QUESTIONS AND ANSWERS 100%
PASS
Large Language Models (LLMs) - ✔✔- relys on Neural Networks
- Neural Networks and LLMs are not the same thing
- LLMS are the most popular implementation of Neural Networks
LLMs use... - ✔✔attention mechanism
Th purpose of "attention mechanism" is to... - ✔✔seek to determine the significance
of a work (or token) as it appears in a string
There are many models for determining... - ✔✔attention
The modern explosion in LLM functionality - ✔✔"Transformer Model" and
"Transformer Architecture"
"Transformer Architecture" - ✔✔- introduced by the paper "Attention is All You
Need" (2017)
- revolutionized the landscape of AI
Key ideas from the paper: - ✔✔- You don't need a complicated model to get better
attention, you need a model that can grow with more hidden layers
- Your attention process must evaluate input sequences in parallel (rather than
sequentially)
COPYRIGHT © 2025 BY EMILLY CHARLOTTE, ALL RIGHTS RESERVED 1