Questions and CORRECT Answers
Large Language Models (LLMs) - CORRECT ANSWER - - relys on Neural Networks
- Neural Networks and LLMs are not the same thing
- LLMS are the most popular implementation of Neural Networks
LLMs use... - CORRECT ANSWER - attention mechanism
Th purpose of "attention mechanism" is to... - CORRECT ANSWER - seek to determine the
significance of a work (or token) as it appears in a string
There are many models for determining... - CORRECT ANSWER - attention
The modern explosion in LLM functionality - CORRECT ANSWER - "Transformer Model"
and "Transformer Architecture"
"Transformer Architecture" - CORRECT ANSWER - - introduced by the paper "Attention is
All You Need" (2017)
- revolutionized the landscape of AI
Key ideas from the paper: - CORRECT ANSWER - - You don't need a complicated model to
get better attention, you need a model that can grow with more hidden layers
- Your attention process must evaluate input sequences in parallel (rather than sequentially)
Weights: - CORRECT ANSWER - numeric measure of strength between neurons. Weights are
also called "parameters". Its common to hear about "model size" or "number of parameters,"
which refers to the total number of these weights (or coefficients) in the model.