100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Exam (elaborations)

CS7643 Quiz 5 – Comprehensive Exam on Embeddings, Graphs, RNNs, LSTMs, Word2Vec, Masked Language Modeling, Knowledge Distillation, t-SNE, and Conditional Language Models (2025 Edition)

Rating
-
Sold
-
Pages
8
Grade
A+
Uploaded on
02-11-2025
Written in
2025/2026

Which of the following defines an embedding? A. A fixed-length input representation B. A learned map from entities to vectors that encodes similarity C. A linear classifier D. A pre-trained RNN output Answer: B Rationale: Embeddings map entities into a vector space to capture similarity. 2. Graph embeddings aim to: A. Reduce graph size B. Encode connected nodes as more similar vectors than unconnected nodes C. Train MLPs more efficiently D. Compute word probabilities Answer: B Rationale: Embeddings preserve graph structure for downstream tasks. 3. Skip-Gram Word2Vec predicts: A. Center word from context B. Context words from a center word C. Entire sentence probability D. Edge embeddings in graphs Answer: B Rationale: Skip-Gram maximizes probability of context words given the center. 4. CBOW Word2Vec predicts: A. Center word from context words B. Context words from center word C. Next word in sequence D. Hidden state of RNN Answer: A Rationale: CBOW uses surrounding words to predict the center word. 5. Negative sampling in Word2Vec: A. Reduces computation compared to full Master advanced deep learning concepts with this CS7643 Quiz 5 Comprehensive Exam Resource, designed for Georgia Institute of Technology’s OMSCS Program. This guide includes realistic, high-level exam questions with correct answers and detailed rationales, aligned with the latest CS7643 – Deep Learning (2025 Edition) curriculum. Focused on representations, generative models, and sequence learning, this quiz provides targeted reinforcement for the most conceptually challenging topics in the course. Topics Covered: Embeddings & Graph Embedding – Node2Vec, DeepWalk, and similarity encoding Recurrent Models – RNNs, LSTMs, and GRUs Language Representation Models – Word2Vec, Masked LM, BERT fundamentals Knowledge Distillation – Model compression and student–teacher frameworks t-SNE & Dimensionality Reduction – Visualization and clustering Conditional Language Models – Sequence-to-sequence models, attention mechanisms Each question is structured to mirror real exam difficulty, ensuring you gain the conceptual and applied mastery needed for success in Quiz 5 and the final. Ideal For: OMSCS Students at Georgia Tech (CS7643 – Deep Learning) Graduate Students in Machine Learning, NLP, or AI specializations Researchers & Professionals looking to strengthen NLP and embedding knowledge Learners preparing for comprehensive exams or interviews involving deep learning architectures CS7643 Quiz 5 – Comprehensive Exam on Embeddings, Graphs, RNNs, LSTMs, Word2Vec, Masked Language Modeling, Knowledge Distillation, t-SNE, and Conditional Language Models (2025 Edition)

Show more Read less
Institution
CS7643 Q Uiz 5
Course
CS7643 Q uiz 5









Whoops! We can’t load your doc right now. Try again or contact support.

Written for

Institution
CS7643 Q uiz 5
Course
CS7643 Q uiz 5

Document information

Uploaded on
November 2, 2025
Number of pages
8
Written in
2025/2026
Type
Exam (elaborations)
Contains
Questions & answers

Subjects

Content preview

CS7643 Quiz 5 – Comprehensive Exam on
Embeddings, Graphs, RNNs, LSTMs,
Word2Vec, Masked Language Modeling,
Knowledge Distillation, t-SNE, and
Conditional Language Models (2025 Edition)
Overview:

This quiz assesses your understanding of advanced machine learning and natural language
processing concepts. Topics include:

 Embeddings and Graph Embeddings: Learning vector representations for entities and
nodes, preserving similarity and structure for downstream tasks.
 Recurrent Neural Networks (RNNs) and LSTMs: Sequential modeling,
vanishing/exploding gradients, and gate mechanisms (input, forget, output) to manage
long-term dependencies.
 Word2Vec Models (Skip-Gram and CBOW): Learning word vectors, training
objectives, negative sampling, and hierarchical softmax.
 Masked Language Modeling (MLM) and Teacher Forcing: Pre-training strategies for
improved language model performance.
 Knowledge Distillation: Compressing large models into smaller ones while maintaining
performance.
 Evaluation of Embeddings: Intrinsic (analogy and similarity tasks) vs. extrinsic
(downstream tasks) evaluation.
 Dimensionality Reduction and Visualization: Using t-SNE to map high-dimensional
embeddings to 2D/3D space.
 Conditional Language Models: Predicting sequences of tokens conditioned on previous
tokens and training strategies.
 Bias Mitigation: Debiasing embeddings to reduce gender or other societal biases.




1. Which of the following defines an embedding?
A. A fixed-length input representation
B. A learned map from entities to vectors that encodes similarity
C. A linear classifier
D. A pre-trained RNN output
Answer: B
Rationale: Embeddings map entities into a vector space to capture
similarity.

, 2. Graph embeddings aim to:
A. Reduce graph size
B. Encode connected nodes as more similar vectors than unconnected nodes
C. Train MLPs more efficiently
D. Compute word probabilities
Answer: B
Rationale: Embeddings preserve graph structure for downstream tasks.
3. Skip-Gram Word2Vec predicts:
A. Center word from context
B. Context words from a center word
C. Entire sentence probability
D. Edge embeddings in graphs
Answer: B
Rationale: Skip-Gram maximizes probability of context words given the
center.
4. CBOW Word2Vec predicts:
A. Center word from context words
B. Context words from center word
C. Next word in sequence
D. Hidden state of RNN
Answer: A
Rationale: CBOW uses surrounding words to predict the center word.
5. Negative sampling in Word2Vec:
A. Reduces computation compared to full softmax
B. Increases vocabulary size
C. Ensures exact probabilities
D. Is used only in LSTMs
Answer: A
Rationale: Approximates softmax efficiently with a small set of negative
samples.
6. Vanilla RNN training challenges:
A. Overfitting
B. Vanishing and exploding gradients
C. Lack of embeddings
D. Fixed input size
Answer: B
Rationale: Multiplicative effects of weights over time steps cause gradients
to vanish or explode.
7. Input gate in LSTM:
A. Controls what information to forget

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
studyguidepro NURSING
View profile
Follow You need to be logged in order to follow users or courses
Sold
50
Member since
3 months
Number of followers
3
Documents
1186
Last sold
2 hours ago
verified exams

Updated exams .Actual tests 100% verified.ATI,NURSING,PMHNP,TNCC,USMLE,ACLS,WGU AND ALL EXAMS guaranteed success.Here, you will find everything you need in NURSING EXAMS AND TESTBANKS.Contact us, to fetch it for you in minutes if we do not have it in this shop.BUY WITHOUT DOUBT!!!!Always leave a review after purchasing any document so as to make sure our customers are 100% satisfied. **Ace Your Exams with Confidence!**

3.9

14 reviews

5
8
4
1
3
2
2
1
1
2

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions