100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Exam (elaborations)

IT 350 AI &Machine Learning Comprehensive Final Exam 2025 (With Solns)

Rating
-
Sold
-
Pages
30
Grade
A+
Uploaded on
03-05-2025
Written in
2024/2025

IT 350 AI &Machine Learning Comprehensive Final Exam 2025 (With Solns)IT 350 AI &Machine Learning Comprehensive Final Exam 2025 (With Solns)IT 350 AI &Machine Learning Comprehensive Final Exam 2025 (With Solns)











Whoops! We can’t load your doc right now. Try again or contact support.

Document information

Uploaded on
May 3, 2025
Number of pages
30
Written in
2024/2025
Type
Exam (elaborations)
Contains
Questions & answers

Subjects

Content preview

IT 350 Artificial Intelligence & Machine Learning

Comprehensive Final Exam (Qns & Ans)

2025


Question 1 (Multiple Choice)
Question:
Which optimization algorithm adapts its learning rates based on
estimates of first and second moments of gradients, making it
especially effective in training deep neural networks?
A) Stochastic Gradient Descent (SGD)
B) RMSProp
C) Adam
D) Adagrad


Correct ANS:
C) Adam
©2025

, Rationale:
Adam (Adaptive Moment Estimation) calculates individual
adaptive learning rates for each parameter by using estimates of
the first (mean) and second (uncentered variance) moments of the
gradients. This method results in improved convergence on deep
learning tasks compared to standard SGD or even RMSProp.


---


Question 2 (Fill in the Blank)
Question:
The process of leveraging a pre-trained model on one task and
adapting it to a related but different task is known as ________ .


Correct ANS:
transfer learning


Rationale:
Transfer learning uses the knowledge gained from a source task to
improve learning in a target task. This technique is especially
useful when labeled data for the target task is limited and is
commonly applied in deep learning for computer vision and NLP.

©2025

, ---


Question 3 (True/False)
Question:
True/False: Transformers rely purely on self-attention
mechanisms and completely abandon recurrence and convolution
in order to capture long-range dependencies in data.


Correct ANS:
True


Rationale:
Transformer architectures are built entirely on self-attention
mechanisms, allowing them to model relationships between all
input positions without using recurrent or convolutional layers.
This design enables them to efficiently capture long-range
dependencies with higher parallelism during training.


---


Question 4 (Multiple Response)

©2025

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
Bankart Chamberlain College of Nursing
View profile
Follow You need to be logged in order to follow users or courses
Sold
145
Member since
2 year
Number of followers
31
Documents
4502
Last sold
4 days ago

3.6

21 reviews

5
9
4
0
3
9
2
1
1
2

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions