100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Summary

APML Summary - Applications of Machine Learning

Rating
-
Sold
1
Pages
23
Uploaded on
22-01-2024
Written in
2022/2023

Summary of the APML course for information science on UU

Institution
Course










Whoops! We can’t load your doc right now. Try again or contact support.

Written for

Institution
Study
Course

Document information

Uploaded on
January 22, 2024
Number of pages
23
Written in
2022/2023
Type
Summary

Subjects

Content preview

Content
Lecture 2............................................................................................................................................1
Lecture 3............................................................................................................................................3
Lecture 4............................................................................................................................................5
Lecture 5............................................................................................................................................6
Lecture 6............................................................................................................................................7
Lecture 7............................................................................................................................................7
Lecture 8............................................................................................................................................7
Lecture 9............................................................................................................................................8
Lecture 10........................................................................................................................................10
Lecture 11........................................................................................................................................10
Lecture 12........................................................................................................................................12
Lecture 13........................................................................................................................................13
Lecture 14........................................................................................................................................15
Lecture 15........................................................................................................................................16
Lecture 16........................................................................................................................................18
Lecture 17 (guest lectures)..............................................................................................................19
Booking.com................................................................................................................................19
eScience center............................................................................................................................21
Lecture 2
Decision tree

- Split the set of instances in subsets such that the variation within each subset becomes
smaller

Entropy = degree of uncertainty

,First number means total classified here, second number means incorrectly classified ones of the
total

Exercise

What is tree depth?  how many squares there are, 3

Would you further grow tree?  yes, ‘young’ has 55 incorrectly classified out of 381. Further
growing could prove usefu




Confusion matrix and measures




Quality measures

Error

, - (FP + FN) / total
How many of the actual negative instances did the model identify?

Accuracy

- (TP + TN) / total
How many instances did the model classify correctly over all instances?

Precision

- TP / (TP + FP)
How many of the predicted positive instances are actually positive?

Recall

- TP / (TP + FN)
How many of the actual positive instances did the model identify?

F1 score

- 2 * (precision * recall) / (precision + recall)
A balance between precision and recall

Exercise

What setting would you choose for the tree size?

Answer  between 10 and 20 are the best results, smaller tree is generally better so 10




Lecture 3
Overfitting

- The model is too specific for the data set used to learn the model and performs poorly on
new instances
- High variance

Underfitting

- The model is too general and does not exploit the data
- High bias

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
timb3 Universiteit Utrecht
Follow You need to be logged in order to follow users or courses
Sold
15
Member since
2 year
Number of followers
7
Documents
10
Last sold
2 months ago

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions