100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Class notes

Complete WEEK5 note: Machine Learning & Learning Algorithms(BM05BAM)

Rating
-
Sold
-
Pages
6
Uploaded on
12-03-2024
Written in
2023/2024

THIS IS A COMPLETE NOTE FROM ALL BOOKS + LECTURE! Save your time for internships, other courses by studying over this note! Are you a 1st/2nd year of Business Analytics Management student at RSM, who want to survive the block 2 Machine Learning module? Are you overwhelmed with 30 pages of reading every week with brand-new terms and formulas? If you are lost in where to start or if you are struggling to keep up due to the other courses, or if you are just willing to learn about Machine Learning, I got you covered. I successfully passed the course Machine Learning & Learning Algorithms at RSM with 7.6, WITHOUT A TECHNICAL BACKGROUND before this Master. So if you are from Non-tech bachelor program, this note will navigate the knowledge you should focus on to pass the exam and successfully complete assignments, and for people with some machine learning knowledge, this note will certainly make your life easier and gets you a booster to your grade.

Show more Read less
Institution
Course









Whoops! We can’t load your doc right now. Try again or contact support.

Connected book

Written for

Institution
Study
Course

Document information

Uploaded on
March 12, 2024
Number of pages
6
Written in
2023/2024
Type
Class notes
Professor(s)
Jason roos
Contains
All classes

Subjects

Content preview

5.2 The Bootstrap
Bootstrap: a method obtain a new sample set without obtaining independent
data sets from the population by repeatedly sampling observations form the
original data set.

It can be applied to a wide range of statistical learning methods, including ones
which a measure of variability is otherwise difficult to obtain and not
automatically output by statistical software.

Estimates of values (e.g. accuracy) of bootstrapped data can perform great, even
comparable to the estimate based o a simulated datasets from the true
population .

Purpose
Bootstrap can be used to estimate and quantify the uncertain value associated
with a given estimator/statistical learning method

Process
1. Assume our sample is representative of the population of interest
2. Bootstrap randomly select n observations from a dataset to produce a
bootstrap data det, Z*1.

The sampling Is performed with replacement: same observation can occur more
than once in the bootstrap data set.




We can use Z*1 to produce a new bootstrap estimate for an estimate of the dataset (e.g.
accuracy as alpha*1).

, 8.2 Bagging, Random Forests, Boosting
Ensemble method/ weak learners: approach that combines many simple “building block”
models to obtain a single powerful model in prediction performance.

Decision tree have low bias but high variance. Averaging many trees improve variance.
(low bias because it has the information of the interactive features)

Ensemble methods that use regression/classification tree as building blocks
1) Bagging
2) Random forests
3) Boosting

8.2.1 Bagging
Bagging : General purpose procedure for reducing he variance of a statistical learning method. –
- Bagging uses bootstrapping: it takes repeated samples from the single training data set,
build a separate prediction model using each training set, and average the resulting
prediction that leads to low variance.

Bagged trees are grown deep + unpruned.

Bagging is particularly useful for many regression methods, particularly decision trees.

Advantage
1. Low bias
a. Trees are grow deep, and not pruned: thanks to the low variance by averaging,
each tree can be fit to each bootstrapped data
2. Low variance
a. Averaging the trees (could be hundreds/thousands trees!) built on bootstrapped
train data reduces the variance: as it does not rely on any single tree
Disadvantage
1. Can be difficult to interpret the resulting model
2. Could result in highly correlated trees when variance importance’s strongly vary among
the predictors, leading to not effective variance reduction (refer to Random Forest)

Process
1. Create B bootstrapped training data sets
2. Construct B regression trees using the bth bootstrapped training set to get the estimates
of y
3. For regression trees: Average them to obtain a single low-variance statistical learning
model, given by
$13.26
Get access to the full document:

100% satisfaction guarantee
Immediately available after payment
Both online and in PDF
No strings attached


Also available in package deal

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
ArisMaya Erasmus Universiteit Rotterdam
Follow You need to be logged in order to follow users or courses
Sold
49
Member since
4 year
Number of followers
30
Documents
20
Last sold
2 months ago
Let's Pass Together!

4.0

1 reviews

5
0
4
1
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions