Garantie de satisfaction à 100% Disponible immédiatement après paiement En ligne et en PDF Tu n'es attaché à rien 4,6 TrustPilot
logo-home
Notes de cours

Complete WEEK4 note: Machine Learning & Learning Algorithms(BM05BAM)

Note
-
Vendu
-
Pages
13
Publié le
12-03-2024
Écrit en
2023/2024

THIS IS A COMPLETE NOTE FROM ALL BOOKS + LECTURE! Save your time for internships, other courses by studying over this note! Are you a 1st/2nd year of Business Analytics Management student at RSM, who want to survive the block 2 Machine Learning module? Are you overwhelmed with 30 pages of reading every week with brand-new terms and formulas? If you are lost in where to start or if you are struggling to keep up due to the other courses, or if you are just willing to learn about Machine Learning, I got you covered. I successfully passed the course Machine Learning & Learning Algorithms at RSM with 7.6, WITHOUT A TECHNICAL BACKGROUND before this Master. So if you are from Non-tech bachelor program, this note will navigate the knowledge you should focus on to pass the exam and successfully complete assignments, and for people with some machine learning knowledge, this note will certainly make your life easier and gets you a booster to your grade.

Montrer plus Lire moins
Établissement
Cours









Oups ! Impossible de charger votre document. Réessayez ou contactez le support.

Livre connecté

École, étude et sujet

Établissement
Cours
Cours

Infos sur le Document

Publié le
12 mars 2024
Nombre de pages
13
Écrit en
2023/2024
Type
Notes de cours
Professeur(s)
Jason roos
Contient
Toutes les classes

Sujets

Aperçu du contenu

ISLR 8: Tree-Based Methods
Tree-based methods for regression and classification involve stratifying/segmenting the
predictor space into a number of simple regions.

For prediction, we use mean/mode response value for the training observations in the region
to which they belong.

Decision tree: the set of splitting rules to segment the predictor summarized in a tree

8.1: The basics of decision trees
8.1.1 Regression trees

In the tree analogy, decision trees are typically drawn upside down.
Splitting rule: starting at the top of the tree.

For every observation fall into each region, we make the same prediction: mean of the response
values for the training observation in each region

Decision tree: Superfical similarities with KNN
- K-NN is expensive because each prediction requires a comparison to entire training set.
- What is we partitioned the training set ahead of time, summarized the partitions with
simple rules, and pre-calculated predictions?

Purpose
Partition feature space into regions where values of target variables are relatively
homogeneous(or pure if classification tree case)

Decision tree terminologies
Terminal nodes/leaves = Regions: where all observations end up.
Internal nodes = Points along the tree where the predictor space is split
- The first internal node with the most important factor to determine the response factor

, Value at the top in the square = predicted y value
Advantage:
- Interpretability
- Similarity to human decisions
- No dummy variable required
- Good “Base learners”
- It can be visually and intuitively displayed
- Easy to handle qualitative predictors without creating dummy variables
- Can deal with the non-linear relationship by digging into specific features multiple times
in splitting

Disadvantage
- Not competitive in terms of accuracy (aggregation of trees can improve it)
- Trees can be very non-robust: small change in the data can cause a large change in the
final estimated tree

Process: Greedy approach
It is computationally infeasible to consider every possible partition of the feature space into J
boxes.
 In creation of the decision tree, we take top-down greedy approach/recursive binary
splitting (for decision tree case)
o It begins at the top of the tree and then successively splits the predictor space
with two new branches
o It is greedy as the best split is made at the each of the split rather than looking
ahead and picking one that lead to a better tree in some future step.
Example: predicting salary of baseball players
1. Years <4.5?
2. For players with years <4.5, find the mean response value for those players (e.g. mean
log salary of 5.107 = $165,174)
€11,49
Accéder à l'intégralité du document:

Garantie de satisfaction à 100%
Disponible immédiatement après paiement
En ligne et en PDF
Tu n'es attaché à rien


Document également disponible en groupe

Faites connaissance avec le vendeur

Seller avatar
Les scores de réputation sont basés sur le nombre de documents qu'un vendeur a vendus contre paiement ainsi que sur les avis qu'il a reçu pour ces documents. Il y a trois niveaux: Bronze, Argent et Or. Plus la réputation est bonne, plus vous pouvez faire confiance sur la qualité du travail des vendeurs.
ArisMaya Erasmus Universiteit Rotterdam
S'abonner Vous devez être connecté afin de suivre les étudiants ou les cours
Vendu
49
Membre depuis
4 année
Nombre de followers
30
Documents
20
Dernière vente
3 mois de cela
Let's Pass Together!

4,0

1 revues

5
0
4
1
3
0
2
0
1
0

Récemment consulté par vous

Pourquoi les étudiants choisissent Stuvia

Créé par d'autres étudiants, vérifié par les avis

Une qualité sur laquelle compter : rédigé par des étudiants qui ont réussi et évalué par d'autres qui ont utilisé ce document.

Le document ne convient pas ? Choisis un autre document

Aucun souci ! Tu peux sélectionner directement un autre document qui correspond mieux à ce que tu cherches.

Paye comme tu veux, apprends aussitôt

Aucun abonnement, aucun engagement. Paye selon tes habitudes par carte de crédit et télécharge ton document PDF instantanément.

Student with book image

“Acheté, téléchargé et réussi. C'est aussi simple que ça.”

Alisha Student

Foire aux questions