100% tevredenheidsgarantie Direct beschikbaar na je betaling Lees online óf als PDF Geen vaste maandelijkse kosten 4,6 TrustPilot
logo-home
Samenvatting

Part 1 (midterm) summary Business Analytics: Week 1 - Week 3

Beoordeling
4,0
(4)
Verkocht
17
Pagina's
22
Geüpload op
11-11-2020
Geschreven in
2020/2021

Summary for the midterm of the course 'Business Analytics'. Includes all the reading material for week 1, 2, and 3. Week 1 --> Read chapter 2.1-2.2. Week 2 --> Read chapter 10.1, 10.3 Week 3 --> Read chapter 3.1-3.3, 3.5 Also, check out my free summary of the knowledge clips from week 1 till week 3!

Meer zien Lees minder










Oeps! We kunnen je document nu niet laden. Probeer het nog eens of neem contact op met support.

Documentinformatie

Heel boek samengevat?
Nee
Wat is er van het boek samengevat?
2.1, 2.2, 10.1, 10.3, 3.1, 3.2, 3.3, 3.5
Geüpload op
11 november 2020
Bestand laatst geupdate op
1 december 2020
Aantal pagina's
22
Geschreven in
2020/2021
Type
Samenvatting

Voorbeeld van de inhoud

Summary midterm Business Analytics week 1 - week 3

WEEK 1 CHAPTERS

2.1 What Is Statistical Learning?

Input variables​: e.g. advertising budgets. Typically denoted using the symbol ​X​, with a subscript to
distinguish them. ​X1: ​TV budget, ​X2:​ The Radio Budget etc.
→ also called predictors, independent variables, features, or sometimes just variables

Output variable​: e.g. sales. Typically denoted using the symbol ​Y
→ also called variable response or dependent variable

We assume that there is some relationship between ​Y​ and ​X = (X1, X2, ..., Xp)​, which can be written
in the very general form:

Y = f(X) + ϵ

Here ​f i​ s some fixed but unknown function of ​X1…, Xp​, and ϵ is ​random error term, ​which is
independent of ​X​ and has mean zero. ​f​ represents the systematic information that ​X​ provides about ​Y.

In essence, ​statistical learning ​refers to a set of approaches for estimating ​f.

2.1.1 Why Estimate f

Prediction
To predict ​Y​, when inputs ​X​ are readily available and the error term averages to zero.


f^​ represents our estimate for ​f​. ​Often treated as a black box, not typically concerned with the exact
form of ​f^, p​ rovided that it yields accurate predictions for ​Y​. ​Ŷ​ represents the resulting prediction
for ​Y.

The accuracy of ​Ŷ​ depends on two quantities:
● Reducible error​: ​f^ ​will not be a perfect estimate for ​f, ​and this inaccuracy will introduce
some error. It is reducible because we can potentially improve the accuracy of ​f^ ​ by using the
most appropriate statistical learning technique.
● Irreducible errors​: ​Y i​ s also a function of ​ϵ w
​ hich cannot be predicted by using ​X. ​Therefore
this error also affects the accuracy of our predictions. No matter how well we estimate ​f ​, we
cannot reduce the error introduced by ϵ​ .
○ Why is the irreducible error larger than zero?
■ Unmeasured variables
■ Unmeasurable variation




1

,E(Y - Ŷ)² ​represents the average or expected value​, of the squared difference between the predicted
and actual value of ​Y.​
Var(ϵ)​ ​represents the ​variance​ associated with the error term.

Inference
To ​understand the relationship between ​X a​ nd ​Y,​ to understand how ​Y c​ hanges as a function of
X1,..., Xp.​ Now ​f ​cannot be treated as a black box, because we need to know its exact form.
● Which predictors are associated with the response?
● What is the relationship between the response and each predictor?
● Can the relationship between Y and each predictor be adequately summarized using a linear
equation, or is the relationship more complicated?

Linear models​ allow for relatively simple and interpretable inferences, but may not yield as accurate
predictions as some other approaches.

2.1.2 How do we estimate f?

Training data​: observations that will be used to train our method how to estimate ​f.
Let ​Xij​ represent the value of the ​j​th predictor, or input, for observation ​i​, where ​i=1,2,..., n​ and
j=1,2,...,p.​ Correspondingly, let ​yi​ represent the response variable for the​ i​th observation. Then our
training data consist of ​{(x1,y1), (x2,y2),..., (xn,yn)} w​ here ​xi = (xi1, xi2,..., xip)T.

Parametric methods
Involve a two-step model-based approach.
​ xample assumption linear model:
1. Make an ​assumption​ about the functional form/shape of ​f. E


Only the ​p + 1 ​coefficients (β) have to be estimated.
2. Now we need a procedure that uses the training data to fit/train the model. We want to find
values such that


The most common approach is ​(ordinary) least squares.

The potential disadvantage of a parametric approach is that the model we choose will usually not
match the true unknown form of ​f. ​We can try to address this problem by choosing ​flexible ​models
that can fit many different possible functional forms for ​f.
→ ​Flexible models ​can lead to ​overfitting ​the data. This means they follow the errors or ​noise​ too
closely.




2

, Non-parametric methods
Do not make explicit assumptions about the functional form of ​f. ​They seek an estimate of ​f t​ hat gets
as close to the data points as possible without being too rough or wiggly.

Advantage​: have the potential to accurately fit a wider range of possible shapes for ​f. ​Avoid the
danger of estimate ​f ​ being very different from the true ​f.
Disadvantage: ​they do not reduce the problem or estimating ​f t​ o a small number of parameters, a very
large number of observations is required in order to obtain an accurate estimate for ​f.

Thin-plate spline​ can be used to estimate ​f. ​It does not impose any pre-specified model on ​f. ​It instead
attempts to produce an estimate for ​f ​that is as close as possible to the observed data, subject to the fit
being ​smooth.

2.1.3 The trade-off between prediction accuracy and model interpretability
Why would we ever choose a more restrictive method (linear) instead of a very flexible approach (thin
plated)? → When we are mainly interested in inference, restrictive models are much more
interpretable
Restrictive approach is more interpretable because it is less complex. Flexible approach however can
be more complex to interpret.




Trade-off between flexibility and interpretability of different statistical learning methods:
Lasso: ​relies upon the linear model but uses an alternative fitting procedure for estimating the
coefficients. More restrictive and therefore less flexible and more interpretable.
Least squares linear regression: ​relatively inflexible but quite interpretable.
Generalized additive model (GAMs):​ extend the linear model to allow for certain non-linear
relationships. More flexible than linear regression and less interpretable.
Bagging, boosting, and support vector machines:​ highly flexible approaches that are harder to
interpret.

2.1.4 Supervised versus Unsupervised Learning

Supervised learning​: for each observation of the predictor measurements there is an associated
response to the predictors.
Unsupervised learning​: when we lack a response variable (Y) that can supervise our analysis.
E.g. ​Cluster analysi​s: the goal is to ascertain whether the observations fall into relatively
distinct groups

3
€5,98
Krijg toegang tot het volledige document:
Gekocht door 17 studenten

100% tevredenheidsgarantie
Direct beschikbaar na je betaling
Lees online óf als PDF
Geen vaste maandelijkse kosten

Beoordelingen van geverifieerde kopers

Alle 4 reviews worden weergegeven
5 jaar geleden

5 jaar geleden

4 jaar geleden

5 jaar geleden

4,0

4 beoordelingen

5
2
4
1
3
0
2
1
1
0
Betrouwbare reviews op Stuvia

Alle beoordelingen zijn geschreven door echte Stuvia-gebruikers na geverifieerde aankopen.

Maak kennis met de verkoper

Seller avatar
De reputatie van een verkoper is gebaseerd op het aantal documenten dat iemand tegen betaling verkocht heeft en de beoordelingen die voor die items ontvangen zijn. Er zijn drie niveau’s te onderscheiden: brons, zilver en goud. Hoe beter de reputatie, hoe meer de kwaliteit van zijn of haar werk te vertrouwen is.
jtimmermans Universiteit van Amsterdam
Bekijk profiel
Volgen Je moet ingelogd zijn om studenten of vakken te kunnen volgen
Verkocht
152
Lid sinds
5 jaar
Aantal volgers
106
Documenten
14
Laatst verkocht
3 maanden geleden

3,6

17 beoordelingen

5
2
4
9
3
4
2
1
1
1

Recent door jou bekeken

Waarom studenten kiezen voor Stuvia

Gemaakt door medestudenten, geverifieerd door reviews

Kwaliteit die je kunt vertrouwen: geschreven door studenten die slaagden en beoordeeld door anderen die dit document gebruikten.

Niet tevreden? Kies een ander document

Geen zorgen! Je kunt voor hetzelfde geld direct een ander document kiezen dat beter past bij wat je zoekt.

Betaal zoals je wilt, start meteen met leren

Geen abonnement, geen verplichtingen. Betaal zoals je gewend bent via iDeal of creditcard en download je PDF-document meteen.

Student with book image

“Gekocht, gedownload en geslaagd. Zo makkelijk kan het dus zijn.”

Alisha Student

Veelgestelde vragen