STATISTICAL INFERENCE & DECISION THEORY
Bayes Theorem : I(ak)=((lis)
m(x) mi=SoT(f)((a; 0) do
Hypothesis testing
Ho 0 0. Odds ofH ,
being 31
=
: true :
H: 0 = 2, :
Probability H ,
= = 0 75
.
&
#(0) = To To + Ti = 1
T(0 ) ,
=
Ti
·
Posterior probability thatH , is true :
TiLl8 , jou)
Bayes T(0 (s)
=
: , To L(20 ; >) + TiL(0 ,j S
H(0, (x) ((8 , jx)
Odds ratio :(801)
=
F *
100 ; x)
Col type &error Co :
type/error
· :
;
·
Posterior expected losses : Cotto /) , vs GoTT((x)
Accept H , if Got(00k) <CoitT(0 (c) ,
Cio #(8 , (s)) T(Go(s) Col
L Y
Col < T(Gols)
=
T(8 , 15) Go
Bayesian Estimation
=argmin
a tor(218 8)(s) ,
use either posterior or prior
Squared error loss function : 1(8 ,
8) =
(8-0) results in =Ea(s) Bayes optimal estimator
Credibility
·
interval : probability that a lies in that interval
f(x(z)d((0 ; x)
Binomial
sampling E(0, 1)
·
Choose a uniform prior
,
T(G) X1 if all values of a are equally probable
,
*
Choose
·
a beta prior , T(G) a f /1-8)B" if some values of a are more likely than others
,
conjugate prior
·
Natural Bayesian estimator : Elak] is an
average between the prior mean and the MLE
from the data.
Elak-a-Ben=a) nan)
x + SC
. Beta prior =Hakd-Betalat , Ben-e)
+
e .
g ,
Note : if a+
B = 0 ,
all inference should be based on the MLE
I
E(0(x) =
and (b) <2(1-0)
, TECHNIQUES OF BAYESIAN ANALYSIS
Sampling from the Normal Distribution
f(xly 4) ,
= zi"expt-E(x -
M)Y)
"
((m , + , x) =
iπ expt[if(xi M() -
E(xi M) - =
[if(xi -
s + c -
M)
=
E(xi-5) + Ei (M -5))
=
Sxx+n(M-5) Sxx= [xi-5) is the sample sum of squared of o
:
T(M +(es) ,
& T(m , 4)(4v2 exptSxx)expt (M -(4)
TIMITST)/T(MITTIT)/T(TIM) HT(M)
Independence assumption : T(M14 8) ,
x T(M)exp( (5)) looks normal (specify a conjugate prior
T(TIM c) ,
& #(4) T * exp)- E [ (xi-M)) looks gamma
Unknown mean and known variance :
#(M(S)xπ(m)exp)- 2 (u -5))
TT(M) x exp)- [(m m() - -
N(m , b)
:
TT(MIss) x exp) + (n+(M -5) P(m m(t))
-
+ -
n4(m -5)
2
+ b(m m) - =
(b + n+)Mz -
2(bm +
n+5)M + ...
2(dments
=
(D + n+)(m) (D + +M + ... ]
=
(P + nT))M -
COMING precision
exp(Pin (y- Pmj-NPm
↑
:
T(MI) x
Dn+m
nT
E(MIS) = +
p + +x
Unknown variance and known mean :
π(p(s))xπ(4)PE exp) [(xi M(z) -
Ei, (xi M)) -
Note : Si =
N is the variance estimate (variance is unknown
By
#(4)xyx
-
g
E-
π(4(5))4p+ exp) +(n B1)
+
-
+
-
Gamma(x+
2 + /2
B
f(+(x)) =
nSyz + B
T(843) -
1G(x E, + + B)
ns
-
B
can
2 +
= (84) =
+ 12- 1
=
+
Bayes Theorem : I(ak)=((lis)
m(x) mi=SoT(f)((a; 0) do
Hypothesis testing
Ho 0 0. Odds ofH ,
being 31
=
: true :
H: 0 = 2, :
Probability H ,
= = 0 75
.
&
#(0) = To To + Ti = 1
T(0 ) ,
=
Ti
·
Posterior probability thatH , is true :
TiLl8 , jou)
Bayes T(0 (s)
=
: , To L(20 ; >) + TiL(0 ,j S
H(0, (x) ((8 , jx)
Odds ratio :(801)
=
F *
100 ; x)
Col type &error Co :
type/error
· :
;
·
Posterior expected losses : Cotto /) , vs GoTT((x)
Accept H , if Got(00k) <CoitT(0 (c) ,
Cio #(8 , (s)) T(Go(s) Col
L Y
Col < T(Gols)
=
T(8 , 15) Go
Bayesian Estimation
=argmin
a tor(218 8)(s) ,
use either posterior or prior
Squared error loss function : 1(8 ,
8) =
(8-0) results in =Ea(s) Bayes optimal estimator
Credibility
·
interval : probability that a lies in that interval
f(x(z)d((0 ; x)
Binomial
sampling E(0, 1)
·
Choose a uniform prior
,
T(G) X1 if all values of a are equally probable
,
*
Choose
·
a beta prior , T(G) a f /1-8)B" if some values of a are more likely than others
,
conjugate prior
·
Natural Bayesian estimator : Elak] is an
average between the prior mean and the MLE
from the data.
Elak-a-Ben=a) nan)
x + SC
. Beta prior =Hakd-Betalat , Ben-e)
+
e .
g ,
Note : if a+
B = 0 ,
all inference should be based on the MLE
I
E(0(x) =
and (b) <2(1-0)
, TECHNIQUES OF BAYESIAN ANALYSIS
Sampling from the Normal Distribution
f(xly 4) ,
= zi"expt-E(x -
M)Y)
"
((m , + , x) =
iπ expt[if(xi M() -
E(xi M) - =
[if(xi -
s + c -
M)
=
E(xi-5) + Ei (M -5))
=
Sxx+n(M-5) Sxx= [xi-5) is the sample sum of squared of o
:
T(M +(es) ,
& T(m , 4)(4v2 exptSxx)expt (M -(4)
TIMITST)/T(MITTIT)/T(TIM) HT(M)
Independence assumption : T(M14 8) ,
x T(M)exp( (5)) looks normal (specify a conjugate prior
T(TIM c) ,
& #(4) T * exp)- E [ (xi-M)) looks gamma
Unknown mean and known variance :
#(M(S)xπ(m)exp)- 2 (u -5))
TT(M) x exp)- [(m m() - -
N(m , b)
:
TT(MIss) x exp) + (n+(M -5) P(m m(t))
-
+ -
n4(m -5)
2
+ b(m m) - =
(b + n+)Mz -
2(bm +
n+5)M + ...
2(dments
=
(D + n+)(m) (D + +M + ... ]
=
(P + nT))M -
COMING precision
exp(Pin (y- Pmj-NPm
↑
:
T(MI) x
Dn+m
nT
E(MIS) = +
p + +x
Unknown variance and known mean :
π(p(s))xπ(4)PE exp) [(xi M(z) -
Ei, (xi M)) -
Note : Si =
N is the variance estimate (variance is unknown
By
#(4)xyx
-
g
E-
π(4(5))4p+ exp) +(n B1)
+
-
+
-
Gamma(x+
2 + /2
B
f(+(x)) =
nSyz + B
T(843) -
1G(x E, + + B)
ns
-
B
can
2 +
= (84) =
+ 12- 1
=
+