In: Math
When is the Bayes' rule (not the Bayes' theorem) optimal?
Explain the meaning of that by using a
2x2 confusion matrix
Bayes Rule is as follows:-
through marginilization we have :-
Where = posterior probability, = likelihood,
= prior probability and = evidence
The optimal decision rule to choose between the two hypothesis is as follows:-
If the prior probabilities are fixed, then decide Hi if > for all ij
we know that optimal decision rule gives the minimum error rate possible.
The bayes rule with more information is as follows
Decide Hi if
Example of Confusion matrix
=1 | =0 | |
l=1 | ppv | fdr |
l=0 | for | npv |
represents true disease status (1=disease present, 0=disease absent)
l represents test results(1=test positive, 0= test negative)
ppv = positive predictive value =
npv =negative predictive value =
false discovery rate : fdr = = 1-ppv
false omission rate : for = = 1-npv
ppv, npv, fdr, for are posterior probabilities
sensitivity =
specificity =
sensitivity, specificity, and are the likelihoods
prevalence = =
and are the priors
plr= positive labeling rate=
nlr= negative labeling rate =
The plr and nlr are the evidence.