Statistics: Difference between revisions

From 太極
Jump to navigation Jump to search
Line 9: Line 9:
{| border="1" style="border-collapse:collapse; text-align:center;"
{| border="1" style="border-collapse:collapse; text-align:center;"
|-
|-
|     ||  || colspan="2" | Predict  ||  
|                   ||  || colspan="2" | Predict  ||
|-
|                    ||  ||  1      ||    0      ||
|-
|-
| rowspan="2" | True || 1 ||  TP    ||    FN    || Sens=TP/(TP+FN)
| rowspan="2" | True || 1 ||  TP    ||    FN    || Sens=TP/(TP+FN)
|-  
|-  
| 0 ||  FP    ||    TN    || Spec=TN/(FP+TN)
|   0               ||  FP    ||    TN    || Spec=TN/(FP+TN)
|-
|-
|     ||  ||          ||            ||  N = TP + FP + FN + TN
|                   ||  ||          ||            ||  N = TP + FP + FN + TN
|}
|}



Revision as of 13:08, 1 April 2013

Boxcox transformation

Finding transformation for normal distribution

Visualize the random effects

http://www.quantumforest.com/2012/11/more-sense-of-random-effects/

Sensitivity/Specificity/Accuracy

Predict
1 0
True 1 TP FN Sens=TP/(TP+FN)
0 FP TN Spec=TN/(FP+TN)
N = TP + FP + FN + TN
  • Sensitivity = TP / (TP + FN)
  • Specificity = TN / (TN + FP)
  • Accuracy = (TP + TN) / N

ROC curve and Brier score

Elements of Statistical Learning

Bagging

Chapter 8 of the book.

  • Bootstrap mean is approximately a posterior average.
  • Bootstrap aggregation or bagging average: Average the prediction over a collection of bootstrap samples, thereby reducing its variance. The bagging estimate is defined by
[math]\displaystyle{ \hat{f}_{bag}(x) = \frac{1}{B}\sum_{b=1}^B \hat{f}^{*b}(x). }[/math]
  • ksjlfda