# Posts Tagged ‘ statistics ’

## A new candidate for worst figure

September 1, 2014
By

Today I read a paper that had been submitted to the IJF which included the following figure along with several similar plots. (Click for a larger version.) I haven’t seen anything this bad for a long time. In fact, I think I would find it very diffi...

## BREAKING THE LAW! (of likelihood): to keep their fit measures in line (A), (B)

August 29, 2014
By

1.An Assumed Law of Statistical Evidence (law of likelihood) Nearly all critical discussions of frequentist error statistical inference (significance tests, confidence intervals, p- values, power, etc.) start with the following general assumption about the nature of inductive evidence or support: Data x are better evidence for hypothesis H1 than for H0 if x are more probable under H1 than […]

## Mathematical and Applied Statistics Lesson of the Day – The Motivation and Intuition Behind Markov’s Inequality

$Mathematical and Applied Statistics Lesson of the Day – The Motivation and Intuition Behind Markov’s Inequality$

Markov’s inequality may seem like a rather arbitrary pair of mathematical expressions that are coincidentally related to each other by an inequality sign: where . However, there is a practical motivation behind Markov’s inequality, and it can be posed in the form of a simple question: How often is the random variable “far” away from […]

August 26, 2014
By

What is the Gauss-Markov theorem? From “The Cambridge Dictionary of Statistics” B. S. Everitt, 2nd Edition: A theorem that proves that if the error terms in a multiple regression have the same variance and are uncorrelated, then the estimators of the parameters in the model produced by least squares estimation are better (in the sense […] Related posts: What is meant by regression modeling? Skimming statistics papers for the ideas…

## The Chi-Squared Test of Independence – An Example in Both R and SAS

$The Chi-Squared Test of Independence – An Example in Both R and SAS$

Introduction The chi-squared test of independence is one of the most basic and common hypothesis tests in the statistical analysis of categorical data.  Given 2 categorical random variables, and , the chi-squared test of independence determines whether or not there exists a statistical dependence between them.  Formally, it is a hypothesis test with the following null and […]

## Recent Articles

August 20, 2014
By

I have uploaded a few papers I have written and presented at some national conferences over the past several years.  Currently, all the articles relate to election research.

## GEFCom 2014 energy forecasting competition is underway

August 18, 2014
By

GEFCom 2014 is the most advanced energy forecasting competition ever organized, both in terms of the data involved, and in terms of the way the forecasts will be evaluated. So everyone interested in energy forecasting should head over to the competitio...

## Are P Values Error Probabilities? or, “It’s the methods, stupid!” (2nd install)

August 18, 2014
By

Despite the fact that Fisherians and Neyman-Pearsonians alike regard observed significance levels, or P values, as error probabilities, we occasionally hear allegations (typically from those who are neither Fisherian nor N-P theorists) that P values are actually not error probabilities. The denials tend to go hand in hand with allegations that P values exaggerate evidence against […]

## Teaching random variables and distributions

August 18, 2014
By

Why do we teach about random variables, and why is it so difficult to understand? Probability and statistics go together pretty well and basic probability is included in most introductory statistics courses. Often maths teachers prefer the probability section as … Continue reading →

## Mathematical Statistics Lesson of the Day – Markov’s Inequality

$Mathematical Statistics Lesson of the Day – Markov’s Inequality$

Markov’s inequality is an elegant and very useful inequality that relates the probability of an event concerning a non-negative random variable, , with the expected value of .  It states that where . I find Markov’s inequality to be beautiful for 2 reasons: It applies to both continuous and discrete random variables. It applies to any non-negative […]