Blog Archives

Can we try to make an adjustment?

November 14, 2014
By
Can we try to make an adjustment?

In most of our data science teaching (including our book Practical Data Science with R) we emphasize the deliberately easy problem of “exchangeable prediction.” We define exchangeable prediction as: given a series of observations with two distinguished classes of variables/observations denoted “x”s (denoting control variables, independent variables, experimental variables, or predictor variables) and “y” (denoting […] Related posts: Don’t use correlation to track prediction performance Reading the Gauss-Markov theorem Bad…

Read more »

Bias/variance tradeoff as gamesmanship

October 30, 2014
By
Bias/variance tradeoff as gamesmanship

Continuing our series of reading out loud from a single page of a statistics book we look at page 224 of the 1972 Dover edition of Leonard J. Savage’s “The Foundations of Statistics.” On this page we are treated to an example attributed to Leo A. Goodman in 1953 that illustrates how for normally distributed […] Related posts: Automatic bias correction doesn’t fix omitted variable bias Reading the Gauss-Markov theorem…

Read more »

Factors are not first-class citizens in R

September 23, 2014
By
Factors are not first-class citizens in R

The primary user-facing data types in the R statistical computing environment behave as vectors. That is: one dimensional arrays of scalar values that have a nice operational algebra. There are additional types (lists, data frames, matrices, environments, and so-on) but the most common data types are vectors. In fact vectors are so common in R […] Related posts: R has some sharp corners R minitip: don’t use data.matrix when you…

Read more »

Reading the Gauss-Markov theorem

August 26, 2014
By
Reading the Gauss-Markov theorem

What is the Gauss-Markov theorem? From “The Cambridge Dictionary of Statistics” B. S. Everitt, 2nd Edition: A theorem that proves that if the error terms in a multiple regression have the same variance and are uncorrelated, then the estimators of the parameters in the model produced by least squares estimation are better (in the sense […] Related posts: What is meant by regression modeling? Skimming statistics papers for the ideas…

Read more »

Automatic bias correction doesn’t fix omitted variable bias

July 8, 2014
By
Automatic bias correction doesn’t fix omitted variable bias

Page 94 of Gelman, Carlin, Stern, Dunson, Vehtari, Rubin “Bayesian Data Analysis” 3rd Edition (which we will call BDA3) provides a great example of what happens when common broad frequentist bias criticisms are over-applied to predictions from ordinary linear regression: the predictions appear to fall apart. BDA3 goes on to exhibit what might be considered […] Related posts: Frequentist inference only seems easy Six Fundamental Methods to Generate a Random…

Read more »

Frequentist inference only seems easy

July 1, 2014
By
Frequentist inference only seems easy

Two of the most common methods of statistical inference are frequentism and Bayesianism (see Bayesian and Frequentist Approaches: Ask the Right Question for some good discussion). In both cases we are attempting to perform reliable inference of unknown quantities from related observations. And in both cases inference is made possible by introducing and reasoning over […] Related posts: Bayesian and Frequentist Approaches: Ask the Right Question Automatic bias correction doesn’t…

Read more »

R minitip: don’t use data.matrix when you mean model.matrix

June 10, 2014
By
R minitip: don’t use data.matrix when you mean model.matrix

A quick R mini-tip: don’t use data.matrix when you mean model.matrix. If you do so you may lose (without noticing) a lot of your model’s explanatory power (due to poor encoding). For some modeling tasks you end up having to prepare a special expanded data matrix before calling a given machine learning algorithm. For example […] Related posts: Level fit summaries can be tricky in R Vtreat: designing a package…

Read more »

R style tip: prefer functions that return data frames

June 6, 2014
By
R style tip: prefer functions that return data frames

While following up on Nina Zumel’s excellent Trimming the Fat from glm() Models in R I got to thinking about code style in R. And I realized: you can make your code much prettier by designing more of your functions to return data.frames. That may seem needlessly heavy-weight, but it has a lot of down-stream […] Related posts: Prefer = for assignment in R Your Data is Never the Right…

Read more »

Skimming statistics papers for the ideas (instead of the complete procedures)

June 2, 2014
By
Skimming statistics papers for the ideas (instead of the complete procedures)

Been reading a lot of Gelman, Carlin, Stern, Dunson, Vehtari, Rubin “Bayesian Data Analysis” 3rd edition lately. Overall in the Bayesian framework some ideas (such as regularization, and imputation) are way easier to justify (though calculating some seemingly basic quantities becomes tedious). A big advantage (and weakness) of this formulation is statistics has a much […] Related posts: Checking claims in published statistics papers Data Science, Machine Learning, and Statistics:…

Read more »

How does Practical Data Science with R stand out?

June 2, 2014
By
How does Practical Data Science with R stand out?

There are a lot of good books on statistics, machine learning, analytics, and R. So it is valid to ask: how does Practical Data Science with R stand out? Why should a data scientist or an aspiring data scientist buy it? We admit, it isn’t the only book we own. Some relevant books from the […] Related posts: A bit of the agenda of Practical Data Science with R Data…

Read more »


Subscribe

Email:

  Subscribe