Category: Zombies

A couple of thoughts regarding the hot hand fallacy fallacy

For many years we all believed the hot hand was a fallacy. It turns out we were all wrong. Fine. Such reversals happen. Anyway, now that we know the score, we can reflect on some of the cognitive biases that led us to stick with the “hot hand fallacy” story for so long. Jason Collins […]

The post A couple of thoughts regarding the hot hand fallacy fallacy appeared first on Statistical Modeling, Causal Inference, and Social Science.

Niall Ferguson and the perils of playing to your audience

History professor Niall Ferguson had another case of the sillies. Back in 2012, in response to Stephen Marche’s suggestion that Ferguson was serving up political hackery because “he has to please corporations and high-net-worth individuals, the people who can pay 50 to 75K to hear him talk,” I wrote: But I don’t think it’s just […]

The post Niall Ferguson and the perils of playing to your audience appeared first on Statistical Modeling, Causal Inference, and Social Science.

Niall Ferguson and the perils of playing to your audience

History professor Niall Ferguson had another case of the sillies. Back in 2012, in response to Stephen Marche’s suggestion that Ferguson was serving up political hackery because “he has to please corporations and high-net-worth individuals, the people who can pay 50 to 75K to hear him talk,” I wrote: But I don’t think it’s just […]

The post Niall Ferguson and the perils of playing to your audience appeared first on Statistical Modeling, Causal Inference, and Social Science.

These 3 problems destroy many clinical trials (in context of some papers on problems with non-inferiority trials, or problems with clinical trials in general)

Paul Alper points to this news article in Health News Review, which says: A news release or story that proclaims a new treatment is “just as effective” or “comparable to” or “as good as” an existing therapy might spring from a non-inferiority trial. Technically speaking, these studies are designed to test whether an intervention is […]

The post These 3 problems destroy many clinical trials (in context of some papers on problems with non-inferiority trials, or problems with clinical trials in general) appeared first on Statistical Modeling, Causal Inference, and Social Science.

“Using numbers to replace judgment”

Julian Marewski and Lutz Bornmann write: In science and beyond, numbers are omnipresent when it comes to justifying different kinds of judgments. Which scientific author, hiring committee-member, or advisory board panelist has not been confronted with page-long “publication manuals”, “assessment reports”, “evaluation guidelines”, calling for p-values, citation rates, h-indices, or other statistics in order to […]

The post “Using numbers to replace judgment” appeared first on Statistical Modeling, Causal Inference, and Social Science.

Robustness checks are a joke

Someone pointed to this post from a couple years ago by Uri Simonsohn, who correctly wrote: Robustness checks involve reporting alternative specifications that test the same hypothesis. Because the problem is with the hypothesis, the problem is not addressed with robustness checks. Simonsohn followed up with an amusing story: To demonstrate the problem I [Simonsohn] […]

The post Robustness checks are a joke appeared first on Statistical Modeling, Causal Inference, and Social Science.

Chocolate milk! Another stunning discovery from an experiment on 24 people!

Mike Hull writes: I was reading over this JAMA Brief Report and could not figure out what they were doing with the composite score. Here are the cliff notes: Study tested milk vs dark chocolate consumption on three eyesight performance parameters: (1) High-contrast visual acuity (2) Small-letter contrast sensitivity (3) Large-letter contrast sensitivity Only small-letter […]

The post Chocolate milk! Another stunning discovery from an experiment on 24 people! appeared first on Statistical Modeling, Causal Inference, and Social Science.

“Recapping the recent plagiarism scandal”

Benjamin Carlisle writes: A year ago, I received a message from Anna Powell-Smith about a research paper written by two doctors from Cambridge University that was a mirror image of a post I wrote on my personal blog roughly two years prior. The structure of the document was the same, as was the rationale, the […]

The post “Recapping the recent plagiarism scandal” appeared first on Statistical Modeling, Causal Inference, and Social Science.

The purported CSI effect and the retroactive precision fallacy

Regarding our recent post on the syllogism that ate science, someone points us to this article, “The CSI Effect: Popular Fiction About Forensic Science Affects Public Expectations About Real Forensic Science,” by N. J. Schweitzer and Michael J. Saks. We’ll get to the CSI Effect in a bit, but first I want to share the […]

The post The purported CSI effect and the retroactive precision fallacy appeared first on Statistical Modeling, Causal Inference, and Social Science.

Cornell prof (but not the pizzagate guy!) has one quick trick to getting 1700 peer reviewed publications on your CV

From the university webpage: Robert J. Sternberg is Professor of Human Development in the College of Human Ecology at Cornell University. . . . Sternberg is the author of over 1700 refereed publications. . . . How did he compile over 1700 refereed publications? Nick Brown tells the story: I [Brown] was recently contacted by […]

The post Cornell prof (but not the pizzagate guy!) has one quick trick to getting 1700 peer reviewed publications on your CV appeared first on Statistical Modeling, Causal Inference, and Social Science.

An actual quote from a paper published in a medical journal: “The data, analytic methods, and study materials will not be made available to other researchers for purposes of reproducing the results or replicating the procedure.”

Someone writes: So the NYT yesterday has a story about this study I am directed to it and am immediately concerned about all the things that make this study somewhat dubious. Forking paths in the definition of the independent variable, sample selection in who wore the accelerometers, ignorance of the undoubtedly huge importance of interactions […]

The post An actual quote from a paper published in a medical journal: “The data, analytic methods, and study materials will not be made available to other researchers for purposes of reproducing the results or replicating the procedure.” appeared first on Statistical Modeling, Causal Inference, and Social Science.

“Fudged statistics on the Iraq War death toll are still circulating today”

Mike Spagat shares this story entitled, “Fudged statistics on the Iraq War death toll are still circulating today,” which discusses problems with a paper published in a scientific journal in 2006, and errors that a reporter inadvertently included in a recent news article. Spagat writes: The Lancet could argue that if [Washington Post reporter Philip] […]

The post “Fudged statistics on the Iraq War death toll are still circulating today” appeared first on Statistical Modeling, Causal Inference, and Social Science.

Statistical Modeling, Causal Inference, and Social Science Regrets Its Decision to Hire Cannibal P-hacker as Writer-at-Large

It is not easy to admit our mistakes, particularly now, given the current media climate and general culture of intolerance on college campuses. Still, we feel that we owe our readers an apology. We should not have hired Cannibal P-hacker, an elegant scientist and thinker who, we have come to believe, after serious consideration, does […]

The post Statistical Modeling, Causal Inference, and Social Science Regrets Its Decision to Hire Cannibal P-hacker as Writer-at-Large appeared first on Statistical Modeling, Causal Inference, and Social Science.

Don’t calculate post-hoc power using observed estimate of effect size

Aleksi Reito writes: The statement below was included in a recent issue of Annals of Surgery: But, as 80% power is difficult to achieve in surgical studies, we argue that the CONSORT and STROBE guidelines should be modified to include the disclosure of power—even if less than 80%—with the given sample size and effect size […]

The post Don’t calculate post-hoc power using observed estimate of effect size appeared first on Statistical Modeling, Causal Inference, and Social Science.

“Tweeking”: The big problem is not where you think it is.

In her recent article about pizzagate, Stephanie Lee included this hilarious email from Brian Wansink, the self-styled “world-renowned eating behavior expert for over 25 years”: OK, what grabs your attention is that last bit about “tweeking” the data to manipulate the p-value, where Wansink is proposing research misconduct (from NIH: “Falsification: Manipulating research materials, equipment, […]

The post “Tweeking”: The big problem is not where you think it is. appeared first on Statistical Modeling, Causal Inference, and Social Science.

Narcolepsy Could Be ‘Sleeper Effect’ in Trump and Brexit Campaigns

Kevin Lewis sent along this example of what in social science is called the “ecological fallacy”: UNDER EMBARGO UNTIL MARCH 8, 2018 AT 10 AM EST Media Contact: Public and Media Relations Manager Society for Personality and Social Psychology press@spsp.org Narcolepsy Could Be ‘Sleeper Effect’ in Trump and Brexit Campaigns Regions where voters have more […]

The post Narcolepsy Could Be ‘Sleeper Effect’ in Trump and Brexit Campaigns appeared first on Statistical Modeling, Causal Inference, and Social Science.