Statistics Done Wrong: The Woefully Complete Guide
From a brilliant Web-rant to an indispensable guide to the perils of statistics and their remedies, Alex Reinhart's Statistics Gone Wrong is a spotter's guide to arrant nonsense cloaked in mathematical respectability.
For 15 years, I've been a faculty member in the Ed.D. program at Nova Southeastern University where the majority of my doctoral students employ quantitative methods in their education research. Despite the fact that students take a mandatory methods class and get templates providing a rough statistical framework, they experience great confusion when it comes to designing their methodology and analyzing their data.
It's not just the students: despite my own background in mathematics (I teach linear and abstract algebra), I sometimes find myself uncertain about advising my students about their data analysis and also in conflict with some colleagues about what counts as being statistically valid. Typically, I turn to statistical textbooks and other colleagues for advice.
An article in the April 16, 2015 edition of Scientific American boldly claimed that research psychologists are wringing their hands over the inadequacy of the statistical tools they have been using. It seems that the use of p values as gold standard tests for significance has gone into disrepute as a consequence of over-reliance and inadequacy in determining the quality of the results. This is where Alex Reinhart comes in.
Reinhart is a physicist turned statistician who has set out to write a book whose aim is to improve the quality of statistical education and understanding that researchers need to have. Statistics Done Wrong is not a textbook. It is a highly informed discussion of the frequent inadequacy of published statistical results and confronts the sacred cow: the p value. Here is what he has to say on page 2.
Since the 1980s, researchers have described numerous statistical fallacies and misconceptions in the popular peer-reviewed scientific literature and have found that many scientific papers -- perhaps more than half -- fall prey to these errors. Inadequate statistical power renders many studies incapable of finding what they're looking for, multiple comparisons and misinterpreted p values cause numerous false positives, flexible data analysis makes it easy to find a correlation where none exists, and inappropriate model choices bias important results. Most errors go undetected by peer reviewers and editors, who often have no specific statistical training, because few journals employ statisticians to review submissions and few papers give sufficient statistical detail to be accurately evaluated.
Astonishing to my eyes was his conclusion that
The methodological complexity of modern research means that scientists without extensive statistical training may not be able to understand most published research in their fields.
Reinhart advises users of statistics to replace point estimates (p values) with confidence intervals (estimates of uncertainty). He discusses statistical power, (a way of determining the degree of confidence associated with statistical tests using the null hypothesis). He discusses and illustrates with clear and uncomplicated examples such things as the effects of sample size and reasonable estimates of bias (suggestive of the Bayesian approach).
He also covers the pitfalls associated with underpowered and overpowered statistical testing, and what contextual cues you can use to detect them.
The book is full of mind-arresting statistics, like, “fewer than 3% of articles" in prestigious journals Science and Nature “calculate statistical power before starting their study." Imagine what this means for my doctoral students.
This slim book, only 129 pages long, is very forceful in outlining and documenting the deficiencies in quantitative science research in the social and medical sciences. Toward the end, he addresses research scientists directly and advises them of how they can go about ameliorating the situation—not a simple task.
This is a useful, reasonably detailed guide for those who consume research, produce research, or who are interested in understanding the limits of current research. Crucially, it provides precise and unequivocal guidance advocated to instructors of statistics.
They say that in an argument you're entitled to your own opinions, but not your own facts. Statistics validity is a difficult nexus of opinion and fact, and this is a vital contribution to the argument.
Statistics Done Wrong: The Woefully Complete Guide [Alex Reinhart/No Starch Press]
(Image: Spurious Correlations)
I’m not an engineer, but I can’t stop watching this hypnotic and oddly satisfying video of tying rebar.
The Australian Acoustic Observatory project, described by its creators as a kind of “Google Street View of sound,” is a new acoustic sensor network of hundreds of microphones and digital audio recorders distributed across multiple remote ecosystems on the continent. The solar-powered system will record animal and natural sounds continuously for 5 years. According to […]
The Life Cycle podcast talks transhumanism with Kernel CEO Bryan Johnson and "To Be a Machine" author Mark O'Connell
Is your brain a machine? Are your thoughts and feelings just malware of the mind? (And what “really” is a machine, anyway?) John and Eva referee the transhumanist fight of the century. In the blue corner, we have Eva meeting founder and Bryan Johnson, CEO of Kernel, straight from his office in LA. And in […]
We’re living in the age of Big Data. As the driving force behind everything from Google’s famed algorithms to self-driving cars, massive sets of complex data can be found at the heart of some of today’s most exciting and important technologies. The Ultimate SQL Bootcamp Certification Bundle will give you the skills and tools you […]
So you missed Black Friday and Cyber Monday? Well, there’s one more holiday milestone coming up: Green Monday, on Dec. 16. It’s one of the busiest online shopping days for the simple reason that it will be 10 days before Christmas, which is when last-minute shoppers start to stress. Our advice? Don’t wait for that […]
Weighted blankets are officially a thing. And if you’ve ever tried one, you know there’s a reason for the fuss. Parents have known for generations that swaddling a baby has a profound calming effect, and the gentle pressure that blanket weights provide can have the same effect on restless sleepers. Pretty much all parts of […]