Despite public pledges, leading scientific journals still allow statistical misconduct and refuse to correct it

A leading form of statistical malpractice in scientific studies is to retroactively comb through the data for "interesting" patterns; while such patterns may provide useful leads for future investigations, simply cherry-picking data that looks significant out of a study that has otherwise failed to prove out the researcher's initial hypothesis can generate false -- but plausible-seeming -- conclusions. Read the rest

Big Data's "theory-free" analysis is a statistical malpractice

One of the premises of Big Data is that it can be "theory free": rather than starting with a hypothesis ("men at buffets eat more when women are present," "more people will click this button if I move it here," etc) and then gathering data to validate your guess, you just gather a ton of data and look for patterns in it. Read the rest

A/B testing tools have created a golden age of shitty statistical practices in business

A team of researchers examined 2,101 commercial experiments facilitated by A/B splitting tools like Google Optimize, Mixpanel, Monetate and Optimizely and used regression analysis to detect whether p-hacking (previously), a statistical cheating technique that makes it look like you've found a valid cause-and-effect relationship when you haven't, had taken place. Read the rest

The work of the world's leading nutrition researchers appears to be riddled with statistical errors

Brian Wansink is one of the most-cited nutrition researchers in the world; 30,000 US schools use his advice to design their lunch programs, drawing on studies he's done that show that kids eat more carrots when they're called "X-ray vision carrots" and that putting out fruit bowls improves eating habits, and that smaller plates reduce portion sizes. Read the rest

Psychology's reproducibility crisis: why statisticians are publicly calling out social scientists

Princeton University psych prof Susan Fiske published an open letter denouncing the practice of using social media to call out statistical errors in psychology research, describing the people who do this as "terrorists" and arguing that this was toxic because of the structure of social science scholarship, having an outsized effect on careers. Read the rest