Amateur scientists vs. cranks

This is video of a talk given last year by David Dixon, assistant professor of math, science and engineering at Saddleback College in California. He used to work in the Physics Department at California Polytechnic State University, which, like many physics departments around the world, received loads of correspondence from non-scientists who thought they had come up with earth-shattering, game-changing hypotheses that needed to be shared.

Now, sometimes, laypeople come up with good ideas that should be explored. But many of these letters are better classified as the work of cranks — folks who had big ideas, cared deeply about those big ideas, but who were dead wrong ... and utterly impervious to the idea that they might be wrong.

In this talk, Dixon delves into the collection of crank letters received by California Polytechnic State University over the years to explain the hallmarks of crankitude, the behaviors that raise red flags for professional scientists, and what we can actually learn about real science by studying fake science.

YouTube says the video is over two hours long, but that's apparently inaccurate. The actual talk is an hour long and just somehow got loaded twice into the same video.

If this is a topic that interests you, I'd also recommend reading this MetaFilter thread, where scientists explain to a poster why the poster's friend is setting off crank red flags with scientists whose attention he's trying to capture. It's a fascinating look at what to do and what not to do if you have a hypothesis you want to share.

Notable Replies

  1. These are all interesting conversations, but notice that people who talk about cranks tend to not talk about the other side of the equation: The danger of dogma within our scientific institutions.

    Joseph Novak is an education researcher who is most known for having invented the concept map. He has tirelessly advocated for a distinction to be made in science education between rote memorization and meaningful learning. Meaningful learning involves attaching new concepts to pre-existing ones (a concept which I'm sure Cory is familiar with, as it is routinely practiced by Tim O'Reilly), whereas rote memorization tends to result in many disconnected knowledge structures which can be observed to fade from memory far faster due to their disconnection.

    The way in which science tends to be taught today -- through lectures and problem sets -- tends to invite students to rote-memorize the materials. Needless to say, rote memorization does not activate the same cognitive circuitry that thinking like a scientist or critical thinking does. Eric Mazur has demonstrated that even in a Harvard undergrad physics class, it can be shown that when the problem sets are asked as conceptual questions -- oftentimes called force concept inventories -- it becomes apparent that many students who rely upon rote memorization cannot actually answer basic conceptual questions about what they claim to "know".

    This presents the flip side of the "crank" coin: The persistence of dogmas within our scientific institutions. What I try to remind people as often as possible is that there is no sense to talking about crankism without also mentioning the problem of dogmatism. And this is unfortunately where most science journalism today fails to meet the needs of the public: The journalists tends to shine far more light on the easily-observable problem of cranks than on the much more complex problem of dogma. This creates a secondary byproduct phenomenon of pseudo-skepticism: Skepticism applied towards all ideas which compete with conventional theories, but not also towards conventional theory itself. The idea here is that authentic skepticism should be applied towards both.

    The dogma problem is very, very tricky because it would seem that part of the PhD training is to "enculture" grad students into "thinking like a scientist". So, what does it mean to think like a scientist? Is it that the person agrees with what would seem to be consensus views -- the fundamental claims of scientists? What happens to grad students who challenge the work of other professors in their university? Are some questions simply out of bounds?

    Jeff Schmidt -- author of Disciplined Minds -- claims to know the answer. He suggests that the weeding out process in the PhD programs is NOT politically neutral. In fact, he observed that those physics grad students who stopped to think about and possibly question what it was that they were memorizing would become bogged down, and would eventually either drop out, or be kicked out.

    So, this raises what might turn out to be the most important question related to dogma: Is the way in which we've been teaching science creating some of the agreement (aka consensus) we see in our scientific institutions? In other words, have we in some cases failed to effectively teach scientists how to question their own discipline's theory?

    The question will certainly be more relevant for some disciplines than others. I would propose that where disciplines are empirically challenged -- as in cosmology and astrophysics, for instance -- the seriousness of the problem will predictably rise.

Continue the discussion at

57 more replies