Submit a link Features Reviews Podcasts Video Forums More ▾

The One True Cause of all disease (All 52 of them)

A few years ago, Harriet Hall googled "The One True Cause of all disease", just to see what the Internet would come up with. She counted 67 One True Causes before she got bored (52 of them made it into the handy chart above).

Besides making for an amusing anecdote, this little exercise also helps illustrate why there's a problem with ideologically driven medical treatments — the sort that comes from people who are pushing a lifestyle or a philosophy along with ostensible healthcare. It's both intriguing and convenient to think that, if we just open the right secret door, we can find the thing that's actually causing all our problems. The truth, unfortunately, seems to be that our bodies and the world they inhabit are complicated and messy and that lots of of things can lead to disease (doctors typically learn to divide these things into nine different categories, Hall says). In fact, a disease we think of as a single entity can have its roots in more than one thing. All of this is pretty obvious but it's the kind of obvious that's worth rubbing our noses in on occasion. If somebody tells you that everything from obesity to bipolar disorder to allergies to cancer all stem from the same root and can be treated or prevented with the exact same treatment, there's probably good reason to question what they're telling you.

The art and science of searching for water

The United States Geological Survey has an interesting FAQ report on dowsing — the practice of attempting to locate underground water with divining rods. It's got some interesting history and comparisons between dowsing and modern hydrology. The part on evidence for and against dowsing, though, is pretty sparse. If you want more on that, The Skeptic's Dictionary has some deeper analysis. The basic gist — what little research there has been suggests the successes of dowsing aren't any better than chance. (Via an interesting piece by Mary Brock at Skepchick about dowsing in the wine industry.) Maggie

It's not okay to threaten to rape people you don't like: Why I stand with Rebecca Watson

Every now and then, I am reminded of how lucky I am. I'm lucky that none of my readers has ever responded to a comment I made, which they didn't like, by calling me ugly. I'm lucky that they've never called me a cunt or a whore. I'm lucky that they've never threatened to rape me and then called me a humorless bitch when I pointed out how messed up that was. In general, the worst comments I've ever had directed to me, here, were from people accusing me of being a paid shill for Big Conspiracy, which is just funny.

But that shouldn't be luck, guys. My experience should not represent a minority experience among the female science bloggers I know. (And it is.) I shouldn't have to feel like thanking you, the BoingBoing readers, for being kind enough to not treat me like shit just because I'm a lady person.

Treating people with respect should not be a controversial position. It should not be a mindblowingly crazy idea to point out the fact that women are quite often treated as objects and, thus, have to deal with a lot more potentially threatening situations than men do. It shouldn't be offensive to say, hey, because of that fact, it's generally not a good idea to follow a woman you've never spoken to into an elevator late at night and ask her to come to your hotel room. Chances are good that you will make her feel threatened, rather than complimented.

And, even if you disagree, it's still totally not okay to threaten to rape people you disagree with. Seriously. Other than the specific bit about rape, we should have all learned this in preschool. And the fact that so many of the people engaging in this behavior claim to be rational thinkers and members of a community I strongly identify with ... well, that just makes me want to vomit. I honestly don't know what else to say.

Read Rebecca Watson's full article, Sexism in the Skeptic Community

Parody of anti-gay pamphlets offers detailed, behind-the-scenes view of how liars misuse real citations

The Box Turtle Bulletin has put together a great parody of anti-gay, fear-mongering pamphlets. Entitled, "The Heterosexual Agenda: Exposing the Myths", it includes important revelations about the heterosexuals and their plans for your children and our country. Here's a quick excerpt from a section that documents some of the depraved behaviors that heterosexuals are known to engage in:

... unsafe behavior is often compounded by drug use, which is an integral part of the heterosexual lifestyle. College students who engage in heterosexuality are 30% more likely to use marijuana than gay students, and they are nearly 40% more likely to use other drugs. (71) Among Redbook readers, 90% of heterosexual women admitted to initiating sex while under the influence of alcohol, and 30% had sex after smoking marijuana. For women under twenty, marijuana use before sex skyrocketed to 63%, with 45% of them using it often. (72)

Those numbered citations are important. In fact, this slim booklet contains more than 100. And it's not just part of the parody. Instead, author Jim Burroway uses these ostensibly unbiased sources of information as a way showing how people can use real information to corroborate a lie. Follow up on his citations at the end of The Heterosexual Agenda, and you'll find a breakdown of how, exactly, he contorted the cited source to fit his own goals.

Read the rest

Climate science, climate change, and denial

CONvergence, Minneapolis' great big science fiction and fantasy convention, also has a whole series of panels based on hard science—Skepchickcon. This year, I was invited to speak on a few of the panels, including two that dealt with climate science. The best bits of those panels—"The Chilling Effects of Denialism,” and “Who Will Save the Polar Bears"—have been edited up and published online as this week's Skeptically Speaking podcast. Besides myself, the panels included engineering professor John Abraham, science advocate and writer Shawn Otto, and biological anthropologist Greg Laden. We had some great conversations! Take a listen. Maggie

Apply Truth Goggles, learn truths

Truth Goggles is a web app that highlights facts in the text of a web page or news story and provides a link that tells you whether or not those facts are true. Maggie

How to: Read science news

How you read matters as much as what you read. That's because nothing is written in a vacuum. Every news story or blog post has a perspective behind it, a perspective that shapes what you are told and how that information is conveyed. This is not, necessarily, a bad thing. Having a perspective doesn't mean being sensationalistic, or deceitful, or spreading propaganda. It can mean those things, but it doesn't have to. In fact, I'm fairly certain that it's impossible to tell any story without some kind of perspective. When you relate facts, even in your personal life, you make choices about what details you will emphasize, what emotions you'll convey, who you will speak to—and all of those decisions are based on your personal perspective. How we tell a story depends on what we think is important.

Unfortunately, sometimes, perspective can be misleading. That's why it's important to be aware that perspective exists. If you look at what you're reading, you can see the decisions the author made, you can get an idea of what perspective they were trying to convey, and you will know whether that perspective is likely to distort the facts.

Emily Willingham is a scientist who blogs about science for the general public. Over at Double X Science, she's come up with a handy, six-step guide for reading science news stories. These rules are a great tool for peeking behind the curtain, and learning to think about the perspective behind what you read. In the post, she explains why each of these rules is important, and then applies them to a recent news story about chemical exposure and autism.

3. Look at the words in the articles. Suspected. Suggesting a link. In other words, what you're reading below those headlines does not involve studies linking anything to autism. Instead, it's based on an editorial listing 10 compounds [PDF] that the editorial authors suspect might have something to do with autism (NB: Both linked stories completely gloss over the fact that most experts attribute the rise in autism diagnoses to changing and expanded diagnostic criteria, a shift in diagnosis from other categories to autism, and greater recognition and awareness--i.e., not to genetic changes or environmental factors. The editorial does the same). The authors do not provide citations for studies that link each chemical cited to autism itself, and the editorial itself is not focused on autism, per se, but on "neurodevelopmental" derailments in general.

4. Look at the original source of information. The source of the articles is an editorial, as noted. But one of these articles also provides a link to an actual research paper. The paper doesn't even address any of the "top 10" chemicals listed but instead is about cigarette smoking. News stories about this study describe it as linking smoking during pregnancy and autism. Yet the study abstract states that they did not identify a link, saying "We found a null association between maternal smoking and pregnancy in ASDs and the possibility of an association with a higher-functioning ASD subgroup was suggested." In other words: No link between smoking and autism. But the headlines and how the articles are written would lead you to believe otherwise.

The one rule of Willingham's that I would question is "Ask a Scientist", not because it's bad advice, but because it's not something most people can easily do. Twitter helps, but only if you're already tied into social networks of scientists and science writers. Again, most people aren't. If you want to connect to these networks, I'd recommend starting out by picking up a copy of The Open Laboratory, an annual anthology of the best science writing on the web. Use that to find scientists who write for the public and whose voice you enjoy. Add them in your social networks, and then add the people that those scientists are spending a lot of time talking to. That's the easiest way to connect with some trustworthy sources. And remember: An expert in one subject is not the same thing as an expert. It doesn't make sense to ask a mechanical engineer for their opinion on cancer treatments. It doesn't make sense to as an oncologist about building better engines.

Read the rest of Emily Willingham's post on reading science news.

Buy The Open Laboratory 2010 (the 2011 edition hasn't been published yet).

Indian skeptic charged with "blasphemy" for revealing secret behind "miracle" of weeping cross

Sanal Edamaruku, an Indian skeptic, went to Mumbai and revealed that a "miraculous" weeping cross was really just a bit of statuary located near a leaky drain whose liquid reached it by way of capillary action. The local Catholic Church demanded that he retract his statements, and when he refused, they had him arrested for blasphemy.

On 10th March, Sanal Edamaruku, President of the Rationalist International, flew to Mumbai. The TV channel TV-9 had invited him to investigate a “miracle” that caused local excitement. He went with the TV team to Irla in Vile Parle to inspect the crucifix standing there in front of the Church of Our Lady of Velankanni. This crucifix had become the centre of attraction for an ever growing crowd of believers coming from far and wide. The news of the miracle spread like wild fire. For some days, there were little droplets of water trickling from Jesus’ feet. Hundreds of people came every day to pray and collect some of the “holy water” in bottles and vessels. Sanal Edamaruku identified the source of the water (a drainage near a washing room) and the mechanism how it reached Jesus feet (capillary action). The local church leaders, present during his investigation, appeared to be displeased.

Some hours later, in a live program on TV-9, Sanal explained his findings and accused the concerned Catholic Church officials of miracle mongering, as they were beating the big drum for the drippling Jesus statue with aggressive PR measures and by distributing photographs certifying the “miracle”. A heated debate began, in which the five church people, among them Fr. Augustine Palett, the priest of Our Lady of Velankanni church, and representatives of the Association of Concerned Catholics (AOCC) demanded that Sanal apologize. But Sanal refused and argued against them. [The whole TV program is recorded. You can watch an abridged version of it on YouTube.]

When they saw Sanal refused to bow to their demands, they threatened to file a blasphemy case against him. And they did. Yesterday (10th April,2012) Sanal received a phone call from a Police official of Juhu Police Station in Mumbai directing him to come to the said police station to face the charges and get arrested. He also said that FIRs have also been filed in Andheri and some other police stations u/s 295 of Indian Penal Code on the allegations of hurting the religious sentiments of a particular community. Mumbai police has announced that they were out to arrest him. It is apprehended that he can be arrested any moment.

Letter from Sanal Edamaruka defence committee (via /.)

A neat finding about pseudonymous commenters—and why you should question it

Here's some interesting data that I would like to believe is true—mainly because it matches up with what I've experienced here at BoingBoing. Many of you use some kind of pseudonym in the comments, whether it's first-name-only, an Internet handle, or a completely fake name. My experience here has taught me that, despite this, you all are perfectly capable of writing fascinating, informative, worthwhile comments and having good discussions that add to the usefulness of the original post. (That doesn't always happen, as I'm sure Antinous will attest. But it happens often enough that I talk y'all up to other journalists and bloggers who are nervous about having a comments section on their site.)

After an analysis of 500,000 comments, Disqus now says that pseudonymous commenters are the most prolific commenters—and that the quality of their comments are actually a little better than the quality of comments from people who logged in through Facebook, using their real names.

If this is correct, it's pretty cool. It might not be correct, though. So do think about that before you start touting this as absolute fact in the #nymwars. For instance, the key measure of quality here is whether or not a post generates "likes" and replies, and, if so, how many. Another thing I've learned from watching the comments on BoingBoing: Likes and replies are not necessarily indicative of actual quality. Likewise, the measures that branded a post as "low quality" seem designed to really only address the worst-of-the-worst: Comments that get flagged, deleted, or marked as spam. There's a lot of room left over for comments that are low quality, but not outright trolling/spam.

Another issue: "Real identity," in this case, means "logged in through Facebook. I can think of several of you, off the top of my head, who I know use real names in the comments, but don't log in through a social media site.

Finally, I can't find anything about where the 500,000 comments were pulled from. Depending on the site(s), this may or may not be a representative sample. After all, the site you're posting on—what the content is, what the community is like, how well moderated it is—probably does a lot to influence how you behave there.

So, basically, what I'm saying is this: Disqus has published an infographic confirming my personal beliefs. Hooray! The problem is, I don't really feel like I can trust it.

Image: jack masque, a Creative Commons Attribution (2.0) image from speculummundi's photostream

Cloning a wooly mammoth: Harder than it sounds

A Japanese research team has begun the process of cloning a wooly mammoth and thinks it can pull off the job in 5 years. Discover magazine is skeptical. Maggie

Scooby-Doo is Veggie Tales for secular humanists

At Comics Alliance, Chris Sims makes such a good argument that I can only gape and think, "Oh my god, why had I never noticed this before?"

Because that's the thing about Scooby-Doo: The bad guys in every episode aren't monsters, they're liars.

I can't imagine how scandalized those critics who were relieved to have something that was mild enough to not excite their kids would've been if they'd stopped for a second and realized what was actually going on. The very first rule of Scooby-Doo, the single premise that sits at the heart of their adventures, is that the world is full of grown-ups who lie to kids, and that it's up to those kids to figure out what those lies are and call them on it, even if there are other adults who believe those lies with every fiber of their being. And the way that you win isn't through supernatural powers, or even through fighting. The way that you win is by doing the most dangerous thing that any person being lied to by someone in power can do: You think.

But it's not just that the crooks in Scooby-Doo are liars; nobody ever shows up to bilk someone out of their life savings by pretending to be a Nigerian prince or something. It's always phantasms and Frankensteins, and there's a very good reason for that. The bad guys in Scooby-Doo prey on superstition, because that's the one thing that an otherwise rational person doesn't really think through. It's based on belief, not evidence, which is a crucial element for the show. If, for example, someone knocks on your door and claims to be a police officer, you're going to want to see a badge because that's the tangible evidence that you've come to expect to prove their claim. If, however, you hold the belief that the old run-down theater has a phantom in the basement, then the existence of that phantom himself -- or at least a reasonably convincing costume -- is all the evidence that you need to believe that you were right all along. The bad guys are just reinforcing a belief that the other characters already have, and that they don't need any evidence before because it's based in superstition, not reason.

... To paraphrase G.K. Chesterton, Scooby Doo has value not because it shows us that there are monsters, but because it shows us that those monsters are just the products of evil people who want to make us too afraid to see through their lies, and goes a step further by giving us a blueprint that shows exactly how to defeat them.

Via Chad Towle

Why you should be skeptical of evolutionary psychology

Using the attractiveness of waist-to-hip ratio as an example, psychologist and blogger Sabrina Golonka explains why you have to be skeptical when someone declares a psychological finding to be a universal human truth. It's not universal if it doesn't cross cultures. But we don't have great cross-cultural psychology data, and, where the data does exist, it suggests that things we assume must be true across cultures often are not. Maggie

Coffee: An antidepressant and religion preventative?

A recently published study found a correlation between higher rates of coffee drinking in women and decreased risk of depression. Naturally, that finding made headlines. But blogger Scicurious has a really nice analysis of the paper that picked up a significant flaw in the way the data is being interpreted. There was a correlation between drinking more coffee and a lowered risk of depression. But that wasn't the only correlation the researchers found—just the only correlation they made a big deal of in their conclusions.

On her blog, Scicurious lists the other correlations and explains why it's hard to draw any solid conclusion from this data set:

1) Smoking. The interaction between depression risk, smoking, and coffee consumption was “marginally” significant (p=0.06), but they dismiss it as being due to chance because it was “unexpected”. Um. Wait. Nicotine is a STIMULANT. It is known to have antidepressant like effects in animal models (though the withdrawal is no fun). This is not unexpected.

2) Drinking: heavy coffee drinkers drink more. But note that they don’t say that drinking coffee puts you at risk for drinking alcohol.

3) Obesity: heavy coffee drinkers are, on average, thinner, but not more physically active. They do not conclude that coffee drinking prevents obesity.

4) Church going: heavy coffee drinkers are less likely to go to church. Less likely to go to church, less likely to develop depression…heck, forget depression, maybe coffee prevents religion now! Now THAT would be a heck of a finding.

Here’s the thing. I do believe that high coffee consumption correlates with decreased risk of depression. But a lot of other things do as well. I am not convinced that the high coffee consumption wasn’t part of a lifestyle that correlated with decreased risk of depression, maybe they have stronger support networks or less incidence of depression in the family. It could be many other things.

Image: Coffee, a Creative Commons Attribution (2.0) image from dyobmit's photostream

Three common mistakes in medical journalism

I love Gary Schwitzer, a former journalism professor at the University of Minnesota and a key advocate for better health and medical reporting at HealthNewsReview.org. Schwitzer has a quick list of the most common mistakes reporters make when writing about medical science, and I think it's something that everybody should take a look at.

Why does this bit of journalism inside-baseball matter to you? Simple. If you know how journalists are most likely to screw up, you'll be less likely to be led astray by those mistakes. And that matters a lot, especially when it comes to health science, where people are likely to make important decisions based partly on what they read in the media.

The three mistakes:

Absolute versus relative risk/benefit data

Many stories use relative risk reduction or benefit estimates without providing the absolute data. So, in other words, a drug is said to reduce the risk of hip fracture by 50% (relative risk reduction), without ever explaining that it’s a reduction from 2 fractures in 100 untreated women down to 1 fracture in 100 treated women. Yes, that’s 50%, but in order to understand the true scope of the potential benefit, people need to know that it’s only a 1% absolute risk reduction (and that all the other 99 who didn’t benefit still had to pay and still ran the risk of side effects).

Association does not equal causation

A second key observation is that journalists often fail to explain the inherent limitations in observational studies – especially that they can not establish cause and effect. They can point to a strong statistical association but they can’t prove that A causes B, or that if you do A you’ll be protected from B. But over and over we see news stories suggesting causal links. They use active verbs in inaccurately suggesting established benefits.

How we discuss screening tests

The third recurring problem I see in health news stories involves screening tests. ... “Screening,” I believe, should only be used to refer to looking for problems in people who don’t have signs or symptoms or a family history. So it’s like going into Yankee Stadium filled with 50,000 people about whom you know very little and looking for disease in all of them. ... I have heard women with breast cancer argue, for example, that mammograms saved their lives because they were found to have cancer just as their mothers did. I think that using “screening” in this context distorts the discussion because such a woman was obviously at higher risk because of her family history. She’s not just one of the 50,000 in the general population in the stadium. There were special reasons to look more closely in her. There may not be reasons to look more closely in the 49,999 others.

Via The Knight Science Journalism Tracker

How do we know that the moon isn't cheese?

Sean Carrol explains why there are some ideas science doesn't have to test in order to know that they're ridiculous. (Via Bora Zivkovic.) Maggie