Submit a link Features Reviews Podcasts Video Forums More ▾

Green women, fat livers, and the cultural side of disease

Once upon a time, there was apparently a disease called chlorosis. (There is, still, a plant disease of the same name, but we're talking about human chlorosis, here.) It existed in young women from the U.S. and Europe. It turned their skin turn green. The diagnosed cause: Excessive virginity. Prescription: A husband and, for best results, babies.

The thing with chlorosis is that the actual biological parts of it — the green skin — really did exist. It was the culturally influenced medical interpretation that was all off. In 1936, researchers proved it was actually just a type of anemia — an iron deficiency that could happen in males and females. The greenish tinge to the skin happened because the red blood cells were suddenly a lot less red.

Medicine isn't just anatomy and biology. It's also how we cultural interpret the importance and meaning of what we see in anatomy and biology. That's the point made by Druin Burch in a really interesting piece at Slate.com, where he compares chlorosis to a modern scourge — fatty liver disease.

Fatty liver disease affects up to a quarter of us. Its harms—a significantly increased risk of death among them—are taken seriously by hepatologists and other doctors. But it may not be a real disease at all ... Those with fatty liver disease won't know for certain they have the disease without a scan, be it ultrasound or some other modality. Usually fatty liver disease causes no symptoms. Yet those who have it are more likely to suffer heart attacks and strokes, more likely to develop liver cirrhosis, more likely to have high blood pressure and diabetes. Their health is improved from lowering their blood pressure and cholesterol levels, from dieting and exercising, and even (if they're particularly obese) from having a gastric bypass or similar surgery to help them lose weight.

The problem comes into focus when you realise these same hazards and recommendations can be invoked for any other manifestation of being overweight. Take fatty elbow disease. As far as I'm aware, I'm the first to describe it, but I think it could take off. It's associated with being overweight and underactive and it carries with it the same range of real risks. Sufferers are often asymptomatic, unaware of their illness, although I admit that it can be picked up without much use of an MRI scanner. Shortly I'll be writing to the New England Journal of Medicine to expose the problem. I'll demand action to raise the profile of fatty elbow disease, with programs to screen elbows nationwide and make patients aware of their affliction. I'll accept lucrative posts advising drug companies and seek out a celebrity patient or two. I'll attend so many lavish conference dinners I may develop the disease myself.

Read the rest

Image: Sean & Sarah, a Creative Commons Attribution (2.0) image from duncanh1's photostream

The natural history of the European werewolf

Where did the European werewolf come from and why did this particular mythology become so powerful that we're still telling stories about it today?

In a fascinating talk recorded at Skepticon 5 last month, Deborah Hyde discusses the history of lycanthropy and its various roles in European society. Lycanthropy was more than one thing, Hyde explains. It functioned as a legitimate medical diagnosis — usually denoting some kind of psychotic break. It served as a placeholder to explain anything particularly horrific — like the case of a French serial killer. And, probably most importantly, lycanthropy went hand-in-hand with witchcraft as part of the Inquisition.

Hyde is the editor of The Skeptic magazine and she blogs about the cultural history of belief in the supernatural. As part of this talk, she's tracked down cases of werewolf trials in the 16th and 17th centuries and attempted to understand why people were charged with lycanthropy, what connected those cases to one another, and the role the trials played in the history of religious liberty. Great stuff!

Read Deborah Hyde's blog

What it's like to be a journalist in China

In Foreign Policy magazine Eveline Chao writes a fascinating, insider account of working with Chinese censors and trying to do the job of a journalist in a place where your entire staff can be fired for the crime of accidentally having a Taiwanese flag in the background of a photograph.

Every legally registered publication in China is subject to review by a censor, sometimes several. Some expat publications have entire teams of censors scouring their otherwise innocuous restaurant reviews and bar write-ups for, depending on one's opinion of foreigners, accidental or coded allusions to sensitive topics. For example, That's Shanghai magazine once had to strike the number 64 from a short, unrelated article because their censors believed it might be read as an oblique reference to June 4, 1989, when the Chinese government bloodily suppressed a pro-democracy movement in Tiananmen Square. Many Chinese-run publications have no censor at all, but their editors are relied upon to know where the line falls -- i.e., to self-censor.

... Business content is not censored as strictly as other areas in China, since it seems to be understood that greater openness is needed to push the economy forward and it doesn't necessarily deal with the political issues Chinese rulers seem to find the most sensitive. English-language content isn't censored as much either, since only a small fraction of the Chinese population reads English. (As foreigners reporting on non-sensitive subjects in English, we could worry much less about the dangers -- threats, beatings, jail time -- that occasionally befall muckraking Chinese journalists.) And, in the beginning, most of Snow's edits were minor enough that we didn't feel compromised. We couldn't say that a businessperson came back to China from the United States after "Tiananmen," but we could say "June 1989," knowing that our readers knew the significance of the month. We couldn't say "the Cultural Revolution" but could write "the late 1960s and early 1970s," to allude to then Communist Party chairman Mao Zedong launching his disastrous campaign that sent millions of intellectuals to the countryside. Writing that a company planned to expand into "foreign markets like Taiwan and Korea" was forbidden because it suggested that Taiwan was a separate country from China, but we could say "overseas markets," since, according to Snow, Taiwan literally is over a body of water from the mainland.

Read the full story at Foreign Policy

Via Marilyn Terrell

Fraud, failure, and FUBAR in science

Here's an issue we don't talk about enough. Every year, peer-reviewed research journals publish hundreds of thousands of scientific papers. But every year, several hundred of those are retracted — essentially, unpublished. There's a number of reasons retraction happens. Sometimes, the researchers (or another group of scientists) will notice honest mistakes. Sometimes, other people will prove that the paper's results were totally wrong. And sometimes, scientists misbehave, plagiarizing their own work, plagiarizing others, or engaging in outright fraud. Officially, fraud only accounts for a small proportion of all retractions. But the number of annual retractions is growing, fast. And there's good reason to think that fraud plays a bigger role in science then we like to think. In fact, a study published a couple of weeks ago found that there was misconduct happening in 3/4ths of all retracted papers. Meanwhile, previous research has shown that, while only about .02% of all papers are retracted, 1-2% of scientists admit to having invented, fudged, or manipulated data at least once in their careers.

The trouble is that dealing with this isn't as simple as uncovering a shadowy conspiracy or two. That's not really the kind of misconduct we're talking about here.

Read the rest

Meet NASA's apocalypse expert

OK, I know that I promised to never post anything ever again about a certain hypothetical disaster that rhymes with Schmapocalypse MiffyMelve, but hear me out. This really isn't about that. Instead, I want to highlight an excellent profile of a scientist whose work and interactions with the public have been affected by that unnamed bit of urban mythology.

David Morrison is a 72-year-old senior scientist at NASA's Ames Research Center. He runs NASA's "Ask an Astrobiologist" column, and considers it his way of following in the footsteps of Carl Sagan. In this story, written by Dan Duray at The Awl, we learn about Morrison's deep commitment to communicating science to the public ... a commitment that has led him to spend the last eight years answering a increasingly heavy flood of letters about the end of the world. It's an interesting look at the effects pop culture has on real people.

The questions that Dr. Morrison receives circle around a surprisingly cohesive set of theories, each grounded in some kind of real science that then veers off in a wild direction ... It's possible that many of the people who write to Dr. Morrison are trolls, or have Kindle books to sell, or want to garner enough YouTube views to merit an ad before their videos (some of the "Nibiru exposed" videos now feature a pre-roll for the conspiracy movie Branded). But his younger questioners certainly aren't faking it. He read me some of the more serious emails over the phone:

"I know that everyone has been asking you the same question but how do I know the world is not going to end by a planet or a flood or something? I'm scared because I'm in 10th grade and I have a full life ahead of me so PLEASE I WOULD REALLY LIKE AN ANSWER TO MY QUESTION."

"I am really scared about the end of the world on 21 December. I'm headed into 7th grade and I am very scared. I hear you work for the government and I don't know what to do. Can someone help me? I can't sleep, I am crying every day, I can't eat, I stay in my room, I go to a councilor, it helps, but not with this problem. Can someone help me?"

It's not all serious business, though. In one of the funnier moments, a 72-year-old man tries to figure out how to deal with YouTube commenters accusing him of being a secret Lizard Person.

Read the full profile at The Awl

Image: Apocalypse, a Creative Commons Attribution No-Derivative-Works (2.0) image from torek's photostream

Why some technologies fail, and others succeed

My second column for the New York Times Magazine went online today. It's about the history of technology and the forces that determine which tools end up in our everyday portfolio and which become fodder for alternate history sci-fi novels.

The key thing to remember: The technologies we use today aren't necessarily the best technologies that were available. We don't really make these decisions logically, based solely on what works best. It's more complicated than that. Technology is shaped sociocultural forces. And, in turn, it shapes them, as well. The best analogy I've come up with to summarize this: The history of technology isn't a straight line. It's more like a ball of snakes fucking. (Sadly, I couldn't figure out a good way to reword this analogy for publication in the Paper of Record.) One of my big examples is the history of the electric car:

There are plenty of reasons Americans should have adopted electric cars long ago. Early E.V.’s were easier to learn to drive than their gas cousins, and they were far cleaner and better smelling. Their battery range and speed were limited, but a vast majority of the trips we take in our cars are short ones. Most of the driving we do has been well within the range of electric-car batteries for decades, says David Kirsch, associate professor of management at the University of Maryland and the author of “The Electric Vehicle and the Burden of History.” We drive gas-powered cars today for a complex set of reasons, Kirsch says, but not because the internal-combustion engine is inherently better than the electric motor and battery.

Read the rest

Why the fedora grosses out geekdom

The fedora draws increasing controversy in internet circles. In just one hour I found no less than three Tumblrs related to shaming people who wear the creased, curve-brimmed hat—formal with a touch of classic dandy—and the censure is interestingly specific. The targets are usually men.

Read the rest

The champagne of national unity

According to a survey of 200,000 Americans, Miller High Life is the most bi-partisan of beers. Republicans favor Samuel Adams and, apparently, there are a lot of Democrats drinking Heineken. (Although one might argue that these results are heavily skewed, as the survey did not include either microbrews or microparties. God only knows what the Libertarians are drinking.) There's a chart. Yay, charts! (Via Kevin Zelnio) Maggie

Black American sign language and American sign language are different languages

I've been fascinated by the history and development of sign language for a while now. Highly linked to local Deaf cultures, individual sign languages have deep roots in the home-made systems people came up with in order to communicate with one another and with their families at times when Deaf people were often a lot more socially isolated than they are today. That means that each sign language is unique — even British and American sign language aren't at all the same thing. English is spoken in both countries, but the cultural history that gave birth to sign was sufficiently different to produce two completely different languages that are unintelligible to one another. (Meanwhile, American sign language is much closer to French, because it also has roots in a system imported from France in the 19th century.)

In that case, it was a physical distance that lead to the development of two different sign languages. But, within the United States, the same thing happened because of social distance. Turns out, there is a Black American sign language that is distinctly different, as a language, from ASL. Its roots lie in segregation, and especially in separate-and-not-at-all-equal school systems. Ironically, though, that meant sign language had a more prominent place in black schools for much of the 20th century. At white schools, up until the 1970s and 1980s, students were heavily pressured to speak and lip-read, rather than sign — because it was thought to be better. Meanwhile, at black schools, sign language continued to be heavily used, growing and changing. By the late 1960s, the two systems were almost completely different languages.

Carolyn McCaskill remembers exactly when she discovered that she couldn’t understand white people. It was 1968, she was 15 years old, and she and nine other deaf black students had just enrolled in an integrated school for the deaf in Talledega, Ala.

... The teacher’s quicksilver hand movements looked little like the sign language McCaskill had grown up using at home with her two deaf siblings and had practiced at the Alabama School for the Negro Deaf and Blind, just a few miles away. It wasn’t a simple matter of people at the new school using unfamiliar vocabularly; they made hand movements for everyday words that looked foreign to McCaskill and her fellow black students.

...So, McCaskill says, “I put my signs aside.” She learned entirely new signs for such common nouns as “shoe” and “school.” She began to communicate words such as “why” and “don’t know” with one hand instead of two as she and her black friends had always done. She copied the white students who lowered their hands to make the signs for “what for” and “know” closer to their chins than to their foreheads. And she imitated the way white students mouthed words at the same time as they made manual signs for them.

Read the full story at The Washington Post

PREVIOUSLY
Martha's Vinyard: Birthplace of American Deaf Culture
What the invention of Nicaraguan sign language teaches us about the human brain
How To: Spell with your fingers in different languages
CWA: Your language is your worldview
The sign language of science
Learn the sign language of physics, male genitalia

Via Stan Carey

Death on Mount Everest

Back in May, we linked you to the reporting of Outside's Grayson Schaffer, who was stationed in the base camps of Mount Everest, watching as the mountain's third deadliest spring in recorded history unfolded. Ten climbers died during April and May. But the question is, why?

From a technological standpoint, as Schaffer points out in a follow up piece, Everest ought to be safer these days. Since 1996 — the mountain's deadliest year, documented in John Krakauer's Into Thin Air — weather forecasts have improved (allowing climbers to avoid storms like the one responsible for many of the 1996 deaths), and new helicopters can reach stranded climbers at higher altitudes. But those things, Schaffer argues, are about reducing deaths related to disasters. This year, he writes, the deaths that happened on Everest weren't about freak occurrences of bad luck. It wasn't storms or avalanches that took those people down. It wasn't, in other words, about the random risks of nature.

This matters because it points to a new status quo on Everest: the routinization of high-altitude death. By and large, the people running the show these days on the south side of Everest—the professional guides, climbing Sherpas, and Nepali officials who control permits—do an excellent job of getting climbers to the top and down again. Indeed, a week after this year’s blowup, another hundred people summited on a single bluebird day, without a single death or serious injury.

But that doesn’t mean Everest is being run rationally. There are no prerequisites for how much experience would-be climbers must have and no rules to say who can be an outfitter. Many of the best alpinists in the world still show up in Base Camp every spring. But, increasingly, so do untrained, unfit people who’ve decided to try their hand at climbing and believe that Everest is the most exciting place to start. And while some of the more established outfitters might turn them away, novices are actively courted by cut-rate start-up companies that aren’t about to refuse the cash.

It’s a recipe that doesn’t require a storm to kill people. In this regard, things are much different now than in the past: they’re worse.

Read the rest at Outside

Image via Outside and photographer Rob Sobecki

The grisly business of buffalo bones

By this point in your lives, most of you are by no doubt aware of the massive slaughter of buffalo that happened in the United States in the late 19th century. Across the plains, thousands of buffalo were killed every week during a brief period where the hides of these animals could fetch upwards of $10 a pop. (The Bureau of Labor Statistics inflation calculator only goes back to 1913, so it's hard for me to say what that's worth today. But we know from the context that even when the value of buffalo hides dropped to $1 each, the business of killing and skinning buffalo was still considered a damned fine living.)

You might think that the business ended there, with dead, skinned buffalo left to rot on the prairie. And you're sort of right. But, in a story at Bloomberg News, Tim Heffernan explains that, a few years later, those dead buffalo created another boom and bust industry—the bone collection business.

Animal bones were useful things in the 19th century. Dried and charred, they produced a substance called bone black. When coarsely crushed, it could filter impurities out of sugar-cane juice, leaving a clear liquid that evaporated to produce pure white sugar -- a lucrative industry. Bone black also made a useful pigment for paints, dyes and cosmetics, and acted as a dry lubricant for iron and steel forgings.

... And so the homesteaders gathered the buffalo bones. It was easy work: Children could do it. Carted to town, a ton of bones fetched a few dollars. Sent to rendering plants and furnaces in the big industrial cities, that same ton was worth between $18 and $27. Boiled, charred, crushed or powdered, it was worth as much as $60.

... By the 1880s, however, a few reporters were expressing nervous awe at the scale of the cleansing, and even despair for what had been lost. In 1891, not 25 years after the slaughter began, the Chicago Daily Tribune ran a dispatch titled “Relics of the Buffalo.” The relics were the animals’ empty pathways and dust wallows, worn into the surface of the Manitoba plains over countless years. The bones, let alone the living creatures, were long gone.

Read the rest

David Byrne's How Music Works

Former Talking Heads frontman and all-round happy mutant David Byrne has written several good books, but his latest, How Music Works, is unquestionably the best of the very good bunch, possibly the book he was born to write. I could made good case for calling this How Art Works or even How Everything Works.

Though there is plenty of autobiographical material How Music Works that will delight avid fans (like me) -- inside dope on the creative, commercial and personal pressures that led to each of Byrne's projects -- this isn't merely the story of how Byrne made it, or what he does to turn out such great and varied art. Rather, this is an insightful, thorough, and convincing account of the way that creativity, culture, biology and economics interact to prefigure, constrain and uplift art. It's a compelling story about the way that art comes out of technology, and as such, it's widely applicable beyond music.

Byrne lived through an important transition in the music industry: having gotten his start in the analog recording world, he skilfully managed a transition to an artist in the digital era (though not always a digital artist). As such, he has real gut-feel for the things that technology gives to artists and the things that technology takes away. He's like the kids who got their Apple ][+s in 1979, and keenly remember the time before computers were available to kids at all, the time when they were the exclusive domain of obsessive geeks, and the point at which they became widely exciting, and finally, ubiquitous -- a breadth of experience that offers visceral perspective.

There were so many times in this book when I felt like Byrne's observations extended beyond music and dance and into other forms of digital creativity. For example, when Byrne recounted his first experiments with cellular automata exercise for dance choreography, from his collaboration with Noemie Lafrance:

1. Improvise moving to the music and come up with an eight-count phrase (in dance, a phrase is a short series of moves that can be repeated).

2. When you find a phrase you like, loop (repeat) it.

3. When you see someone else with a stronger phrase, copy it.

4. When everyone is doing the same phrase, the exercise is over.

It was like watching evolution on fast-forward, or an emergent lifeform coming into being. At first the room was chaos, writhing bodies everywhere. At first the room was chaos, writhing bodies everywhere. Then one could see that folks had chosen their phrases, and almost immediately one could see a pocket of dancers who had all adopted the same phrase. The copying had already begun, albeit in just one area. This pocket of copying began to expand, to go viral, while yet another one now emerged on the other side of the room. One clump grew faster than the other, and within four minutes the whole room was filled with dancers moving in perfect unison. Unbelievable! It only took four minutes for this evolutionary process to kick in, and for the "strongest" (unfortunate word, maybe) to dominate.

Read the rest

Meet "Big Trash"

Over the long run, keeping stuff like tree limbs and compostable waste out of landfills is good for cities. There's only so much space in a landfill and getting more land is extremely expensive. So why haven't more cities hopped on the curbside composting bandwagon, or at least banned yard waste from landfills? There's probably a lot of factors that go into those decisions, but one, apparently, is the influence of large, private companies that handle waste collection and see the diversion of re-usable waste as a detriment to their income. (Via Chris Tackett) Maggie

The best cat video on the Internet

That is a high claim, I know. But over Labor Day weekend, a combination of dedicated curation and popular vote resulted in Henri 2, Paw de Deux being named the best Internet cat video.

The Internet Cat Film Festival, sponsored by Minneapolis' Walker Museum of Art, drew a live audience of more than 10,000 people last Thursday night. Videos were curated from a massive collection submitted online, and were grouped into thematic categories— foreign films, for instance, or comedies. Henri 2 took home the Golden Kitty, a People's Choice award.

Bonus: If arguing about the merits of Henri 2 weren't enough of a gift to your procrastination tendencies, you can also check out a full list of all the films screened at the festival, including links.

Better services, less piracy

John Brownlee on why he stopped pirating music:

It’s clear to me, in retrospect, that my piracy was mostly mere collecting, and like the most fetishistic of collectors, it was conducted with mindless voracity. A good collection is supposed to be made up of relics, items that conjure up memories, feelings and ideas for the owner so strongly that he gets pleasure in simply being in close contact with them. A tended garden. My collection was nothing like this: it was just a red weed, swallowing up and corroding anything I did care about within its indiscriminating mass.

tl;dr newer streaming/subscription services, such as Spotify and Rdio, have nailed it.