What your New Year's Resolutions tell us about the way you think


It's a little late, but I kind of love these 2013 props made by PaperandPancakes on Etsy.

How did you write your New Year's resolutions? I don't mean, like, the tools you used — pencil and paper vs. tablet and bluetooth keyboard. What I'm talking about is how you put the goals into words — how you described what it was you wanted to do.

There's more than one way to make a resolution.

A couple of weeks ago, I ran across a great example of this in an old sociology paper from 1977. Researchers had collected New Year's resolutions from two groups of 6th graders — one of average middle class kids, and another group made up of Amish and Mennonites.

The researchers meant to study differences in gender. They were trying to figure out how different cultural backgrounds affected behavior that we tend to associate with one gender or another. But in that data, they noticed something odd, something they couldn't easily translate into statistics. The Amish kids' resolutions were different from those of the "normal" children.

Read the rest

What you can learn from the million-dollar tuna

On Saturday, a bluefin tuna was sold at Tokyo's Tsukiji fish market tuna auction for $1.76 million. Which is a little crazy. (Also crazy, the size of the fish in question.) But the amount paid for this specimen of a chronically overfished species doesn't really represent simple supply and demand, explains marine biologist Andrew David Thaler. It shouldn't be read as a measurement of tuna scarcity, he says, but rather as an artifact of culture (and marketing).

How humans evolved to explore

Boldly going where nobody's gone before. In a lot of ways, that idea kind of defines our whole species. We travel. We're curious. We poke our noses around the planet to find new places to live. We're compelled to explore places few people would ever actually want to live. We push ourselves into space.

This behavior isn't totally unique. But it is remarkable. So we have to ask, is there a genetic, evolution-driven, cause behind the restlessness of humanity?

At National Geographic, David Dobbs has an amazing long read digging into that idea. The story is fascinating, stretching from Polynesian sailors to Quebecois settlers. And it's very, very good science writing. Dobbs resists the urge to go for easy "here is the gene that does this" answers. Instead, he helps us see the complex web of genetics and culture that influences and encourages certain behaviors at certain times. It's a great read.

Not all of us ache to ride a rocket or sail the infinite sea. Yet as a species we’re curious enough, and intrigued enough by the prospect, to help pay for the trip and cheer at the voyagers’ return. Yes, we explore to find a better place to live or acquire a larger territory or make a fortune. But we also explore simply to discover what’s there.

“No other mammal moves around like we do,” says Svante Pääbo, a director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, where he uses genetics to study human origins. “We jump borders. We push into new territory even when we have resources where we are. Other animals don’t do this. Other humans either. Neanderthals were around hundreds of thousands of years, but they never spread around the world. In just 50,000 years we covered everything. There’s a kind of madness to it. Sailing out into the ocean, you have no idea what’s on the other side. And now we go to Mars. We never stop. Why?”

Why indeed? Pääbo and other scientists pondering this question are themselves explorers, walking new ground. They know that they might have to backtrack and regroup at any time. They know that any notion about why we explore might soon face revision as their young disciplines—anthropology, genetics, developmental neuropsychology—turn up new fundamentals. Yet for those trying to figure out what makes humans tick, our urge to explore is irresistible terrain. What gives rise to this “madness” to explore? What drove us out from Africa and on to the moon and beyond?

Read the full story

Photos of a simpler time ... in North Korea

Retro DPRK is a blog that collects images of North Korea from the 1950s, 1960s, 1970s, and 1980s. Getting into North Korea from the United States and Western Europe is not easy today. But up until the collapse of the Soviet Union, it was even more difficult. If you weren't also from a Communist country, chances were good that you weren't going to get even a glimpse of the place.

But, at the same time, North Korea was also promoting itself through propaganda, and as a tourist destination for citizens of the USSR. Christopher Graper — who leads tours into North Korea today from Canada — has scanned scenes from postcards and tourism brochures — rare peeks into the little-documented history of a secretive country.

The collection blends familiar scenes that wouldn't look terribly different from American advertisements of the same era with an amusingly odd sensibility (who wouldn't want a whole book of postcards documenting every detail of Pyongyang's new gymnasium?) and quietly disconcerting scenes like the one above, where a seaside resort town appears eerily empty — like a theme park before opening time.

Retro DPRK

Thanks for pointing me toward this, Gidjlet!

Green women, fat livers, and the cultural side of disease

Once upon a time, there was apparently a disease called chlorosis. (There is, still, a plant disease of the same name, but we're talking about human chlorosis, here.) It existed in young women from the U.S. and Europe. It turned their skin turn green. The diagnosed cause: Excessive virginity. Prescription: A husband and, for best results, babies.

The thing with chlorosis is that the actual biological parts of it — the green skin — really did exist. It was the culturally influenced medical interpretation that was all off. In 1936, researchers proved it was actually just a type of anemia — an iron deficiency that could happen in males and females. The greenish tinge to the skin happened because the red blood cells were suddenly a lot less red.

Medicine isn't just anatomy and biology. It's also how we cultural interpret the importance and meaning of what we see in anatomy and biology. That's the point made by Druin Burch in a really interesting piece at Slate.com, where he compares chlorosis to a modern scourge — fatty liver disease.

Fatty liver disease affects up to a quarter of us. Its harms—a significantly increased risk of death among them—are taken seriously by hepatologists and other doctors. But it may not be a real disease at all ... Those with fatty liver disease won't know for certain they have the disease without a scan, be it ultrasound or some other modality. Usually fatty liver disease causes no symptoms. Yet those who have it are more likely to suffer heart attacks and strokes, more likely to develop liver cirrhosis, more likely to have high blood pressure and diabetes. Their health is improved from lowering their blood pressure and cholesterol levels, from dieting and exercising, and even (if they're particularly obese) from having a gastric bypass or similar surgery to help them lose weight.

The problem comes into focus when you realise these same hazards and recommendations can be invoked for any other manifestation of being overweight. Take fatty elbow disease. As far as I'm aware, I'm the first to describe it, but I think it could take off. It's associated with being overweight and underactive and it carries with it the same range of real risks. Sufferers are often asymptomatic, unaware of their illness, although I admit that it can be picked up without much use of an MRI scanner. Shortly I'll be writing to the New England Journal of Medicine to expose the problem. I'll demand action to raise the profile of fatty elbow disease, with programs to screen elbows nationwide and make patients aware of their affliction. I'll accept lucrative posts advising drug companies and seek out a celebrity patient or two. I'll attend so many lavish conference dinners I may develop the disease myself.

Read the rest

Image: Sean & Sarah, a Creative Commons Attribution (2.0) image from duncanh1's photostream

The natural history of the European werewolf

Where did the European werewolf come from and why did this particular mythology become so powerful that we're still telling stories about it today?

In a fascinating talk recorded at Skepticon 5 last month, Deborah Hyde discusses the history of lycanthropy and its various roles in European society. Lycanthropy was more than one thing, Hyde explains. It functioned as a legitimate medical diagnosis — usually denoting some kind of psychotic break. It served as a placeholder to explain anything particularly horrific — like the case of a French serial killer. And, probably most importantly, lycanthropy went hand-in-hand with witchcraft as part of the Inquisition.

Hyde is the editor of The Skeptic magazine and she blogs about the cultural history of belief in the supernatural. As part of this talk, she's tracked down cases of werewolf trials in the 16th and 17th centuries and attempted to understand why people were charged with lycanthropy, what connected those cases to one another, and the role the trials played in the history of religious liberty. Great stuff!

Read Deborah Hyde's blog

What it's like to be a journalist in China

In Foreign Policy magazine Eveline Chao writes a fascinating, insider account of working with Chinese censors and trying to do the job of a journalist in a place where your entire staff can be fired for the crime of accidentally having a Taiwanese flag in the background of a photograph.

Every legally registered publication in China is subject to review by a censor, sometimes several. Some expat publications have entire teams of censors scouring their otherwise innocuous restaurant reviews and bar write-ups for, depending on one's opinion of foreigners, accidental or coded allusions to sensitive topics. For example, That's Shanghai magazine once had to strike the number 64 from a short, unrelated article because their censors believed it might be read as an oblique reference to June 4, 1989, when the Chinese government bloodily suppressed a pro-democracy movement in Tiananmen Square. Many Chinese-run publications have no censor at all, but their editors are relied upon to know where the line falls -- i.e., to self-censor.

... Business content is not censored as strictly as other areas in China, since it seems to be understood that greater openness is needed to push the economy forward and it doesn't necessarily deal with the political issues Chinese rulers seem to find the most sensitive. English-language content isn't censored as much either, since only a small fraction of the Chinese population reads English. (As foreigners reporting on non-sensitive subjects in English, we could worry much less about the dangers -- threats, beatings, jail time -- that occasionally befall muckraking Chinese journalists.) And, in the beginning, most of Snow's edits were minor enough that we didn't feel compromised. We couldn't say that a businessperson came back to China from the United States after "Tiananmen," but we could say "June 1989," knowing that our readers knew the significance of the month. We couldn't say "the Cultural Revolution" but could write "the late 1960s and early 1970s," to allude to then Communist Party chairman Mao Zedong launching his disastrous campaign that sent millions of intellectuals to the countryside. Writing that a company planned to expand into "foreign markets like Taiwan and Korea" was forbidden because it suggested that Taiwan was a separate country from China, but we could say "overseas markets," since, according to Snow, Taiwan literally is over a body of water from the mainland.

Read the full story at Foreign Policy

Via Marilyn Terrell

Fraud, failure, and FUBAR in science

Here's an issue we don't talk about enough. Every year, peer-reviewed research journals publish hundreds of thousands of scientific papers. But every year, several hundred of those are retracted — essentially, unpublished. There's a number of reasons retraction happens. Sometimes, the researchers (or another group of scientists) will notice honest mistakes. Sometimes, other people will prove that the paper's results were totally wrong. And sometimes, scientists misbehave, plagiarizing their own work, plagiarizing others, or engaging in outright fraud. Officially, fraud only accounts for a small proportion of all retractions. But the number of annual retractions is growing, fast. And there's good reason to think that fraud plays a bigger role in science then we like to think. In fact, a study published a couple of weeks ago found that there was misconduct happening in 3/4ths of all retracted papers. Meanwhile, previous research has shown that, while only about .02% of all papers are retracted, 1-2% of scientists admit to having invented, fudged, or manipulated data at least once in their careers.

The trouble is that dealing with this isn't as simple as uncovering a shadowy conspiracy or two. That's not really the kind of misconduct we're talking about here.

Read the rest

Meet NASA's apocalypse expert

OK, I know that I promised to never post anything ever again about a certain hypothetical disaster that rhymes with Schmapocalypse MiffyMelve, but hear me out. This really isn't about that. Instead, I want to highlight an excellent profile of a scientist whose work and interactions with the public have been affected by that unnamed bit of urban mythology.

David Morrison is a 72-year-old senior scientist at NASA's Ames Research Center. He runs NASA's "Ask an Astrobiologist" column, and considers it his way of following in the footsteps of Carl Sagan. In this story, written by Dan Duray at The Awl, we learn about Morrison's deep commitment to communicating science to the public ... a commitment that has led him to spend the last eight years answering a increasingly heavy flood of letters about the end of the world. It's an interesting look at the effects pop culture has on real people.

The questions that Dr. Morrison receives circle around a surprisingly cohesive set of theories, each grounded in some kind of real science that then veers off in a wild direction ... It's possible that many of the people who write to Dr. Morrison are trolls, or have Kindle books to sell, or want to garner enough YouTube views to merit an ad before their videos (some of the "Nibiru exposed" videos now feature a pre-roll for the conspiracy movie Branded). But his younger questioners certainly aren't faking it. He read me some of the more serious emails over the phone:

"I know that everyone has been asking you the same question but how do I know the world is not going to end by a planet or a flood or something? I'm scared because I'm in 10th grade and I have a full life ahead of me so PLEASE I WOULD REALLY LIKE AN ANSWER TO MY QUESTION."

"I am really scared about the end of the world on 21 December. I'm headed into 7th grade and I am very scared. I hear you work for the government and I don't know what to do. Can someone help me? I can't sleep, I am crying every day, I can't eat, I stay in my room, I go to a councilor, it helps, but not with this problem. Can someone help me?"

It's not all serious business, though. In one of the funnier moments, a 72-year-old man tries to figure out how to deal with YouTube commenters accusing him of being a secret Lizard Person.

Read the full profile at The Awl

Image: Apocalypse, a Creative Commons Attribution No-Derivative-Works (2.0) image from torek's photostream

Why some technologies fail, and others succeed

My second column for the New York Times Magazine went online today. It's about the history of technology and the forces that determine which tools end up in our everyday portfolio and which become fodder for alternate history sci-fi novels.

The key thing to remember: The technologies we use today aren't necessarily the best technologies that were available. We don't really make these decisions logically, based solely on what works best. It's more complicated than that. Technology is shaped sociocultural forces. And, in turn, it shapes them, as well. The best analogy I've come up with to summarize this: The history of technology isn't a straight line. It's more like a ball of snakes fucking. (Sadly, I couldn't figure out a good way to reword this analogy for publication in the Paper of Record.) One of my big examples is the history of the electric car:

There are plenty of reasons Americans should have adopted electric cars long ago. Early E.V.’s were easier to learn to drive than their gas cousins, and they were far cleaner and better smelling. Their battery range and speed were limited, but a vast majority of the trips we take in our cars are short ones. Most of the driving we do has been well within the range of electric-car batteries for decades, says David Kirsch, associate professor of management at the University of Maryland and the author of “The Electric Vehicle and the Burden of History.” We drive gas-powered cars today for a complex set of reasons, Kirsch says, but not because the internal-combustion engine is inherently better than the electric motor and battery.

Read the rest

Why the fedora grosses out geekdom

The fedora draws increasing controversy in internet circles. In just one hour I found no less than three Tumblrs related to shaming people who wear the creased, curve-brimmed hat—formal with a touch of classic dandy—and the censure is interestingly specific. The targets are usually men.

Read the rest

The champagne of national unity

According to a survey of 200,000 Americans, Miller High Life is the most bi-partisan of beers. Republicans favor Samuel Adams and, apparently, there are a lot of Democrats drinking Heineken. (Although one might argue that these results are heavily skewed, as the survey did not include either microbrews or microparties. God only knows what the Libertarians are drinking.) There's a chart. Yay, charts! (Via Kevin Zelnio)

Black American sign language and American sign language are different languages

I've been fascinated by the history and development of sign language for a while now. Highly linked to local Deaf cultures, individual sign languages have deep roots in the home-made systems people came up with in order to communicate with one another and with their families at times when Deaf people were often a lot more socially isolated than they are today. That means that each sign language is unique — even British and American sign language aren't at all the same thing. English is spoken in both countries, but the cultural history that gave birth to sign was sufficiently different to produce two completely different languages that are unintelligible to one another. (Meanwhile, American sign language is much closer to French, because it also has roots in a system imported from France in the 19th century.)

In that case, it was a physical distance that lead to the development of two different sign languages. But, within the United States, the same thing happened because of social distance. Turns out, there is a Black American sign language that is distinctly different, as a language, from ASL. Its roots lie in segregation, and especially in separate-and-not-at-all-equal school systems. Ironically, though, that meant sign language had a more prominent place in black schools for much of the 20th century. At white schools, up until the 1970s and 1980s, students were heavily pressured to speak and lip-read, rather than sign — because it was thought to be better. Meanwhile, at black schools, sign language continued to be heavily used, growing and changing. By the late 1960s, the two systems were almost completely different languages.

Carolyn McCaskill remembers exactly when she discovered that she couldn’t understand white people. It was 1968, she was 15 years old, and she and nine other deaf black students had just enrolled in an integrated school for the deaf in Talledega, Ala.

... The teacher’s quicksilver hand movements looked little like the sign language McCaskill had grown up using at home with her two deaf siblings and had practiced at the Alabama School for the Negro Deaf and Blind, just a few miles away. It wasn’t a simple matter of people at the new school using unfamiliar vocabularly; they made hand movements for everyday words that looked foreign to McCaskill and her fellow black students.

...So, McCaskill says, “I put my signs aside.” She learned entirely new signs for such common nouns as “shoe” and “school.” She began to communicate words such as “why” and “don’t know” with one hand instead of two as she and her black friends had always done. She copied the white students who lowered their hands to make the signs for “what for” and “know” closer to their chins than to their foreheads. And she imitated the way white students mouthed words at the same time as they made manual signs for them.

Read the full story at The Washington Post

PREVIOUSLY
Martha's Vinyard: Birthplace of American Deaf Culture
What the invention of Nicaraguan sign language teaches us about the human brain
How To: Spell with your fingers in different languages
CWA: Your language is your worldview
The sign language of science
Learn the sign language of physics, male genitalia

Via Stan Carey

Death on Mount Everest

Back in May, we linked you to the reporting of Outside's Grayson Schaffer, who was stationed in the base camps of Mount Everest, watching as the mountain's third deadliest spring in recorded history unfolded. Ten climbers died during April and May. But the question is, why?

From a technological standpoint, as Schaffer points out in a follow up piece, Everest ought to be safer these days. Since 1996 — the mountain's deadliest year, documented in John Krakauer's Into Thin Air — weather forecasts have improved (allowing climbers to avoid storms like the one responsible for many of the 1996 deaths), and new helicopters can reach stranded climbers at higher altitudes. But those things, Schaffer argues, are about reducing deaths related to disasters. This year, he writes, the deaths that happened on Everest weren't about freak occurrences of bad luck. It wasn't storms or avalanches that took those people down. It wasn't, in other words, about the random risks of nature.

This matters because it points to a new status quo on Everest: the routinization of high-altitude death. By and large, the people running the show these days on the south side of Everest—the professional guides, climbing Sherpas, and Nepali officials who control permits—do an excellent job of getting climbers to the top and down again. Indeed, a week after this year’s blowup, another hundred people summited on a single bluebird day, without a single death or serious injury.

But that doesn’t mean Everest is being run rationally. There are no prerequisites for how much experience would-be climbers must have and no rules to say who can be an outfitter. Many of the best alpinists in the world still show up in Base Camp every spring. But, increasingly, so do untrained, unfit people who’ve decided to try their hand at climbing and believe that Everest is the most exciting place to start. And while some of the more established outfitters might turn them away, novices are actively courted by cut-rate start-up companies that aren’t about to refuse the cash.

It’s a recipe that doesn’t require a storm to kill people. In this regard, things are much different now than in the past: they’re worse.

Read the rest at Outside

Image via Outside and photographer Rob Sobecki

The grisly business of buffalo bones

By this point in your lives, most of you are by no doubt aware of the massive slaughter of buffalo that happened in the United States in the late 19th century. Across the plains, thousands of buffalo were killed every week during a brief period where the hides of these animals could fetch upwards of $10 a pop. (The Bureau of Labor Statistics inflation calculator only goes back to 1913, so it's hard for me to say what that's worth today. But we know from the context that even when the value of buffalo hides dropped to $1 each, the business of killing and skinning buffalo was still considered a damned fine living.)

You might think that the business ended there, with dead, skinned buffalo left to rot on the prairie. And you're sort of right. But, in a story at Bloomberg News, Tim Heffernan explains that, a few years later, those dead buffalo created another boom and bust industry—the bone collection business.

Animal bones were useful things in the 19th century. Dried and charred, they produced a substance called bone black. When coarsely crushed, it could filter impurities out of sugar-cane juice, leaving a clear liquid that evaporated to produce pure white sugar -- a lucrative industry. Bone black also made a useful pigment for paints, dyes and cosmetics, and acted as a dry lubricant for iron and steel forgings.

... And so the homesteaders gathered the buffalo bones. It was easy work: Children could do it. Carted to town, a ton of bones fetched a few dollars. Sent to rendering plants and furnaces in the big industrial cities, that same ton was worth between $18 and $27. Boiled, charred, crushed or powdered, it was worth as much as $60.

... By the 1880s, however, a few reporters were expressing nervous awe at the scale of the cleansing, and even despair for what had been lost. In 1891, not 25 years after the slaughter began, the Chicago Daily Tribune ran a dispatch titled “Relics of the Buffalo.” The relics were the animals’ empty pathways and dust wallows, worn into the surface of the Manitoba plains over countless years. The bones, let alone the living creatures, were long gone.

Read the rest