It's a little late, but I kind of love these 2013 props made by PaperandPancakes on Etsy.
How did you write your New Year's resolutions? I don't mean, like, the tools you used — pencil and paper vs. tablet and bluetooth keyboard. What I'm talking about is how you put the goals into words — how you described what it was you wanted to do.
There's more than one way to make a resolution.
A couple of weeks ago, I ran across a great example of this in an old sociology paper from 1977. Researchers had collected New Year's resolutions from two groups of 6th graders — one of average middle class kids, and another group made up of Amish and Mennonites.
The researchers meant to study differences in gender. They were trying to figure out how different cultural backgrounds affected behavior that we tend to associate with one gender or another. But in that data, they noticed something odd, something they couldn't easily translate into statistics. The Amish kids' resolutions were different from those of the "normal" children. Read the rest
On Saturday, a bluefin tuna was sold at Tokyo's Tsukiji fish market tuna auction for $1.76 million. Which is a little crazy. (Also crazy, the size of the fish
in question.) But the amount paid for this specimen of a chronically overfished species doesn't really represent simple supply and demand, explains marine biologist Andrew David Thaler. It shouldn't be read as a measurement of tuna scarcity
, he says, but rather as an artifact of culture (and marketing). Read the rest
Boldly going where nobody's gone before. In a lot of ways, that idea kind of defines our whole species. We travel. We're curious. We poke our noses around the planet to find new places to live. We're compelled to explore places few people would ever actually want to live. We push ourselves into space.
This behavior isn't totally unique. But it is remarkable. So we have to ask, is there a genetic, evolution-driven, cause behind the restlessness of humanity?
At National Geographic, David Dobbs has an amazing long read digging into that idea. The story is fascinating, stretching from Polynesian sailors to Quebecois settlers. And it's very, very good science writing. Dobbs resists the urge to go for easy "here is the gene that does this" answers. Instead, he helps us see the complex web of genetics and culture that influences and encourages certain behaviors at certain times. It's a great read.
Read the rest
Not all of us ache to ride a rocket or sail the infinite sea. Yet as a species we’re curious enough, and intrigued enough by the prospect, to help pay for the trip and cheer at the voyagers’ return. Yes, we explore to find a better place to live or acquire a larger territory or make a fortune. But we also explore simply to discover what’s there.
“No other mammal moves around like we do,” says Svante Pääbo, a director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, where he uses genetics to study human origins.
Retro DPRK is a blog that collects images of North Korea from the 1950s, 1960s, 1970s, and 1980s. Getting into North Korea from the United States and Western Europe is not easy today. But up until the collapse of the Soviet Union, it was even more difficult. If you weren't also from a Communist country, chances were good that you weren't going to get even a glimpse of the place.
But, at the same time, North Korea was also promoting itself through propaganda, and as a tourist destination for citizens of the USSR. Christopher Graper — who leads tours into North Korea today from Canada — has scanned scenes from postcards and tourism brochures — rare peeks into the little-documented history of a secretive country.
The collection blends familiar scenes that wouldn't look terribly different from American advertisements of the same era with an amusingly odd sensibility (who wouldn't want a whole book of postcards documenting every detail of Pyongyang's new gymnasium?) and quietly disconcerting scenes like the one above, where a seaside resort town appears eerily empty — like a theme park before opening time.
Thanks for pointing me toward this, Gidjlet! Read the rest
Once upon a time, there was apparently a disease called chlorosis. (There is, still, a plant disease of the same name, but we're talking about human chlorosis, here.) It existed in young women from the U.S. and Europe. It turned their skin turn green. The diagnosed cause: Excessive virginity. Prescription: A husband and, for best results, babies.
The thing with chlorosis is that the actual biological parts of it — the green skin — really did exist. It was the culturally influenced medical interpretation that was all off. In 1936, researchers proved it was actually just a type of anemia — an iron deficiency that could happen in males and females. The greenish tinge to the skin happened because the red blood cells were suddenly a lot less red.
Medicine isn't just anatomy and biology. It's also how we cultural interpret the importance and meaning of what we see in anatomy and biology. That's the point made by Druin Burch in a really interesting piece at Slate.com, where he compares chlorosis to a modern scourge — fatty liver disease.
Read the rest
Fatty liver disease affects up to a quarter of us. Its harms—a significantly increased risk of death among them—are taken seriously by hepatologists and other doctors. But it may not be a real disease at all ... Those with fatty liver disease won't know for certain they have the disease without a scan, be it ultrasound or some other modality. Usually fatty liver disease causes no symptoms. Yet those who have it are more likely to suffer heart attacks and strokes, more likely to develop liver cirrhosis, more likely to have high blood pressure and diabetes.
A tour of werewolves in European history — the mad, the bad, and the heretics.
In Foreign Policy magazine Eveline Chao writes a fascinating, insider account of working with Chinese censors and trying to do the job of a journalist in a place where your entire staff can be fired for the crime of accidentally having a Taiwanese flag in the background of a photograph.
Read the rest
Every legally registered publication in China is subject to review by a censor, sometimes several. Some expat publications have entire teams of censors scouring their otherwise innocuous restaurant reviews and bar write-ups for, depending on one's opinion of foreigners, accidental or coded allusions to sensitive topics. For example, That's Shanghai magazine once had to strike the number 64 from a short, unrelated article because their censors believed it might be read as an oblique reference to June 4, 1989, when the Chinese government bloodily suppressed a pro-democracy movement in Tiananmen Square. Many Chinese-run publications have no censor at all, but their editors are relied upon to know where the line falls -- i.e., to self-censor.
... Business content is not censored as strictly as other areas in China, since it seems to be understood that greater openness is needed to push the economy forward and it doesn't necessarily deal with the political issues Chinese rulers seem to find the most sensitive. English-language content isn't censored as much either, since only a small fraction of the Chinese population reads English. (As foreigners reporting on non-sensitive subjects in English, we could worry much less about the dangers -- threats, beatings, jail time -- that occasionally befall muckraking Chinese journalists.) And, in the beginning, most of Snow's edits were minor enough that we didn't feel compromised.
Here's an issue we don't talk about enough. Every year, peer-reviewed research journals publish hundreds of thousands of scientific papers. But every year, several hundred of those are retracted — essentially, unpublished. There's a number of reasons retraction happens. Sometimes, the researchers (or another group of scientists) will notice honest mistakes. Sometimes, other people will prove that the paper's results were totally wrong. And sometimes, scientists misbehave, plagiarizing their own work, plagiarizing others, or engaging in outright fraud. Officially, fraud only accounts for a small proportion of all retractions. But the number of annual retractions is growing, fast. And there's good reason to think that fraud plays a bigger role in science then we like to think. In fact, a study published a couple of weeks ago found that there was misconduct happening in 3/4ths of all retracted papers. Meanwhile, previous research has shown that, while only about .02% of all papers are retracted, 1-2% of scientists admit to having invented, fudged, or manipulated data at least once in their careers.
The trouble is that dealing with this isn't as simple as uncovering a shadowy conspiracy or two. That's not really the kind of misconduct we're talking about here.
Read the rest
OK, I know that I promised to never post anything ever again about a certain hypothetical disaster that rhymes with Schmapocalypse MiffyMelve, but hear me out. This really isn't about that. Instead, I want to highlight an excellent profile of a scientist whose work and interactions with the public have been affected by that unnamed bit of urban mythology.
David Morrison is a 72-year-old senior scientist at NASA's Ames Research Center. He runs NASA's "Ask an Astrobiologist" column, and considers it his way of following in the footsteps of Carl Sagan. In this story, written by Dan Duray at The Awl, we learn about Morrison's deep commitment to communicating science to the public ... a commitment that has led him to spend the last eight years answering a increasingly heavy flood of letters about the end of the world. It's an interesting look at the effects pop culture has on real people.
Read the rest
The questions that Dr. Morrison receives circle around a surprisingly cohesive set of theories, each grounded in some kind of real science that then veers off in a wild direction ... It's possible that many of the people who write to Dr. Morrison are trolls, or have Kindle books to sell, or want to garner enough YouTube views to merit an ad before their videos (some of the "Nibiru exposed" videos now feature a pre-roll for the conspiracy movie Branded). But his younger questioners certainly aren't faking it. He read me some of the more serious emails over the phone:
"I know that everyone has been asking you the same question but how do I know the world is not going to end by a planet or a flood or something?
My second column for the New York Times Magazine went online today. It's about the history of technology and the forces that determine which tools end up in our everyday portfolio and which become fodder for alternate history sci-fi novels.
The key thing to remember: The technologies we use today aren't necessarily the best technologies that were available. We don't really make these decisions logically, based solely on what works best. It's more complicated than that. Technology is shaped sociocultural forces. And, in turn, it shapes them, as well. The best analogy I've come up with to summarize this: The history of technology isn't a straight line. It's more like a ball of snakes fucking. (Sadly, I couldn't figure out a good way to reword this analogy for publication in the Paper of Record.) One of my big examples is the history of the electric car:
Read the rest
There are plenty of reasons Americans should have adopted electric cars long ago. Early E.V.’s were easier to learn to drive than their gas cousins, and they were far cleaner and better smelling. Their battery range and speed were limited, but a vast majority of the trips we take in our cars are short ones. Most of the driving we do has been well within the range of electric-car batteries for decades, says David Kirsch, associate professor of management at the University of Maryland and the author of “The Electric Vehicle and the Burden of History.” We drive gas-powered cars today for a complex set of reasons, Kirsch says, but not because the internal-combustion engine is inherently better than the electric motor and battery.
The fedora draws increasing controversy in internet circles. In just one hour I found no less than three Tumblrs related to shaming people who wear the creased, curve-brimmed hat—formal with a touch of classic dandy—and the censure is interestingly specific. The targets are usually men.
According to a survey of 200,000 Americans, Miller High Life is the most bi-partisan of beers. Republicans favor Samuel Adams and, apparently, there are a lot of Democrats drinking Heineken. (Although one might argue that these results are heavily skewed, as the survey did not include either microbrews or microparties. God only knows what the Libertarians are drinking.) There's a chart. Yay, charts! (Via Kevin Zelnio) Read the rest
I've been fascinated by the history and development of sign language for a while now. Highly linked to local Deaf cultures, individual sign languages have deep roots in the home-made systems people came up with in order to communicate with one another and with their families at times when Deaf people were often a lot more socially isolated than they are today. That means that each sign language is unique — even British and American sign language aren't at all the same thing. English is spoken in both countries, but the cultural history that gave birth to sign was sufficiently different to produce two completely different languages that are unintelligible to one another. (Meanwhile, American sign language is much closer to French, because it also has roots in a system imported from France in the 19th century.)
In that case, it was a physical distance that lead to the development of two different sign languages. But, within the United States, the same thing happened because of social distance. Turns out, there is a Black American sign language that is distinctly different, as a language, from ASL. Its roots lie in segregation, and especially in separate-and-not-at-all-equal school systems. Ironically, though, that meant sign language had a more prominent place in black schools for much of the 20th century. At white schools, up until the 1970s and 1980s, students were heavily pressured to speak and lip-read, rather than sign — because it was thought to be better. Meanwhile, at black schools, sign language continued to be heavily used, growing and changing. Read the rest
Back in May, we linked you to the reporting of Outside's Grayson Schaffer, who was stationed in the base camps of Mount Everest, watching as the mountain's third deadliest spring in recorded history unfolded. Ten climbers died during April and May. But the question is, why?
From a technological standpoint, as Schaffer points out in a follow up piece, Everest ought to be safer these days. Since 1996 — the mountain's deadliest year, documented in John Krakauer's Into Thin Air — weather forecasts have improved (allowing climbers to avoid storms like the one responsible for many of the 1996 deaths), and new helicopters can reach stranded climbers at higher altitudes. But those things, Schaffer argues, are about reducing deaths related to disasters. This year, he writes, the deaths that happened on Everest weren't about freak occurrences of bad luck. It wasn't storms or avalanches that took those people down. It wasn't, in other words, about the random risks of nature.
Read the rest
This matters because it points to a new status quo on Everest: the routinization of high-altitude death. By and large, the people running the show these days on the south side of Everest—the professional guides, climbing Sherpas, and Nepali officials who control permits—do an excellent job of getting climbers to the top and down again. Indeed, a week after this year’s blowup, another hundred people summited on a single bluebird day, without a single death or serious injury.
But that doesn’t mean Everest is being run rationally. There are no prerequisites for how much experience would-be climbers must have and no rules to say who can be an outfitter.
By this point in your lives, most of you are by no doubt aware of the massive slaughter of buffalo that happened in the United States in the late 19th century. Across the plains, thousands of buffalo were killed every week during a brief period where the hides of these animals could fetch upwards of $10 a pop. (The Bureau of Labor Statistics inflation calculator only goes back to 1913, so it's hard for me to say what that's worth today. But we know from the context that even when the value of buffalo hides dropped to $1 each, the business of killing and skinning buffalo was still considered a damned fine living.)
You might think that the business ended there, with dead, skinned buffalo left to rot on the prairie. And you're sort of right. But, in a story at Bloomberg News, Tim Heffernan explains that, a few years later, those dead buffalo created another boom and bust industry—the bone collection business.
Read the rest
Animal bones were useful things in the 19th century. Dried and charred, they produced a substance called bone black. When coarsely crushed, it could filter impurities out of sugar-cane juice, leaving a clear liquid that evaporated to produce pure white sugar -- a lucrative industry. Bone black also made a useful pigment for paints, dyes and cosmetics, and acted as a dry lubricant for iron and steel forgings.
... And so the homesteaders gathered the buffalo bones. It was easy work: Children could do it. Carted to town, a ton of bones fetched a few dollars.
Former Talking Heads frontman and all-round happy mutant David Byrne has written several good books, but his latest, How Music Works, is unquestionably the best of the very good bunch, possibly the book he was born to write. I could made good case for calling this How Art Works or even How Everything Works.
Though there is plenty of autobiographical material How Music Works that will delight avid fans (like me) -- inside dope on the creative, commercial and personal pressures that led to each of Byrne's projects -- this isn't merely the story of how Byrne made it, or what he does to turn out such great and varied art. Rather, this is an insightful, thorough, and convincing account of the way that creativity, culture, biology and economics interact to prefigure, constrain and uplift art. It's a compelling story about the way that art comes out of technology, and as such, it's widely applicable beyond music.
Byrne lived through an important transition in the music industry: having gotten his start in the analog recording world, he skilfully managed a transition to an artist in the digital era (though not always a digital artist). As such, he has real gut-feel for the things that technology gives to artists and the things that technology takes away. He's like the kids who got their Apple ][+s in 1979, and keenly remember the time before computers were available to kids at all, the time when they were the exclusive domain of obsessive geeks, and the point at which they became widely exciting, and finally, ubiquitous -- a breadth of experience that offers visceral perspective. Read the rest
Over the long run, keeping stuff like tree limbs and compostable waste out of landfills is good for cities. There's only so much space in a landfill and getting more land is extremely expensive. So why haven't more cities hopped on the curbside composting bandwagon, or at least banned yard waste from landfills? There's probably a lot of factors that go into those decisions, but one, apparently, is the influence of large, private companies that handle waste collection
and see the diversion of re-usable waste as a detriment to their income. (Via Chris Tackett) Read the rest