In Foreign Policy magazine Eveline Chao writes a fascinating, insider account of working with Chinese censors and trying to do the job of a journalist in a place where your entire staff can be fired for the crime of accidentally having a Taiwanese flag in the background of a photograph.
Read the rest
Every legally registered publication in China is subject to review by a censor, sometimes several. Some expat publications have entire teams of censors scouring their otherwise innocuous restaurant reviews and bar write-ups for, depending on one's opinion of foreigners, accidental or coded allusions to sensitive topics. For example, That's Shanghai magazine once had to strike the number 64 from a short, unrelated article because their censors believed it might be read as an oblique reference to June 4, 1989, when the Chinese government bloodily suppressed a pro-democracy movement in Tiananmen Square. Many Chinese-run publications have no censor at all, but their editors are relied upon to know where the line falls -- i.e., to self-censor.
... Business content is not censored as strictly as other areas in China, since it seems to be understood that greater openness is needed to push the economy forward and it doesn't necessarily deal with the political issues Chinese rulers seem to find the most sensitive. English-language content isn't censored as much either, since only a small fraction of the Chinese population reads English. (As foreigners reporting on non-sensitive subjects in English, we could worry much less about the dangers -- threats, beatings, jail time -- that occasionally befall muckraking Chinese journalists.) And, in the beginning, most of Snow's edits were minor enough that we didn't feel compromised.
Here's an issue we don't talk about enough. Every year, peer-reviewed research journals publish hundreds of thousands of scientific papers. But every year, several hundred of those are retracted — essentially, unpublished. There's a number of reasons retraction happens. Sometimes, the researchers (or another group of scientists) will notice honest mistakes. Sometimes, other people will prove that the paper's results were totally wrong. And sometimes, scientists misbehave, plagiarizing their own work, plagiarizing others, or engaging in outright fraud. Officially, fraud only accounts for a small proportion of all retractions. But the number of annual retractions is growing, fast. And there's good reason to think that fraud plays a bigger role in science then we like to think. In fact, a study published a couple of weeks ago found that there was misconduct happening in 3/4ths of all retracted papers. Meanwhile, previous research has shown that, while only about .02% of all papers are retracted, 1-2% of scientists admit to having invented, fudged, or manipulated data at least once in their careers.
The trouble is that dealing with this isn't as simple as uncovering a shadowy conspiracy or two. That's not really the kind of misconduct we're talking about here.
OK, I know that I promised to never post anything ever again about a certain hypothetical disaster that rhymes with Schmapocalypse MiffyMelve, but hear me out. This really isn't about that. Instead, I want to highlight an excellent profile of a scientist whose work and interactions with the public have been affected by that unnamed bit of urban mythology.
David Morrison is a 72-year-old senior scientist at NASA's Ames Research Center. He runs NASA's "Ask an Astrobiologist" column, and considers it his way of following in the footsteps of Carl Sagan. In this story, written by Dan Duray at The Awl, we learn about Morrison's deep commitment to communicating science to the public ... a commitment that has led him to spend the last eight years answering a increasingly heavy flood of letters about the end of the world. It's an interesting look at the effects pop culture has on real people.
Read the rest
The questions that Dr. Morrison receives circle around a surprisingly cohesive set of theories, each grounded in some kind of real science that then veers off in a wild direction ... It's possible that many of the people who write to Dr. Morrison are trolls, or have Kindle books to sell, or want to garner enough YouTube views to merit an ad before their videos (some of the "Nibiru exposed" videos now feature a pre-roll for the conspiracy movie Branded). But his younger questioners certainly aren't faking it. He read me some of the more serious emails over the phone:
"I know that everyone has been asking you the same question but how do I know the world is not going to end by a planet or a flood or something?
My second column for the New York Times Magazine went online today. It's about the history of technology and the forces that determine which tools end up in our everyday portfolio and which become fodder for alternate history sci-fi novels.
The key thing to remember: The technologies we use today aren't necessarily the best technologies that were available. We don't really make these decisions logically, based solely on what works best. It's more complicated than that. Technology is shaped sociocultural forces. And, in turn, it shapes them, as well. The best analogy I've come up with to summarize this: The history of technology isn't a straight line. It's more like a ball of snakes fucking. (Sadly, I couldn't figure out a good way to reword this analogy for publication in the Paper of Record.) One of my big examples is the history of the electric car:
Read the rest
There are plenty of reasons Americans should have adopted electric cars long ago. Early E.V.’s were easier to learn to drive than their gas cousins, and they were far cleaner and better smelling. Their battery range and speed were limited, but a vast majority of the trips we take in our cars are short ones. Most of the driving we do has been well within the range of electric-car batteries for decades, says David Kirsch, associate professor of management at the University of Maryland and the author of “The Electric Vehicle and the Burden of History.” We drive gas-powered cars today for a complex set of reasons, Kirsch says, but not because the internal-combustion engine is inherently better than the electric motor and battery.
I've been fascinated by the history and development of sign language for a while now. Highly linked to local Deaf cultures, individual sign languages have deep roots in the home-made systems people came up with in order to communicate with one another and with their families at times when Deaf people were often a lot more socially isolated than they are today. That means that each sign language is unique — even British and American sign language aren't at all the same thing. English is spoken in both countries, but the cultural history that gave birth to sign was sufficiently different to produce two completely different languages that are unintelligible to one another. (Meanwhile, American sign language is much closer to French, because it also has roots in a system imported from France in the 19th century.)
In that case, it was a physical distance that lead to the development of two different sign languages. But, within the United States, the same thing happened because of social distance. Turns out, there is a Black American sign language that is distinctly different, as a language, from ASL. Its roots lie in segregation, and especially in separate-and-not-at-all-equal school systems. Ironically, though, that meant sign language had a more prominent place in black schools for much of the 20th century. At white schools, up until the 1970s and 1980s, students were heavily pressured to speak and lip-read, rather than sign — because it was thought to be better. Meanwhile, at black schools, sign language continued to be heavily used, growing and changing. Read the rest
Back in May, we linked you to the reporting of Outside's Grayson Schaffer, who was stationed in the base camps of Mount Everest, watching as the mountain's third deadliest spring in recorded history unfolded. Ten climbers died during April and May. But the question is, why?
From a technological standpoint, as Schaffer points out in a follow up piece, Everest ought to be safer these days. Since 1996 — the mountain's deadliest year, documented in John Krakauer's Into Thin Air — weather forecasts have improved (allowing climbers to avoid storms like the one responsible for many of the 1996 deaths), and new helicopters can reach stranded climbers at higher altitudes. But those things, Schaffer argues, are about reducing deaths related to disasters. This year, he writes, the deaths that happened on Everest weren't about freak occurrences of bad luck. It wasn't storms or avalanches that took those people down. It wasn't, in other words, about the random risks of nature.
Read the rest
This matters because it points to a new status quo on Everest: the routinization of high-altitude death. By and large, the people running the show these days on the south side of Everest—the professional guides, climbing Sherpas, and Nepali officials who control permits—do an excellent job of getting climbers to the top and down again. Indeed, a week after this year’s blowup, another hundred people summited on a single bluebird day, without a single death or serious injury.
But that doesn’t mean Everest is being run rationally. There are no prerequisites for how much experience would-be climbers must have and no rules to say who can be an outfitter.
By this point in your lives, most of you are by no doubt aware of the massive slaughter of buffalo that happened in the United States in the late 19th century. Across the plains, thousands of buffalo were killed every week during a brief period where the hides of these animals could fetch upwards of $10 a pop. (The Bureau of Labor Statistics inflation calculator only goes back to 1913, so it's hard for me to say what that's worth today. But we know from the context that even when the value of buffalo hides dropped to $1 each, the business of killing and skinning buffalo was still considered a damned fine living.)
You might think that the business ended there, with dead, skinned buffalo left to rot on the prairie. And you're sort of right. But, in a story at Bloomberg News, Tim Heffernan explains that, a few years later, those dead buffalo created another boom and bust industry—the bone collection business.
Read the rest
Animal bones were useful things in the 19th century. Dried and charred, they produced a substance called bone black. When coarsely crushed, it could filter impurities out of sugar-cane juice, leaving a clear liquid that evaporated to produce pure white sugar -- a lucrative industry. Bone black also made a useful pigment for paints, dyes and cosmetics, and acted as a dry lubricant for iron and steel forgings.
... And so the homesteaders gathered the buffalo bones. It was easy work: Children could do it. Carted to town, a ton of bones fetched a few dollars.
Former Talking Heads frontman and all-round happy mutant David Byrne has written several good books, but his latest, How Music Works, is unquestionably the best of the very good bunch, possibly the book he was born to write. I could made good case for calling this How Art Works or even How Everything Works.
Though there is plenty of autobiographical material How Music Works that will delight avid fans (like me) -- inside dope on the creative, commercial and personal pressures that led to each of Byrne's projects -- this isn't merely the story of how Byrne made it, or what he does to turn out such great and varied art. Rather, this is an insightful, thorough, and convincing account of the way that creativity, culture, biology and economics interact to prefigure, constrain and uplift art. It's a compelling story about the way that art comes out of technology, and as such, it's widely applicable beyond music.
Byrne lived through an important transition in the music industry: having gotten his start in the analog recording world, he skilfully managed a transition to an artist in the digital era (though not always a digital artist). As such, he has real gut-feel for the things that technology gives to artists and the things that technology takes away. He's like the kids who got their Apple ][+s in 1979, and keenly remember the time before computers were available to kids at all, the time when they were the exclusive domain of obsessive geeks, and the point at which they became widely exciting, and finally, ubiquitous -- a breadth of experience that offers visceral perspective. Read the rest
That is a high claim, I know. But over Labor Day weekend, a combination of dedicated curation and popular vote resulted in Henri 2, Paw de Deux being named the best Internet cat video.
The Internet Cat Film Festival, sponsored by Minneapolis' Walker Museum of Art, drew a live audience of more than 10,000 people last Thursday night. Videos were curated from a massive collection submitted online, and were grouped into thematic categories— foreign films, for instance, or comedies. Henri 2 took home the Golden Kitty, a People's Choice award.
Bonus: If arguing about the merits of Henri 2 weren't enough of a gift to your procrastination tendencies, you can also check out a full list of all the films screened at the festival, including links. Read the rest
John Brownlee on why he stopped pirating music:
It’s clear to me, in retrospect, that my piracy was mostly mere collecting, and like the most fetishistic of collectors, it was conducted with mindless voracity. A good collection is supposed to be made up of relics, items that conjure up memories, feelings and ideas for the owner so strongly that he gets pleasure in simply being in close contact with them. A tended garden. My collection was nothing like this: it was just a red weed, swallowing up and corroding anything I did care about within its indiscriminating mass.
[Video Link] The 30 Days Ramadan guys have put out a wonderful new short film in their series of profiles on Muslim life in America. This one was directed by Zeshawn Ali, and focuses on a father-son legacy of music, in Brooklyn. Snip:
Mohammad Boota walks the streets of NYC walking Muslims up with a dhol drum during Ramadan - a rich tradition he inherited from his family in Pakistan. He came to America in 1992 and spent 9 years saving enough money to bring the rest of his family over. Now, fully reunited with his family, he rekindles the bond he has with his son over their love for drumming.
As you watch, remember that these are the regular people the NYPD and DHS want to surveil all the time, every day, solely because of their heritage.
You can subscribe to the 30 Days Ramadan YouTube channel for more great videos like this.
I've written here before about seed art at the Minnesota State Fair. Every year, Minnesotans glue thousands of tiny seeds to heavy backing material to create some surprisingly elaborate examples of portraiture and political commentary. Oddly, given that this is folk art at a state fair in the Midwest, most of that political commentary is solidly liberal.
I wasn't able to make it to the Minnesota State Fair this year, but Minnesota Public Radio's Nikki Tundel was there. At least four different entries in this year's seed art competition feature marriage equality themes—responses to the coming election when Minnesotans will decide whether or not to enshrine discriminatory marriage laws into our state constitution. It's safe to say: Minnesota's seed artists want you to vote "No".
Via the Stuff About Minneapolis blog, and Andrew BalfourRead the rest
When I was about 10, I developed an obsessive love for The X-Men. It started with the Saturday morning cartoon show, but quickly became about comic books, as well. To this day, long-overwritten plot points from the Marvel universe take up a significant portion of my memory space (as my husband can attest). In my marriage, I am the one who is called upon to flesh out the backstory and conflicts with source material after my husband and I have seen an action-hero movie.
But I didn't own a single comic book until I was 19.
In fact, I'm not sure my parents or friends even knew I liked comic books. All my reading, for nine years, was done in secret. I'd slip into the comic book aisle at the bookstore when nobody was around to see, grab an anthology off the shelf, and spend the next two hours nestled in a corner somewhere — with the comics safely hidden behind a magazine or large book. I did the same thing at the public library. Never even checked one out. If I couldn't finish a library comic anthology in one afternoon, I'd hide it in a seldom-used section and come back the next day. (My apologies to the librarians of the world for that.)
Partly, that shame and fear came was about being labeled a nerd, in general. But there was, for me, also a pretty heavy gender component. Tall, clumsy, nerdy, ignorant of fashion or makeup, and definitely not "attractive" in the way that sheltered pre-teen and teenage society defines it, I spent a good chunk of my adolescence paranoid about my identity as a female. Read the rest