Boldly going where nobody's gone before. In a lot of ways, that idea kind of defines our whole species. We travel. We're curious. We poke our noses around the planet to find new places to live. We're compelled to explore places few people would ever actually want to live. We push ourselves into space.
This behavior isn't totally unique. But it is remarkable. So we have to ask, is there a genetic, evolution-driven, cause behind the restlessness of humanity?
At National Geographic, David Dobbs has an amazing long read digging into that idea. The story is fascinating, stretching from Polynesian sailors to Quebecois settlers. And it's very, very good science writing. Dobbs resists the urge to go for easy "here is the gene that does this" answers. Instead, he helps us see the complex web of genetics and culture that influences and encourages certain behaviors at certain times. It's a great read.
Not all of us ache to ride a rocket or sail the infinite sea. Yet as a species we’re curious enough, and intrigued enough by the prospect, to help pay for the trip and cheer at the voyagers’ return. Yes, we explore to find a better place to live or acquire a larger territory or make a fortune. But we also explore simply to discover what’s there.
“No other mammal moves around like we do,” says Svante Pääbo, a director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, where he uses genetics to study human origins. “We jump borders. We push into new territory even when we have resources where we are. Other animals don’t do this. Other humans either. Neanderthals were around hundreds of thousands of years, but they never spread around the world. In just 50,000 years we covered everything. There’s a kind of madness to it. Sailing out into the ocean, you have no idea what’s on the other side. And now we go to Mars. We never stop. Why?”
Why indeed? Pääbo and other scientists pondering this question are themselves explorers, walking new ground. They know that they might have to backtrack and regroup at any time. They know that any notion about why we explore might soon face revision as their young disciplines—anthropology, genetics, developmental neuropsychology—turn up new fundamentals. Yet for those trying to figure out what makes humans tick, our urge to explore is irresistible terrain. What gives rise to this “madness” to explore? What drove us out from Africa and on to the moon and beyond?
Retro DPRK is a blog that collects images of North Korea from the 1950s, 1960s, 1970s, and 1980s. Getting into North Korea from the United States and Western Europe is not easy today. But up until the collapse of the Soviet Union, it was even more difficult. If you weren't also from a Communist country, chances were good that you weren't going to get even a glimpse of the place.
But, at the same time, North Korea was also promoting itself through propaganda, and as a tourist destination for citizens of the USSR. Christopher Graper — who leads tours into North Korea today from Canada — has scanned scenes from postcards and tourism brochures — rare peeks into the little-documented history of a secretive country.
The collection blends familiar scenes that wouldn't look terribly different from American advertisements of the same era with an amusingly odd sensibility (who wouldn't want a whole book of postcards documenting every detail of Pyongyang's new gymnasium?) and quietly disconcerting scenes like the one above, where a seaside resort town appears eerily empty — like a theme park before opening time.
Thanks for pointing me toward this, Gidjlet!
Once upon a time, there was apparently a disease called chlorosis. (There is, still, a plant disease of the same name, but we're talking about human chlorosis, here.) It existed in young women from the U.S. and Europe. It turned their skin turn green. The diagnosed cause: Excessive virginity. Prescription: A husband and, for best results, babies.
The thing with chlorosis is that the actual biological parts of it — the green skin — really did exist. It was the culturally influenced medical interpretation that was all off. In 1936, researchers proved it was actually just a type of anemia — an iron deficiency that could happen in males and females. The greenish tinge to the skin happened because the red blood cells were suddenly a lot less red.
Medicine isn't just anatomy and biology. It's also how we cultural interpret the importance and meaning of what we see in anatomy and biology. That's the point made by Druin Burch in a really interesting piece at Slate.com, where he compares chlorosis to a modern scourge — fatty liver disease.
Fatty liver disease affects up to a quarter of us. Its harms—a significantly increased risk of death among them—are taken seriously by hepatologists and other doctors. But it may not be a real disease at all ... Those with fatty liver disease won't know for certain they have the disease without a scan, be it ultrasound or some other modality. Usually fatty liver disease causes no symptoms. Yet those who have it are more likely to suffer heart attacks and strokes, more likely to develop liver cirrhosis, more likely to have high blood pressure and diabetes. Their health is improved from lowering their blood pressure and cholesterol levels, from dieting and exercising, and even (if they're particularly obese) from having a gastric bypass or similar surgery to help them lose weight.
The problem comes into focus when you realise these same hazards and recommendations can be invoked for any other manifestation of being overweight. Take fatty elbow disease. As far as I'm aware, I'm the first to describe it, but I think it could take off. It's associated with being overweight and underactive and it carries with it the same range of real risks. Sufferers are often asymptomatic, unaware of their illness, although I admit that it can be picked up without much use of an MRI scanner. Shortly I'll be writing to the New England Journal of Medicine to expose the problem. I'll demand action to raise the profile of fatty elbow disease, with programs to screen elbows nationwide and make patients aware of their affliction. I'll accept lucrative posts advising drug companies and seek out a celebrity patient or two. I'll attend so many lavish conference dinners I may develop the disease myself.
Where did the European werewolf come from and why did this particular mythology become so powerful that we're still telling stories about it today?
In a fascinating talk recorded at Skepticon 5 last month, Deborah Hyde discusses the history of lycanthropy and its various roles in European society. Lycanthropy was more than one thing, Hyde explains. It functioned as a legitimate medical diagnosis — usually denoting some kind of psychotic break. It served as a placeholder to explain anything particularly horrific — like the case of a French serial killer. And, probably most importantly, lycanthropy went hand-in-hand with witchcraft as part of the Inquisition.
Hyde is the editor of The Skeptic magazine and she blogs about the cultural history of belief in the supernatural. As part of this talk, she's tracked down cases of werewolf trials in the 16th and 17th centuries and attempted to understand why people were charged with lycanthropy, what connected those cases to one another, and the role the trials played in the history of religious liberty. Great stuff!
In Foreign Policy magazine Eveline Chao writes a fascinating, insider account of working with Chinese censors and trying to do the job of a journalist in a place where your entire staff can be fired for the crime of accidentally having a Taiwanese flag in the background of a photograph.
Every legally registered publication in China is subject to review by a censor, sometimes several. Some expat publications have entire teams of censors scouring their otherwise innocuous restaurant reviews and bar write-ups for, depending on one's opinion of foreigners, accidental or coded allusions to sensitive topics. For example, That's Shanghai magazine once had to strike the number 64 from a short, unrelated article because their censors believed it might be read as an oblique reference to June 4, 1989, when the Chinese government bloodily suppressed a pro-democracy movement in Tiananmen Square. Many Chinese-run publications have no censor at all, but their editors are relied upon to know where the line falls -- i.e., to self-censor.
... Business content is not censored as strictly as other areas in China, since it seems to be understood that greater openness is needed to push the economy forward and it doesn't necessarily deal with the political issues Chinese rulers seem to find the most sensitive. English-language content isn't censored as much either, since only a small fraction of the Chinese population reads English. (As foreigners reporting on non-sensitive subjects in English, we could worry much less about the dangers -- threats, beatings, jail time -- that occasionally befall muckraking Chinese journalists.) And, in the beginning, most of Snow's edits were minor enough that we didn't feel compromised. We couldn't say that a businessperson came back to China from the United States after "Tiananmen," but we could say "June 1989," knowing that our readers knew the significance of the month. We couldn't say "the Cultural Revolution" but could write "the late 1960s and early 1970s," to allude to then Communist Party chairman Mao Zedong launching his disastrous campaign that sent millions of intellectuals to the countryside. Writing that a company planned to expand into "foreign markets like Taiwan and Korea" was forbidden because it suggested that Taiwan was a separate country from China, but we could say "overseas markets," since, according to Snow, Taiwan literally is over a body of water from the mainland.
Via Marilyn Terrell
Here's an issue we don't talk about enough. Every year, peer-reviewed research journals publish hundreds of thousands of scientific papers. But every year, several hundred of those are retracted — essentially, unpublished. There's a number of reasons retraction happens. Sometimes, the researchers (or another group of scientists) will notice honest mistakes. Sometimes, other people will prove that the paper's results were totally wrong. And sometimes, scientists misbehave, plagiarizing their own work, plagiarizing others, or engaging in outright fraud. Officially, fraud only accounts for a small proportion of all retractions. But the number of annual retractions is growing, fast. And there's good reason to think that fraud plays a bigger role in science then we like to think. In fact, a study published a couple of weeks ago found that there was misconduct happening in 3/4ths of all retracted papers. Meanwhile, previous research has shown that, while only about .02% of all papers are retracted, 1-2% of scientists admit to having invented, fudged, or manipulated data at least once in their careers.
The trouble is that dealing with this isn't as simple as uncovering a shadowy conspiracy or two. That's not really the kind of misconduct we're talking about here.
OK, I know that I promised to never post anything ever again about a certain hypothetical disaster that rhymes with Schmapocalypse MiffyMelve, but hear me out. This really isn't about that. Instead, I want to highlight an excellent profile of a scientist whose work and interactions with the public have been affected by that unnamed bit of urban mythology.
David Morrison is a 72-year-old senior scientist at NASA's Ames Research Center. He runs NASA's "Ask an Astrobiologist" column, and considers it his way of following in the footsteps of Carl Sagan. In this story, written by Dan Duray at The Awl, we learn about Morrison's deep commitment to communicating science to the public ... a commitment that has led him to spend the last eight years answering a increasingly heavy flood of letters about the end of the world. It's an interesting look at the effects pop culture has on real people.
The questions that Dr. Morrison receives circle around a surprisingly cohesive set of theories, each grounded in some kind of real science that then veers off in a wild direction ... It's possible that many of the people who write to Dr. Morrison are trolls, or have Kindle books to sell, or want to garner enough YouTube views to merit an ad before their videos (some of the "Nibiru exposed" videos now feature a pre-roll for the conspiracy movie Branded). But his younger questioners certainly aren't faking it. He read me some of the more serious emails over the phone:
"I know that everyone has been asking you the same question but how do I know the world is not going to end by a planet or a flood or something? I'm scared because I'm in 10th grade and I have a full life ahead of me so PLEASE I WOULD REALLY LIKE AN ANSWER TO MY QUESTION."
"I am really scared about the end of the world on 21 December. I'm headed into 7th grade and I am very scared. I hear you work for the government and I don't know what to do. Can someone help me? I can't sleep, I am crying every day, I can't eat, I stay in my room, I go to a councilor, it helps, but not with this problem. Can someone help me?"
It's not all serious business, though. In one of the funnier moments, a 72-year-old man tries to figure out how to deal with YouTube commenters accusing him of being a secret Lizard Person.
Image: Apocalypse, a Creative Commons Attribution No-Derivative-Works (2.0) image from torek's photostream
My second column for the New York Times Magazine went online today. It's about the history of technology and the forces that determine which tools end up in our everyday portfolio and which become fodder for alternate history sci-fi novels.
The key thing to remember: The technologies we use today aren't necessarily the best technologies that were available. We don't really make these decisions logically, based solely on what works best. It's more complicated than that. Technology is shaped sociocultural forces. And, in turn, it shapes them, as well. The best analogy I've come up with to summarize this: The history of technology isn't a straight line. It's more like a ball of snakes fucking. (Sadly, I couldn't figure out a good way to reword this analogy for publication in the Paper of Record.) One of my big examples is the history of the electric car:
There are plenty of reasons Americans should have adopted electric cars long ago. Early E.V.’s were easier to learn to drive than their gas cousins, and they were far cleaner and better smelling. Their battery range and speed were limited, but a vast majority of the trips we take in our cars are short ones. Most of the driving we do has been well within the range of electric-car batteries for decades, says David Kirsch, associate professor of management at the University of Maryland and the author of “The Electric Vehicle and the Burden of History.” We drive gas-powered cars today for a complex set of reasons, Kirsch says, but not because the internal-combustion engine is inherently better than the electric motor and battery.
Read the rest
I've been fascinated by the history and development of sign language for a while now. Highly linked to local Deaf cultures, individual sign languages have deep roots in the home-made systems people came up with in order to communicate with one another and with their families at times when Deaf people were often a lot more socially isolated than they are today. That means that each sign language is unique — even British and American sign language aren't at all the same thing. English is spoken in both countries, but the cultural history that gave birth to sign was sufficiently different to produce two completely different languages that are unintelligible to one another. (Meanwhile, American sign language is much closer to French, because it also has roots in a system imported from France in the 19th century.)
In that case, it was a physical distance that lead to the development of two different sign languages. But, within the United States, the same thing happened because of social distance. Turns out, there is a Black American sign language that is distinctly different, as a language, from ASL. Its roots lie in segregation, and especially in separate-and-not-at-all-equal school systems. Ironically, though, that meant sign language had a more prominent place in black schools for much of the 20th century. At white schools, up until the 1970s and 1980s, students were heavily pressured to speak and lip-read, rather than sign — because it was thought to be better. Meanwhile, at black schools, sign language continued to be heavily used, growing and changing. By the late 1960s, the two systems were almost completely different languages.
Carolyn McCaskill remembers exactly when she discovered that she couldn’t understand white people. It was 1968, she was 15 years old, and she and nine other deaf black students had just enrolled in an integrated school for the deaf in Talledega, Ala.
... The teacher’s quicksilver hand movements looked little like the sign language McCaskill had grown up using at home with her two deaf siblings and had practiced at the Alabama School for the Negro Deaf and Blind, just a few miles away. It wasn’t a simple matter of people at the new school using unfamiliar vocabularly; they made hand movements for everyday words that looked foreign to McCaskill and her fellow black students.
...So, McCaskill says, “I put my signs aside.” She learned entirely new signs for such common nouns as “shoe” and “school.” She began to communicate words such as “why” and “don’t know” with one hand instead of two as she and her black friends had always done. She copied the white students who lowered their hands to make the signs for “what for” and “know” closer to their chins than to their foreheads. And she imitated the way white students mouthed words at the same time as they made manual signs for them.
• Martha's Vinyard: Birthplace of American Deaf Culture
• What the invention of Nicaraguan sign language teaches us about the human brain
• How To: Spell with your fingers in different languages
• CWA: Your language is your worldview
• The sign language of science
• Learn the sign language of physics, male genitalia
Via Stan Carey
Back in May, we linked you to the reporting of Outside's Grayson Schaffer, who was stationed in the base camps of Mount Everest, watching as the mountain's third deadliest spring in recorded history unfolded. Ten climbers died during April and May. But the question is, why?
From a technological standpoint, as Schaffer points out in a follow up piece, Everest ought to be safer these days. Since 1996 — the mountain's deadliest year, documented in John Krakauer's Into Thin Air — weather forecasts have improved (allowing climbers to avoid storms like the one responsible for many of the 1996 deaths), and new helicopters can reach stranded climbers at higher altitudes. But those things, Schaffer argues, are about reducing deaths related to disasters. This year, he writes, the deaths that happened on Everest weren't about freak occurrences of bad luck. It wasn't storms or avalanches that took those people down. It wasn't, in other words, about the random risks of nature.
This matters because it points to a new status quo on Everest: the routinization of high-altitude death. By and large, the people running the show these days on the south side of Everest—the professional guides, climbing Sherpas, and Nepali officials who control permits—do an excellent job of getting climbers to the top and down again. Indeed, a week after this year’s blowup, another hundred people summited on a single bluebird day, without a single death or serious injury.
But that doesn’t mean Everest is being run rationally. There are no prerequisites for how much experience would-be climbers must have and no rules to say who can be an outfitter. Many of the best alpinists in the world still show up in Base Camp every spring. But, increasingly, so do untrained, unfit people who’ve decided to try their hand at climbing and believe that Everest is the most exciting place to start. And while some of the more established outfitters might turn them away, novices are actively courted by cut-rate start-up companies that aren’t about to refuse the cash.
It’s a recipe that doesn’t require a storm to kill people. In this regard, things are much different now than in the past: they’re worse.
Image via Outside and photographer Rob Sobecki
By this point in your lives, most of you are by no doubt aware of the massive slaughter of buffalo that happened in the United States in the late 19th century. Across the plains, thousands of buffalo were killed every week during a brief period where the hides of these animals could fetch upwards of $10 a pop. (The Bureau of Labor Statistics inflation calculator only goes back to 1913, so it's hard for me to say what that's worth today. But we know from the context that even when the value of buffalo hides dropped to $1 each, the business of killing and skinning buffalo was still considered a damned fine living.)
You might think that the business ended there, with dead, skinned buffalo left to rot on the prairie. And you're sort of right. But, in a story at Bloomberg News, Tim Heffernan explains that, a few years later, those dead buffalo created another boom and bust industry—the bone collection business.
Animal bones were useful things in the 19th century. Dried and charred, they produced a substance called bone black. When coarsely crushed, it could filter impurities out of sugar-cane juice, leaving a clear liquid that evaporated to produce pure white sugar -- a lucrative industry. Bone black also made a useful pigment for paints, dyes and cosmetics, and acted as a dry lubricant for iron and steel forgings.
... And so the homesteaders gathered the buffalo bones. It was easy work: Children could do it. Carted to town, a ton of bones fetched a few dollars. Sent to rendering plants and furnaces in the big industrial cities, that same ton was worth between $18 and $27. Boiled, charred, crushed or powdered, it was worth as much as $60.
... By the 1880s, however, a few reporters were expressing nervous awe at the scale of the cleansing, and even despair for what had been lost. In 1891, not 25 years after the slaughter began, the Chicago Daily Tribune ran a dispatch titled “Relics of the Buffalo.” The relics were the animals’ empty pathways and dust wallows, worn into the surface of the Manitoba plains over countless years. The bones, let alone the living creatures, were long gone.
Read the rest
Former Talking Heads frontman and all-round happy mutant David Byrne has written several good books, but his latest, How Music Works, is unquestionably the best of the very good bunch, possibly the book he was born to write. I could made good case for calling this How Art Works or even How Everything Works.
Though there is plenty of autobiographical material How Music Works that will delight avid fans (like me) -- inside dope on the creative, commercial and personal pressures that led to each of Byrne's projects -- this isn't merely the story of how Byrne made it, or what he does to turn out such great and varied art. Rather, this is an insightful, thorough, and convincing account of the way that creativity, culture, biology and economics interact to prefigure, constrain and uplift art. It's a compelling story about the way that art comes out of technology, and as such, it's widely applicable beyond music.
Byrne lived through an important transition in the music industry: having gotten his start in the analog recording world, he skilfully managed a transition to an artist in the digital era (though not always a digital artist). As such, he has real gut-feel for the things that technology gives to artists and the things that technology takes away. He's like the kids who got their Apple ][+s in 1979, and keenly remember the time before computers were available to kids at all, the time when they were the exclusive domain of obsessive geeks, and the point at which they became widely exciting, and finally, ubiquitous -- a breadth of experience that offers visceral perspective.
There were so many times in this book when I felt like Byrne's observations extended beyond music and dance and into other forms of digital creativity. For example, when Byrne recounted his first experiments with cellular automata exercise for dance choreography, from his collaboration with Noemie Lafrance:
1. Improvise moving to the music and come up with an eight-count phrase (in dance, a phrase is a short series of moves that can be repeated).
2. When you find a phrase you like, loop (repeat) it.
3. When you see someone else with a stronger phrase, copy it.
4. When everyone is doing the same phrase, the exercise is over.
It was like watching evolution on fast-forward, or an emergent lifeform coming into being. At first the room was chaos, writhing bodies everywhere. At first the room was chaos, writhing bodies everywhere. Then one could see that folks had chosen their phrases, and almost immediately one could see a pocket of dancers who had all adopted the same phrase. The copying had already begun, albeit in just one area. This pocket of copying began to expand, to go viral, while yet another one now emerged on the other side of the room. One clump grew faster than the other, and within four minutes the whole room was filled with dancers moving in perfect unison. Unbelievable! It only took four minutes for this evolutionary process to kick in, and for the "strongest" (unfortunate word, maybe) to dominate.
Read the rest
That is a high claim, I know. But over Labor Day weekend, a combination of dedicated curation and popular vote resulted in Henri 2, Paw de Deux being named the best Internet cat video.
The Internet Cat Film Festival, sponsored by Minneapolis' Walker Museum of Art, drew a live audience of more than 10,000 people last Thursday night. Videos were curated from a massive collection submitted online, and were grouped into thematic categories— foreign films, for instance, or comedies. Henri 2 took home the Golden Kitty, a People's Choice award.
Bonus: If arguing about the merits of Henri 2 weren't enough of a gift to your procrastination tendencies, you can also check out a full list of all the films screened at the festival, including links.
John Brownlee on why he stopped pirating music:
It’s clear to me, in retrospect, that my piracy was mostly mere collecting, and like the most fetishistic of collectors, it was conducted with mindless voracity. A good collection is supposed to be made up of relics, items that conjure up memories, feelings and ideas for the owner so strongly that he gets pleasure in simply being in close contact with them. A tended garden. My collection was nothing like this: it was just a red weed, swallowing up and corroding anything I did care about within its indiscriminating mass.
[Video Link] The 30 Days Ramadan guys have put out a wonderful new short film in their series of profiles on Muslim life in America. This one was directed by Zeshawn Ali, and focuses on a father-son legacy of music, in Brooklyn. Snip:
Mohammad Boota walks the streets of NYC walking Muslims up with a dhol drum during Ramadan - a rich tradition he inherited from his family in Pakistan. He came to America in 1992 and spent 9 years saving enough money to bring the rest of his family over. Now, fully reunited with his family, he rekindles the bond he has with his son over their love for drumming.
As you watch, remember that these are the regular people the NYPD and DHS want to surveil all the time, every day, solely because of their heritage.
You can subscribe to the 30 Days Ramadan YouTube channel for more great videos like this.
I've written here before about seed art at the Minnesota State Fair. Every year, Minnesotans glue thousands of tiny seeds to heavy backing material to create some surprisingly elaborate examples of portraiture and political commentary. Oddly, given that this is folk art at a state fair in the Midwest, most of that political commentary is solidly liberal.
I wasn't able to make it to the Minnesota State Fair this year, but Minnesota Public Radio's Nikki Tundel was there. At least four different entries in this year's seed art competition feature marriage equality themes—responses to the coming election when Minnesotans will decide whether or not to enshrine discriminatory marriage laws into our state constitution. It's safe to say: Minnesota's seed artists want you to vote "No".
Via the Stuff About Minneapolis blog, and Andrew Balfour
When I was about 10, I developed an obsessive love for The X-Men. It started with the Saturday morning cartoon show, but quickly became about comic books, as well. To this day, long-overwritten plot points from the Marvel universe take up a significant portion of my memory space (as my husband can attest). In my marriage, I am the one who is called upon to flesh out the backstory and conflicts with source material after my husband and I have seen an action-hero movie.
But I didn't own a single comic book until I was 19.
In fact, I'm not sure my parents or friends even knew I liked comic books. All my reading, for nine years, was done in secret. I'd slip into the comic book aisle at the bookstore when nobody was around to see, grab an anthology off the shelf, and spend the next two hours nestled in a corner somewhere — with the comics safely hidden behind a magazine or large book. I did the same thing at the public library. Never even checked one out. If I couldn't finish a library comic anthology in one afternoon, I'd hide it in a seldom-used section and come back the next day. (My apologies to the librarians of the world for that.)
Partly, that shame and fear came was about being labeled a nerd, in general. But there was, for me, also a pretty heavy gender component. Tall, clumsy, nerdy, ignorant of fashion or makeup, and definitely not "attractive" in the way that sheltered pre-teen and teenage society defines it, I spent a good chunk of my adolescence paranoid about my identity as a female. Where and when I grew up, there weren't a lot of good role models for diversity of female experience. My parents always supported who I was, but society and my peers seemed to have a pretty strict definition of who girls were and what they liked ... and I didn't fit. Admitting that I was into comics felt like it would be just one more thing I did wrong. That's why I really, really love Women Reading Comics in Public Day, an unofficial holiday started by the bloggers at DC Women Kicking Ass.
Read the rest
Technology solves problems. But there's usually more than one way to solve a problem. Cars don't have to run on internal combustion — and they don't have to look like smoothly curved pods. (In fact, when I was in grade school, they didn't.) Our electric grid isn't the result of a rational discussion about ideal technology. Instead, it was built partly based on convenience and speed, and partly based on cost.
Basically, there are lots of ways to solve a problem and for almost every tool we use there's an alternative we chose (somewhere along the line) to not use. I'm working on my second column for The New York Times Magazine, which will come out in September. In the course of researching that, I stumbled across a really fascinating research paper about the history of the refrigerator. See, the electric fridge we're all familiar with wasn't the only option in home refrigeration. In the 20th century, the low hum of the electric refrigerator competed with a silent version powered by natural gas.
"How the Refrigerator Got its Hum" is an article written by science historian Ruth Schwartz Cowan. It was published in 1985, in a book called The Social Shaping of Technology. The article traces the development of the refrigerator and the story of why we use electricity, rather than natural gas, to cool our food today. I couldn't fit it into my NYT column, but it's absolutely fascinating and well worth the read. The key point of Cowan's article: Our world is full of "failed machines", technologies that worked just fine, but that we don't use today.
These are not junked cars and used refrigerators that people leave along roadsides and in garbage dumps, but the rusting hulks of aborted ideas; patents that were never exploited; test models that could not be manufactured at affordable prices; machines that had considerable potential, but were, for one reason or another, actively suppressed by the companies that had the license to manufacture them; devices that were put on the market, but never sold well and were soon abandoned. The publications of the Patent Office and the "new patents" columns in technical magazines reveal that the ration of "failed" machines to successful ones is high, although no scholar has yet devised a formula by which it can actually be determined.
At Neatorama, librarian John Farrier helpfully points out some places where fictional pony librarian Twilight Sparkle could stand to improve her professional practice. It is simultaneously a dedicated bit of pony fandom and an interesting overview of the many responsibilities of a real-world librarian.
Conducting a reference interview is the act of translating a patron’s request into terms that are congruent with the library’s resources. It may surprise non-librarians to learn this, but yes: reference interviewing is a skill. And it is one that Twilight should develop.
A good reference interview begins with the librarian conducting him/herself in a manner that is welcoming. Helping the patron is the first priority of a librarian working the reference desk. The patron is not a distraction or an annoyance. In the first reference interview in the series, Twilight interacts with her patron, Rainbow Dash. “Can I help you?” is a good beginning. But her tone and body language suggests that she would rather not.
... Twilight has some good reference interviewing sense. One pitfall that rookie librarians fall into is to give professional advice instead of information—especially medical and legal advice. In “Cutie Pox,” Applejack and Applebloom visit the library and asking for medical advice. Twilight, aware that doing so could expose the library and herself to liability, deftly avoids doing so and refers Applejack and Applebloom to Zecora, a qualified medical professional.
Editorial note — Cow Week is a tongue-in-cheek look at risk analysis and why we fear the things we fear. It is inspired by the Discovery Channel's Shark Week, the popularity of which is largely driven by the public's fascination with and fear of sharks.Read the rest
It is very hard, and very weird to try to get a handle on how human health has changed between the 19th century and today. Obviously, the way we live has changed dramatically. But understanding how that impacts health (or doesn't) is complicated by the fact that healthcare, science, and public health research changed dramatically during those years, as well.
And all that science hasn't happened in a vacuum. The names we give various disorders change. Whether or not we consider something to be a disorder, at all, might change. And our cultural understanding changes, too—especially when it comes to mental illness.
At the Mind Hacks blog, Vaughn Bell has an excellent breakdown of two recent studies that try to put the modern diagnosis of post-traumatic stress disorder (PTSD) into a cultural and historical context. Many people assume that PTSD is just a new name for something that has always existed—look at shell shock, which made it onto Downton Abbey last season. But these new papers suggest that the distinction between what soldiers experienced in the past and what they experience today might go deeper than naming conventions.
The diagnosis of PTSD involves having a traumatic experience and then being affected by a month of symptoms of three main groups: intrusive memories, hyper-arousal, and avoidance of reminders or emotional numbing ... there has been a popular belief that PTSD has been experienced throughout history but simply wasn’t properly recognised. Previous labels, it is claimed, like ‘shell shock’ or ‘combat fatigue’, were just early descriptions of the same universal reaction.
But until now, few studies have systematically looked for PTSD or post-trauma reactions in the older historical record. Two recent studies have done exactly this, however, and found no evidence for a historical syndrome equivalent to PTSD.
A study just published in the Journal of Anxiety Disorders looked at the extensive medical records for soldiers in the American Civil War, whose mortality rate was about 50-80 greater than modern soldiers fighting in Iraq and Afghanistan. In other words, there would have been many more having terrifying experiences but despite the higher rates of trauma and mentions of other mental problems, there is virtually no mention of anything like the intrusive thoughts or flashbacks of PTSD.
David Dobbs adds some more context to Bell's post at the Neuron Culture blog.
IUDs are the weird form of birth control. We don't really know exactly how they work, for instance. And they've been largely unpopular my entire lifetime—really, ever since a couple of poorly designed IUDs set off a mini-panic in the late 1970s and early 1980s. But IUDs are effective birth control. The ones that you can buy today are safe. And, more importantly, they represent birth control that you don't have to think about, and birth control that is really hard to get wrong.
If you've ever done research on the effectiveness of various methods of birth control, you'll notice that the statistics usually come with a little asterisk. That * represents a concept that few of the people who rely on birth control ever think about—perfect use. Let's use condoms as an example. With perfect use, 2 out of 100 women will get pregnant over the course of a year's worth of condom-protected sex. Without perfect use—maybe you don't use a condom every time, maybe you don't put it on right when you both get naked—the number of accidental pregnancies jumps to 18 out of 100. The same basic problem affects birth control pills, as well. Ladies, did you know you're supposed to take those things at the same time of day every day? That's the kind of use error that can make a difference between 1 out of 100 women getting pregnant in a year, and 9 out of 100 getting pregnant.
In contrast, IUDs represent a fit-it-and-forget-it method of birth control. Which is a big part about why they're up there with outright sterilization as the most effective means of birth control available. Bonus: Depending on which kind you use, you can avoid hormonal side effects. This, experts say, is why IUDs are experiencing something of a resurgence in popularity. In an article at Wired, Jennifer Couzin-Frankel writes that 5.5 percent of American women who use birth control use IUDs. That's up from only 1.3 percent in 1995.
Somewhat unbelievably, no one is quite sure how they work, but the theory goes like this: The human uterus has one overriding purpose, which is to protect and sustain a fetus for nine months. If you stick a poker-chip-sized bit of plastic in there, the body reacts the way it does to any foreign object, releasing white blood cells to chase after the invader. Once those white blood cells are set free in the uterus, they start killing foreign cells with efficient zeal. And sperm, it turns out, are very, very foreign. White blood cells scavenge them mercilessly, preventing pregnancy. In copper- containing IUDs, metal ions dissolving from the device add another layer of spermicidal action.
... Most modern IUDs incorporate copper, which has an assortment of benefits, including increased durability and effectiveness. They’re also free of hormones and can be made cheaply, a boon for women in developing countries. But copper IUDs can cause heavy menstrual bleeding and cramping. The Mirena solves that problem by forgoing the metal for a synthetic version of the hormone progesterone. Here again, the mode of action isn’t completely understood, but researchers suspect that the hormone thickens cervical mucus, which makes it nearly impossible for sperm to swim upstream. It may also thin the uterine lining, rendering it inhospitable to an embryo should fertilization occur. The hormone-based IUD has the opposite side effect of the copper ones: It sometimes leaves women with little uterine lining to shed, so they hardly get any period at all.
... Even though many more doctors are comfortable with the IUD, a generation of doctors didn’t get practice inserting it. And if they don’t know how to put one in, they’re less likely to recommend it as an option. Also, the devices are expensive—the ParaGard costs $500, the Mirena $850. “It’s absolute highway robbery that these companies charge so much,” Espey says. “If you went to Home Depot and got the raw materials for a copper IUD, it would cost less than 5 cents.” And the hormones don’t contribute much more to the cost, she adds. In fact, amortized over years of use—10 for the ParaGard and five for the Mirena—an IUD is far cheaper than birth control pills, which can cost $30 or more a month. But the initial outlay is difficult for some women to manage, and it’s not always covered by insurance.
Read more about different kinds of birth control, their effectiveness, and how to use them correctly at Planned Parenthood
Image: X-Ray showing an IUD in place. Photo taken by Wikipedia user Nevit Dilmen, used via CC license.
I've mentioned here before that I went to fundamentalist Christian schools from grade 8 through grade 11. I learned high school biology from a Bob Jones University textbook, watched videos of Ken Ham talking about cryptozoology as extra credit assignments, and my mental database of American history probably includes way more information about great revival movements than yours does. In my experience, when the schools I went to followed actual facts, they did a good job in education. Small class sizes, lots of hands-on, lots of writing, and lots of time spent teaching to learn rather than teaching to a standardized test. But when they decided that the facts were ungodly, things went to crazytown pretty damn quick.
All of this is to say that I usually take a fairly blasé attitude towards the "OMG LOOK WHAT THE FUNDIES TEACH KIDS" sort of expose that pops up occasionally on the Internet. It's hard to be shocked by stuff that you long ago forgot isn't general public knowledge. You say A Beka and Bob Jones University Press are still freaked about Communism, take big detours into slavery/KKK apologetics, and claim the Depression was mostly just propaganda? Yeah, they'll do that. Oh, the Life Science textbook says humans and dinosaurs totally hung out and remains weirdly obsessed with bombardier beetles? What else is new?
Well, for me, this is new:
"Unlike the "modern math" theorists, who believe that mathematics is a creation of man and thus arbitrary and relative, A Beka Book teaches that the laws of mathematics are a creation of God and thus absolute....A Beka Book provides attractive, legible, and workable traditional mathematics texts that are not burdened with modern theories such as set theory." — ABeka.com
Read the rest
I recently posted a couple of articles featuring heartfelt letters from people who had earned their Eagle Scout awards as boys, but no longer wanted to be associated with the Boy Scouts of America and its rule banning gay scouts and GBLT troop leaders. Instead, they were choosing to return their awards to the BSA, in hopes that scouting's national organization would recognize that this rule isn't something all scouts want. In fact, many wrote about their frustration with what they see as the BSA failing to live up to the values that scouting teaches.
As of August 4, more than 80 former Eagle Scouts have sent photos of their resignation letters to the Eagle Scouts Returning Our Badges Tumblr blog, where the letters and the protest they represent are being archived.
Reading the comments that have turned up here at BoingBoing, I get the sense that there are many more Eagle Scouts—and active Boy Scout troops—that also disagree with the BSA, but don't want to resign from local connections that don't reflect the national organization's bigotry. In fact, the Northern Star Council, which represents 75,000 scouts in Minnesota and Wisconsin, is openly bucking Boy Scouts of America policy, and has been for years.
The Associated Press ran a piece yesterday looking at this dissent and the effect—or, it seems, lack thereof—it is having on BSA policy.
Deron Smith, the Boy Scouts' national spokesman, said there was no official count at his office of how many medals had been returned. He also noted that about 50,000 of the medals are awarded each year.
Beyond the Eagle Scout protests, the Boy Scouts' reaffirmation of the no-gays policy has drawn condemnation from liberal advocacy groups, newspaper editorialists and others. In Washington state, Republican gubernatorial candidate Rob McKenna, an Eagle Scout, joined his Democratic opponent, Jay Inslee, in suggesting the policy be changed.
But overall there has been little evidence of any new form of outside pressure that might prompt the Scouts to reconsider.
The leadership of the Scouts' most influential religious partners - notably the Mormons, Roman Catholics and Southern Baptists - appears to support the policy. And even liberal politicians seem reluctant to press the issue amid a tense national election campaign.
When you pull out a laser pointer and get your cat to chase the dot of light around your house*, you are using a patented method of cat exercise. The rights are owned by Kevin Amiss and Martin Abbott (both of Virginia), who patented it in the early 1990s. In the abstract, they describe this method of cat exercise as:
A method for inducing cats to exercise consists of directing a beam of invisible light produced by a hand-held laser apparatus onto the floor or wall or other opaque surface in the vicinity of the cat, then moving the laser so as to cause the bright pattern of light to move in an irregular way fascinating to cats, and to any other animal with a chase instinct.
In other words, they own the rights on doing this with ferrets, as well.
This might also be a good time to note an NPR story from this week, which documents IBM and Halliburton attempting to patent the process of patent trolling.
Method of exercising a cat: United States Patent 5443036
*Fact: This game becomes more fun if you have a rug. Just run the light up to the edge of the rug and then turn it off. The cat will become convinced that the little red light has gone under said rug and you will get to amuse yourself watching your cat try to lift the corner of something heavy without the use of opposable thumbs.
Thanks, Sam Ley!