At the Brainwaves blog, Ferris Jabr writes about a fascinating project. Anthropologist Andrew Irving talked random strangers on the streets of New York City into putting on a headset and speaking their inner monologue out loud as he followed behind them with a camera. The result is something that approximates what it might be like to be able to hear someone else's thoughts.
A woman worries about where she can find a Staples and contemplates her relationship with a friend who has cancer. A man deals with his emotions over two close friends (or, possibly, roommates, or lovers) having a baby together. Another man flits between internal discussions of totalitarianism, speculation about other people on the street, and his own attempts to figure out which direction he's heading. In general, it's all a mixture of engaging and mundane, swirled together.
There are other videos in the series, as well. You can watch them at Brainwaves.
New York's Grand Central Terminal, as it currently stands today, was built between 1903 and 1913. But it is the third Grand Central. Two earlier buildings — one called Grand Central Depot, and the other known as Grand Central Station (which remains the colloquial name for the Terminal) — existed on pretty much the exact same spot. But neither lasted nearly as long. The Depot opened in 1871, and was drastically reconstructed in 1899. The new building, the Station, only stood for three years before it began to come down in sections, eventually replaced by the current building.
That's a lot of structural shuffling, and at the Anthropology in Practice blog, Krystal D'Costa explains some of the history behind it. Turns out, the rapid reconfiguration of Grand Central had a lot to do with crowd control — figuring out how to use architecture to make the unruly masses a little more ruly. One early account that D'Costa quotes describes regular mad scrambles to board the train — intimidating altercations that could leave less-aggressive passengers stranded on the platform as their train left them behind.
The problem it seemed was that the interior of the depot did nothing to manage the Crowd—which could resume the same patterns of movement as they did on the street—and believe me, it was just as unruly out there. In the depot, where passengers were confronted with the unbridled power of locomotives, it was necessary to impose some sort of structure to the meeting: the Crowd had to be domesticated.
... A deadly collision in 1902 preceded public demand for an even safer, more accessible terminal. Warren and Wetmore won the bid for reconstruction, and the plan they produced included galleries, which added yet another transition area but, more importantly, rendered the Crowd into a spectacle. This design, which is the one visitors experience today, preserves the Crowd in a central area, providing raised balconies from which there are plenty of opportunities to people-watch. Being placed on display is not lost on the subconscious of the Crowd: what appears to be hustle and bustle are manifestations of many synchronizations happening at once. So what appears to be chaos to the casual observer is actually a play directed by design that makes the Crowd a key feature of the space even as it is minimized by the architectural elements that Grand Central Terminal is known for: the grand ceiling, the large windows, and the deep main concourse. These items add perspective to the Crowd and diminish its psychological power as an uncontrollable mass.
In an interview with The Houston Chronicle, paleoanthropologist Jean-Jacques Hublin hits on an interesting point that I don't think we (the media and laypeople) consider enough when we talk about our closest ancient relatives. Although we have an increasingly deep picture of Neanderthal anatomy and genetics, that doesn't necessarily tell us a great deal about their biology.
Truth is, for how little we understand the wiring and functioning of our own brains, we understand even less about the Neanderthal mind. It's quite possible that they could mate with us, but couldn't think the same way we do. And it's those unseen, unstudied differences that could really account for the vast disparities that we see between how humans lived and how their Neanderthal neighbors lived.
The picture we have so far is that the Neanderthals are sort of opportunistic, good at hunting middle- to large-sized mammals. They have a territory in which they probably go through a cycle of habitation in different places, basically when one place is exhausted they move to another one. What we don't see with Neanderthals is long-distance exchanges with other groups. What we see with modern humans in the same areas is different. For example, we find shells in Germany coming from the Mediterranean or from the French Atlantic Coast. It means there was a network of people. So, the question is, what kind of relationship did a Neanderthal have with his brother-in-law? Humans did not just live with their families and their neighbors, but they knew they had a brother-in-law in another village, and that beyond the mountain there is the family of their mother, or uncle, or something like that. There is a large network of groups that, if necessary, could help each other. I think this is where we would like to go to find differences between Neanderthals and modern humans.
Via Marc Kissel
I've been fascinated by the history and development of sign language for a while now. Highly linked to local Deaf cultures, individual sign languages have deep roots in the home-made systems people came up with in order to communicate with one another and with their families at times when Deaf people were often a lot more socially isolated than they are today. That means that each sign language is unique — even British and American sign language aren't at all the same thing. English is spoken in both countries, but the cultural history that gave birth to sign was sufficiently different to produce two completely different languages that are unintelligible to one another. (Meanwhile, American sign language is much closer to French, because it also has roots in a system imported from France in the 19th century.)
In that case, it was a physical distance that lead to the development of two different sign languages. But, within the United States, the same thing happened because of social distance. Turns out, there is a Black American sign language that is distinctly different, as a language, from ASL. Its roots lie in segregation, and especially in separate-and-not-at-all-equal school systems. Ironically, though, that meant sign language had a more prominent place in black schools for much of the 20th century. At white schools, up until the 1970s and 1980s, students were heavily pressured to speak and lip-read, rather than sign — because it was thought to be better. Meanwhile, at black schools, sign language continued to be heavily used, growing and changing. By the late 1960s, the two systems were almost completely different languages.
Carolyn McCaskill remembers exactly when she discovered that she couldn’t understand white people. It was 1968, she was 15 years old, and she and nine other deaf black students had just enrolled in an integrated school for the deaf in Talledega, Ala.
... The teacher’s quicksilver hand movements looked little like the sign language McCaskill had grown up using at home with her two deaf siblings and had practiced at the Alabama School for the Negro Deaf and Blind, just a few miles away. It wasn’t a simple matter of people at the new school using unfamiliar vocabularly; they made hand movements for everyday words that looked foreign to McCaskill and her fellow black students.
...So, McCaskill says, “I put my signs aside.” She learned entirely new signs for such common nouns as “shoe” and “school.” She began to communicate words such as “why” and “don’t know” with one hand instead of two as she and her black friends had always done. She copied the white students who lowered their hands to make the signs for “what for” and “know” closer to their chins than to their foreheads. And she imitated the way white students mouthed words at the same time as they made manual signs for them.
• Martha's Vinyard: Birthplace of American Deaf Culture
• What the invention of Nicaraguan sign language teaches us about the human brain
• How To: Spell with your fingers in different languages
• CWA: Your language is your worldview
• The sign language of science
• Learn the sign language of physics, male genitalia
Via Stan Carey
It is very hard, and very weird to try to get a handle on how human health has changed between the 19th century and today. Obviously, the way we live has changed dramatically. But understanding how that impacts health (or doesn't) is complicated by the fact that healthcare, science, and public health research changed dramatically during those years, as well.
And all that science hasn't happened in a vacuum. The names we give various disorders change. Whether or not we consider something to be a disorder, at all, might change. And our cultural understanding changes, too—especially when it comes to mental illness.
At the Mind Hacks blog, Vaughn Bell has an excellent breakdown of two recent studies that try to put the modern diagnosis of post-traumatic stress disorder (PTSD) into a cultural and historical context. Many people assume that PTSD is just a new name for something that has always existed—look at shell shock, which made it onto Downton Abbey last season. But these new papers suggest that the distinction between what soldiers experienced in the past and what they experience today might go deeper than naming conventions.
The diagnosis of PTSD involves having a traumatic experience and then being affected by a month of symptoms of three main groups: intrusive memories, hyper-arousal, and avoidance of reminders or emotional numbing ... there has been a popular belief that PTSD has been experienced throughout history but simply wasn’t properly recognised. Previous labels, it is claimed, like ‘shell shock’ or ‘combat fatigue’, were just early descriptions of the same universal reaction.
But until now, few studies have systematically looked for PTSD or post-trauma reactions in the older historical record. Two recent studies have done exactly this, however, and found no evidence for a historical syndrome equivalent to PTSD.
A study just published in the Journal of Anxiety Disorders looked at the extensive medical records for soldiers in the American Civil War, whose mortality rate was about 50-80 greater than modern soldiers fighting in Iraq and Afghanistan. In other words, there would have been many more having terrifying experiences but despite the higher rates of trauma and mentions of other mental problems, there is virtually no mention of anything like the intrusive thoughts or flashbacks of PTSD.
David Dobbs adds some more context to Bell's post at the Neuron Culture blog.
The story is familiar to us today: Somebody, usually a young man, walks into a public place, kills a bunch of people seemingly at random, and (usually) ends the murder spree with a suicide-by-cop.
But this story—at least, in Western culture—is startlingly new, relatively speaking. In fact, Paul Mullen, a forensic psychologist, says we can pin a date and place on the first time it happened. On September 4, 1913, in the German towns of Degerloch and Mühlhausen an der Enz, Ernst August Wagner killed his wife, his children, and at least nine strangers. He shot more than 20 people and set several fires during his killing spree. He ended up spending the rest of his life in an insane asylum.
But when we try to pin killings like these on mental illness, Mullen says, we're not quite hitting the right point. The people who go on killing sprees are mad, sure. But that's not the same thing as diagnosable, objective, physical mental illness. Only about 10% of the people ever arrested for crimes like this had actual mental illnesses. In fact, Mullen thinks these killings have more to do with culture than brain chemistry. His argument is interesting. And it might sound a little similar to the old "angry music made him do it!" trope. But what Mullen is talking about is different than that. Science journalist David Dobbs tries to explain the distinction:
I’m not saying the movies made Holmes crazy or psychopathic or some such. But the movies are a enormous, constant, heavily influential part of an American culture that fetishizes violence and glamorizes, to the point of ten-year wars, a militarized, let-it-rain approach to conflict resolution. And culture shapes the expression of mental dysfunction — just as it does other traits. This is why, say, relatively ‘simple’ schizophrenia — not the paranoid sort — takes very different forms in Western and some Eastern cultures. On an even simpler level, this is why competitive athleticism is more likely to express itself as football (the real kind) in Britain but as basketball in the U.S. Culture shapes the expression of behavioral traits.
This is an interesting argument and an interesting thing to think about.
Read the rest of David Dobbs' post about the difference between blaming movies for violence and talking about the consequences of violence in culture.
Read a very good post at the Neuroanthropology blog that expands on Paul Mullen's ideas and provides more interesting links
In certain parts of the United States (including Birmingham, Alabama) shooting guns into the air is one way that some locals celebrate major holidays, like the 4th of July.
For those of us who didn't grow up with celebratory gunfire, this cultural practice can be difficult to understand—especially given the fact that it is dangerous. Bullets that go up come back down, and they can injure and kill people. It's unclear exactly how risky the practice is. If you're hit by a falling bullet, your chances of death are significantly higher compared to a normal gunshot wound. And a study of celebratory gunfire injuries in Los Angeles turned up 118 victims, including 38 deaths, between 1985 and 1992. But I wasn't able to find a good analysis that put deaths into perspective with shots fired. (So, for instance, for every x shots fired into the air, x number of people are injured. Without that, it's hard to tell whether celebratory gunfire is really, really dangerous or only kind of dangerous sometimes. But either way, when you do it, especially in urban areas, you're taking a risk of killing someone.)
Usually, though, when we talk about celebratory gunfire, we're talking about unorganized huzzahs fired off with impromptu vigor in backyards and at family gatherings. In Cherryville, North Carolina, however, the whole thing is a lot more official ... and safer. Starting at midnight on New Year's Eve, the Cherryville New Year's Shooters go door to door throughout a three-county area singing traditional New Year's shooting songs, and calling residents out to shoot with them. It's a lot like going caroling, but with weaponry. Thankfully, it's all done with blanks these days.
For more than 18 hours, and through three different counties — Gaston, Lincoln, and Cleveland — the shooters follow the route bringing ceremony and good tidings to neighbors. At each stop along the way, a crier recites the “Chant of the New Year’s Shooters,” and then participants fire their muskets, one by one, each loaded with black powder, no bullets allowed. The noise of the musket is thought to drown out evil spirits and bad luck; while the chant — part poem, part speech, and part song — asks for peace and prosperity in the New Year.
Joyce Green sent this story in to me. While she was raised in one of these communities—Shelby, North Carolina—she would like you to know that, "I never wake up on New Year’s day and think, 'I’d better get on down to the nursing home and fire off a couple of shots to bring in the New Year right.'"
Read more about the Cherryville New Year's Shooters
Read more about the dangers of celebratory gunfire that involves real bullets.
On Tor.com, author and reviewer Jo Walton has an insightful look at why so many science fiction readers and writers are discussing David Graeber's Debt: The First 5,000 Years, a book that is already a darling of the Occupy movement:
One of the problems with writing science fiction and fantasy is creating truly different societies. We tend to change things but keep other things at societal defaults. It’s really easy to see this in older SF, where we have moved on from those societal defaults and can thus laugh at seeing people in the future behaving like people in the fifties. But it’s very difficult to create genuinely innovative societies, and in genuinely different directions. As a British reader coming to SF there were a lot of things I thought were people’s amazing imagination that turned out to be normal American things and cultural defaults. And no matter how much research you do, it’s always easier in the anglosphere to find books and primary sources in English and about our own history and the history of people who have interacted with us. And both history and anthropology tend to be focused on one period, one place, so it’s possible to research a specific society you know you want to know about, but hard to find things that are about the range of options different societies have chosen.
What Debt does is to focus on a question of morality, first by framing the question, and then by examining how a really large number of human societies over a huge geographical and historical range have dealt with this issue, and how they have interacted with other people who have very different ideas about it. It’s a huge issue of the kind that shapes societies and cultures, so in reading it you encounter a whole lot of contrasting cultures. Graeber has some very interesting ideas about it, and lots of fascinating details, and lots of thought provoking connections.
For a more academic discussion of Debt among political scientists and economists, see this Crooked Timber seminar on the book, and the author's reply. I liked Debt, but was also frustrated by the amount of circling back and meandering the author engages in. That said, it was one of my more thought-provoking reads of 2011.