My latest Locus column is "Where Characters Come From," and it advances a neurological theory for why fiction works, and where writers find their characters.
As a writer, I know that there’s a point in the writing when the engine of the story really seems to roar to life, and at that moment, the characters start feeling like real people. When you start working on a story, the characters are like finger-puppets, and putting words into their mouths is a bit embarrassing, like you’re sitting at your desk waggling your hands at one another and making them speak in funny, squeaky voices. But once those characters ‘‘catch,’’ they become people, and writing them feels more like you’re recounting something that happened than something you’re making up. This reality also extends to your autonomic nervous system, which will set your heart racing when your characters face danger, make you weepy at their tragedies, has you grinning foolishly at their victories.
In some ways, this is even weirder. For a writer to trick himself into feeling emotional rapport for the imaginary people he himself invented seems dangerous, akin to a dealer who starts dipping into the product. Where does this sense of reality – this physical, limbic reaction to inconsequential non-events – spring from?
The New Yorker's profile of Apollo Robbins is one of the most interesting things I've read all year (ha). Robbins is a self-trained virtuoso pickpocket who once managed to lift a pen out of Penn Jillette's pocket, steal the ink cartridge, and return the pen, all while he was demurely insisting to Jillette that he wasn't really comfortable performing in front of magicians.
Josh grew increasingly befuddled, as Robbins continued to make the coin vanish and reappear—on his shoulder, in his pocket, under his watchband. In the middle of this, Robbins started stealing Josh’s stuff. Josh’s watch seemed to melt off his wrist, and Robbins held it up behind his back for everyone to see. Then he took Josh’s wallet, his sunglasses, and his phone. Robbins dances around his victims, gently guiding them into place, floating in and out of their personal space. By the time they comprehend what has happened, Robbins is waiting with a look that says, “I understand what you must be feeling.” Robbins’s simplest improvisations have the dreamlike quality of a casual encounter gone subtly awry. He struck up a conversation with a young man, who told him, “We’re going to Penn and Teller after this.”
“Oh, then you’ll probably want these,” Robbins said, handing over a pair of tickets that had recently been in the young man’s wallet.
When Robbins hits his stride, it starts to seem as if the only possible explanation is an ability to start and stop time. At the Rio, a man’s cell phone disappeared from his jacket and was replaced by a piece of fried chicken; the cigarettes from a pack in one man’s breast pocket materialized loose in the side pocket of another; a woman’s engagement ring vanished and reappeared attached to a key ring in her husband’s pants; a man’s driver’s license disappeared from his wallet and turned up inside a sealed bag of M&M’s in his wife’s purse.
After the performance, Robbins and I had dinner at the bar. “A lot of magic is designed to appeal to people visually, but what I’m trying to affect is their minds, their moods, their perceptions,” he told me. “My goal isn’t to hurt them or to bewilder them with a puzzle but to challenge their maps of reality.”
My fascination with the profile doesn't just come from the recounting of Robbins's many impressive deeds (though they are impressive, and if I ever had cause to book a magician for a gig, he'd be it), but also the struggle that Robbins has had in coming up with ways to maximize his prodigious talent.
Reading further down, I noticed that Apollo Robbins collaborated with neuroscientists on a book called Sleights of Mind: What the Neuroscience of Magic Reveals About Our Everyday Deceptions, which I've ordered. I was also unsurprised to learn that Robbins had consulted on the late, lamented caper-show Leverage, which explains quite a lot about why that show was so good.
Using brain scans, scientists are trying to find how great freestyle rappers drop dope lines. Discovery News reports on a study conducted by researchers the voice, speech and language branch of the National Institute on Deafness and Other Communication Disorders (NIDCD) at the National Institutes of Health (NIH). Here's the paper: "Neural Correlates of Lyrical Improvisation: An fMRI Study of Freestyle Rap." (via Clive Thompson; image photoshop mine from original study)
Miles O'Brien has a wonderful piece on NewsHour about the neuroscience of sleep and other forms of brain-rest, including meditation. I was present for some of the taping and research, and I love how the story turned out.
Sleep deprivation can cause serious health and cognitive problems in humans. In short, it can make us fat, sick and stupid. But why do humans need so much sleep? Science correspondent Miles O'Brien talks to scientists on the cutting edge of sleep research and asks if there's any way humans might evolve into getting by with less.
Draper Laboratory and University of South Florida researchers are developing a prototype "brain-on-a-chip." No, it's not an AI but rather a combination of living cells and microfluidics in a bio-artificial model of the brain's nerovascular unit, the system of neurons, capillaries, and other cells that control the supply of nutrients to the brain. Eventually, such a device could be used to test medications and vaccines. And that's just the beginning.
“In addition to screening drugs, we could potentially block vascular channels and mimic stroke or atherosclerotic plaque," says lead researcher Anil Achyuta. "Furthermore, this platform could eventually be used for neurotoxicology, to study the effects of brain injury like concussions, blast injuries, and implantable medical devices such as in neuroprosthetics.”
Psychobiologist Dario Maestripieri returned from a neuroscience meeting in New Orleans and posted to Facebook that he was disappointed with the "unusually high concentrations of unattractive women. The super model types are completely absent. What is going on? Are unattractive women particularly attracted to neuroscience? Are beautiful women particularly uninterested in the brain?"
He added, "No offense to anyone."
Many people took offense, starting with the Drugmonkey blog, which reposted the remarks.
Janet Stemwedel on Adventures in Ethics and Science has a good post explaining why she is offended by this:
The thing is, that denial is also the denial of the actual lived experience of a hell of a lot of women in science (and in other fields -- I've been sexually harassed in both of the disciplines to which I've belonged).
I can't pretend to speak for everyone who calls out sexism like Maestripieri's, so I'll speak for myself. Here's what I want:
1. I want to shine a bright light on all the sexist behaviors, big or small, so the folks who have managed not to notice them so far start noticing them, and so that they stop assuming their colleagues who point them out and complain about them are making a big deal out of nothing.
2. I want the exposure of the sexist behaviors to push others in the community to take a stand on whether they're cool with these behaviors or would rather these behaviors stop. If you know about it and you don't think it's worth talking about, I want to know that about you -- it tells me something about you that might be useful for me to know as I choose my interactions.
3. I want the people whose sexist behaviors are being called out to feel deeply uncomfortable -- at least as uncomfortable as their colleagues (and students) who are women have felt in the presence of these behaviors.
4. I want people who voice their objections to sexist behaviors to have their exercise of free speech (in calling out the behaviors) be just as vigorously defended as the free speech rights of the people spouting sexist nonsense.
5. I want the sexist behavior to stop so scientists who happen to be women can concentrate on the business of doing science (rather than responding to sexist behavior, swallowing their rage, etc.)
I've got a daughter who, at four and a half, wants to be a scientist. Every time she says this, it makes me swell up with so much pride, I almost bust. If she grows up to be a scientist, I want her to be judged on the reproducibility of her results, the elegance of her experimental design, and the insight in her hypotheses, not on her ability to live up to someone's douchey standard of "super model" looks.
The methodology is straightforward. You take your subject and slide them into an fMRI machine, a humongous sleek, white ring, like a donut designed by Apple. Then you show the subject images of people engaging in social activities — shopping, talking, eating dinner. You flash 48 different photos in front of your subject's eyes, and ask them to figure out what emotions the people in the photos were probably feeling. All in all, it's a pretty basic neuroscience/psychology experiment. With one catch. The "subject" is a mature Atlantic salmon.
And it is dead.
Read the rest
Between the downfall of Jonah Lehrer, and Naomi Wolfe's new book that claims chemicals in women's brains force us to demand our lovers shower us with roses and candy and refer to us as "goddess"*, there's been some growing backlash against the long-popular idea of better living through neuroscience. You know what I'm talking about here: You (yes, you!) can succeed at work, be more creative, improve your relationships, and have a better sex life — all you have to do is read this one interpretation of the latest in neuroscience research!
Perhaps unsurprisingly, that pitch oversells the reality. What we know about how the brain works isn't really that clear cut. But more than that, the idea of scientific self-help quite often has to severely distort science in order to make any sense. The public comes away with a massive misunderstanding of what MRI does and doesn't tell us, what hormones like dopamine actually do, and what the lab tells us about real life.
There are two big essays that you need to read before you pick up another story or book that tries to make connections between cutting-edge brain science and real life. The first, in New Statesman, is by Steven Poole and the broad overview of why it's such a problem when neuroscience becomes neuro-speculation. The second, by Maia Szalavitz at Time Magazine's Healthland blog, focuses on Naomi Wolfe's new book and uses that as a springboard to talk about the bigger issue of brain chemicals, what they are, and what they aren't.
Read the rest
Moran Cerf is a neuroscientist. In the video above, which Cory posted on Friday, he tells the story of how a paper he published in the journal Nature ended up getting him phone calls from Apple and invitations to appear with Christopher Nolan on the publicity tour for Inception. The problem: Nolan, Apple, and a lot of other people thought Cerf had figured out a way to record dreams. He hadn't. Not even close.
Cory's piece, and a link that Xeni sent me to the video, got me reading up on this case and I wanted to provide more of the scientific background—so you can see clearly what Cerf's research was really about and how the media got wrong. Back in 2010, Cerf and his colleagues were trying to figure out how humans look at a world cluttered with different faces, objects, smells, and sounds and manage to filter out the specific things we're interested in. What happens when I look at a messy desk and immediately focus in on one piece of paper? If there are two objects on the desk that are familiar to me, but only one of them really matters, how does my brain resolve the conflict and direct my attention in a single direction?
Turns out, at least under laboratory conditions, humans can filter out the important stuff by consciously controlling the firing of neurons in their own brains. Here's how Alison Abbott at Nature News described the research at the time:
In the last six years or so they have shown that single neurons can fire when subjects recognise — or even imagine — just one particular person or object. They propose that activity in these neurons reflect the choices the brain is making about what sensory information it will consider further and what information it will neglect.
In this experiment, the scientists flashed a series of 110 familiar images — such as pictures of Marilyn Monroe or Michael Jackson — on a screen in front of each of the 12 patients and identified individual neurons which uniquely and reliably responded to one of the images. They selected four images for which they had found responsive neurons in different parts of a subject's MTL. Then they showed the subject two images superimposed on each other. Each was 50% faded out.
The subjects were told to think about one of the images and enhance it.
Read the rest
If you watch or read much science fiction, you know that all it takes to suspend disbelief about fictional science is an explanation that sounds good on the surface and makes use of terms and ideas that your audience doesn't fully understand but does find emotionally compelling. It's why "radioactive spider" made sense in 1960s.
Apparently (and unfortunately) this effect is true for actual science as well.
This slide comes from a lecture given by Oxford University neuroscientist Dorothy Bishop. Basically, it's showing that an explanation of a psychological phenomenon became more believable if you added in some hand-wavey neuroscience and pictures of brain scans. Suddenly, an explanation of human behavior that's based on circular reasoning and poor logic changes from something lay people won't accept to something we're happy to buy into.
Bishop's entire, hour-long presentation on the science of bad neuroscience is available to watch online for free. If you don't have time, check out this summary of the key points at the Neurobonkers blog.
Via Mind Hacks
Time is relative. Remember how each day in grade school (especially summer days) seemed to last for an eternity? Ever notice how it seems to take forever to travel a new route on your bike, while the return trip along the same path is done in the blink of an eye?
Turns out, both of those things are connected and they have important implications for the nature of memory. There's a great summary of the science on this up at The Irish Times. It's written by William Reville, emeritus professor of biochemistry at University College Cork.
The key issue, according to Reville, is that the amount of information your brain can store during a given time period isn't really dependent on the length of that time period. You could store up a lot of new information during 10 minutes of a really interesting lecture. You might store only a little new information during 10 minutes of walking your dog along a path you know very well.
The higher the intensity, the longer the duration seems to be. In a classic experiment, participants were asked to memorise either a simple [a circle] or complex figure . Although the clock-time allocated to each task was identical, participants later estimated the duration of memorising the complex shape to be significantly longer than for the simple shape.
... [H]ere is a “guaranteed” way to lengthen your life. Childhood holidays seem to last forever, but as you grow older time seems to accelerate. “Time” is related to how much information you are taking in – information stretches time. A child’s day from 9am to 3.30pm is like a 20-hour day for an adult. Children experience many new things every day and time passes slowly, but as people get older they have fewer new experiences and time is less stretched by information. So, you can “lengthen” your life by minimising routine and making sure your life is full of new active experiences – travel to new places, take on new interests, and spend more time living in the present.
I think this also has some implications for my exercise routine. I am well aware that my ability to run any distance at all is heavily dependent on psychological factors. I am not one of those people who likes to go running in new places, along unfamiliar trails, because it has always made me feel like the distance was much, much longer — and, consequently, leads me to stop running and start walking sooner than I actually have to. I've had a lot more luck running on tracks and elliptical machines—situations where it seems to be easier for me to get into a zone and lose track of time. When I run that way, it's my physical limitations that matter, not my psychological ones.
Of course, I know a lot of people who feel exactly the opposite. Maybe, for those people, running in a routine situation, like a track, makes them start to think more about their day or what's going on around them, and processing all that information makes the workout seem longer. I'm not sure. But this is awfully interesting.
Via Graham Farmelo
I was on Minnesota Public Radio's morning show The Daily Circuit today—along with Ivan Semeniuk, chief of correspondents for the journal Nature—talking about the Curiosity rover, human evolution, and dealing with the big unknowns in science. You can listen to that segment online.
But right at the end of my bit, as I was packing up my stuff to leave the studio, I heard the next segment on the show, and it was AWESOME. Ask a Neuroscientist is, precisely, reader questions answered by a neuroscientist. But you have to read the transcript for today's first question, where a 5-year-old exchanged ideas with Baylor College of Medicine neuroscientist David Eagleman.
Madeline, 5 years old: How does a brain think?
David Eagleman: We don't know. Part of modern neuroscience's quest is to answer that. One theory goes that, in the same way brains control muscle movement, your brain controls your arms and legs and mouth and so on. Thought might be, essentially, covert muscle movement. In other words, it's going through the same routine that says 'bend this, flex that, extend that' - except that it's not controling a muscle. Instead, it's controling something conceptual.
Read the rest at The Daily Circuit website