• The hilariously practical reason why He-Man has a giant green tiger

    I'm endlessly fascinated by the behind-the-scenes chaos of the 80s action-figure-cartoon pipeline. Advertising direct to kids was frowned upon — but no one would necessarily stop you from creating a kids show that inadvertently advertised its own action figure line. Who cared about the storyline or quality? As long as the next knockoff GI Joe looked cooler than the last one, with a cheap "hologram" on its chest or whatever, then nothing mattered, as long as it sold well!

    This is absurd, but also made for some interesting creative tensions and experiments. Consider: Battle-cat, the giant green tiger that hung around with He-Man. Masters of the Universe was already kind of an odd franchise — an obvious Conan ripoff that also tried to cash-in on Star Wars' success with sword-and-sorcery stories but in space, with guns and robots! In a franchise that already arguably has too much going on, a giant green tiger stands out as almost too normal.

    As one of the creators explains in this clip from Netflix's The Toys That Made Us, it turns out this was an entirely pragmatic decision.

    Basically, they had promised to deliver a cool new toy line, but time and money were getting tight. Someone on the creative team realized that Mattel already had a mold for a tiger action figure, from a different, older toy line — but it was in a different scale, twice as large as the other He-Man figures. Still, re-using an existing mold could save a lot of money. So they slapped a saddle on it and called it a day. A steed-sized tiger would hardly be the weirdest thing about the franchise, right?

    The way the story is told above, though, really goes to show how by-the-seat-of-their-pants this absurd operation was.

  • How conspiracy theories challenge Snopes' mission to find the "truth"

    I've mentioned here before that I recently wrote a novel about an addiction support group for conspiracy theorists who accidentally blow a hole in reality (any publishers or lit agents out there: hit me up). One of the fun writing challenges with the book was trying to get the characters to reach common ground, despite the irreconcilable differences between each of their respective conspiracy beliefs. If these are people who have all chosen to seek help for their obsessions, then how can any of them rightly tell anyone else that what they believe is weird or wrong — even if that belief involves God, or gravity?

    The answer didn't come easy in dramatic writing. In reality, it comes even harder for places like the long-standing fact-checking site Snopes. Over at Medium, author Colin Dickey (Ghostland: An American History in Haunted Places and The Unidentified: Mythical Monsters, Alien Encounters, and Our Obsession with the Unexplained) has a great essay exploring the 25-year history of the site, and the unique predicament it finds itself in during a time when different Americans have radically different definitions of "objective reality." As Dickey relays, the early days of Snopes were spent debunking urban legends like Bigfoot and the rumors of Halloween candy that had been secretly injected with HIV-laced needles — which sounds absurd, but was in fact an important service in those early days of the Internet:

    Once these urban legends moved to the internet, they became singularly vulnerable to debunking, and Snopes proved that sunlight was indeed a magnificent disinfectant. The power of an urban legend depends in part on its vagueness, but also its specificity. […] Friend-of-a-friend stories lose their believability once you can Google the friend of a friend's actual name, and they lose much of their power once you can trace how they've mutated and spread, placing each story alongside all of its various variants. Snopes was able to build its reputation and its following on this pretense — that diligent research could discredit even the most virulent of stories.

    Reading Snopes in the era of urban legends reassured us that the world wasn't as scary as we thought, but it did more than that. Urban legends also have the capacity to generate shame. Believing them is seductive, but as soon as one is debunked, you might feel dumb and sheepish for thinking you could ever believe it in the first place. Snopes allowed us to feel superior to those who'd been duped while covering up our own gullibility. It was the lights in the theater coming on after the horror movie—reassuring, but in a way where no one ever had to know how scared you were in the dark.

    (That passage reminds me of a joke I've heard a few variations on recently — the idea that our parents used to warn us about all the scary strangers on the Internet, and now they'll harpoon their lives because of a Satanic Democrat Pizza Pedophile Ring or 5G internet-laced COVID vaccine rumor they saw on Facebook.)

    But, Dickey explains, the post-9/11 era brought some new challenges to Snopes' reliable debunking tactics:

    The slow evolution of Snopes' focus began with the September 11 attacks, as the Mikkelsons found themselves increasingly responding to conspiratorial (and often anti-Semitic) rumors about who brought down the towers. By the time Barack Obama was elected, the partisan nature of such rumors was increasingly evident: Despite attempts to maintain neutrality (David Mikkelson told Wired's Michelle Dean that he was "essentially apolitical"), Fox News and other right-wing sources targeted Snopes as a stalking-horse for liberal bias. And as conspiracy theories surrounding Snopes continued to swirl, the site itself slowly moved from deep-fried rodents to the Deep State.

    […]

    Conspiracy theories also invert those feelings of gullibility when one has fallen for an urban legend. With conspiracy theories, it's the sheeple willing to take the word of public health experts and politicians who are the naive ones. Given their perpetual ironic distance from accepted fact, conspiracy theorists resist being shamed no matter how much debunking you subject them to.

    As such, Snopes' long-held formula of wry, patient debunking has increasingly fallen on deaf ears. And as authority, expertise, and facts themselves have all been called into question, the whole mechanism of debunking has lost its power. 

    It's a thoughtful essay, that gets at a lot of the hard questions around ideas of "fact-checking" and "objectivity." There are lots of valid critiques about these things—how even what we think of as a fact-based retort is inherently framed by our own pre-existing unconscious biases, which affects the language we use in describing those facts, which prevents us from presenting them as something truly "objective," et cetera. But, as with any grain of truth, those legitimate questions can easily be weaponized by bad faith actors and turned into dangerous propaganda in the form of conspiracies. And it's hard to stop that without tearing down the entire pedagogical and philosophical system our society was built on.

    Snopes Debunked the World. Then the World Changed. [Colin Dickey / Snopes]

    Image: Markus Allen / Flickr (CC-BY-SA 2.0)

  • See Rebecca Black of "Friday" fame as Harley Quinn

    Rebecca Black has had a strange decade since she first went accidentally viral with the so-bad-it's-good pop hit "Friday." She recently released a 10-year-anniversary version of the song in collaboration with hyper-pop artist Dorian Electra, who also brought Black in for their own Joker-themed music video. The song is called "Edgelord," and everything about it is very edgelord.

    But where the Joker goes, Harley Quinn follows — and this time, she's played by Rebecca Black.

    I'm honestly not sure where the irony and satire begin or end with this song and video. Is it earnest? Is it parody? Something tells me that lack of clarity is largely the point.

  • Space travel can literally shrink your heart

    A recent paper from the scientific journal Circulation, "Cardiac Effects of Repeated Weightlessness During Extreme Duration Swimming Compared With Spaceflight," has published new findings on the impact of space travel on human hearts. When astronaut Scott Kelly returned to Earth after nearly a year at the International Space Station, his heart had apparently shrunk by more than 25% of its original mass. As The New York Times explained:

    A smaller heart did not appear to have any ill effects on Mr. Kelly.

    "He did remarkably well over one year," said Dr. Benjamin D. Levine, the senior author of the Circulation paper and a professor of internal medicine at the University of Texas Southwestern Medical Center and Texas Health Presbyterian Dallas.

    "His heart adapted to the reduced gravity," Dr. Levine said. "It didn't become dysfunctional, the excess capacity didn't get reduced to a critical level. He remained reasonably fit. His heart shrank and atrophied kind of as you'd expect from going into space."

    Without the pull of gravity, the heart does not have to pump as hard, and like any other muscle, it loses some fitness from less strenuous use. For Mr. Kelly, the shrinkage occurred even though he exercised almost every day on the space station, a regimen that has proved effective at limiting the brittling of bone and loss of muscle overall.

    Makes sense!

    Space travel is known to have some other weird physical effects as well, including bloated heads, brittle bones, and swollen eyeballs.

    Note to Future Space Travelers: Prepare for a Shrinking Heart [Kenneth Chang / The New York Times]

    Image via Public Domain Pictures

    Full Disclosure: I also write for Wirecutter, which is owned by the New York Times Company, which also publishes The New York Times.

  • That time Angie Bowie almost made a Black Widow and Daredevil movie

    In a recent Wikipedia wormhole binge, I came across an unexpected surprise: that Angie Bowie, former wife of David Bowie and estranged mother of writer/director Duncan Jones, had once procured the rights to Daredevil and Black Widow from Marvel in the mid-1970s, and tried to make a movie starring herself and Ben Carruthers. As Bowie explained to the Daredevil fanzine Man Without Fear:

    Thank you for your interest and e-mail.
    I am surprised that you ask me that question. I received permission from Stan Lee
    to have the rights to Daredevil and Black Widow for a year.
    We were unable to place the series.
    Actor, writer, Benny Carruthers and I did the photo shoot with Terry
    O'Neill and Natasha Kornilkoff costume designer and Barbara Daly – make-up
    in London and that was all that ever happened. Unfortunately at that time
    it was considered too difficult and expensive to film, special effects
    etc.

    I appreciate your asking,

    kindest regards,
    Angie

    A few of those promo photos are still available online, but the project never got any farther than that. Which is probably for the best, because Daredevil's horns look worse than Captain America's gross rubber ears from the 90s.

  • How one electronic music producer wants to decolonize music software

    I've spent a lot of my time over this past pandemic year improving my skills with Digital Audio Workshop software. I also feel like I tend to have a pretty keen eye for identifying issues around colonialism. Yet somehow, I never considered the relationship between the two until I read this Pitchfork article about Khyam Allami, a musician and musicologist of Iraqi descent who was born in Syria but raised in London.

    Allami had grown up in London playing guitar and drums in punk bands. He was exploring Arabic music for the first time—or at least trying to, but the music's distinctive quarter-tones were proving difficult to emulate. The software simply wasn't made for him.

    While every part of the world has its own distinct acoustic instruments, electronic producers around the globe must make do with a narrow range of production tools. Popular digital audio workstations like Ableton, FL Studio, Logic, and Cubase were built primarily to facilitate music-making in a Western mode, according to the principles of European classical music. If an artist wants to compose with the common features of music from Africa, Asia, or Latin America, they have to fight against the software and rely on complex workarounds.

    […]

    Through his research, Allami discovered that it had been possible to explore microtonality using MIDI, the language of electronic music tools, since 1992, but software developers had not implemented functions to make microtonal tunings intuitive to use. As one product manager of a popular music notation program told him, they simply didn't believe that there was a market for such features.

    For those not hip with the lingo of music theory: Western music relies on half-step and whole-step intervals. There are 12 half-steps in an octave (C, C#/Db, D, D#/Eb, E, F, F#/Gb, G, G#/Ab, A, A#/Bb, B, C), at which point the scale repeats, just higher. But non-European music doesn't always conform to the same style. In traditional music from places like India and China and Iraq, for example you can find quarter-steps — notes between each of the Western music notes — or other intervals.

    But most digital music software is still based on a piano roll, which still follows that 12-note half-step octave scale. While you can custom-program your own triggers in MIDI, it's not always easy.

    So Allami spent the last 15 years on a journey (during which DAWs have evolved tremendously anyway). He founded a record label, Nawa Recordings, that highlights alternative Arabic music. And now he, along with collaborators Tero Parviainen and Samuel Diggins from the creative technology studio Counterpoint, have two new pieces of free software called Leimma and Apotome that help to decolonize your digital audio workshop. As Pitchfork explained:

    Leimma allows users to explore tuning systems from around the world or create their own, while Apotome offers generative music creation using these diverse tuning systems. They intend to give musicians a blank musical slate, rather than nudging them towards any specific musical tradition.

    Pretty cool!

    I found my own frustrations while working on my Irish folk album Forfocséic earlier this year; while the scales were still very basic European-based stuff, the rhythms are typically slightly-ahead-of-the-beat. I made the mistake of trying to basically auto-correct the rhythms (called "quantizing") to make sure it all lined up … but the software wanted to break things down into very clean fractions of a beat that did not actually work with the music. I tried to use a digital cheat and ended up creating hours and hours of more work for myself in having to undo it. So I can't imagine trying to get the MIDI triggers to actually do what you want them to with non-Western music modes!

    (I also vaguely recall doing a mashup of digital drum machine beats with some Sitar samples for a project in my World Music class in college, but in hindsight, that was probably some gross appropriation, not to mention that it probably sucked as well).

    Decolonizing Electronic Music Starts With Its Software [Tom Faber / Pitchfork]

     Leimma and Apotome [Khyam Allami and Counterpoint]

    Image: Public Domain via PxHere

  • Dracula's Castle in Romania now offering COVID-19 vaccines

    The BBC reports:

    Visitors to Dracula's castle are being jabbed with needles rather than fangs after a Covid-19 vaccine centre has been set up at the Transylvanian site.

    Medics with fang stickers on their scrubs are offering Pfizer shots to everyone who visits the 14th-century Bran Castle in central Romania.

    […]

    Bran Castle hopes its unique initiative will help boost vaccination numbers. During every weekend in May, anyone can turn up without an appointment to get a jab, and they also get free entry to the castle's exhibit of 52 medieval torture instruments.

    "The idea… was to show how people got jabbed 500-600 years ago in Europe," the castle's marketing director, Alexandru Priscu told Reuters news agency.

    While Bran Castle is largely believed to have been the inspiration for Dracula's castle in the Bram Stoker novel, there is little historical evidence that it has ever hosted any actual vampires. Nor did Vlad the Impaler, the notoriously brutal warlord from nearby Wallachia who is similarly believed to have inspired that literary bloodsucker, ever likely set foot in the place. But hey, tourism's down all over the world, and Romanians gotta make a living somehow.

    Also, vampires gotta eat, and COVID-19 ain't a threat when you're undead.

    Covid: Dracula's castle in Romania offers tourists vaccine [BBC]

    Image: Public Domain via Universal Studios and CDC

  • The incredibly complicated question of how to translate inaugural poet Amanda Gorman

    Amanda Gorman blew a lot of peoples' minds when she performed her poem "The Hill We Climb" at the inauguration of President Joe Biden. The poem itself almost immediately became a bestseller, and her forthcoming debut children's book was soon coveted as well.

    So naturally, the publishing world saw some international opportunities with Gorman's work. But that's where things got complicated. As The New York Times detailed, Gorman's poem posed a problem with translators, who saw a myriad of social and political challenges in transforming her words into a foreign language. Translation are rarely literal, and are more often issue of "interpretation" — and how one individual personally interprets a text across both languages and cultures can have a tremendous impact on how readers of that translation receive and perceive that text. As The Times explains:

    A translator's main task is to capture the nuance and feeling of a language in a way that you could never achieve with Google Translate, and most translators have long happily wrestled with questions of how to faithfully translate works when they are about people completely unlike them.

    "No good translator denies they're bringing their own experience to a text," Mr. Robertson said.

    In a video interview, the members of the German team said they had certainly done such wrestling to make sure their translation of the text — about a weary country whose "people diverse and beautiful will emerge," — was faithful to Ms. Gorman's spirit.

    The team spent a long time discussing how to translate the word "skinny" without conjuring images of an overly thin woman, Ms. Gumusay said. They also debated how to bring a sense of the poem's gender-inclusive language into German, in which many objects — and all people — are either masculine or feminine. A common practice in Germany to signify gender neutrality involves inserting an asterisk in the middle of a word then using its feminine plural form. But such accommodations would be "catastrophic" to a poem, Ms. Strätling said, as it "destroys your metric rhythm." They had to change one sentence where Gorman spoke of "successors" to avoid using it, she added.

    "You're constantly moving back and forth between the politics and the composition," she said.

    "To me it felt like dancing," Ms. Gumusay said of the process. Ms. Haruna-Oelker added that the team tried hard to find words "which don't hurt anyone."

    Even something as simple as translating Gorman's self-description of her Blackness is rife with loaded connotations and complications. Does a White or Asian translator possess the full cultural grasp to figure out how, exactly, to transform "Black" into German, for example, while still capturing all of the nuances beyond color that are implied by the English usage of the word? The answer is: it's complicated, and made even more so by the limited number of translators available to choose from.

    By way of personal example: a friend of mine served as producer for the recent radio play adaptation of Mike Lew's Tiger Style!, a play about Asian-American experiences. The producing company provided an ASL interpretation of the story for those who couldn't enjoy the audio version … but it would, admittedly, be weird to make a D/deaf or hard of hearing person watch a non-Asian person interpret this explicitly Asian story in sign language.

    Meanwhile, there are languages where the literally translation of a phrase like "black man" can colloquially refer to the Devil. In order to accurately translate Gorman's poem with its references to her Blackness, a translator would need to understand and accommodate all of that cultural context. And that's no easy task!

    There are no easy answers to these kinds of translation issues, but I think the article from The New York Times does a good job at articulating the myriad arguments and nuances involved.

    Amanda Gorman's Poetry United Critics. It's Dividing Translators. [Alex Marshall / The New York Times]

    Image: DOD Photo by Navy Petty Officer 1st Class Carlos M. Vazquez II / Flickr (CC-BY-SA 2.0)

    (Full disclosure: I also write for Wirecutter, which is owned by the New York Times Company, which also publishes The New York Times.)

  • See THE DARK CRYSTAL as a ballet next year in London

    London's Royal Opera House has announced its upcoming post-COVID performance season, which will apparently include a new adaptation of Jim Henson's classic fantasy film The Dark Crystal, as a ballet!

    As the company wrote in a press release:

    The Royal Ballet presents Company Wayne McGregor in The Dark Crystal: Odyssey,a work for family audiences choreographed and directed by Wayne McGregor. Based on Jim Henson's iconic 1982 film, this magical coming-of-age story brings together a team of world-class collaborators including artists Brian and Wendy Froud, composer Joel Cadbury, digital designers kontrastmoment, lighting designer Lucy Carter, dramaturg Uzma Hameed, costume designer Philip Delamore and face-and-body-artist Alex Box, with puppets and props from Jim Henson's Creature Shop.

    That's all the information that's available so far; supposedly more info (including dates) will be coming in June.

    While I'm more of a Labryinth fan myself, I think the grandiose symbolism of ballet could be a cool way to bring the magical world of The Dark Crystal to life.

  • Read this free comic about the history of climate change denial propaganda

    Illustrator and animator Céline Keller undertook the fascinatingly complex task of adapting a research paper on climate change propaganda into a graphic narrative. Here's how the artist explains it:

    This is a comic adaption of the 'Discourses of Climate Delay' study by the Mercator Research Institute on Global Commons and Climate Change (MCC). I used the quotes from their supplementary materials and added some extra examples with context information gathered mostly from the fantastic Climate Disinformation Database at Desmog.

    You can download a PDF of the comic, or check it out at Keller's website. A print run is supposedly on the way as well.

    Discourses of Climate Delay (Lamb, W., Mattioli, G., Levi, S., Roberts, J., Capstick, S., Creutzig, F., Minx, J., Müller-Hansen, F., Culhane, T., Steinberger, J. / Adapted by Céline Keller)

  • These Long Lost TMNT / Star Wars Mashups Are Incredible

    Teenage Mutant Ninja Turtles have a long history of wild crossovers with other nerdy properties. But according to the Jedi Temple Archives, the best one that could have happened never actually did. In the early 90s, when Hasbro/Kenner had let the Star Wars action figure license lapse. So Playmates made a pitch for the property, and even commissioned famed Ninja Turtles artist Michael Dooney to mock up some potential artwork for a line of TMNT / Star Wars crossover toys.

    I never knew I needed a pizza lightsaber all this time.

    These closeups with the call-out details make it even better. A nunchuck bowcaster! Sewer pipe lightsaber handles! A miniature Han Solo In Carbonite strapped to April Leia's belt! Various versions of Donatello strapped to the other Turtles' backs! This is the stuff of childhood dreams.

    There are even some customized fan-made toy versions of those, to show us all what could have been (such as Krang riding an AT-ST).

  • Ted Chiang talks about magic, AI, capitalism, and superheroes

    I'm not a regular listener to the Ezra Klein Show, but I tune in for the occasional interesting guest — so I assumed that the Ted Chiang episode was not one to miss, and I was right.

    Chiang is the author of the speculative short fiction like the collection Stories of Your Life and Others, and is critically acclaimed by writers and reviewers alike for the way he combines hard sci-fi and heartfelt explorations of humans living in society. And that's basically what Chiang and Klein talk about for an hour.

    Here are a few excerpts I particularly enjoyed, like this observation about magic vs. science:

    When people quote the Arthur C. Clarke line [about sufficiently advanced science being indistinguishable from magic], they're mostly talking about marvelous phenomena, that technology allows us to do things that are incredible and things that, in the past, would have been described as magic, simply because they were marvelous and inexplicable. But one of the defining aspects of technology is that eventually, it becomes cheaper, it becomes available to everybody. So things that were, at one point, restricted to the very few are suddenly available to everybody. […]

    Magic is something which, by its nature, never becomes widely available to everyone. Magic is something that resides in the person and often is an indication that the universe sort of recognizes different classes of people, that there are magic wielders and there are non-magic wielders. That is not how we understand the universe to work nowadays.

    This leads into a cool examination of alchemy and religion.

    He also offers his thoughts on superheroes:

    Most of the most popular superhero stories, they are always about maintaining the status quo. Superheroes, they supposedly stand for justice. They further the cause of justice. But they always stick to your very limited idea of what constitutes a crime, basically the government idea of what constitutes a crime.

    Superheroes pretty much never do anything about injustices perpetrated by the state. And in the developed world, certainly, you can, I think, make a good case that injustices committed by the state are far more serious than those caused by crime, by conventional criminality. The existing status quo involves things like vast wealth inequality and systemic racism and police brutality. And if you are really committed to justice, those are probably not things that you want to reinforce. Those are not things you want to preserve.

    This is slightly ahistorical — Superman was originally created by two Jewish men in poverty who wanted a hero to rail against the the landlords and bankers and other corrupt representatives of a system that didn't care for them; and Warren Ellis vividly explored the fascism of that exact hypocrisy in Stormwatch and The Authority, which were wildly successful (and which Klein does bring up). But Chiang expands on his theory about the anti-Egalitarianism of superheroes, which coincides with his criticisms of the unequal distribution of magic, which leads into some of his other observations about things like capitalism and artificial intelligence:

    I tend to think that most fears about A.I. are best understood as fears about capitalism. And I think that this is actually true of most fears of technology, too. Most of our fears or anxieties about technology are best understood as fears or anxiety about how capitalism will use technology against us. And technology and capitalism have been so closely intertwined that it's hard to distinguish the two.

    The whole conversation is absolutely worth an hour of your time.

    Ezra Klein Interviews Ted Chiang [Ezra Klein / The New York Times]

    Image: Arturo Villarrubia / Flickr (CC-BY-SA 2.0) and Irn / Wikimedia Commons (CC 4.0)

  • How credit cards work, explained with evil fairy mobsters

    Game designer Avery Alder took to Twitter to explain how credit cards, using a much more sensible and accessible framing than the magical metaphor that we typically use, which is "money."

    The thread starts here:

    You take a handful of gold coins, because you really do need the money. As long as you pay this strange creature back before the next full moon, nothing bad will come of it. Now, obviously the fairy is trying to trick you. You know that! But you're confident you can outwit it.

    You borrow what you need, and you return it to that magical forest place before the moon fills. All is well. Better than well! The fairy grows fond of you, leaving you larger and larger piles of gold to borrow. Other fairies begin to make offerings to you as you walk the woods.

    One month, life is particularly cruel to you. You can't pay back the gold you borrowed. On the night of the full moon, the being appears. "Don't worry, my sweet. I am merciful. Just give me what you have today, and pay the rest by next moon." It strokes its gruesome necklace.

    "I'm sorry," you say. "I'll earn the money. I'll pay back the debt!" "Don't worry at all, precious darling! All in due time. Things have a way of working themselves out." And then, before disappearing in a puff of smoke, whispered under its breath: "that's the first finger."

    The fairy keeps leaving you bigger piles of gold. The temptation grows stronger. Eventually, you come to think of it as *your* gold. You borrow too much sometimes, and can't pay it back. "Two fingers," it whispers without sound. Then three. It keeps letting you take more money.

    One day, you realize that regardless of whether you make good on your debts, you can't stop borrowing more gold. Not only because you've built your life around it, but also because if you ever stopped borrowing it would make the fairies very angry. Not just this one. All of them.

    You look around you at the world. Your fellow villagers have all fallen under the sway of the fairies. Borrowed fairy gold runs your whole town. People only do business with others if they are known to be in the favour of the fairies. Every day, more hands with missing fingers.

    The savvy villager knows just what to do: borrow small amounts of gold regularly, to attract the attention and good graces of the fairies, and always repay it in full before the next moon, knowing it is not their gold. Get charmed and keep their fingers. Few villagers are savvy.

    Anyways, "sinister temptations from the fairy mafia, who will love you dearly if you play their game right" is the framework that helps me make my best credit decisions. Maybe it will be helpful to someone else out there too.

    Makes more sense than, "You want a pieces of paper whose only value is derived by the collective belief of our society, so we're going to use an NSA-derived spying metric to determine your capitalist morality and help us decide how many non-physical inherently valueless pieces of paper we will temporarily lend you, which you will repay us with additional inherently valueless pieces of paper as punishment for not already having enough inherently valueless pieces of paper of your own." And when you break it down like that, you start to realize that faeryfolk are actually more real than currency.

    If you like this, check out some of Alder's role-playing games — I haven't played them myself, but if they take a similar approach to her credit explainer, then they're probably pretty cool.

  • Read an exclusive excerpt from "An Internet In Your Head: A New Paradigm For How The Brain Works "

    A few months back, I shared a piece from neuroscientist (and longtime BB reader!) Dan Graham about research into the language we use to discuss and understand the human brain. For years, scientists have relied on computer metaphors as the go-to point-of-comparison for brain function. But in his new book An Internet in Your Head: A New Paradigm for How the Brain Works, which is out today, Graham proposes a new way of looking at the language we use to talk about our minds: the internet.

    Whether we realize it or not, we think of our brains as computers. In neuroscience, the metaphor of the brain as a computer has defined the field for much of the modern era. But as neuroscientists increasingly reevaluate their assumptions about how brains work, we need a new metaphor to help us ask better questions.

    The computational neuroscientist Daniel Graham offers an innovative paradigm for understanding the brain. He argues that the brain is not like a single computer—it is a communication system, like the internet. Both are networks whose power comes from their flexibility and reliability. The brain and the internet both must route signals throughout their systems, requiring protocols to direct messages from just about any point to any other. But we do not yet understand how the brain manages the dynamic flow of information across its entire network. The internet metaphor can help neuroscience unravel the brain's routing mechanisms by focusing attention on shared design principles and communication strategies that emerge from parallel challenges. Highlighting similarities between brain connectivity and the architecture of the internet can open new avenues of research and help unlock the brain's deepest secrets.

    An Internet in Your Head presents a clear-eyed and engaging tour of brain science as it stands today and where the new paradigm might take it next. It offers anyone with an interest in brains a transformative new way to conceptualize what goes on inside our heads.

    This sparked some neat discussion about the ways we talk about and understand the functions of our minds, so I've gotten permission to share an excerpt from Graham's book, which hopefully leads to some more rollicking debate!


    We think of our brains as computers. Whether we notice it or not, we invoke the metaphor of the brain as a computer anytime we talk about retrieving a memory, running on autopilot, being hardwired for something, or rebooting our minds. Neuroscientists are no less trapped in the computer metaphor. For almost as long as neuroscience has been a recognized field, the default approach has been to imagine the brain as a computing device.

    Of course, most neuroscientists don't think the brain is literally a digital computer. But textbooks across the brain sciences routinely describe neurobiological processes of thinking and behavior as directly analogous to those of a computer, with programs, memory circuits, image processing, output devices, and the like. Even consciousness is described as the internal computational modeling of the external world. And although comparisons of the brain to a computing device are usually somewhat qualified, they are nearly ubiquitous. The metaphor is hard to escape or even notice because it is so ingrained in the way we think about the brain.

    This situation exists in part because neuroscientists use the computer metaphor when describing the brain for the general public. Neuroscientist Dean Buonomano, in his 2011 book Brain Bugs, calls brain injuries and disorders a "system crash," and he writes of "disk space" and "upgrades" for our memory systems. Cognitive scientist Donald Hoffman analogizes our visual perception of the world with a computer desktop interface: "my icon of an apple guides my choice of whether to eat, as well as the grasping and biting actions by which I eat." Others, like brain scientist Gary Marcus, are uncompromising: "Face it," Marcus wrote in the New York Times, "your brain is a computer."

    Neuroscientists typically see the job of a given part of the brain—single neurons, neural circuits, or brain regions—as computing something. At each level, electrical or chemical signals are passed among components and the components operate on the signals by computing something. Computing in this sense means taking in a signal, making the signal bigger or smaller, faster or slower, and then passing the signal along for further mathematical adjustment. What matters is the computational relationship between the magnitude of the signal coming in and the magnitude of the signal going out.

    A neuron's job is often to compute a response when provided with some stimulus: a pattern of light, a sound, a social situation. With lots of neurons performing specialized computations, properties of our environment can be sensed, analyzed, stored, and linked to behavior. Working neuroscientists mostly agree that, although brains and computers differ in innumerable ways, they share a common set of "hacks." In other words, brains and computers exploit many of the same fundamental design principles.

    There is no doubt that the computer metaphor has been helpful and that the brain does perform computations. But neuroscience based on the computer metaphor is incomplete because it does not consider the principles of network communication. Neuroscientists are starting to realize that, in addition to performing

    computations, the brain also must communicate within itself. The key point is that, although communication involves computation, communication systems rely on different fundamental design principles than those of computing systems.

    Although it has been little studied, brain-wide communication is attracting greater interest. We increasingly understand the physical structure of the brain as a highly interconnected network. The connectomics movement aims to map this network, as well as its dynamic activity. Through increasingly massive studies of the structure of neuronal networks, a new picture of brain function in complex animals is emerging. We are beginning to understand that one of the connectome's main jobs is to support brain intercommunication.

    At the moment, however, there is no guiding principle for how these interconnected networks carry messages to and from a given part of the brain. We don't know the rules about how traffic on brain networks is directed or how the rules relate to our capabilities of thinking and behavior. We don't even know how to investigate this. What's missing, at least in part, is an appropriate metaphor to help us think about how the brain communicates within itself. I propose that the internet is that metaphor. The computer metaphor and the internet metaphor can coexist and inform one another. For one thing, the internet is obviously made up of computers. But it has emergent properties and rules that differ from those that govern single computers.

    The coexistence of computation and communication metaphors—and the change in perspective needed to understand communication strategies—can be understood as being analogous to a time traveler from the past encountering today's internet. Imagine a 1950s-era electrical engineer transported to the present day. The engineer doesn't know what the internet is, but given a standard Wi-Fi router, she is curious enough to open it up and record electrical currents from its circuit board. By carefully measuring voltage changes over time at various locations on the circuit board, the engineer could probably learn to identify different kinds of components, such as diodes and transistors. In doing so, she could deduce the computations each one performs. But the stream of voltage variations entering or leaving the router would be very difficult to interpret. Measuring only the sheer number of signals would reveal little.

    In brains, we have something similar. We can measure the activity of individual cells in the brain and deduce the rules that govern their electrical changes. In a much more limited way, we can measure large-scale brain activity. But we can't observe how messages are transmitted across several synapses in the brain or the branching, dynamic paths these messages may take.

    In short, we don't know the brain's strategy for passing messages across the whole brain. Indeed, supposing the existence of "messages" is somewhat heretical. But returning to our timetraveling engineer, if she knew the general rules for message passing on a computer network, she might be able to identify the role played by a given diode or transistor in the router. The same should be true for brains: if we could work out the basic principles of message passing, we could understand the role of individual neural computations.

    For decades, neuroscientists have been measuring diodes and transistors and ignoring the larger system of message passing. We should think more about the brain as a unified communication system in science—and in society. Going further, we can investigate the brain in reference to the general principles that make the internet the universe's most powerful, flexible, and robust communication system. This change in viewpoint can also help us all understand and utilize our own brains more effectively.

    We know that brains must intercommunicate at all levels, from the biochemistry of synapses to whole-brain oscillations in electrical activity. Most importantly, it must be possible to send messages selectively in the brain without changing the structure of the cellular network of neurons. All kinds of tasks involve sending messages to one place sometimes and to another place at other times. This seems obvious when stated directly, but it is rarely acknowledged.

    It's like what happens at an ice cream shop when we decide between chocolate and vanilla. It must be possible for a decision-making neuron in our brain to direct a signal to the neural output for saying "chocolate" or, alternatively, to the neural output for saying "vanilla." We might even say "chocolate—no, wait! Vanilla!" because we remember that the vanilla at the shop is especially tasty, and thereby draw upon memories stored elsewhere on the network to change the route of the message in real time. The trace of communication across the network can change almost instantaneously. But our brain accomplishes this without altering neuronal network connectivity.

    Neuroscientists have extensively studied the decision-making computations occurring in neurons. These neurons appear to "decide" to fire or not fire by accumulating evidence from input signals over time. But it is not known how the computed decision is routed to the selected output neurons. This question has not really even been asked.

    Other parts of the body also intercommunicate, and it's worth considering whether the solutions adopted in other biological systems are useful comparisons. The immune system, for example, is predicated on the ability to pass information about the presence of pathogens to the appropriate internal security forces. Great armies of antibodies patrol every milliliter of blood, applying tiny labels to anything suspicious. As tagged microbes circulate through the body, the tags are eventually noticed and the offender pulled aside and killed. The message, as it were, has been received. If antibodies are the immune system's messages, passed by physical movement in miles of blood vessels, the brain's messages are something altogether different. In the brain, messages consist of electrical signals and their chemical intermediaries. Messages travel over a highly interconnected—but fixed—network of "wires." No individual component of the brain moves very far, at least in the short term. It is this kind of networked message passing that defines neural communication. Just like the immune system, the brain must have effective global rules and strategies for communication. But these rules are specialized for a system made

    of neurons and linked to functions of thinking and behavior.

    In recent years, a small but growing community of researchers has investigated the message-passing rules operating on brain networks. A few researchers have proposed internet-like solutions to the challenge of passing signals in the brain in a flexible way, though the theories have only occasionally been described as routing theories. Routing here refers to the directing of signals from one part of the network to another part according to a set of global rules. We can start to see things from a new perspective—and see how the internet metaphor can aid us—by recasting neural computation as neural routing.

    Excerpted from An Internet in Your Head by Daniel Graham. Copyright (c) 2021 Columbia University Press. Used by arrangement with the Publisher. All rights reserved.

    An Internet in Your Head: A New Paradigm for How the Brain Works by Daniel Graham [Columbia University Press / Amazon / Indiebound]

  • The stunning book design behind the new Philip Dick collection

    The Folio Society is releasing a limited edition collection of all 118* short stories written by Philip K. Dick — and like a lot of the Folio Society's work, the graphic design is absolutely stunning. Check out the video for more about the process:

    This limited edition of Philip K. Dick's The Complete Short Stories, with 24 illustrations by 24 different artists, is a celebration of the freewheeling imagination of a science-fiction master. The Complete Short Stories is limited to 750 hand-numbered copies and presented in a special display box designed by independent studio La Boca. The bindings, endpapers, title pages, page edges – even the ribbon markers – are colour co-ordinated in a fabulous fluorescent rainbow. Each binding is emblazoned with an eye-catching symbol that is echoed on the relevant title page and spot-varnished on the two-part presentation box. The interior of the box itself is lined with two specially designed papers; a multi-coloured 'glitch' pattern, and a night sky sparkling with silver stars to reflect Dick's fascination with the possibilities of space travel and technology.

    Now for the bad news: the 750-print run of The Complete Short Stories is already sold out. The good news is, at least now you won't be tempted to drop $745 dollars on this gorgeous work of literary art!

    The Complete Short Stories [Philip K. Dick / The Folio Society]

    *Wikipedia says that PKD actually wrote 121 short stories, so there's either an interesting reason why three of them aren't included in the collection, or else you can chalk it up to some Dick-like mindfuck that makes you question reality, which is the more likely case.

  • Prince's epic "While My Guitar Gently Weeps" guitar solo has a new director's cut

    The 2004 Rock & Roll Hall of Fame Induction Ceremony included a memorial performance to the then-recently-deceased George Harrison. An all-star line-up of musicians including Tom Petty and Steve Winwood performed the best Beatles' song ever, which was written by Harrison, who was objectively the best Beatle.

    Then, about halfway through the 6-minute performance, Prince magically appears and rips one of the most face-melting guitar solo in rock n' roll history. And just when it can't get any more epic, Prince throws his guitar up in the air and it … never comes down. It just disappears. It's fucking incredible.

    Joel Gallen, who directed and produced the original broadcast, recently revisited the footage and re-edited the sequence to give the world what we want: more Prince. As he explains:

    17 years after this stunning performance by Prince, I finally had the chance to go in and re-edit it slightly – since there were several shots that were bothering me. I got rid of all the dissolves and made them all cuts, and added lots more close ups of Prince during his solo. I think it's better now.

    Fortunately, Gallen preserved the disappearing guitar at the end. To this day, it seems that still no one knows what happened to that thing. Heartbreakers drummer Steve Ferrone reminisced on the performance in The New York Times in 2016, saying:

    It was a hell of a guitar solo, and a hell of a show he actually put on for the band. When he fell back into the audience, everybody in the band freaked out, like, "Oh my God, he's falling off the stage!" And then that whole thing with the guitar going up in the air. I didn't even see who caught it. I just saw it go up, and I was astonished that it didn't come back down again. Everybody wonders where that guitar went, and I gotta tell you, I was on the stage, and I wonder where it went, too.

    Maybe it was waiting for Prince up there in rock n' roll heaven all along.

    Gallen shared this behind-the-scenes story with The Times then, too:

    The Petty rehearsal was later that night. And at the time I'd asked him to come back, there was Prince; he'd shown up on the side of the stage with his guitar. He says hello to Tom and Jeff and the band. When we get to the middle solo, where Prince is supposed to do it, Jeff Lynne's guitar player just starts playing the solo. Note for note, like Clapton. And Prince just stops and lets him do it and plays the rhythm, strums along. And we get to the big end solo, and Prince again steps forward to go into the solo, and this guy starts playing that solo too! Prince doesn't say anything, just starts strumming, plays a few leads here and there, but for the most part, nothing memorable.

    They finish, and I go up to Jeff and Tom, and I sort of huddle up with these guys, and I'm like: "This cannot be happening. I don't even know if we're going to get another rehearsal with him. [Prince]. But this guy cannot be playing the solos throughout the song." So I talk to Prince about it, I sort of pull him aside and had a private conversation with him, and he was like: "Look, let this guy do what he does, and I'll just step in at the end. For the end solo, forget the middle solo." And he goes, "Don't worry about it." And then he leaves. They never rehearsed it, really. Never really showed us what he was going to do, and he left, basically telling me, the producer of the show, not to worry. And the rest is history. It became one of the most satisfying musical moments in my history of watching and producing live music.

    Amazing.