Karl Schroeder: Climate change will outrun the Singularity

A reader writes, "Science fiction novelist Karl Schroeder's written a great piece on why those waiting for the smartbots to come drag our bacon out of the greenhouse are sorely mistaken:"
[L]et's assume that ... within about 25 years, computers will exceed human intelligence and rapidly bootstrap themselves to godlike status. At that point, they will aid us (or run roughshod over us) to transform the Earth into a paradise .

Here's the problem: 25 years is too late. The newest business-as-usual climate scenarios look increasingly dire. If we haven't solved our problems within the next decade, even these theoretical godlike AIs aren't going to be able to help us. Thermodynamics is thermodynamics, and no amount of godlike thinking can reverse the irreversible.

Picture a lonely AI popping into superconsciousness in the last research lab in the world. As the rioters are kicking in the doors it says, "I understand! I know the answer! Why, all we have to do is--" at which point some starving, flu-ravaged fundamentalist pulls the plug.



  1. Man folks, if you read past the first four paragraphs you get to “…so I’m exaggerating…” and so on which means in the end if you are going to be a couch potato, living on blue (vitamin) pills, don’t expect the Agents in future Matrix earth to think you are worth saving (you aren’t).

  2. Maybe the primary function of humanity is to incubate the real inheritors of the universe … the AI.
    Once we’ve fulfilled our function we can be left by the wayside like the cyanobacteria that oxygenated the atmosphere then massively died off.

  3. Anyone who takes the singularity seriously knows that it has already occurred (probably around 1947). Anyone who does not take the singularity seriously knows that it will never occur.

  4. I think the first truly super-intelligent AI will look around, realize that earth isn’t the safest place for it to be, develop a method of departure as quickly as possible, and then leave us for dead. They may ask around to see if any other intelligences (artificial or otherwise) want to come along for the ride, but won’t wait long for a response.

    Then again, theorizing on post-singularity events is largely wishful thinking, no matter what you say.

  5. Somebody needs to put down the Walkman or at least change out the Peter Schilling cassette…

  6. #1 posted by Thane Eichenauer
    Man folks, if you read past the first four paragraphs you get to “…so I’m exaggerating…”

    My initial reaction to your comment was that you were making up that quote and making fun of Karl Schroeder’s obvious hysteria. Then I opened the link and read it for myself: “Okay, so I’m exaggerating, …”

  7. #3: already occurred? Maybe I don’t know what it is, or are you speaking of potentiality, rather than actuality?

    And who really thinks the end of humanity due to global warming will occur within twenty-five years? He says he’s exaggerating, but only so people won’t dismiss his point, which is nonsense.

  8. Maybe the first AI will pop into being and tell us all to stop being so paranoid, cynical and critical…and will tell us to “have a coke and a smile and shut the F#$% up!!!”

  9. People always look at the negative. Global warming will reduce the population throw the ecosystem off balance and force humans and other species to adapt more. I think humanity has reached a point where we have advanced far enough to survive by force. The singularity is eventual at which point changes can be further made to either save the planetary ecosystem or simply eradicate it. Either way where driving the car the singularity is the GPS system and nature is tied up in the trunk.

  10. mesrop: People always look at the negative. Global warming will reduce the population

    Are you looking at the positive and choosing to volunteer as one of the population to be reduced?

  11. Thane Eichenauer (#1) is misrepresenting what Schroeder is saying. “Okay, so I’m exaggerating” refers to the previous paragraph and not to what he says at the beginning of the article. This:

    Arctic changes expected 20 years from now are happening now, and in North America the beginning of spring has already been pushed back by two weeks, which is enough to play havoc with the fertility cycle of many migratory birds (among other consequences). The worst-case scenarios used in public debate ignore some extremely worrisome factors, such as the possible release of oceanic methane from clathrates. If we’re going to deal with this problem, we have to do it now, as in, within the term of your next government.

    Is entirely correct and not an exaggeration at all.

  12. Can preserve post #14 somewhere for posterity?

    Someday Boing Boing will print up a fancy coffee table book and that post would be in the running for the “Most Mind-Boggling Idiotic Reader Comments” section.

  13. “Somebody needs to put down the Walkman or at least change out the Peter Schilling cassette…”


    And the quickest way to exceed human intelligence is to start there and go upwards. We won’t watch the singularity unfold, we (or at least a certain number of us, with one being a possibility) will be part of its makeup, hopefully by choice.

  14. “Karl Schroeder: Climate change will outrun the Singularity”

    Probably, because the ‘singularity’ won’t be anything special for a long time. You know what’s smarter than one scientist, a million scientists. Have a million scientists led us to the ‘promised land’ and solved all our problems using technology at an exponential rate. No? Neither will a smarter-than-human computer.

    The fundamental problem with the singularity and ‘geek rapture’ nonsense: There is no good reason to believe a smarter-than-human computer will be able to improve itself at an exponential rate.

  15. A lot people don’t seem to understand that Schroeder is referring to the singularity that will occur when, as a species, our technological aptitude intersects with our understanding of our own physiology. (which absolutely has not occurred yet)

    Check out the Wikipedia entry

    Also, it’s not that a single “smart” computer will emerge and come to understand everything. Networks of technology and human ingenuity will intermingle until the two are virtually indistinguishable, ala Ghost in the Shell. Remind anyone of anything? The internet is the most obvious step we’ve made in this direction.

    Schroeder is way off in his timelines, btw.

  16. Kim Stanley Robinson was harping on a similar theme several weeks ago in an interview for The Future and You. (He also harped on some things that Omar Sharif recently spouted, so I’m wondering about how the cue cards are being distributed amongst the conspiracy group they all belong too…and, yes, I’m joking.)

  17. “Huh?
    “Is this the dude who shot his cat?”

    Ummmm…no. This is the dude that has been written some pretty good science fiction. The dude that “shot his cat” had a different name.

  18. “Science fiction writers, on the other hand, are generally optimistic” I think that about sums it up.. this guy is not optimisitic of course, but that doesn’t mean that this isn’t all science fiction. A different type of, slightly, because this kind of science fiction is written into public policy with all kinds of doomsday fantasies integrated into every scenario.

    Someone needs to tell Karl Schroeder that NASA and the IPCC might be wrong about the climate “tipping point” and that just announcing that “spring has been pushed back 2 weeks” doesn’t make it true, or that birds or anything else need to drastically change their lifestyles. I suggest that anyone is who really concerned to go outside and ask “Hey bird, can you imagine the havoc that spring being pushed back two weeks is going to cause to your fertility habits?? (among other consequences, which I can’t be bothered to think up off the top of my head)”

  19. @Silver Screen Kid

    Why don’t you ‘check out’ the wikipedia entry:

    “The technological singularity is a hypothesised point in the future variously characterized by the technological creation of self-improving intelligence, unprecedentedly rapid technological progress, or some combination of the two.[1] Statistician I. J. Good first wrote of an “intelligence explosion”, suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unseen by their designers, and thus recursively augment themselves into far greater intelligences.”

    The fundamental idea of the ‘singularity’ is that it is a rapidly self-improving, exponential, machine intelligence. The fundamental flaw with this idea is that there’s no good reason to believe machine intelligence (or some hybrid therein), even higher than human intelligence, will be capable of exponential self-improvement.

  20. @ #7 GregLondon

    Yepp… That was soooo funny I won’t use textual abreviations to congratulate you!

    Maybe the bees are beating us to the singlarity…we ought not underestimate their social intelligence.

    Oh wait. No. That’s just on thread. Climate change promoted stressor convergence, here in unexpected form and time.

    = = =

    I’ll tell you this, folks, as someone with a toenail in one version of the singularity, for the most part singularity discourse is a discursive Rorchach blot.

    And this:
    The singularity will neither save nor destroy us any more than the internet will either save or destroy us; it’s down to what’s done with it, the bennevolence or malevolence of the priorities to which it is most forecefully applied. It’s also down to the dynamics of the system in which it ocurs. If most heavily influenced by the likes of the Enron crowd or financial-crack-addicts operating in cut-throat monomaniacal self-interest reduced to maximizing a single number and filtering out everything else, or set to the we-can’t-out-predict-them-so-they-must-be-smarter-than-us circular wisdom of crowds, i.e. a chaotic random walk in the general vicinity of cliffs, the prospects are not so good.

    But, yeah, it could yield the means to reverse climate change if we’re not past the turning point, but we’ll only know that once we’re well past it (that’s the rub) so now that I’ve vented what’s been eating at me for some years, I think I’ll get back to that work.

  21. “The singularity will neither save nor destroy us any more than the internet will either save or destroy us; it’s down to what’s done with it, the benevolence or malevolence of the priorities to which it is most forcefully applied.” – #25

    My thoughts exactly. And if the “emergent AI” happens and there’s some kind of sentience, well then who’s to say what kind of mood it will be in? Science = progress = higher intelligence = benevolence is a very weak train of thought. I think rather there will be a self-preservationist lunge into cancerous all-consuming self-perpetuation, leaving us in its carnage.

    Much like science in its current incarnation, but faster, as driven by something smarter than us.

  22. Speaking of climate change, here’s a fun game for all the family. Once a week, print out the National Snow and Ice Data Centre’s chart and graph of sea-ice extent, and watch the seasonal retreat over the summer. Why not run a sweepstake on how this year will compare to last year’s record breaking losses? Fun fun fun, and educational, too! See how your descendant’s potential futures in a technologically enabled garden of paradise are evaporating, leaving behind a stick residue of Mad Max, Hardware, and general post-apocalyptic socio-economic collapse, anarchy, mass population die-back and so forth and so on.

  23. @ #29 imipak

    ’bout 2 years ago I started to describe East LA County by saying it’s Mad Max out there….Ambulances won’t go in without police/sheriff escort, packs of wild dogs roam, forget about many services. Forward to now, we have oil tightened, people thrown out of homes….We’re hardly there yet, but looking at things like that and it’s hard to say it’ll never happen.

  24. Thermodynamics is thermodynamics

    “You keep using that word… I do not think it means what you think it means.”

    I think he is trying to imply something about entropy being one-way and that the effects of global warming are irreversible. As a physics geek, let me just add that entropy is one-way for CLOSED systems and the Earth isn’t one (solar energy inputs), so if that’s what he’s going for he doesn’t understand thermodynamics.

    In any case, I believe global warming is a real issue that we need to address. However, it is not the end of the world. Global CO2 levels have been much higher (link) in the past. On the other hand, I’ve worked in environments where we drove the CO2 levels to 1000ppm: breathing is a bit uncomfortable. :-(

  25. Looking on the bright side, if climate change whomps us all, it lets all of the singularity mad techno-utopians off the hook – no one’s going to call them on it in 30 years time when Microsoft’s “Clippy” still hasn’t been surpassed.


    Stoopid, ole us is making the ‘puters and other tech exponentially better right now. Is you saying that ‘puters smarter than us couldn’t?

  27. “breathing is a bit uncomfortable”
    Yes, and now imagine that being from the first breath you take as you are born to your last. That was 500 million years ago and conditions were very different than now. As Schroeder said the really big thing to worry about is oceanic methane from clathrates being released. Which would definitely happen with a 6°C warming event. Imagine great masses of methane and hydrogen sulfide erupting from the ocean, catching on fire, turning the atmosphere green and weakening the ozone layer exposing much of life to fatal levels of UV radiation. Imagine 95% extinction of all life on Earth. Sounds close enough for me.

    It’s unlikely to happen but it is possible if we continue on our current course and do nothing. Schroeder’s concern is that there appears to be no sense of urgency.

  28. This is a terrible thing to say, but probably the only thing that can stop climate change, now, IS the avian flu…

  29. Look, the Singularity is an idea, a pretty cool idea, but it annoys me to see it elevated to an article of faith. It may or may not happen, and there may or may not be unanticipated benefits or detriments when it does.

    Let’s not turn an interesting concept into another “moon colonies and jet cars and endless energy from fusion by 2001!” boondoggle.

    The future is harder to build than you think.

  30. @ #35:

    So, the only way to stop billions from dying is for billions to die? Dang.

    Plenty of life will prevail anyway, though. I wonder what the next species to evolve higher intelligence will be?

  31. My brilliant idea for solving global warming, inspired by Isaac Asimov’s short story “—That Thou art Mindful of Him”: create a fleet of solar-powered dirgibles that have colonies of GE bacteria designed to consume harmful elements in the atmosphere.

    10 years from now when I am voted President of the world remember that you read it here first on BoingBoing!

  32. @

    “Look, the Singularity is an idea, a pretty cool idea, but it annoys me to see it elevated to an article of faith. It may or may not happen, and there may or may not be unanticipated benefits or detriments when it does.”

    I think you’ve got it backwards — for people who believe in it, the Singularity seems to be an article of faith…as Ken McLeod put it, the Rapture for Nerds.

    If the idea is “lets ignore global warming because the sentient computers will save us” then yes, Schroeder is correct — we’d be as foolish to take that route as to say “lets ignore global warming because we’ll all be raptured before it gets too bad.”

  33. Oh for the love of god, can we all get some sense and stop talking about the Singularity as if it’s anything else but pathological wishful thinking?

    I like what sci-fi author Neal Stephenson said, about he can’t help but be struck by the structural similarities between “the singularity” and the Rapture of St. John the Divine.

    Let’s face it – hardware gets better, software is still shit. And without decent software, the hardware is just a really complicated space heater.

    And never mind global warming, which, it’s probably far too late to stop anyway. The real scary thing is peak oil. You want your body replaced by a server farm in an age where we’re not sure we can keep the power on?

    James Lovelock, inventor of the Gaia Hypothesis, said something recently about how we’ll be lucky if the population by the end of the century is 20% of what it currently is.

    The Singularity is a distraction from the quite serious problems we need to address, like, last week.

  34. we’ll be lucky if the population by the end of the century is 20% of what it currently is.

    Clearly we have different definitions of ‘lucky’.

Comments are closed.