Life in a city made of computers

Here's a transcript of a classic Charles Strossian rant, his speech at TNG's Big Tech Day in Munich last June. Entitled "How low (power) can you go?" it's a look at life in a city whose entire surface was made of sensing, computing smart matter:

I also noted that the combined video and audio streams from the entire population of Germany, over a period of a century, would occupy on the order of a hundred kilograms of Memory Diamond — a hypothetical crystalline form of carbon used for data storage, in which each bit is represented positionally by an atom of one isotope or another (in this case, carbon-12 or carbon-13). With Avogardro's number of bits storable in 12.5 grams of carbon, if we can figure out how to read and write this stuff we can store roughly 0.5 petabytes in each gram of substrate.

(Using this yardstick, on a world-wide scale Google currently processes about 2 grams of data per hour.)

So, the first point to note is that if the world of 2032 has this level of ambient computing power at all, we're likely to have the data storage to go with it.

Let's assume we have found a use for our billion cpu city, and we're running a billion operations per second on each cpu. If each operation generates one byte of useful output — from air quality sensors, or cameras, or whatever — then our city is producing 1018 bytes of data per second. That's heavy data: that's 2000 grams per second. We're really going to have to get our data de-duplication strategies under control, lest we build up memory diamond landfill at a rate of seven tons per hour! Luckily most computer programs don't generate anything like one byte of output per operation — that's a ridiculous edge condition. Given the bandwidth and power constraints on our tiny solar powered processors, I'd be surprised if they averaged even a megabit per second of output — and even that would correspond to uncompressed high-definition video from every square metre of our city. So let's arbitrarily hack six orders of magnitude off that peak data output figure. Our city of 2032 is emitting as much information in a second as Google processes in an hour today: remarkable, but not outrageous in context.

How low (power) can you go?



  1. “…then our city is producing 1018 bytes of data per second.” I am assuming this is supposed to be 10^18.

  2. The numbers, at least in the first paragraph, are way, way, way, way, way, way, way off.  (In my opinion each order of magnitude deserves one “way” :-) 12.5 g of carbon contains 6*10^23 atoms, so 1 g of this notional “memory diamond” would store 5*10^22 bits.  That’s 6*10^21 bytes or 6 billion terabytes, i.e. 6 million petabytes, not 0.5 petabytes.

    1. His previous ‘memory diamond‘ used two atoms to represent each bit; 12C-13C is one state, 13C-12C is another. Still 3 million petabytes.

      I would have thought he could have spelt ‘Avogadro‘ correctly…

      1. A tendency to be sloppy with words (or names) may go hand-in-hand with a tendency to be sloppy with numbers.  Of course, neither one is a particularly good attribute to possess if you’re trying to make a living as a science fiction writer!  There’s something to be said for the old greats of hard SF (Heinlein, Clarke, etc.) — a lot of them were incredibly careful in both domains.

        As for the two-atom, half-capacity memory diamond, I guess that is the RAM version.  While one could theoretically represent a bit using a single atom of carbon with one or the other atomic weight, having to knock out (or in) a neutron to flip that bit puts a writable version of such a memory out of the realm of the merely highly implausible and into that of the virtually inconceivable.  If, on the other hand, we stipulate some sort of nanomachinery that can interact with individual atoms in a diamond crystal to “read them out” then it’s perhaps not so long a stretch to imagine transposing two of them.

        Still, either way the whole thing is so far from our current capabilities that I find it a bit amusing that he chose to describe the memory diamond in the context of storing the “combined video and audio streams from the entire population of Germany, over a period of a century”.  Surely it will be at least 50 years before a memory diamond is attainable technology.  50 years ago the country of Germany didn’t even exist… I don’t see any reason one way or the other to believe it will still exist 50 years hence, so memory diamond may never have a chance to record a century’s worth of its population’s sensory experiences :)

Comments are closed.