Is it possible that modern humans aren't the first civilization on Earth?
This is the insanely interesting question probed by "The Silurian Hypothesis", a new paper authored by Gavin A. Schmidt and Adam Frank, two NASA scientists.
As they point out, if an industrialized civilization existed in the deep past, it's not clear there'd be easily recognizable traces of it. Our geologic record doesn't go back any further than the Quaternary period of about 2.6 million years ago. "Go back much farther than the Quaternary," as Frank writes in an essay about the paper in the Atlantic, "and everything has been turned over and crushed to dust."
It's not even clear we'd find fossilized remains of a previous civilization, because while museumgoers might think that fossils are reasonably common, they're actually incredibly rare. A near-zero percentage of life on earth has ever been fossilized. A civilization could last what seems — to us — like a super-long time and still not produce any fossils, as Frank notes:
So, could researchers find clear evidence that an ancient species built a relatively short-lived industrial civilization long before our own? Perhaps, for example, some early mammal rose briefly to civilization building during the Paleocene epoch about 60 million years ago. There are fossils, of course. But the fraction of life that gets fossilized is always minuscule and varies a lot depending on time and habitat. It would be easy, therefore, to miss an industrial civilization that only lasted 100,000 years—which would be 500 times longer than our industrial civilization has made it so far.
The upshot is that Frank and Schmidt wind up focusing on the chemical traces of an advanced civilization. If previous lifeforms industrialized and began making stuff the way we do, you might see suspiciously large buildups of, say, nitrogen (in our case, from fertilizer) or rare-earth minerals (in our case, from making electronic gadgets).
Indeed, the truly massive chemical signal you might see is the shift in carbon that comes from burning fossil fuels — and its attendant global warming. The scientists here study the Paleocene-Eocene Thermal Maximum, a period 56 million years ago where the global average temperature rose 15 degrees higher than today. The spike in carbon and oxygen isotope ratios was, they conclude, very much like what you'd see if an industrial society burned fossil fuels the way we do. But the thing that's different is the speed: The rise in atmospheric C02 these days is much, much sharper than the incline back then.
The upshot, Frank writes, is that the evidence doesn't really suggest a previous civilization existed. But engaging in the counterfactual is useful for pondering our modern society, and the detritus we're producing …
It's not often that you write a paper proposing a hypothesis that you don't support. Gavin and I don't believe the Earth once hosted a 50-million-year-old Paleocene civilization. But by asking if we could "see" truly ancient industrial civilizations, we were forced to ask about the generic kinds of impacts any civilization might have on a planet. That's exactly what the astrobiological perspective on climate change is all about. Civilization building means harvesting energy from the planet to do work (i.e., the work of civilization building). Once the civilization reaches truly planetary scales, there has to be some feedback on the coupled planetary systems that gave it birth (air, water, rock). This will be particularly true for young civilizations like ours still climbing up the ladder of technological capacity. There is, in other words, no free lunch. While some energy sources will have lower impact—say solar vs. fossil fuels—you can't power a global civilization without some degree of impact on the planet.
By the way, Doctor Who fans will no doubt recognize the reference in the title of "The Silurian Hypothesis" — Silurians being a race of humanoid reptiles (picture above) that appeared originally back on the show in the 70s, and, as the lore goes, existed millions of years before humanity.