Security failures will live on forever, because protocols have no sell-by date. Glenn Fleishman exposes the eternity we face with broken software.

The notion of built-in obsolescence has been an axiom for decades. Manufacturers sell stuff, whether appliances or electronics, that are intentionally designed to have a lifespan shorter than the component parts require. As the owner of a 13-year-old major brand dishwasher that suddenly developed multiple component failures for which the cost of repair would only buy time and run more than the original purchase price, I'm well aware that value-priced items come with a ticking time bomb.

But we also face unplanned, eternal obsolescence with modern embedded hardware, computers, and other devices. Operating systems, firmware, and add-on software can continue to run indefinitely and without any path to upgrade when flaws in operation or exploits for local or remote access appear. There is no comprehensive philosophy to cope with this across industries and scales, whether at power plants that handle millions of customers or the embedded code that runs set-top boxes or home routers. Many products reach an effective end-of-life without any upgrade path, or without the people or group that uses them knowing that a problem even exists — until there's a major exploit.

The SCADA (supervisory control and data acquisition) systems exploited in the Stuxnet worm, the tens of millions (or potentially more) older Android mobile devices for which security updates will never be released or installed, the hundreds of millions of generic Linux embedded devices the owners of which mostly don't know they are running an operating system and the hardware doesn't have an upgrade path or are never upgraded when such fixes exist — these devices may run for years or decades longer despite the risks to users, companies, and the public. Just this week, a massive hole in bash, a common shell for Linux and Unix, was discovered, and will surely affect uncounted millions of devices exposed via Web interfaces. "Protocols don't come with sell-by dates," says Chris Soghoian, principal technologist at the ACLU's Speech, Privacy, and Technology Project.

I was prompted to consider this issue again with the recent decision by Google to try to hasten the demise of outdated SSL/TLS certificates. Eric Mill wrote a full rundown of the problem and Google's path to tweak Chrome warnings to become more alarming. The brief version is that a cryptographic signature type, SHA-1, is in use by about 90 percent of Web sites that employ encryption, and it has been known to have significant flaws since 2005. Governments surely already have the power to use known weaknesses to forge security certificates that look absolutely legitimate. The price for criminals and determined businesses is falling rapidly as well.

Despite that, there's little urgency to make SHA-1 obsolete. Soghoian points out that browser makers, like Mozilla and Google, have little leverage with the certificate authorities (CAs) that continue to allow SHA-1 certs to be issued. Google plans to make Chrome surface more and worse security warnings to users as a deadline in December 2016 to drop SHA-1 certificates approaches. And that's with a centralized system. (The CA system is highly broken, and the topic of many thousands of other words. "It's like a taxi medallion," Soghoian says of being a CA baked into an OS or browser.)

Here's the thing. With a centralized system that affects billions of users and a handful of browser makers, three of them enormous global corporations (Google, Microsoft, and Apple), there's still not the impetus to repair a problem that's been well known for nine years. Soghoian points out that the basic technology for cellular voice calls remains protected by mid-1980s technology that was broken about 15 years ago, and which offers a clear and present danger of government and other interception. And yet it remains.

Eternal persistence of vulnerable, network-connected hardware is not a new realization, but it's getting worse — the tens to hundreds of millions of computers worldwide still running Windows XP, a good portion of which mask XP with a dedicated front-end interface for a particular purpose (as with SCADA), is just one part of the problem. Those computers will eventually brick and fail or users will move to patched XP or tablets or newer OSes, even in the developing world where such XP systems predominate. Soghoian notes that companies are still buying new computers with XP installed or installing the OS themselves.

Rather, the biggest problem is devices without a face (embedded systems), without an upgrade path (abandoned hardware, which includes plenty of mobile devices just a few years old — yes, including early Apple iOS hardware, too), and devices that require a fiddly process to update that the vast, vast majority of owners will never follow, even if it's just downloading a file and click on an administrative Web page.

The logical way to force upgrades is to make things break for people and devices that don't move forward. But this is impossible or untenable for a variety of reasons. "Once you have a billion devices, no one is going to replace it," Soghoian says.

The persistent problem is twofold. First, companies don't nuke outdated versions of their software, even if there are massive security problems. If an iOS 5 problem appeared today, Apple isn't going to push a brick button and kill millions of older devices running that OS. Likewise, if Linksys found its 2005-era bestselling Wi-Fi router had an unpatchable vulnerability — or that most owners didn't apply a patch — it won't call the nukes from orbit.

Second, there's no revenue associated with fixing old problems. Companies continue to push updates only so far as it seems reasonable to them or they're required by law. In most nations, there's no regulatory framework to require a period of time, much less an indefinite one, in which firms have a security responsibility. They'd rather you trash your gear and buy a new thing.

People typically either have no idea they're running insecure or exploitable hardware or software, or they have no choice. The software or hardware they use doesn't exist in an updated form, or the cost is too high even when they want to upgrade. (Please note, for instance, that it's possible to root old Android phones and patch software with CyanogenMod, but most human beings with Android devices will never, ever do this.)

Even when reasonable and affordable solutions exist, most people and organizations likely lack the expertise to apply it, ignore it, or can't afford the related work required to test upgrades and deploy them. The security team has a budget for improving security, Soghoian notes, but the human-resources department doesn't — and if its aging Visual Basic application written by an intern a decade ago for XP continues to work and a whole company is trained to use it, security has a hell of a time trying to convince that scale of change.

Soghoian is impatient. "Napalm. Burn it all," he says. But it's unlikely to happen. So long as a sufficient number of users and devices need backwards compatible support and blocking those devices would cause an unknown amount of disruption that users would likely misdirect against, say, browser makers instead of certificate authorities, or against their cable company instead of their router maker.

"How much longer are the rest of us going to be forced to put up with those users?" Soghoian asks rhetorically. However depressing it sounds, seemingly forever.

Photo: Shutterstock