I'm guessing there aren't many of us who are terribly concerned that Skynet will unleash its army of Terminator robots on us. But to hear tech visionaries like Bill Gates and Elon Musk tell it, there's probably good reason to worry that computers will one day become too smart for our own good.
We already know the Internet is segmenting us into distinct groups based on economic, social, educational, geographic, political and behavioral classifiers, among others. Internet titans rely on these classifiers to "filter" the world for us – which really means they are deciding what stories and opinions we read, which ads and offers we see, and the type of opportunities we receive. They also decide for us what we don't see.
But they are only just warming up; companies like Facebook and Google are racing to develop artificial intelligence technologies to expand their "deep learning" capabilities. These new technologies will be used to mine our data to more accurately assess who we are and what we want, and – to hear the Internet giants tell it – deliver elegantly tailored experiences that help us better understand and interact with the world around us.
There would be nothing inherently wrong with this if we could be absolutely certain the companies that control this technology will act only in our best interests. But if not, we could all be susceptible to manipulation by powerful systems we couldn't possibly understand. Some academics have even raised the specter of techno-social engineering and questioned whether we are moving into an age in which "humans become machine-like and pervasively programmable."
There is no shortage of Silicon Valley insiders who share these concerns. That's right. Some of the very coders, network architects, security experts and entrepreneurs who are driving big data to new heights understand better than any of us the full consequences of the trade-offs we all make when we provide our data to companies that provide "free" Internet services. And quite a few of these insiders are quietly freaking out.
I know this because I spoke with dozens of them while writing Terms of Use, my upcoming novel that explores the dark side of social media. They understand how easy it is to segment us, model our behavior and interests, predict our wants and needs, and potentially manipulate our thoughts and actions. In fact, the seeds of some of the signature lines in my novel were planted by these insiders.
One engineer told me he was disturbed by the ease with which he and his colleagues casually mine our data, almost on a whim, to see what patterns they might discover, which in turn could give rise to algorithms and models that exploit our personalities and behaviors in ways not yet imagined. "We do it because we can. And because it's a competitive advantage," he explained.
Another told me that one big Internet company he worked for had backed up and stored every single piece of data you and I ever left on its servers. The company didn't dare delete anything because executives and engineers couldn't possibly imagine all the uses they might discover for our data in the future. We are leaving breadcrumbs for these companies to exploit next year, or perhaps next decade.
Several contacts shared with me the same "what if?" What might happen if and/or when market forces pressure these profit-driven companies to develop ever more sophisticated algorithms, which could in turn underpin new services not necessarily built with users' best interests in mind?
Here are a few scenarios they came up with:
What if one of the big social networks started offering background checks that predicted and ranked the suitability of job applicants based on each candidate's data set – regardless of whether the information was "public" or not?
Many of us are starting to use wearable computers on our wrists. What if your insurance company could marry your biometric data with your health history and genetic profile and was able to, for example, predict you were 10 times more likely than average to suffer a heart attack? Might you one day be required by your insurer to live a certain lifestyle in order to minimize its financial risk?
Another contact, who did classified work for one government agency (he couldn't possibly say which one), offered a different but equally chilling twist. Sooner or later, he predicted, we will all come to fully understand that we won't be able to say, search, browse, buy, like, watch or listen to anything without our actions and thoughts being sliced, diced, and churned through powerful analytical systems. And then what? Will we, creeped out and perhaps a little afraid, start to second-guess our every move? Will we self-censor our speech and behavior to avoid being labeled?
The profit-driven companies that dominate the Internet insist the trust of their users is of paramount importance to them. And yet, these are often the same companies that keep moving privacy goalposts and rewriting their terms of use (or service) to ensure they enjoy wide latitude and broad legal protection to use our data as they see fit.
Yes, some of these scenarios seem pretty far out there. But not to some of the Silicon Valley insiders I count as friends and contacts. They understand the consequences – certainly better than I do – should these powerful technologies be misused. And I couldn't have written Terms of Use without them.
Scott Allan Morrison was a journalist for almost twenty years, covering politics, business, and technology in Mexico, Canada, and the United States. Morrison arrived in Silicon Valley as a reporter for the Financial Times during the darkest days of the dot-com crash. He later covered the Web 2.0 boom for Dow Jones Newswires and the Wall Street Journal. Over the course of a decade, Morrison covered most of the world's top tech companies and chronicled many of Silicon Valley's greatest stories, including the rise of Internet insecurity and the explosion of social media. Before setting his sights on journalism, he spent four years teaching English and traveling in Southeast Asia. He speaks fluent Spanish and very rusty Mandarin. He lives in Northern California with his wife and his hockey sticks.