CARR RESPONDS TO DYSON

Nicholas G. Carr [7.31.13]
Topic:

 

Dyson , in his essay "NSA: The Decision Problem", has done us a favor by connecting the dots, both backward to the origins of modern predictive algorithms and forward to the potentially stifling effect of using such algorithms to spy on personal action and speech. I wonder whether there’s another set of dots to be connected to the commercial use of data-mining and prediction tools. 

NICHOLAS G. CARR, former executive editor of the Harvard Business Review, writes and speaks on technology, business, and culture. He is the author of The Shallows and The Big Switch.

Nicholas G. Carr's Edge Bio Page

CARR RESPONDS TO DYSON

In the summer of 2006, America Online released a log of all the web searches made by more than a half million of its members over the course of a three-month period earlier in the year. AOL acted with the best of intentions. It hoped researchers would be able to use the logs to improve the workings of search engines. To protect the privacy of its members, it stripped all personal information from the data set. Each member was identified only by a number. But, much to AOL’s surprise and embarrassment, the "anonymization" didn’t work. It took a couple of New York Times reporters just a few hours to figure out one AOL member’s identity—her name and address—just by examining her list of search keywords. "My goodness," the woman exclaimed when the reporters tracked her down and showed her the search log, "it’s my whole personal life."

In the age of the web, as George Dyson expertly explains, we are our metadata. We all disclose ourselves—our names, our addresses, our acquaintances, our thoughts and intentions—through what we search for, whom we friend and follow, the people we call and text. Dyson warns that once a powerful and secretive government bureaucracy is able to automate the deciphering of thoughts, it is on a path that leads, logically though not inevitably, to the ability to automate the control of thoughts. A drone strike is a particularly lethal means of reminding someone that their intentions have strayed out of bounds. One can imagine an array of more subtle tactics to nudge people away from dangerous or merely suspicious ideas.

Dyson, in his essay, "NSA: The Decision Problem", has done us a favor by connecting the dots, both backward to the origins of modern predictive algorithms and forward to the potentially stifling effect of using such algorithms to spy on personal action and speech. I wonder whether there’s another set of dots to be connected to the commercial use of data-mining and prediction tools. The data collection and processing infrastructure that the NSA and other spy agencies use for espionage is the infrastructure built by internet companies to monitor people’s behavior and thoughts for business purposes. The new, digitized "military-industrial complex" still depends on the capabilities of its "industrial" partners, whether they take part willingly or reluctantly. The Snowden disclosures should encourage us to take a hard look at the secrecy of commercial data collection in “the cloud.”

Dyson argues, drawing on historical precedent, that "a secret program can be brought into the open, to the benefit of all, without necessarily being brought to a halt." That goes for the data-mining programs of companies like Google, Facebook, Microsoft, and Apple as well as those of agencies like the NSA. What personal data is being collected? How is it being used? With whom is it being shared? The development of a stifling surveillance culture begins at the moment that the data on our thoughts and behavior is initially recorded.