About
Features
Editions
Press
Events
Dinner
Question Center
Subscribe

WHO GETS TO KEEP SECRETS? 12.6.10]
Hilis's Question: An Edge Special Event!

The question of secrecy in the information age is clearly a deep social (and mathematical) problem, and well worth paying attention to.

When does my right to privacy trump your need for security? Should a democratic government be allowed to practice secret diplomacy? Would we rather live in a world with guaranteed privacy or a world in which there are no secrets? If the answer is somewhere in between, how do we draw the line?

I am interested in hearing what the Edge community has to say in this regard that's new and original, and goes beyond the political. Here's my question:


WHO GETS TO KEEP SECRETS?


I hope to hear from you.

— Danny Hillis

W. DANIEL (Danny) HILLIS is an inventor, scientist, engineer, author, and visionary. Hillis pioneered the concept of parallel computers that is now the basis for most supercomputers, as well as the RAID disk array technology used to store large databases. He holds over 150 U.S. patents, covering parallel computers, disk arrays, forgery prevention methods, and various electronic and mechanical devices. He is also the designer of a 10,000-year mechanical clock.

Presently, he is Chairman and Chief Technology Officer of Applied Minds, Inc., a research and development company in Los Angeles, creating a range of new products and services in software, entertainment, electronics, biotechnology security, and mechanical design. The company also provides advanced technology, creative design, and security and cryptography consulting services to a variety of clients.

W. Daniel Hillis's Edge Bio Page

___

Hillis on Edge — Further Reading:

• "Proteomics" The Edge Master Class 2010
• "The Hillis Knowledge Web: An Idea Whose Time Has Come"
Part V: "Something Beyond Ourselves" in The Third Culture: Beyond The Cybernetic Revolution
"The Genius" in Digerati: Encounters With The Cyber Elite



MORE DISCUSSION ON "WHO GETS TO KEEP SECRETS"

December 21st +

[MOST RECENT FIRST:] NEW Danny Hillis, Clay Shirky, Francisco DePretis, Danny Hillis, Emanuel Derman, Danny Hillis, John Markoff, Dave Winer, Nathan Myhrvold, Nicholas Carr, Danny Hillis, Nicholas Carr, Danny Hillis, Nicholas Carr, Evgeny Morozov, Danny Hillis, Nathan Myhvold, Dave Winer, Esther Dyson, Daniel C. Dennett, Nathan Myhvold, John Markoff, Daniel Kahneman, Nassim Taleb, David Gelernter, David Berreby, Dave Winer, Danny Hillis, Clay Shirky, Juan Zarate


NEW JUAN ZARATE
Former Deputy Assistant to the President and Deputy National Security Advisor for Combating Terrorism (2005 to 2009); Currently Senior Adviser, Center for Strategic and International Studies (CSIS); Senior National Security Analyst, CBS News.

Just like it's impossible to think of human interaction without secrets or intimate communications, it's quite naïve to think that there will not be secrets within and between governments — at a minimum during war or in the context of sensitive diplomacy.

Secrets are not just useful for governments to operate with others around the world and to defend their respective citizens but morally necessary in certain instances, as in the case of secret negotiations to end a genocide or civil war or tracking a suicide bomber with the intent of saving the lives of innocent civilians.

Normatively, secrets can certainly be destructive and be used to hide the indefensible, but they can also enable good and moral events and developments.

If we accept there are scenarios in which secrets can be morally necessary and important, then there is no absolute, apriori rule of openness and transparency for information — unless one is purposely amoral or destructively relativistic. If so, we then need to create structures and systems as societies that allow for appropriate secrets, oversight, and revelation — all of which we've debated within American culture and politics since the founding of the republic.

A more relevant question in this context then — and for a democracy — is what that system looks like, who gets to divulge secrets, and how this works in a way to ensure there isn't abuse. We have answered that as a society — perhaps not well enough and perhaps in need of more consistent review — but we've answered it. Assange may not feel a part of that debate or understand it, but he isn't American and has no right to make these judgments for the American people.

Shouldn't Americans be offended then that a non-American is determining what it means to be a transparent democracy and what should be divulged globally — to all actors (friends and foes alike)?

Have Assange or his supporters addressed the validity of the balance struck within the United States between necessary secrecy and transparency — existing declassification procedures, the Freedom of Information Act, the role of internal Inspector Generals in all th key departments and agencies, the oversight of all of the government's activities by the peoples' representatives in Congress, and the ability of the media to act consistently as a check on the government in the American context?

This system may not be perfect, but it allows for a balance between the need for secrets and the need for transparency and accountability in a democracy. There is a deafening silence in this current debate regarding this balance and how it works in the American context. The fact that no grand revelation or policy bombshell has emerged in anything Wikileaks has produced may in fact be a reflection of this system working — one which tends to reveal over time alleged government abuses and missteps along with the rich contours of American policies consistent with what the media and public already know and expect.

All of this is revelatory of Wikileaks' real purpose. Wikileaks' wanton dumping of diplomatic cables (dribbled out to create headaches and headlines) demonstrates nothing other than a contempt for American influence and an attempt to undercut the ability of the U.S. government to operate effectively around the world. (Ironically, most of these documents revealed in this latest tranche are merely sensitive and not classified.) Assange and his supporters may argue that they are defending a broad principle of transparency, but this is less about principles and more a normative judgment about the nature of American power.

His focus on American "secrets," vice those of the most reclusive, repressive, and purposely opaque regimes in the world, demonstrates a failed moral compass and a flawed, fulsome principle unveiled now as simple anti-Americanism.


NEW DANNY HILLIS
Physicist, Computer Scientist; Chairman, Applied Minds, Inc.; Author, The Pattern on the Stone

Clay,

I understand your point about attempts to silence Wikileaks, but I am not sure I understand what your position is on WIkileaks itself. Should a democratic society be able to empower its diplomats to conduct confidential discussions? If so, and if those diplomats do the job they were asked to, would you consider breaking the confidentially of those discussions a short-circuit of the process?

Danny

NEW CLAY SHIRKY
Social & Technology Network Topology Researcher; Adjunct Professor, NYU Graduate School of Interactive Telecommunications Program (ITP); Author, Cognitive Surplus

Danny,

Well, I think two things: diplomats need secrets, and government secrets are a threat to democracy. What I don't know is how those two conflicting observations should be balanced.

If the last couple of centuries of democracy have taught us anything, its that designing political process is more important than designing outcomes, so I also don't think that diplomats 'doing the job they were asked to' as being much of a Get Out of Jail Free card, since the job is in part to represent our interests, while working in our name (and, of course, being paid with our taxes.) All that suggests to me that diplomats (and foreign policy generally) can't be governed by any simple test of them 'doing their jobs' — unlike, say, postal employees or park rangers, the job of diplomat requires a lot of on-the-job judgment, and so should be part of democratic checks and balances.

I'm enough of a liberal (in the 19th century sense of the word) to believe that when fundamental principles clash (in this case, "international actors need private discourse" vs. "citizens should be able to observe the State's actions"), no one point of view is adequate to resolving them. My not having an answer isn't just "Better minds than mine..." begging off either. It's not that I don't don't think I can resolve the tension between those contradictory goods. I don't believe any single mind can resolve the tension.

(A funny illustration of the cognitive dissonance caused by one person advocating contradictory views is the famous "If by whiskey" speech by a Mississippi lawmaker on the possibility of legalizing whiskey, who gave an impassioned speech simultaneously for and against the proposition; the overall argument ran roughly "If by whiskey, you mean it's bad effects, I'm against legalization, and if by whiskey, you mean it's good effects, I'm in favor.")

Unlike people, though, systems can contain contradictory views. Democracies, when they work, are designed to prevent intellectually or ideologically coherent groups from governing. As a result, we have a bunch of processes where the composite position do not resolve the issues so much as hold them in iso. (Processes like trial by jury or the hashing out of spending bills, leading to results like the Fair Use doctrine or the current rules on abortion.)

These processes are not based on processing a set of facts through a clear theory. They are bargains, balancing fundamental incompatibilities in a way pleasing to no individual or group, but representing something that won't derail the system as a whole.

Consider our immigration policy. It is a farrago of differing rationales around letting people in legally, and around pursuing those here illegally, but the fact of the matter is that all of those arguments were as valid in 2004, and there was little such debate. The key question for immigration is not rational, but cultural and economic — when first-generation immigrants account for 1 in 10 citizens or more, or when there is a recession, the mood of the country turns anti-immigrant. There simply isn't a principled argument about immigration that accounts for these 'stocks and flows' thresholds around the US's cultural and economic carrying capacity.

Similarly, Richard Posner once pointed out that not only is it generally impossible to say whether the Supreme Court has correctly decided any given case, it should be impossible, since the design of the system should keep the Court from spending time on cases where current legal principles provide a clear answer.

So it is with Wikileaks and diplomatic secrets — principled argument isn't adequate to resolving the issue. Diplomats and world leaders have to be able to talk in private, to explore possibilities outside the range of acceptable public discourse. (The Israeli/Palestinian situation would be impossible to even approach diplomatically without this.) Yet the ability of diplomats to operate in secret not only violates principles of democratic oversight, it provides a huge loophole allowing for overproduction of secrets. And Wikileaks changes the balance of power in that situation.

Consider the nested questions involved in figuring out how Wikileaks does and should change that balance:

  • Has Wikileaks, the organization, committed a crime? Is that crime bad enough to convince our allies to shut it down?
  • Has Julian Assange committed a crime? Is it a crime bad enough to convince our allies to extradite him?

The answer to the first question seems likely to be No, per the Supreme Court's reasoning in the Pentagon Papers case. (IANAL.) The answer to the second question is less likely to be No, because of the wording of the Espionage Act; if Julian aided or persuaded Manning in any significant way, the Act seems to indicate that's a crime, but we don't know how much protection he gets as a publisher. There is also not much case law around the Espionage Act, so it's less clear what the precedents even say. And Julian, an Australian citizen, may be un-extraditable. And so on.

However, all previous reasoning about the legal liabilities of publishers and journalists — all of it — has taken place against a set of background assumptions about publishing: It is expensive. It requires a big, formal organization. The bigger its reach, the more expensive it is to be that kind of organization. Only a tiny group of people can be publishers with global reach. Every media outlet is deeply rooted in some country. And so on.

Like tree roots growing around a big rock, the law has grown up around these facts. Now the landscape has changed; the facts the law grew up around have changed shape, but the law has not yet been reshaped to fit the new facts.

We can't just assume that the old interpretations apply to the new situation, because a law that carves out a special place for a minority class of citizens — publishers and journalists — can't just be extended to the general populace without breaking things. If everyone who has the capacity to broadcast public speech is a journalist, and journalists are immune from testifying about people they've gotten secrets from, no one could ever be compelled to testify.

While I like the status quo ante (it's illegal to leak secrets but not illegal to publish leaks), I also recognize that that situation balanced a bunch of competing interests in an environment that no longer exists. So I don't think my opinion is adequate to capture whatever new balance is required, because the systems we have for setting the balance haven't done their work yet.

Julian's publication of the State Dept cables may be a crime, and if it is, the US may be able to make a case to our allies that it is one so serious he should be extradited. Wikileaks may also have committed a crime, and it may be so serious that the punishment merits shutting them down.

I am skeptical of these propositions to varying degrees, but I also know that the act of deciding them will create global precedents and important new information about the publishing law in the age of the internet. And that new information is almost certain to be contained in an If-by-whiskey-ish bargain, where incompatible principles are both affirmed. And we don't know what that bargain will look like yet.

Wolfram made this point in New Kind of Science: sometimes the energy required to calculate the answer to a problem is so significant that there is no way to model that outcome with heuristics and short-cuts. Sometimes the system just has to run flat-out for the full time it takes to arrive at the answer.

So this is a long-winded way of saying that the situation gives me ontological blindness. It's not that I think there is an answer to your question but I don't happen to know it. I don't think there is an answer to your question, period, and the novelty of the circumstance means there won't be an answer until the system has run flat-out for the time it takes to have a trial, and to see what new laws the world's governments are able to get passed.

And this is why I focused on the terrible risk of an extra-legal shutdown of Wikileaks; it's not just that if Julian doesn't get a fair trial, justice won't be served. It's that if Julian doesn't get a fair trial, we won't know what justice is in a world where something like Wikileaks is possible.

— Clay

NEW DANNY HILLIS

Clay,

So it seems to me that you are saying that my question is un-computatble, in the mathematical sense of the word. I want to know where the lines "should" be drawn, but you are pointing out the the very concept of "should" assumes understanding of cause and effect that is impossible in this situation. Is that fair summary of what you are saying?

[...PREVIOUS]


NEW DAVE WINER
Pioneer in the development of weblogs, syndication (RSS), podcasting, outlining, and web content management software

I don't know who gets to keep secrets, but it's good that these secrets are coming out.

We're not really learning anything we didn't already know (but couldn't prove).

Now maybe some of our leaders will go to jail, and maybe some will be driven from office, and maybe some in the future will be a little less bold with the lies and trumped up reasons to go to war. (And maybe some reporters will dig in a little more and press harder for the truth.)

Maybe cynicism won't rule the day so much. Maybe we have a chance at getting reality-based in the future. Seems we could use a bit of that to deal with the exploding population.

We've gone way overboard with the secrecy thing. Now it's time for the pendulum to swing back from that extreme. It's good. Stop worrying.


NEW DAVID BERREBY
Author, Us and Them

The Czech dissident Jan Prochazka was spied upon for years by the Communist government in Prague, but he didn't let this inhibit his conversation. He spoke to his friends as anyone would, expecting that his talk would go no further than his social circle. Then one day the police did a shocking thing: They broadcast Prochazka's private chatter on the radio. He was devastated. Milan Kundera, who has written about this assault several times, explained it in terms of the contrast between public and private life: "that we act different in private than in public is everyone's most conspicuous experience, it is the very ground of the life of the individual," as he writes in Testaments Betrayed.

I'd like to propose that the issue isn't that simple — that we don't have a private self and a public self, but rather many different selves. One for work. One for the home we've made as adults, another for the family in which we were once children. One for old friends and one for new acquaintances. This variability is well-documented by social psychologists, behavioral economists and other researchers.

Now, imagine you say something cruel and funny, but unfair, about one friend to another friend. If the listener repeats what you said, you will feel (a) betrayed and (b) that you didn't "really mean it." You feel betrayed because you expected that Listener Friend would respect the unspoken assumption of a conversation, which is that you were presenting a version of yourself appropriate to that time and place, a version which is almost always not for export to the wider world.

Why, though, do you feel that you didn't "really mean it"? Why is the self in the bar at 2 a.m. a less legitimate, less real version of you than another self--for example, the self who knows he will be on a radio program talking about his friend tomorrow? I think it's because when you know you're going to be broadcast, you can prepare a version of your self to fit the occasion. It's probably axiomatic that we always mean what we say, when we say it. The important thing is to have control over your own self-presentation--to say what you want to mean at the funeral, and not have to hear what you said in bar.

Privacy is version control. It's not a set of protections around a single True Self. It's the right to decide which of many versions of your self will be known to others, in a particular context or situation. It's the expectation that, if your voice is broadcast on the radio, it will be your "radio self" that the world hears, not the last-round-of- drinks self. People who can't control their self-presentation — children, prisoners, celebrities who have lost control of their own stories — feel the pain of missing autonomy, missing adulthood. Someone else is deciding what they really meant; someone else decides which version is true and which others don't count. So the opposite of privacy, as Kundera noted, isn't serene secretless glass-house transparency. It's totalitarianism.

Hence I was appalled by Wikileaks' violation of privacy in its cables release. Sure, I was titillated and amused to read what American diplomats think of Vladimir Putin or Nicolas Sarkozy. But Wikileaks did to them what the Communist apparatchiks did to Prochazka. It robbed them of that control over self-presentation that's essential to human dignity and autonomy.

Yes, but what about the other issue — state secrecy, rather than personal privacy? Surely some "secrets" are just the nation-state equivalent of personal privacy: The State Department can't do any business if it's limited to one version of itself (the one that issues bland communiqués about cooperation with Russia, or the one that calls Putin and Medvedev Robin and Batman, take your pick. Both are necessary).

Other kinds of state secrets have political and material consequences. They are the kinds of information that people in governments have decided to withhold from their own people. You could take the view that this is always a bad thing, but that's hard to defend. Should we really let Osama bin-Laden know we've found his cave, in the name of perfect transparency?

I accept that states should keep consequential secrets, then, but it puts me in a bind: I must trust people in government to do what is right. But given the sacredness of privacy, I cannot know know more about those people than they choose to let me know. So can I really trust them? It seems to me that democratic societies don't have a stable place to stand between the imperative of individual privacy and the imperative of trust-in-others. Yet I feel sure that if we let our justified mistrust lead us to deny the importance of privacy, we'll end up with a worse State, not a better one.


NEW DAVID GELERNTER
Computer Scientist, Yale University; Chief Scientist, Mirror Worlds Technologies; Author, Mirror Worlds

In the matter of Wikileaks, certain observers know they're right a priori. On keeping secrets, isn't it too bad that some Superior Moral Father didn't spill the beans about Bletchley Park & the breaking of Enigma in (say) 1943? Then Britain could have starved and the Allies could still have lost the war. Of course, maybe today's state secrets are OK to reveal, and things were different back then. It's a tough moral problem. Who is to decide? A democratically elected government, obviously; not a hacker.


NEW NASSIM TALEB
Distinguished Professor of Risk Engineering, New York University; Author, The Bed of Prostuces

David Gelentner has a great point. All these discussions miss something central: in this romantic utopianism for free information few of the advocates realize that the wiki concept may not be that much a bottom up process. Rather it is one that can be easily hijacked by a few techies and bully hackers creating cliques and their own mafias. It gives a disproportionate power to people we did not elect. And it seems to want to escape the legal system, which we spent to much of our history fighting to refine in order to protect individuals.


NEW DANIEL KAHNEMAN
Psychologist, Princeton; Recipient, 2002 Nobel Prize in Economic Sciences

There is a reason for all the instances of secret negotiations among adversaries. Concessions are often impossible because of the fired-up constituencies on both sides. Giving that up is giving up a lot — too much in my opinion. It will cause conflicts to go on much longer than they should.

The underlying problem is the ease with which the press is co-opted or coerced — and in some cases so partisan that it is not interested in exposing truths that could be inconvenient to power. What happened at the New York Times in the run-up to the Iraq war, and what is happening daily on Fox News are strong arguments for supporting leakers. How do we achieve an independent, adversarial but still responsible press?


NEW JOHN MARKOFF
Journalist; Covers Cybersecurity for The New York Times; Author, What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry

David Gelernter picked a hypothetical exposure of a state secret. I wonder what he thinks about an actual one with historical consequences, say the Pentagon Papers?


NEW NATHAN MYHRVOLD
CEO, Intellectual Ventures; Former Chief Technology Officer, Microsoft Corporation; Physicist, Paleontologist; Photographer; Chef; Coauthor (with Bill Gates), The Road Ahead

Who gets to keep secrets?

I think that the answer is that we all want to keep secrets, at some level.

That's why we have clothing, and doors to our bedrooms and bathrooms. Logically speaking, we don't need those things but we find them very desirable. A post by one of us argued that privacy is a recent concept — which I find quite unpersuasive in this context. Lots of desirably things are relatively recent, like proper nutrition, healthcare and lots of other things. The fact is that as society has gained the ability for people to have privacy and keep secrets, there is a clear preference for doing so. Along those lines, Douglas Adams, in one of his Hitchhiker books, argued that mental telepathy was considered a great punishment to species that had it because they couldn't have any private thoughts.

Secrecy is one of the reasons we play cards. Is some idealistic killjoy going to argue that a real time Wikileaks ought to tweet what cards are being dealt in poker or bridge games? Secrecy in that context is part of the fun. Secrecy in games works in part because they are a model for real world situations in which we have strong vested interests in keeping secrets. Much of the Wikileaks fallout is to ensure that US ambassadors can't send frank assessments to the Secretary of State. They are daily engaged in a poker game, of sorts, with their counterparts. That is the game of diplomacy. So why should that be exempt from secrets?

It is very difficult to imagine having negotiations without having some secrets. The reality is that secrecy and privacy are deemed to be highly desirable in a lot of human contexts.

Everybody on this email has secrets. One set is proprietary information. Anybody doing something nontrivial has some information which they consider proprietary. Usually the word "proprietary" has a business connotation to it, but a scientist's research results are often guarded even more tightly than commercial secrets. I know a physicist who used to throw himself onto his desk to cover up the papers he was working on if a colleague walked into his office — and that was AFTER he had won the Nobel Prize. Right now, the LHC results at CERN are proprietary — they need to be checked and verified, and the people involved want to get credit. The draft of the next book that John Brockman is about to sell — that is proprietary. The research result that isn't published yet. Everybody has proprietary interests.

Wanting to keep secrets does not, by itself mean that it is always desirable. People committing crimes have an obvious desire to keep their activities secret. Finding the balance between personal and societal rights is difficult. The way we run that in our society, as Nassim and David Gelernter point out is a system of laws and elected officials. However, imperfect they are, that is our system. Stepping outside that is something that is hard to condone as a general rule. Who elected Julian Assange to be in charge of this for the planet?

But what is the difference between what Wikileaks is doing and the Chinese hackers who were penetrating Google and other web sites trying (in part) to find evidence of dissidents that could then be punished and imprisoned? Does Google get to keep account info secret? Is it really OK that Chinese hackers can penetrate Google and read everybody's email?

Much of the response to Danny's question, and the response to Wikileaks more generally has a liberal / populist "lets stick it to The Man" quality to it. Dave Winer's response captures much of that. Wikileaks has so far attacked the US government, so people who don't like our government (which at one point or another is most of us!) can treat that as a special case and make special pleadings (as he did) that basically say it is OK to break laws and violate privacy so long as it is somebody I don't like you're doing it. That isn't an intellectual argument — it is an emotional and political argument.

I am sure that somebody will argue that the Chinese hacking of Google is different because it is the Chinese Government that is sponsoring it. If you follow this line of logic you wind up with an answer to Danny's question which is just "the people I don't like shouldn't keep secrets", which is hardly a principled position.


NEW DANIEL C. DENNETT
Philosopher; University Professor, Co-Director, Center for Cognitive Studies, Tufts University; Breaking the Spell

I think the issue of "where we draw the line" is difficult — as such issues almost always are — but not insoluble, and in fact I think we already know pretty well where to draw the line, and have done so in the past. The Pentagon Papers is a pretty good example — but nothing is a perfect exemplar.

States, like individual agents, need secrecy if they are to avoid being exploited in all their dealings. This is simply undeniable. You can't play 'rock paper scissors' if you write your list of moves on a piece of paper visible to your opponent. In a world in which agents (of all sizes and shapes) are constantly looking for an edge, keeping one's plans and one's knowledge of the world (and ignorance about the world) secret is a Good Trick with no alternative. So unless you are a radical anarchist/nihilist who prefers life in Hobbes' state of nature to the security of living in a law-governed society, you should endorse the need for state secrecy (And corporate secrecy—think of the law of trade secrets—and individual secrecy, and ACLU Governing Board secrecy, and . . . . )

But of course state secrecy is often abused, and then we have to hope that enterprising and tenacious journalists and conscience-stricken insiders, will spill the beans, selectively, for a purpose, minimizing collateral damage. Newspaper editors have huddled in secret over whether or not to print various secrets they have uncovered, and have a track record of making pretty good decisions. That going public with state secrets will be law-breaking in most instances is undeniable, but the state that chooses to prosecute the exemplary citizen who exposes major corruption/treason or other crime does so at the risk of its own perceived legitimacy. In a basically free society, judicious leakers earn our respect, even if they get harassed for a time or even jailed by those they expose.


NEW ESTHER DYSON
Catalyst, Information Technology Startups, EDventure Holdings; Former Chariman,Electronic Frontier Foundation and ICANN; Author: Release 2.1

Re: Nassim Taleb's comments, WikiLeaks is not a wiki. For better or worse, it's tightly controlled. And it does not have power over anything: it gives the power of information to anyone who wants it.

I'm actually a big fan (with limits) of the free flow of information, but I see one point/coming problem that has been so far neglected: One strength of WikiLeaks, for all its flaws, is its practice of vetting the information for authenticity — usually with original sources, but also with common sense, knowledge of the sources, etc. So far as I know, nothing WikiLeaks has posted has proved to be inauthentic (as opposed to biased, scurrilous, embarrassing, dangerous, etc.).

Indeed, in an odd way, WL is an example of centralization and earning a brand: the central, acknowledged source for quality leaks. Now, there are likely to be imitations, fake mirrors, knock-offs and the like that are less responsible — as you say, hijacked by who knows what. In a truly distributed world, misinformation runs rampant.


NEW DAVE WINER
Pioneer in the development of weblogs, syndication (RSS), podcasting, outlining, and web content management software

As Esther Dyson writes, WikiLeaks is not a wiki. It was when they started, but they gave up on that idea and decided to work with professional reporters instead, people at The Guardian, Le Monde, New York Times, Der Spiegel and El Pais.

There really isn't all that much difference betw this and previous leaks, except that this time, the Internet was used to transmit them. Not that big a distinction.


NEW NATHAN MYHRVOLD
CEO, Intellectual Ventures; Former Chief Technology Officer, Microsoft Corporation; Physicist, Paleontologist; Photographer; Chef; Coauthor (with Bill Gates), The Road Ahead

Concerning John Markoff's comments on the Pentagon Papers, this seems very much like pandering to a special case. When there is an unpopular war then it is OK to reveal secrets, but not otherwise?

Who gets to decide that?

Also, I have to ask. Does anybody have a really good analysis of the counterfactual — what would have happened if the Pentagon Papers had not been released? Was it a good thing?

Within journalism the idea that publishing the Pentagon Paper was a major event is holy writ, but that does not mean it is so.

The popular mythology is that the Pentagon Papers release was a good thing — but I have to wonder. Nobody (me included) is going to stand up in favor of the Vietnam war, but it is very far from clear that the publication of the Pentagon Papers was actually significant. Perhaps it is, but that would require real scholarship and analysis to make a coherent argument.

The papers themselves were largely an academic exercise — a study of the history of the Vietnam war from 1945 to 1967 commissioned by Robert McNamara. Although the final date covered in the papers was 1967, they weren't released until 1971. The Vietnam war didn't end until 1975, so its not like the papers brought a quick end to the war. They were secret even within the Johnson administration and it is unclear why McNamara even had them written. Supposedly it was done for future historians. Many assessments of the Pentagon Papers say that they never should have been classified in the first place — there was no actual national security information involved. That point of view is often taken to justify Daniel Ellsberg's actions, but by the same token, one could use this to say what was the actual importance of publishing them?


NEW DANNY HILLIS
Physicist, Computer Scientist; Chairman, Applied Minds, Inc.; Author, The Pattern on the Stone

In considering the Pentagon Paper's it may be helpful to make a distinction between secrets and lies. A lie often requires keeping the truth a secret, but a secret does not necessarily require a lie. My understanding of the Pentagon Paper's is that they were revealed to expose the creation of a false narrative. Should this make a difference?


NEW DAVID GELERNTER
Computer Scientist, Yale University; Chief Scientist, Mirror Worlds Technologies; Author, Mirror Worlds

I agree that John Markoff raised an important point in mentioning the Pentagon Papers. No one can condemn all principled breaking of the law a priori, even of the democratically-enacted law of a legitimate democratic govt. But it seems to me that you have moral standing to break the law on principle to the extent that (at a minimum) you're willing to stand trial & face the consequences. Ellsberg did.


NEW ESTHER DYSON
Catalyst, Information Technology Startups, EDventure Holdings; Former Chariman,Electronic Frontier Foundation and ICANN; Author: Release 2.1

recisely ... We need heroes willing to sin for us if necessary. (Yes, who decides it's necessary?)


NEW EVGENY MOROZOV
Commentator on Internet and politics "Net Effect" blog; Contributing editor, Foreign Policy

While I don't have a comprehensive theory as to who gets to keep their secrets, I'd like to correct several misperceptions about WikiLeaks that have arisen on this list:

1) Most of the redactions that WikiLeaks applied to the cables were suggested by the journalists working for its media partners — The Guardian, Le Monde, El Pais, Der Spiegel, and The New York Times. WikiLeaks publicly acknowledged that they do not have the expertise to do the redactions and relied on the journalists' expertise.

Furthermore, WikiLeaks did contact the US State Department prior to the publication of the cables and asked them to suggest any further redactions in case the release of the cables might endanger lives or national security. The State Department refused to weigh in on this matter out of principle, making Assange conclude that there was no such information contained in the cables.

I am not aware of any cables that were released by WikiLeaks but were not released by their media partners—- i.e. all of the cables released so far seem to have been vetted by establishment media (and contrary to numerous reports to the contrary, WikiLeaks has only released 2,000 cables so far, out of 250,000 total). It's certainly not a "dump" as is commonly assumed.

2) To portray Assange as a hacker may help us in understanding where he comes from ideologically, but there is no evidence whatsoever that the cables were obtained via hacking. WikiLeaks claims to have obtained them through their electronic dropbox technology, which provides anonimity to the leaker; thus, even WikiLeaks doesn't know who leaked the cables to it.

I think it is counterproductive to bring too many assumptions about hackers and techies in general into this discussion — at least as long as we want to understand the ethics of leaking. For all we know, these cables may have been leaked to Human Rights Watch; I'm not sure that their response to receiving such a package would be much different to Assange's, despite them not having many hackers on staff.

Furthermore, it's not clear if WikiLeaks has violated any laws, as is evident by the tremendous difficulty that the US government is having in building a case against Assange. The question of "who elected Assange?" is a fairly common question in global governance — but, once again, it is also asked of Human Rights Watch, Amnesty International, Doctors Without Borders and plenty of other transnational players (and, if you want to think really globally, the Internet fits the bill too — technically, no one "elected" it and governments, both democratic and authoritarian ones, have quite a bit of trouble controlling it).

3) It may be useful to look at some of Assange's published writings to understanding the philosophy behind WikiLeaks. Two of Assange's most telling essays are analyzed here. As you would notice, they contain very little discussion about secrecy in the abstract; Assange's focus is almost entirely on what he calls "government conspiracies".

While I don't endorse Assange's philosophy, I think he's quite clear in that he is not seeking to violate the privacy of individuals; governments (and increasingly corporations) are his only focus. I'd also like to add that Assange was one of the masterminds behind Rubberhose, an encryption technology — to be used by human rights workers in authoritarian states — that would help them secure their data even if they were being tortured. Thus, as long as we want to zoom in on WikiLeaks, it's quite clear that they are not set out to eliminate secrecy per se. In fact, WikiLeaks has a history of sharing personnel with Tor, the leading tool for online anonymity (google "Jake Appelbaum").

Hope this clarifies a few common misconceptions.


NEW NICHOLAS CARR
Author, The Shallows: Mind, Memory and Media in an Age of Instant Information

Danny,

This discussion is fascinating, but largely beside the point. The answer to the question "Who gets to keep secrets?" will not be based on ideology or morality. It will be based, as always, on power, perspicacity, and technical skill. The technologies of secret-keeping and the technologies of secret-leaking advance hand in hand. The WikiLeaks case shows that, thanks to the remarkable ease and inexpensiveness of mass data storage and transmission, secret-leaking can now occur on a scale unimaginable before. But that's also true of secret-keeping. Tapping into the Internet and other communication networks allows governments, corporations, and others to collect and store private information on an unprecedented scale.

The U.S. government may in the end come to admit — secretly, of course — that it owes a debt of gratitude to WikiLea ks. Cablegate revealed, after all, that there are huge security holes at the State Department. One wonders how many other CD-Rs, thumb drives, SDHC cards, and other diminutive storage devices filled with confidential data have passed unnoticed through the doors of government agencies—and into whose hands they've fallen. At least with WikiLeaks the government knows what's been purloined; the undetected security breach tends to be the more dangerous one. And at least with WikiLeaks there seems to have been, so far, a careful vetting of the information made public. It's worth remembering that whoever who gave the data to WikiLeaks could just as easily have turned it into a torrent and blasted it, in raw form, across the Net. Julian Assange may be a hero or he may be a creep, but he's far from a worst-case scenario.

If the government wants to keep its secrets secret, it needs to modernize its computing systems, with a particular emphasis on reducing the proliferation of copies of sensitive documents and messages. Wherever possible, for example, the computing devices used by functionaries with security clearances should act as dumb network terminals, with as little onboard storage as possible and without USB and other ports. Such safeguards are not foolproof, of course. Computer security is and always will be a cat-and-mouse game. We can hope that the cats and the mice who gain the upper hand will always be the good ones, but that has never been the case in the past and is unlikely to be the case in the future. We don't get to choose who keeps the secrets.

— Nick

~~

NEW DANNY HILLIS

Nick,

You are right, but the intent of my question was who should get to keep secrets. As you point out, the relative strengths of sercet-keeping and secret-exposing technologies makes a difference. As in inventor of technology, which way should I try to tilt the balance?

— Danny

~~

NEW NICHOLAS CARR

Danny,

I think I'd tilt the technological balance slightly toward secret-keeping when it comes to individuals and slightly toward secret-exposing when it comes to institutions. Is that possible?

— Nick

~~

NEW DANNY HILLIS

Nick,

Let me ask a narrower question, which will make clearer my motivation and help me understand your position. Should I work to increase or decrease the technical ability of institutions to keep secrets?

— Danny

~~

NEW NICHOLAS CARR

Danny,

I think institutions have a legitimate need, and often an obligation, to keep some information secret, so I think it's entirely appropriate for you to help them build technical defenses against theft and sabotage, and it would be inappropriate for you to work to create tools to make theft and sabotage easier.

At the same time, I also believe it would be appropriate for you to work on technologies that would make it easier for individuals to discern — and to block such efforts when desired. I also think it would be appropriate to work on technologies that would help shield the identity of whistleblowers inside institutions.

These efforts might well conflict with one another, but I don't think such conflicts are bad. In fact, I think the conflicts are necessary in order to keep a balance between the need to keep information secret and the potential for secrecy to be abused.

— Nick


NEW NATHAN MYHRVOLD
CEO, Intellectual Ventures; Former Chief Technology Officer, Microsoft Corporation; Physicist, Paleontologist; Photographer; Chef; Coauthor (with Bill Gates), The Road Ahead

The fact is that institutions have multiple people and devices, so in general their security problem is much harder. A single person who keeps a secret thought to him or herself is pretty invulnerable. The more people, devices and networks you interconnect to, the harder your security problem as a general rule. The bias against institutions is built into the problem.

But as Danny points out, the distinction is silly when you consider institutions that hold personal information. I note a lot of G-mail accounts in the people on the message — they all depend on Google being secure. Those of you at mac.com depend on Apple and so forth . Which case is that?


NEW DAVE WINER
Pioneer in the development of weblogs,

I have a different problem, I want to be sure my information can be found.


NEW JOHN MARKOFF
Journalist; Covers Cybersecurity for The New York Times; Author, What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry

What baffles me most in all of this is the design of SIPRnet that made Cablegate possible. Bill Joy once proposed that the first Sun workstation DES encrypt all data that entered and exited the machine. That never happened. I can name at least six companies that could have sold the Government software that would have secured the workstation used to exfiltrate the cables that went to Wikileaks. I assume that over the next half decade secret data will be both encrypted and watermarked, which will at least raise the bar a bit above the technical ability to do file transfers.


NEW DANNY HILLIS
Physicist, Computer Scientist; Chairman, Applied Minds, Inc.; Author, The Pattern on the Stone

All true, but it only would have made the exfiltration of the information less convenient, not impossible. Fundamentally, an organization's ability to keep secrets is dependent on the individuals who have access to the information. Thus, as Nathan pointed out, organizations are at a fundamental disadvantage relative to individuals in their ability to conceal.


NEW EMANUEL DERMAN
Professor, Financial Engineering, Columbia University; Principal, Prisma Capital Partners; Former Head, Quantitative Strategies Group, Equities Division, Goldman Sachs & Co.; Author, My Life as a Quant

Governments and corporations sometimes use secrets and lies to harm the people that make their existence possible.

When they err, and they do, they err on the side of excess.

One cannot expect the world's (in this case, Wikileaks') antithetic response to be finely tuned and error-free either. Where power is the issue, human affairs advance by overshooting the equilibrium on either side, and the equilibrium keeps changing too.

Meanwhile, perhaps Wikileaks has decreased the asymmetry between the power of bureaucracies and the power of individuals.


NEW DANNY HILLIS
Physicist, Computer Scientist; Chairman, Applied Minds, Inc.; Author, The Pattern on the Stone

There is an interesting paradox in talking about the asymmetry between the power of bureaucracies and the power of individuals when those bureaucracies were empowered by individuals to do a job. In a democracy, creating institutions is one of the ways that individuals exercise their collective power. As individuals, we would like to strengthen our institutions' power to do what we want them to do, but weaken their ability to do what we don't.

From your comment, I am guessing that you have an underlying model that assumes that our bureaucracies are behaving as if they have goals of there own, at odds with the goals of the individuals who created them, and that they need to be weakened. Is that your assumption, or am I reading too much into your comment?


NEW FRANCESCO DE PRETIS
Assistant Professor, Link Campus University (Rome, Italy)

In replying to Danny Hillis' question, I would like to put the focus on the bias between the social and scientific aspects of secrecy, a theme I deem interesting for Edge and the Third Culture's community.

Assuming that secrecy is still the main way to carry on human social relationships (diplomacy, politics and overall love!), we could take a look at modern science and infer that here the game and the rules are completely different.

In modern science, the main way to make progress — a process that happens for many while trying to enter history — is to reach and spread information in the most free and unlimited way.

It is worth to note that Charles Darwin was pushed to publish its milestone books mainly to achieve consensus among the scientific community or Albert Einstein made the same thing communicating his so far unknown results on special relativity through the "Annalen der Physik".

According to this perspective, Internet and more usual scientific journals (both peer or not peer-reviewed) are the furthest things from secrecy we could ever think of and science should be advocated as a model of human uncovered cooperation towards a free knowledge.

Well.. I think this view is quite far from the reality that many people believe or would like to believe!

Science — as product of human activity — has no fewer constraints respect to other activities that openly use secrecy to pursue their final goals. First of all, the naïf picture I gave about modern science is just about modern science.

We must not forget that for many centuries until the age of enlightenment, science was just a matter of secrecy.

Till the end of the eighteenth century, even outstanding scientists such as Isaac Newton or Pierre De Fermat were making science in a completely different way: basically, they were searching answers in a framework where secrecy and access to knowledge were not absolutely free but regulated by byzantine processes more similar to alchemy than today's science.

Secondly, even after enlightenments, science has been regulated more or less softly by internal or external authorities. For a long time, the Holy See has played an important role in doing this and war too! The Manhattan Project — maybe the most remarkable advance of science in the previous century — was strictly managed under secret in a facility in New Mexico administered by the Army.

Eventually, are we sure that nowadays science research is carried on in an unlimited way?

Here, I do not want to refer to the laws prohibiting particular types of research in biology or genetics.

Here, I would like to stress the fact that maybe there could be something too big to discover!

Here's an example.

Today, modern cryptography and almost all the systems used to assure secrecy rely on an hypothetical conviction belonging to mathematics: the idea that the so-called "prime numbers" are distributed in an unknown and very highly difficult way to track.

So far, no specific mathematical law exists to explain how these numbers behave. It is deemed a such difficult problem — even more complicated that any kind of Millennium Prize puzzles — that almost everyone who wants to share information in a secret manner (from the banks to CIA) uses prime numbers.

Imagine what would happen if someone stood up and said: "I've got it! I have discovered the prime numbers law!".

Julian Assange and WikiLeaks would then become trivial puppets with respect to this mathematician.

So, can we be sure that science and secrecy aren't still together as has happened for so many years in the past?


NEW CLAY SHIRKY
Social & Technology Network Topology Researcher; Adjunct Professor, NYU Graduate School of Interactive Telecommunications Program (ITP); Author, Cognitive Surplus

Late to the conversation, but just checked in on Christmas night to find this present of a conversation gift-wrapped and sitting in my mail box.

Late though I am, I'll chime in with a few things.

1. To Danny's original question, one obvious answer to "Who gets to keep secrets?" is "Anyone capable of keeping them." As noted, that list is increasingly hard for organizations to get on, because the tension between access for insiders and lack of access for outsiders is no longer aided by a media environment characterized by a cartel of nationally-based actors who own substantially all modes of global distribution, and who would decline to publish such material.

2. It is astonishing that Pfc. Manning (or the person who copied the cables from SIPRnet) has gotten almost zero attention, even though, in both law and practice, he is clearly the person most culpable. Like the music industry, the government is now desperate to add new forms of control at intermediary points, as they have witnessed the million-fold expansion of edge points capable of acting on their own, without needing to ask anyone for help or permission.

3. It's unlikely we'll ever again see as massive and indiscriminate a collection of secret documents leaked from a single governmental source by a single person. As Markoff said, the poor design of SIPRnet access controls, as well as the insanity of a demoted specialist being allowed to keep his clearance, are not the kind of mistakes governments are likely to make twice. We will still see leaks, but they will be narrower in scope, and the leakers will be more centrally placed.

(3a. It is, however, the sort of mistakes hospitals will make. They have exactly the same "Multiplication of insiders/resistance to internal segmentation" that .mil had. Electronic records are proliferating, while the health care system continues to resist requirements to take on the expense and inconvenience of protecting its data (including resisting requirements to encrypt all data at rest). As a result, we will see at least one #cablegate-scale leak of healthcare data.)

4. As a consequence of the government's reaction, the Age of Intellipedia is over (9/12/01-11/28/10, RIP). After the shock of the 9/11 attacks, the intelligence community had been re-thinking its approach to silo'ed data, moving, fitfully, from a culture of "Need to know" to one of "Need to share." That giant sucking sound you hear is a billion putatively shared documents being slurped back into their silos, and even now, somewhere in the bowels of the Pentagon, there is doubtless a Powerpoint deck being crafted whose title is "Need to Know 2.0".

5. Julian claims that the history of these matters will be divided into "pre-Wikileaks" and "post-Wikileaks" periods. This claim is grandiose and premature. However, it is not, on present evidence, visibly wrong.

6. There are several ways in which Wikileaks marks a break with previous examples of such leaks, but for my money, the most dramatic is its global nature. Even the NY Times vs. the United States (1971), universally referenced as the apposite legal judgment, was a local affair.

Though, as Nathan noted, the leaking of the Pentagon Papers leading to that case didn't much change the prosecution of that war, it did affect the principal target of the protests of the 1960s, which was ending the draft. The Papers made it harder to ask middle-class parents to sacrifice their children to that kind of action. (It also helped feed into the collapse of trust in the government and of authority figures generally.) All this, however, was in the US context, as was every actor involved: Ellsberg, McNamara, Sulzberger, the NY Times, the Supreme Court, and so on. No one was out of the reach of the Federal Government.

Wikileaks, as both an institution and as a capability, has been global from the beginning, and the additional complexity of both jurisdiction and extradition make this particular problem much much more complex than any issues, legal or practical, triggered by the Pentagon Papers. Wikileaks has been operating since 2006, and the AG still has huge difficulty bringing charges.

7. Because of the massive increase in the number of actors who can make things public and the globalization of the stage on which those actors can act, I don't think there is yet a practical answer to Danny's question in the context of Wikileaks.

For many of our most important social systems, we resolve clashing principles by providing an escape valve, in the form of a set of actors who are less rule-bound than the rest of the system. The most famous and ancient is the jury, a collection of amateurs who can, in the face of clear laws and evidence, simply not return the verdict a judge would have returned.

So while I share Nathan's belief that the publication of the Pentagon Papers wasn't itself such a momentous event, I am nevertheless willing to defend the subsequent court decision not to punish the NY Times as "Holy Writ", on different principles. The Supreme Court's ruling seems to say (IANAL) that there is no law-like way to balance the State's need for secrets with the threats secrets pose to democracies, so it simply said that the 1st Amendment provides immunity to publishers; it is illegal to leak secrets, but it is not illegal to publish leaks.

8. This immunity sets up publishers as self-regulating checks to government power, albeit in a system that can never be made intellectually coherent — neither total success nor total failure of the state to keep secrets would protect the Republic as well as a regime of mostly success with periodic failures.

In this, it is analogous to other such legal systems, like Fair Use, which says "It's fair to use some bits of copyrighted work without permission, except when you use too much, or in the wrong way, and we're not going to define 'too much' or 'wrong way' too predictably." The effect of this is that claiming fair use is an assertion of a sufficiently low probability of being successfully sued; put another way, fair use is simply the concatenation of behaviors that won't get you fined for violating fair use.

9. However, a system like the one we got in 1971, a system that lets publishers operate as self-regulating checks to government power, was constructed in an environment of significant extra-legal constraints. A publisher, circa the Pentagon Papers, was a commercial, nationally-rooted media firm subject to significant tradeoffs between scale and partisanship — large publishers had to be deeply embedded in the culture they operated in, and they had to reflect mainstream views on most matters most of the time, to find and retain both revenue and audience.

Wikileaks operates with none of those extra-legal constraints. As a result, the legal bargain from 1971 (which was in turn public codification of US practice; no publisher has ever been successfully prosecuted under the Espionage Act) simply does not and cannot produce the outcome it used to, even though no aspects of the law itself have changed.

Society is made up of good things that can't be resolved in any perfect way — freedom vs. liberty, state secrets vs. citizen oversight — but the solutions to those tensions always take place in a particular context. Sometimes a bargain is so robust it lasts for centuries, as with trial by jury, but sometimes it is so much a product of its time that it does not survive the passing of that time.

I think that this latter fate has befallen our old balance between secrets and leaks. This does not mean that the Pentagon Papers precedent shouldn't free Wikileaks from prosecution, but I think it does mean that we can't assume that the old rules should be applied without amendment to the new situation.

I hope for all of our sakes that Holder brings a case under the Espionage Act, because that outcome would be better than either the extra-legal attacks on a media outlet the US Government doesn't like, or than finding ways to hold Julian on charges that don't revisit the basic balance the Court struck 40 years ago. This is new ground, and needs to be hashed out as an exemplar of the clash of basic principles that it is.

And with that, from snowy Geneva, a Merry Christmas to all, and to all, a good night.


NEW DANNY HILLIS
Physicist, Computer Scientist; Chairman, Applied Minds, Inc.; Author, The Pattern on the Stone

Thank you all for being so generous with your time, sharing your insights into this complex question. I especially appreciate those of you who have been willing to express opinions that you know to be unpopular with your friends and colleagues. Thank you for not keeping your opinions a secret.

— Danny


PREVIOUS December 6 — 21, 2010

RESPONSES: Aalam Wassef, Clay Shirky , Gloria Origgi, George Church, Noga Arikha, Douglas Rushkoff , George Dyson, Simona Morini, Andrian Kreye, Marc Rotenberg, James O'Donnell, Lee Smolin, Ross Anderson, Jennifer Jacquet, Donald Hoffman

CORRESPONDENCE: Danny Hillis responds to George Church, Lee Smolin, Clay Shirky, Marc Rotenberg, James O'Donnell, Jennider Jacquet, Noga Arikha, Andrian Kreye

IN THE NEWS: O Globo, CNet



PREVIOUS CORRESPONDENCE: Danny Hillis responds to George Church, Lee Smolin, Clay Shirky, Marc Rotenberg, James O'Donnell, Jennider Jacquet, Noga Arikha, Andrian Kreye


TO GEORGE CHURCH

George,

I agree with you that the world improves when we can remove the need for secrecy, but in saying that you tacitly admit that there is sometimes a need for it in our not-yet-perfect world. I especially like your notion that secrets are symptoms of something that needs improving. I think this applies to both public and private secrets. Is this a rule or just a heuristic? Are some truths better left obscured? For example, is it hard to imagine the effect on human relationships if a mindreader were invented that allowed others to know our private thoughts and feeling. Evolution has evolved deception because it serves a purpose. Cannot that purpose sometimes serve the common good?

— Danny

~~

GEORGE CHURCH:

Danny,

Yes "Evolution has evolved deception" — as well as flu, HIV, tb, etc. Could
those "sometime serve the common good"? Maybe — but this seems like a very
weak argument. Perhaps better would be that we might need separation of
ideas/memes/cultures long enough to test them — and then recombine the
parts we like best. But separation could be done without deception.

Furthermore, let's say that we had strong AI. Would we prefer to be able to
read the computers' minds or teach them how to lie to us and each other?
Would we like them to explain and justify their decisions or just strut off
with the taunt "It's for me to know and you to find out!"?

— George

~~

DANNY HILLIS:

Your AI example is a great thought experiment, but there is a big difference between lies and secrets. If we were designing Azimov's laws of robotics, would we add "Never hide information" to the list? We certainly do not program our current computers that way. I would not want a robot to reveal information that would put me immediate physical harm. Nor would I want it to reveal information that I told it not reveal. For example, if I told it not reveal useless information that would hurt someone else's feelings, is that a symptom of something that is wrong?

~~

GEORGE CHURCH:

My lab programs a variety of computers, robots, and nanobots, with a variety of safety features — but we are not heavily influenced by Asimov's laws. We do generally use the rule "Never hide information from the programmers". The ability to check the robot logic when a bug is found is quite valuable.

Yes; "there is a big difference between lies and secrets", but I was addressing your seemingly favorable use of the word "deception". Yes I can imagine scenarios where a person might not want to know something — but that should be their choice — not the choice of some paternalistic data hoarding computer. Software should help people decide if information is truly useless, harmful, or helpful for them (before they see the details).. Deception and secrecy (as typically implemented) do not have such choices as top priorities.

~~

DANNY HILLIS:

Since we agree that an individual might legitimately want to choose not to know something, shouldn't a society of people be able to make an similar choice? In other words, shouldn't a group be allowed to reach a consensus that they would be better off if some information was not known the group as a whole. We choose to do this for example,when we play card games. We also choose not to publicly reveal information that would hurt people feelings for no purpose. Are secrets always a symptoms of something that needs to be fixed, or are they just an good warning sign?

~~

GEORGE CHURCH:

I'm glad that you chose the word "consensus" rather than "majority rule". But that means that the secrecy-adverse minority has a strong role. Card games seem like a fairly specialized case, possibly popular to hone our skills of deception and secrecy — something that might be an amusing cultural vestige well into the future.

Are secrets always symptoms? I'm not keen on the word "always", but perhaps we could benefit from specific examples of "information that would hurt people feelings for no purpose."

1) If you tell me that I'm ugly, why should that hurt my feelings? Is that symptomatic of society favoring handsome people? If I need a shave, then what hurts me more — hearing about it now or a decade from now after I'm been passed over in the marketplace of life? Or does this just save you from (awkwardly) helping me?

2,3,4,5) Or could you hurt me by telling me that I've invested stupidly, or had cancer, or was conceived out of wedlock, or someone was insulted by my narcolepsy — all true — Or could you hurt me via false rumors (but giving me a better chance of nipping them in the bud)?

Should we keep the our current vast and destructive mechanisms for secret keeping (and deception) just for those rare cases where we can achieve consensus as a group to not let individuals decide if they want to hear a specific class of secret? Perhaps you have more compelling examples than mine above.


TO LEE SMOLIN

Lee,

I like the explicitness of the rules that you offer, but I don't think they cover all the important cases. For example, you say:

2) As a private individual, I cannot have access to information about other private individuals, without their explicit permission.

Currently, I do have some access to information about other individuals without explicit permission. Records of public transactions, such as birth and marriage are an example. I also have (today) the right to some information that they would presumably not want me to know, like whether they have been convicted of a felony. I might also want to live in a society that allowed me access information about them as a statistic, detached from their name, such as whether they had side effects to a certain drug. This type of information revelation without permission could help society without harming the individual.

— Danny

~~

LEE SMOLIN:

Dear Danny,

Thanks for writing back, I am for sure aware that what I wrote just touched the surface of a major area of law in transition. Let me see if I can improve on these.

Meanwhile, just to attempt to reply: marriage is a public act so it is one in which one voluntarily cedes certain privacy rights. As for birth and death-there is probably not a right to exist without it being known to the community. Privacy is only one of the rights you loose when you are convicted of a crime.

Getting statistical information from medical records is just one area where there is an issue of safeguarding privacy while getting useful information out of records. The telecom companies have troves of data, some of which would be very useful, if it could be separated out without violating the privacy of individuals, such as real time maps of traffic flows. Our PI building, like many workplaces, knows who is in the building and where, as statistical information this could be used to save a lot of energy, but why not go further and post real time information about where we are on an intranet page so we can always find each other easily? Do I have a right to privacy while at work? Even if not, is it wise for everyone to be able to reach each other quickly in a scientific institute? I suspect none of this is new to you.

— Lee


TO CLAY SHIRKY

Clay, you say: The only thing that could go really, terribly wrong right now would be short-circuiting that process." Do you see Wikileaks as an example of short-circuting that process?

— Danny

~~

CLAY SHIRKY:

No, I see attempts to silence Wikileaks without due process as the short-circuit.

Wikileaks is just the kind of thing that happens from time to time (for which read: once every 500 years or so, as with the Dutch printers). What's at stake is whether democratic or extra-democratic processes become the way the world's democracies adjust.

— Clay

~~

DANNY HILLIS:

I understand your point about attempts to silence Wikileaks, but I am not sure I understand what your position is on WIkileaks itself. Should a democratic society be able to empower its diplomats to conduct confidential discussions? If so, and if those diplomats do the job they were asked to do, would you consider breaking the confidentiality of those discussions a short-circuit of the process?

— Danny


TO MARC ROTENBERG

Marc,

I wish you would say more about this. When the governments withholds information about individuals is that an issue of privacy or secrecy? Do individuals have a fundamental right to withhold any information or can society demand transparency from them in certain circumstances? For instance, should courts be stripped of their current powers to force individuals to reveal information?

-Danny

~~

MARC ROTENBERG:

Interesting questions. Here are a few more thoughts.

Niether privacy nor secrecy is absolute. But in democratic societies, the default settings are toward privacy for the individual and transparency for government. In authoritarian societies — and in prisons — it is transparency for the individual and secrecy for the government. Privacy is also highly dynamic. A person's ability to move between a private sphere and public sphere is a measure of political freedom.

When the interests in personal privacy and government transparency come into conflict, there are often non-zero sum solutions. So for example, it is reasonable to expect that governments should make available records to the public while redacting the names of individuals. At times, of course, we need to understand the activities of individuals when they are acting in an official capacity.


TO JAMES O'DONNELL

Jim,

Your operational answer to the question of who gets to keep secrets, "Whoever can get away with it", is hard to argue with, but it does not get at why I was asking the question. What I really meant was "Who should get to keep secrets." As a technologist and as a citizen, I imagine that I have at least a small ability influence who can get away with it. Who should I help or oppose?

— Danny

~~

JAMES O'DONNELL:

For that we need the old Roman's line, 'who will guard the guardians?' All our methods of controlling the controllers have depended ultimately on trusting the elected representatives, but that has all gotten frayed and unpersuasive. Right now, the government of the US knows a lot more about terrrorist threats than I do —but I don't know how to trust them. Are their public measures overreactions, underreactions, or incredibly astute? I have absolutely no way of knowing. Is there a Team B, a UN, a Club of Rome we could trust to be the check and balance? I don't think so. I think we're just in a tough Machiavellian world with very few tools other than insurgency (to put the fear into powerful abusers, but also to make them react viperously) on our side.

Thanks for writing,

j'od


TO JENNIFER JACQUET:

I love your notion that secrecy is a cost of imagination. It brings to mind the fig leaves in the story of of Adam and Eve: When their "eyes were opened" by eating from the Tree of Knowledge, they felt the need for the first time to hide their shame.

— Danny


TO NOGA ARIKHA:

I think is interesting that you take such a balanced view of the virtue of opposing tendencies, kept in balance. Your description of the system of the four humours "in Passions and Tempers" has help me appreciate this merits of this perspective.

— Danny


TO ANDRIAN KREYE:

Your observation that many German's are upset by Google Street view adds another dimension to the question of secrecy. The view of one's house from the street is surely not a secret, yet somehow, gathering these views into a database has aroused memories of authoritarian dictatorships. One of the few revelations that seems to have invoked European indignation in the recent diplomatic leaks is that non-hidden information was being collected on the leadership of the United Nations. Do you think Europeans make more of a distinction than Americans between collected and uncollected non-secrets? Is the concentration of information necessarily linked to the concentration of power

— Danny


RESPONSES: Aalam Wassef, Clay Shirky , Gloria Origgi, George Church, Noga Arikha, Douglas Rushkoff , George Dyson, Simona Morini, Andrian Kreye, Marc Rotenberg, James O'Donnell, Lee Smolin, Ross Anderson, Jennifer Jacquet, Donald Hoffman

DONALD HOFFMAN
Cognitive Scientist, UC, Irvine; Author, Visual Intelligence

Secrets shape our bodies and brains. The need to keep a secret, and to break that secret, powers a creative cycle through natural selection, leading to better encrypting bodies and better decrypting perceptual systems.

The Australian bird dropping spider, Celaenia excavata, has a secret to keep: I am food. The need to keep this secret has, through natural selection, creatively shaped its entire body to resemble a bird dropping. This secret has thus been encrypted to hide the secret from a specific audience that wants the information, viz., predatory birds. The method of encryption strikes us as clever, even humorous. The engine behind this cleverness was the need to keep a secret. The need was pressing. Those less adept at keeping the secret are more likely to die prematurely. Only those who can keep the secret long enough can pass on their encryption method to a new generation.

The creative process of encryption probably took time. A sequence of mutations was probably required, a sequence that led to greater and greater mimicry of bird dung. The secret gradually became more secure.

The secret was a creative force not just for the body of the spider encrypting it, but also for the perceptual systems of the predators. Each time the encryption became more secure, some predators were fooled. Their perceptual systems failed to decrypt the secret. But others were not fooled, others with perceptual systems better-adapted for the decryption. They reaped a better chance to pass on their decryption technology to a new generation. Secrets are a creative force of perceptual evolution.

Who gets to keep secrets? Those who survive. As long as the spider gets to keep its secret, it avoids predation.

Who does not get to keep secrets? Those who are about to be eaten. Once a predator has decrypted the spider's secret, chances are that the spider will soon perish along with its secret.

The metaphor of encryption does not capture the full extent of the spider's deceit. The spider actually disseminates false information. The truth is: I am food. The information disseminated is: I am inedible dung. Disinformation is key to survival of the secret and the spider. Predators must see through the dung to find the spider.

Secrets also shape our governments. When governments in the information age keep secrets, employ encryption and disseminate false information they are following a pattern ubiquitous in nature for millions of years. This is, of course, no blanket endorsement of these governmental activities, any more than tracing the biological roots of cancer is an endorsement of cancer.

One reason to understand government secrets within a biological context is to remove the element of shock and surprise. Why waste time being shocked and surprised at revelations of governmental abuse of secrets? Instead such abuses should be expected. Those government officials and clerks who perpetrate them are, like the rest of us, the offspring of those who more successfully kept their secrets.

A second reason to understand government secrets in a biological context is to provide a reservoir of methods for understanding and countering abuses. What innovative methods do biological systems use to hide secrets and disseminate false information? What innovative methods do they use to break the secrets and see through the falsehoods?

Understanding government secrets in this way can provide more effective tools for citizens to counter abuses. It can also provide the government more effective ways to perpetrate abuse.


JENNIFER JACQUET
Post-doctoral researcher at the University of British Columbia

I believe (but cannot prove) that we will always live in a world with secrets. It seems no sooner is there a tool for truth, it is equally possible to use it to deceive. After language evolved, in part to keep track of one another, humans harnessed its combinatorial possibilities to lie. The same is true for writing and now digital tools, such as photography, video, and the blogosphere.

The philosophical discussions around secrecy are likely similar to those around my current research interest: shame (secrets exposed could lead to shame). I believe shame can be more effective and more democratic if it is leveraged against institutions, rather than individuals. The same is likely true for efforts to expose secrets. However, if what an individual is doing relates to the public good, then even an individual's secrets should not be safe.

Another individual's right to privacy trumps my need for security if what that person is doing does not directly affect me or the public. But this principle is often at odds with commerce, since revealing good secrets means good money.

If there is a line for secrecy, it will be drawn between those who benefit from it and those who suffer on its account. Bankers should not be allowed to keep secrets.

I believe we will always live in a world with secrets. I would not want it any other way. Secrecy is one cost of imagination.


ROSS ANDERSON
FRS; Professor, Security Engineering, Cambridge Computer Laboratory; Researcher in Security Psychology

The economics of security and privacy have been hot research topics recently. I will use one fact from security economics, and another from security engineering, to answer Danny's question; I'll then ask another one.

One puzzle in security economics was how the world manages to function despite our reliance on computers and their terrible vulnerability. We are starting to realise that computer crime has split into quite separate mass markets and elite markets. In the first, PCs are compromised by automated attacks, such as malware loaded by porn sites, and sold for a few tens of cents each to spammers. In the second, smart attackers study a high-value target — a company CFO, or a diplomat — and send carefully-crafted emails that trick him into installing snooping software on his PC. These attacks don't cost tens of cents per machine, but thousands of dollars. The reason the world still works is that most of us are not worth the effort of a targeted attack.

On the engineering side, the critical fact is that security (and privacy) don't scale. A secret diplomatic telegram known to a million US government employees just isn't secret any more. And a national medical record system is no different. One was built in Scotland, making five million residents' records available to tens of thousands of doctors and nurses; in short order, the records of politicians, footballers and other celebrities were compromised. And a privacy-conscious celeb would not keep her money in a big money-center bank that lets 200,000 staff at 2,000 branches look up any customer's statement; a small private bank in Geneva is a much better bet. But that will never work for the masses; the average schoolteacher or bus driver is never going to pay $300 a month in account maintenance charges.

So this leads me to a prediction about privacy economics. At equilibrium the elite will use private banks, exclusive clinics and so on, while the mass will have no privacy — as Scott McNealy famously remarked. Most people won't care, as no-one is particularly interested in them. In fact if you're poor it can be an advantage to have no privacy; if a shop knows you're poor it'll offer you discounts it won't offer to the rich. It's the rich who have an incentive to be inconspicuous, so they don't get charged extra. So my answer to "When does my right to privacy trump your need for security?" is "When I am prepared to pay for it."

Is this all fine then? Not entirely. Let me give an example: a woman being assisted by an NGO I advise was tracked down by her ex-husband and seriously assaulted, after he found her new address from an aunt of his who worked at a hospital. The victim is now suing the hospital, and I hope she will eventually get justice. Where the poor do need privacy, it will have to come from laws rather than markets, and as we see a small number of people suffering serious harm, rather than many people being just annoyed, privacy law may be made by judges rather than legislators. The European Court of Human Rights found in I v Finland (2008) that Europeans have a right to restrict their medical records to the clinicians treating them.

So here is my own question. I wonder if a case like this one might persuade the US Supreme Court to draw a similar conclusion? Surely a woman's right to privacy over her own body includes the right not to be put in fear and peril of her life by the negligence of her doctor's computer supplier. A decision along these lines could annoy the computer industry just as much as Roe v Wade annoyed religious people, but it may just be the way forward.


LEE SMOLIN

Secrecy may sound like a bad thing, but let us not forget that the right to privacy is a necessary component of civil society. There are lots of things I don't want to know about my neighbors and colleagues, or have them know about me. A related issue is the vast amount of information that institutions such as our own workplaces keep about ourselves, that we are not allowed to see. I want to suggest three principles to address Danny's question of where we draw the line on information about individuals.

1) As a private individual I have a right to see all information about me that may be gathered by businesses, institutions and governments. These include medical records, work evaluations, letters of recommendation, records of my communications and travel, credit histories, financial information etc.

2) As a private individual, I cannot have access to information about other private individuals, without their explicit permission.

3) In some cases we may be asked to cede these rights in exchange for explicit privileges. In exchange for a license to practice medicine or engineering, or fly an airplane, I can wave my right to see confidential assessments of my work. In exchange for the privilege of traveling by airplane I can be asked to cede my right to see information governments and the airline may keep about my travel history. When one applies for employment or entrance into university one may be asked to cede one's right to read confidential evaluations about oneself.

Given this framework there remain issues about what is wise policy. Like many academics, I spend a lot of time writing and reading confidential letters of recommendation; such secret information is the core of our system of hiring and promotions. Is this really the wisest way to inform our bets on who is going to do important science? The claim is that the advice you get is more honest than it would be in a system of open assessments, but is this really the case? The information one wants is often in the confidential letters, but so is a lot of exaggeration, bias and sloppiness, that would not survive some transparency. There is also the slowdown in the rate of progress due to the power differential which arises when older academics are confidentially evaluating the work of people significantly younger.

A system of open evaluations, in which candidates are allowed to see what is written about them, would take some getting used to, but it might lead to wiser decisions at less of a cost of time. I would also guess that a system of open evaluations would be weighed more favorably towards independent thinkers who do high risk/high payoff science than the present system, which, by its emphasis on confidential evaluations by senior scientists, is weighed towards scientists who follow well established research programs.

So my answer to Danny is: err on the side of allowing people to see information stored about them, while keeping the right for individuals to keep secrets from each other.


JAMES O'DONNELL
Classicist; Provost, Georgetown University; Author, The Ruin of the Roman Empire

Secrecy is ignorance made useful, ignorance turned to advantage. The secret is something that one person or two persons or (one count) the 854,000 people with top secret clearances in the U.S. Government can know and that is worth keeping somebody else in the dark about. It's only a secret when there's advantage to the ignorance. I know what I had for lunch today and you don't: it's not a secret. I know what I'm getting my dearly beloved for Christmas, she doesn't: that's a secret. That one's a benign secret and will bring us both pleasure. Not all secrets are benign.

So who gets to keep secrets? Whoever can get away with it and wants the advantage badly enough to exploit the ignorance of others. If in an information age, it gets harder to keep people in ignorance, then lots of secrets will get harder to keep. There will be fights over these things and a new normal will be achieved.

Secrets are ignorance crafted by artifice. They represent knowledge made into a tool for advantage. When one breaks (that is, when somebody learns the secret who isn't supposed to), power and advantage can shift suddenly and disproportionately. We relish the moments of secret-breaking because we like the spectacle of sudden reversal — like an intercepted pass in football.

The most interesting thing about secrets is the high moral dudgeon that attaches to keeping and leaking them. Edge readers should be looking at Jonathan Haidt and Simone Schnall on these pages to be reminded that morality isn't what we've always thought it was. Schnall would surmise that if you say to someone, "You're looking very professional and impressive today: can you keep a secret?", the secret will likelier get kept than if you say, "Dressed in the dark this morning and forgot to take your shower, eh? can you keep a secret?" The trouble with secret-keeping is really that we don't yet understand that side of ourselves very well, and so we are surprised and betrayed all the time.


MARC ROTENBERG
President and Executive Director of the Electronic Privacy Information Center (EPIC) , Washington, DC; Co-author, Information Privacy Law

It is important to distinguish between personal privacy and government secrecy.
In the first instance, we are considering a fundamental human right, in the second
an instrumental technique that extends the power of the state.


ANDRIAN KREYE
Editor, the Feuilleton (Arts & Essays) Section, Sueddeutsche Zeitung, Munich/Germany

The European, especially the German suspicion of new media and the new forms of transparency they bring or demand are definitely rooted in the dictatorships of the 20th century. It's only 21 years ago that the one-party state of the GDR collapsed and with it a secret police and a network of informants that permeated not only every corner of society, but even circles of friends and families. All over Europe scandals and heartbreaks are still discovered in the files of those dictatorships. Right now for example there's a fervent debate right now amongst the intellectuals of Romania that spills over to Germany, where many Romanians of German descent had settled during the reign of Ceau?escu.

A lot what was collected by these secret services was banal and mundane, details of everyday life that was used to create profiles of the observed. That is why Europeans tend to react strongly to the collection of banal and mundane data by Google Streetview or Facebook, even if you wouldn't classify them as secrets. Thus te amount of information kept by any state is definitely seen as a sign for the concentration of power.

In Germany the fear of new media goes even deeper. It goes hand in hand with the deep suspicion of science and pop culture. The rise of the Nazis and their genocidal ideology in German society was facilitated by the extensive use and abuse of new media, science and pop culture. After World War 2 this resulted in a deep almost dogmatic passion for highbrow culture in the German middle class. Which also resulted in a complete lack of middlebrow culture. The younger generation now imports middlebrow culture mostly from the US, be it intelligent pop music as the one heard on American college radio, DVDs of TV series like "Mad Men" or "The Wire", literature by writers like Jonathan Franzen or Bret Easton Ellis. This resulted in a culturally divided culture class system with a trashy pop market for the uneducated masses and an overly highbrow culture for the bourgeoisie and the elites.

In most countries the internet and all it's cultural phenomena fall mostly into a middlebrow segment of culture. That adds to the aversion of digital culture. Since that middlebrow non-existent, the tendency by the establishment is to see it as a purveyor of lowbrow trash undermining the core values of culture.


SIMONA MORINI
Philosopher; Dipartimento delle Arti e del Disegno Industriale, IUAV University Venice

Game theory and international relations have influenced each other almost since the publication in 1944 of Theory of Games and Economic Behaviour by von Neuman and Morgenstern. Now since the first applications during the Cold War, both the international situation and game theory has significantly evolved. For example equilibrium points, that is "solutions" where each party has no interest in modifying its strategies, can be defined either by restrictions on the individuals or parties behaviour or as restrictions on the environment in which they interact. Also, information plays a crucial role in defining the end result of a game (intended as an interaction among individuals).

Now from the agent's point of view full transparency can be detrimental to the end result of the game or completely alter its nature (as if we were playing poker showing to the other players our cards). But in the case of "games" played by the representatives of democratic states or institutions (i.e played with the citizen's money and in behalf of their safety or interests), it is clear that there can be no secrecy about the kind of game that is being played, the end of the game and its rules. Julian Assange has been very clear in stating that secrecy is necessary to diplomacy and that he is just fighting against secrecy used to cover abuses, corruption, private interests or human rights violations.

A decision making process that doesn't offer justification for its choices, that infringes the norms that it publicly defends, that benefits groups interest instead of the interests of the community, or tell lies about facts or its aims (for example limiting freedom well beyond what is needed to assure security) is bound to raise questions about its legitimacy.

In such situations civil and political obedience (as defended by Julian Assange and, before him, by Daniel Ellsberg who claimed to have obeyed to the Constitution in delivering the Pentagon Papers), becomes "revolutionary" and can be labeled by some as "terrorism".

This perverse effect is not related to the content of the "leaks" or to their possible influence on the parties behaviour (the accusation of endangering people), but to the fact that it questions the rules of the game, changing radically the environment in which decision making takes place, thus possibly eliminating the possibility of reaching safely those "equilibrium points" that violate the transparency required by democratic institutions. i.e it weakens one of the main preconditions of (non democratic) power.


GEORGE DYSON
Science Historian; Author, Darwin Among the Machines

It is important to make a distinction between classification and secrecy. Thanks to digital encryption, it is easier than ever to keep information genuinely secret — between those who hold the keys. Designating something as classified information, however, does not keep it secret; on the contrary, it specifies a class of people with whom it can be shared.

The problem is that as more and more information has been classified, those classes have become very, very large. Apparently (although the amount of classified information is classified information) the United States Government now produces more classified information than unclassified information. Since no information can be useful unless it is shared, we have developed a vast and unwieldy apparatus for sharing classified information.

Any such system is bound to leak. There are many possible motivations for such leaks: money, national interest, commercial advantage, etc., and in most cases it is advantageous to those with access to a compromised system not to disclose this.

If, however, the motivation is to publicize the information, rather than take advantage of it, then it will become widely evident that there has been a leak. As any survey of the rich history of cryptography and cryptanalysis will show, the real damages are usually less the result of the system leaking secrets and more the result of believing (or hoping) that it won't.


DOUGLAS RUSHKOFF
Media analyst; Documentary filmmaker; Author, Program or Be Programmed

There's no such thing as a secret. There's just denial — agreement to pretend we don't know.

I concluded this after watching a performer a couple of years ago, a fellow who could tell when people were lying and when they were telling the truth. He did it like a parlor trick on stage, but his services have been used by governments, police, jury selectors, and so on. And pretty much every technique he was using to discern the truth was based on one cue or the other that the liar was giving. What poker players call a "tell." And these "tells" are part of overall communications matrix. They are part of the 93% of human communication that takes place non-verbally.

See, on some level, we are all telling the truth no matter what. Our husbands and wives know when we are lying to them, even if they haven't allowed what they know to be true to fully enter consciousness. Our lying is ultimately useless, except insofar as it serves as "manners." We pretend we don't know.

The net, by enabling faceless communication, has made it a bit easier to delay the process through which the truth inevitably rises to the surface. We think of it as an affront to our privacy and secrecy, but it has actually promoted it. The only difference now is that this privacy is itself apparent. And its violation is just as visible.

I used to say that the net was preparing us for a future when we are all connected, anyway. That we will eventually become a biologically networked organism, where we will all know each other's thoughts. That the net was a dry run, a practice session for this eventuality. Now, however, I think that's the way we have always been. The privacy we have experienced in our lives and world has been like those Japanese paper walls. We hear everything going on behind them — we simply act like we don't, and sometimes we believe it ourselves.


NOGA ARIKHA
Historian of ideas; Author, Passions and Tempers: A History of the Humours

A world in which there are no secrets would be scary, because the line between secrecy and privacy is hard to draw. Such a world might resemble a Big Brother house. But it is unlikely ever to exist. For a start, complete transparency does not exist within an individual life, let alone in between groups or nations. Institutions have secret rules even when their explicit laws are declared and obeyed, and it is considered a crime, indeed a theft, to get hold of a company's trade secrets without authorization. Creators do not usually divulge the secrets of their trade — how a chef makes a special dish, how a sculptor uses a resin, how a filmmaker uses a lens. Families have secrets that it would be abhorrent to display to others; friendships and romances are filled with treasured secrets that it would be unseemly to reveal out of place. In such cases secrets are kept for the purpose of preserving an entity's identity, history and coherence, just as, in Freudian terms, we keep secrets from ourselves in order to safeguard our psychic economy.

And so it is not necessarily a bad thing to keep a secret. One entrusts someone with a secret — and the ability to keep a secret is a mark of trustworthiness. Military, political, indeed diplomatic missions function on the basis of secrecy, where secrecy entails exclusive participation: if everyone knew all the communications that led to a decision, or the strategy that underlay political or military decisions, then there would no more use of such missions, which function on the basis of exclusion. In fact very little can get done, or, arguably, communicated at all, without some sort of withholding of information from one set of people. There is nothing intrinsically wrong with respecting secrets; not to do so would allow one also not to respect privacy.

But one must determine who keeps what secrets in the name of what. Secrets become a problem when they (rather than the revelations they prevent) are out of place and harmful, means to a desctructive end, or when the right to secrecy is abused in such a way as to allow for lying, manipulation, distortion of reported data, and so on.

Democracies are full of secrets. One difference between democracies and dictatorships is the kind and amount of divulgation that is allowed. We do not want crimes to be concealed, or bad faith to be hidden away. We do not want hypocrisy. We want to expose bad deeds. But to say, along with the Wikileaks founder, that we should live in a fully transparent world is to condone the pervasive use of closed circuit cameras and praise the day when they will be placed in the toilets of the world's parliaments. A world where everyone knows everything would be chaotic, and knowledge itself would cease to be communicable. The founder of Wikileaks, who is himself living in hiding — alone with the secrets of his motivations — has assumed that a world without secrets was both desirable and, thanks to his efforts, possible. He seems to believe that the alternative to the traditional secrecy of diplomacy is a world without secrets at all, single-handedly deciding that all diplomatic secrets and deeds should be made public — forgetting that diplomacy functions on the basis of secrets. Without the information both exposed and concealed by these secrets, however, one is not in the position to know what secret is worth revealing. It takes expertise to decide what needs to be kept secret, for the sake of security, for instance. The Wikileaks cables put together the opinions of individual diplomats with various state secrets, without any mission other than to unveil all, for the idealistic sake of unveiling all that pertains to America's foreign missions. It is fine to want to know what is going on behind political scenes, but there are more legitimate ways of finding out than by such secret, and in the end clumsy stealth.

So how do we draw the line between secrecy and transparency? By realizing that this line is not a matter of principle, but can only be drawn ad hoc, since secrecy and transparency are equally necessary.


GEORGE CHURCH
Professor, Harvard University, Director, Personal Genome Project

Destigmatize

Privacy is a relatively new social phenomenon. When our ancestors died in the tiny village of their birth, lacking walls, and surrounded by relatives, there were no secrets.

Privacy vs security is a false dichotomy and the answer may not fall between those poles.

A third option is lowering the need for secrets. This wave seems to be growing in momentum. Over the past decades the number of people hiding their psychiatric status, sexual orientation, STDs, cancer, and salary has shrunk for a variety of reasons. Some of these topics seem much less more prone to extortion or scandal than in the past.

People now share more now because new technologies make data more accessible (2010 Google vs 1950 private eye) or make sharing more attractive (Facebook).

Technologies also make sharing more important, for example, discussing which new drugs to take for cancer, AIDS, or depression. In PatientsLikeMe and PersonalGenomes.org individuals share all of this and more to benefit people globally.

Secrecy is false security. The memory hole of Orwell's "Nineteen Eighty-Four", a purposefully disingenuous illusion of information security, is not so fictional (e.g. old personal web searches revived in crime investigations). Add to that mathematical tricks, human error and willful individuals and teams, and we see strong arguments against dark secrets with nowhere to hide.

Secrets are symptoms — not demanding a better bandage, but a treatment for the underlying disease causes. AIDS created an activism that lead to open discussion and to less stigma associated with sexual preference. Even the apparent "essentiality" of secrets for police and war-fighters is symptomatic of a failure of diplomacy and a failure of technology policy to provide a decent life for all.

There may be a trend toward less violence with global increases in the standard of living or education — and as we learn more about the underlying biology of violence. What will we see as anachronistic as we look back from a few decades hence? Will it be the wimpiness of our security or the prevalence of our secrets?


GLORIA ORIGGI
Philosopher, Institut Nicod, CNRS, Paris; www.interdisciplines.org

Secrecy is the forbidden fruit: you want to know more even at the risk of loosing the heavenly security of the Garden of Eden. Speech is power: some information is so potent that it could be dangerous. God created the universe with speech and he put the forbidden tree to remind to his creatures that they could not get the overall picture, that some files remained classified.

In classical mythology, those who steal secrets from God are damned heroes, like Prometheus, who stole the secret of fire from Zeus. Being human is a damned heroic destiny: we are scavengers, scraping off layers of lies and prohibitions to reach bitter truths.

Truth is not just an epistemic commodity: it is a human value. It mixes the needs of sincerity, accuracy and honesty that are essential to trust each other, to feel that we belong to the same species, that we are playing the same game.

But secrecy is not a sacred value: it is perceived as an abuse of power. It may have rational motivations, it may be indispensable in order to keep order and peace, but the secret-keeper never has the part of the hero, apart from extreme cases when lying is a way of saving people against an oppressive power that wants to brutally extort information to act in an evil way.

State secrecy is not a clear principle: no constitutions in the Western world endorse State secrecy as a legal or moral principle. It is an old privilege of sovereigns that has taken different shapes in the political history. It goes from the British Majesties' privilege of the Habeas Corpus, which overrules local authorities, to the Macchiavellian precepts to the Prince, who must classify some information in order to succeed in governing the people. What is called Raison d'Etat, is the privilege of the sovereign to act "out of law" for the State's interests. That is why it is so difficult these days to see State secrecy as legitimate, and to see those who violate it as traitors.

In our times, the first time United States advocated exclusion of evidence in a trial based only on affidavit was in 1953, in the United States vs. Reynolds case which involved the crash of a military plane whose mission had to be kept secret.

That is to say: it is difficult to have a spontaneous sympathy for the secrets' holders, and the damned heroes à la Julian Assange have all their chance to gain popular consensus.

Also, we come out from a decade in which truth-wars have been at the centre of the most difficult political choices, such as the Iraq invasion. For those who have studied the whole story, the balance between secrecy and security was really odd: the report from the British Intelligence on which Colin Powell based his speech at the UN, contained a major plagiarism from the journal Middle Eastern Studies. The following British report had been "sexed up" in order to affirm that an Iraq nuclear attack was possible in 45 minutes.

But what are the truths we value in the information society? Now that the Information Age is leaving its place to the Reputation Age, we want certified truths, attested by authoritative sources: we want the seal of quality that warrants us on where the truth come from, who is the authority endorsing it. Plain, factive truths, like plain facts, don't exist anymore: we trust a chain of production of truths, with its labels and legitimacies. The naked "truth" that leaks from unknown sources is unreadable, it is a noisy voice that we do not know what to do with. Yet, the Wikileaks scandal comes from the fact that many newspapers have given credit to the source, thus showing that they endorse this chain of production. They have provided the reputation these naked truths needed.

We have to understand better how these chains of reputation of information are constructed and endorsed. We have to take the epistemic responsibility of asking ourselves why we trust news or an information provider. And perhaps, with the power of collaborative work on the Web, we can contribute in giving the appropriate labels to the information we are able to control, thus contributing to the damned human enterprise of unveiling the forbidden truths.


CLAY SHIRKY
Social & Technology Network Topology Researcher; Adjunct Professor, NYU Graduate School of Interactive Telecommunications Program (ITP); Author, Cognitive Surplus

"When does my right to privacy trump your need for security?" seems analogous to the question "What is fair use?", to which the answer is "Whatever activities don't get people convicted for violating fair use." Put another way, fair use is a legal defense rather than a recipe.

Like fair use, the tension between a right to privacy and a right to know isn't one-dimensional — there are many independent variables involved. For instance, does the matter at hand involve elected or appointed officials? Working in the public name? The presumption should run to public review.

Does it involve private citizens? Would its release subject someone to harassment or other forms of attack? The presumption should run to privacy. And so on.

There are a set of clear cases. Congress should debate matters openly. The President should have a private channel with which to communicate with other world leaders. Restaurants must publish the results of their latest health inspection. You shouldn't have your lifetime browsing history published by Google. These are not complicated cases.

The complicated cases are where our deeply held beliefs clash. Should all diplomats have the right to all communications being classified as secret? Should all private individuals be asked, one at a time, how they feel about having their house photographed by Google?

I do not know the answer to this class of questions. You do not know the answer to this class of questions. This is not because you or I don't have strong opinions about them; it is because our opinions differ significantly from the opinions of our fellow citizens, and putting just one of us in charge would be bad for democracy.

So my answer to "When does my right to privacy trump your need for security?" is whenever various democratic processes say it does. If I publish something you wanted private and you can't successfully sue me, I was in the right, and when not, then not.

The faith here has to be in a heuristic, not an algorithm, because the values that come into conflict, and the depth of our intuitions about those values, prevent there being any simple answer. The hardest questions in mediating our lives together have to be adjudicated using lawmakers and the courts and executive choice about what to enforce and how.

The only thing that could go really, terribly wrong right now would be short-circuiting that process.


AALAM WASSEF
Visual artist based in Cairo and Paris; Founder, PeerEvaluation.org

Cablegate Is No Watergate

Washington, 1972. Two inquisitive journalists, an informant, tens of supporting sources, hundreds of physical documents and an editorial board waking up to the word cor-ro-bo-ra-te! That was the Watergate.

Fast forward to the Cablegate, the Web, November 2010 and be warned that any ressemblance to journalism, as you knew it, is purely coincidental.

In an interview to the Belfast Telegraph, Julian Assange, WikiLeaks's founder, explains: "We don't verify our sources, we verify the documents. As long as they are bona fide it doesn't matter where they come from." And he concludes: "We would rather not know."

The WikiLeaks Cablegate trove, as it is referred to in the press, seems to have disintegrated journalistic common practice and routines. Verifying your sources then protecting their identity has shifted to not knowing your sources and digitally encrypting their identities.

Julian Assange seems to believe that truth and good faith are to be found within the documents: "As long as they [the documents] are bona fide it doesn't matter where they come from". Challenging thought, considering that forging digital text is easier than forging your own mother's signature.

Responding to concerned New York Times readers, Bill Keller, executive editor, explains: "[…] the format is familiar from embassy cables we have seen from other sources." He adds : "No official has questioned the genuineness of the material, or suggested that they have been manipulated in any way."

According to Keller, graphic design on one hand, and silence on the other, prove the authenticity of information.

The Times's safeguards seem too weak compared to the many risks potentially involved but yet, and for a reason we wish to understand, five world famous newspapers have endorsed this material, namely the New York Times, The Guardian, El País, Der Spiegel and Le Monde. Why ? On what grounds ? The previous grounds — source verification, corroboration, material proof — seem to be definitely out of fashion.

When so much opacity, encryption and silence are involved, how exactly does WikiLeaks gain a newspaper's trust and, consequently, how does a newspaper gain its readers' trust. Answering the second question is an easy call : we trust the Times, period.

As for the first question, someone should step forward and provide us with a proper explanation.

While claiming that violating secrets in the name of greater transparency is goodand informative, one should be reminded that those secrets are in the very hands of men and women, institutions and administrations that have been entrusted and elected by the people.

In that respect, Wikileaks's opacities can in no way be compared to US diplomatic secrecy. WikiLeaks is unlected, unmonitored and unregulated by the people or any of its representatives.

The Cablegate raises, again, two issues that challenge the "Wiki" world and the Web in general: Internet popularity versus authority, and Internet popularity versus quality.

Wikipedia is impressive and popular, but can we trust it? Is it an authoritative source on the information it provides? WikiLeaks is powerful and is potentially an amazing "contre pouvoir", but have they been granted the authority — by any international organization — to handle material that involves the public good on a global scale? 

Popularity of online content grants it high rankings in search engines such as Google. The first results are not necessarily the best or the most accurate, they are the ones getting the highest number of hits.

On its Twitter page, WikiLeaks boasted on December 4th 2010 that, according to Google, it was twice as known as Wikipedia.

This quantitative metric alone might well explain the global media's interest for WikiLeaks considering it — just as search engines would do — worthy of attention and authoritative by popularity.

Everything would then fall into place. Wikileaks is popular, meaning it is authoritative. According to the same logic, one should consider WikiLeaks as an acclaimed and legitimate representative of those who searched it, clicked it and made it more famous than Wikipedia.

"One click, one vote" says Google, making its ranking algorithm sound like the very essence of democracy.

The prospect of knowledge and information becoming authoritative by popularity is a rather disturbing one. If the production processes of one and the other started mimicking "bottom-up" marketing strategies, wouldn't the end product merely reflect the crowd's "likes" and expectations, regardless of facts, regardless of what is right and regardless of the truth ?




10 de dezembro de 2010

"I always come back to Edge. In the world of Anglo-Saxon ideas (that still prevail throughout the whole world, or among the elite of the world), there is no smarter guide."

FRONTEIRA
By Hermano Vianna

When I received the invitation to write here, there was the question of whether the new columns would have names different than those of their authors. I was thinking about some possibilities. The first idea was to be a "name dropper," the English term for those in the habit of naming names of important people to impress listeners. I even thought about beginning all the texts with some name and gradually forming an idiosyncratic biographical catalogue, which could be useful for adventurous spirits.

The fact that I have not found a good ironic translation for such an expression in English, made me give up the gam in the end. So thought about the title "Frontier". In the background, still thinking in English: I movied towards "the border" in the direction of "edge". The columns would deal with only the cultural production that crossed limits established for the common place, transforming the world or inventing new ways to think about life.  My inspiration came from a number of different things such as "Close to the Edge" or Brian Eno's Edge feature "A Big Theory Of Culture". But mostly, I wanted to emulate, in absurdly individual and uselessly pretentious way, the site http://www.edge.org/.

I tracked the trajectory of John Brockman, the man who founded Edge before the Web existed. I bought the first book in his series "The Reality Club" at the time of its launch in 1990. I was impressed with such an interesting gathering of thinkers, coming from different areas such as the philosopher Daniel Dennett, the biologist Lynn Margulis, or psychologist Mihaly Csikszentmihalyi. I learned that what was published there was only a sample of much greater diversity. The Reality Club’s monthly "invitation only" meetings in New York — which began in 1981 — is a fascinating group that includes the physicist Freeman Dyson to theater director Richard Foreman, almost all of my idols. The motto of the club was ambitious: "To arrive at the edge of the world's knowledge, seek out the most complex and sophisticated minds, put them in a room together and have them ask each other the questions they are asking themselves."

Today, the meeting room has become the website Edge. The transformation has not exactly been democratizing. The club remains as elitist (not a criticism, an observation) as before, maybe even more, since its members have become celebrities (sign of the times: today scientists can be more pop than Mick Jagger) and many of them are incredibly rich. It is not an open site where anyone can contribute, but remains invitation-only, editorially driven. The difference: the general reader can now monitor the selected conversation almost in real time, after a light filter. Brockman still decides who may speak at the forum. Currently he is one of the more powerful literary agents in the world (specialized mainly in science books), managing to convince the major publishing houses to pay millions in advances to his clients. (One of the legends that revolve around his working method is that if a book begins to earn royalties, he says that he's failed — because he didn't get a large enough advance from the publisher). Brockman is the agent of Richard Dawkins, Jared Diamond, Martin Rees and others of the same caliber.

"An invitation to the 2010 dinner was not easy to come by as the figures who were present were the owners of Google, Twitter, Huffington Post, Bill Gates, Benoit Mandelbrot (fractals), Craig Venter (Human Genome Project). Do I need to drop more names? A bomb at dinner and we would lose much of a certain creative intelligence that drives our world and our future, or the future that these people have created for all of us. The nerd on the edge has now became the center of power."

The site has several sections. In one of them, a sort of "lifestyles of the rich and famous" — of the people Edge considers the most interesting and intelligent in the world — is an album of photos of an annual event hosted by Brockman, originally named "The Millionaires' Dinner" which was later upgraded to "The Billionaires' Dinner." An invitation to the 2010 dinner was not easy to come by as the figures who were present were the owners of Google, Twitter, Huffington Post, Bill Gates, Benoit Mandelbrot (fractals), Craig Venter (Human Genome Project). Do I need to drop more names? A bomb at dinner and we would lose much of a certain creative intelligence that drives our world and our future, or the future that these people have created for all of us. The nerd on the edge has now become the center of power.

Another very popular section is the Edge Annual Question. Every year a new question is asked. In November, Richard H. Thaler, the father of "behavioral economics" (the hottest area in economic studies), asked the following question: "Can you name your favorite examples of wrong scientific belief that were held for long periods of time". So far 65 responses have been received, authored by, among others, the physicist Lee Smolin and artist Matthew Ritchie. This week a special question was published. The inquisitor is Danny Hillis, pioneer in super computing, who — under the impact of Wiki-Leaks — wants to know if we can or if we must keep secrets in the age of information.

But this is the festive aspect of the Edge. What makes my neurons burn are the regular features, which are frequently brilliant texts, such as the most recent: "Metaphors, Models and Theories", by Emanuel Derman, one of those physicists in the past decades who has left the university to attempt to discover the laws of financial markets. (I will go deeper into this subject in a future column.) And this is why I always come back to Edge. In the world of Anglo-Saxon ideas (that still prevail throughout the whole world, or among the elite of the world), there is no smarter guide.

___

Hermano Vianna is a Brazilian anthropologist and writer who currently works in television. The original Portugese-language column, published behind O Globo's subscription pay-wall, is available, with an introduction, on Hermano Vianna's blog.

PERMALINK



December 9, 2010

PRIVACY, INC.

Edited by Declan McCullagh

Edge.org has a solid collection of essays addressing these questions: "When does my right to privacy trump your need for security? Should a democratic government be allowed to practice secret diplomacy? Would we rather live in a world with guaranteed privacy or a world in which there are no secrets? If the answer is somewhere in between, how do we draw the line?"


John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

contact: [email protected]
Copyright © 2010 By Edge Foundation, Inc
All Rights Reserved.

|Top|