The idea that computers are people has a long and storied history. It goes back to the very origins of computers, and even from before. There's always been a question about whether a program is something alive or not since it intrinsically has some kind of autonomy at the very least, or it wouldn't be a program. There has been a domineering subculture—that's been the most wealthy, prolific, and influential subculture in the technical world—that for a long time has not only promoted the idea that there's an equivalence between algorithms and life, and certain algorithms and people, but a historical determinism that we're inevitably making computers that will be smarter and better than us and will take over from us. ...That mythology, in turn, has spurred a reactionary, perpetual spasm from people who are horrified by what they hear. You'll have a figure say, "The computers will take over the Earth, but that's a good thing, because people had their chance and now we should give it to the machines." Then you'll have other people say, "Oh, that's horrible, we must stop these computers." Most recently, some of the most beloved and respected figures in the tech and science world, including Stephen Hawking and Elon Musk, have taken that position of: "Oh my God, these things are an existential threat. They must be stopped."
In the history of organized religion, it's often been the case that people have been disempowered precisely to serve what was perceived to be the needs of some deity or another, where in fact what they were doing was supporting an elite class that was the priesthood for that deity. ... That looks an awful lot like the new digital economy to me, where you have (natural language) translators and everybody else who contributes to the corpora that allows the data schemes to operate, contributing to the fortunes of whoever runs the computers. You're saying, "Well, but they're helping the AI, it's not us, they're helping the AI." It reminds me of somebody saying, "Oh, build these pyramids, it's in the service of this deity," and, on the ground, it's in the service of an elite. It's an economic effect of the new idea. The new religious idea of AI is a lot like the economic effect of the old idea, religion.
JARON LANIER is a Computer Scientist; Musician; Author of Who Owns the Future?
KEVIN KELLY is Senior Maverick at Wired magazine. He helped launch Wired in 1993, and served as its Executive Editor until January 1999. He is currently editor and publisher of the popular Cool Tools, True Film, and Street Use websites. His most recent books are Cool Tools, and What Technology Wants.
by John Brockman
A few weeks ago David Carr profiled Kevin Kelly on page 1 of the New York Times Business section. He wrote that Kelly's pronouncements were "often both grandiose and correct." That’s a pretty good summary of Kevin Kelly's style and his prescience.
For the thirty years I've known him, Kelly has been making bold declarations about the world we are crafting with new technologies. He first began to attract notice when he helped found Wired as the first executive editor. "The culture of technology, he notes, "was the prime beat of Wired. When we started the magazine 20 years ago, we had no intentions to write about hardware—bits and bauds. We wrote about the consequences of new inventions and the meaning of new stuff in our lives. At first, few believed us, and dismissed my claim that technology would become the central driver of our culture. Now everyone sees this centrality, but some are worried this means the end of civilization."
The biggest change in our lives is the rate of change and it's interesting to note that this week marks the 10th anniversary of the 2004 founding of Facebook. (Twiiter, founded in 2006, was still 2 years away). If you were to get your news electronically at that time, most likely is was on a Blackberry pager. Nobody was talking about "sharing".
Kelly's is well aware that his complete embrace of what he calls "The Technium", is a lightning rod for criticism. But, he points out that "we are still at the beginning of the beginning. We have just started to make a technological society. The technological changes in the next 20 years will dwarf those of the last 20 years. It will almost be like nothing at all has happened yet."
In the meantime Kelly is doing what he's been up to for decades, acting as a sensing and ruddering mechanism for the rest of us, finding his way through this new landscape.
I think beyond me, beyond our individual silos, to achieve prosperity and development in a place like Sierra Leone does not involve giving free devices to victims, which leads to low self-efficacy and dependence on external actors; we need to make new minds. That involves giving young people the platform to innovate, to learn from making, and to learn, and to solve very tangible problems within their communities.
DAVID MOININA SENGEH is a doctoral student at the MIT Media Lab, and a researcher in the Lab’s Biomechatronics group.
As all the people and computers on our planet get more and more closely connected, it's becoming increasingly useful to think of all the people and computers on the planet as a kind of global brain.
THOMAS W. MALONE is the Patrick J. McGovern Professor of Management at the MIT Sloan School of Management and the founding director of the MIT Center for Collective Intelligence. He was also the founding director of the MIT Center for Coordination Science and one of the two founding co-directors of the MIT Initiative on "Inventing the Organizations of the 21st Century".
"If we're going to get science policy right, it's really important for us to study the economic benefit of open access and not accept the arguments of incumbents. Existing media companies claim that they need ever stronger and longer copyright protection and new, draconian laws to protect them, and meanwhile, new free ecosystems, like the Web, have actually led to enormous wealth creation and enormous new opportunities for social value. And yes, they did in fact lead in some cases to the destruction of incumbents, but that's the kind of creative destruction that we should celebrate in the economy. We have to accept that, particularly in the area of science, there's an incredible opportunity for open access to enable new business models."
"One question that fascinated me in the last two years is, can we ever use data to control systems? Could we go as far as, not only describe and quantify and mathematically formulate and perhaps predict the behavior of a system, but could you use this knowledge to be able to control a complex system, to control a social system, to control an economic system?"
"With Big Data we can now begin to actually look at the details of social interaction and how those play out, and are no longer limited to averages like market indices or election results. This is an astounding change. The ability to see the details of the market, of political revolutions, and to be able to predict and control them is definitely a case of Promethean fire—it could be used for good or for ill, and so Big data brings us to interesting times. We're going to end up reinventing what it means to have a human society."
"Today, what you want is you want to have resilience and agility, and you want to be able to participate in, and interact with the disruptive things. Everybody loves the word 'disruptive innovation.' Well, how does, and where does disruptive innovation happen? It doesn't happen in the big planned R&D labs; it happens on the edges of the network. Most important ideas, especially in the consumer Internet space, but more and more now in other things like hardware and biotech, you're finding it happening around the edges."
"There's a new kind of socio-inspired technology coming up, now. Society has many wonderful self-organization mechanisms that we can learn from, such as trust, reputation, culture. If we can learn how to implement that in our technological system, that is worth a lot of money; billions of dollars, actually. We think this is the next step after bio-inspired technology."
"We're missing a tremendous opportunity. We're asleep at the switch because it's not a metaphor. In 1945 we actually did create a new universe. This is a universe of numbers with a life of their own, that we only see in terms of what those numbers can do for us. Can they record this interview? Can they play our music? Can they order our books on Amazon? If you cross the mirror in the other direction, there really is a universe of self-reproducing digital code. When I last checked, it was growing by five trillion bits per second. And that's not just a metaphor for something else. It actually is. It's a physical reality."