This can't be the end of human evolution. We have to go someplace else.
It's quite remarkable. It's moved people off of personal computers. Microsoft's business, while it's a huge monopoly, has stopped growing. There was this platform change. I'm fascinated to see what the next platform is going to be. It's totally up in the air, and I think that some form of augmented reality is possible and real. Is it going to be a science-fiction utopia or a science-fiction nightmare? It's going to be a little bit of both.
JOHN MARKOFF is a Pulitzer Prize-winning journalist who covers science and technology for The New York Times. His most recent book is the forthcoming Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots.
...Today, you can send a design to a fab lab and you need ten different machines to turn the data into something. Twenty years from now, all of that will be in one machine that fits in your pocket. This is the sense in which it doesn't matter. You can do it today. How it works today isn't how it's going to work in the future but you don't need to wait twenty years for it. Anybody can make almost anything almost anywhere.
...Finally, when I could own all these machines I got that the Renaissance was when the liberal arts emerged—liberal for liberation, humanism, the trivium and the quadrivium—and those were a path to liberation, they were the means of expression. That's the moment when art diverged from artisans. And there were the illiberal arts that were for commercial gain. ... We've been living with this notion that making stuff is an illiberal art for commercial gain and it's not part of means of expression. But, in fact, today, 3D printing, micromachining, and microcontroller programming are as expressive as painting paintings or writing sonnets but they're not means of expression from the Renaissance. We can finally fix that boundary between art and artisans.
...I'm happy to take claim for saying computer science is one of the worst things to happen to computers or to science because, unlike physics, it has arbitrarily segregated the notion that computing happens in an alien world.
NEIL GERSHENFELD is a Physicist and the Director of MIT's Center for Bits and Atoms. He is the author of FAB. Neil Gershenfeld's Edge Bio Page
What interests me is how bits and atoms relate—the boundary between digital and physical. Scientifically, it's the most exciting thing I know. It has all sorts of implications that are widely covered almost exactly backwards. Playing it out, what I thought was hard technically is proving to be pretty easy. What I didn't think was hard was the implications for the world, so a bigger piece of what I do now is that. Let's start with digital.
Digital is everywhere; digital is everything. There's a lot of hubbub about what's the next MIT, what's the next Silicon Valley, and those were all the last war. Technology is leading to very different answers. To explain that, let's go back to the science underneath it and then look at what it leads to.
The idea that computers are people has a long and storied history. It goes back to the very origins of computers, and even from before. There's always been a question about whether a program is something alive or not since it intrinsically has some kind of autonomy at the very least, or it wouldn't be a program. There has been a domineering subculture—that's been the most wealthy, prolific, and influential subculture in the technical world—that for a long time has not only promoted the idea that there's an equivalence between algorithms and life, and certain algorithms and people, but a historical determinism that we're inevitably making computers that will be smarter and better than us and will take over from us. ...That mythology, in turn, has spurred a reactionary, perpetual spasm from people who are horrified by what they hear. You'll have a figure say, "The computers will take over the Earth, but that's a good thing, because people had their chance and now we should give it to the machines." Then you'll have other people say, "Oh, that's horrible, we must stop these computers." Most recently, some of the most beloved and respected figures in the tech and science world, including Stephen Hawking and Elon Musk, have taken that position of: "Oh my God, these things are an existential threat. They must be stopped."
In the history of organized religion, it's often been the case that people have been disempowered precisely to serve what was perceived to be the needs of some deity or another, where in fact what they were doing was supporting an elite class that was the priesthood for that deity. ... That looks an awful lot like the new digital economy to me, where you have (natural language) translators and everybody else who contributes to the corpora that allows the data schemes to operate, contributing to the fortunes of whoever runs the computers. You're saying, "Well, but they're helping the AI, it's not us, they're helping the AI." It reminds me of somebody saying, "Oh, build these pyramids, it's in the service of this deity," and, on the ground, it's in the service of an elite. It's an economic effect of the new idea. The new religious idea of AI is a lot like the economic effect of the old idea, religion.
JARON LANIER is a Computer Scientist; Musician; Author of Who Owns the Future?
KEVIN KELLY is Senior Maverick at Wired magazine. He helped launch Wired in 1993, and served as its Executive Editor until January 1999. He is currently editor and publisher of the popular Cool Tools, True Film, and Street Use websites. His most recent books are Cool Tools, and What Technology Wants.
by John Brockman
A few weeks ago David Carr profiled Kevin Kelly on page 1 of the New York Times Business section. He wrote that Kelly's pronouncements were "often both grandiose and correct." That’s a pretty good summary of Kevin Kelly's style and his prescience.
For the thirty years I've known him, Kelly has been making bold declarations about the world we are crafting with new technologies. He first began to attract notice when he helped found Wired as the first executive editor. "The culture of technology, he notes, "was the prime beat of Wired. When we started the magazine 20 years ago, we had no intentions to write about hardware—bits and bauds. We wrote about the consequences of new inventions and the meaning of new stuff in our lives. At first, few believed us, and dismissed my claim that technology would become the central driver of our culture. Now everyone sees this centrality, but some are worried this means the end of civilization."
The biggest change in our lives is the rate of change and it's interesting to note that this week marks the 10th anniversary of the 2004 founding of Facebook. (Twiiter, founded in 2006, was still 2 years away). If you were to get your news electronically at that time, most likely is was on a Blackberry pager. Nobody was talking about "sharing".
Kelly's is well aware that his complete embrace of what he calls "The Technium", is a lightning rod for criticism. But, he points out that "we are still at the beginning of the beginning. We have just started to make a technological society. The technological changes in the next 20 years will dwarf those of the last 20 years. It will almost be like nothing at all has happened yet."
In the meantime Kelly is doing what he's been up to for decades, acting as a sensing and ruddering mechanism for the rest of us, finding his way through this new landscape.
I think beyond me, beyond our individual silos, to achieve prosperity and development in a place like Sierra Leone does not involve giving free devices to victims, which leads to low self-efficacy and dependence on external actors; we need to make new minds. That involves giving young people the platform to innovate, to learn from making, and to learn, and to solve very tangible problems within their communities.
DAVID MOININA SENGEH is a doctoral student at the MIT Media Lab, and a researcher in the Lab’s Biomechatronics group.
As all the people and computers on our planet get more and more closely connected, it's becoming increasingly useful to think of all the people and computers on the planet as a kind of global brain.
THOMAS W. MALONE is the Patrick J. McGovern Professor of Management at the MIT Sloan School of Management and the founding director of the MIT Center for Collective Intelligence. He was also the founding director of the MIT Center for Coordination Science and one of the two founding co-directors of the MIT Initiative on "Inventing the Organizations of the 21st Century".
"If we're going to get science policy right, it's really important for us to study the economic benefit of open access and not accept the arguments of incumbents. Existing media companies claim that they need ever stronger and longer copyright protection and new, draconian laws to protect them, and meanwhile, new free ecosystems, like the Web, have actually led to enormous wealth creation and enormous new opportunities for social value. And yes, they did in fact lead in some cases to the destruction of incumbents, but that's the kind of creative destruction that we should celebrate in the economy. We have to accept that, particularly in the area of science, there's an incredible opportunity for open access to enable new business models."
"One question that fascinated me in the last two years is, can we ever use data to control systems? Could we go as far as, not only describe and quantify and mathematically formulate and perhaps predict the behavior of a system, but could you use this knowledge to be able to control a complex system, to control a social system, to control an economic system?"
"With Big Data we can now begin to actually look at the details of social interaction and how those play out, and are no longer limited to averages like market indices or election results. This is an astounding change. The ability to see the details of the market, of political revolutions, and to be able to predict and control them is definitely a case of Promethean fire—it could be used for good or for ill, and so Big data brings us to interesting times. We're going to end up reinventing what it means to have a human society."
"Today, what you want is you want to have resilience and agility, and you want to be able to participate in, and interact with the disruptive things. Everybody loves the word 'disruptive innovation.' Well, how does, and where does disruptive innovation happen? It doesn't happen in the big planned R&D labs; it happens on the edges of the network. Most important ideas, especially in the consumer Internet space, but more and more now in other things like hardware and biotech, you're finding it happening around the edges."