Edge in the News: 2010

The New york Times [1.18.10]

Today’s idea: Filtering, not remembering, is the most important mental skill in the digital age, an essay says. But this discipline will prove no mean feat, since mental focus must take place amid the unlimited distractions of the Internet.

Internet | Edge, the high-minded ideas and tech site, has posed its annual question for 2010 — “How is the Internet changing the way you think?” — and gotten some interesting responses from a slew of smart people. They range from the technology analyst Nicholas Carr, who wonders if the Web made it impossible for us to read long pieces of writing; to Clay Shirky, social software guru, who sees the Web poised uncertainly between immature “Invisible High School” and more laudable “Invisible College.”

David Dalrymple, a researcher at the Massachusetts Institute of Technology,thinks human memory will no longer be the key repository of knowledge, and focus will supersede erudition. Quote:

DESCRIPTION

Before the Internet, most professional occupations required a large body of knowledge, accumulated over years or even decades of experience. But now, anyone with good critical thinking skills and the ability to focus on the important information can retrieve it on demand from the Internet, rather than her own memory. On the other hand, those with wandering minds, who might once have been able to focus by isolating themselves with their work, now often cannot work without the Internet, which simultaneously furnishes a panoply of unrelated information — whether about their friends’ doings, celebrity news, limericks, or millions of other sources of distraction. The bottom line is that how well an employee can focus might now be more important than how knowledgeable he is. Knowledge was once an internal property of a person, and focus on the task at hand could be imposed externally, but with the Internet, knowledge can be supplied externally, but focus must be forced internally. [Edge via The Daily Dish]

More Recommended Reading:

http://ideas.blogs.nytimes.com/2010/01/19/the-age-of-external-knowledge/ [1.17.10]

Today’s idea: Filtering, not remembering, is the most important mental skill in the digital age, an essay says.

But this discipline will prove no mean feat, since mental focus must take place amid the unlimited 

distractions of the Internet.

Internet | Edge, the high-minded ideas and tech site, has posed its annual question for 2010 — "How is the Internet changing the way you think?" — and gotten some interesting responses from a slew of smart people. They range from the technology analyst Nicholas Carr, who wonders if the Web made it impossible for us to read long pieces of writing; to Clay Shirky, social software guru, who sees the Web poised uncertainly between immature "Invisible High School" and more laudable "Invisible College." David Dalrymple, a researcher at the Massachusetts Institute of Technology, thinks human memory will no longer be the key repository of knowledge, and focus will supersede erudition. Quote:

Before the Internet, most professional occupations required a large body of knowledge, accumulated over years or even decades of experience. But now, anyone with good critical thinking skills and the ability to focus on the important information can retrieve it on demand from the Internet, rather than her own memory. On the other hand, those with wandering minds, who might once have been able to focus by isolating themselves with their work, now often cannot work without the Internet, which simultaneously furnishes a panoply of unrelated information — whether about their friends’ doings, celebrity news, limericks, or millions of other sources of distraction. The bottom line is that how well an employee can focus might now be more important than how knowledgeable he is. Knowledge was once an internal property of a person, and focus on the task at hand could be imposed externally, but with the Internet, knowledge can be supplied externally, but focus must be forced internally.

FIELDS: I compute, therefore I am
Washington Times [1.14.10]
Read the full article →

The New york Times [1.13.10]

In 2006, the artist and computer scientist Jaron Lanier published an incisive, groundbreaking and highly controversial essay about “digital Maoism” — about the downside of online collectivism, and the enshrinement by Web 2.0 enthusiasts of the “wisdom of the crowd.” In that manifesto Mr. Lanier argued that design (or ratification) by committee often does not result in the best product, and that the new collectivist ethos — embodied by everything from Wikipedia to“American Idol” to Google searches — diminishes the importance and uniqueness of the individual voice, and that the “hive mind” can easily lead to mob rule.

Jonathan Sprague

Jaron Lanier

 

YOU ARE NOT A GADGET

 

A Manifesto

By Jaron Lanier

209 pages. Alfred A. Knopf. $24.95.

Related

Bits: Can We Change the Web's Culture of Nastiness?

Excerpt: ‘You Are Not a Gadget’(pdf)

Now, in his impassioned new book “You Are Not a Gadget,” Mr. Lanier expands this thesis further, looking at the implications that digital Maoism or “cybernetic totalism” have for our society at large. Although some of his suggestions for addressing these problems wander into technical thickets the lay reader will find difficult to follow, the bulk of the book is lucid, powerful and persuasive. It is necessary reading for anyone interested in how the Web and the software we use every day are reshaping culture and the marketplace.

Mr. Lanier, a pioneer in the development of virtual reality and a Silicon Valley veteran, is hardly a Luddite, as some of his critics have suggested. Rather he is a digital-world insider who wants to make the case for “a new digital humanism” before software engineers’ design decisions, which he says fundamentally shape users’ behavior, become “frozen into place by a process known as lock-in.” Just as decisions about the dimensions of railroad tracks determined the size and velocity of trains for decades to come, he argues, so choices made about software design now may yield “defining, unchangeable rules” for generations to come.

Decisions made in the formative years of computer networking, for instance, promoted online anonymity, and over the years, as millions upon millions of people began using the Web, Mr. Lanier says, anonymity has helped enable the dark side of human nature. Nasty, anonymous attacks on individuals and institutions have flourished, and what Mr. Lanier calls a “culture of sadism” has gone mainstream. In some countries anonymity and mob behavior have resulted in actual witch hunts. “In 2007,” Mr. Lanier reports, “a series of ‘Scarlet Letter’ postings in China incited online throngs to hunt down accused adulterers. In 2008, the focus shifted to Tibet sympathizers.”

Mr. Lanier sensibly notes that the “wisdom of crowds” is a tool that should be used selectively, not glorified for its own sake. Of Wikipedia he writes that “it’s great that we now enjoy a cooperative pop culture concordance” but argues that the site’s ethos ratifies the notion that the individual voice — even the voice of an expert — is eminently dispensable, and “the idea that the collective is closer to the truth.” He complains that Wikipedia suppresses the sound of individual voices, and similarly contends that the rigid format of Facebook turns individuals into “multiple-choice identities.”

Like Andrew Keen in “The Cult of the Amateur,” Mr. Lanier is most eloquent on how intellectual property is threatened by the economics of free Internet content, crowd dynamics and the popularity of aggregator sites. “An impenetrable tone deafness rules Silicon Valley when it comes to the idea of authorship,” he writes, recalling the Wired editor Kevin Kelly’s 2006 prediction that the mass scanning of books would one day create a universal library in which no book would be an island — in effect, one humongous text, made searchable and remixable on the Web.

“It might start to happen in the next decade or so,” Mr. Lanier writes. “Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what’s important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don’t know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video.”

While this development might sound like a good thing for consumers — so much free stuff! — it makes it difficult for people to discern the source, point of view and spin factor of any particular fragment they happen across on the Web, while at the same time encouraging content producers, in Mr. Lanier’s words, “to treat the fruits of their intellects and imaginations as fragments to be given without pay to the hive mind.” A few lucky people, he notes, can benefit from the configuration of the new system, spinning their lives into “still-novel marketing” narratives, as in the case, say, of Diablo Cody, “who worked as a stripper, can blog and receive enough attention to get a book contract, and then have the opportunity to have her script made into a movie — in this case, the widely acclaimed ‘Juno.’ ” He fears, however, that “the vast majority of journalists, musicians, artists and filmmakers” are “staring into career oblivion because of our failed digital idealism.”

Paradoxically enough, the same old media that is being destroyed by the Net drives an astonishing amount of online chatter. “Comments about TV shows, major movies, commercial music releases, and video games must be responsible for almost as much bit traffic as porn,” Mr. Lanier observes. “There is certainly nothing wrong with that, but since the Web is killing the old media, we face a situation in which culture is effectively eating its own seed stock.”

In other passages in this provocative and sure-to-be-controversial book he goes even further, suggesting that “pop culture has entered into a nostalgic malaise,” that “online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media.”

Online culture, he goes on, “is a culture of reaction without action” and rationalizations that “we were entering a transitional lull before a creative storm” are just that — rationalizations. “The sad truth,” he concludes, “is that we were not passing through a momentary lull before a storm. We had instead entered a persistent somnolence, and I have come to believe that we will only escape it when we kill the hive.”

ATLANTIC WIRE [1.13.10]

Edge is an organization of deep, visionary thinkers on science and culture. Each year the group poses a question, this year collecting 168 essay responses to the question, "How is the Internet changing the way you think?" 

In answer, academics, scientists and philosophers responded with musings on the Internetenabling telecommunication, or functioning as a sort of prosthesis, or robbing us of our old, linear" mode of thinking. Actor Alan Alda described the Web as "speed plus mobs." Responses alternate between the quirky and the profound ("In this future, knowledge will be fully outside the individual, focus will be fully inside, and everybody's selves will truly be spread everywhere.") 

Since it takes a while to read the entire collection--and the Atlantic Wire should know, as we tried--here are some of the more piquant answers. Visit the Edge website for the full experience. For a smart, funny answer in video form, see here.

  • We Haven't Changed, declares Harvard physician and sociologist Nicholas Christakis. Our brains "likely evolved ... in response to the demands of social (rather than environmental) complexity," and would likely only continue to evolve as our social framework changes. Our social framework has not changed: from our family units to our military units, he points out, our social structures remain fairly similar to what they were over 1000 years ago. "The Internet itself is not changing the fundamental reality of my thinking any more than it is changing our fundamental proclivity to violence or our innate capacity for love."
  • Bordering on Mental Illness Barry C. Smith of the University of London writes of the new importance of "well-packaged information." He says he is personally "exhilarated by the dizzying effort to make connections and integrate information. Learning is faster. Though the tendency to forge connecting themes can feel dangerously close to the search for patterns that overtakes the mentally ill."
  • New 'Survival of the Focused' Stanford psychologist Brian Knutson thinks the Internet may bias us towards our "present" selves rather than "future" selves, leading to procrastination: "I worry that the Internet may impose a 'survival of the focused,' in which individuals gifted with some natural capacity to stay on target or who are hopped up on enough stimulants forge ahead, while the rest of us flail helplessly in some web-based attentional vortex."
  • Language is a Technology, Too, points out another Stanford psychologist, Lera Boroditsky. Some technologies "we no longer even notice as technologies: they just seem like natural extensions of our minds. Numbers are one such example: a human-invented tool that once learned has incredible productive power in the mind. Writing is another such example. It no longer seems magical in the literate world that one could communicate a complex set of thoughts silently across vast reaches of time and space using only a cocktail napkin and some strategically applied stains." Boroditsky ends with a jab at renowned philosopher Dan Dennett, who makes his own point about how "absolute power corrupts absolutely," and the Internet is absolute.
  • We Are Immortal, is Juan Enriquez's startling conclusion. "Future sociologists and archaeologists," unlike current ones studying ancient Rome, "will have access to excruciatingly detailed pictures on an individual basis." There are drawbacks: "those of a certain age learned long ago, from the triumphs and tragedies of Greek Gods, that there are clear rules separating the mortal and immortal. Trespasses tolerated and forgiven in the fallible human have drastic consequences for Gods. In the immortal world all is not forgiven and mostly forgotten after you shuffle off to Heaven."
  • Cells are to Humans as Humans are to Internet Humanity W. Tecumseh Fitch, cognitive biologist at the University of Vienna, looks at the way single cells gradually grouped into multi-celled organisms that required organization, with certain cells exerting control over others through hormones and neurons. Humans are now "the metaphoric neurons or the global brain," he says, with HTML for neurotransmitters as we rush to "the brink of a wholly new system of societal organization." He sees "two main problems," though, with his metaphor:

First, the current global brain is only tenuously linked to the organs of international power ... Second, our nervous systems evolved over 400 million years of natural selection, during which billions of competing false-starts and miswired individuals were ruthlessly weeded out. But there is only one global brain today, and no trial and error process to extract a functional configuration from the trillions of possible configurations. This formidable design task is left up to us.

ATLANTIC WIRE [1.12.10]

Edge is an organization of deep, visionary thinkers on science and culture. Each year the group poses a question, this year collecting 168 essay responses to the question, "How is the Internet changing the way you think?"

In answer, academics, scientists and philosophers responded with musings on the Internet enabling telecommunication, or functioning as a sort of prosthesis, or robbing us of our old, linear" mode of thinking. ActorAlan Alda described the Web as "speed plus mobs." Responses alternate between the quirky and the profound ("In this future, knowledge will be fully outside the individual, focus will be fully inside, and everybody's selves will truly be spread everywhere.")

Since it takes a while to read the entire collection--and the Atlantic Wire should know, as we tried--here are some of the more piquant answers. Visit the Edge website for the full experience. For a smart, funny answer in video form, see here.

  • We Haven't Changed, declares Harvard physician and sociologist Nicholas Christakis. Our brains "likely evolved ... in response to the demands of social (rather than environmental) complexity," and would likely only continue to evolve as our social framework changes. Our social framework has not changed: from our family units to our military units, he points out, our social structures remain fairly similar to what they were over 1000 years ago. "The Internet itself is not changing the fundamental reality of my thinking any more than it is changing our fundamental proclivity to violence or our innate capacity for love."

  • Bordering on Mental Illness Barry C. Smith of the University of London writes of the new importance of "well-packaged information." He says he is personally "exhilarated by the dizzying effort to make connections and integrate information. Learning is faster. Though the tendency to forge connecting themes can feel dangerously close to the search for patterns that overtakes the mentally ill."

boingboing [1.11.10]

edge.jpg

 

Each year, John Brockman of Edge.org asks a question of a number of science, tech, and media personalities, and compiles the answers. This year's question: "How is the internet changing the way you think?" Lots of good, meaty responses that make for great reading, from interesting people whose work ideas have been blogged here on Boing Boing before: Kevin Kelly, Jaron Lanier, Linda Stone, George Dyson, Danny Hillis, Esther Dyson, Tim O'Reilly, Doug Rushkoff, Jesse Dylan, Richard Dawkins, Alan Alda, Brian Eno, and many more.

I'm far out-classed by the aforementioned thinkers. But here's a snip from my more modest contribution, "I DON'T TRUST ALGORITHM LIKE I TRUST INTUITION":

I travel regularly to places with bad connectivity. Small villages, marginalized communities, indigenous land in remote spots around the globe. Even when it costs me dearly, on a spendy satphone or in gold-plated roaming charges, my search-itch, my tweet twitch, my email toggle, those acquired instincts now persist.

The impulse to grab my iPhone or pivot to the laptop, is now automatic when I'm in a corner my own wetware can't get me out of. The instinct to reach online is so familiar now, I can't remember the daily routine of creative churn without it. The constant connectivity I enjoy back home means never reaching a dead end. There are no unknowable answers, no stupid questions. The most intimate or not-quite-formed thought is always seconds away from acknowledgement by the great "out there."

The shared mind that is the Internet is a comfort to me. I feel it most strongly when I'm in those far-away places, tweeting about tortillas or volcanoes or voudun kings, but only because in those places, so little else is familiar. But the comfort of connectivity is an important part of my life when I'm back on more familiar ground, and take it for granted.

Macht das Internet nun schlau oder dumm? Von Alan Posener
Die Welt [1.11.10]

 

[continue...]

 

Read the full article →

Pages