Edge in the News
Planet's biggest brains answer this year's Edge question: 'What scientific concept would improve everybody's cognitive toolkit?'
Being comfortable with uncertainty, knowing the limits of what science can tell us, and understanding the worth of failure are all valuable tools that would improve people's lives, according to some of the world's leading thinkers.
The ideas were submitted as part of an annual exercise by the web magazine Edge, which invites scientists, philosophers and artists to opine on a major question of the moment. This year it was, "What scientific concept would improve everybody's cognitive toolkit?"
The magazine called for "shorthand abstractions" – a way of encapsulating an idea or scientific concept into a short description that could be used as a component of bigger questions. The responses were published online today.
Many responses pointed out that the public often misunderstands the scientific process and the nature of scientific doubt. This can fuel public rows over the significance of disagreements between scientists about controversial issues such as climate change and vaccine safety. ...
Qual será o conceito científico que, se toda a gente o dominasse, poderia representar um salto imenso na capacidade que as pessoas têm de perceber e participar activamente nos assuntos do mundo?
This is, in essence, the question that John Brockman, the American literary agent and director of the site edge.org, presented in late December to a constellation of world famous scientists. The results were published online this morning.
The question was formulated more precisely as follows: "What Scientific Concept Would Improve Everybody's Cognitive Toolkit?"
Since this question is not as direct and explicit as some of its predecessors (the question last year, for example, was "How the Internet is changing the way we think?") Edge is quick to contextualize it.
The point is that, according to James Flynn, an expert on human intelligence from the University of Otago, New Zealand, there are words and short phrases — such as "market", "natural selection", etc.. — Which constitute "conceptual abbreviations" (shorthand abstractions, or SHA) that actually represent a constellation of such abstract concepts as complex and that "although extremely brief, have immense utility to perceive the world."
The idea is that the SHA, according to Flynn, "penetrated the cognitive repertoire of educated people, expanding their intellectual capabilities to become available in the form of cognitive units that can be used as elements of reasoning and debate." In other words, an economist, when he speaks of "market" or a biomedical specialist when he thinks of a "control group" or a statistician when he speaks of "random sample", knows very well that there's no need to lose time to reprocess these concepts each time you use them.
By Friday evening 115 people, scientists from various fields of knowledge, had already responded to the challenge. Some answers are extensive and very complex. Others do not respond exactly the question. But there are, as always, approaches to suit all tastes and most are interesting enough to make it worth going to have a look.
Theories of social nets and their relationship with the contemporary sociology, dangerous ideas of scientists on Radio3 Scienza on Radio3.
[click here:Ascolto]
Does the Internet affect our brains? 170 scientists and artists are trying to answer that question.
Since 1998, John Brockman of The Edge Foundation, an association of scientists and intellectuals, asks his members every year a question which they must answer with a short essay. Examples from recent years include "What do you think is true, but you can not prove?" and "What did you change your mind about? The question last year is also the title of this book. It is of course a very topical issue although the Internet has no really been around long enough to have serious statements about the impact on our brains or our way of thinking, yet in recent years there have been dozens of works published describing the investigated effects.
Here you get 170 scientists and artists, including big names such as Richard Dawkins, Steven Pinker, Daniel C. Dennet, Nassim Nicholas Taleb and Brian Eno, each of whom has a few pages to explain the advent of the Internet and what it has meant to them.
With so many people are discussed, this is a very diverse book. Optimists like Kevin Kelly, founder of Wired magazine, may hold a hymn to the possibilities of the Internet, while several pages later another describes the worldwide web as "the biggest distraction from serious thinking since television was invented." Some authors point to the disastrous consequences that the Internet already has on our brains, while an evolutionary biologistsays that the Internet so far has changed our very little because the information we find there is still viewed through our lenses as hunter-gatherers. That we can not handle this, for example, leads to a greater sense of insecurity.
Publishers, I was told by an august member of that tribe soon after I first wrote about them, are exactly like farmers. Whatever the weather, whatever the harvest, they just love to moan. Much in the world of books has changed since that moment, but not the propensity to grumble. During 2010, the "we're all doomed" tendency fed us on a bumper crop of of gloomy prognostications. Will almost-free digital distribution drain cash and credit out of the entire book-supply system? Do electronic books as a whole threaten to bankrupt publishers and pauperise authors? Has the spread of new media destroyed an appetite for reading any text tougher than a tweet among the born-digital generation?
Can anybody stop Google (with Amazon and Apple not far behind) seizing control of humanity's written heritage and using it to promote their partisan corporate agendas? Will independent high-street booksellers, well-stocked local libraries and reasonable advances for authors who don't appear on TV fade into the mists of bookish history, along with quill pens and lazy lunches? And can we ever hope to resist the takeover of publishing by celebrity clout when our Christmas chart-topper – Jamie Oliver's Jamie's 30-Minute Meals - comes from an author who cheerily admits that "I've never read a book in my life, ever, apart from my own". Respect to the recipes, though.
So book lovers need to embark on a chapter of hope. Every new year, John Brockman of the online intellectual powerhouse Edge (www.edge.org) asks its virtual community of scientists and social thinkers one question. In 2007, it was this: "What are you optimistic about?" To strike a less than despondent chord this January, I put the same question to a few people in the British book world who are best placed to know. Read their answers on these page
testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing testing
his Christmas, a bevy of elegant models are on display. Not just the long-legged female variety (although you can see those at this season’s parties in New York); instead, regulators, bankers and investors have been flaunting their own smart models, as they attempt to predict what 2011 might deliver.
But as this economic catwalk gets underway, it is shot through with irony. When the financial crisis hit, many observers blamed the disaster on the misuse of financial models. Not only had these flashy computer systems failed to forecast behaviour in the sub-prime mortgage world, but they had also seduced bankers and investors to take foolhardy risks or been used to justify some crazy behaviour.
But these days, in spite of all those mis-steps, there is little sign that financiers are falling out of love with those models; on the contrary, if you flick through the recent plethora of reports from the Basel Committees — or look at the 2011 forecasts emanating from investment banks — these remain heavily reliant on ever-more complex forms of modelling.
So what are investors to make of this? One particularly thought-provoking set of ideas can be seen in the current work of Emanuel Derman, a former physicist-turned banker who shot to fame within the banking industry two decades ago by co-developing some ground-breaking financial models, such as the Black-Derman-Toy model (one of the first interest rate models) and the Derman-Kani local volatility model (the first model consistent with the volatility smile). *
At first glance, Derman’s past might suggest he should be a model-lover — or "modeliser" — par excellence. In the banking world, he is often hailed as one of the great, original "quants", who paved the way for the derivatives revolution. Yet in reality, Derman has always been pretty cynical about those models that won him, and other quants, earlier accolades. For while investment bank salesmen might have treated his creations as near infallible, in truth Derman — like many brilliant scientists-turned-quants — has always recognised their flaws. ...
Some of the world’s greatest thinkers came together recently to answer the really big question — what will change the world? Roger Highfield, editor ofNew Scientist, reveals their predictions, from crowd-sourced charity to space colonisation and built-in telepathy.
It is not hard to think of examples of wide-eyed predictions that have proved somewhat wide of the mark. Personal jetpacks, holidays on the moon, the paperless office and the age of leisure all underline how futurologists are doomed to fail.
Any predictions should thus be taken with a heap of salt, but that does not mean crystal ball-gazing is worthless: on the contrary, even if it turns out to be bunk, it gives you an intriguing glimpse of current fads and fascinations.
A few weeks ago, a science festival in Genoa, Italy, gathered together some leading lights to discuss the one aspect of futurology that excites us all: cosa farà cambiare tutto — this will change everything.
The event was organised by John Brockman, a master convener, both online and in real life, and founder of the Edge Foundation, a kind of crucible for big new ideas.
With him were two leading lights of contemporary thought: Stewart Brand, the father of the Whole Earth Catalog, co-founder of a pioneering online community called The Well and of the Global Business Network; and Clay Shirky, web guru and author of Cognitive Surplus: Creativity and Generosity in a Connected Age. ...
When I received the invitation to write here, there was the question of whether the new columns would have names different than those of their authors. I was thinking about some possibilities. The first idea was to be a "name dropper," the English term for those in the habit of naming names of important people to impress listeners. I even thought about beginning all the texts with some name and gradually forming an idiosyncratic biographical catalogue, which could be useful for adventurous spirits.
The fact that I have not found a good ironic translation for such an expression in English, made me give up the gam in the end. So thought about the title "Frontier". In the background, still thinking in English: I movied towards "the border" in the direction of "edge". The columns would deal with only the cultural production that crossed limits established for the common place, transforming the world or inventing new ways to think about life. My inspiration came from a number of different things such as "Close to the Edge" or Brian Eno's Edge feature"A Big Theory Of Culture". But mostly, I wanted to emulate, in absurdly individual and uselessly pretentious way, the site http://www.edge.org/.
I tracked the trajectory of John Brockman, the man who founded Edge before the Web existed. I bought the first book in his series "The Reality Club" at the time of its launch in 1990. I was impressed with such an interesting gathering of thinkers, coming from different areas such as the philosopher Daniel Dennett, the biologistLynn Margulis, or psychologist Mihaly Csikszentmihalyi. I learned that what was published there was only a sample of much greater diversity. The Reality Club’s monthly "invitation only" meetings in New York — which began in 1981 — is a fascinating group that includes the physicist Freeman Dyson to theater directorRichard Foreman, almost all of my idols. The motto of the club was ambitious: "To arrive at the edge of the world's knowledge, seek out the most complex and sophisticated minds, put them in a room together and have them ask each other the questions they are asking themselves."
Today, the meeting room has become the website Edge. The transformation has not exactly been democratizing. The club remains as elitist (not a criticism, an observation) as before, maybe even more, since its members have become celebrities (sign of the times: today scientists can be more pop than Mick Jagger) and many of them are incredibly rich. It is not an open site where anyone can contribute, but remains invitation-only, editorially driven. The difference: the general reader can now monitor the selected conversation almost in real time, after a light filter. Brockman still decides who may speak at the forum. Currently he is one of the more powerful literary agents in the world (specialized mainly in science books), managing to convince the major publishing houses to pay millions in advances to his clients. (One of the legends that revolve around his working method is that if a book begins to earn royalties, he says that he's failed — because he didn't get a large enough advance from the publisher). Brockman is the agent of Richard Dawkins, Jared Diamond, Martin Rees and others of the same caliber.
"An invitation to the 2010 dinner was not easy to come by as the figures who were present were the owners of Google, Twitter, Huffington Post, Bill Gates, Benoit Mandelbrot (fractals), Craig Venter (Human Genome Project). Do I need to drop more names? A bomb at dinner and we would lose much of a certain creative intelligence that drives our world and our future, or the future that these people have created for all of us. The nerd on the edge has now became the center of power."
The site has several sections. In one of them, a sort of "lifestyles of the rich and famous" — of the people Edge considers the most interesting and intelligent in the world — is an album of photos of an annual event hosted by Brockman, originally named "The Millionaires' Dinner" which was later upgraded to "The Billionaires' Dinner." An invitation to the 2010 dinner was not easy to come by as the figures who were present were the owners of Google, Twitter, Huffington Post, Bill Gates,Benoit Mandelbrot (fractals), Craig Venter (Human Genome Project). Do I need to drop more names? A bomb at dinner and we would lose much of a certain creative intelligence that drives our world and our future, or the future that these people have created for all of us. The nerd on the edge has now become the center of power.
Another very popular section is the Edge Annual Question. Every year a new question is asked. In November, Richard H. Thaler, the father of "behavioral economics" (the hottest area in economic studies), asked the following question:"Can you name your favorite examples of wrong scientific belief that were held for long periods of time". So far 65 responses have been received, authored by, among others, the physicist Lee Smolin and artist Matthew Ritchie. This week a special question was published. The inquisitor is Danny Hillis, pioneer in super computing, who — under the impact of Wiki-Leaks — wants to know if we can or if we must keep secrets in the age of information.
But this is the festive aspect of the Edge. What makes my neurons burn are the regular features, which are frequently brilliant texts, such as the most recent:"Metaphors, Models and Theories", by Emanuel Derman, one of those physicists in the past decades who has left the university to attempt to discover the laws of financial markets. (I will go deeper into this subject in a future column.) And this is why I always come back to Edge. In the world of Anglo-Saxon ideas (that still prevail throughout the whole world, or among the elite of the world), there is no smarter guide.
___
Hermano Vianna is a Brazilian anthropologist and writer who currently works in television. The original Portugese-language column, published behind O Globo's subscription pay-wall, is available, with an introduction, on Hermano Vianna's blog.
• Edge.org has a solid collection of essays addressing these questions: "When does my right to privacy trump your need for security? Should a democratic government be allowed to practice secret diplomacy? Would we rather live in a world with guaranteed privacy or a world in which there are no secrets? If the answer is somewhere in between, how do we draw the line?"
Now comes the "binary-turn"? A Nerdisierung clues to the FAZ.
Some time ago I had - more in jest - the venerable Frankfurter Allgemeine Zeitung in a couple of tweets as "nerd-central organ" means . The cause was Frank Schirrmacher's full-page defense of nerds just prior to the election, the publication of offensive around the iPad unveiling or opening of the FAZ as a platform for self-confessed nerds, such as in the article by Frank Rieger (Chaos Computer Club). Once the word was the "central organ" picked up every now and again, I've been thinking about it again a bit and listed a few thoughts on the "binary turn" the FAZ.
What is clear is one thing: Frank Schirrmacher is the driving force behind this process, as a glance in its editorial of 23 January in which he for more digital intelligence and vehemently calls for Deuschland (a rogue is someone who even at that date, the nerd alarm bells ringing stops ). It could be that reorientation of the FAZ-Feuilleton debates and articles on digital culture, first of all an accompanying report on his recent book project Paybacksuspect behind, but that would be very superficial and would not Mr Schirrmacher's sense of justice issues.
of at least a few leads away from the genetic, bio-and nano-technology debates of recent years - - the content of this pivot is absolutely to be welcomed, because in fact, the public discourse in Germany to serious issues of digitization and its social consequences backlog. Gets a special twist this Nerdisierung but by every now and then breaking through arrogance and rejection of online culture - which are at very different points of the paper shows, for example, currently at the minimum of impact-side position in the case Hegemann.
If you look a little closer, then there are some indications that the FAZ for much longer is a place for nerds - a small clues.
Exhibit 1: The feature of 27 June 2000
Across different FAZ had the extract from the code of the human genome expressed one that to some as "not read the article in the latest media story" is . Opening up, a publishing initiative under the banner of John Brockman led "Third Culture" debate, which aimed at the entanglement of intellectual discourse between the natural sciences and humanities.The "digital revolution" was this time "only" opens with some Schirrmacher texts. But who knows, maybe coming soon still a spread in binary code.
Exhibit 2: The daily blog entry on page one
As a big journalistic break in the part of the relaunch of 5 October 2007, a color photo on page one of the traditional mounted. Since then, this teaser photo developed more and more of a daily journal. Of course comment on the picture and the accompanying text is a central theme of world affairs - but stylistically there are often elements of this reduction, reference and commentary, which are characteristic of the interior of the leaf often criticized blogosphere.
And maybe it's just a personal perception, but the captions have lately not always once again fallen below the magic 140-character limit? But even if not - many image Comments work exactly like a typical Twitpic-mail: The image illustrates a theme, an idea and the text uses references to the authorities for further reading in the Gazette. Formulated the other way around: the same way could the "Play" the FAZ twitter too.
Exhibit 3: The nerds are not just the features section
Not so much a concrete piece of evidence, rather a collection of evidence to the thesis - one looks only superficially different departments, then there are a surprising number of examples of "nerdy" reporting: the surprisingly flowery flaunted attention to detail of the "technology and motor" editorial or the often highly encrypted reports from the gourmet world. The Ins-sheet smuggling material from Duckburg by professing Donaldisten Patrick Bahners and Andreas Platt House is one of them as well as the knowledgeable "network economy" every Tuesday in the business section - all these are small indications of the anchoring of the FAZ in the proverbial nerd culture. Taking FAZ.net added yet, you should at this point to the incorporation of Don Alphonso and Michael Seemann be mentioned.
But what is certainly the result of this partial and expandable "evidence" is? Is there one?Starting point for considerations was indeed the rather humorous name of the FAZ as "nerd-central organ", a volatile leaf criticism has provided at least some evidence of elements of the much-quoted last "nerd culture". So what?
To say it once again clear: the demand for more "digital intelligence" or better: the increased talk of "digital subject" in the German public support is essential to - Schirrmacher Paybackand the theories of information overload of the defiance. However, the term should be the "Nerds" or "nerd culture" spelled out a little clearer than it is now - otherwise converts the FAZ a fine line and does attack. Where can this lead, has recently Thomas Knüwer shown before that an article by Frank Schirrmacher split has.
Such examples are also illustrates the fragility of the construction of a "nerd-FAZ", which does recognize the opportunity to strike the tone for a new social discourse, is here but would have to make a more accurate picture of the situation in the digital cultural front.
If you were a sophisticated and up-to-the-minute science buff in 17th century Europe, you knew that there was only one properly scientific way to explain anything: "the direct contact-action of matter pushing on matter," (as Peter Dear puts it The Intelligibility of Nature). Superstitious hayseeds thought that one object could influence another without a chain of physical contact, but that was so last century by 1680. Medieval physics had been rife with such notions; modern thought had cast those demons out. To you, then, Newton's theory of gravity looked like a step backwards. It held that the sun influenced the Earth without touching it, even via other objects. At the time, that just sounded less "sciencey" than the theories it eventually replaced.
This came to mind the other day because, over at Edge.org, Richard H. Thaler asked people to nominate examples of "wrong scientific beliefs that were held for long periods." He also asked us to suggest a reason that our nominee held sway for too long. ...
Science can contradict itself. And that's OK. It's a fundamental part of how research works. But from what I've seen, it's also one of the hardest parts for the general public to understand. When an old theory dies, it's not because scientists have lied to us and can't be trusted. In fact, exactly the opposite. Those little deaths are casualties of the process of fumbling our way towards Truth*.
Of course, even after the pulse has stopped, the dead can be pretty interesting. Granted, I'm biased. I like dead things enough to have earned a university degree in the sort of anthropology that revolves around exactly that. But I'm not alone. A recent article at the Edge Foundation website asked a broad swath of scientists and thinkers to name their favorite long-held theory, which later turned out to be dead wrong. The responses turn up all sorts of fascinating mistakes of science history—from the supposed stupidity of birds, to the idea that certain, separate parts of the brain controlled nothing but motor and visual skills.
One of my favorites: The idea that complex, urban societies didn't exist in Pre-Columbian Costa Rica, and other areas south of the Maya heartland. In reality, the cities were always there. I took you on a tour of one last January. It's just that the people who lived there built with wood and thatch, rather than stone. The bulk of the structures decayed over time, and what was left was easy to miss, if you were narrowly focused on looking for giant pyramids.
What's your favorite dead theory?
Edge: Wrong Scientific Beliefs That Were Held for Long Periods of Time ...
There's a fascinating list of scientific ideas that endured for a long time, but were wrong, over at Edge.org, the Web site created by the agent and intellectual impresario John Brockman.
The cautionary tale of the fight over the cause of stomach ulcers, listed by quite a few contributors there, is the kind of saga that gives science journalists (appropriately) sleepless nights. One of my favorites in the list is the offering of Carl Zimmer, the author and science journalist, who discusses some durable misconceptions about the stuff inside our skulls:
"This laxe pithe or marrow in man's head shows no more capacity for thought than a Cake of Sewet or a Bowl of Curds."
This wonderful statement was made in 1652 by Henry More, a prominent seventeenth-century British philosopher. More could not believe that the brain was the source of thought. These were not the ravings of a medieval quack, but the argument of a brilliant scholar who was living through the scientific revolution. At the time, the state of science made it was very easy for many people to doubt the capacity of the brain. And if you've ever seen a freshly dissected brain, you can see why. It's just a sack of custard. Yet now, in our brain-centered age, we can't imagine how anyone could think that way.?The list grew out of a query fromRichard Thaler, the director of the Center for Decision Research at the University of Chicago Graduate School of Business and coauthor, with Cass Sunstein, of " Nudge: Improving Decisions About Health, Wealth, and Happiness." (He also writes a column for The Times.)
Here's his question:
The flat earth and geocentric world are examples of wrong scientific beliefs that were held for long periods. Can you name your favorite example and for extra credit why it was believed to be true?
Earlier this week Richard H. Thaler posted a question to selected Edge contributors, asking them for their favorite examples of wrong scientific theories that were held for long periods of time. You know, little ideas like "the earth is flat."
The contributor's responses came from all different fields and thought processes, but there were a few recurring themes. One of the biggest hits was the theory that ulcers were caused by stress–this was discredited by Barry Marshall and Robin Warren, who proved that the bacteria H. pylori bring on the ulcers. Gregory Cochran explains:
One favorite is helicobacter pylori as the main cause of stomach ulcers. This was repeatedly discovered and then ignored and forgotten: doctors preferred 'stress' as the the cause, not least because it was undefinable. Medicine is particularly prone to such shared mistakes. I would say this is the case because human biology is complex, experiments are not always permitted, and MDs are not trained to be puzzle-solvers–instead, to follow authority.
Another frequent topic of disbelief among Edge responders was theism and its anti-science offshoots–in particular the belief in intelligent design, and the belief that the Earth is only a few thousand years old. Going by current political discussions in America it may seem that these issues are still under contention and shouldn't be included on the list, but I'm going to have to say differently, and agree with Milford Wolpoff:
Creationism's step sister, intelligent design, and allied beliefs have been held true for some time, even as the mountain of evidence supporting an evolutionary explanation for the history and diversity of life continues to grow. Why has this belief persisted? There are political and religious reasons, of course, but history shows than neither politics nor religion require a creationist belief in intelligent design. ...
the conversation about the role of the two great branches of learning is important, and still topical. C.P. Snow crystallized this debate in the mid-twentieth century with his suggestion that two cultures existed in the British academy, the literary and the scientific, and that they were at odds.
In his quest to argue that science will solve global social problems, why, Snow asked, should one be held responsible for knowing the works of Shakespeare, but not understand the Second Law of Thermodynamics? His insight gave steam to an internecine intellectual fight that had surfaced a number of times in the past. (Today, one can chart the most recent "science wars" all the way back through Snow to Arnold and Huxley, on to the Romantic critiques of the Enlightenment Project, to the debate between the Ancients and Moderns, which revolved around the new science's assault on both Aristotelianism and Renaissance Humanism.)
However, what shouldn't be forgotten in the admission that the humanities will resist ultimate reduction is that there are those in the humanities who suffer from science envy. This envy was given impetus by entomologist and evolutionary theorist E.O. Wilson in his monograph, Consilience, wherein Wilson suggests that the humanities must move toward the methods of the sciences to remain relevant.
While this gentle shimmy sounds harmless enough, there are those in humanistic disciplines like literary studies who have taken such a move to heart. For example, any cursory examination into the nascent approach called Literary Darwinism reveals a loose confederacy of individuals who think literary texts are best read within Darwinian contexts (think, reading Jane Austen to understand how her characters represent universals in human behavior related to, say, their inclusive fitness). ...
...That term, 'third culture', was popularized by literary agent and Edge founder John Brockman in a response to Snow to suggest that a new culture is emerging that is displacing the traditional, literary intellectual with thinkers in the sciences who are addressing key concepts and categories found in the humanities; Richard Dawkins writing about religion or Carl Sagan expounding about the awe of the cosmos both come to mind as quintessential examples. One need only browse through the science section of the book store to see that a bridge between the two cultures has been built, with much of the traffic in the popular sphere going one way
"The Shallows" which explores what the Internet is doing to our brains) are clear examples of a profession lacking in these latitudes: the dedicated writer to think of our new state environment, technology and science. They are "tecnoescritores" or scientific writers such as the British biologist Richard Dawkins (The Selfish Gene "), Daniel Dennett (" Darwin's Dangerous Idea "), psychologist Steven Pinker (The Blank Slate), Matt Ridley ("genome"), Malcolm Gladwell ("Blink"), Bill Bryson ("Short History of Nearly Everything"), Brian Greene ("The Elegant Universe"), Michio Kaku ("Physics of the Impossible"), Paul Davies ( "The last three minutes"), and many more as the hypermedia Stephen Hawking (A Brief History of Time "). Direct descendants of Carl Sagan, Richard Feynman and Stephen Jay Gould, is a breed of authors who write and release science laboratories. And, oddly enough-attention-Argentine publishers, they sell many books. It's true: in recent years came over here-very interesting collections, such as barking Science (Siglo XXI), led by Diego Golombek biologist who trains scientists and science communicators to tell beyond a cryptic paper or news article forgettable . But you have to admit, compared to the international market for "literature" are still in the First B. Each in its own way and located in what CP Snow called "third culture" (that bridge between science and literature currently represented by the site Edge.org), the great science writers take a scientific publication, the link with the literature and in doing so take it one floor up