The reason why that letter is nice is because it illustrates what's important to that girl at that particular moment in her life. Less important that man landed on moon than things like what she was wearing, what clothes she was into, who she liked, who she didn't like. This is the period of life where that sense of self, and particularly sense of social self, undergoes profound transition. Just think back to when you were a teenager. It's not that before then you don't have a sense of self, of course you do. A sense of self develops very early. What happens during the teenage years is that your sense of who you are—your moral beliefs, your political beliefs, what music you're into, fashion, what social group you're into—that's what undergoes profound change.
SARAH-JAYNE BLAKEMORE is a Royal Society University Research Fellow and Professor of Cognitive Neuroscience, Institute of Cognitive Neuroscience, University College London. Sarah-Jayne Blakemore's Edge Bio
One of the great things about cognitive science is that it allowed us to continue that seamless integration of the sciences, from physics, to chemistry, to biology, and then to the mind sciences, and it's been quite successful at doing this in a relatively short time. But on the whole, I feel there's still a failure to continue this thing towards some of the social sciences such as, anthropology, to some extent, and sociology or history that still remain very much shut off from what some would see as progress, and as further integration.
HUGO MERCIER, a Cognitive Scientist, is an Ambizione Fellow at the Cognitive Science Center at the University of Neuchâtel. Hugo Mercier's Edge Bio Page
We're going to pretend that modern-day vampires don't drink the blood of humans; they're vegetarian vampires, which means they only drink the blood of humanely farmed animals. You have a one-time-only chance to become a modern-day vampire. You think, "This is a pretty amazing opportunity, do I want to gain immortality, amazing speed, strength, and power? But do I want to become undead, become an immortal monster and have to drink blood? It's a tough call." Then you go around asking people for their advice and you discover that all of your friends and family members have already become vampires. They tell you, "It is amazing. It is the best thing ever. It's absolutely fabulous. It's incredible. You get these new sensory capacities. You should definitely become a vampire." Then you say, "Can you tell me a little more about it?" And they say, "You have to become a vampire to know what it's like. You can't, as a mere human, understand what it's like to become a vampire just by hearing me talk about it. Until you're a vampire, you're just not going to know what it's going to be like."
L.A. PAUL is Professor of Philosophy at the University of North Carolina at Chapel Hill, and Professorial Fellow in the Arché Research Centre at the University of St. Andrews. L.A. Paul's Edge Bio page
What I want to do today is raise one cheer for falsification, maybe two cheers for falsification. Maybe it’s not philosophical falsificationism I’m calling for, but maybe something more like methodological falsificationism. It has an important role to play in theory development that maybe we have turned our backs on in some areas of this racket we’re in, particularly the part of it that I do—Ev Psych—more than we should have.
MICHAEL MCCULLOUGH is Director, Evolution and Human Behavior Laboratory, Professor of Psychology, Cooper Fellow, University of Miami; Author, Beyond Revenge. Michael McCullough's Edge Bio page
The idea that computers are people has a long and storied history. It goes back to the very origins of computers, and even from before. There's always been a question about whether a program is something alive or not since it intrinsically has some kind of autonomy at the very least, or it wouldn't be a program. There has been a domineering subculture—that's been the most wealthy, prolific, and influential subculture in the technical world—that for a long time has not only promoted the idea that there's an equivalence between algorithms and life, and certain algorithms and people, but a historical determinism that we're inevitably making computers that will be smarter and better than us and will take over from us. ...That mythology, in turn, has spurred a reactionary, perpetual spasm from people who are horrified by what they hear. You'll have a figure say, "The computers will take over the Earth, but that's a good thing, because people had their chance and now we should give it to the machines." Then you'll have other people say, "Oh, that's horrible, we must stop these computers." Most recently, some of the most beloved and respected figures in the tech and science world, including Stephen Hawking and Elon Musk, have taken that position of: "Oh my God, these things are an existential threat. They must be stopped."
In the history of organized religion, it's often been the case that people have been disempowered precisely to serve what was perceived to be the needs of some deity or another, where in fact what they were doing was supporting an elite class that was the priesthood for that deity. ... That looks an awful lot like the new digital economy to me, where you have (natural language) translators and everybody else who contributes to the corpora that allows the data schemes to operate, contributing to the fortunes of whoever runs the computers. You're saying, "Well, but they're helping the AI, it's not us, they're helping the AI." It reminds me of somebody saying, "Oh, build these pyramids, it's in the service of this deity," and, on the ground, it's in the service of an elite. It's an economic effect of the new idea. The new religious idea of AI is a lot like the economic effect of the old idea, religion.
JARON LANIER is a Computer Scientist; Musician; Author of Who Owns the Future?
There is a new fundamental theory of physics that’s called constructor theory, and was proposed by David Deutsch who pioneered the theory of the universe of quantum computer. David and I are working this theory together. The fundamental idea in this theory is that we formulate all laws of physics in terms of what tasks are possible, what are impossible, and why. In this theory we have an exact physical characterization of an object that has those properties, and we call that knowledge. Note that knowledge here means knowledge without knowing the subject, as in the theory of knowledge of the philosopher, Karl Popper.
We’ve just come to the conclusion that the fact that extinction is possible means that knowledge can be instantiated in our physical world. In fact, extinction is the very process by which that knowledge is disabled in its ability to remain instantiated in physical systems because there are problems that it cannot solve. With any luck that bit of knowledge can be replaced with a better one.
What I wanted to talk about is somewhat of a parallel of that in human populations. If you were to go to a textbook on human biology from the time of Darwin or a bit later, you would certainly get an image that looked a bit like this. This is an image of the so-called races of humankind—racial types, as they called them. I’m not going to go into the question of whether there are real races of humankind because there aren’t. It’s interesting to note that until quite recently people assumed, and scientists assumed too, that the human species was divided into distinct groups that were biologically different from each other and had been isolated from each other for a long, long time.
Well, to some extent that was true. Until quite recently, human populations were isolated from each other. That’s changing quite quickly. ...
I dream about the sea cow or imagine what they would be like to see in the wild, but the case of the Pinta Island giant tortoise was a particularly strange feeling for me personally because I had spent many afternoons in the Galapagos Islands when I was a volunteer with the Sea Shepherd Conservation Society in Lonesome George’s den with him. If any of you visited the Galapagos, you know that you can even feed the giant tortoises that are in the Charles Darwin Research Station. This is Lonesome George here.
He lived to a ripe old age but failed, as they pointed out many times, to reproduce. Just recently, in 2012, he died, and with him the last of his species. He was couriered to the American Museum of Natural History and taxidermied there. A couple weeks ago his body was unveiled. This was the unveiling that I attended, and at this exact moment in time I can say that I was feeling a little like I am now: nervous and kind of nauseous, while everyone else seemed calm. I wasn’t prepared to see Lonesome George. Here he is taxidermied, looking out over Central Park, which was strange as well. At that moment realized that I knew the last individual of this species to go extinct. That presents this strange predicament for us to be in in the 21st century—this idea of conspicuous extinction.
... A strange thing happened on the way to a better world in pursuit of an admirable quest, that is, a world free of sex discrimination where you’re judged on your own qualities and not your sex. Truth and falsity went topsy-turvy. The truth—the silence of sex differences—became dangerous, unmentionable, and in its place the conventional wisdom, which is a ragbag of ideas that have long been extinct but are kept ghoulishly alive by popularity, became the entrenched orthodoxy influencing public thinking, agendas and policy-making, and completely crowding out science and sense.
My aim is to show you why the current orthodoxy should be abandoned and why, if you really care about a fairer world, the science does matter. It matters profoundly. I’m going to take two examples, both about the professions, because they very well epitomize the orthodox litany: how society systematically discriminates against women, and how at work they are victims of pervasive sexism.