At a time when groundbreaking discoveries seem almost commonplace, it is difficult to predict which scientific news is important enough to "stay news" for longer than a few days. To stick around, it would have to potentially redefine "who and what we are." One of the recent scientific advancements that, in my mind, fulfills these prerequisites is decoding and reprogramming DNA via bioinformatics.
While mapping the complete human genome was itself a great achievement, it was bioinformatics that allowed for a practical application of the acquired knowledge. Uploading a genome onto a computer has enabled researchers to use genetic markers and DNA amplification technologies in ways that shed real light on the intricate, otherwise unfathomable gene-environment interactions causing disease.
Researchers also hope to use bioinformatics to solve real-life problems; imagine microbes "programmed" to generate inexpensive energy, clean water, fertilizer, drugs, and food, or tackle global warming by sucking carbon dioxide from the air.
But like most things in life, there are also possible negative side effects. With DNA being written like software, cloning and "designing" more complex living creatures, including humans, no longer seems a mere fantasy from Sci-Fi movies. All possible advantages aside, it is likely to stir a wide range of ethical debates, requiring us to ponder what it means to be human: a "naturally" conceived, unique, and largely imperfect creature, or a pre-designed, aimed-to-be-perfect being.
Asked about the future of DNA coding and bioinformatics, Craig Venter replied, "We’re only limited by our imagination." I am somewhat more skeptical than this, but one thing is for sure: digital DNA is here to stay in scientific news about evolutionary biology, forensic science, and medicine, and also in debates about the way we, humans, define ourselves.