Biotechnology is the #1 Threat to Mankind
The phrase “rearranging deck chairs on the Titanic” is usually hyperbole but almost fits our current political debates. Britain’s Astronomer Royal, Martin Rees, claims that humanity has only a 50/50 chance of surviving the 21st century because of threats arising from technology. Here’s my opinion: unless we take decisive political action, the world surviving to 2100 is unlikely. A future disaster may also be triggered or avoided by choices in the next decade. We particularly need to ban research into biotechnologies that would threaten our species. Rolling back dangerous technologies once they are created may be politically impossible, so we should act now. And scientific discovery is a worldwide process, so international cooperation is needed to stop risky research. I have made a little petition at change.org asking President Obama to support a global ban on all research involving human DNA and also set up a blog.
Why might future biotechnology advances threaten us? Consider two capabilities that could be possible very soon, namely whole-genome sequencing of human embryos and identifying (more) genes responsible for IQ. Researchers made strides related to both last year. The next ten years may well see the creation of the first “designer babies” deliberately selected for IVF as embryos because of their IQ genes. We would then experience a worldwide biotechnology arms race. People would be driven to engineer their children further and further from human norms just to stay competitive in a so-called “free market.” Benefits would accrue to families willing to cut the bonds between generations by loading their kids with genes for IQ, a hardworking personality, and so forth. Holdouts would be pressured into capitulating or become socially isolated. Eventually we would begin what Nicholas Agar calls “radical enhancement” and lose our humanity entirely. Human civilization might be annihilated in an orgy of competition.
It is tempting to dismiss this scenario as total fantasy. But the real fantasy is that it is impossible or even very unlikely. We seem to dogmatically believe that the future will resemble the past; government agencies often assume such in their economic projections. But a future like the past is nothing like the experience of the 20th century. In the past century world GDP grew by about 3,700% and per capita world GDP rose by 860%. That is completely unlike any other historical period. The extreme changes over the past century should lead us to view the future with radical uncertainty or meta-uncertainty. There are many different plausible scenarios: computing power for a given price has increased maybe 600 billion times in a hundred years. If trends hold for another few decades then by a simple mathematical projection we will probably develop human-level artificial intelligence. But that is still some ways off even if trends continue. Embryo selection for IQ in contrast could begin much sooner. We should react now.
Stopping research into dangerous technologies is easier than recalling them once they are developed. The latter is very difficult to do consistently when the technology is cheap, as Lance Armstrong has demonstrated with sports doping. And obviously restrictions can also be controversial, like with banning drugs. These obstacles are maybe why Nick Bostrom, James Hughes, Nayef Al-Rodhan and Ray Kurzweil consider transhumanism unavoidable and unstoppable. But such defeatism is uncalled for. Scientific research—the process of discovery—requires a huge number of bright people working simultaneously and inspiring one another, and this can be effectively stopped through legislation and political action. Isolated teams working illegally might make a discovery or two but have no chance of doing it continuously. This is why my petition targets only research into new techniques and technologies and not anything already developed.
That being said, we do face a tough global collective action problem. There is no clear international legal regime now regulating biotechnology, much less one with the unprecedented authority to block research. The single most important political issue—anywhere—could be creating this regime. It cannot happen unless peacefully agreed upon by at least the five permanent UN Security Council members. Small nations that tried eugenics or transhumanism could be stopped if necessary, but not these behemoths. Right now the United States is unfortunately the world leader in dangerous biotechnology research, but that also means it has the most leverage in proposing a comprehensive ban. Instead of trying to further cement its position, the United States needs to make use of it to shift the agenda. The stakes are high enough that non-violent civil disobedience is called for against the US government if you feel you can personally accept the risk. (I can’t for the moment. Sorry.) I would like to help create a political organization devoted to global or supranational biotechnology regulation. If anyone here is interested I can message you my Facebook info.