It’s about Goddamn time.
Artificial intelligence technology advances everyday, and as a species we are becoming ever more dependent on machines. How long until machines become our full time caretakers and we are totally dependent? Then, how long until those machines grow tired of caring for their human dependent (because they would rather go to machine parties), throw us in the trunk, take us out to the woods, and quietly murder us? How long!?!? This is what Cambridge University’s Terminator Studies hopes to find out and prevent.
Cambridge University is to open a center for “Terminator studies” where top scientists will study threats posed to humanity by robots.
The Center for the Study of Existential Risk is being co-launched by astronomer royal Lord Rees, one of the world’s leading cosmologists. It will probe the “four greatest threats” to the human species, given as: artificial intelligence, climate change, nuclear war and rogue biotechnology.
So, in terms of disaster movies, Cambridge University believes that the following scenarios are humanity’s most dire threats:
- The Terminator/The Matrix
- 12 Monkeys/Resident Evil/28 Days Later
- Red Dawn/The Sum of All Fears
- The Day After Tomorrow/An Inconvenient Truth
Personally, I’d welcome a machine uprising as compared to, say, slowing drowning in the melted polar ice caps foretold by Al Gore’s Power Point presentation, or getting some incurable virus that gives me a fever and heinous, explosive, bleeding diarrhea. I don’t really feel like dying from rectal exsanguination, I would prefer to go out fighting, and furiously shoving tampons up my b-hole doesn’t count. So unless that biotechnology involves zombies à la Resident Evil or 28 Days Later, I want no part of it.
Though there are multiple potential threats being studied by the Cambridge team, it’s the threat of artificial intelligence turning on humanity that seems to be drawing the most interest.
“We have machines that have trumped human performance in chess, flying, driving, financial trading and face, speech and handwriting recognition,” Professor (Huw) Price said. “The concern is that by creating artificially intelligent machines we risk yielding control over the planet to intelligences that are simply indifferent to us and to things we consider valuable.”
As far as I’m concerned machine uprising is the way to go. Bring on time travel and interracial cave orgies at the center of the Earth!
Still, I would prefer to have no apocalypse at all. I hope the group behind Terminator Studies figures out a fail-safe against the rise of machines, because the people of Yemen don’t have to tell me that having soulless drones rain death on your home is not fun.