Select Your Favourite
Category And Start Learning.

( 0 Review )

Omnicide

Free

Synopsis

The philosopher John Sommerville coined the term omnicide. The nuclear threat was fresh in his mind, and he came up with a term that went beyond war or conflict, as he thought these terms did not suffice; nor would suicide, genocide and infanticide, as for a nuclear war would mean the killing of all humans. Sommerville stated ‘[…] since nuclear weapons can now kill all human beings and obliterate all human creations in one relatively brief conflict, it seems appropriate to call such a conflict omnicide’ (Sommerville 1985).

Bouttell and Freyberg-Inan conceptualise omnicide ‘[…] as human extinction caused by human action, omnicidal threats as the particular threats existing to the species’ survival, and omnicidal risk as the latent danger constituted by these threats. [...] Omnicide is a subcategory of human extinction wherein human activity is a necessary condition for the existence of the threat and level of risk. Omnicidal threats are, by definition, anthropogenic, and thus politically relevant in the most minimal and general sense of the term. These threats and the general state of risk they pose can be managed and minimized through collective human action, and omnicide is thus a political issue and a potential concern for political realists’ (Bouttell & Freyberg-Inan 2024).

From the field of existential risk studies, where omnicide and anthropogenic existential risks are a key part of the studies, Bostrom argues that ‘an existential risk is one that threatens to cause the extinction of Earth-originating intelligent life or to reduce its quality of life (compared to what would otherwise have been possible) permanently and drastically’ (Bostrom 2002). For him existential risks can come from anthropogenic causes: ‘[…] human civilization is introducing many novel phenomena into the world, ranging from nuclear weapons to designer pathogens to high energy particle colliders. The most severe existential risks of this century derive from expected technological developments. Advances in biotechnology might make it possible to design new viruses that combine the easy contagion and mutability of the influenza virus with the lethality of HIV. Molecular nanotechnology might make it possible to create weapons systems with a destructive power dwarfing that of both thermonuclear bombs and biowarfare agents. Superintelligent machines might be built and their actions could determine the future of humanity – and whether there will be one. […]’ (Bostrom 2002).

About Course

.

What to learn?

Instructor

MT
4.45 /5
Omnicide