It may sound incredible, and indeed it is because the human mind tends to make cognitive errors that distort our understanding of reality.

Do you think there are more chances that you come from a meteorite that killed by lightning, and your chances of dying electrocuted are about four times lower than those of dying in a terrorist attack.
In other words, you should be much more concerned about the meteorites that of 'IsIs or al-Qaeda (for now).
This calculation is based on a study done by the influential "Stern Review on the Economics of Climate Change," a report of the UK government that describes climate change as "the greatest ever recorded and extended market failure."
Considering climate change a priority, the Stern Review stipulates a probability of extinction of the human species than 0.1 percent per annum.
This number may seem small, but in the course of a century, the probability that the human race becomes extinct increased by 9.5 percent.
For example, in 2008, a think tank established that the probability of human extinction during this century is 19 percent.
And the co-founder of the Center for the Study of Existential Risk, Sir Martin Rees, suggests that civilization has a 50 percent chance of surviving this century- almost the flip of a coin.
How is it that the probability of a global disaster is higher than the death in a car accident?
How is it that the probability of dying in a global disaster is higher than that of dying in a car accident?
Probably, these estimates are somewhat 'dubious.
While some risks such as asteroid impacts and super-volcanic eruptions can be estimated using historical data objectives, risks associated with future technologies require a good deal of speculation.
There are three broad categories of "existential risk" or scenarios that could cause our extinction or catapult permanently to the Stone Age.
The first includes natural hazards such as asteroids and comets, volcanic eruptions, global epidemics and even supernovae.
These form our cosmic background risk and, as above, some of these risks are relatively easy to estimate.
As you may remember from elementary school, a celestial murderess, possibly a comet, it crashed in the Yucatan 66 million years ago and killed nearly all dinosaurs.
And more or less 75,000 years ago, a supervolcano in Indonesia caused the Tob catastrophe, according to some scientists, it would have drastically reduced the human population, although the issue is controversial.
Despite the "fear factor" of pandemics is less than the wars and terrorist attacks, these incidents have caused the most significant mass deaths of human history.
For example, the Spanish flu of 1918 killed about 3 percent (although some estimate twice) the human population, infecting a third of humans between 1918 and 1920.
In absolute terms, it has buried 33 million people in more than bayonets, bullets and bombs of World War I, which lasted from 1914 to 1918.
And the Black Death of the fourteenth century, caused by the bubonic plague, would involve the same number of victims of the Second World War, the First World War, the Crusades, the Mongol conquests of the Russian Civil War and the Thirty Years' War, all together.
(Take note, antivaccinisti!)
Patients during the epidemic of 1918 in Iowa.
Picture: Office of the Public Health Service Historian
The second existential risk category involves advanced technologies, which could cause unprecedented damage by means of ' "error or terror".
From the historical point of view, humanity has created the first anthropogenic risk in 1945, when the US government detonated an atomic bomb in the New Mexico desert.
From this event, humanity has lived with the constant threat of a nuclear holocaust, which has led a group of physicists to create the Doomsday Clock, a clock that metaphorically represents our proximity to the disaster.
While the height of nuclear tensions was during the Cold-War President Kennedy he had estimated that the probability of a nuclear war at some point was "between 1 and 3" -the situation has improved significantly since the fall of the Iron Curtain .
Unfortunately, US-Russian relations have deteriorated recently, bringing the Russian Prime Minister Dmitry Medvedev to suggest that "we're back to the Cold War."
As we write, the Clock of the Apocalypse is just three minutes from the end of the world-it was closer only at the time of its creation in 1947.
While nuclear weapons currently constitute the greatest risk to human survival, it could be one of our recent problems.
Why?
Because of the risk associated with expanding fields such as biotechnology, synthetic biology and nanotechnology.
The key point to understand is that these fields are not only becoming exponentially more powerful, but their products are becoming more and more affordable for groups and individuals.
For example, it is always possible that, thanks to CRISPR, the inexperienced individuals can manipulate genetic codes.
The DNA material can also be ordered from commercial suppliers how they found out the journalists of the Guardian in 2006, when they purchased "part of the genome of the smallpox with an email order."
In addition, anyone with an internet connection can access a database with sequences of pathogens such as Ebola.
We are far from the program the DNA of organisms as we program software.
But if these trends continue (as is likely), terrorists and lone wolves of the future will certainly have the ability to give life to pathogens of global proportions, and perhaps even more devastating than ever.
In the field of nanotechnology the most common risk scenario includes small machines that are self-or nano-robots, programmed to disassemble any type of material with which they come into contact and to rearrange these atoms into exact replicas of themselves.
The resulting clones nanorobotici would convert all matter around them in other copies.
Due to the high number of replies, the entire biosphere could be turned into a swarm of nanobots in a relatively short period of time.
Alternatively, a terrorist could draw certain nanobots to selectively destroy the bodies with a specific genetic makeup.
An eco-terrorist who wants to wipe humanity from the planet without damaging the global ecosystem could potentially create nanobots that only attack the 'Homo Sapiens.
Perhaps the biggest threat to the long-term future of humanity is the artificial superintelligence.
Instilling values that promote human well-being in a super-intelligent machine could be very difficult.
For example, a superintelligence whose goal is to eliminate the sadness from the world could simply exterminate the 'Homo Sapiens, because people who do not exist can not be sad.
Or superintelligence whose purpose is to help humans solve our energy crisis could inadvertently destroy covering the entire planet with solar panels.
The point is that there is a crucial difference between the "do as I say" and "do as I want you to do" and figure out how to program a machine that follows the second law poses very different challenges.
A fire in a national park in Florida.
Image: Josh O'Connor / USFWS
This brings us to the last category of risk, including the anthropogenic disasters such as climate change and loss of biodiversity.
While neither leads directly to extinction, both are powerful "conflict multiplier" that would bring civilization to the limit and would increase the possibility of a misuse of advanced technologies.
In other words: a nuclear gerra is more or less likely in a world of extreme weather events, drought, mass migration, and social and political instability?
A ecoterroristico attack involving nanotechnology is more or less likely in a world of environmental degradation?
Climate change and the question of biodiversity are destined to exacerbate the current geopolitical tensions and conflicts between the state apparatus and parastatal.
Not only it is worrying in itself, but it could become disastrous with the technological advancement.
Considerations such as this have led experts to very little optimistic estimates about the future.
The fact is that in this age there are many other ways in which our species could become extinct than in the past, and die in a generational catastrophe is far more likely to die in a car accident.
However, there are also reasons to believe that the threat of terrorism will tend to increase in the coming decades, due to the effects of environmental degradation, the democratization of technology and of 'religious extremism growth in some areas.
But it is not the end of history.
There are also reasons to be optimistic.
Though the existential risks are enormous, none of them is insurmountable.
Humanity also has the ability to overcome obstacles.
For example, technology can mitigate risks related to natural disasters.
An asteroid that threatens Earth (hopefully) can be destroyed by an atomic bomb.
The colonization of SpazioA and the underground bunkers may enable mankind to survive impacts of asteroids and supervulcaniche eruptions.
As far as pandemics, recent incidents such as Ebola and SARS have It showed that doctors of the international community can effectively curb the spread of pathogens that would otherwise cause a global disaster.
Other risks such as climate change and biodiversity loss can be limited by reducing the population growth, promoting renewable energy and preserving natural habitats.
The cosmos is a place of endless threats to humans.
And even if our extraordinary success as a species has greatly improved our condition, also it introduced a series of new threats.
However, there are concrete actions that mankind can put in place to mitigate the catastrophe.
As many experts have confirmed, the future is full of hope, but to achieve it we need to seriously understand the dangers that surround us.
Phil Torres is an author, works with the Future of Life Institute, and is a founder of 'X-Risks Institute.
His latest book is called The End: What Science and Religion Tell Us About the Apocalypse.

From Vice