Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

March 08 2012

02mydafsoup-01
[...]

One possible strategic response to human-created risks is the slowing or halting of our technological evolution, but you have been a critic of that view, arguing that the permanent failure to develop advanced technology would itself constitute an existential risk. Why is that?
Bostrom:

Well, again I think the definition of an existential risk goes beyond just extinction, in that it also includes the permanent destruction of our potential for desirable future development. Our permanent failure to develop the sort of technologies that would fundamentally improve the quality of human life would count as an existential catastrophe. I think there are vastly better ways of being than we humans can currently reach and experience. We have fundamental biological limitations, which limit the kinds of values that we can instantiate in our life---our lifespans are limited, our cognitive abilities are limited, our emotional constitution is such that even under very good conditions we might not be completely happy. And even at the more mundane level, the world today contains a lot of avoidable misery and suffering and poverty and disease, and I think the world could be a lot better, both in the transhuman way, but also in this more economic way. The failure to ever realize those much better modes of being would count as an existential risk if it were permanent. Another reason I haven't emphasized or advocated the retardation of technological progress as a means of mitigating existential risk is that it's a very hard lever to pull. There are so many strong forces pushing for scientific and technological progress in so many different domains---there are economic pressures, there is curiosity, there are all kinds of institutions and individuals that are invested in technology, so shutting it down is a very hard thing to do. What technology, or potential technology, worries you the most? Bostrom: Well, I can mention a few. In the nearer term I think various developments in biotechnology and synthetic biology are quite disconcerting. We are gaining the ability to create designer pathogens and there are these blueprints of various disease organisms that are in the public domain---you can download the gene sequence for smallpox or the 1918 flu virus from the Internet. So far the ordinary person will only have a digital representation of it on their computer screen, but we're also developing better and better DNA synthesis machines, which are machines that can take one of these digital blueprints as an input, and then print out the actual RNA string or DNA string. Soon they will become powerful enough that they can actually print out these kinds of viruses. So already there you have a kind of predictable risk, and then once you can start modifying these organisms in certain kinds of ways, there is a whole additional frontier of danger that you can foresee.

[...]
We're Underestimating the Risk of Human Extinction | Ross Andersen - The Atlantic - 20120306

Don't be the product, buy the product!

Schweinderl