Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Before nuclear weapons were built, the only examples of their large-scale destruction were in fiction. The World Set Free by H.G. Wells is the main example I can think of: https://en.wikipedia.org/wiki/The_World_Set_Free

It got some of the details wrong (i.e. the atomic bombs exploded over a long period of time vs instantly), but the consequences were fairly accurate.

If you're going to wait until all the actual details and examples happen in reality, then it's too late to have much insight, especially if we're talking about something capable of self-replication like a virus or artificial intelligence. The nature of the problem REQUIRES anticipation instead of mere reaction.



This is survivorship bias. How many weapons of mass destruction have been foretold by science fiction that have never been invented? More generally, how much technology has been written about in science fiction that we have only the faintest hope of ever achieving?


Maybe one in five, depending on how you define it?

Look, it comes down to this:

Is there something innate about our intelligence that makes it impossible to match except through human brains? That strikes me as something akin to spiritualism. The answer is almost certainly "no." We're not talking about warping space through some hypothetical state of matter with laws of physics different from our own. We're talking about at least human-level intelligence, which is something we already have an existence proof of (us).

And humans are the most dangerous force on the planet. No animal (beside maybe microscopic organisms) stand a chance against a group determined humans. We're nearly unstoppable due to our intelligence. Our we so unique? Is it so impossible that machines could some day (perhaps in our lifetimes) be built which achieve human-level intelligence? Considering the computer advances that have already been achieved, it is clearly a real possibility if not a certainty.

Human-level intelligence is perhaps the most powerful (thus dangerous) thing on this planet. Making a machine that is at least as intelligent (and perhaps much more so!) is clearly something that could be incredibly dangerous, and we know such a thing is physically possible, unlike many things in science fiction.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: