You are viewing a single comment's thread.

view the rest of the comments →

2
-2

[–] tcp 2 points -2 points (+0|-2) ago  (edited ago)

The singularity is always a second away. The more apt analogy is that a coronal mass ejection or gamma ray burst could burn up the atmosphere and kill all life of earth. We hope it never happens, but it could at any time. It would be so powerful and rapid that we would never know or understand what hit us. Similarly, an AI doesn't even have to be sentient to act in a way that will destroy us all.

Also, you are underestimating the runaway, exponential effect that programs writing programs could have.