Bill Joy’s famous essay is a great place to start worrying about this thing.
So far, I think what’s most worrying is the complete de-emotionalization of humans. World destruction is bad, but I expect people will have solutions to that, since I can’t imagine many people wanting it. However, if people make unemotional thinking sentient beings, they might just get rid of us with their smart little gray cells.
No, the problem is, is love really intrinsically good? What if all the normal movie endings don’t apply? What if the bad guy survives because all the things we treasure, love, care, family, freedom, peace, equality, guilt, passion, honesty, etc. are, in reality, vulnerabilities? I think they might have been made through evolution, since humans would be able to bond together to fight, but if the computers become seriously sentient and powerful, none of those qualities would help us. Except in movies. Pick a random person from the street who really likes math but never pursued it in favor of journalism, and a world-weary mathematician who has published enough papers to grant him an Erdos number less than 0.1 (it’s a generalization, pretty like the operation of resistor networks) and now finds all papers disgusting (bad example, but bear with me). You bet the second guy will win in any mathematical competition. If we motivate him to sufficiently.
So people have to tread carefully here. The computers needn’t be sentient to cause serious damage: if somebody invents a foldable, nanotechnological iElevator, the bug that if you unlock the screen this many times and tap two buttons together while holding the pack at 45 degrees to the vertical, the thing suddenly shuts itself down and discharges all its battery will still be very rare, but in the iPhone, people won’t care about something that happens one millionth of the time. With the iElevator, though, it could cost lives.
Of coure, if we really get that far, we’ll have uploads already. But the thing is that technology shouldn’t be blindly relied on. We’re creating things that are SMARTER than us and giving them MORE POWER than we retain ourselves. Of course, humans are fallible too, so who knows?
The Singularity will pretty much defy all our intuition just like death should. Well, that’s awful.