Oct 30, 2022
As I read somewhere, we have absolutely no idea what a machine with an IQ of 13,000 would want to do, especially if it's written its own code.
However, we know it would have a desire (for want of a better term) to fulfill whatever goals it's programmed for, because that's the only reason it exists. If achieving that goal most effectively has a side effect of wiping out humanity, there's no reason to suppose it wouldn't do that.
Sucks that we forgot about Asimov's three laws of robotics.