>... the risk of a superintelligence taking over the world no longer looks distant and abstract.
Can we please stop floating this as a threat? This is the more science-fiction than reality at this point and it does a great disservice to humanity. The more we keep pushing the idea that AI is the threat and not the people controlling it the less we will be focused on mitigating global risk.
It is far more likely that someone else will leverage an AI to attempt to expand their influence or dominion. Putin has essentially already stated views on this matter and we should assume groups within all adequately advanced nations will be working toward this end either independently or cooperatively.
So once again, humans are the dangerous part, clearly, if we didn't have destructive tendencies in our psyche that we're using to train these models, we wouldn't build things that would be interested in destruction.
Interesting.
I don't think we're as intelligent as we believe we are which I doubt we will ever actually build a super intelligence, we're too stupid. Even something 10x smarter than us may actually be quite "stupid".
Can we please stop floating this as a threat? This is the more science-fiction than reality at this point and it does a great disservice to humanity. The more we keep pushing the idea that AI is the threat and not the people controlling it the less we will be focused on mitigating global risk.
It is far more likely that someone else will leverage an AI to attempt to expand their influence or dominion. Putin has essentially already stated views on this matter and we should assume groups within all adequately advanced nations will be working toward this end either independently or cooperatively.
We are more than likely in an arms race now.