Artificial intelligence
fromBig Think
1 month agoWhat happens the day after humanity creates AGI?
The biggest challenge with AI superintelligence is the potential identity crisis for humanity as we lose cognitive supremacy.
It's actually not true; All of them are on the record the same: this is going to kill us. Their doom levels are insanely high. Not like mine, but still, 20 to 30 percent chance that humanity dies is a lot.