Considering that the idea of the singularity of AGI was the exponential function going straight up, I don’t think this persons understands the problem. Lol, LMAO foomed the scorpion.
(Also that is some gross weird eugenics shit).
E: also isn’t IQ a number that gets regraded every now and then with an common upper bound of 160? I know the whole post is more intended as vaguely eugenics aspirational but still.
Smh, why do I feel like I understand the theology of their dumb cult better than its own adherents? If you believe that one day AI will foom into a 10 trillion IQ super being, then it makes no difference at all whether your ai safety researcher has 200 IQ or spends their days eating rocks like the average LW user.
Considering that the idea of the singularity of AGI was the exponential function going straight up, I don’t think this persons understands the problem. Lol, LMAO foomed the scorpion.
(Also that is some gross weird eugenics shit).
E: also isn’t IQ a number that gets regraded every now and then with an common upper bound of 160? I know the whole post is more intended as vaguely eugenics aspirational but still.
Anyway, time to start the lucrative field of HighIQHuman safety research. What do we do if the eugenics superhumans goals don’t align with humanity?
Smh, why do I feel like I understand the theology of their dumb cult better than its own adherents? If you believe that one day AI will foom into a 10 trillion IQ super being, then it makes no difference at all whether your ai safety researcher has 200 IQ or spends their days eating rocks like the average LW user.