

ChatGPT’s got what intelligence craves… it’s got neurons
ChatGPT’s got what intelligence craves… it’s got neurons
I actually think it’s part-and-parcel of Yarvin’s personality. As much as he rails against “the Cathedral,” PMCs, whatever, he himself is a perfect example of a pathological middle manager. Somebody who wants power without having to shoulder ultimate responsibility. He craves the childishly simplified social environment of a medieval-fantasy king’s court, but he doesn’t want to be the king himself. He wants to be (and has been, up until now) the scheming vizier who can run his manipulation games in the background, deciding who gets in front of the king but not having to take the heat if the king makes a bad decision. (And the “kings” he works for have made plenty of bad decisions, but consequences have only just begun to catch up.)
I suspect this newfound mainstream attention is far more uncomfortable than it is validating for him. Perhaps the NYT profile was a burst of exhilaration, but the shine has worn off quickly. This correlates with the story last year about him coming back to Urbit as a “wartime CEO.” If Urbit is so damn important for building his ridiculous vision, why wasn’t he running it the whole time? He doesn’t actually want to be CEO of anything. Power without responsibility.
He will never stop to reflect that his “philosophy,” such as it is, is explicitly tailored for avaricious power-hungry narcissists, soooooo
Obvious joke is obvious, but
The essay brims with false dichotomies, logical inconsistencies, half-baked metaphors, and allusions to genocide. It careens from Romanian tractor factories to Harvard being turned “into dust. Into quarks” with the coherence of a meth-addled squirrel.
Harvard isn’t already full of Quarks?
For my money, 2015/16 Adams trying to sell Trump as a “master persuader” while also desperately pretending not to be an explicit Trump supporter himself was probably the most entertaining he’s ever been. Once he switched from skimmable text blogging to livestreaming, though, he wanted to waste too much of my time to be interesting anymore.
Yes, Kurzweil desperately trying to create some kind of a scientific argument, as well as people with university affiliations like Singer and MacAskill pushing EA, are what give this stuff institutional strength. Yudkowsky and LW are by no means less influential, but they’re at best a student club that only aspires to be a proper curriculum. It’s surely no coincidence that they’re anchored in Berkeley, adjacent to the university’s famous student-led DeCal program.
FWIW, my capsule summary of TPOT/“post-rationalists” is that they’re people who thought that advanced degrees and/or adjacency to VC money would yield more remuneration and influence than they actually did. Equally burned out, just further along the same path.
I’ve been contemplating this, and I agree with most everyone else about leaning heavily into the cult angle and explaining it as a mutant hybrid between Scientology-style UFO religions and Christian dispensationalist Book of Revelation eschatology. The latter may be especially useful in explaining it to USians. My mom (who works in an SV-adjacent job) sent me this Vanity Fair article the other day about Garry Tan grifting his way into non-denominational prosperity gospel Christianity: https://www.vanityfair.com/news/story/christianity-was-borderline-illegal-in-silicon-valley-now-its-the-new-religion She was wondering if it was “just another fad for these people,” and I had to explain no, not really, it is because their AI bullshit is so outlandish that some of them feel the need to pivot back towards something more mainstream to keep growing their following.
I also prefer to highlight Kurzweil’s obsession with perpetual exponential growth curves as a central point. That’s often what I start with when I’m explaining it all to somebody. It provides the foundation for the bullshit towers that Yudkowsky and friends have erected. And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published. It’ll most likely be the main source in the lower-division undergraduate/AP high school history texts that highlight this stuff as a background trend in the 2010s/2020s. Right now, we live in the peak days of the LessWrong bullshit volcano plume, but ultimately, it will probably be interpreted by the specialized upper-division texts that grow out of peoples’ PhD theses.
awful.systems
Huh, 2 paradigm shifts is about what it takes to get my old Beetle up to freeway speed, maybe big Yud is onto something
It is what happened to look good in the valley between the Adderall comedown and yesterday evening’s edible really starting to hit
And the photos from a previous event are an ocean of whiteness. Hard to argue that they’re not, uh, cultivating a certain demographic…
I propose we pool funds, buy an old motel on the other side of the city limits in Oakland, and rename it Farthaven
Time magazine is, of course, now a property of Salesforce bobblehead Marc Benioff. So one wonders if there are editorial decisions being made at a high level, much like the Washington Post.
so an alternative, somewhat weaker fireball spell
On the other hand, bombing foreign data centers, likely located in densely populated urban areas, would be justified and morally upstanding if it seems like they might be incarnating the imaginary computer god! I’m glad we have such a nuanced thinker guiding our modern morality.
For Yarvin, it always is and always will be someone else’s fault