Bistable multivibrator
Non-state actor
Tabs for AI indentation, spaces for AI alignment
410,757,864,530 DEAD COMPUTERS

  • 0 Posts
  • 17 Comments
Joined 2 years ago
cake
Cake day: July 6th, 2023

help-circle


  • Ooh, pants-on-head stupid semantics nonsense detected.

    First, the government needs to be run top-down from the Oval Office. This is why we call it the “executive” branch. “Executive” is a literal synonym of “monarchical”—from “mono,” meaning “one,” and “archy,” meaning “regime.” “Autocratic” is fine too. The “executive branch” is the “autocratic branch,” or should be if English is English. Libs: if these words don’t mean what they mean, what do they mean?

    Executive, as in pertaining to execution. Executing the the duties of a government as defined by legislation. Where the fuck did Curtsy get the impression that “executive” is somehow synonymous with “monarchical”? Did he mix it up with “exclusive”? Even corporations often have multiple executives with different roles.

    Mr. Thiel you have so much fucking money couldn’t you afford a fascist philosopher king who is more intelligent than this guy who thinks an “executioner” is a guy that makes you a dictator?





  • Those comments are tight, but really the problem with trying to explain any of this to laypeople isn’t exposing how wrong it is. The hard part is making any sense of it.

    Like if I told you Donald Trump has connections with a cult that believes grandmothers are a species of raspberry, whose goal is turning Denmark into cheese and oh, a splinter group of theirs just murdered a police officer. That will just raise more questions than it answers. How the hell did they come to believe that? Why would anyone want that? And then I have to choose between looking like a loony conspiracy theorist, doing an impromptu lecture or just having to decide you actually probably don’t want to know.


  • I don’t think Yud is that hard to explain. He’s a science fiction fanboy who never let go of his adolescent delusions of grandeur. He was never successfully disabused from the notion that he’s always the smartest person in the room and he didn’t pursue high school, let alone college education to give him the expertise to recognize just how difficult his goal is. Blud thinks he’s gonna create a superhumanly intelligent machine when he struggles with basic programming tasks.

    He’s kinda comparable to Elon Musk in a way. Brain uploading and superhuman AI are sort of in the same “cool sci fi tech” category as Mars colonization, brain implants and vactrain gadgetbahns. It’s easy to forget that not too many years ago the public’s perception of Musk was very different. A lot of people saw him as a cool Tony Stark figure who was finally going to give us our damn flying cars.

    Yudkowsky is sometimes good at knowing just a bit more about things than his audience and making it seem like he knows a lot more than he does. The first time I started reading HPMoR I thought the author was an actual theoretical physicist or something and when the story said I could learn everything Harry knows for free on this LessWrong site I though I could learn what it means for something to be “implied by the form of the quantum Hamiltonian” or what that those “timeless formulations of quantum mechanics” were about. Instead it was just poorly paced essays on bog standard logical fallacies and cognitive biases explained using their weird homegrown terminology.

    Also, it’s really easy to be convinced of thing when you really want to believe in it. I know personally some very smart and worldly people who have been way too impressed by ChatGPT. Convincing people in San Francisco Bay Area that you’re about to invent Star Trek technology is basically the national pastime there.

    His fantasies of becoming immortal through having a God AI simulate his mind forever aren’t the weird part. Any imaginative 15 year old computer nerd can have those fantasies. The weird parts are that he never grew out of those fantasies and that he managed to make some rich and influential contacts while holding on to his chuunibyō delusions.

    Anyone can become a cult leader through the power of buying into your own hype and infinite thielbux.









  • I listened to parts 2—4 today and even with an above-average familiarity with rationalist weirdness it was a wild fucking ride.

    Kinda wild to have a community of vegan trans women whose most notable victims are a border patrol agent and a landlord and who are still the baddies in this situation.

    The HPMOR/LW parts and to a lesser extent bits of CFAR were absolutely familiar (and all the more cringeworthy for it). Never quite realized exactly how obviously psychologically torturous cult shit CFAR event were about.


  • Honestly, I’m really surprised to hear that IQ is not even a little bit heritable, given that IQ test performance correlates with level of education, which correlates with wealth, which is heritable.

    True, wealth is not genetic, but heritability has an interesting definition which leads to some unintuitive cases of heritability abd non-heritability. For instance, wearing earrings is heritable while having ears is not.