HPMoR and the Limits of Rationality
One woman's quest to understand the Harry Potter fanfic that created the modern world.
HPMoR and the Limits of Rationality
Ep. 2: Wingardium Leviscrewyou
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In chapter 2, a woman turns into a cat. Anthea and Regina discuss the pros and cons of reasoning from first principles, the collision of the profound and the mundane, and a cool Bavarian.
Citations and further reading
- Harry Potter and the Methods of Rationality, chapter 2: https://hpmor.com/chapter/2
- HPMoR and Science: https://hpmor.com/science/
- “Rationality Quotes 1,” Eliezer Yudkowsky. LessWrong. 15 January 2008. https://www.lesswrong.com/posts/LiDk2XMBFmRiju4x3/rationality-quotes-1
- “Interstitial Comments on Dawkins,” Mencius Moldbug. Unqualified Reservations.11 October 2007. https://www.unqualified-reservations.org/2007/10/interstitial-comments-on-dawkins/
- “The Anti-Reactionary FAQ,” Scott Alexander. Slate Star Codex. 20 October 2013. https://web.archive.org/web/20131126130345/http://slatestarcodex.com/2013/10/20/the-anti-reactionary-faq
- “Geeks for Monarchy: The Rise of the Neoreactionaries,” Klint Finley. TechCrunch, 22 November 2013. https://web.archive.org/web/20131124080402/https://techcrunch.com/2013/11/22/geeks-for-monarchy/
- “Mr. Jones Is Rather Concerned,” Mencius Moldbut. Unqualified Reservation. 28 November 2013. https://www.unqualified-reservations.org/2013/11/mr-jones-is-rather-concerned/
- “Symmetry for Dummies: Noether’s Theorem,” Will Sweatman. Hackaday.com. 14 June 2016. https://hackaday.com/2016/06/14/symmetry-for-dummies-noethers-theorem/
- “E. Noether's Discovery of the Deep Connection Between Symmetries and Conservation Laws,” Nina Byers. 23 September 1998. https://doi.org/10.48550/arXiv.physics/9807044
- “ELI5 What are Hamiltonian and Langrarian Mechanics? What do they concern?” answer by u/CreatureOfPrometheus. 4 May 2019. https://www.reddit.com/r/explainlikeimfive/comments/bkka3z/comment/emhm30p/
- “Unitarity,” Wikipedia. Accessed 29 March 2026. https://en.wikipedia.org/wiki/Unitarity
- “My Childhood Death Spiral,” Eliezer Yudkowsky. LessWrong. 14 September 2008. https://www.lesswrong.com/s/SXurf2mWFw8LX2mkG/p/uD9TDHPwQ5hx4CgaX
- “How Peter Thiel’s Relationship With Eliezer Yudkowsky Launched the AI Revolution,” Keach Hagey. Wired. 20 May 2025. https://web.archive.org/web/20260204143917/https://www.wired.com/story/book-excerpt-the-optimist-open-ai-sam-altman/
- “The Pleasure of Finding Things Out,” from The Pleasure of Finding Things Out, Richard Feynman. Original interview 1981, collected in 1999.
Intro and disclaimers
SPEAKER_01Let's do it for real. Hell yeah. Hell yeah. Hell yeah. Welcome back to HP More and the Limits of Rationality, HP More Lore, a podcast where we try and understand formal logic, cognitive bias, and how many ways there are to have mixed feelings about Harry Potter. I'm your host, Anthea Carnes, and I'm joined today by my lovely sister, Regina. Hello. Hi. Great to be here. Yay! Thanks for being here. Yeah. I'm really glad that so many of my loved ones are willing to enable me in this.
SPEAKER_02I'm in. Of course.
unknownYeah.
SPEAKER_01Um uh tell us about yourself, Regina. No, okay, here's what here's what I want to know. Um, part of the reason I was excited to to have you on here is uh is is your history with uh with rationalism. So can you tell me a little about like what what knowledge base are you coming into this with?
SPEAKER_02Yeah, uh so I feel like um you know back in the rationalist heyday, uh I like spent a lot of time sort of lurking at the edges of the community. Um I was mostly uh I was a big fan of uh the Slate Star Codex blog.
SPEAKER_01Uh and that's Scott. Scott Alexander. Uh because it's an anagram.
SPEAKER_02Yes. I figured I figured that out all on my own. And and after uh after the New York Times uh uh uh reportedly threatened to dox him uh and he shut that blog down and started a new one, he called it Astral Codex 10. Uh okay, great. Great. And uh so like I was uh I I read that blog a lot, uh and like via that um I sort of got into some other like uh you know, this was a a period of my life where I was you know interested in the idea of figuring out ways to make my brain work better. Uh I mean that describes most periods of my life before I got uh diagnosed with ADHD.
SPEAKER_01Uh sure, yes. And and I believe describes both of our continuing. Indeed. Yes. Yeah, like we're both like, oh my god, this thing doesn't work very well.
SPEAKER_02Yeah, yeah. I mean, it's uh like alcohol, it is the cause of and solution to all of life's problems. Yes.
SPEAKER_01Yes. Uh brains? Yeah. Yeah.
SPEAKER_02Yeah.
SPEAKER_01Yeah. Yep.
SPEAKER_02And um so the rationalist community purported to help with that. And so I, you know, it and I thought it had some good ideas. I like the the the concept of working out what sort of biases uh your brain is prone to, and you know, approaching things from first principles to try and avoid that is appealing. It makes sense if it makes sense if approached from first principles. Right. I think the primary uh objection to it is as you pointed out last episode, it's hella tedious. Yes. Uh you can't use that to do everything without basically just spending your entire life working things out from first principles and never actually getting anything done.
SPEAKER_01Right. Right. And some people have the time to do that, I guess, but the rest of us need heuristics.
SPEAKER_02Right. I mean, I suppose if that's your hobby.
SPEAKER_01Sure, yeah, yeah. And clearly it is for some people.
SPEAKER_02Yes.
SPEAKER_01Yeah.
SPEAKER_02So yeah, so I sort of hung around I I never uh found uh Eliezer Yudkowski himself particularly readable. I did not read the sequences.
SPEAKER_01Uh uh And you haven't read more I haven't read HP Moore.
SPEAKER_02Um so like I mostly uh and and I I don't know, his vibe does not resonate with me. Sure. Uh so yeah, I was mostly sort of you know poking around the edges of the community. Um I went to a Slate Star Code Codex meetup at one point. Uh it was it was fine. It was a bunch of nerds.
SPEAKER_01I yes, yeah. I assume this was when you were working in adjacent to tech in California.
SPEAKER_02Uh no, this was in Seattle.
SPEAKER_01That tracks. That also tracks.
SPEAKER_02I mean, adjacent to tech in California, adjacent to academia in Seattle, like Oh, yeah.
SPEAKER_01Yeah, that makes sense.
SPEAKER_02Yeah. So yeah, that's kind of my my history with rationality. Uh and you know, I I never I probably never would have called myself a rationalist, sure um, but I sort of identified enough with the ideas that uh I found myself bristling bristling a little uh listening to episode one. Uh like, hey, I don't know but about that. I I think that's a you know, I don't think you're giving the rationalists enough credit. Uh you know, I I think you were for the most part fair, but Sure.
SPEAKER_01Well, I so this is I d uh I I feel like I'm gonna come back to this over and over again, which is that I think of and this is gonna segue into uh this this intro screed that I've written before we get to the good part, um in some ways, but uh I I think of of rationality as um I I I think of rationality as like a a com as a subculture and a community and like a uh uh an edifice, not not the sort of not the methods of rationality, but like the the community and edifice that is built up around it, in a very similar way that I think of Catholicism, because we both grew up nominally Catholic. Um and there are a lot of really good things in Catholicism. There's a lot of parts of my own uh of my personality and my ideal and my worldview and my ideology that are informed by Catholicism. And also there are bad actors who use it to do all kinds of weird shit. Right. Right? Yeah. Um and you can and it there are parts of Catholicism that interact really poorly with my brain chemistry. Speaking of brains being bad, um, and I think that we can see ways in which rational rationalism and rationality uh with a capital R interact really poorly with some people's brain chemistry and have bad outcomes.
SPEAKER_02Right. Uh see, the Zizians.
SPEAKER_01Yeah, exactly. But it's not really the the underlying principles are fine. Right. There's nothing wrong with the underlying principle. I've been I, you know, I I part of why I'm doing all of this is that I'm like, hey, Yadkowski says I can learn everything that the main character knows. Uh sure, let's let's put that to the test. Right. Um and and I have found myself, you know, coming across some really interesting ideas and really interesting, you know, uh science about cognitive bias and um and all that kind of stuff. Uh and also some stuff where I'm like, that's silly.
SPEAKER_02Right. I I I mean I think, and perhaps you're going to get into this, um, the other major failure mode of the rationalist approach, aside from it being tedious, is that when you are approaching things from first principles, um there are uh you know a number of common failure modes that you are going to get into that are sort of yeah, if you look at the situation i i sort of like when I was in high school, I used to look at sort of interpersonal interactions and be like, why can't people just, you know, be honest with each other and say what they mean? And uh and then as I get older I realize, oh, there are subtle reasons why that things may run more smoothly if people are not exactly honest with each other and say what they mean, not least of which is just that language cannot fully describe our, you know, feelings and opinions.
SPEAKER_00Yeah.
SPEAKER_02Uh yeah. And so that sort of rationalism is prone to that kind of failure mode where you get something wrong, you are deliberately ignoring the sort of mechanisms in your brain that may stop these like uh chains of logic from going to weird extremes. Uh and then the chains of logic go to weird extremes, and you're like, well, I logiced it, so it must be true.
SPEAKER_01There's a like a like some of these chains of logic have like a runaway state.
SPEAKER_02Yeah, yeah, exactly.
SPEAKER_01And then your phone blows up.
SPEAKER_02Right.
SPEAKER_01Yeah, yeah. Yeah. So yeah, so I totally I I I totally hear you on like having a little bit of a hey, um, I own that. I own that I'm gonna I'm gonna provoke that. I mean and as well you should. Thank you. That should own it or should provoke it. Both. Thank you. Good. Speaking of provoking things, before we get too far in, um I uh I did not get any pushback on the last episode about, hey, you're being mean to rationalists. What I did get pushback on um was uh connecting was bringing up Eliezer Yudkowski uh in conjunction with Curtis Yarvin, aka Mencious Moldbug. So now you have to sit here while I read this.
SPEAKER_02All right.
SPEAKER_01Um I mean like interrupt and and and chime in if you're like, oh, I have a thought about that. But um okay, so I got some pushback on on connecting these two. And you listened to the last episode, so you're you're like broadly familiar with mencious mold bug.
SPEAKER_02I mean, I I I uh was already broadly familiar with mencious moldbug uh and um uh I remember reading you know various uh explanations of why he was wrong on first to Slate Star Codes.
SPEAKER_01Yeah, Scott Alexander was one of the people who was like, hey, this is a problem.
SPEAKER_02Right. And and then when his name came back up in national politics, I was like, that guy?
SPEAKER_01Yeah, that guy. That guy. Uh so there we go. Um so yeah, so uh I got some pushback on on bringing bringing up uh Eliezer Yudkowski and Mencious Moldbug, a man who I consider uh responsible for uh uh no small part of our current fascist dystopia. Um and uh, you know, that's a that's a fair pushback because Yudkowski does have explicit disdain for menchus Moldbug. Um and uh I did a little I after I got this pushback, I dug into it a little bit more and discovered that Moldbug is equally dismissive of Yudkowski, so it would be very unfair for me to describe them as philosophical colleagues or anything like that, which thankfully I didn't. Uh I went back and I listened and I described them as adjacent and not friends, which I believe are both accurate.
SPEAKER_02Yeah, I I think that you Yeah, you did, you know, uh uh make some disclaimers about them not uh being ideologically aligned, I think.
SPEAKER_01Yeah, but let's dig into that a little bit more. Like, why if they're not ideologically aligned, why did I bring them up? Why did I bring them up at all? Like, uh, you know, okay, so they're adjacent. Yudkowski Yudkowski is also adjacent to self-help, and I didn't bring up Tony Robbins, who is also an objectionable person. Um, so why did I bring up Moldbug? Um Okay, thing one, they've literally been in conversation with each other. Uh according to Moldbug, they have met and talked. Um back in 2008, Yudkowski included a quote from Moldbug in a post titled Rationality Quotes 1, alongside Sam Harris and Steve Jobs. Um, so people that he clearly thought were were, you know, smart, interesting people with interesting things to say. Um also in also included in that quote, uh in that quote roundup, um, something from Neil Gaiman, I think Anansi Boys, uh, and Jacqueline Carey of Cushale's Dart, which is okay, you're not familiar with that one. It means a lot to the people who it means something to.
SPEAKER_02I mean, I I like I I'm I'm vaguely familiar with that book series and actually tried to read it at one point but bounced off of it for whatever reason.
SPEAKER_01Yeah, fair enough. Um so in 2007, Moldbug called Yadkowski, quote, a really good philosopher, end quote, in a post titled Interstitial Comments on Dawkins, because he really hates he's got a real hate on for Richard Dawkins. Um to be fair, he called him a really good philosopher in the context of taking Yudkowski to task for oversimplifying the world into Bayesian reasoning. But nevertheless, he took Yodkowski seriously enough that he was like, I gotta, you know, deal with this guy's ideas, right? Uh fast forward to 2012-2013, when, as mentioned, Scott Alexander of Slate Star Codex started drawing attention to neo-reaction as a dangerous phenomenon. And at this, this is the point where we get Yadkowski describing Moldbug as mind-killed by politics. Moldbug, in a contemporary post, called Yudkow slyly called Yadkowski a cult leader, um, putting me in the position of agreeing with both of them. Yeah. You know, broken oligarchic clocks.
SPEAKER_02Right. I mean, it is uh unfortunately the case that terrible people can, in addition to their terrible ideas, have correct ideas. Indeed. You know, the whole Hitler was a vegetarian and loved puppies sort of thing.
SPEAKER_01Right, yeah, yeah. Uh so point being, but as of about a decade ago, their paths had clearly diverged. Um and I get the impression that Yadkowski is pretty defensive about rationalism being associated in any way with the neoreactionaries. Um I mentioned in the first episode he felt strongly enough about this to edit somebody else's post to make it clear that Mentius Moldbug is not popular on less wrong. Um Moldbug quoted Yadkowski in a 2013 post, quote, Eliezer Yadkowski of the Machine Intelligence Research Institute here, more right is not any kind of acknowledged offspring of less wrong, nor is it so much as linked to by the less wrong site. We are not part of a neo-reactionary conspiracy. We are and have been explicitly pro-enlightenment, as such, under that name. Should it be the case that any neoreactionary is citing me as a supporter of their ideas, I was never asked and never gave my consent. Also, to be clear, I try not to dismiss ideas out of hand due to fear of uh public unpopularity. However, I found Scott Alexander's takedown of neo-reaction convincing, and thus I shrugged and didn't bother to investigate further. End quote. Uh Moldbug's citation for this is the comments on a uh excellent tech crunch article by Clint Finley, Geeks for Monarchy. Um to my distress, the Wayback Machine does not populate comments on this, uh, so I'm not able to like independently verify Moldbug's citation. Um but uh while I think that he's a Gish Galloper, I don't I don't see any reason to think that he's lying about this. I, you know, I'm not inclined to take anything he says at face value, but this seems fine. Like um so uh yeah, so so Yudkowski really wants to make it clear that he's not associated with this this movement, and that is fair enough. Um so I want to state again that I don't think Eliezer Yudkowski is an authoritarian, and he's definitely not a neo-reactionary, and I actually don't know what his p political affiliations are at all at this point. I know that recently he sat down with Bernie Sanders to talk about AI. Like, so that's cool. I'd be good for him.
SPEAKER_02Like, I also don't know his political affiliations. I I suspect that m many rationalists of his ilk would disclaim the idea of political affiliation.
SPEAKER_01I I think a lot of tech dudes like this are very hard to categorize politically. Um in some cases deliberately uh and consciously, and in some cases not.
SPEAKER_02I mean it it follows from the rationalist premise that you should evaluate each political idea on its own merits and you know choose where you stand rather than signing on to the whole slate of you know Democratic or Republican positions.
SPEAKER_01Right. Totally, totally. So Yudkowski is not affiliated with Menchis Mulbug, aka Curtis Yarvin, he's not associated with that philosophy. If I implied that Yudkowski's work and philosophy led to doge or other overreaches of the current administration, I overstated that case, meia culpa. Um, guilt by association is a messy business, and I understand the desire of both Yudkowski and the people who like him to avoid being associated with a dude who has direct influence over the way the current presidential administration is conducting itself. So, that being said, I do find myself morbidly intrigued by the way the ghost of neo-reaction is haunting the machine of rationalism. Like, and more generally, I'm I'm intrigued by the way that rationalism runs alongside so much right-wing tech weirdness. Um and that's why I, uh, tongue-in-cheek, call HP Moore the fanfic that created the modern world. Because if I can, if if this is if this is theoretically uh a work that will let me uh get into the methods of rationality, and those methods of rationality have like a lot of uh, you know, like inter like intersect in various places with this parallel line of not parallel, clearly, because they intersect. I know math. Um if they if it intersects in places with this line of uh right-wing tech weirdness, um I I want to understand why and how it's doing that, if I can.
SPEAKER_02So the defensive rationalist in me says like the methods of rationality are not what led to the right-wing tech weirdness. They are much in the way that these many of these right-wing weirdos co-opt Christianity in order to justify their terrible ideas. Yeah. Uh they also co-opt uh these ideas of you know, pure rational philosophy because it is a nice way to launder your bullshit.
SPEAKER_01See, that's interesting, because I d I don't see that's really interesting. Uh and I and I think you are uh correct in some cases, right? What I th I don't necessarily see rationality being co-opted. What I think I see is A, is that rationality and other tech I I think I think what I see is that rationality ends up in this place about the singularity and and AI um and uh and related ideas of like transhumanism and anti-death uh uh ideas and those tech weirdos have those same ideas. So it's it's maybe it's maybe it's not so much that like rationality is like being co-opted, but like that all these things are coming out of a similar like like on the evolutionary branch or the evolutionary tree of this of these political ideologies, up here there's a common ancestor that leads to ration that leads both to um Yudkowski's AI Doomerism and uh uh Elon Musk and Sam Altman's AI boomerism.
SPEAKER_02Which are really quite closely related. You know, they're both saying AI's gonna take over the world, they just disagree about whether that's good or not.
SPEAKER_01Yeah, absolutely.
SPEAKER_02Yeah. And I think there's also the interesting issue that rationality, by definition, cannot have dogmas. It can only have methods. And so if someone uses your methods to reach abhorrent conclusions, then you can say I disagree, but you can't necessarily say they didn't use rationalism to do that.
SPEAKER_01Interesting. I feel like I'm defending. I feel like I'm about to get defensive of rationality now because I'm like, but if but like logic should always lead to the same conclusions, shouldn't it?
SPEAKER_02Only if you start from the same premises.
SPEAKER_01Only if you start from the same premises. Fair enough. Fair enough. And that is That's the problem. Yeah, yeah. That is the problem.
SPEAKER_02You're starting from first principles, but most uh questions of society and politics and what have you do not have nice provable first principles the way that physics does. Right. Uh for a physics problem, I can give you first principles that are both, you know, observable to like five nines of accuracy and uh follow from basic mathematical rules about the universe.
SPEAKER_00Right.
SPEAKER_02In politics and anthropology and governing and all of that stuff, we do not have such solid ground to stand on.
SPEAKER_01Right, right. Right. But I think that a lot of rationalists think that we do.
unknownYeah.
SPEAKER_01I think Yudkowski thinks that we do.
SPEAKER_02Well, uh or at least I think that they are uh they have developed a certain amount of hubris from their ability to s solve problems in engineering. And uh you know, they may admit that the premises of uh, you know, sociology are not as solid as those of physics or engineering, but I think that they don't fully uh account for just how uncertain a lot of those basic ideas are.
SPEAKER_01Right, right. There's so many yeah, and there's so many confounding factors in that it is it's i we're we're well outside just assume a spherical cow uh category or or or situations here. So Fantastic. Uh I love this. Um reading these guys makes me overly wordy and think really, really hard. Uh and that's why we're here. Um but I think the point of all of this was I've just claimed my opinions on these public figures enough to avoid defamation charges. Uh so let's uh let's get to the good good part. This is a good part, let's be clear. But let's get to the Harry Potter part.
Chapter 2 close reading
SPEAKER_02Excellent.
SPEAKER_01Chapter 2. Everything I believe is false. Faces already. Faces already.
SPEAKER_02Uh if only faces could be captured for podcasting purposes.
SPEAKER_01Uh the epigraph for this chapter is of course it was my fault. There's no one else here who could be responsible for anything. Um last last episode I uh I speculated that the epigraph was supposed to be like an impressionistic haiku about the death of Lily and James Potter, and we got really sidetracked by that. Um uh apparently all of these epigraphs are actually quotes from the work itself, um, sometimes uh from like later in the work and sometimes from the chapter itself, so they're like a little T they're like a little cold open. Um, because this is a TV show. You can I think you can really tell in some of these, especially in these early chapters, how much this was influenced by TV.
SPEAKER_02Interesting. I mean, fair fair enough.
SPEAKER_01We all have our media you know, and fanfiction in fanfiction in general, I think, uh a lot of fanfic writers are very influenced by like, and this is the cliffhanger for next week, or this is the this is where you cut to black for the commercial break. Like, very that's very much how a lot of us approached pacing, I think.
SPEAKER_02Well, I mean, it's kind of uh the only culturally mainstream example of serialized fiction. We don't do that nearly so much anymore with books or radio shows. Right, right.
SPEAKER_01Um so in the previous chapter, we were introduced to uh Harry James Potter Evans Varys. Uh he's a 10-year-old British lad. He was raised by his aunt Petunia and her husband Michael. Um Michael is an Oxford biochem professor. Petunia is lithe and hot, unlike in the original.
SPEAKER_02Magically.
SPEAKER_01Magically, yes. Um uh Harry Potter Evans Varus, uh, like his canonical counterpart, was orphaned as a baby, um, but he's been raised in a uh in a household that believes in science. Uh he's read a lot of science fiction. His dad is still kind of a dick, um uh who doesn't think that who who has a lot of problems with his wife wanting basic emotional validation, it seems like. Um uh and at the and uh his his mother is, you know, a woman who really wants emotional validation and maybe doesn't always have the I don't know, she's a she's a pretty clear communicator about it.
SPEAKER_02And I I feel like last time it it was not entirely clear whether the text thought that Harry's dad is a dick.
SPEAKER_01I it is uh it is a little I think that the text I think does think that uh not that the professor is a dick, but that the professor, one, does not take Harry seriously. Like like his father doesn't take him seriously, and two, that he's maybe that he's dogmatic, to use your word. Like I think that might be be fair. Um Harry has received an acceptance letter to Hogwarts School of Witchcraft and Wizardry, and at the end of the last chapter, uh his neighbor, Mrs. Fig, helped him send a letter to Hogwarts asking them to send a professor to help prove to him and to his father that magic is real, because Petunia already knows. Petunia was made magically hot. Uh so we owe in on Harry laying out the rules of the experiment to his parents. Um uh the professor from Hogwarts is going to levitate his dad, Professor Varys, and Professor Varys knows that he has not been attached to any wires, so if this levitation works, he will acknowledge that magic is real. And if it doesn't work, then Petunia will acknowledge that magic is not real, and she doesn't get to do any like Yuri Geller shit and be like, oh, it doesn't work if the vibes are wrong. She doesn't get to pull any of that.
SPEAKER_02So wait, uh so we're we're coming a little in media res here, so last chapter ended, and they were like, We're gonna get a professor. Now the professor's already here and we have uh laid out this experiment. Yes.
SPEAKER_01Okay, yeah, yeah. Deputy headmistress Minerva McGonagall was watching Harry with a bemused expression. She looked quite witchy in her black robes and pointed hat, but when she spoke she sounded formal and Scottish, which didn't go together with the look at all. At first glance she looked like someone who ought to cackle and put babies into cauldrons, but the whole effect was ruined as soon as she opened her mouth. I am not gonna try and do a Scottish accent. I think that's fair. I think that's yeah, you're all welcome. Is that sufficient, Mr. Potter? She said. Shall I go ahead and demonstrate? Sufficient? Probably not, Harry said. But at least it will help. Go ahead, Deputy Headmistress. Just Professor will do, said she. And then Wingardium Leviosa. Harry looked at his father. Huh. Harry said. His father looked back at him. Huh. His father echoed. Then Professor Varys Evans looked back at Professor McGonagall. All right, you can put me down now. His father was lowered carefully to the ground. Harry ruffled a hand through his own hair. Maybe it was just that strange part of him which had already been convinced, but that's a bit of an anticlimax, Harry said. You'd think there'd be some kind of more dramatic mental event associated with updating on an observation of infinitesimal probability. Harry stopped himself. Mum, the witch, and even his dad were giving him that look again. I mean, with finding out that everything I believe is false. I think it's weirder that his dad has no reaction to this. I get like I get why I I think that I think that this is kind of a nice little piece of writing that Harry is like, oh, I expected this to I expected to like freak out, and I'm not freaking out, that's interesting. Even though, like, okay, I am being forced to accept that magic is real. But his father was just presumably liv levitated. Not presumably, his father was just levitated.
SPEAKER_02I mean, it is an it it is an interesting piece of writing. Uh it it reminds me of two things. One is what we were discussing before we started recording about uh the way in which the profound uh arrives in combination with the mundane.
SPEAKER_03Yeah.
SPEAKER_02Uh and it's like, you know, huge things can happen and yet life still goes on and everything is still, you know, much the same despite this huge thing having happened. Uh and also uh you're familiar with the fairy and walrus discourse? The witch.
unknownOkay.
SPEAKER_01Let's go.
SPEAKER_02So this was going around Tumblr a while back. Okay.
SPEAKER_01And the question was Sorry, the f the fairy and walrus discourse?
SPEAKER_02Yes. The question was Okay. Uh there's a knock at your door. Yes. You open your door. Hello. Which of these would it be more surprising to find at your door? A fairy or a walrus?
SPEAKER_01Okay, this is a very rationalist question because it's a weird thought experiment. Okay, um, and and like a proper rationalist, I'm going to ask you some follow-up questions. Please do. Um when we say fairy, do we mean like a little tinkerbell fairy?
SPEAKER_02I believe that is the implication. The original question didn't state it, but I think generally the idea is uh, you know, what what's being asked here is would you be more surprised to find something that upends your understanding of how physics works and proves the existence of magic? But if we posit the existence of magic would plausibly be at your door.
SPEAKER_03Sure, sure.
SPEAKER_02Or something that uh you know exists in the world we live in, but that has no reason whatsoever to be at your door.
SPEAKER_01Oh, okay. I think I would be more surprised by a fairy because I grew up in a state where there are moose at our door on a regular basis. So a walrus is not that many sigmas off the norm. Yeah.
SPEAKER_02Like I mean, if you I also said I would be more surprised by a fairy.
SPEAKER_03Yeah.
SPEAKER_02Cause like there were there are a lot of things that would have to happen for a walrus to get to your door, even in Anchorage. But they're all possible. But they're all possible. Yeah. Uh but a lot of people were like, you know, there was a lot of argument about it. I of course. And I feel like this is the same thing as like, you know, if you if you discover that magic exists, like, how much does that surprise you and how much do you have to like update your worldview? Sure, sure.
SPEAKER_01Well, it's funny you should say that. Um, because uh that's exactly the question that Harry is is wrestling with, right? Uh yeah, okay, so at this point, McGonagall offers to provide um additional demonstration to magic is real. Harry has Harry's been like, oh okay. Oh, interesting. I thought that was going to be more dramatic. Um and Harry decides that it is, quote, right and proper to be curious. Professor McGonagal turned into a cat. Harry scrambled back unthinkingly, backpedaling so fast that he tripped over a stray stack of books and landed hard on his bottom with a thwack. His hands came down to catch himself without quite reaching properly, and there was a warning twinge in his shoulder as the weight came down unbraced. At once the small tabby cat morphed back up into a robed woman. I'm sorry, Mr. Potter, said the witch, sounding sincere, though the corners of her lips were twitching upwards. I should have warned you. Harry was breathing in short gasps. His voice came out choked. You can't do that It's only a transfiguration, said Professor McGonagall. An animagus Animagus? Animagus. I'm gonna go with Animagus. An animagus transformation, to be exact. You turned into a cat! A small cat! You violated conservation of energy! That's not just an arbitrary rule, it's implied by the form of the quantum Hamiltonian! Rejecting it destroys unitarity, and then you get FTL signaling! And cats are complicated! A human mind can't just visualize a whole cat's anatomy and and all the cat biochemistry? And what about the neurology? How can you go on thinking using a cat-sized brain? Professor McGonagall's lips were twitching harder now. Magic. Magic isn't enough to do that! You'd have to be a god! Professor McGonagall blinked. That's the first time I've ever been called that. Pause. Uh when I first read this, I was like, what the fuck are all of those things he just rattled off? Regina, you were a physics major. What did you recognize in that?
SPEAKER_02Uh not a lot. I mean, those are terms I am familiar with, but uh I don't feel like that is even someone who is familiar with uh with physics would not go immediately into that like I mean I guess that that implies uh the that string of sentences implies that he has been sitting around thinking about uh what would be true if conservation of mass and energy were violated and you know what um Yeah, that's not like baseline physics stuff.
SPEAKER_01No, certainly not. Certainly not um conservation of energy is kind of though. Can you can can you explain conservation of energy succinctly?
SPEAKER_02I mean uh matter and energy can be interconverted, but neither created nor destroyed in a closed system. You're gonna have the same amount of, you know, matter energy uh like and if you don't, it's not a closed system and the universe is considered to be a closed system.
SPEAKER_01Okay. Uh right, right, right, right, right, right. Um so uh so he's freaking out because she went from a large amount of matter to a small amount of ma matter without a corresponding uh without an apparent corresponding um uh conversion to energy. Right. Right.
SPEAKER_02Uh which is her that would destroy the planet.
SPEAKER_01I'm like, she should have she the their their house should be in flames.
SPEAKER_02Right. Yeah. Uh you know, if or you know, if the mass went somewhere else somehow, like Which is how they solve it in Animars. Right. Uh then that still uh you know uh raises some pretty questionable physics questions.
SPEAKER_01Yeah, for sure. Yeah, yeah. Okay, um so so that's conservation of energy. Um uh that one I so that one I was like, okay, I got I got this. Um uh that's not just an arbitrary rule, it's implied by the form of the quantum Hamiltonian. Okay, then I spent a long time on quantum Hamiltonians. Um or or what what that is.
SPEAKER_02And and what did you learn? Because I don't remember anything about quantum Hamiltonians.
SPEAKER_01Fantastic. Um well, okay, first let me quote Yudkowski. Um in his own notes, he writes, uh in Harry's You Turned Into a Cat Speech, that's just that's not just an arbitrary rule, it's implied by the form of the quantum Hamiltonian, refers to conservation of energy being the Noether's theorem conjugate of the laws of quantum mechanics being invariant under translation in time, which didn't help me very much because then I had to go look up what Noether's theorem was. And that was a very fun rabbit hole, actually. Uh because turns out Noether, Noether's theorem is a mathematical f math physics, I don't know, math and physics are intimately involved in ways that you know you you probably get.
SPEAKER_02Yeah, they're they're kind of the same thing.
SPEAKER_01Yeah, yeah, yeah. Um uh is a theorem created by um Amelie Emmy Noether, who was born on the 23rd of March 1882. Uh oh, that's tomorrow. Hey, happy birthday, Emmy. Uh in Erlang, Bavaria. Uh she was a cool ass Bavarian lady. She earned a PhD in mathematics despite being 25 in turn of the century Germany, where women weren't actually allowed to register for mathematics classes.
SPEAKER_02Um It's bad for the uterus, you know.
SPEAKER_01Definitely, definitely. Um I'm gonna quote liberally here from Will Sweatman's Symmetry for Dummies on Hackaday.com. 1907 was a very exciting time in theoretical physics, as scientists were hot on the heels of figuring out how light and atoms interact with each other. Emmy wanted in on the fun, but being a woman made this difficult. She wasn't allowed to hold a teaching position, so she worked as an unpaid assistant, surviving on a small inheritance and under-the-table money that she earned sitting in for male professors when they were unable to teach. Uh, she was still able to do what professors are supposed to do, however, which is write papers. In 1916, she would pen the theorem that would have her rubbing shoulders with the other physics and mathematical giants of the era. Emmy Noether's theorem seems simple on the onset, but holds a fundamental truth that explains the fabric of our reality. It goes something like this. For every symmetry, there is a corresponding conservation law. What what does symmetry mean to you?
SPEAKER_02I mean, I feel like I have the vibe, but I couldn't explain it.
SPEAKER_01Okay. Well, here's here's what I gleaned. Um basically a symmetry is like if you if you change a like an aspect uh of something, uh it will be it it will st work the same. So um like if you throw a basketball on one at one point on a plane and then move to a different point on the plane, and throw the basketball in exactly the same way. We are again, we're we're in a physics problem, so we are assuming everything else is like assume a spherical basketball?
SPEAKER_02Yes.
SPEAKER_01Um yeah, so assuming so assuming the the throw is exactly the same uh uh at both points of the plane, the basketball's arc will be exactly the same. So it is um symmetrical in translation of of space. Um uh so uh Noether's theorem, which is a math problem, um, can be used to prove it it's I looked it up and it involves integers, and I'm like, oh god, I never got this far.
SPEAKER_02I think you mean integrals.
SPEAKER_01Fuck, that's what I mean. It probably involves involve integers too. I it does, yes, an integral.
SPEAKER_02Uh but yes, I feel like uh, you know, a lot of this stuff uh once you once you get into some of the more theoretical bits of physics, you know, you get to a point where truly understanding it does involve sort of understanding the math. Uh and I mean I I feel like I can do that, but I there's a reason I went in the geophysics direction and was studying stuff I could stand on instead of uh stuff that I had to like go into underground laboratories to figure out.
SPEAKER_01Totally. Totally. Um so you can use Noether's theorem to prove that because translation in space doesn't change the momentum of the basketball, the law of conservation of momentum exists. Uh so um you can also use Noether's theorem to prove that conservation of energy exists, but I don't have like a good like like tangible example of that the way I do for momentum, because I c I kinda get kinetic energy. I can I I can you know grok momentum in a way that I like don't entirely grok energy and matter.
SPEAKER_02Right.
SPEAKER_01Uh well I mean I mean I grok like I know like if you set something on fire you get thermal energy, right?
SPEAKER_02Like I mean there's all kinds of energy and they're all sort of interconvertible uh you know I mean they're not all interconvertible. Um yeah, it is complex. Yeah. Uh and I feel like you know, it's you could go into each individual aspect of like, oh well, you can turn this kind of energy into this other kind of energy. Uh but we can probably just d be like, yes, okay, it's symmetrical and therefore conserved.
SPEAKER_01Right. Right. Uh yeah. So yeah. So good on good on Emmy Noether. She's awesome. Um she did eventually uh uh become a professor, um but before like she wasn't allowed to prov to present her own papers, um uh she wasn't allowed to lecture under her own name. Um in uh in do I have the the date here? Um no I don't but um according in uh uh Nina Byers' 1998 paper, Noether's Discovery of the Deep Connection Between Symmetries and Conservation Laws, uh she writes she quotes Hermann Weil. During the war, Hilbert, who was a mentor of Noether, tried to push through Emmy Noether's habilitation in the philosophical faculty in Göttingen. He failed due to the resistance of the philologists and historians. It is a well-known anecdote that Hilbert declared at the faculty meeting, I do not see that the sex of a candidate is an argument against her admission as private docent. After all, we are a university, not a bathing establishment.
SPEAKER_02Beautiful.
SPEAKER_01But I guess the I guess the philologists and the historians were like it's bad for the uterus.
SPEAKER_02What's a philologist?
SPEAKER_01It's uh it's um it's like a linguist.
SPEAKER_02Okay. I think that's right. My my brain wants correct my brain wants to conflate it with philatelist, but that's different.
SPEAKER_01I think that's different. That's like a stamp thing, right? Yeah, that's a stamp collector, yeah.
SPEAKER_02Which I think I mostly tend to remember uh because it shows up in a Tom Lehrer song.
SPEAKER_01Yeah, yeah. You have no idea how relevant that is to this podcast. But we won't get into that today. Aw man. Sorry. Um so Emmy gave uh Emmy gave unpaid lectures for years, uh despite being the kind of cool doing the kind of cool work that Einstein was like, this chick gets it. She's very cool. I'm I'm impressed by her. Um she eventually got to be a professor after the First World War, uh, when Germany was presumably like, well, everything else is fucked. We may as well have girl professors of math. Um and I had never heard of her, so thank you for that, Yudkowski. Uh this has not actually answered for me what the quantum Hamiltonian is. Um so I went to the internet's foremost scholars, r/slash explain like I'm five, uh, and got this explanation from user Creature of Prometheus. Hamilton's approach is to observe that nature is lazy. Of all the imaginable Okay, right, so Hamiltonian physics is like a kind is like a method or like a a way of understanding physics, and there are other methods of understanding physics by other people or like of describing physics.
SPEAKER_02Yeah, I that that seems like that seems fair. Yeah. You know, there are there are various ways to look at this wacky universe of ours and describe the ways it works and the rules that they're describing are all the same, but they're kind of But you can phrase them in different ways.
SPEAKER_01Yeah, yeah.
SPEAKER_02And you know, some of those are more useful in different domains.
SPEAKER_01Right. So Hamilton's approach, quoting Hamilton's approach is to observe that nature is lazy relatable. Of all the imaginable ways a system could take to get from one configuration to another, it will naturally take the path that minimizes an energy-like quantity called the action, thus the principle of least action, which is also really relevant to Noether's theorem for reasons that would take me a while to get into because I don't fully understand them. Um But yeah, the un the universe prefers an action with the smallest possible kinetic energy and the largest possible potential energy. Um there is an irony not lost on me in how much time and energy I spent trying to understand something like this. Um especially because when I'm just gonna come back to Gudkowski saying conservation of energy seems built into the laws of physics on an extremely deep level. It's just not it's not just a matter of not knowing any particular force that violates it. We have reason to believe that no force should violate it.
SPEAKER_02That sounds right. Yeah, like you know, it's it's one of those things where it's like this is uh if we I mean, I I guess as as Harry is saying in a very unten your old way. Yes. If we learned that that was wrong, then we would have to rethink everything else about our understanding of physics.
SPEAKER_01Right, right. Um I also tried to understand um unitarity. Is that a word that means anything to you? It's quantum I think it's like it's yeah, it's a quantum physics term.
SPEAKER_02So I I mean it is generally understood that even quantum physicists don't understand quantum physics. They just know how to do the math. So like So no.
SPEAKER_01Okay, great. Uh what I what I got was uh this, which I think I think I pulled from Wikipedia. In quantum physics, unitarity is or a unitary process has the condition that the time evolution of a quantum state according to the Schrdinger equation is mathematically represented by a unitary operator. This is typically taken as an axiom or basic postulate of quantum mechanics, while generalizations of or departures from unitarity are part of speculations about theories that may go beyond quantum mechanics. Actually, I think I understood that a little better than I used to. At least insofar as the reason it's coming up here is departures from unitarity are part of speculations about theories that may go beyond quantum mechanics. And that's why Harry is bringing it up and going, once you once you destroy that, you get faster than light signaling.
SPEAKER_02Sure.
SPEAKER_01Sure. I guess why not? Yeah, I assume because he I mean, like, what this kind of rolls boils down to, I think, is that he's saying you you turned into a cat and you didn't cause a nuclear level explosion of energy conversion. So therefore conservation of energy is false. And if conservation of energy is false, then E equals MC squared is disproven. And because that equation relies on the constant of the speed of light, that constant no longer can be relied upon, and therefore we might be able to get faster-than-light something.
SPEAKER_02I suspect that the chain of logic leading from violation of conservation of energy to faster-than-light signaling is more complicated than that.
unknownDamn.
SPEAKER_02Well, I tried. I mean it it it might just be that, but uh I it's not an argument I remember having run across, uh, but I think a the whole speed of light thing is a interesting and very weird aspect of like how our universe works. Uh and I feel like a lot of fundamental physical principles uh, you know, sort of come back around to if you can violate this, you can go faster than light. Just cause you have to, you know, if you logic it out, you know, it's like, well, if if this then that, yeah, and if that, then this other thing, and then ultimately you end up where, oh, you can uh find out that this happened before it happened, and uh, you know, logic doesn't work anymore, and the arrow of causality is uh exploded.
SPEAKER_01Okay. Alright. Cool. Well, okay. I think I learned something. I do think I learned some things here.
SPEAKER_02Yeah, and uh, you know, I I'm sure that uh Yudkowski is the sort of person, you know, who worked all this out in detail before writing this down.
SPEAKER_01Absolutely. And to your point, it does also uh like on a writing level at a character level, this tells us that Harry Potter Harry James Evans Potter Varys uh has been sitting around in as a ten-year-old being like, well, if conservation one, he understands he he he has learned enough physics to uh at least think that he understands unitarity.
SPEAKER_02Yeah, that is like getting into sort of I don't know if I necessarily want to say like graduate level understanding of physics, but certainly uh you know, this sort of thing is um senior year undergrad.
SPEAKER_00Yeah, yeah, yeah.
SPEAKER_02And uh, you know, Mr. 10-year-old autodidact Harry Potter is just uh hanging out with his quantum Hamiltonians. Yes. Uh doing thought experiments about a violation of uh conservation laws.
SPEAKER_01Yeah, yeah. Cool. Very normal. Now, Harry, as you might expect, is having trouble with all of this. And he thinks to himself three thousand years of resolving big, complicated things into smaller pieces, discovering that the music of the planets was the same tune as a falling apple, finding that the true laws were perfectly universal, and had no exceptions anywhere, and took the form of simple maths governing the smallest parts. Not to mention that the mind was the brain, and the brain was made of neurons, a brain was what a person was. And then a woman turned into a cat. So much for all that. A hundred questions fought for priority over Harry's lips, and the winner poured out. And what kind of incantation is Wingardium Leviosa? Who invents the words to these spells? Nursery schoolers? McGonagall's like, alright, kid, let's get back to business. That's actually a piece of writing that I really like. I really like that uh the this discovering the music of the planets was the same tune as a falling apple. I think that's a really beautiful, like, like lovely description of you know, I read so I read a lot of Feynman for this for the first time. Like, you read a lot of Feynman growing up. Yeah.
SPEAKER_02Yeah. I I read some Feynman growing up. Uh and uh, you know, I I feel like I read m more of his sort of like funny personal anecdotes than his physics. But yeah.
SPEAKER_01Yeah. Um but in like in his lectures he talks about uh I think I'm I think I'm thinking of Feynman and not Sagan. I might be thinking of both of them. I think they had similar ideas about like like, well, yeah, I'm a scientist, but like science allows me to appreciate the beauty of the world in a way.
SPEAKER_02Yeah, like that's that's definitely something that I've always found uh like you know, people who say, oh, it science is taking all the magic out of it. It's like, no, science is putting the magic into it. Yeah, yeah. You know, it's the the fact that I can appreciate these things on all of these new levels, like it doesn't take away from the ways in which they were already beautiful, and it adds a bunch of new ways in which they're beautiful.
Chapter 2 close reading pt 2
SPEAKER_01Yeah, yeah, yeah, yeah, yeah. Absolutely. So I I really I do I like I I want to give Yudkowski his flowers when they when they come up, right? And I think that that's a really lovely description of uh or like like summation of why of of like what is beautiful about physics. Um Yeah. Yeah. So McGonagall's like, alright, kid, let's get back to business. We gotta get you to Hogwarts. Hold on a moment, Harry, his father said. Remember why you haven't been going to school up until now? What about your condition? Professor McGonagall spun to face Michael. His condition? What's this? I don't sleep right, Harry said. He waved his hands helplessly. My sleep cycle is twenty six hours long. I always go to sleep two hours later every day. I can't fall asleep any earlier than that, and then the next day I go to sleep two hours later than that. Ten PM, twelve AM, two AM, four AM, until it goes around the clock. Even if I try to wake up early, it makes no difference, and I'm a wreck that whole day. That's why I haven't been going to a normal school up until now. This is apparently a real thing.
SPEAKER_02The Mudders. The the Harvey Mudd students, uh, some of them would get themselves onto a I want to say it works out to twenty-eight hours, they'd do it like a six-day week. Uh and um, you know, they'd uh figure out their class schedules and stuff such that they would, you know, they would do this, they would like advance their sleep schedule by X amount every day.
SPEAKER_01Uh and I wish Elisa was here for this as a scriptsy.
SPEAKER_02I uh I don't know how widespread it was. Like I don't think I knew anyone.
SPEAKER_01How would they do this? So they're so they're they were okay, because so this is this is like a real thing that is uh uh that largely affects blind people because they don't have a normal they don't have the like input of daylight.
SPEAKER_02Uh I mean How were the mutters d simulating this? I mean I I think that they weren't uh I don't I don't remember if they were attempting to sort of hack their circadian rhythms by like changing their exposure to light. They were just uh you know just brute forcing it. Yeah. Um and you know, they would uh schedule their classes around it and arrange to be, you know, awake all night at the weekends uh and yes.
SPEAKER_01I I gotta say, I have a a great deal of admiration for the way Harvey Mudd people seem to be like, how can I optimize my college experience to do the craziest shit possible?
SPEAKER_03Hell yeah.
SPEAKER_01Oh man. Um that's fun. Yeah. That's fun.
SPEAKER_02And I think that is sort of the you know, that's that's the fun side of the rationalist approach, is sort of what assumptions can we question, you know, and uh how can we like do fun things by saying, well, if we work this out, if if we abandon the idea that you should be asleep at night and awake during the day, uh and you know, bring in the idea that humans drift towards uh a slightly longer sleep schedule, uh deprived of uh the signal of the sun, uh then what can we do with that?
SPEAKER_01Yeah, yeah. Everyone's gotta have hobbies. Uh that's why I haven't been going to a normal school up until now. One of the reasons, said his mother. Harry shot her a glare. McGonagall gave a long hmm. I can't recall hearing about such a condition before, she said slowly. I'll check with Madame Pomfrey to see if she knows any remedies. Then her face brightened. No, I'm sure this won't be a problem. I'll find a solution in time. Now, and her gaze sharpened again. What are these other reasons? Harry sent his parents a glare. I am a conscientious objector to child conscription on grounds that I should not have to suffer for a disintegrating school system's failure to provide teachers or study materials of even minimally adequate quality. Now fair? I so okay. So you and I both went to weird schools.
SPEAKER_02We did. Like, uh we were both For our entire Well I went to weird schools for my entire schooling experience. You had a brief period in a normal school.
SPEAKER_01I spent I spent my seventh grade year in a in a normal school and it sucked. Yeah. I mean, mostly socially, but I think that had as much to do with it being seventh grade as anything. Like looking back on it, I'm like, hmm, seventh grade sucks.
SPEAKER_02I am so eternally grateful for like op being able to opt out of the standard junior high experience. Yeah.
SPEAKER_01Yeah, you should be. Yeah. Um, so so we I I think we definitely both come at this with a with a certain amount of like, yeah, like listen, normal school is not the right solution for everybody. For sure. For sure. Um and uh and I want to be I want to acknowledge that. I also add like when I when I started this project, I came into it for some reason with this like rock solid belief that rationalism had has like some kind of idea about like the autonomy of children and like the rights of children, and I have not been able to actually find anyone writing about that, so I don't know where I got that idea. Like I had it very firmly in my head, and I seem to have made it up. I don't know where I got it.
SPEAKER_02Fascinating. I mean I I think that that's one of those things that um you know, I'm I'm sure that there are some rationalists who believe that. And like uh it may show up in s but it like yeah, it's it's not like a core tenant.
SPEAKER_01It's not like a core tenant, yeah, exactly. So I don't know.
SPEAKER_02Except as we've discussed, they don't have core tenants. Well it's they purport not to have core tenants.
SPEAKER_01Right, right, right. So I I don't know where I got that, but I do think that Yudkowski has some very big feelings about adults not taking children seriously. Um so what I and and in trying to search this out, I came across his sequen one of his sequences, Yudkowski's Coming of Age, uh, which gives us a couple of clues based on his own adolescent experiences. In an essay titled My Childhood Death Spiral, he opens with this. My parents always used to downplay the value of intelligence, and play up the value of effort, as recommended by the latest research? No, not effort. Experience. A nicely unattainable hammer with which to smack down a bright young child, to be sure. That was what my parents told me when I questioned the Jewish religion, for example. I tried laying out an argument, and I was told something along the lines of, logic has limits. You'll understand when you're older that experience is the important thing, and then you'll see the truth of Judaism. I didn't try again. I made one attempt to question Judaism in school, got slapped down, didn't try again. I've never been a slow learner. Whenever my parents were doing something ill-advised, it was always, we know better because we have more experience. You'll understand when you're older. Maturity and wisdom is more important than intelligence. The moral I derived when I was young was that anyone who downplayed the value of intelligence didn't understand intelligence at all. My own intelligence had affected every aspect of my life and mind and personality. That was massively obvious, seen at a backward glance. So I think there's plenty of interest to understanding Yadkowski's worldview in this essay. Um he goes on to talk about how, as a teenage transhumanist, which is, I think, against me's much less popular B-side to I was a teenage anarchist, uh, he believed that intelligence should be more fairly distributed amongst humanity. That said, he was, quote, annoyed with the Randian and Nietzschean's trends in SF, end quote, and because he'd read so much SF, he thought he understood the dangers of claiming any kind of superiority over the rest of humanity, and therefore rejected the temptation to do so, despite, in my reading, clearly being aware that he had an objectively fair claim to say that he was superior to many people. Like he is he's a smart guy.
SPEAKER_02I appreciate his ability to, you know, work out that that's a bad uh approach to take, you know, even with relatively limited life experience.
SPEAKER_01Sure, sure. Then the essay veers off to discuss his first encounter with the concept of the singularity, and it becomes somewhat less relevant for our current purposes, but I will be coming back to it at some point one day. Um, in this specific beat in in the story, uh, and in some that we will encounter in later chapters, where where Harry is is railing against the conscription of children, I uh okay, because yeah, we're I think we're seeing self-insertion pretty clearly, right? Like Yadkowski was he's clearly smarter than the average kid.
SPEAKER_02Like Have we not been seeing self-insertion this whole time?
SPEAKER_01I mean, yes. Just I think that it becomes like, like, I'm like, why is this here? Oh, because it's it's uh it's Heinlein turning to the audience and being like, hello, I have something to tell you. And this is a this is a guy who read a Yudkowski has read a lot of SF and he's it I'm positive he's read Heinlein. Right. There's no way he hasn't. Yeah. Like Um. Uh according to most every source you can find, uh Yudkowski is an autodidact who didn't attend high school or college. In an excerpt from The Optimist by Keach Hagee, we learn quote, at eleven, he scored a 14 10 on the SAT. By seventh grade, he told his parents that he could no longer talk. School. He did not attend high school. By the time he was seventeen, he was painfully aware that he was not like other people, posting a webpage declaring that he was a quote genius, but quote, not a Nazi. He rejected being defined as a quote male teenager, instead, he preferring to classify himself as an aldernon, a reference to the famous Daniel Keyes short story about a lab mouse who gains enhanced intelligence.
SPEAKER_02That's an interesting choice of categorization.
SPEAKER_01Yes, yeah.
SPEAKER_02I'm not a male teenager, I'm a lab mouse.
SPEAKER_01Yes. Yes. Uh an enhanced Transhumanism. Transhumanism, man. Yeah, yeah. Um the Yadkowski's Coming of Age sequence is about 25,000 words long, and it doesn't have an audio version easily available, so I haven't gone through it the whole thing yet, to see if he reveals any additional biographical information. Um but he does reek of autodidacticism, right? And I I think it's generally acknowledged that like he is a quote unquote expert on AI without ever having done any schooling in in that, like, because he has just reasoned himself into a bunch of stuff about AI from first principles.
SPEAKER_02Uh yeah, I I think that's I think that's correct. Yeah. Uh and like it's an interesting uh question uh uh in terms of uh uh you know, even if you haven't done formal schooling, uh uh you know, once you uh uh a scientist in uh in the course of their career, for instance, will you know, they don't stop learning when they graduate with their PhD. They are still learning things via uh reading papers and conversation with their community and so on and so forth. Uh so you know, it is certainly possible to learn things that way. Uh and I think it sounds like Yudkowski even has a certain amount of humility about it, like uh, you know, I not as much as he should, but like a little bit.
SPEAKER_01You were like, oh, she's making a face. Yeah, yeah, I yeah. Yes, I I I I'm uh do I have that open? Hang on. I don't think I do. No. No. He I mean, because he like he he does at the end of my childhood death spiral, he talks a little bit about um his like where he where he falls now on in terms of like uh thinking about uh not not so much about thinking about superiority, but thinking about intelligence. Um and he's like, well, I think intelligence is actually like uh or or as as a child I thought that intelligence was the or the the uh unequal distribution of intelligence was the greatest problem and and like the most unfair part of reality. And these days when I think about what what's like super unfair, I think of death. Um because I because he's a dude who's really freaked out by his own mortal mortality. Um uh and uh uh and he's like, yeah, you know, intelligence is like important, but but then also I've seen discussions from him uh and with with within the less wrong community about how um uh intelligen like like it is um a huge problem to be smarter. Like being smart is not the problem, it's being smarter than other people um that causes you problems. Which is okay, and this I think actually lets me come around to the point that I wanted to make, which is that I'm not here to uh to shit on people who are autodidacts or who like needed a uh uh an atypical uh path through the educational system. Because again, we're both people who had somewhat atypical.
SPEAKER_02We had uh it would it would be very hypocr hypocritical of us to uh you know look down on people needing an alternative path. I I didn't have to I don't know what would have happened if I didn't take that alternative path, but I bet it would have been less pleasant.
SPEAKER_01Sure, sure. Um and for the people who don't know, like we we both went to a a uh seven through twelve school. Well, we both went to a gifted program.
SPEAKER_02Yeah, we went to the intensive accelerated uh you know elementary school, and then uh, you know, weird, hippie, dependent learning uh you know, seven through twelve high school.
SPEAKER_01So yeah. Um with I think, you know, and I think particularly relevant is the level of like self-directed learning that we were we were uh inculcated with um both at school and at home. I think we both you know we had a we were very, very fortunate to have we had parents a lot like what Harry James Potter Evangeris had um philosophically. Philosophically, yes. Not personality-wise, not personality-wise at all. Um but we had a we dealt with you know the description of like all the books in the house in the first uh in the first chapter. I'm like, oh yeah, yeah, totally.
SPEAKER_02How stuffed with books, you know, uh an encouragement to ask questions. Uh if you have a a question about something, let's run an experiment.
SPEAKER_01Yeah, yeah. Um however, I do think that this level of like school it because um a dear friend of mine from college, uh Jessica Dickinson Goodman Holmes, another three name uh uh who also went to like an alternative high school, the way she described it that I've always really liked is like she is that like public school is is a or or any any not not when I say public school, I don't just mean like like publicly funded schools. I mean just like school where you have to interact with other people with you with the public is a rock tumbler and it it smooths off some of your edges. Um and if you don't get that tumbling, um those edges don't get s rounded off, and I don't remember if she actually said this, but what I think of it as is like, and then those edges can catch on things later in life. Like, and I think that we both like, yeah, we went to these weird, weird schools, but we both were like but that is one of the the things about uh about schooling is that it gives you it teaches you some social rules that will that will benefit you in the future if you can learn them.
SPEAKER_02Uh this is how did they say it? This is a tool that will help us later. Um yeah. Uh the you know, we went to weird schools, but they were schools. They were not uh, you know, we were not autodidacts uh in the same sense. Right. We had more control over our learning and it was more tailored to our abilities and preferences. Uh but you know, I I think that academic academic learning is not everything. Uh learning how to uh uh spend time with other humans is a very important skill set uh and one that schools teach you uh via throwing you in with a bunch of other humans uh as opposed to formally.
SPEAKER_01Yeah, and it's an imperfect system, for sure. It has a lot of negative outcomes as well as positive outcomes. Um but it is but it is not without use in that way.
SPEAKER_02Do we know if Yudkowski had a um i you know any any kind of socialization training or like immersion aside from his autodidacticism?
SPEAKER_01We don't, but I do know that he's read a lot of social psychology, and I wonder if he thinks that's the same thing. I don't know. I don't know. He he socialized a lot on the internet, and I tell you one thing, the internet is not good at smoothing your rough edges off.
SPEAKER_02True.
SPEAKER_01I'm gonna grant Yadkowski this much, uh though, as Harry goes off about child conscription, um uh Yadkowski does make fun of Harry for this. Both of Harry's parents howled with laughter at that, like they thought it was all a big joke. Oh, said Harry's father, eyes bright, is that why you bit a maths teacher in third year? She didn't know what a logarithm was. Of course, seconded Harry's mother. Biting her was a very mature response to that. Harry's father nodded. A well considered policy for addressing the problem of teachers who don't understand logarithms. I was seven years old. How long are you going to keep on bringing that up? I know, said his mother sympathetically. You bite one math teacher and they never let you forget it, do they? Harry turned to Professor McGonagall. There. You see what I have to deal with? Excuse me, said Petunia, and fled through the back door into the garden, from which her screams of laughter were clearly audible. There, uh there Professor McGonagall seemed to be having trouble speaking for some reason. There is to be no biting of teachers at Hogwarts. Is that quite clear, Mr. Potter? Harry scowled at her. Fine. I won't bite anyone who doesn't bite me first. Professor Michael Varys Evans also had to leave the room briefly upon hearing that. Well, Professor McGonagall sighed, after Harry's parents had composed themselves and returned. Well, I think under the circumstances that I should avoid taking you to purchase your study materials until a day or two before school begins. What? Why? The other children already know magic, don't they? I have to start catching up right away. Rest assured, Mr. Potter, replied Professor McGonagall. Hogwarts is quite capable of teaching the basics. And I suspect, Mr. Potter, that if I leave you alone for two months with your school books, even without a wand, I will return to this house only to find a crater billowing purple smoke, a depopulated city surrounding it, and a plague of flaming zebras terrorizing what remains of England. Harry's mother and father nodded in perfect unison. Mum! Dad End of chapter two. That's pretty good. It's it's charming. It's charming. I think it's a I think I I have some quibbles about the I think the plague of flaming zebras is unnecessary, but that's a quibble that I'm nitpicking.
SPEAKER_02I mean, it's a little lol random, but it was 2011 after all.
SPEAKER_01Exactly. Yeah yes. Again, I think it's really interesting to see things that like stick out to me as like, oh, this is what the internet was like in 2011, right?
SPEAKER_02Of its time.
SPEAKER_01It's very of its time, which is very interesting to me. So that's chapter two. What are you thinking?
SPEAKER_02Uh you know, it's I'm interested to see going forward, you know, how and if Harry does reconcile the apparent ways in which magic violates the conservation laws that that he has learnt.
SPEAKER_01Indeed. Uh yes, that I I ha so I have I've read ahead some. Um and so I can say with confidence that a a big chunk of things is gonna start involving um attempting to apply the scientific method to magic. Um I don't recall if they ever deal with conservation of energy. Maybe they do. But I don't remember that they do. Which does feel like a fun I don't know. Maybe he was just maybe he's just like, pfft, well, that one's out the window.
SPEAKER_02I do better move on. Right. I I do often think about uh the whole sufficiently advanced technology is indistinguishable from magic thing. And like you know, when we talk about magic, you know, what we mean is stuff that we don't know how to do, and like, you know, most of our current technology would appear magic to, you know, people 500 years ago.
SPEAKER_01Some of it appears magic to us right now, and I think that's a huge problem with the way people approach LLMs. Yeah, a lot of people are just like, well, that's magic. Uh it's it can do anything because it's magic.
unknownWrong.
SPEAKER_02I I have had some fascinating conversations with LLMs about LLMs uh and like the the ways in which they um they work and the ways in which they you know have no more insight about the ways that they themselves work than uh than the typical human, you know. Like we we don't I'm saying these words, I can't tell you like what my neurons are doing to allow me to say these words and in the same way, like an LLM says words and but doesn't like have an awareness of how it's doing it.
SPEAKER_01Right. Which d which does seem weird because somebody programmed it. Right, but the LLM didn't program itself. No, but like doesn't that documentation I guess if it doesn't have access to the documentation, it can't know it. It can go look that up, right?
SPEAKER_02Like in the same way that I can go look up how the neurons work. Yeah, exactly. Yeah.
SPEAKER_01But it's not um, you know, if you say like it's not inherently within the corpus of knowledge.
SPEAKER_02Why did you do the thing you just did, it uh it'll be like uh you know, I can I can I can look up the theory. Like uh I can tell you why I probably did it, but I, you know, it's uh it doesn't have that type of self-insight. It's fascinating.
SPEAKER_01Unnerving because I feel like a lack of self-insight has led to a lot of problems.
SPEAKER_02Yeah, but like I they can't help how they're made any more than we can.
SPEAKER_01I guess. Except well then that's why that's why the ra that's why rationalism is appealing, because it offers a set of tools for self-insight when when properly applied.
SPEAKER_02But it's still not direct. It's still like you're looking up uh you know, you can understand better how brains work, but that's not going to tell you in the moment what your own personal brain is doing to result in your actions.
SPEAKER_01Right. Right. Yeah, yeah, unfortunately. Well, gosh, Regina, thank you for joining me. I I'm like, I don't have a better outro than that. Brains, man. Brains, man. Self-insight, man. None of us are any good at it.
SPEAKER_02Yeah. Uh it has been an immense pleasure.
SPEAKER_01Thank I'm you I will definitely be inviting you. I I genuinely like, listen, again, what I said last time remains true. I can be a very malleable person, and I do want to have assholes tell me when I'm wrong. And I also, you're not an asshole, unlike my roommates. I think they would own that. Um and uh and I appreciate also having a not asshole tell me when maybe I'm wrong.
SPEAKER_02Uh you know, uh, and I also just you know, noodling about really abstruse philosophical concepts is one of my favorite pastimes.
SPEAKER_01It's how we got here, man.
SPEAKER_02Yeah.
SPEAKER_01Um I guess if I feel like in a normal podcast, this would be where I would be like, do you have anything to plug? Uh, but we're not really at that point yet. And I don't think so. Great. All right. Well, uh thank you so much for being here. Thank you all for listening to uh all of this. Uh that's it for this episode of HP More and the Limits of Rationality. You should subscribe. Uh you should share, you should put the RSS feed in your podcatcher. You should share it with people if you like it or don't like it, I guess. Uh, you know, sure.
SPEAKER_02So they can point and laugh.
SPEAKER_01So they can point and laugh. Yeah, that's engagement too. Um that kind of thing really helps out. And we will see you next time on HP Moralore. The Harry Potter universe is copyright to JK Rowling, and Harry Potter and the methods of rationality, and the sequences are copyright Eliezer Beauty Cowscape. This podcast may contain copyrighted material, the use of which has not been specifically authorized by the copyright owner for the purposes of commentary, criticism, and transformation, which are protected under the doctrine of fair use. This podcast is released under a Creative Commons application share like 4.0 in an actual license.
SPEAKER_00The music you heard in this episode is The Watchmaker's Secret by Nikolai Pedro's on hotel.