
Entropy Rising
Entropy Rising is a podcast where hosts Jacob and Lucas explore everything from today’s cutting-edge technology to futuristic concepts like Dyson spheres, discussing how these advancements will impact society. Dive into deep conversations about innovation, the future, and the societal shifts that come with the technology of tomorrow or the next thousand years.
Entropy Rising
The Dark Forest Theory: First Contact, Planet Killing Weapons and Cosmic Survival | Entropy Rising Episode 9
Support us on Patreon: https://patreon.com/EntropyRising?utm_medium=unknown&utm_source=join_link&utm_campaign=creatorshare_creator&utm_content=copyLink
Listen to The Multiverse Employee Handbook podcast: https://multiverseemployeehandbook.com
Follow us on treads: https://www.threads.net/@entropyrisingpodcast
Website: https://www.entropy-rising.com/
Book highlight (The Lever): https://rebelsatori.com/product/the-lever/
In this episode of Entropy Rising, Jacob and Lucas tackle one of the most thought-provoking and unsettling ideas in the search for extraterrestrial life: the Dark Forest Theory. This theory, popularized by science fiction but rooted in game theory and strategic thinking, suggests that the universe isn’t just quiet—it’s silent because intelligent civilizations know that making noise could be a fatal mistake.
The Fermi Paradox is at the heart of this discussion. With billions of planets scattered across the galaxy, the odds of intelligent life emerging elsewhere seem overwhelmingly high. So why haven’t we seen or heard any evidence of extraterrestrials? Some theories suggest that civilizations self-destruct before achieving interstellar capabilities, while others propose that we simply haven’t been listening in the right way. But the Dark Forest Theory offers a much darker explanation—what if advanced civilizations are intentionally hiding?
Jacob and Lucas explore the game theory behind this idea, comparing it to real-world scenarios where acting first is often the safest option. If civilizations view each other as potential threats, they may adopt a "shoot first, ask questions later" mentality. This leads to the unsettling possibility that the first sign of another civilization could also be the last—because the best strategy for survival might be to eliminate any potential competition before they become a threat.
One of the most fascinating aspects of this discussion is the idea of relativistic kill missiles (RKMs)—hypothetical weapons capable of traveling at near-light speeds and delivering catastrophic destruction across interstellar distances. If a civilization detects another developing species, could they fire an RKM to ensure their own survival before the other side has a chance to grow into a threat? And more importantly—have we already made ourselves a target by actively broadcasting signals into space through projects like SETI and the Arecibo Message?
The conversation doesn’t stop at doom and gloom. Jacob and Lucas also consider alternative explanations for the eerie silence of the cosmos. Could an advanced galactic community already exist, with rules preventing young civilizations from making contact until they prove themselves worthy? Or is it possible that self-replicating von Neumann probes are already out there, quietly monitoring emerging civilizations like ours, waiting to see if we develop in a way that aligns with their interests?
Despite the grim implications of the Dark Forest Theory, there are reasons to question whether it holds true. Human history shows that as civilizations advance, they tend to become less violent and more cooperative. Could the same be true on a galactic scale? If interstellar expansion requires collaboration and intelligence, would an advanced alien species even think in terms of conquest and paranoia? Or is it possible that we are simply alone in a vast, empty universe?
As Jacob and Lucas navigate these big questions, they invite listeners to consider the implications for humanity's future. Should we continue searching for extraterrestrial life? Should we be cautious about broadcasting our presence? Or is the fear of a hostile universe just another layer of sc
Website: https://www.entropy-rising.com/
So taking something. Getting up to, 99. 999, a couple of repeating nines percent the speed of light and hitting a sun, for example, could actually be enough to cause their sun to go nova.
Lucas:Now that's an interesting idea if you wanted to destroy their entire solar system just to be safe. I mean, well, why stop there? Let's just take out all the suns around them just to be safe.
Jacob:Hello and welcome to Entropy Rising, where we talk about science and futurism. I'm your host, Jacob, and I am here with my wonderful co host, Lucas. Lucas, how are you doing today? Jake, I'm doing great today. How are you doing? I'm doing good. I'm happy to hear that. today we have viewer requested episode, actually. We ran a poll on threads. This is what was voted for. This is our second such episode, so If you're interested in voting in these polls, follow us on threads. Link in the description. The topic for discussion today is the Dark Forest Theory. Which, as you know, is a solution to the Fermi Paradox.
Lucas:Yep, it is one of the solutions, and to those out there who don't know, the Fermi Paradox is essentially a bunch of different theories as to why we have not yet found alien life. Our galaxy, is so vast, and there are so many planets out there we just don't know why we have not seen or heard from any other life so far.
Jacob:Exactly. And the Fermi Paradox we actually covered in a previous episode. It was episode 5, so for anyone who's more curious about the Fermi Paradox as a whole, feel free to check out that episode, but it is not required that you listen to that before this episode. Yeah, not at all. as you said, there's multiple solutions to the Fermi Paradox is what they're typically called, and the Dark Forest Theory is one of those solutions. Now the Dark Forest Theory is a little bit of a, um, a chilling solution, right? It's not a happy solution. But the Dark Forest Theory essentially says Imagine that you're a hunter alone in the woods that is full of other hunters. the idea is you don't want to make a lot of noise. Because if you do, you risk drawing other hunters to your location who might shoot you. And conversely, If you see a hunter making a bunch of noise, lighting a big bonfire, the dark forest theory says it's in your best advantage to go ahead and eliminate that hunter, because that means there's one less out there who could potentially shoot you later on down the road.
Lucas:Yeah, in the dark forest theory, it's pretty much thought that every other civilization that we could see that are creating a measurement that could be read by some of our instruments would be a possible threat. so with that being known, Would it be the safest and best option to preserve ourselves to shoot first? Yes,
Jacob:exactly. And, a lot of the the dark force theory comes from this idea of game theory. Are you familiar with game theory? I'm familiar with game theory. I think most people are, if it's not by name, most people have at least heard of the Prisoner's Dilemma. Have you heard of that thought experiment, the Prisoner's Dilemma? I have not. the Prisoner's Dilemma is, it's a famous thought experiment. It basically imagines you have two prisoners who are brought into separate interview rooms, and they can stay silent, or they can rat out the other person. And it goes through this explanation saying that it's always in their best interest To rat out the other person. And the dark force theory has a similar kind of dilemma. Because, let's say you do discover a separate alien civilization. Let's say that I'm alien civilization Jacob and you're alien civilization Lucas. Now, Lucas, if you discover my civilization, one of the things you have to keep in mind is that we really don't have any way to communicate in a fast manner. we might be a hundred light years away from each other. when you discover me, you have three options. Option A is that you can choose to do nothing. You can, discover me and go about your merry own life. Option B is that you can choose to send me a message, you can reach out to my civilization in hopes of maybe building up some type of friendship, and then option C is, if you have the technology, you could try to destroy my civilization. Now one thing you have to keep in mind for dark force theory, is you have to know when you discover my civilization, I have these exact same three options. I could choose to ignore you, communicate with you, or destroy you. Now, the reasoning behind the dark force theory says that option A and B, ignoring me or reaching out to me, have very little benefit to you. ignoring me obviously has no benefit to you. reaching out to me, maybe we can work together on science, but due to the communication delay, it's not like we're really going to be able to share materials. We can try to share ideas, but, any idea I send you is 100 years out of date by the time it reaches you. So you've probably discovered that on your own. But, both those options, option A and B, do carry a lot of risk. Because if you identify yourself in this dark forest, and I choose to annihilate you, if I go for option C, that's the most cost for you, and you got the least amount of gain in return for it. And you have to know that if you send a message to me, I also might choose to do option C, and destroy you. And you have no way of knowing what my intentions are. Because, we're a hundred years away from each other for any type of communication. Yeah. the Dark Forest Theory and the game theory behind it says, mathematically, since option one and two pose very little benefit to you, but high risk, and option C has a little bit of risk to you, but high benefit, which is that you preserve your own civilization, mathematically, it argues option C is the option that you should go for.
Lucas:Yeah. that makes total sense. You would have first strike advantage if you were the first to act. And, unfortunately, Those who shoot first would prevail.
Jacob:That's at least the idea behind the dark forest theory, right? Is whoever shoots first has the best advantage. Now, do they, I don't know, that's a little bit, up for debate, right? Because there is a huge risk too, with trying to shoot first, which is that. If you discover my civilization, and let's say that we're on about the same technological level, you fire off RKM, a relativistic kill missile, at me, I have a hundred years to develop before that reaches. So there is a risk that maybe by the time it gets to me, we've expanded out past our solar system. Or let's say we're a thousand light years away from each other. The time you detect my civilization It took a thousand years for that light to reach you, and then you decide to send a relativistic missile back. It takes a thousand years to reach us. We've had two thousand years of development. We could be a huge, multi star spanning empire at that point, and you just took a pot shot at us.
Lucas:Absolutely. and thinking about that, that's actually a huge player in it is that depending on how far away they are, those actions that you make give them that amount of time to prepare. Exactly. Assuming that they would be able to notice that you had done something that was hostile at that point.
Jacob:But that's tricky because of light delay. If you detect the light from a civilization and then immediately respond with something that travels near the speed of light, which is the idea behind a relativistic kill missile, then there's no way they could possibly know that you fired that off. Because by the time the light reaches them to let them know you launched such a weapon, the weapon's right behind that light wave.
Lucas:Right. Well, you know, it could be a fraction of a speed of light. moving, just a smaller object like the size of an elephant or a school bus we could shoot at a planet. and that's more likely because trying to achieve close to speed of light is more difficult.
Jacob:It is more tricky, but it does also have a lot of advantages the closer you get to the speed of light, the more kinetic energy it carries. And you're gonna need a a large weapon to annihilate a civilization. So taking something. Getting up to, 99. 999, a couple of repeating nines percent the speed of light and hitting a sun, for example, could actually be enough to cause their sun to go nova.
Lucas:Now that's an interesting idea if you wanted to destroy their entire solar system just to be safe.
Jacob:Which you should probably do because there's always the risk that they have O'Neill cylinders maybe orbiting other planets. They could have colonized their moon, other moons, it's really hard to be sure. So yeah, take out the solar system is the best bet.
Lucas:let's just say that they're a hundred light years away, they've had a hundred years to advance from the point that we saw them as being a hostile civilization. And then they have another hundred years. If we're traveling at that speed of light or just. Below that speed of light to then advance more. So they have 200 years of advancement by the time that we fired our projectile. Yeah. yeah, I mean, well, why stop there? Let's just take out all the suns around them just to be safe.
Jacob:That's fair. Cause there is always the risk that they could have expanded on the separate solar systems. And there's a huge risk if you don't completely annihilate them, because. What if they stick around, right? They're going to hold a grudge on you. You just made a blood enemy.
Lucas:Yeah. And I mean, then you share the same fate, but they also have the same issue, right? If we're at similar points technologically, then we would have advanced. assuming we would have advanced, the same amount that they would have in span multiple star systems.
Jacob:I agree. Now that actually, you brought up an interesting point. You said if we were at similar points technologically, and that's a really big if, because what are the actual odds that two civilizations would discover each other? when they're the same point technologically. I mean a thousand years is a humongous difference technologically. Oh yeah, it's massive. That's a rounding error on the types of timescale civilizations develop over. So what are the odds you're going to stumble across a civilization that's equally as developed as you? I think rather slim.
Lucas:I, I would say so, yes. So
Jacob:you're either discovering a civilization that's significantly more advanced than you, which means you're not going to want to attack them, or you're discovering a civilization that's significantly less advanced than you. And I think the Dark Forest Theory argues, you're going to want to target any civilization less advanced than you, because you don't want them to become an advanced civilization that could rival you.
Lucas:Yeah, but then again, if you start targeting those civilizations, you then show yourself to the larger civilizations that could then target you, and you've proven yourself hostile.
Jacob:That is also a fair point, and an issue with the Dark Forest Theory is that, you know, if you're this raging, civilization, taking pot shots at anyone that reveals themselves, how long before someone Decides to hick you about too, because like you said, you're now showing yourself to be incredibly hostile.
Lucas:Exactly. Which could then bring an actual benefit too. Not acting through the, like the way of the dark forest theory, right? You said that option A and B give us little benefit. But if we are able to see that there is other alien life, and we can assume that alien life out there could be in more abundance, perhaps even a galactic community of some type, then it would be in our best interest not to fire.
Jacob:That's true, but I do want to point out, though, that This notion of civilizations directly shooting each other isn't the only way that you can have a dark forest state.
Lucas:Oh yeah, absolutely.
Jacob:a famous one is something like the Inhibitors from Revelation Space. If anyone's read that book, it's a great book. They basically designed a machine intelligence that roams around the galaxy and annihilates any budding civilization that's entering the interstellar. Theater. And this is another solution or another proposal of the dark force theory is maybe you've had are you familiar with von Neumann probes? Yes. So von Neumann probes are self replicating probes, and it's a proposed way to colonize the entire galaxy. You send them off, they go to another solar system, they replicate themselves, and then they exponentially spread across the galaxy. there's this proposal for the dark force theory that maybe a civilization has developed von Neumann probes and either intentionally or accidentally, these probes are also annihilating any form of life that crops up. So if you send out messages and any of these von Neumann probes notice them, they come to your solar system and just cleanse it.
Lucas:See, I actually would see that as a far more likely. outcome as to what could happen, maybe a civilization advanced, not just a couple of thousand years, but a million years prior to us. And they've reached levels that are unfathomable and they thought in the same way that we would think it's in our best interest to destroy. Our neighbors that are similar in technology to us, they were like, let's just destroy everything that gets to a certain point, right? And it could be that they come down and they just eradicate the smartest creature on that planet. and then they fly off until it gets to another point like that. But we could be being monitored right now and have no idea. But that would mean if that was something that occurred and all of the other civilizations around us were already quiet, they either truly believe in the same theory or they actually were able to witness another civilization go dark because of something like this and they are already staying quiet. Which would mean that everybody that is quiet out in the galaxy is either, staying hidden or dead at this point.
Jacob:Yeah, that's true. quiet you need to have, Witnessed someone else get annihilated right so that means there does need to be a certain rate of new civilizations cropping up getting You know cold and then all the remaining civilizations being able to witness that and this is possible I mean this could happen. We've only been able to really do radio astronomy for example for what a hundred years now So maybe it's the case that every, two, three, four hundred years this happens and any civilization that happens to have the technology to detect this but doesn't quite have the technology to make themselves noticeable are just the lucky ones. They get to see this happen and they know to shut up. Or like you said, it's possible that it's just every civilization gets annihilated eventually because one of the issues with the dark forest theory is that you really can't stay hidden on the galactic scale. You really can't hide life on a planet. I think a lot of people imagine our radio signals and our broadcast signals are what gives us away. But those actually are pretty unlikely to be discovered by an alien civilization, at least right now. And that's not the biggest giveaway that there's life on Earth. Actually, our atmosphere is going to be the biggest giveaway that there's life on Earth. One of the ways that SETI actually looks for alien life and something that an advanced civilization Would be able to do much better is we will actually take a spectra of the atmosphere of a planet when it's passing between its star and our telescopes and by the way, the Light from that star gets shifted as it passes through the atmosphere. We can figure out the chemical composition of the atmosphere of that planet and we can look for things like high oxygen content, for example Because oxygen is a fairly volatile element. It's very reactive. So it's actually pretty rare to have high concentrations of oxygen on rocky planets, for example.'cause it typically gets sequestered in the form of like oxides, like iron oxides. Mm-hmm So if you see a planet with high oxygen content, it's a dead giveaway. That planet probably has life,
Lucas:at least plant life.
Jacob:Yes. At least plant life or Some type of life,
Lucas:Yeah, some type of life that converts something into oxygen. Exactly,
Jacob:because it has to be a constant, reoccurring process, and there aren't many, or as far as I know, there might not be any natural processes for this to happen. So that means, if there is some alien intelligence looking to eliminate any life in the galaxy, they already know Earth has life. that leads to one possibility, right? If this is true, if the dark force theory is true, And there's an event species eliminating any lesser species. It must be the case that at least basic life is very common. Because otherwise, the life on Earth would have been eliminated a long time ago. But if you imagine a situation where maybe basic life is very common, then these biomarkers of life would be all over the galaxy. And there's too many of them to efficiently go and kill all of them. But if intelligent life is very rare, then that's the only situation where the dark force theory makes sense, is that basic life must be common, intelligent life must be very rare.
Lucas:Yeah, I definitely agree with that. they would be looking not Not just for life, but for signs that life is advancing in a way that they could get to the point that higher civilization is right now. Exactly. And actually become a threat. we could build a Dyson Swarm and then get blown up 20 years later. Yeah, that,
Jacob:or we could just get annihilated, down the road because if they're constantly checking the spectra of planets that can harbor life, they might be looking for certain key compounds that indicate that industry has happened. In that planet. If you see a sudden spike in carbon dioxide, for example, maybe that would be a sign of hey, you know, over the past hundred years, this planet has had an unprecedented spike of carbon dioxide or other greenhouse gases. Let's send a dark force strike to them just to be safe.
Lucas:Yeah. Yeah, absolutely. Or they saw evidence of us splitting the atom and they were like, okay. It's time to go. Yeah. Yeah.
Jacob:Yeah. It's very possible there's certain compounds that are made once that happens. And once that's detected, maybe, these, horde of, robot intelligences go up. That planet needs to go.
Lucas:They could be on their way right now. They're just a thousand years out.
Jacob:Yeah, exactly.
Lucas:But yeah, no, it's all part of the fun.
Jacob:Oh, is it fun? I don't know.
Lucas:Now, but talking about all that doom and gloom, there is a little bit on the brighter side. Personally, I don't think that the Dark Forest Theory would be a viable solution to this paradox. And I feel like you feel the same, Jake.
Jacob:Yeah, I do. I think it has a lot of issues. I think the one I proposed of roaming von Neumann probes are the most likely solution, but that also has a couple of problems with it, too, in the sense that I wouldn't have expected Life on Earth to get as far as it has already if that was the case.
Lucas:Yeah. the biggest issue that I find with it is that as we progress as a civilization, we become less violent. Okay. So You would have to assume that trend would continue and that trend would then assuming that all life, grows up the same way, which it might not, but assuming that all life would grow in the same way culturally as ours does. then we would assume that as they progress, and as we become a more unified planet, we would have those thoughts of violence less and less, and a higher civilization right now might not even have thoughts of claim and violence.
Jacob:Okay, I see where you're coming from. And I can see some logic behind that, or some rationale behind You would think as a civilization gets larger and tries to undertake larger projects like developing an orbital economy, branching out into space, they would need more collaboration. I think we're going to need significant levels of international collaboration to significantly colonize our solar system. So perhaps you're right That's a necessary prerequisite to advance to the stars, is to become less violent, less tribal. is it universalizable? To an extent, I think so, maybe, unless you have a violent hive mind out there that Yeah, or
Lucas:you could have warrior cultures that pride themselves on killing and sacrifice, and they've advanced in some way. I
Jacob:think that's actually pretty unlikely, Yeah, but it
Lucas:is a possibility.
Jacob:Is it? I don't know. I have a hard time imagining a warrior culture like the Klingons from Star Trek could actually become space faring. Even in the lore of Star Trek, the Klingons weren't always warrior like. they developed significantly before they became like that. And I think even their warrior like antics are contained to very specific things. The majority of the population is not going around and murdering everyone anytime.
Lucas:No, that, that is true. the reason why I say that could be a possibility is I imagine a solar system where there's intelligent life that develops side by side. You could have spacefaring wars between the neighboring planets, which could then drive their technology even farther and more advanced because war does advance technology in most cases.
Jacob:I guess that's true, or at least in humans it does, because it's the only time we actually put a lot of money behind R& D. That's not necessarily universalizably true. You might have a species that just values research and just, Dedicates a significant portion of its version of a GDP to research anyways.
Lucas:Yes. And just to be clear, I do agree with you that it is probably not likely, but it's not impossible. I guess it's
Jacob:not impossible. I do think it's, yeah. And that's the thing with the dark force theory in general, people like to argue, I've made several posts about this on threads, and the general argument is that It's unlikely that civilizations would advance to then still be a violence, but my argument to that is always you only need one. You only need one civilization to get spacefaring technology that decides to eliminate anyone else underneath them to have a dark force state.
Lucas:That is true.
Jacob:So even if it's incredibly rare that does happen, it only needs to happen once. So it depends how common life is.
Lucas:Yeah. there could be a section in our galaxy that is currently experiencing a dark forest, right? But, perhaps then there is even another filter that could weed them out. Yeah. And then we're in multiple different, theories for, or solutions for the Fermi Paradox.
Jacob:Absolutely. I will also say too, an issue with the Dark Forest Theory is, I went into kind of the game theory behind it at the beginning of the episode. And mathematically that kind of works on paper, but There are some significant issues, one of them I briefly touched on, which is that, there is a significant risk to you performing a dark force strike to trying to eliminate abutting civilization, is that one, you're identifying yourself, it's really hard to launch something like that without being seen by others, and two, you have to be 100 percent certain that your dark force strike is going to eliminate that civilization, or else you've now just, increased the likelihood that they're going to retaliate. You've taken a huge risk and if you can't guarantee 100 percent success, is it worth it to take that risk in the first place? And that's the type of map that every civilization would be running. And because of light delay, you have no idea what that Species is like when you actually, your relativistic missile reaches there.
Lucas:Yeah.
Jacob:So it could be the case, that you're looking at the civilization, they're 300 light years away. You're a hundred percent certain that, if you fired your RKV right now, and hit their sun, it could eliminate them. But can you be sure that when it actually reaches there in 300 years, they haven't advanced more than that?
Lucas:yeah, it definitely is, a massive risk at that point when you do think about it
Jacob:and. Can you be sure there's not some more advanced civilization than you, also, looking for people firing these types of, weapons. And the minute you fire yours, they might be firing theirs at you. like we said, we touched on earlier, you've now identified yourself to the whole galaxy as a risk.
Lucas:Exactly. they could have just been indoctrinated into, A galactic security type of thing where it's just mutually assured safety. Yes. Amongst everybody. And then the new guy comes into the room and starts launching missiles. Yeah. They're gonna be like, dude, get rid of him.
Jacob:Yeah, exactly.
Lucas:Get him out of
Jacob:here. I do think the notion of some type of intergalactic community is actually pretty unlikely to light delay. Having any type of coherent Large body that spans a galaxy. It's just it's not doable in my opinion. This goes also with colonization But even with alien species, I think it's unlikely you're gonna have any type of Wide scale communication or coordination because how could you?
Lucas:Yeah we could think of it this way as that super advanced civilization that came a million years prior They don't communicate with any of the planets but they do ensure the mutual safety of them.
Jacob:One way you could, somewhat go about this is it won't be coordinated in the way you're thinking of an intergalactic, Security force. But it could be the case that when we launch into space and we start expanding, we get a message from a more advanced species, or maybe several of them, that says, Hey, congratulations, welcome to the galaxy. You are allowed to expand 200 light years, maybe 1, 000 light years, from the radius of your home planet. That is your volume of space, and everyone is going to immediately target you and destroy you if you expand any further than that. That's the space you get. please put up some markers. maybe a giant broadcasting station that says, this is my area. And if you expand beyond that, then you've become a target for everyone.
Lucas:Yeah. See a galactic community that never sees each other, but gets a singular message. Yes. to let us know that they are there. I feel like that would be probably the most realistic thing that we could look, to, coexisting with alien life.
Jacob:yeah, I agree. And the irony of that is actually how dark force here on Earth actually work. predators, they mark their territory. And the reason you do that is you're telling any would be attacker of your same species or anything like hey, this is my territory and that's going to dissuade a majority of them who decide they don't want to deal with the fighting because it's a huge risk in nature to actually have a battle because if you get injured it's death and it's kind of the same thing in space right if you try to take over a region of space you're also risking yourself getting injured and maybe in a dark forest state galaxy taking a significant blow you It might not kill you, but now you've become significantly weakened and there's a risk other people are going to attack you now. So by just announcing your presence saying, I'm taking this volume of space, it's mine, that could be enough for someone to say, it's not worth it to expand into that space, even if it would be beneficial to me, it's just not worth the risk,
Lucas:which, would definitely be much more. of a safe route, it's just establishing that communication, marking our territory to others, would be something that would have to be understood throughout. So it would need that generalized message.
Jacob:Yeah, exactly. Yeah. Maybe some type of beacon that just announces it. And we don't see anything like this though is the only problem is you would expect these beacons to work on some form of electromagnetic radiation. I guess people always say, ah, maybe they use. some form of communication that we don't know of. it's not impossible. If you're a tribe in the Amazon, you might look around and say, there's no other civilization because you can't detect radio waves. I don't know if I fully buy that though. I don't know. There could be maybe some form of, extra layer of physics that, enables communication that we're unaware of that maybe is beneficial on galactic scales that we haven't discovered. And maybe when we develop that technology, we'll find that we actually are surrounded by aliens. but I do actually personally believe that's probably not likely.
Lucas:Yeah, I mean, I like to buy into the thoughts that we're just scratching the surface of Anything that science can offer us because that's how science always is Do you think that we understand it and it's never the same a hundred years later? I have no evidence to support that wouldn't continue, right? So, you know, I will buy into that but then of course on the other hand we have to try to understand and rationalize these theories with what I do know currently. so I do see both sides of that.
Jacob:it could be the case that maybe the whole galaxy is communicating with gravitational waves, which is something that, is real and can be manipulated. And maybe it turns out that's more efficient over long distances. And some technology allows that to be a way we communicate. Because right now, Humans can detect gravitational waves, but we can only detect massive events like two black holes colliding. So maybe, all these civilizations have discovered an efficient way to send messages that way. And that's how they have to communicate. And we just can't detect those signals with the right level of fidelity. It definitely could be. And that's just one example. maybe there's some other stuff too. We don't really understand dark matter, dark energy. maybe there's something that can be done with that, or maybe there's a level beyond that.
Lucas:Yeah. it is just like, these are all theories for, things that could work that way but another reason why I don't. Feel like the dark forest theory would be a real thing is because watching our civilization and how we have Eagerly thrown our information into space on our voyagers, we've put our understanding of physics and the coordinates to our planet specifically in the universe. We have SETI whose entire job is to try and find alien life and which then would reveal us what point do we just completely shift and go? Oh. Stop all that. We're just going to hide.
Jacob:Yeah, I know. I agree with that. And there's this axiom in statistics that says if you only ever have one datum point and you need to extrapolate off from that data point a much larger set, you should always assume that data point comes from an average because it's much more likely to represent an average result than an outlier. I think, with what you said, we should look at what humanity is doing, and it's not an unreasonable assumption to think aliens would do the same thing. And, I think naturally, if you're going to be a spacefaring civilization, you have to be at least somewhat curious to develop science in the first place. So I think, yeah, why wouldn't alien civilizations also announce their presence and try to find other ones out there?
Lucas:Yeah. even with us, I feel that humans and our planet are, particularly violent.
Jacob:I think what I said also applies here. We should assume humans are about an average level of aggression. any alien civilization has also had to go through natural selection, evolution, predation, so I think it's reasonable to assume we're not. Uniquely violent. We're probably averagely violent.
Lucas:Okay. Yeah. No, that, I do agree with that. we are averagely violent, right? but we are violent. but yet we still go into this new frontier. and what could be new civilizations with an eagerness instead of a fear.
Jacob:I think so. Yeah.
Lucas:And I just feel like that. We'll never naturally shift unless like we talked about before, we saw evidence of life on a planet and then one day it just went out.
Jacob:Yeah, that or, maybe as we develop technology, we start finding signs of ruined civilization. At that point, we might say, all right, guys, we got to be quiet. something's going on here. A lot of civilizations are falling, shut it all down, shut it all off.
Lucas:Yeah, but then you also have to assume if we are an average, and other civilizations developed like us, who started it?
Jacob:Yeah.
Lucas:who was like, oh, there they are. Let's launch the first missile and get this whole thing going. It's just, it just seems unlikely to me.
Jacob:Yeah. No, I agree. I agree. The Dark Forest Theory, it's good food for thought. It makes a great book, The Three Body Problem, if you haven't read it, that series, the second which is literally called The Dark Forest. And while it didn't invent this theory, it didn't make it popular. it was a great book. I liked it but I think reality, it's not, I believe the most realistic solution to the Fermi Paradox.
Lucas:No, it is one of the most exciting though. It is. Thanks for good books. It does. It's a great, yeah.
Jacob:Tells a wonderful story, I think.
Lucas:Yeah.
Jacob:All right. Lucas, I think we both have similar feelings in the dark forest This was an interesting discussion. Hey before you guys close out of your players of choice. I want to let you guys know that as we are approaching our 10th episode, Lucas and I want to make an additional episode of sort of like an AMA, get to know us and all that, that we will release after episode 10, but before episode 11. So if you have any questions for us, whether it's personal questions about Lucas or me, if it's questions regarding anything on science fiction, futurism, questions about previous episodes, questions about the podcast itself, just let us know. Leave it in the comments of whatever platform you're on, I check all of those. If the podcast platform you're on doesn't allow comments, check out our threads, I'm gonna put a post on there, so that you guys can leave your questions, and those will be answered in that episode. And like I said, that's not gonna replace episode 10 or 11, it'll be released the Monday in between. please leave your questions, we're eager to get to know you guys a little bit better. And I think it'd be a great way for you guys to get to know Wes personally.
Lucas:And it'll be fun.
Jacob:Exactly. thank you so much for joining us. Please stick around to the end to hear our author spotlight.
Lucas:Thank you all so much. Bye bye.