Galactic Civilization and ‘The Filter’ Fallacy

The Filter, in Brief

So, there is this idea of ‘The Filter’, the great divide that separates intelligent species from the galactic civilizations that are their supposed birthright. The reasoning goes as follows:

  • All intelligent species exist on the path to developing technology and culture that will enable them to expand to control their planet, and then solar system, and then multiple solar systems, and then spread throughout galaxy
  • Given the age of our own galaxy, we would expect other intelligent species to have evolved before humans, and to have already begun this colonization process
  • We don’t see any sign of them, and so there must be The Filter, the crisis that prevents intelligent species from moving along this inevitable staircase of development
  • So we wonder – is The Filter behind us, and we’re one of the few intelligent species to make it this far, or is The Filter ahead of us, and we’re doomed?

In thinking about what this Filter could be, one can come up with places it could occur starting all the way back with the origins of life:

  • It could be that it is very unlikely for life to begin at all, and so on planets in their star’s ‘Goldilocks zone’ we will only find various kinds of chemical soup
  • Maybe it is very unlikely for multicellular life to develop, and so that soup will just be filled with simple single-celled organisms
  • It could be very unlikely for intelligence to develop (this one is a hard sell for anyone who has looked at the intelligence of non-human animals)
  • Perhaps run-away feedback loops like climate chance prevent intelligent species from living long enough, or maintaining a civilization long enough, to colonize their solar system
    • Ditto with something like thermonuclear Armageddon, or AI deciding to kill us off, or nanotechnology turning us al into grey goo, etc.
  • Or a lack of any faster-than-light travel solution could make colonizing worlds beyond one’s homeworld economically impossible
  • Or maybe something crazy, like a FTL-capable civilization wipes out all competition, and they just haven’t decided we are a threat yet

Clearly there are a lot of other options, but those above are common.

Flawed Premise

The problem I see in this formulation immediately (and I’m far form alone, nor innovative in doing so) is that it is founded on the premise that all intelligent life will inevitably lead to something like our own technocratic, hierarchical and exploitative way of life. That is, we take the way we happen to live now as a cosmic given, and then reason from there.

That’s insane. That’s a failing grade on your term paper in Philosophy 101. That’s a huge argument build on a sample size of one, when we even have other intelligent species on Earth to look at for other examples. Why not argue that orca intelligence is inevitable, or cetacean intelligence, or chimpanzee intelligence, or the emergent intelligence of insect colonies? We’re not even the only intelligence here. We’re just the most disastrous for every other living thing.

Conclusion

Maybe there is no filter, and we are just caught in the throes of a suicidal trajectory because we are a particular kind of intelligent life in a particular situation. There’s no reason to assume that all life would be in a similar situation, much less to assume that all intelligent life would inexorably seek to exploit their entire planet, and then solar system, and then multiple solar systems.

Maybe as we find signs of life in other places, that life will be living in approximate balance with its ecosystems, like the various species of human did for hundreds of thousands of years before the last ten thousand or so. Maybe they will have developed means to detect us, and have meetings to decide what to do about this one rogue form of intelligent life out there that seems hell-bent on killing itself and everything around it. Can they somehow contain the damage we do? What do the thousands of other intelligent species on other worlds think?

The galaxy could be empty of star-spanning civilizations because of wisdom and no other reason. The “Filter” could exist only in our thinking about the nature of life, and intelligence, and civilization. It seems that we are catastrophically wrong about how to live on our own planet – it stands to reason that we would also be catastrophically wrong about how to live on multiple planets circling multiple stars as well.

Sci-Fi: Resleeving, Star Wars, Supers and Moralism

Re-sleeving Madness

In science fiction settings like Eclipse Phase or Altered Carbon, there is the idea of “re-sleeving” – that you can download your consciousness into new bodies. This leads to a situation where those with means can live forever, continually downloading into new bodies (as well as more complicated things like downloading into multiple bodies at once, etc). I’ve always thought that this would, over time, drive someone insane. I like the idea of having a table to roll on every time you re-sleeve that lists quirks and maybe even delusions which build up over time. Ideally, you end up with thoroughly deranged ultra-rich oligarchs who are almost impossible to kill.

Sequel Requel

I ran a pretty fun Star Wars game using Fate Core which was a “requel.” We took the premises and starting-point of the Star Wars prequels and then took them in an entirely different (not-terrible) direction. I would like to run a game that is a sequel requel, which takes the basic premise of The Force Awakens and then takes the story in a new direction. I’m much happier with the sequels than the prequels, but there still left plenty to be desired. I’d personally want to take things in a more The Last Jedi sort of direction (I still think the saga should have ended with something like TLJ and that it was the wrong middle movie) but it would be fun to see what the players did withthe basic elements of The Force Awakens: Luke is the MacGuffin, Ben Solo is the pseudo-Darth Vader, the old guard is passing the torch, the new Jedi Academy failed spectacularly and led to the Knights of Ren, etc.

Force Sensitivity

I would love to see Force sensitivity in the Star Wars universe would be presented as, in many ways, simply sensitivity. This would help explain why Jedi training is so opposed to feelings and interpersonal connections, and why Sith are so volatile. It would explain why Kylo Ren throws tantrums. Force sensitivity is sensitivity to every living thing, to every feeling, to every intuition. It would be overwhelming, like hyper-empathy, and would do a bit to explain the weirdness of Jedi teaching and the frothing man-rage of Sith lords.

Charlie and the Amazon Fulfillment Center

I was thinking of Charlie and the Chocolate Factory and it’s odd, idiosyncratic moralism. According to the story’s internal logic, moral failings include watching lots of TV, chewing gum and over-eating. These are the kinds of things that exclude the kids from winning the ultimate prize, earning them Oompa Loompa songs mocking them as they go. I always thought those were kind of stupid things to list as moral failings. I was wondering what it would look like for the moral failings to be actual wrongdoing. Maybe if the story was changed so that it was adults who were on the tour, and the tour is of a magical Tesla factory or Amazon fulfillment center, full of robots and weird mechanisms and magical inventions.

So the moral failings could be real failings, eliminating the adults one by one, with swirling AI robots singing their mocking songs. At the end, the winner gets the prize of the factory or fulfillment center or whatever, and they immediately liquidate the place to give the proceeds to the poor, or fire themselves and all management to make the place into a worker-owned co-op, because that’s what an actual moral person would do in that situation. Update the morality tale for the 2020s.

Physics-Based Superpowers

I know this has been discussed many times in many places, but it is fun to think about superpowers that were more rooted in real-world physics, biology and chemistry (more than just “it’s a mutation” or something). Things that were connected to mechanisms we understand, just ramped up to superhuman levels. Maybe one could be slowing or accelerating entropy, or slowing or accelerating local time. One could reach into alternate dimensions and bring things to our own dimension from them, based on the multiverse theory. Is there a (pseudo) scientific basis for projecting one’s thoughts? Maybe powerful pheromone control for ‘mind control’ and hyper-empathy for mind-reading. Changing one’s own metabolism, so super-speed would lead to a glucose crash and being flooded with lactic acid. Creating prions to rewrite others’ DNA. Having extra organs derived from animals, giving you the abilities of an electric eel or shark or lionfish. Anyway, I’ve always thought it would be interesting to have a supers setting where the science behind it all was more of a consideration than it tends to be.