ETI Musings (Spoilers)

2 posts / 0 new
Last post
ChaserGrey ChaserGrey's picture
ETI Musings (Spoilers)

These are some ideas that have been ricocheting around my head for quite a while now. I'm posting them here mostly to get them out of my brainpan and in the hope someone here might find them amusing. If I'm going over ground that has already been beaten to death, I apologize in advance.

A quick note about tone. Perhaps it's because I've played too much Call/Trail of Cthulhu, but when I come to EP I find I don't really want another game of hopeless horror. I want there to be at least some chance for transhumanity to adapt and survive, provided we don't screw things up too badly. If extinction is inevitable, why play a game about fighting it?

That, to me, says there pretty much can't be an active, hostile ETI out there. To misquote Wells' War of the Worlds, there wouldn't be a fight between humans and the ETI, any more than there's a fight between humans and ants. (Dealing with an infestation in my kitchen actually got me thinking on this again.) They mine star clusters like they're mineral veins. We don't. If the Exsurgent virus is just the first layer of some cosmic antivirus system, imagine what active measures would look like. Set course for the scum barge, folks, because we might as well die drunk and high.

Solution? They have to be either not active, or not hostile.

Scenario 1: Not active. This was discussed in the main book as one of the possible origins for the Exsurgent virus- that it was a weapon in some unimaginable ancient war that hung around after the combatants were wiped out or ascended. To be honest, I never bought this at first. If a weapon was designed to devastate a transcendent civilization, how could we possibly survive it?

Then I got to thinking, and the following thought experiment. Imagine a sufficiently advanced computer virus that magically bricks any system it comes into contact with. If that got loose in our current civilization, it would be cataclysmic. Computers keep our power plants running, they fly our airplanes, run our medical devices, manage the delicate supply chains that deliver incredible abundance just in time for maximum efficiency, and handle virtually all of our communications. At best, we'd have massive die-off in the first world and worldwide technological regression. At worst, well. You get the picture.

Now imagine that same virus got loose in the early 1980s. Some of the military's most advanced weapons would start working. There would be financial disruptions as the computers that kept insurance and stock records failed. But computers aren't embedded in everything yet. Most technology would keep working. You might get a new Great Depression, but not megadeaths.

What if the Exsurgent virus gets worse the smarter you are? What if the TITANs are dead of some electronic meningitis that transhumanity can survive because we're not advanced enough to get the full effect of it yet? If that's true, there might be some hope there, because we know the virus is there. We have the chance to advance our technology while building in appropriate safeguards, so that perhaps one day we might even reach the level of the ETI with a more robust technology capable of living with the virus.

Or we could screw it all up and go extinct, but that's always true isn't it?

Scenario 2: Not hostile. The ETI has been around a long time. Long enough to do studies on the evolution and extinction of sapient species with a statistically significant n- and just ponder that one for a second, monkey boy. And what they discovered, well, is kind of depressing. The most likely path for sapient civilizations, it seems, is stagnation, followed by resource exhaustion of their immediate solar system and extinction. Very few species manage to discover the key enabling quantum-scale technologies (psi, creating stable wormhole gates, metastable self-improving AI) in time to avoid this fate. It can be done- obviously, because the original ETI species did it- but it's not very likely.

So they decided to help.

The Exsurgent virus is a kick in the ass. It's designed to take a species that has shown the potential to discover those key enabling technologies and force an existential crisis on their civilization. It's been shown that in a statistically significant number of cases, this leads to risk-acceptance behaviors and forced adaptation/innovation that improve the chances of successfully crossing the transcendence threshold before extinction. (See paper by Groxnar-11 and Jastreb-2 in next month's Journal of Interventional Xenoanthropology.) Or, you know, it kills them. But they were probably headed that way anyway.

Think of it this way. You have a ward full of cancer patients, each of whom has a 10% chance of survival. You also have a chemotherapy drug that will kill 25% of them and cure the rest. It's not a slam-drunk by any means, but I think there's a serious ethical case to be made that you not only may administer the drug, you should. You may kill some people who would have survived, but ultimately there will be a lot more living people at the end of it.

The Exsurgent Virus. It's for your own good, kids!

Sleep tight.

Urthdigger Urthdigger's picture
I typically portray the ETI

I typically portray the ETI as truly unknowable, rather than strictly hostile or benevolent. We were not exterminated because quite frankly, it doesn't care enough to do so. We caught a minor bit of attention and triggered a retaliation, but for the most part we're neither a threat, nor even a curiosity. We survive by dealing with the fallout of the recent retaliation and figuring out what precisely it was that we did to piss it off.