42 Comments

There's a passage in the Odyssey where Odysseus tries to fast-talk Athene into doing something. Athene's reaction shows that she's pleased by his cleverness, but rather like a mother whose child has just tried to be clever: You could see her thinking "Oh, he's so cute!"

Expand full comment

From the opposite angle, a series that really dropped the ball and just didn't bother to look at what their technology suggests is "Undying Mercenaries". The basic premise is that there is a space alien Roman style empire, of which Earth and humanity are a far flung provincial backwater. However, humans make pretty good mercenaries, and they gain access to alien technology that scans your body and constantly scans you mind, so if you die you can get a 3d printed body, put the mind in, and hey presto give the man a gun and send him back out. The author, over the course of 19 novels, kinda sorta scratches at implications, as the tech goes from military only to sorta legal civilian/government use. No question of "What's it really mean when leadership, military, government or business, is effectively immortal?" "What happens to people who are perpetually 25 due to dying over and over in battle, yet everyone they know outside their unit has died?" "You've just explicitly disproven all afterlife containing religions... now what?"

Early on the author at least makes an attempt to look at some implications around making multiple copies of yourself, and what happens when one criminally minded human breaks galactic law to do so, but then that's it. Quite frustrating... I wrote a review of it here: https://dochammer.substack.com/p/book-series-review-undying-mercenaries

Expand full comment
author

Is there any discussion of the philosophical issue of whether you are being revived or you are dead and a copy created that thinks it is you? It is relevant to how willing people would be to risk death?

Expand full comment

There is no philosophical angle at all, no. There is early on a clever character that breaks the galactic laws about revivification and makes millions of copies of himself for nefarious purposes. About 3/4 of the way through the series folks realize that if you get the brain and body scan data from someone you can revive them (or make a copy really) and interrogate them without their other self (presumably home safe) ever knowing about it, and in fact do it over and over again to try different strategies. Sadly that is the extent.

It is relevant to how willing people are to risk death, and kill each other from the first book (when I think the author got all his ideas out) and it was pretty good. New recruits are routinely killed during training exercises and as discipline to get them used to it. Murder of other recruits is a misdemeanor for wasting resources.

The sense of danger in the books comes from the possibility of being "permed": dying outside the range of the wifi network that tracks your brain scans and confirms you are dead. It is super illegal (a species extermination offense... sometimes) to have multiple copies running around at once, so if they can't prove you died you stay dead. Pretty clever.

One other aspect the series hints around, but never gets into over 19 books, is that the copies aren't always perfect. The bodies might come out wrong and get recycled which has no relevance really, but the brain part can. A new version might be more high strung or something, and it is suggested that older tech sometimes broke peoples' brains over time and they went nuts. But that's all fixed now! But maybe newer tech still has a chance of dropping memories as people live longer lives than human bodies were meant to... NOPE.

It was such a lot of wasted potential, I can't even recount it now a few years on without getting frustrated.

Expand full comment

It's certainly relevant to me. I have thought for some time that if I were in the Star Trek universe, and told to get into a transporter, I would refuse, because the transporter would kill me and create a copy; my entering it would be the last moments I would ever experience. I might choose to do so to avoid a more painful death by torture, or to save my wife from dying, but no ordinary punishment would be sufficient to persuade me. Of course, for obvious reasons, the Trek universe contains no one who seriously holds such views (Dr. McCoy's grumping doesn't count; he gets into the transporter every time).

Expand full comment
author

I believe Heinlein has a character with those views in _Waldo and Magic Inc._, I don't remember which part.

Expand full comment

There was some handwaving early on suggesting that the transporter was doing a macro-scale quantum transposition, like when an electron jumps from one shell to the next. That may just have been something I read from the show Bible or something Blish interpolated into his versions. In any case stuff like getting stuck in the pattern buffer, filtering out foreign biological infections, and half-mirror duplications like Will Riker underwent pretty much rule that interpretation out.

Expand full comment

Dr. Pulaski (season 2 of TNG) only used a transporter when it was a matter of life or death.

Expand full comment

Hers, or her patients'? I suppose I can see a doctor being dedicated enough to commit suicide to save someone's life, but it's not something I'd do.

Though on the other hand, if transporters can make copies of people, I'd say, "Fine, make a copy of me and send it."

Expand full comment

A noted transporter-phobe, she saves her own life in this manner. Or at least a copy of her. https://en.wikipedia.org/wiki/Unnatural_Selection_(Star_Trek:_The_Next_Generation)

Expand full comment

I am amazed it took Trek what, 8 seasons of TNG to address the issue that they can make copies of people whenever they want. (I may be entirely misremembering that, but wasn't it Riker who has a second version of himself running around?) Clone soldiers never became a thing, for instance, or just recreating entire cities as colonies down to the very last brick and person.

Expand full comment

“Better communications technology makes it easier to live in your own bubble”

Also easier to venture out of your bubble seeking the scalps of those guilty of wrongthink.

As for the plausibility of mere humans defeating highly intelligent gods, it does seem implausible that a human protagonist could win a game of 4D chess in such a situation. But it doesn’t seem too implausible that a human protagonist, by making a mistake or hitting a lucky strike, might send events in an unexpected direction and hence foil such a god’s carefully laid but perhaps fragile plan. Whether this would result in the outcome the human hoped for is a separate question, unless foiling the evil plan was the only intention.

Expand full comment

AlphaGo's strategy was approximately optimized for robustness in place of a more fragile plan to win by as many points as possible. You'd expect a wise god to show similar wisdom.

Expand full comment

Correct. It's the difference between playing a single opponent in a zero-sum game, and dealing with multiple agents each with different agendas.

Expand full comment

More complex settings usually give *more* advantage to higher intelligence, not less. Even Taleb calls his target "intellectual yet idiot".

Expand full comment

I don’t think you got my point. In a one on one game, the mere human loses. But the god's real-world plan might be vulnerable to failure for weird difficult to foresee reasons, which a human protagonist might stumble into. So the human wouldn’t win in a real sense, except that the god-plan would fail. Of course, if the god is still around it will just make another plan.

Expand full comment

If I may restate your point a bit: the issue is that games typically have known probabilities for every action, and are closed systems in the sense of what can and cannot happen within the game. Reality has neither of those limitations, and so being smart only gets you so far towards always achieving plans. Omniscience is required to always have your plans work out. Otherwise you still will not be able to know how robust your plan is and what outside forces might break it, and so are vulnerable to all the bugs getting into your plan's system.

(I would also add that the existence of other gods with differing goal sets will also make things dicey.)

Expand full comment
author

The Glowfic threads I mentioned assume that the god's intelligence maximized expected payoff, but leaves the possibility of bad results through losing a rational gamble.

Expand full comment

At the same time, though, gods also have a lot more to pay attention to. For things they really care about they are probably very difficult to out maneuver, but there are probably wide ranges of matters they are only mildly interested in.

Likewise, gods might play barbell strategies ala Taleb, where they are optimizing for robustness in the majority of their plans they care about, but have a number of fragile plans for maximum points as it were for some, just in case. Of course the gamble from the point of view of the mortal is guessing which type of game plan you are messing with.

With multiple competing gods it gets trickier too, since they might have to forgo robustness for bigger gambles in order to win in the long term, so mortals might have more opportunities to slip in.

Your example would make a great distinction between archetypical gods, though. Some are more careful while others are more willing to take big risks. That should affect both how they respond to their followers' behaviors, as well as how their goals and plots work out and interact with gods and men. I suspect the rash god of thunder is going to have many more followers than the god of accounting, but over time... things might get interesting.

Expand full comment
author

Have you read any of the glowfic stories, one of which I quote in the post? The gods' attention problem is explicit in some of them. Most of the time you are interacting, at most, with a small fraction of the god, because the god is paying attention to many worlds at once, each with a bit of himself. If that fraction decides it is important enough more of the god starts paying attention, but at the cost of less attention on other worlds.

Expand full comment

I haven't, no, but I will have to pick it up this summer, thanks!

I do recall a similar mechanic from an old Forgotten Realms branded series involving a war of the gods and the death of some and ascent of others. It basically used the same notion of gods' minds as faceted crystals, with each facet paying attention to different things or multiples focusing on really important things.

I think in part it was used to explain why one god doesn't become super powerful, as their power from additional worshippers doesn't keep up with the power they are expending to pay attention to or help them (although it has been nearly 30 years and I might be mixing a lot of my explanation in with what is actually in the text.) The really bad god that was trying to kill other gods and take their portfolios got dethroned largely because the remaining gods coordinated to create a whole lot of situations demanding attention all at once and then attacked him personally, creating a dilemma where his followers got annihilated or he did. Or something.

At any rate, I remember it (kind of) three decades later because it made a for a clever resource trade off between attention and god-magic-power and followers. You need followers to power yourself as a god, and retaining followers requires using that power to pay attention to them and grant them boons, but that also means you are not necessarily achieving your own ends in as focused a manner. Now that I think of it, it works like the dictator's trade off in public choice... I wonder if the authors were aware of that or developed it in parallel.

Expand full comment
author

The relevant Glowfic threads are set in Golarion, which originated as a Pathfinder scenario. I don't know how much of this the authors added, how much was in the scenario.

I don't think more followers increase a god's ability to pay attention. The function of the followers is to do things the god wants done. It's a little like Bujold's line "The gods have no hands but ours," although the Golarion gods can sometimes, not often, intervene directly.

Incidentally, Bujold's fantasy is another interesting example of taking gods seriously.

Expand full comment

That's a good point, that the amount of direct intervention gods can inflict upon the material world differs a lot world to world. Real world religions tend to have gods that can directly reach out and change the world (or possibly make it), while most fantasy worlds are a little more hands off for one reason or another. It then gets tricky to explain why the gods can bless/enchant their followers but not directly reach out and smith those that annoy them, or if they can reach right out, why don't they.

I always liked the the Dragon Lance storyline that the gods accidentally half broke the world, and so all agreed to back out and not get involved directly, until the evil goddess broke the deal. Not a good long term solution, but makes for a neat twist on the whole thing. It still leaves out the question of why the gods don't just dump a bunch of power through followers when it is convenient.

I will have to look up Bujold's stuff too. Treating actual, extant gods and their interactions with mortals seriously is rare, except for those cases where the answer is "The gods just don't care a lot and are busy screwing around with their own games, and they are kind of assholes", but a real serious work thinking about the actual implications would be super interesting.

Expand full comment

I really love Alfred Bester's work - specifically the Demolished Man for how it covers fictional telepathy.

That said, I really to have to recommend the following short story (by my best friend) exploring the consequences of a fictional machine that can revert any object (including a person) to a previous state:

https://www.lesswrong.com/posts/CKgPFHoWFkviYz7CB/the-redaction-machine

Expand full comment

"Taken seriously, they should be not only more powerful than humans but smarter. "

I remember reading a story about alien invaders who were substantially less intelligent than humans. I don't remember title or author, nor how he made the story interesting rather than "Well, we were better than them at everything, the end."

Expand full comment

Christopher Anvil wrote a bunch like that, I think. There was also the one that I *thought* was by Michael Flynn but I can't find it in his list, where we are invaded by somebody who is just barely industrial -- it turns out that space-warp is actually really simple, but that you don't get there by the Newton-Kepler-Einstein path we took, which for some reason obscures the insight you need.

Expand full comment

I think about these things all the time for my Dungeons & Dragons campaigns. I came to some conclusions to make the lore of typical settings make sense.

Not just healing, but why aren't all/most problems solved with magic? It's expensive because of low supply/high demand is sufficient.

More importantly is the fallibility of divination magic. The game doesn't point this out, but it does provide magical countermeasures that can return both false negatives and false positives to Detect Lies, Detect Evil, etc. Thus it is considered circumstantial evidence in courts of law. Thus it is considered a heinous misdeed when a paladin kills a random passerby simply for detecting as evil.

Expand full comment

Training your thoughts isn't this evil hard thing. You must train your thinking unless you want to be acting on emotion and instinct. I would take this as an argument in support of modesty.

I've always been interested in modesty as a concept, and one observation over the past century is a lowered expectation of not looking at things, or helping others not see things. The ancient equivalent would be hiding objects of value in common pots. We Do the opposite today. Is there an English word for this?

Reading people's minds would be the worst violation of modesty possible.

Expand full comment

On truthtelling, I recommend Jim Halperin's 1999 novel, The Truth Machine. He does a good job exploring the social effects of a cheap, universally available device that tells you with certainty whether someone is lying.

https://www.amazon.com/Truth-Machine-Novel-Things-Come-ebook/dp/B000FC1KR0

Expand full comment