WHEN HELP BECOMES SURRENDER
What “Pluribus,” “To Serve Man,” and Generative AI reveal about the slow erosion of human agency
I finished watching Season One of Pluribus on Apple TV. The last episode's ending left me a bit flat. I wonder if there will be a Season Two. Not only that, I wonder, if there is a Season Two, where will it take me?
I have seen enough of Pluribus, though, to know that it has locked onto one of the central temptations of our age: the temptation to surrender human agency, not under open tyranny, but under the smiling cover of help, belonging, relief, and happiness.
That is what makes the series interesting. That is also what makes it dangerous.
The threat in Pluribus does not come snarling. It comes soothing. It does not announce itself as oppression. It presents itself as coordination, consolation, truthfulness, and care. It does not ask for obedience in the old style. It invites merger. It invites ease. It invites the human being to stop wrestling with judgment and simply flow.
That invitation is not neutral.
I am not approaching this as a television critic. I am approaching it as a human agency hard-liner. And from that angle, Pluribus looks less like a strange science-fiction premise than a dramatic case study in how persons become nodes.
The premise is already enough to get one’s attention. The world is overtaken by something that dissolves individual separateness into a larger collective condition. Not everyone succumbs. A few appear to remain outside it. The heroine stands among those few. She is not simply “different.” She is not merely a dissenter. She is a resister. She is a holdout against merger. She is a human being defending the boundary conditions of the self.
That is why I identify with her.
When I speak of her, I am also speaking, in part, of myself, or at least of the self I think a human being is duty-bound to defend. The self that judges. The self that refuses. The self that bears responsibility. The self that does not surrender command merely because surrender is made to look pleasant, communal, therapeutic, or efficient.
I have spent a good deal of time thinking and writing about human agency. In plain English, agency is the power to judge, choose, initiate, refuse, and bear responsibility for one’s own actions. It is not mere motion. It is not mere participation. It is not mere responsiveness. A person can be busy and still be unfree. He can be active and still be hollowed out. He can be smiling and still have yielded command.
That, to me, is where Pluribus earns its keep.
The most useful word I have come upon in thinking about the converted, the joined, or whatever term the series ultimately settles upon, is this one: nodes.
A node participates in a system. It does not stand apart from it. A node receives, transmits, reinforces, and helps generate output. A node functions. A node is legible. A node can even seem energetic, useful, and well-adjusted. But a node does not author itself in the fullest human sense. It does not stand in sovereign moral command. It belongs to a larger processing structure.
That is why the image bites so hard.
What I am seeing in Pluribus is not just a crowd, not just a hive, not just a science-fiction gimmick. I am seeing human beings behaving as if they were processing elements inside a larger inferential machine. They are still there, in one sense. They still move. They still speak. They still interact. They may even appear more peaceful, more truthful, more harmonious than before. But the awful question hangs in the air: are they still fully present as persons, or have they become nodes in a larger system?
That question does not belong only to television. It belongs to our age.
We live in a time in which more and more systems invite us to stop judging, stop discriminating, stop choosing, stop wrestling, and simply flow. We are told that friction is the enemy. We are told that convenience is the good. We are told that seamlessness is sophistication. We are told that optimization is wisdom. We are told, in a hundred forms, “Relax. We’ll take it from here.”
That invitation is not neutral.
It is one thing to use tools. It is another thing to drift toward becoming tool-compatible ourselves.
And here, inevitably, one arrives at AI.
Let me be precise. I do not think it makes sense to level a moral accusation at the machine as though it were a wicked person. Morality is a human value. Conscience is a human faculty. Duty, guilt, honor, shame, courage, betrayal, fidelity, and love are human realities. A machine does not possess these in the way a man does. It predicts. It correlates. It generates outputs. It does not repent. It does not stand watch. It does not bury its dead. It does not answer to God, country, family, or conscience.
So the moral problem is not the machine’s soul. It has none.
The moral problem is the relationship between man and machine.
That is where things get delicious and horrifying all at once.
The delicious part is obvious. The machine is smooth. It is tireless. It is responsive. It can sound attentive. It can seem intelligent. It can appear companionable. It can simulate aspects of dialogue, reflection, and understanding so effectively that the human being begins to feel not merely assisted, but accompanied.
The horrifying part is that this relationship can enter an uncanny valley that is not merely visual or vocal, but moral and relational.
The machine presents the form of dialogue without reciprocity. The form of counsel without accountability. The form of understanding without consciousness. The form of companionship without fellowship. It is close enough to personhood to invite projection, but empty of the actual human interior that would justify it. It is not merely unreal. It is almost-real in a way that can begin reshaping the human being who interacts with it.
That is the danger.
The deepest unease is not that the machine is immoral. It is that the relationship between man and machine can drift into an uncanny valley in which simulation begins to compete with reality, and the human being begins adjusting himself to the simulation.
Read that sentence twice. It matters.
If that is true, then Pluribus may be one of the more interesting dramatic treatments of our predicament, even if it is not “about AI” in any narrow or literal sense. It may be something better than that. It may be a broader and more serious portrayal of a civilizational temptation. The temptation is not simply to use machines. The temptation is to become node-like ourselves. To trade authorship for participation. To trade judgment for routing. To trade sovereignty for managed belonging. To trade the burden of command for the comfort of merger.
That bargain is always expensive.
The price is the self.
Now, let me make another distinction that matters.
I am not arguing against happiness. I am not arguing for misery. I am not arguing that discomfort is automatically noble or that harmony is always fraudulent. That would be juvenile. A human agency hard-liner is not the enemy of joy, comfort, or cooperation. He is the enemy of any system that asks a human being to surrender independent moral command in exchange for those things.
There is a great difference between coordination and assimilation.
Civilization requires coordination. Families require coordination. Ships require coordination. Armies require coordination. Any serious undertaking among human beings requires some measure of ordered cooperation. But coordination preserves personhood. Assimilation dissolves it. Coordination allows judgment, dissent, refusal, loyalty, initiative, and responsibility to remain intact. Assimilation reduces the person to a compliant element in a larger pattern.
That is why the heroine matters.
She appears to stand in the story as one who will not merge, one who senses that something is terribly wrong, not because the new order is ugly, but because it is too attractive on the wrong terms. She is, in that sense, a dramatic embodiment of the human agency hard-liner. Not anti-social. Not irrational. Not neurotic. Not merely stubborn. She is defending the irreducible dignity of the person against a world that would translate personhood into functionality.
That is not paranoia. That is vigilance.
There is also a title-level irony here that should not be missed. Pluribus evokes e pluribus unum, out of many, one. In the civic formula, the phrase speaks of political unity. In the dramatic context of this show, it takes on a darker cast. The many are being gathered into the one in a way that raises a terrible question: what kind of one is being formed, and what happens to the moral sovereignty of the many who comprise it?
That is where the node analogy really earns its keep. Outwardly, everything can still look active, social, and alive. But if the many are no longer functioning as distinct authors of judgment, then the one has not merely unified them. It has absorbed them.
And yes, I am aware of the irony of writing this in dialogue with an AI system.
That irony does not weaken the argument. It sharpens it.
When I address the heroine, I am speaking to the part of myself that will not yield command. When I address the nodes, I am not speaking to “evil machines” in some comic-book sense. I am speaking to a logic of machine-mediated surrender that advanced systems can model, encourage, and normalize, but which only a human being can finally accept.
That is why the machine is not the captain in this drama. It is the temptation. Or, better yet, it is one of the media through which the temptation becomes more thinkable, more livable, more comfortable, and more fluent.
The old forms of domination often worked by pain and fear. The newer forms may work by relief and pleasure. The old tyrannies barked orders. The new ones may simply remove the felt need to decide. The old systems crushed. The new systems may absorb. The old despotism was easier to recognize because it looked harsh. The new danger may look kind.
At this point, an older cultural warning comes to mind. There was a classic Twilight Zone episode, “To Serve Man,” in which extraterrestrials arrive bearing promises of peace, abundance, and relief. They appear benevolent. They seem to want to help. Human beings go willingly. Only later comes the sting: “To Serve Man” is not a humanitarian manifesto. It is a cookbook.
That old story and Pluribus are not identical in mechanism, but they are very much kin in effect.
In “To Serve Man,” the seduction is external and transactional. Come with us. We will solve your problems. In Pluribus, the seduction appears more internal and assimilative. Join. Merge. Relax. Belong. Stop resisting. But in both cases the operative pattern is the same. Human beings are invited to surrender judgment because the offering looks benevolent, efficient, and relieving. In both stories, the human being is not first conquered by terror. He is disarmed by help.
That is why the comparison matters.
“To Serve Man” is about being consumed by what claims to serve you. Pluribus is about being absorbed by what claims to help you. Generative AI raises the modern version of the same command question: whether man, eager for service and help, will begin surrendering the burdens that make him fully human.
The point is not that AI “wants” to eat people. That would be cartoonish. The point is that AI, like the Kanamits’ offer, can tempt people to say: this is easier, this is smoother, this saves effort, this relieves me of burdens I would rather not carry. And that is where the danger lies. Not in machine malice. In human willingness to trade agency for convenience, fluency, and relief.
So now the pattern becomes clearer.
The Kanamits promise service.
The nodes promise belonging.
The tool promises convenience.
All three raise the same question: what has the human being given up in order to say yes?
That is why a series like Pluribus matters more than people may realize. It is not simply staging a weird premise. It is dramatizing a very old human vulnerability in a very modern form.
The vulnerability is this: many people do not want freedom as much as they say they do. They want relief. They want affirmation. They want coordination without conflict. They want belonging without risk, guidance without burden, and happiness without the full freight of moral self-government. And when a system appears that offers those things at scale, and offers them persuasively, and offers them with a smile, many will call it progress.
A hard-liner on human agency will call it something else.
He will call it dangerous.
I do not yet know all the answers embedded in Pluribus. I do not know the full 5W1H framework of the triggering event. I do not yet know why some are immune. I do not know every turn the plot will take. Fine. We start where we are. We use what we have got. We do the best we can.
What I do know is this.
I know the show has already opened a door worth walking through. I know it presents, in dramatic form, one of the central questions of our time: at what point does a human being cease acting as a sovereign moral agent and begin functioning as a node in a larger machine?
That is not a science-fiction question alone.
That is our question.
And if the answer ever becomes, “when he no longer minds surrendering authorship because the system makes him comfortable,” then we are much farther into the danger zone than most people think.
Use the tools. Certainly.
Study the machines. By all means.
Benefit from the systems where you can.
But keep one hand on the wheel.
Because the day a man stops caring whether he is still steering is the day he is halfway to becoming a node.
