🧨 Condoms in the Machine

How AI Is Making Us Ethically Impotent

Let’s begin with a familiar scenario, though one you’re unlikely to see dramatized.

A drone strike is authorized—not by a soldier wrestling with conscience but by a remote analyst operating a touchscreen interface. The target is rendered in pixels, and the collateral damage is projected as probabilities. A single tap completes the task. The consequences play out elsewhere, in a place too far to smell, bleed, or tremble.

There is no shouting, no trembling hands, no scent of gunpowder. Just a clean, data-driven execution—another quiet moment of death outsourced to the comfort of an ergonomic chair.

And so the question emerges: when the decision to kill no longer passes through a human heartbeat, does morality even enter the room?

📦 Vending Machines and Moral Detachment

To understand how technology alters our moral circuitry, consider a very different kind of machine.

In some schools, condom vending machines are installed in restrooms with the hope of encouraging safer sexual behavior among teenagers. The goal is pragmatic: reduce pregnancies, prevent disease, and offer protection without shame.

But there’s an unintended psychological effect. When something potentially life-altering becomes as easy to obtain as a soda, the gravity of the choice can begin to fade. Not because the act is less severe but because the ritual-the pause, the awkward conversation, the weighing of consequence-has been removed.

The machine doesn’t cause the sex. It simply erases the friction. And friction, it turns out, is where morality often lives.

The same phenomenon applies with devastating clarity to the rise of artificial intelligence. AI does not invent malevolence, but it does lubricate the machinery of action to such an extent that we no longer feel the strain of our choices.

🤖 Moral Distance in a World of Instant Execution

We are not necessarily becoming worse people. We are simply operating at a greater distance from the consequences of our behavior.

Whereas cruelty once required proximity—eye contact, physical force, a moment of reckoning—today, it can be authored remotely, scaled infinitely, and delivered with zero visceral feedback. Deepfakes impersonate voices to scam parents. Chatbots generate plausible misinformation. Facial recognition software identifies and tracks protestors in real time.

The ethical load, once borne by the body, is now distributed across clean interfaces and polite APIs.

This is not a Luddite lament. It’s a reckoning with what philosopher Hannah Arendt called the banality of evil—the way monstrous acts are often committed not by sociopaths but by ordinary people following procedure. In the algorithmic age, this banality has become sleek. It now arrives dressed in frictionless UX and morally mute machine learning.

What Arendt saw in Nazi bureaucrats, we now see in prompt engineers and product managers. Not because they wish harm, but because their tools no longer whisper, “Wait. Are you sure?”

🧠 The Ethical Erosion of Automation

Philosopher Daniel Dennett once asked whether a system that doesn’t understand consequences can be called a moral agent. However, perhaps the more urgent question today is whether we remain moral agents when we no longer experience those consequences directly.

Our reliance on AI is not inherently unethical but accelerates our tendency to disassociate action from impact. The more decisions we outsource to machines, the more insulated we become from their emotional weight.

Consider this: when a doctor must deliver a terminal diagnosis in person, they carry the burden of empathy, presence, and sitting with the pain. Now, imagine that diagnosis is delivered via an AI chatbot. The words may be accurate, even clinically appropriate, but something deeply human has been removed. And with that removal, an opportunity for moral growth—however painful—has vanished.

🩸 The Architecture of Conscience

Moral action, unlike technical execution, is rarely efficient. It is inconvenient, often slow, and occasionally paralyzing. This is because conscience is not an algorithm. It is a sensation—a lived discomfort—a resistance.

Peter Singer has long argued that our ethical sense diminishes with distance. We feel a visceral impulse to save a drowning child nearby, but scroll past the statistics of famine victims far away. AI magnifies this phenomenon by abstracting action so thoroughly that ethical proximity disappears.

  • The more we rely on machines to mediate our choices, the less we are forced to feel the weight of those choices ourselves.

  • The condom didn’t make sex immoral. It just made the act easier to perform without reflection.

  • The drone doesn’t make war immoral. It just makes the act easier to execute without emotional cost.

  • The chatbot doesn’t make lying inevitable. It just removes the shame that once came with it.

Technology doesn’t replace morality. But it does anesthetize the part of us that used to notice when it had gone missing.

🎮 The Banality of the Dashboard

Today’s dashboard is tomorrow’s battlefield. With the right plugins, you can manage ad campaigns, political propaganda, military operations, and financial fraud—all from the same swivel chair.

And while the interface may change, one thing remains constant: the easier it becomes to cause impact without witnessing the aftermath, the less we seem to feel responsible for what unfolds.

In Technopoly, Neil Postman warned that critique becomes obsolete when societies become tools of their tools. We are now tools of our dashboards, able to wield extraordinary power while shielded from the blood, complexity, and contradiction such power once required us to confront.

We are not, as some fear, building machines that will become conscious. We are building machines that allow us to behave like we are not.

🧭 Reclaiming the Friction

So what remains for the human who refuses to go numb?

It begins with reclaiming friction, remembering that the pause before action- the breath, the hesitation, the sense of cost—is not a flaw. It is the last proof that you are alive.

To be human in the age of AI is to choose awareness over convenience. It is to refuse to let dashboards and prompts replace what used to be the terrain of moral struggle. It is to design tools that make you more ethically awake, not less.

You cannot outsource the sacred. You cannot automate the spine. And you cannot pretend that because a machine now pulls the trigger, no one has blood on their hands.

👁️‍🗨️ Join the Cult of Clarity

We don’t worship the future. We interrogate it.

Subscribe for essays that slice through the fog of modern ethics, myth, and machine. Rituals, rants, and reminders that the friction you feel is not a problem—it’s your humanity, trying to survive.