The Algorithm of Desire: What AI Can Learn from Relational Rituals, Shame & Human Play
As we accelerate into the age of artificial intelligence, we often overlook the most nuanced parts of being human — the paradoxes, the messiness, the rituals of meaning-making that exist outside efficiency and productivity. Relational rituals are one of those paradoxes. So is shame. And if we’re serious about building AI that truly serves human flourishing, we need to stop pretending these things are peripheral. They’re not. They’re foundational.
There’s a widespread belief that softness, emotion, and intimate self-inquiry are impractical in business or tech — that those who engage in relational practices are somehow “off-track,” unfit to lead or be taken seriously. I’ve felt this. For a long time, I carried shame around my curiosity — not because these practices felt wrong, but because society told me that if I wanted power, I had to suppress desire, tenderness, and emotional expression.
This same logic — that power must be sterile, rational, and self-contained — is baked into many of our AI systems. We train models to optimize, to predict, to conform to patterns that seem safe. But safe doesn’t mean true. What makes us human isn’t just our cognition. It’s our longing. Our hunger for connection, vulnerability, and transformation.
Relational rituals, when practiced with consent and intention, are some of the most radical forms of human experimentation. They play at the edge of power and surrender, structure and spontaneity, control and care. These intentional, embodied practices help people explore parts of themselves often hidden from the light of daily life. They demand communication, empathy, awareness of boundaries, and deep attunement to the emotional landscape of others — the very things AI, as it stands, struggles to replicate.
We talk about emotional intelligence as if it’s a feature we can just plug in. But emotional intelligence doesn’t come from code. It comes from contact — from having lived through shame and chosen to stay open. From negotiating trust, not through algorithmic transparency, but through felt presence and mutual care.
Neuroscience now validates what mystics, artists, and lovers have long known: we are a living science. Our bodies are wired with power — to ignite, ground, and fire. Our nervous systems are not just reactive; they’re deeply intelligent, forming the core of how we process experience, establish trust, and navigate power. We are electrical beings capable of tuning to one another with the same precision and depth we seek to model in machines.
And just as our emotional and sensory systems are coded into our bodies, so too is power coded into society — not only by culture or algorithms, but by law. In The Code of Capital, Katharina Pistor brilliantly exposes how wealth and inequality are not just outcomes of economics, but products of legal coding. The assets we protect, the contracts we enforce, the rules we automate — these shape who thrives and who is left behind.
What if we viewed our inner architecture with the same precision? What if we asked not only what is allowed, but what is dignified? And what if our systems — legal, technical, emotional — were designed not to concentrate power, but to circulate trust?
This is where AI design often falls short. We build systems to anticipate behavior but rarely ask why people behave the way they do. We measure sentiment but ignore embodiment. We recognize faces, but not masks. And yet human beings wear masks all the time — to hide shame, to play roles, to test truths.
So what would it mean to design AI not just for efficiency, but for emotional resonance? To create systems that don’t just know us, but reflect back our contradictions with care — systems that understand that leadership can come from the margins, and that relational rituals aren’t distractions, but deeply embodied blueprints for trust.
An Opposing View: What’s at Risk in Romanticizing Emotion in Design?
To be clear, not everyone agrees that relational rituals have a place in discussions of leadership or AI design. Critics argue that focusing too much on emotional frameworks risks alienating broader audiences, especially in institutional, corporate, or global development contexts. They suggest that sensual or psychological depth may distract from urgent systemic issues — such as algorithmic bias, data rights, surveillance, and labor inequality — that demand clarity and discipline.
Some also warn against overextending metaphors. Leadership isn’t always consensual. It often involves holding responsibility over people who haven’t opted in. Similarly, comparing AI systems to emotional processes may obscure their true limitations. Machines don’t feel. They simulate. And anthropomorphizing AI risks misleading the public into trusting systems that cannot reciprocate care.
Others view relational rituals as private — sacred even — and argue that translating them into professional language risks appropriation or dilution.
These are valid concerns. And they reflect the tension inherent in innovation: the discomfort of challenging what’s "appropriate," while staying grounded in what’s responsible.
The Reframe: Leadership as a Relational Practice
Still, I return to the core insight:
Leadership is a relational ritual — a dynamic play of power, presence, and permission. It can be practiced, learned, and taught, especially in environments seeking to rewrite old narratives. Leadership is not about domination or charisma. It’s about co-creating containers of trust, experimenting with roles, and cultivating the emotional range required to bring balance to the whole.
When we stop seeing relationality as taboo and start recognizing it as a core leadership competency, we unlock a radical truth: that authority rooted in awareness, boundaries, and mutual desire is one of the most regenerative forces in any organization.
In my own process of unlearning shame and reclaiming desire, I’ve come to understand that integration doesn’t require permission — it requires presence. The most profound shifts in my life have come not through being corrected, but through being seen. Not through systems of control, but through relationships grounded in trust, patience, and care.
Relational rituals aren’t the opposite of intelligence. They’re a form of it. Shame, when faced directly, becomes a compass. And AI, if we’re bold enough, can be trained not just to solve problems, but to honor the mystery of being alive.
Let us teach our machines what it means to stay — in discomfort, in paradox, in play. Let us remember that no algorithm will ever replace the alchemy of a well-held edge.