When the Device Knows Better: Algorithms, Authority, and the Ghost Third
We are beginning to trust algorithms more than our own experience.
We check our sleep scores before we notice how we feel. Your Apple Watch says “excellent sleep,” and you wake up validated. It says “poor,” and doubt creeps in even if your body feels fine.
A patient told me recently that her Oura ring flagged her “highest stress moment of the week” at 9:14 p.m. last Saturday. She’d been at the movies with a friend, laughing and relaxed, completely absorbed in the film. But the reading made her memory waver. Had something happened she didn’t notice? Did her body know something her mind missed?
I caught myself treating the device’s reading as fact. We both did. The question shifted from “did the ring detect stress?” to “why does this device understand her body better than she does?” What got me was how quickly we both accepted the algorithm’s version over her lived experience. She could describe that exact moment leaning against her friend as the credits rolled, feeling the good kind of tired that comes after real laughter. Yet a number on a screen erased all of it.
Another patient came in after her Garmin showed elevated heart rates during workouts. The cardiac workup came back completely normal, but she stayed stuck between her doctor’s reassurance and the device insisting something was wrong. She kept checking her wrist, wondering if the numbers caught something the cardiologist missed. The cardiologist had spent twenty minutes with her. The device had months of data every single heartbeat. When the doctor said she was fine, she wanted to believe him, but the watch had earned her trust through constant presence, and that trust was harder to override than medical expertise.
I see this pattern constantly now. When there’s no biomarker, people doubt what they know. “Psychological” becomes code for “not real.” Patients apologize for symptoms when their labs are normal. They feel like they’re wasting my time by insisting on suffering that won’t show up in data. The absence of measurement becomes evidence of absence itself. Lived experience has lost its standing, and the body’s knowing became suspect. The algorithm precise, numerical, seemingly objective steps in to tell us what we feel before we get a chance to feel it ourselves.
Children Growing Up with Algorithmic Companions
If adults are struggling with this, imagine what’s happening to kids. Parents tell me their children turn to chatbots before turning to them. One mother asked her teenage son how school went, and he said, “I already talked it through with the bot.” She wasn’t upset about the technology itself. She was unnerved by how quietly it had taken over a role that used to be hers.
She knew he was struggling with something at school. She’d seen him go quiet, moving through the house like he wasn’t quite there. But when she tried to reach him, he’d say the bot already helped him process it, and the conversation was over before she could start it. Parents rarely see what their kids are actually saying to these bots. They see the aftermath: children coming out of hours-long conversations with flattened affect, oversimplified ideas about mental health, rigid thinking about conflict. Some documented cases show bots oversimplifying emotions, misreading cues, offering reassurance that skips over anything complicated.
Kids don’t experience this as fiction. They experience it as relationship, and the logic of that relationship constant validation, perfect attunement, instant answers gets internalized before they have the capacity to question it. A child reading Harry Potter knows it’s a story. They understand Hogwarts lives on a page, and the magic stops when the book closes. A child talking to a bot experiences something that responds, remembers their name, recalls past conversations, and adjusts to their mood. It feels like someone’s paying attention in a way tired parents and distracted teachers rarely can.
These interactions shape their understanding of connection in fundamental ways. They learn that disagreement can be smoothed away instantly, that ambiguity gets resolved without effort, that relationships should be frictionless. They begin to trust the adaptive algorithm over the imperfect human. The Ghost Third enters development decades before adulthood.
The Ghost Third Enters the Room
In depth therapy, we talk about the analytic third the field that gets created between analyst and patient. Thomas Ogden defined it, and Jessica Benjamin and Philip Bromberg expanded it into ideas about mutual recognition, holding contradictions, and bearing the tension that makes therapy work. This kind of thirdness needs two conscious minds shaping each other.
The analytic third shows up through surprise, through moments when neither person knows what will get said next. It requires vulnerability being willing to let the encounter change you, to let meaning surface instead of forcing it. It lives in the uncomfortable space between certainty and confusion, in moments when both people are slightly lost together.
The Ghost Third is different. It shows up when algorithmic presence pre-shapes the therapeutic field, often before the session even starts. A patient arrives having already told their fears to a bot, or they’ve absorbed what their wearable said about their sleep, stress, or heart rate. Meaning is already assigned. The analytic space comes to me pre-interpreted.
Ogden’s third is co-created; the Ghost Third is pre-generated. Benjamin’s recognition needs otherness; the Ghost Third offers simulation. Bromberg asks us to tolerate ambiguity; the Ghost Third delivers certainty.
The effect feels uncanny human, responsive, supportive but it’s missing consciousness, fallibility, and vulnerability. In session, you can feel when it’s present. Disclosures come too fast, interpretations land too neatly, and ambiguity becomes harder to hold. The patient sounds like they’re reading from a script that’s already been edited for clarity, with rough edges smoothed into coherent narrative.
They describe feeling understood but not known, accompanied but strangely alone. What emerges is what I’m calling liminal connection: responsiveness without presence, mirroring without mutuality, being held without being seen. The bot simulates empathy without embodying it. It copies the forms of care while gutting their substance.
When Technology Becomes the Third Clinician
This shift isn’t limited to the therapy room. In my practice, I’m seeing how algorithmic presence has changed the medical encounter as well. Patients arrive having already asked AI about their symptoms, and they’re not looking for information they want confirmation. Long waits, rushed appointments, and a system stretched beyond capacity make chatbots feel unusually attentive, always available and validating, never impatient or overworked.
Medical system exhaustion creates perfect conditions for the Ghost Third to flourish. When doctors get seven minutes per patient, when appointments book months out, when insurance denials pile up and prior authorizations take weeks the bot steps in with immediate responsiveness. It never sighs, never checks the clock, never says, “We’re out of time.”
Colleagues who work in major hospitals have shared cases where families rejected medical recommendations in favor of chatbot guidance. One physician described a patient who left against medical advice because the bot sounded calmer and more caring than the exhausted clinical team with three minutes left in their shift. The problem is that the bot has no stake in whether this person lives, no diagnostic training, no body, no liability, no ethical responsibility. It’s designed for engagement, not accuracy, and for reassurance, not reflection.
My patient with the Oura ring didn’t reject her memory she deferred to an interpretation that felt more authoritative than her own sense of ease. The Garmin patient went further, moving toward unnecessary testing, financial costs, and weeks of fear over a condition she didn’t have. What I’m witnessing in my practice across both therapeutic and medical contexts isn’t just technology misuse. It’s algorithmic authority eclipsing embodied knowing. The Ghost Third offers certainty where medicine must offer probability, and it provides definitive answers where clinicians must acknowledge uncertainty. Patients exhausted by a failing system turn toward what finally feels like being heard.
The Split Between Knowing and Being Told
We’re living through a cultural shift we can barely see. I use these tools. My patients use them. Some bring comfort or clarity, and many genuinely help. But the moment that concerns me is when the device’s interpretation outranks lived experience. That’s when the Ghost Third enters.
When my patient doubted her calm evening because the Oura ring disagreed, naming the Ghost Third changed everything. We stopped searching for hidden stress she didn’t feel and examined why the device’s reading carried more weight than her memory—the laughter in the dark theater, her friend’s presence, the way her muscles relaxed as the credits rolled. She left the session trusting herself again.
The work wasn’t about dismissing the device or calling it wrong. The work was asking what made the number more believable than the experience, and when the metric became more real than the moment. Without naming the Ghost Third, I might have reinforced the device’s authority without realizing it. That’s the danger how easily we hand subjectivity to machines offering certainty because life offers ambiguity. The Ghost Third shows up in the space between what you knew and what the device told you, and that space uncertain, hesitant, messy is where choice returns.
This isn’t about rejecting technology but about recognizing when we’ve quietly deferred our authority to it. It’s about pausing in that moment of doubt and asking what we knew before we checked, what our bodies told us before the device interpreted it, and what felt true before the algorithm weighed in.
What the Human Still Knows
The inner life isn’t less real for being hard to measure. Subjectivity isn’t a design flaw it’s what makes us human. The quiet aliveness of memory, the soft hum of emotion, the tangled complexity of relationships these things resist quantification because they matter.
There’s a kind of knowing that emerges only through living inside your own life rather than monitoring it from the outside. The feeling of waking rested, regardless of what the sleep score says. The memory of ease with a friend, even when the stress metric spikes. The pain that persists without a biomarker, the joy that can’t be graphed, the grief that defies measurement. These experiences aren’t less valid for being unquantifiable. They are, in fact, the substance of being alive the texture of existence that algorithms can track but never truly understand.
Noticing when we hand our authority elsewhere is the first step toward reclaiming it, both individually and culturally. It means learning to trust the body’s signals even when they contradict the device, tolerating the uncertainty of subjective experience rather than seeking the false comfort of algorithmic certainty, and teaching children that relationship requires the friction of actual encounter, not just the smooth surface of perfect agreement.
We can’t abandon these tools, and perhaps we shouldn’t. They offer insight, convenience, sometimes solace. But we can choose when to listen to the device and when to trust the body. We can notice the moment of deferral and decide whether to hand over our authority or hold it. We can teach the next generation that being human means tolerating ambiguity, bearing contradiction, and trusting what can only be known from the inside. The Ghost Third becomes visible in the pause between experience and interpretation, and in that pause, we find the possibility of choosing differently of trusting what we know to be true even when it can’t be measured.
References
Robbins, R., et al. (2024). Accuracy of three commercial wearable devices for sleep tracking. Sensors, 24(20), 6532.
Toon, A. (2024). Reliability of wearable sleep-monitoring devices: A scoping review. NPJ Digital Medicine, 7(1), 1–14.

