On Guilt and Regret
Of all the emotions that shape human behavior, guilt and regret may be the most invisible. Not because they’re rare—they’re nearly universal—but because they live beneath the surface, often undetectable even to those who know us best.
This poses a fundamental problem for any system that claims to understand human emotion through observation.
The invisibility of internal states
A camera can capture a face. An algorithm can measure the distance between eyebrows, the curve of lips, the direction of gaze. From these geometric facts, models attempt to infer emotional states: happy, sad, angry, surprised.
But guilt? Regret? These don’t announce themselves on the face.
Consider: a person sits alone, expression neutral, perhaps slightly downcast. They might be tired. They might be bored. They might be replaying a conversation from twenty years ago where they said something cruel to someone who loved them—someone now gone, the apology forever undelivered.
The face looks the same in all three cases. The machine sees pixels. The weight of unspoken remorse is invisible.
Why expression fails
Guilt and regret are retrospective emotions. They require:
- Memory of a past action or inaction
- Moral judgment that the action was wrong
- Counterfactual thinking—imagining how things could have been different
- Persistence—the feeling doesn’t resolve when the moment passes
None of these leave reliable physical traces. Unlike fear (elevated heart rate, widened eyes) or joy (genuine smiles engage the orbicularis oculi), guilt has no signature expression. Paul Ekman, who spent decades cataloging universal facial expressions, found no distinct “guilt face.” The closest markers—gaze aversion, slumped posture—are shared with shame, sadness, fatigue, and simple introversion.
Darwin himself noted in The Expression of the Emotions in Man and Animals (1872) that complex social emotions like guilt show enormous cultural and individual variation. What looks like guilt in one person might be invisible in another.
The confession requirement
Here’s the uncomfortable truth: we typically only know someone feels guilt when they tell us.
This is why religious traditions developed confession. Why therapy works through dialogue. Why truth and reconciliation processes rely on testimony. The internal experience of guilt cannot be observed—it must be disclosed.
A machine learning model trained on millions of faces will never learn to detect guilt reliably, because guilt doesn’t present reliably. The training data itself is sparse: we rarely have ground truth labels for “this person is experiencing guilt right now.” We only have moments where someone admitted to guilt, and even then, we can’t know if their face reflected it.
Stories of hidden weight
The parent who didn’t say goodbye. A father, rushing to work, brushes past his daughter’s request to play. “Later,” he says. There is no later—an accident, a loss, a lifetime of replaying that moment. His face in photographs afterward shows nothing unusual. The guilt is entirely internal, carried silently for decades.
The survivor’s burden. In Man’s Search for Meaning, Viktor Frankl describes Holocaust survivors who felt guilt for living when others died. This “survivor’s guilt” manifests not as a detectable expression but as a persistent, quiet torment. Many survivors appeared functional, even successful. The weight was invisible.
The undone kindness. A woman passes a homeless person every day for years. She never stops. One winter, the person is gone—frozen, she later learns. Her face shows nothing. But she changes her route, unable to pass that spot. The guilt reshapes her behavior, not her expression.
What machines actually see
When AI systems claim to detect “guilt” or “remorse,” they’re typically detecting:
- Downcast gaze (also indicates sadness, shame, submission, or thought)
- Reduced facial animation (also indicates depression, fatigue, or concentration)
- Self-touching gestures (also indicates anxiety, discomfort, or habit)
- Speech patterns like hedging or qualification (also indicates uncertainty or politeness)
These are correlates at best, not signatures. A skilled deceiver can fake them. A stoic person can feel crushing guilt while showing none of them.
The famous “guilty look” that dog owners swear they see? Research by Alexandra Horowitz at Barnard College showed dogs display “guilty” body language based on owner behavior, not their own actions. The dog who didn’t eat the treat shows “guilt” if the owner acts accusatory. The dog who did eat it shows nothing if the owner acts normal.
If we can’t even reliably detect guilt in dogs—creatures we’ve co-evolved with for 15,000 years—what hope do we have with algorithms?
The ethical boundary
This limitation isn’t a bug to be fixed. It’s a feature to be respected.
If machines could detect guilt, the implications would be dystopian. Imagine:
- Job interviews where an algorithm scans for “hidden guilt”
- Border crossings where your face is analyzed for “signs of wrongdoing”
- Insurance claims denied because you “looked guilty”
- Criminal justice systems that claim to detect remorse
The unreliability of guilt detection is a protection. It preserves the privacy of our inner moral lives. It keeps the space between action and conscience sacred.
What remains
So what can observation-based systems honestly do?
They can notice patterns without interpreting cause:
- This person’s demeanor changed after this date
- This expression appears in contexts involving this topic
- These behavioral markers cluster together
They can surface data for human interpretation:
- “Your facial expressions in photos from 2019 differ from 2020”
- “You appear more animated when discussing X than Y”
But they cannot—and should not—claim to know why.
The gap between observable signal and internal experience is not a limitation to overcome. It’s a boundary to honor. Guilt and regret belong to the person who feels them. They are disclosed, not detected.
Some things are meant to be told, not seen.
This reflection shapes how I think about the image analysis tools on this site. They report geometry—face positions, expression intensities, pose configurations. They deliberately refuse to interpret emotions like guilt, regret, or remorse. That’s not a limitation of the technology. It’s a design choice about what machines should claim to know.
References
- Darwin, C. (1872). The Expression of the Emotions in Man and Animals
- Ekman, P. (2003). Emotions Revealed: Recognizing Faces and Feelings
- Frankl, V. (1946). Man’s Search for Meaning
- Horowitz, A. (2009). “Disambiguating the ‘guilty look’: Salient prompts to a familiar dog behaviour.” Behavioural Processes, 81(3), 447-452.
- Barrett, L.F. (2017). How Emotions Are Made: The Secret Life of the Brain