Translating Distress: Can AI Read Emotional Signals?
- IO Kim

- Jul 6, 2025
- 1 min read
Updated: Feb 23

Children often draw before they articulate.
In exploring psychological drawing interpretation, I became curious about how visual patterns might reflect emotional states. The House–Tree–Person test is one example of how clinicians attempt to interpret symbolic imagery. Yet interpretation is often subjective, shaped by experience and bias.
Could computational models assist in identifying patterns across thousands of drawings? Not to replace human judgment—but to examine consistency and structure.
Working with image datasets, I began exploring how machine learning could classify visual features at scale. What patterns repeat? What distortions correlate with certain annotations? What does algorithmic detection reveal—and what does it miss?
The more I explored, the more cautious I became. AI can identify patterns, but it cannot feel context. It can cluster shapes, but it cannot understand lived experience.
That tension interests me deeply.
Technology can scale pattern recognition. Empathy requires interpretation. The challenge is not to automate care, but to support it with structure.
I am interested in what computational tools can illuminate—and where human judgment must remain central.



Comments