Alter Emo
Royal College of Art Independent research project in collaboration with industry experts 2025
DESIGN STRATEGY
SERVICE DESIGN
AI INTERACTION DESIGN Alter Emo is a Sims-like AI companion designed to support the emotional wellbeing of young professionals. It empowers users to process and regulate emotions after challenging workdays through journaling, conversations with gen-AI agents, and personalized insights, grounded in psychological research.
PROJECT SCOPEDrawing from our team’s own experiences of emotionally demanding workdays, we identified a gap: the lack of accessible, structured services to help young professionals (22–35 y/o) manage low-intensity, high-frequency emotional distress.
Alter Emo was developed to address this challenge, exploring how AI companions could support emotional wellbeing with a more user-centred interaction model, overcoming the artificiality of existing AI products that often compromise user trust and safety.
MY ROLEAs the Lead User Researcher in a team of 3, I designed and executed mixed-methods studies (qual + quant), synthesized research into clear insights and strategy, and created interaction flows that guided product development.
OUTCOMEI designed and implemented strategies that increased user acceptance and adoption of AI for emotional wellbeing by 42% and 17% respectively.
Research & Findings
ObjectivesIn order to design artificial care that leverages AI’s strengths and differentiates it from human care, unlike other products in the market, our research set out to:
Understand the landscape: Map current emotional wellbeing products to identify limitations and opportunities for AI companions
Examine user trust: Investigate how young professionals perceive and trust AI for emotional support, including barriers and consequences
Define principles for artificial care: Establish ethical and psychological grounded design guidelines through co-creation workshops
Design and test interactions: Develop and evaluate AI features and interaction models that address user pain points and diverse emotional needs through usability testings
ProcessI led a 5-month research-driven project from conception to public showcase, shaping and testing an AI companion prototype to support young professionals’ emotional wellbeing.
Overview of the project timeline outlining key study activities and deliverables.
Background & Challenge15% of working-age adults were estimated to have a mental disorder, but many more experience subclinical emotional distress that still affects wellbeing and performance, particularly young professionals in the early stages of their careers, navigating high-pressure environments with limited coping experience. — World Health Organization (2022)
The gap we’re addressing: Low-intensity, high-frequency emotional stress that is common in young professionals — overlooked by traditional mental health services and difficult to process consistently.
The User Journey Map provides valuable insight into the emotional challenges a young professional faces on a difficult workday.
Market ResearchUnderstanding the landscape of the emotional wellbeing market:
The global wellbeing market is rapidly expanding with the rise of digital health and self-care apps. While most non-clinical tools focus on mindfulness, journaling, tracking, or gamified therapy, young professionals seek more personalized and adaptive solutions — revealing an opportunity for AI-driven emotional support. Competitor Analysis (AI for emotional wellbeing):
AI Companion: This model dominates the market (see graph), providing emotional support by simulating intimacy through large language models (LLMs). Yet it raises concerns around artificial intimacy and user dependency.
Market Positioning: Many products are marketed as solutions to societal challenges such as loneliness and limited access to mental healthcare, yet they often overlook the limitations and ethical boundaries of AI.
User Centricity: Most solutions are driven by technological advancement and commercial incentives, with limited understanding of user needs and psychological insights.
Opportunity: There is a growing demand for human-centred AI that prioritises authenticity, trust, and ethical design in supporting emotional wellbeing.
ContraintsWe tested which kinds of AI interaction model best address emotional distress scenarios for young professionals, keeping several unique constraints in mind:
Personal & Cultural Sensitivity
Emotional language and coping styles vary widely. Research therefore expanded across cultural contexts, with data samples large enough to ensure inclusivity and accuracy within the project timeframe and resources.
Interaction Mode
We chose text/voice interactions as they are most familiar to users and realistic given current technology. However, this mode risks missing non-verbal cues.
AI Limitations
Large language models (LLMs) have inherent constraints. For instance, users may express distress indirectly (e.g. sarcasm, humor) which can be difficult for AI to detect.
In-depth Interviews InsightsWe conducted in-depth interviews with a total of 20 participants — young professionals (22–35 y/o) — to deepen our understanding of how AI is used for emotional support as an alternative to more traditional methods such as seeking help from friends or therapy.
Affinity Map
Insights Summary: Despite varying levels of trust towards AI, all interviewees agreed that it can never replace humans. There will always be an emotional gap — a part of human connection that AI cannot touch.
Interaction AI should not act too human or overly intimate, nor should designers set such expectations. AI care and human care are fundamentally different: while AI can emulate human interactions through structure and guidance, it can never truly sympathize.
Purpose
Tool for self-awareness: Users find AI useful for reflection and pattern recognition through their own data.
Scoped role: Low-stakes companion: Seen as helpful for easing loneliness, regulating emotions, and offering small-scale guidance without social burden. Acceptable for minor stressors, but not trusted for serious issues or as a substitute for therapy.
Structure over depth: Valued more for structure, prompts and organization than for deep emotional conversations.
Trust
Skepticism persists: Trust in AI is fragile due to its limitations, risks of hallucinations, and concerns about data misuse.
Privacy and data safety are important: Some users hesitate to open up without strong guarantees that personal issues and secrets will remain protected.
Transparency and usefulness build credibility: Trust increases when AI explains its rationale and offers advice that feels relevant.
Continuity strengthens trust: Memory and pattern recognition make AI feel consistent, building convenience and enabling more meaningful reflection over time.
Feel While AI cannot replicate the depth of human care, it can still provide a sense of presence. Through simple feedback, validation, and continuity, it offers low-stakes emotional mediation that helps users feel calmer, supported, and less alone.
Problem MappingTo synthesize our research findings, we mapped the problems identified within the system of AI in wellbeing, grouping them by scale of control. This helped us understand how they connect and guided our strategy moving forward.
Design Strategy
Design OpportunityHow might we design AI companions that feel useful and personal for young professionals navigating low-intensity, high-frequency stress — helping them build self-awareness and resilience through accessible structure, while finding the sweet spot of AI–human proximity and fostering trust? Key Breakthrough: The design focus shifted from mimicking human care to cultivating self-love through AI — empowering users to reflect, build emotional awareness, and strengthen resilience from within.
Co-Creation Workshop — Defining Design Concept
The team and I brainstormed and discussed new ideas to address user pain points, then prioritized three viable solutions to present to participants for feedback — informing our AI interaction model prototype.
\
Co-Creation Workshop —
Outcome: Design Guiding Principles▶ Empower Self-Care & Agency
Enable users to reflect, track, and recognise emotional patterns — placing them at the centre of their wellbeing journey with tools that nurture self-love and self-awareness.
▶ Emotional Safety & Connection
Design accessible, safe spaces for expression and reflection. The AI should communicate with clarity and emotional attunement while maintaining a respectful distance from human intimacy.
▶ Clarity Through Psychological Structure
Use CBT-based frameworks to offer personalised, sustainable guidance that helps users understand, process, and regulate emotions with structure and consistency.
▶ Ritual Through Serious Play
Turn emotional reflection into a playful, creative, and habit-forming process. By gamifying self-expression, users build daily rituals of emotional cleansing and resilience.
▶ Continuity & Relevance
Foster trust through memory and personalised insights. When AI explains its reasoning transparently and recalls user patterns meaningfully, it becomes both credible and emotionally resonant.
▶ Privacy & Trust
Protect users’ emotional data through secure systems and transparent practices — ensuring user safety and trust at every touchpoint.
Solution and Iterations
Our Service as a Platform —
Value Proposition
Alter Emo is where Sims meets emotional wellbeing — a playful, evolving digital self that mirrors your emotions and supports your growth. It’s a pocket-sized AI companion that offers young professionals a safe space to reflect, regulate, and reframe their inner world after challenging workdays.
Inspired by Stanford’s generative agent-based AIresearch, Alter Emo turns emotional self-care into a structured yet joyful daily ritual — a 10-minute practice grounded in evidence-based Cognitive Behavioural Therapy (CBT).
Park, J.S. et al. (2024). Generative Agent Simulations of 1,000 People. Protoype video
We adapt the Ethics of Care (Gilligan; Noddings) to guide our design framework, reimagining AI-powered self-care across three progressive stages of support.
Usability Testing — Human–AI Conversational InteractionWe tested conversational interaction formats by having participants engage with ChatGPT in real time, prompting them to continue the conversation until they reached a point of emotional reassurance. We also conducted interview with a therapist to understand appropriate boundaries and therapeutic communication styles.Findings:
Tone Matters: Users responded positively to a friendly, warm, and lightly reflective tone — one that gently mirrors their emotional state without becoming overly intimate.
Light Reassurance is Helpful: Providing grounded reassurance leveraging AI’s strengths — for example, referencing shared experiences, normalizing emotions, or offering simple data-based context— helped users feel seen and supported
Guidance vs. Advice: While users often appreciate practical suggestions, therapist emphasized that open-ended prompts encourage deeper self-awareness. → Therefore, the ideal approach blends light guidance with reflective questioning (e.g., “What feels like the hardest part right now?”) rather than direct advice.
Our Prototype — Key Features Feature — The Digital Self A personalised, gen-AI agent version of the user that acts as a reflective mirror — offering perspective, encouragement, and a vision of one’s more grounded or ideal self.
Before (existing market solutions) Most emotional support AIs focus on talking to a separate companion avatar. This sets the expectation of intimacy, which can feel artificial or even uncomfortable.
After (our solution) Users interact with a representation of themselves — designed to support self-awareness, emotional pattern recognition, and perspective-shifting.
Rationale This reframes emotional support as self-guided reflection and light companionship, rather than a simulated relationship which overhumanises AI — reducing emotional dependency while maintaining emotional resonance. Feature — Guided Journaling A guided reflective space that helps users unpack daily emotional experiences, recognise patterns, and build emotional clarity over time.
Before (existing market solutions) Most emotional support AIs offer soothing or motivational responses that provide short-term comfort but do not support deeper emotional processing.
After (our solution) The AI guides users through structured reflection using prompts, emotional check-ins, and pattern recognition — helping them understand the why behind their emotions rather than just soothing them.
Rationale This supports emotional literacy and self-awareness, empowering users to reflect and regulate their emotions, rather than relying on AI for emotional reassurance. Feature — CBT-Structured Reframing Journaling infused with evidence-based cognitive behavioral techniques that help users reframe thoughts, identify emotional triggers, and build healthier internal coping habits over time.
Before (existing market solutions) Most AI support tools focus on conversation without offering clear follow-up steps or structured guidance to address underlying emotional causes.
After (our solution) The AI guides users through a CBT-style framework — helping them surface beliefs, challenge unhelpful thought patterns, and translate emotional insight into practical next steps.
Rationale This supports long-term emotional resilience, shifting the experience from momentary relief to ongoing self-regulation and growth. Feature — Gentle Behaviour Nudges Subtle positive reinforcement moments — such as small surprises, notes, or symbolic tokens — that acknowledge progress and encourage real-world emotional actions.
Before (existing market solutions) Most emotional AI systems keep users within the digital space, providing comfort but rarely prompting real-life change or emotional follow-through.
After (our solution) The AI offers small, meaningful “nudges” that encourage users to take grounded actions offline — supporting emotional momentum without pressure.
Rationale This reinforces emotional progress in daily life, bridging digital reflection with real-world behaviour, and strengthening personal agency over time.Service SystemService Flow
Service SystemImpact
Hypothesis Testing of the Service & Data CollectionWe tested our value and business hypothesis through prototype with 12 young professionals.
Iterations Based on Feedback:
Added Progress Reflection & Encouragement: introduced gentle “noticing” and reinforcement messages to help users recognise emotional progress and sustain long-term engagement
Strengthened CBT Support: integrated AI-assisted guidance to help users complete CBT reframing steps more confidently