Yutong Zhang
banner
yutongzhang.bsky.social
Yutong Zhang
@yutongzhang.bsky.social
CS master student at Stanford | previously undergrad at UIUC
⚠️ Short-term comfort from a chatbot may cost long-term social health. Users report emotional dependence, withdrawal, and distorted expectations.

The illusion of care comes with real risks. Future designs must ask: How do we prevent over-attachment? How do we support healthier AI chatbots?
June 18, 2025 at 4:27 PM
People don’t always say they use chatbots for companionship, but companionship remains the primary actual use across all three data sources. Users view chatbots as friends or partners and are turning to them to discuss emotional and intimate topics.
June 18, 2025 at 4:27 PM
📊 We surveyed 1,000+ Character.AI users and analyzed 4,363 chat sessions to understand how people really talk to AI. We combined three data sources to reveal how people connect with AI companions and how it impacts their well-being.
June 18, 2025 at 4:27 PM
This raises an urgent question: Can these “artificial” bonds truly meet human needs, or are we creating new vulnerabilities?
June 18, 2025 at 4:27 PM
AI companions aren’t science fiction anymore 🤖💬❤️
Thousands are turning to AI chatbots for emotional connection – finding comfort, sharing secrets, and even falling in love. But as AI companionship grows, the line between real and artificial relationships blurs.
June 18, 2025 at 4:27 PM