Yutong Zhang
@yutongzhang.bsky.social
90 followers 400 following 10 posts
CS master student at Stanford | previously undergrad at UIUC
Posts Media Videos Starter Packs
yutongzhang.bsky.social
Huge thanks to my amazing coauthors (@dorazhao.bsky.social, Jeffrey T. Hancock, Robert Kraut, @diyiyang.bsky.social), I couldn’t have done this without you. Grateful to learn from and work with you all :)
yutongzhang.bsky.social
⚠️ Short-term comfort from a chatbot may cost long-term social health. Users report emotional dependence, withdrawal, and distorted expectations.

The illusion of care comes with real risks. Future designs must ask: How do we prevent over-attachment? How do we support healthier AI chatbots?
yutongzhang.bsky.social
5️⃣ Chatbot effects depend on your social environment – people with fewer real-life connections get less out of AI companionship, which doesn't make up for missing human support 🫂.
yutongzhang.bsky.social
3️⃣ It’s all about how chatbots are used: general interaction links to greater well-being 📈, but seeking companionship is tied to worse outcomes 📉.
4️⃣ More frequent interactions, deeper emotional connections, and more disclosure with AI companions are linked to lower well-being 📉.
yutongzhang.bsky.social
1️⃣ People with less human support, like single users, minority groups, and those with smaller social networks, are more likely to seek companionship from chatbots 💬.
2️⃣ The longer people chat, the more intense the relationship with AI 🔁.
yutongzhang.bsky.social
People don’t always say they use chatbots for companionship, but companionship remains the primary actual use across all three data sources. Users view chatbots as friends or partners and are turning to them to discuss emotional and intimate topics.
yutongzhang.bsky.social
📊 We surveyed 1,000+ Character.AI users and analyzed 4,363 chat sessions to understand how people really talk to AI. We combined three data sources to reveal how people connect with AI companions and how it impacts their well-being.
yutongzhang.bsky.social
This raises an urgent question: Can these “artificial” bonds truly meet human needs, or are we creating new vulnerabilities?
yutongzhang.bsky.social
AI companions aren’t science fiction anymore 🤖💬❤️
Thousands are turning to AI chatbots for emotional connection – finding comfort, sharing secrets, and even falling in love. But as AI companionship grows, the line between real and artificial relationships blurs.
Reposted by Yutong Zhang
tiziano.bsky.social
Academic job market post! 👀

I’m a CS Postdoc at Stanford in the Stanford HCI group.

I develop ways to improve the online information ecosystem by designing better social media feeds & improving Wikipedia. I work on AI, Social Computing, and HCI.
piccardi.me 🧵
Reposted by Yutong Zhang
mbernst.bsky.social
This paper argues that online spaces become ghost towns because it's too easy to lurk without contributing, and that asking people to regularly re-commit—or the incoming messages start getting muted—reverses the trend. arxiv.org/abs/2410.23267

It works! #cscw2024 paper by @popowski.bsky.social
.
Commit: Online Groups with Participation Commitments
In spite of efforts to increase participation, many online groups struggle to survive past the initial days, as members leave and activity atrophies. We argue that a main assumption of online group de...
arxiv.org