Josh Wenger
jdweng.bsky.social
Josh Wenger
@jdweng.bsky.social
PhD Student in Psychology and Social Data Analytics | NSF GRFP Fellow
Our findings highlight the impressive potential of AI for high-quality emotional support, while emphasizing the importance of respecting individual preferences in empathy seeking behavior.
February 4, 2026 at 11:30 PM
This effect appears for participants’ real-life emotional situations, and even when the human empathizer is an expert (e.g., trained crisis responders).
February 4, 2026 at 11:30 PM
In our new research, we examined whether people choose to receive empathy from a human or AI empathizer when given the free option.

Across four studies, we find an “AI empathy choice paradox”:
—People generally choose human empathizers.
—But when they do choose AI, they rate it as more empathetic.
February 4, 2026 at 11:30 PM
We also see empathy as part of a broader philosophy of science conversation: how should scientists engage with participant experiences to inform our construct definitions, and how should we bound our constructs as new technologies and relational possibilities emerge?

5/5
January 16, 2026 at 4:36 PM
Rather than letting modality (human vs. AI) dictate construct boundaries, we suggest grounding empathy in what it does for people—and why that matters for theory, measurement, and public relevance. As AI reshapes social interaction, our constructs need to be flexible enough to keep up.

4/5
January 16, 2026 at 4:36 PM
In our new preprint, we argue for a functional-relational approach to empathy, highlighting:
- the multiple functions empathy serves
- the role of relational context
- the importance of lived experience in defining psychological constructs

3/5
January 16, 2026 at 4:36 PM
Traditional models of empathy focus on the empathizer’s embodied emotional experience, which AI lacks. Yet people report feeling cared for by AI. This tension between human experience and researcher-imposed construct definitions raises questions about what it truly means for AI to “empathize.”

2/5
January 16, 2026 at 4:36 PM
Finally, we suggest that questions about whether empathy from one source is inherently “better” are difficult to answer without grounding them in a normative ethical framework to provide guidance regarding the relative value of different empathic qualities and their effect on well-being.
October 17, 2025 at 7:11 PM
From an empathy recipient’s perspective, the preference for one source over another may depend on how they weigh these trade-offs in light of their particular emotional situation. In some moments, accessible empathy may be more valuable than selective empathy.
October 17, 2025 at 7:10 PM
Human empathy has the potential for unique qualities such as selectivity and effort. However, human empathy and its expression of these qualites manifests in a wide variety of forms. AI empathy, on the other hand, also offers its own unique advantages, including consistency and accessibility.
October 17, 2025 at 7:10 PM
Across multiple studies we examine this AI empathy choice paradox and explore how it varies between empathy vs. compassion, physical vs. emotional suffering, positive vs. negative situations, and explore the importance of perceived empathizer effort.
March 5, 2025 at 1:52 PM