Dhanaraj Thakur
thakurdhanaraj.bsky.social
Dhanaraj Thakur
@thakurdhanaraj.bsky.social
Research Director, Center for Democracy & Technology @cdt.org

Board Member: #shepersisted and the Baltimore Digital Equity Coalition.
Member: Global Majority Research Committee (Trust and Safety Foundation).
Reposted by Dhanaraj Thakur
AND running unpermitted, polluting gas turbines AND sited in the predominantly Black Memphis community of Boxtown, which already faces some of the highest rates of asthma in the nation time.com/7308925/elon...
November 25, 2025 at 3:01 PM
Reposted by Dhanaraj Thakur
I've been thinking about how many journalists of color I know have gone through multiple layoffs and are tired of the instability on top of the stagnant pay, increasing workload, demand for more skills, threat of AI, and constant complaints about paywalls. It's hard to have kids or make plans.
November 24, 2025 at 5:46 PM
Reposted by Dhanaraj Thakur
I’ve been running around asking tech execs and academics if language was the same as intelligence for over a year now - and, well, it isn’t. @benjaminjriley.bsky.social explains how the bubble is built on ignoring cutting-edge research into the science of thought www.theverge.com/ai-artificia...
November 25, 2025 at 1:54 PM
Reposted by Dhanaraj Thakur
"The Wall Street Journal interviewed teenage users of Character.AI with the permission of their parents. The teens described relying on the technology to cope with loneliness and having difficulty setting it aside."
Teens Are Saying Tearful Goodbyes to Their AI Companions
Chatbot maker Character.AI is cutting off access, citing mental-health concerns.
www.wsj.com
November 25, 2025 at 12:20 PM
Reposted by Dhanaraj Thakur
"My mentor always tells me, 'Kim, dogs don’t bark at parked cars." They’re coming after critical race theory, 1619, intersectionality because these ideas mobilized people. They gave them the language to actually articulate what they were seeing with their own eyes," says Kimberlé Crenshaw.
November 25, 2025 at 1:26 PM
Reposted by Dhanaraj Thakur
“To be honest,” Etoria added, “it helped me imagine how the slaves might have felt, going to another land in shackles and chains—that loneliness, that disconnect, that sense of loss.”

www.newyorker.com/magazine/202...
Disappeared to a Foreign Prison
The Trump Administration is deporting people to countries they have no ties to, where many are being detained indefinitely or forcibly returned to the places they fled. Sarah Stillman reports.
www.newyorker.com
November 24, 2025 at 7:05 PM
Reposted by Dhanaraj Thakur
“AI workers said they distrust the models they work on because of a consistent emphasis on rapid turnaround time at the expense of quality.”
Meet the AI workers who tell their friends and family to stay away from AI
When the people making AI seem trustworthy are the ones who trust it the least, it shows that incentives for speed are overtaking safety, experts say
www.theguardian.com
November 22, 2025 at 5:12 PM
Reposted by Dhanaraj Thakur
It’s blowing my mind that schools, universities, public services would run headlong into this. We spent 15 years documenting black boxes. This is a black box in a black hole!
I don't understand how anyone can watch how blatantly Grok is manipulated to answer the way ownership desires it to and then act like the other LLM chatbots couldn't possibly be similarly but less obviously compromised to produce responses in whatever way corporate interests and priorities dictate.
November 23, 2025 at 11:16 PM
New report from @cdt.org led by Varun Rao and myself.
"Think Twice Before You Search: Deterrence Messaging Designs to Prevent Searches for Non-Consensual Intimate Images"

cdt.org/insights/thi...

#ncii #onlinegbv #ndii #deepfakes
Think Twice Before You Search: Deterrence Messaging Designs to Prevent Searches for Non-Consensual Intimate Images
This report is also authored by Princeton University’s Varun Nagaraj Rao. [ PDF version ] Non-consensual distribution of intimate images (NDII) occurs when explicit images or content of a person, whet...
cdt.org
November 21, 2025 at 7:00 PM
Reposted by Dhanaraj Thakur
Delighted to share my collaboration with @aliyabhatia.bsky.social on research that grounds online safety debates in what families actually say they need and value. We touched on four key topics: age verification, feed controls, screen-time features and parental access.
CDT released new research on what teens & parents actually want from social media safety features. CDT’s @mluria.bsky.social & @aliyabhatia.bsky.social found that many policy proposals—like strict age checks or rigid time limits—often feel intrusive, ineffective, or disconnected from family needs.
November 20, 2025 at 7:58 PM
Reposted by Dhanaraj Thakur
I really hate it when scientists keep saying that “we need to rebuild trust in science,” because it implies that scientists are to blame for the mistrust rather than the millions of dollars of dark money that have funded political attacks on science in order to advance a far right agenda.
November 19, 2025 at 9:48 PM
Reposted by Dhanaraj Thakur
This is a nuts story. An AI chatbot and image platform left millions of images exposed. They show what people are actually using the AI for: taking random women's yearbook, graduation, and social media photos and making super realistic hardcore porn with them
www.404media.co/ai-porn-secr...
Massive Leak Shows Erotic Chatbot Users Turned Women’s Yearbook Pictures Into AI Porn
Chatbot roleplay and image generator platform SecretDesires.ai left cloud storage containers of nearly two million of images and videos exposed, including photos and full names of women from social me...
www.404media.co
November 19, 2025 at 3:28 PM
In my latest op-ed with @techpolicypress.bsky.social I address the problem of child sexual exploitation and abuse (CSEA) on livestreaming platforms.

www.techpolicy.press/livestreamin...

#csea #livestreaming #contentmoderation
Livestreaming Platforms Must Demonstrate Their Safety Measures' Effectiveness | TechPolicy.Press
Research by the Center for Democracy & Technology shows that platforms use three major approaches to address this problem, writes Dhanaraj Thakur.
www.techpolicy.press
November 19, 2025 at 2:23 PM
Reposted by Dhanaraj Thakur
Great. Really fantastic. I love computers.

www.404media.co/a-researcher...
A Researcher Made an AI That Completely Breaks the Online Surveys Scientists Rely On
We can no longer trust that survey responses are coming from real people.”
www.404media.co
November 18, 2025 at 12:51 AM
Reposted by Dhanaraj Thakur
Horrific and illegal activities are playing out across livestreaming platforms. Research by the Center for Democracy & Technology shows that platforms use three major approaches to address the problem, writes Dhanaraj Thakur. But do they work? Platforms must provide proof of their efficacy, he says.
Livestreaming Platforms Must Demonstrate Their Safety Measures' Effectiveness | TechPolicy.Press
Research by the Center for Democracy & Technology shows that platforms use three major approaches to address this problem, writes Dhanaraj Thakur.
www.techpolicy.press
November 18, 2025 at 12:43 PM
Reposted by Dhanaraj Thakur
POV: you are a young woman celebrating a recent academic success
November 17, 2025 at 7:20 PM
Reposted by Dhanaraj Thakur
'Darkfakes,' 'Foefakes,' 'Fanfakes,' and 'Glowfakes': Morgan Wack, Christina Walker, Alena Birrer, Kaylyn Jackson Schiff, Daniel Schiff, and JP Messina systematically analyzed political deepfakes and developed a classification that categorizes them along key dimensions.
Scrutinizing the Many Faces of Political Deepfakes | TechPolicy.Press
Morgan Wack, Christina Walker, Alena Birrer, Kaylyn Jackson Schiff, Daniel Schiff, and JP Messina systematically analyzed political deepfakes.
www.techpolicy.press
November 17, 2025 at 1:47 PM
Reposted by Dhanaraj Thakur
CDT’s @kateruane.bsky.social in @404media.co: “Providing tech services to supercharge ICE operations while blocking tools that support accountability of ICE officers is entirely backwards.”
Google Has Chosen a Side in Trump's Mass Deportation Effort
Google is hosting a CBP app that uses facial recognition to identify immigrants, while simultaneously removing apps that report the location of ICE officials because Google sees ICE as a vulnerable group. “It is time to choose sides; fascism or morality? Big tech has made their choice.”
www.404media.co
November 14, 2025 at 5:30 PM
Reposted by Dhanaraj Thakur
New in @theverge.com - half of US scams originate on a Meta property. The company must do more. @lanalanalana.bsky.social and I weigh in www.theverge.com/tech/820906/...
Meta must rein in scammers — or face consequences
Scams are ruinous to users — but, reportedly, big business to Meta.
www.theverge.com
November 14, 2025 at 7:52 PM
Reposted by Dhanaraj Thakur
Google has moved from moral cowardice (removing ICE tracking apps) to direct complicity in the violent, unconstitutional abuse of vulnerable people.

As a civil liberties lawyer who worked at Google for ten years, this one hits me in the gut.
Google Has Chosen a Side in Trump's Mass Deportation Effort
Google is hosting a CBP app that uses facial recognition to identify immigrants, while simultaneously removing apps that report the location of ICE officials because Google sees ICE as a vulnerable gr...
www.404media.co
November 14, 2025 at 3:28 PM
Reposted by Dhanaraj Thakur
I'm attending a workshop hosted by @techpolicypress.bsky.social tomorrow. They invited me to write a "provocation" about the state of technology and democracy right now.

I think I wrote something appropriately dark and foreboding.

www.techpolicy.press/the-dance-wi...
The Dance with Big Tech is Different under Trump 2.0 | TechPolicy.Press
If we are going to repair democratic institutions, we are going to have to do it ourselves, writes Dave Karpf.
www.techpolicy.press
November 13, 2025 at 9:09 PM
Reposted by Dhanaraj Thakur
TOMORROW: How can civil society researchers produce work that’s both rigorous + high-impact? Join CDT, @datasociety.bsky.social & Society, & @aclu.org for “Advocating with Evidence: Lessons for Tech Researchers in Civil Society."

📅 Nov 13 | 10–11 AM ET | Online

REGISTER: cdt.org/event/advoca...
November 12, 2025 at 5:26 PM