Teresa Barrio Traspaderne
tbtraspa.bsky.social
Teresa Barrio Traspaderne
@tbtraspa.bsky.social
Researcher on technology and ESCR at Amnesty International. Amateur DJ and movement freak. Migrant in Mexico.
Reposted by Teresa Barrio Traspaderne
South Africa’s #G20 presidency has the potential to catalyse meaningful global #debt reform, but so far nothing tangible has been achieved.
1️⃣6️⃣5️⃣ organisations have called on @cyrilramaphosabsky.bsky.social to take action on the #debtcrisis.
Read the call here➡️ www.eurodad.org/letter_to_sa...
October 14, 2025 at 10:07 AM
Reposted by Teresa Barrio Traspaderne
🚨Acción Urgente 🚨
Por décadas, el pueblo indígena #Barí del Catatumbo, #Colombia 🇨🇴, ha enfrentado el abandono. Hoy sufren una epidemia, docenas de enfermos y un niño muerto. MinSalud y Norte de Santander deben garantizar derecho a la salud. amn.st/63326AKxnA
September 11, 2025 at 12:00 AM
I'm very excited to take on an interim role as Researcher/Adviser on Technology and ESCR at @amnesty.org.
I’ll explore how AI and algorithms impact social protection, focusing on intersectional & potentially discriminatory effects. Looking forward to connecting with others in this space!
September 8, 2025 at 11:18 PM
Reposted by Teresa Barrio Traspaderne
TikTok says it’s safe for children and young people—but actions don’t match words.

Easy-to-bypass time limits, invasive data collection, not enough protection from harmful content. Sign our #FixTikTok petition & demand change: http://amn.st/63324NXrer
Make TikTok safer for children and young people.
It is becoming more toxic and addictive for children. Sign the petition now.
amn.st
May 13, 2025 at 12:20 PM
Reposted by Teresa Barrio Traspaderne
TikTok says it’s safe for children and young people — but its actions don’t match its words. Easy-to-bypass time limits, invasive data collection, not enough protection from harmful content. Sign our #FixTikTok petition & demand change!

www.amnesty.org/en/petition/...
Make TikTok safer for children and young people.
It is becoming more toxic and addictive for children. Sign the petition now.
www.amnesty.org
May 13, 2025 at 3:28 PM
Reposted by Teresa Barrio Traspaderne
🚨 TikTok is still failing to address serious risks of harm to children and young people’s mental health almost 18 months after @Amnesty highlighted these risks in an extensive research project. Our analysis marking #MentalHealthAwarenessWeek

www.amnesty.org/en/documents...
TikTok fails to address risks to children and young people’s mental health despite past warnings - Amnesty International
TikTok is failing to address serious risks of harm to young users’ mental and physical health almost 18 months after Amnesty International highlighted these risks in a groundbreaking report. Ahead of ...
www.amnesty.org
May 13, 2025 at 3:25 PM
Reposted by Teresa Barrio Traspaderne
JD Vance's stance on AI deregulation undermines civil society efforts to push for regulations that protect human rights.

Check Amnesty's statement on why leaders at the #AIActionSummit must ensure tech companies follow binding rules and standards on AI, and not operate unchecked.
Global/France: AI Action Summit must meaningfully center binding and enforceable regulation to curb AI-driven harms  
Ahead of the AI Action Summit, which begins on February 10, Amnesty International’s Director of the technology and human rights programme, Damini Satija, said:  “With global leaders and tech executives gathering to attend the Artificial Intelligence (AI) Action Summit in Paris, the French government must not miss a crucial opportunity to make meaningful progress towards […]
amn.st
February 11, 2025 at 12:30 PM
Reposted by Teresa Barrio Traspaderne
We are now living in a world that feels increasingly terrifying. The omnipresence of predictive algorithms, coupled with rising global backlash against civil liberties risks giving a carte blanche to tech companies, to operate without rules or guidelines.

Read more 👇
Global/France: AI Action Summit must meaningfully center binding and enforceable regulation to curb AI-driven harms  
Ahead of the AI Action Summit, which begins on February 10, Amnesty International’s Director of the technology and human rights programme, Damini Satija, said:  “With global leaders and tech executives gathering to attend the Artificial Intelligence (AI) Action Summit in Paris, the French government must not miss a crucial opportunity to make meaningful progress towards […]
amn.st
February 10, 2025 at 10:20 AM
Reposted by Teresa Barrio Traspaderne
In the lead up to next week’s Paris AI Summit, @lipstickkrantikari.bsky.social and I call on government leaders and tech executives to confront how AI and austerity, often coded as ‘government efficiency’ are driving inequality and entrenching corporate power.

www.techpolicy.press/ai-as-double...
AI as Double Speak for Austerity | TechPolicy.Press
Amnesty Tech's Likhita Banerji and Damini Satija say the Summit should prioritize people and communities over the whims of corporations.
www.techpolicy.press
February 7, 2025 at 6:12 PM
Reposted by Teresa Barrio Traspaderne
🚨As leaders gather for the Paris #AIActionSummit, they must confront a stark reality: rapid deployment of AI in the public sector is deepening inequality, expanding mass surveillance, & violating human rights.

Our analysis in @techpolicypress.bsky.social 👇
AI as Double Speak for Austerity | TechPolicy.Press
Amnesty Tech's Likhita Banerji and Damini Satija say the Summit should prioritize people and communities over the whims of corporations.
www.techpolicy.press
February 10, 2025 at 12:04 PM
Reposted by Teresa Barrio Traspaderne
Damini Satija, Director of Tech and Human Rights at Amnesty is at the Paris #AIActionSummit.

She's sharing three things we're asking decision makers to take heed of on AI harms and regulation 👇
February 11, 2025 at 11:48 AM
Reposted by Teresa Barrio Traspaderne
👀🔎 Last week Big Tech companies published their first reports of how they assess the systemic risks of their platforms to users under the EU’s Digital Services Act. Here’s what we’ve found so far: 🧵
December 3, 2024 at 1:32 PM