Nadia Jude
@nadiajude.bsky.social
470 followers 1.6K following 15 posts
Postdoctoral Research Fellow at the University of Edinburgh. Thinking critically about data infrastructures, digital platform content moderation and global security governance. She/her.
Posts Media Videos Starter Packs
Pinned
nadiajude.bsky.social
Our @techpolicypress.bsky.social article is based on our recent paper examining Community Notes through the lens of humour. We find that CN enacts a narrow conception of disinformation that is ill-equipped to address hate and harm. eprints.qut.edu.au/254907/1/Mat...
Our research reveals a fundamental issue with the ‘community notes’ model as a preferred content moderation system: it is oriented toward addressing an outdated and extremely narrow understanding of disinformation, one that views falsity and fakeness as the problem alone, ignoring the historical, social, cultural, economic and political nature of disinformation. By examining the tool through the lens of humor, we show how the system fails to assess the harms or ‘check-worthiness’ of false narratives. This problem is a skill expert fact-checkers are trained to address, making Community Notes a poor substitute for culturally informed content moderation performed by a combination of experts, AI, and crowd workers situated in place and time.
nadiajude.bsky.social
I'm organising an ECR workshop this Friday with an incredible lineup of scholars. It's called 'A field guide to infrastructural analysis'. We only have space for 20, so there is a short application form. If you're in Edinburgh this week, please consider joining us!
www.law.ed.ac.uk/news-events/...
A field guide to infrastructural analysis An early career workshop with Julie Cohen, Gavin Sullivan and Morgan Currie | Edinburgh Law School
www.law.ed.ac.uk
nadiajude.bsky.social
Happy to see this out in the world! Please read if you're thinking critically about the fast adoption of content moderation systems like Community Notes, that combine crowdsourcing with bridging-based algorithmic ranking. These systems are being rolled out by X, YouTube, Meta and now TikTok.
Reposted by Nadia Jude
nadiajude.bsky.social
"These scholars... all highlight the centrality of popular discontent with neoliberalism—whether as a project of governance, a type of rationality, or a set of economic policies—as key to understanding the resurgence of an authoritarian, nationalist, and anti-globalist Right"
shannimcg.bsky.social
We here @citap.bsky.social, in collaboration w/ @mariher.bsky.social & Kristóf Szombati, put together a timely and important reading list. Authoritarian Politics: How to Understand It and How to Resist It.
We hope you'll read, reflect, and share this tool broadly. 1/ citap.unc.edu/wp-content/u...
citap.unc.edu
nadiajude.bsky.social
We stress the importance of critically interrogating the design of content moderation systems, the problems they are oriented to solve, their contexts of use, and their risk of directly supporting and entrenching online harms.
X, YouTube, and now Meta are taking advantage of these successes, taking elements, then co-opting the language of ‘empowerment’ and ‘democracy’ to reduce moderation and market their techno-solutionist products as beneficial for the world. Research like ours is a useful reminder of the importance of critically interrogating platform content moderation systems, which involves paying attention to these systems’ design, the problems they are oriented to solve, their contexts of use, and their risk of directly supporting and entrenching online harms. There is an urgent need to think beyond technology to address the societal challenges often associated with the disinformation problem despite large tech companies' efforts to convince us otherwise. Media system reform, market-shaping approaches, and “big tent” civil society coalitions led by the Global Majority would be a fruitful start.
nadiajude.bsky.social
Our @techpolicypress.bsky.social article is based on our recent paper examining Community Notes through the lens of humour. We find that CN enacts a narrow conception of disinformation that is ill-equipped to address hate and harm. eprints.qut.edu.au/254907/1/Mat...
Our research reveals a fundamental issue with the ‘community notes’ model as a preferred content moderation system: it is oriented toward addressing an outdated and extremely narrow understanding of disinformation, one that views falsity and fakeness as the problem alone, ignoring the historical, social, cultural, economic and political nature of disinformation. By examining the tool through the lens of humor, we show how the system fails to assess the harms or ‘check-worthiness’ of false narratives. This problem is a skill expert fact-checkers are trained to address, making Community Notes a poor substitute for culturally informed content moderation performed by a combination of experts, AI, and crowd workers situated in place and time.
nadiajude.bsky.social
Thank you Tom ☺️ We felt our research really spoke to the global tech justice piece you wrote with @joncong.bsky.social on the impact of Meta's decisions for marginalised communities + regions grappling with deep-seated conflicts. CN is not designed to address harm + its ethos will likely sow harm.
nadiajude.bsky.social
Went to a great talk on innovation nationalism and the colonial dynamics of drone testing in Australia last night, thanks so much @thaophan.bsky.social. The documentary below, interviewing people of Logan, is worth a watch. Logan is approx. 1h from my home.
www.careful.industries/ai-in-the-st...
Observatory - Logan — Careful Industries
www.careful.industries
nadiajude.bsky.social
I loved working with @ariadnamf.bsky.social on this. We conceptualise Community Notes as a "data infrastructure for soft moderation". CN has an operational logic that inscribes true-false, real-fake binaries, and fails to address false narratives that mobilise the ambiguities within humour to harm.
ariadnamf.bsky.social
Amidst Meta's content moderation changes, @nadiajude.bsky.social and I have a new paper in press on how the 'Community Notes' model corrects narrow understandings of the disinformation problem eprints.qut.edu.au/254907/1/Mat... We're also preparing a piece for Tech Policy Press that we'll share soon
eprints.qut.edu.au
nadiajude.bsky.social
No problem at all, Nicole! It’s what this place is for ☺️ I hope you find them useful!
nadiajude.bsky.social
“Worryingly, critical researchers and specialist organizations are redirected by donors away from healing initiatives to conflict frames. 💔”
joncong.bsky.social
New GloTech study discusses these Qs: What can South-to-South knowledge exchange contribute to Tech and Democracy research? What are divergence points between Global North & South tech justice advocacies? How are so-called global “whole-of-society” coalitions extractive of “local partners”? 🧿 1/6
nadiajude.bsky.social
Hi Nicole, I'm writing my PhD on shifting conceptions of disinformation within Australian policy discourses. I'll link to some readings I've found particularly useful below (have many more, if this is the kind of work you're looking for) ☺️
Reposted by Nadia Jude
qutdmrc.bsky.social
We made a starter pack of researchers affiliated with the QUT Digital Media Research Centre