Meredith Broussard, PhD
@merbroussard.bsky.social
19K followers 410 following 300 posts

Critical AI, data journalism, literary nonfiction. Professor at NYU. Author, "More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech." meredithbroussard.com

Meredith Broussard is a data journalism professor at the Arthur L. Carter Journalism Institute at New York University. Her research focuses on the role of artificial intelligence in journalism. .. more

Computer science 50%
Business 9%
Posts Media Videos Starter Packs
design-law.bsky.social
If a bunch of smart, financially-disinterested experts take a look at something and they all come away with concerns, that's not "bias." It's data.

Reposted by Meredith Broussard

techpolicypress.bsky.social
🚨Less Than a Week Left to Apply!🚨

If you are someone looking to inform technology policy through rigorous original reporting or policy analyses, we want to hear from you! Click the link below and apply to be part of our 2026 Fellowship cohort!

airtable.com/appIrc1F9M5d...
carlzimmer.com
Today my @nytimes.com colleagues and I are launching a new series called Lost Science. We interview US scientists who can no longer discover something new about our world, thanks to this year‘s cuts. Here is my first interview with a scientist who studied bees and fires. Gift link: nyti.ms/3IWXbiE
nyti.ms
hypervisible.blacksky.app
“One of the negative consequences AI is having on students is that it is hurting their ability to develop meaningful relationships with teachers, the report finds. Half of the students agree that using AI in class makes them feel less connected to their teachers.”
Rising Use of AI in Schools Comes With Big Downsides for Students
A report by the Center for Democracy and Technology looks at teachers' and students' experiences with the technology.
www.edweek.org
datasociety.bsky.social
“Privacy-preserving” isn’t as private as you might think. Our new brief, published in collab w @powerswitchaction.org & Coworker, exposes how so-called “privacy-preserving” technologies can actually enable *more* worker surveillance — and what we can do about it. datasociety.net/library/the-...
hypervisible.blacksky.app
Altman claims the company didn’t anticipate people not wanting their deepfakes to say “offensive things or things that they find deeply problematic,” which sounds like a lie but is also indicative of how they recklessly release tech into the world.
OpenAI wasn’t expecting Sora’s copyright drama
It felt “more different to images than people expected.”
www.theverge.com

Reposted by Meredith Broussard

richardfletcher.bsky.social
A thread on how people's use of generative AI has changed in the last year - based on survey data from 6 countries (🇬🇧🇺🇸🇫🇷🇩🇰🇯🇵🇦🇷 ).

First, gen AI use has grown rapidly.

Most people have tried out gen AI at least once (61%), and 34% now use it on a weekly basis - roughly doubling from 18% a year ago.
hypervisible.blacksky.app
“…that Sora is being used for stalking and harassment will likely not be an edge case, because deepfaking yourself and others into videos is one of its core selling points.”

Far from an edge case, it’s the primary use case.
Stalker Already Using OpenAI's Sora 2 to Harass Victim
A journalist claims that her stalker used Sora 2, the latest video app from OpenAI, to churn out videos of her.
futurism.com

merbroussard.bsky.social
This is the first year I have seen multiple students taking notes by hand because they have a deeper understanding of how information is most effectively encoded in their brains. It’s exciting to see!

merbroussard.bsky.social
I definitely thought about getting printed coursepacks for this semester, but I don’t know where to order them!

merbroussard.bsky.social
Shoutout to all the first-year students who left home, grew tiny mustaches, and are debuting them in-person at parents' weekend.

Reposted by Meredith Broussard

merbroussard.bsky.social
King Chunk
explore.org
Meet your FAT BEAR WEEK 2025 champion.

Chunk the Hunk. The Chunkster. 32 Chunk.

All hail the new king of Brooks River 👑
dingdingpeng.the100.ci
A lot of psych is already conducted with online convenience samples & ppl are probably excited about silicon samples bc it would allow them to crank out more studies for even less 💸

How about we reconsider the idea that sciencey science involves collecting own data.
www.science.org/content/arti...
AI-generated ‘participants’ can lead social science experiments astray, study finds
Data produced by “silicon samples” depends on researchers’ exact choice of models, prompts, and settings
www.science.org

Reposted by Meredith Broussard

explore.org
Meet your FAT BEAR WEEK 2025 champion.

Chunk the Hunk. The Chunkster. 32 Chunk.

All hail the new king of Brooks River 👑

alondra.bsky.social
"Data centers are proliferating in VA and a blind man in [MD] is suddenly contending with sharply higher power bills...It’s an increasingly dramatic ripple effect of the AI boom as energy-hungry data centers...[pull]...households into paying for the digital economy" www.bloomberg.com/graphics/202...
AI Data Centers Are Sending Power Bills Soaring
Wholesale electricity costs as much as 267% more than it did five years ago in areas near data centers. That’s being passed on to customers.
www.bloomberg.com
carlquintanilla.bsky.social
“.. OpenAI spent more on marketing and equity options for its employees than it made in revenue in the first half of 2025. That single fact sums up where we are in the AI cycle more neatly than anything we could ever write, so we’ll just end there.”

@financialtimes.com
www.ft.com/content/908d...

Reposted by Meredith Broussard

404media.co
New: landlords are demanding potential tenants hand over employer login credentials so a tool can verify their income. We were sent screenshot of the tool, Argyle, downloading much more data than necessary to approve the renter. "Opt-out means no housing" www.404media.co/landlords-de...
Landlords Demand Tenants’ Workplace Logins to Scrape Their Paystubs
Screenshots shared with 404 Media show tenant screening services ApproveShield and Argyle taking much more data than they need. “Opt-out means no housing.”
www.404media.co
hypervisible.blacksky.app
“This worldwide update includes the ability for parents, as well as law enforcement, to receive notifications if a child—in this case, users between the ages of 13 and 18—engages in chatbot conversations about self harm or suicide.”
OpenAI Adds Parental Safety Controls for Teen ChatGPT Users. Here’s What to Expect
OpenAI’s review process for teenage ChatGPT users who are flagged for suicidal ideation includes human moderators. Parents can expect an alert about alarming prompts within hours.
www.wired.com
disabilitystor1.bsky.social
I realize many people don’t quite get that not every single professor in every single university teaches computer science and might actually be trained in and invested in teaching other things, like let’s say, history? Or poetry.
Or sociology.
It is not our job to teach students how to use AI.
matt94250.bsky.social
If you don’t teach your students how to use AI, you’re doing them a huge disservice because they won’t have jobs in the future.
hypervisible.blacksky.app
The guide lists 7 risks “that come from AI governance: discriminatory work assignments, fluctuating wages, loss of worker control, constant surveillance, unreasonable performance evaluations, automated punishment, and non-payment.”
Report Warns That AI Is About to Make Your Boss a Panopticon Overlord
Far from a dystopian fantasy, algorithmic management tech is rapidly expanding across the US and European Union.
futurism.com