PauseAI
banner
pauseai.bsky.social
PauseAI
@pauseai.bsky.social
Community of volunteers who work together to mitigate the risks of AI. We want to internationally pause the development of superhuman AI until it's safe.

https://pauseai.info
SciShow's video (in collaboration with ControlAI) is definitely worth a watch! www.youtube.com/watch?v=90C3...
We’ve Lost Control of AI
If you find these trends concerning and you want to make a difference, you can go to http://controlai.com/scishow, where ControlAI has created tools to help you easily voice your concerns to your…
www.youtube.com
November 7, 2025 at 6:30 PM
This is absurd, and we would not accept it with any other industry. We need regulation to ensure the public is safe. Relying on Big Tech to do the right thing is already failing.

See the full list of signatories here: pauseai.info/dear-sir-dem...
Letter to Sir Demis Hassabis
Parliamentarians from across the UK call on Google DeepMind to honour their AI safety commitments
pauseai.info
November 7, 2025 at 6:30 PM
With the release of Gemini 2.5 Pro, no safety information was published. Eventually, a month later, they were gracious enough to provide us with a model card.
November 7, 2025 at 6:30 PM
In 2024, Google promised governments that they would be released critical safety information with every new model, and explain how external bodies were involved in evaluations.
Letter to Sir Demis Hassabis
Parliamentarians from across the UK call on Google DeepMind to honour their AI safety commitments
pauseai.info
November 7, 2025 at 6:30 PM
The polls are clear: the public want a ban on superintelligence. Many politicians agree, but don't feel that the issue is politically salient enough to feel comfortable discussing it. It's our job to change that, and it's something everyone, including you, can help with.

pauseai.info/join
November 5, 2025 at 4:55 PM
As Andrea Miotti writes in TIME, we've achieved a global treaty to prevent catastrophe before. In the 1980s, every nation recognised that it wasn't in their interest to allow anyone to destroy our ozone layer with CFCs. Self-interest was enough to save us then, and it can save us now.
November 5, 2025 at 4:55 PM
Both of these quotes are from a recent BBC Panorama documentary, Trump and the Tech Titans.
November 4, 2025 at 6:06 PM
The letter points to our current inability to control advanced AI, and the significant risk to human life that it would pose.

Sign here: superintelligence-statement.org
Statement on Superintelligence
“We call for a prohibition on the development of superintelligence, not lifted before there is (1) broad scientific consensus that it will be done safely and controllably, and (2) strong public…
superintelligence-statement.org
October 30, 2025 at 6:39 PM
Taking place from Thursday 11th to Saturday 13th of December, PauseCon Brussels will culminate in a large demonstration at the European Parliament.

Apply now! 👉 pausecon.org
PauseCon Brussels 2025
Join us for PauseCon in Brussels, December 11-13th 2025. A development and training event for anyone interested in volunteering for PauseAI.
pausecon.org
October 29, 2025 at 10:21 AM
Following on from our event in London this summer, we'll be heading to Belgium to give people interested in volunteering with PauseAI a chance to take part in workshops, connect with other activists, and learn more about how they can take action against reckless AI development.
PauseCon Brussels 2025
Join us for PauseCon in Brussels, December 11-13th 2025. A development and training event for anyone interested in volunteering for PauseAI.
pausecon.org
October 29, 2025 at 10:21 AM
These figures accompanied the publication of the Future of Life Institute's open letter calling for a ban on superintelligence, at least until there's scientific consensus that it will be safe and strong public buy-in.

futureoflife.org/recent-news/...
The U.S. Public Wants Regulation (or Prohibition) of Expert‑Level and Superhuman AI - Future of Life Institute
Three‑quarters of U.S. adults want strong regulations on AI development, preferring oversight akin to pharmaceuticals rather than industry "self‑regulation."
futureoflife.org
October 27, 2025 at 12:37 PM
The same polling also found that 64% think that superhuman AI should not be developed until it is proven safe and controllable, or should never be developed at all.
The U.S. Public Wants Regulation (or Prohibition) of Expert‑Level and Superhuman AI - Future of Life Institute
Three‑quarters of U.S. adults want strong regulations on AI development, preferring oversight akin to pharmaceuticals rather than industry "self‑regulation."
futureoflife.org
October 27, 2025 at 12:37 PM
You get the picture. It's just waiting for one more signature - yours!

Sign here - superintelligence-statement.org
Statement on Superintelligence
“We call for a prohibition on the development of superintelligence, not lifted before there is (1) broad scientific consensus that it will be done safely and controllably, and (2) strong public…
superintelligence-statement.org
October 24, 2025 at 5:42 PM
It’s been signed by over 30,000 people, including the two most cited living scientists (Geoffrey Hinton and Yoshua Bengio), Apple co-founder Steve Wozniak, Richard Branson, Prince Harry, former President of Ireland Mary Robinson, Grimes, Stephen Fry, Steve Bannon, Kate Bush, will.i.​am, ...
October 24, 2025 at 5:42 PM
To help us grow further to 300, simply upload a selfie of yourself using the link below. It only takes 30 seconds!

👉 pauseai.info/sayno
Stop Superintelligence
Join the photo petition to say no to the race to build superintelligent AI
pauseai.info
October 16, 2025 at 1:50 PM