Per Axbom
@axbom.com
1.5K followers 690 following 830 posts
Swedish communication theorist born in Liberia • Writing about the human impact of tech and teaching digital ethics Mechanical heart valve ❤️ • He/him My Next Heartbeat – Critical thinking in times of artificial urgency: https://heartbeat.email
Posts Media Videos Starter Packs
Pinned
axbom.com
"We are asking permission because of GDPR."

No, you're asking permission because it's the decent, respectable thing to do.

Laws came into being because too many were busy being assholes.

If you say you're doing it because of GDPR you're saying you didn't really want to do the decent thing.
Reposted by Per Axbom
halfrobot.com
This is amazing; I very much agree with the how the report calculated value to society here:
www.gov.ie/en/departmen...

But for people excited about a *literal* profit machine, note that the real net fiscal cost was almost €72M. It made €146M due to the estimated value of gains in mental wellbeing.
Reposted by Per Axbom
kojamf.bsky.social
Dr. Jane Goodall filmed an interview with Netflix in March 2025 that she understood would only be released after her death.
Reposted by Per Axbom
strandjunker.com
“Terrible things are happening outside. Poor helpless people are being dragged out of their homes. Families are torn apart. Men, women, and children are separated. Children come home from school to find that their parents have disappeared.”

Diary of Anne Frank
January 13, 1943
Reposted by Per Axbom
documentalope.bsky.social
This is an interesting and useful distinction. I colloquially refer to "AI tools" when working with students, etc, but it does seem useful to think through the ways that the products we are currently confronted with are not meaningfully tool-like.
olivia.science
Perhaps related, but definitely related to the conceptual distinction the linked piece tried to make: I think it's a problem to say AI (of any kind actually) is a tool (or is definitely a tool), maybe useful extract attached from: arxiv.org/abs/2507.19960
extract from page 5 and 6 from https://arxiv.org/pdf/2507.19960
axbom.com
❤️
theatlantic.com
Jane Goodall, who died Wednesday at 91, was not just a pioneering scientist, but also an expert at wielding the power of fame, Michelle Nijhuis writes.
Jane Goodall’s Second-Greatest Talent
She knew how to wield her fame to protect the animals she loved.
bit.ly
Reposted by Per Axbom
surfbruden.bsky.social
Hårresande. Läs hela texten

”Efter tio år som vårdbiträde i hemtjänsten valde Gjulfidan Saliji att utbilda sig till undersköterska. Då fick hon beskedet att hon ska utvisas – för att hon inte sökt en ny tjänst.”

arbetet.se/2025/10/03/g...
Gjulfidan utvisas för att hon pluggade vidare
”Det går inte att beskriva med ord hur det känns.”
arbetet.se
Reposted by Per Axbom
iris-meredith.bsky.social
Today's article is about a form of bad-faith speech somewhat akin to bullshit, which I'm calling (because why not) wank. I don't quite think I've focused the ideas down as much as I can, but I think it says some interesting things regardless:
An essay on wank | deadSimpleTech
This captures well the uncomfortable, slightly disorienting feeling that wank creates when you're subjected to it, wherein you're expected to speak about and think about the statement as though it say...
deadsimpletech.com
Reposted by Per Axbom
davidbrax.bsky.social
”Vi har drillats i vikten av att inte ha något politiskt provocerande i våra mobiltelefoner (…) Vi har fått lära oss vilka forskningsfält och aktiviteter som oftare triggar misstankar och extra granskning, och vad vi kan göra om vi blir kvarhållna vid gränsen.” universitetslararen.se/2025/09/22/i...
I kön till den amerikanska gränskontrollen - Universitetsläraren
"Att jag fortsätter resa till USA under dessa omständigheter är inte slentrianmässigt." Universitetslärarens krönikör Evelina Edfors beskriver det komplexa i ett fortsatt forskningssamarbete med USA.
universitetslararen.se
Reposted by Per Axbom
janmaarten.com
You too can save the world by not trusting the computer.
axbom.com
Happy Petrov Day to those who celebrate. On September 26, 1983, Stanislav Petrov made the correct decision to not trust a computer.

I've attached a short clip from a reenactment of the situation in the documentary The Man Who Saved the World.

1/11
Reposted by Per Axbom
jimwhittington.bsky.social
Excellent thread about performing during an incident of compressed time and high stress. There are few times in history where the conditions were so intense and the decisions were so critical for all of us.

Good job Lt. Col. Petrov.
axbom.com
Happy Petrov Day to those who celebrate. On September 26, 1983, Stanislav Petrov made the correct decision to not trust a computer.

I've attached a short clip from a reenactment of the situation in the documentary The Man Who Saved the World.

1/11
axbom.com
"AI will take over the world."

This can mean two things:

1. "AI" will become self-aware and decide to control humans.

2. Gen-slop and deepfakes will dominate available content, leading to deskilling, human abuse and eco-harm.

The first idea has no basis, but it's fair to worry about the second.
axbom.com
Maintaining an illusion of perfect, neutral and flawless systems will keep people from questioning the systems when the systems need to be questioned.

We need to stop punishing when failure helps us understand something that can be improved.

10/11
axbom.com
3. Reward exposure of faulty systems

If we keep praising our tools for their excellence and efficiency it's hard to later accept their defects. When shortcomings are found, this needs to be communicated just as clearly and widely as successes.

9/11
axbom.com
On top of this: The launch detection system was new (and hence he did not fully trust it).

8/11
axbom.com
- He had been told a US attack would be all-out. An attack with only 5 missiles did not make sense to him.
- Ground radar gave no supporting evidence of an attack, even after minutes of waiting.
- The message passed too quickly through the 30 layers of verification he himself had devised.

7/11
axbom.com
2. Look for multiple confirmation points

Stanislav Petrov understood what he was looking for. While he has admitted he could not be 100% sure the attack wasn't real, there were several factors he has mentioned that played into his decision:

6/11
axbom.com
I've previously written about three lessons to take away from Petrov's actions:

1. Embrace multiple perspectives

Petrov being an engineer and not a military man speaks to me of how important it is to welcome a broad range of experiences and perspectives.

5/11
axbom.com
The computer was indeed wrong about the imminent attack and Petrov likely saved us from nuclear disaster in those impossibly stressful minutes, by daring to wait for ground confirmation. For context one must also be aware that this was at a time when US-Soviet relations were extremely tense.

4/11
axbom.com
Many officers facing the same situation would have called their superiors to alert them of the need for a counter-attack. Especially as fellow officers were shouting at him to retaliate quickly before it was too late. Petrov did not succumb.

3/11
axbom.com
The early warning system at command center Serpukhov-15, loudly alerting of a nuclear attack from the United States, was of course modern and up-to-date. Stanislav Petrov was in charge, working his second shift in place of a colleague who was ill.

2/11
axbom.com
Happy Petrov Day to those who celebrate. On September 26, 1983, Stanislav Petrov made the correct decision to not trust a computer.

I've attached a short clip from a reenactment of the situation in the documentary The Man Who Saved the World.

1/11