Andreas Longva
andlon.bsky.social
Andreas Longva
@andlon.bsky.social
Researcher and PhD candidate @ RWTH Aachen University. I work in the intersection between math, physics and software. Simulating the macroscopic world 🌍🦀

andreaslongva.com
I'm very tempted to call this pure word salad too!
December 10, 2025 at 7:24 PM
I'm sorry to hear that. I get the sentiment.

I partially agree if it's only about scientists getting to do science ("what they love"). I think *humanity* no longer doing the science is problematic though.
December 10, 2025 at 7:23 PM
That's not at all what I'm talking about. Amateur hobbyists don't (generally) shape the course of society. Science does.

Science is bottlenecked by resources, and the bureaucracy to allocate them. "Strong automation" implies the AI handles this part too. This has major implications for society.
December 10, 2025 at 4:02 PM
turns humanity into mere consumers of whatever the AI spits out, no longer in charge of our own development.

I don't think that's an assertion, it's inherent in the premise of what the author calls "strong automation" (no human in the loop).

2/2
December 10, 2025 at 3:31 PM
Going back to the two points I just lined out, this addresses the first point (scientists want to do science), but not the second.

Given how closely scientific and societal development are intervowen, a full surrender of scientific agency to AI

1/
December 10, 2025 at 3:31 PM
being a scientist is a privilege, and taking away that privilege from a select few isn't necessarily "evil".

I'm personally more concerned with the second. I think science, like art, has intrinsic value, and shapes society. I am concerned what happens if we give up that agency
December 10, 2025 at 2:35 PM
Yeah, I think there are two points to his arguments:

1. scientists love to do science and they want to keep doing it
2. science (and its precursors) is an important part of our collective selves, and losing scientific agency is/might be detrimental to humanity

I agree with you that

1/
December 10, 2025 at 2:35 PM
I'm sorry, I don't see this anywhere in the post? I don't agree, and certainly not "literally", unless, again, I missed something in the text. Which is very possible.
December 10, 2025 at 2:20 PM
«But what I was arguing against was not what we might call "weak science automation", where humans stay in the loop in important roles, but "strong science automation", where humans are redundant.»
December 10, 2025 at 12:11 PM
Ok, I don't mean to be rude, but you keep adding your own wrongful interpretations of the post, and it's making this discussion bordering on meaningless. I nevertheless appreciate the discussion but I'm going to leave it here.

The post explicitly addresses using AI as a tool:

1/
December 10, 2025 at 12:11 PM
That's not all the post's position though?

At NeurIPS you will find people with a variety of positions. I'm sure some of them want to replace humans in the scientific process entirely. That's the viewpoint the author is arguing against, not some vague average consensus.
December 10, 2025 at 12:05 PM
Were you in the meeting? I wasn't, I have no idea what was said. I can only discuss what is in the post.
December 10, 2025 at 12:02 PM
«If you take the human out of the loop, meaning that humans no longer have any role in scientific research, ...»

You keep wanting to have a different discussion than the post. It's topic is the wholesale replacement of humans in scientific research.

Of course tool use is less problematic.
December 10, 2025 at 11:52 AM
You are having a different discussion than the original post. Of course then all the conclusions are also different.
December 10, 2025 at 11:47 AM
I'm sorry, you've lost me. Second sentence in the post:

«The panelists discussed how they plan to replace humans at all levels in the scientific process.»

The discussion is emphatically NOT about using AI simpy as a tool by humans.
December 10, 2025 at 11:45 AM
By "giving up on society" I meant giving up self-determination, to shape the direction society takes. While we are discussing scientific agency, I think it is difficult or impossible to entirely decouple it from society and government.

2/2
December 10, 2025 at 11:26 AM
The discussion is literally predicated on the idea of replacing humans at every level of the scientific process. If we don't make that assumption we are having a different discussion.

And I did not say anything about the end of society, but I admit my wording was imprecise.

1/
December 10, 2025 at 11:26 AM
our scientific agency? Sure, maybe AI could do it faster on its own, but who knows what else happens as a side effect.

Surrendering our agency is essentially giving up on society. I obviously have moral qualms about that.

2/2
December 10, 2025 at 11:15 AM
You are here implicitly assuming that surrendering our agency to AI will lead to utopia. This has always been the problem of utilitarianism: the tacit assumption that the future can be predicted, and the outcomes weighed appropriately.

And who says we can't cure cancer without giving up

1/
December 10, 2025 at 11:15 AM
What does this have to do with personal ego? I don't read it this way.

It's also not about curing cancer in isolation: it is taking away *all* scientific agency away from humanity.
December 10, 2025 at 10:11 AM
I know some of these words! 😅 I had hoped it would be physics-related but I can sort of follow along
December 8, 2025 at 9:45 PM