Joachim W Pedersen
@joachimwpedersen.bsky.social
620 followers 480 following 27 posts
Bio-inspired AI, meta-learning, evolution, self-organization, developmental algorithms, and structural flexibility. Postdoc @ ITU of Copenhagen. https://scholar.google.com/citations?user=QVN3iv8AAAAJ&hl=en
Posts Media Videos Starter Packs
Pinned
joachimwpedersen.bsky.social
In deep learning research, we often categorize meta-learning approaches as either gradient-based or black-box meta-learning. In my PhD thesis, I argued that it can sometimes be useful to classify approaches based on how the outer-loop optimization affects the inner-loop optimization.
joachimwpedersen.bsky.social
Looking forward to this!
najarro.science
We're excited to announce final program of the @alife2025.bsky.social SONI session
which will host a panel discussion with @blaiseaguera.bsky.social, @risi.bsky.social, @emilydolson.bsky.social & Sidney Pontes-Filho

Check out the full program: sites.google.com/view/soni-al...

See you in Kyoto ⛩️
Reposted by Joachim W Pedersen
sakanaai.bsky.social
Introducing The Darwin Gödel Machine

sakana.ai/dgm

The Darwin Gödel Machine is a self-improving agent that can modify its own code. Inspired by evolution, we maintain an expanding lineage of agent variants, allowing for open-ended exploration of the vast design space of such self-improving agents.
Reposted by Joachim W Pedersen
sakanaai.bsky.social
“Continuous Thought Machines”

Blog → sakana.ai/ctm

Modern AI is powerful, but it's still distinct from human-like flexible intelligence. We believe neural timing is key. Our Continuous Thought Machine is built from the ground up to use neural dynamics as a powerful representation for intelligence.
joachimwpedersen.bsky.social
New submission deadline: April 2nd!
So still some time to put interesting thoughts on Evolving Self-Organization together!

Also: We are very fortunate to have the great Risto Miikkulainen as the keynote speaker at the workshop!

Can't wait to see you all there! 🤩🙌
#Evolution #Gecco #ALife
joachimwpedersen.bsky.social
Join us for the Evolving Self-Organisation workshop at #GECCO this year! Great chance to submit your favourite ideas concerning self-organisation processes and evolution, and how they interact.
Relevant for Alifers #ALife and anyone interested in #evolution, #self-organisation, and #ComplexSystems.
risi.bsky.social
We're excited to announce the first Evolving Self-organisation workshop at GECCO 2025!

Submission deadline: March 26, 2025

More information: evolving-self-organisation-workshop.github.io
joachimwpedersen.bsky.social
Very satisfying to see one's code run on actual real-world robots and not just simulation.
Check out the paper here:
arxiv.org/pdf/2503.12406
arxiv.org
joachimwpedersen.bsky.social
www.youtube.com/watch?v=jnoa...
Bio-Inspired Plastic Neural Nets that continually adapt their own synaptic strengths can make for extremely robust locomotion policies!
Trained exclusively in simulation, the plastic networks transfer easily to the real world, even under various extra OOD situations.
[IROS25] Bio-Inspired Plastic Neural Nets for Zero-Shot Out-of-Distribution Generalization in Robots
YouTube video by Worasuchad Haomachai
www.youtube.com
joachimwpedersen.bsky.social
Remember that 4-page submissions of early results are also welcome!

Also, does anyone know if #GECCO has an official 🦋 account? I cannot seem to find it...
joachimwpedersen.bsky.social
Join us for the Evolving Self-Organisation workshop at #GECCO this year! Great chance to submit your favourite ideas concerning self-organisation processes and evolution, and how they interact.
Relevant for Alifers #ALife and anyone interested in #evolution, #self-organisation, and #ComplexSystems.
risi.bsky.social
We're excited to announce the first Evolving Self-organisation workshop at GECCO 2025!

Submission deadline: March 26, 2025

More information: evolving-self-organisation-workshop.github.io
joachimwpedersen.bsky.social
Both 4-pagers of early research as well as 8-page papers with more substantial results are welcome!
joachimwpedersen.bsky.social
Join us for the Evolving Self-Organisation workshop at #GECCO this year! Great chance to submit your favourite ideas concerning self-organisation processes and evolution, and how they interact.
Relevant for Alifers #ALife and anyone interested in #evolution, #self-organisation, and #ComplexSystems.
risi.bsky.social
We're excited to announce the first Evolving Self-organisation workshop at GECCO 2025!

Submission deadline: March 26, 2025

More information: evolving-self-organisation-workshop.github.io
joachimwpedersen.bsky.social
Very cool! And great aesthetics as well 🙌 😊
Reposted by Joachim W Pedersen
risi.bsky.social
Ever wish you could coordinate thousands of units in games such as StarCraft through natural language alone?

We are excited to present our HIVE approach, a framework and benchmark for LLM-driven multi-agent control.
joachimwpedersen.bsky.social
With all the research coming from Sakana AI, this figure needs to be updated fast! direct.mit.edu/isal/proceed...

#LLM #ALife #ArtificialIntelligence
Reposted by Joachim W Pedersen
hardmaru.bsky.social
Transformer²: Self-adaptive LLMs

arxiv.org/abs/2501.06252

Check out the new paper from Sakana AI (@sakanaai.bsky.social) paper. We show the power of an LLM that can self-adapt its weights to its environment!
Reposted by Joachim W Pedersen
itu.dk
Vi har samlet et starter pack med forskere og repræsentanter fra ITU på Bluesky. Mød dem her 👇
go.bsky.app/E8WJwXS
Reposted by Joachim W Pedersen
lanalpa.bsky.social
Can Dynamic Neural Networks boost Computer Vision and Sensor Fusion?
We are very happy to share this awesome collection of papers on the topic!
Reposted by Joachim W Pedersen
matrig.net
If microchip ~= silicon
then AGI ~= huge pile of sand
Reposted by Joachim W Pedersen
hardmaru.bsky.social
Neural Attention Memory Models are evolved to optimize the performance of Transformers by actively pruning the KV cache memory. Surprisingly, we find that NAMMs are able to zero-shot transfer its performance gains across architectures, input modalities and even task domains! arxiv.org/abs/2410.13166
sakanaai.bsky.social
An Evolved Universal Transformer Memory

sakana.ai/namm/

Introducing Neural Attention Memory Models (NAMM), a new kind of neural memory system for Transformers that not only boost their performance and efficiency but are also transferable to other foundation models without any additional training!
joachimwpedersen.bsky.social
3) Optimizer optimization: Think hyperparameter tuning, e.g., learning rate etc. The search within the inner-loop is altered.

We use meta-learning to achieve improved inner-loop optimization, so it is well worth considering exactly how our double-loop achieves this!
#meta-learning #deeplearning #ai
joachimwpedersen.bsky.social
1) Starting point optimization: Think MAML. Move the initial point of the inner-loop search to a better place to learn quick.
2) Loss landscape optimization: Think neural architecture search. The loss landscape(s) of the inner-loop is transformed.
joachimwpedersen.bsky.social
This can be thought of independently from which optimizer is being used in the inner-loop.
In any meta-learning approach, the outer-loop optimization will transform the inner-loop optimization process in at least of one three ways and often in a combination of these three.
joachimwpedersen.bsky.social
In deep learning research, we often categorize meta-learning approaches as either gradient-based or black-box meta-learning. In my PhD thesis, I argued that it can sometimes be useful to classify approaches based on how the outer-loop optimization affects the inner-loop optimization.
Reposted by Joachim W Pedersen
craigreynolds.bsky.social
Like 130,000 others, I made a starter pack. This one is people working on or with evolutionary computation in its many forms: genetic algorithms, genetic programming, evolution strategies.

If you like to be added, or suggest someone else, message me or reply to this post.
joachimwpedersen.bsky.social
Thanks for making a pack putting the spotlight on evolutionary computation! I would love to join the list :)