Lorenzo Loconte
@loreloc.bsky.social
960 followers 770 following 8 posts
#probabilistic-ml #circuits #tensor-networks PhD student @ University of Edinburgh https://loreloc.github.io/
Posts Media Videos Starter Packs
Reposted by Lorenzo Loconte
nesyconf.org
We're glad to announce the NeSy 2025 Test of Time award for "Probabilistic Inference Modulo Theories"!

🏆Rodrigo de Salvo Braz was here to accept the award.

This is groundwork for recent NeSy approaches like DeepSeaProbLog and the probabilistic algebraic layer.
Reposted by Lorenzo Loconte
euripsconf.bsky.social
EurIPS is coming! 📣 Mark your calendar for Dec. 2-7, 2025 in Copenhagen 📅

EurIPS is a community-organized conference where you can present accepted NeurIPS 2025 papers, endorsed by @neuripsconf.bsky.social and @nordicair.bsky.social and is co-developed by @ellis.eu

eurips.cc
Reposted by Lorenzo Loconte
emilevankrieken.com
We propose Neurosymbolic Diffusion Models! We find diffusion is especially compelling for neurosymbolic approaches, combining powerful multimodal understanding with symbolic reasoning 🚀

Read more 👇
Reposted by Lorenzo Loconte
javaloyml.bsky.social
This year I am co-organizing the 8th iteration of the Tractable Probabilistic Modeling #TPM workshop at #UAI2025 🌴 Rio de Janeiro edition 🌴

🌐 lnkd.in/dDK8T5Au
⏰ Submission deadline: May 23th AoE
🌴 Date: July 15th

🧵👇
Reposted by Lorenzo Loconte
javaloyml.bsky.social
Today we have @lennertds.bsky.social from KU Leuven teaching us how to adapt NeSy methods to deal with sequential problems 🚀

Super interesting topic combining DL + NeSy + HMMs! Keep an eye on Lennert's future works!
Reposted by Lorenzo Loconte
arnosolin.bsky.social
Have you thought that in computer memory model weights are given in terms of discrete values in any case. Thus, why not do probabilistic inference on the discrete (quantized) parameters. @trappmartin.bsky.social is presenting our work at #AABI2025 today. [1/3]
Reposted by Lorenzo Loconte
nolovedeeplearning.bsky.social
the #TPM ⚡Tractable Probabilistic Modeling ⚡Workshop is back at @auai.org #UAI2025!

Submit your works on:

- fast and #reliable inference
- #circuits and #tensor #networks
- normalizing #flows
- scaling #NeSy #AI
...& more!

🕓 deadline: 23/05/25
👉 tractable-probabilistic-modeling.github.io/tpm2025/
Reposted by Lorenzo Loconte
nolovedeeplearning.bsky.social
great to have David Watson (dswatson.github.io) visiting us today and talking about #trustworthy #AI #ML for tabular data with #trees and #circuits

with connections to #generative modeling, #causality and #fast inference!
loreloc.bsky.social
The last speaker of the workshop is Alexandros Georgiou, who is giving an introduction to polynomial networks and equivariant tensor network architecture, as well as how to implement them.
loreloc.bsky.social
After lunch break, Andrew G. Wilson (@andrewgwils.bsky.social) is now giving his presentation on the importance of linear algebra structures in ML, as well as on how to navigate such structures in practice.
loreloc.bsky.social
After Nadav it is now the turn of Guillaume Rabusseau, who is joining us online

Guillaume guides us through interesting expressiveness relationships of families of RNNs that are parameterized through tensor factorizations techniques
loreloc.bsky.social
Live from the CoLoRAI workshop at AAAI
(april-tools.github.io/colorai/)

Nadav Cohen is now giving his talk on "What Makes Data Suitable for Deep Learning?"

Tools from quantum physics are shown to be useful in building more expressive deep learning models by changing the data distribution.
Reposted by Lorenzo Loconte
nolovedeeplearning.bsky.social
we're almost ready for the @realaaai.bsky.social #AAAI25 Workshop on Connecting Low-rank Representations in #AI (#CoLoRAI) tomorrow!

we also have video presentations for some of the accepted papers you can already check (👉 april-tools.github.io/colorai/acce...)!

📽️ www.youtube.com/watch?v=JlVd...
[CoLoRAI] FinLoRA: Finetuning Quantized Financial Large Language Models Using Low-Rank Adaptation
YouTube video by april lab
www.youtube.com
Reposted by Lorenzo Loconte
jjcmoon.bsky.social
We all know backpropagation can calculate gradients, but it can do much more than that!

Come to my #AAAI2025 oral tomorrow (11:45, Room 119B) to learn more.
loreloc.bsky.social
We are going to present our poster "Sum of Squares Circuits" at AAAI in Philadelphia today

Hall E 12:30pm-14:00pm poster #840

We trace expressiveness connections of different types of additive and subtractive deep mixture models and tensor networks

📜 arxiv.org/abs/2408.11778
loreloc.bsky.social
Are you at AAAI in Philadelphia and interested about #tensor-factorizations or #circuits or even both?

Then join us today at our tutorial: "From tensor factorizations to circuits (and back!)"

Details and materials here
april-tools.github.io/aaai25-tf-pc...

Time 4:15pm - 6:00pm, Room 117
Home | AAAI'25 tutorial
The AAAI'25 tutorial on Tensor Factorizations + Probabilistic Circuits
april-tools.github.io
Reposted by Lorenzo Loconte
nolovedeeplearning.bsky.social
I am at @realaaai.bsky.social #AAAI25 in sunny #Philadelphia 🌞

reach out if you want to grab coffee and chat about #probabilistic #ML #AI #nesy #neurosymbolic #tensor #lowrank models!

check out our tutorial
👉 april-tools.github.io/aaai25-tf-pc...

and workshop
👉 april-tools.github.io/colorai/
Reposted by Lorenzo Loconte
javaloyml.bsky.social
Right, but what are Causal NFs again? In case you missed our NeurIPS 2023 Oral, Causal NFs are Deep Learning models that learn causal systems (SCMs) while having *theoretical guarantees*!

In short, you can accurately use them for causal inference tasks 🧪

arxiv.org/abs/2306.05415
Causal normalizing flows: from theory to practice
In this work, we deepen on the use of normalizing flows for causal reasoning. Specifically, we first leverage recent results on non-linear ICA to show that causal models are identifiable from observat...
arxiv.org
Reposted by Lorenzo Loconte
javaloyml.bsky.social
Have you ever been curious to try Causal Normalizing Flows for your project but found them intimidating? Say no more 😜

I just released a small library to easily implement and use causal-flows:

github.com/adrianjav/ca...
GitHub - adrianjav/causal-flows: CausalFlows: A library for Causal Normalizing Flows in Pytorch
CausalFlows: A library for Causal Normalizing Flows in Pytorch - adrianjav/causal-flows
github.com
loreloc.bsky.social
Happy to see our work at TMLR!

We systematically show the relationships between two apparently different fields: tensor factorizations and circuits, and how bridging the two enables us to exchange results, research opportunitie in ML, and practical implementation solutions.
tmlr-pub.bsky.social
New #Featured Certification:

What is the Relationship between Tensor Factorizations and Circuits (and How Can We Exploit it)?

Lorenzo Loconte, Antonio Mari, Gennaro Gala et al.

https://openreview.net/forum?id=Y7dRmpGiHj

#tensorized #factorizations #tensor
Reposted by Lorenzo Loconte
nicolabranchini.bsky.social
Interested in estimating posterior predictives in Bayesian inference? Really want to know if your approximate inference "is working"?
Come to our poster at the NeurIPS BDU workshop on Saturday - see TL;DR below.
Reposted by Lorenzo Loconte
samubortolotti.bsky.social
📣 Does your model learn high-quality #concepts, or does it learn a #shortcut?

Test it with our #NeurIPS2024 dataset & benchmark track paper!

rsbench: A Neuro-Symbolic Benchmark Suite for Concept Quality and Reasoning Shortcuts

What's the deal with rsbench? 🧵
Reposted by Lorenzo Loconte
andrewgwils.bsky.social
I wanted to make my first post about a project close to my heart. Linear algebra is an underappreciated foundation for machine learning. Our new framework CoLA (Compositional Linear Algebra) exploits algebraic structure arising from modelling assumptions for significant computational savings! 1/4
Reposted by Lorenzo Loconte
nolovedeeplearning.bsky.social
@ropeharz.bsky.social and his pet dinosaur are on bsky!

follow him for #probabilistic #ML content!