Maureen de Seyssel
@maureendeseyssel.bsky.social
690 followers 370 following 20 posts
machine learning researcher @Apple | PhD from @CoML_ENS | speech, ml and cognition.
Posts Media Videos Starter Packs
maureendeseyssel.bsky.social
In Vienna for ACL 2025! 🇦🇹 Come chat about speech, multilinguality, and cognitive modelling : I’ll be at the Apple booth Tuesday 1:30–3:30.
#ACL2025NLP
maureendeseyssel.bsky.social
⚠️ Correction: The tutorial is on August 17th (not September 21)
maureendeseyssel.bsky.social
We’ll show how bridging speech processing and psycholinguistics benefits both fields, and why speech researchers (you!) should care.

Collab. between ENS/Meta (Emmanuel Dupoux), Tampere University (Okko Räsänen) and Apple (me).

Hope to see many of you on September 21!

(2/2)
maureendeseyssel.bsky.social
Now that @interspeech.bsky.social registration is open, time for some shameless promo!

Sign-up and join our Interspeech tutorial: Speech Technology Meets Early Language Acquisition: How Interdisciplinary Efforts Benefit Both Fields. 🗣️👶

www.interspeech2025.org/tutorials

⬇️ (1/2)
https://www.interspeech2025.org/tutorials
Your cookies are disabled, please enable them.
www.interspeech2025.org
maureendeseyssel.bsky.social
New preprint out! 👇

We adapt the ABX task, commonly used in speech models, to investigate how multilingual text models represent form (language) vs content (meaning).

📄 arxiv.org/pdf/2505.17747

🙌 With Jie Chi, Skyler Seto, @maartjeterhoeve.bsky.social, Masha Fedzechkina & Natalie Schluter
arxiv.org
maureendeseyssel.bsky.social
In this work, we investigate whether learning statistical regularities in speech supports word learning, and find that while people can detect these regularities, they rarely remember the words; suggesting a dissociation between pattern learning and memory. 🧠

Check it out!
maureendeseyssel.bsky.social
This work was primarily carried out by
Ansgar Endress, who also gave me my first taste of research, of cognition, and even introduced me to programming. Couldn't have asked for a better introduction!
maureendeseyssel.bsky.social
New paper out in Cognition! 🎉
This one actually includes some of my (very) early work: my first-ever research project as an undergrad psychology student, now a decade ago. Nice to see it finally in print!

👉 authors.elsevier.com/sd/article/S...
ScienceDirect.com | Science, health and medical journals, full text articles and books.
authors.elsevier.com
maureendeseyssel.bsky.social
👀
marvinlavechin.bsky.social
Glad to share this new study comparing the performance and biases of the LENA and ACLEW algorithms in analyzing language environments in Down, Fragile X, Angelman syndromes, and populations at elevated likelihood for autism 👶📄
osf.io/preprints/ps...
🧵1/12
OSF
osf.io
Reposted by Maureen de Seyssel
serge.belongie.com
Would you present your next NeurIPS paper in Europe instead of traveling to San Diego (US) if this was an option? Søren Hauberg (DTU) and I would love to hear the answer through this poll: (1/6)
NeurIPS participation in Europe
We seek to understand if there is interest in being able to attend NeurIPS in Europe, i.e. without travelling to San Diego, US. In the following, assume that it is possible to present accepted papers ...
docs.google.com
maureendeseyssel.bsky.social
🥳 A huge thank you to all of our co-authors @hadware.bsky.social, @acristia.bsky.social, @hbredin.bsky.social, Guillaume Wisnewski & Emmanuel Dupoux, and of course to a larger extent to everyone in the CoML team!

🧵7/7
maureendeseyssel.bsky.social
💡 Finally, we aim to illustrate with this research how advanced computational models can simulate broad and realistic language learning, and as such help advance psycholinguistic research.

For a theoretical take, check our other paper ⬇

📄 www.cambridge.org/core/journal...

🧵6/7
Realistic and broad-scope learning simulations: first results and challenges | Journal of Child Language | Cambridge Core
Realistic and broad-scope learning simulations: first results and challenges - Volume 50 Issue 6
www.cambridge.org
maureendeseyssel.bsky.social
📚 We also analyse STELA's representations and find they don't match traditional linguistic categories like phonemes or words.

📈This indicates that early phonetic and word learning can occur without these categories, which may develop later in life.

🧵5/7
maureendeseyssel.bsky.social
🔬After training, we evaluate STELA on a phone discrimination and a word recognition task.

📊 We find the model replicates the *gradual* and *simultaneous* learning patters found in infants, suggesting statistical learning can alone bootstrap early language patterns.

🧵4/7
maureendeseyssel.bsky.social
🤖 Here, we isolate statistical learning by developing STELA, a computational model that learns by predicting the near future of speech utterances.

🗣️ STELA is trained on raw speech only, with gradually increasing quantities, equivalent to those an infant would hear.

🧵3/7
maureendeseyssel.bsky.social
👶 We know that infants learn multiple aspects of language - like distinguishing native sounds and recognising words - *gradually* and *simultaneously*.

Yet, it is still difficult to pinpoint which mechanisms are responsible for which aspect of language development.

🧵2/7
maureendeseyssel.bsky.social
Amazing opportunity for researchers with cognitive/psychology/linguistics (or related) backgrounds!

There is an open position to work as an AIML resident with Apple MLR in Copenhagen, with the great @maartjeterhoeve.bsky.social.

Details (topic + deadline) in the 🐦 thread: x.com/maartjeterho...
x.com
x.com
maureendeseyssel.bsky.social
👀
charlottemagister.bsky.social
Looking for an alternative to RAG for personalization?

With PLUM, a pipeline for teaching LLMs to remember prior user conversations, we aim to enable your future personalization research! Joint work with @maartjeterhoeve.bsky.social, Katherine Metcalf and Yizhe Zhang from my internship at Apple.

🧵
Reposted by Maureen de Seyssel
grzegorz.chrupala.me
I've started putting together a starter pack with people working on Speech Technology and Speech Science: go.bsky.app/BQ7mbkA

(Self-)nominations welcome!