ERCbravenewword
@ercbravenewword.bsky.social
190 followers 320 following 41 posts
Exploring how new words convey novel meanings in ERC Consolidator project #BraveNewWord🧠Unveiling language and cognition insights🔍Join our research journey! https://bravenewword.unimib.it/
Posts Media Videos Starter Packs
Reposted by ERCbravenewword
marcociapparelli.bsky.social
I'm sharing a Colab notebook on using large language models for cognitive science! GitHub repo: github.com/MarcoCiappar...

It's geared toward psychologists & linguists and covers extracting embeddings, predictability measures, comparing models across languages & modalities (vision). see examples 🧵
Reposted by ERCbravenewword
gboleda.bsky.social
New paper! 🚨 I argue that LLMs represent a synthesis between distributed and symbolic approaches to language, because, when exposed to language, they develop highly symbolic representations and processing mechanisms in addition to distributed ones.
arxiv.org/abs/2502.11856
Sigmoid function. Non-linearities in neural network allow it to behave in distributed and near-symbolic fashions.
Reposted by ERCbravenewword
ercbravenewword.bsky.social
Great week at #ESLP2025 in Aix-en-Provence! Huge congrats to our colleagues for their excellent talks on computational models, sound symbolism, and multimodal cognition. Proud of the team and the stimulating discussions!
Reposted by ERCbravenewword
caterinavillani.bsky.social
📣The chapter "𝗦𝗽𝗲𝗰𝗶𝗳𝗶𝗰𝗶𝘁𝘆: 𝗠𝗲𝘁𝗿𝗶𝗰 𝗮𝗻𝗱 𝗡𝗼𝗿𝗺𝘀"
w/@mariannabolog.bsky.social is now online & forthcoming in the #ElsevierEncyclopedia of Language & Linguistics
🔍 Theoretical overview, quantification tools, and behavioral evidence on specificity.
👉 Read: urly.it/31c4nm
@abstractionerc.bsky.social
Reposted by ERCbravenewword
valentinapescuma.bsky.social
The dataset includes over 240K fixations and 150K word-level metrics, with saccade, fixation, and (word) interest area reports. Preprint osf.io/preprints/os..., data osf.io/hx2sj/. Work conducted with @davidecrepaldi.bsky.social and Maria Ktori. (2/2)
OSF
osf.io
Reposted by ERCbravenewword
dirkwulff.bsky.social
How can we reduce conceptual clutter in the psychological sciences?

@ruimata.bsky.social and I propose a solution based on a fine-tuned 🤖 LLM (bit.ly/mpnet-pers) and test it for 🎭 personality psychology.

The paper is finally out in @natrevpsych.bsky.social: go.nature.com/4bEaaja
ercbravenewword.bsky.social
For those who couldn't attend, the recording of Abhilasha Kumar's seminar on exploring form-meaning interactions in novel word learning and memory search is now available on our YouTube channel!!

Watch the full presentation here:
www.youtube.com/watch?v=VJTs...
Abhilasha Kumar, Beyond Arbitrariness: How a Word's Shape Influences Learning and Memory
YouTube video by Mbs Vector Space Lab
www.youtube.com
Reposted by ERCbravenewword
marcociapparelli.bsky.social
Happy to share that our work on semantic composition is out now -- open access -- in Cerebral Cortex!

With Marco Marelli (@ercbravenewword.bsky.social), @wwgraves.bsky.social & @carloreve.bsky.social.

doi.org/10.1093/cerc...
ercbravenewword.bsky.social
Great presentation by @fabiomarson.bsky.social last Saturday at #AMLAP2025! He shared his latest research using EEG to study how we integrate novel semantic representations, “linguistic chimeras”, from context.

Congratulations on a fascinating talk!
ercbravenewword.bsky.social
For those who couldn't attend, the recording of Prof. Harald Baayen's seminar on morphological productivity and the Discriminative Lexicon Model is now available on our YouTube channel.

Watch the full presentation here:
www.youtube.com/watch?v=zN7G...
The Computational Approach to Morphological Productivity | Harald Baayen at Bicocca
YouTube video by Mbs Vector Space Lab
www.youtube.com
ercbravenewword.bsky.social
New seminar announcement!

Exploring form-meaning interactions in novel word learning and memory search
Abhilasha Kumar (Assistant Professor, Bowdoin College)

A fantastic opportunity to delve into how we learn new words and retrieve them from memory.

💻 Join remotely: meet.google.com/pay-qcpv-sbf
ercbravenewword.bsky.social
📢 Upcoming Seminar!

A computational approach to morphological productivity using the Discriminative Lexicon Model
Professor Harald Baayen (University of Tübingen, Germany)

🗓️ September 8, 2025
2:00 PM - 3:30 PM
📍 UniMiB, Room U6-01C, Milan
🔗 Join remotely: meet.google.com/dkj-kzmw-vzt
Reposted by ERCbravenewword
qlu.bsky.social
I’d like to share some slides and code for a “Memory Model 101 workshop” I gave recently, which has some minimal examples to illustrate the Rumelhart network & catastrophic interference :)
slides: shorturl.at/q2iKq
code (with colab support!): github.com/qihongl/demo...
hidden state representation during training
ercbravenewword.bsky.social
🎉We're thrilled to welcome Jing Chen, PhD to our team!
She investigates how meanings are encoded and evolve, combining linguistic and computational approaches.
Her work spans diachronic modeling of lexical change in Mandarin and semantic transparency in LLMs.
🔗 research.polyu.edu.hk/en/publicati...
ChiWUG: A Graph-based Evaluation Dataset for Chinese Lexical Semantic Change Detection
research.polyu.edu.hk
Reposted by ERCbravenewword
cyhsieh.bsky.social
1st post here! Excited to share this work with Marelli & @kathyrastle.bsky.social. We've found readers "routinely" combine constituent meanings for Chinese compound meaning, despite variability in constituent meaning and word structure, even when they're not asked to. See threads👇 for more details:
Compositional processing in the recognition of Chinese compounds: Behavioural and computational studies - Psychonomic Bulletin & Review
Recent research has shown that the compositional meaning of a compound is routinely constructed by combining meanings of constituents. However, this body of research has focused primarily on Germanic ...
doi.org
ercbravenewword.bsky.social
📢 Upcoming Seminar

Words are weird? On the role of lexical ambiguity in language
🗣 Gemma Boleda (Universitat Pompeu Fabra, Spain)
Why is language so ambiguous? Discover how ambiguity balances cognitive simplicity and communicative complexity through large-scale studies.
📍 UniMiB, Room U6-01C, Milan
ercbravenewword.bsky.social
📢 Upcoming Seminar
The Power of Words: The contribution of co-occurrence regularities of word use to the development of semantic organization
🗣 Olivera Savic (BCBL)
How do children grasp deeper word connections beyond simple meanings? Discover how word co-occurrence shapes semantic development
Reposted by ERCbravenewword
pnas.org
One of the most-viewed PNAS articles in the last week is “Is Ockham’s razor losing its edge? New perspectives on the principle of model parsimony.” Explore the article here: www.pnas.org/doi/10.1073/...

For more trending articles, visit ow.ly/Me2U50SkLRZ.
Double descent of prediction error. Degree-one, degree-three, degree-twenty, and degree-one-thousand polynomial regression fits (magenta; from Left to Right) to data generated from a degree-three polynomial function (green). Low prediction error is achieved by both degree-three and degree-one-thousand models.
ercbravenewword.bsky.social
Our @RBonandrini was bestowed a "Giovani talenti" award for his studies on word processing.

Congrats Rolando for this achievement! https://x.com/unimib/status/1870046485947265302