Digital Linguistics Lab
@dili-lab.bsky.social
77 followers 75 following 11 posts
DiLi lab at the Department of Computational Linguistics, University of Zurich. 👀🤖📖🧠💬 https://www.cl.uzh.ch/en/research-groups/digital-linguistics.html
Posts Media Videos Starter Packs
Reposted by Digital Linguistics Lab
dili-lab.bsky.social
Late breaking work:

📄 Krakowczyk et al.: The More the Merrier: Boost Your Dataset Visibility and Discover Eye-Tracking Datasets with pymovements

Big congratulations to all the authors! 🎊
dili-lab.bsky.social
Workshop papers:

📄 Brasser et al.: Predicting Children’s Reading Comprehension Through Eye Movements: Insights from Visual Search and Interpretable Machine Learning

📄 Jakobi et al.: MultiplEYE: Creating a multilingual eye-tracking-while-reading corpus
dili-lab.bsky.social
Short papers (2/2):

📄 Bondar et al.: CoLAGaze: A Corpus of Eye Movements for Linguistic Acceptability

📄 Prasse et al.: Detection of Alcohol Inebriation from Eye Movements using Remote and Wearable Eye Trackers
dili-lab.bsky.social
Short papers (1/2):

📄 Jakobi et al.: Neural Additive Models Uncover Predictive Gaze Features in Reading

📄 Reich et al.: Proxy-Based Pre-Training for Eye-Tracking Applications
dili-lab.bsky.social
Long papers:

📄 Reich et al.: Evaluating Gaze Event Detection Algorithms: Impacts on Machine Learning-based Classification and Psycholinguistic Statistical Modeling

📄 Bolliger et al.: ScanDL 2.0: A generative model of eye movements in reading synthesizing scanpaths and fixation durations
dili-lab.bsky.social
Excited to share that our group will present 9 papers at this year's ACM Symposium on Eye Tracking Research & Applications (ETRA) in Tokyo!

We will post summaries of each paper in the coming weeks, but here's a quick sneak peek 👀
dili-lab.bsky.social
At this year's ACL in Vienna, @lenajaeger.bsky.social and David Reich from our group, together with @whylikethis.bsky.social and Omer Shubi, will be hosting a tutorial on EyeTracking and NLP 👀 🖥️ Be there to join us!

More information can be found here: acl2025-eyetracking-and-nlp.github.io
ACL 2025 Tutorial: Eyetracking and NLP
ACL 2025 Tutorial on Eyetracking and NLP
acl2025-eyetracking-and-nlp.github.io
dili-lab.bsky.social
Keynote Speakers:
🎤 @stefanfrank.bsky.social (Radboud University)
🎤 Vera Demberg (Saarland University)

For detailed guidelines, templates, and additional information, visit our website: lnkd.in/dCuXnxvk

We look forward to your contributions!
LinkedIn
This link will take you to a page that’s not on LinkedIn
lnkd.in
dili-lab.bsky.social
The meeting will take place on December 18–19, 2025, in Utrecht, the Netherlands. It aims to connect researchers using (neuro-)symbolic, Bayesian, deep-learning, connectionist, and mechanistic models (e.g., ACT-R) in studying human language production, perception, and processing.
dili-lab.bsky.social
Hello Bluesky world! This is our very first post from the Digital Linguistics (DiLi) team — and we’re kicking things off with exciting news:

🖥️ 🧠 We are pleased to announce that the abstract submission for the first Computational Psycholinguistics Meeting 2025 is now open!