Thomas Nowotny
@drtnowotny.bsky.social
630 followers 160 following 36 posts
Professor of Informatics at the University of Sussex, Brighton, UK. President of @cnsorg.bsky.social I do research in bio-inspired AI and computational neuroscience. See more at https://profiles.sussex.ac.uk/p206151-thomas-nowotny/about
Posts Media Videos Starter Packs
drtnowotny.bsky.social
Excellent paper and great new project. Very interesting how the race of technology between GPU and FPGA shapes up.
neworderofjamie.bsky.social
Belated exciting news! First paper from my new(ish) project with the very talented Zainab Aizaz is out at ieeexplore.ieee.org/document/110.... Results from the first prototype of our fully-programmable FPGA-based SNN accelerator, showing off its stochastic rounding capabilities and speediness.
FeNN: A RISC-V vector processor for Spiking Neural Network acceleration
Spiking Neural Networks (SNNs) have the potential to drastically reduce the energy requirements of AI systems. However, mainstream accelerators like GPUs and TPUs are designed for the high arithmetic ...
ieeexplore.ieee.org
drtnowotny.bsky.social
Strong results on Speech commands with GLE presented by Paul Haider at #cns2025florence today . @cnsorg.bsky.social
drtnowotny.bsky.social
And we are off - excited that #cns2025florence is under way with record attendance - first keynote starts in 5 minutes.
Reposted by Thomas Nowotny
mmaravall.bsky.social
Posting this after some recent conversations with potential international applicants - still time to apply to our Masters courses and International PhD Academy for 2025 entry - join the diverse and vibrant Neuroscience community on our beautiful campus next to Brighton
sussexneuro.bsky.social
Want to join our vibrant postgrad community?

Discover all our Neuroscience courses here: www.sussex.ac.uk/study/subjects/neuroscience

Including:
🏫 International PhD Academy
🏫 Masters courses
🏫 Opportunities across several schools: Life Science; Psychology; Computer Science & AI; and BSMS
Neuroscience : University of Sussex
Discover our undergraduate courses, Masters and PhD degrees in neuroscience.
www.sussex.ac.uk
drtnowotny.bsky.social
... and you can see the great lineup of tutorials and workshops here: ocns.memberclicks.net/cns-2025-mee...
Of course, tutorial and workshop registration can also be added later if you have already registered for the main meeting. @cnsorg.bsky.social
drtnowotny.bsky.social
The deadline is tomorrow - last push! @cnsorg.bsky.social
drtnowotny.bsky.social
Only 5 days to go to the (extended) deadline. Make them count. A great lineup of keynotes, tutorials and workshops is secured - add your research as an oral or poster by submitting an abstract.
drtnowotny.bsky.social
The deadline is tomorrow - last push! @cnsorg.bsky.social
drtnowotny.bsky.social
Only 5 days to go to the (extended) deadline. Make them count. A great lineup of keynotes, tutorials and workshops is secured - add your research as an oral or poster by submitting an abstract.
incforg.bsky.social
Abstract submission is open for the 34th Annual Computational Neuroscience Meeting CNS*2025 in Florence, Italy, July 5-9, 2025!

Submit your abstract latest March 15th: www.cnsorg.org/cns-2...
drtnowotny.bsky.social
Only 5 days to go to the (extended) deadline. Make them count. A great lineup of keynotes, tutorials and workshops is secured - add your research as an oral or poster by submitting an abstract.
incforg.bsky.social
Abstract submission is open for the 34th Annual Computational Neuroscience Meeting CNS*2025 in Florence, Italy, July 5-9, 2025!

Submit your abstract latest March 15th: www.cnsorg.org/cns-2...
Reposted by Thomas Nowotny
neworderofjamie.bsky.social
@drtnowotny.bsky.social and I are again participating in Google Summer of Code under the @incforg.bsky.social. We have 3 paid projects involving GeNN for contributors with a range of skills and experience levels. If you're interested, please get in touch via the forums linked from the thread:
Reposted by Thomas Nowotny
cnsorg.bsky.social
Call for Abstracts: 34th Annual Computational Neuroscience Meeting, CNS*2025 🌟

📍 Join us in Florence 🇮🇹 July 5-9, 2025! 🧠✨

🗓️ Abstract Deadline: March 11, 2025
🔗 Submit here: www.cnsorg.org/cns-2025-abs...

📚 Workshops: www.cnsorg.org/cns-2025-cal...
🎓 Tutorials: www.cnsorg.org/cns-2025-cal...
CNS 2025 Abstract Submission
www.cnsorg.org
Reposted by Thomas Nowotny
neworderofjamie.bsky.social
This is awesome work from Balazs! Not only does our Eventprop-based method supports multiple spikes per neuron and recurrent connectivity but uses less than half the memory of the current state-of-the-art delay-learning method and is up to 26x faster.
Reposted by Thomas Nowotny
tdverstynen.bsky.social
Wikipedia is one of the last major bastions of verified information. Which, of course, is why the oligarchs want to destroy it.

You can donate to them here.

donate.wikimedia.org/w/index.php?...
drtnowotny.bsky.social
We also classified the Spiking Speech Commands (SSC) with good success. Finally, the GeNN implementation of Eventprop has very beneficial computational scaling properties compared to BPTT in Spyx (github.com/kmheckel/spyx). All details at doi.org/10.1088/2634.... @sussexai.bsky.social 5/5
Figure displaying the results of computational benchmarking of the algorithm. A) GPU memory use is plotted versus the number of timesteps in the simulation. We compared GeNN versus SpyX and for 256 hidden neurons and 1024 hidden neurons. The graph shows a linear increase of memory requirements for SpyX but essentially flat lines for GeNN. B) A similar graph for the training time as a function of timesteps. All curves increase but the increase for SpyX is much steeper than for GeNN. C) A bar graph comparing the training time for different elements of the network as a function of timesteps. The most time is spent on the synapses, followed by neuron simulation and other bits. The compile time is negligible. All times increase roughly linearly with number of time steps and somewhat sub-linearly with the number of hidden neurons.
drtnowotny.bsky.social
We extended Eventprop to a wider class of loss functions and found that, with a ‘loss-shaping’ term, we could achieve fast and reliable learning. Combining this with 3 data augmentations, we obtained a SOTA SHD classification accuracy of 93.5±0.7% (n=8) on the test set after rigorous validation. 4/5
Bar graph showing the performance of our algorithm in classifying the Spiking Heidelberg Digits data set with four different variations: plain, with delay line input, with delay and shift, and with delay, shift and blend. The performance increases in this order. Bars are shown for all four conditions for three hidden layer sizes. The performance is best for the largest hidden layer but the differences are not very large.
drtnowotny.bsky.social
This initially failed due to average cross-entropy loss creating unhelpful gradients in the hidden layer and the fact that spike creation and deletion is not “visible” in the exact gradients calculated by Eventprop. 3/5
drtnowotny.bsky.social
We implemented Eventprop (Wunderlich & Pehle, 2021) in our GeNN (genn-team.github.io) simulator and attempted to classify the Spiking Heidelberg Digits (SHD, zenkelab.org/resources/sp...) in a 3-layer network. 2/5
GeNN · GeNN by genn-team GeNN by genn-team
genn-team.github.io
drtnowotny.bsky.social
Just out in J Neuromorph Comput & Eng: Loss shaping enhances exact gradient learning with Eventprop in Spiking Neural Networks doi.org/10.1088/2634.... With @neworderofjamie.bsky.social.
We report how Eventprop scales to harder learning tasks.
TL;DR: It works great but not without extra effort. 🧪🧵
Loss shaping enhances exact gradient learning with Eventprop in spiking neural networks - IOPscienceSearch
Loss shaping enhances exact gradient learning with Eventprop in spiking neural networks, Nowotny, Thomas, Turner, James P, Knight, James C
doi.org
Reposted by Thomas Nowotny
dpmoriarity.bsky.social
"A study of federally funded research projects in the United States estimated that principal investigators spend on average about 45% of their time on administrative activities related to applying for and managing projects rather than conducting active research"

www.pnas.org/doi/10.1073/...
drtnowotny.bsky.social
We are proposing PhD project ideas along these lines:
Event-based machine learning and neuromorphic computing (EP25/53) EPSRC category: Interdisciplinary Research Lead Supervisors: James Knight and Thomas Nowotny Department: Informatics Research outline:  Event-based Spiking Neural Networks (SNNs) are inspired by the efficiency of biological neurons and with recent advances such as Eventprop (Wunderlich & Pehle (2021), Nowotny et al. (2024)) we can train them using supervised learning. However, there are technical difficulties and the methods have thus far only been applied to a few benchmark problems. Equally important, like in ML more generally, lack of labelled data is becoming a problem. Self-supervised approaches are an exciting and competitive alternative (Chen et al. (2020), Illinger et al. (2020), Halvagal & Zenke (2023)).  We are looking for PhD students interested in working on these challenges using our GPU-accelerated SNN simulation framework (Knight et al. (2021), Knight & Nowotny (2023) to: • further improve methods for gradient descent in SNNs  • extend gradient descent methods towards real-world problems • combine SNNs and self-supervised learning to solve real-world tasks • train SNNs for deployment on neuromorphic hardware like Loihi 2 Key Publications:  Nowotny T, Turner, JP, Knight, JC (2024). Loss shaping enhances exact gradient learning with EventProp in spiking neural networks, arXiv. (link) Knight, JC & Nowotny, T (2023). Easy and efficient spike-based machine learning with mlGeNN. Neuro-Inspired Computational Elements Conference, 115–120. (link)
Reposted by Thomas Nowotny
kakape.bsky.social
More than 60 German-speaking universities and research institutes have just jointly announced they will cease activities on X because “the current orientation of the platform is not compatible with their core values” incl. scientific integrity, transparency and democratic discourse” 🧪
drtnowotny.bsky.social
Hi Manu,
I am a Professor of Informatics and work in Computational Neuroscience and neuromorphic computing. I would like to contribute to the Science feed. My staff page is profiles.sussex.ac.uk/p206151-thom... and Google Scholar scholar.google.com/citations?us...
University of Sussex
profiles.sussex.ac.uk
drtnowotny.bsky.social
New paper on spike sorting www.sciencedirect.com/science/arti...
with Lydia Ellison, Georg Raiser, Alicia Garrido Peña and George Kemenes. TL;DR: SSSort software now handles overlapping spikes in addition to the extreme spike shape changes it was already good at. github.com/grg2rsr/SSSort
Reposted by Thomas Nowotny
neworderofjamie.bsky.social
First paper by our excellent PhD student @mbalazs98.bsky.social on trying to unpick the relation between delay learning and structural plasticity in SNNs and visualise what gets learned www.frontiersin.org/articles/10....
Frontiers | Learning delays through gradients and structure: emergence of spatiotemporal patterns in spiking neural networks
www.frontiersin.org
Reposted by Thomas Nowotny
tyrellturing.bsky.social
💯

Hallucination is totally the wrong word, implying it is perceiving the world incorrectly.

But it's generating false, plausible sounding statements. Confabulation is literally the perfect word.

So, let's all please start referring to any junk that an LLM makes up as "confabulations".
cianodonnell.bsky.social
petition to change the word describing ChatGPT's mistakes from 'hallucinations' to 'confabulations'

A hallucination is a false subjective sensory experience. ChatGPT doesn't have experiences!

It's just making up plausible-sounding bs, covering knowledge gaps. That's confabulation