Günter Klambauer
gklambauer.bsky.social
Günter Klambauer
@gklambauer.bsky.social
Deep Learning researcher | professor for Artificial Intelligence in Life Sciences | inventor of self-normalizing neural networks | ELLIS program Director
Drug discovery benchmarks reveal a surprising result: the original 2017 SELU networks still dominate toxicity prediction (Tox21).
P: arxiv.org/abs/2511.14744
Measuring AI Progress in Drug Discovery: A Reproducible Leaderboard for the Tox21 Challenge
Deep learning's rise since the early 2010s has transformed fields like computer vision and natural language processing and strongly influenced biomedical research. For drug discovery specifically, a k...
arxiv.org
December 9, 2025 at 1:56 PM
GCN variants (e.g., GyralNet) increasingly replace ReLU with SELU for better convergence and robustness in noisy or shallow graph settings.
P: arxiv.org/abs/2503.19823
GyralNet Subnetwork Partitioning via Differentiable Spectral Modularity Optimization
Understanding the structural and functional organization of the human brain requires a detailed examination of cortical folding patterns, among which the three-hinge gyrus (3HG) has been identified as...
arxiv.org
December 9, 2025 at 1:56 PM
SELU made a comeback in RL: PPO-based systems like Pearl report smoother, more stable actor–critic updates when batchnorm is avoided.
P: arxiv.org/abs/2506.01880
Pearl: Automatic Code Optimization Using Deep Reinforcement Learning
Compilers are crucial in optimizing programs and accelerating their execution. However, optimizing programs automatically using compilers is not trivial. Recent work has attempted to use reinforcement...
arxiv.org
December 9, 2025 at 1:56 PM
The SELU-transformer resurged in NLP/tabular domains -- TabTranSELU even kicks out SwiGLU entirely.
P: doi.org/10.54254/275...
TabTranSELU: A transformer adaptation for solving tabular data
Tabular data are most prevalent datasets in real world, yet the integration of deep learning algorithms in tabular data often garners less attention despite their widespread utilization in other field...
doi.org
December 9, 2025 at 1:56 PM
Flowstate continued the trend: SELU activations bring stability in large-scale time-series models.
P: arxiv.org/abs/2508.05287
FlowState: Sampling Rate Invariant Time Series Forecasting
Foundation models (FMs) have transformed natural language processing, but their success has not yet translated to time series forecasting. Existing time series foundation models (TSFMs), often based o...
arxiv.org
December 9, 2025 at 1:56 PM
Time-series foundation models (FIM/FIM-ℓ, Flowstate) all quietly adopt SELU, and TiRex topped the GIFT Eval leaderboard this year.
P: openreview.net/forum?id=NPS...
Zero-shot Imputation with Foundation Inference Models for Dynamical...
Dynamical systems governed by ordinary differential equations (ODEs) serve as models for a vast number of natural and social phenomena. In this work, we offer a fresh perspective on the classical...
openreview.net
December 9, 2025 at 1:56 PM
Tiny SELU MLPs (2–3 layers, width 64) became the standard in Conditional Flow Matching, thanks to smoother derivative behavior than ReLU.
P: arxiv.org/abs/2302.00482
Improving and generalizing flow-based generative models with minibatch optimal transport
Continuous normalizing flows (CNFs) are an attractive generative modeling technique, but they have been held back by limitations in their simulation-based maximum likelihood training. We introduce the...
arxiv.org
December 9, 2025 at 1:56 PM
Normalization-Free Transformers rediscover controlled signal propagation — a core idea behind SNNs.
P: arxiv.org/abs/2503.10622
Transformers without Normalization
Normalization layers are ubiquitous in modern neural networks and have long been considered essential. This work demonstrates that Transformers without normalization can achieve the same or better per...
arxiv.org
December 9, 2025 at 1:56 PM
Yes, ECFP plus some other descriptors.. should be in the model-cards and the paper ..
November 19, 2025 at 7:59 AM
You have access to the model predictions via API call.

Just type:

curl -X POST ml-jku-tox21-gin-classifier.hf.space/predict -H "Content-Type: application/json" -d '{"smiles": ["CCO", "c1ccccc1"]}'

in your shell.
November 19, 2025 at 6:52 AM
Indeed.. :)
November 14, 2025 at 12:32 PM
How many errors do you find in the chemical structure above?
November 13, 2025 at 10:11 AM