Thomas F. Varley
banner
thosvarley.bsky.social
Thomas F. Varley
@thosvarley.bsky.social
Dual PhD: Complex Systems & Computational Neuroscience - Postdoc at UVM in the Vermont Complex Systems Institute.

Information theory, synergy, and emergence.

Connoisseur of collapse phenomena
For some reason I thought Emperor Penguins where like 6 feet tall. Learning that they're just a meter high is a bit disappointing.
January 30, 2026 at 11:46 PM
I don't disagree with the career implications. I'm just annoyed by people claiming that what they really care about is highfalutin principles about "art" writ-large when they're really just worried about their own careers.
Those are legitimate worries on their own - why dress it up?
January 30, 2026 at 2:09 PM
Has this kind of hyperfocus on "critique" always been a feature of American/progressive activism? Or is it new? It feels very academia-brained, like the best possible thing you can do is find new and creative ways to articulate "critiques" of how things are actually problematic.
January 22, 2026 at 4:19 PM
So far, when I've used this code for internal, day-to-day analyses at work, speed has never been an issue (although I use the Gaussian estimators the most, where are naturally very efficient - I haven't really stress-tested the KNN estimators as much).
January 19, 2026 at 4:32 PM
The neural estimators can almost certainly be improved w/ GPU integration (I just don't have one, so I never spent time with that). I'm not super familiar with Numba so I don't know much much extra juice it could squeeze out.
January 19, 2026 at 4:31 PM
I do vectorization and hand off to Nupy/Scipy wherever possible for major bottlenecks (matrix operations, KNN network inference. There's some potential for further optimization if you can get paralleism over loops (esp. for the higher-order info measures), but Python makes that hard.
January 19, 2026 at 4:31 PM
I have no personal interest in Julia or MATLAB, but I do have Opus 4.5 and might try and see if it can automate the porting/translation process, just to see if it works.
January 19, 2026 at 3:42 PM
I am more than happy to continue expanding this package. If there's something you'd like to see, feel free to make a request, or roll it yourself. 6/N
January 19, 2026 at 3:30 PM
The goal is not to supplant other packages (e.g. JIDT/IDTxl are still my go-tos for information dynamics), but my goal was a pure-python package (i.e. no MATLAB, ever) to make everything as accessible as possible. 5/N
January 19, 2026 at 3:30 PM
I have classic measures (entropy, MI, CMI, etc), modern multivariate measures (O/S-information, TC, DTC, etc), and three different kinds of information decomposition (PID, PED, GID).
As far as I know, this is the most complete package on the market atm. 4/N
January 19, 2026 at 3:30 PM
I've included estimators for different data types: discrete estimators, covariance-based estimators for Gaussian data, and two different non-parametric estimators (normalizing flow-based and KNN-based). Also mixed, for data with discrete and continuous components. 3/N
January 19, 2026 at 3:30 PM
This package grew out of my work in graduate school and my postdoc - inspired by the networkx package, I wanted a one-stop-shop for discrete and continuous multivariate information theory, all in Python for maximum accessibility. Years later, here it is. 2/N
January 19, 2026 at 3:30 PM
The way (some) professors will use rhetoric around "the environment" as a cudgel against AI and then thoughtlessly hop on a multi hour flight to present a 15 minute talk (and let's be real, socialize with their friends from grad school) is so disheartening.
January 18, 2026 at 2:17 PM
This blows my mind living in MA all the time. The whole state is enthusiastically pushing out a whole generation of young professionals because it's impossible to afford to live here! Happily sending all that future growth (and tax revenue) to sunbelt states.
January 16, 2026 at 9:08 PM
Very excited to live in a world where everyone has some form of dementia because we've outsourced all of the cognitive demands of daily living to the hungry ghosts in our phones.
January 16, 2026 at 8:27 PM
I think they did the "H" in Northampton a bit too hard. We usually say "North-ampton". Ditto for Easthampton.
January 16, 2026 at 5:13 PM