Automatic Differentiation, Explainable AI and #JuliaLang.
Open source person: adrianhill.de/projects
neurips.cc/virtual/2025...
Joint work with Neal McKee, Johannes Maeß, Stefan Blücher and Klaus-Robert Müller, @bifold.berlin.
neurips.cc/virtual/2025...
Joint work with Neal McKee, Johannes Maeß, Stefan Blücher and Klaus-Robert Müller, @bifold.berlin.
Unfortunately, this convolution is computationally infeasible in high dimensions. Naive Monte Carlo approximation results in the popular SmoothGrad method.
Unfortunately, this convolution is computationally infeasible in high dimensions. Naive Monte Carlo approximation results in the popular SmoothGrad method.
Unfortunately, gradients of deep NN resemble white noise, rendering them uninformative:
Unfortunately, gradients of deep NN resemble white noise, rendering them uninformative:
However, if I had to create these diagrams from scratch, I think I would prioritize Typst's error messages and almost instant compilation over TikZ' more concise syntax.
However, if I had to create these diagrams from scratch, I think I would prioritize Typst's error messages and almost instant compilation over TikZ' more concise syntax.
github.com/cetz-package...
github.com/cetz-package...
#set par(justify: true, justification-limits: (
spacing: (min: 66.67% + 0pt, max: 150% + 0pt),
tracking: (min: -0.01em, max: 0.02em),
))
#set par(justify: true, justification-limits: (
spacing: (min: 66.67% + 0pt, max: 150% + 0pt),
tracking: (min: -0.01em, max: 0.02em),
))