Élie Michel
@elie-michel.bsky.social
420 followers 230 following 110 posts
Research Scientist at #Adobe. PhD in Computer Graphics. Author of #LearnWebGPU C++. Creative Coding. Indie game. VFX. Opinions are my own. Writes in 🇫🇷 🇺🇸. https://portfolio.exppad.com https://twitter.com/exppad
Posts Media Videos Starter Packs
elie-michel.bsky.social
It's not easy for sure, chicken and egg problem, which is why our message is at least as much for reviewers then it is for authors! Nobody is to blame individually, it's just sth we should collectively discuss.
elie-michel.bsky.social
I agree, but part of the problem is that what the "average PC in 5-6 years" looks like may indirectly depend on what we do in research. If we only test on these hardware, they'll indeed naturally become the ones people use. :)
Reposted by Élie Michel
emxtyu.bsky.social
Ever wondered how badly we're all addicted to buying new GPUs in graphics labs?

Come see our talk at #SIGGRAPH2025 to discuss how we can collectively move "Towards a sustainable use of GPUs in Graphics Research"

with @elie-michel.bsky.social @axelparis.bsky.social Octave Crespel and Felix Hähnlein
MOTIVATION
Graphical Processing Units (GPUs) are at the core of Computer Graphics research. These chips are critical for rendering images, processing geometric data, and training machine learning models. Yet, the production and disposal of GPUs emits CO2 and results in toxic e-waste [1].

METHOD
We surveyed 888 papers presented at SIGGRAPH (premier conference for computer graphics research), from 2018 to 2024, and systematically gathered GPU models cited in the text. 

We then contextualize the hardware reported in papers with publicly available data of consumers’ hardware [2, 3].

REFERENCES
[1] CRAWFORD, KATE. The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press, 2021.
[2] STEAM. Steam Hardware Survey. https://store.steampowered.com/hwsurvey
[3] BLENDER. Blender Open Data. https://opendata.blender.org
elie-michel.bsky.social
Note that 'naga' is the equivalent tool developed by Firefox. It can easily be installed using cargo (the rust build manager): github.com/gfx-rs/wgpu/...
Left: WGSL, GLSL and SPIR-V are possible inputs of naga. Right: MSL, GLSL? HLSL, SPIR-V and WGSL are possible outputs of naga.
elie-michel.bsky.social
'tint' is the shader compiler developed by Chrome to implement #WebGPU. It has a nice command line interface, but so far there is no official build out...

Wait no more! I share here precompiled binaries of tint CLI: github.com/eliemichel/d...
Left: WGSL and SPIR-V are possible inputs of tint. Right: MSL, GLSL? HLSL, SPIR-V and WGSL are possible outputs of tint.
elie-michel.bsky.social
Important notes:
🔹This rewrite is *WIP*, refer to the main section (w/o "next" in the URL) for further chapters.
🔹This only works with Dawn for now because it is closer to what the v1.0 of WebGPU will be.
🔹The accompagnying code "stepXXX-next" is not up to date yet.
elie-michel.bsky.social
I've just realized something. It makes much more sense to have the "hello triangle" pointing upside down when learning #WebGPU!

👉 The ongoing "Next" rewrite of my guide reached the Hello Triangle chapter 🥳 eliemichel.github.io/LearnWebGPU/...
Screenshot of the Hello Triangle chapter
elie-michel.bsky.social
🏅Honored to have been awarded at #Eurographics25 for our paper on #LipschitzPruning to speed-up SDF rendering!

👉 The paper's page: wbrbr.org/publications...

Congrats to @wbrbr.bsky.social, M. Sanchez, @axelparis.bsky.social, T. Lambert, @tamyboubekeur.bsky.social, M. Paulin and T. Thonat!
Wilhem receiving the award on stage
elie-michel.bsky.social
Nice writeup from @mattkeeter.com inspired by our recent work on #LipschitzPruning!
mattkeeter.com
new blog post: "Gradients are the new intervals"

www.mattkeeter.com/blog/2025-05...

If you've got Lipschitz-continuous distance fields, you can use single-point samples to do tricks that normally require interval arithmetic – like hierarchical evaluation and expression simplification!
The words "hello, world" rendered as a signed distance field, with nicely uniform field lines radiating out from their boundaries.
elie-michel.bsky.social
PS: I'll be in #Eurographics next week, feel free to get in touch!
elie-michel.bsky.social
New update post about the 🚧 Ongoing work! 🚧 in my LearnWebGPU C++ guide!

On patreon: www.patreon.com/posts/ongoin...
On Discord: discord.gg/2Tar4Kt564

Outline:
🔹 The LearnWebGPU guide
🔹 WebGPU-distribution
🔹 RenderDoc
🔹 WebGPU-C++
🔹 WebGPU spec
🔹 Dawn
🔹 wgpu-native
🔹 GLFW and SDL
🔹 Slang x WebGPU
Screenshot of https://eliemichel.github.io/LearnWebGPU Screenshot of my custom version of RenderDoc Screenshot of the README of Slang x WebGPU
Reposted by Élie Michel
wbrbr.bsky.social
I am proud to announce our Eurographics 2025 paper "Lipschitz Pruning: Hierarchical Simplification of Primitive-Based SDFs"! With Mathieu Sanchez (joint first author), @axelparis.bluesky.social, @elie-michel.bsky.social, Thibaud Lambert, @tamyboubekeur.bsky.social, Mathias Paulin and Théo Thonat.
Left: an input CSG tree and a much smaller pruned tree computed using our method.
Right: a rendered scene showing the number of active nodes per cell. Our method reduces the active nodes to less than 20 from the initial 6023 nodes of the input tree.
elie-michel.bsky.social
Starting to track down the usage of #WebGPU resources during a frame in my custom #RenderDoc driver!

(Don't mind the usage field, it's a placeholder value for now)
elie-michel.bsky.social
Yeah I totally had this issue as well ^^ I can add warnings indeed!
elie-michel.bsky.social
I don't think there is such a toggle, but that would indeed be useful here!
elie-michel.bsky.social
Starting to nest events in the #WebGPU driver for #RenderDoc, how do you think I should handle these "WriteBuffer" that occur while encoding a "RenderPass"?

Because chronologically they are submitted before the render pass even though the API call occurs after.
WebGPU events in RenderDoc, with nesting of what happens between BeginRenderPass and RenderPassEnd
elie-michel.bsky.social
Close-up on the captured WebGPU API calls
Close-up on the captured WebGPU API calls
elie-michel.bsky.social
Who would be interested in a version of #RenderDoc that captures and replays calls to the #WebGPU API (rather than calls to the underlying DirectX/Vulkan/Metal API)?

This is an early test that only lists the API calls, but already promissing! Will share when usable.
A screenshot of RenderDoc where one can see calls to the WebGPU API
elie-michel.bsky.social
It's very nice that you intend to talk about this with your students 🙏 Don't forget that fossil fuels is not the only issue, rare earth materials (and the rate at which we renew hardware) and water consumption are huge problems too!
elie-michel.bsky.social
A very compelling proof that something isn't going right is to look at the divergence between announced energy reduction plans and the reality of compute providers.

Data comes from provider's own annual reports (that they agreed on publishing a couple of years back, transparency FTW), links⤵️
elie-michel.bsky.social
This report of Wells Fargo is also very informative/worrysome and points to many interesting sources: www.wellsfargoadvisors.com/research-ana...
elie-michel.bsky.social
The most commonly cited source about the growth of energy need for compute is this IEA report: iea.blob.core.windows.net/assets/6b2fd... (screenshot of p31)
Global electricity demand from data centres
could double towards 2026