mrale.ph
@mrale.ph
Uber TL of @dart.dev
Of course it would be welcome. Though I think that's not necessarily where the low-hanging fruit lies. I think a) making sure that developers can work with more dense memory layouts b) making sure that developers have good raw SIMD APIs is probably way more important.
June 2, 2025 at 7:08 AM
Dart VM does not do any vectorization - so the best way to ensure it happens is to do it by hand :)

Other than that it is on the case by case basis and requires looking at generating code before and after.
May 31, 2025 at 4:32 PM
(Also our compilers are focused on code size so we avoid any optimizationa that expand code size)
May 31, 2025 at 8:34 AM
Currently we never unroll loops in any of our compilers.

These days unrolling is usually only beneficial if it enables other optimizations across loop iterations because increase in code size can have worse impact than benefit from removing a well predicted branch.
May 31, 2025 at 8:34 AM
throw is an expression returning Never, so that you could write

a ? throw B : c;

among other things
February 24, 2025 at 8:52 PM
(I want to stress it here that function* here is not the same as a single function of Dart language - it is function + all inlined stuff, e.g. if you write something like `Vector(1,2).add(Vector(2, 3)`) compiler can still eliminate `Vector(1,2)` and `Vector(2,3)` if `add` is inlined).
February 8, 2025 at 8:00 PM
I think bsky.app/profile/mral... answers it. It has to be created and used and not escape confines of a function*
It occurs if compiler can figure out that it can do it. Compiler can do it if some object is fully confined to the function* the way compiler sees it - this is not the same as Dart function, it is function plus all functions inlined into it.
February 8, 2025 at 7:58 PM
Speaking of collections: "some" should have probably been "any". What I was trying to say is that if you do something like

var v = Vector();
var l = [];
list.add(v);
for (var u in l) { print(u.x); }

Even though v does not really escape confines of the function it can't "explode" into fields.
February 8, 2025 at 7:57 PM
There is no documentation but you can read articles like en.wikipedia.org/wiki/Escape_... and chrisseaton.com/truffleruby/... to get the idea of the theory behind it.
Escape analysis - Wikipedia
en.wikipedia.org
February 8, 2025 at 7:57 PM
It occurs if compiler can figure out that it can do it. Compiler can do it if some object is fully confined to the function* the way compiler sees it - this is not the same as Dart function, it is function plus all functions inlined into it.
February 8, 2025 at 7:57 PM
But it is hard to answer this question in an abstract case. It would be easier if you gave a bit more details on what you are trying to optimize
February 7, 2025 at 7:55 AM
In general if you want to optimize for cache locality you need to eliminate indirections and allocate things close to each other. So you need to pack your data into typed arrays.
February 7, 2025 at 7:55 AM
However if you only use object within a function (including all functions compiler managed to inline into that function), and you don't put it into some collections, but use it directly - then such object will "explode": compiler will eliminate the object and turn its fields into local variables
February 7, 2025 at 7:53 AM
I assume the question is about Dart VM, not JS or Wasm.

In technical terms: there is no stack allocation, but there is scalar replacement of aggregates. In simple terms: if you have an object and you pass to another function (which is not inlined) that object has to be heap allocated.
February 7, 2025 at 7:53 AM
What is your biggest gripe with build runner? Would you be happy if it runs 10x faster and the code does not have all the pesky `$` and other boilerplate nonsense?
January 29, 2025 at 10:45 PM
Would you prefer to wait couple more years for macros to ship? What was your specific use case you were betting on macros?

We are hoping to address most gripes (e.g. data classes, serialization, build_runner slowness) on a much shorter time frame than what macros would have taken us.
January 29, 2025 at 10:24 PM
I think this must mean that analyzer is busy with other tasks for some reason. I will return to this in the NY after holidays.
December 20, 2024 at 12:08 PM
Can you also post details of slow requests by clicking on them? They should have breakdown of what took time. You can obscure names (but there should not be many private names there)
December 20, 2024 at 10:38 AM
When you hit long requests could you open Analysis Server diagnostic page and post information about request latency from there?
December 19, 2024 at 2:39 PM
Is there a bug / more details for this?
December 17, 2024 at 11:20 PM