Jacobo Tarrío
banner
jacobo.tarrio.org
Jacobo Tarrío
@jacobo.tarrio.org
I write software for fun and for profit.
Radio amateur (AE2IT and EA1ITI).
He/him/爸爸.
:-)
December 2, 2025 at 11:14 PM
:-)
December 2, 2025 at 11:02 PM
No, I saw that you were an engagement farmer so I deleted my contribution to your bullshit.

You are still an idiot.
December 2, 2025 at 10:56 PM
Not military, but I imagine they see the bus coming and they might be wondering if they'll feel a shove in their backs as it passes by.
December 2, 2025 at 2:40 AM
Son la misma longitud pero es posible que el ancho sea distinto. Para comprobar cuando te lleguen.
December 1, 2025 at 10:39 AM
I too was sorely disappointed after moving from Ireland, but then a few years later, my now-wife introduced me to hot water dispensers.

You can see one in pretty much every Asian restaurant and cafe because they dispense water at the appropriate temperature for tea, at the touch of a button.
December 1, 2025 at 1:40 AM
For the PLL, yes, my version was slightly slower. I guess it's faster (and definitely easier) to compute sin/cos a few times than to manage a few complex numbers in Cartesian form, multiply them together, and normalize them to modulus 1 when necessary.
November 30, 2025 at 2:23 AM
It looks like the atan2 issue was a benchmarking artifact. Turns out my version was still 3x faster than atan2. But when I was doing research I found another version that's just as fast but like 1000x more precise than mine, so I nicked it. I mean, I adapted it.
November 30, 2025 at 2:23 AM
Maybe I should build some self-contained benchmarks and run them on every computer and phone, modern or old, I have access to.
November 29, 2025 at 2:47 PM
My FM demodulator had this atan2 implementation I came up with, that was like twice as fast as Math.atan2 back then.

Well, the latest benchmark says it's like 1/3 as fast on my M3 Mac.

So either some of my benchmarks are wrong, or M3 is so much better for trigonometry, or modern CPUs are.
November 29, 2025 at 2:45 PM
I bet it's a case of "modern CPUs and GPUs are optimized for floating point because everybody wants to do local AI" and "doing one slowish operation on one number might be faster than doing six fast operations on two numbers."

And maybe "I have to review my data accesses to optimize the cache."
November 29, 2025 at 2:38 PM
O bo de vivir en Galicia é que nunca hai humidades :-)
November 28, 2025 at 10:51 PM
O ferro que vai dentro, non. Cando oxida, expándese!
November 28, 2025 at 9:36 PM
And yeah, Negroponte had a (likely syndicated) column in a Spanish magazine, and it was always a variation of "airlines will gather all data about you, know you absolutely need to take that flight, and then they are gonna price gauge you specifically". An idea he seemed to admire a touch too much.
November 28, 2025 at 7:30 PM
For the @ sign, I distinctly remember a TV show where the presenters started reading their newfangled email address just to stop hard in befuddlement at that weird "a inside a circle" sign.

(I already knew its name from my typing lessons, but I had never seen it outside of a typewriter.)
November 28, 2025 at 7:27 PM
Myself, I like O(log(n)), which means that the run-time grows with the number of digits of the size of the input. So much faster than O(n)!
November 27, 2025 at 8:17 PM
But, in the end, whenever someone says "O(1)" they mean the algorithm takes the same time no matter the size of its input. O(n) means the time it takes grows at the same rate as the input. O(n^2) means double the input takes four times as long. And O(2^n) means "choose a different algorithm."
November 27, 2025 at 8:17 PM
And there are a lot of well-known algorithms whose complexities you should know off the top of your head, because most other algorithms are a composite of these.
November 27, 2025 at 8:17 PM
For example, if there is a loop on a number of elements, then it's going to be O(n) on the number of elements. If there is a loop nested inside another, then it's going to be O(n^2).

If an algorithm has a sequence of parts with different complexities, you keep only the largest one.
November 27, 2025 at 8:17 PM
So when they ask you "what's the complexity of this algorithm?" they are asking to figure out the "shape" of the formula to compute the number of steps it takes to complete for a given input size.

We don't always need to do maths to do this. Sometimes we can take shortcuts.
November 27, 2025 at 8:17 PM
And the algorithm that took 10+10*4^n steps has a run-time that belongs to O(4^n).

Now, when we talk about algorithms we often are lazy, so we end up saying "this algorithm is O(n)" and such, which is not entirely correct for a mathematician but it's too late now so they have to suck it up.
November 27, 2025 at 8:17 PM
So O(f(n)) is the set of all functions that, for large values of "n", return smaller values than C*f(n) if you can choose a C large enough.

So what about your algorithm that took 10+40*n or 5+70*n steps depending on how it was described? The formula for its run-time belongs to O(n) in either case.
November 27, 2025 at 8:17 PM