Frogge
banner
froggge.bsky.social
Frogge
@froggge.bsky.social
Furthermore, OpenAI delenda est!
Not the comedically large check too!
December 20, 2025 at 5:29 PM
Most non-GenAI datacenters don't belch diesel generator fumes due to power demands that the local infrastructure isn't remotely prepared for, for starters.
December 17, 2025 at 9:37 AM
Mfw the exploded rock didn't unexplode on the way to the ground
December 15, 2025 at 1:12 PM
I don't think the extended warranty covers that.
December 13, 2025 at 6:19 PM
Nonono, it's "Science Process(tm)", much more trademarkable!
December 13, 2025 at 12:51 AM
@dieworkwear.bsky.social I think this is about you
December 4, 2025 at 2:24 AM
@hakeem-jeffries.bsky.social this is the man you just publicly praised. Neither your wealth nor your platitudes will protect you from him.
December 4, 2025 at 12:49 AM
They legitimately believe they're *fighting* children's book villains. Joanne herself compared Trans people and those fighting for their rights to Death Eaters in the nonsense "Witch Trials" podcast she was on. If one earnestly believes that, essentially any extremity can be justified. It's insane.
December 3, 2025 at 10:25 AM
@gork.it is this nonsense?
November 26, 2025 at 11:19 PM
Lmfao if I have to be functionally *deceived* into using it, it doesn't seem nearly as useful as you keep claiming!
November 26, 2025 at 4:30 AM
Working in data isn't what tells you that you shouldn't stare at the sun, silly goose! Nor is an LLM! Hallucinations are so core to LLM technology that you simply cannot make meaningful and responsible use of anything the things spit out!
November 26, 2025 at 2:19 AM
"I diagnose you with holes-in-your-eyes-from-staring-at-the-sun"
'What could have possibly caused this?!'
"..."
November 25, 2025 at 8:32 AM
That's very good! Now, how do you know this is a hallucination?
November 25, 2025 at 2:08 AM
Ah yes, truly a technological marvel!
November 24, 2025 at 4:59 AM
Funny how you cited it as "concise" and "well-written" with "sources", but you made no mention of actually following those sources. The LLM could have invented an entirely new concept in the subject you know nothing about, and your intellectually lazy self would have no idea.
November 24, 2025 at 3:35 AM
These things are true, but I'd wager we would be hard pressed to find a microscope that routinely talks its users into suicide!
November 24, 2025 at 3:32 AM
Let me know when an LLM can spit out a pancake recipe without guidance from the prompter or stealing one wholesale from a recipe blog.
November 24, 2025 at 3:27 AM