she literally gives the example of scooby doo villains as evil and incompetent in the tweet! like how is this a hard concept if you can already come up with examples!
December 23, 2025 at 11:46 PM
she literally gives the example of scooby doo villains as evil and incompetent in the tweet! like how is this a hard concept if you can already come up with examples!
i’m enjoying it in kind of a yuri on ice way lmao. you think this is about two gay ice sport people? jokes on you, it’s actually ALL OF THEM WHO ARE GAY
December 17, 2025 at 3:49 PM
i’m enjoying it in kind of a yuri on ice way lmao. you think this is about two gay ice sport people? jokes on you, it’s actually ALL OF THEM WHO ARE GAY
as someone who went to their rival school that loses all the time i endorse this lmao. they got too comfortable winning all the time and now they think they deserve the spot based on vibes
December 7, 2025 at 5:18 PM
as someone who went to their rival school that loses all the time i endorse this lmao. they got too comfortable winning all the time and now they think they deserve the spot based on vibes
in my experience, when my car slows down bc of someone in front, all the cars behind me will go around me before i’m able to speed back up and pass the car in front, which annoys the crap out of me lol. i’d rather maintain constant speed and brake when i actually need to, not when my car says to
October 12, 2025 at 5:26 PM
in my experience, when my car slows down bc of someone in front, all the cars behind me will go around me before i’m able to speed back up and pass the car in front, which annoys the crap out of me lol. i’d rather maintain constant speed and brake when i actually need to, not when my car says to
i mean, to me it’s like saying a math formula “knows” its solution. the equation itself doesn’t know anything, because it isn’t “thinking.” its just following rules and presenting a conclusion. LLMs are particularly bad at math bc there’s not a bank of “correct” answers, it’s just predictive text
October 3, 2025 at 2:09 PM
i mean, to me it’s like saying a math formula “knows” its solution. the equation itself doesn’t know anything, because it isn’t “thinking.” its just following rules and presenting a conclusion. LLMs are particularly bad at math bc there’s not a bank of “correct” answers, it’s just predictive text
agree! LLMs aren’t capable of shit other than a predictive algorithm, like they’re just generating the next most likely set of words. they can’t know things, they can’t feel things, they can’t even do math! if you’re gonna die on an AI hill, don’t make it LLMs lmao. ML is so much more than chatGPT
October 3, 2025 at 2:03 PM
agree! LLMs aren’t capable of shit other than a predictive algorithm, like they’re just generating the next most likely set of words. they can’t know things, they can’t feel things, they can’t even do math! if you’re gonna die on an AI hill, don’t make it LLMs lmao. ML is so much more than chatGPT