Boxo McFoxo
banner
boxomcfoxo.bsky.social
Boxo McFoxo
@boxomcfoxo.bsky.social
Fox with a boxy muzz. Mid-30s. Scottish. 🔞 MINORS DNI 🔞

I'm meant to be gay but my orientation play kink messed that up uwu
see, look
October 22, 2025 at 2:05 AM
It is better than LLMs because it is at least trying to be accurate, while an LLM is not. You can't iteratively next token predict for accuracy.
October 17, 2025 at 10:34 AM
Also, even if they did start respecting robots.txt, the problem is that most of the time someone reads the AI slop version instead of the original content, it's via Google. Their scraping for Web indexing and for model training is the same scrape.
October 17, 2025 at 10:27 AM
A very technical definition would be that bisexuality is a significant level of both androphilia and gynephilia, significant not necessarily meaning high.
October 16, 2025 at 12:11 PM
Someone can be bisexual and have a low libido, or they can be bisexual and have a high libido but their attraction doesn't manifest as the physical act of sex itself, or they can be actively abstaining from sex any reason, or they can be passively abstaining by just not seeking it out.
October 16, 2025 at 12:11 PM
Sexual orientation is about which sex you are sexually attracted to, not about what you do or feel like doing. So yes.

Your whole sexuality is bigger than just your orientation.
October 16, 2025 at 12:11 PM
Up and down
October 15, 2025 at 5:38 AM
Mister wolf are you sure you haven't mistaken me for another fox?
October 15, 2025 at 12:09 AM
Tone change is a valid use case for LLMs I suppose, but you have to be careful. Don't assume it will be easy to spot a hallucination putting words in your mouth.
October 10, 2025 at 9:12 PM
Some of the harms from AI come from well meaning people being unaware of the true nature of the technology.

This one does not.
October 10, 2025 at 9:09 PM
Why did you need an LLM to make a SAR? There is a form for it on the ICO's website.

Unless you are saying you needed help from the LLM with what to put in the form in which case... try cognitively offloading a little less. Or a lot less.
October 10, 2025 at 6:27 PM
Matches don't lie to people that they are having a conversation with them.
October 10, 2025 at 6:22 PM
The thing about citations is they only really work properly when the person doing it has in good faith been trying to be correct. A manipulative person can very convincingly cite deceptively. LLMs have no intent to deceive but they are also not trying to be correct.
October 10, 2025 at 1:34 PM
It's hard enough for an expert in one of these areas to tell whether the citation is correctly used by the LLM, because 'hallucinations' can be so fluent. How do you expect a non expert to know just from following the link?
October 10, 2025 at 1:32 PM
If they did include inline URLs how often would a non expert be able to tell that they had been referenced correctly anyway? LLMs will cite a scientific paper and say it supports the absolute opposite conclusion. Even if the paper is real, the citation is not.
October 10, 2025 at 12:12 PM
Marc isn't on Bluesky and I am not going to post on Twitter, so I emailed it to his assistant, a reasonable course of action. It was very important that he was made aware of this.
October 10, 2025 at 3:40 AM
The "get fucked" at the end though
October 10, 2025 at 2:15 AM
"Humans can have faulty reasoning, so let's make things better with a system that has no reasoning at all" - you, being ridiculous
October 4, 2025 at 5:19 PM