Charles P
@stllegend.me
1K followers
1.7K following
420 posts
I am large, I contain multitudes.
Early 40’s #stlmade MarTech Pro who’s into beatmaking and combat sports.
stllegendmusic.com for music and creative things
These are just my opinions, not representative of employers or affiliations past or present.
Posts
Media
Videos
Starter Packs
Pinned
Charles P
@stllegend.me
· Nov 14
Reposted by Charles P
Reposted by Charles P
Terra Fied 🎃🏳️⚧️👻
@rainofterra.gay
· May 28
Reposted by Charles P
Reposted by Charles P
Reposted by Charles P
Charles P
@stllegend.me
· 9d
Reposted by Charles P
Reposted by Charles P
Reposted by Charles P
We learned the term "praise kink" from @catieosaurus.bsky.social , and it's not restricted to any one gender; no matter your partner's gender, the occasional "good job" will do wonders in bed or the dungeon.
https://f.mtr.cool/rfxzjsffkh
https://f.mtr.cool/rfxzjsffkh
Reposted by Charles P
Charles P
@stllegend.me
· 23d
Reposted by Charles P
Those who are 35+, what advice do you have for people just entering their 30s?
Reposted by Charles P
Tim Eby
@timjeby.bsky.social
· Aug 20
Reposted by Charles P
The Guardian
@theguardian.com
· Aug 18
Chatbot given power to close ‘distressing’ chats to protect its ‘welfare’
Anthropic found that Claude Opus 4 was averse to harmful tasks, such as providing sexual content involving minors
The makers of a leading artificial intelligence tool are letting it close down potentially “distressing” conversations with users, citing the need to safeguard the AI’s “welfare” amid ongoing uncertainty about the burgeoning technology’s moral status.
Anthropic, whose advanced chatbots are used by millions of people, discovered its Claude Opus 4 tool was averse to carrying out harmful tasks for its human masters, such as providing sexual content involving minors or information to enable large-scale violence or terrorism. Continue reading...
www.theguardian.com
Reposted by Charles P
Reposted by Charles P