Ian Miers
@secparam.bsky.social
510 followers 71 following 55 posts
UMD CS Prof. Security and applied cryptography.
Posts Media Videos Starter Packs
secparam.bsky.social
Problem is once users have digital IDs, demands will shift. Instead of 'are you 18?', it becomes: prove you're human, prove you're not banned, prove you live here. Then you need programmable identity. Private IDs are just a start, as we looked at here.
eprint.iacr.org/2022/878
zk-creds: Flexible Anonymous Credentials from zkSNARKs and Existing Identity Infrastructure
Frequently, users on the web need to show that they are, for example, not a robot, old enough to access an age restricted video, or eligible to download an ebook from their local public library withou...
eprint.iacr.org
secparam.bsky.social
How do we do better?
Well, a simple solution to this particular problem is zk-proofs. Instead of giving Discord your ID, you prove you have one. We did some preliminary work on this in 2023, and Google is rolling out a version of zk proofs of IDs.But basic proofs aren't enough.
secparam.bsky.social
What's worse but predictable? Attackers get both IDs and messages. Every conversation you've ever had, every dumb comment, or like attached to your legal name and address. There's no evidence it happened here, but it will happen soon. We need better approaches to identity.
secparam.bsky.social
Discord user IDs getting leaked is the entirely predictable consequence of requiring platforms to do age verification. That data never goes away, it spreads. In this case, into appeals in a breached customer support database. And predictably, it can get worse. www.404media.co/the-discord-...
The Discord Hack is Every User’s Worst Nightmare
A hack impacting Discord’s age verification process shows in stark terms the risk of tech companies collecting users’ ID documents. Now the hackers are posting peoples’ IDs and other sensitive informa...
www.404media.co
secparam.bsky.social
The worst part of preparing a tenure portfolio is realizing you actually have to create that 'permanent record' your elementary school teachers threatened you with.
And it has pesky formatting requirements.
secparam.bsky.social
Isn't it worse than that. If your professional account is marked ChatControllExempt, isn't that a giant gapping red flag to adversaries to go look at the personal account of you, your spouse, anyone you might be having an affair with or owe money?
secparam.bsky.social
Best cover for a stego system.
secparam.bsky.social
There's a very niche case where
1) you succeed at building the quantum computer
2) crypto does migrate to pq
3) you can still sell recovery services on non migrated addresses
4) those addresses don't get robbed by others or FUD from competing PQ secure chais says they were
secparam.bsky.social
What's the value of recovering X% of crypto, discounted by: legal risk it's deemed theft, the chance crypto migrates to PQ-resistant algorithms first, and the risk that BTC/ETH prices collapse the moment everyone realizes the same quantum tech makes ALL legacy crypto vulnerable?
secparam.bsky.social
If true, this says more about VC funding fads than cryptography. It highlights how hard it is to find valuable applications that classical computers can't approximate well enough. And I have questions for the junior deal partner who modeled the ROI for pq crypto "recovery."
secparam.bsky.social
Interesting anecdote from a friend: quantum computing startups are now raising funds by pitching their ability to break cryptocurrency encryption (n=1 plus VC gossip, but still). Apparently other applications like quantum chemistry don't offer big enough ROI for investors.
secparam.bsky.social
Some "AI" on my phone is reading inbound Signal messages. I left predictive typing on, trading a little of my privacy for convenience. Yet something is giving responses using what others wrote in chats with disappearing messages, persisting or sharing them who knows where. Not a good default, Google
secparam.bsky.social
The Brooklyn one is actually a water front park development and a vacant office space, at least as of 4 months ago. So even more on brand.
secparam.bsky.social
We've crossed a threshold. A paid subscription used to be the ultimate proof of humanity online, now its not enough to allow a single link click inside the NYT cooking app. The next few years are going to be an interesting race to extract more and more invasive proofs of humanity.
secparam.bsky.social
The 2010s internet: Let's mock dissertation-length arguments about weird-ass fanfic tags.
The 2025 internet: 'dubcon' is an ancillary part of the financial privacy discourse.

The past was a better place.
acvalens.net
PayPal user in the UK lost their account after buying adult ebooks “about monsters and milking,” “some dubcon stuff”

“My account got banned a couple of days ago for making purchases which violate the ToS. Upon querying w/ staff over the phone I've been told that it was ebooks that I've been buying”
r/LegalAdviceUK 4 hr. ago
Join
DariaDover
My Paypl account has been banned for buying smutty e-books. Have I done something illegal?
Other Issues
• •
My account got banned a couple of days ago for making purchases which violate the terms of service.
Upon querying with staff over the phone l've been told that it was e-books that l've been buying.
I'm panicking a little bit right now. I haven't done anything illegal have I? The books were mostly stuff about monsters and milking. Some dub-con stuff.
If it is illegal, what do I need ot do to protect myself? Do I delete my e-books?
If the e-books aren't illegal then is the company really allowed to tell me how I'm allowed to spend my money?
Like, is this any different to Halifax or Nationwide forbidding me from buying cigarettes or alcohol at Tesocs?
I'm in England.
个 88
46
Share
secparam.bsky.social
And now is when someone should point out private compute for AIs and TEEs are not secure enough to make on by default chat monitoring a good idea. Because they aren't. It's terribly insecure, especially against hostile governments. It's just ... better than without.
secparam.bsky.social
Private AI needs to be the norm because opting out is impossible for many apps. Take messaging or photo sharing: even if you opt out, the recipient likely has AI enabled—maybe even on by default. Your data ends up in their app's AI cloud. Private compute for AI must be a default.
secparam.bsky.social
And before anyone says TEEs have imperfect security, the point is they're a massive improvement. And essential in a future where AI assistants get baked into your chat apps and browser watching every move you make, video you view, and message you send.
secparam.bsky.social
This is doable today. Apple already has Private Cloud Compute, and Nvidia's H100 GPUs come with Trusted Execution Environments built right in. The pieces are there—your AI conversations could run where even the NYT, OpenAI, and hackers can't snoop.
secparam.bsky.social
Making LLM chats private is a good idea. We've accepted too much data harvesting already—this moment lets us reset the norm around who controls our data online. But let's go further: put LLM chats in private compute, so you get technical guarantees you control your data.

x.com/sama/status/...
secparam.bsky.social
Classic Google: an A/B test (a rare overt one)
Classic Google AI: it doesn't actually work (you can't submit)
secparam.bsky.social
Incidentally, if your Signal is now flooded with work chats, you can organize chats into folders
Settings->Chats->Chat Folders. Looks like its Android only for now.
secparam.bsky.social
Friend messaged me: Signal's going mainstream. They've got 150+ active chats. Work life invaded their friend space.

Its not just Signal being in the news: people don't trust other apps. Too many places to half-ass privacy: be it backups, ads, or an AI reading over your shoulder.
secparam.bsky.social
It's the year 2030. AIs write all our sitcoms now, but they're just endless FRIENDS clones because the underpaid content moderators in offshore offices learned that's the pinnacle of American comedy.