Angie Boggust
@angieboggust.bsky.social
410 followers
250 following
11 posts
MIT PhD candidate in the VIS group working on interpretability and human-AI alignment
Posts
Media
Videos
Starter Packs
Reposted by Angie Boggust
Angie Boggust
@angieboggust.bsky.social
· Apr 14
Abstraction Alignment: Comparing Model-Learned and Human-Encoded Conceptual Relationships
While interpretability methods identify a model's learned concepts, they overlook the relationships between concepts that make up its abstractions and inform its ability to generalize to new data. To ...
arxiv.org
Angie Boggust
@angieboggust.bsky.social
· Apr 14
Angie Boggust
@angieboggust.bsky.social
· Apr 14
Angie Boggust
@angieboggust.bsky.social
· Nov 24