Brian Guay
@brianguay.bsky.social
3.2K followers 74 following 33 posts
Assistant Professor of Political Science @ UNC Chapel Hill | Public opinion, behavior, polarization, misinformation | Last name pronounced without the u | brianguay.com
Posts Media Videos Starter Packs
brianguay.bsky.social
UNC Political Science is hiring in methods!

UNC has an amazing department and the triangle is a great place to live.

Tenure Track Assistant Professor in Methods (Deadline Oct 24)

link to the posting👇
unc.peopleadmin.com/postings/307...

#polisky #psjobs #poliscijobs
unc.peopleadmin.com
Reposted by Brian Guay
conjugateprior.org
Every time this worry comes up (www.ft.com/content/d419...) I post some Landy et al. (2018).

People just answer questions about proportions (of anything) in a rather particular way. So I think it's unlikely that what they are being asked about is as important as you might expect it should be.
issues, and the results on immigration are
sobering. UK respondents dramatically overestimate how many immigrants there are and how many of them are Muslim. They overestimate what proportion of immigrants are from north Africa by a factor of 10, and from the Middle East by a factor of two, and underestimate how many are from North America. Respondents underestimate how many are Christian, and also underestimate the education levels and employment status of immigrants relative to the UK-born population. These misperceptions are not unique to the UK - they are common in rich countries. People seem to answer proportion questions in log odds space, not probabilities. And when you know that, there seems a lot less substantive interpretation worth doing.
Reposted by Brian Guay
brendannyhan.bsky.social
Zaller remains undefeated
brianguay.bsky.social
thanks so much Conrad, I'll fix this!
brianguay.bsky.social
Thanks to my fantastic co-authors @tylermarghetis.bsky.social , @david-landy.bsky.social , Cara Wong and everyone who gave us feedback over many years

Ungated earlier version of the paper here: www.brianguay.com/files/guay_2...
www.brianguay.com
brianguay.bsky.social
Our findings suggest that the public knows more about politics than we give them credit for:

People make errors when estimating politically-relevant percentages, but this is due to the format of the question not underlying misinformation about what they are estimating
brianguay.bsky.social
Of course, characteristics of specific groups may matter, but only at the margins. We should first account for the domain-general errors people make *anytime* they estimate a percentage, then examine group-specific explanations
brianguay.bsky.social
The same is true of theories that people overestimate the size of groups that they have a lot of social contact with. Very little evidence of this!
brianguay.bsky.social
We also test popular theories that people overestimate the size of groups they fear. Not the case.

Again, misestimates result mainly from the psychological errors we make anytime we estimate %s, not from anything specific to the group being estimated
brianguay.bsky.social
We argue that this pattern of over-under estimation arises from 🧠Bayesian reasoning under uncertainty🧠: people often have uncertain ideas in their minds about the size of these groups, but when they convert these ideas to percentages they ‘hedge’ their estimates toward a prior
brianguay.bsky.social
And this is the same pattern of errors people make when estimating things like the percentage of dots on a page that are red 👇
brianguay.bsky.social
Here’s the key figure: people make the same estimation errors regardless of what they are estimating---political and *entirely non-political* quantities.

These are 100k estimates of the size of racial and non-racial groups made by 37k people in 22 countries
brianguay.bsky.social
Instead, people are just really bad at estimating percentages

They systematically overestimate smaller %s and underestimate larger %s, including ENTIRELY NON-POLITICAL %s, such as the % of the population that owns an Apple product, has a passport, or has indoor plumbing
brianguay.bsky.social
We argue that journalists and academics are *wrong* when they interpret these misperceptions as evidence that the public is ignorant and misinformed 👇
brianguay.bsky.social
New paper on misperceptions out in PNAS @pnas.org

www.pnas.org/doi/10.1073/...

Why do people overestimate the size of politically relevant groups (immigrant, LGBTQ, Jewish) and quantities (% of budget spent on foreign aid, % of refugees that are criminals)?🧵👇
Reposted by Brian Guay
prowag.bsky.social
This is not a time for passive citizenship. Silence means approval. So what to do?
brianguay.bsky.social
I'm very happy to share that I'll be joining the Department of Political Science at UNC Chapel Hill as an Assistant Professor this fall. I'm excited for this next chapter and will always be incredibly grateful for my amazing experience at Stony Brook.
brianguay.bsky.social
I haven't been on bsky for almost a year. It's nice over here!
brianguay.bsky.social
But the MOST important thing is that researchers *justify* their research design & analysis approach on normative/theoretical grounds and *pre-register* it

Doing so will help prevent researchers from talking past each other and move toward tackling problem of misinfo
brianguay.bsky.social
Key takeaway: choose the design that aligns with your normative claim about how people should interact with information

e.g., the normative claim that aligns with discernment is that people should maximize accuracy of the content that they believe and share
brianguay.bsky.social
We demonstrate these differences empirically by re-analyzing data from recent misinformation studies

Different research designs and outcomes = different conclusions about whether misinformation interventions work
brianguay.bsky.social
We recommend designs that measure discernment over those that measure only belief/sharing of false content

Importantly, these two designs can lead to different conclusions about whether misinformation interventions work👇