- About
- Courses
- Research
- Computational Social Science
- Critical Data Studies
- Data Science
- Economics and Information
- Education Technology
- Ethics, Law and Policy
- Human-Computer Interaction
- Human-Robot Interaction
- Incentives and Computation
- Infrastructure Studies
- Interface Design and Ubiquitous Computing
- Natural Language Processing
- Network Science
- Social Computing and Computer-supported Cooperative Work
- Technology and Equity
- People
- Career
- Undergraduate
- Info Sci Majors
- BA - Information Science (College of Arts & Sciences)
- BS - Information Science (CALS)
- BS - Information Science, Systems, and Technology
- MPS Early Credit Option
- Independent Research
- CPT Procedures
- Student Associations
- Undergraduate Minor in Info Sci
- Our Students and Alumni
- Graduation Info
- Contact Us
- Info Sci Majors
- Masters
- PHD
- Prospective PhD Students
- Admissions
- Degree Requirements and Curriculum
- Grad Student Orgs
- For Current PhDs
- Diversity and Inclusion
- Our Students and Alumni
- Graduation Info
- Program Contacts and Student Advising
Thomas Costello, PhD, is an Assistant Professor of Psychology at American University and Research Associate at the MIT Sloan School of Management. He received his PhD in Psychology from Emory University in 2022 and his BA in Philosophy and Psychology from Binghamton University in 2016 before completing a postdoctoral fellowship at MIT. He studies where political and social beliefs come from, how they differ from person to person–and, ultimately, why they change–using artificial intelligence and the tools of cognitive and political science. He is best known for his work on leveraging generative AI to reduce conspiracy theory beliefs and the psychology of authoritarianism. He has published dozens of research papers in peer-reviewed outlets, including Science, Journal of Personality and Social Psychology, Psychological Bulletin, and Trends in Cognitive Sciences. Thomas has been featured in the New York Times, The Atlantic, The Economist and interviewed on television programs like NBC’s Nightly News and the BBC World News, radio shows and podcasts like NPR, CBC, and CNN, and has reached millions through social media. He is currently supported by grants from DARPA, the Long-term Future Fund, and Schmidt Sciences. Thomas developed DebunkBot.com, a public tool for combatting conspiracy theories with AI. He has been awarded the Klarman Fellowship from Cornell University, the Heritage Dissertation Research Award from the Society for Personality and Social Psychology, and the JS Tanaka Dissertation Award from the Association for Research in Personality.
Talk: The new behavioral science of belief change?
Abstract: Our social institutions — science, liberal democracy, trial by jury — assume that humans change their minds in response to sufficiently compelling information, allowing us to access truth via deliberation. Yet across the behavioral sciences, interventions that seek to change minds (e.g., shift attitudes and beliefs, correct misinformation) by leveraging factual information are notoriously ineffective, especially for salient topics related to ideology, identity, and coalitional interests. I will argue that much of this inefficacy is not attributable to motivated reasoning (i.e., the typical explanation for humans’ unwillingness to change their minds) but rather that individuals’ belief systems are sufficiently heterogeneous and complex to confound one-size-fits-all attempts at persuasive argumentation. Mounting genuinely compelling arguments at scale is trickier than it appears. My work helps solve this problem by leveraging a novel pipeline for information-focused interactions between humans and generative AI models that (a) measures participants’ beliefs in great detail and (b) delivers high-density factual argumentation (which proves crucial for effectiveness) that bears precisely on said beliefs. These interactions dramatically and durably reduce false beliefs, such as conspiracy theories (d = 1.1, with effects enduring for 2 months) and vaccine skepticism (d = 0.78), shift anti-immigrant prejudice (d = 0.20), and increase voting intentions (d = 0.85) — among other promising findings. I will share these findings and articulate a vision for a new behavioral science of belief change that recognizes beliefs as high dimensional person-specific phenomena, using both computational cognitive science and emerging technologies to account for this complexity. This approach sheds new light on the human mind while helping solve enduring social challenges.