Organisers outline topics - including AI and deepfakes - for Cambridge Disinformation Summit 2025
The open call for submissions to the 2025 Cambridge Disinformation Summit runs from Sunday (25 August) to 25 October, with an agenda that is “intentionally open to afford the most latitude for creative and insightful approaches”.
The second-ever summit, which will be held at King’s College on 23-25 April, has been convened to allow thought leaders to develop a clearer understanding of the disinformation landscape and the systemic effects on global issues like health, economics, democratic process, and environment.
The summit is intended “to help facilitate better interdisciplinary communication for research and policy support, and consider potential interventions to mitigate the harms from disinformation”.
Formal speakers for the dinner – the theme will be ‘managing threats’ – have been announced. They are:
- Nina Jankowicz, CEO of The American Sunlight Project, author of How to Lose the Information War (2020), and a Washington DC-based policy adviser who was named by Time magazine as one of the 100 Most Influential People in AI 2023.
- Marianna Spring, the BBC’s first disinformation and social media correspondent and author of Among the Trolls: My Journey through Conspiracyland.
- Yoel Roth is VP of Trust & Safety at the Match Group, the parent company of Tinder, Hinge, and more than a dozen other dating apps worldwide.
Roth was also the former head of trust and safety at Twitter - now X - at the time the platform was acquired for $44billion by Elon Musk in October 2022. By November Roth had left, remarking that he was dismayed by the new owner’s “lack of legitimacy through his impulsive changes and tweet-length pronouncements about Twitter’s rules”. He remains committed to “building technology that’s resilient to threats and helps people and communities connect safely and authentically”.
The Disinformation Summit is organised by Alan Jagolinzer, Professor of Financial Accounting at Cambridge Judge Business School.
Prof Jagolinzer told the Cambridge Independent: “We convened the prior summit to get a sense for how disinformation is deployed to exploit different audiences, how it leverages current technology, and how some policies have evolved to try to address the actual and potential harms.
“We learned that researchers and policymakers are examining and deploying many different harm interventions, for instance digital services legislation, student literacy and critical thinking education, or crowd-sourced or AI-assisted narrative flagging.
“What we want to encourage, with the 2025 summit, is more interdisciplinary collaboration on interventions research, more analysis into which interventions really help stem the harms from disinformation campaigns, more careful consideration about where interventions, themselves, might be harmful, and how to develop trust and accountability around information, more generally.
“So the 2025 summit will showcase two days of selected research that indicates promising potential and then one day of candid policy-oriented discussions, to support work in the field.”
The summit will also offer two sub-theme discussions regarding disinformation tactics deployed by crypto influencers and also managing threats in this work environment, including online harassment, threats of violence, and legal harassment similar to SLAPP tactics.
SLAPP – Strategic Lawsuit Against Public Participation – suits are lawsuits brought by individuals and entities to dissuade their critics from continuing to produce negative publicity.
Papers submitted before 25 October will be selected for presentation at the summit by an interdisciplinary scientific committee. Research might consider policy, regulatory, enforcement, audit, fact-checking, sociological, psychological, religious, algorithmic, financial, or other frameworks for interventions, and should consider balancing free speech and other fundamental human rights.
Topics surrounding AI and deepfakes will be considered by the committee members.
Email CFRA@jbs.cam.ac.uk for details.