OxTalks will soon move to the new Halo platform and will become 'Oxford Events.' There will be a need for an OxTalks freeze. This was previously planned for Friday 14th November – a new date will be shared as soon as it is available (full details will be available on the Staff Gateway).
In the meantime, the OxTalks site will remain active and events will continue to be published.
If staff have any questions about the Oxford Events launch, please contact halo@digital.ox.ac.uk
Fact-checking organizations, the media, advertisers, and even governments have taken note of false and misleading information on mainstream Internet platforms when that content is the majority language of a country. Misinformation in other languages and on end-to-end encrypted platforms, however, often spreads unnoticed, leaving the people it reaches with fewer options to investigate the veracity of the information.
Moving countries no longer entails cutting ties with one’s former home. While studying or working in a new country it is possible to continue to consume news and stay in touch with friends and family overseas. These connections can be a vital source of information, but they can also allow misinformation to reach new populations. Furthermore, disinformation can specifically target diaspora communities using culturally-tailored messages.
This presentation will discuss two projects aiming to empower diaspora communities to identify and mitigate the most pernicious misinformation online. The projects combine large-scale data analysis with community tiplines—social media accounts to which community members can forward potentially harmful content and discover relevant context explainers, fact-checks, and media-literacy materials.
Misinformation claims often form around common themes, persistent stereotypes, and patterns of deception. By leveraging discourse analysis and community knowledge, we can build taxonomies of common types of misinformation and use machine learning to map new claims to this taxonomy. This enables us to move to a proactive model where interventions are available before misinformation claims spread widely.
Dr Scott A. Hale is an Associate Professor and Senior Research Fellow at the OII, Director of Research at Meedan, and a Fellow of the Alan Turing Institute. He develops and applies techniques from computer science to research questions in the social sciences. His research seeks to see more equitable access to quality information and investigates the spread of information between speakers of different languages online, the roles of bilingual Internet users, collective action and mobilization, hate speech, and misinformation.