Machine Learning Language Models: Achilles Heel for Social Media Platforms and a Possible Solution

Advances in Artificial Intelligence and Machine Learning

Any uptick in new misinformation that casts doubt on COVID-19 mitigation strategies, such as vaccine boosters and masks, could reverse society’s recovery from the pandemic both nationally and globally. This study demonstrates how machine learning language models can automatically generate new COVID-19 and vaccine misinformation that appears fresh and realistic (i.e. human-generated) even to subject matter experts. The study uses the latest version of the GPT model that is public and freely available, GPT-2, and inputs publicly available text collected from social media communities that are known for their high levels of health misinformation. The same team of subject matter experts that classified the original social media data used as input, are then asked to categorize the GPT-2 output without knowing about its automated origin. None of them successfully identified all the synthetic text strings as being a product of the machine model. This presents a clear warning for social media platforms: an unlimited volume of fresh and seemingly human-produced misinformation can be created perpetually on social media using current, off-the-shelf machine learning algorithms that run continually. We then offer a solution: a statistical approach that detects differences in the dynamics of this output as compared to typical human behavior.

Richard Sear, Rhys Leahy, Nicholas Johnson Restrepo, Yonatan Lupu, Neil F. Johnson

View article >>

New Math to Manage Online Misinformation

SIAM News Blogs

Social media continues to amplify the spread of misinformation and other malicious material. Even before the COVID-19 pandemic, a significant amount of misinformation circulated every day on topics like vaccines, the U.S. elections, and the U.K. Brexit vote. Researchers have linked the rise in online hate and extremist narratives to real-world attacks, youth suicides, and mass shootings such as the 2019 mosque attacks in Christchurch, New Zealand. The ongoing pandemic added to this tumultuous online battlefield with misinformation about COVID-19 remedies and vaccines. Misinformation about the origin of COVID-19 has also resulted in real-world attacks against members of the Asian community. In addition, news stories frequently describe how social media misinformation negatively impacts the lives of politicians, celebrities, athletes, and members of the public.

Neil F. Johnson

View article >>

Online Group Dynamics Reveal New Gel Science

A better understanding of how support evolves online for undesirable behaviors such as extremism and hate, could help mitigate future harms. Here we show how the highly irregular growth curves of groups supporting two high-profile extremism movements, can be accurately described if we generalize existing gelation models to account for the facts that the number of potential recruits is time-dependent and humans are heterogeneous. This leads to a novel generalized Burgers equation that describes these groups’ temporal evolution, and predicts a critical influx rate for potential recruits beyond which such groups will not form. Our findings offer a new approach to managing undesirable groups online — and more broadly, managing the sudden appearance and growth of large macroscopic aggregates in a complex system — by manipulating their onset and engineering their growth curves.

Pedro D. Manrique, Sara El Oud, Neil F. Johnson

Read preprint >>

A Public Health Research Agenda for Managing Infodemics: Methods and Results of the First WHO Infodemiology Conference

JMIR Infodemiology

An infodemic is an overflow of information of varying quality that surges across digital and physical environments during an acute public health event. It leads to confusion, risk-taking, and behaviors that can harm health and lead to erosion of trust in health authorities and public health responses. Owing to the global scale and high stakes of the health emergency, responding to the infodemic related to the pandemic is particularly urgent. Building on diverse research disciplines and expanding the discipline of infodemiology, more evidence-based interventions are needed to design infodemic management interventions and tools and implement them by health emergency responders.

Calleja et al.

View article >>

Online hate network spreads malicious COVID-19 content outside the control of individual social media platforms

Scientific Reports

We show that malicious COVID-19 content, including racism, disinformation, and misinformation, exploits the multiverse of online hate to spread quickly beyond the control of any individual social media platform. We provide a first mapping of the online hate network across six major social media platforms. We demonstrate how malicious content can travel across this network in ways that subvert platform moderation efforts. Machine learning topic analysis shows quantitatively how online hate communities are sharpening COVID-19 as a weapon, with topics evolving rapidly and content becoming increasingly coherent. Based on mathematical modeling, we provide predictions of how changes to content moderation policies can slow the spread of malicious content.

N. Velásquez, R. Leahy, N. Johnson Restrepo, Y. Lupu, R. Sear, N. Gabriel, O. K. Jha, B. Goldberg, N. F. Johnson

View article >>

Hidden order across online extremist movements can be disrupted by nudging collective chemistry

Scientific Reports

Disrupting the emergence and evolution of potentially violent online extremist movements is a crucial challenge. Extremism research has analyzed such movements in detail, focusing on individual- and movement-level characteristics. But are there system-level commonalities in the ways these movements emerge and grow? Here we compare the growth of the Boogaloos, a new and increasingly prominent U.S. extremist movement, to the growth of online support for ISIS, a militant, terrorist organization based in the Middle East that follows a radical version of Islam. We show that the early dynamics of these two online movements follow the same mathematical order despite their stark ideological, geographical, and cultural differences. The evolution of both movements, across scales, follows a single shockwave equation that accounts for heterogeneity in online interactions. These scientific properties suggest specific policies to address online extremism and radicalization. We show how actions by social media platforms could disrupt the onset and ‘flatten the curve’ of such online extremism by nudging its collective chemistry. Our results provide a system-level understanding of the emergence of extremist movements that yields fresh insight into their evolution and possible interventions to limit their growth.

N. Velásquez, P. Manrique, R. Sear, R. Leahy, N. Johnson Restrepo, L. Illari, Y. Lupu, N. F. Johnson

View article >>

A computational science approach to understanding human conflict

Journal of Computational Science

We discuss how computational data science and agent-based modeling, are shedding new light on the age-old issue of human conflict. While social science approaches focus on individual cases, the recent proliferation of empirical data and complex systems thinking has opened up a computational approach based on identifying common statistical patterns and building generative but minimal agent-based models. We discuss a reconciliation for various disparate claims and results in the literature that stand in the way of a unified description and understanding of human wars and conflicts. We also discuss the unified interpretation of the origin of these power-law deviations in terms of dynamical processes. These findings show that a unified computational science framework can be used to understand and quantitatively describe collective human conflict.

D. Dylan Johnson Restrepo, Michael Spagat, Stijnvan Weezel, Minzhang Zheng, Neil F. Johnson

View article >>

As QAnon Conspiracy Theories Draw New Believers, Scientists Take Aim at Misinformation Pandemic

Newsweek

The first three “nodes” of the conspiracy-theory network known as QAnon arose in 2018 in the persons of founders Tracy Diaz, Paul Furber and Coleman Rogers. They had figured out how to profit from promoting the posts of “Q,” a mysterious figure claiming to have inside information on a mass arrest, undertaken with the blessing of Donald Trump, that nabbed Hillary Clinton and others for running a pedophile ring.

Read the full article >>

Facebook Pages, the “Disneyland” Measles Outbreak, and Promotion of Vaccine Refusal as a Civil Right, 2009–2019

AJPH

We categorized 204 Facebook pages expressing vaccine opposition, extracting public posts through November 20, 2019. We analyzed posts from October 2009 through October 2019 to examine if pages’ content was coalescing.

David A. Broniatowski, Amelia M. Jamison, Neil F. Johnson, Nicolás Velasquez, Rhys Leahy, Nicholas Johnson Restrepo, Mark Dredze, Sandra C. Quinn

View article >>