Silicon
Community Notes testing across Facebook, Instagram and Threads to begin next week in US, using algorithm from Elon Musk’s X
Silicon
Community Notes testing across Facebook, Instagram and Threads to begin next week in US, using algorithm from Elon Musk’s X
NPJ Complexity
Politicization of the COVID-19 vaccination debate has lead to a polarization of opinions regarding this topic. We present a theoretical model of this debate on Facebook. In this model, agents form opinions through information that they receive from other agents with flexible opinions and from politically motivated entities such as media or interest groups. The model captures the co-evolution of opinions and network structure under similarity-dependent social influence, as well as random network re-wiring and opinion change. We show that attitudinal polarization can be avoided if agents (1) connect to agents all across the opinion spectrum, (2) receive information from many sources before changing their opinions, (3) frequently change opinions at random, and (4) frequently connect to friends of friends. High Kleinberg authority scores among politically motivated media and two network components that are comparable in size can indicate the onset of attitudinal polarization.
Mikhail Lipatov, Lucia Illari, Neil Johnson, Sergey Gavrilets
News Central
Elon Musk has vowed to “fix” X’s Community Notes feature, arguing that it is being exploited by governments and legacy media.
Voice of Nigeria
Elon Musk, owner of X (formerly Twitter), has relied on the community rather than fact-checkers to monitor misinformation online since acquiring the social media company in 2022. He has championed the Community Notes feature as the best way to correct false posts.
CNBC
For X owner Elon Musk, the solution to monitoring misinformation online has been the community, rather than a group of fact checkers. Since buying the social media company formerly known as Twitter in 2022, he’s touted the Community Notes feature as the best way to correct false posts.
The National News
One of Meta’s first major announcements of 2025, the move to phase out fact checkers and replace them with a community notes-based system, is still fueling debate in the tech world.
Physical Review Letters
The global chaos caused by the July 19, 2024 technology meltdown highlights the need for a theory of what large-scale cohesive behaviors—dangerous or desirable—could suddenly emerge from future systems of interacting humans, machinery, and software, including artificial intelligence; when they will emerge; and how they will evolve and be controlled. Here, we offer answers by introducing an aggregation model that accounts for the interacting entities’ inter- and intraspecies diversities. It yields a novel multidimensional generalization of existing aggregation physics. We derive exact analytic solutions for the time to cohesion and growth of cohesion for two species, and some generalizations for an arbitrary number of species. These solutions reproduce—and offer a microscopic explanation for—an anomalous nonlinear growth feature observed in various current real-world systems. Our theory suggests good and bad “surprises” will appear sooner and more strongly as humans, machinery, artificial intelligence, and so on interact more, but it also offers a rigorous approach for understanding and controlling this.
Frank Yingjie Huo, Pedro Manrique, Neil Johnson
PsyPost
Online hate communities are not confined to isolated corners of the internet. In a new study published in npj Complexity shows how these groups are increasingly intersecting with mainstream online spaces.
Curious By Nature Podcast
The run-up to the 2024 U.S. presidential election has seen unprecedented levels of misinformation, division, and hate speech on social media. Even as election day comes and goes and the votes are being counted, the temperature of online discourse is only likely to rise. Online conversations about race, immigration, and other hot-button topics continue to attract extremist views that threaten to drown out anything resembling civil discourse. How do communities of hate operate? And how do they create their networks of users to infiltrate both the major platforms as well as the darker corners of the web? To understand complex systems such as this, one researcher at George Washington University is using his background in particle physics to map and analyze how hate speech flows on social media. We spoke before election day and before any of the votes were counted. So, without knowing the outcome, he gives a sobering warning that the biggest spike in online hate is likely to come after voters go to the polls.
Tech Policy Press
As the outcome of the 2024 US Presidential election hangs in the balance, it is worth looking at new academic research that explores the relationship between elections, political communications, and technology. This piece looks at three recent studies that provide insights into the efficacy of prebunking election misinformation using AI, the resilience and growth of online hate networks, and the shortcomings of political communication research in addressing threats of illiberalism from the far-right.