This year’s elections around the world are under fire from disinformation and deepfakes. Researchers from the USA have prepared a map of the sources of harmful content

Newseria

Upcoming elections in more than 50 countries, including the U.S. and Poland, will encourage harmful content creators to step up their efforts to use artificial intelligence. The largest number of deepfakes is likely to be created this summer. Analysts have looked at which places in the digital world are incubators for the activities of “bad actors” and have mapped them. Small platforms are the main source of harmful content production and dissemination. In this context, the EU’s Digital Services Act can be seen as misguided, as such small platforms will be practically beyond the control of the regulations. Scientists suggest taking actions in the fight against this phenomenon based on realistic scenarios, and eliminating it completely is not one of them. Therefore, it is better to limit the effects of disinformation.

Read the full article >>

Predicting the risk of bad-actor-AI

Scienmag

Bad actors are predicted to begin using AI daily by the middle of 2024, according to a study. Neil F. Johnson and colleagues map the online landscape of communities centered around hate, beginning by searching for terms found in the Anti-Defamation League Hate Symbols Database, along with the names of hate groups tracked by the Southern Poverty Law Center. From an initial list of “bad-actor” communities found using these terms, the authors assess communities linked to by the bad-actor communities.

Read the full article >>

‘Bad actor’ AI predicted to pose daily threat to democracies by mid-2024

New Atlas

A new study has predicted that AI activity by ‘bad actors’ determined to cause online harm through the spread of disinformation will be a daily occurrence by the middle of 2024. The findings are concerning given that more than 50 countries, including the US, will hold national elections this year, the outcomes of which will have a global impact.

Read the full article >>