You are here

‘Coordinated Inauthentic Behaviour’ on Facebook during Election Campaigns

The next speaker in this AoIR 2019 session is Fabio Giglietto, whose focus is on inauthentic coordinated link sharing on Facebook in the run-up to the 2018 Italian and 2019 European election in Italy. ‘Coordinated inauthentic behaviour’ is a term used by Facebook itself, especially to justify its periodic mass account take-downs; the term remains poorly defined, however, and Facebook’s own press releases mainly point to a one-minute video that it has published to define the term.

The term marks a shift from content to process (including actors, propaganda, and information cascades), but – surprise! – largely remains unaware of the volume of research on authenticity and coordination that are available in the scholarly literature. Some such work goes back as far as Lazarsfeld’s research in the 1940s, which showed that personal influence is critical in generating influence, not least also in political contexts.

Fabio’s project examined this, first, in the context of the 2018 Italian election. Is there evidence of coordinated Facebook activity spreading political news during the campaign; were such activities undertaken by inauthentic, non-official political accounts; and did such activities have any measurable effect on the campaign and its outcomes?

The project developed tools to detect the rapid re-sharing of political news stories by the same Facebook and Instagram entities. It also drew on the CrowdTangle software provided by Facebook itself, which tracks posts by public accounts or groups (but not private profiles or groups) and paid ads unless they began as organic posts, but not private accounts, or comments.

The link-sharing networks identified through this work divided into overtly political, overtly non-political, and mixed components (which occasionally but not always contain political content). Even the non-political components sometimes shared political content, sometimes from very dubious, ‘fake news’ sources. Where such link-sharing was coordinated across multiple pages, it significantly boosted dissemination and engagement, and several of the pages identified by Fabio’s project are also in a list of spreaders of hate and disinformation published by Avaaz.

This shows that coordinated and inauthentic behaviour networks did operate in these elections, and that these networks were effective. They shared links to problematic news sources, but this coordination also provides opportunities for researchers to further identify and study such behaviours.