The spread of disinformation is one of the most relevant threats that societies face nowadays.

What makes disinformation so dangerous is that often governments and private actors are unable to stop its spread or influence those who have begun to believe in it. Often individuals would share disinformation based on their genuine belief that this information is truthful.[i] For example, they may have been manipulated into trusting a news source that is maliciously spreading disinformation or they could have been deceived by fake news spread by people they otherwise trust, such as family members or close friends. Once an individual believes a certain piece of information, it has been recognized that later on it is harder to dissuade them.[ii] Thus, it would seem important to limit overall the amount of disinformation people are exposed to and their susceptibility to it.

Here the phenomenon of microtargeting comes into play. As people are beginning to recognise, that digital platforms collect large amounts of information they later use to tailor our online experience.[iii] Often when we discuss this, we think of targeted advertising. However, social media platforms and other similar venues are training their algorithms to show us posts based on our previous engagement and likes.[iv] This in turn leads to the creation of eco-chambers. When an individual is in an eco-chamber during their online experience, they are mostly shown content that shows and reinforces their pre-existing views and beliefs.[v] For example if an individual is an opponent of vaccines, when they are in an online eco-chamber, the algorithm of the social media platform they are using will continue to show them content that is critical of vaccines, that criticises public health guidelines and that helps them connect with other anti-vaxx individuals. This is extremely dangerous. If individuals engage with disinformation, they will continue to be shown similar posts and accounts, which will only make them even more invested in the wrongful narratives.

Even if singular posts are taken down or certain claims are proven to be fake, the users will continue to see more and more content of this character. This would mean that measures aimed at limiting disinformation itself, such as taking down/flagging posts, etc. may be to some extent helpful, but will not be able to fix the issue itself. Efforts must be made to understand the issue of micro-targeting and to limit its harmful effects.

For example, some have argued that micro-targeting could be in breach with the GDPR.[vi] Depending on how the process itself is carried out and what safeguards are implemented, such practices could be in breach of principles like transparency, data minimisation, they could even lead to abuse of personal data by third parties.[vii] However, the best tool against data mining and micro-targeting is users themselves objecting to such processing of their personal data. Their right to object to such processing and algorithmic decision-making is explicitly envisioned under the GDPR (this is provided under Articles 18 and 21).[viii] Furthermore, more efforts should be made to educate the public on the harms eco-chambers and micro-targeting pose in order for them to responsibly regulate their online conduct. For example, individuals should be aware when they are being targeted with one-sided content and should seek out alternative sources and check the credibility of certain narratives, even if they find them relatable and persuasive.



[i] Ullrich K. H. Ecker, Stephan Lewandowsky, John Cook, Philipp Schmid, Lisa K. Fazio, Nadia Brashier, Panayiota Kendeou, Emily K. Vraga and Michelle A. Amazeen, ‘The psychological drivers of misinformation belief and its resistance to correction’, (2022), Nature Reviews Psychology 1 13–29.

[ii] Ecker and others n(1).

[iii] ‘Micro-targeting’, (Privacy International), available at: https://privacyinternational.org/learn/micro-targeting, last accessed 4 May 2022.

[iv] Matteo Cinelli, Gianmarco De Francisci Morales, Alessandro Galeazzi, Walter Quattrociocchi and Michele Starnini, ‘The echo chamber effect on social media’, available at: https://www.pnas.org/doi/10.1073/pnas.2023301118, last accessed 05/09/2022.

[v] Cinelli and others n(4)

[vi] Cristina Blasi Casagran, Mathias Vermeulen, Reflections on the murky legal practices of political micro-targeting from a GDPR perspective, (2021), International Data Privacy Law, vol. 11, issue 4, pp. 348–359

[vii] Casagran and others n(6)

[viii] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance), OJ L 119, available at: http://data.europa.eu/eli/reg/2016/679/oj, last accessed 05/09/2022.