Detection of fake newS on SocIal MedIa pLAtfoRms
Period: June 2021 – May 2024
Source of funding: Ministry of Economy, Industry, and Competitiveness (MINECO)
Programme and Call: European Interest Group (EIG) CONCERT-Japan – 7th Joint Call on “ICT for Resilient, Safe and Secure Society”
Project reference: PCI2020-120689-2
Project coordinator: David Megías (Universitat Oberta de Catalunya).
Digital media have changed the classical model of mass media that considers the sender of a message and a passive receptor to a model where digital media users can appropriate the contents, recreate and circulate them (Castells, 2009). In this context, online social media are a suitable circuit for distributing “fake news” and the spread of disinformation. Particularly, photo and video editing tools and recent advances in artificial intelligence allow non-professionals to easily counterfeit multimedia documents and create deep fakes. Some online social media deploy methods to filter fake content to avoid the spread of disinformation. Although this can be an effective method, its centralized approach gives enormous power to the manager of these services.
Aligned with the United Nations Sustainable Development Goals (SDGs) in Goal 9 (Industry, Innovation, and Infrastructure) and Goal 16 (Peace, Justice, and Strong Institutions), this project aims to provide content creators with tools to watermark their creations and make any modification easily spottable. Also, this project will give online social media users tools based on state-of-the-art signal processing and machine learning methods to detect fake content. Combining the watermarking and detection tools will empower users to discern between original and fake multimedia content without the need for assessment and control from a centralized service. Furthermore, this project will develop a user-centered design approach. It will conduct a comprehensive user experience study (WP5) to build the most effective tools and account for the cultural dimension and the diversity of final users.