The European Union legislators have approved the development of rules that companies, such as Alphabet, Google, Meta and other online services, identify and delete materials with child pornography.
The draft rules for children of sexual abuse (CSAM), represented by the European Commission last year, provoked a dispute between supporters of safety on the Internet and activists who are indignant at a possible violation of data privacy.
The EU Executive Body proposed CSAM in accordance with the current inefficiency of the current voluntary detection and reporting system for children. Legislators should discuss the latest details with Member States before the project becomes a law that can happen in 2024.
The proposed law requires reports, apps and access to Internet access to report and delete known and new images and videos, as well as grooming cases. It also provides for the creation of an EU Center for Sexual Violence against Children, which will transfer reports to the police.
"The position of the European Parliament determines the restrained control of chats and allows only purposeful monitoring of individuals and groups that are reasonably suspected of sexual abuse materials against children, in the presence of a judicial warrant," - said the International Youth Association of Europe (Lymec).