24 May 2022,

The European Commission published on the 11th of May 2022 its new EU Strategy for a Better Internet for Children (BIK+), which finally tackles the question of the children exposure to online pornography in a clear way. This focus is warmly welcome, in a context of exponential growth in online sharing and communications of children, at an earlier age, and on an almost constant basis, exacerbated by the COVID-19 pandemic. Online activity can expose children to risks such as cyberbullying, access to age-inappropriate content, self-generated sexually explicit content (sexting), solicitation for the purposes of sexual exploitation (grooming), and child sexual abuse and exploitation.

In virtue of Article 3(3) and (5) of the Treaty of the European Union (EU) and Article 24 of the EU Charter of Fundamental Rights, the European Union  is competent to ensure the protection of children in the digital environment. The European Commission demonstrated for the past years its commitment to ensure a better internet for children, with first the 2021 EU Strategy on the Rights of the Child, which put forward a focus on a safe digital environment for children. The EU will soon also revise its 2011 Directive on the fight against child sexual abuse, to resolve its failed implementation EU-wide. Finally, the new BIK+ Strategy chose as a first pillars the “safe digital experiences to protect children from harmful and illegal online content, conduct, contact and consumer risks and to improve their well-being online through a safe, age-appropriate digital environment, created in a way that respects children’s best interests“.

This publication follows a period of consultation by the Commission of experts and civil society organisations committed to those issues. FAFCE contributed to the public consultation, and is happy to see several of its points of contribution included in the Strategy:

In response to the Commission’s consultation, the list of concerns identified by children themselves included seeing harmful content, which can glorify and promote self-harm, suicide, violence, hate speech, sexual harassment, drug taking, risky online challenges, eating disorders and dangerous dieting practices. Such violent, frightening or otherwise age inappropriate content is within easy reach. Children report seeing pornography at early ages, affecting their views of what constitutes a healthy relationship.

Despite existing EU law (AVMSD and GDPR), age verification mechanisms and parental consent tools are still ineffective in many cases, with users often only required to enter their birth date upon registration .

However, as outlined in section 2 above, it is expected that the recently endorsed DSA [Digital Services Act] will significantly improve the safety of all users, including children, and empower them to make informed choices when online. In particular, as part of the DSA risk management framework, systemic risks relating to minors require specific attention. Very large online platforms will be obliged to consider how easy it is for children to understand the design and functioning of their service, as well as how children can be exposed to content that may impair their physical and mental health,and moral development. Such risks may arise, for example, in the design of online interfaces which intentionally or unintentionally exploit the inexperience of children or which may cause addictive behaviour. In this regard, very large online platforms will have to adopt targeted measures to protect the rights of the child, including use of age verification and parental control tools, or tools aimed at helping children signal abuse or obtain support. Cyberviolence, including non-consensual sharing of intimate content, is an example of content that requires rapid processing if flagged by users, including children, and appropriate adaptation of the content moderation practices.”

The Digital Services Act is a Regulation proposal of the European Commission that finally reached an agreement by both the European Parliament and the Council of the UE on the 23rd of April 2022. It will oblige by 2024 online platforms to “identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union”, which includes the “intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security” (Article 24 1. (c) of the Regulation proposal).

Pornography is known to portray a fictive vision of sexuality. It leads to generate misleading perceptions of sexuality, inadequate expectations towards relationships and a vision of women and men as sexual objects. Moreover, for users who do not have yet the mental capacity to distinguish fiction and reality, such as children, the consumption of pornography has even more serious consequences. FAFCE will therefore continue to raise awareness on the harmful impacts of pornography as an issue of public health, with enormous consequences on the mental and moral development of children and to advocate for effective mechanisms to prevent children access to online pornography.

For more information of the commitment of FAFCE against pornography: