European Parliament adopts proposal to set minimum age for social media use at 16



In light of the physical and mental health risks that minors face online, the European Parliament has overwhelmingly passed a proposal to standardize the minimum age for using social media platforms (SNS) to 16. In addition to this proposal, the Parliament has also requested the European Commission to take several other measures to protect minors.

Children should be at least 16 to access social media, say MEPs | News | European Parliament

https://www.europarl.europa.eu/news/en/press-room/20251120IPR31496/children-should-be-at-least-16-to-access-social-media-say-meps



According to a report by Christelle Chardemous, EU Commissioner for Internal Market and Consumer Protection, 97% of young people are online in some way every day, and 78% of those aged 13 to 17 check their smartphones or other devices at least every hour. One in four people are said to be addicted to smartphones.

According to the EU opinion poll 'Eurobarometer,' over 90% of EU citizens believe that online child protection policies are an urgent issue, calling for 'effective measures to address the negative impact of social media on mental health,' 'cyberbullying,' and 'restricting access to age-inappropriate content.'

In response to this situation, the European Parliament adopted a proposal to unify the minimum age for using social media across the EU to 16 years old as a 'proposal for the proper protection of minors in the online environment,' with 483 votes in favor, 92 against, and 86 abstentions. Under this proposal, children aged 13 to 16 will be able to use social media with parental consent.

The European Commission is taking the lead in developing an ' EU age verification app ,' and is also working on measures to realize a '

European digital ID wallet .'

EU launches prototype age verification app to protect children online, testing begins in France, Spain, Italy, Denmark, and Greece - GIGAZINE



While supporting the European Commission's efforts, the European Parliament strongly called for the accuracy of age verification systems and the protection of minors' privacy, and stressed that these measures do not relieve platform operators of their responsibility to 'ensure safety and age-appropriateness at the product design stage.'

In order to ensure compliance with the EU's Digital Services Act (DSA) and related laws and regulations, the report also suggested that senior management could be held personally liable in the event of serious and persistent breaches of the law regarding the protection of minors and age verification.

In addition to the minimum age requirement, the following requirements are also required:

Prohibiting addictive practices targeting minors and disabling other addictive features by default, including infinite scrolling, autoplay, pull-to-refresh, and harmful gamification.
- Prohibition of sites that do not comply with EU regulations.
- Measures under the upcoming Digital Fairness Act (DFA) to address solicitation techniques such as targeted advertising, influencer marketing, addictive design, and dark patterns.
- Prohibition of engagement-based recommendation systems aimed at minors.
Applying DSA rules to online video platforms and prohibiting loot boxes and other randomized game features, including in-app currency, gacha, and 'pay-to-progress' (speeding up gameplay through payment).
-Protection of minors from commercial exploitation, including a ban on providing financial incentives to child influencers.
Urgent responses to ethical and legal issues raised by generative AI tools, such as deepfakes, companion bot chats, AI agents, and nude image generating AI apps.

in Note,   Web Service, Posted by logc_nt