A recent decision by a Los Angeles Court found Google and Meta liable for social media addiction among young people, following a complaint by a 20-year-old Californian woman who accused YouTube (Google) and Instagram (Meta) of causing her depression and suicidal thoughts, having become addicted to the apps at a young age due to their design.
The significant innovation is therefore the subject of the charge: rather than challenging individual videos or posts, the focus was on structural elements of the apps themselves, such as infinite scrolling, recommendation algorithms, the absence of filters, notifications, and personalisation mechanisms. This allowed the plaintiff to overcome one of the platforms’ main defences in the United States, namely that they are merely intermediaries and not publishers of the content posted.
For the first time, Google and Meta have been held liable not so much for the content they host, but for the way the platforms are designed and the design of their services, which is deemed capable of influencing user behaviour, fostering addiction in minors, with potentially harmful effects.
The consequences could be significant, particularly in the United States, where an increase in litigation and a possible review of platform design models are anticipated. However, the outcome remains uncertain, as this is a first-instance decision against which an appeal has already been announced.
Within the European context, and consequently in Italy, there is a structured regulatory framework governed by Regulation (EU) 2022/2065 – the Digital Services Act – which clearly sets out the obligations for the intermediary service providers such as the platforms in question. These obligations concern diligent conduct aimed at ensuring a transparent online environment with general terms and conditions of use, including content moderation criteria and any restrictions applied to users, whilst respecting fundamental rights; for hosting services, it will be mandatory to provide a system for reporting illegal content and complaint mechanisms, to which users must have access. Article 54 of the Digital Services Act also provides for the right to compensation for recipients for damages suffered as a result of a breach of the Regulation’s obligations by the intermediary service provider.
However, the provisions of the Regulation and the resulting possibility of compensation mainly concern the management of the content of such services and the ways in which they must operate to be considered transparent, effectively requiring a description of how they are designed, how moderation systems work, what criteria are applied and what tools (including algorithmic ones) are used.
Consequently, in Italy, a ruling similar to that of the Californian court cannot be excluded, but it would face significant obstacles. Existing legal instruments — in particular tort liability under Article 2043 of the Civil Code — would in fact allow for claims for damages, but would require rigorous and complex proof of a causal link between the platform’s design and the harm suffered.
As consequence, whilst litigation based on platform design is also conceivable in Italy, it is more likely that developments on this issue will proceed through regulatory measures and public enforcement, rather than through court rulings with a significant systemic impact similar to those in the US.
In Italy, it is currently being debated by the relevant Senate committee the Disegno di Legge (Draft Law) S.1136, ‘Provisions for the protection of minors in the digital environment’, a new draft of which was adopted in March 2026, with the aim of raising the level of protection of minors’ mental and physical health against the consequences arising from the use of online social networking services and video-sharing platforms.
The draft law introduces a series of measures to strengthen the protection of minors when using social networks and video-sharing platforms, adopting a rather restrictive approach compared to the current situation.
Firstly, it prohibits the creation of accounts on online social networks and video-sharing platforms before the age of 15, with the introduction of a strict official age verification system.
From a legal point of view, contracts entered into by minors under the age of 15 will be void, and the account, even if created before the law comes into force, will remain valid only if the minor has reached the age of 15 by the time the law comes into force.
For minors over the age of 15 who work as influencers, it is envisaged that the Communications Regulatory Authority (AGCOM) will draw up specific guidelines with clear rules on transparency, advertising, the protection of rights and the recognisability of promotional content. Finally, state-funded public campaigns are planned to promote parental control and a more informed use of the internet, aimed at both minors and parents.
The debate is still ongoing and the legislation is likely to undergo further amendments before final approval.
