-
PDF
- Split View
-
Views
-
Cite
Cite
Kinga Sorbán, An elephant in the room—EU policy gaps in the regulation of moderating illegal sexual content on video-sharing platforms, International Journal of Law and Information Technology, Volume 31, Issue 3, Autumn 2023, Pages 171–185, https://doi.org/10.1093/ijlit/eaad024
- Share Icon Share
Abstract
With the availability of broadband internet and a significant increase in storage capacity, people are sharing a dynamically growing amount of multimedia content on video-sharing platforms, including sexually explicit content. While sexual content on the internet is not illegal per se and remains in the domain of economic services, there are certain forms of sexually explicit content that are criminally unlawful, such as child pornography or non-consensual pornography. This article explores why the current business-oriented regulatory environment fails to provide effective tools to combat illegal sexual content: first, from the perspective of content moderation on video-sharing platforms, and then from the perspective of substantive criminal law. After mapping the gaps in policy at all levels of content regulation, the article offers recommendations to address these issues, open up a policy debate and influence policy makers.
Introduction
Back in 2018 during the revision of the Audiovisual Media Services Directive (the AVMSD),1 I was attending an expert group meeting in Brussels on video-sharing platform regulation. The main item on that day’s agenda was to set up a typology of video-sharing platforms (VSPs), which would facilitate Member States implementing the future directive in a way that results in a graduated system of protecting users. As the most popular platforms with a video-sharing function, like YouTube and Facebook were mentioned early on, I suddenly sensed that something was missing. I asked whether adult platforms would constitute an independent platform category, as they tend to offer content that is and should be restricted for certain—and in the case of illegal content, all—audience groups. After a fleeting moment of a somewhat awkward silence, another member of the expert group noted that we all just had noticed the elephant in the room.
There are adult websites on the internet that allow users to upload their own sexually explicit content, some of which receive more website traffic than streaming giants Amazon and Netflix.2 As they fit the definition of VSPs set out in the AVMSD and the definition of hosting service providers set out in the E-Commerce Directive, the EU’s framework for regulating content does, in theory, apply to them. In spite of these rules, there are issues with these VSPs that are becoming strikingly evident. Take one of the largest adult VSPs, Pornhub, for example. In 2020, The New York Times published an exposé accusing the platform of aiding the distribution of child pornography and rape videos.3 Following the article, both Visa and MasterCard announced that they would stop processing payments for the provider because it hosts illegal content.4 While some promises were made and a few preventive measures were taken by Pornhub to prevent the dissemination of such content, the next scandal followed shortly after. In 2022, the New Yorker published an article on how the service provider still failed to tackle the issue of non-consensual videos,5 leading to the resignation of Pornhub’s CEO and COO.6
Platform providers are in a good position to curb the harms caused by illegal content on the internet. Due to the gatekeeping nature of their platforms, they have the power to monitor, filter and moderate certain forms of content.7 Furthermore, under the E-commerce Directive (and, more recently, the DSA), they are obliged to remove illegal material to be exempted from liability for third-party illegal content.8 Even though platform providers have secondary liability9 for cases when they fail to comply with their obligation to remove illegal content, the examples above show that illegal sexual content remains a significant issue on adult VSPs.
Clearly, there is a flaw in the system that prevents the sexual content regulatory regime from fulfilling its main objective: protecting vulnerable user groups and certain individual users. Is it a policy gap that renders the system of tackling illegal sexual content on video-sharing platforms ineffective? If so, how can this shortcoming be remedied on the EU or the Member State level?
Looking at Europe, there were several attempts to regulate the conduct of VSPs that label themselves as adult websites. In 2017, the UK passed the Digital Economy Act, colloquially known as Britain’s porn block, which required pornographic websites to introduce age-verification systems in order to prevent access by users under age 18. The act was never implemented because of the serious flaws and shortcomings in its framing:10 the nation-wide block could have been easily bypassed by the use of VPNs, while identification and verification raised serious privacy concerns.11
When the EU drafted the Digital Services Act (the DSA),12 there was a lively debate during the trialogues between the European Parliament, the Council and the Commission on how the EU should deal with online platforms where user-generated pornographic content can be published. The original regulatory idea13 was that if the primary purpose of an online platform was to share pornographic material, the provider would have to provide enhanced protection to its users through various technical and organizational measures. Among the technical measures, service providers would have obliged content creators to identify themselves by providing their email address and phone number. This would have allowed for the rejection of registrants under the age limit and, in the event of an infringement, the easy identification of the identity of the infringer. As an organizational measure, operators would have employed human moderators qualified to identify image-based sexual abuse, including illegal sexual content. In addition, the general complaints handling procedure applicable to all platform providers, these platforms would have also established a qualified complaints handling mechanism to swiftly identify and remove sexually explicit material posted without consent. These provisions were dismissed during the political negotiations.
The regulation of VSPs hosting pornographic content thus remains an open issue, which becomes more pressing as the number of so-called adult VSPs proliferate. According to the MAVISE database set up by the European Audiovisual Observatory, about a third of the 72 VSPs registered in the European Union are pornographic.14
In this article, my aim is two-fold: first to draw attention to policy gaps which render the regulatory framework inadequate for dealing with certain categories of VSPs, such as adult platforms. Second, I argue that the issue of sexually explicit content is not merely a matter of content governance if the possibility of criminal liability emerges. I aim to provide an overview of those EU rules that penalize certain forms of sexual content, highlighting that only some basic offences (eg child sexual abuse material) are regulated in an EU-wide context, while the remainder of illegal sexual content are criminalized by Member State’s laws.
To synthesize these parts, my final aim is to illustrate what happens when adult VSPs have to act on content that is only criminalized in certain jurisdictions. Generalist platforms, like YouTube and Facebook have robust systems in place that filter all sorts of sexual content. Platforms offering specialized services or niche services that claim to be ‘adult platforms’ are more prone to turning a blind eye and failing to moderate illegal content. I argue that such platforms often lack the initiative to complement overly broad EU rules and patchy Member State laws, and even though their terms and conditions may comply with existing regulations, they take no extra steps to add additional layers of protection through private regulation nor do they pay particular attention to enforcing the laws.
This last part also aims to formulate a set of policy proposals that could incentivize adult content VSPs to apply voluntary measures that would offer users more protection.
The place of graduated content regulation within the regulatory framework of video-sharing platform services
VSP regulation is a novel area of the European regulation of services. Before the amendment of the Audiovisual Media Services Directive in 2018, there were no specific European rules addressing VSPs. Prior to the AVMSD, only the E-commerce Directive offered a limited set of rules on hosting service providers, which are all kinds of services that provide storage for third-party information. In this section of the study, I will provide a brief overview of the current regulatory landscape of VSPs, outlining the broad regulatory framework that governs their functioning. This will be followed by a description of the laws on content moderation, paying specific attention to those policy areas that concern sexual content. Narrowing the scope of analysis in this way makes it possible to highlight the differences in regulatory attitudes to lawful and unlawful online sexual content.
The broader regulatory framework for video-sharing platform services
Video-sharing platforms are online services devoted to providing programmes, user-generated videos, or both, to the general public, while allowing users to interact with each other and the content. Online services in the EU are classified by the E-commerce Directive, which created three categories of internet intermediaries, based on their involvement with third-party content. The categories are as follows: (i) mere conduits, (ii) cache providers and (iii) hosting service providers. The newly adopted Digital Service Act (the DSA) defines two types of hosting service providers: simple hosting service providers that merely provide storage for third-party information and online platforms that allow for user interactions. Video-sharing platform service providers constitute a sub-category within online platforms, and as such, their main objective is not simply to store and disseminate information, but also audiovisual content, which can be either user-generated or made by professional media outlets.
In the EU, VSPs are perceived and regulated as services—one of the basic building blocks of the internal market besides goods, persons and capital. Given their pivotal role in the internal market, their regulation falls within the EU’s competence, which is vested with the power to regulate them as economic activities, leading to a generally business-oriented approach drawn up along business interests and market considerations. The regulation of VSPs as services focuses on maintaining market integrity, protecting consumers in B2C relationships and supporting competitiveness. Accordingly, these objectives are promoted through various areas of legislation, including consumer protection, data protection, competition law and content regulation to boost economic growth, but it is also these areas of law that form the basis of the regulatory framework for online services.
As shown on the grid in Figure 1, the logic of regulations that apply to online services is the same: the above legal areas constitute the horizontal dimension of regulation and the service-specific rules constitute the vertical dimension, resulting in a complex regulatory environment. In academia, this complexity is attributed to their unruly status as internet intermediaries, occupying a liminal position pertaining both to their functionality and to the status of their operators.15

There is only one dimension where the above business-oriented approach falters: content regulation, which is rooted within the public interest obligations generally associated with the media.16 In the traditional media setting, the obligations of media outlets include protecting the audience, or certain audience groups, from harmful and/or illegal content. A tangible tension emerges when rules associated with these public interest objectives are embedded into a business-oriented framework because the two approaches are inherently different. This tension is particularly evident in the moderation of pornographic content on video-sharing platforms. The business-oriented approach perceives audiovisual content as a marketable product, and seeks to balance consumer and market interests. If, however, the given content constitutes a crime, this approach lose all relevance. The regulation of illegal content must achieve non-economic objectives: it must facilitate the prosecution of the perpetrator and the termination of the infringement (by the deleting the content in question). The current European frameworks that govern the behaviour of online platforms do not pay sufficient attention to these aspects, as both the measures enshrined in the AVMS Directive and the notice-and-takedown procedure features in the DSA are still balancing interests when regulating liability exemptions and user-platform disputes.
Graduated content-regulation for VSPs
The first AVMSD regulated all content services,17 including linear and on-demand services, irrespective of the means of transmission. When it was enacted, it replaced the EU’s outdated vertical regulatory framework which differentiated services depending on the sector they fall into.18 Since then, the EU’s content regulation has followed a horizontal approach to content regulation, which has integrated the provisions that apply to content uploaded to VSPs. Horizontal content regulation does not mean that uniform rules apply to all types of services. In the case of audiovisual media services (linear and on-demand services), and where the service provider exercises editorial control over the content provided, applicable regulations tend to be more detailed and robust than those that apply to other forms of online services where the provider merely exercises an intermediary function between the users of the service and has significantly less influence over the content uploaded.
Peggy Valcke and D. Stevens call this a graduated framework, yet it is only graduated on the level of services. I prefer to call it semi-graduated because, as illustrated in Figure 2, beyond the level of services, it offers little to no room for manoeuvre to enact different rules for the sub-categories of services even when specificities would justify different treatment.

The provisions to be discussed apply to all platforms that fit the definition of VSPs, irrespective of the nature of the content offered, thus ranging from sites that host children’s cartoons to hardcore pornography. In the case of explicit, yet not illegal content, the Member States have limited options to tailor the rules that apply to VSPs established in their territories to national policies and specificities—an opportunity seldom taken.19 The DSA enacts a broadly structured framework for online platform regulation, as it places different obligations on different online platforms, including VSPs. However, because it is not an instrument of content governance, it differentiates rules on the basis of the size of the platform rather than the type of content offered. I argue that this ‘one-size-fits-all’ approach is not an effective way to tackle illegal content in cyberspace because it leads to the proliferation of complementary national rules and a chaotic system of exceptions. This patchworked regulation is already noticeable in the case of hate speech,20 which global platform providers try to compensate for through private regulation.
Moderating third-party content on video-sharing platforms
VSPs in Europe have limited obligations regarding third-party content. Limited obligations are the consequences of the providers’ limited liability for third-party unlawful content first introduced to EU law by the E-commerce Directive, which laid down a liability framework for hosting service providers, respective of their limited control over the information distributed through their service. According to the E-Commerce Directive, hosting service providers (including VSPs and online platforms) are only liable for third-party content if they fail to remove them when they gain knowledge of their existence in their services. ‘Knowledge’, in the meaning of the E-commerce Directive is not understood as an abstract notion pointing to any reasonable assumption that the service is being used to host illegal content, but is rather defined as the actual recognition of clearly identifiable pieces of information. The DSA maintains this framework of limited liability and complements it with a good Samaritan clause similar to the American CDA’s rules. Under this new clause, providers are also exempted from liability if they voluntarily search their platforms for illegal content. This clause has the unconcealed aim to incentivize providers to actively engage in content moderation on their own accord, even if this obligation is not set out in the law of the Member State having jurisdiction.
Content moderation is a platform governance tool, which involves the ‘organized practice of screening user-generated content (UGC) […] in order to determine the appropriateness of the content for a given site, locality, or jurisdiction’21 which also involves certain interventions in the case of unacceptable content.22 A common element in existing definitions is that they either mention inappropriateness, unacceptability or harmfulness among the main reasons for moderation, but never simply unlawfulness, leading to the conclusion that moderation concerns a significantly wider range of content. While illegal content is always law-infringing material, harmful content is often not in violation of the law, only deemed inappropriate for certain sensitive or vulnerable audience groups. This dichotomy is also reflected in the AVMSD, which differentiates between illegal and harmful content at the regulatory level.
Under the AVMSD, there are four categories of illegal content: child pornography, terrorist content, hate speech and xenophobic content, and content that constitutes a criminal offence under EU law. This list is rather restrictive and does not cover a broad range of conducts that are criminalized in Member States’ laws. For example, there are no common European rules criminalizing acts such as revenge pornography or malicious deepfakes, while hate speech in the European context refers to racist and xenophobic speech (ignoring speech based on other characteristics such as gender, sexual orientation or political beliefs).
While the list of illegal content is considered exhaustive, the AVMSD does not set up such a list of harmful content. The scope of harmful content is the most difficult to define precisely, as the extent to which a particular type of content is considered harmful to a child in each jurisdiction depends largely on the cultural background, the ideology, but also on the prevailing scientific evidence on child psychology and development. Even taking all of these aspects into account, it is not possible to state clearly that the content identified by the rules as harmful is universally harmful to all children. The age of the child, his or her social background and personal sensitivity also have a major influence on the actual degree of harm.
The directive sets out the protection of minors from harmful media content as a policy objective whilst naming only two of the most harmful content types: content depicting violence and pornography. Member States and platform providers are given a broad margin of appreciation to determine content categories that they consider harmful. The Dutch Kijkwijzer system, which serves as a model for several European countries, classifies the following categories of content as harmful to minors:
- violence,
- sexuality,
- incitement to fear,
- alcohol or drug use,
- discrimination.23
The exact measures adopted to moderate content must be aligned with the harmfulness of the content in question. Thus, the most harmful—ie, illegal—content has to be subjected to the strictest measures: removal. The AVMSD suggests a handful of measures, such as informing users, providing effective tools for handling complaints, setting up parental control systems and age verification systems.24 Besides content removal, content moderation has a large range of tools available such as (i) implementing age-verification systems, (ii) geo-blocking, (iii) applying labels, metatags and other content descriptors, (iv) imposing demonetization and punitive strike systems, (v) providing counterspeech25 and (vi) reducing the visibility or reach of problematic content.26 Member States enjoy a considerable amount of freedom to choose which measures to apply, while it is essentially up to the platforms can to decide how to implement the measures. In theory, this allows for tailoring protective measures to fit the type of platform. In the meantime, a 2022 report by the European Regulators Group for Audiovisual Media Services (ERGA)27 notes that 25 NRAs in the EU declared that their national legislation merely replicates Article 28b of the AVMSD. While the AVMSD suggests a list of applicable measures, its rules merely set a framework for minimum rules. The Directive does not set further requirements concerning the technical means to fulfil the measures, benchmarking or measuring effectiveness. This means that setting up a simple security question asking whether the user had reached the age of majority may qualify as an age-verification tool under the AVMSD, in spite of its evident lack of efficacy.
The types of illegal sexual content on video-sharing platforms
Sexual content on VSPs can be either harmful to minors, or illegal and thus subject to moderation. Generally, pornographic material is not prohibited: the creation and distribution of pornography is a million-dollar industry. I will call this lawful sexual content. Due to their harmfulness to certain vulnerable audience groups, typically minors, there are certain restrictions that apply to lawful sexual content, yet these were not set up for the purpose of banning such content from the internet. Similarly to rules governing alcohol and tobacco advertising, these restrictive measures do not question the product nature of the content, but merely limit their free flow on the internet out of public policy considerations, such as protecting public morals or the healthy development of children. While lawful sexual content only needs to be restricted for certain audiences, illegal content must be removed from all platforms, simultaneously with notifying the relevant law enforcement authorities and preventing re-upload.
The issue to be discussed in this part of the article is that there are no common EU rules governing criminalized sexual content. While the most serious offences involving sexual content are subject to harmonized rules in the EU and by the Cyber Crime Convention in a wider European context, this is not the case for sanctions applicable to other types of illegal sexual content, which fall under the jurisdiction of Member States. The following section will create the typology of illegal sexual content found in cyberspace and will also point out the detrimental effects of not having clear and uniform rules to counter them. The typology of illegal conduct involving sexual content can be drawn up by examining the trends of recent policy discussions. While combatting child sexual abuse material remains a recurring topic in policy debates, there are two other notable areas where policymakers actively call for intervention: non-consensual sexual imagery and malicious sexual deepfakes.
Child sexual abuse material
Among all the unlawful conduct involving sexual content, EU law only recognizes child pornography as an EU crime in accordance with the Treaty of Lisbon. Although the criminalization of content depicting minors in sexually explicit situations seems straightforward, in-depth analysis of the existing rules crystallize several elements, which allow for different interpretations.
First, the terms ‘child’ and ‘minor’ are both used to define victims. Yet, the two notions have slightly different meanings, which may cause a misunderstanding, especially in relation to EU Member States and third countries. The EU Directive on combating the sexual abuse of children defines and uses the term ‘child’ and considers any person below the age of 18 years to be a child.28
The Cybercrime Convention however uses the term ‘minor’. According to the Cybercrime Convention, ‘minor’ refers to persons under 18 years of age, but State Parties are free to set a lower age limit (but not less than 16 years of age).29 This definition is in accordance with the UN Convention on the Rights of the Child, which uses the same definition, but uses the term ‘child’. While the two terms appear to be synonymous, they are not. The term ‘child’ is perceived as a universal term, referring to all persons under the age of 18 years, while the term ‘minor’ should be understood as a legal notion. The Luxemburg Guidelines describe minors as persons who have not reached the age of majority.30 The age of majority is ‘is the legally defined age at which a person becomes an adult, with all the attendant rights and responsibilities of adulthood’.31 The age of majority is generally set at 18 years, but there are some exceptions. Under some jurisdictions, persons under 18 can become emancipated (eg by marriage or by obtaining a court order) which means that children can legally attain the status of adults. It is generally presumed that minors cannot legally consent to the creation of pornographic material,32 since they lack sufficient discretion to assess the consequences of their actions. However, the status of being an emancipated minor expressly involves the capacity to make certain decisions (such as entering into contracts or managing property) on their own behalf. Being an emancipated minor may not result in persons aged between 16 and 18 losing their status as a child and becoming excluded from the protection that international legal instruments afford to children.
László Dornfeld notes that the biggest problem with the age-based approach is that there is a gap between the age of consent and the protection offered by laws aiming to protect children.33 For example, the age of consent is 16 in the UK, 15 in France and 14 in Hungary, or even as low as 12 if both partners are under 18.
Second, using the term ‘pornography’ for acts related to the sexual exploitation of children is deemed controversial by many,34 as it underplays the significance of the issue.35 The dictionary defines pornography as ‘the depiction of erotic behaviour (in pictures or writing) intended to cause sexual excitement’.36 Edwards argues that child sexual abuse material cannot be considered erotica, as this imagery represent the ‘rape, abuse and torture of children’.37 The Luxemburg Guidelines on terminology highlights that the term pornography is generally used to describe a commercial product in which consenting adults engage in sexual activity.38 Danijela Frangež et al. highlight that pornography is a commonly used term, which is broadly accepted, trivialized and refers to a type of material that is legal in the majority of the countries of the globe and using the term for material depicting the sexual abuse and exploitation of children may, in fact, legitimize the phenomena.39 A commonly used term in our present days is child sexual abuse material and child sexual exploitation material. The latter is used by international police organizations such as Europol,40 while the former is used by several organizations and private entities including the International Association of Internet Hotlines (INHOPE)41 and YouTube (Google). The EU also uses the term child pornography, but its Directive on combating the sexual abuse of children clarifies that child pornography consists of images of child sexual abuse.42 The EU also acknowledged the inadequacy of the term child pornography. The European Parliament started communicating its effort to correct the terminology in its Resolution on Online Child Sexual Abuse in 2015.43 Child sexual abuse material and child sexual exploitation material are considered acceptable terms, because they clearly indicate that we are in fact talking of violent, abusive conduct that takes places on a regular basis, often with the aim of realizing financial gain. These terms do not cover self-generated, sexually explicit content produced consensually (unless consent is a result of putting pressure on, or coercing children) which helps avoid an unnecessary stigmatization of adolescents.
Third, the Lanzarote Convention only aims to criminalize offences that victimize real children: virtual child pornography and the conduct of adult actors mimicking minors are not addressed, while the Cybercrime Convention also covers material that depicts persons representing or appearing to be minors. As such the two Conventions establish different levels of protection: those States that did not sign and ratify the Cybercrime Convention have to criminalize a narrower set of conducts. The different levels of protection give rise to merely theoretical debates and do not pose a problem in practice, because the Cybercrime Convention was signed and ratified by all State Parties of the Lanzarote Convention.44
Non-consensual pornography
Non-consensual pornography has emerged in recent years and stirred lively regulatory debates. Early definitions however mislabelled the phenomenon as ‘revenge porn’, which is inaccurate for several reasons. First, it presupposes that the act was committed with a specific intent to take revenge. Although this is often the case when someone publishes intimate imagery of their former partners after a break-up, the issue of non-consensual sexual imagery is much broader. In fact, the conduct referred to as non-consensual pornography turns on the issue of consent, or the lack thereof. This absence of consent can relate to the publication or dissemination of the material in cases where all parties have consented to the sexual act and its documentation, just not to making it available online. Or, the lack of consent can relate to the production of such material, in which case the sexual act is consensual, but at least one of the parties did not agree to documenting it. Voyeur videos are a popular manifestation of content made without consent. Finally, lack of consent can relate to sexual intercourse as well, where it is not only the creation and dissemination of the material capturing the sexual act, but where the act itself—rape—is a serious crime. As, unfortunately, rape videos are circulated on social media platforms, the regulation of non-consensual pornography must address this issue as well.
The penalization of the rest of this type of illegal sexual content rests within the discretion of Member States. While some—eg, Italy and Germany—criminalize non-consensual dissemination of sexual imagery, others do not. Member States laws regulating the issue are not uniform, as they tend to address different aspects of the problem. For example, Italian Law 69/2019 on Non-Consensual Pornography penalizes the illicit distribution of sexually explicit images or video as a form of gender-based violence.45 According to Italian law, a mandatory element of the actus reus for non-consensual pornography is the violation of confidentiality, which implies that one of the parties expressly objected to the publication of the material. Some argue that this can be interpreted as meaning that if explicit consent to publish was not or could not be provided, the act is still in breach of the law.46
In the United Kingdom, the publication of non-consensual pornography is a crime according to the Criminal Justice and Court Act 2015, but only if evidence shows that it was committed with a specific intent, namely to cause distress. Critiques of the act highlight that the requirement of this specific mens rea gives rise to an argument difficult to challenge, that there was an absence of the required intent to cause distress.47 Intent however is not only a core element of non-consensual pornography laws in Europe. The French Digital Republic Act of 2016 criminalizes the violation of the intimacy of private life insofar as the act was wilful.48
There are EU Member States that do not have specific legal provisions to address non-consensual pornography, yet criminalize conduct in relation to violating certain aspects of private life. This is the case in Germany, where the German Criminal Code’s Section 201a criminalizes the violation of highly personal sphere of private life, and Hungary, where the Hungarian Criminal Code does the same. These provisions were not specifically created to tackle non-consensual pornography, yet as the image of a person is protected under personality rights in these countries, the dissemination of sexual imagery without consent is clearly a violation of these rights.
These diverse laws result once again in a patchworked regulatory environment, where victims are offered different levels of protection based on the country they live in. Platforms that have to abide by the laws of the Member State having jurisdiction sometimes level the regulatory landscape through platform-based private regulation by introducing bans or restrictions to certain content types, regardless of the laws of their respective places of establishment.
Malicious deepfakes
In 2017, the famous actress Gal Gadot, noted for her portrayal of “Wonder Woman” in the DC Comics cinema franchise, appeared in a pornographic movie. Shortly afterwards, it transpired that the film was in fact fake and that Gadot’s face had been superimposed onto an adult movie star’s body.49 Wired magazine reported that pornographic deepfakes50 were more popular in 2020 than ever, with up to 1000 deepfake videos being uploaded to porn sites every month, each surrounded by advertisements which generate revenue for the providers.51
In regulatory terms, the phenomenon is a novel one, thus there are no comprehensive regulatory concepts in the EU to tackle it. The issue of pornographic deepfakes is often at the centre of regulatory debates in the United States and increasingly in the UK. While some argue that a federal criminal statute is necessary to prohibit the publication of such deepfakes,52 others are of the opinion that deepfakes should not be criminalized as the harms accrued by them can be dealt with through existing categories of criminal, civil and administrative law.53 Although deepfake technology reportedly has a legitimate commercial use in the entertainment industry, a 2019 report showed that 96 per cent of deepfake content is non-consensual pornography.54
Two trends can be observed in policy initiatives that aim to control the flow of pornographic deepfakes. The first concentrates on the regulation of the technology itself, which is seen as a tool to prevent misuse in a broad context. Some new pieces of EU law reflect this idea: the DSA obliges very large online platforms to label deepfakes55 and the AI Act proposal, if adopted, will require content creators to tag their content using the technology.56
The second concentrates on conduct rather than technology and aims to tackle the misuse of technology. This approach can be observed in the criminal legislation of Member States. Although pornographic deepfakes are not regulated by dedicated criminal law provisions in the EU, they are subsumed under existing criminal offences such as extortion, violation of privacy or sexual harassment as the individual harms (sextortion, defamation, violating personal integrity and reputation) and societal harms (media manipulation, undermining trust in institutions, increasing gender imbalances) give rise to criminal liability. Lack of dedicated provisions however lead to Member States having different takes on the issue. The Digital Republic Law of France provides protection from all the unauthorized use of personal data, including non-consensual sexual imagery and deepfakes.57 A similar data protection-led approach is applied in Germany, Spain58 and Hungary.59
Platform private regulation as compensational mechanism
To compensate for the problems stemming from the gaps identified above, platforms introduce their own sets of norms either in the form of community guidelines or bylaws. These forms of private regulation governing user behaviour are often more detailed and stringent than state laws. This type of platform-specific private regulation shows substantial differences concerning sexual content depending on the type of platform, thus moderation of either lawful or illegal sexual content is intertwined with the video-sharing platforms’ general attitude towards content depicting sexuality.
This part of the article will categorize VSPs based on their relationship with sexually explicit content and map industry-led initiatives applied by VSP service providers, and will argue that there is a higher risk of abuse on platforms that identify themselves as ‘adult’ VSPs. This hypothesis is backed up by qualitative research examining the terms and conditions of adult VSPs registered in the MAVISE database until 28 February 2023.
There are three types of VSPs based on their general attitude towards sexual content: (i) platforms that ban all kinds of sexual content, (ii) platforms that generally allow some forms of sexual content with restrictions and (iii) platforms whose main purpose is hosting sexually explicit content (adult platforms). The majority of the very large online platforms fall into the first category, which completely ban content that depicts naked persons or features sexual content: YouTube, Facebook, Tumblr, Instagram and Twitch all forbid the dissemination of content featuring nudity. These platforms not simply ban explicitly sexual content, but have their own standards to evaluate what shall be deemed sexual content. Instead of creating rigid definitions for pornography, sexual content or content depicting naked persons, the providers apply detailed descriptions to define what types of content are not permitted on their service. Platforms applying a general ban not only prohibit pornography and explicit sexual content in a narrowly construed sense: they often ban sexuality and the depiction of naked person in general. Facebook’s Community Standards prohibit the posting of images that contain real nude adults, sexual activities and content that is closely related to such activities (eg the presence of by-products of sexual activity, the use of sex toys, even if they are above or under clothing). The platform’s restrictions also apply to digitally created content, unless it is posted for educational, humorous, or satirical purposes.60 YouTube seems to be somewhat more indulgent based on its Community Guidelines, as it only moderates videos containing sexual acts and nudity that are meant to be sexually gratifying (pornography). Videos that are without the purpose of sexual gratification, yet are sexual or contain nudity are made age-restricted by the provider instead of being deleted outright. Age-restriction may be applied to videos in which breasts, buttocks or genitals (clothed or unclothed) are the focal point of the video, the subject’s actions in the video invite sexual activity, such as by kissing, provocative dancing or fondling, or where the subject wears clothing that would be unacceptable in public (such as lingerie). Violent, cruel and humiliating fetishes are prohibited by all platforms even if they do not depict nudity. Content which justifies or promotes incest, cannibalism, necrophilia or animal pornography falls under the same judgement and is also prohibited from these platforms.
Platforms in the second category are not so dismissive of sexual content or content that depicts nudity. The platforms that fall into this category are still generalist platforms, and sexual content is only one of the types of content they allow to be hosted and shared through their service. On Snapchat and Twitter, users may be able to share explicit content and nudity by applying certain restrictive measures. Snapchat only bans users’ accounts who registered to share and popularize pornographic content, while users are not prohibited from sharing61 other types of sexuality and nudity, except content featuring children. Twitter’s rules and policies specifically state that the company is aware that some content may depict sensitive topics, including violent and adult content, but also states that these are considered crucial elements of exercising freedom of speech on the platform. As such, this type of content is allowed on the platform if the uploader takes some precautionary measures to ensure that people who wish to avoid such content are not exposed to it. Sensitive content (such as violent, hateful and adult content) is not allowed within the most visible areas of Twitter, like live videos or profile images and on banners. Users who wish to share such content must mark their accounts as sensitive. In order to facilitate informed decision-making on the part of users, a warning message has to be displayed before viewing sensitive content.62 Users’ freedom to share sensitive content is not unlimited, as Twitter explicitly bans some types of content, the majority of which involve forms of sexual content. Graphic violence, pornography, violent sexual conduct, gratuitous gore and hateful imagery are banned from the platform. Given their dismissive attitude towards pornographic content, the risk of illegal sexual content spreading through these platforms can be considered low, as such content is likely to be deleted due to the infringement of community guidelines even if the country of jurisdiction does not have legislation sanctioning it. In the case of these services, platform private regulation offers an extra layer of protection that can remedy the shortcomings of EU and Member State laws.
The third category of platforms consists of services that were specifically created to host sexually explicit content (such as Pornhub).63 Unlike platforms in the first two categories, these do not prohibit or restrict pornography, yet this does not mean that they are necessarily oblivious to rules that aim at content regulation. On the contrary, given that these platforms host sensitive content, there are some rules that should be applied and enforced with more rigour than for ‘general’ video-sharing platforms. The risk of illegal sexually explicit content being distributed through these platforms is significantly higher. Unlike generalist platforms, adult VSPs do not remove pornographic material just because it depicts sexuality with salacious openness, which means that when they moderate, they must assess the illegality of the content on its merits. Circling back to the lack of harmonized EU rules on certain forms of harmful sexual content (such as non-consensual sexual imagery or malicious deepfakes), adult VSPs may opt to remove only those forms of sexual content that are in clear violation of EU law or the law of their place of establishment. The majority of adult VSPs in the EU are established in Cyprus or Luxemburg where regulatory burdens are considered relatively low. Once applicable, the DSA will allow the removal of content deemed illegal under Member State law differing from the country of establishment, however in the absence of practical application, the impact of this provision is hard to estimate. As the measures that VSPs have to adopt to protect users from harmful and/or illegal content are not graduated in the majority of Member States, adult platforms are not particularly stringent, for example, regarding age-verification measures. On most of these platforms, age-verification tools are simplistic at best, as they usually involve a tick-box where the user can choose whether they are above the age of 18.
As stated before, the AVMSD’s graduated approach rest on the harmfulness of the content offered and not the type of service offered. This means that platforms, where pornographic content is the key element of the service have the liberty to apply the exact same content moderation measures as platforms where merely a small percentage of all the content offered is pornographic.
Conclusion
The aim of drawing attention to the shortcomings of the semi-graduated regulation, the lack of consensus in the scope of criminalized sexual content and platforms’ role in moderating sexual content was to expose the elephant in the room. Now that its presence is acknowledged, policy makers who recognize its existence can elaborate on strategies to deal with it.
In this study, I have identified four gaps that emerged in different levels of regulation shown in Figure 3.

Gaps identified on the levels of the content-regulation pyramid
In this concluding part of my paper, I wish to phrase some recommendations derived from the gaps identified in the above sections.
(R1) The semi-graduated framework needs to be more structured so that the specificities of thematic services are addressed. The DSA’s approach that differentiated on the basis of platform size and addressing very large online platforms with specific rules is not an effective solution to the issues of sexually explicit content. One of the largest adult VSPs with 33 million active users does not qualify as a very large online platform as it does not meet the 45-million user threshold set in the DSA.64 To structure regulation better a typology of VSPs should be created.
(R2) The measures that apply to content moderation should be tailored to better take into account the types of VSPs. While those platforms that forbid the upload of sexual content do not require more detailed or stricter rules on moderating sexual content, platforms that describe themselves as adult platforms do. These measures may range from systematically searching for illegal content, setting up procedures to handle user complaints to preventing the access of minors to lawful sexual content by applying effective age-verification measures.
(R3) Whilst the spread of content online is global, reaction tends to be local. The jurisdiction under which a platform is established determines the applicable law and the country that proceeds in content-related cases. Enhanced harmonization of what is construed as illegal sexual content is necessary to unify action against non-consensual sexual imagery and malicious deepfakes. Enhanced harmonization would lessen the impact of platform private regulation, which are often used as correctional mechanisms to complement non-existent or not uniform Member States laws. In the meantime, emphasis should be put on the role of national authorities, which have the power to enforce EU law and national laws in a consistent and transparent way. Noticing the elephant in the room is only the first step towards making the online sphere a better space, while more research and policy dialogues are needed to develop solutions that may serve as best practice examples for the future.
Footnotes
Research associate, Institute of the Information Society, Ludovika University of Public Service, Hungary. E-mail: [email protected]. I wish to thank Katalin Fehér (Associate Professor at the Ludovika University of Public Service) for her guidance and valuable insights during the preparation of the manuscript and Bernát Török (Director at the Eötvös József Research Centre, Ludovika University of Public Service) for his support.
Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive).
Lisa Baker E, ‘Pornhub Receives More Website Traffic than Amazon and Netflix, New Research Reveals!’ (Business in the News 20 July 2020) <https://businessinthenews.co.uk/2020/07/19/pornhub-receives-more-website-traffic-than-amazon-and-netflix-new-research-reveals/> accessed 1 May 2023.
Nicholas Kristof, ‘The Children of Pornhub. Why does Canada allow this company to profit off videos of exploitation and assault?’ The New York Times, 4 December 2020.
Robertson A, ‘Visa and Mastercard Cut off Pornhub after Report of Unlawful Videos’ (The Verge 10 December 2020) <https://www.theverge.com/2020/12/10/22168240/mastercard-ending-pornhub-payment-processing-unlawful-videos> accessed 1 May 2023.
Kolhatkar S, ‘The Fight to Hold Pornhub Accountable’ (The New Yorker 13 June 2022). <https://www.newyorker.com/magazine/2022/06/20/the-fight-to-hold-pornhub-accountable> accessed 1 May 2023.
Roth E, ‘Pornhub Owner’s CEO and COO Resign’ (The Verge 22 June 2022). <https://www.theverge.com/2022/6/22/23178427/pornhub-owners-ceo-coo-resign-antoon-tassillo-mindgeek> accessed 1 May 2023.
Laidlaw EB, ‘Myth or Promise? The Corporate Social Responsibilities of Online Service Providers for Human Rights’ in Taddeo M and Floridi L (eds), The Responsibilities of Online Service Providers. Law, Governance and Technology Series, vol 31 (Springer, Cham 2017) 136. https://doi.org/10.1007/978-3-319-47852-4_8
Miriam C Buiten, Alexandre de Streel and Martin Peitz, ‘Rethinking Liability Rules for Online Hosting Platforms’ (Summer 2020) 28 IJLIT 139,166. https://doi.org/10.1093/ijlit/eaaa012
Giancarlo F Frosio, ‘Why Keep a Dog and Bark Yourself? From Intermediary Liability to Responsibility’ (Spring 2018) 26 IJLIT 1,33. https://doi.org/10.1093/ijlit/eax021
Yar M, ‘Protecting Children from Internet Pornography? A Critical Assessment of Statutory Age Verification and Its Enforcement in the UK’ (2019) 43 Policing 183.
Blake P, ‘Age Verification for Online Porn: More Harm than Good?’ (2019) 6 Porn Stud 228.
Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).
Compromise amendments REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC. <https://www.europarl.europa.eu/meetdocs/2014_2019/plmrep/COMMITTEES/IMCO/DV/2021/12-13/DSACA9_EN.pdf> accessed 11 September 2023.
European Audiovisual Observatory – MAVISE database: adult video sharing platforms registered in the European Union. <https://mavise.obs.coe.int/f/ondemand/advanced?typeofservice=4> accessed 10 June 202.
Van Dijck J, ‘Seeing the Forest for the Trees: Visualizing Platformization and Its Governance’ (2020) 23 New Media Soc 2801.
Sunstein CR, ‘Television and the Public Interest’ (2000) 88 CLR 499.
Valcke P and Stevens D, ‘Graduated Regulation of ‘Regulatable’ Content and the European Audiovisual Media Services Directive’ (2007) 24 Telemat Inform 285.
Squire, Dempsey and Sanders, ‘Study on Adapting the EU Regulatory Framework to the Developing Multimedia Environment. Main Report’ (AEI Banner 1 January 1998) <http://aei.pitt.edu/43249/> accessed 1 May 2023. p 289.
ERGA: Report – The implementation(s) of Article 28b AVMSD: National transposition approaches and measures by video-sharing platforms.
Ahn S, Baik JS and Krause CS, ‘Splintering and Centralizing Platform Governance: How Facebook Adapted Its Content Moderation Practices to the Political and Legal Contexts in the United States, Germany, and South Korea’ [2022] Inf Commun Soc 1.
Roberts ST, ‘Content Moderation’ [2017] Encyclopedia of Big Data 1.
Gillespie T and others, ‘Expanding the Debate about Content Moderation: Scholarly Research Agendas for the Coming Policy Debates’ (2020) 9 IPR.
Valkenburg P and others, Kijkwijzer: The Dutch Rating System for Audiovisual Productions. (De Gruyter 27 June 2002). Retrieved 30 January 2023 <https://www.degruyter.com/document/doi/10.1515/comm.27.1.79/html> accessed 11 September 2023.
AVMSD Art. 28.b (3).
Goldman E, ‘Content Moderation Remedies’ [2021] MTLR 1.
Gillespie T, ‘Do Not Recommend? Reduction as a Form of Content Moderation’ (2022) 8 Social Media + Society 205630512211175.
ERGA, ‘Report – The implementation(s) of Art. 28b AVMSD: National transposition approaches and measures by video-sharing platforms’ (2022).
Directive 2011/92/EU of the European Parliament and of The Council on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA, Article 2(a).
Cybercrime Convention, Art. 9(3).
Susanna Greijer and Jaap Doek, Terminology Guidelines for the Protection of Children from Sexual Exploitation and Sexual Abuse (ECPAT International, Bangkok 2016) p. 8.
Id. p. 6.
Id.
Dornfeld L, ‘ICTs and Sexual Exploitation of Children in Europe’ [2020] Encyclopedia of Criminal Activities and the Deep Web 568.
See, for example, Allisdair Gillespie.
Gillespie AA, ‘Child Pornography’ (2017) 27 I&CTL 30, 31.
Merriam-Webster, Definition of Pornography. <www.merriam-webster.com/dictionary/pornography> accessed 11 September 2023.
Edwards SS, ‘Prosecuting ‘Child Pornography’: Possession and Taking of Indecent Photographs of Children’ (2000) 22 JSWFL 1.
n 30, p. 38.
Danijela Frangež and others, ‘The Importance of Terminology Related to Child Sexual Exploitation’ (2015) 66 Journal of Criminal Investigation and Criminology 291,299.
See the Europol’s website about child sexual exploitation. <www.europol.europa.eu/crime-areas-and-trends/crime-areas/child-sexual-exploitation> accessed 11 September 2023.
See at the INHOPE’s website on child sexual abuse material. <https://inhope.org/EN/articles/child-sexual-abuse-material> accessed 11 September 2023.
Directive 2011/92/EU of the European Parliament and of The Council on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA, Recital (3).
European Parliament Resolution of 11 March 2015 on child sexual abuse online, 2015/2564(RSP), para. 12.
See the Chart of signatures and ratifications of the Cybercrime Convention at <www.coe.int/en/web/conventions/full-list/-/conventions/treaty/185/signatures?p_auth=Ie8q5VWb> accessed 11 September 2023; and the Chart of signatures and ratifications of the Lanzarote Convention at <www.coe.int/en/web/conventions/full-list/-/conventions/treaty/201/signatures> accessed 11 September 2023.
Beckman EM and Flora MGP, ‘Non-consensual Pornography: A New Form of Technology Facilitated Sexual Violence (2021) Rassegna Italiana di Criminologia, XV, 4, 317–328.
Gian Marco Caletti, ‘Can Affirmative Consent Save ‘Revenge Porn’ Laws? Lessons from the Italian Criminalization of Non-Consensual Pornography’ (2021) 25 Va JL & Tech 112.
Mania K, ‘The Legal Implications and Remedies Concerning Revenge Porn and Fake Porn: A Common Law Perspective’ (2020) 24 Sex Cult 2079.
Ryan D, ‘European Remedial Coherence in the Regulation of Non-Consensual Disclosures of Sexual Images’ (2018) 34 CLSR 1053.
Turton W, ‘‘Deepfake’ Videos like That Gal Gadot Porn Are Only Getting More Convincing – and More Dangerous’ (VICE 27 August 2018) <https://www.vice.com/en_us/article/qvm97q/deepfake-videos-like-that-gal-gadot-porn-are-only-getting-more-convincing-and-more-dangerous> accessed 1 May 2023.
Deepfake is a portmanteau of ‘deep’, referring to the deep learning techniques applied within artificial intelligence algorithms, which are capable of imitating the learning patterns of the human brain; and ‘fake’, meaning not genuine or deceptive.
Burgess M, ‘Porn Sites Still Won’t Take down Nonconsensual Deepfakes’ (Wired 30 August 2020) <https://www.wired.com/story/porn-sites-still-wont-take-down-non-consensual-deepfakes/> accessed 1 May 2023.
See, for example, Harris D, ‘Deepfakes: False Pornography Is Here and the Law Cannot Protect You’ (2019) 17 DLTR 99,127 and Delfino R, ‘Pornographic Deepfakes — Revenge Porn’s Next Tragic Act – the Case for Federal Criminalization’ [2019] Fordham L Rev 88.
Kirchengast T, ‘Deepfakes and Image Manipulation: Criminalisation and Control’ (2020) 29 I& CTL 308.
Ajder H and others, ‘The State of Deepfakes: Landscape, Threats and Impact Report’ (Deeptrace Labs 2019) <https://regmedia.co.uk/2019/10/08/deepfake_report.pdf> accessed 1 May 2023.
Barr K, ‘Eu Demands Facebook, Tiktok, and Google Start Labeling AI Content to Fight Deepfakes’ (Gizmodo, 6 June 2023) <https://gizmodo.com/ai-deepfake-facebook-twitter-tiktok-eu-1850509893> accessed 8 June 2023.
Proposal for a regulation of the European Parliament and of the Council Laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain union legislative acts.
Law No. 2016–1321 of 7 October 2016, for a Digital Republic.
Rigotti C and McGlynn C, ‘Towards an EU Criminal Law on Violence Against Women: The Ambitions and Limitations of the Commission’s Proposal to Criminalise Image-Based Sexual Abuse’ (2022) 13 NJCEL 452.
National Media and Infocommuncations Authority, ‘How Is Revenge Porn Addressed by Different Legal Systems?’ <https://english.nmhh.hu/article/191027/How_is_revenge_porn_addressed_by_different_legal_systems> accessed 8 June 2023.
Meta Transparency Center: Adult Nudity and Sexual Activity. <https://transparency.fb.com/policies/community-standards/adult-nudity-sexual-activity/> accessed 11 September 2023.
Snapchat Privacy and Safety Hub: Community Guidelines.<https://values.snap.com/privacy/transparency/community-guidelines> accessed 11 September 2023.
X Help Center: Sensitive media policy. <https://help.twitter.com/en/rules-and-policies/media-policy> accessed 11 September 2023.
The study aimed to examine the policies of the biggest and most known pornographic video-sharing platforms. As the paper focuses on services that allow the upload of third-party content the examination of on-demand pornography services was omitted from the research.
Goujard C, ‘Brussels Gears up to Tame Unruly Porn Platforms’ (POLITICO 18 February 2023). <https://www.politico.eu/article/online-porn-websites-europe-regulation-age/> accessed 1 May 2023.