What the Digital Service Act should say about Instagram nudes?

Agathe R
13 min readNov 7, 2020

Abstract: The regulation of content by platforms, increasingly chosen as a solution for law enforcement in the “online world”, is necessary. Nonetheless, many issues linked with the essence of platforms rules have already detrimental effects. They need to be addressed before reinforcing the importance of these ruled with increasing platforms responsibility.

For these new legislations to be successful, the legislative bodies reinforcing platforms responsibility must draw inspiration from traditional research in political science and law. They must ensure that the new regulations are not just effective in removing illegal content, but that these regulations are legitimate and accepted. To do this, they must ensure two main legal principles: legal certainty and the uniform application of the rules.

Finally, the Digital Service Act package should limit regulation to necessary and proportionnate means with a justified objective, as well as strengthening the accountability and transparency of platforms throught improved dispute process. Governance by data, if applied with the right tools, could be a solution to reinforce the legitimacy of platforms regulation.

The Digital Service Act package is an European text aimed at regulating the content published on the platforms and the platforms themselves. Its first drafts are expected by the end of the 2020 year, and the text will undoubtedly be at the heart of digital debates in 2021. In line with other legislative texts and the practice of European courts, the Digital Service Act text is expected to strengthen the responsibility of platforms regarding the regulation of content published on their networks. However, if this involvement is a necessity in order to effectively regulate illegal content on the Internet, this should not make us forget that the regulation of platforms, unlike European or national legislation, is constructed in an almost unilateral manner. This leads to legal risks, and thus a lack of legitimacy, that could in the long term damage the effectivity of laws supporting and imposing regulation by platforms.

So what are the risks and how to make sure that the European Digital Service Act package does not become a clay-footed colossus?

The increasing responsibility of platforms in online regulation

In my study, I will be focusing on platforms as defined by the OECD : digital service that facilitates interactions between two or more distinct but interdependent sets of users (whether firms or individuals) who interact through the service via the Internet. Particularly, I will focus on platforms whose main activity is the exploitation of user-generated content (UGC) such as social media. The focus on social media is justified by the fact that they tend to have the more diverse publication of content, and already existing moderation rules and tools. I will mainly focus on the European context.

First of all, I am not criticizing the fact that platforms should be made accountable and should regulate unlawful content from their network. It is first a technical necessity. Indeed, even though European courts and national courts have increasingly targeted online illegal content since 2000, it has in majority not be able to have significant impact nor control the publication of illegal online content on social media. The number of content posted every second is astronomical and this massification has made impossible to apply laws on forbidden content like it was made off-line.

For the moment, Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 (Directive on electronic commerce) was partially protecting the responsibility of providers, even if the states could ask providers to “promptly to inform the competent public authorities of alleged illegal activities”. Nonetheless, the platforms seems to be the only one to have the power and money to regulate the content posted on their services. Moreover, they have a better knowledge of their users, the network organization and the main problems encountered. The responsibility of platforms seems also to be a question of justice, as off-line content producers are made accountable for the content they display, voluntary or not.

In this context, many laws have already integrated the platforms as responsible for compliance with the laws of the content published on their networks, and therefore themselves responsible if this regulation had not been made. This could look like a partial transfer of regulatory competence to the platforms, while remaining under the supervision of national laws. This regulation is generally done through two axes: either an obligation of means, which shows that the platform has invested humanely and economically for the regulation, or through quantified time objectives.

In Germany, the Network Enforcement Act (NetzDG) was passed in June 2017. It calls for commercial social networks on the Internet with at least 2 million members, to delete “obviously illegal” content within 24 hours, and to delete any illegal content within 7 days after checking and block access to it. What is interesting is that failure to comply with these rules is an administrative offense, in the form of fines of up to 5 million euros.

A similar law on the moderation of heinous content, voted in 2020 in France, was largely censored by the French Constitutional Court in its decision n° 2020–801 DC of the 1st the of June 2020. The law was engaging the platforms’ responsibility and giving them timed moderation objectives and penalties. The law was also aiming at introducing the idea of a “cooperation duty” for the operators, in a more preventive approach. This would have led to more transparency on the means used by the platforms for moderation, as well as an obligation for the platforms to justify certain resources invested in moderation.

The digital service act package will surely bring real confirmation of this platform responsibility, with its affirmation at European level, for platforms of a certain size. Here what is expected, but should be subject to change.

Illustration of the Financial Times on what will probably the regulation on illegal content, after the Digital Service Act.
The challenges of the DSA on the regulation of online content, according to the financial times (2019–07–24).

As seen before, platforms are more and more expected to regulate their own content, and their responsibility is increasingly being a legislative subject. We will now see that the legitimacy given by national and European laws should not be the only source of legitimacy of the regulatory texts and decisions made from platforms.

The necessary legitimacy of platform regulation

Talking about legitimacy with regard to new technologies or Internet is coherent. Indeed, these innovations, like all innovations before them, tend to experience a phase of rejection or contestation from a part of the population. Carlotta Perez describes how the urges of technologies are always accompanied by the “break and rebuild the foundations of society: economic structures, social norms, laws, and regulations”. In this case, a rejection of platforms is understandable. Nonetheless, as social media are becoming essential in today’s world, for studies, work or socialization, and therefore as we are more and more obliged to use them, its acceptance becomes an important subject for political leaders.

In The Rule of Law on Instagram : An Evaluation of the Moderation of Images Depicting Women’s Bodies, Alice Witt, Nicolas Suzor and Anna Huggins underline that “a growing body of literature recognizes that non-state actors, like Instagram, have become ‘the new governors’ of the digital age”. Indeed, if platforms plays an increasing role in the regulation of online content, then their power over the citizen’s freedom and existences increases. In a 2018 essay, Kate Klonick notes that “Private online platforms have an increasingly essential role in free speech and participation in democratic culture”. Yet, how they moderate speech is “largely opaque”. What could only appear to be a simple rule application by the platforms, I see it as the extension of existing national laws. What we could see as a simple control of the State over the platforms, I see it as a transfer of regulatory competence over the users, under the control of legislative powers and national courts.

Even if the legal obligation of platform responsibility will give them theoretical legitimacy in the regulation of content, this will not give them de facto legitimacy. For a regulation to work, it is necessary that the entity who produced them is considered as being legitimate over that jurisdiction. In most states nowadays, the legitimacy of legislative power is ensured, in line with social contract theories, by the fact that they are elected directly or indirectly. In a business setting, power is only legitimate based on the fact that it created the platform. However, when platforms will start to apply stricter rules, we can expect more contestations of their rules. A lack of legitimacy for a platform, and therefore for its regulatory model, could prevent the application of national or European content regulation law. I could even harm the legitimacy of national and European legislative bodies.

As I presented the necessity of a better legitimacy for platforms regulations, I will now show how the current regulation made by platforms are creating inconsistent set of rules that are not applied evenly. This leads to harmful consequences on users’ freedom, creates contestation against what is perceived as unjustified censorship, and thus creates legitimacy risks.

Instagram creates regulatory hazards: example of an inconsistent and harmful regulatory framework on nudity

Instagram moderates content to manage the risks, financial, legal or reputational, as well as to protect and maintain users, and their value in term of data and advertisement. Nonetheless, its regulatory framework does not respect basic principles of law, which create legitimacy risks. As underlines Nicolas Suzor in Digital Constitutionalism, the rule of law is a good framework for evaluating the legitimacy of governance of platforms, and its core values “consent, predictability, and procedural fairness are core liberal values of good governance”. I will use this to demonstrate how, in its regulation of nudity, Instagram is creating legal uncertainty, and a systematic unequal application of the rules, leading to discrimination.

The reason I choose to work on the regulation or nudity online, it is because it is not itself an illegal content. Indeed, French law only condemn nudity if it is made with a sexual intent or dimension, even if it has been interpreted sometimes in a larger definition by French courts. Many photos censored for nudity on Instagram are artistic photographs, with no sexual intention nor sexual body part visible, and that most of the time do not seems to disobey the rules of Instagram.

First, Instagram guidelines on nudity are creating legal uncertainty, as they lack consistency and explanation when they are applied. According to community guidelines, nudity is not allowed for “a variety of reasons”. This may be linked to the fact that the social media is open to “underage users”, starting at 13 years old. When a photo infringes the community guidelines, it can be deleted, “shadow banned” (made hardly accessible on the platform), and the account of the user can receive a “strike” and or deleted. According to the rare information available on the organization of the moderation process by Instagram, theyuse both algorithm and “human” moderation.

In their article from 2019, Alice Witt, Nicolas Suzor and Anna Huggins observe that “the rules and processes that moderators follow in practice significantly differ from publicly available terms and guidelines around content”. Moreover, they observe that the decisions of the platforms “lack transparent information and reasons to explain exactly why their content was removed nor why other users’ similar content was moderated differently”. Several times, users whose photo has been deleted have found very similar photos that were tolerated on the platform. For instance, Australian comedic Celeste Barber posted a photo of herself in a parody of a post from former Victoria’s Secret model Candice Swanepoel, copying exactly the photo and the position of the model. Her photo was taken down, not the one of the model.

When the law does not seems to be applied evenly, it creates legal uncertainty which confuses the users and does not allow them to have the predictability that is supposed to be ensured by legal frameworks according to Nicolas Suzor. It is especially the case when non-respect to the community guidelines leads to concrete consequences for users. First, this increases the chance of misunderstanding and non-acceptance of the rule. Secondly, Shadow ban or account deletion can be very impactful for users with a large audience and making a living from it. This can lead to preventive self-censorship and thus a diminution of the freedom of expression, even if the content would have been declared authorized in the end.

Secondly, Instagram guidelines uneven application and lack of transparency leads to discriminations, which goes against the procedural fairness necessary in the rule of law. According to Duguay, Burgess and Suzor, in Queer Women’s Experiences of Patchwork Platform Governance on Tinder, Instagam, and Vine: “marginalized individuals and groups generally face a higher risk of arbitrariness in moderation when participating in online platforms”. Several movement of contestation have emerged among Instagram after users observed more censorship of fat and black bodies photos under the reason of nudity. In June 2020, the plus-size model Nyome Nicholas-Williams posted an artistic topless photo, in which her breasts were covered, which was deleted. After, the photo was systematically censored before its publication. As the photo was apparently not against the community guidelines, it lead to critics and contestations with the #IWantToSeeNyome. In France, the same contestation happened after the image of a Telerama magazine cover, with the DJ and activist Leslie Barbara Butch addressing grossophobia, was censored, and the model account was closed when she tried to publish the image.

The image is a magazine cover. A fat woman is the model. We can see her upper body naked, but not her breast, that are hidden
The Telerama cover with Leslie Barbara Butch was censored by Instagram even if it didn’t explicitely infringe the community guidelines.

These example where partially addressed by CEO Adam Mosseri, acknowledging the need to look at “algorithmic bias” and by Vishal Shah, the company’s vice-president of product, announcing the creation of an internal equity team. Still, the legal uncertainty remains and more importantly, the possibility of contesting the censorship made by the platform, is very limited, which does not allow a possibility to question the regulatory decision made.

As seen with Instagram, the regulatory framework of platforms are often inconsistent, obscure and not easily contestable. There are thus real threats on the legitimacy of legal frameworks, when those are produced by the platforms. This needs to be taken into account by the legislators, when including the responsibility of platforms in online regulation laws. I will now show how a legislation including data objectives, actual transparency and easier challenge procedures could be an answer to legitimacy threats, by rebalancing an inequality of power, and by promoting transparency and accountability of platforms.

Legislation by data, actual transparency and easier challenge procedures as a partial resolution of platforms’ legitimacy risks

So, what the Digital Service Act should say about Instagram regulation?

Ask for more data

The idea of using data as a way to reinforce accountability of big players has been explored by many researchers. For instance, Bruce Schneier in The Battle for Power on the Internet explains how data and transparency could balance the equilibrium between small and big players. He also emphasize how transparency and oversight is necessary to the trust in institutional power for the short term, but how trust in the long term would require a rebalancing of the distribution of power, necessary for the stability of our society.

The current laws addressing the responsibility of platforms already include the necessity for platforms to be more transparent, with the public and/or independent control authorities. As the legitimacy risk of the platforms need to be taken in account during the writing of laws, the control of the regulation made by platforms need to go further than only “efficiency objectives». The legislators needs to enforce a bigger access of platforms functioning data, will make the control of platforms regulation and accountability possible.

Enforce equilibrium between the necessity of regulation and users’ rights

Regulation of the platforms should be limited by the Digital Service Act. First, the regulation need to answer a necessary objective that is justified for the platform regulation. Then, the tools and means used by the platforms should be proportionate and necessary to the achievement of this goal. If the regulation disproportionately damage user rights, users should be able to challenge the rules when they are censored.

Clarify the rules

As seen earlier, the rules of regulations on social media often stays vague, which lead to uncertainty. To make the rules clears, we need to establish legal specific demands for transparency. Here is a list of what could be asked of platforms. First, they could publish their internal regulation guides, that are used by the moderators and often much more complete and concrete than the rules visible by users. Then, they need to make information actually accessible to the users by an equilibrium between transparency and accessibility. This could be achieved by providing more user-friendly interface for the rules, with less ambiguous language, and giving precise example of things that are allowed and things that aren’t.

Make accountability and dispute possible

Platforms must offer a fair and equitable process by which the user can challenge the censorship of their content.

The European legislative text will have to describe the judicial process by which the user can contest a censorship if he considers that it has been censored without infringing the moderation rules, or contesting the necessity or the proportionality of the regulatory tools or the legitimate purpose of the platform.

Conclusion

Legitimacy and citizen’s acceptance are crucial in the application of regulatory frameworks.

As seen with the example of Instagram, platforms are currently creating inconsistent set of rules that are not applied evenly. This leads to harmful consequences on users’ freedom, creates contestation against what is perceived as unjustified censorship, and thus creates legitimacy risks.

When enforcing platforms’ responsability with legislative pieces, we need to ensure that they will apply proportionate and necessary regulation, while maintaining actual transparency and accountability.

Otherwise, the European Digital Service Act will become a clay-footed colossus.

This essay was written as part of the course Regulation and Digital Economy, by Bertrand PAILHES, Benoît LOUTREL, and Doaa ABU ELYOUNES.

Are, Carolina. “How Instagram’s Algorithm Is Censoring Women and Vulnerable Users but Helping Online Abusers.” Feminist Media Studies 20, no. 5 (July 3, 2020): 741–44.

About Facebook. “Charting a Way Forward on Online Content Regulation,” February 17, 2020.

Desmaris, Sacha, Pierre Dubreuil, and Benoît Loutrel. “Creating a French Framework to Make Social Media Platforms More Accountable : Acting in France with a European Vision.” French Secretary of State for Digital Affairs, May 2019.

Duguay, Stefanie, Jean Burgess, and Nicolas Suzor. “Queer Women’s Experiences of Patchwork Platform Governance on Tinder, Instagram, and Vine,” 2020.

Klonick, Kate. “The New Governors: The People, Rules, and Processes Governing Online Speech.” harvardlawreview.org, April 10, 2018.

Madrigal, Alexis C. “Inside Facebook’s Fast-Growing Content-Moderation Effort.” The Atlantic, February 7, 2018.

Mas, Gabriella. “#NoFilter: The Censorship of Artistic Nudity on Social Media Notes.” Washington University Journal of Law & Policy 54 (2017): 307–28.

Perez, Carlota. Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages. Edward Elgar, 2003.

Roberts, Sarah T. Behind the Screen: Content Moderation in the Shadows of Social Media. New Haven: Yale University Press, 2019.

Schneier, Bruce. “The Battle for Power on the Internet — Schneier on Security.” Accessed November 7, 2020.

Suzor, Nicolas. “Digital Constitutionalism: Using the Rule of Law to Evaluate the Legitimacy of Governance by Platforms:” Social Media + Society, July 17, 2018.

Witt, Alice, Nicolas Suzor, and Anna Huggins. “The Rule of Law on Instagram: An Evaluation of the Moderation of Images Depicting Women’s Bodies.” University of New South Wales Law Journal 42, no. 2 (2019): 557–96.

--

--