August 04, 2023
by Meri Baghdasaryan
Earlier in July 2023, a blog post by Jacob van de Kerkhof discussed the shortcomings of the recent Grand Chamber judgment in Sanchez v. France, in which the Court expanded its intermediary liability rules for hate speech posted by third parties to individual users of social media platforms. This piece will not contemplate the majority position nor the (unsound) reasoning behind this extension of liability rules. Rather, it will reflect on the points raised by the dissenting voices on the bench, who offer interesting insights into the Court’s thinking, point out logical gaps in its reasoning and highlight the impact of the ruling on individual rights. This judgment is accompanied by four separate opinions: three dissenting opinions and one concurring opinion.
The applicant is a French politician who was running for election to Parliament in the Nîmes constituency. During his electoral campaign, he made a post on his public Facebook wall about one of his political opponents, F.P. While the post was not inflammatory in itself, two third parties, S.B. and L.R, targeted F.P.’s partner Leila T. in their comments on the post and expressed dismay at the presence of Muslims in Nîmes. Leila T. confronted S.B. and he deleted his comment later that day. Leila T. also lodged a criminal complaint against the applicant and the two commenters. The Nîmes Criminal Court found all three of them guilty of incitement to hatred or violence against a group or an individual on account of their origin/belonging or not belonging to a specific ethnic group, nation, race or religion. The decision was upheld by the Nîmes Court of Appeal and the Court of Cassation.
The Chamber majority found that the applicant’s conviction for incitement to hatred did not violate Article 10. In their decision, the Grand Chamber upholds the Chamber ruling. The majority note that the interference with the applicant’s right to freedom of expression was foreseeable and pursued the legitimate aim of protecting the rights and reputation of others. With regard to the necessity of the measure, the Grand Chamber concludes that the comments, in this case, qualify as hate speech, targeting a specific group (Muslims) with insulting and hurtful language. Further, the majority define the applicant’s social media presence as ‘fora on the internet where third-party comments can be disseminated’ (Delfi v Estonia). The majority agree with domestic courts that the applicant had started a ‘dialogue’ and should have been aware of the risk an open profile would pose for the dissemination of illegal content in election times. He had then failed to monitor the content that was generated. The majority accept domestic courts’ position that the applicant and the two commenters were not convicted for the same offence: the applicant had failed as a ‘producer’ while the commenters faced criminal liability for hate speech.
For a more detailed overview of the case facts and the procedural history, please refer to Jacob van de Kerkhof’s blog post.
Judge Bošnjak does not agree with the applicant’s arguments, that the third-party comments are political speech and the monitoring obligation would be an excessive burden on social media users. However, neither does he accept two points supported by the majority, namely, that the applicant’s conviction was foreseeable and that the conviction for one of the third-party comments was proportionate.
In this case, the applicant was held liable as a producer under a cascading criminal liability framework (paras 4-6). The framework is aimed at ensuring that criminal offenses committed in the media do not go unpunished and its application means that if there is no publication director or author, or they cannot be identified, the producer shall be prosecuted as the principal offender (para 8).
Judge Bošnjak disagrees with the majority that it was neither arbitrary nor manifestly unreasonable that the applicant would be convicted when both of the authors of the offensive comments were prosecuted and convicted too. He notes that the majority relied on domestic jurisprudence to assess the interpretation of French provision on cascading criminal liability (paras 9-13). He further adds that the review of foreseeability should be stricter as a criminal conviction is at stake and that ‘the Court should not hide behind the fact that the novel character of the issue at the material time was not in itself incompatible with the requirements of accessibility and foreseeability’ (para 14)
In Judge Bošnjak’s view, the French domestic courts did not develop any standards to support the application of the principle of the independence or autonomy of criminal prosecutions, that allows for proceedings to be instituted against the various actors in the cascade chain. He states that the domestic courts’ case law does not support the view that the producer and the author can be held liable simultaneously (paras 9-13). Thus, the applicant’s conviction was not foreseeable and therefore, not prescribed by law within the meaning of Article 10.
While this lack of foreseeability in itself would be sufficient to find a violation of Article 10, Judge Bošnjak also joined Judge Ravarani’s dissent with regard to the finding that the applicant should have deleted one of the third-party comments, considering the factual circumstances of the case as established by the domestic courts (para 18).
While Judge Ravarani agrees with the majority on several points, he disagrees that the applicant’s conviction for one of the third-party comments was lawful, as the applicant deleted it promptly, within less than 24 hours after it was posted, unlike the other comment in question that stayed on the applicant’s page for months. He contends that criminal law should be interpreted strictly. In Judge Ravarani’s view, the majority seems to engage in ‘intellectual acrobatics and pure speculation’ by upholding the applicant’s criminal punishment for his failure to promptly remove the comment in question (para 5). The majority held that the messages ‘were responding to…each other,’ turning it into ‘an ongoing dialogue.’ Judge Ravarani underlines that this is ‘an unacceptable extension of a criminal incrimination by an international court which constantly repeats that it is not a court of fourth instance’ (para 5). He also disagrees that the applicant was prosecuted only for failing to promptly delete the comment in question, as the domestic courts ordered the applicant to pay a fine jointly with the commenter (para 6).
In their joint dissenting opinion, Judges Wojtyczek and Zünd disagree with the majority that there was no violation of Article 10. In their view, the application of French law was not foreseeable and the Court should have read Article 10 in light of Article 7 as the case concerns individual criminal liability for failure to ensure the prompt deletion of remarks made by third parties (para 2). The dissenting judges contend that the majority confused foreseeability with non-arbitrariness, noting that the fact that a decision to apply a law is not arbitrary does not necessarily mean that the law that has been applied is sufficiently clear. In addition, national case law interpreting criminal law must provide accessible and foreseeable explanations, and should not apply extensive interpretations or analogies to the detriment of the defendant (See, Vasiliauskas v. Lithuania).
They further highlight that the Court should have reviewed not the quality of specific rules, but the body of relevant criminal law provisions as a whole (para 3). Moreover, the rules should be clear and foreseeable to an average person or the ‘man on the street’ rather than to a professional (para 4). Under the French regime, a person who starts a Facebook page, that would allow other users to submit comments, would need to review not only the scattered legal provisions but also conduct in-depth legal research into the national jurisprudence to understand if they’d be considered a ‘producer’ or ‘publication director’ and to find out the scope of potential liability. While national courts have established an obligation to promptly remove offending messages upon becoming aware of them, it is unclear what time frame the users should act within to avoid liability (para 5).
Endorsing Judge Mourou-Vikström’s dissent in the Chamber judgment, Judges Wojtyczek and Zünd underline that the obligation to monitor numerous comments on a Facebook page is a heavy burden, especially in the context of a political discussion, potentially diminishing freedom of expression (para 6). Further, they express concern about imposing criminal liability that is in some way based on deeds of third parties. They note that the account holder should be given prior notice, allowed a reasonable time limit to delete the unlawful comments and only be held criminally liable if they fail to comply with these steps (para 6).
Finally, Judges Wojtyczek and Zünd state that the Court’s case law on a politician’s free speech is diverse and complex. If the majority’s ruling transforms politicians using social media platforms into ‘professionals in politics,’ under the Court’s case-law ‘this would be a consideration militating in favor of an enhanced protection of his freedom of expression and in that case’ (para 6). In addition, French authorities failed to demonstrate the necessity to prosecute the Facebook account holder, when they had already prosecuted the authors of unlawful comments. In short, they consider that the relevant French rules are neither foreseeable nor proportionate.
Meanwhile, Judge Kūris voted in favour of finding no violation. In his concurring opinion, he shared that his vote tilted toward no violation when he viewed the applicant’s arguments in the light of the circumstances in the case, such as their time and place, and the politically and socially sensitive context (para 1). Highlighting the margin of appreciation doctrine, he notes that even though the domestic regulation of cascading criminal liability is disturbing, and may lead to the penalization of social media users for any ‘lack of diligence,’ the Court cannot review regulations in abstracto, as it is not a supranational constitutional court (para 3). He adds that the Court should have taken a tougher stance on hate-speech-inciting language in other cases too (see, Perinçek v. Switzerland).
It appears that all dissenting voices were concerned about the foreseeability of the newly expanded liability rules on the prompt deletion of hateful comments. As Judges Wojtyczek and Zünd noted, ‘a field as important as social networks calls for legislation that is rather more accessible to those to whom it is addressed’ (Joint dissenting opinion of judges Wojtyczek and Zünd, para 5). It is not a coincidence that the foreseeability and clarity of norms constitute the first prong of the three-part test under Article 10 of the European Convention on Human Rights. It is a fundamental guarantee that enables ordinary individuals without any special knowledge or professional skills to understand the scope and application of norms regulating and interfering with their right to freedom of expression. In this case, the rules governing the imposition of liability are scattered across several domestic regulations and decisions. Interestingly, one of the laws in question was adopted 142 years ago, long before the Internet and social media, and even before the development of modern universal human rights standards on free expression. It goes without saying that regulations are unable to account for all possible relevant scenarios, and domestic case law is tasked with interpreting and clarifying concepts, their scope and application. However, the dissenting judges question the clarity of both domestic norms and their interpretation by domestic courts. Given that social media users face criminal liability for failure to comply with ambiguous rules, the stakes are high.
On the other hand, the burden imposed by the French system endorsed by the majority is heavy as not all social media users have the resources to meticulously monitor and expeditiously remove unlawful content, most notably hate speech – a concept that has not even been defined by the Court. It is unclear what timeframe the users need to comply with and who is within the scope of liability – the commenters and the post author or both. Moreover, these new rules impose criminal liability without a requirement of prior notification to users, be those authors or commenters. In this case, the domestic regulation did not contain such a requirement, leaving it to the domestic courts to conduct a case-by-case assessment. As argued by the dissenters, the Grand Chamber had to exercise ‘mental gymnastics’ to uphold the lawfulness of the French regime without assessing this element. Even though this is a judgment against France and a particular set of facts and domestic regulations is involved, the Grand Chamber’s position and reasoning may become a guiding authority for countries contemplating social media and content governance regimes. While the Court rightfully has consideration for the margin of appreciation owed to the State, it should not overlook the precedential nature of its judgments and their impact on legal systems across member states.
Finally, this judgment opens up an interesting debate about the scope of free expression rights for public officials in the online domain and more specifically, on social media. In particular, the factual background in this case involved electoral context, which in the Court’s case law takes special significance in the context of free expression (see, Magyar Kétfarkú Kutya Párt v. Hungary). But how do we define public officials? Would all citizens running for office be treated with the presumption that they have enough resources to monitor and manage their social media pages? Once again, given the criminal liability involved, this may deter certain potential candidates from fully making use of online platforms to communicate with their constituents or result in them giving up the idea of running for office altogether. The stakes are high, again.
With its ruling in Sanchez v. France, the Grand Chamber introduces new responsibilities for social media users, including those using their social media pages for political campaigns. However, the dissenting judges underline the lack of foreseeability and the alarming ambiguity of the new intermediary liability rules with regard to how individuals should regulate their conduct, especially as criminal liability is involved. Yes, fighting hate speech online is a critical mission. But in the age of widespread social media regulation, the application of such a burdensome framework could have a chilling effect on freedom of expression.
Disclaimer: The author of this post, Meri Baghdasaryan, engages with the Strasbourg Observers voluntarily and in her personal capacity. The views and opinions expressed here do not reflect on the organizations Meri is affiliated with.