How to live with deepfakes? Legal issues

Temps de lecture: 11 min

How to live with deepfakes? Attempts to regulate the use of political synthetic media and deep-porns dominate for the moment. Yet the questions are not just about the harm to democracy or the victims of sextortion and call for legal reflection on the future legal issues synthetic media raises.

Since their public appearance in late 2017, the discourse articulated in the public sphere about deepfakes has invariably followed two trends. The first takes the form of acute catastrophism and concerns the effect of synthetic media on democracy. The alarm is sounded. Deepfakes, a real marker of the post-truth era we are in 1 Is post-truth a new phenomenon, if the term dates from the 1990s, it probably refers to something older? I refer you to listen to ” The era of post-truth? I refer you to “The era of post-truth by Anastasia Colosimo on France Culture to learn more. would accentuate the degradation of global trust in our democratic institutions and the security of people. Two keywords are to be remembered: infocalpyse and deep-pornography

The second one takes the form of a candid chronicle of the efforts made by various actors in the technological world to develop and then market solutions for detecting media manipulated by algorithms 2C2PA ” or Coalition for Content Provenance and Authenticity brings together Adobe, Arm, Intel, Microsoft and Truepic through two other groupings. ” Project Origin ” includes BBC, CBC/Radio-Canada, The New York Times, and Microsoft. ” The Content Authenticity Initiative ” includes Adobe, AFP, ARM, the BBC, Camera Bits, CBC, Gannett, Getty Images, Microsoft, the New York Times, Qualcomm, Synthesia, Truepic, Twitter, USA Today, the VII photo agency, the Washington Post and finally the NGO Witness.org. A very binary worldview ensues where on one side the bad guys, the creators, and disseminators of malicious deepfakes, hone their algorithms to do society in, and the good guys, proud tech companies, scramble to flush out the bad guys and thus prevent the innocent society from falling victim to their actions. In accordance with American storytelling, everything takes place in the field of private business, with the federal state only intervening to ensure the proper financing of operations or administrative buffers.

There seems to be little room for nuance. Legislators, like many technologists or observers, are frantically waving their muleta and feeding a sense of panic that is quite inconsistent with the reality of the immediate threat.

To date, the most obvious manifestations of the power of synthetic media to cause harm to remain confined to a few news stories and a few allegations of fraudulent deepfakes. At the political level, Gabon, Russia, and some European partners have been victims of real-fake deepfakes 3Was Leonid Volkov a victim of a deepfake?, journalism.design, 2021. Some companies have claimed to have been victims of extortion using voice synthetic media, but without ever providing solid evidence to support these claims 4 The credibility of these allegations remains quite low and for good reason: 1. at the time of the facts (2019) voice deepfakes are still rare and very expensive to produce. 2. Even with a perfect synthetic voice, the device to conduct a long, improvised conversation does not yet exist. 3. Why produce a deepfake when an impersonator is enough? 4. No proof has ever been publicly put forward. Fake voices “help cyber-crooks steal cash,” BBC, July 8, 2019. On the victims’ side, the most spectacular ones, such as this American mother unjustly accused 5 Miscellaneous facts and deepfakes, journalism. design, 2021 or the pathetic alibi of Dieudonné in France relegated to the background the complaints of real victims of sextortion 6Nth Room: deepfakes as a weapon of sexual exploitation, journalism. design, 2020. However, in the future, the transformative power of synthetic media will undoubtedly force us to reconsider our social, digital, legal, and technical realities to the point of profoundly transforming the societies in which we live.

Perhaps we can question the limits of the space in which deepfakes will develop. Perhaps it is even possible to anticipate a certain number of problems by reflecting collectively on the development of synthetic media in our digitalized societies.

Laws already exist to act on deepfakes.

The approach of the legislator towards synthetic media in Western countries consists in preventing, in the urgency of important electoral campaigns, the potentially devastating effects of deepfakes. This was the case in the United States a few months before the 2020 presidential election, where some states put in place legislation prohibiting the broadcasting of deepfakes in the months preceding a key election, but also in Australia, New Zealand, or England. Often, this legislative package was accompanied by a section on the protection of deep-porn victims 7Deepfakes laws and proposals flood US, Malwarebytes Lab, 2020.

However, a certain number of texts, at the national or federal level (depending on the country concerned) as well as at the supranational level, already regulate different aspects of the life of a malicious deepfake. In France, in particular, there are texts governing the rights of individuals, particularly with regard to respect for private life 8in France,Article 9 of the Civil Code and the right to one’s image 9 Civil Code: Articles 7 to 16-14, respect for private life (Article 9), Criminal Code: Articles 226-1 to 226-7, violation of privacy, Criminal Code: Articles 226-8 to 226-9, violation of privacy, etc : articles 226-8 to 226-9, infringement of the representation of the person and the summary procedure to stop the obvious disturbance Civil Procedure Code: Articles 484 to 492-1 , but also the texts protecting people against defamation 10 the law of July 29, 1881 on the freedom of the press: Article 32 establishes the penalties incurred in case of public defamation. The law of July 29, 1881 on the freedom of the press: article 33 establishes the penalty for public insult. In the case of non-public defamation, it is article R625-8 of the Penal Code which sets out the penalties incurred in the case of non-public defamation, and article R621-2 of the Penal Code which indicates the penalties incurred in the case of non-public insult. In France, Article 226-4-1 of the Penal Code punishes the offense of identity theft with one year’s imprisonment and a fine of 15,000 euros: “The fact of usurping the identity of a third party or of using one or more data of any kind allowing the identification of the third party with a view to disturbing his or her peace of mind or that of others, or to harming his or her honor or reputation”. The adoption of Law no. 2002-1094 of August 29, 2002, on the orientation and programming of internal security, known as LOPSI, the last reform of which was carried out by Law no. 2011-267 of March 14, 2011, on the orientation and programming of internal security, leading to LOPSI 2. This latest development, inserted into the Penal Code inArticle 226-4-1, punishes identity theft on the Internet. More recently, the anti-fake news law 11 LOI n° 2018-1202 du 22 décembre 2018 relative à la lutte contre la manipulation de l’information promulgated in 2018 aims precisely to counter the development of fake news in an electoral context by reinforcing article 4 of the law on the press of , Nextimpact, December 2018 since it will apply to false information “obviously” illegal, leaving room for interpretations of the judge.

Criminal law also contemplates extortion 14 Penal Code: Of extortion (Articles 312-1 to 312-9) and severely punishes the fact of “obtaining by coercion, either a signature, a commitment or a renunciation, or the revelation of a secret, or the handing over of funds, values or any property”. Finally, it should be noted that the General Data Protection Regulation (GDPR) protects data 15 Right to image and personal data: two sides of the coin, Adrien Aulas, Aeon 2018 that constitutes an image or sound if this image or sound can be attached to a name, and therefore as such, protects the image and voices of people in case of non-consensual use.

The justice system, therefore, has the technical means in the existing texts to qualify an offense and to condemn a malicious deepfake author. The police, on the other hand, is still relatively ill-equipped to track down the perpetrators of cybercrime on the Internet despite clear progress 16 Berthelet Pierre, “Aperçus de la lutte contre la cybercriminalité dans l’Union européenne”, Revue de science criminelle et de droit pénal comparé, 2018/1 (No. 1), pp. 59-74. DOI: 10,391 7/rsc.1801.0059. URL: https: //www.cairn.info/revue-de-science-criminelle-et-de-droit-penal-compare-2018-1-page-59.htm. Despite some improvement in police resources and manpower, bringing a perpetrator to justice can be very complicated, and in some cases impossible, making any new legislation at best ineffective, at worst ineffectual. The effects of political opportunism are not to be neglected either, since a certain number of declarations have flourished here and there, aiming more at capitalizing on political credibility points than at really carrying out a concerted action capable of leading to an effective text. 17 Is Olivia Grégoire right to fear that deepfakes will influence the presidential elections in 2022? journalism. design, 2021

The use of synthetic doubles

The short-term vision of parliamentarians in charge of legislating on these subjects and their lack of understanding of these matters do not allow them to go beyond the compassed issues to consider the future stakes. The first question, it seems to me, is to determine the legal status of synthetic doubles and more precisely of this “likeness”, this “essence” or “soul” 18in English we speak of ” Likeness “, a term which shows the similarity between two things. We often say ” It’s alike”. But “Likeness” brings an additional dimension, it is not simply a physical resemblance, but the whole of the features which compose a person. Way of moving, tone of voice, gestures, expressions, facial micromovements, in short, the criteria that make us unique. In Faegre & Benson, LLP v. Purday, 367 F. Supp. 2 d 1238 (D. Minn. 2005), the Minnesota court defined the term as follows: “Likeness” means a visual image of the plaintiff, whether it is a photograph, drawing, cartoon, or other visual presentation. The visual image need not precisely reproduce the complainant’s appearance, or even show his or her face, as long as it is sufficient to evoke the complainant’s identity in the public eye. which is at the heart of the deepfakes issue.

Indeed, should we consider synthetic doubles as a simple high definition reproduction of the physical person or can we consider them as an extension of the identity in the digital sphere? Should we see in this replica a virtual avatar to which we would attach rights close to those attached to the original? The very current question of digital identity and its legal status, introduced in 2010 in the public debate at the height of Second Life 19 The right of avatars, a right in progress? Iteanu Olivier, Battisti Michèle, Manara Cédric, ” Droit de l’information “, Documentaliste-Sciences de l’Information, 2010/1 (Vol. 47), p. 22-27. DOI: 10,391 7/docsi.471.0022. URL: https://www.cairn.info/revue-documentaliste-sciences-de-l-information-2010-1-page-22.htm provides an opportunity for legal scholars to consider the link between synthetic media and identity. The rapidly evolving digital uses allow, with little extrapolation, to foresee how digital space will be, in the same way as tangible space, a key place in our future lives. It is therefore not useless to consider already the legal consequences of the proliferation of synthetic doubles in this digital space and the interactions that there could be between them. How can we be sure of a person’s identity if his synthetic double does not resemble him? Should we force the use of a realistic synthetic double for transactions or official representations?

Moreover, what could be the relationship between a person and his synthetic double, could one die and the other survive him and have an economic, social or cultural activity for example? 20 In this context, the law voted on November 30, 2020, by Andrew Cuomo, governor of the state of New York, ratifies that a synthetic double can have an autonomous life after the death of the person on which it is based, as long as this one consent. New York’s Right to Publicity and Deepfakes Law Breaks New Ground, Matthew F. Ferraro, Louis W. Tompros, December 2020 Can this digital double be transferred to a third party or does it remain attached to the person? Can we not even consider that a synthetic double, from the moment it was originally imagined and conceived by its creator (even if it does not resemble him in all respects) constitutes a projection of the digital self and that it thus remains attached to the personality of the person who embodies it? If synthetic media degrades the e-reputation of a person, what recourse will be possible to repair or improve the lost credibility? The questions go on endlessly.

The conditions for the creation of synthetic media

The multiplication of our representations on social networks can now form a sufficient database to create a synthetic double with or without our consent. To be convinced of this, it is enough to navigate on the sub-Reddit dedicated to the creation of deepfakes or on the forums of Mr. Deepfake, one of the many sites of deep-porns where the community meets to exchange on the creation of deepfakes 21 ⚠️ RISK OF PORNOGRAPHIC CONTENT ⚠️ How to create artificial faces to use as deepfakes, MrDeepfake’s Forum .

In the register of entertainment or pure content creation, what about the intellectual property of deepfakes made with existing content? Is the synthetic media generated by an algorithm an original work or a synthesis of several by-products whose rights must be recognized? Is the training database of the algorithms used as a reference to determine how many percent the final deepfake is composed of?

If an actor or actress appears in a deepfake, is he or she eligible to receive damages for the use of his or her image? Should we distinguish the prejudice suffered by an anonymous person who sees his or her body or face or voice being used without his or her consent from the prejudice suffered by a known person who exercises a public profession (actor, singer, politician) or who derives substantial benefits from it? Is there an aggravating factor? If a professional actor decides to be digitized in order to digitally preserve a copy of his or her appearance, who has the right to this digitization?

Does the person have an inalienable right to control his appearance, his likeness or does he have to give exploitation rights to the digitization companies so that they can integrate this synthetic copy into a commercially exploitable catalog of avatars? What about the evolution in time of these rights? Should we consider a period of time during which the avatar can only be exploited by the person from whom the synthetic copy has been made? In the context of the creation of original deepfakes, what are the rights to be transferred? For how long? In the context of the propagation of unconsented deepfakes, are we condemning the fabrication, the distribution, both? What about the security of the data, the storage, and their conservation over the long term? Are we facing an obligation to maintain digital media containing the digital copy of an individual’s appearance?

The work of legal experts has only just begun. Many existing texts can already protect society from the most obvious undesirable effects that synthetic media could cause. It is therefore important not to approach the question simply in a negative way, but also to consider the positive applications of synthetic media. A balanced legislative response favoring the devices allowing the development of benevolent and useful deepfakes on the one hand, and on the other hand reinforcing the coercive tools against ill-intentioned productions, seems necessary to clean up the digital space of tomorrow.

It is certain that the doctrinal approaches will be very varied and that the United States will play a major role in setting up international standards. But Europe certainly has a say in the development of these normative rules, if only to avoid the establishment of a new hegemony around digital issues.

In a second part, we will ask ourselves the question of the technical solutions to be considered to live with deepfakes. In the meantime, feel free to share this post if you liked it.

Notes :

Notes :
1 Is post-truth a new phenomenon, if the term dates from the 1990s, it probably refers to something older? I refer you to listen to ” The era of post-truth? I refer you to “The era of post-truth by Anastasia Colosimo on France Culture to learn more.
2 C2PA ” or Coalition for Content Provenance and Authenticity brings together Adobe, Arm, Intel, Microsoft and Truepic through two other groupings. ” Project Origin ” includes BBC, CBC/Radio-Canada, The New York Times, and Microsoft. ” The Content Authenticity Initiative ” includes Adobe, AFP, ARM, the BBC, Camera Bits, CBC, Gannett, Getty Images, Microsoft, the New York Times, Qualcomm, Synthesia, Truepic, Twitter, USA Today, the VII photo agency, the Washington Post and finally the NGO Witness.org.
3 Was Leonid Volkov a victim of a deepfake?, journalism.design, 2021
4 The credibility of these allegations remains quite low and for good reason: 1. at the time of the facts (2019) voice deepfakes are still rare and very expensive to produce. 2. Even with a perfect synthetic voice, the device to conduct a long, improvised conversation does not yet exist. 3. Why produce a deepfake when an impersonator is enough? 4. No proof has ever been publicly put forward. Fake voices “help cyber-crooks steal cash,” BBC, July 8, 2019
5 Miscellaneous facts and deepfakes, journalism. design, 2021
6 Nth Room: deepfakes as a weapon of sexual exploitation, journalism. design, 2020
7 Deepfakes laws and proposals flood US, Malwarebytes Lab, 2020
8 in France,Article 9 of the Civil Code
9 Civil Code: Articles 7 to 16-14, respect for private life (Article 9), Criminal Code: Articles 226-1 to 226-7, violation of privacy, Criminal Code: Articles 226-8 to 226-9, violation of privacy, etc : articles 226-8 to 226-9, infringement of the representation of the person and the summary procedure to stop the obvious disturbance Civil Procedure Code: Articles 484 to 492-1
10 the law of July 29, 1881 on the freedom of the press: Article 32 establishes the penalties incurred in case of public defamation. The law of July 29, 1881 on the freedom of the press: article 33 establishes the penalty for public insult. In the case of non-public defamation, it is article R625-8 of the Penal Code which sets out the penalties incurred in the case of non-public defamation, and article R621-2 of the Penal Code which indicates the penalties incurred in the case of non-public insult. In France, Article 226-4-1 of the Penal Code punishes the offense of identity theft with one year’s imprisonment and a fine of 15,000 euros: “The fact of usurping the identity of a third party or of using one or more data of any kind allowing the identification of the third party with a view to disturbing his or her peace of mind or that of others, or to harming his or her honor or reputation”. The adoption of Law no. 2002-1094 of August 29, 2002, on the orientation and programming of internal security, known as LOPSI, the last reform of which was carried out by Law no. 2011-267 of March 14, 2011, on the orientation and programming of internal security, leading to LOPSI 2. This latest development, inserted into the Penal Code inArticle 226-4-1, punishes identity theft on the Internet
11 LOI n° 2018-1202 du 22 décembre 2018 relative à la lutte contre la manipulation de l’information
12 Délits commis par la voie de la presse ou par toute autre voie de publication, Gallica (BNF), 27 juillet 1849.
13 Validated by the Constitutional Council, the law “Fake news” looks tricky to apply, Nextimpact, December 2018
14 Penal Code: Of extortion (Articles 312-1 to 312-9)
15 Right to image and personal data: two sides of the coin, Adrien Aulas, Aeon 2018
16 Berthelet Pierre, “Aperçus de la lutte contre la cybercriminalité dans l’Union européenne”, Revue de science criminelle et de droit pénal comparé, 2018/1 (No. 1), pp. 59-74. DOI: 10,391 7/rsc.1801.0059. URL: https: //www.cairn.info/revue-de-science-criminelle-et-de-droit-penal-compare-2018-1-page-59.htm
17 Is Olivia Grégoire right to fear that deepfakes will influence the presidential elections in 2022? journalism. design, 2021
18 in English we speak of ” Likeness “, a term which shows the similarity between two things. We often say ” It’s alike”. But “Likeness” brings an additional dimension, it is not simply a physical resemblance, but the whole of the features which compose a person. Way of moving, tone of voice, gestures, expressions, facial micromovements, in short, the criteria that make us unique. In Faegre & Benson, LLP v. Purday, 367 F. Supp. 2 d 1238 (D. Minn. 2005), the Minnesota court defined the term as follows: “Likeness” means a visual image of the plaintiff, whether it is a photograph, drawing, cartoon, or other visual presentation. The visual image need not precisely reproduce the complainant’s appearance, or even show his or her face, as long as it is sufficient to evoke the complainant’s identity in the public eye.
19 The right of avatars, a right in progress? Iteanu Olivier, Battisti Michèle, Manara Cédric, ” Droit de l’information “, Documentaliste-Sciences de l’Information, 2010/1 (Vol. 47), p. 22-27. DOI: 10,391 7/docsi.471.0022. URL: https://www.cairn.info/revue-documentaliste-sciences-de-l-information-2010-1-page-22.htm
20 In this context, the law voted on November 30, 2020, by Andrew Cuomo, governor of the state of New York, ratifies that a synthetic double can have an autonomous life after the death of the person on which it is based, as long as this one consent. New York’s Right to Publicity and Deepfakes Law Breaks New Ground, Matthew F. Ferraro, Louis W. Tompros, December 2020
21 ⚠️ RISK OF PORNOGRAPHIC CONTENT ⚠️ How to create artificial faces to use as deepfakes, MrDeepfake’s Forum
Author avatar
Gerald Holubowicz
http://geraldholubowi.cz
Ancien photojournaliste et web-documentariste passé chef de produit spécialisé en innovation éditoriale, j'étudie l'impact des médias synthétiques (deepfakes) sur la fabrique d'une culture visuelle numérique. Après 10 ans d'interventions régulières auprès de différentes écoles de journalisme (EMI, CFJ/CFPJ, INA, Sciences Po Grenoble), j'interviens désormais à l'École de Journalisme et au Centre des Médias de Sciences Po Paris.