2 3. Right to Privacy
Outline of the chapter
This chapter sets out some of the prominent challenges that new technologies and AI systems pose to the right to privacy. In particular, the chapter addresses:
- what the protective scope of the right to privacy encompasses (section 3.2)
- the restrictions that may be imposed to the right to privacy (section 3.3)
- the broad construction of ‘private life’ under the ECHR, including in a public context (section 3.2.2)
- the role of encryption as an element ensuring the confidentiality of communications (section 3.2.4)
- how the ECtHR addresses the risks ensuing from mass surveillance regimes (section 3.4)
- the conditions upon which biometric recognition systems are compatible with human rights law and the AI Act (section 3.5)
(3.1) Relevant law
Article 17 ICCPR International Covenant on Civil and Political Rights
1. No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation.
2. Everyone has the right to the protection of the law against such interference or attacks.
Article 10 ECHR European Convention on Human Rights – Right to respect for private and family life
1. Everyone has the right to respect for his private and family life, his home and his correspondence.
2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.
Article 7 EU Charter on Fundamental Rights – Respect for private and family life
Everyone has the right to respect for his or her private and family life, home and communications.
(3.2) What is Privacy? The Elusive Protective Scope of the Right to Privacy
(3.2.1) Clarifying the scope of protection of Article 17 ICCPR
(3.2.1.1) Unlawful or arbitrary interference
UN Human Rights Committee, General Comment No 16, Article 17 (Right to Privacy), The Right to Respect of Privacy, Family, Home and Correspondence, and Protection of Honour and Reputation, UN Doc HRI/GEN/1/Rev.1 (1994)
1. Article 17 provides for the right of every person to be protected against arbitrary or unlawful interference with his privacy, family, home or correspondence as well as against unlawful attacks on his honour and reputation. In the view of the Committee this right is required to be guaranteed against all such interferences and attacks whether they emanate from State authorities or from natural or legal persons. The obligations imposed by this article require the State to adopt legislative and other measures to give effect to the prohibition against such interferences and attacks as well as to the protection of this right.
[…]
3. The term “unlawful” means that no interference can take place except in cases envisaged by the law. Interference authorized by States can only take place on the basis of law, which itself must comply with the provisions, aims and objectives of the Covenant.
4. The expression “arbitrary interreference” is also relevant to the protection of the of the right provided for in Article 17. In the Committee’s view the expression “arbitrary interreference” can also extend to interference provided for under the law. The introduction of the concept of arbitrariness is intended to guarantee that even interference provided for by law should be in accordance with the provisions, aims and objectives of the Covenant and should be, in any event, reasonable in the particular circumstances.
[…]
8. Even with regard to interferences that conform to the Covenant, relevant legislation must specify in detail the precise circumstances in which such interferences may be permitted. A decision to make use of such authorized interference must be made only by the authority designated under the law, and on a case-by-case basis […]
(3.2.1.2) Protected categories
(3.2.2) The notion of ‘private life’ under the ECHR
The ECtHR adopt a broad conceptualisation of the notion of ‘private life’ in its case law.
(3.2.2.1) The broad notion of ‘private life’
ECtHR, S. and Marper v UK, App no 30562/04 and 30566/04, 4 December 2008 (Grand Chamber)
66. The Court notes that the concept of “private life” is a broad term not susceptible to exhaustive definition. It covers the physical and psychological integrity of a person […]. It can therefore embrace multiple aspects of the person’s physical and social identity […]. Elements such as, for example, gender identification, name and sexual orientation and sexual life fall within the personal sphere protected by Article 8 […]. Beyond a person’s name, his or her private and family life may include other means of personal identification and of linking to a family […]. Information about the person’s health is an important element of private life […]. The Court furthermore considers that an individual’s ethnic identity must be regarded as another such element […]. The concept of private life moreover includes elements relating to a person’s right to their image […]
(3.2.2.2) The social aspect of ‘private life’
Private life is a notion that is primarily intended to ensure the development, without outside interference, of the personality of each individual. In this sense, private life is primarily designed to protect the individual in their private sphere/space from the State interfering.
However, the ECtHR adds a social aspect to protecting private life, including the right for an individual to establish and develop relationships with others and with the outside world.
ECtHR, Von Hannover v Germany (no 2), Applications nos 40660/08 and 60641/08, 7 February 2012 (Grand Chamber)
95. The […] concept of private life extends to aspects relating to personal identity, such as a person’s name, photo, or physical and moral integrity; the guarantee afforded by Article 8 of the Convention is primarily intended to ensure the development, without outside interference, of the personality of each individual in his relations with other human beings. There is thus a zone of interaction of a person with others, even in a public context, which may fall within the scope of private life.
The Court, in the Von Hannover case, stresses that Article 8 also ensures the development, without outside interference, of the personality of each individual in his relations with other human beings. In other words, the Court acknowledges that individuals develop their personality also when approaching others in order to establish and develop relationships with them and with the outside world. One’s private life is protected in this social and public context too. The Court uses various terms to capture this, such as the right to lead a “private social life” or ‘a zone of interaction’ of a person with others.
(3.2.2.3) When do private life considerations arise in a public context?
Having clarified that private life includes the right for an individual to establish and develop relationships with others and with the outside world, the question now is how we ascertain whether an interference with private life exists when we are in public.
The main criterion that the Court employs to assess whether there is an interference with private life in public is the creation of a systematic or permanent record of personal data.
Examples of creating permanent records: photographs, voice samples, fingerprints
Examples of creating systematic records: compiling data/datasets (in a manner or degree beyond that normally foreseeable)
(3.2.2.3.1) ECtHR, Glukhin v Russia, App no 11519/20, 4 July 2023
64. The […] concept of “private life” is a broad term not susceptible to exhaustive definition. It can embrace multiple aspects of the person’s physical and social identity.It is not limited to an “inner circle” in which the individual may live his or her own personal life without outside interference, but also encompasses the right to lead a “private social life”, that is, the possibility of establishing and developing relationships with others and the outside world. It does not exclude activities taking place in a public context. There is thus a zone of interaction of a person with others, even in a public context, which may fall within the scope of “private life”.
[…]
66. Since there are occasions when people knowingly or intentionally involve themselves in activities which are or may be recorded or reported in a public manner, a person’s reasonable expectations as to privacy may be a significant, although not necessarily conclusive, factor in this assessment. As to the monitoring of an individual’s actions using photographic or video devices, the Convention institutions have taken the view that the monitoring of the actions and movements of an individual in a public place using a camera which did not record the visual data does not constitute in itself a form of interference with private life. Private-life considerations may arise, however, once any systematic or permanent record of such personal data comes into existence, particularly pictures of an identified person. A person’s image constitutes one of the chief attributes of his or her personality, as it reveals the person’s unique characteristics and distinguishes the person from his or her peers. The right of each person to the protection of his or her image is thus one of the essential components of personal development and presupposes the right to control the use of that image. While in most cases the right to control such use involves the possibility for an individual to refuse publication of his or her image, it also covers the individual’s right to object to the recording, conservation and reproduction of the image by another person […]
67. The Court has previously found that the collection and storing of data by the authorities on particular individuals constituted an interference with those persons’ private lives, even if that data concerned exclusively the person’s public activities […], such as participation in anti-government demonstrations; […] recording by CCTV cameras in a public place and the subsequent disclosure of the video-footage to the media; […] and video surveillance of amphitheatres at a public university […].
(3.2.2.3.2) Protection of ‘private life’ extends to processing or publishing publicly accessible taxation data in a manner/degree beyond normally foreseeable
Satakunnan Markkinapörssi Oy and Satamedia Oy v. Finland, App no 931/13, 27 June 2017 (Grand Chamber)
134. The fact that information is already in the public domain will not necessarily remove the protection of Article 8 of the Convention. […]
[…]
136. […] where there has been compilation of data on a particular individual, processing or use of personal data or publication of the material concerned in a manner or degree beyond that normally foreseeable, private life considerations arise […]
(3.2.2.4) Exercise: Privacy as secrecy or privacy as protecting trust and confidentiality?
ECtHR, Rotaru v Romania, App no 28341/95, 4 May 2000 (Grand Chamber)
43. […] public information can fall within the scope of private life where it is systematically collected and stored in files held by the authorities. That is all the truer where such information concerns a person’s distant past.
44. […] [This information included] various pieces of information about the applicant’s life, in particular his studies, his political activities and his criminal record, some of which had been gathered more than fifty years earlier. […]
Partly dissenting opinion of Judge Bonello, in Rotaru v Romania
-
[…] I cannot endorse the applicability of Article 8.
-
Article 8 protects the individual’s private life. At the core of that protection lies the right of every person to have the more intimate segments of his being excluded from public inquisitiveness and scrutiny. There are reserved zones in our person and in our spirit which the Convention requires should remain locked. It is illegitimate to probe for, store, classify or divulge data which refer to those innermost spheres of activity, orientation or conviction, sheltered behind the walls of confidentiality.
-
On the other hand, activities which are, by their very nature, public and which are actually nourished by publicity, are well outside the protection of Article 8.
Concurring opinion of Judge Yudkivska, joined by judge Bošnjak, in ECtHR, Benedik v. Slovenia, App no 62357/14, 24 April 2018
It is argued that in order to protect privacy in the modern era we must reconsider the outdated understanding of it as mere secrecy, and move toward legal protection of trust and confidentiality and of the right to control how information is disseminated and used. As judges we are entrusted with the task of rethinking the privacy paradigm in cases such as the present one.
Questions:
- How do you assess Judge Bonello’s view that private life protects only what is ‘sheltered behind the walls of confidentiality’ and not in public?
- In contrast, Judge Yudkivska speaks of reconsidering the outdated understanding of privacy as mere secrecy and that we should move instead towards a new paradigm of protecting trust and confidentiality. What do you think she means?
(3.2.3) The relationship between privacy and data protection
ECtHR, Glukhin v Russia, App no 11519/20, 4 July 2023
75. The protection of personal data is of fundamental importance to a person’s enjoyment of his or her right to respect for private and family life, as guaranteed by Article 8 of the Convention. The domestic law must afford appropriate safeguards to prevent any such use of personal data as may be inconsistent with the guarantees of this Article.
[…]
77. In the context of the collection and processing of personal data, it is therefore essential to have clear, detailed rules governing the scope and application of measures, as well as minimum safeguards concerning, inter alia, duration, storage, usage, access of third parties, procedures for preserving the integrity and confidentiality of data, and procedures for their destruction, thus providing sufficient guarantees against the risk of abuse and arbitrariness.
(3.2.4) Confidentiality of communications and encryption
According to the ECtHR, confidentiality of communications is an essential element of the right to respect for private life and correspondence, as enshrined in Article 8 ECHR.
(3.2.4.1) Encryption as a technical tool to protect the confidentiality of communications
UN Human Rights Council Res 47/16, The promotion, protection and enjoyment of human rights on the Internet, UN Doc A/HRC/RES/47/16, 7 July 2021
Emphasizing that, in the digital age, technical solutions to secure and protect the confidentiality of digital communications, including measures for encryption and anonymity, are important to ensure the enjoyment of all human rights offline and online
(3.2.4.2) A statutory obligation to decrypt end-to-end encrypted communications weakens the encryption mechanism for all users and it is a disproportionate restriction
ECtHR, Podchasov v Russia, App no 33696/19, 13 February 2024
- The applicant was a user of Telegram, a messaging application which is used free of charge by millions of people in Russia and worldwide.
- This was the first opportunity for the ECtHR to decide on the statutory requirement for Internet providers to give state authorities information necessary to decrypt electronic messages.
- The judgment is well-informed and well-articulated on legal and technical developments in this area.
7. On 12 July 2017 the Federal Security Service (“the FSB”) required Telegram Messenger LLP to disclose technical information which would facilitate “the decryption of communications […] in respect of Telegram users who were suspected of terrorism-related activities”. […]
8. Telegram Messenger LLP refused to comply with the disclosure order, arguing that it was technically impossible to execute it without creating a backdoor that would weaken the encryption mechanism for all users. […] The company was fined by the Meshchanskiy District Court of Moscow […]. Subsequently, […] the Taganskiy District Court of Moscow ordered the blocking of the Telegram application in Russia. Both judgments were upheld on appeal.
[….]
65. […] the Court reiterates that confidentiality of communications is an essential element of the right to respect for private life and correspondence, as enshrined in Article 8. Users of telecommunications and Internet services must have a guarantee that their own privacy and freedom of expression will be respected, although such a guarantee cannot be absolute and must yield on occasion to other legitimate imperatives, such as the prevention of disorder or crime or the protection of the rights and freedoms of others […]
[…]
76. As regards the requirement to submit to the security services information necessary to decrypt electronic communications if they are encrypted, the Court observes that international bodies have argued that encryption provides strong technical safeguards against unlawful access to the content of communications and has therefore been widely used as a means of protecting the right to respect for private life and for the privacy of correspondence online. In the digital age, technical solutions for securing and protecting the privacy of electronic communications, including measures for encryption, contribute to ensuring the enjoyment of other fundamental rights, such as freedom of expression […]. Encryption, moreover, appears to help citizens and businesses to defend themselves against abuses of information technologies, such as hacking, identity and personal data theft, fraud and the improper disclosure of confidential information. This should be given due consideration when assessing measures which may weaken encryption.
77. […] It appears that in order to enable decryption of communications protected by end-to-end encryption, such as communications through Telegram’s “secret chats”, it would be necessary to weaken encryption for all users. These measures allegedly cannot be limited to specific individuals and would affect everyone indiscriminately, including individuals who pose no threat to a legitimate government interest. Weakening encryption by creating backdoors would apparently make it technically possible to perform routine, general and indiscriminate surveillance of personal electronic communications. Backdoors may also be exploited by criminal networks and would seriously compromise the security of all users’ electronic communications. […]
78. The Court accepts that encryption can also be used by criminals, which may complicate criminal investigations […] However, it takes note in this connection of the calls for alternative “solutions to decryption without weakening the protective mechanisms, both in legislation and through continuous technical evolution” […]
79. […] In the present case the ICO’s statutory obligation to decrypt end-to-end encrypted communications risks amounting to a requirement that providers of such services weaken the encryption mechanism for all users; it is accordingly not proportionate to the legitimate aims pursued.
(3.2.5) Evolution of technologies and the changing threshold for the existence of an interference with the right to privacy
ECtHR, S. and Marper v UK, App no 30562/04 and 30566/04, 4 December 2008 (Grand Chamber)
58. The applicants complained under Article 8 of the Convention about the retention of their fingerprints, cellular samples and DNA profiles pursuant to section 64(1A) of the Police and Criminal Evidence Act 1984.
[…]
67. The mere storing of data relating to the private life of an individual amounts to an interference within the meaning of Article 8 […]. The subsequent use of the stored information has no bearing on that finding […]. However, in determining whether the personal information retained by the authorities involves any of […] private-life aspects […], the Court will have due regard to the specific context in which the information at issue has been recorded and retained, the nature of the records, the way in which these records are used and processed and the results that may be obtained […]
Cellular samples and DNA profiles
[…]
71. The Court maintains its view that an individual’s concern about the possible future use of private information retained by the authorities is legitimate and relevant to a determination of the issue of whether there has been an interference. Indeed, bearing in mind the rapid pace of developments in the field of genetics and information technology, the Court cannot discount the possibility that in the future the private-life interests bound up with genetic information may be adversely affected in novel ways or in a manner which cannot be anticipated with precision today. […]
72. Legitimate concerns about the conceivable use of cellular material in the future are not, however, the only element to be taken into account in the determination of the present issue. In addition to the highly personal nature of cellular samples, the Court notes that they contain much sensitive information about an individual, including information about his or her health. Moreover, samples contain a unique genetic code of great relevance to both the individual and his relatives.
[…]
75. [As regards DNA profiles] the Court observes […] that the profiles contain substantial amounts of unique personal data. While the information contained in the profiles may be considered objective and irrefutable […], their processing through automated means allows the authorities to go well beyond neutral identification. […] In the Court’s view, the DNA profiles’ capacity to provide a means of identifying genetic relationships between individuals […] is in itself sufficient to conclude that their retention interferes with the right to the private life of the individuals concerned. The frequency of familial searches, the safeguards attached thereto and the likelihood of detriment in a particular case are immaterial in this respect […] This conclusion is similarly not affected by the fact that, since the information is in coded form, it is intelligible only with the use of computer technology and capable of being interpreted only by a limited number of persons.
See also discussion below (3.4.3) on the evolving technology of metadata and its legal treatment
(3.3) Restricting the Right to Privacy
A restriction to the right to privacy may be imposed as long as this restriction
a) is provided by law
b) serves one of the legitimate purposes exhaustively enumerated in the respective provision; and
c) and is necessary in a democratic society.
- Read: Article 8(2) ECHR (see 3.1)
(3.3.1) Legislator’s obligation to provide a framework for reconciling the various claims which compete for protection even in the sphere of the relations of individuals
ECtHR, K.U. v Finland, App no 2872/02, 2 December 2008
40. […] the applicant, a minor of 12 years at the time, was the subject of an advertisement of a sexual nature on an Internet dating site. The identity of the person who had placed the advertisement could not, however, be obtained from the Internet service provider due to the legislation in place at the time.
[…]
42. The Court reiterates that, although the object of Article 8 is essentially to protect the individual against arbitrary interference by the public authorities, it does not merely compel the State to abstain from such interference: in addition to this primarily negative undertaking, there may be positive obligations inherent in an effective respect for private or family life […]
43. These obligations may involve the adoption of measures designed to secure respect for private life even in the sphere of the relations of individuals between themselves. […]
44. The limits of the national authorities’ margin of appreciation are nonetheless circumscribed by the Convention provisions. In interpreting them, since the Convention is first and foremost a system for the protection of human rights, the Court must have regard to the changing conditions within Contracting States and respond, for example, to any evolving convergence as to the standards to be achieved.
[…]
46. […] For the Court, States have a positive obligation inherent in Article 8 of the Convention to criminalise offences against the person, including attempted offences, and to reinforce the deterrent effect of criminalisation by applying criminal‑law provisions in practice through effective investigation and prosecution […]. Where the physical and moral welfare of a child is threatened, such injunction assumes even greater importance. The Court notes in this connection that sexual abuse is unquestionably an abhorrent type of wrongdoing, with debilitating effects on its victims. Children and other vulnerable individuals are entitled to State protection, in the form of effective deterrence, from such grave types of interference with essential aspects of their private lives
[…]
49. The Court considers that practical and effective protection of the applicant required that effective steps be taken to identify and prosecute the perpetrator, that is, the person who placed the advertisement. In the instant case, such protection was not afforded. An effective investigation could never be launched because of an overriding requirement of confidentiality. Although freedom of expression and confidentiality of communications are primary considerations and users of telecommunications and Internet services must have a guarantee that their own privacy and freedom of expression will be respected, such guarantee cannot be absolute and must yield on occasion to other legitimate imperatives, such as the prevention of disorder or crime or the protection of the rights and freedoms of others. Without prejudice to the question whether the conduct of the person who placed the offending advertisement on the Internet can attract the protection of Articles 8 and 10, having regard to its reprehensible nature, it is nonetheless the task of the legislator to provide the framework for reconciling the various claims which compete for protection in this context. Such framework was not, however, in place at the material time […]
50. The Court finds that there has been a violation of Article 8 of the Convention in the present case.
(3.4) Mass Surveillance
(3.4.1) Adjusting the victim requirement in cases of secret, mass surveillance
ECtHR, Roman Zakharov v Russia, App no 47143/06, 4 December 2015 (Grand Chamber)
The applicant alleged that the system of secret interception of mobile-telephone communications in Russia violated his right to respect for his private life and correspondence, and that he did not have any effective remedy in that respect. However, the applicant was not in position to prove that he has been a victim (directly affected) and thus that an interference had taken place.
The Court held that the victim requirement had to be adjusted in cases of secret, mass surveillance. The Court instead examines:
- the scope of the legislation permitting secret surveillance measures by examining whether the applicant can possibly be affected by it; and
- whether the domestic system affords an effective remedy to the person who suspects that (s)he was subjected to secret surveillance.
163. The Court observes that the applicant in the present case claims that there has been an interference with his rights as a result of the mere existence of legislation permitting covert interception of mobile-telephone communications and a risk of being subjected to interception measures, rather than as a result of any specific interception measures applied to him.
[…]
164. The Court has consistently held in its case-law that the Convention does not provide for the institution of an actio popularis and that its task is not normally to review the relevant law and practice in abstracto, but to determine whether the manner in which they were applied to, or affected, the applicant gave rise to a violation of the Convention […]. Accordingly, in order to be able to lodge an application in accordance with Article 34, an individual must be able to show that he was “directly affected” by the measure complained of. This is indispensable for putting the protection mechanism of the Convention into motion, although this criterion is not to be applied in a rigid, mechanical and inflexible way throughout the proceedings […]
[…]
168. In other cases the Court reiterated […] that the mere existence of laws and practices which permitted and established a system for effecting secret surveillance of communications entailed a threat of surveillance for all those to whom the legislation might be applied. This threat necessarily affected freedom of communication between users of the telecommunications services and thereby amounted in itself to an interference with the exercise of the applicants’ rights under Article 8, irrespective of any measures actually taken against them […]
[…]
171. […] Accordingly, the Court accepts that an applicant can claim to be the victim of a violation occasioned by the mere existence of secret surveillance measures, or legislation permitting secret surveillance measures, if the following conditions are satisfied. Firstly, the Court will take into account the scope of the legislation permitting secret surveillance measures by examining whether the applicant can possibly be affected by it, either because he belongs to a group of persons targeted by the contested legislation or because the legislation directly affects all users of communication services by instituting a system where any person can have his communications intercepted. Secondly, the Court will take into account the availability of remedies at the national level and will adjust the degree of scrutiny depending on the effectiveness of such remedies. […] Where the domestic system does not afford an effective remedy to the person who suspects that he was subjected to secret surveillance, widespread suspicion and concern among the general public that secret surveillance powers are being abused cannot be said to be unjustified […]. In such circumstances the threat of surveillance can be claimed in itself to restrict free communication through the postal and telecommunication services, thereby constituting for all users or potential users a direct interference with the right guaranteed by Article 8. There is therefore a greater need for scrutiny by the Court, and an exception to the rule denying individuals the right to challenge a law in abstracto is justified. In such cases the individual does not need to demonstrate the existence of any risk that secret surveillance measures were applied to him. By contrast, if the national system provides for effective remedies, a widespread suspicion of abuse is more difficult to justify. In such cases, the individual may claim to be a victim of a violation occasioned by the mere existence of secret measures or of legislation permitting secret measures only if he is able to show that, due to his personal situation, he is potentially at risk of being subjected to such measures.
[…]
(3.4.2) Assessing mass surveillance regimes against human rights safeguards
In 2013, Edward Snowden – a former (NSA) intelligence contractor and whistleblower – leaked classified documents revealing the existence of global surveillance programmes run by the National Security Agency in the USA and the General Communications Headquarters in the UK. These programmes enabled access to global internet traffic, calling records, and huge volumes of other digital communications content. Since then, concerns regarding mass surveillance have been amplified.
In her 2014 Report on The right to privacy in the digital age, the UN High Commissioner for Human Rights stressed that covert governmental mass surveillance practices were ‘emerging as a dangerous habit rather than an exceptional measure’. Today, governmental mass surveillance practices and data retention regimes – either overt or covert – are well-established and mainstream vindicating the point that they are not an exceptional measure but a dangerous habit
(3.4.2.1) UN General Assembly, Res 68/167, The Right to Privacy in the Digital Age, UN Doc A/RES/68/167, 18 December 2013
4. Calls upon all States:
(c) to review their procedures, practices and legislation regarding the surveillance of communications, their interception and the collection of personal data, including mass surveillance, interception and collection, with a view to upholding the right to privacy by ensuring the full and effective implementation of all their obligations under international human rights law
(3.4.2.2) ECtHR, Big Brother Watch and Others v United Kingdom, App nos 58170/13, 62322/14 and 24960/15, 25 May 2021 (Grand Chamber)
The case concerned the bulk interception regime operated in the UK, including the programmes of surveillance and intelligence sharing between the UK and the USA. The applicants brought claims regarding Articles 8, 10, 6, and 14 in conjunction with Articles 8 and 10. The complaints concerned 3 different surveillance regimes:
- the bulk interception of communications;
- the receipt of intercept material from foreign governments and intelligence agencies;
- the obtaining of communications data from communication service providers.
A notable point that should be made is that the Court accepted that Contracting States’ bulk interception regimes are ‘a valuable technological capacity to identify new threats in the digital domain’ (para 323). The framing of a bulk interception regime as a valuable technological capacity normalises mass surveillance. Moreover, ECtHR assesses compliance by reference to the existence of safeguards against arbitrariness and abuse, on the basis of limited information about the manner in which those regimes operate.
The Court’s assessment
Preliminary remarks
322. The present complaint concerns the bulk interception of cross‑border communications by the intelligence services. While it is not the first time the Court has considered this kind of surveillance […], in the course of the proceedings it has become apparent that the assessment of any such regime faces specific difficulties. In the current, increasingly digital, age the vast majority of communications take digital form and are transported across global telecommunications networks using a combination of the quickest and cheapest paths without any meaningful reference to national borders. Surveillance which is not targeted directly at individuals therefore has the capacity to have a very wide reach indeed, both inside and outside the territory of the surveilling State. Safeguards are therefore pivotal and yet elusive. Unlike the targeted interception which has been the subject of much of the Court’s case-law, and which is primarily used for the investigation of crime, bulk interception is also – perhaps even predominantly – used for foreign intelligence gathering and the identification of new threats from both known and unknown actors. When operating in this realm, Contracting States have a legitimate need for secrecy which means that little if any information about the operation of the scheme will be in the public domain, and such information as is available may be couched in terminology which is obscure and which may vary significantly from one State to the next.
The existence of an interference
[…]
325. […] the Court considers that the stages of the bulk interception process which fall to be considered can be described as follows:
(a) the interception and initial retention of communications and related communications data (that is, the traffic data belonging to the intercepted communications);
(b) the application of specific selectors to the retained communications/related communications data;
(c) the examination of selected communications/related communications data by analysts; and
(d) the subsequent retention of data and use of the “final product”, including the sharing of data with third parties.
[…]
330. […] Article 8 applies at each of the above stages. While the initial interception followed by the immediate discarding of parts of the communications does not constitute a particularly significant interference, the degree of interference with individuals’ Article 8 rights will increase as the bulk interception process progresses. In this regard, the Court has clearly stated that even the mere storing of data relating to the private life of an individual amounts to an interference within the meaning of Article 8 […], and that the need for safeguards will be all the greater where the protection of personal data undergoing automatic processing is concerned […]
Whether the interference was justified
-
General principles relating to secret measures of surveillance, including the interception of communications
[…]
333. The meaning of “foreseeability” in the context of secret surveillance is not the same as in many other fields. In the special context of secret measures of surveillance, such as the interception of communications, “foreseeability” cannot mean that individuals should be able to foresee when the authorities are likely to resort to such measures so that they can adapt their conduct accordingly. However, especially where a power vested in the executive is exercised in secret, the risks of arbitrariness are evident. It is therefore essential to have clear, detailed rules on secret surveillance measures, especially as the technology available for use is continually becoming more sophisticated. The domestic law must be sufficiently clear to give citizens an adequate indication as to the circumstances in which and the conditions on which public authorities are empowered to resort to any such measures […]. Moreover, the law must indicate the scope of any discretion conferred on the competent authorities and the manner of its exercise with sufficient clarity to give the individual adequate protection against arbitrary interference.
[…]
335. […] In its case-law on the interception of communications in criminal investigations, the Court has developed the following minimum requirements that should be set out in law in order to avoid abuses of power: (i) the nature of offences which may give rise to an interception order; (ii) a definition of the categories of people liable to have their communications intercepted; (iii) a limit on the duration of interception; (iv) the procedure to be followed for examining, using and storing the data obtained; (v) the precautions to be taken when communicating the data to other parties; and (vi) the circumstances in which intercepted data may or must be erased or destroyed […]
336. Review and supervision of secret surveillance measures may come into play at three stages: when the surveillance is first ordered, while it is being carried out, or after it has been terminated. As regards the first two stages, the very nature and logic of secret surveillance dictate that not only the surveillance itself but also the accompanying review should be effected without the individual’s knowledge. Consequently, since the individual will necessarily be prevented from seeking an effective remedy of his or her own accord or from taking a direct part in any review proceedings, it is essential that the procedures established should themselves provide adequate and equivalent guarantees safeguarding his or her rights. In a field where abuse in individual cases is potentially so easy and could have such harmful consequences for democratic society as a whole, the Court has held that it is in principle desirable to entrust supervisory control to a judge, judicial control offering the best guarantees of independence, impartiality and a proper procedure […]
[…]
338. As to the question whether an interference was “necessary in a democratic society” in pursuit of a legitimate aim, the Court has recognised that the national authorities enjoy a wide margin of appreciation in choosing how best to achieve the legitimate aim of protecting national security […]
339. However, this margin is subject to European supervision embracing both legislation and decisions applying it. In view of the risk that a system of secret surveillance set up to protect national security (and other essential national interests) may undermine or even destroy the proper functioning of democratic processes under the cloak of defending them, the Court must be satisfied that there are adequate and effective guarantees against abuse. The assessment depends on all the circumstances of the case, such as the nature, scope and duration of the possible measures, the grounds required for ordering them, the authorities competent to authorise, carry out and supervise them, and the kind of remedy provided by the national law. […]
- The approach to be followed in bulk interception cases
348. It is clear that the first two of the six “minimum safeguards” [see paragraph 335] which the Court, in the context of targeted interception, has found should be defined clearly in domestic law in order to avoid abuses of power […], are not readily applicable to a bulk interception regime. Similarly, the requirement of “reasonable suspicion”, which can be found in the Court’s case-law on targeted interception in the context of criminal investigations is less germane in the bulk interception context, the purpose of which is in principle preventive, rather than for the investigation of a specific target and/or an identifiable criminal offence. Nevertheless, the Court considers it imperative that when a State is operating such a regime, domestic law should contain detailed rules on when the authorities may resort to such measures. In particular, domestic law should set out with sufficient clarity the grounds upon which bulk interception might be authorised and the circumstances in which an individual’s communications might be intercepted. The remaining four minimum safeguards defined by the Court in its previous judgments — that is, that domestic law should set out a limit on the duration of interception, the procedure to be followed for examining, using and storing the data obtained, the precautions to be taken when communicating the data to other parties, and the circumstances in which intercepted data may or must be erased or destroyed — are equally relevant to bulk interception.
349. […] In the context of bulk interception the importance of supervision and review will be amplified, because of the inherent risk of abuse and because the legitimate need for secrecy will inevitably mean that, for reasons of national security, States will often not be at liberty to disclose information concerning the operation of the impugned regime.
350. Therefore, in order to minimise the risk of the bulk interception power being abused, the Court considers that the process must be subject to “end-to-end safeguards”, meaning that, at the domestic level, an assessment should be made at each stage of the process of the necessity and proportionality of the measures being taken; that bulk interception should be subject to independent authorisation at the outset, when the object and scope of the operation are being defined; and that the operation should be subject to supervision and independent ex post facto review. […]
351. Turning first to authorisation, the Grand Chamber agrees with the Chamber that while judicial authorisation is an “important safeguard against arbitrariness” it is not a “necessary requirement” […]. Nevertheless, bulk interception should be authorised by an independent body; that is, a body which is independent of the executive.
352. Furthermore, in order to provide an effective safeguard against abuse, the independent authorising body should be informed of both the purpose of the interception and the bearers or communication routes likely to be intercepted. This would enable the independent authorising body to assess the necessity and proportionality of the bulk interception operation and also to assess whether the selection of bearers is necessary and proportionate to the purposes for which the interception is being conducted.
353. The use of selectors – and strong selectors in particular – is one of the most important steps in the bulk interception process, as this is the point at which the communications of a particular individual may be targeted by the intelligence services. However, while some systems allow for the prior authorisation of categories of selectors […], the Court notes that the Governments of both the United Kingdom and the Netherlands have submitted that any requirement to explain or substantiate selectors or search criteria in the authorisation would seriously restrict the effectiveness of bulk interception […]
354. Taking into account the characteristics of bulk interception […], the large number of selectors employed and the inherent need for flexibility in the choice of selectors, which in practice may be expressed as technical combinations of numbers or letters, the Court would accept that the inclusion of all selectors in the authorisation may not be feasible in practice. Nevertheless, given that the choice of selectors and query terms determines which communications will be eligible for examination by an analyst, the authorisation should at the very least identify the types or categories of selectors to be used.
355. Moreover, enhanced safeguards should be in place when strong selectors linked to identifiable individuals are employed by the intelligence services. The use of every such selector must be justified – with regard to the principles of necessity and proportionality – by the intelligence services and that justification should be scrupulously recorded and be subject to a process of prior internal authorisation providing for separate and objective verification of whether the justification conforms to the aforementioned principles.
356. Each stage of the bulk interception process – including the initial authorisation and any subsequent renewals, the selection of bearers, the choice and application of selectors and query terms, and the use, storage, onward transmission and deletion of the intercept material – should also be subject to supervision by an independent authority and that supervision should be sufficiently robust to keep the “interference” to what is “necessary in a democratic society” […]
357. Finally, an effective remedy should be available to anyone who suspects that his or her communications have been intercepted by the intelligence services, either to challenge the lawfulness of the suspected interception or the Convention compliance of the interception regime. In the targeted interception context, the Court has repeatedly found the subsequent notification of surveillance measures to be a relevant factor in assessing the effectiveness of remedies before the courts and hence the existence of effective safeguards against the abuse of surveillance powers. However, it has acknowledged that notification is not necessary if the system of domestic remedies permits any person who suspects that his or her communications are being or have been intercepted to apply to the courts; in other words, where the courts’ jurisdiction does not depend on notification to the interception subject that there has been an interception of his or her communications […]
[…]
361. In assessing whether the respondent State acted within its margin of appreciation […] the Court will examine whether the domestic legal framework clearly defined:
- the grounds on which bulk interception may be authorised;
- the circumstances in which an individual’s communications may be intercepted;
- the procedure to be followed for granting authorisation;
- the procedures to be followed for selecting, examining and using intercept material;
- the precautions to be taken when communicating the material to other parties;
- the limits on the duration of interception, the storage of intercept material and the circumstances in which such material must be erased and destroyed;
- the procedures and modalities for supervision by an independent authority of compliance with the above safeguards and its powers to address non-compliance;
- the procedures for independent ex post facto review of such compliance and the powers vested in the competent body in addressing instances of non-compliance.
(3.4.3) Communications-related data (metadata)
What is metadata? Watch the video Metadata Explained, by Privacy International
(3.4.3.1) Examples of metadata
Metadata is all non-content related data.
Phone calls
- data to trace and identify the source of a communication and its destination
- date, time, duration and type of a communication
- users’ communication equipment
- location of mobile communication equipment
- name and address of a subscriber
- the calling telephone number and the number called
Telecommunication services
- IP address for Internet services
- websites visited
- connections to websites
- email addresses
- mapping of social networks
- location tracking
- mapping of communication patterns
- insights into who they have interacted with
(3.4.3.2) The evolving technology of metadata and the human rights protection
- metadata as an evolving technology and seriousness of interference with privacy
Metadata and the technologies related to it are evolving through time. The technological capacity to create metadata, storing capacity as well as processing power enable the creation and analysis of a large amount of data/personal information. Metadata can also be processed and analysed easier that content because of its formatting. Due to the radical increase of digital communications and these technological advances over the past two decades, communications-related data, taken as a whole, now reveal an intimate portrait of a person (e.g., private live, behaviour, relationships, preferences, identity). The interference with the right to privacy can be particularly serious; many times, equally serious (if not more serious) compared to the content of communications.
- response by human rights law
The evolving seriousness of the interferences posed by the collection, storing and analysis of metadata dictates that the human rights law standard of assessing compliance with the right to privacy is expected to adjust. The material below gives a snapshot of this evolution from a human rights law point of view. The UN High Commissioner for Human Rights in the 2014 report contests the arguments regularly made by many states at that time that the interception/collection of metadata, as opposed to the content of the communication, does not on its own constitute an interference with privacy. In the same year (2014), the CJEU, in the Digital Rights case, emphasises that interception/collection of metadata, can qualify as a serious interference with privacy. The ECtHR reaches the same conclusion in its 2021 judgment in the Big Brother case. The ECtHR articulates much more clearly the seriousness of the interference (in the recent (2024) judgments). Interestingly, the ECtHR accepts that safeguards applicable to metadata do not have to be identical in every respect to those governing the content of communications.
(3.4.3.2.1) Report of the Office of the UN High Commissioner for Human Rights, The right to privacy in the digital age, UN Doc A/HRC/27/37, 30 June 2014
19. […] It has been suggested that the interception or collection of data about a communication, as opposed to the content of the communication, does not on its own constitute an interference with privacy. From the perspective of the right to privacy, this distinction is not persuasive. The aggregation of information commonly referred to as “metadata” may give an insight into an individual’s behaviour, social relationships, private preferences and identity that go beyond even that conveyed by accessing the content of a private communication.
(3.4.3.2.2) Court of Justice of the European Union, Digital Rights Ireland and Seitinger and Others, C-293/12 and C-594/12, 8 April 2014 (Grand Chamber)
26. […] [The data which providers of publicly available electronic communications services or of public communications networks must retain, pursuant to Articles 3 and 5 of Directive 2006/24] data make it possible, in particular, to know the identity of the person with whom a subscriber or registered user has communicated and by what means, and to identify the time of the communication as well as the place from which that communication took place. They also make it possible to know the frequency of the communications of the subscriber or registered user with certain persons during a given period.
27. Those data, taken as a whole, may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.
[…]
29. The retention of data for the purpose of possible access to them by the competent national authorities, as provided for by Directive 2006/24, directly and specifically affects private life and, consequently, the rights guaranteed by Article 7 of the Charter. Furthermore, such a retention of data also falls under Article 8 of the Charter because it constitutes the processing of personal data within the meaning of that article and, therefore, necessarily has to satisfy the data protection requirements arising from that article […]
[…]
37. […] The interference caused by Directive 2006/24 with the fundamental rights laid down in Articles 7 and 8 of the Charter is […] wide-ranging, and it must be considered to be particularly serious. Furthermore, […] the fact that data are retained and subsequently used without the subscriber or registered user being informed is likely to generate in the minds of the persons concerned the feeling that their private lives are the subject of constant surveillance.
(3.4.3.2.3) ECtHR, Škoberne v Slovenia, App no 19920/20, 15 February 2024
133. […] telecommunications data […], when linked to a subscriber or a user, can reveal intimate pictures of his or her life through the mapping of social networks, location tracking, the mapping of communication patterns, and insight into who they have interacted with […]. In the Court’s view, the systemic surveillance entailed by the mandatory retention of telecommunications data presents an impediment to the enjoyment of the privacy rights of all users of telecommunication services. The existence of large collections of telecommunications data and the ongoing retention of such data could understandably generate a sense of vulnerability and exposure and could prejudice persons’ ability to enjoy privacy and the confidentiality of correspondence, to develop relations with others and to exercise other fundamental rights. In this regard the Court also refers to the observations made by the CJEU in Digital Rights Ireland and Others […]
134. […] the Court finds that the interference constituted by the data retention under consideration was of a serious nature. It must now determine whether the means provided under the impugned legislation for the achievement of the legitimate aims […] complied with the requirements of the principle of the rule of law and remained in all respects within the bounds of what was necessary in a democratic society.
(3.4.3.2.4) ECtHR, Pietrzak and Bychawska-Siniarska and Others v Poland, App nos 72038/17, 25237/18, 28 May 2024 (in French) (unofficial translation)
249. […] The acquisition of associated communications data in the context of bulk interception can be just as intrusive as the acquisition of the content of communications, and therefore the interception, retention, and searches of such data must be analyzed in light of the same safeguards as those applicable to the content of communications, without it being necessary for the legal provisions governing the processing of such data to be identical in every respect to those governing the processing of the content of communications […]
(3.4.3.2.5) Not all metadata is the same: interference, while not trivial, rather limited
ECtHR, Breyer v Germany, App no 50001/12, 20 January 2020
The case concerned the statutory obligation imposed on service providers to store personal data of users of prepaid mobile telephone SIM cards and make them available to authorities upon request.
90. The Court acknowledges that pre-registration of mobile-telephone subscribers strongly simplifies and accelerates investigation by law‑enforcement agencies and can thereby contribute to effective law enforcement and prevention of disorder or crime. […] The Court reiterates that in a national security context, national authorities enjoy a certain margin of appreciation when choosing the means for achieving a legitimate aim and notes that according to the comparative-law report, there is no consensus between the member States as regards the retention of subscriber information of prepaid SIM card customers […]
91. The question, however, remains whether the interference was proportionate and struck a fair balance between the competing public and private interests.
92. […] The Court has to establish the level of interference with the applicants’ right to private life. […] Only a limited data set was stored. These data did not include any highly personal information or allow the creation of personality profiles or the tracking of the movements of mobile-telephone subscribers. Moreover, no data concerning individual communication events were stored. The level of interference therefore has to be clearly distinguished from the Court’s previous cases that concerned, for example, “metering” […], geolocation […] or the storage of health or other sensitive data […] Moreover, the case has to be distinguished from cases in which the registration in a particular database led to frequent checks or further collection of private information […].
[…]
95. In sum, the Court concludes that the interference was, while not trivial, of a rather limited nature.
(3.4.4) Exercise: In addition to the right to privacy, what other rights may also be affected by mass surveillance practices?
Report of the Office of the United Nations High Commissioner for Human Rights, The right to privacy in the digital age, UN Doc A/HRC/27/37, 30 June 2014
14. […] other rights also may be affected by mass surveillance, the interception of digital communications and the collection of personal data. These include the rights to freedom of opinion and expression […]; to freedom of peaceful assembly and association; and to family life – rights all linked closely with the right to privacy and, increasingly, exercised through digital media. Other rights, such as the right to health, may also be affected by digital surveillance practices, for example where an individual refrains from seeking or communicating sensitive health-related information for fear that his or her anonymity may be compromised. There are credible indications to suggest that digital technologies have been used to gather information that has then led to torture and other ill-treatment. Reports also indicate that metadata derived from electronic surveillance have been analysed to identify the location of targets for lethal drone strikes.
[…]
20. […] Even the mere possibility of communications [data] […] being captured creates an interference with privacy, with a potential chilling effect on rights, including those to free expression and association. The very existence of a mass surveillance programme thus creates an interference with privacy.
(3.5) Biometric/Facial Recognition and the Right to Privacy
(3.5.1) Explaining biometric recognition
- What is biometric recognition?
Biometric recognition technologies detect, capture, and transform physical characteristics or behavioural traits into machine-readable biometric data.
- physical characteristics include facial features (eye distance and size, nose length), fingerprints, iris, voice
- behavioural characteristics include keystroke, gait or voice recognition
- What is facial recognition technology?
Facial recognition technology (FRT) is a type of biometric recognition. It compares digital images of a person’s face to a database of faces, either live or from images and videos.
- What is live recognition technology?
Live facial recognition (LFR) is a technology that uses AI to compare live camera footage of people’s faces against a database, typically a watchlist of individuals of interest. This allows for real-time identification of people and can be used in various contexts.
(3.5.2) ECtHR, Glukhin v Russia, App no 11519/20, 4 July 2023
The case concerns the applicant’s administrative conviction for his failure to notify the authorities of his intention to hold a solo demonstration using a “quickly (de)assembled object”. During the investigation, the police used live facial recognition technology to process the applicant’s personal data. Glukhin v Russia is the first international judgment on facial recognition. The ECtHR:
- stressed that new technologies (live and automated facial recognition) are highly intrusive
- highlighted the chilling effect that the use of such technologies has on the effective exercise of human rights (e.g., freedom of expression or right to assembly)
Nonetheless, the ECtHR did not engage at all with the question of whether, in certain circumstances, facial/biometric recognition systems may be incompatible with the right to privacy.
68. In the present case, during routine monitoring of the internet the police discovered photographs and a video of the applicant holding a solo demonstration published on a public Telegram channel. They made screenshots of the Telegram channel, stored them and allegedly applied facial recognition technology to them to identify the applicant. Having identified the location on the video as one of the stations of the Moscow underground, the police also collected video-recordings from CCTV surveillance cameras installed at that station, as well as at two other stations through which the applicant had transited. They made screenshots of those video-recordings and stored them. They also allegedly used the live facial recognition CCTV cameras installed in the Moscow underground to locate and arrest the applicant several days later with the aim of charging him with an administrative offence. The screenshots of the Telegram channel and of the video-recordings from the CCTV surveillance cameras were subsequently used in evidence in the administrative-offence proceedings against the applicant […]
72. […] the Court accepts in the particular circumstances of the case that facial recognition technology was used […].
73. The Court concludes that the processing of the applicant’s personal data in the framework of the administrative-offence proceedings against him, including the use of facial recognition technology – firstly, to identify him from the photographs and the video published on Telegram and, secondly, to locate and arrest him later while he was travelling on the Moscow underground – amounted to an interference with his right to respect for his private life within the meaning of Article 8 § 1 of the Convention.
[…]
75. The protection of personal data is of fundamental importance to a person’s enjoyment of his or her right to respect for private and family life, as guaranteed by Article 8 of the Convention. The domestic law must afford appropriate safeguards to prevent any such use of personal data as may be inconsistent with the guarantees of this Article. The need for such safeguards is all the greater where the protection of personal data undergoing automatic processing is concerned, not least when such data are used for police […], and especially where the technology available is continually becoming more sophisticated […]. The protection afforded by Article 8 of the Convention would be unacceptably weakened if the use of modern scientific techniques in the criminal-justice system were allowed at any cost and without carefully balancing the potential benefits of the extensive use of such techniques against important private-life interests […]
76. Personal data revealing political opinions, such as information about participation in peaceful protests, fall within the special categories of sensitive data attracting a heightened level of protection […]
77. In the context of the collection and processing of personal data, it is therefore essential to have clear, detailed rules governing the scope and application of measures, as well as minimum safeguards concerning, inter alia, duration, storage, usage, access of third parties, procedures for preserving the integrity and confidentiality of data, and procedures for their destruction, thus providing sufficient guarantees against the risk of abuse and arbitrariness.
[…]
83. The Court has strong doubts that the domestic legal provisions meet the “quality of law” requirement. It notes, in particular, that the domestic law permits the processing of biometric personal data “in connection with the administration of justice” […] This legal provision is widely formulated. […] The domestic law does not contain any limitations on the nature of situations which may give rise to the use of facial recognition technology, the intended purposes, the categories of people who may be targeted, or the processing of sensitive personal data. Furthermore, the Government did not refer to any procedural safeguards accompanying the use of facial recognition technology in Russia, such as the authorisation procedures, the procedures to be followed for examining, using and storing the data obtained, any supervisory control mechanisms or the available remedies.
84. The Court will further proceed on the assumption that the contested measures pursued the legitimate aim of the prevention of crime.
[…]
86. In determining whether the processing of the applicant’s personal data was “necessary in a democratic society”, the Court will first assess the level of the actual interference with the right to respect for private life […]. It notes that the police collected and stored the applicant’s digital images and used them to extract and process the applicant’s biometric personal data with the aid of facial recognition technology: first, to identify him from the photographs and the video published on Telegram and, secondly, to locate and arrest him while he was travelling on the Moscow underground. The Court considers these measures to be particularly intrusive, especially in so far as live facial recognition technology is concerned […]. A high level of justification is therefore required in order for them to be considered “necessary in a democratic society”, with the highest level of justification required for the use of live facial recognition technology. Moreover, the personal data processed contained information about the applicant’s participation in a peaceful protest and therefore revealed his political opinion. They accordingly fell within the special categories of sensitive data attracting a heightened level of protection […]
87. In the assessment of the “necessity in a democratic society” of the processing of personal data in the context of investigations, the nature and gravity of the offences in question is one of the elements to be taken into account […]. The domestic law permits the processing of biometric personal data in connection with the investigation and prosecution of any offence, irrespective of its nature and gravity.
88. The Court observes that the applicant was prosecuted for a minor offence consisting of holding a solo demonstration without prior notification – an offence classified as administrative rather than criminal under the domestic law. He was never accused of committing any reprehensible acts during his demonstration, such as the obstruction of traffic, damage to property or acts of violence. It was never claimed that his actions presented any danger to public order or transport safety. The Court has already found that the administrative-offence proceedings against the applicant breached his right to freedom of expression […]. It considers that the use of highly intrusive facial recognition technology to identify and arrest participants in peaceful protest actions could have a chilling effect in relation to the rights to freedom of expression and assembly.
89. In such circumstances, the use of facial recognition technology to identify the applicant from the photographs and the video published on Telegram – and a fortiori the use of live facial recognition technology to locate and arrest him while he was travelling on the Moscow underground – did not correspond to a “pressing social need”.
90. In the light of all the above considerations, the Court concludes that the use of highly intrusive facial recognition technology in the context of the applicant’s exercising his Convention right to freedom of expression is incompatible with the ideals and values of a democratic society governed by the rule of law, which the Convention was designed to maintain and promote. The processing of the applicant’s personal data using facial recognition technology in the framework of administrative-offence proceedings – firstly, to identify him from the photographs and the video published on Telegram and, secondly, to locate and arrest him while he was travelling on the Moscow underground – cannot be regarded as “necessary in a democratic society”.
91. There has accordingly been a violation of Article 8 of the Convention.
See also R. (on the application of Edward Bridges) v Chief Constable of S. Wales Police [2020] EWCA Civ 1058 (Ed Bridges)
- the Court of Appeal found the South Wales Police’s use of automated facial recognition technology breached the rights to privacy and equality
(3.5.3) Prohibition of the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purposes of law enforcement under the EU AI Act
(3.5.3.1) Article 5(1)(h) EU AI Act
The following AI practices shall be prohibited:
(h) the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purposes of law enforcement, unless and in so far as such use is strictly necessary for one of the following objectives:
(i) the targeted search for specific victims of abduction, trafficking in human beings or sexual exploitation of human beings, as well as the search for missing persons
(ii) the prevention of a specific, substantial and imminent threat to the life or physical safety of natural persons or a genuine and present or genuine and foreseeable threat of a terrorist attack
(iii) the localisation or identification of a person suspected of having committed a criminal offence, for the purpose of conducting a criminal investigation or prosecution or executing a criminal penalty for offences referred to in Annex II and punishable in the Member State concerned by a custodial sentence or a detention order for a maximum period of at least four years
Point (h) of the first subparagraph is without prejudice to Article 9 of Regulation (EU) 2016/679 for the processing of biometric data for purposes other than law enforcement.
(3.5.3.2) Rationale underpinning the prohibition in Article 5(1)(h) EU AI Act
Commission Guidelines on prohibited artificial intelligence practices established by the EU AI Act, C(2025) 884 final, 4 February 2025
(293) Recital 32 AI Act acknowledges the intrusive nature of real-time RBI systems in publicly accessible spaces for law enforcement purposes to the rights and freedoms of persons concerned, to the extent that it may affect the private life of a large part of the population, evoke a feeling of constant surveillance, and indirectly dissuade the exercise of the freedom of assembly and other fundamental rights. Technical inaccuracies of AI systems intended for the remote biometric identification of natural persons can lead to biased results and entail discriminatory effects.
Such possibly biased results and discriminatory effects are particularly relevant with regard to age, ethnicity, race, sex or disabilities. In addition, the immediacy of the impact and the limited opportunities for further checks or corrections in relation to the use of such systems operating in real-time carry heightened risks for the rights and freedoms of the persons concerned in the context of, or impacted by, law enforcement activities
(294) However, where the use of such systems is strictly necessary to achieve a substantial public interest and where the situations in which such use may occur are exhaustively listed and narrowly defined, that use outweighs the risks to fundamental rights (Recital 33 AI Act). To ensure that such systems are used in a ‘responsible and proportionate manner’, their use is subject to the safeguards and the specific obligations and requirements in Article 5(2)-(7) AI Act.
(3.5.3.3) Understanding the prohibition
Article 5(1)(h) AI Act prohibits:
The prohibition in Article 5(1)(f) AI Act does not apply to the labelling or filtering of lawfully acquired datasets in the area of law enforcement.
(3.5.3.4) Exceptions to the prohibition
Article 5(1)(h)(i)-(iii) provides 3 exceptions to the general prohibition on the use of real-time RBI in publicly accessible spaces for law enforcement purposes:
a) the targeted search of victims of three specific serious crimes and missing persons
b) the prevention of imminent threats to life or physical safety or a genuine threat of terrorist attacks
c) the localisation and identification of suspects and offenders of certain serious crimes (as listed in Annex II to the AI Act)
These exceptions
- are prescribed exhaustively
- need to be strictly necessary
- must fulfill the conditions and safeguards set out in Article 5(2) to 5(7) AI Act
- have to be grounded in respective national legislation
Where a Member State authorises the use of real-time RBI systems in publicly accessible spaces for law enforcement purposes for any of these three objectives, the rules for high-risk AI systems apply to that use.
- DJ Solove, ‘A Taxonomy of Privacy’ (2006) 154 U. Pa. L. Rev. 477
- DJ Solove and NM Richards, ‘Privacy’s Other Path: Recovering the Law of Confidentiality’ (2007) 96 Geo. L. J. 123
- A Rachovitsa & N Johann, ‘The Use of AI in the Digital Welfare State and Lessons Learned from the SyRI Case’ (2022) 22 Human Rights Law Review 1
- Mittelstadt, ‘From Individual to Group Privacy in Big Data Analytics’ (2017) 30 Philosophy & Technology 478
- D Murray, ‘Police Use of Retrospective Facial Recognition Technology: A Step Change in Surveillance Capability Necessitating an Evolution of the Human Rights Law Framework’ (2023) Modern Law Review 1
- Have a look at the work of the UN Special Rapporteur on the right to privacy, including the annual thematic reports.
- On children’s rights in relation to the digital environment, read the Committee on the Rights of the Child, General Comment No 25 (2021).
- For a gender perspective on privacy and technology, read this report by the UN Special Rapporteur on the right to privacy, UN Doc A/HRC/40/63, 27 February 2019
- On data colonialism, Nick Couldry and Ulises A. Mejias, ‘Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject’ (2019) 20 Television & New Media, 2019
Feedback/Errata