Legal analysis of individual’s situation is not their personal data, says Advocate General

December 18th, 2013 by Robin Hopkins

YS, M and S were three people who applied for lawful residence in the Netherlands. The latter two had their applications granted, but YS’ was refused. All three wanted to see a minute drafted by an official of the relevant authority in the Netherlands containing internal legal analysis on whether to grant them residence status. They made subject access requests under Dutch data protection law, the relevant provisions of which implement Article 12 of Directive 95/46/EC. They were given some of the contents of the minutes, but the legal analysis was withheld. This was challenged before the Dutch courts. Questions were referred to the CJEU on the application of data protection law to such information. In Joined Cases C‑141/12 and C‑372/12, Advocate General Sharpston has given her opinion, which the CJEU will consider before giving its judgment next year. Here are some important points from the AG’s opinion.

The definition of personal data

The minutes in question contained inter alia: the name, date of birth, nationality, sex, ethnicity, religion and language of the applicant; information about the procedural history; information about declarations made by the applicant and documents submitted; the applicable legal provisions and an assessment of the relevant information in the light of the applicable law.

Apart from the latter – the legal advice – the AG’s view is that this information does come within the meaning of personal data under the Directive. She said this:

“44. In general, ‘personal data’ is a broad concept. The Court has held that the term covers, for example, ‘the name of a person in conjunction with his telephone coordinates or information about his working conditions or hobbies’, his address, his daily work periods, rest periods and corresponding breaks and intervals, monies paid by certain bodies and the recipients, amounts of earned or unearned incomes and assets of natural persons.

45. The actual content of that information appears to be of no consequence as long as it relates to an identified or identifiable natural person. It can be understood to relate to any facts regarding that person’s private life and possibly, where relevant, his professional life (which might involve a more public aspect of that private life). It may be available in written form or be contained in, for example, a sound or image.”

The suggestion in the final paragraph is that the information need not have a substantial bearing on the individual’s privacy in order to constitute their personal data.

The AG also observed that “Directive 95/46 does not establish a right of access to any or every document or file in which personal data are listed or used” (paragraph 71). This resonates with the UK’s long-established Durant ‘notions of assistance’.

Legal analysis is not personal data

AG Sharpston’s view, however, was that the legal analysis of the individuals’ situations did not constitute their personal data. Her reasoning – complete with illustrative examples – is as follows:

“55. I am not convinced that the phrase ‘any information relating to an identified or identifiable natural person’ in Directive 95/46 should be read so widely as to cover all of the communicable content in which factual elements relating to a data subject are embedded.

56. In my opinion, only information relating to facts about an individual can be personal data. Except for the fact that it exists, a legal analysis is not such a fact. Thus, for example, a person’s address is personal data but an analysis of his domicile for legal purposes is not.

57. In that context, I do not find it helpful to distinguish between ‘objective’ facts and ‘subjective’ analysis. Facts can be expressed in different forms, some of which will result from assessing whatever is identifiable. For example, a person’s weight might be expressed objectively in kilos or in subjective terms such as ‘underweight’ or ‘obese’. Thus, I do not exclude the possibility that assessments and opinions may sometimes fall to be classified as data.

58. However, the steps of reasoning by which the conclusion is reached that a person is ‘underweight’ or ‘obese’ are not facts, any more than legal analysis is.”

Interestingly, her conclusion did touch upon the underlying connection between personal data and privacy. At paragraph 60, she observed that “… legal analysis as such does not fall within the sphere of an individual’s right to privacy. There is therefore no reason to assume that that individual is himself uniquely qualified to verify and rectify it and ask that it be erased or blocked. Rather, it is for an independent judicial authority to review the decision for which that legal analysis was prepared.”

In any event, legal analysis does not amount to “processing” for data protection purposes

The AG considered that legal analysis such as this was neither ‘automatic’ nor part of a ‘relevant filing system’. “Rather, it is a process controlled entirely by individual human intervention through which personal data (in so far as they are relevant to the legal analysis) are assessed, classified in legal terms and subjected to the application of the law, and by which a decision is taken on a question of law. Furthermore, that process is neither automatic nor directed at filing data” (paragraph 63).

Entitlement to data, but not in a set form

The AG also says that what matters is that individuals are provided with their data – data controllers are not, under the Directive, required to provide it in any particular form. For example, they can extract or transcribe rather than photocopy the relevant minute:

“74. Directive 95/46 does not require personal data covered by the right of access to be made available in the material form in which they exist or were initially recorded. In that regard, I consider that a Member State has a considerable margin of discretion to determine, based on the individual circumstances in case, the form in which to make personal data accessible.

75. In making that assessment, a Member State should take account of, in particular: (i) the material form(s) in which that information exists and can be made available to the data subject, (ii) the type of personal data and (iii) the objectives of the right of access.”

If the legal analysis is personal data, then the exemptions do not apply

Under the Directive, Article 12 provides the subject access right. Article 13 provides exemptions. The AG’s view was that if, contrary to her opinion, the legal analysis is found to be personal data, then exemptions from the duty to communicate that data would not be available. Of particular interest was her view concerning the exemption under Article 13(1)(g) for the “protection of the data subject or of the rights and freedoms of others”. Her view is that (paragraph 84):

“the protection of rights and freedoms of others (that is, other than the data subject) cannot be read as including rights and freedoms of the authority processing personal data. If a legal analysis is to be categorised as personal data, that must be because it is related to the private interests of an identified or identifiable person. Whilst the public interest in protecting internal advice in order to safeguard the administration’s ability to exercise its functions may indeed compete with the public interest in transparency, access to such advice cannot be restricted on the basis of the first of those two interests, because access covers only what falls within the private interest.”

If the Court agrees with the AG’s view, the case will be an important addition to case law offering guidance on the limits of personal data. It would also appear to limit, at least as regards the exemption outlined above, the data controller’s ability to rely on its own interests or on public interests to refuse subject access requests. That said, there is of course the exemption under Article 9 of the Directive for freedom of expression.

Robin Hopkins @hopkinsrobin

Facebook fan pages: data protection buck stops with Facebook, not page owners

October 22nd, 2013 by Robin Hopkins

In Re Facebook, VG, Nos. 8 A 37/12, 8 A 14/12, 8 A 218/11, 10/9/13 the Schleswig-Holstein Administrative Court has allowed Facebook’s appeals against rulings of the regional data protection authority (the ULD), Thilo Weichert.

The case involved a number of companies’ use of Facebook fan pages. The ULD’s view was that Facebook breached German privacy law, including through its use of cookies, facial recognition and other data processing. He considered that, by using Facebook fan pages, the companies were facilitating Facebook’s violations by processing users’ personal data on those pages. He ordered them to shut down the fan pages or face fines of up to €50,000.

The appellant companies argued that they could not be held responsible for data protection violations (if any) allegedly committed by Facebook, as they had no control over how that data on the pages was processed and used by the social networking site. The Administrative Court agreed.

The case raises interesting questions about where the buck stops in terms of data processing – both in terms of who controls the processing, and in terms of where they are based. Facebook is based in Ireland, without a substantive operational presence in Germany. Earlier this year, the Administrative Court found – again against the Schleswig-Holstein ULD’s ruling – that Facebook’s ‘real names’ policy (i.e. a ban on pseudonymised profiles) was a matter for Irish rather than German law.

The ULD is unlikely to be impressed by the latest judgment, given that he is reported as having said in 2011 that:

“We see a much bigger privacy issue behind the Facebook case: the main business model of Google, Apple, Amazon and others is based on privacy law infringements. This is the reason why Facebook and all the other global internet players are so reluctant in complying with privacy law: they would lose their main profit resource.”

For more on this story, see links here and here.

Robin Hopkins

Fingerprints requirement for passport does not infringe data protection rights

October 22nd, 2013 by Robin Hopkins

Mr Schwarz applied to his regional authority, the city of Bochum, for a passport. He was required to submit a photograph and fingerprints. He did not like the fingerprint part. He considered it unduly invasive. He refused. So Bochum refused to give him a passport. He asked the court to order it to give him one. The court referred to the Court of Justice of the European Union questions about whether the requirement to submit fingerprints in addition to photographs complied with the Data Protection Directive 95/46/EC.

Last week, the Fourth Chamber of the CJEU gave its judgment: the requirement is data protection-compliant.

The requirement had a legal basis, namely Article 1(2) of Council Regulation 2252/2004, which set down minimum security standards for identity-confirmation purposes in passports.

This pursued a legitimate aim, namely preventing illegal entry into the EU.

Moreover, while the requirements entailed the processing of personal data and an interference with privacy rights, the ‘minimum security standards’ rules continued to “respect the essence” of the individual’s right to privacy.

The fingerprint requirement was proportionate because while the underlying technology is not 100% successful in fraud-detection terms, it works well enough. The only real alternative as an identity-verifier is an iris scan, which is no less intrusive and is technologically less robust. The taking of fingerprints is not very intrusive or intimate – it is comparable to having a photograph taken for official purposes, which people don’t tend to complain about when it comes to passports.

Importantly, the underlying Regulation provided that the fingerprints could only be used for identity-verification purposes and that there would be no central database of fingerprints (instead, each set is stored only in the passport).

This is all common-sense stuff in terms of data protection compliance. Data controllers take heart!

Robin Hopkins

Refusal to destroy part of a ‘life story’ justified under Article 8(2) ECHR

October 4th, 2013 by Robin Hopkins

The High Court of Justice (Northern Ireland) has today given judgment In the matter of JR60’s application for judicial review [2013] NIQB 93. The applicant sought to challenge the right of the two Social Care Trusts to keep and use various records generated when she was a resident of children’s homes and a training school between the years 1978-1991.

In most cases of challenges to the retention of records, the applicant seeks to expunge information which suggests they have done wrong. This application is interesting because it focused (though not exclusively) on what the applicant had suffered, as opposed to what she had done. In short, she wished to erase from the record a part of her life story which was painful for her to recall. The application failed: there were weightier reasons for retaining those records, and in any event whatever her current wish to forget matters of such import, she might come to change her mind.

The applicant was described as having had a very difficult childhood, to which those records relate. It was not known who her father was. She had grown up to achieve impressive qualifications. Horner J described her as having “survived the most adverse conditions imaginable and triumphed through the force of her will. By any objective measurement she is a success”.

She wished to move on, and to have the records about her childhood expunged. The Trusts refused; their policy was to retain such information for a 75-year period. The applicant challenged this refusal on Article 8 ECHR grounds. Horner J readily agreed that the retention of such information interfered with her rights under Article 8, but dismissed her application on the grounds that the interference was justified.

The applicant had argued that (i) she did not intend to make any claim for ill-treatment or abuse while she was in care, (ii) she did not want to retrieve information about her life story, (iii) she did not want the records to be used to carry out checks on her, as persons who were not in care would not be burdened by such records in respect of their early lives, and (iv) she did not want others, including her own child, to be able to access these records.

In response to the applicant’s assertion that she did not want and did not envisage wanting access to her records, Horner J said this at paragraph 19:

“Even if the applicant does not want to know at present what is in her records, it does not follow that she may not want to find out in the future what they contain for all sorts of reasons. She may, following the birth of a grandchild, be interested in her personal history for that grandchild’s sake. She may want to find out about her genetic inheritance because she may discover, for example, that she, or her off-spring, is genetically predisposed to a certain illness whether mental or physical. She may want to know whether or not this has been passed down through her mother’s side or her father’s side. There may be other reasons about which it is unnecessary to speculate that will make her want to seek out her lost siblings. There are any number of reasons why she may change her mind in the future about accessing her care records. Of course, if the records are destroyed then the opportunity to consider them is lost forever.”

The Trusts argued that they needed to retain such records for the purposes of their own accountability, any background checks on the applicant or related individuals which may become necessary, for the purposes of (hypothetical) public interest issues such as inquiries, and for responding to subject access requests under the Data Protection Act 1998. Horner J observed that the “right for an individual to be able to establish details of his or her identity applies not just to the Looked After Child but also, inter alia, to that child’s offspring”.

In the circumstances, the application failed; the Trusts’ interference with the applicant’s Article 8 rights was justified.

Horner J added a short concluding observation about the DPA (paragraph 29):

“It is significant that no challenge has been made to the Trust’s storage of personal information of the applicant on the basis that such storage constitutes a breach of the Data Protection Act 1998. This act strengthens the safeguards under the 1984 Act which it replaced. The Act protects “personal data which is data relating to a living individual who can be identified from data whether taken alone or read with other information which is the possession (or is likely to come into possession) of the data controller: see 12-63 of Clayton and Tomlinson on The Law of Human Rights (2nd Edition). It will be noted that “personal” has been interpreted as almost meaning the same as “private”: see Durant v Financial Services Authority [2004] FSR 28 at paragraph [4].”

Robin Hopkins

Anonymity: publication and open justice

July 11th, 2013 by Robin Hopkins

The tension between transparency and individual privacy is part of what makes information rights such a fascinating and important area. When it comes to high-public interest issues involving particular individuals, prevailing wisdom has tended to be something like this: say as much as possible on an open basis, but redact and anonymise so as to protect the identity of the individuals involved. Increasingly, however, transparency is outmuscling privacy. See for example my post about the Tribunal’s order of disclosure, in the FOIA context, of the details of the compensation package of a Chief Executive of an NHS Trust (the case of Dicker v IC (EA/2012/0250).

The recent Care Quality Commission debate is the highest-profile recent illustration: the health regulator published a consultant’s report into failings regarding the deaths of babies at Furness General Hospital, but withheld the names of the individuals being criticised (including for alleged ‘cover-ups’), relying on the Data Protection Act 1998. The anonymisation was not endorsed by the Information Commissioner, and attracted widespread criticism in media and political circles. Transparency pressures held sway.

In a similar vein, the BBC has come under great pressure over the past week – particularly from Parliament’s Public Accounts Committee – to reveal the names of approximately 150 departing senior managers who received pay-offs averaging £164,000 in the past three years. As the Telegraph reports, the Committee is threatening to use parliamentary privilege to publish those names. The BBC admits that it “got things wrong” by overpaying in many cases (as confirmed by the National Audit Office), but is concerned to protect the DPA and privacy rights of the affected individuals, as well as to safeguard its own independence. The Committee says the public interest in transparency is compelling; Lord Patten, chair of the BBC Trust, says there will be “one hell of an argument” about this.

Such arguments become all the more thorny in the context of open justice disputes, of which there have been a number in recent weeks.

In the matter of Global Torch Ltd/Apex Global Management Ltd (The Guardian, The Financial Times and others intervening) [2013] EWCA Civ 819 involved competing petitions of unfair prejudice alleging misconduct in the affairs of a particular company. Two Saudi Arabian princes and one of their private advisers applied to have the interlocutory hearings held in private under CPR rule 39.2(3). The Court of Appeal agreed with the judge who dismissed those applications. It rejected the contention that the judge had elevated open justice above Article 8 ECHR rights as a matter of law. Rather, he noted that some general presumptions were valid (for example, open justice is likely to trump reputational damage) and applied those in the factual context of this case. Maurice Kay LJ said  (paragraph 34) that there was sometimes a “need for a degree of protection so as to avoid the full application of the open justice principle exposing a victim to the very detriment which his cause of action is designed to prevent… If such an approach were to be extended to a case such as the present one, it could equally be applied to countless commercial and other cases in which allegations of serious misconduct are made. That would result in a significant erosion of the open justice principle. It cannot be justified where adequate protection exists in the form of vindication of the innocent through the judicial process to trial”.

Open justice is of course fundamental not only to freedom of expression, but is also the default setting for fair trials. This is illustrated in the regulatory/disciplinary context by Miller v General Medical Council [2013] EWHC 1934 (Admin). The case involved a challenge to a decision by a Fitness to Practise Panel of the Council’s Medical Practitioners Tribunal Service that a fitness to practise hearing should take place in private because it considered that the complainant, a former patient of the claimant, was otherwise unlikely to give evidence. HHJ Pelling quashed the decision; there was insufficient evidence for the Panel’s conclusion about witness participation, and in any event the Panel “fell into error at the outset by not reminding itself sufficiently strongly or at all that the clear default position under Article 6 is that the hearing should be in public. It failed to remind itself that Article 6 creates or declares rights that are the rights of the Claimant and that it was for the GMC to prove both the need for any derogation from those rights and for a need to derogate to the extent claimed” (paragraph 20).

Robin Hopkins

Prism and Tempora: Privacy International commences legal action

July 10th, 2013 by Robin Hopkins

Panopticon has reported in recent weeks that, following the Edward Snowden/Prism disclosures, Liberty has brought legal proceedings against the UK’s security bodies. This week, Privacy International has announced that it too is bringing a claim in the Investigatory Powers Tribunal – concerning both the Prism and Tempora programmes. It summarises its claim in these terms:

“Firstly, for the failure to have a publicly accessible legal framework in which communications data of those located in the UK is accessed after obtained and passed on by the US National Security Agency through the Prism programme.  Secondly, for the indiscriminate interception and storing of huge amounts of data via tapping undersea fibre optic cables through the Tempora programme.”

Legal complaints on Prism-related transfers have been made elsewhere on data protection grounds also. A group of students who are members of a group called Europe vs. Facebook have filed complaints to the data protection authorities in Ireland (against Facebook and Apple), Luxembourg (against Skype and Microsoft) and Germany (against Yahoo).

European authorities have expressed concerns on these issues in their own right. For example, the Vice President of the European Commission, Viviane Reding, has written to the British Foreign Secretary, William Hague, about the Tempora programme, and has directed similar concerns at the US (including in a piece in the New York Times). The European Parliament has also announced that a panel of its Committee on Civil Liberties, Justice and Home Affairs will be convened to investigate the Prism-related surveillance of EU citizens. It says the panel will report by the end of 2013.

In terms of push-back within the US, it has been reported that Texas has introduced a bill strengthening the requirements for warrants to be obtained before any emails (as opposed to merely unread ones) can be disclosed to state and local law enforcement agencies.

Further complaints, litigation and potential legal challenges will doubtless arise concerning Prism, Tempora and the like.

Robin Hopkins

Google and data protection: no such thing as the ‘right to be forgotten’

June 28th, 2013 by Robin Hopkins

Chris Knight has blogged recently about enforcement action against Google by European Data Protection authorities (but not yet the UK’s ICO). I blogged last month about a German case (BGH, VI ZR 269/12 of 14th May 2013) concerning Google’s ‘autocomplete’ function, and earlier this year about the Google Spain case (Case C‑131/12). The latter arises out of complaints made to that authority by a number of Spanish citizens whose names, when Googled, generated results linking them to allegedly false, inaccurate or out-of-date information (contrary to the data protection principles) – for example an old story mentioning a surgeon’s being charged with criminal negligence, without mentioning that he had been acquitted. The Spanish authority ordered Google to remove the offending entries. Google challenged this order, arguing that it was for the authors or publishers of those websites to remedy such matters. The case was referred to the CJEU by the Spanish courts.

Advocate General Jääskinen this week issued his opinion in this case.

The first point concerns territorial jurisdiction. Google claims that no processing of personal data relating to its search engine takes place in Spain. Google Spain acts merely as commercial representative of Google for its advertising functions. In this capacity it has taken responsibility for the processing of personal data relating to its Spanish advertising customers. The Advocate General has disagreed with Google on this point. His view is that national data protection legislation is applicable to a search engine provider when it sets up in a member state, for the promotion and sale of advertising space on the search engine, an office which orientates its activity towards the inhabitants of that state.

The second point is substantive, and is good news for Google. The Advocate General says that Google is not generally to be considered – either in law or in fact – as a ‘data controller’ of the personal data appearing on web pages it processes. It has no control over the content included on third party web pages and cannot even distinguish between personal data and other data on those pages.

Thirdly, the Advocate General tells us that there is no such thing as the so-called “right to be forgotten” (a favourite theme of debates on the work-in-progress new Data Protection Regulation) under the current Directive. The Directive offers accuracy as to safeguards and so on, but Google had not itself said anything inaccurate here. At paragraph 108 of his opinion, the Advocate General says this:

“… I consider that the Directive does not provide for a general right to be forgotten in the sense that a data subject is entitled to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests. The purpose of processing and the interests served by it, when compared to those of the data subject, are the criteria to be applied when data is processed without the subject’s consent, and not the subjective preferences of the latter. A subjective preference alone does not amount to a compelling legitimate ground within the meaning of Article 14(a) of the Directive.”

It remains to be seen of course whether the Court agrees with the Advocate General. The territorial issue and the ‘data controller’ question are of great significance to Google’s business model – and to those whose businesses face similar issues. The point about objectivity rather than subjectivity being the essential yardstick for compliance with data protection standards is potentially of even wider application.

“This is a good opinion for free expression,” Bill Echikson, a spokesman for Google, said in an e-mailed statement reported by Bloomberg.

Robin Hopkins

New CCTV Code of Practice: surveillance and the protection of freedoms

June 17th, 2013 by Robin Hopkins

Surveillance of the covert and digital variety has been dominating the news of late. The legal contours of the practices leaked by Edward Snowden (the NSA’s obtaining of internet metadata) and covered by The Guardian (most recently, GCHQ’s monitoring of certain communications of ‘friendly’ foreign allies) may be matters of some debate.

In the meantime, the legal contours of a more overt and physical variety of surveillance – CCTV – have been somewhat clarified.

Panopticon indeed.

As its name suggests, the Protection of Freedoms Act 2012 expressed the incoming Coalition Government’s commitment to keeping in check the state’s surveillance of ordinary citizens. By that Act (sections 29-36), the Home Secretary was to present to Parliament a Code of Practice governing the use of surveillance camera systems including CCTV and Automatic Number Plate Recognition (ANPR). Following a consultation exercise – the response to which can be read here – the Home Secretary has now done so. The Code was laid before Parliament on 4 June 2013. A draft order (the Protection of Freedoms Act 2012 (Code of Practice for Surveillance Camera Systems and Specification of Relevant Authorities) Order 2013) is currently being considered by Parliament’s Joint Committee on Statutory Instruments.

Pending its coming into force, Panopticon summarises the key features of the new Code.

To whom does the Code apply?

The Code imposes duties on ‘relevant authorities’, which are those listed at section 33(5) of the Protection of Freedoms Act 2012 – in the main, local authorities and policing authorities.

The draft order proposes to add the following to the list of relevant authorities:

(a) The chief constable of the British Transport Police;

(b) The Serious Organised Crime Agency;

(c) The chief constable of the Civil Nuclear Constabulary; and

(d) The chief constable of the Ministry of Defence Police.

The Code recognises that concern about the use of surveillance cameras often extends beyond these sorts of full-blooded ‘public’ authorities. It recognises that the list of relevant authorities may need to be expanded in future to encompass shopping centres, sports grounds, schools, transport centres and the like.

For now, however, only those listed as ‘relevant authorities’ are subject to the duties imposed by the Code. Others who use such surveillance systems are ‘encouraged’ to abide by the Code.

What duty is imposed by the Code?

The Code imposes a ‘have regard to’ duty. In other words, relevant authorities are required to have regard to the Code when exercising any of the functions to which the Code relates. As regards its legal effects:

“A failure on the part of any person to act in accordance with any provision of this code does not of itself make that person liable to criminal or civil proceedings. This code is, however, admissible in evidence in criminal or civil proceedings, and a court or tribunal may take into account a failure by a relevant authority to have regard to the code in determining a question in any such proceedings” (paragraph 1.16).

It may well be that the Code also weighs heavily with the ICO in its consideration of any complaints about the use of surveillance cameras breaching the DPA 1998.

Remember that the Home Office Code sits alongside and does not replace the ICO’s CCTV Code of Practice.

What types of activity are covered by the new Code?

Relevant authorities must have regard to the Code ‘when exercising any of the functions to which the Code relates’. This encompasses the operation and use of and the processing data derived from surveillance camera systems in public places in England and Wales, regardless of whether there is any live viewing or recording of images and associated data.

The Code does not apply to covert surveillance, as defined under the Regulation of Investigatory Powers Act 2000.

What about third party contractors?

Where a relevant authority instructs or authorises a third party to use surveillance cameras, that third party is not under the ‘have regard to’ duty imposed by the Code. That duty does, however, apply to the relevant authority’s arrangements.

By paragraph 1.11:

“The duty to have regard to this code also applies when a relevant authority uses a third party to discharge relevant functions covered by this code and where it enters into partnership arrangements. Contractual provisions agreed after this code comes into effect with such third party service providers or partners must ensure that contractors are obliged by the terms of the contract to have regard to the code when exercising functions to which the code relates.”

The approach

The guiding philosophy of the Code is one of surveillance by consent:

 “The government considers that wherever overt surveillance in public places is in pursuit of a legitimate aim and meets a pressing need, any such surveillance should be characterised as surveillance by consent, and such consent on the part of the community must be informed consent and not assumed by a system operator…. [legitimacy] in the eyes of the public is based upon a general consensus of support that follows from transparency about their powers, demonstrating integrity in exercising those powers and their accountability for doing so” (paragraph 1.5).

In a nutshell, the expectation is this:

“The decision to use any surveillance camera technology must, therefore, be consistent with a legitimate aim and a pressing need. Such a legitimate aim and pressing need must be articulated clearly and documented as the stated purpose for any deployment. The technical design solution for such a deployment should be proportionate to the stated purpose rather than driven by the availability of funding or technological innovation. Decisions over the most appropriate technology should always take into account its potential to meet the stated purpose without unnecessary interference with the right to privacy and family life. Furthermore, any deployment should not continue for longer than necessary” (paragraph 2.4).

The guiding principles

The Code then sets out 12 guiding principles which systems operators should follow:

(1) Use of a surveillance camera system must always be for a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need.

(2) The use of a surveillance camera system must take into account its effect on individuals and their privacy, with regular reviews to ensure its use remains justified.

(3) There must be as much transparency in the use of a surveillance camera system as possible, including a published contact point for access to information and complaints.

(4) There must be clear responsibility and accountability for all surveillance camera system activities including images and information collected, held and used.

(5) Clear rules, policies and procedures must be in place before a surveillance camera system is used, and these must be communicated to all who need to comply with them.

(6) No more images and information should be stored than that which is strictly required for the stated purpose of a surveillance camera system, and such images and information should be deleted once their purposes have been discharged.

(7) Access to retained images and information should be restricted and there must be clearly defined rules on who can gain access and for what purpose such access is granted; the disclosure of images and information should only take place when it is necessary for such a purpose or for law enforcement purposes.

(8) Surveillance camera system operators should consider any approved operational, technical and competency standards relevant to a system and its purpose and work to meet and maintain those standards.

(9) Surveillance camera system images and information should be subject to appropriate security measures to safeguard against unauthorised access and use.

(10) There should be effective review and audit mechanisms to ensure legal requirements, policies and standards are complied with in practice, and regular reports should be published.

(11) When the use of a surveillance camera system is in pursuit of a legitimate aim, and there is a pressing need for its use, it should then be used in the most effective way to support public safety and law enforcement with the aim of processing images and information of evidential value.

(12) Any information used to support a surveillance camera system which compares against a reference database for matching purposes should be accurate and kept up to date.

Points to note

The Code then fleshes out those guiding principles in more detail. Here are some notable points:

Such systems “should not be used for other purposes that would not have justified its establishment in the first place” (paragraph 3.1.3).

“People do, however, have varying and subjective expectations of privacy with one of the variables being situational. Deploying surveillance camera systems in public places where there is a particularly high expectation of privacy, such as toilets or changing rooms, should only be done to address a particularly serious problem that cannot be addressed by less intrusive means” (paragraph 3.2.1).

“Any proposed deployment that includes audio recording in a public place is likely to require a strong justification of necessity to establish its proportionality. There is a strong presumption that a surveillance camera system must not be used to record conversations as this is highly intrusive and unlikely to be justified” (paragraph 3.2.2).

“Any use of facial recognition or other biometric characteristic recognition systems needs to be clearly justified and proportionate in meeting the stated purpose, and be suitably validated. It should always involve human intervention before decisions are taken that affect an individual adversely” (paragraph 3.3.3).

“This [the requirement to publicise as much as possible about the use of a system] is not to imply that the exact location of surveillance cameras should always be disclosed if to do so would be contrary to the interests of law enforcement or national security” (paragraph 3.3.6).

“It is important that there are effective safeguards in place to ensure the forensic integrity of recorded images and information and its usefulness for the purpose for which it is intended to be used. Recorded material should be stored in a way that maintains the integrity of the image and information, with particular importance attached to ensuring that meta data (e.g. time, date and location) is recorded reliably, and compression of data does not reduce its quality” (paragraph 4.12.2).

Enforcement

The Surveillance Camera Commissioner is a statutory appointment made by the Home Secretary under section 34 of the Protection of Freedoms Act 2012. The Commissioner has no enforcement or inspection powers. However, in encouraging compliance with the Code, he “should consider how best to ensure that relevant authorities are aware of their duty to have regard for the Code and how best to encourage its voluntary adoption by other operators of surveillance camera systems” (paragraph 5.3). The Commissioner is/is to be assisted by a non-statutory Advisory Council with its own specialist subgroups.

Given the limited remit of the Surveillance Camera Commissioner, it may be that the Code shows its teeth more effectively in complaints to the ICO and/or the courts.

Robin Hopkins

Damages under section 13 DPA: Court of Appeal’s judgment in Halliday

May 17th, 2013 by Robin Hopkins

I blogged a while ago about the ex tempore judgment from the Court of Appeal in a potentially groundbreaking case on damages under section 13 of the DPA, namely Halliday v Creation Consumer Finance [2013] EWCA Civ 333. The point of potential importance was that ‘nominal damages’ appeared to suffice for the purposes of section 13(1), thereby opening up section 13(2). In short, the point is that claimants under the DPA cannot be compensated for distress unless they have also suffered financial harm. A ‘nominal damages’ approach to the concept of financial harm threatened to make the DPA’s compensation regime dramatically more claimant-friendly.

The Court of Appeal’s full judgment is now available. As pointed out on Jon Baines’ blog, ground has not been broken: the ‘nominal damages’ point was a concession by the defendant rather than a determination by the Court. See paragraph 3 of the judgment of Lady Justice Arden:

“… this issue, which was the main issue of the proposed appeal to this court, is now academic as the respondent, CCF, concedes an award of nominal damages is “damage” for the purposes of the Directive and for the purposes of section 13(2) of the Data Protection Act 1998.”

Other potentially important points have also fallen somewhat flat. The question of whether UK law provided an adequate remedy for a breach of a right conferred by a European Directive fell away on the facts (“proof fell short in relation to the question of damage to reputation and credit”), while the provision for sanctions under Article 24 of Directive 95/46/EC was neither directly enforceable to Mr Halliday nor of assistance to him.

Still, the judgment is not without its notable points.

One is the recognition that compensation for harm suffered is a distinct matter from penalties for wrongdoing; the former is a matter for the courts in the DPA context, the latter a matter for the Information Commissioner and his monetary penalty powers. Such was the implication of paragraph 11:

“… it is not the function of the civil court, unless specifically provided for, to impose sanctions. That is done in other parts of the judicial system.”

Another point worth noting is Lady Justice Arden’s analysis of distress and the causation thereof. The distress must be caused by the breach, not by other factors such as (in this case) a failure to comply with a court order. See paragraph 20:

“Focusing on subsection (2), it is clear that the claimant has to be an individual, that he has to have suffered distress, and that the distress has to have been caused by contravention by a data controller of any of the requirements of the Act. In other words, this is a remedy which is not for distress at large but only for contravention of the data processing requirements. It also has to be distress suffered by the complainant and therefore would not include distress suffered by family members unless it was also suffered by him. When I say that it has to be caused by breach of the requirements of the Act, the distress which I accept Mr Halliday would have felt at the non-compliance of the order is not, at least directly, relevant because that is not distress by reason of the contravention by a data controller of the requirements of this Act. If the sole cause of the distress had been non-compliance with a court order, then that would have lain outside the Act unless it could be shown that it was in substance about the non-compliance with the Data Protection Act.”

The claimant had sought to draw an analogy with guidelines and banding for discrimination awards as set by Vento v Chief Constable of West Yorkshire Police [2013] 1 ICR 31. The Court of Appeal was not attracted. See paragraph 26:

“In answer to that point, the field of discrimination is, it seems to me, not a helpful guide for the purposes of data protection. Discrimination is generally accompanied by loss of equality of opportunity with far-reaching effects and is liable to cause distinct and well-known distress to the complainant.”

Finally, Lady Justice Arden commented as follows concerning the level of the compensation to be awarded on the facts of this case: “in my judgment the sum to be awarded should be of a relatively modest nature since it is not the intention of the legislation to produce some kind of substantial award. It is intended to be compensation, and thus I would consider it sufficient to render an award in the sum of £750” (paragraph 36).

Lord Justice Lloyd (who, along with Mr Justice Ryder agreed with Lady Justice Arden) did pause to think about a submission on this question ‘if you were so distressed, why did you not complain immediately?’, but concluded that (paragraph 47):

“I confess that I was somewhat impressed at one point by Mr Capon’s submission that it was a surprise, if Mr Halliday was so distressed by this contravention, that he did not immediately protest upon discovering, in response to his first credit reference enquiry, the fact of the contravention, and indeed he did not protest until about a month after the second report had been obtained. But I bear in mind, in response to that, Mr Halliday’s comment that he had had such difficulty in getting any sensible response, or indeed any response, out of CCF at the earlier stage, that it is perhaps less surprising that he did not immediately protest. In any event, the period in question is not a very lengthy one between his discovery of the contravention by his first reference request and his taking action in July. Accordingly, it does not seem to me that that is a matter that should be taken to reduce my assessment of the degree of distress that he suffered.”

Robin Hopkins

Google: autocomplete and the frontiers of privacy

May 17th, 2013 by Robin Hopkins

Unsurprisingly, the frontiers of privacy and data protection law are often explored and extended by reference to what Google does. Panopticon has, for example, covered disputes over Google Street View (on which a US lawsuit was settled in recent months), Google’s status as a ‘publisher’ of blogs containing allegedly defamatory material (see Tamiz v Google [2013] EWCA Civ 68) and its responsibility for search results directing users to allegedly inaccurate or out-of-date personal data (see Google Spain v Agencia Espanola de Proteccion de Datos (application C-131/12), in which judgment is due in the coming months).

A recent decision of a German appellate court appears to have extended the frontiers further. The case (BGH, VI ZR 269/12 of 14th May 2013) concerned Google’s ‘autocomplete’ function. When the complainants’ names were typed into Google’s search bar, the autocomplete function added the ensuing words “Scientology” and “fraud”. This was not because there was lots of content linking that individual with those terms. Rather, it was because these were the terms other Google users had most frequently searched for in conjunction with that person’s name. This was due to rumours the truth or accuracy of which the complainants denied. They complained that the continuing association of their names with these terms infringed their rights to personality and reputation as protected by German law (Articles 823(1) and 1004 of the German Civil Code).

In the Google Spain case, Google has said that the responsibility lies with the generators of the content, not with the search engine which offers users that content. In the recent German case, Google has argued in a similar vein that the autocomplete suggestions are down to what other users have searched for, not what Google says or does.

In allowing the complainants’ appeals, the Federal Court of Justice in Karlsruhe has disagreed with Google. The result is that once Google has been alerted to the fact that an autocomplete suggestion links someone to libellous words, it must remove that suggestion. The case is well covered by Jeremy Phillips at IPKat and by Karin Matussek of Bloomberg in Berlin.

The case is important in terms of the frontiers of legal protection for personal integrity and how we allocate responsibility for harm. Google says that, in these contexts, it is a facilitator not a generator. It says it should not liable for what people write (see Tamiz and Google Spain), not for what they search for (the recent German case). Not for the first time, courts in Europe have allocated responsibility differently.

Notably, this case was not brought under data protection law. In principle, it seems that such complaints could be expressed in data protection terms. Perhaps, if the EU’s final Data Protection Regulation retains the severe penalty provisions proposed in the draft version, data protection will move centre-stage in these sorts of cases.

Robin Hopkins