New from the Upper Tribunal: DWP work programmes, personal data. And security service algebra.

July 23rd, 2014 by Robin Hopkins

The Upper Tribunal has handed down a number of FOIA decisions in recent days. I refrain from comment or analysis, given my involvement in the cases (hopefully someone else from the Panopticon fold will oblige before long), but I post the judgments here for those who wish to read for themselves.

In DWP v IC and Zola [2014] UKUT 0334 (AAC), the Upper Tribunal dismissed the DWP’s appeal against this First-Tier Tribunal decision. The disputed information is a list of the identities of companies, charities and other organisations who host placements through the DWP’s work programmes for job seekers. Zola determination 21.07.14

In Farrand v IC and London Fire and Emergency Planning Authority [2014] UKUT 0310 (AAC), the Upper Tribunal dismissed an appeal concerning a report into a fire in a London flat, on the grounds that the requested information was the occupant’s personal data and no condition from Schedule 2 to the DPA was met. The decision discusses Common Services Agency and identification, legitimate interests, necessity and fairness. Farrand UT

Third, in Home Office v IC and Cobain (GIA/1722/2013), the Upper Tribunal has issued an interim decision allowing the appeal. This case concerns this problem: x + y = z, where z is a publicly known number, x is non-exempt information but y is exempt information (in this case, on section 23 grounds – security service information). Normally, the requester is entitled to non-exempt information, but here the automatic effect of disclosure would be to reveal the exempt information. What to do about this? As I say, an interim decision which I don’t analyse here. Have a go at the security service algebra yourself.

Robin Hopkins @hopkinsrobin

Academies and FOI

July 16th, 2014 by Robin Hopkins

The question of whether information is ‘held’ by a public authority for FOIA or EIR purposes can raise difficulties. This is especially so where the boundaries between public and private service provision are blurred: consider outsourcing, privatisation of services, public/private partnerships, joint ventures, the use of external consultants and so on. Legal separation and practical day-to-day realities can often point in different directions in terms of who holds information on whose behalf.

Geraldine Hackett v IC and United Learning Trust (EA/2012/0265) is a recent First-Tier Tribunal decision which addresses such issues – specifically in the context of academy school provision.

The United Church Schools Foundation Limited delivers schools through two separate trusts: the United Church Schools Trust (which runs 11 private schools) and the United Learning Trust (which runs 20 academies, and receives approximately £110k of its £129k of annual income from public funds).

Para 52A Schedule 1 FOIA brings within the scope of FOIA “the proprietor of an academy” but only in respect of “information held for the purposes of the proprietor’s functions under academy arrangements.”

Geraldine Hackett asked for information about the employment package of ULT’s chief executive (pay, pension contribution, expenses etc) and of the other members of the ULT senior management team.

ULT said it did not hold the information; the information was instead held by UCST (the private school provider). The ICO agreed. So did the First-Tier Tribunal, but this was overturned by the Upper Tribunal on account of aspects of procedural fairness which had gone badly awry at first instance.

On reconsideration by a fresh First-Tier Tribunal, the ICO’s decision was overturned. The Tribunal asked itself the questions which the Upper Tribunal had invited for consideration:

“Was it really the case that ULT had delegated day-to-day running of its charitable activities to a chief executive of whose duties under his contract of employment, ULT was ignorant? Was it permissible to avoid FOIA by the device of a contract of employment made by another body?”

It applied the leading case of University of Newcastle upon Tyne v ICO and BUAV [2011] UKUT 185 (AAC) and concluded that ULT did hold the requested information for FOIA purposes. This meant that “ULT would fulfil its obligations under FOIA by disclosing not the total sums involved but that proportion, calculated in accordance with the agreement, which relates to the academies; in other words excluding that proportion which can be attributed to USCT’s private schools.”

The Tribunal noted that “in 2006 both trusts entered into an agreement with each other to apportion the expenditure on shared services” and observed that “it appeared to us from the oral and written evidence that staff work together seamlessly for all three trusts”.

Those who grapple with held/not held questions in contexts like this will wish to note the key paragraph (19) illuminating the Tribunal’s reasoning:

“We were told at the hearing, and we accept, that the disputed information is held in hard copy in one of the filing cabinets at the United Learning Head Office. Those with access to it work seamlessly, we have found, for all three trusts. They have responsibilities to all three trusts. For these purposes, we are not attracted by artificial theories suggesting that staff hold these documents only on behalf of one or two of the trusts. Looking at actualities, and applying the plain words of the statute, in our judgment the disputed information is held by ULT, even if it is also held by UCST and UCSF. This finding is consistent with the obligations of the ULT accounting officer in respect of senior officers’ payroll arrangements…”

Robin Hopkins @hopkinsrobin

In the wake of Google Spain: freedom of expression down (but not out)

July 15th, 2014 by Robin Hopkins

The CJEU’s judgment in Google Spain was wrong and has created an awful mess.

That was the near-unanimous verdict of a panel of experts – including 11KBW’s Anya Proops – at a debate hosted by ITN and the Media Society on Monday 14 July and entitled ‘Rewriting History: Is the new era in Data Protection compatible with journalism?’.

The most sanguine participant was the Information Commissioner, Christopher Graham. He cautioned against a ‘Chicken Licken’ (the sky is falling in) alarmism – we should wait and see how the right to be forgotten (RTBF) pans out in practice. He was at pains to reassure the media that its privileged status in data protection law was not in fact under threat: the s. 32 DPA exemption, for example, was here to stay. There remains space, Google Spain notwithstanding, to refuse RTBF inappropriate requests, he suggested – at least as concerns journalism which is in the public interest (a characteristic which is difficult in principle and in practice).

‘I am Chicken Licken!’, was the much less sanguine stance of John Battle, ITN’s Head of Compliance. Google Spain is a serious intrusion into media freedom, he argued. This was echoed by The Telegraph’s Holly Watt, who likened the RTBF regime to book-burning.

Peter Barron, Google’s Director of Communications and Public Affairs for Europe, Africa and the Middle East, argued that in implementing its fledgling RTBF procedure, Google was simply doing as told: it had not welcomed the Google Spain judgment, but that judgment is now the law, and implementing it was costly and burdensome. On the latter point, Chris Graham seemed less than entirely sympathetic, pointing out that Google’s business model is based heavily on processing other people’s personal data.

John Whittingdale MP, Chairman of the Culture, Media & Sport Select Committee, was markedly Eurosceptic in tone. Recent data protection judgments from the CJEU have overturned what we in the UK had understood the law to be – he was referring not only to Google Spain, but also to Digital Rights Ireland (on which see my DRIP post from earlier today). The MOJ or Parliament need to intervene and restore sanity, he argued.

Bringing more legal rigour to bear was Anya Proops, who honed in on the major flaws in the Google Spain judgment. Without there having been any democratic debate (and without jurisprudential analysis), the CJEU has set a general rule whereby privacy trumps freedom of expression. This is hugely problematic in principle. It is also impracticable: the RTBF mechanism doesn’t actually work in practice, for example because it leaves Google.com (as opposed to Google.co.uk or another EU domain) untouched – a point also made by Professor Luciano Floridi, Professor of Philosophy and Ethics of Information at the University of Oxford.

There were some probing questions from the audience too. Mark Stephens, for example, asked Chris Graham how he defined ‘journalism’ (answer: ‘if it walks and quacks like a journalist’…) and how he proposed to fund the extra workload which RTBF complaints would bring for the ICO (answer: perhaps a ‘polluter pays’ approach?).

Joshua Rozenberg asked Peter Barron if there was any reason why people should not switch their default browsers to the RTBF-free Google.com (answer: no) and whether Google would consider giving aggrieved journalists rights of appeal within a Google review mechanism (the Google RTBF mechanism is still developing).

ITN is making the video available on its website this week. Those seeking further detail can also search Twitter for the hashtag #rewritinghistory or see Adam Fellows’ blog post.

The general tenor from the panel was clear: Google Spain has dealt a serious and unjustifiable blow to the freedom of expression.

Lastly, one of my favourite comments came from ITN’s John Battle, referring to the rise of data protection as a serious legal force: ‘if we’d held a data protection debate a year ago, we’d have had one man and his dog turn up. Now it pulls in big crowds’. I do not have a dog, but I have been harping on for some time about data protection’s emergence from the shadows to bang its fist on the tables of governments, security bodies, big internet companies and society at large. It surely will not be long, however, before the right to freedom of expression mounts a legal comeback, in search of a more principled and workable balance between indispensible components of a just society.

Robin Hopkins @hopkinsrobin

Surveillance powers to be kept alive via DRIP

July 15th, 2014 by Robin Hopkins

The legal framework underpinning state surveillance of individuals’ private communications is in turmoil, and it is not all Edward Snowden’s fault. As I write this post, two hugely important developments are afoot.

Prism/Tempora

The first is the challenge by Privacy International and others to the Prism/Tempora surveillance programmes implemented by GCHQ and the security agencies. Today is day 2 of the 5-day hearing before the Investigatory Powers Tribunal. To a large extent, this turmoil was unleashed by Snowden.

DRIP – the background

The second strand of the turmoil is thanks to Digital Rights Ireland and others, whose challenge to the EU’s Data Retention Directive 2006/24 was upheld by the CJEU in April of this year. That Directive provided for traffic and location data (rather than content-related information) about individuals’ online activity to be retained by communications providers for a period of 6-24 months and made available to policing and security bodies. In the UK, that Directive was implemented via the Data Retention (EC Directive) Regulations 2009, which mandated retention of communications data for 12 months.

In Digital Rights Ireland, the CJEU held the Directive to be invalid on the grounds of incompatibility with the privacy rights enshrined under the EU’s Charter of Fundamental Rights. Strictly speaking, the CJEU’s judgment (on a preliminary ruling) then needed to be applied by the referring courts, but in reality the foundation of the UK’s law fell away with the Digital Rights Ireland judgment. The government has, however, decided that it needs to maintain the status quo in terms of the legal powers and obligations which were rooted in the invalid Directive.

On 10 July 2014, the Home Secretary made a statement announcing that this gap in legal powers was to be plugged on a limited-term basis. A Data Retention and Investigatory Powers (DRIP) Bill would be put before Parliament, together with a draft set of regulations to be made under the envisaged Act. If passed, these would remain in place until the end of 2016, by which time longer-term solutions could be considered. Ms May said this would:

“…ensure, for now at least, that the police and other law enforcement agencies can investigate some of the criminality that is planned and takes place online. Without this legislation, we face the very prospect of losing access to this data overnight, with the consequence that police investigations will suddenly go dark and criminals will escape justice. We cannot allow this to happen.”

Today, amid the ministerial reshuffle and shortly before the summer recess, the Commons is debating DRIP on an emergency basis.

Understandably, there has been much consternation about the extremely limited time allotted for MPs to debate a Bill of such enormous significance for privacy rights (I entitled my post on the Digital Rights Ireland case “Interfering with the fundamental rights of practically the entire European population”, which is a near-verbatim quote from the judgment).

DRIP – the data retention elements

The Bill is short. A very useful summary can be found in the Standard Note from the House of Commons Library (authored by Philippa Ward).

Clause 1 provides power for the Secretary of State to issue a data retention notice on a telecommunications services provider, requiring them to retain certain data types (limited to those set out in the Schedule to the 2009 Regulations) for up to 12 months. There is a safeguard that the Secretary of State must consider whether it is “necessary and proportionate” to give the notice for one or more of the purposes set out in s22(2) of RIPA.

Clause 2 then provides the relevant definitions.

The Draft Regulations explain the process in more detail. Note in particular regulation 5 (the matters the Secretary of State must consider before giving a notice) and regulation 9 (which provides for oversight by the Information Commissioner of the requirements relating to integrity, security and destruction of retained data).

DRIP – the RIPA elements

DRIP is also being used to clarify (says the government) or extend (say some critics) RIPA 2000. In this respect, as commentators such as David Allen Green have pointed out, it is not clear why the emergency legislation route is necessary.

Again, to borrow the nutshells from the House of Commons Library’s Standard Note:

Clause 3 amends s5 of RIPA regarding the Secretary of State’s power to issue interception warrants on the grounds of economic well-being.

Clause 4 aims to clarify the extra-territorial reach of RIPA in in relation to both interception and communications data by adding specific provisions. This confirms that requests for interception and communications data to overseas companies that are providing communications services within the UK are subject to the legislation.

Clause 5 clarifies the definition of “telecommunications service” in RIPA to ensure that internet-based services, such as webmail, are included in the definition.

Criticism

The Labour front bench is supporting the Coalition. A number of MPs, including David Davis and Tom Watson, have been vociferous in their opposition (see for example the proposed amendments tabled by Watson and others here). So too have numerous academics and commentators. I won’t try to link to all of them here (as there are too many). Nor can I link to a thorough argument in defence of DRIP (as I have not been able to find one). For present purposes, an excellent forensic analysis comes from Graham Smith at Cyberleagle.

I don’t seek to duplicate that analysis. It is, however, worth remembering this: the crux of the CJEU’s judgment was that the Directive authorised such vast privacy intrusions that stringent safeguards were required to render it proportionate. In broad terms, that proportionately problem can be fixed in two ways: reduce the extent of the privacy intrusions and/or introduce much better safeguards. DRIP does not seek to do the former. The issue is whether it offers sufficient safeguards for achieving an acceptable balance between security and privacy.

MPs will consider that today and Peers later this week. Who knows? – courts may even be asked for their views in due course.

Robin Hopkins @hopkinsrobin

Some results may have been removed under data protection law in Europe. Learn more.

July 3rd, 2014 by Robin Hopkins

This is the message that now regularly greets those using Google to search for information on named individuals. It relates, of course, to the CJEU’s troublesome Google Spain judgment of 13 May 2014.

I certainly wish to learn more.

So I take Google up on its educational offer and click through to its FAQ page, where the folks at Google tell me inter alia that “Since this ruling was published on 13 May 2014, we’ve been working around the clock to comply. This is a complicated process because we need to assess each individual request and balance the rights of the individual to control his or her personal data with the public’s right to know and distribute information”.

The same page also leads me to the form on which I can ask Google to remove from its search results certain URLs about me. I need to fill in gaps like this: “This URL is about me because… This page should not be included as a search result because…” 

This is indeed helpful in terms of process, but I want to understand more about the substance of decision-making. How does (and/or should) Google determine whether or not to accede to my request? Perhaps understandably (as Google remarks, this is a complicated business on which the dust is yet to settle), Google doesn’t tell me much about that just yet.

So I look to the obvious source – the CJEU’s judgment itself – for guidance. Here I learn that I can in principle ask that “inadequate, irrelevant or no longer relevant” information about me not be returned through a Google search. I also get some broad – and quite startling – rules of thumb, for example at paragraph 81, which tells me this:

“In the light of the potential seriousness of that interference, it is clear that it cannot be justified by merely the economic interest which the operator of such an engine has in that processing. However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, in situations such as that at issue in the main proceedings a fair balance should be sought in particular between that interest and the data subject’s fundamental rights under Articles 7 and 8 of the Charter. Whilst it is true that the data subject’s rights protected by those articles also override, as a general rule, that interest of internet users, that balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life.”

So it seems that, in general (and subject to the sensitivity of the information and my prominence in public life), my privacy rights trump Google’s economic rights and other people’s rights to find information about me in this way. So the CJEU has provided some firm steers on points of principle.

But still I wish to learn more about how these principles will play out in practice. Media reports in recent weeks have told us about the volume of ‘right to be forgotten’ requests received by Google.

The picture this week has moved on from volumes to particulars. In the past few days, we have begun to learn how Google’s decisions filter back to journalists responsible for the content on some of the URLs which objectors pasted into the forms they sent to Google. We learn that journalists and media organisations, for example, are now being sent messages like this:

“Notice of removal from Google Search: we regret to inform you that we are no longer able to show the following pages from your website in response to certain searches on European versions of Google.”

Unsurprisingly, some of those journalists find this puzzling and/or objectionable. Concerns have been ventilated in the last day or two, most notably by the BBC’s Robert Peston (who feels that, through teething problems with the new procedures, he has been ‘cast into oblivion’) and The Guardian’s James Ball (who neatly illustrates some of the oddities of the new regime). See also The Washington Post’s roundup of UK media coverage.

That coverage suggests that the Google Spain ruling – which made no overt mention of free expression rights under Article 10 ECHR – has started to bite into the media’s freedom. The Guardian’s Chris Moran, however, has today posted an invaluable piece clarifying some misconceptions about the right to be forgotten. Academic commentators such as Paul Bernal have also offered shrewd insights into the fallout from Google Spain.

So, by following the trail from Google’s pithy new message, I am able to learn a fair amount about the tenor of this post-Google Spain world.

Inevitably, however, given my line of work, I am interested in the harder edges of enforcement and litigation: in particular, if someone objects to the outcome of a ‘please forget me’ request to Google, what exactly can they do about it?

On such questions, it is too early to tell. Google says on its FAQ page that “we look forward to working closely with data protection authorities and others over the coming months as we refine our approach”. For its part, the ICO tells us that it and its EU counterparts are working hard on figuring this out. Its newsletter from today says for example that:

“The ICO and its European counterparts on the Article 29 Working Party are working on guidelines to help data protection authorities respond to complaints about the removal of personal information from search engine results… The recommendations aim to ensure a consistent approach by European data protection authorities in response to complaints when takedown requests are refused by the search engine provider.”

So for the moment, there remain lots of unanswered questions. For example, the tone of the CJEU’s judgment is that DPA rights will generally defeat economic rights and the public’s information rights. But what about a contest between two individuals’ DPA rights?

Suppose, for example, that I am an investigative journalist with substantial reputational and career investment in articles about a particular individual who then persuades Google to ensure that my articles do not surface in EU Google searches for his name? Those articles also contain my name, work and opinions, i.e. they also contain my personal data. In acceding to the ‘please forget me’ request without seeking my input, could Google be said to have processed my personal data unfairly, whittling away my online personal and professional output (at least to the extent that the relevant EU Google searches are curtailed)? Could this be said to cause me damage or distress? If so, can I plausibly issue a notice under s. 10 of the DPA, seek damages under s. 13, or ask the ICO to take enforcement action under s. 40?

The same questions could arise, for example, if my personal backstory is heavily entwined with that of another person who persuades Google to remove from its EU search results articles discussing both of us – that may be beneficial for the requester, but detrimental to me in terms of the adequacy of personal data about me which Google makes available to the interested searcher.

So: some results may have been removed under data protection law in Europe, and I do indeed wish to learn more. But I will have to wait.

Robin Hopkins @hopkinsrobin

GCHQ’s internet surveillance – privacy and free expression join forces

July 3rd, 2014 by Robin Hopkins

A year ago, I blogged about Privacy International’s legal challenge – alongside Liberty – against GCHQ, the Security Services and others concerning the Prism/Tempora programmes which came to public attention following Edward Snowden’s whistleblowing. That case is now before the Investigatory Powers Tribunal. It will be heard for 5 days, commencing on 14 July.

Privacy International has also brought a second claim against GCHQ: in May 2014, it issued proceedings concerning the use of ‘hacking’ tools and software by intelligence services.

It has been announced this week that Privacy International is party to a third challenge which has been filed with the Investigatory Powers Tribunal. This time, the claim is being brought alongside 7 internet service providers: GreenNet (UK), Chaos Computer Club (Germany); GreenHost (Netherlands); Jimbonet (Korea), Mango (Zimbabwe), May First/People Link (US) and Riseup (US).

The claim is interesting on a number of fronts. One is the interplay between global reach (see the diversity of the claimants’ homes) and this specific legal jurisdiction (the target is GCHQ and the jurisdiction is the UK – as opposed, for example, to bringing claims in the US). Another is that it sees private companies – and therefore Article 1 Protocol 1 ECHR issues about property, business goodwill and the like – surfacing in the UK’s internet surveillance debate.

Also, the privacy rights not only of ‘ordinary’ citizens (network users) but also specifically those of the claimants’ employees are being raised.

Finally, this claim sees the right to free expression under Article 10 ECHR – conspicuously absent, for example, in the Google Spain judgment – flexing its muscle in the surveillance context. Privacy and free expression rights are so often in tension, but here they make common cause.

The claims are as follows (quoting from the claimants’ press releases):

(1) By interfering with network assets and computers belonging to the network providers, GCHQ has contravened the UK Computer Misuse Act and Article 1 of the First Additional Protocol (A1AP) of the European Convention of Human Rights (ECHR), which guarantees the individual’s peaceful enjoyment of their possessions

(2) Conducting surveillance of the network providers’ employees is in contravention of Article 8 ECHR (the right to privacy) and Article 10 ECHR (freedom of expression)

(3) Surveillance of the network providers’ users that is made possible by exploitation of their internet infrastructure, is in contravention of Arts. 8 and 10 ECHR; and

(4) By diluting the network providers’ goodwill and relationship with their users, GCHQ has contravened A1AP ECHR.

Robin Hopkins @hopkinsrobin

Fairness under the DPA: public interests can outweigh those of the data subject

June 18th, 2014 by Robin Hopkins

Suppose a departing employee was the subject of serious allegations which you never had the chance properly to investigate or determine. Should you mention these (unproven) allegations to a future employer? Difficult questions arise, in both ethical and legal terms. One aspect of the legal difficulty arises under data protection law: would it be fair to share that personal information with the prospective employer?

The difficulty is enhanced because fairness – so pivotal to data protection analysis – has had little or no legal treatment.

This week’s judgment of Mr Justice Cranston in AB v A Chief Constable [2014] EWHC 1965 (QB) is in that sense a rare thing – a judicial analysis of fairness.

AB was a senior police officer – specifically, a chief superintendent. He was given a final written warning in 2009 following a disciplinary investigation. Later, he was subject to further investigation for allegedly seeking to influence the police force’s appointment process in favour of an acquaintance of AB; this raised a number of serious questions, including about potential dishonesty, lack of integrity, and so on.

AB was on sick leave (including for reasons related to psychological health) for much of the period when that second investigation was unfolding. He was unhappy with how the Force was treating him. He got an alternative job offer from a regulator. He then resigned from the Force before the hearing concerning his alleged disciplinary offences. His resignation was accepted. The Force provided him with a standard reference, but the Chief Constable then took the view that – given the particular, unusual circumstances – he should provide the prospective employer with a second reference, explaining the allegations about AB.

The second reference was to say inter alia that:

“[AB’s] resignation letter pre-dated by some 13 days a gross misconduct hearing at which he was due to appear to face allegations of (i) lack of honesty and integrity (ii) discreditable conduct and (iii) abuse of authority in relation to a recruitment issue. It is right to record that he strenuously denied those allegations. In the light of his resignation the misconduct hearing has been stayed as it is not in the public interest to incur the cost of a hearing when the officer concerned has already resigned, albeit his final date of service post-dating the hearing.”

AB objected to the giving of the second reference and issued a section 10 notice under the Data Protection Act 1998. The lawfulness of the Force’s proposed second reference arose for consideration by Cranston J.

The first issue was this: was the Chief Constable legally obliged to provide a second reference explaining those concerns?

Cranston J held that, in terms of the common/private law duty of care (on the Hedley Byrne line of authority), the answer was no. As a matter of public law, however – and specifically by reference to the Police Conduct Regulations – the answer was yes: “the Chief Constable was obliged by his duty to act with honesty and integrity not to give a standard reference for the recipient because that was misleading. Something more was demanded. In this case the Chief Constable was prima facie under a duty to supply the Regulatory Body at the least with the information about disciplinary matters in the second reference.”

Note the qualifier ‘prima facie’: the upshot was that the duty was displaced if the provision of the second reference would breach the DPA. This raised a number of issues for the Court.

First, no information about AB’s health could be imparted: this was sensitive personal data, and the Chief Constable did not assert that a Schedule 3 DPA condition was met (as required under the First Data Protection Principle).

What about the information as to the disciplinary allegations AB faced? This was not sensitive personal data. Therefore, under the First Data Protection Principle, it could be disclosed if to do so would be (a) fair, (b) lawful, and (c) in accordance with a Schedule 2 condition.

The last two were unproblematic: given the prima facie public law duty to make the second reference here, it would lawful to do so and condition 3 from Schedule 2 would be met.

This left ‘fairness’, which Cranston J discussed in the following terms:

“There is no definition of fairness in the 1998 Act. The Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995, to which the 1998 Act gives effect, contains a reference to protecting privacy rights, as recognised in article 8 of the European Convention on Human Rights and in general principles of EU law: recital 10. However, I cannot accept Mr Lock QC’s submission that the duty of fairness under the Directive and the 1998 Act is a duty to be fair primarily to the data subject. The rights to private and family life in Article 8 are subject to the countervailing public interests set out in Article 8(2). So it is here: assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure.”

In conducting this balance between the interests of AB and those of others (including the public interests), Cranston J ultimately – on the particular facts – concluded that it would have been unfair to provide the second reference. There were strong fairness arguments in favour of disclosure – a see paragraph 78 (my emphasis):

“… The focus must be on fairness in the immediate decision to disclose the data [as opposed to a wider-ranging inquiry into the data subject’s conduct in the build-up to disclosure]. In this case the factors making it fair to disclose the information were the public interest in full and frank references, especially the duty of the police service properly to inform other police forces and other regulatory bodies of the person they are seeking to employ. To disclose the information in the second reference would patently have been fair to the Regulatory Body, so it could make a rounded assessment of the claimant, especially given his non-disclosure during the application process.”

However, the balance tipped in AB’s favour. This was partly because the Force’s policy – as well as the undertaken specifically given to AB – was to provide only a standard reference. But (see paragraph 79):

“… what in my view is determinative, and tips the balance of fairness in this case in favour of the claimant, is that he changed his position by resigning from the Force and requesting it to discontinue the disciplinary proceedings, before knowing that the Chief Constable intended to send the second reference. That second reference threatened the job which he had accepted with the Regulatory Body. It is unrealistic to think that the claimant could have taken steps to reverse his resignation in the few weeks before it would take effect. Deputy Chief Constable CD for one had indicated that he would not allow it. The reality was that the claimant was in an invidious position, where in reliance on what the Force through GH had said and done, he was deprived of the opportunity to reinstate the disciplinary proceedings and to fight the allegations against him. This substantive unfairness for the claimant was coupled with the procedural unfairness in the decision to send the second reference without giving him the opportunity to make representations against that course of action. Asking him to comment on its terms after the final decision to send the second reference was too little, too late.”

Therefore, because of unfairness in breach of the DPA and because of AB’s legitimate expectations, the second reference was not lawful.

While Cranston J rightly emphasised the highly fact-specific nature of his overall conclusion, aspects of his discussion of fairness will potentially be of wider application.

So too will his reminder (by way of quoting ICO guidance) that, when it comes to section 10 notices, “Although this [section 10] may give the impression that an individual can simply demand than an organisation stops processing personal data about them, or stops processing it in a particular way, the right is often overstated. In practice, it is much more limited”. Again, in other words, a balancing of interests and an assessment of the justification for the processing is required.

With the ‘right to be forgotten’ very much in vogue, that is a useful point to keep in mind.

Robin Hopkins @hopkinsrobin

Section 13 DPA in the High Court: nominal damage plus four-figure distress award

June 13th, 2014 by Robin Hopkins

Given the paucity of case law, it is notoriously difficult to estimate likely awards of compensation under section 13 of the Data Protection Act 1998 for breaches of that Act. It is also very difficult to assess any trends in compensation awards over time.

AB v MoJ [2014] EWHC 1847 (QB) is the Courts’ (Mr Justice Jeremy Baker) latest consideration of compensation under the DPA. The factual background involves protracted correspondence involving numerous subject access requests. Ultimately, it was held that the Defendant failed to provide certain documents to which the Claimant was entitled under section 7 of the DPA within the time frames set out under that section.

Personal data?

There was a dispute as to whether one particular document contained the Claimant’s ‘personal data’. Baker J noted the arguments from Common Services Agency, and he is not the first to observe (at his paragraph 50) that it is sometimes not a ‘straightforward issue’ to determine whether or not information comes within the statutory definition of personal data. Ultimately, he considered that the disputed document did not come within that definition: it “is in wholly neutral terms, and is indeed merely a conduit for the provision of information contained in the letters which it enclosed which certainly did contain the claimant’s personal data”.

Nonetheless, the DPA had been breached in virtue of the delays in the provision of other information to which the Claimant was entitled under section 7. What compensation should he be awarded?

Damage under section 13(1) DPA

Baker J was satisfied, having considered In Halliday v Creation Consumer Finance Limited [2013] EWCA Civ 333, [2013] 2 Info LR 85 (where the same point was conceded), that nominal damage sufficed as ‘damage’ for section 13(1) purposes: “In this regard the word “damage” in this sub-section is not qualified in any way, such that to my mind provided that there has, as in this case, been some relevant loss, then an individual who has also suffered relevant distress is entitled to an award of compensation in respect of it”.

Here the Court was satisfied that nominal damages should be awarded. The Claimant had spent a lot of time pursuing his requests, albeit that much of that time also involved pursuing requests on clients’ behalves, and albeit that no actual loss had been quantified:

“Essentially the claimant is a professional man who, it is apparent from his witness statement, has expended a considerable amount of time and expense in the pursuit of the disclosure of his and others’ data from various Government Departments and other public bodies, including the disclosed and withheld material from the defendant. Having said that, the claimant has not sought to quantify his time and expense, nor has he allocated it between the various requests on his own and others’ behalves. In these circumstances, although I am satisfied that he has suffered damage in accordance with s.13(1) of the DPA 1998, I consider that this is a case in which an award of nominal damages is appropriate under this head, which will be in the conventional sum of £1.00.”

Distress under section 13(2) DPA

That finding opened the door to an award for distress. The Court found that distress had been suffered, although it was difficult to disentangle his distress attributable to the breaches of the DPA from his distress as to the other surrounding circumstances: “doing the best I am able to on the evidence before me I consider that any award of compensation for distress caused as a result of the relevant delays in this case, should be in the sum of £2,250.00”.

Until this week, Halliday was the Courts’ last reported (on Panopticon at any rate) award of compensation under section 13 DPA. That was 14 months ago. In AB, the Court awarded precisely triple that sum for distress.

For a further (and quicker-off-the-mark) discussion of AB, see this post on Jon Baines’ blog, Information Rights and Wrongs.

Robin Hopkins @hopkinsrobin

Privacy, electronic communications and monetary penalties: new Upper Tribunal decision

June 12th, 2014 by Robin Hopkins

Panopticon reported late last year that the First-Tier Tribunal overturned the first monetary penalty notice issued by the Information Commissioner for breaches of the Privacy and Electronic Communications Regulations 2003. This was the decision in Niebel v IC (EA/2012/0260).

The Information Commissioner appealed against that decision. The Upper Tribunal gave its decision on the appeal yesterday: see here IC v Niebel GIA 177 2014. It dismissed the Commissioner’s appeal and upheld the First-Tier Tribunal’s cancellation of the £300,000 penalty imposed for the sending of marketing text messages.

I appeared in this case, as did James Cornwell (also of the Panopticon fold), so I will not be offering an analysis of the case just now. With any luck, one of my colleagues will be cajoled into doing so before too long.

It is worth pointing out simply that this is the first binding decision on the meaning of the various limbs of s. 55A of the DPA 1998, which contains the preconditions for the issuing of a monetary penalty notice.

Robin Hopkins @hopkinsrobin

Google Spain and the CJEU judgment it would probably like to forget.

May 19th, 2014 by Akhlaq Choudhury

In the landmark judgment in Google Spain SL and Google Inc., v Agencia Espanola de Proteccion de Datos, Gonzales (13th May 2014), the CJEU found that Google is a data controller and is engaged in processing personal data within the meaning of Directive 95/46 whenever an internet search about an individual results in the presentation of information about that individual with links to third party websites.  The judgment contains several findings which fundamentally affect the approach to data protection in the context of internet searches, and which may have far-reaching implications for search engine operators as well as other websites which collate and present data about individuals.

The case was brought Mr Costeja Gonzales, who was unhappy that two newspaper reports of a 16-year old repossession order against him for the recovery of social security debts would come up whenever a Google search was performed against his name. He requested both the newspaper and Google Spain or Google Inc. to remove or conceal the link to the reports on the basis that the matter had long since been resolved and was now entirely irrelevant. The Spanish Data Protection Agency rejected his complaint against the newspaper on the basis that publication was legally justified. However, his complaint against Google was upheld. Google took the matter to court, which made a reference to the CJEU.

The first question for the CJEU was whether Google was a data controller for the purposes of Directive 95/46. Going against the opinion of the Advocate General (see earlier post), the Court held that the collation, retrieval, storage, organisation and disclosure of data undertaken by a search engine when a search is performed amounted to “processing” within the meaning of the Directive; and that as Google determined the purpose and means of that processing, it was indeed the controller. This is so regardless of the fact that such data is already published on the internet and is not altered by Google in any way.

 The Court went on to find that the activity of search engines makes it easy for any internet user to obtain a structured overview of the information available about an individual thereby enabling them to establish a detailed profile of that person involving a vast number of aspects of his private life.  This entails a significant interference with rights to privacy and to data protection, which could not be justified by the economic interests of the search engine operator.  In a further remark that will send shockwaves through many commercial operators providing search services, it was said that as a “general rule” the data subject’s rights in this regard will override “not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information upon a search relating to the data subject’s name” (at paras 81 and 97). Exceptions would exist, e.g. for those in public life where the “the interference with…fundamental rights is justified by the preponderant interest of the general public in having…access to the information in question”.

However, the Court did not stop there with a mere declaration about interference. Given the serious nature of the interference with privacy and data protection rights, the Court said that search engines like Google could be required by a data subject to remove links to websites containing information about that person, even without requiring simultaneous deletion from those websites.

Furthermore, the CJEU lent support to the “right to be forgotten” by holding that the operator of a search engine could be required to delete links to websites containing a person’s information. The reports about Mr Costejas Gonzales’s financial difficulties in 1998 were no longer relevant having regard to his right to private life and the time that had elapsed, and he had therefore established the right to require Google to remove links to the relevant reports from the list of search results against his name. In so doing, he did not even have to establish that the publication caused him any particular prejudice.

The decision clearly has huge implications, not just for search engine operators like Google, but also other operators providing web-based personal data search services. Expect further posts in coming days considering some of the issues arising from the judgment.

Akhlaq Choudhury