Disclosing child protection information: make sure you ask the right questions first

June 1st, 2015 by Robin Hopkins

High-profile revelations in recent years illustrate the importance of public authorities sharing information on individuals who are of concern in relation to child protection matters. When inaccurate information is shared, however, the consequences for the individual can be calamitous.

AB v Chief Constable of Hampshire Constabulary [2015] EWHC 1238 (Admin) is a recent High Court judgment (Jeremy Baker J) which explores the implications of such inaccurate disclosures. The case is not only about inaccuracies per se, but about why those inaccuracies were not picked up before the disclosure was made.

Perhaps the most notable point from the judgment is this: if such a disclosure is to be necessary, then the data controller must take care to ask themselves reasonable questions about that information, check it against other obvious sources, and make necessary enquiries before disclosure takes place.

In other words, failure to ask the right questions can lead to the wrong course of action in privacy terms. Here is how that principle played out in the AB case.

Background

In 2010, AB was summarily dismissed from his job as a science teacher for inappropriate comments and conduct with potential sexual undertones, as well as a failure to maintain an appropriately professional boundary with students. His appeal against dismissal failed. The Independent Safeguarding Authority, however, decided not to include AB on its barred lists. The General Teaching Council also investigated AB, but it did not find that the allegations of improper conduct were made out.

AB’s dismissal, however, came to the attention of a member of the child abuse investigation public protection unit of the Hampshire Constabulary. Enquiries were made of the college, and certain email correspondence and records were generated and retained on police systems.

Later the following year, AB was offered a teaching job elsewhere. This came to the police’s attention in 2013. There was internal discussion within the police about this. One officer said in an email that, among other things (i) AB had also been dismissed from another school, and (ii) AB’s 2010 dismissal had involved inappropriate touching between himself and pupils. There was no evidence that either of those points was true. That email concluded “From What I’ve been told he should be nowhere near female students. I will put an intel report in on [AB]”.

The above information was passed to the Local Authority Designated Officer (‘LADO’) and in turn to the school, who terminated AB’s employment. He then made a subject access request under the DPA, by which he learnt of the above communication, and also the source of that information, which was said to be a notebook containing a police officer’s notes from 2010 (which did not in fact record either (i) or (ii) above). AB complained of the disclosure and also of the relevant officer’s failures to follow the requisite safeguarding procedures. The police dismissed his complaint.

The Court’s judgment

AB sought judicial review of both the disclosure of the inaccurate email in the email, and of the dismissal of his complaint about the police officer’s conduct in his reporting of the matter.

The Court (Jeremy Baker J) granted the application on both issues. I focus here on the first, namely the lawfulness of the disclosure in terms of Article 8 ECHR.

Was the disclosure “in accordance with the law” for Article 8 purposes?

The Court considered the key authorities in this – by now quite well-developed – area of law (Article 8 in the context of disclosures by the police), notably:

MM v United Kingdom [2010] ECHR 1588 (the retention and disclosure of information relating to an individual by a public authority engages Article 8, and must therefore be justified under Article 8(2));

Tysiac v Poland (2007) 45 EHRR 42, where the ECtHR stressed the importance of procedural safeguards to protecting individuals’ Article 8 rights from unlawful interference by public bodies;

R v Chief Constable of North Wales Ex. Parte Thorpe [1999] QB 396: a decision about whether or not to disclose the identity of paedophiles to members of the public, is a highly sensitive one. “Disclosure should only be made when there is a pressing need for that disclosure”);

R (L) v Commissioner of Police for the Metropolis [2010] 1 AC 410: such cases are essentially about proportionality;

R (A) v Chief Constable of Kent [2013] EWCA Civ 1706: such a disclosure is often “in practice the end of any opportunity for the individual to be employed in an area for which an [Enhanced Criminal Record Certificate] is required. Balancing the risks of non-disclosure to the interests of the members of the vulnerable group against the right of the individual concerned to respect for his or her private life is a particularly sensitive and difficult exercise where the allegations have not been substantiated and are strongly denied”;

R (T) v Chief Constable of Greater Manchester Police & others [2015] AC 49 and R (Catt) v ACPO [2015] 2 WLR 664 on whether disclosures by police were in accordance with the law and proportionate.

The Court concluded that, in light of the above authorities, the disclosure made in AB’s case was “in accordance with the law”. It was made under the disclosure regime made up of: Part V of the Police Act 1997, the Home Office’s Statutory Disclosure Guidance on enhanced criminal records certificates, section 10 of the Children Act 2004 and the Data Protection Act 1998.

See Jeremy Baker J’s conclusion – and notes of caution – at [73]-[75]:

“73. In these circumstances it seems to me that not only does the common law empower the police to disclose relevant information to relevant parties, where it is necessary for one of these police purposes, but that the DPA 1998, together with the relevant statutory and administrative codes, provide a sufficiently clear, accessible and consistent set of rules, so as to prevent arbitrary or abusive interference with an individual’s Article 8 rights; such that the disclosure will be in accordance with law.

74. However, it will clearly be necessary in any case, and in particular in relation to a decision to disclose information to a third party, for the decision-maker to examine with care the context in which his/her decision is being made.

75. In the present case, although the disclosure of the information by the police was to a LADO in circumstances involving the safeguarding of children, it also took place in the context of the claimant’s employment. The relevance of this being, as DC Pain was clearly aware from the contents of his e-mail to PS Bennett dated 10th June 2013, that the disclosure of the information had the potential to adversely affect the continuation of the claimant’s employment at the school….”

Was the disclosure proportionate?

While the disclosure decision was in accordance with the law, this did not remove the need for the police carefully to consider whether disclosure was necessary and proportionate, particularly in light of the serious consequences of disclosure for AB’s employment.

The Court held that the disclosure failed these tests. The crucial factor was that if such information about AB was well founded, then it would have been contained in his Enhanced Criminal Record Certificate – and if it was not, this would have prompted enquiries about the cogency of the information (why, if it was correct, was such serious information omitted from the ECRC?) which would reasonably have been pursued to bottom the matter out before the disclosure was made. These questions had not been asked in this case. See [80]-[81]:

“… In these circumstances, it was in my judgment, a necessary procedural step for DC Pain to ascertain from the DBS unit as to, whether, and if so, what information it had already disclosed on any enhanced criminal record certificate, as clearly if the unit had already disclosed the information which DC Pain believed had been provided to him by the college, then it would not have been necessary for him to have made any further disclosure of that information.

81. If either DC Pain or PS Bennett had taken this basic procedural step, then not only would it have been immediately obvious that this information had not been provided to the school, but more importantly, in the context of this case, it would also have been obvious that further enquiries were required to be made: firstly as to why no such disclosure had been made by the DBS unit; and secondly, once it had been ascertained that the only information which was in the possession of the DBS unit was the exchange of e-mails on the defendant’s management system, as to the accuracy of the information with which DC Pain believed he had been provided by the college.”

Judicial reviews of disclosure decisions concerning personal data: the DPA as an alternative remedy?

Finally, the Court dealt with a submission that judicial review should not be granted as this case focused on what was essentially a data protection complaint, which could have been taken up with the ICO under the DPA (as was suggested in Lord Sumption’s comments in Catt). That submission was dismissed: AB had not simply ignored or overlooked that prospect, but had rather opted to pursue an alternative course of complaint; the DPA did not really help with the police conduct complaint, and the case raised important issues.

Robin Hopkins @hopkinsrobin

Google and the DPA – RIP section 13(2)

March 27th, 2015 by Christopher Knight

Well, isn’t this an exciting week (and I don’t mean Zayn leaving One Direction)? First, Evans and now Vidal-Hall. We only need Dransfield to appear before Easter and there will be a full red bus analogy. Robin opened yesterday’s analysis of Evans by remarking on the sexiness of FOIA. If there is one thing you learn quickly as an information law practitioner, it is not to engage in a sexiness battle with Robin Hopkins. But high-profile though Evans is, the judgment in Vidal-Hall will be of far wider significance to anyone having to actually work in the field, rather than simply tuning every now and then to see the Supreme Court say something constitutional against a FOIA background. Vidal-Hall might not be the immediate head-turner, but it is probably going to be the life-changer for most of us. So, while still in the ‘friend zone’ with the Court of Appeal, before it all gets serious, it is important to explain what Vidal-Hall v Google [2015] EWCA Civ 311 does.

The Context

The claims concern the collection by Google of information about the internet usage of Apple Safari using, by cookies. This is known as “browser generated information” or “BGI”. Not surprisingly, it is used by Google to more effectively target advertising at the user. Anyone who has experienced this sort of thing will know how bizarre it can sometimes get – the sudden appearance of adverts for maternity clothes which would appear on my computer followed eerily quickly from my having to research pregnancy information for a discrimination case I was doing. Apple Safari users had not given their consent to the collection of BGI. The Claimants brought claims for misuse of private information, breach of confidence and breach of the DPA, seeking damages under section 13. There is yet to be full trial; the current proceedings arise because of the need to serve out of the jurisdiction on Google.

The Issues

These were helpfully set out in the joint judgment of Lord Dyson MR and Sharp LJ (with whom Macfarlane LJ agreed) at [13]. (1) whether misuse of private info is a tort, (2) whether damages are recoverable under the DPA for mere distress, (3) whether there was a serious issue to be tried that the browser generated data was personal data and (4) whether permission to serve out should have been refused on Jameel principles (i.e. whether there was a real and substantial cause of action).

Issues (1) and (4) are less important to readers of this blog, and need only mention them briefly (#spoilers!). Following a lengthy recitation of the development of the case law, the Court held that the time had come to talk not of cabbages and kings, but of the tort of misuse of private information, rather than being an equitable action for breach of confidence: at [43], [50]-[51]. This allowed service out under the tort gateway in PD6B. The comment of the Court on issue (4) is worth noting, because it held that although claims for breach of the DPA would involve “relatively modest” sums in damages, that did not mean the claim was not worth the candle. On the contrary, “the damages may be small, but the issues of principle are large”: at [139].

Damages under Section 13 DPA

Issue (2) is the fun stuff for DP lawyers. As we all know, Johnson v MDU [2007] EWCA Civ 262 has long cast a baleful glare over the argument that one can recover section 13 damages for distress alone. The Court of Appeal have held such comments to be obiter and not binding on them: at [68]. The word ‘damage’ in Art 23 of the Directive had to be given an autonomous EU law meaning: at [72]. It also had to be construed widely having regard to the underlying aims of the legislation: the legislation was primarily designed to protect privacy not economic rights and it would be strange if data subjects could not recover compensation for an invasion of their privacy rights merely because they had not suffered pecuniary loss, especially given Article 8 ECHR does not impose such a bar: at [76]-[79]. However, it is not necessary to establish whether there has also been a breach of Article 8; the Directive is not so restricted (although something which does not breach Article 8 is unlikely to be serious enough to have caused distress): at [82].

What then to do about section 13(2) which squarely bars recovery for distress alone and is incompatible with that reading of Article 23? The Court held it could not be ‘read down’ under the Marleasing principle; Parliament had intended section 13(2) to impose this higher test, although there was nothing to suggest why it had done so: at [90]-[93]. The alternative was striking it down on the basis that it conflicted with Articles 7 and 8 of the EU Charter of Fundamental Rights, which the Court of Appeal accepted. In this case, privacy and DP rights were enshrined as fundamental rights in the Charter; breach of DP rights meant that EU law rights were engaged; Article 47 of the Charter requires an effective remedy in respect of the breach; Article 47 itself had horizontal direct effect (as per the court’s conclusion in Benkharbouche v Embassy of Sudan [2015] EWCA Civ 33); the Court was compelled to disapply any domestic provision which offended against the relevant EU law requirement (in this case Article 23); and there could be no objections to any such disapplication in the present case e.g. on the ground that the Court was effectively recalibrating the legislative scheme: at [95]-[98], [105].

And thus, section 13(2) was no more. May it rest in peace. It has run down the curtain and joined the bleedin’ choir invisible.

What this means, of course, is a potential flood of DP litigation. All of a sudden, it will be worth bringing a claim for ‘mere’ distress even without pecuniary loss, and there can be no doubt many will do so. Every breach of the DPA now risks an affected data subject seeking damages. Those sums will invariably be small (no suggestion from the Court of Appeal that Article 23 requires a lot of money), and perhaps not every case will involve distress, but it will invariably be worth a try for the data subject. Legal costs defending such claims will increase. Any data controllers who were waiting for the new Regulation with its mega-fines before putting their house in order had better change their plans…

Was BGI Personal Data

For the DP geeks, much fun was still to be had with Issue (3). Google cannot identify a particular user by name; it only identifies particular browsers. If I search for nasal hair clippers on my Safari browser, Google wouldn’t recognise me walking down the street, no matter how hirsute my proboscis. The Court of Appeal did not need to determine the issue, it held only that there was a serious issue to be tried. Two main arguments were run. First, whether the BGI looked at in isolation was personal data (under section 1(1)(a) DPA); and secondly, whether the BGI was personal data when taken together with gmail account data held by Google (application of limb (b)).

On the first limb, the Court held that it was clearly arguable that the BGI was personal data. This was supported by the terms of the Directive, an Article 29 WP Opinion and the CJEU’s judgment in Lindqvist. The fact that the BGI data does not name the individual is immaterial: it clearly singles them out, individuates them and therefore directly identifies them: at [115] (see more detail at [116]-[121]).

On the second limb, it was also clearly arguable that the BGI was personal data. Google had argued that in practice G had no intention of amalgamating them, therefore there was no prospect of identification. The Court rejected this argument both on linguistic grounds (having regard to the wording of the definition of personal data, which does not require identification to actually occur) and on purposive grounds (having regard to the underlying purpose of the legislation): at [122]-[125].

A third route of identification, by which enable individual users could be identified by third parties who access the user’s device and then learn something about the user by virtue of the targeted advertising, the Court concluded it was a difficult question and the judge was not plainly wrong on the issue, and so it should be left for trial: at [126]-[133].

It will be interesting to see whether the trial happens. If it does, there could be some valuable judicial discussion on the nature of the identification question. For now, much is left as arguable.

Conclusion

The Court of Appeal’s judgment in Vidal-Hall is going to have massive consequences for DP in the UK. The disapplication of section 13(2) is probably the most important practical development since Durant, and arguably more so than that. Google are proposing to seek permission to appeal to the Supreme Court, and given the nature of the issues they may well get it on Issues (1) and (2) at least. In meantime, the Court’s judgment will repay careful reading. And data controllers should start looking very anxiously over their shoulders. The death of their main shield in section 13(2) leaves them vulnerable, exposed and liable to death by a thousand small claims.

Anya Proops and Julian Milford appeared for the ICO, intervening in the Court of Appeal.

Christopher Knight

PS No judicial exclamation marks to be found in Vidal-Hall. Very restrained.

Google Spain, freedom of expression and security: the Dutch fight back

March 13th, 2015 by Robin Hopkins

The Dutch fighting back against the Spanish, battling to cast off the control exerted by Spanish decisions over Dutch ideologies and value judgments. I refer of course to the Eighty Years’ War (1568-1648), which in my view is a sadly neglected topic on Panopticon.

The reference could also be applied, without too much of a stretch, to data protection and privacy rights in 2015.

The relevant Spanish decision in this instance is of course Google Spain, which entrenched what has come to be called the ‘right to be forgotten’. The CJEU’s judgment on the facts of that case saw privacy rights trump most other interests. The judgment has come in for criticism from advocates of free expression.

The fight-back by free expression (and Google) has found the Netherlands to be its most fruitful battleground. In 2014, a convicted criminal’s legal battle to have certain links about his past ‘forgotten’ (in the Google Spain sense) failed.

This week, a similar challenge was also dismissed. This time, a KPMG partner sought the removal of links to stories about him allegedly having to live in a container on his own estate (because a disgruntled builder, unhappy over allegedly unpaid fees, changed the locks on the house!).

In a judgment concerned with preliminary relief, the Court of Amsterdam rejected his application, finding in Google’s favour. There is an excellent summary on the Dutch website Media Report here.

The Court found that the news stories to which the complaint about Google links related remained relevant in light of public debates on this story.

Importantly, the Court said of Google Spain that the right to be forgotten “is not meant to remove articles which may be unpleasant, but not unlawful, from the eyes of the public via the detour of a request for removal to the operator of a search machine.”

The Court gave very substantial weight to the importance of freedom of expression, something which Google Spain’s critics say was seriously underestimated in the latter judgment. If this judgment is anything to go by, there is plenty of scope for lawyers and parties to help Courts properly to balance privacy and free expression.

Privacy rights wrestle not only against freedom of expression, but also against national security and policing concerns.

In The Hague, privacy has recently grabbed the upper hand over security concerns. The District Court of The Hague has this week found that Dutch law on the retention of telecommunications data should be down due to its incompatibility with privacy and data protection rights. This is the latest in a line of cases challenging such data retention laws, the most notable of which was the ECJ’s judgment in Digital Rights Ireland, on which see my post here. For a report on this week’s Dutch judgment, see this article by Maarten van Tartwijk in The Wall Street Journal.

As that article suggests, the case illustrates the ongoing tension between security and privacy. In the UK, security initially held sway as regards the retention of telecoms data: see the DRIP Regulations 2014 (and Panopticon passim). That side of the argument has gathered some momentum of late, in light of (for example) the Paris massacres and revelations about ‘Jihadi John’.

Just this week, however, the adequacy of UK law on security agencies has been called into question: see the Intelligence and Security Committee’s report entitled “Privacy and Security: a modern and transparent legal framework”. There are also ongoing challenges in the Investigatory Powers Tribunal – for example this one concerning Abdul Hakim Belhaj.

So, vital ideological debates continue to rage. Perhaps we really should be writing more about 17th century history on this blog.

Robin Hopkins @hopkinsrobin

Googling Orgies – Thrashing out the Liability of Search Engines

January 30th, 2015 by Christopher Knight

Back in 2008, the late lamented News of the World published an article under the headline “F1 boss has sick Nazi orgy with 5 hookers”. It had obtained footage of an orgy involving Max Mosley and five ladies of dubious virtue, all of whom were undoubtedly (despite the News of the World having blocked out their faces) not Mrs Mosley. The breach of privacy proceedings before Eady J (Mosley v News Group Newspapers Ltd [2008] EWHC 687 (QB)) established that the ‘Nazi’ allegation was unfounded and unfair, that the footage was filmed by a camera secreted in “such clothing as [one of the prostitutes] was wearing” (at [5]), and also the more genteel fact that even S&M ‘prison-themed’ orgies stop for a tea break (at [4]), rather like a pleasant afternoon’s cricket, but with a rather different thwack of willow on leather.

Since that time, Mr Mosley’s desire to protect his privacy and allow the public to forget his penchant for themed tea breaks has led him to bring or fund ever more litigation, whilst simultaneously managing to remind as many people as possible of the original incident. His latest trip to the High Court concerns the inevitable fact of the internet age that the photographs and footage obtained and published by the News of the World remain readily available for those in possession of a keyboard and a strong enough constitution. They may not be on a scale of popularity as last year’s iCloud hacks, but they can be found.

Alighting upon the ruling of the CJEU in Google Spain that a search engine is a data controller for the purposes of the Data Protection Directive (95/46/EC) (on which see the analysis here), Mr Mosley claimed that Google was obliged, under section 10 of the Data Protection Act 1998, to prevent processing of his personal data where he served a notice requesting it to do so, in particular by not blocking access to the images and footage which constitute his personal data. He also alleged misuse of private information. Google denied both claims and sought to strike them out. The misuse of private information claim being (or soon to be) withdrawn, Mitting J declined to strike out the DPA claim: Mosley v Google Inc [2015] EWHC 59 (QB). He has, however, stayed the claim for damages under section 13 pending the Court of Appeal’s decision in Vidal-Hall v Google (on which see the analysis here).

Google ran a cunning defence to what, post-Google Spain, might be said to be a strong claim on the part of a data subject. It relied on Directive 2000/31/EC, the E-Commerce Directive. Article 13 protects internet service providers from liability for the cached storage of information, providing they do not modify the information. Mitting J was content to find that by storing the images as thumbnails, Google was not thereby modifying the information in any relevant sense: at [41]. Article 15 of the E-Commerce Directive also prohibits the imposition of a general obligation on internet service providers to monitor the information they transmit or store.

The problem for Mitting J was how to resolve the interaction between the E-Commerce Directive and the Data Protection Directive; the latter of which gives a data subject rights which apparently extend to cached information held by internet service providers which the former of which apparently absolves them of legal responsibility for. It was pointed out that recital (14) and article 1.5(b) of the E-Commerce Directive appeared to make that instrument subject to the Data Protection Directive. It was also noted that Google’s argument did not sit very comfortably with the judgment (or at least the effect of the judgment) of the CJEU in Google Spain.

Mitting J indicated that there were only two possible answers: either the Data Protection Directive formed a comprehensive code, or the two must be read in harmony and given full effect to: at [45]. His “provisional preference is for the second one”: at [46]. Unfortunately, the judgment does not then go on to consider why that is so, or more importantly, how both Directives can be read in harmony and given full effect to. Of course, on a strike out application provisional views are inevitable, but it leaves rather a lot of legal work for the trial judge, and one might think that it would be difficult to resolve the interaction without a reference to the CJEU. What, for example, is the point of absolving Google of liability for cached information if that does not apply to any personal data claims, which will be a good way of re-framing libel/privacy claims to get around Article 13?

The Court also doubted that Google’s technology really meant that it would have to engage in active monitoring, contrary to Article 15, because they may be able to do so without “disproportionate effort or expense”: at [54]. That too was something for the trial judge to consider.

So, while the judgment of Mitting J is an interesting interlude in the ongoing Mosley litigation saga, the final word certainly awaits a full trial (and/or any appeal by Google), and possibly a reference. All the judgment decides is that Mr Mosley’s claim is not so hopeless it should not go to trial. Headlines reading ‘Google Takes a Beating (with a break for tea)’ would be premature. But the indications given by Mitting J are not favourable to Google, and it may well be that the footage of Mr Mosley will not be long for the internet.

Christopher Knight

Data protection: three developments to watch

January 15th, 2015 by Robin Hopkins

Panopticon likes data protection, and it likes to keep its eye on things. Here are three key developments in the evolution of data protection law which, in Panopticon’s eyes, are particularly worth watching.

The right to be forgotten: battle lines drawn

First, the major data protection development of 2014 was the CJEU’s ‘right to be forgotten’ judgment in the Google Spain case. Late last year, we received detailed guidance from the EU’s authoritative Article 29 Working Party on how that judgment should be implemented: see here.

In the view of many commentators, the Google Spain judgment was imbalanced. It gave privacy rights (in their data protection guise) undue dominance over other rights, such as rights to freedom of expression. It was clear, however, that not all requests to be ‘forgotten’ would be complied with (as envisaged by the IC, Chris Graham, in an interview last summer) and that complaints would ensue.

Step up Max Moseley. The BBC reported yesterday that he has commenced High Court litigation against Google. He wants certain infamous photographs from his past to be made entirely unavailable through Google. Google says it will remove specified URLs, but won’t act so as to ensure that those photographs are entirely unobtainable through Google. According to the BBC article, this is principally because Mr Moseley no longer has a reasonable expectation of privacy with respect to those photographs.

The case has the potential to be a very interesting test of the boundaries of privacy rights under the DPA in a post-Google Spain world.

Damages under the DPA

Second, staying with Google, the Court of Appeal will continue its consideration of the appeal in Vidal-Hall and Others v Google Inc [2014] EWHC 13 (QB) in February. The case is about objections against personal data gathered through Apple’s Safari browser. Among the important issues raised by this case is whether, in order to be awarded compensation for a DPA breach, one has to establish financial loss (as has commonly been assumed). If the answer is no, this could potentially lead to a surge in DPA litigation.

The General Data Protection Regulation: where are we?

I did a blog post last January with this title. A year on, the answer still seems to be that we are some way off agreement on what the new data protection law will be.

The latest text of the draft Regulation is available here – with thanks to Chris Pounder at Amberhawk. As Chris notes in this blog post, the remaining disagreements about the final text are legion.

Also, Jan Philipp Albrecht, the vice-chairman of the Parliament’s civil liberties committee, has reportedly suggested that the process of reaching agreement may even drag on into 2016.

Perhaps I will do another blog post in January 2016 asking the same ‘where are we?’ question.

Robin Hopkins @hopkinsrobin

Monetary penalty for marketing phonecalls: Tribunal upholds ‘lenient’ penalty

December 16th, 2014 by Robin Hopkins

A telephone call made for direct marketing purposes is against the law when it is made to the number of a telephone subscriber who has registered with the Telephone Preference Service (‘TPS’) as not wishing to receive such calls on that number, unless the subscriber has notified the caller that he does not, for the time being, object to such calls being made on that line by that caller: see regulation 21 of the Privacy and Electronic Communications (EC Directive) Regulations 2003, as amended (‘PECR’).

The appellant in Amber UPVC Fabrications v IC (EA/2014/0112) sells UPVC windows and the like. It relies heavily on telephone calls to market its products and services. It made nearly four million telephone calls in the period May 2011 to April 2013, of which approximately 80% to 90% were marketing calls.

Some people complained to the Information Commissioner about these calls. The Commissioner found that the appellant had committed serious PECR contraventions – he relied on 524 unsolicited calls made in contravention of PECR. The appellant admitted that it made 360 of the calls. The appellant was issued with a monetary penalty under section 55A of the Data Protection Act 1998, as incorporated into PECR.

The appellant was issued with a monetary penalty to the value of £50,000. It appealed to the Tribunal. Its appeal did not go very well.

The Tribunal found the appellant’s evidence to be “rather unsatisfactory in a number of different ways. They took refuge in broad assertions about the appellant’s approach to compliance with the regulations, without being able to demonstrate that they were genuinely familiar with the relevant facts. They were able to speak only in general terms about the changes to the appellant’s telephone systems that had been made from time to time, and appeared unfamiliar with the detail. They had no convincing explanations for the numerous occasions when the appellant had failed to respond to complaints and correspondence from TPS or from the Commissioner. The general picture which we got was of a company which did as little as possible as late as possible to comply with the regulations, and only took reluctant and belated action in response to clear threats of legal enforcement.”

The Tribunal set out in detail the flaws with the appellant’s evidence. It concluded that “the penalty was appropriate (or, indeed, lenient) in the circumstances, and the appellant has no legitimate complaint concerning its size”.

This decision is notable not only for its detailed critique (in terms of PECR compliance) of the appellant’s business practices and evidence on appeal, but also more widely for its contribution to the developing jurisprudence on monetary penalties and the application of the conditions under section 55A DPA. Thus far, the cases have been Scottish Borders (DPA appeal allowed, in a decision largely confined to the facts), Central London Community Healthcare NHS Trust (appeal dismissed at both First-Tier and Upper Tribunal levels) and Niebel (PECR appeal allowed and upheld on appeal).

The Amber case is most closely linked to Niebel, which concerned marketing text messages. The Amber decision includes commentary on and interpretation of the binding Upper Tribunal decision in Niebel on how the section 55A conditions for issuing a monetary penalty should be applied. For example:

PECR should be construed so as to give proper effective to the Directive which it implements – see the Tribunal’s discussion of the Marleasing principle.

The impact of the ‘contravention’ can be assessed cumulatively, i.e. as the aggregate effect of the contraventions asserted in the penalty notice. In Niebel, the asserted contravention was a specified number of text messages which had been complained about, but the Tribunal in Amber took the view that, in other cases, the ICO need not frame the relevant contravention solely by reference to complaints – it could extrapolate, where the evidence supported this, to form a wider conclusion on contraventions.

Section 55A requires an assessment of the “likely” consequences of the “kind” of contravention. “Likely” has traditionally been taken to mean “a significant and weighty chance”, but the Tribunal in Amber considered that, in this context, it might mean “more than fanciful”, ie, “a real, a substantial rather than merely speculative, possibility, a possibility that cannot sensibly be ignored”.

The “kind” of contravention includes the method of contravention, the general content and tenor of the communication, and the number or scale of the contravention.

“Substantial” (as in “substantial damage or substantial distress”) probably means “more than trivial, ie, real or of substance”. Damage or distress can be substantial on a cumulative basis, i.e. even if the individual incidents do not themselves cause substantial damage or substantial distress.

“Damage” is different to “distress” but is not confined to financial loss – for example, personal injury or property interference could suffice.

“Distress” means something more than irritation.

The significant and weighty chance of causing substantial distress to one person is sufficient for the threshold test to be satisfied.

Where the number of contraventions is large, there is a higher inherent chance of affecting somebody who, because of their particular unusual circumstances, is likely to suffer substantial damage or substantial distress due to the PECR breach.

The Amber decision is, to date, the most developed analysis at First-Tier Tribunal level, of the monetary penalty conditions. The decision will no doubt be cited and discussed in future cases.

11KBW’s James Cornwall appeared for the ICO in both Amber and Niebel.

Robin Hopkins @hopkinsrobin

Above and below the waterline: IPT finds that Prism and Tempora are lawful

December 5th, 2014 by Robin Hopkins

The now famous revelations by US whistleblower Edward Snowden focused on US government programmes under which vast amounts of data about individuals’ internet usage and communications were said to have been gathered. The allegations extended beyond the US: the UK government and security agencies, for example, were also said to be involved in such activity.

Unsurprisingly, concerns were raised about the privacy implications of such activity – in particular, whether it complied with individuals’ rights under the European Convention on Human Rights (privacy under Article 8; freedom of expression under Article 10).

The litigation before the Investigatory Powers Tribunal

Litigation was commenced in the UK by Privacy International, Liberty, Amnesty International and others. The cases were heard by a five-member panel of the Investigatory Powers Tribunal (presided over by Mr Justice Burton) in July of this year. The IPT gave judgment ([2014] UKIPTrib 13_77-H) today.

In a nutshell, it found that the particular information-gathering activities it considered – carried out in particular by GCHQ and the Security Service – are lawful.

Note the tense: they are lawful. The IPT has not determined whether or not they were lawful in the past. The key difference is this: an essential element of lawfulness is whether the applicable legal regime under which such activity is conducted is sufficiently accessible (i.e. is it available and understandable to people?). That turns in part on what the public is told about how the regime operates. During the course of this litigation, the public has been given (by means of the IPT’s open judgment) considerably more detail in this regard. This, says the IPT, certainly makes the regime lawful on a prospective basis. The IPT has not determined whether, prior to these supplementary explanations, the ‘in accordance with the law’ requirement was satisfied.

With its forward-looking, self-referential approach, this judgment is unusual. It is also unusual in that it proceeded to test the legality of the regimes largely by references to assumed rather than established facts about the Prism and Tempora activities. This is because not much about those activities has been publicly confirmed, due to the ‘neither confirm nor deny’ principle which is intrinsic to intelligence and security activity.

Prism

The first issue assessed by reference to assumed facts was called the “Prism” issue: this was about the collection/interception by US authorities of data about individuals’ internet communications and the assumed sharing of such data with UK authorities, who could then retain and use it. Would this arrangement be lawful under Article 8(2) ECHR? In particular, was it “in accordance with the law”, which in essence means did it have a basis in law and was it sufficiently accessible and foreseeable to the potentially affected individuals? (These are the so-called Weber requirements, from Weber and Saravia v Germany [2008] 46 EHRR SE5).

When it comes to intelligence, accessibility and foreseeability are difficult to achieve without giving the game away to a self-defeating extent. The IPT recognised that the Weber principles need tweaking in this context. The following ‘nearly-Weber’ principles were applied as the decisive tests for ‘in accordance with the law’ in this context:

“(i) there must not be an unfettered discretion for executive action. There must be controls on the arbitrariness of that action.

(ii) the nature of the rules must be clear and the ambit of them must be in the public domain so far as possible, an “adequate indication” given (Malone v UK [1985] 7 EHRR 14 at paragraph 67), so that the existence of interference with privacy may in general terms be foreseeable.”

Those tests will be met if:

“(i) Appropriate rules or arrangements exist and are publicly known and confirmed to exist, with their content sufficiently signposted, such as to give an adequate indication of it.

(ii) They are subject to proper oversight.”

On the Prism issue, the IPT found that those tests are met. The basis in law comes from the Security Service Act 1989, Intelligence Services Act 1994 and the Counter-Terrorism Act 2008. Additionally, the Data Protection Act 1998 DPA, the Official Secrets Act 1989 and the Human Rights Act 1998 restrain the use of data of the sort at issue here. Taken together, there are sufficient and specific statutory limits on the information that each of the Intelligence Services can obtain, and on the information that each can disclose.

In practical terms, there are adequate arrangements in place to safeguard against arbitrary of unfettered use of individuals’ data. These included the “arrangements below the waterline” (i.e. which are not publicly explained) which the Tribunal was asked to – and did – take into account.

Oversight of this regime comes through Parliament’s Intelligence and Security Committee and the Interception of Communications Commissioner.

Further, these arrangements are “sufficiently signposted by virtue of the statutory framework … and the statements of the ISC and the Commissioner… and as now, after the two closed hearings that we have held, publicly disclosed by the Respondents and recorded in this judgment”.

Thus, in part thanks to closed evidence of the “below the waterline” arrangements and open disclosure of more detail about those arrangements, the Prism programme (on the assumed facts before the IPT) is lawful, i.e. it is a justified intrusion into Article 8 ECHR rights.

The alleged Tempora interception operation

Unlike the Prism programme, the second matter scrutinised by the IPT – the alleged Tempora programme – involved the interception of communications by UK authorities. Here, in contrast to Prism (where the interception is done by someone else), the Regulation of Investigatory Powers Act 2000 is pivotal.

This works on a system of warrants for interception. The warrants are issued under section 8 of RIPA (supplemented by sections 15 and 16) by the Secretary of State, rather than by a member of the judiciary. The regime is governed by the Interception of Communications Code of Practice.

The issue for the IPT was: is this warrant system (specifically, the section 8(4) provision for ‘certified’ warrants) in accordance with the law, for ECHR purposes?

This has previously been considered by the IPT in the British Irish Rights Watch case in 2004. Its answer was that the regime was in accordance with the law. The IPT in the present cases re-examined the issue and took the same view. It rejected a number of criticisms of the certified warrant regime, including:

The absence of a tightly focused, ‘targeting’ approach at the initial stages of information-gathering is acceptable and inevitable.

There is no call “for search words to be included in an application for a warrant or in the warrant itself. It seems to us that this would unnecessarily undermine and limit the operation of the warrant and be in any event entirely unrealistic”.

There is also “no basis for objection by virtue of the absence for judicial pre-authorisation of a warrant. The United Kingdom system is for the approval by the highest level of government, namely by the Secretary of State”.

Further, “it is not necessary that the precise details of all the safeguards should be published, or contained in legislation, delegated or otherwise”.

The overall assessment was very similar as for Prism: in light of the statutory regime, the oversight mechanisms, the open and closed evidence of the arrangements (above and below the “waterline”) and additional disclosures by the Respondents, the regime for gathering, retaining and using intercepted data was in accordance with the law – both as to Article 8 and Article 10 ECHR.

Conclusion

This judgment is good news for the UK Government and the security bodies, who will no doubt welcome the IPT’s sympathetic approach to the practical exigencies of effective intelligence operations in the digital age. These paragraphs encapsulate the complaints and the IPT’s views:

“158. Technology in the surveillance field appears to be advancing at break-neck speed. This has given rise to submissions that the UK legislation has failed to keep abreast of the consequences of these advances, and is ill fitted to do so; and that in any event Parliament has failed to provide safeguards adequate to meet these developments. All this inevitably creates considerable tension between the competing interests, and the ‘Snowden revelations’ in particular have led to the impression voiced in some quarters that the law in some way permits the Intelligence Services carte blanche to do what they will. We are satisfied that this is not the case.

159. We can be satisfied that, as addressed and disclosed in this judgment, in this sensitive field of national security, in relation to the areas addressed in this case, the law gives individuals an adequate indication as to the circumstances in which and the conditions upon which the Intelligence Services are entitled to resort to interception, or to make use of intercept.”

11KBW’s Ben Hooper and Julian Milford appeared for the Respondents.

Robin Hopkins @hopkinsrobin

In the wake of Google Spain: freedom of expression down (but not out)

July 15th, 2014 by Robin Hopkins

The CJEU’s judgment in Google Spain was wrong and has created an awful mess.

That was the near-unanimous verdict of a panel of experts – including 11KBW’s Anya Proops – at a debate hosted by ITN and the Media Society on Monday 14 July and entitled ‘Rewriting History: Is the new era in Data Protection compatible with journalism?’.

The most sanguine participant was the Information Commissioner, Christopher Graham. He cautioned against a ‘Chicken Licken’ (the sky is falling in) alarmism – we should wait and see how the right to be forgotten (RTBF) pans out in practice. He was at pains to reassure the media that its privileged status in data protection law was not in fact under threat: the s. 32 DPA exemption, for example, was here to stay. There remains space, Google Spain notwithstanding, to refuse RTBF inappropriate requests, he suggested – at least as concerns journalism which is in the public interest (a characteristic which is difficult in principle and in practice).

‘I am Chicken Licken!’, was the much less sanguine stance of John Battle, ITN’s Head of Compliance. Google Spain is a serious intrusion into media freedom, he argued. This was echoed by The Telegraph’s Holly Watt, who likened the RTBF regime to book-burning.

Peter Barron, Google’s Director of Communications and Public Affairs for Europe, Africa and the Middle East, argued that in implementing its fledgling RTBF procedure, Google was simply doing as told: it had not welcomed the Google Spain judgment, but that judgment is now the law, and implementing it was costly and burdensome. On the latter point, Chris Graham seemed less than entirely sympathetic, pointing out that Google’s business model is based heavily on processing other people’s personal data.

John Whittingdale MP, Chairman of the Culture, Media & Sport Select Committee, was markedly Eurosceptic in tone. Recent data protection judgments from the CJEU have overturned what we in the UK had understood the law to be – he was referring not only to Google Spain, but also to Digital Rights Ireland (on which see my DRIP post from earlier today). The MOJ or Parliament need to intervene and restore sanity, he argued.

Bringing more legal rigour to bear was Anya Proops, who honed in on the major flaws in the Google Spain judgment. Without there having been any democratic debate (and without jurisprudential analysis), the CJEU has set a general rule whereby privacy trumps freedom of expression. This is hugely problematic in principle. It is also impracticable: the RTBF mechanism doesn’t actually work in practice, for example because it leaves Google.com (as opposed to Google.co.uk or another EU domain) untouched – a point also made by Professor Luciano Floridi, Professor of Philosophy and Ethics of Information at the University of Oxford.

There were some probing questions from the audience too. Mark Stephens, for example, asked Chris Graham how he defined ‘journalism’ (answer: ‘if it walks and quacks like a journalist’…) and how he proposed to fund the extra workload which RTBF complaints would bring for the ICO (answer: perhaps a ‘polluter pays’ approach?).

Joshua Rozenberg asked Peter Barron if there was any reason why people should not switch their default browsers to the RTBF-free Google.com (answer: no) and whether Google would consider giving aggrieved journalists rights of appeal within a Google review mechanism (the Google RTBF mechanism is still developing).

ITN is making the video available on its website this week. Those seeking further detail can also search Twitter for the hashtag #rewritinghistory or see Adam Fellows’ blog post.

The general tenor from the panel was clear: Google Spain has dealt a serious and unjustifiable blow to the freedom of expression.

Lastly, one of my favourite comments came from ITN’s John Battle, referring to the rise of data protection as a serious legal force: ‘if we’d held a data protection debate a year ago, we’d have had one man and his dog turn up. Now it pulls in big crowds’. I do not have a dog, but I have been harping on for some time about data protection’s emergence from the shadows to bang its fist on the tables of governments, security bodies, big internet companies and society at large. It surely will not be long, however, before the right to freedom of expression mounts a legal comeback, in search of a more principled and workable balance between indispensible components of a just society.

Robin Hopkins @hopkinsrobin

Surveillance powers to be kept alive via DRIP

July 15th, 2014 by Robin Hopkins

The legal framework underpinning state surveillance of individuals’ private communications is in turmoil, and it is not all Edward Snowden’s fault. As I write this post, two hugely important developments are afoot.

Prism/Tempora

The first is the challenge by Privacy International and others to the Prism/Tempora surveillance programmes implemented by GCHQ and the security agencies. Today is day 2 of the 5-day hearing before the Investigatory Powers Tribunal. To a large extent, this turmoil was unleashed by Snowden.

DRIP – the background

The second strand of the turmoil is thanks to Digital Rights Ireland and others, whose challenge to the EU’s Data Retention Directive 2006/24 was upheld by the CJEU in April of this year. That Directive provided for traffic and location data (rather than content-related information) about individuals’ online activity to be retained by communications providers for a period of 6-24 months and made available to policing and security bodies. In the UK, that Directive was implemented via the Data Retention (EC Directive) Regulations 2009, which mandated retention of communications data for 12 months.

In Digital Rights Ireland, the CJEU held the Directive to be invalid on the grounds of incompatibility with the privacy rights enshrined under the EU’s Charter of Fundamental Rights. Strictly speaking, the CJEU’s judgment (on a preliminary ruling) then needed to be applied by the referring courts, but in reality the foundation of the UK’s law fell away with the Digital Rights Ireland judgment. The government has, however, decided that it needs to maintain the status quo in terms of the legal powers and obligations which were rooted in the invalid Directive.

On 10 July 2014, the Home Secretary made a statement announcing that this gap in legal powers was to be plugged on a limited-term basis. A Data Retention and Investigatory Powers (DRIP) Bill would be put before Parliament, together with a draft set of regulations to be made under the envisaged Act. If passed, these would remain in place until the end of 2016, by which time longer-term solutions could be considered. Ms May said this would:

“…ensure, for now at least, that the police and other law enforcement agencies can investigate some of the criminality that is planned and takes place online. Without this legislation, we face the very prospect of losing access to this data overnight, with the consequence that police investigations will suddenly go dark and criminals will escape justice. We cannot allow this to happen.”

Today, amid the ministerial reshuffle and shortly before the summer recess, the Commons is debating DRIP on an emergency basis.

Understandably, there has been much consternation about the extremely limited time allotted for MPs to debate a Bill of such enormous significance for privacy rights (I entitled my post on the Digital Rights Ireland case “Interfering with the fundamental rights of practically the entire European population”, which is a near-verbatim quote from the judgment).

DRIP – the data retention elements

The Bill is short. A very useful summary can be found in the Standard Note from the House of Commons Library (authored by Philippa Ward).

Clause 1 provides power for the Secretary of State to issue a data retention notice on a telecommunications services provider, requiring them to retain certain data types (limited to those set out in the Schedule to the 2009 Regulations) for up to 12 months. There is a safeguard that the Secretary of State must consider whether it is “necessary and proportionate” to give the notice for one or more of the purposes set out in s22(2) of RIPA.

Clause 2 then provides the relevant definitions.

The Draft Regulations explain the process in more detail. Note in particular regulation 5 (the matters the Secretary of State must consider before giving a notice) and regulation 9 (which provides for oversight by the Information Commissioner of the requirements relating to integrity, security and destruction of retained data).

DRIP – the RIPA elements

DRIP is also being used to clarify (says the government) or extend (say some critics) RIPA 2000. In this respect, as commentators such as David Allen Green have pointed out, it is not clear why the emergency legislation route is necessary.

Again, to borrow the nutshells from the House of Commons Library’s Standard Note:

Clause 3 amends s5 of RIPA regarding the Secretary of State’s power to issue interception warrants on the grounds of economic well-being.

Clause 4 aims to clarify the extra-territorial reach of RIPA in in relation to both interception and communications data by adding specific provisions. This confirms that requests for interception and communications data to overseas companies that are providing communications services within the UK are subject to the legislation.

Clause 5 clarifies the definition of “telecommunications service” in RIPA to ensure that internet-based services, such as webmail, are included in the definition.

Criticism

The Labour front bench is supporting the Coalition. A number of MPs, including David Davis and Tom Watson, have been vociferous in their opposition (see for example the proposed amendments tabled by Watson and others here). So too have numerous academics and commentators. I won’t try to link to all of them here (as there are too many). Nor can I link to a thorough argument in defence of DRIP (as I have not been able to find one). For present purposes, an excellent forensic analysis comes from Graham Smith at Cyberleagle.

I don’t seek to duplicate that analysis. It is, however, worth remembering this: the crux of the CJEU’s judgment was that the Directive authorised such vast privacy intrusions that stringent safeguards were required to render it proportionate. In broad terms, that proportionately problem can be fixed in two ways: reduce the extent of the privacy intrusions and/or introduce much better safeguards. DRIP does not seek to do the former. The issue is whether it offers sufficient safeguards for achieving an acceptable balance between security and privacy.

MPs will consider that today and Peers later this week. Who knows? – courts may even be asked for their views in due course.

Robin Hopkins @hopkinsrobin

Some results may have been removed under data protection law in Europe. Learn more.

July 3rd, 2014 by Robin Hopkins

This is the message that now regularly greets those using Google to search for information on named individuals. It relates, of course, to the CJEU’s troublesome Google Spain judgment of 13 May 2014.

I certainly wish to learn more.

So I take Google up on its educational offer and click through to its FAQ page, where the folks at Google tell me inter alia that “Since this ruling was published on 13 May 2014, we’ve been working around the clock to comply. This is a complicated process because we need to assess each individual request and balance the rights of the individual to control his or her personal data with the public’s right to know and distribute information”.

The same page also leads me to the form on which I can ask Google to remove from its search results certain URLs about me. I need to fill in gaps like this: “This URL is about me because… This page should not be included as a search result because…” 

This is indeed helpful in terms of process, but I want to understand more about the substance of decision-making. How does (and/or should) Google determine whether or not to accede to my request? Perhaps understandably (as Google remarks, this is a complicated business on which the dust is yet to settle), Google doesn’t tell me much about that just yet.

So I look to the obvious source – the CJEU’s judgment itself – for guidance. Here I learn that I can in principle ask that “inadequate, irrelevant or no longer relevant” information about me not be returned through a Google search. I also get some broad – and quite startling – rules of thumb, for example at paragraph 81, which tells me this:

“In the light of the potential seriousness of that interference, it is clear that it cannot be justified by merely the economic interest which the operator of such an engine has in that processing. However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, in situations such as that at issue in the main proceedings a fair balance should be sought in particular between that interest and the data subject’s fundamental rights under Articles 7 and 8 of the Charter. Whilst it is true that the data subject’s rights protected by those articles also override, as a general rule, that interest of internet users, that balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life.”

So it seems that, in general (and subject to the sensitivity of the information and my prominence in public life), my privacy rights trump Google’s economic rights and other people’s rights to find information about me in this way. So the CJEU has provided some firm steers on points of principle.

But still I wish to learn more about how these principles will play out in practice. Media reports in recent weeks have told us about the volume of ‘right to be forgotten’ requests received by Google.

The picture this week has moved on from volumes to particulars. In the past few days, we have begun to learn how Google’s decisions filter back to journalists responsible for the content on some of the URLs which objectors pasted into the forms they sent to Google. We learn that journalists and media organisations, for example, are now being sent messages like this:

“Notice of removal from Google Search: we regret to inform you that we are no longer able to show the following pages from your website in response to certain searches on European versions of Google.”

Unsurprisingly, some of those journalists find this puzzling and/or objectionable. Concerns have been ventilated in the last day or two, most notably by the BBC’s Robert Peston (who feels that, through teething problems with the new procedures, he has been ‘cast into oblivion’) and The Guardian’s James Ball (who neatly illustrates some of the oddities of the new regime). See also The Washington Post’s roundup of UK media coverage.

That coverage suggests that the Google Spain ruling – which made no overt mention of free expression rights under Article 10 ECHR – has started to bite into the media’s freedom. The Guardian’s Chris Moran, however, has today posted an invaluable piece clarifying some misconceptions about the right to be forgotten. Academic commentators such as Paul Bernal have also offered shrewd insights into the fallout from Google Spain.

So, by following the trail from Google’s pithy new message, I am able to learn a fair amount about the tenor of this post-Google Spain world.

Inevitably, however, given my line of work, I am interested in the harder edges of enforcement and litigation: in particular, if someone objects to the outcome of a ‘please forget me’ request to Google, what exactly can they do about it?

On such questions, it is too early to tell. Google says on its FAQ page that “we look forward to working closely with data protection authorities and others over the coming months as we refine our approach”. For its part, the ICO tells us that it and its EU counterparts are working hard on figuring this out. Its newsletter from today says for example that:

“The ICO and its European counterparts on the Article 29 Working Party are working on guidelines to help data protection authorities respond to complaints about the removal of personal information from search engine results… The recommendations aim to ensure a consistent approach by European data protection authorities in response to complaints when takedown requests are refused by the search engine provider.”

So for the moment, there remain lots of unanswered questions. For example, the tone of the CJEU’s judgment is that DPA rights will generally defeat economic rights and the public’s information rights. But what about a contest between two individuals’ DPA rights?

Suppose, for example, that I am an investigative journalist with substantial reputational and career investment in articles about a particular individual who then persuades Google to ensure that my articles do not surface in EU Google searches for his name? Those articles also contain my name, work and opinions, i.e. they also contain my personal data. In acceding to the ‘please forget me’ request without seeking my input, could Google be said to have processed my personal data unfairly, whittling away my online personal and professional output (at least to the extent that the relevant EU Google searches are curtailed)? Could this be said to cause me damage or distress? If so, can I plausibly issue a notice under s. 10 of the DPA, seek damages under s. 13, or ask the ICO to take enforcement action under s. 40?

The same questions could arise, for example, if my personal backstory is heavily entwined with that of another person who persuades Google to remove from its EU search results articles discussing both of us – that may be beneficial for the requester, but detrimental to me in terms of the adequacy of personal data about me which Google makes available to the interested searcher.

So: some results may have been removed under data protection law in Europe, and I do indeed wish to learn more. But I will have to wait.

Robin Hopkins @hopkinsrobin