Time to End the Time Debate

July 23rd, 2015 by Christopher Knight

The apparently endless APPGER litigation has produced yet another decision of the Upper Tribunal for seasoned FOIA watchers, which amongst some very fact-specific issues, also contains two important clarifications of law: APPGER v ICO & FCO [2015] UKUT 377 (AAC).

As anyone who has ever done any information law ever will know, the APPGER litigation concerns requests under FOIA for information related to alleged British involvement in extraordinary rendition. Some information has been released, some has been released following earlier rounds of litigation, some remains withheld under various exemptions.

Following previous hearings staying various points, the present round of litigation concerned the application of section 23 (the security bodies exemption) and section 27 (international relations). There were two points of wider interest discussed in particular. One is the time at which the public interest is assessed (relevant to section 27), and one is the breadth of the “relates to” limb of section 23.

The time point was one which only really arose because of the Upper Tribunal’s desire to throw a mangy cat amongst the pigeons by suggesting in Defra v ICO & Badger Trust [2014] UKUT 526 (AAC) at [44]-[48] that the correct time to assess the public interest might be the date of Tribunal hearing. As some wise and learned commentators have pointed out, this rather seemed to have been overtaken by the Supreme Court’s – technically obiter – reasoning in R (Evans) v Attorney General [2015] UKSC 21 at [72]-[73] that the time was at the point of the authority’s refusal.

The Upper Tribunal in APPGER (containing at least one member of the panel in Badger Trust) issued a mea culpa and accepted that Evans was right: at [49]-[57]. It did not reach any more specific decision on situations where, for example, the authority has been late in complying. Doubtless the difference in time will often not matter very much. But the principle of the point now seems resolved.

Section 23(1) was not a point answered by Evans, and an argument was run by the requestor that “relates to” should be construed narrowly, as in the DPA. The Upper Tribunal disagreed: at [15]-[19]. The ordinary meaning of the language was broad, it was consistent with the aim of shutting the backdoor to the security bodies, it was consistent with authority, and met the contextual aim of FOIA where the contextual aim of the DPA was very different. The idea of requiring a “focus or main focus” was rejected.

Whilst agreeing that it should not attempt to gloss the statutory language, the Upper Tribunal nonetheless sought to assist future cases by indicating that asking whether the information requested had been supplied to a security body for the purposes of the discharge of its statutory functions (a test attributed to Mitting J) would have considerable utility. It would enable a clear explanation, it would allow differentiation within and without the scope of the exemption, and it was less likely to require a detailed line-by-line approach to redactions: at [33]. The language remains broad, but the practical application of it appears to have been ‘guided’ into a slightly narrower pigeon-hole than might have otherwise been the case.

The judgment as a whole is worth reading on the application of those exemptions to the particular information and the treatment of the evidence by the Upper Tribunal, but those two points of principle are the keys to take away. And about time too.

Timothy Pitt-Payne QC and Joanne Clement appeared for APPGER; Karen Steyn QC appeared for the FCO; Robin Hopkins appeared for the ICO.

Christopher Knight

Facebook, child protection and outsourced monitoring

July 22nd, 2015 by Robin Hopkins

Facebook is no stranger to complaints about the content of posts. Usually, one user complains to Facebook about what other users’ posts say about him. By making the offending posts available, Facebook is processing the complainant’s personal data, and must do so in compliance with data protection law.

More unusually, a user could also complain about their own Facebook posts. Surely a complainant cannot make data protection criticisms about information they deliberately posted about themselves? After all, Facebook processes those posts with the author’s consent, doesn’t it?

Generally, yes – but that will not necessarily be true in every instance, especially when it comes to Facebook posts by children. This is the nature of the complaint in striking litigation currently afoot before the High Court in Northern Ireland.

The case is HL v Facebook Inc, Facebook Ireland Ltd, the Northern Health & Social Care Trust and DCMS [2015] NIQB 61. It is currently only in its preliminary stages, but it raises very interesting and important issues about Facebook’s procedures for preventing underage users from utilising the social network. Those issues are illuminated in the recent judgment of Stephen J, who is no stranger to claims against Facebook – he heard the recent case of CG v Facebook [2015] NIQB 11, concerning posts about a convicted paedophile.

From the age of 11 onwards, HL maintained a Facebook page on which she made posts of an inappropriate sexual nature. She was exposed to responses from sexual predators. She says that Facebook is liable for its failure to prevent her from making these posts. She alleges that Facebook (i) unlawfully processed her sensitive personal data, (ii) facilitated her harassment by others, and (iii) was negligent in failing to have proper systems in place to minimise the risks of children setting up Facebook accounts by lying about their age.

The data protection claim raises a number of issues of great importance to the business of Facebook and others with comparable business models. One is the extent to which a child can validly consent to the processing of their personal data – especially sensitive personal data. Minors are (legitimately or not) increasingly active online, and consent is a cornerstone of online business. The consent issue is of one of wide application beyond the HL litigation.

A second issue is whether, in its processing of personal data, Facebook does enough to stop minors using their own personal data in ways which could harm them. In her claim, for example, HL refers to evidence given to a committee of the Australian Parliament – apparently by a senior privacy advisor to Facebook (though Facebook was unable to tell Stephens J who he was). That evidence apparently said that Facebook removes 20,000 under-age user profiles a day.

Stephens J was also referred to comments apparently made by a US Senator to Mark Zuckerberg about the vulnerability of underage Facebook users.

Another element of HL’s case concerns Facebook’s use of an outsourcing company called oDesk, operating for example from Morocco, to moderate complaints about Facebook posts. She calls into question the adequacy of these oversight measures: ‘where then is the oversight body for these underpaid global police?’ (to quote from a Telegraph article referred to in the recent HL judgment). Facebook says that – given its number of users in multiple languages across the globe – effective policing is a tall order (an argument J summed up at paragraph 22 as ‘the needle in a haystack argument, there is just too much to monitor, the task of dealing with underage users is impossible’).

In short, HL says that Facebook seems to be aware of the scale and seriousness of the problem of underage use of its network and has not done enough to tackle that problem.

Again, the issue is one of wider import for online multinationals for whom personal data is stock-in-trade.

The same goes for the third important data protection issue surfacing in the HL litigation. This concerns jurisdiction, cross-border data controllers and section 5 of the Data Protection Act 1998. For example, is Facebook Ireland established in the UK by having an office, branch or agency, and does it process the personal data in Facebook posts in the context of that establishment?

These issues are all still to be decided. Stephens J’s recent judgment in HL was not about the substantive issues, but about HL’s applications for specific discovery and interrogatories. He granted those applications. In addition to details of HL’s Facebook account usage, he ordered the Facebook defendants to disclose agreements between them and Facebook (UK) Ltd and between them and o-Desk (to whom some moderating processes were outsourced). He has also ordered the Facebook defendants to answer interrogatory questions about their procedures for preventing underage Facebook use.

In short, the HL litigation has – thus far – raised difficult data protection and privacy issues which are fundamental to Facebook’s business, and it has required Facebook to lay bare internal details of its safeguarding practices. The case is only just beginning. The substantive hearing, which is listed for next term, could groundbreaking.

Robin Hopkins @hopkinsrobin

Journalism and data protection – new Strasbourg judgment

July 21st, 2015 by Anya Proops

There has been much debate as of late as to how data privacy rights should be reconciled with journalistic freedoms under the data protection legislation. This is a difficult issue which surfaced domestically in the recent case of Steinmetz & Ors v Global Witness and is now being debated across Europe in the context of the controversial right to be forgotten regime. One of the many important questions which remains at large on this issue is: what degree of protection is to be afforded under the data protection legislation to those publication activities which might be said to be of low public interest value (i.e. they satisfy the curiosity of readers but do not per se contribute to public debate).

It was precisely this question which the European Court of Human Rights was recently called upon to consider in the case of Satakunnan Markkinapörssi Oy And Satamedia Oy V. Finland(Application No. 931/13). In Satamedia, the Finnish Supreme Court had concluded that a magazine which published publicly available tax data could lawfully be prevented from publishing that data on the basis that this was required in order to protect the data privacy rights of the individuals whose tax data was in issue. The Finnish Court held that this constituted a fair balancing of the Article 10 rights of the publishers and the data privacy rights of affected individuals, particularly given that: (a) the freedom of expression derogation provided for under the Finnish data protection legislation had to be interpreted strictly and (b) the publication of the tax data was not itself required in the public interest, albeit that it may have satisfied the curiosity of readers. The owners of the magazine took the case to Strasbourg. They argued that the conclusions reached by the Finnish Court constituted an unjustified interference with their Article 10 rights. The Strasbourg Court disagreed. It concluded that the Finnish Court had taken into account relevant Strasbourg jurisprudence on the balancing of Article 10 and Article 8 rights (including Von Hannover v. Germany (no. 2) and Axel Springer AG v. Germany) and had arrived at a permissible result in terms of the balancing of the relevant interests (see para. 72).

There are three key points emerging from the judgment:

– first, it confirms the point made not least in the ICO’s recent guidance on data protection and the media, namely that there is no blanket protection for journalistic activities under the data protection legislation;

– second, it makes clear that, where there is a clash between data privacy rights and Article 10 rights, the courts will closely scrutinise the public interest value of the publication in issue (or lack thereof);

– third, it confirms that the lower the public interest value of the publication in question (as assessed by the court), the more likely it is that the rights of the data subject will be treated as preeminent.

Anya Proops

 

Right to be forgotten claim rejected by the administrative court

July 21st, 2015 by Anya Proops

So here’s the question: you’re an individual who wants to have certain links containing information about you deindexed by Google; Google has refused to accede to your request and, upon complaint to the ICO, the Commissioner has decided that your complaint is unfounded and so he refuses to take enforcement action against Google under s. 40 DPA 1998; can you nonetheless secure the result you seek in terms of getting your data forgotten by mounting a judicial review challenge to the ICO’s decision? Well if the recent decision by the Administrative Court in the case of R(Khashaba) v Information Commissioner (CO/2399/2015) is anything to go by, it seems that you’ll be facing a rather mountainous uphill struggle.

In Khashaba, Mr Khashaba had complained to the Commissioner about Google’s refusal to de-index certain articles which apparently contained information revealing that Mr Khashaba had failed in his legal attempts to get his gun licences reinstated and had also failed to obtain placement on the Register of Medical Specialists in Ireland. The Commissioner concluded that Google had acted lawfully under the DPA 1998 in refusing to de-index the articles in question. Mr Khashaba was evidently unhappy with this result. Accordingly, he brought a judicial review claim against the Commissioner in which he contended in essence that the Commissioner had erred: (a) when he concluded, in exercise of his assessment powers under s. 42, that Google had acted lawfully in refusing to de-index the articles and (b) by failing to take enforcement action against Google under s. 40. By way of an order dated 17 July 2015, Hickinbottom J dismissed Mr Khashaba’s application for permission to judicially review the Commissioner’s decision. His reasoning was based on the Commissioner’s summary grounds, upon which the court felt itself unable to improve:

– first, permission was refused on the ground that Mr Khashaba had an alternative remedy because it was open to him to bring proceedings against Google directly in connection with its refusal of his application to be forgotten;

– second, the Commissioner had a wide discretion under s. 42 as to the manner in which he conducts his assessment and as to his conclusions on breach. He also had a wide discretion when it came to the issue of enforcement under s. 40. There was no basis for concluding that the way in which the Commissioner had exercised his powers in response to Mr Khashaba’s complaint was unreasonable or otherwise disproportionate.

All of which tends to suggest that: (a) the courts are likely to be very slow in impugning a decision of the Commissioner that particular information should not be forgotten and (b) that, if you’re an applicant who wants your data to be forgotten, you may yet find that the regulatory route offers little by way of comfort in terms of securing the necessary amnesiac effect.

11KBW’s Christopher Knight represented the Commissioner.

Anya Proops

 

FOIA Under Review

July 17th, 2015 by Christopher Knight

An important rule of Government is to outsource anything difficult or potentially controversial to an independent body which can then deliver a report to be ignored or implemented as required or the political mood dictate. The recent investigation into new runways at Heathrow was a good example, at least until it came up with an answer the Prime Minister didn’t entirely want to hear, and the Commission on a Bill of Rights was a superlative instance of a very learned study which achieved precisely nothing other than kicking a political football into the long grass.

Now it is the turn of the Freedom of Information Act 2000 to be undergone scrutiny by the Independent Commission on Freedom of Information. Snappy title. It is chaired by Lord Burns (former senior civil servant at HM Treasury) and contains such luminaries as Jack Straw, Lord Michael Howard, Lord Carlisle and Dame Patricia Hodgson (of Ofcom). Just in case anyone was suffering under the delusion that the Commission would be looking into widening the scope and application of FOIA, the terms of reference are set by the Cabinet Office as:

  • whether there is an appropriate public interest balance between transparency, accountability and the need for sensitive information to have robust protection
  • whether the operation of the Act adequately recognises the need for a ‘safe space’ for policy development and implementation and frank advice
  • the balance between the need to maintain public access to information, the burden of the Act on public authorities and whether change is needed to moderate that while maintaining public access to information

One would not, however, wish readers to think that the Government were anything less than fully committed to revealing information. On the contrary, the written statement laid by the Minister, Lord Bridges, opens by saying “We are committed to being the most transparent government in the world.” Well, quite. “We fully support the Freedom of Information Act [could there be a ‘but’ coming?] but [ah yes, there it is] after more than a decade in operation it is time that the process is reviewed, to make sure it’s working effectively.” The new Commission has a webpage here and is to report by November, which gives the grass limited time to lengthen… The Commission won’t, of course, be able to do anything about the EIRs.

Responsibility for FOIA has also been transferred to the Cabinet Office, which at least gives Michael Gove one less constitutional headache to deal with.

Christopher Knight

DRIPA 2014 declared unlawful

July 17th, 2015 by Robin Hopkins

In a judgment of the Divisional Court handed down this morning, Bean LJ and Collins J have declared section 1 of the Data Retention and Investigatory Powers Act 2014 (DRIPA) to be unlawful.

For the background to that legislation, see our posts on Digital Rights Ireland and then on the UK’s response, i.e. passing DRIPA in an attempt to preserve data retention powers.

That attempt has today suffered a serious setback via the successful challenges brought by the MPs David Davis and Tom Watson, as well as Messrs Brice and Lewis. The Divisional Court did, however, suspend the effect of its order until after 31 March 2016, so as to give Parliament time to consider how to put things right.

Analysis to follow in due course, but for now, here is the judgment: Davis Watson Judgment.

Robin Hopkins @hopkinsrobin

Secret ‘Practice Directions’ and Royal Wills

July 16th, 2015 by Christopher Knight

Mr Brown became a well-known figure in litigation circles when he sought to unseal the Will of Princess Margaret in the belief that it might reveal information showing him to be her illegitimate son. In the course of his unsuccessful litigation, it was revealed that there existed what had been described orally during the court proceedings as a “Practice Direction in respect of the handling of Royal Wills” (although there is dispute over precisely what form this document takes and whether it is really a Practice Direction at all), produced by the-then President of the Family Division following liaison with the Royal Household.

Having failed to unseal the Will, Mr Brown requested a copy of the document from the Attorney General. He was refused, under section 37 FOIA. The First-tier Tribunal upheld that refusal (on which see Robin’s blog here). Mr Brown appealed to the Upper Tribunal on the grounds of inadequacy of the Tribunal’s reasons and a failure to properly apply the public interest test. He was refused permission, but then successfully judicially reviewed the Upper Tribunal for failure to grant him permission (on which, see my blog here).

Much happened subsequently. Having fought hard to prevent disclosure of the ‘Practice Direction’ the AG then released almost all of it to Mr Brown in advance of the substantive appeal hearing before the Upper Tribunal. The unreleased aspect was one paragraph, which was supplied to him in ‘gisted’ form. Nonetheless, Mr Brown sought disclosure of the outstanding paragraph. Perhaps not entirely surprisingly, Charles J in the Upper Tribunal has just refused to give him the final missing piece: Brown v ICO & Attorney General [2015] UKUT 393 (AAC).

The Upper Tribunal decision, in the light of the release by AG, had rather less work to do than it might have done, and the judgment will be of equivalent reduced wider interest. However, Charles J does roundly endorse the proposition that there is a very powerful public interest “against the creation of undisclosed principles and procedures to be applied by the court to an application to seal any will, and this is strengthened when participants in and the decision maker on that application (the court through initially or generally the President of the Family Division) and the normal guardian of the public interest (the Attorney General) have been involved in its creation on a confidential and undisclosed basis, and so in favour of the publication of the principles and procedure to be applied on any such application (particularly if initially or generally the application will be made in private)“. In other words, the AG was right to concede that the material should be disclosed. There was no further interest in the gisted paragraph also being revealed because the essential meaning had been conveyed.

Whether this brings Mr Brown’s campaign to an end is another matter, but whatever one might think of his view as to his parentage, his uncovering of a – to put it neutrally – highly unusual document agreed between the AG, the Royal Household and the President of the Family Division concerning court procedures is a worthy effort.

Robin Hopkins appeared for the ICO; Joanne Clement appeared for the Attorney General and Anya Proops appeared for Mr Brown at some of the earlier stages of proceedings.

Christopher Knight

Google and the ordinary person’s right to be forgotten

July 15th, 2015 by Robin Hopkins

The Guardian has reported today on data emerging from Google about how it has implemented the Google Spain ‘right to be forgotten’ principle over the past year or so: see this very interesting article by Julia Powles.

While the data is rough-and-ready, it appears to indicate that the vast majority of RTBF requests actioned by Google have concerned ‘ordinary people’. By that I mean people who are neither famous nor infamous, and who seek not to have high-public-interest stories erased from history, but to have low-public-interest personal information removed from the fingertips of anyone who cares to Google their name. Okay, that explanation here is itself rough-and-ready, but you get the point: most RTBF requests come not from Max Mosley types, but from Mario Costeja González types (he being the man who brought the Google Spain complaint in the first place).

As Julia Powles points out, today’s rough-and-ready is thus far the best we have to go on in terms of understanding how the RTBF is actually working in practice. There is very little transparency on this. Blame for that opaqueness cannot fairly be levelled only at Google and its ilk – though, as the Powles articles argues, they may have a vested interest in maintaining that opaqueness. Opaqueness was inevitable following a judgment like Google Spain, and European regulators have, perhaps forgivably, not yet produced detailed guidance at a European level on how the public can expect such requests to be dealt with. In the UK, the ICO has given guidance (see here) and initiated complaints process (see here).

Today’s data suggests to me that a further reason for this opaqueness is the ‘ordinary person’ factor: the Max Mosleys of the world tend to litigate (and then settle) when they are dissatisfied, but the ordinary person tends not to (Mr González being an exception). We remain largely in the dark about how this web-shaping issue works.

So: the ordinary person is most in need of transparent RTBF rules, but least equipped to fight for them.

How might that be resolved? Options seem to me to include some combination of (a) clear regulatory guidance, tested in the courts, (b) litigation by a Max Mosley-type figure which runs its course, (c) litigation by more Mr González figures (i.e. ordinary individuals), (d) litigation by groups of ordinary people (as in Vidal Hall, for example) – or perhaps (e) litigation by members of the media who object to their stories disappearing from Google searches.

The RTBF is still in its infancy. Google may be its own judge for now, but one imagines not for long.

Robin Hopkins @hopkinsrobin

Data Sharing between Public Bodies

July 10th, 2015 by Christopher Knight

The principle disadvantage, to the data protection lawyer, of the failure of Esperanto is that every now and then the CJEU hands down a judgment which is only available in French, and even Panopticon cannot blog every entry in Franglais. Such is the problem raised by the Opinion of the Advocate General (Cruz Villalon) in Case C-201/14 Bara v Presedintele Casei Nationala de Asigurari de Sanatate. Readers will have to forgive any failure to capture the nuances.

Bara is a reference from the Romanian courts and contains a number of questions, the majority of which concern the compatibility of national tax authority arrangements with Article 124 TFEU (which prohibits in most cases privileged access for public bodies to financial institutions). Those need not concern us, not least because the AG considered them to be inadmissible.

However, the fourth question referred was in the following terms: “May personal data be processed by authorities for which such data were not intended where such an operation gives rise, retroactively, to financial loss?” The context appears to be that people deriving their income from independent activities were called to pay their contributions to the National Fund for health insurance, following a tax notice issued by the Romanian health insurance fund. However, that tax notice was calculated on the basis of data on income provided National Tax Administration Agency under an internal administrative protocol. The complaint was that the transfer by the Tax Agency to the Health Insurance Fund of personal data, particularly related to income, was in breach of Directive 95/46/EC because no consent had been provided to the transfer, the data subjects had not been informed of the transfer and the transfer was not for the same purpose as the data was originally supplied.

The Advocate General answered the fourth question by saying that the Directive precludes national legislation which allows a public institution of a Member State to process personal data that has been supplied by another public institution, including the data relating to the income of the persons concerned, without the latter having been previously informed of this transmission or treatment. This was despite the fact that the AG recognised that the Romanian bodies had a legitimate interest in being able to properly tax self-employed persons; the informal protocol did not constitute a legislative measure setting out a relevant national exemption under Article 13. The AG stressed that the requirement of notification in Article 11 had not been complied with, and that the data subjects accordingly had been unable to object to the transfer. The data subjects had not given their unambiguous consent. Although Article 7(e) (necessary for the performance of a task) could apply to a transfer of income data, it had to be shown that it was strictly necessary for the realisation of the functions of the Health Insurance Fund. (This appears to be a higher test being imposed than the usual interpretation of necessary as ‘reasonably necessary’, as per the Supreme Court in South Lanarkshire). The AG did not consider that test met.

It remains, of course, to be seen whether the CJEU will take the same approach; but it seems fairly likely that Bara will produce a judgment which confirms the illegality of inter-institutional transfer of personal data without express consent or a carefully defined need which is prescribed by law. There is nothing ground-breaking in that conclusion, but it is an important reiteration of the need for data controllers anywhere in the EU to think carefully about the authorisation they have to hand over personal data to other bodies; informal agreements or policy documents are not sufficient without a legal underpinning (through the DPA) or consent of the data subject.

The forthcoming judgment in Case C-582/14, Breyer will also raise issues over consent in the context of IP information retained by websites, along with the vexed question of whether an IP address can constitute personal data when combined with other information available to a third party (issues similar to those raised in Vidal-Hall v Google, on which see here). When the final judgments in Bara and Breyer appear, so will the analysis of some intrepid blogger of this parish.

Christopher Knight

Do Young Thugs have Human Rights? The Supreme Court has a Riot

July 9th, 2015 by Christopher Knight

Following a period of considered reflection, or laziness depending on one’s view, it is worth noting the decision of the Supreme Court in In the matter of an application by JR38 for Judicial Review [2015] UKSC 42. The case is all about Article 8 ECHR, and is of particular interest because of the dispute about the breadth of the correct test for the engagement of Article 8. The context is also one which will be familiar to English data protection and privacy lawyers: the publication by the police of photographs seeking to identify a suspect. If anyone remembers that famous picture of a youth in a hoodie pointing his fingers like a gun behind an awkward looking David Cameron, JR38 is basically that, but with Molotov cocktails and a sprinkling of sectarian hatred.

In JR38, the suspect in question was a 14 year child whose photograph was published by the PSNI as someone involved in rioting in an area of Derry in 2010 which had particular sectarian tensions. The judgment of Lord Kerr makes clear that JR38 has by no means been a well-behaved young man before or since the riots of 2010. Moreover, and amusingly, it is apparent that he and his father failed to correctly identify his own appearance in pictures published, and originally sued on the basis of images which did not show JR38 at all. However, a correct image was eventually alighted upon.

The judgments contain a lengthy and detailed discussion of the domestic and Strasbourg case law on the engagement of Article 8, but there was a 3-2 split in the Court between the correct approach. Lords Toulson and Clarke (with both of whom Lord Hodge agreed) considered that the overwhelming approach of the existing domestic law was to apply the touchstone of the reasonable/legitimate expectation of privacy test: see Lord Toulson at [87]-[88]. The test (originating, of course, in Campbell) focuses on “the sensibilities of a reasonable person in the position of the person who is the subject of the conduct complained about…If there could be no reasonable expectation of privacy, or legitimate expectation of protection, it is hard to see how there could nevertheless be a lack of respect for their article 8 rights”. The warning in Campbell not to bleed justification matters into the engagement analysis was stressed.

The difference between the majority and minority of Lord Kerr (with whom Lord Wilson agreed) was explained by Lord Clarke at [105]. Does the reasonable expectation of privacy test provide the only touchstone? The majority thought that it did, it being the only test set out clearly in the cases, and it being a broad objective concept to applied in all the circumstances of the case and having regard to the underlying values involved, unconcerned with the subjective expectation of the individual, be they child or adult (see at [98] per Lord Toulson and [109] per Lord Clarke).

In essence, the majority did not consider this context to be one which Article 8 was designed to protect. The identification of a suspect was not within the scope of personal autonomy, although publication of the same picture for a different purpose, other than identification, might be: at [98] (and at [112] where Lord Clarke did not consider the mere fact of criminal activity took the matter outside Article 8). Historic or re-published photos may alter the situation: at [101].

By contrast, Lord Kerr took a broader view, holding that the reasonable expectation of privacy test might be the ‘rule of thumb’, but could not be an inflexible, wholly determinative test. The scope of Article 8 was much broader and was contextual, requiring consideration of factors such as: age, consent, stigmatisation, the context of the photographed activity and the use of the image. Reasonable expectation of privacy failed, in his view, to allow for these factors to be considered: at [56]. Rather than shoehorning such factors into the test, they should bear on the issue in a free-standing footing: at [61]. The focus must be on the publication – i.e. the infringement – rather than the activity the photo displays. For Lord Kerr, the fact that JR38 was a child, taken with the potential effect publication might have on the life of the child, was more than sufficient to engage Article 8 (in the way that it might not for an adult): at [65]-[66].

The debate is an interesting one, but there is a very strong chance that the flexibility of the majority orthodox approach is likely to mean very little difference in substance between the two. It will, however, be worth emphasising the importance of context, particularly in child cases under Article 8.

The Court was, however, unanimous in agreeing that publication was justified in any event; rioters had to be identified (and other methods had been tried internally first), with the peril in which inter-community harmony was placed being particularly important in the fair balance.

Where, readers of this blog might ask, was the DPA in all this namby-pamby human rights discussion? Why is there no mention of schedules and data protection principles and all the other black letter statutory stuff that so gets the blood pumping? Well, it was mentioned, at [70], by Lord Kerr who considered that compliance with the DPA would mean that the limb of proportionality which requires the act to be in accordance with the law would be met. In very brief reasoning, Lord Kerr concluded that this type of case was within section 29 because publication was processing for the purposes of prevention and detection of crime, and that the relevant condition met in both Schedule 2 and 3 (because he agreed it was clearly sensitive personal data) was that of the processing being necessary for the administration of justice. Unfortunately, there was no analysis of the way in it was necessary for the administration of justice, or the extent to which this is the same as the prevention and detection of crime. Nor is it quite the same reasoning as adopted by Lord Woolf CJ in the well-known ‘naming and shaming’ case of R (Ellis) v Chief Constable of Essex Police [2003] EWHC 1321 (Admin), which, at [29], appeared to apply the conditions in Schedules 2 and 3 whereby processing was necessary for the performance of functions by or under any enactment (without further specification). Where the Supreme Court speaks, we follow, but it might have been helpful to detail this aspect a little more, although it is another example of a case in which Article 8 is presumed to do all of the work and the DPA be raced through in a paragraph to avoid having to think about it too much. That Article 8 and the DPA are ensured to be pulling in the same direction is, however, a relief to us all.

 Christopher Knight