Relativity Fest always generates a mass of material, from company and product announcements to sessions on law and legal practice. I can’t write about it all, and you wouldn’t thank me if I did. If I focus now on the International Panel, that is not just because I moderated it, but because the interaction between the panel members seems worth capturing before the video of the session is taken down on 5 November.
I wrote a bit about the subject in advance in an article called Relativity Fest 2021 – the pervasive effect of privacy and data protection. My theme was (as that word “pervasive” implies) that privacy is no longer a side-issue for narrow specialists but a factor which touches everything we do, from corporate data management to disputes and regulation, to employment law, to personal life. We still need the specialists, of course, and we like to gather some of them on our Relativity Fest panel every year. In introducing the panel, David Horrigan made the point that Relativity itself needs to know what is going on as well as helping to keep its users up to date.
The speakers were Jonathan Armstrong of Cordery in London, Meribeth Banaschik of EY in Germany, Karyn Harty of McCann FitzGerald in Dublin, and Steven Klimt of Clayton Utz in Sydney. We had no preset agenda beyond the broad “International” subject and a session description which emphasised the significance of privacy everywhere. My usual practice is to ask panel members to say what matters to them and then to ask them about it which, among other advantages, gives us a wide range of topics.
I began with some quick points from the UK. We seem keen to launch our own equivalent of the GDPR for the rather stupid reason that the government wants to disown anything with an EU origin, and because its less thoughtful supporters see only restrictions in privacy where others see opportunities. We have had a run of ministers conveniently losing or breaking their private phones after conducting state business on them with their friends and donors (in each case with pitifully unbelievable stories of losing Whatsapp data). The Disclosure Pilot, covering the courts with larger cases, is to be revised again, in part to reduce the burden of disclosure on cases which do not warrant the mechanisms required by the pilot – the ability to modify the rules is a good reason for adopting a pilot.
I asked each speakers to tell us what was interesting in their professional worlds.
Meribeth Banaschik mentioned ephemeral messaging (the subject of a recent Sedona Conference commentary), and the implications of vaccination and getting back to work, referring to the “wonderful and complex privacy issues” of consent and data transfers. Top of Meribeth’s list, however, was the pending Artificial Intelligence Regulation, part of wider European proposals for data. The question “Who do you trust?” arises here, as with news reporting and medicine. There is plenty of room to argue about what constitutes AI, a debate which assumes yet more significance when we fall to wondering what will actually fall under the new regulation.
Karyn Harty mentioned deepfakes as a cause of growing concern for her clients. GDPR fines are increasing and the Irish Data Protection Commissioner has just imposed a very heavy fine on WhatsApp. This reminded me of a panel in Chicago some years ago where Karyn and I expressed different views on the likelihood of large fines being imposed. I thought not, and I used our 2021 panel to concede that Karyn had been right to predict that fines could be very heavy.
For Ireland itself, Karyn mentioned current proposals for comprehensive reforms of Irish court procedures, including discovery procedures.
Karyn Harty mentioned also an Irish criminal case, that of Graham Dwyer. Dwyer would never have been suspected of murder but for the geolocation data on two phones recovered from a reservoir, and he now challenged his conviction on the basis that the mandatory retention of such data was disproportionate.
Meribeth Banaschik recalled a panel session of years ago at which I had talked of the discovery significance of geolocation data, not least in photographs. I remember that session mainly for the nervous way in which everyone got out their phones to check what their phones were recording about their every move.
Steven Klimt said he had been working on the interaction between Australian privacy and court processes offshore – the US courts, for example, are not renowned for their respect for the processes of foreign jurisdictions. Return to work after three months of lockdown in Sydney had raised several issues where employment law and privacy meet, not least where mandatory vaccination (and proof of that) collided with privacy. More generally, Steven Klimt said, Australian businesses had coped quite well with the new constraints of Covid.
Turning to our main topic, Steven Klimt spoke about the review of the Australian Privacy Act; the next round of proposals is coming soon. The plans may take Australia closer to the GDPR, with a definition of personal information which includes such things as IP addresses, with more prescriptive rules about consent, with a possible right to erasure, and with a fairness and reasonable standard as to when personal information is collected and used. These things, plus the removal of some exemptions, may enable recognition by other jurisdictions and easier transfer of data to Australia,
Jonathan Armstrong mentioned what he called the “smoke and mirrors of AI”, making the same point as had been made earlier about what is and what is not AI. A big theme had been the “hokey cokey of back-to-office” (that’s “hokey pokey” if you speak American). Organisations may have done a data privacy impact assessment when staff had moved out of the office, but overlooked the possibility that a new DPIA may be needed on their return – the new normal may involved hybrid working, for example, which raises implications beyond the simple in or out approach.
GDPR fines were becoming substantial, with €1.2 billion in fines levied already. In some cases, fines were only the start, with further costs or losses raised by consequential recommendations or decisions – Jonathan Armstrong mentioned Facebook’s withdrawal from the lucrative online dating market as one example of a serious business decision being made in reaction to a regulator’s decision.
There has been a rise in “aggressive” subject access requests, either ancillary to litigation or employment claims, or by chancers who threaten claims unless a sum is paid to them.
Jonathan Armstrong talked of a crossover between AI and the GDPR in Spain and Italy, where online food delivery companies have been hit (in one case with a dawn raid) for automated decision-making which was unfair, lacked transparency, and was not disclosed to the delivery riders/drivers. They were micro-tracked – this allows the companies to predict arrival times, but the data (and too many data points) were used also to monitor staff, sometimes marking them down for things they could not help. The regulator went beyond fines, giving a to do list of corrective measures. Again, future compliance was at least as burdensome as the fines.
With their main subjects covered, the panel turned to other things of importance.
Jonathan Armstrong spoke about the Luxembourg Data Protection Commission’s activity against Amazon, mainly over transparency as to the uses to be made of data. Transparency is a theme turning up elsewhere – the obligation to tell people about the uses of data may change e.g. from investigation to regulatory investigation to litigation (perhaps abroad), and each new phase may trigger renewed reporting obligations.
I invited Meribeth Banaschik to tell us about EY’s recent paper, which covered (among other things) privacy issues on going back to work, and the use of data entered into apps e.g .about restaurant booking – what happens to that data and what is it being used for beyond the actual booking? Most employers, she said, are trying to do the right thing, but it is legitimate to ask what will happen to data given in compliance with new circumstances. The law relating to records of vaccination status is different in Ireland and in Australia, but in both we find employment law and privacy intersecting.
Steven Klimt spoke about Facebook and the extraterritorial reach of Australian privacy law, and Karyn Harty talked of increasing active enforcement against Facebook and other tech companies in Ireland. Ireland, she said, is attractive to tech companies, not least because the Irish courts have a reputation for managing their cases with relative certainty. There is a committee (of which Karyn Harty is a member) whose role is to promote “Ireland for Law” and to capitalise on the Brexit car crash.
Meribeth Banaschik talked about the rise in Data Subject Access Requests, which she described as “mini eDiscovery projects” which must be done in a very short time.
The only approach to this, we agreed, is for organisations to work out how they will answer prospective DSARs which they don’t yet know about – an information governance point.
Businesses are brushing up their processes, increasingly using technology to manage DSARs which have a wide range of sources – prospective litigants using them for pre-discovery, and as preliminaries to defamation claims, or people simply wanting to find out what an organisation knows about them.
I asked each panel member to give their final short thoughts.
Steven Klimt said that Australia will learn from what is happening in the EU.
Jonathan Armstrong said that Max Schrems had not gone away, and that data transfers would remain a huge issue.
Meribeth Banaschik reminded us that Covid had created an immense amount of work. much of it in new ground, and praised the teams who have delivered all this work during Covid.
Karyn Harty urged people to think about privacy by design.
I really enjoyed doing this panel, My thanks to the panel members, and to Relativity and David Horrigan for once again giving me the opportunity to moderate it.