The flood of GDPR fines anticipated by the media arguably never materialised—but four years on from the GDPR’s effective date, enforcement is ramping up. From Luxembourg’s €746 million fine against Amazon to Ireland’s €225 million WhatsApp action, 2021-2022 has been a bumper year for GDPR penalties. In this session we will consider what we can learn from recent GDPR enforcement trends.
Hello, thanks for joining us back here at PrivSec Focus, GDPR Four Years On. We’re having a great day so far, some really great interaction from the audience, so please do keep that up. I’ll go straight on to our next session now. GDPR Four Years On: Reviewing the Most Significant Enforcement Decisions. So, the GDPR was very strongly hyped up as being a regulation with a very powerful set of sanctions and fines available to Data Protection Authorities. Our panel’s going to have a look at which ones of those have really made an impact. And our host for this session is Peter Molduano, who is Information Management & Governance Lead at Leeds City Council. Over to you Peter.
Thank you for the introduction Robert and welcome to everyone on this call. And just preparing for this, I had a bit of a reflection thinking, my God, what was I doing exactly four years ago? And I thought, well, actually, where did I wish I was? And actually, more importantly, who would I wish I had been able to talk to? I tell you, I definitely would’ve benefited from having the guidance and the insights of the four lawyers that we have on this call as is so they could take us through the 78 articles of GDPR to give us the relevance, to give us an idea of what are the most important points. I wish I could have traveled in time four years forward, perhaps, and look at some of the precedents that then have occurred so that way you can actually figure out from a perspective of what is meaningful and what is relevant based on what is actually being issued as penalties, as warnings, and so on and so forth.
And luckily, on this call, what we have is the insight of such great legal minds. So they’re going to be able to pull back the curtains, go beyond the headlines, and give us quite a bit of a useful insight. And at this point, what I’m going to do is actually ask you, you on the actual call, to perhaps put on the hats of, well, of devil’s advocates and feel free to ask any questions after our legal minds go through their perspectives and through their presentations so that way you can take from this what I hope to take from this, which is actually how to adjust the compliance posture, the data protection relevance, and to the work which I’m currently doing.
So, as amazed as I am, I’d like to also be very critical and also learn as much from it as possible because it’s not often that you get a chance to get some brilliant insight from the excellent legal minds that we have on this presentation. We’ll go through Mr. Rhodes and then Ms. Rzaca, Ms. Kagan, and we’ll finish with Professor M, and I’m just going to leave him as Professor M for that show. Mr. Rhodes, over to you.
Steven Marc Rhodes:
Thank you very much for that Peter. I hope you can all hear me. I’m concentrating for this session really on the Google and Facebook fines that happened from CNIL in France, and I’m based in the UK, why should I be concerned about French regulators? Why should you be concerned about French regulators? That’s what I’m going to look at today. I’m going to look at the facts of the Facebook fine cases, examine some of the legal issues, look at the special status of the ePrivacy or PECR regs as they used to be known, and last, really find out what it means for you as the data controller, how it’s going to affect you, the sort of things you should be putting in your agreements and risks that you should be closing off.
So, the actual facts of the case were pretty unremarkable, I think we can probably say. Everybody knows that cookies are dropped whenever you visit a website and everybody is familiar with the cookie notice, which you should be able to take control of those cookies with. The finding really with both Google and Facebook, or Meta as they are now, was essentially that this wasn’t being followed. Non-essential cookies, marketing cookies, were being deposited automatically, that’s without the active consent of the data subject, which GDPR requires. Cookie banners were ineffective. It didn’t really matter what you clicked on, you were still going to get a whole load of cookies dropped onto you. And cookie declines, when you activated them, didn’t, so you pulled the lever and nothing happened. It was a pretty obvious breach and the fine was levied against them by the French data regulator.
Well, Facebook and Meta appealed. They said, “We’re governed by the Irish data regulator. The one-stop shop mechanism under GDPR demonstrates the need for the Irish regulator to take account of this and this is not a matter for CNIL to be involved with.” And the response, that really came alongside, I suppose, another case which was decided last year, a Belgian case against the Belgian Data Protection Authority. I’ll deal with that briefly because we’re looking at this issue of cross-border exchange. And in that case, it reached the European Court of Justice and the European Court of Justice stepped back a little from the very hard account of, there’s only one-stop shop that Facebook and Meta were putting forward in their own case. They said that the lead supervisor or authority is still the boss, but there were some exceptions.
There was Article 56(2), which were matters purely affecting its own territory, but that they could really only rely on those where they’d had a discussion with the lead supervisory authority and where the lead supervisory authority had failed to act. So, if we were looking at the Facebook case, for instance, and we’d have been applying those rules in the Belgian case, then it would’ve been the responsibility of CNIL to contact the Irish authorities, clear with them that there was a matter affecting only French citizens, get the understanding, if you like, of the issues between those bodies and then find out from the Irish regulator that they were not going to act. And only at that stage would they be able to go forward.
The European Court of Justice, in that case, also said there’s an urgency procedure and they dealt with a couple of other issues mildly of concern, I suppose, when information requests had gone into a regulator and the regulator had not complied with them, or when the matter had been referred to the European Data Protection Board. So, one-stop shop seemed pretty robust if you go to the European Court of Justice. Why does that broad decision, why is it in any way compromised by the French decision? I’ve described the French decision, that CNIL’s action in this is think global, act local.
So, in their statement in January 2022, when these fines were confirmed, there are a couple little interesting statements. And the first is by the CNIL talking about its own enforcement practice. It says, “These two decisions are part of the global compliance strategy initiated by CNIL over the past two years with French and foreign actors publishing websites with many visits and having practices contrary to the legislation on cookies.” So they’re saying, “We are looking outside of France, we’re looking internationally, we’re cooperating with other bodies, but this affects French people.” So, the CNIL’s approach is, yeah, GDPR does govern data generally and it governs the distribution of regulation throughout Europe, and there is a one-stop shop mechanism, they said. But the PECR, the then-regulations of ePrivacy, they act specially and they act particularly in France. This is a breach of French law affecting French clients, Vive la France.
So, with this idea that this is a local law for local people, it’s also worth looking at the French judicial authorities. So the Conseil d’État, which is the final body in France making these decisions, issued a statement saying the Council of State judge that the exclusion from one-stop shop mechanisms concerning cookies was sufficiently clear, so it doesn’t need to refer the matter to the Court of Justice of the European Union for a preliminary ruling, as required by the companies. So the French courts have said, “This is French law and there’s no right of appeal.” Now, this is a brave decision by the CNIL because litigation carries risk for them. It carries political risk and it carries cost risk. So when the CNIL decide to take this step, they are not taking it lightly, they are taking financial risks upon themselves which they are not able to cover with a profit margin, for instance. So, it’s quite an ambitious line of attack by CNIL. Why should it bother us? Well, because it concerns ePrivacy.
And I think, for this last section, I want to concentrate on the Internet of Things and how that matters. So, the ePrivacy regulations cover the Internet of Things and that could be a whole host of devices that you have in your office. Certainly, of course, laptops and mobiles, but it could include handhold devices for building maintenance, digital pricing guns. And the one I want to concentrate for this last section is photocopiers because the latest photocopiers can gather considerable amount of information, including very sensitive information, and they can talk to other devices in your office unless your cloud settings are extremely restrictive. That means cookies or the equivalent of cookies, scripts on other devices, can potentially gather the data of data subjects without their giving any form of real consent and which would be in breach, therefore, of in France in particular, the French privacy regulation law, which they have enacted themselves. But following the French decision, that could apply in any country which has implemented domestic legislation to do with the issues of GDPR.
What should you do as a lawyer to deal with this? I think you need to pay very strict attention to the contract wording that you get when you are purchasing new technology. If you were to look, for instance, at those photocopiers, you would want to ensure that the cloud service providers were fully end-to-end, that you had certification that the data gathering capabilities of your devices were on restrictive settings. And you would want to make sure that your contract language had attached those settings to the contract and made it part of the contract.
I recently negotiated a contract for purchases of new photocopiers in our offices. The supply was extremely resistant to any form of specific wording when it came to technical specifications and very resistant to adding them to the contract. My advice to you is to hold a firm line on this, ensure that these details are tied down, because as we’ve seen, the ePrivacy regulations, following the French decisions, may still bite on you even though you may be perfectly happy that your regulator is based in Ireland or somewhere else and not in a position to enforce. We’re now dealing with direct effect regulations and they can affect your business and considerably increase your digital risk.
Thank you. That’s really quite both comprehensive, as well as really quite insightful. As a program manager, I just kind of listened in to you and I kind of went, wait, hold on, do I need to capture this in a separate data protection impact assessment? Obviously I need to, any time I approach any prospective supplier, need to take this into consideration, but in a whistle-stop tour, which you gave us, is there also some kind of a top three points that I could take to my chain of command and go, yeah, this is what we need to watch out on on procurement?
Steven Marc Rhodes:
I would say, bear in mind that procurement happens without your knowing it so you will be often referred to software as a service or a platform as a service.
Steven Marc Rhodes:
And you’ll be asked to do a DPIA on that, and everybody understands that that’s how it runs and this information and request comes from IS. But in fact, it’s your facilities managers that are importing the Internet of Things products into your workplace, so it’s not the IS department. And in many cases, they will be importing a service from a third-party supplier that comes with their own technology into your workplace. And therefore, it’s not obvious, you won’t necessarily have assessed the risk with a DPIA because you’re thinking, well, we’re not controller over this, we’re not purchasing these products, we’re not using them.
But data can nonetheless be abstracted and unless the settings and the cloud service protocols are very strictly applied and you can enforce them and they’re not going to change, then you run that risk. And as I said, the suppliers of these products are not interested in having strict controls over the settings on their devices because they have their own commercial freedom and they want to exploit the capabilities of these devices as much as possible for their own purposes. This is not a case of people deliberately setting out to cause you trouble, this is simply a question of the natural desire for people to use commercial exploitation of the technology which they’ve developed. But unless it’s restricted, you can fall foul of regulations not just in your own territory but in offices across Europe.
Right. That’s very useful. That is definitely a takeaway for me to where I’m currently working as well. Floor is open to anyone within our panel for any questions because I’m sure they could take it into different directions from here as well. Now, if we’re going to be silent on this point and clearly you’ve answered everything as necessary, so thank you very much Mr. Rhodes, and over to Ms. Rzaca who’s going to give us a slightly different perspective from a slightly different area, but I’m sure just as valuable and insightful. The floor is yours.
Thank you very much Peter. So, I would like to focus on one of the biggest GDPR fines, which was imposed by Irish Data Protection Authority in August 2021. More than 225 millions of euro imposed on WhatsApp, basically for not providing clear why and how data are processed by WhatsApp. And I think it’s important from so many different reasons. For me personally, I was really waiting for a long time that Irish Data Protection Authority will issue a fine. It happened in December 2020, but it wasn’t very significant amount, it was only 500,000 imposed on Twitter for not notifying about the data breach. When it comes to WhatsApp, it’s a really huge amount of money. And I think also that, for many companies, the ultimate reason to take care about privacy was the risk of imposing a fine. And I think we can also, by analyzing different Data Protection Authorities in different country, we can clearly see that there is really very different approach in each country. So, that is definitely one of the biggest and most important in terms of amount.
The other one that I wanted to mention is the fine imposed by Hamburg Data Protection Authority, more than 35 million of euro, basically for processing too much data on employees. So basically, what managers did in the specific H&M shops in Hamburg, they were collecting a lot of different information which is definitely not needed about employees, like for example, religious beliefs, ethnic origin, et cetera. That it’s sensitive category of data, and moreover, it’s not really necessary for employment relationship. So, that’s another important decision because it’s just highlighting that, mostly, I think, companies are focusing on this external privacy, which is like how we inform our clients, customers how we process their data, but this internal aspect of privacy compliance is also important.
And just recently, last week, there was another fine imposed on Google by Agencia Española de Protección de Datos in Spain. And basically, the reason for imposing this fine was sharing the data with third parties by Google and not making possible the exercise of right to be forgotten. So, that brings me to the point that the European Data Protection Board guidelines, which were issued on 12th of May regarding the harmonization on how the fines are calculated, is very much needed. And also brings me to another point that on the European level, there is a proposal for the regulation on artificial intelligence and the limit is even higher because the AI regulation is proposing fines up to 30 million of euro or even 6% of global turnover. So, and of course this AI regulation is also proposing creation of AI authorities, something in line with Data Protection Authorities in each country, so that will be also very interesting to see.
Great, thanks. That’s really insightful and a great summary of everything that’s coming through on this. So, with my kind of a devil’s advocate hat, I’m just going to go, so what? Once these authorities… The Irish Data Protection Authority hasn’t been exactly very efficient at getting fines out there and they have only so many people on staff, so they cannot be penalizing everybody, especially if they go and put their resources after big players, so, yeah, what would you say to me, who is willing to kind of bet that, actually, my company, my organization, is not going to be penalized and as a consequences of this, I’m going to kind of take a higher risk profile for my organization? Does it still matter that there are these big penalties on which those Data Protection Authorities essentially focus all their resources?
Thank you very much Peter. So I think in the first place, learning on someone else lesson is a very, I think, very cheap kind of learning. And I would really recommend for any organization, even if they have privacy notice in place, just to revise it regularly because… And also have a sanity check too because those are mostly wrote or by privacy professionals or by lawyers, but just to have a sanity check and give to anyone else in the organization and basically ask question, “Do you understand this? What you can understand from this?” And of course, like Irish Data Protection Authority, there is a lot of criticism that it’s understaffed, it’s not having proper budget to really be busy with all the investigations. And basically, the fine on WhatsApp was a result of three years’ investigation, and still there is criticism that investigation was just focused on very limited aspects and there is much more to be discovered.
But I think there is for global companies having really very high total turnover, I really would consider if you really want to accept this kind of risk, because you can invest this kind of money in training or even revision of your privacy program, and I think you will get much more value because when we are discussing privacy, on the other end, we are discussing reputation and trust, which is really essential for any company. And I think also with those decisions that I commented, it’s important that WhatsApp was claiming that, oh, they have this privacy program, they have privacy notice, but it’s just the fine that was imposed, it’s just highlighting that just privacy is a journey, not a destination. So, really to have this constant revision, it’s a way to go.
Right, thanks. That’s really appreciative. Well, I appreciate it, I should say. We’ve had a question from one of our viewers and since you’re tracking the biggest penalties out there, the question was, what was the biggest fine applied to a US organization? I mean, WhatsApp comes to mind, I think there was something from Facebook and from Google as well, but I wasn’t explicitly tracking them, so over to you.
To you, me?
You could be you Ms. Kagan, yes.
So, I think what I wanted to say about this question, I think you’re right about the numbers, but I think the interesting thing is that these decisions that you’re quoting have been kind of with respect to local activity, local arms, I think, of the US providers, whereas pure kind of cross-border enforcement is something that we haven’t really seen much of. We’ve seen, there is a Dutch decision on a Canadian company, it was about $500,000, for failure to appoint an Article 27 representative. There is a decision in Italy in connection with facial recognition, which was pretty high fine. And I know that there are pending decisions by Luxembourg, and I know that because I’ve read that NOYB is in contact with Luxembourg because Luxembourg said, “What do you want from us? We can’t enforce them, they are not in Europe.”
So, this question of the cross-border enforcement is obviously one that comes up a lot with US-based organizations with the sort of statement being, “Who’s coming after me? I’m here, I don’t have any presence in the EU. What do you want from me?” And the answer to that, I think, is twofold. I think that, number one, and we’ve been saying that to clients for four years, happy birthday, is that for four years is that it’s not… Okay, let’s assume the regulator’s not coming after you, right? They’re not, okay? They’re not. You’re in the US, they’re not coming after you. But if you’re in the US, a lot of the times, you are B2B, you are not going to get the business, you are not going get the clients. And that we see a lot in the data transfer cases, right? I’m going to mention them a little bit in my talk, the Google Analytics and stuff.
So, the issue is, your B2B clients are going to be wary of doing business with you. And then, and that, I think, the business aspect, I think, is always more immediate than the potential hypothetical regulatory enforcement. And then we’ll see how those cases play out, I actually don’t know, and maybe Marco knows what Garante is planning in connection with the cross-border enforcement, because we’ve seen a few judgments come out, but I don’t know where they ended up on actual enforcement, so that would be interesting to follow.
Magda, do you want to respond? Go for it.
So, I think, regarding putting fines on US-based organizations, I think most of them also they have presence in Europe and most of them are basically their headquarters is in Ireland. And I think it’s interesting that from all the Data Protection Authorities, I think the Irish one is the most criticized and the most understaffed and the most struggling with the budget, so I think probably I would expect that this will change soon as Irish Data Protection Authorities becoming also more and more active. But also just revising, for example, Dutch Data Protection Authority, the fines that were imposed, there are no such high fines. The highest fine that was imposed as far by Dutch Data Protection Authority was three and a half million euro, and it was on Dutch tax authority, which is also interesting, and basically for creating a blacklist of fraudsters, which is also interesting.
And that brings me to another point that you should always, whenever you are processing personal data, first, you should ask yourself, do I really need to process this personal data? And then, the second question is, if I need to process, what is the legal basis? And just to double-check that you have legal basis.
Good. Thank you, really appreciate that. And before we’re going to move over to Ms. Kagan and her presentation, Steve actually picked out, well, volunteered a response to a question about Brave and a complaint that it launched with a commission against 27 EU member states. I’m not familiar with this, so show us the light.
Steven Marc Rhodes:
I think it’s very difficult to fix on a single figure and say that’s what a good regulator needs. If there is any basis in this case on sort of best practice, I’d be interested to see it. What I would say is that you can always pour money into a regulator and it will just find more things to do. Regulators have to be smart as well as properly resourced and they should be properly resourced. I would need to see the details. What I would probably like to see though is much more specialization within individual regulators within Europe, so that if they can form a center of expertise in Italy for one matter, a center of expertise in Belgium for another, yet another one in Austria, and concentrate resources in terms of research, sponsoring local academic and research input into their specialist area. Rather than expecting every data regulator to concentrate in every area of operation, that would be a better use of the funds. It’s always difficult to know exactly how much money, therefore, a regulator needs.
But what I would say, as I pointed out, the point with the CNIL is that they took a risk in taking matters to the Conseil d’État, the French Council of State. One very good way of ensuring effective regulation is to indemnify those regulators for legal costs. So you don’t necessarily have to give them upfront funding, but if they feel that they have a case which they need to fight and need to win in order to make regulation effective, giving the regulator a line of credit, if you like, in order to take that particular legal case forward is a good way to go. And again, to establish that, you can have different specializations with different regulators in different countries to do it. So, it’s not simply a question of funds, you have to be smart with them too.
I didn’t realize that actually public authorities could be indemnified within the, well, especially the Data Protection Authorities. Thank you, that’s quite insightful, I appreciate that. That would be, for me, almost like an insight that if there’s a case and the authority would be publicly indemnified, that it is a major priority. And for anybody within this sphere, they should be watching this particular domain. So, I guess over to Ms. Kagan, floor is yours.
Thank you. And thank you everybody for joining us for this birthday bash. I will try to make it a little bit more festive. What’s more festive than talking about fines? Obviously. So my take here is, we were talking about, right, the most significant enforcement decisions, and I think my takeaway is that GDPR enforcement decisions have a significance even without having a large euro number attached to them. And the two points that I want to make. Number one is, I think even the decisions that are not a high-ticket euro number are effective and there are other things, there are other tools, in the enforcement toolbelt that are effective, and I will demonstrate why. And I think the other is that GDPR enforcement, one of the other effects that it has is a kind of ripple effect or something where it is serving as a benchmark for other data protection compliance and other laws specifically. If I’m already downgrading everybody’s accents here with my American accent, I will talk about how it affects the US.
So, the first point is this non-high fine decisions and why they’re effective. So, I think one classic example is the cookie sweeps that we’ve seen, and we’ve seen CNIL do a number of cookie sweeps, we’ve seen the Hellenic DPA just issued a report on the cookie sweep that it did for 30 companies. CNIL was more, I think it was… They’re doing it in tranches, but I think it’s 90 at this point. NOYB also, the None of Your Business, the Max Schrems NGO, also is doing a cookie sweep. And in all three of those cases, the report said that the majority of companies came into compliance without there being a need to initiate any further proceedings. I think the NOYB and the CMIL reports were 80% and then further proceedings for the remaining 20. And then, the Hellenic DPA said 29 out of 30 companies came into compliance without the need for further steps.
And I think that is interesting because I do think that there is some sort of maybe unintended consequence where companies that are able to tolerate fines are less stressed out about coming into compliance, whereas companies that are really trying have a disproportionate burden. And so, this situation where, hey, if you’re really trying and you’re not a bad guy and you made a mistake and we pointed it out and you went and fixed it, this is a possibility, I think that’s really important, I think that is very sig… I mean, the fact that out of, if we’re just doing second grade math, which we’re lawyers and we cannot do, so 90 plus 30 plus whatever, that’s, I don’t know, 150. And if out of those, 130 came into compliance, that’s a big effect, right? So I think that’s one thing to keep in mind.
The second thing is, I think that decisions that even without a huge fine, but there was a decision out of IMY in Sweden and a decision by Datatilsynet in Norway with respect to privacy notices. So I think that the other, in a lot of cases, I mean, we’re all sort of hunting down EDPB guidances and trying to get things to inform our practice, but in some situations, some of these decisions, right? The Datatilsynet, I’m a big fan, of Datatilsynet in Norway, and they write good decisions. And it’s very specific, like the decision on MOI, I’m probably pronouncing it wrong, but it was very specific as to how to draft or how not to draft a privacy notice, same thing the Swedish decision on Klarna.
And that clarified a lot of things that were not, frankly, 100% clear to me that I was making judgment calls on advising clients and also gives me firepower to tell clients, because the decision in MOI, one of the items was, you can’t do a list of purposes and a list of legal bases and hope for the best that people can figure out the connection. No, you’ve got to connect them yourself. I’ve been telling clients this all the time and there was nothing, no… I said, “Well, Article 40, Article 29 Working Party said this six years ago,” and clients were like, “Yeah, okay, but that’s optional.” So now, I think those decisions are helpful in that way.
And the other is that there are other remedies that are being used. So for example, we’ve seen this in the Google Analytics cases, DSB Austria, EDPS, CNIL, those cases did not have a large fine attached to them, they had an injunction attached to them. They said, “Hey, fix your data processing within X time or you are in breach.” And that is not a high fine but that is super effective because companies need the data more than they are minding the fine.
And then, finally, the decision on, we’ve seen this, the FTC has done this and I haven’t seen it in Europe, on requiring deletion of algorithms, but the decision out of the UK in connection with facial recognition just now, I think two days ago, right, was delete the data, the ill-gained data. So that remedy, right, deleting data that you have used, deleting data that you have intertwined into your training algorithms, right? That is much more significant than a fine because it is severely difficult/impossible to do, and now you need to recreate the data.
So I think my first point is, there are other things other than a large euro ticket item fine and they have consequences. And the second point is that GDPR is not only an EU thing, besides the fact that companies do cross-border business, obviously, but even without it. So in the US, we now have in the new US privacy laws, of which we now have five, some of them literally copy-paste language of GDPR. There is a copy-paste of Article 7, the definition of consent, there are other pieces where literally… Automated processing, profiling, there are a lot of language, controller processor in some of the laws, we actually use that term. So, we have GDPR concepts and standards being used, we have in the preamble, in the explanatory notes to the CPRA, the new California law that’s coming into effect in January, there is all sorts of explanatory notes on, “This is how they do it under GDPR and this has been the GDPR experience.” So that’s important.
And then, the CPPA, which is the new California supervisory authority, both in informational sessions that it held in preparation for regulations that will come out, as well as stakeholder sessions opining on the regs, as well as stakeholder comments, everybody, not everybody, but a lot of alluding and relying on GDPR for, okay, here is how we do profiling, here is automated processing, here is what legal or other significant effect means, here is how you do DPIAs. So, the DPIA session had Gwendal Le Grand from CNIL explaining DPIAs to us Americans. And so, I think that is the second point, which is GDPR’s significance of the GDPR four-year enforcement is not only in enforcement for GDPR but also in setting a benchmark to be a frame of reference for other laws, specifically in the US.
Ms. Kagan, thank you. Those two key points are a takeaway from me. I’d like to ask more questions, but actually, we need to give Professor M a chance to actually finalize everything for us. And by the way, I do appreciate your American accent, it gives me comfort. Keep it up please.
Prof. Avv. Marco Martorana:
First of all, I have to excuse me but my English is not so fluent, I have Italian accent very strong, so I have to excuse me for this. And I want to thank you for the GRC World Forum for the opportunity to participate to such interesting and prestigious event. And I will also thank my colleague Roberta Savella, which helped me to translate my speech of today. I want to also answer for some of the question which is right near here. I don’t think that there will be very soon some changes to the one-stop mechanism of the EU.
Today, I would like to focus my contribution to this panel to the enforcement activity of Italian Data Protection Authority, and that’s much important decision in the past four years. November 2021, our DPA was the second European DPA for number of sanction issued, 75 for a total of 84 million of euros. So, I don’t think that there is a problem of budget in our DPA. The problem here is about the employee. There is a law which give the number of maximum employee that the DPA can have, so the problem here is different maybe. One of the field in which our DPA has been much stricter with the GDPR enforcement is telemarketing one. After the GDPR has come into force, there have been a lot of sanction in Italy to big company in telecommunication sector because of their unsolicited marketing call to consumer in absence of advanced consent and because of other various problems regarding data storage, accountability, privacy by design, transparency, and fairness of processing. The telecommunication sector is one in which our DPA has issued the highest sanction, with either or reaching more 27 million euro.
By answer to another question about what we do with the firm which don’t want to respect our law, we can see what about the popular Chinese app TikTok, because in this case, there was inappropriate use by children under minimum age required to create an account on the app, and therefore, because they allow for processing of children data by the developer of the app. In 2020, the Italian DPA has asked to create a specific task force in European Data Protection Board to investigate the data protection risks posed by the app.
After a few months, in May 2021, the Italian DPA asked TikTok to deploy additional measure in order to keep children under 13 off the platform and the company and they took to remove in 48 hours the reported account that are found to be owned by user under 13 years old following the relevant checks. TikTok now is based in Ireland, in Dublin, like the other big tech companies, which we just said.
Also relating to new technology field, last February, our DPA issued a 20 million fine to US company Clearview, as Odia said before. It’s processing biometric data during facial recognition activities using open-source intelligence software and the web scraping techniques. The company retract the publicly available picture of individual and processes them in order to obtain biometric information, which was stored in the company’s database. The Italian DPA’s jurisdiction derived from the processing, among the others, of Italian citizens’ data by Clearview. Our authority issued the [inaudible 00:47:48] that be caused by violation of the following GDPR articles. Article 5 for violation of the lawfulness and fairness of transparency principle, the purpose limitation principle, and the storage limitation principle. Article 60 for violation regarding lawful basis for processing the data. Article 9 for violation regarding the processing of biometric data. Article 12, 13, 14, and 15 for violation regarding the data subjects’ right. Article 27 because Clearview AI did not designate a representative on the European territory.
Also rating to new technology and algorithm usage, another important sanction was issued by the Italian DPA against Foodinho, of Glovo. I don’t think that I have a lot of time, so I want to close this speech and say that it’s clear that the goal of the sanction to the GDPR is to ensure the effective implementation of all the requirements and data protection, safeguard of regulation. Therefore, the enforcement decision for diverse Data Protection Authorities are one of the most important driving force to make company comply. In Italy, we are very proud of our DPA has achieved in the past four years and we are confident that these action are going to continue and guarantee better data protection in our country and hopefully also around Europe.
Well, that says it all. I mean, the track record, the penalties, the comprehensiveness. And I think also, looking forward with the use of facial recognition, which naturally tailors into artificial intelligence, that it’s showing us the way that it’s going to go. Thank you, that was both sweeping and comprehensive, and hats off to the Italian authority, so there is that. We had a number of questions coming through around there, but I think probably because we need to be a little bit… Well, we need to summarize all of this, perhaps we can have a 360 kind of a one-by-one response for everybody in the same order as we started off. So, start off with Mr. Rhodes, follow through with Ms. Rzaca, Ms. Kagan, and close off with Professor M. If there’s one thing that me, as an ignorant program manager, can take away from you, what would that be? Looking perhaps moving forward into the next aspects of, well, the next four years, perhaps, of GDPR. Steve, you’re on mute.
Steven Marc Rhodes:
Good. Well, to gaze into my crystal ball.
Steven Marc Rhodes:
I think regulators are going to have to be more focused in their action. I was particularly struck by a couple of points that Odia made. One on cookie sweeps and how responses don’t necessarily result in fines. I think that goes so far, I think it’s very effective initially, but of course, if companies are aware of the fact that they’re only going to be punished if they get caught, then it’s very difficult to enforce generally. So I think you need a combination of that sort of activity, perhaps a warning for the first time, combined with some fines to encourage the others. And it may well be, I think that that’s the direction we’re going to go in. Perhaps some more targeted large fines for more egregious activity combined with more low-level recommendations and bringing people into line.
All right, thank you.
Steven Marc Rhodes:
Steven Marc Rhodes:
I need to be slowly moving on to Ms. Rzaca. Go ahead.
Thanks very much, we are really out of time. So, I think from my perspective, all the high and small fines are just highlighting that the burden is so new as a data controller to prove that you’re accountable and to show that you have the privacy program measures in place and other measures in place. So, that’s my take. And also try to really focus on privacy by design, also choosing vendors. And that’s my take, thank you.
Thank you. Ms. Kagan?
I mean, my recommendation would be focus on the important pieces, on the meaning and not the substance. There’s a lot of talk out there with respect to GDPR not being effective or being formalistic, et cetera. I think if you are serious about real disclosure and real choice and real, as Magdalena was saying, privacy by design, I think that’s where we need to go, is we need to get the substance over the form, and that will also carry you for your cross-border business, right? If do that correctly, that translates into 80% of what you need to do under any privacy law, so I think that’s a good bet to focus on.
Great, thank you. And we do have some additional questions we’re not going to get the trust to respond to. Perhaps we can do that in the comments for the participants. However, Professor M can get the final word. Fire away sir.
Prof. Avv. Marco Martorana:
Yes. Very, very fast, I think that in Europe we cannot see just our DPA, but we have to look also at the European Data Protection Board because after a few months or a few years, all arrive also in our country, just like about cookie, so study and look about European Data Protection Board because it’s very important also for us.
Brilliant, thank you kindly. Thank you for the education, I’m sure that I’m not the only one who’s wowed by what I learned. And hopefully, we’ll get a chance to respond to some of the questions we couldn’t speak to but some of the comments could be coming through on this. So, that’s it.
Thanks so much to Peter and the panel, a great discussion. So important, I think, to remember that the most significant enforcement actions might not be the biggest fines in terms of euros levied, but those stop processing orders and deletion orders can have more of an impact sometimes. I’m writing at the moment about the Clearview fine that the UK has just announced as well. It’s difficult to say how that will be enforced despite the GDPR’s very broad territorial scope. In reality, if there’s no EU representative, I’d be interested to see what-
PrivSec World Forum
Park Plaza Westminster Bridge, London: 7-8 June 2022
PrivSec World Forum is a two-day, in-person event taking place as part of the Digital Trust Europe series.
PrivSec World Forum will bring together a range of speakers from world-renowned companies and industries—plus thought leaders and experts sharing case studies and their experiences—so that professionals from across all fields can listen, learn and debate.
The event is a must-attend for data protection, privacy and security professionals who are keen to network, learn more, discuss and add expertise to how these sectors are interconnected.
FIND OUT MORE & REGISTER TODAY!
GDPR Four Years On: Reviewing the Most Significant Enforcement Decisions
- 1Currently reading
GDPR Four Years On: Reviewing the Most Significant Enforcement Decisions
No comments yet