This update aims to amend the law to maintain its EU adequacy determination with the EU after the Schrems II decision in July 2020. This panel session will look at the several existing pieces of legislation they intend to amend, Personal Information and Electronic Documents Act (PIPEDA) into the Consumer Privacy Protection Act (CPPA) and what this means for organisations.



The transcript has been edited for grammatical reasons 

[00:01:20] Andrew Menniss: Good morning, good afternoon or good evening wherever you’re joining us on the globe right now, here for Privsec Global. My name is Andrew Menniss and I am here with GRC World Forums. Just before I introduce this final panel as part of the US Privacy Focus and final one of the day we’ve been live now for almost nine and a half hours.

[00:01:43] Just wanted to thank our headline sponsor Microsoft and our premier sponsor One Trust. If you visit the GRC World Forums page via the left-hand menu and register your interest for a number of our new initiatives for 2021, including GRC TV with Joe Tidy, who’s been online a few times today. It’s a new weekly on demand and digital TV star series feature news, in-depth interviews, discussions, and debates on all things, governance, risk and compliance.

[00:02:11] And there’s also In Conversation with which is a monthly series of hour long conversations between our hosts and special guests from the world of privacy, security and financial crime with Nina Schick and Oliver below.

[00:02:23] So it gives me great pleasure to introduce this final panel, which is sponsored by Helios Data.

[00:02:29]It gives me great pleasure to introduce Abigail Dubiniecki, who is a lawyer for an nNovation LLP, and the title of this panel is Canada’s Proposed updated Federal Privacy Law, What You Need To Know. Over to you, Abigail.

[00:02:44] Abigail Dubiniecki: Thank you. And right. So just to clarify, it’s Canada and US in this section. All right.

[00:02:50] Yes, I’m very pleased to be here. We have a fantastic panel today to discuss the proposed update to Canada’s private sector privacy law, which is bill C 11, or the Canadian consumer privacy protection act CPPA to add just a little more confusion to the global privacy space. I’m joined by Dr. Ann Cavoukian, who needs no introduction, but just in case you have been living in a cave somewhere and are not aware oh who she is she’s a global leader in privacy.

[00:03:23] She’s been doing this for decades. She’s the, she was a former Informational Privacy Commissioner of Ontario, which is where she came up with Privacy By Design, which has taken the world by storm and was first picked up formally legally in the GDPR. And now has been recognized as the standard as an important standard to achieve for embedding privacy proactively.

[00:03:46] She’s also the Director of Global Privacy and Security By Design and actively involved in so many other things. So, and it’s such a pleasure to, to be hosting a panel with you.

[00:03:58] Thank you so much Abigail

[00:04:00] Next we have Luk Arbuckle. He is the Data Methodologist, have I got that right Luk? Of Privacy analytics and he is the go-to person if you have any questions about anything related to data de-identification anonymization, particularly in the health data space, but not exclusively. He does a fantastic job of turning these highly complex and potentially very technical concepts into something accessible and dare I say, even fun to listen to. So he’s going to help shed some light on some of those issues today.

[00:04:32] And finally, we have Mark Sward, who is the vice president of privacy for Sterling. Anyone who’s ever been onboarded as a consultant or applied for a job, chances are, some of your personal data is with his company because they do back checks and a number of other services in, in that vein.

[00:04:51] And so he’s going to share his perspective as a global business grappling with a range of privacy laws, but specifically CPPA coming forward. So I did promise this would get a little spicy, we all have very strong opinions, but my previous career was as a teacher and I was always taught to say the positive and the negative.

[00:05:09] So we’re going to start with a little thumbs up, thumbs down exercise. So I ask each of you ahead of time, without discussing with the others to list three thumbs up to three things that you like about C 11 and three things that you don’t. So if you have those in front of you, you could flash them. If you don’t, you could just tell me.

[00:05:28] Have you bought the papers?

[00:05:31] Dr Ann Cavoukian: Okay. Do you want me to start?

[00:05:32] Abigail Dubiniecki:  Sure and just list them and we’ll, we’ll expand on them later as we go, but just bullet points.

[00:05:41] Dr Ann Cavoukian: There’s really only one thing I like about C 11 and it’s the fact that it finally gives the federal privacy commissioner of Canada order making power over the provincial commissioners.

[00:05:53] When I was a provincial commissioner for three terms and the others. We have order making power, which makes it, it strengthens your ability to, to do things. And I always compared it to having the carrot on the stick. The order is a stick. You have order making power. People know you can make them do things that you need to do because you haven’t been in compliance with the privacy laws.

[00:06:18] But the beauty of it is having order making power, and they know it, means that they’re much more willing to negotiate with you. Most of my decisions were informal resolutions, which I was with the organization, which worked out much better win-win as opposed to insisting upon making them do things. So it gives you a great deal of strength and authority to have that.

[00:06:41] Do you want to hear things I don’t like, or should I stop now?

[00:06:42] Abigail Dubiniecki:Yes. Yeah, because we’ll definitely talk about the enforcement provisions. So hold those thoughts. Let’s hear the three things you don’t like, just bullet points for now, and we can elaborate on them later.

[00:06:57] Dr Ann Cavoukian: [00:06:57] I hate that there’s a stupid tribunal. I won’t get in to it. Or describe that in a minute. I hate that the federal privacy commissioner who now has order making power, it’s taken away from him because of the, you can complain, you can file a complaint with the tribunal and you can then do something about it. I think that’s outrageous.

[00:07:13] And I hate the fact that they have not included privacy by design, as they said they were going to two years ago. So those are the top three.

[00:07:21] Abigail Dubiniecki: Okay, great. Thank you. How do you really feel about it Ann?

[00:07:30] Luk, three positives, please?

[00:07:33] Luk Arbuckle: [00:07:33] Yes, of course. So obviously given my background the fact that the application is actually expressly recognized and broken up into several sections, I think is, is really good. And it’s defined contextually, which is helpful as well. The second would be, well, the idea that datafication is not just mentioned and defined, but it’s actually used throughout.

[00:07:51] I think that’s really powerful. One of the biggest challenges we have is getting organizations to properly identify data. So making it explicit, really elevates the conversation. And last kind of a, maybe a slightly controversial and we’ll get into later is the fact that datafication is described as a process and not just an end point, but I’ll stop there for now.

[00:08:09] Okay.

[00:08:10] Abigail Dubiniecki: Mark, you’ve got some more to choose from.

[00:08:12] Mark Sward: Yeah,

[00:08:14] well, I’m with Ann, with everyone, on enforcement power. I think without a way to enforce the rules meaningfully it’s not, there’s no point having any. One that hasn’t come up yet that I really like is clarity on service providers and the relationships between parties, that process data together.

[00:08:30]That’s I think was something sorely lacking in PIPEDA. And coming from industry while I’m a privacy person and I like good privacy laws, I also can’t, can’t say it just like a relatively low compliance burden, which honestly is what this gives us. And I think we have a lot of criticism coming on that point, but but it is a positive from a, from an industry perspective.

[00:08:50] Abigail Dubiniecki:  Okay. Wow. So that’s really cool. All right. So that gives us a really good sort of level set. What I’ve probably failed to say is, you know, this is the first major update since about 2000. So Canada sort of went from being a leader at some point to then a laggard as everyone outpaced us, especially after the GDPR.

[00:09:10] And then finally, now big question, and this is going to lead up to my question. First question for Ann. Are we. Leaders again, are we laggards? Where do we stand? So C 11 aims to update. And I think as you mentioned, we all expected to see Privacy By Design in there, a cousin, which is a bill 64, and Quebec has absolutely done that and even strengthened it in the course of their law, their clause by clause analysis.

[00:09:34] We haven’t done that. So why do you suppose that is, and how do you think that’ll impact the effectiveness of the legislation, Ann?

[00:09:42] Dr Ann Cavoukian: Well, I don’t think this legislation couldn’t pass anytime soon because it’s going to be so many concerns about it. It will be debated extensively. In 2018, so let me take you back to 2017, the following year 2018, the General Data Protection Regulation in the EU, it was coming into effect and they included to, to my amazement, my Privacy By Design and privacy, as the default setting, this was huge that they did that as data protection by design. And it was huge. So our federal privacy commissioner of Canada, Daniel Therrien, he said to the government, look, we have to upgrade our law.

[00:10:20] PIPEDA is no longer going to be essentially equivalent with the new law that’s coming into effect next year, the GDPR. So our law is dated early 2000 it’s time for an upgrade. Plus, we need to add Privacy By Design into it because they do that in the EU. And after all was created by a Canadian, Ann Cavoukian, can who’s here.

[00:10:40] So we need to do this as well. In response to this at the beginning of 2018, the Federal Government put out a paper. The title of the paper was towards Privacy By Design upgrading for peter. So everyone thought that this was the direction they were going to take because after all this was the Federal Government said they were going to do.

[00:10:59] Do you think that’s happened? No zero, zero. And this is on the heels of privacy by design being embraced and adopted all around the world. It’s been translated into 40 languages. Not a week goes by when I don’t hear from some jurisdiction or something they’re going, et cetera. So I’m very, very disappointed that that hasn’t been included. Now, quickly, do you want me to stop here?

[00:11:22] Abigail Dubiniecki: Oh no, that’s fine. Go ahead.

[00:11:24] Dr Ann Cavoukian: Well, then the other things, as I mentioned, yes, the commissioner now has order making power. But none of the commissioners myself included when I was commissioner across the country, when we have order making power, you don’t have a tribunal and they can change your decision.

[00:11:41] No, yeah. It can be challenged in court, but that’s what they have to do. You issue an order, it’s binding. And if they don’t like it, they can take you to court. So instead the federal government in C 11 crisis tribunal, which consists of members, only one of which has have any privacy expertise. It’s crazy. It makes no sense that it’s like you give the commissioner power and then you take it away right away.

[00:12:07] It’s just astounding truly. So I could go on and on. I’m going to stop here. But those are the highlights.

[00:12:13] Abigail Dubiniecki: Okay. So we’ll, we’ll park the enforcement question. Cause I do have a question later on that. Mark, I’d like to turn to you and one I’d like to ask you about PIA’s, but before I do, I want to pick up on something that Ann and you said earlier, and you were saying, you know, there’s a relatively low compliance bar and as as industry that’s promising, but if everyone else is doing it internationally, what does that mean for you in terms of maintaining a global privacy program?

[00:12:38] So for example you know, as we know, the GDPR requires a data protection impact assessment in certain cases, its core obligation is data protection by design and by default, Quebec has a mandatory PIA obligation that has to be proportionate. So it’s not, in all cases, there is privacy by design and privacy by default, which is actually even made stricter.

[00:12:58] So you’re based in Montreal. You wouldn’t have to comply with Quebec, your global, you’ll have to comply with GDPR, CCPA, everything. You know, what does this mean for you as a business that revolves around highly sensitive personal data when you’ve got these different, you know, when your own home jurisdiction has kind of set the bar a bit lower, but you still have the tie bars to me.

[00:13:18] Mark Sward: Yeah. I mean, I think what it does it’s an application of leadership role for Canada. You know, the big companies that need to be regulated when it comes to privacy like mine and like many others, are going to focus their compliance efforts where they’re needed based on what laws have been passed and they’re going to build those compliance efforts around those laws.

[00:13:36] And so, yeah, I mean, when I’m looking at data protection by design privacy, by design, I’m gonna be looking to the GDPR. I’m not going to be looking to the two federal legislation in Canada for, for guidance on that. And it’s a missed opportunity because we’re a global organization is already doing these things.

[00:13:51]Canada can have a role to play in that conversation if, if it’s laws require them as well. But if it doesn’t, well, who’s going to pay attention to Canada and that’s kind of, and, you know, as a global privacy practitioner, based in Canada, I’ve found over the last, you know, since 2018, my intention is focused on the U S and Europe.

[00:14:08] My, my attention is not focused on Canada and, and that’s honestly what I see in, you know, my company’s clients the same is true. It’s just, nobody’s paying attention to Canada. And I don’t think this is going to change that. And I don’t think that’s good for Canada.

[00:14:22] Abigail Dubiniecki: Yeah. And then we do have this tendency to sort of wait until our superstars get famous outside of the country before we accept them here.

[00:14:30] But somehow that still didn’t happen and we didn’t get the memo. So it happens with our hip hop artists and so on, but it doesn’t happen with our privacy champions. Okay. So, I mean, there is the appropriate purposes test you could argue. That’s kind of like legitimate interest assessment kind of, but it doesn’t, it doesn’t quite get there.

[00:14:48] It’s a bit flou, as we say in French. I want to speak to Luk because interestingly on the one hand while both Ann and Mark are saying, you know, we’ve set the bar low, we’re just sort of heading into laggard territory. There’s been a lot of blow back on the de-identification issue. In fact, a lot of the biggest criticism that I’m hearing is that even anonymized data, that you can never really satisfy this bill and you can never anonymize data enough for it never to be captured.

[00:15:17] So effectively to be regulating beyond personal information Luk, can you, let’s first let’s start by. Cause I think you, you mentioned an important point in the beginning about de-identification as a process or a journey rather than a static thing or a destination. So can you just, as a starting point, do you read it the same way that anonymized data, so subject to C 11 and can you, for people have a little bit of difficulty understanding the risks around it could you maybe just speak a little bit to that as well?

[00:15:48] Luk Arbuckle: Sure. I mean, it’s interesting that when I first read it, how many months ago now, I was excited. I thought, Oh, this is really good. They’ve got a contextual piece, the reasonably foreseeable circumstances, they’ve got it defined as a process, not as an end point. So to me it sounded like the regulation was intended to be.

[00:16:05] So overseeing the process, right? Let’s make sure we understand that unification isn’t a one and done, and that’s kind of been understood in the field for a long time. I’ve been doing this for 10 years now, and that’s just been the standard. That’s the best practice because you don’t just anonymize or get Edify and walk away and assume it’s always going to be okay, no, your governance like you do anything else.

[00:16:22] You keep managing it. You ensure that the purposes that you’re using it for is still appropriate, et cetera, all these things. And we, you know, we do assessments on a, like an 18 month basis where we keep rechecking the data rechecking, the circumstance, which has been used. So to me, I thought this is great.

[00:16:36] They’re actually defining it there then saying when to use it, which I think is also very positive. So I can read it the same way. But then all the, as you said, all the articles started coming out and it’s interesting. I’ve seen, the complete reverse, like, like diametrically opposed view some say, this is great. It’s providing a benchmark, it’s providing a clear sort of intention in terms of it as a process as I’ve described it. But then there’s others that say, no, this sounds like it’s personal information too. Well, I don’t read it that way because to me, the definition is personal if it’s identifiable and it’s the identifying when it’s non identifiable in the reasonably foreseeable circumstances.

[00:17:13] Okay. And then it’s, and again, it’s defined not as an end point, not the identified, but the identification I thought, okay, that’s a good thing. And everything it, to me, he sounds like, again, yes, there’s stuff that has to happen. There have to be codes of practices standards that are being developed by the way, there’s already international standards being developed in this space, which you know, will help.

[00:17:33] Again. You mentioned a lot of people are confused by this. It is its own field of study. I’ve been doing this for 10 years. You’ve got experts at statistics, Canada, for example, a national statistical organizations. This is what they do day in, day out, they have teams that do this right. There are good ways of doing it.

[00:17:47] And a lot of the guidance system, written books, et cetera, are now being turned into actual standard practice where you can look at, okay, you have to have a privacy model, and it’s not the same, depending on the type of data you’ll have different privacy models are suitable for different circumstances.

[00:18:01] And then you got to look at the overall context in which it’s being used. Okay. We understand that there’s residual risks. So we don’t just de-identify forget about it. We govern it. You look at who’s going to have access for what purposes what controls are going to be around the data, both technical and administrative organizational, all these things around the data.

[00:18:18] This has being defined in the standards in this best practice. I think that’s a good thing. Naturally, it’s not in the, the legislation itself, but at least it’s out there. It’s going to be something people can look to.

[00:18:29] Right,

[00:18:29] Abigail Dubiniecki: but on that point, and I want to just tease this up because I thought this, but it’s not you mentioned that, so, okay.

[00:18:36] You got two camps. So some people are of the, it can always be re identified and so you should never have any kind of exceptions and they’re extremely risk averse. And then you have others who feel like you said that look you know, you’re effectively casting a very wide net and saying all data subject to this.

[00:18:52]The challenge is they didn’t benchmark it to anything or pin it to any kind of standard or include, let’s say a re identification risk assessment obligation, or a rebuttable presumption. Do you think something like a rebuttable presumption of identified fireboat? Sorry. If I’m being too legally, the lawyer rebuttable presumption that the starting point is it is personal information until you can demonstrate. You know, to our satisfaction that it’s not. And then, you know, obviously you’re not going to put all the detail in the law that you’ve got to regulation and codes, but in the guidance, if you gave power, for example, under the Quebec law, the commission still has the power to issue directors on this March 17th date in their clause by clause, they devoted almost five hours exclusively to this issue of de-identification.

[00:19:43] And you know, to what extent should we, or should we not, include a requirement to do a re identification risk assessment? In the context of our PIA they had this elaborate discussion in Indiana. It fell somewhere else and I’ll be writing, but that’s why I won’t go into that today. But it feels like that piece of maybe missing, and do you think that would maybe assuage the uncertainty in the business community, but also the reticence amongst privacy

[00:20:11] Dr Ann Cavoukian: advocates?

[00:20:13] You know,

[00:20:16] I agree with what you’re saying, Abigail, because in the risk of re identification has to be addressed and in the olden days, so Luk knows all about this, but when I was privacy commissioner, I worked with Khaled El Emam who is one of the leaders in this field.

[00:20:31] And he always used to tell me, look, yeah, we need strong de-identification protocols combined with a risk of regenification framework. And then if we do that, we can minimize the risk of re identification dramatically, which is great. No one is expecting perfection. I always talk about the myths of zero risk, but please you don’t have to have something in there to address this issue.

[00:20:54] Otherwise people could foolishly think, well, he just stripped the name and it’s fine. It’s de-identified no, it’s not. You have to do a lot more than that. So you have to make some effort to reduce the likelihood, the risk of re identification. And I don’t think that’s spelled out at all. And does that’s one of the things that concerns me, Luk, you’ll have much more to say on this. Of course.

[00:21:15] Luk Arbuckle: Well, I mean, this is the challenge, right? The Canadian optimization network, for example, as a, as a consortium of different, different people, different organizations that are looking at this and, and we’ll be proposing that they currently have on their website, for example some definitions, some use cases that were being developed that are trying to show, you know, where it could views, how it can be, etc.

[00:21:33] And we kind of avoided that to some degree, we can talk about non identifiable data. We don’t talk about the risk of reidentification so much just because it tends to be, as you said, Abigail tends to be sort of inflated people don’t understand what risk means. It, same thing with environmental protection, anything else risk is a, you know, it’s a scary word for people, but it’s really, this is about taking data and creating statistics out of it.

[00:21:53] That’s how I like to think of it at the end of the day. Cause that’s really what we want to do. We don’t care about the individual people. We care about the, the statistics that we can draw, and then we can apply that back. Of course, I work primarily in healthcare. So that’s your, we’re talking about, you know, treatment outcomes, et cetera, but that’s the idea it’s it’s so whether or not it should be in the legislation.

[00:22:11] It’s a tough, tough question. I don’t know. I mean, I look at HIPAA for example, and it’s very spelled out. That’s probably the most explicit in the world. Now, some people don’t like that because maybe it’s too explicit, but, and again, the same sort of problems come up. You’re going to have people doing a different way.

[00:22:26] They’re interpreted differently, but at least there you have something that says, look, we know national statistical organizations do this. They do these statistical assessments. Why not build the language in as well. And, and that’s why HIPAA is so very clear about that. No one else really has that to be quite honest, even GDPR, it doesn’t really go into the level of detail, but hopefully we’ll get out of a code of practice.

[00:22:49] Although I’ll be honest with you. I have some, even some reservations around codes of practice, simply because I think it puts the OPC in a really difficult position to now accept, you know, what organizations are doing. And that’s a tough thing to ask of a regulator, right? To say to you, will you endorse this, especially when you haven’t talked about privacy by design, the nice thing about, and I’m bringing this back to Ann here, but it’s true.

[00:23:09] I mean at the day there was a win-win that has to be put forward. Right. And, and when we don’t talk about that, then it’s just, well, let’s protect privacy and yes, we want to do that, but we also want to make it useful in some way. And that’s where I think that’s, that’s a piece that’s constantly missing.

[00:23:25] Abigail Dubiniecki: And don’t worry, Mark, you will get a chance to speak. It’s like you read my mind or my notes. I’m going to ask, right? Because Ann, this is, with privacy by design, you talk about it, not just as something protective, but a win-win even for business for innovation. So can you speak on the identification in that win-win.

[00:23:46] Dr Ann Cavoukian: Oh, totally. And, and privacy by design is all about win-win positive sum results, where you can have multiple positive gains. It’s not privacy versus security or privacy versus business interests. It’s privacy and. So I’ve been preaching for years, getting rid of that zero sum mindset of one interest versus another, either a win lose that’s so yesterday, but we still have to address some measures as to the strength of the de-identification the way in which you do it. And it has to be able to stand up. You can’t just throw it all away and say, okay, we’re just going to strip the name and it’s the same numbers. You can’t reduce it to that level.

[00:24:26] And there’s all kinds of ways of doing this. Luk does this at Khaled and other individuals who excel in this area, but it has to be addressed and the notion of conveying that. And that’s why I was so upset that they didn’t include privacy by design because it is that win-win model. And it’s predicated on, you know, strongly de-identifying the data and, you know, using various methods, et cetera. But I just think that’s not addressed at all.

[00:24:54] Abigail Dubiniecki: Okay. I think that’s a really good point. I do want to give Mark a chance to speak because Mark, the nature of your business is such that it kind of needs to be identifiable. If you’re doing back checks on individuals, I suspect there are there’s room within your business for de-identified data and feel free to speak to that.

[00:25:12] But my question is really around, you know, for a company focused on background checks and a lot of, a lot of sensitive data. How can companies strike a balance between being both data-driven and so data focused and privacy and feel free to talk about de-identification as well. If that is something you’d like to talk about,

[00:25:29] Mark Sward: Yeah, and I think there’s, there’s a, the two go hand in hand. I think there’s, you know, when you’re using identified data, you have to do it in a way that’s transparent. You have to do it in a way that’s proportional and you have to just not be creepy, I mean, that’s the, that’s the key. And so, you know, I guess in a, in a business where your, what you do is transparent, like, you know, background screening, employment, background, screen, very transparent.

[00:25:51] It’s this, this is what’s going on. This is what your employer wants to know about you before you get hired. That’s, you know, it’s, it’s very open. So that’s, that part is relatively easy. I think it’s where you’re getting into the stuff. That’s potentially feels a little creepy where you’re trying to learn new things.

[00:26:05] You’re trying to make connections in inferences. That’s where, that’s where there’s an opportunity to use de-identified data. And that’s where I, you know, I, I don’t want to repeat, what’s already been said. I think, I think the other two have done a great job of covering it, but clarity is what’s needed on what counts as personal data, what doesn’t, what counts as de-identified. I think the acknowledgement of de-identified data in the law as, as Luk pointed out is is a big step forward. But I think the way this, this particular bill is drafted it doesn’t really give us clarity. I think it muddies the waters as to what is. What’s personal data, what’s not. What’s subject to the rules. What do you need consent for? I think there’s some, some like drafting errors, honestly it’s to help to help drive that. But I think we’re moving in the right direction, I guess.

[00:26:44]Abigail Dubiniecki: So that’s interesting though, that you talk about the value of de-identified data and that kind of, I don’t want to jump too far ahead, but there is the whole notion around data trust and being able to drive public good and public benefit from, from personal data.

[00:26:59] There are lots of brilliant work going on, particularly in the UK, when it comes to financial fraud and money laundering. So in that, Ann you would like this in 2019, the financial conduct authority actually did privacy by design themes, tech sprint, and the solutions they came up with, they actually demonstrated that looking at it at trends and the de-identified data in some ways was even more powerful at unlocking those issues.

[00:27:23] Then focusing, you know, sharply on one individual. But so I’m going to jump ahead to that question. I was going to talk about enforcement, but we, again, this law seems to kind of give and take at the same time. So now we’ve given the opportunity to use de-identified data for a socially beneficial purpose, but, Ann, who’s allowed to use that data for socially benefit official purposes.

[00:27:44] It seems to only be government and government institutions. What are your thoughts on that?

[00:27:48]Dr Ann Cavoukian:  Far too narrow, I mean, in my view, if you’re going to go to those lengths and you truly have de-identified data, then this should be a wide open in terms of it. Shouldn’t be just governments that benefit from this, but believe it or not, I’ve worked at well, all of the companies I’ve worked with, which are extensive, they go to great lengths to protect their customers’ data, and they want to protect the data. Whereas government, in my view, I have been far more lax in terms of their protection of the data. So I don’t take my hat off to government at all. And I don’t think they should be the only ones

[00:28:25] Abigail Dubiniecki:  who benefit from this.

[00:28:27] Right. And there was a really detailed piece. I encourage our viewers to have a read from another former privacy commissioner, Chantal Bernier specifically for in CIGI, and she sets it out in a lot of detail and brings in a comparative analysis.

[00:28:41]Luk you also can, and obviously pulling together data and looking at this as an opportunity, particularly in the health field has been something you’ve been working with in a lot of time, is this a missed opportunity? Is there a different way they could have approached this and still meet the objectives of protecting privacy?

[00:28:59] Luk Arbuckle: You mean in terms of data trust?

[00:29:01] Abigail Dubiniecki: Yeah. And socially beneficial purposes.

[00:29:04] Luk Arbuckle: Yeah. I mean, I see a lot of opportunity for, as events outside of just government, there’s you know, in the health sector, for example, but I won’t name any organizations, but there are some that already do this. They already pool data and it’s for, definitely for, beneficial purposes.

[00:29:18] Cause they’re looking at health care outcomes. But we’ve also seen organizations well, okay, it didn’t work out that your old Toronto sidewalk labs, but that would have been another example where you probably wouldn’t wanted a data trust. You’ve got Uber for example, and other similar type organizations that have mobility data, obviously, you know, big privacy concerns, but there may be a lot that cities could learn from that data.

[00:29:41] Would it be great to have a trust that allows them to control it, govern it, but also share it for those socially beneficial purposes. That’s not government, it might be to the benefit of government, but it’s not just government that could use that data. So I do think it’s a right now, at least it seems like a missed opportunity.


[00:29:56] Abigail Dubiniecki: And Luk, you’ve dropped some, some nice little hints there, and I would like you to tease those out a bit. So because Ann had mentioned earlier, don’t assume because it’s private sector, that it’s a careless sector that, you know, they will take the care of your business, your company, and have worked with different organizations and I think Privacy Analytics has worked with Uber and some financial services companies and the IESO in the energy space you know, to leverage de-identified data that’s collected from various sources. Can you speak to a bit of these? and gives some examples of global companies that have used de-identification to either socially beneficial purpose, if you want to call it that, or even just to give themselves an edge.

[00:30:36] Luk Arbuckle: Yeah. Well, you mentioned most of them there. I mean I can’t disclose necessarily names unless it’s publicly available. Some don’t want their name out there because it’s actually a competitive advantage. Right. They use this. For, you know, to share with certain partners and whatnot, they don’t want their competitors to know because it’s actually something that’s very beneficial to them.

[00:30:55] But you mentioned some of there’s a financial Uber. We actually had a webinar about to give an example that was one where at five, you know, they, they wanted to share data, but they found the risks were too high in the format that it was wanted, basically. Right. And this is where I think like a data trust would be a great idea because it wouldn’t then be.

[00:31:11] I assumed it wouldn’t be subject to something like a freedom of access request because you have access, you could ask for it, but it’s controlled. That’s the key thing, right? And that, that says a lot. I mean, open data is hard. Let’s not, let’s not kid ourselves. You know, stats can, does this other national service organizations do this.

[00:31:27] They do public data sharing, but they have full on PhD statisticians working on this. There’s a whole teams of people that work on us. It’s not easy to do. It seems easy. We’ll just aggregate the data. No, false. That doesn’t work. It’s a false sense of security. There’s a lot more you have to do. Well obviously not going to get into that now, but, but the point is that something like a data trust to me, it seems like a huge opportunity.

[00:31:47] Right. And you’re seeing that. You know, quite frankly, we work globally. And so we work with a lot of organization, as Mark said, looking at GDPR. I mean, that’s sort of the benchmark right now, right? So we look at GDPR, there are conversations about data trust as well and how we can de-identify or anonymize the data in a way that’s suitable, given all the huge controls they’re going to have around that data, but still making it publicly available.

[00:32:08] So I think, yeah, I still, like I said before, I think it’s a good opportunity and there’s a lot of different sectors that are looking into this. You mentioned a few, but there’s many others as well. The techniques may be different than how you protect it, but the idea of the governance and the controls around it are the same.

[00:32:22] Abigail Dubiniecki: And I think that’s another really important point, and Mark also hinted at it, is that it’s not just what you do to the data. It’s what you do around the data. It’s what you do, the people in the organizations involved in handling the data. So we’re never talking about just throwing it out into the wild, like in the next flakes, you know, the famous ones that cause people to say, you can never de-identify data, you know, satisfactorily, you can never 100% secure it either, but we do stuff online all the time.

[00:32:49] We do banking online, we do a lot of stinks. So you have to find that balance. And I think also your, your cautionary advice about don’t try this at home kids. How many times have I been advising clients who said, they’re just going to kind of DIY the de-identification and I’m like, well, are you hoping to actually extract any value from it later?

[00:33:06] Or is this for a security purpose? Because you just made it unusable, like from it, you know, like it’s not going to have a lot of value. Okay. So look, we’ve talked a lot about de- identification and this isn’t the de-identification show. So I’m going to just talk over I’m going to hop, skip ahead of the enforcement question.

[00:33:23] We’ll park that for a second. Cause I want to hear from Mark around cross border transfers and international transfers. One of the big motivators for this law was because Canada wants to make sure that it maintains adequacy. So we get inbound data flows from the EU, for example, but also we wanted to clarify what happens to data originating in Canada that we then transfer or elsewhere.

[00:33:45]Do you think that the cross border transfers issue for a global company, like yours has been adequately addressed in C 11 to make sure that data can still flow and supply chains are still intact, but also safe?

[00:33:58] Mark Sward: I think it’s another example of a missed opportunity because there’s really not a lot there.

[00:34:03]There’s not really onerous requirements for cross border transfers. And I don’t think the GDPR isn’t necessarily the right model. It’s quite onerous. And I don’t know if that’s the right model for Canada. But but I wouldn’t have been surprised to see more for two reasons, one, canadians think the data is not allowed to leave Canada, so it’s they’re wrong.

[00:34:24]But, but it is a very common misconception about Canadians and, and probably a political win to put some controls around it which they really haven’t done other than, you know, requires some transparency and proper vendor management. But the and, and the other piece is that the frameworks are already there and they’re already being followed.

[00:34:40] So the fact that, you know, not just Europe, but other countries like Japan and like Argentina and various others have rules in place around cross border transfers, where you have to follow certain rules and, and all that, that are already being followed by global organizations and are not that hard to put in place.

[00:34:58]So I was surprised I didn’t go a little further. Quebec took the other route and made it impossible to transfer to the outside the province.

[00:35:05] Abigail Dubiniecki: Even within the province, which will be subject to a constitutional challenge, we can be sure. But you know, on that point, you know, Canada still stubbornly has not signed on the convention, 108 or the optional, additional protocol to convention 108 which would have been a really fantastic way.

[00:35:20] It’s the one international treaty that specifically, if there’s anything, it’s the one that actually has teeth that actually exists up there, but we didn’t. I don’t know Ann if you have any insight into where, what and why we haven’t done that.

[00:35:35] Dr Ann Cavoukian: I wish I had some insight it’s like to had, it’s crazy.

[00:35:39] It’s absurd. And it would have been such an easy thing to do, and it would have extended conveyed so much messaging. And honestly, I have so little faith in our governments because they just miss the mark on these things again and again. So, I apologize, Abigail. I have nothing else to offer you on that.

[00:36:01] Abigail Dubiniecki: Other things to talk about, so don’t worry.

[00:36:03]But I, I mean, no, to be fair where we’re in a minority government, we’re in the middle of the pandemic. It’s not easy to legislate. And some people will say that if everybody hates it and loves it, you’ve pretty much found the right balance. So maybe, maybe there is some kind of secret sauce there we’ll see.

[00:36:18]But can we take a minute? You did touch on enforcement. You were saying that effectively C 11 gives, finally gives the power to the OPC, but then kind of takes it away by saying, well, actually it’s subject to the supervision of a tribunal that only has one privacy specialist, but earlier when you introduced your session, you did raise a point.

[00:36:40] That’s been sort of in the back of my mind, which is, does the various threat of the tribunal gives that added incentive to companies to kind of go along to get along anyway, as opposed to now where we’ve worked on bits model or are we where we started,

[00:36:56] Dr Ann Cavoukian: You know, with due respect, I don’t I don’t believe the latter.

[00:37:00] As I said, I’ve worked with so many companies and that they, they understand that there’s a reasonable means of, you know, de-identifying data of introducing strong privacy measures that these are warranted and that if you do it right, you can get a competitive advantage because your customers will love it.

[00:37:20] It builds trust. And there’s such a trust deficit right now. So I don’t understand it. I don’t know why they want the tribunal model that doesn’t exist anywhere else in terms of privacy, commissioners and privacy oversight. And I do think it chips away at the commissioner’s authority in terms of order making power.

[00:37:38] That was the whole point of the exercise. So I, I, sorry, but I can’t offer you any more on that because I think it’s just the wrong way to go.

[00:37:47] Abigail Dubiniecki: Well, so Mark, again, you’re based in Quebec. So I’m going to keep bringing it back. You know, on the, related to what Ann was saying, is that okay? So, you know, the Commission has had order making power, for example, in the context of biometrics has a lot of power.

[00:38:03] So in the clear view, AI, a decision, although that was a multi-jurisdictional within Canada decision Quebec had its own little paragraph or two specifically on the fact that there was no advance notice given to the commissioner, which people outside of Quebec might not be aware of, but you actually are required to do.

[00:38:20] And it’s, it’s quite you know, their powers are real. They have the powers of an an inquiry. So they can actually stop you and say, sorry, we’re not happy with this. You’re not going to do it. Or they bombard you with questions to the point where you pretty much back off because you know, that you’re doomed.

[00:38:36]So there is a question Mark that I’m going to ask which is that, you know you’re in a jurisdiction where there is stronger enforcement, at least on in that area. As it stands, even now, before we get into bill 64 are we already seeing, you know, having learned from the Quebec experience, is there something, is there a middle ground at least that we could have found?

[00:38:55]Cause I don’t think our federal privacy commissioner has ever had anything close to what the commission has.

[00:39:01] Mark Sward:  I mean, maybe my view is warped because I’m you know, I only see some of what goes on, but I don’t see the commissioner and Quebec exercising that power, particularly, certainly not in a high profile way.

[00:39:16]And so I don’t know how, you know, if, if you have it, but you don’t use it, I don’t know how effective it is and maybe they do use it to some degree. But I I’m willing to bet that there’s a whole lot of companies all over Quebec using biometrics that have never even thought to notify the CAI and, and have never been called out for it.

[00:39:32]And so that’s, I mean, I think that’s where you’re seeing really high profile investigations and eye-popping fines from regulators in France and Germany and elsewhere in Europe. That gets everybody’s attention. And we just don’t see that you just don’t see that in Quebec. Although I think, you know, with both of these, just the dollar signs in both of these bills in Quebec and Canada are certainly enough to get board level attention.

[00:39:55]Which, which is a good thing, but I, you know, I in practice, I just, I just haven’t observed it.

[00:40:01] Abigail Dubiniecki: Okay. Interesting. I mean, because one thing that it does sort of force you to do, but you’re right. If someone actually bothers to notify is it, is it kind of forces you to build in privacy by design before you go ahead with the project.

[00:40:14] And actually now with Bill 64, there’s a 60 day advance notice. Notification requirements. So there’s just really no excuse to not have your PIA done and you’re required to anyway. But I guess that’s a question and maybe Luk, you have some insights into this too, but what good is enforcement, if you know, you can’t be policing and I’m asking Luk because he’s had the experience of being in both worlds.

[00:40:32] So he did spend some time with the OPC. And I forgot to mention that earlier. You know, you’re, it feels like as an, as a regulator, you’re always chasing after anyways. So what else can we rely on the Goodwill of companies? Fear of other loss, you know, if you can’t be everywhere at once?

[00:40:52] Luk Arbuckle: Ooh, that’s a tough question.

[00:40:54]I mean, I think Ann’s covered it already for the most part. I mean, enforcement powers are going to be important in that respect, but, but it does, it’s not going to change the fact that, you know, there’s always going to be more that can be done. So, so just for the background, I, I, my, my position was in a director of technology analysis.

[00:41:09] So that’s the group that does the technology side of things. So if there’s an investigation, if there’s research to be done looking at the technology, so the datafication and other things as well, the challenge there is there’s so much, they don’t, technology’s changing so fast. Right. And of course, you’ve got to constantly stay up to date.

[00:41:26] You’ve got a team of, you know, IT specialists and researchers. When I was there, we had a very good team, a very diverse team of skills, right. Psychology, IT, you name it. But I mean, there’s always more going on. And so, you know, being involved in things like standardization was an opportunity to be part of that conversation with industry to see what’s popping up.

[00:41:45]We had alerts coming up all the time and, you know, the reality is and I’m sure Ann could speak to this. Cause she, she was the commissioner. You have to be ready for any media call, right. So you’re constantly, you know, calling your technologists saying, okay, let me know about this before I speak. It’s challenging.

[00:41:59] And then add to that investigations of the time that’s spent there. I mean, there’s no, I don’t, I don’t have a solution to this problem. And it’s just the reality of the space. But I do think things like, maybe I’m biased because I’m a technologist, but I think, you know, there’s, there’s things we talk about from a privacy perspective.

[00:42:15] I’m very deep with the privacy engineering right now. I’m doing a lot of work in that area, helping companies. I think there’s a lot there and there’s a lot of work happening. For example, NIST national suits at Stanford technology is doing tremendous work on their privacy framework. You’ve got people developing technologies in this space and you’ve got standardization.

[00:42:31] I mean, this is having to evolve in standardization as well, making sure that the best practices there. So we go beyond the principles and actually as a technologist, I know what to make of those principles, right? How do I put them into practice? And I think that’s a powerful thing. So I have to actually, I didn’t mention this before, but that’s one area that I’m a little disappointed in CPPA.

[00:42:49] There’s no mention of standards, no mention of industry standards as mentioned of codes of practice and certification. Okay. No mention of standards. I think that’s a, that’s a missed opportunity there because that’s something companies can rely on and go, okay, now I know what expectations are. I know what best practices I can start implementing those.

[00:43:06] Otherwise, as a technologist, you’re kind of stuck. You don’t fully understand how to do this and you have to, you know, work with the lawyers. And personally, I love working with lawyers of course, but, but there’s a lot there. It’s a lot. It’s its own field specialization.

[00:43:19] Abigail Dubiniecki: Yeah. So we in the privacy community, the ones of us, even the ones that are lawyers that recognize that lawyers are good for certain things and not other things is that it takes a village. To run your privacy programme properly. And I’m the first one to say it and to, and to say, you know, we need to be communicating across disciplines. Absolutely. And I think that has been one of the challenges also on privacy by design is people say, you know, these are principles. How do you translate those into a set of requirements and dependencies to your software dev?

[00:43:47] And you know, if he followed me on LinkedIn, I often share some stuff. There’s some great frameworks out there and you’re right. They haven’t been incorporated. I do think that the plan is to, sorry.

[00:43:57] Dr Ann Cavoukian: Abigail if I can respond to that because it drives me crazy. When I was at the commission, we produced a publication, which was a compilation of 22 papers on privacy by design that we wrote with companies, all the big companies out there, Intel, iBM, Microsoft. Think of any big company, PWC, all the big companies. We did papers on how you actually implement privacy by design is eminently global. It’s not a, you know, pie in the sky thing, eminently doable, 22 papers we did on this. So anybody who says to me while I was sort of sounds good, but you can’t really do it.

[00:44:33] It’s nonsense. It is so easy to do, and you just have to be proactive and address it. It’s it’s a model of prevention. You want to prevent the privacy arms from arising?

[00:44:45] Sorry. That gets me going.

[00:44:46] Abigail Dubiniecki: No, that’s fair, absolutely. And you did a lot of work on this, and I think one thing that was really unique about that as well as those white papers is that there was always an industry partner.

[00:44:55]Some of those papers still come up as one of the first hits. When I look at things like smart smart energy, smart homes, for example, inevitably, it’s one of the first hits that comes up. Maybe it’s my duck, duck go. I don’t know, but maybe you maybe you’ve rigged it. But I did want to say for the audience.

[00:45:11] You know, there is a huge body of work in this area and a good starting point would be your website, which is GPS by design. And then also for de-identification of proxy analytics has, I think a whole DID university with their webinars and everything else. So loads of stuff, if you wanted to even just get started there, that can keep you busy for a few weeks.

[00:45:29]We are getting close to the end. And so I wanted to ask a question around consent because that’s been another hot topic. One of the things is there are 22 exceptions to the need for example. Well, a lot of privacy advocates and business leaders and a range of people from all walks of life saying, can we just stop trying to focus on consent?

[00:45:50] Can we not look at maybe other models like legitimate interests or, you know contractual necessity or exceptions and so on? Or can we focus more on privacy by design and show that we’re not at this time, in this position where we’re putting the onus on the individual. Okay. Yes. Okay. Yes. Like a parent who’s being nagged at constantly by a child and finally gives in so who do you think of these consent exceptions or have they gone too far or are they on point.

[00:46:15] Dr Ann Cavoukian: I’ll keep it really short because I want Luk and Mark to be able to, I think it’s way too many exceptions in terms of consent. And people say, well, we shouldn’t rely on consent. I mean, bottom line is you’re using people’s personal information and you should have some type of type of consensual model or, you know, in terms of privacy by design, we do privacy as the default, which means that privacy is embedded as a default setting in your operations.

[00:46:41] You are only permitted to use the information for the primary purpose of the data collection that has been consented to. And then if you require additional use of secondary uses, you obtain additional consent. It is ever so doable. And to take the individual out of the equation and just say, forget about consent.

[00:46:58] We don’t want to have to involve the individual. It’s ridiculous. It’s their personal information. Privacy is all about control, personal control relating to the uses of your identifiable data.

[00:47:08] Abigail Dubiniecki: Thank you. And I think, and I’m gonna come up to the two of you there. I think some might challenge though, in this current environment where data travels at the speed of light in a smart cities or IOT looking for internet of things, sort of situation, but it’s just not practical.

[00:47:22] It’s far too many transactions happening at once, but that some have argued that if you build in privacy by design and by default that already you’ve sort of, kind of like food safety. You don’t have to tell people, are you okay with there maybe being listeria in here? No, there’s a certain set of standards and you don’t go below them.

[00:47:39] And they’re verified, right? So a food inspector can come in at any time. So you know, some have argued that for that sort of model, but Luk, I’d like to hear, you know, there is a de-identification exception, which you probably appreciate. C 11 actually explicitly addresses a gray area, which is the act of de-identification now does not require consent.

[00:47:59] Frustratingly in the five hours of debate on bill 64. They never talked about that. So I still don’t know the answer there. But apart from that how do you feel about the consent exceptions?

[00:48:10] Luk Arbuckle: Well, to follow on with Ann was saying I’ll point it out a book that I think is really critical to this whole conversation, which is the Privacy Blueprint by Woodrow Hartzog, cause he focuses on design, right? Privacy by design, design, talks about, you know, a need to reform in terms of like competition Bureau, where it’s not just about, you know, how it’s built, because you could have, for example, consent, but it’s hidden somewhere or it’s on a button that you don’t see and you just click next, next, next, these things are problematic.

[00:48:37] And in the book he gives many examples of that design problems, but he says, if you look at it from the, and this is my summary, of course, but you’d look at it from sort of a competition Bureau perspective. Now you start to look at the intent. You start looking at the psychology, which is something we did when I was at the OPC.

[00:48:51] We had a researcher that, that was his area of expertise. And we would look at the intent though, how things were being shown to the user so that you could catch these things like, yeah. Okay. Technically you may have a consent button, but it’s buried down here and I didn’t see it because it said next, next, next. I think, I think that’s a really interesting approach and it’s very different than CPPA, but I think it’s very powerful because again, it brings us back to this conversation, how privacy by design, privacy engineering, building it directly into the framework of how we look at privacy and legislation.

[00:49:26] Abigail Dubiniecki: There is something in the CPPA where they’ve actually explicitly prohibited deceptive designer, what we call dark patterns. And that’s where it’s leaning a little bit more towards the competition area. And remember our law doesn’t exist in a vacuum and the competition authorities in Canada have actually explicitly called out this issue of deceptive practices.

[00:49:43] So I think we’re going to see, it’s not just going to be privacy Czars that are going after these sorts of issues, but also the competition authorities, Mark I’d like to come on to you and then I think there’s an audience question as well that we should probably get to.

[00:49:59] Mark Sward: I mean, for me, I think the possibly my biggest disappointment in this bill is not, not taking a more nuanced approach to consent.

[00:50:09] I think the problem with the heavy reliance on consent now, all the exceptions are great. Although I think the exceptions and this might be going up, transgressor digressing a bit, but the. The exceptions sort of illustrate, you know, my theory, which is that in Canada, we have the tendency to conflate notice and consent.

[00:50:25]That transparency equals consent, which that’s not, no transparency and consent are two different things. Being, you know, telling people what you’re going to do with their information. Is absolutely crucial. You have to do that though, as a key tenant of privacy, you can’t have privacy. You don’t tell people what you’re doing with their information, but with consent, it suggests there’s a choice.

[00:50:43] And it suggests that the consent is freely given. And I know I work in an industry we’re handling HR data. I don’t know very many employees who actually have choices when it comes to giving their employers data. Their employers want to know what they’re doing and that’s it, take it or leave it to me that isn’t consent.

[00:50:58] And so I think it cheapens the concept of consent and it, and it reduces it to affection to just say that, Hey, if you told them about it and they still proceeded, then they consented. Cause I don’t think that’s true. I think that, and that’s where I think the GDPR has done something really great in putting in place the accountability needs.

[00:51:15] Doesn’t sit with the individual, the individual doesn’t, you know, consent, doesn’t cure all. You have to be accountable for what you’re doing. You have to make sure you’ve documented a lawful basis, whether it’s consent or something else. And then transparency is paramount. So.

[00:51:28] Abigail Dubiniecki: I would counter though, that there is an appropriate purposes test.

[00:51:32] I just want to give the poor legislators a bit of a break. There are some very hefty accountability provisions in here. You have to have extensive privacy program that isn’t just on paper that you need to be able to demonstrate. It’s actually enforced in your business. You can be called upon to and show your work.

[00:51:50] There’s an appropriate purposes test. It’s not quite a PIA. Certainly doesn’t go far as far as a PIA, but, and there is this notion of what someone would reasonably expect. And so it does kind of align with the Canadian sort of reasonable expectations standard. But again, it’s still very sort of, as we say, in French flou, you know, like there’s so much room for interpretation and maybe we’ll get more in the form of regular, you know, regulations or guidance on that.

[00:52:14] And I think you wanted to say something more, but the consent or. No.

[00:52:18] Dr Ann Cavoukian: No, I’m fine. Thank you.

[00:52:19] Abigail Dubiniecki: Okay, so we’re coming to the to the end. I will just ask one last question. Someone raised a really good question here about where do we draw the line between sort of consulting with industry versus kind of being pressured or having interference from industry when it comes to lobby groups addressing a new law?

[00:52:39] So we’ve heard, for example, Facebook has been sort of very pro you know, appropriate legislation on the outside, but then in back rooms really pushing for legislation that would suit their purposes. How do you draw that line as a legislator?

[00:52:55] Dr Ann Cavoukian: I personally think you have to have patience for considerable consultancy consulting with companies. They want to talk to you. You need to hear the impact it may have on their operations. They need to understand why you’re doing what you’re doing. I used to do that a lot in terms of that kind of interaction. So both sides can at least understand each other’s position and often you’re asleep because you may not have considered certain things that the business needs to do, and it just never crossed your mind.

[00:53:21] So I think there is an appropriate role for that.

[00:53:25] Abigail Dubiniecki: Is there, it’s more a question of being more transparent, but how those things happen is that maybe one way?

[00:53:30] Dr Ann Cavoukian: Oh sure, and transparent and enabling you to gain in your understanding of an area that isn’t your first area. So you need to be exposed to it, but transparency is critical on both sides.

[00:53:44] Okay. Now we’re at the end. I’m just going to ask you a parting question. Each of you get a turn. So there are two pieces to it. Yes. You know, do you think this is going to help us maintain our adequacy? That’s question one that you need to answer and question two, keep it or fix it. So if you could choose one thing, just one who fix or get rid of in this bill, what would it be?

[00:54:05] Let’s start with Ann.

[00:54:06] I think, you know, my, I would get rid of it. I think there was so much you have to do to fix it. Yeah. Don’t get me wrong. I have been pushing for an upgrade to PPIDA forever. So that’s not that we don’t want to get little bit. I just think this is so poorly.. who wrote this? I don’t know how it got put together.

[00:54:24] I would start from scratch.

[00:54:27] Okay, Luk.

[00:54:30] Luk Arbuckle: I mean, I think there’s a lot that’s been from everything I’ve read as well, as well as my own interpretation. There’s been a lot of good things have been added to it, a lot of clarity on some points. So I think there’s some good. I wouldn’t wanna throw the baby out with the bath water. If I were to change one thing, it might be stealing Mark’s thunder here, but there is ambiguity in a few places where even, you know, someone who’s done this for 10 years, I look at it and go, wait, what do they mean here? Internal research is one. How is that defined use of the identification, but then also saying personal in the same sort of paragraph or section.

[00:55:05] And it makes me, it makes me confused as to what they mean by de-identification sometimes. So there’s ambiguity in it that that throws me off. But the one thing I would change I’m going to say is it’s a bigger focus on proxy by design

[00:55:18] Abigail Dubiniecki: and adequacy, will we keep it? No, sorry. I didn’t let Ann tell us.

[00:55:26] Luk Arbuckle: Oh, I mean, I’m not, I’m not in the position to really say I’m not a lawyer. So I can’t say I hope it helps, but I don’t know.

[00:55:32] Abigail Dubiniecki: Okay, Mark, that’s probably not a fair question. Anyway, Mark. And you get to the last one.

[00:55:37] Mark Sward: Oh yeah. As far as adequacy, I’ll start with that. Do I think it’s good enough. I don’t think it’s good enough. But we’ll leave that up to the European commission. And then as far as whether to try and fix it, I mean, it’s not the bill I would’ve written, but but I’m willing to give it a chance, but I, you know, I would love to see a rebalancing of consent versus accountability, which of course takes into consideration privacy by design.

[00:56:01] Okay. So, Ann we didn’t,Abigail Dubiniecki: [00:56:03] I didn’t, we weren’t all selected because of our pro privacy by design stance, but I think everyone’s coming out at the same time. I think we’re just at the end of our time. Thank you so much for indulging us. I promised it would be a little bit spicy. Please do checkout the Global Privacy By Design Website.

[00:56:21] There is a wealth of resources. Also check out privacy analytics. I’m not sure if if Sterling has any resources in this area. Nnovation, I’ll be writing some things for the blog and just shameless pitch, luk and I are going to do a session one in French and one in English specifically on de-mystifying, de-identification. And so watch for that. You’ll see if you follow us on LinkedIn, that’ll be announced at some point.

[00:56:45] Dr Ann Cavoukian:  Great.

[00:56:46] Abigail Dubiniecki: Thank you very much everyone.

[00:56:48] Andrew Menniss: Thank you. Thank you, Abigail, Ann, Luk, Mark. That was a, you know, a great panel and fantastic way to finish an awesome day here at PrivSec Global but I’m also a fan of Privacy By Design. Just, just, yeah,

[00:57:02] That was a panel filled with experience, intelligence and passion, and that’s what we love here PrivSec Global so that was fantastic. We also have an e-book available. In the PrivSec library which explores the six most commonly discussed data privacy regulations, including Canada’s PIPEDA. You can download this ebook by visiting the PrivSec library. Visit the GRC World Forums page via the left hand menu and register your interest for a number of our new initiatives for 2021, including the Women in Governance, Risk and Compliance series.

[00:57:36] It’s a series of awards and forums, and it will honor female leaders and companies demonstrating excellence across GRC. And you’ll also be able to see details on future events being held in 2021. So that’s the end of stream two of PrivSec Global today. Don’t forget, there are more sessions taking place tomorrow and Thursday,

[00:57:56] once again, we’d like to thank our headline sponsors of Microsoft and OneTrust, and we’ll see you tomorrow. Thanks so much.