WhatsApp’s announcement of a new privacy policy that no longer allows users to opt-out of sharing their data with parent company Facebook has led to an influx of users moving to more privacy-friendly apps. This shows that the public is starting to take their privacy seriously - but is this enough?

In this fireside chat with Carissa Veliz author of Privacy Is Power we will explore the current issues of privacy in society, the power that data gives to big tech and governments, and how it may lead to us sliding into authoritarianism.

Speaker:

the-power-of-bigtech-and-ethics-carissa-veliz3

Transcript

The transcript has been edited for grammatical reasons

Joe Tidy 1:32

Hello and welcome to PrivSec Global we are well into day three of this privacy and security stream at the largest data protection, privacy and security event of 2021. In total, we have more than 200 subject matter experts sharing their experiences and ideas and knowledge across more than 64 sessions. And if you’ve missed any of those sessions, you can get them on demand by following the instructions to the lower left of the screen, which I think is my right here, you can navigate and explore the agenda and find out which ones you may want to have or may want to watch again that you missed. And you can register interest for future panels as well. Up next we have a keynote speaker Dr. Chris villus, who is author of what the Economist magazine says is one of their best books of the year. Privacy is power. And this session is sponsored by our premier sponsors one trust. And if you want to join the conversation on social media, you can use the hashtag, Percocet global. Over to you, Chris.

Carissa Veliz 2:35

Thank you so much, Joe. It’s great to be here. Thank you for inviting me today.

Carissa Veliz 2:43

Today I want to talk about the relationship between privacy power, big tech, and a bit about ethics. I think that being a data protection expert, sometimes it’s easy to get caught up in the technicalities of it, the details of the law, and the details of the definitions and so on. And it’s easy to lose sight of the big picture. So I wrote my dissertation on the ethics and politics of privacy. And at some point, I go very much into the details of what does it mean to say that somebody has privacy, what is the right to privacy where privacy related duties. And then once I have most of the project done, I sort of looked back on it and thought more about the relationship between politics and privacy. And I got extremely alarmed at the picture that I saw there. And what first started as an academic project became very much a project to better inform the general population, but also have different kinds of conversations within the privacy community and the privacy literature that I thought we weren’t having enough of, I think now it’s a bit more common. But for many, many years, most of the conversations around privacy have been too much focused either on the law or on ethics and not enough on politics.

Carissa Veliz 4:06

So one thing that I started to think about more closely, is about the relationship between data and power. For a long time, we’ve known that there is a very close connection between knowledge and power. Francis Bacon, famously argued that the more knowledge you have, the more power you have over others. And that is extremely clear with with big tech, the more knowledge they have about who we are, what we do, what we fear, what we hope, what are our vulnerabilities, the more they can predict our behaviour and influence it and change the course of things. But the converse is also true, and it’s something that perhaps we think less about, and that is that the more power you have, the more you get to say what counts as knowledge. And this is an insight from Michel Foucault. And, again, it’s very clear in context of big tech, the more say a company like Google has power and has data on you, the more they get to say who you are, they get to decide what counts as relevant information about you what counts as part of your profile. And same thing goes for data brokers and other kinds of companies. And this is partly a problem, because some of that data may be inaccurate, some of that data may be accurate, but unfair in the sense that it’s data that it’s true that it’s about you. But that shouldn’t be taken into account for certain kinds of decisions, or for certain kinds of contexts. And we have, at the moment, a huge asymmetry of both knowledge and power that we need to address if we want to keep our democracy strong, who wants to defend our ways of life, and our current and social fabric. And part of what we need to do is to redress the balances.

Carissa Veliz 5:54

On the one hand, we need to make sure that big tech knows a lot less about us, because having too much data about us makes us too vulnerable to them. And on the other hand, we need to make sure that we know a lot more about them. So up until now, really only data protection officers and similar roles knew enough about what was going on with data. And even then there, there are many obscure practices that you really need to push companies on, and sometimes investigate to figure out what’s going on. One of the marks of data and that is extremely relevant for ethics is how opaque it is how obscure, it doesn’t feel like anything to have your data collected. If I if I harm you physically, it hurts, you can bleed. If you get a virus, it’s hard to breathe. They’re all these physical markers of harm, that allow us to make sure that we know how important it is to be safe from harm. But with privacy is not the same thing. When your data gets collected, you don’t notice the absence, it doesn’t hurt itself hard to breathe. And the bad consequences of it come only much later in in the course of time, once it’s first, too late to recall that data. And once

Carissa Veliz 7:13

Basically, the ship has gone. And so we as privacy experts have a really big job in making sure that society is aware of the risks involved in the kind of data collection that we are doing. And in making sure that we can do whatever is in our power to avoid something really bad from happening and to and to minimise risks in general. One of the consequences of this connection between data and power is well, at least a couple of things. One is that that party explains why big tech got to get so big before we really noticed and did anything about it. Our litmus test for whether something was a monopoly is whether a company can increase their prices without losing any clients. But of course, companies like Facebook and Google are free. And that means that we are looking to set test failed, and really early this test should have been something much more general. And the point about money is just one example of it. And the litmus test would be whenever a company can impose exploitative practices, whether it’s through charging more money, or whether it’s through collecting more data, or anything else, without losing any clients, that is a symptom, and it might be a monopoly, and we should look into it. And so that just goes to show how important it is to have power in the picture. And another implication is that whoever we give, most of our data will have more power.

Carissa Veliz 8:46

So if we give most of our personal data to companies, it shouldn’t surprise us that they have more power than countries and that they can essentially write the rules of the game. And many times even not pay taxes bypass the law. And in many instances, if we give too much of our personal data to governments, however, we are risking sliding into some form of authoritarianism. And for democracies to be strong, the bulk of power needs to be with a citizenry. It’s always been like that, it just happens to be the case that in the digital age, whoever has the data will have the power. So what I argue in my book is that we need to have control over personal data as a citizenry, not as individuals necessarily, but as the people essentially. And there are at least two ways of thinking about personal data that I think are very illustrative and unimportant and accurate, and that typically get ignored. The first one is that personal data is not personal. Not really. I mean, it is personal, but it’s more than that. When will you privacy is a collective enterprise much more than it is a personal one.

Carissa Veliz 9:57

Even if I’m and you know this, but Even even if I’m super extra careful with my privacy, I do everything right. If my friends and family are not careful and don’t, there’s, there’s no, there’s nothing I’ll be able to do to protect myself. And in that sense is a lot like ecology, even if I recycle everything, and I’m very careful with my ecological footprint. If the rest of society isn’t, then I’m going to suffer the consequences of climate change just as much as the next person. And that means that we need a collective approach to regulating privacy, and to taking care of privacy. And that means that when a person says, Well, you know, I have nothing to hide, then I have nothing to fear, I have no reason to protect my privacy, they’re wrong on so many accounts, first of all, they do have something to hide, they do have something to fear unless they are exhibitionist, with no care for their own future and for no worries about being discriminated against, and all kinds of data related harms.

Carissa Veliz 10:59

But Furthermore, even if that were true, that you know, they just don’t care about themselves at all, and are willing to take the risk. Society has reasons to worry about privacy in a way that justifies us banning certain kinds of practices. So for instance, today, if I go into a hospital and say, Hey, you know, I’ve heard about this Coronavirus thing, then it just seems like something interesting to experience, can I just get infected, please, even if I consent, and even if the doctors were to consent, because they find it interesting that we wouldn’t be able to do that. Because, among other reasons, I would be a risk to others, I could infect others. So unless it’s a very controlled experiment, and with very guarded limits, then we society has an interest in me not getting infected. So just in the same way, at the moment, something that is really on the one hand concerning but on the on the other hand, makes me optimistic, is how much is becoming cleared the connection between personal data and national security.

Carissa Veliz 12:06

So after 911, the intuition was that if governments could make literally a copy of the data that big data that big tech was collecting, then they could keep the citizenry safer. Now, it just turns out that big data is not the kind of analysis that is good for preventing terrorism, among other reasons. Because it’s incredibly rare. Terrorism is very rare. And big tech is very good. And big data is very good at figuring out patterns in troves of data. So figuring out what you’re going to buy tomorrow, because there are billions of people buying things every day. Terrorism is not like that. But Furthermore, is not only that, that personal data is not as useful as we thought it might be. But it’s an incredible liability. And here are just a few examples to show it.

Carissa Veliz 12:56

About a year ago, the New York Times published an article two journalists who describe themselves as being not very tech savvy, who nonetheless had access to data from a Data Broker. And without data, they could figure out the location of the President of the United States by figuring out by correlating his schedule, with phones around that area and figuring out who was a secret Secret Service agent, if the President of the United States is not safe, obviously, none of us are. But also, the whole country’s is not safe. That’s a huge national security risk. And there was another article in The New York Times that showed how with this same database, they could figure out details about military personnel, very high people in public office and so on. And that that is a huge risk. Another example is just that the internet has been largely allowed to be unsafe, so that we can collect personal data. And that means that as the Internet of Things proliferate, we are going to have many more risks.

Carissa Veliz 14:10

So just imagine house, instead of having one door in one window that you have to protect, you have to protect a million windows, it’s a lot easier for somebody to break in. And that means that if hackers were to just tack say, 10% of electrical appliances and turn them on at the same time, they could bring down the national grid. And this is not hypothetical. During the pandemic, there have been many attacks on the national grid. And it’s just a matter of time before one of them is successful. Just imagine being in lockdown during the pandemic and with no electricity for days on end. That would really, it could bring down a country to its knees. And then a third example comes from history. And I think it’s really important to to understand privacy and why is privacy a human rights but When the Human Rights got designed and agreed upon, we didn’t have the internet, we didn’t have all kinds of data collection practices, why did our predecessors think that it was so important to for it to be in there. And here’s something to think about?

Carissa Veliz 15:20

Well, I already said one of the important ideas is that privacy is collective. The second one is a personal data is a toxic asset. It’s really a liability. It’s just like a ticking bomb. And here’s a story to illustrate that. During the Second World War, when the Nazis invaded a city, one of the first things they did was visit the registry, because that’s where the data was kept. And they wanted to know where Jewish people lived. There is an incredibly interesting study between the country that had the most data and its citizens, which was the Netherlands, versus a country that had the least data about its citizens, which was France, and why these two countries have so different practices was not a coincidence.

In the Netherlands, there was a guy called Lentz who was a fan of statistics, he was a pioneer of population statistics. He designed the first ID card, and he wanted to have a system that tracked people from cradle to grave. That’s, that’s an actual quote. On the other hand, friends had made a conscious decision in 1872, not to collect certain kinds of data, like religious, religious affiliation, for privacy reasons. And when the Nazis came to France went to France, the French said, we have no idea how many Jewish people we have, let alone where they live, or any other details.

And the result was that in the Netherlands, the Nazis found about 75% of the Jewish population and killed them. And in France, they found 25% of the Jewish population. The Nazis basically had to rely on Jewish people to either turn themselves in or be turned in by a neighbour. And there is a particularly interesting illustration, from well, there are actually two one from France and one from the Netherlands. In France, there was a man called René Carmille, who was the general contour of the army. And he had Hollerith machines, he had punch cards, machines, and he volunteered to create a census that collected all the data that the French had decided not to collect.

So that is, we’re delighted, and they accepted the offer. But months passed, and René Carmille didn’t give up the data. So the Nazis start getting patient and started making making raids. But again, it was very inefficient, because they didn’t have an easy way of knowing exactly where Jewish people live. And again, months passed, and the data was not handed in. And it turned out that when they gave me was one of the best, better pies placed people in the resistance, and he had never planned to give up that data. He did the punch cards and did the sensors, but he never recorded the answers of whether somebody was Jewish or not. And just by that decision, one person deciding not to collect sensitive personal information, he saved hundreds of 1000s of people.

Carissa Veliz 18:22

Unfortunately, he got found out by the Nazis, and he died from exhaustion in a camp, I think he knew he was going to be found out. But it just shows the power of what a person can do with that, who decides not to collect data. On the dark side, there is also a very important story about the registry in Amsterdam. The Resistance decided to destroy as many records as they could to protect people. And they wanted to registry, they set fire on to the records, and they had a deal with the fire department. And they had some people who were sympathisers in there. And the deal was that the fire department was going to try to arrive late, and that he was going to try to use more water than necessary to destroy as many records as possible. Unfortunately, they were quite unsuccessful, they only managed to destroy about 15% of records. And the Nazis found 70,000 Jews and kill them and their resistance all got also found out and killed.

Carissa Veliz 19:20

And in my mind, the Dutch made two mistakes. On the one hand, they collected a lot more data than was necessary. And the second mistake was that they didn’t have an easy way to delete that data in the event of an emergency. And we are making both of those mistakes on a grand scale never seen before. It’s dangerous, and it’s stupid. And we should back off.

So what I’m arguing is that we should end the data economy, essentially, at least when it comes to personal data. Personal data isn’t the kind of thing that should be for sale. It’s too sensitive. It’s too important. It’s not only too important for individuals for the sake of equality, discrimination and democracy. Like we saw with Cambridge analytical like we’re seeing every day with scandals about unfairness. But it’s so important for society is just too dangerous. We have very important rivals in the world who are very good at hacking. And we are just setting ourselves up for a trap.

So what I argue is that we have to implement a series of things. And I already see that there are a lot of people might be asking, and I’m just imagining, what do I think of the GDPR? This is a question I often get. I think it was an incredibly important law. And I don’t think we could have come up with something better at the time because of the pressures, because of people’s imaginations for all sorts of reasons. But clearly, it’s not enough, and clearly is not enough because every day we’re seeing scandals, and these agencies, data protection agencies are underfunded, they’re understaffed, and we need to do better.

So one of the first things we need to do is that the default should be no data collection, that the default is data collection creates all sorts of inefficiencies and unfairness, it means that every time I go into a website, I have to spend like a minute saying no to cookies. And that’s ridiculous. If it were the other way around. If the default was no data collection, I wouldn’t have to do that every time. And if I do want my data collected certain kinds of data, we allow it, then I can opt in, and then the webpage can remember me with legitimate reasons, and it would be a lot more efficient. So we need to ban the trade in personal data.

Joe Tidy 21:39

Personal data is not the kind of thing that should be bought and sold. Yes, you can collect it if you need it. Yes, you can analyse it, if you need it. But we have to increase our cybersecurity standards or cybersecurity standards are so bad that it makes me want to cry every time, I just look like peek into them. It’s really pathetic, we are so much better at collecting data than we are at keeping it safe. And that’s not fair. If you’re collecting such sensitive data, and you can’t keep it safe, you shouldn’t be collecting it in the first place. We should implement fiduciary duties, fiduciary duties are duties of care that are relevant in a relationship of asymmetry a professional relationship of a symmetry. So just like your doctor has a duty of care towards you as a patient, then whoever wants to collect or manage your data should have a duty of care to you as a data subject. And this is only fair because what we’re entrusting these professionals with is very important. In the case of medicine is your body in the case of finances, your financial affairs, in case of your lawyer, it’s your legal case. And in the case of personal data is your personal data. And that is incredibly sensitive.

Joe Tidy 22:48

Professionals can have conflict of interest. So you can imagine your doctor wanting to perform surgery on you. Because they want to earn more money, or because they want to test a new equipment or they want to practice their skills, or they want to have an extra data point for their research. And they can only do that if it’s in your best interest. In the same way you can imagine your financial advisor wanting to buy more stocks than is wise because they get a cut, a commission. And they can’t do that unless it’s in your best interest. And if professionals aren’t willing to take on this duty of care, then they should have a different job. So in the case of medicine, it’s not enough for somebody to be really good at opening up people and being being really curious about what’s inside. Now they need to have a duty of care and they need to want to make people’s lives better. Otherwise, they shouldn’t be a doctor and they can be something else. In the same way, if you don’t want to have a duty of care towards people with their data. Absolutely fine. It’s understandable, because that’s a lot of responsibility. But then you shouldn’t be collecting data in the first place you should, you should look for a different kind of job. And so those are the first things that we should do. Another proposal that I think is very important is to ban personalised ads.

Joe Tidy 24:15

The kind of advantage that we get from it from them is really miniscule. We can get it in other ways. And the kind of disadvantage is huge. So what’s the advantage? Okay, you get to get you get to see relevant ads. And that’s great. And people really value it. I can tell because often people come up to me and I’m talking to me about this. But of course we can get relevant ads without such grave violations of privacy. So if you’re reading an article about bikes, or about roller skates, there’s a good chance that you might be interested in ADS about sports gear, and it’s fine to have that those ads attached to that article. But for that to happen, we don’t need to know your sexual orientation. Your political tendencies, your health records and your purchasing power. And at the moment, that’s what we’re doing with real time bidding, what’s happening is that companies are getting huge amounts of very sensitive information about people even before they consent many times. And they get to keep this data. And that’s something that is just incredible to me that then we still allow it to happen.

Joe Tidy 25:28

What are the disadvantages of personalised ads well, they’re huge? On the one hand, they have the risk of polarising society like I’m sure we’ve you’ve heard before. We don’t have unmediated access to reality, we can only access reality through our screens, especially during pandemic times. And that means that if we see completely different pictures of reality, when we talk to each other, it’s going to be very, very hard to have a rational conversation. If we have completely different pictures of reality. It also creates a risk for personalised propaganda, which is related and for political friends trying to sway elections in ways that are very questionable. Whether they succeed or not, is a different question and a very important one, but they’re even trying is already incredibly concerning. And you might think, Well, can we done is gone. And yes, it’s gone. But 300 firms that are very, very similar, still working in political campaigns today.

Joe Tidy 26:28

Another risk is of discrimination. Of course, if you’re a man, you probably see ads for higher paying jobs than I do. It’s a risk for all kinds of questionable targeting. But it’s also it turns out an incredible financial risk. So the people who created this market of ads are the same people who created the market of subprime mortgages that caused the financial crisis in 2008. And there’s very good reason to think that we might be on the brink of a similar financial crisis unless we do something about it pretty soon. Because there’s more and more evidence of how targeted ads don’t work as well as we thought they might work. They work a little bit, but they’re so expensive, that they’re not worth the cost. And furthermore, there’s a lot of fraud click, and it’s very hard to assess whenever you buy an app, exactly who saw it, whether they clicked or not, for how long? And what was the effect? Would they have bought that product if the ad hadn’t been there, etc. And there’s a fear that once publishers really realise the extent of what they’re spending, essentially for possibly smoke, then the bubble will burst and it’ll all come crashing down.

Joe Tidy 27:48

So the advantages are very small, we can get them in other ways. The disadvantages are huge. We should ban personalised after they’re really just not worth it. In the book, I have about 20 different measures, also think that the role of diplomacy is going to be very, very important. Now it’s time for the US and Europe and the UK, and Australia, New Zealand and Japan, and possibly democratic countries in Latin America to come together and agree on how are we going to regulate AI and data, now is the time just like, we had to have that Alliance at the end of the Second World War.

Carissa Veliz 28:26

And just to end, because I’m mindful that we’re running out of time, and I want to have enough time for questions. It might sound radical, this thing about ending the data economy. But just have two things in mind. The first one is that we have ended exploitative, economical practices in the past, because we thought they were too toxic for society. So this is not the first time we would do something like this and succeed at it. And the second thing is that it only seems radical, because we’ve gotten so used to the status quo, which is pretty far from ideal. But if you think about it, what’s really radical, what’s extreme is to have a business model that depends on the systematic and mass violation of rights. That’s unacceptable, and we shouldn’t get used to it. So I look forward to hearing what you think about this.

Joe Tidy 29:16

Thank you very much, professor. That was a really fascinating discussion. I was curious, before we get to the questions, what what are you? How did you sort of come to these conclusions in your life? Have you always been anti data collection and quite privacy conscious even when you were growing up? or How was this all developed for you?

Joe Tidy 29:35

That’s an interesting question. I do have the intuition that there’s some people who are more privacy conscious just by personality, you know more shy, more introverts. And so there was an element of that maybe, but what really got me thinking about privacy was actually something very different. My family were refugees from the Spanish Civil War, and they never talked about the war. It was kind of a taboo topic in the family. So when they died, I went into the archives thinking, you know, maybe I’ll find something. And I found a huge amount about my family, and things that they hadn’t told me. And it made me wonder whether I had a right to know these things.

But I had a right to tell the rest of my family about it without had a right to publish about it, I really wanted to write about it. And I started, you know, as a philosopher, I looked into philosophy to see, you know, what philosophers had to say about it. And I was so unsatisfied with what I found that I thought it was just a huge gap in the literature. And it just so happened that that summer, Edward Snowden came up with his revelations. And I thought, you know, we have to think more carefully about about privacy. And and I’m not sure enough people are doing that, like having like a big picture perspective. So that’s what got me into it.

Joe Tidy 30:48

Right. That’s what set you on the path. Congratulations on the book, by the way. I’m gonna definitely get it and have a read. So we go to the questions. And if that’s okay with you, we’ve got about I think, Laura, tell me in the message, I think we’ve got about five minutes or so. So question one, a few weeks ago, Mikko Hyppönen said that data is the new uranium, which is interesting, rather than data is the new oil, because uranium can remain explosive for decades. Do you like this analogy?

Joe Tidy 31:20

Yeah, I like it, I think I will need to do more research about uranium, because it’s not exactly my field of expertise. But in the book, I use asbestos. And I find this asbestos, really helpful, because it’s something that it’s very easy to mine, it’s very cheap. And it’s incredibly useful. And we put it in walls, plumbing, all kinds of cars, all kinds of things, because it’s very hard for it to catch fire. And it’s very durable, it’s very strong. It just turns out that it’s also incredibly toxic. There’s no safe threshold for exposure. And every year, hundreds of 1000s of people are getting cancer from it, because we put it in our structures. And now it’s really tough to get it out. Because every time you move it, you create this risk. And so in the same way, data is very easy to mine, it’s very cheap. It’s very useful, but it’s incredibly toxic. And once it’s in the structure, it’s really hard to recall it. And in a safe way. It’s a ticking bomb that poisons people, even though it’s a very slow kind of poison.

Joe Tidy 32:24

And where do you stand on the whole data is the new oil? And oh, there’s, there’s a growing number of people saying that that is actually just completely the wrong way to look at it.

Carissa Veliz 32:32

Yeah, I think it’s the wrong way to look at it for a variety of reasons. I mean, there’s something to it in the sense that it’s very valuable. And also, we have to think that oil is not very clean energy. And it’s done a huge amount of harm. So in that sense, there’s also a similarity. But in particular, I find it problematic to think about data as a kind of property. And one of the reasons is, like I mentioned privacy is really collective. So what moral authority when I have a house, I want to sell it, unless there are the questions coming into play and nobody has a right to tell me not to sell it. But when I sell my genetic data, I’m selling the data of my parents and my siblings, and my kids and distant kin who can be deported for it, who can be denied insurance can have all kinds of bad consequences. So I don’t think I have the moral authority and, and the the the authority of ownership in the same way that I would if I had a barrel of oil.

Joe Tidy 33:31

Hmm. Okay, I haven’t thought about it that way. It’s interesting, I’d look at it. And question two, will the open data movement in the UK and EU impact the power of big tech in the coming future? And if so, why do you think that could be?

Joe Tidy 33:48

That’s a very interesting question. And I think it depends on how it pans out. Because open data is a bit vague. So if we are very, very careful about making sure that the only data that we consider eligible for open data is non personal data, then that’s going to be very different than if we also consider personal data as part of the package. And it turns out that it’s just really hard to distinguish between personal and non personal data. Because data that we have many times considered non personal because it wasn’t anonymized. It turns out that it’s really easy to re identify. And we don’t know what kinds of technology we might have in say, five years, 10 years time, that can make data that seems non personal today become personal.

Carissa Veliz 34:33

So say if quantum computing gets developed quickly, or any other number of of developments, so it’s, it’s really hard to tell. But in general, I would find it more promising to not allow big tech to mine or personal data to fix that asymmetry than to say, well, let’s just share all the data because If we share all the data, that doesn’t mean that institutions and companies will have equitable access to it. So say a company like Google has a lot more resources, both in terms of experience in terms of people than a government to analyse that data. So even if they have the same data, it could still be the case that Google have a lot more power than than the government. So the devil is in the details. And I don’t think we have enough information right now to give a meaningful answer to that question.

Joe Tidy 35:30

What are the chances of social media companies being forced to change to a subscription type service, where they charge in a normal fashion and stop commercialising data? as their business model? What do you think of that?

Joe Tidy 35:43

I think the chances are not bad. So more than than forcing social media to do this, it will be more like, Okay, this business model of trading and personal data is banned now. So if you want to survive, come up with a different funding model. And then it will be up to these companies to come up with a different funding model. And I guess that the most obvious one is just a subscription type service. So at the moment, for instance, Twitter’s is trying out something new, and we’ll see how it works. And they’re trying to offer the possibility to like super users, to charge people for their content. So you might have free content from more normal people. And if you want to access celebrities content, then you might have to pay something. So I think it’s worth exploring with different kinds of models and seeing why it works. But I think there’s a good chance that that might be at least part of the future.

Joe Tidy 36:33

Do you think though we’re only going to get there through banning? You know, do you think if we were to wait for the big tech companies to make their own minds up and move to those models? Would that happen? Or would we be waiting forever?

Joe Tidy 36:48

I think I’ll be waiting for a long time, maybe with Twitter, and we can always already see how they’re trying something different. But I’m very, very, very pessimistic about Facebook, I think my sense is that he’s so entrenched and convinced of the business model that he would never move. They would have to be so much pressure from users that we would we have to organise a lot more than we have. And it’s not clear to me that we are capable to that degree.

Joe Tidy 37:19

Yep. Question four, will the rising awareness of privacy rights help to counter the threat posed by fraud?

Carissa Veliz 37:29

I’m not sure I understand the question exactly. What kind of fraud are we talking about?

Joe Tidy 37:33

Will the rising awareness of privacy rights helps counter the threat. So I suppose what they’re saying is, if we’re more careful with our data, it might help solve the problem or go some way to solving the problem of fraud. So for example, if there’s less data out there about me, then there’s less data that can be used to impersonate me or steal my identity or hack me essentially, do you think there’s an argument there? Is that another benefit of privacy that we haven’t really thought about?

Joe Tidy 38:02

Yeah, there might be, I think a lot of things have to happen before. That’s true. A simple awareness of privacy rights is probably not enough. But for instance, if we think about something like Tim Berners-Lee, project Solid, taking off on this project is having pods in which you have your personal data, and then share it with whomever you want. And you can always revoke consent, and you can know exactly who has your data and what it’s being used for, then there will be a kind, not only a way of controlling personal data and making sure that it’s not just sold forever and ever from party to party, but also a kind of certification, that it’s actually your data and that you are certifying that it’s accurate, and it’s true.

And that might be really useful for fraud, of course. And then if you are the only one controlling that pod, then presumably, it would be very hard for somebody to say take a loan in your name, or something similar. But I think more would need to happen than just awareness. We need better systems in place to make sure that we can technically protect data.

Joe Tidy 39:05

Right? Question five, data fusion creates more value to solve some of the biggest global challenges. What is your solution to get the balance, right between data sharing, or use versus privacy? And how do we unleash the power of data to overcome data asymmetry? That is, for me, quite a technical question that I don’t. Hopefully you’ll understand that one.

Carissa Veliz 39:30

I love this question. Because it’s true, data we can do things I guess useful, right? So we wouldn’t want to not use data to solve the coronavirus pandemic or any other kind of really big projects and for concerns about privacy. So what is the right balance? It’s a combination of things. The first question is, do we really need this data? Too many times we give up data with on the narrative that is really necessary when it’s actually not. So just to give an example, the first proposals for the Coronavirus apps were really previously invasive and having a centralised database and all that, and it wasn’t necessary. So the first thing we have to make sure is that we’re not giving up data unnecessarily.

The second thing that we have to make sure about is that data has an expiry date that it has to be deleted. And there’s a really good book that I recommend called Delete. That argues that, for most of history, forgetting has been a really important thing, not only because it makes us more forgiving, it has all these kind of psychological benefits, but it’s also a kind of filter, you only remember what you really need to remember. And when you don’t need to remember you discard. And this change in the digital age, in which the default is remembering everything because it’s cheaper, can have really bad consequences, among others, that we live in a very harsh society that never forgets or forgives your mistakes.