Debbie Reynolds Consulting LLC

View Original

E55 - Wayne Cleghorn, Chief Executive Officer & Practice Director at PrivacySolved

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

E55 -Wayne Cleghorn, CEO at Privacy Solved (49 Minutes) Debbie Reynolds

ADD 49:40

SUMMARY KEYWORDS

privacy, data, people, directive, data protection, debate, world, big, law, laws, companies, agree, states, put, competition, understand, absolutely, question, EU, lawyers

SPEAKERS

Debbie Reynolds, Wayne Cleghorn


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "They Data Diva ". This is "The Data Diva Talks Privacy" podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on my show, straight from London, England, United Kingdom, Wayne Cleghorn. Heis the Chief Executive Officer and Practice Director at Privacy Solved. Hello, Wayne.


Wayne Cleghorn  00:42

Hi, Debbie, it is a real pleasure to be with you really good.


Debbie Reynolds  00:46

So I'm happy to have you on the show. You're one of these clandestine people who don't put your photo up. So it was the first time I have actually seen you on this podcast. You're super smart. I love your profile because of all the different locations that you work in. So you have experience in almost any country in the world related to privacy. That’s probably the one thing that attracted me to your profile. You went to Northwestern University, the Pritzker School of Law in Chicago, and I'm a Chicagoan, and I went to Loyola .


Wayne Cleghorn  01:30

 absolutely neighbors. Absolutely Yes Yes  Yes Great town. I love Chicago. You know, I do. Do we?


Debbie Reynolds  01:36

Chicago is  Amazing. And so I was attracted by your profile, even without your photo? Because I like people who understand the kind of international data flows, you definitely do. You're very cosmopolitan gent I can tell this by us chatting before the podcast. So tell me a little bit about yourself and kind of your journey into Privacy Solved.


Wayne Cleghorn  02:05

Yes. So I really started as a lawyer, right. So I was at law school. And I remember in my third year, and I, I took a course in medical law. And it was a very intense course because I was about ethics. And it was about consent. And it was about all the big issues. Were medicine and law intersect. And I remember in my final exam, they asked about the new UK Data Protection Act, which came out of the 1995 directive. And I was writing this paper right after my final exam. And I just had this kind of out-of-body experience; I thought, my God, this is a really innovative piece of law. It's really clever. It's quite neat. It's not too long and complicated. There are big issues with it. And I looked around the world. And I thought, God, this is quite interesting. And then I tracked it through, and I tried to make sure that the things I did coming up after that, I tried to make sure that there was some kind of data protection. So I ended up working for a big commercial law firm in kind of the personal injury and public liability, commercial litigation section. And there were a lot of private medical records coming through there. And then I picked this up in my LLM at Northwestern. I wrote a big paper, but actually, my thesis, I suppose if you could call it, was about consent, consent to non-invasive surgery in the UK and the US and the ethical rules around that and the legal rules around that and best practice and, and all of that. And I also took a course in Biomedical Ethics. So I think a lot. There's a lot of AI ethics discussion right now. But I was doing the medical side, what, 10 or 12 years ago. So I come to that debate from the medical side. So I'm able to really get into that debate quite strongly and ritually. So that's how I started. I started from a legal background, but I didn't stay a lawyer. I got involved in the IT side, I've got technical qualifications, IT qualifications and also, you know, other relevant qualifications, you know, dealing with businesses, project management, etc. So, I'm a mixed bag, but I started out in the law, and I'm still a  lawyer, so I practice in the UK, and I've also practiced in Belgium as well.


Debbie Reynolds  04:52

You truly are an International Man of Mystery.


Wayne Cleghorn  04:59

I think I'm want to take that; actually, I'll take it. So good side. Yes, yes,


Debbie Reynolds  05:03

definitely take that. Yeah, so this is really interesting. I actually did a lecture for the University of Massachusetts, I'm sorry, the University of Maryland, about bio tech, ethics, biotech ethics, and privacy. So it's really interesting. So I like to talk to people who understand that type of privacy because, you know, almost anywhere in the world, you have to protect medical data in a certain way. So that gives people an idea of sort of how we're working toward more consumer privacy and other things that aren't necessarily medical. Right?


Wayne Cleghorn  05:43

No, I absolutely agree. And it's one discipline that actually you can't, as I call it, stay in the gray, there are certain disciplines where you can kind of say, Oh, well, you know, I'm not quite sure. And let's see how well it works. In Biomedical Ethics, and in medical, Data Protection and Data Privacy, you really have to come to a view, and you have to take a position. And you have to be able to argue that through, operationalize that through, and live that through, and I love areas like that actually are great, good. Sometimes great. It's inevitable. Yeah, but I see where it's life and death. And, you know, biggest is, it's good if you're able to actually come to a view and I have a good position.


Debbie Reynolds  06:31

Yeah, that's true. And then also, one thing about privacy with playing out over the world, you know, I feel like some people have approached it as kind of like an ivory tower issue, where, let's just debate it at, you know, the college level and at conferences, and not really do anything about it were, to me, it's a human problem. And, and that's something that needs to, it needs to everyone needs to be in this conversation, not just the intellectuals. No, the scholars.


Wayne Cleghorn  07:11

No, I think you're absolutely right. I agree with you. Whenever I see a discussion about Privacy, and it doesn't, or Data Protection, and it doesn't center on the human, and it doesn't center on the individual, I get a little bit suspicious as to kind of what the starting point is, where the power is, actually, if you're not talking about the person, so who has the power, where's the power being exercised? And I suppose the story about that is that I set privacy to solve for that reason, because I thought, actually, what we need are solutions. And I love our academic colleagues. In fact, we have a Ph.D. on our staff, and he is an Identity Management guy, he's a doctor, and he gets brought into those kinds of discussions and, and organizations and, and he brings so much talent to the team. But actually, I wanted a multidisciplinary team. So we have lawyers, we have a couple of sizes, Chief Information Security Officers, right board-level experienced people, we have a handful of lawyers on one of them. And of course, we have the technical people, you know, a couple of people have engineering backgrounds as well. And it's all about your rights and twinned individual, but also finding solutions or a place or an outcome that actually balances all the rights as much as possible.


Debbie Reynolds  08:47

Yeah, that I agree with that. I want to talk, and I think you're the perfect person to ask this question, too. Let's talk a little bit about Data Privacy versus Data Protection. So it gets kind of mangled up in the US. So I feel like people in Europe understand the distinction more than we do in the US. And I think part of that is that in the US, we're trying to achieve both. We're trying to get legal rights for Data Privacy and also Data Protection, where the EU s as it’s laid out in the GDPR is actually in your constitution. So privacy is a fundamental human right and has been since the 50s in the EU, and we don't have that we don't have privacy as a from human rights in the US. So a lot of times when we talk about, and I know a lot, I see a lot of eye-rolling, my European colleagues, when people in America call GDPR a Privacy Law is actually a Data Protection Law, because you already have privacy as a right, so when we're talking about privacy here, we're talking about we want a right to privacy as a fundamental human right, which we don't have, and we want to protect the privacy that we want to have.  So I would love for you to talk a little bit about that distinction.


Wayne Cleghorn  10:18

No, that is really the heart of the matter, because actually, let's go back to 95 when the original directive came out, and also the GDPR as well, they have three bases, they were brought about, for three reasons really, the constitutional right to privacy rights, this is a traditional kind of international human rights to the home, correspondence to family, right. And that's known around the world. What the Europeans have done is that they've added to that right a fundamental right to Data Protection, which is a subset of that, right? So, in fact, if you have those two rights together, it's quite a powerful kind of combination. And that's because of history. History was always rich in Europe, countries fighting each other for no reason, you know, I go on record to say that for hundreds of years, then we had a couple of kind of world wars, you know, neighbors, basically beating the hell out of each other. And, you know, state control and Communism and right-wing Fascism in Spain, there was a lot of politics going on. So, you know, holding on to fundamental rights and Data Protection became a thing, and it continues to be a thing. And when the European Union came together, it was like, okay, so how are we going to glue ourselves together? How do we stop fights, and there are still fights, you know, Brexit, all that lovely stuff going on? How do we stop fighting? So that's important. But the thing that's often forgotten is that the 95 directive and the GDPR are also trying to allow data to rush around the European Union really quickly. It's often forgotten.  Oh, these Europeans with their fundamental rights, no, no, it's actually quite a commercial, a tactical thing of allowing data to flow from Budapest to Paris, to Dublin, to Naples to Vienna, and all the way down to the islands of the Mediterranean of Spain. So it's trying to put those two things together. So let me summarize, Data Protection is a set of European, you know, origin laws that are quite technical. And actually, they're about protecting data from creation, to, to the kind of where it ends, right. The lifecycle privacy, the fundamental constitutional kind of right, is fused into that it's kind of baked in, but they are separate. And if you talk about privacy, you should always, you know, indicate what you are talking about? I don't know, and the fundamental rights are you really using privacy as a byword for the technical kind of Data Protection? I know that the US uses the word privacy quite broadly. So whenever I'm talking to a US audience, I'm always quite open-minded as to what they mean. But I think as the world begins to bring out laws, those two terms will come together. And we will see regions kind of developing as it goes. So really, I'll pause there, but that's my little storytelling. What are the different definitions?


Debbie Reynolds  13:54

That's great. I love that you said that the solid for the record. Wayne just set us all straight here. So everyone is clear the difference between privacy and protection? So I'm glad you mentioned the 1995 directive. So when GDPR. So I separate privacy people or people who are in our field into two categories. So people who understood this area before GDPR came out and then after. So you're, you're part of that first group, and I love to talk to people about this. So when the GDPR came out, the thing that the GDPR did that was important, have done a lot of important things. But the thing the reason why we're even talking right now is because of GDPR. So the GDPR made privacy a C suite issue because of the fines. Okay, so you and I know that the data directory isn't majorly different than the GDPR is it really isn't, So they tighten up some things. And they added some things, obviously. But it was so funny to hear people kind of complaint about the GPR certain things in it. And I'm like, Well, what were you doing from 1995? Up till now too, you know, 2016? Because this is like in the previous Data Director, what are your thoughts about that?


Wayne Cleghorn  15:21

No, I cannot say better than me. The truth is, here's what I did. I remember when GDPR was coming out, there were a lot of drafts, right? There were a lot of under a lot of leaks. But what I did is that at the start of 2016, when the law was finalized, I took some time away because I used to lead aboard actually, who were looking at this, and I took some time away, and I read the directive from start to cover. And my conclusion was exactly like yours, and it was like, people are not doing half of this stuff. They're not making automatic decision making all the stuff about, you know, making sure that international transfers are well kind of managed, and you do more than just put a contract in place, because that's what that got her name. So my conclusion in early 2016 was, we and I put myself in that we're not even complying with the nine to five directive. So when the regulation came in, and the regulation has to be the same across Europe, that's the difference. Each country has a little bit of scope as to how they interpreted and how they applied the directive. But when that wasn't really working, that's why we need regulation to make it a bit more detailed. You're absolutely right. A lot of organizations got caught short. And they said, Oh, God, what's this big, oppressive thing that has come upon us? And I absolutely agree with you, and it wasn't a big and oppressive thing. If you had high standards, and you were trying to be as compliant as possible to the directive, GDPR wouldn't be an extension, a little bit of spending, a little bit of investment, a little bit more training, but it wouldn't have been the seismic kind of push and pull. And boards wouldn't have been so shocked. And oh, my God, what's happening to us? So interesting, you have that view, I share it, I share it.


Debbie Reynolds  17:20

Yeah, one thing, so I do a lot of research. So I, I, I remember when the director of came out in 1995, and sort of the things that preceded that, as well, I was, you know, very well versed in and so, when I talk when people talk about Data Privacy and Data Protection in the US, and, you know, we have all these people trying to create these, you know, Hail Mary laws, or it's okay, we, we haven't really done anything on this significant in 20 years. So lets like trying to create this huge massive thing, you know, out of the blue, and it just doesn't happen that way. You know, it takes many, many years. It takes a lot of building blocks to do it. And so, in the US, I feel like we are where Europe was before 1995. So we don't have comprehensive Data Protection or Data Privacy laws in the US. So before 1995, Europe was like us, so we, all member states, had their own thing that they were doing. And so I'm happy, I was happy to see that someone decided, you know, let's get together and try to find things that we can harmonize on, and that we can agree on. And then, over the years, build upon that. So in the US, we don't really have, you know, EDPD or working group 29, whose job it is to look at privacy or look at Data Protection in that way and come up with these proposals. So for us, it's mostly like, okay, let's find a politician, let's have them, you know, draft a bill, put their name on it, and then, you know, especially in an election year in the US, like, nothing gets done. So privacy doesn't show up on anyone's top 10 lists, you know, in an election. So it's like, Okay, let's see if we can make a little bit of progress in a nonelection election year. So what we have now, in the US, is the states run amuck doing their own thing. And so, it's like we have instead of relying on states, we have like 50 different countries, you know, the shared currency? What are your thoughts about that?


Wayne Cleghorn  19:47

Let me take this question from what I call a geopolitical level. The US has been very successful in exporting technology around the world. Big companies have a lot of data. And I genuinely think at a federal level that the UK that the US Army should have some sort of Federal Data Protection Law. I think that's overdue. And the reason why it's important geopolitically, I think the conversations the US have with the EU about, you know, Safe Harbor and data transfer would be better if it had, if it was saying something to the world federally, it doesn't have to be exhaustive. It doesn't have to be complicated, you know, look at all of the countries that have gained and EU adequacy. You know, you can see the differences, different approaches the different, but I think that's really important. The second thing is, I think, again, geopolitically, China should not have a larger conversation about Data Privacy than the US. I think Data Privacy is deeper rooted in the US psyche. I mean, I'm, I'm a big fan of America, right. And I understand where the laws have come from. And in conclusion, I think the Commerce Clause covers this, get together on the Commerce Clause, all the states, and put a framework of privacy, and then that would be the federal law. And, of course, the states can then continue to do their own work. But at least there's almost like a minimum a directive standard on the federal level that speaks to the word that's the, how can I put this international kind of outward-looking standard, and then the states can come in, by the way, let me say this, the states have been very good at data breach, and you know, this Debbie, around the world, the US understands data breach, and has the most mature system going, but that doesn't work for an omnibus privacy law that needs to be federal.


Debbie Reynolds  22:15

Yeah, I agree with that. I think that, you know, the sticking point in the US, the reason we haven't made a lot of progress on Federal Privacy is two things, almost probably more than two, but two that I want to talk about two things that are stopping our progress. One is this whole idea of preemption. So if we have a federal law, how will this preempts the State Level Law, so states have their own sovereignty, you know it within the constitution. So that's kind of an issue, and then Private Right Of Action. So corporations don't want to see these laws where they don't want to see a GDPR in the US where you have these huge fines, and you don't do X, Y, and Z. So those two things people are trying to solve. These are like the hardest parts of that. And they don't want to do that. So in my view, it is like, let's create a standard, let's create a harmonization in the language, and how we, you know, something as simple as glyphs come together and define what Personal Data is nationally in the same way, as opposed to different from different states. And let's leave out the private right of action and leave out the preemption because, in a way, that's what Europe has, right. So even though you all have a GDPR, every member state also has additional laws that you have to follow based on the kind of where your customer is. So, you know, I think, you know, you can't. I think we can make progress. And it was funny that you brought up China. So a lot of people thought, well, you know, well, we can't have a GDPR because we're so different and stuff like that. So fact, in China, not only does China have a framework, they have people who are dedicated to privacy within their government, and we don't have that. So we have FTC, Federal Trade Commission us, and they, they handle everything, and they handle all types of stuff. So not having a kind of a dedicated agency is working on privacy issues. I think is a is one reason why it's hard to make progress here. And then some people are of the mind, well, let's just use these all laws, and I just see a quick story who's funny. So I was doing I actually dusted off the 95 directives, and I read it recently again, and I was just crushed that the laws that they were mentioned in the Data directive in the US have not changed. These are the same laws that exist right now. And there has not been any progress literally, since 1995. So I was like, Oh, my God, I can't believe it. So what are your thoughts?


Wayne Cleghorn  25:18

Well, I agree with you. And thank you, actually, for explaining so eloquently and mentioning mentioned all of that’s very well put, I heard a Chief Privacy Officer of one of us big tech actually making an admission, and I agree with her. And she said she fears that the US is losing the ability to really join in the global Privacy Debate. The Data Protection debate is happening worldwide. And I think at the end of the day, we can kind of disagree about where we start from where we go to, but I think the US-North America should be very much involved in the global debate because it's moving very fast technology is moving rapidly fast. And I think having, as you say, some sort of regulator, some sort of agreed cross country, standards definitions would definitely help, and I think, actually be a bit more efficient. Because if you're trading in three states, and you have three different kinds of a data breach, you know, personally identifiable information definitions, that's an extra cost, right? Because you haven't done get the lawyers to kind of say, Oh, God, do we tell? Do we not tell? What do we do? So I think efficiencies are definitely to be had, from coming together, practically. But also on an international level, having a bigger voice, having a bigger stay as to what privacy means. Final quick points about where privacy starts from. I think the world can be rich and storied in terms of where privacy comes from. Now, the Europeans, privacy comes from an individual rights perspective, and also the flow of data around certain areas and countries with equivalent standards. For the Chinese, it's very much a security law, and it very much is a protect the borders, protect the data, protect the people not as individuals, but as a country. And in other places. And I am it's more about. Actually, it's more about commercial empowerment, and it’s about allowing businesses to understand very clearly how to grade and risk the data they have. There are a few kinds of individual rights and those regimes, but it's really about allowing the holders of data to understand and risk the data. They have to start from different places. But we have a commonality in terms of having a global conversation about these big issues.


Debbie Reynolds  28:11

Right, I agree. Yeah. And then China, you know, that law is consumer law, right? It's not a human. It's not human law, like the GDPR. So the US, that's what we have. We have consumer law, so they are parallel. So yeah, that is the question was like, okay, China can create this framework, or, you know, what, I guess what we need is a strategy, not tactics. That's what we do. We have tactics now. We need someone to talk through a high-level strategy. You know, what is our plan going forward? Because we, you know, just, you know, I created when I see someone like quoting a law, it's like 70 years old. That's disturbing to me, knowing that computers didn't exist back then. And, you know, data is very different. And so, you know, I'm dismayed when I hear people say, Well, you know, we'll just use these old this old, let's dust off the 70-year-old law and let's apply it to this new thing. It doesn't apply at all, you know, so, yeah, I want to see some progress. Definitely, on that, that front.


Wayne Cleghorn  29:24

Well, I'm hopeful. I'm, I suppose, an eternal family optimist. I think there will be drivers. And it's really interesting. I was quite surprised at how Apple's technical changes to their operating system literally changed the ecosystems where the competitors started saying, Oh, my God, you know, people are now involved. People are now making decisions, and actually,non-election. It surprised me the number of people who opted for the privacy preserving technology when it was given to them. So I think a change like that, as well as political changes, as well as maybe a few of you a few more data breaches or cyber hack, think it's a collection of things will start a debate and allow legislators and pressure groups as well to actually start to create. And I think in that creative process, we will get to, so I hope.


Debbie Reynolds  30:40

I agree, what Apple did. So I think what Apple did, that people really didn't understand. And I did a video about this last year; I was like, hey, this app transparency thing's gonna be huge. So you all need to pay attention, and people weren't really listening. And now, now we see all these reports about these companies saying, Oh, we lost so and so billion dollars because of app transparency and stuff like that. So, you know, you know, a good and bad thing. So it's good that Apple did this right, app transparency. And so the thing that is illustrated is that you know, a company like Apple that has their iOS, their iOS operating system, when they make a change like that, it can be global, right? So we don't have any privacy laws that are global in that way. And so they can push a button and make a change that impacts everyone who has their product around the world. So companies that do I have operating systems that run the smartphone systems have extraordinary global power, and it's great. Apple's ability to use its powers for good. And unfortunately, not all companies will do that. But I think it is important. And I think, especially, you know, you and my other friends, and you really applaud it, you know, that effort, even people us really liked it. And what they did was, you know, I think people are generally lazy, right. So what Apple did was make it so that app transparency is the default. And if someone wanted to opt-in, they just had to take action. We know people are lazy, right? So very few people took that action. And so it's sort of it it did a lot to prove the point, that people kind of anecdotally said, Okay, well, you know, we don't think that people really care about privacy, because they're, they're not up in arms about it. So now we see, okay, when people have a choice, they choose what they want, most people want their privacy, and they didn't want to share their data, or they want it to get some consent from an individual about the data. And then the fact that Apple or, you know,  Google to have further, that bunch, not any company that sort of manages the operating system of products that we use have, like extraordinary power and influence over, you know, how we attack these kinds of data issues.


Wayne Cleghorn  33:35

Right, time will tell exactly, you know, how this iOS change, you know, really affects, the ecosystem, and, and is it as privacy-preserving as Apple itself hopes or, or others. We will see; I think we're still kind of in that phase. But I think you're absolutely right, and there was a big PR thing that says, Oh, well, no one cares about privacy. And that debate that irritates me, and I use that word deliberately. Oh, well, you know, if we're ever talking about security, beware, there has to be a trade-off in privacy. And I have; I’ve instructed all my team to really get into that debate with every board and every project we do, to say, can we have a conversation with a single conversation about privacy, data protection, and security at the same time? In fact, I think this is the blood that runs through my organization. Because it's important, I think, to have a nuanced argument, a detailed discussion as to how Data Privacy and security can work together. And actually how they benefit each other and can be in competition sometimes, and all of that. But back to the point about the power of technology, there's a competition issue there. And a few years ago, there was a push to bring together competition policy with kind of data protection Data Privacy Policy, and I haven't really seen a lot of movement about that. There is a movement in the UK; the UK has brought together its competition review later and started to look into the likes of Facebook or meta as they're now called. But I'll still call them Facebook, for the foreseeable future, until obviously, we get used to the new name. So there are small changes in terms of competition policy and Privacy Regulation. But I think that intimately links actually, but it takes the will, again, of governments and for the players to participate in work about to what extent does a monopoly squeeze privacy? And how does privacy has an impact on a monopoly or large players? You take up over 50% of a given market, very rich debate, but not a lot of movement practically on the ground yet? Really?


Debbie Reynolds  36:26

Yeah. I remember a couple of years ago when I couldn’t remember which company this was. Maybe it was Facebook? I don't know. So then I think it was Facebook. So some company was trying to acquire someone else. So maybe it was Facebook, Facebook, and WhatsApp. And they were trying to do that. So you know, they have to get approval from the EU. And you know, competition authorities were asking all these questions about privacy, probably was Facebook and Whatsapp.


Wayne Cleghorn  36:59

Well, I'll jump in. There was also Google and Fitbit. Yeah. The health exercise, a wellness app. Yeah, yes, yes.


Debbie Reynolds  37:02

 So the competition folks that I know in the US went bananas about that. Because for them, they thought that was a separate issue. So they don't think about privacy, as something that gets talked about in competition is antitrust. So the fact that that conversation even came up, you know, the purists that I knew in antitrust, they were like crazy about it. They were like, Oh, my God, I can't believe that you know, like the Okay, privacy and other things that we deal about, you know, let's let the tech people deal with this, or whatever, like, they didn't really see it as something that can literally stop or hinder a merger or acquisition. And so I think that's something that's going to happen in the future, you know, we're seeing, even if the FTC, you know, we don't have like a federal privacy law, we're seeing them take on asking more of those types of questions in situations where people want to merge, or they want to acquire kind of new companies as well. What are your thoughts?


Wayne Cleghorn  38:20

Well, I think it's definitely ripe for a discussion. I’m hoping that the Antitrust Competition Specialists start looking across because we know that mergers and acquisitions happen for a number of reasons. Either it's a complimentary service or expanding service. But also we know that large companies look across and think actually, there's an upstart, you know, well funded VC capital, and they're going to take on a market, or they're going to become bigger than us. So we are going to take them under our wings and control them so that we can continue kind of market domination and market leadership. So that's important. Because if you have an here's, here's a really interesting thing that I've been thinking about recently. I'm open, and this is one of my hopes and dreams that I think about before I fall asleep at night. I'm hoping that Privacy Tech, you know, technologies that center Privacy, will come into the market, and people will have a real choice, right? Whether it's social media, whether it's devices, wearables, you know, all of that IOT, that actually, there'll be a stream of privacy-preserving technology, which actually give people a real choice. The competition authorities need to look at that just to make sure that the old players who have a way of doing things don't acquire all of these small players up and turn them into a business as usual. Absolutely important. Yeah, let you come in. Sorry.


Debbie Reynolds  40:00

Oh, wow, I hadn't thought about that. Yeah, that's right, but you make a movie about that. Evil overlords have like assume all the team players. Okay, no one cares about privacy because we bought up all the private companies.


Wayne Cleghorn  40:18

Absolutely but there's also, you know, there's also another piece; let me throw this in as well. There is a big debate about large companies, and I name no names who sponsor a lot of academic research. Yep and if you have a marketplace where academic research is quite sponsored, and actually, the larger players are funding, research into new technologies or all technologies, the question is the technologies that come out of that, actually, what does that look like in terms of privacy? What is that the marketplace? And I know that regulators can't get into the academic space, but they will catch it downstream. Because all of those research will come out into startup and will come out into the new ecosystem, and I think competition authorities need to have a look at that to say, well, actually, if this has been funded from incumbents, how do we make sure that in a sense, the whole value chain, you know, it's funded, and then the startup starts, you know, they're basically the same as the others, and look at the acquisitions and look at who works with or who will, who shares, markets, and all of that, I think it's a rich scene. But I think in conclusion, it takes a lot of money. And it takes a lot of lawyers, and it takes a lot of willpower and political will to really get into that. And, of course, in defense of big companies, and let me put out in defense of the big tech players around the world, or wherever you are, in every country, I think the meeting place is innovation. At the end of the day, we're trying to make sure in our different ways that innovation lives, choice lives, individuals can actually pick and choose and import their data, and actually, we can live full lives. So I think we can all agree on, you know, all of this is trying to make the market better protect consumers and actually give us better, you know, tech, better product services, tools, etc. So, you know, I hope that is a balanced view.


Debbie Reynolds  42:56

Yeah, very good. Very good. I like that. Well, what would be your wish? So if it were the world, according to Wayne, and we did everything you said, What would be your wish for privacy, Data Protection anywhere in the world, whether it's law, technology.


Wayne Cleghorn  43:14

I would love the public debate on privacy. And I mean, public debate, I mean, everybody, right? Newspapers and the media and kids and adults. I want the debate to be richer, and I wanted to; I want privacy and information and cybersecurity to be something that people talk about with knowledge and make choices. And I think consumers are becoming more empowered, and companies are tilting towards that consumers and really listening and really innovating and really giving us great kind of products and kind of services. And I suppose the other big thing for me, and it's something I've been kind of going on about well, since I was in government actually, about getting rid of dirty data. And I use that word deliberately. And what I mean by dirty data is, you know, data that is not good quality data that has run through is how I can put this discrimination? You know, really systemic biases, because I think here's the thing. A few years ago, we were told, Oh, you know, don't worry about stuff. We're all getting better as a society, right. We're all improving around the world. We're getting more aware we're getting fairer. But the truth about it is that AI systems are looking back, as you say, 70 years, 100 years, 50 years. And a lot of that data is dirty data. So my message to boards is, let's get really militant about getting rid of bad data. So we shouldn't collect it and hold it. And we also should go through data to make sure that it is good quality data; because it's better data, we make better decisions. And we don't entrench the mistakes of the past. We don't learn those as lessons for the future. So I think those are the two kinds of, you know, apple pie, you know, views that I put across, really, yeah, yeah.


Debbie Reynolds  45:53

I love that. You said that. I was having a debate with one of my friends, a dear friend from New Zealand,  Rohan Light. He's an ethicist. And he always puts up really interesting articles. And so always thought-provoking. And so I was too tired to respond back. But I posted just a question to him, which is, you know, about AI. Okay. So if you have, how can the algorithm be artificially intelligent is only telling you is predicting the future based on what happened in the past?


Wayne Cleghorn  46:37

It's a good question. While I think we've hit on philosophy, now, this is good. It’s a good debate. No, hey, I can't even begin to answer that question. I see you laughing because it's such a fundamental question. And of course, there are different types of artificial intelligence. There's machine learning kind, and there's the predictive type, all of that. But I think the input and it's set on that a minute, actually, and let's go back to where we started about medical data, medical law, and advancement has actually developed so quickly, especially around COVID. You know, what we understand now about pandemics? Yeah, if we get a data set that's 10 years old, 20 years old that uses, you know, flu and other different sides and different types of a pandemic. Does that actually make us fit for the future? Or are we just reminiscing about the past? Right, and old data and old notions? I love that point. I think I sit in silence right now. I think that is so well put? No. Let me say that again. How can we look backwardbiotech? As a way of determining two things, what the current state of play is or should be? And secondly, what the future looks like. I think you've dropped the mic there. I love it. It's a good point. Very good point. I'd really like to say drop the mic. Very good. I think I'll feast on that. Pretty Yes.


Debbie Reynolds  48:24

We may have a session where we're talking just about that question.


Wayne Cleghorn  48:29

Absolutely. Bring a few friends to sit around, our business in deep. Now. You got me very excited. Absolutely brilliant. I can't get over it. Very good.


Debbie Reynolds  48:45

Well, thank you. Thank you. Well, this has been such a thrill to talk with you today. You're You're amazing. I love what you're doing with your company and your voice is really very much needed. So I just I am happy to share the mic with you today. Happy t,o have other conversations. It's so much fun. It's great.


Wayne Cleghorn  49:09

No, thank you. It's a real pleasure. Thank you for blowing up the Privacy Podcast state. We verify your online work is very good, and studio work is very good. Keep at it, and thank you really it's grateful. We will talk for sure.


HERE