E196 - Michael Clark, Data Scholar and Author of the Book “Data Revolution: The Rise of an Asset”
35:51
SUMMARY KEYWORDS
data, people, privacy, asset, today, created, ai, technology, part, systems, world, ownership, commodity, business, book, build, storage, delete, identity, means
SPEAKERS
Michael Clark, Debbie reynolds, Debbie Reynolds
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello. My name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show, all the way from Dubai. We have Michael Clark. He is a data scholar and author of the book Data Revolution, The Rise Of An Asset. Welcome.
Michael Clark 00:41
It's amazing to be here; really looking forward to it.
Debbie Reynolds 00:43
Well, you and I met, actually Joyce Hunter, who was also a guest on the show, she introduced me to you, and you and I had a call and a chat, and it was amazing just to speak with you and get to know you. I would love for you to tell me your data journey. You have a tremendous career in data because I love data people, and you and I had such a lively conversation, I thought this would be a great platform for you to be able to explain yourself and what you do.
Michael Clark 01:13
This is great, and I really appreciate the platform to do this. So my history from data, I guess, is probably a little bit different to most. We all started with, probably a computer was my starting point, which is the first computer I ever owned, which was probably the first one that ever was a home computer, the Spectrum 80. So, I'm showing my age a little bit here. From there, then I was entranced by more the data side than the technology, but the two go hand in hand. So across my career, I've worked for Fortune 500 companies building ontologies. They've helped build some of the biggest technology changes in terms of solutions, as well as some of the largest digital transformations as well. That's spread from large financial institutions to having the pleasure of working in some of the next generation Smart Cities. So, really thinking about how data needs to move around big new environments where people are going to live, but it's really gone from the life cycle of data, in a way, I've worked in every job in a company, and data's followed me, either through cutting edge technology or applying it in a business context. I sometimes think of myself as a bit of a surgeon. In a way, I've done the product space, I've done the customer, I've done technology, even regulations. I've touched it all, really, and I think that's what's enabled me to have this bigger perspective on data, but more importantly, how it touches all our lives, as well as the business world.
Debbie reynolds 02:30
Well, I'm a data person, and you and I really connected on data because I feel like a lot of people think a lot about systems, but they don't think about data because data is the lifeblood of systems. So, you can't really talk about systems without talking about data. But tell me, what do people get wrong about data they don't understand? I
Michael Clark 02:48
I think that's the greatest question. I would even argue none of us really understand it, and I also believe that none of us really appreciate it. We've been led to believe that data is just a number and a transaction or a byproduct of something that a system creates. The way I like to think about data is its potential. It's the potential to create new innovations. It's the potential to build relationships, but it's also the potential to solve problems. I think we've lost sight of that because, through time, it's become almost like a commodity that we all are trying to monetize. So, we forget that data is really unique. It's one of the few things that is a store of value and also utility at exactly the same time. I think we've really lost sight of that, and in some respects, we've become fearful of data because some of the events that have happened outside of people's control have almost created this fear of sharing when the reality is it's the power of sharing that will make data come alive. That's what data means to me. It means potential, and I will add something else into the mix. What data also means to me is actually people's memories and their stories. It's not a transaction. That's what we see at the end. The reality is, when someone created that, they were creating a memory, and the story, in effect, is our new cultural artifact, and I feel that we've lost sight of that through time.
04:07
I like the fact that you term it as a cultural artifact, because in past history, other civilizations have recorded information about themselves, and because we're doing it in digital ways, I don't think we take an appreciation of how fragile that can be and how we can lose so much of that in those digital transactions. What do you think?
Michael Clark 04:29
I think that, again, an amazing observation, which not many people will make. I think you touched on it, a really important point. Let's go back in time. Take today: 14 million people a year visit the pyramids in Egypt. Why? Because they are trying to learn about stories from the past, information that was left behind. People don't realize today, I think that the cultural asset of the modern-day world is information and data, but we constantly recycle it, and delete it at will, which means it's becoming even more difficult to piece together our past. So, let me give you a real-world example. So, from 1991 to 1995, the Internet wasn't really archived. Archiving technology only really came in until 1995; that means large parts of the Internet don't exist from its early origins. To make it worse, from 2009 to 2018, a large percentage of history was deleted through acquisition or just loss. So GeoCities, the most famous of all, when that was acquired, 38 million user accounts were deleted. Now, if you add up all the data that we lost between 2009 and 2018, that is the equivalent of you and I standing outside the New York Library and pressing delete. It's also the equivalent of half of the Vatican archives being wiped off the face of the earth. So I don't think we realize that as time passes, it's going to become harder for us to piece together who we were and who we are, which then when you throw AI into the mix, it starts to explain why we see gaps in some of our data today and some of the hallucinations. Because, actually, it's because there's a past it's trying to create that we've deleted a lot of.
06:10
Wow, oh my goodness, yeah, that's scary. We assume that important things are kept and restored or retained, and that's not necessarily the case. I want your thoughts about keeping what's important. So I think that in the Internet age, well, let me go back to the paper days. So let's say, in paper days, let's say you were in the office, people wrote memos and they had documents that they exchanged with one another. But in the fact that it was harder to create those paper documents, I think there was a curation layer there that I don't think that we see as much in a digital world, where now almost anything that you do in a digital world is kept. Who's actually curating, whether it's important or not. So someone can Tweet, for example, that they had a ham sandwich yesterday, but that's not important in the future. What are your thoughts?
Michael Clark 07:00
So I think I'm not sure whose hand sandwich it is, though, but I think if you look backward, I think you've touched on something that's, again, really important, is we don't actually understand the value in data itself. That's part of the problem. If you think, before the 1940s, we actually appreciated intrinsic value, because things were built on their uniqueness, and we lost that as we entered the digital age, partly because items became intangible, whereas in the past, they were tangible, and I could touch them and I could see them. So, we entered the digital age, and I would even argue that when Windows 3.1 entered the business world in the 80s, we didn't really have an understanding of what data was because the attention was primarily on the technology and the software. So again, data was that byproduct. So you ended up in a weird phase where we were mindlessly storing things that we probably didn't need and probably still don't need. On top of that, you've got interoperability problems and portability because, primarily, no one really owned the data, per se, and it wasn't really needed. No one had thought about whether or not we should define some form of ownership around this, which, by definition, would have forced us to think about value. So we've reached this point now where, particularly when we think about AI, I can come back to that, but also the way that we've built AI, a lot of that could have been avoided, particularly with the energy consumption, if we were only keeping what we needed, if we truly understood the value of data in our database and what we needed, we wouldn't need so much energy that would have driven a lot of change.
Debbie Reynolds 08:33
Tell me a little bit more about the rise of data as an asset. I think we hear people say things like data, information is the new gold or the new oil. But to me, it doesn't really communicate the value of it. It truly is an asset. Tell me what that means to you.
Michael Clark 08:51
So when I hear that data is the new oil, I chuckle because it's an example, again, of how we try and put a commodity to everything. So Clive Humby made the statement that data is the new oil in 2006. What people don't realize is what he actually meant. What he actually said is that data is the new oil because, like oil, data has to be refined, and its value lies in its potential. But we seem to take the word data as a new oil as let's all run to the hills and make money from data, which is actually not what he meant. The reality is that data now has become like an asset, very similar to water. It's become critical to everyday life, and like water, it's also a utility and also a unit of value. So the only way, in essence, to respect something is to finally recognize and realize that this thing called data is probably our greatest asset. The point is that the book is split into a number of parts. The first part is really the history lesson of data itself, some of the key milestones and how data changed through history, and actually some of the gaps that we see today. Why those gaps have resulted in the way we see data today, almost that history milestone moments that have given us what we know today. Then we give the reader the choice, in a way, to say look, you can either stay where we are and carry on the cycle where data remains a commodity. It's not owned by people, and the things we value the most are productivity and efficiency. actually, we can give ourselves a new narrative, which is data now becomes owned. It's treated as an asset, and we apply a completely different set of values, as well as productivity and efficiency. Part two of the book then effectively tells you actually how to turn data into an asset itself, not just digitally. So this is the why, the what, in terms of where you can use that asset. But more importantly, how do you operate that asset? How do you enable it? Also, how do we trash it, which includes the curriculums in schools in terms of actually how to be able to read data. Then you can't just tell someone how to do it without showing them what the art of the possible is. So we are showing use cases then from within finance, within health and longevity, data as a community, even data as a utility, and then a possible future and an eight-point plan for government, well as what's in it for everyone, whether it's you as the individual, whether it's a business or even a government, but more importantly, the role of AI in all of this, and how, actually, how that changes for the better. Because really, the book exists because the timing is right at this moment as a catalyst for change because AI is giving us that opportunity because it's reflecting the past back at us. So now we have an opportunity to maybe rethink privacy itself, to say, well, actually, how can we now look at things that really matter today, which is things like security and ethics, for example? So you get the whole book, in a sense, is for two reasons: it's the audience and to give the next generation a voice to demand change, which is part one, and part two is for those who must change. So a guidebook to actually make this possible and probably create the greatest asset of our time.
Debbie Reynolds 11:56
We talk a lot around artificial intelligence and privacy space, and how that can complicate privacy, because people really don't know how to track the lineage of data. They don't know how to handle it. But I want your perspective, something you just said was really interesting. How can we leverage AI to actually help in these data spaces around privacy?
Michael Clark 12:20
So we have all the technology today to do all of this. We just need the will. You and I couldn't have had this conversation maybe 10 or 15 years ago because we didn't have the technology to even make any of the things that we know we can today possible. They were still in their infancy. What's happened is all of these technologies have converged: Blockchain, Artificial Intelligence, decentralized storage, and even traditional technologies have all converged to give us the art of the possible. So the question is, now, in the face of AI, how do we look at AI now as a partner in the age of protecting people's data, but more importantly, giving people ownership and the choice? Because I think we're reaching a point in the story of privacy where it's going to become harder and harder to fight that battle because technology is getting too intelligent. Humans are declining to allow them to even keep up. Look, we need to be honest with ourselves. Data literacy is the worst it's ever been, and now we've all become visual thinkers. So we have an attention span of eight seconds, which means that I believe the last two incidents of cybersecurity were because of human error. So we need a new capability to help us, but we also need to re-educate the next generation so they become critical thinkers again and actually can use data in the right way, in conjunction with AI. So I believe that we have the technologies to do this, but that's not the silver bullet either. That's why, if you treat it like an asset, you then encourage nations' would then focus on what's important to encourage and almost mandate that people actually understand how to use this asset and read it on a daily basis. Then that's when you start to see change. But let's be honest, between ourselves and your audience, it probably takes two years to change the law, but probably a decade to change the mindset where we have to start the journey now put the building blocks in place.
Debbie Reynolds 14:05
I agree with that. It's really hard to change, and you're right. I think data literacy is very poor. I've been concerned that we have people who are still doing 12345 passwords, and then we're going into these really complex data systems, and so we have to be able to bridge that gap
Michael Clark 14:24
On that topic, I think it's worse than most people realize. I think 16% of people in the UK can't find their balance on a bank statement. So we've come to the point where we accept what we see, which is not a great place to be if you think about where the future is projected to go. But the point is, we can have a much better future if we can empower the next generation and almost give them a choice to how private they want to be because the technology will allow that to be the case, and almost the regulator then starts to play a different role, which is really, I think, where the issues are today with data and AI is really the ethics. The US and the security around it, particularly some of the ways that data is being taken or the way that AI is actually created, you could argue is a breach of ethics rather than necessarily privacy.
Debbie Reynolds 15:11
So where do we start to make this change, to make this pivot?
Michael Clark 15:16
It's a great question. It's like an elephant. You don't want to attack it all at once, part of swallow, as they say. So I think you have to start at the very basics. A lot of the big conversation right now is around data ownership, which is the logical place to start. I think data itself. There are two other missing pieces of the jigsaw, which are identity and value. It's almost like there's a magic Trident that has to be connected, and the exciting bit is that digital identity, in terms of its sovereignty, it is now becoming a conversation globally. So, a lot of markets have already gone into that space. Even the UK Government has started to talk about it. So if you have a digital identity, the logical next step is then to allow people to join data because you can link it. There's the two pieces of the puzzle. But we also need to realize giving people ownership of their data is no silver bullet either, because you still have the problems I've just described. So it really is a precursor to making something an asset, really, and that's why ownership is always a good place to start. I talk about this in the book, is that the model that actually enables us all to work, which we've put into the book, not everybody will take it all, different countries and different organizations will be in different points of the journey, which is like any digital change. The point is you start from where you are, but I think you can't hide from the fact that you need regulation in place to enable you to get off the starting blocks. Then it's really which bits of the model and other pieces of value that you need to put into place. I truly believe, once you start owning data, you can start the conversation. But I keep saying it's not the silver bullet because we need to learn from other types of regulation as well. Like I was part of open banking from the very beginning in the UK, I've been lucky to work on it for many years. I would argue that it probably hasn't succeeded as well as it could have, primarily because we didn't really solve the value exchange for why people give their data up in the first place. So, in this model, we can't lose sight of the human element in all of this because we need to realize that people are in a position where they're scared of their data or they're fearful for privacy, so we have to create an environment where they feel comfortable to share. But equally, there's a fair value exchange between both parties and sometimes that doesn't have to be mandatory either. So I think there's the regulation, but also then meaningful exchanges that actually make people to want to share data in the first place. And the main elephant in the room in all of this is trust, which is going to be the actual key component to all of this, is building trust based on an underlying set of values.
Debbie Reynolds 17:32
So part of the challenge, I think, and I love the fact you mentioned identity, because I've always said, I spoke at a conference in Amsterdam a couple years ago, and I'm like, identity is going to be so central to what happens in the future with data, because you have to have the person somehow be able to have some type of agency over their data, and right now, in digital systems, they really don't. Right now, you have all these different identities and different systems based on the way that you interact with those systems. But I think the future will have to be the person will almost have to be their own bank or store of value of their information, and then there has to be some kind of brokerage of that value exchange between them and other entities. What do you think?
Michael Clark 18:18
Yeah, correct. You also need to add to the fact that a machine is also a value owner and a unit of exchange as we go forward. So machines today are already customers. I saw I heard something that I totally agreed with on a podcast. The machine orders your printer cartridges for you. So, in the context of a business, the machine is the customer. The goal-setter is the human. We're getting to a point where the machine will be the one who will be the customer; they'll also decide the transaction, and they'll also make the payment, but I will be the end receiver of the gold, and I will be the one setting the goal. So we have to have an identity and also entitlements attached to that because I'm going to be allowing that machine to do certain things on my behalf, and that includes sharing of data. So again, this trident of data, identity and value is the cornerstone of where we're heading. So many people have said this in years gone by, but the problem has been, and you've alluded to it. We've allowed the three to be fragmented and never connected. We have 100 passwords between us; most companies have 367 systems. On average, a staff member spends what, a third of their day, or even their week, trying to find stuff. Even with Active Directory, they probably have portals to log into. Now, the reality is, even with internal systems, some people are still using an external site to log in to help them in their day-to-day jobs. So if you roll in virtual reality on top of that, we're now human emotion as a data set as well, which I don't think people realize, and I can have seven versions of me in a virtual world at the end of the day. One, how do I know it was even you, and how can I trace it to you? Ultimately, if we ever want interoperability, we have to realize that you need agency, and I need one version of me only, but there can be many instances of me that can be traced and can be trusted. The two are separated as we think.
Debbie Reynolds 20:11
Yeah, a lot of people talk about interoperability, and so my concern, a lot, is around data fragments being in so many different systems and trying to figure out what's the best way to be able to either connect those things or reduce the problem that interoperability poses when you have data in all these systems. What are your thoughts about that?
Michael Clark 20:33
It's one of the elephants in the room. It's a problem that we've created because of the way we track data. We focused again more on the promise of technology rather than the asset that was actually at the heart of it, which ironically made other industries be created. It's quite funny that we created an integration industry that's worth billions of dollars because we can't figure out how to connect data because we approached it in a very strong way. So today, part of the challenge, and some of the things that I help businesses with, and even in the book, is, first of all, you need to know where to look. You need to know what you need to keep and what you don't need to keep. The second part of the puzzle is that if data ownership actually becomes a thing, I would imagine, by definition, systems would have to be portable and have to be interoperable. So that's why ownership is so important, because a lot of people talk about fear, rewards, and remuneration, and I think that's absolutely correct, but I also think it has another purpose, which is actually forcing us to think about the actual right standards. When we think about standards with technology, our first reaction is the user interface or all the features and functions that need to talk together across systems. We forget the fact that people own that data, or it's a memory or a transaction, or even its requirement for the business, and we start plastering things together to try and make these things work. When you have ownership as a legal right by definition, then you have to be able to move data, and it has to be portable. That's why ownership will become powerful. But again, I keep saying it, it's not the silver bullet. We still need to help people define what value is, and that's what I had to do in the book. So one of the big chapters in the book is redefining value itself, which is a combination of values in days gone by, and not all of them, but then also some of the values that we use today to give us a moral compass to say, well, not moral compass, but a compass to say, out of all of this data that I see before me, which ones have intrinsic value, but also equally have extrinsic value, because that's what it'll boil down to.
Debbie Reynolds 22:30
Yeah, I want your thoughts about privacy. When I talk with people about privacy or data protection, sometimes I think companies think of it as a barrier. We can't use data in certain ways, where, to me, it's basically saying, this is how you can use data, not how you can't. What are your thoughts about that?
Michael Clark 22:48
That's a nice tee-up. So I'm going to throw a statement in, which is like a big hand grenade into all of this. So first of all, I think let's tackle the question that you asked, which I think is a great question. So I see it's actually some cases in some countries, hindered innovation as well. So I particularly look at America as an example, particularly with healthcare and HIPAA. I think that was done with the right intention, but in the end, it was the patient that suffered because if you moved from State to State, there was no guarantee that the hospital would know who you were. It all depends if each State feels comfortable and understands the laws within their own State to even share the data. It almost feels like back in the day with the early forms of cyber security in a business, don't tell them until the last minute because they're going to say no, and usually they do because you left it too late. Data privacy has become a little bit like that in terms of how I used to be involved in projects, but it was trying to figure out the ways that we could use it, and it became so difficult. In some cases, you avoided it because of the pitfalls. I also think privacy has become a bit of a deflection to what's really going on. I actually think what's going on today with the way we use data is unregulated banking. That's actually what's going on, and I'll give you an example. So if I go to my bank today and deposit money, they will guarantee that money for a certain figure by law. Think in the UK, I think it's like 50,000 pounds, maybe more. In the US, they're also regulated to how much they can use that data for their money, for what purpose, under certain conditions. So yeah, if I deposit my data into a storage provider and they delete it, it's oops, I'm really sorry, and there's no guarantees. I always use this example to someone's data's got to the point where I would argue that if you gave someone the choice, would you be okay if your bank lost $1,000 or if they lost 10 years of photographs of your family, which one would be more important to you? That's what it's come down to. If you apply the logic of all of the cloud providers in all of the UK and the globe, they account probably for all of the Fortune 500 companies storage and most of our storage. But none of it falls under ANY form of regulation, and the only thing that we can tackle it with is privacy. So I think what's happened with privacy is it's focusing rightly on protecting the citizen, but in doing so, it's almost deflected the bigger picture. So I go back to the days in banking in the UK when mergers and acquisitions created nine banks, and think about it. Today we have Google, Facebook, etc. What are they doing? The massive acquisitions of small tech players to consolidate an industry almost follow a complete replica of the banking crisis, where banks became too big to fail and also had too much influence and little choice for the consumer. So I think when we start looking at data in a different narrative, we start to realize that we've created a system that limits people's potential, but I would also argue limits even the cloud providers’ potential, because of the way that we've looked at data just through one lens, rather than looking at it as something that gives everybody massive potential. Maybe privacy wouldn't be, if we'd have done it that way. I mean, maybe privacy wouldn't be the only lens that we looked into this, and I think once we start looking at data this way, I think regulators and governments will apply a very different lens to data and start to look at it as really what it is.
Debbie Reynolds 26:34
Wow, that's fascinating. I agree with you. I think that, and I'm sure you've heard these conversations as well, there are some jurisdictions who are trying to talk about data, or talk about people as almost fiduciaries of data, so they're trying to tie back financial language into how people use data. So do you think that's a stepping stone to getting people to think about data as an asset as opposed to a commodity?
Michael Clark 27:01
Yeah, I think so. It's bizarre. It's like human nature just wants to commoditize everything. We just fall into the trap of, when we see an object, it has to be monetized in some shape or form, and it's also, I think, part of the reason my ownership of data has often failed in the past, because we've almost approached the same people who are monetizing data today to say, hey, we want our bit as well. Obviously there's nothing in it for them to even want to do that, and in the past, there was not even a platform and a reason to do it, whereas I think now AI is building the case for ownership. I think it will happen. I think it's a guarantee it will happen. My hope is that we change by treating it as an asset, though we start to change the narrative. I know when I speak to a lot of people, even particularly people who are 18 and younger when you explain to them the things that they could have had or the things that they could now do if it was possible, the response is always the same. We don't know any different. It's the world that we've grown up with. It's a necessary evil to share our data. Once you explain to them, look, you could have this, you could have this, their mind is changed. So I think once it becomes an asset, that's why I have a whole chapter around now, we'll appreciate it as a cultural artifact, as something that can be taught. This will change the narrative completely and consciously. The data is the new oil. I'm always correcting people on that because we take the word oil, and we see a commodity out of the ground, and it's traded on the stock market. Poor Clive Humby, we ignored his entire paragraph that you can find on the Internet, which he himself calls data potential. I think we need to change the narrative and say what data is potential, and if we look at it through that lens, yes, yes, you will get fair rewards. Absolutely, that's a cornerstone of this, particularly as people's data gets recycled in AI models many times over. Absolutely, they should be rewarded because we've been building this algorithm since 2003 without us knowing the point is, if we are to move forward and change the way we think about data, we have to also remove it from our mindset that its only purpose in life is to make money out of it, which I think is wrong. I think it holds so much more value, and you almost devalue the concept because, at the end of the day, we are data. That's what we're made of. We're a walking library of the universe, all of us, and there's so much untapped data inside all of us that none of us have ever seen. Even there's someone in Africa, possibly, who's never been able to create data in their life, I'm sure has data that could solve a problem if we allowed them to. So, if we stop looking at data as a commodity, as potential, we may even start thinking about how we solve inclusion.
28:32
I have a concern, and I want your thoughts on this, and that is with Artificial Intelligence. Now we're creating just massively more data than we ever created before, and to me, there has to be a value difference between data that is true and authentic versus things that are fake. So back to the data as the new oil problem. Not all data is valuable, or we need to be able to separate what's authentic and real and true and have that have a higher value than things that are fake and made up. What do you think?
Michael Clark 30:09
Yeah, I understand that. You've read the book. We talk about this a lot, and there are models in the book that actually solve this very problem because you have to know what came in versus what went out versus what's the same or what's different. So there are even capabilities today that actually make that possible, and that's a big part of the book because we also have to understand that we duplicate an awful lot. We duplicate, but we never build on stuff, either. So it's very difficult for someone today to say, I'm a continuous learner. It was not the same when I was growing up, when I would buy 10 books and I would read each book, and each book would be a build on the next. Today, we can't do that because our data is siloed, or, in fact, it was there two days ago. Now it's gone six days later. So, we don't have this ability to build on data's value over time. So we've got this weird paradox where the investment in storage is going up and up and up, but yet 97% of all health data doesn't get used. Out of all the storage we have today globally, about 60 plus percent doesn't get used whatsoever. Yet storage is rising. So we are storing stuff we don't even understand what's in it, let alone answer your question, which is, well, what should I keep and what don't I because in the end, it becomes basically the equivalent of a wardrobe full of white t-shirts that you might need one day, but you don't know when, and we'll just keep it anyway. In some cases, you're even forced to increase storage because you don't have the option to declutter through your provider. They make it impossible sometimes because they want you to buy more storage. So we very much talk about this compare and contrast model in the book, helping businesses actually start to determine what they keep and what they shouldn't keep. And all of that starts from value, which, again, you come back to the Trident again, you have to know the value that's in there, but also what came in versus what went out, again, is a huge component of the book, because this way gets it right and actually compute becomes less needed because now I'm running data in real-time on the data that I need at that moment. Rather than having to process thousands of records of data, which, as we know, takes a huge amount of energy and computing, we even do that, and in most cases, a lot of that data was irrelevant as part of that algorithm or part of that compute. Well, Michael, if it were the world, according to you and we did everything you said, what would be your wish for privacy or data anywhere in the world, whether that be regulation, technology, or human behavior? My overall goal in life is for a child to be born into the world of data and AI because if they are, that means they can think critically, they can debate, can reason. An AI becomes a partner, and that means they can use AI then to solve real-world problems. That's the first thing. The mission really is for everybody on this planet to have the human right to own, create, and share their data with anybody of their choosing in a way that's secure and protected. I think we do that by making data an asset that's priced and valued and valued I don't mean monetary; I mean appreciated. If governments are listening to this, if they ever do the way we solve all the problems through AI today, and everything we see is by finally realizing that data is actually the most important asset, not technology, and for privacy, I would champion that. I'm lucky enough to be working with some really super open minded regulators, particularly here in Dubai, that want to do something different and actually put the person right in the center of all of this. I think sometimes we create privacy laws that even companies don't want to do, and I say that because if you look at all the terms and conditions that a poor customer has to read through, a lot of companies don't actually want to go to that level of detail they're forced to by the regulator when they wonder why people don't read them, and why businesses are frustrated. So, I think forward-looking progressive privacy laws. But I also would love to see a time when privacy people are actually part of research and development and are not playing catch up because if they were involved in AI from the very beginning, and we had privacy by design, sustainability by design, embedded in all of these things, and possibly even an ethical mindset and an understanding of all of these things connect with all the right people. This is wishful thinking, but the reality is, we wouldn't be talking about some of the issues we do today, about energy, about privacy, because we would have put all these things in by design. So that's my wish. It's a long laundry list. You look at the human level, where I want people to be born into this and appreciate it and think critically. I want regulators to be progressive and look at more the ethical side and the security side, and empower people with the right tools and technologies. Ultimately, I'm looking to a government to say, actually, we need to treat this no different to water when this is our greatest possible asset because, at the end of the day, data, in the very early days, built our world. It is the foundation of language, and we've lost sight of that. So I think governments need to address that because I think once you do that, everything we see today falls in line by design, because we defined it as an asset.
Debbie Reynolds 35:18
I love that. Thank you so much. I'm so happy that you and I were able to chat on the show, and I'm sure that the audience will love the episode as much as I do. I'd love to be able to continue the conversation and find ways we can collaborate in the future.
Michael Clark 35:32
Thank you.
Debbie Reynolds 35:35
Thank you so much, and have a good evening.
Michael Clark 35:39
You too.