E51- Davi Ottenheimer VP, Trust, and Digital Ethics Technologist, Inrupt

 

Davi_Ottenheimer

Tue, 9/21 12:49AM • 39:45

SUMMARY KEYWORDS

people, privacy, data, regulation, integrity, slavery, drm, solid, law, world, teddy bear, technology, find, identity, colonialism, thinker, big, context, breaking, security

SPEAKERS

Debbie Reynolds, Davi Ottenheimer

 

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. And this is "The Daya Diva Talks" privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that you need to know now. I'm very happy to have a special guest on the show Davi Ottenheimer, who's the VP of Trust and Digital Ethics at Interrupt in San Francisco, right.

 

Davi Ottenheimer  00:38

A little south of San Francisco is where I'm located. But yes, Bay Area,

 

Debbie Reynolds  00:44

So I contacted you on LinkedIn. I was really fascinated by your profile. So you know, you have a background and, you know, economics, politics, I am, I was a philosophy major, which my mother was horrified about in college. So I see you have a kind of philosophy literature. You've been a lecturer on technology issues. Also, you know, an advisory board member for many different kinds of companies. So I thought, to me, your profile really attracted me like, this is a big thinker is somebody I need to really talk to. And so I really liked the way you're kind of talking about and framing data issues. I think it's about trust and sovereignty and things like that. So tell me a bit about yourself. Anything that I left out is the kind of interest that you want to share with the audience.

 

Davi Ottenheimer  01:43

Okay, well, thank you for your kind introduction resonates with me when you say your mother was horrified about studying philosophy. My family is primarily engineers, applied scientists, you know, electrical engineering during the big boom, when electricity became so widespread, even the beginning of computers. So I studied social science as a black sheep almost in the family. Although my parents studied anthropology, they still did it. My father even used computer modeling in the '70s in anthropology to do kinship analysis. So I felt like I was walking away from computers and technology and engineering and electricity and all that stuff to go explore into this human side of the world. And I think a big thinker has the right analysis. I lately have been trying to frame big data security in terms of cognition and cogs. There's so much emphasis in the United States on being a cog. If you can be a really essential cog, like hedge and corner a market, you can't live without this cog. It's like being a coat hanger. Nobody can hang their clothes without being a coat hanger. So you, you become a billionaire, because you make coat hangers, and everyone has to hang their clothes. And that's such a different mindset from a lot of places in the world where people think, what is the big picture? Should we even have clothes to put on our coats? Hangers? Should we even have racks? Should we have closets? And that's so fundamental, that's cognition. And I think what puts America at a disadvantage is we emphasize so much this cog sense of success and progress, that now that we're in the world of big data, AI, machine learning, we are unprepared, we're almost unarmed because it's a world of cognition and the risks around the big picture. And so I think, whoa, unintentionally, I've ended up becoming both a technologist and a philosopher, and I try to bounce between the two. But I recognize their values in both. But I most often end up in the world. I mean, 30 years, almost three decades of working with engineers, I constantly find them unprepared to discuss things that require big thinking. So I always try to stay on my toes on that side of the fence because it's the most important now, I think.

 

Debbie Reynolds  03:50

Oh, I totally agree. I totally agree. I feel like we're helpless, in a way, and the way that we're going into the future, we're not really looking at the big picture at all. You know, for me, you know, for example, just give an example. I say people treat AI like a teddy bear, and it's a grizzly bear. So, you know, we're not thinking about the risks enough about what's happening with technology. And also, you know, like you said, like being a big thinker, not just being sort of one of the people at Sam's workshop, right?

 

Davi Ottenheimer  04:27

Yeah, I think there are two sides to this coin is I've been trying to write a book about this since about 2012. And it's been a real struggle. Ten years now, my first book we cranked out in three months, six months, a co-author and me because it was highly technical about how to break cloud environments. And then we were told to rename it to virtual environments because people at the time didn't even think cloud would be a success. So we were so early, they thought we should call it virtualization instead. But it really is about breaking the cloud. In the second book, I thought, well, okay, now the infrastructure has been kind of laid out the risks. Let's talk about the data that lies on top of the risks, and I mean, the way I bifurcate this is you have an easy routine and minimal judgment topics. And that's a conservative class of thinkers that really wants to stick with things that have worked in the past. And they really believe in it to a point where if you tell them to get off of the old ways of doing things, they resist, sometimes violently, they really, really don't like the chasm between them, the ERM, if you will. And the other side, which is to identify things, store lots of data, evaluate that data, and do an analysis of it. So you can adapt. I call that ICEA. And so, and I've literally talked about this for ten years, this DRM, I see a map, and I didn't come up with it, I really borrowed it from veterinarian medicine. And every form of science I've ever studied has some version of this, even political science. You know you talk about the fall of the Soviet Union, it was there are people who want to do things the old way. And there were people who wanted to make a change, Gorbachev being an ICEA, and a lot of the Politburo being the DRM's. And so, the chasm creates the conflict. And what I'm trying to do, most often, in my analysis of AI, or machine learning, as you say, teddy bears, is find people who are on the DRM side who look for a teddy bear because they want it to be easy. They want it to be routine, and they want them to be minimal judgment. Everyone loves a teddy bear. It's my stuffy, and I squeeze it, I feel comfortable, I can go to sleep at night. But the reality is, over on the icy aside, the person who made that Teddy Bear put it on a MongoDB database that's open with no authentication on the web. It's recording everything you're doing. And so you've just put a teddy bear as a surveillance toy into your bed, and it listens to absolutely everything, your breathing, your heart rate, your you know, it's reporting all of that to the insurance companies, if not somebody who's trying to assassinate you. And that really is where we get to the dark side of this stuff is a lot of the information leakage. A lot of the privacy leakage we're seeing is driven by high-stakes individuals who are trying to gain or manipulate. And so, they use the DRM blindness of people who just want something super easy and with minimal judgment. And then they infiltrate the ICS side. And people who aren't prepared to be an IC AI thinker, or be able to handle the cognition of the big thought, are just totally open and vulnerable to exploitation.

 

Debbie Reynolds  07:12

Let's talk about ethics. So, and I know that you probably know this better than most because I've read a lot of your posts, right? Not all laws are ethical. And also, the law tends to react to things, and the problem with sort of AI technology is the harm can be so catastrophic that there really is no adequate redress or can't wait for a lawsuit or whatever to go through. What are your thoughts about that?

 

Davi Ottenheimer  07:44

Yeah, there's the economic side to this; I think that gets underplayed. You wouldn't write code for any machine if you knew it was going to fail because it would just be throwaway code all the time you put yourself out of business, and the law is kind of the same as human code. So why wrote right? Why write laws so fast that nobody ever follows them? Because they're broken would just be a huge waste of time. So laws tend to come slowly. They look in the rearview mirror more than often, more often than not, because they're trying to do something that they know would work. And that's a gamble. If you wait long enough, a lot of people get hurt. He waited too long. But if you go too fast, you wrote a law that's useless. It doesn't really reflect reality. We see that a lot. So there's this constant balance. There's a middle road. I know people don't like to talk about the middle road because it seems in America, we call that like waffling or wishy-washy. But that's the reality is you ride a bicycle to stay upright, not to fall to the left or fall to the right. So the law is slow by design to be a better law. And the way you work around that is by having lots of little laws written in like states so that eventually you can figure out the best one to use at the federal level. There are all these mechanisms in economics and politics to slowly more move towards a better code so that it's more efficient. And I think that regulation often is undersold because it's in the broadest set in the broadest sense. Regulation is innovation. If the CEO of the company says, okay, everybody, we're changing tack, we're going to build this thing instead of that thing. We're going to go to the moon, and we need a rocket to get there. That's regulation. Put aside all your other projects and work on this alone. You can't work on that of the project. People framed regulation is holding them back. But in reality, it may be holding them back so they can go faster forward. It breaks in the sense of regulation. So if you're driving really fast around a corner, you put the regulation on so you can come out of that corner even faster. And anyone who competes or races understands the importance of regulation in improving performance. And yet when we get into the space of laws and technology, so often I find really radical extremist, anti-government, anti-government, anti-regulation, people getting into the mixing, the only way to move forward is with no rules at all, which is to be absolutely crazy. It just holds a spec so much to have no regulation a part of is because they're in the DRM camp, and they can't trust someone's done a proper analysis, and so they destroy all analysis, which is terrible. And then we get no progress at all. We go back to them—basically, the Stone Age. You know, it's, so I think laws move slowly, but they, they need to move slowly, deliberately. And I think they need to happen because, without them, we can't move forward. Mm-hmm.

 

Debbie Reynolds  10:18

Yeah. I mean, because you're right, we're looking; it's almost moving in different directions in a way. So that balance definitely needs to be there. I think one of one of the challenges that that has happened over the years with kind of that coordination, right, the technology versus law or with the law, is that many of many organizations have this whole silo, right? Where, you know, for me when I worked privately, is about really breaking down those walls and breaking down those silos and having that those conversations and having those collaborations in a way that it wasn't before. So it can't just be Sam's workshop, and you don't know what I'm doing. And vice versa. And I just have kind of blinders on and doing my thing. And hopefully, that at the end is something good comes out.

 

Davi Ottenheimer  11:14

Right, you have a balance between knowledge and privacy. I always try to frame that tension in those terms because people often think of privacy as a good and inherent value to have. But then when you think about the opposite of that, they think it's a non-good. Whereas knowledge is the opposite of privacy makes it clear that they're both good. And so you have to figure out how much good of one do you want versus how much good at the other. And so exactly as this, as you describe, you got to break down the walls, you got to get rid of the silos, you got to bring all the information together, which is a removal of privacy in some sense. But it's because you'll get better privacy if you can remove the right amount to give you the knowledge you need to protect people.

 

Debbie Reynolds  11:53

I think that's where we are on kind of getting a Federal privacy law or legislation where, you know, can't we seem to have a hard time finding out sort of where that middle ground is. And so because we can't solve kind of the harder edge problems, which are, in my view, sort of, you know, will affect our Federal law, preempt, you know, what's happening at the state level, and then the private right of action, like those are the two things is sort of stop us from being able to find that middle ground. What are your thoughts about that?

 

Davi Ottenheimer  12:30

One of the biggest problems like the EFF, being this crazy, crazy libertarian, you know, slush fund, that sort of pushes the agenda at the government level, I find the same to be true in the privacy space with a lot of different groups. And one of the things that shocked me was the lobby groups that said any kind of Federal regulation would reduce innovation. Now, they say this from think tanks, you know, it's probably three or four think tanks, there's the University of Chicago scholar, there's a bunch of people I've seen say this, it's actually not true. And it's difficult to argue because they bring lots of facts, they'll do studies, and they'll say the European market had some regulation. Therefore, they have no innovation, which of course, is not true. But those are the types of things we have to argue against, in fact, as a technologist, and this is why it's weird, bridging the two sides, as someone inside companies, you know, I recently built end to end encryption at the field level, because there was just a little bit of privacy regulation, you know, because GDPR came to bear because larger, more wide scope, geographic areas are under privacy regulation. We then had the approval from the board to put, you know, quite a bit of engineering talent, you know, we had like 15 people and $5 million invested over a couple of years. And I've been doing this stuff for decades. So I felt like the more Federal regulation you get, the more national or union-based regulation you get, the more justification you have for innovation and thinking about solving the hard problems. So the fact that the United States doesn't have it is holding us back. I absolutely think we're not solving the problems. Because when I sit in these boardrooms and talk to people, they say, we don't have to do it. So why don't we spend money on another feature instead? That's not related to privacy?

 

Debbie Reynolds  14:17

I would love for you to expand a little bit on SOLID for people who don't understand what that is.

 

Davi Ottenheimer  14:23

Yeah, so SOLID is about linking data of the web. Obviously, as you click on a link, and it takes you someplace. So imagine if back to the genesis of the web, you put up a website, you put your information on it, and then people can click a link, find a link, click a link and get to your information. That's the genesis of the web is this linked ecosystem of data where we can all collaborate, share, publish, read. So solid allows that in the link data system to be a personal data store. And so, it introduces a Venn diagram of three circles essentially. You have the circle of store data, which is the personal data store or pod where everything that you generate, create, collect goes into your personal data store. It doesn't have to be a single physical place, but it needs to be, philosophically it needs to be like your body, a place that is centered around you, right people understand; it's what defines you. Now, the other two circles are identity and applications. So the identity is generated someplace, maybe it's by you, maybe it's by your parents, maybe it's my workplace, maybe it's by the sports club you join can be lots of identity stores, or it can be one that represents all of those, you know, the government keeps a lot of identities, your driver's license, your taxpayer ID, your birth certificate. So you get your IDs, you connect those to your data store. Now you can write apps, and they always know where to put the data. And they always know which identity to use. So if you get those three circles just right. Anybody can write anybody in the world can write an app, and access your data, read your data, write your data based on the identities that you're using. So let me give you a real-world example. There's, there's a simple when you play chess with me, we have a game, we use an app, and the moves are stored, my moves are stored in my pod, your moves are stored in your pod, we halfway through the game, get rid of that app, get a new app that reads where we were, and we continue playing. That's the type of flexibility that allows app explosion because anyone can write apps that can read anyone's data in the true sense of a standard, right? That's how the Internet is so successful, and we have standards that are open. So let's move on to another example. I go into a furniture store. And I because I'm buying a present for someone choose all bright lime green furniture, I want a chair, I want a rug, I say, please keep this data while I'm shopping. And then I leave, and I come back the next day. And I say I want to shop for something in the current context, that data is stored on their servers. And they identify me by me coming back by all the things that trail of crumbs, or maybe I put my username in. I don't want lime green data to be present. That was for somebody else. In the current context, it's very hard to get rid of that experience. And that, that world, I was just in and say reset. But in the solid context, the pod, I can just say don't look at that data. I might even throw the data away or erase the data. Or maybe I kept it to use again, but I chose. And so now I look for more furniture, and I can choose an entirely different persona. So then I can have like, I'm looking for this person, I'm looking for that person, I'm looking for me. So I can turn on and off what I share with people based on the fact that the data is centered around me, the pod. So that's, you know, the retail experience would change. And I saw this recently with people saying they just throw away data. When you're browsing the web, right? There's a browser saying, we just throw all the data away, and you can present it again if you want, or you can choose not to. That is the solid concept that you present the data when you want. Now the complicated part comes. Let's say the power company is giving you power based on the information that's in your pod. You delete that information. Do you lose power? That gets back into integrity? Or do they keep a copy of the data? In that sense, it makes sense to have a copy of the data somewhere else. Here's another example. Mercedes makes me a car. It has 10,000 sensors in it. When they deliver the car, those sensors putting their data into my pod, I choose whether I share that information. Like is there a pothole is the speed of my car going over the speed limit, all that data gets shared by me by my consent, that's very different than right now where Mercedes is getting all that information and just sharing it with whomever I mean, I literally read that Ford is sharing data over your driving habits with other car companies, regardless of whether you want to or not. And you have no idea that when you buy this car, you know, for all the new F 150s and amazing new electric truck news, I see very little privacy news about what that truck is going to be doing. You're going to be powering your home with that truck. That's amazing. Okay, what is it going to be saying about love? People don't know this. But when you power your home off of a device, it's making fingerprints of what you're doing. Every time you turn on the toaster, every time you run the fridge, those all has fingerprints. And you can tell exactly what's happening inside a home, exactly what brand of things people have and what they own. So the pod enables us to put all that data generation back into a place that actually is representative of us, and we have controlled consent and ownership over it. That's SOLID in a nutshell.

 

Debbie Reynolds  19:18

So right now, people don't really have any control over your identity, right? So their identity, sort of these fragments of data that different corporations have and gather and buy a sale or trade with one another. So a lot of us, you know, lot of people think we have control over that and we really don't. So I think, you know, the centralization, I think, Well, I think it's important two ways, one from a technical perspective, right? Being able to have agency over how you share or not share your data, but then also Being able to have a more fulsome be able to create a more fulsome picture of yourself that you can share this more true than what people infer from the fragments of data that are just sort of running around about you. What are your thoughts?

 

Davi Ottenheimer  20:16

Yeah, exactly. So part of living in the digital realm, like the physical realm, really, it's not that different. But in the digital realm, you give pieces of your identity away as you move around, right? You say, Hello, my name is Davi. And I'm from the Bay Area or San Francisco. And so people collect that information. But nobody really in the physical real believes they have the absolute definition of you, because you've shared something with them, they still think, well, if I need an update on that, I go back to that person to get an update. Are you still living in San Francisco? Where are you now? Is your name still Davi? Or have you changed it legally? It's stuff like that. When you get into the digital world, though, you give a piece of information away. And suddenly people think, well, I have this information, I have the absolute right definition of this person, because I've captured some piece of them, and then even tried to sell it. And so as to be accurate and real. But we know for a fact, first of all, that data is not accurate. We know that consent isn't being given to you to share it with other parties, right, repeated breaches and lawsuits over and over again. And so one of the questions I got asked me recently, which is really interesting, was, if you use SOLID to give people the right to control that law as they should, what would prevent poor people from just selling out and risk people from getting privacy? And I thought, What an interesting question, because first of all, it presumes poor people would be a sellout. Right? Just because you're poor, you make bad decisions. That's not true at all. In fact, it may be the opposite. Poor people may recognize that because money isn't everything, they would only do it to help the greater good. Like, I'm going to get my information away because it's going to solve cancer. It goes back to privacy versus knowledge. I'm going to share because I'm trying to help everybody because I'm poor money doesn't mean that much to me. On the flip side, we know that rich people don't always make the best decision. I mean, look how extremely wealthy I'm talking like million-dollar income plus Facebook executives greedily gave away everyone's privacy just dumped on Cambridge Analytica was at 910 million people's identities or information because they wanted to give access to 300,000 accounts. That's just exponential harm on top of harm because they're so wealthy, they couldn't make the right decision. Or as they're so wealthy, they didn't help them to make the right decision, better way of putting it. So are people in a position to make good decisions about their privacy if you give them the capability? And I think the answer is, well, depends on whether they're, you know, studied and learned in the art of when to share and when not to share, which is no different than physical. You know, who do you tell your name to? Who do you go visit? Who do you? Where do you go? So we're approaching a place where people need to recognize that you could live in a world you should live in a world where the information you share out there isn't what defines you?

 

Debbie Reynolds  22:54

I guess it concerns me, for obvious reasons, right bias their sort of seeps in here. So if you have a parent who doesn't know you, right, how can they know what's best for you? Or how do you know they may infer things that are not true that can that actions are taken against you that can harm you?

 

Davi Ottenheimer  23:14

Right, or even even more troublesome, is when you get into examples of child exploitation and prostitution, where basically women who are under age are groomed and then turned into permanent children with a pimp as their controlling authority. And their bodies are used against their will for things that make the pimp successful, right? Disposable bot lives, that's the Facebook model essentially, is a sort of enslavement. It's entrapment and exploitation of humanity. So when the prostitute kills the pimp, these are classic court cases. When they kill the pimp, they're exonerated because they were under control. And in fact, consent is interesting, because as a child, people would say it typically, well, they're promiscuous, they will come up with excuses for well, you shared your data, you gave away your password, you use the weak password, you did things that allow you that's promiscuity in the digital sense, you did things that allowed you to be exploited by others who had dominance over you. But those are not consensual. And in fact, you don't fall under the law, and you do not have the right to give consent to use your body as a child, right? The adult is always wrong, and they're always abusing you. In that context, you have not made a conscious choice. And that's wherein the digital realm we need to get much more intelligent about when we have predators. And we have people who are controlling tyrants, pimps, and so forth. They're not just parents. They're enforcing dominance, for self-gain, and in a way that's exploitative and victimizing people who are trying to live in a world that they think is safe, but they have no knowledge of what they really could be living in, which is a world that gives them agency that gives them definition or rights to their own body.

 

Debbie Reynolds  24:48

Yeah. So now, there there is a concept that you you have that you talk a bit about, which is digital slavery. Can you explain that a bit?

 

Davi Ottenheimer  24:59

For sure. It's not that different from slavery. I mean, that's one of the big eye-openers for me was. As I dug into this research more and more and more, I found that a lot of the parallels to history were in the slavery concept. So what you find is this idea of you aggressively taking you take space, you take people in that space, and then you make them a profit center. And as you profit from these people's bodies, their labor, their generation of ideas, their generation of other bodies, right, all the things that they're although assets and value and things that they create are yours to own. If you look at that model, that's essentially what I'm finding is Facebook. It even got to the point where when I read about Facebook saying, well, the value of the American is going down or stable because fewer and fewer are joining our platform, we need to find a country that we hit like Cambodia, that we can go in and make a pitch, we'll provide Internet, and then we'll get four Cambodians for every one American because they're worth 25 cents in American account is worth $1. That, to me, just so many alarm bells, you know, I was a longtime student of colonialism and imperialism. And a lot of what I write about, I think, is shocking to Americans because they're never taught the actual history. Right? It's shocking to people that Washington was an avowed slaver. What you learn is, he really wasn't into it; he didn't really like it. Eventually, you want it to go away. But none of the facts support that. What you see is that George Washington actively found loopholes and laws to make sure he didn't lose his slaves. Throughout his life, he conspired with his lawyers to get away with the exploitation of people. Even on the final day, he died in a terrible cold storm. That was so harsh, and he couldn't survive it. But he was standing out or sitting on a horse looking over his slaves. No one ever talks about what happened to the slaves and that same storm. And so when I think about the context today of digital slavery, I think, okay, you have these executives at a company like Facebook, who are getting value out of all these people they draw into their platform. And they're just looking at ways that they can keep them from leaving. And they're looking at ways that they can generationally continue to expand and get into more markets and more relations and more and, and if you leave, you're dead to them. That's like, running away. Right? And so there's so much pressure on preventing people from ever leaving, and always staying and always generating value for them. Yeah, I mean, I could go on forever about the historical comparisons. But I think, I think so instructional that it's hard to have this discussion with people unless they really understand the history of slavery. So many people do not. Right. So that's the first problem. And then the second is, people need to understand that when you get into a cult, it's kind of a form of slavery. If you join a cult that you can't leave without being deprogrammed where you feel like you can't live without them, you're getting into the minds mindset of colonists. The power of colonialism was that it came in, in a way that you couldn't get it out. Right. So if we talk about colonialism at all, in the digital space, we really need to recognize that was a symbiotic relationship that was meant to invade your space in a way that you no longer had agency and control over your own destiny and your own body. You couldn't self-rule. You couldn't articulate or express yourself because of the way that it wouldn't leave you. You had to have it in order to even survive. And I, I find that like in the Bay Area, I hate to use the stereotype. But I find young white men who come out of very fancy Ivy League colleges who come and say, my goal here is to find something I can get inside of that can't get rid of me. I'm going to buy up all the parking spots in San Francisco so that everybody has to pay me to get them back again. That's unethical. That's just absolutely it's illegal. You can't do that. You know, if you're in New York City, they recognize this because they've had so much experience with the mafia and the mob and trying to get them out of the garbage collection industry or get them out of the corner in the restaurant industry. But it's like in the Bay Area, and somehow they don't have the instant the Silicon Valley they don't have the instant mindset the reaction to well, that's just colonialism that's just RICO. That's just unethical hedge cornering markets, monopolization. So that's where digital slavery comes from. It's an easy way for me to sort of characterize the business model of Facebook, and why you'll find Facebook executives leaving and saying, we're here to protect you is not uncommon. I mean, I went to school in England. It was not uncommon to find people who left the colonial army and said, we were there just to protect people. We were there just to help people. We were there to make them better, and they couldn't do it on their own. They're not capable. And that's the colonial mindset that I find often coming out of Facebook executives saying that here I'm here to help. I'm here to help you with privacy because I was doing the absolute opposite for so long. I'm the person you want to talk to.

 

Debbie Reynolds  29:56

Yeah. I would love to talk with you about the way that, you know, organizations, governments, I see this a lot where they're trying where they tried to have you choose privacy versus security as opposed to talking about it in a way where they can exist together in some way. So, you know, what are your thoughts about that?

 

Davi Ottenheimer  30:25

Well, that's, that's absolutely the problem with moving away from the privacy versus knowledge discussion to security protects knowledge. But security also protects privacy. You can't put security in between privacy and knowledge. It doesn't fit, and it goes in both. If you want more knowledge, you use security. And usually integrity and availability. For example, availability is a big, big, big part of security. But that's opposed to privacy. Because if I have super high availability, everyone knows that all the time. That's the best availability, and you can't be lost ever. Now confidentiality, whoa, that's also for security. And confidentiality absolutely blows up all your availability. I lost this key, and it's gone forever. It's very confidential. So we have insecurity, the balance between privacy and knowledge; we're supposed to do both. And so what people are doing very distant genuinely is taking one part of security and saying it's supposed to privacy, but not the other part of security, which is also right there, the elephant, the room, which is completely in favor of privacy. I guess it's a lot of people don't want to be balanced. And that's probably the hardest thing is if you go all-in on one availability, and just availability, it's super easy because all you do is try to keep uptime and measure how many servers are on all the time. And then when somebody says, I need all my servers to be offline, now, it blows up everything. So you have to be able to constantly balance between two completely contradictory approaches as opposed to values. Now, integrity, I left out there. Also factors in here, you know, if you have good integrity, it's, it's a third circle in the Venn diagram. And actually, security is very bad at integrity; we've completely left integrity out of the room because it's difficult to talk about. But a form of privacy is no integrity. If you don't trust the data, if you don't believe this data is accurate, then you don't use the data because it's garbage. And so what you find is, in real-world terms, when you say please put your secret in people put pineapple, and what's your mother's name? Pineapple, what street were you born on pineapple. So there's no integrity to the data because you're generating privacy. Right, because you don't trust the people who are going to have that database of your information, you use garbage data or use tokenization, where you substitute pineapple means I don't know, Washington, and mango beans, cinnamon, you know, so whatever system you use, integrity is a big, big, big factor in preserving private, so security's absolutely in favor of privacy. But on the flip side, if I have really good integrity, because I've really worked hard to make sure that there's integrity, I lose a lot of privacy. So in between availability and confidentiality, integrity itself is constantly working with itself because it's you're trying to do integrity for privacy and integrity against privacy. So it's, it's complicated when people really understand security. They don't just jump into it and become famous. This tends to happen two, three years into a security career. People become famous because they found a bug or a flaw in something. And somehow, that makes them famous, as opposed to spending 20 years, really grinding in the trenches of everything is contradictory. And nothing makes sense to have a very big picture conceptual understanding of, hey, there's no perfect answer here.

 

Debbie Reynolds  33:34

So if it was the world according to Davi, and we did everything you said, what will be your wish for, let's say Data Privacy, whether it be, you know, in ethics, law, technology anywhere in the world?

 

Davi Ottenheimer  33:53

What would I like to see? Or what would I change,

 

Debbie Reynolds  33:56

Or even human-human space. I like to throw humans in there too.

 

Davi Ottenheimer  34:00

Human space? Well, let me start with SOLID again because I'm working on SOLID with Tim Berners Lee, which is a huge honor. It's just amazing. my whole career, my entire technology, life has been on the web from the very beginning. Because the web was created. I feel like I jumped into it in 1994. And everything since then has been so to work with the guy who created the thing on security is mind-blowingly wonderful, refreshing, because I really believe in the web. And I really believe the change in SOLID is going to make it better. It's like sitting down with the founder who says it's working, but not as I had hoped. So let's improve it. And I'm all in for that. Let's make a better place a better world in the digital space. When people talk about the metaverse. I go, yeah, SOLID gives you the metaverse when people talk about blockchain. I go, yeah, SOLID gives you the blockchain. It is the infrastructure layer, like the web, that blooms a million flowers. It creates that better world, and it has all the balance built in. You want to open centralized because that's your get your deal, you can do that on solid, you can over centralize to protect yourself from being over diversified. You want to diversify because centralization is a huge threat to your existence; you can do that. And so that's, you know, to create a future that is that flexible and that diverse. And that representative of the human condition is what I'm trying to work on than the human really matters. in the technology space, the technology does not matter as much as the human right, the tractors, the plows, the fire, all that stuff is used by us. And we need to think of it in those terms, even in the digital space. And so that's the change SOLID is, in a sense, something other people have thought of that I'm all on board with because I believe it's a reflection of the human condition that is most useful. Now, that being said, you know, Wollstonecraft is probably my favorite philosopher at the moment and has been for a while, I used to like Hume extensively, but I realized no one's heard of Wollstonecraft. So I started getting more on that. And I'm trying to understand why a lot of what she said is so helpful because you're talking about a woman in the 1700s, late 1700s, who said women should vote. Women are equal to men. Blacks are equal to whites. It's like the 1700s. Now, in context, in 1735, that colony of Georgia abolished slavery. So there was a lot of abolition in America before the revolution, which brought slavery back. Americans have to understand that the American Revolution was about preserving and expanding slavery was not about the end of tyranny. It was about creating tyranny. And so when you look at Wollstonecraft in the context of the 1700s, saying, hey, blacks are equal, women are equal to men, humans are equal. And it's all about education and learning. It's just kind of mind-blowing to think she's not someone anyone's heard of. And her philosophy is so spot on target. Her writing is so clear, and she just trashes it. So as he deserves to be. Rousseau was fundamental to fascism. For example, Rousseau's philosophies ended up being tyranny, and she just trashes him in the 1700s. And so, I would like to see people working on projects that are in the concept of Wollstonecraft, seeing the future. So often, people say, how can you see the future? How can you predict where we're going to go? How do we? We need to collect all the data in order to have a better picture of what to do. That's been a big mantra, and big data, you know, the data lakes and AI and ml, if we just have 10 million more crashes and 50,000 more fatalities, we'll finally figure out how to prevent one. It's such a broken mindset, and you have somebody who did not have any of the resources, did not have the Internet, did not have keyboards, you know, let alone printing press, you know, she didn't have any of the modern conveniences. And yet, she could see the future. And people really need to think about that, let alone people who saw the future centuries before her, or 1000s of years before. We have that capability right now, the way that we approach technology, the way we use technology, we need to think about how to improve other people's lives, pull them up and make them equal. And so, if I were to change one thing, it would be to get in front of the people who believe that we need to be competitive. And help them understand that humans are successful because we're compassionate and collaborative. It's what defines us as humans. We're not competitive by nature. We learn to be competitive. We are compassionate and collaborative by nature. And so that would make our technology much more useful and successful. And our innovations would be much more impactful. So whatever it is we do, I would want that to be the change. And I believe SOLID is a manifestation that we can collaborate better because this protocol allows it allows collaboration. Now we can compete. Also, competition is a small piece of that, but competition is within the collaboration. If you're on the same team, who's better on the team isn't meant to make the team less successful. It's meant to help you collaborate so that you can compete against a larger and then a larger, larger total. Ultimately, you're all collaborating. Excellent. Excellent.

 

Debbie Reynolds  38:51

Well, it has been a great session, thank you for illuminating all these different areas, and I think it's fascinating your work in ethics and trust. And you know, I'm sure we'll have other tests in the future.

 

Davi Ottenheimer  39:05

Thank you so much for having me. Welcome.

 

Debbie Reynolds  39:09

Thank you.

Previous
Previous

E52 - Jimmy Sanders, Information Security, Netflix

Next
Next

E50 - Kurt Cagle CEO Semantical, LLC, Editor in Cheif of The Cagle Report, Community Editor, Data Science Central