Debbie Reynolds Consulting LLC

View Original

E126 - Roberto López Dávila, Legal Advisor, General Court Of Justice of Puerto Rico

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E126 - Roberto Lopez (37 minutes) Debbie Reynolds

37:18

SUMMARY KEYWORDS

privacy, began, people, data, data protection, regulation, inferences, principles, ai, law, fact, technology, europe, happening, thoughts, challenge, cross fertilization, importance, field, debate

SPEAKERS

Debbie Reynolds, Roberto López Dávila

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. So I have a special guest on the show all the way from San Juan, Puerto Rico, Roberto López Dávila. He is a legal adviser at the General Court of Justice of Puerto Rico; welcome.

Roberto López Dávila  00:42

It's a pleasure to be here. So finally having this great opportunity, mostly because you are someone that I admire so much. So it is really a dream come true to have a shot with someone that I admire so much in the product field. So great honor.

Debbie Reynolds  01:03

Thank you so much. I'm so excited to have you on the show. We've been trying for months to get this together. And we finally did. And I'm really excited. I love to talk with people from just all over the world. And I know you have a very unique perspective. I follow all your stuff on LinkedIn, you always have smart answers. And you know, and then also at some point, we're going to talk a bit about AI because that's an area that you're also interested in as well. So tell me, how did you come to your career? And how did you develop an interest in privacy?

Roberto López Dávila  01:39

Well, I am a Gen X person. So I began my career just at the dawn of the 21st century. So this is all dot coms or these enthusiasts who with the web and so on, so forth, and then a public debate with regards to the possible relationship between law and technology. And I get fascinated, you know, at the very moment that this debate began with, you know, that important book by Loren Glassing and all the, you know that conversation that started with him, just blew my mind. And I knew at that moment that I have to, you know, have very, not only to become part of the conversation but in order to do that I have to get the fundamentals, one of the most important aspects that this new relationship between law and technology was generating was with regards to privacy, may your issues that began to surface as a major aspect to take into account. Particularly, you know, when we deal with rule of law, fundamental values, the importance that we, as a society for at least from the Western Hemisphere, get to privacy as a fundamental right, or at least as something important for a civilized society or civilized life for all of us. So I began to see the work that the International Association of Privacy Professionals was doing at that moment was just at the early steps, and then began to take the certification. So the association, in order to get in were really involved in the debate and the conversation, and also to help my current employer with older privacy and cyber security issues that we were confronted with as part of their modernization efforts at the judicial branch.

Debbie Reynolds  04:17

Yeah, I want to talk a little bit about that. So you've been with the General Court of Justice in Puerto Rico for almost 19 years. So at what point did privacy start to creep up where you say, okay, I need to really pay attention to this, this and technology

Roberto López Dávila  04:35

That began when we started to think about putting forward a justice initiative, you know, not only because I was seeing all the problems and the gaps that we began to see while we were crafting all the projects and so on, but also because I was doing a lot of studies of what was happening in other jurisdictions. So, in that way, I had the opportunity to see things that we have to talk to tackle early on, in order to avoid a lot of the problems that different jurisdictions were confronting within their justice digital transformation efforts. So it began in that room. And it was, while we were, you know, involved in all these projects, are currently have an electronic case management system running on also e file in notification platform modules that are working pretty well, not in all of the, of the subject matters, but most of them, particularly in the civil field. So then it begins to transform and began to see that we have not only to have awareness of privacy awareness, but also to develop some kind of practices, or within the way that the courts have to operate. So in that way, we began to realize that we had to, moderately sized or, you know, update our document management practices. And also the document and data retention process. Or recently, in the past, during the past month, the Supreme Court, which is the main governance body of the courts in Puerto Rico, approved a new data retention and document rotation regulation, which not only deals with the different shared use for documentation but also include home data management practices that are that we believe, by work critical, in order to have a preventive more control approach to how we deal with data and documents, taking into account persons that they know, they have for the judicial process as a whole.

Debbie Reynolds  07:41

Yeah, right. That's fascinating. So that's great. I love that you guys are doing that. So I guess there are two parts to that. So one is, this is the way that we think that people should handle themselves going forward. But then also, you're adding more of a proactive approach to it as well. So this is what the rule is; this will happen like you don't follow a rule. But in order to give people guidance, you added that to your regulation as well, like saying, hey, this is the way that you should organize yourselves around data, correct?

Roberto López Dávila  08:20

Yes, we have another challenge in that regard. And this is, and that is that we have a jurisdiction where Rico is, as you know, a US territory. So some US laws apply to us directly, others none, don't. Uh, so as part of the US, we ensure that, let's say, we don't have that data protection tradition that other countries, you know, have countries from Europe have a more robust data protection background. We don't have that. So that means people know that your privacy is important. In fact, privacy is a core value for Puerto Rican society and is enshrined in our Constitution as a fundamental right. But there's hadn't translated at least yet to comprehensive law. So, that reflects the importance that data protection has as a projection of privacy as a fundamental value. So, we are we have that challenge in the sense that people do not have perhaps this tradition of having this to guide their behavior to certain principles of data protection, all of these younger I guess that perhaps other countries have. So the judicial branch has, you know, a tough job to do in that regard that you have to compensate for the lack of or boss culture in with regards to data protection. But this is something that we as a society, as a Puerto Rican society, has to deal with; we have suffered a lot of cyber threats, cyber-attacks, and cyber events, particularly including in the, in the public sector, to my mind, because we don't, we haven't had that having given privacy and cybersecurity the importance that he acquires. We are right now, the government is trying to do a better job in that sense. But what we wanted to say is that it is a little bit more difficult when we, when you have a society that doesn't have these, you know, robots background from our data protection perspective, so we have to work even harder to make sure that we got this, right, because this data is everything. When it comes to the creation process, after all, we are moving data from point to point; although your file, the file actions, that action transform into a process, we add information, and the other parties in the case, add more information, and then all the judicial operators began also to add the power to the process. So it is a lot of information that we have to deal with. And if the core of all the usual processes. So we have to adopt some sort of fiduciary attitude as a custodian of that important data.

Debbie Reynolds  12:20

Oh, that's interesting. This is fascinating. Okay, so this is cool. I didn't know that Puerto Rico had privacy as a fundamental human right in your Constitution, which is great. So I would love for the US to have that. But you're right. So there are gaps there, right? You know, so it just can't be a feeling of privacy; you want to be able to have some meat on the bone, right? Around how people can protect privacy. And so technology always outpaces the law. So you're always kind of trying to cake pig cats up on the wall. But I think you hit on a really interesting point. And that's around kind of education, and also understanding that you know, our country hasn't had the same experience, for example, as Europe because I've heard people say, Why don't we just take the GDPR and put it in the US? And I see you're laughing. About actually, you know, we're just different countries; we have different foundations. You know, I think one really interesting thing, and I want your thought about it is that you've done it very well by talking about data protection, right? Because people in Europe sometimes they get upset when we in the US talk about Data Privacy because for them, they have privacy as a fundamental human right. So all they're trying to do is protect the privacy that they have. And we're trying to get privacy. And then we're trying to protect the at the same time. So what are your thoughts about that?

Roberto López Dávila  13:55

Well, this is, you know, the traditional conflicts, US liberty, Europe, fundamental rights, you know, this is something that in some respects, I'm beginning to see that is a little bit unjust with regard to the US because a lot of what has been the evolution of privacy as a concept, including as a human right, has been developed in the US, besides the typical reference to the Randy's Warren law article, which is, you know, a seminal article, you know, creates, you know, a new field in a new law field. So it's big, it's a big event for law in general, but particularly to the privacy law, but also you have to take into account that a lot of principles that we take for granted were developed during the 1970s in the US in the, in the form of deferred information privacy principles, which are the basis of the UACD guidelines, and even the Convention 108 in the Council of Europe. So there has been a strong influence on a lot of the things that we consider right now as a standard from a privacy perspective. And also, we have to take into account that, despite the fact that, as of today, as of the moment, the GDPR seems to be the standard, and probably will, you know, will remain the global standard, or the one that a lot of countries look to, in order to craft their own statutes. The work that California has been doing us has been, you know, the case for a lot of things that a lot of laws they have to do with technology. So it's really amazing. When you see the CCPA, as amended, mirrors in a lot of ways and even goes beyond the GDPR. Take, for example, for instance, the width and the definition of personal data. They include, as part of the definition inferences drawn, this is a major development on these scores, even at least as of today, is not recognized within the GDPR. So it remains to be seen how these are applied because there's a lot of trouble in the event that the American Data Privacy Protection Act is finally approved and the preemption debate that is currently going on if, in fact, it will preempt states, you know, regulations, including a robot one in California, but probably, you know, there's a celebratory, all of you know, democracy, states have been doing a lot of work and creating a model for the rest of the of the nation in different things. But particularly when it comes to technological regulations. And this is another instance of Daniel Burke. So at the end, I mean, I guess, I know, we have, the US has to learn a lot from other jurisdictions. And in fact, I believe it has learned a lot; probably all the impact of that we are sensing from the US from a Federal level has to do with, you know, with the pressure that it is beginning to sense not only from Europe but also from China that has, you know, recently been felt? So being active in the regulatory field with regard to technology as well. But perhaps we have to be more just, to my mind, when what has been, you know, the role of the US and its contribution to creating standards and providing a basis for regulation all over the world?

Debbie Reynolds  18:51

Yeah, I think you're right. It's interesting that you mentioned that. So I had another podcast guest recently, Cameron Kerry, who's at the Brookings Institution, says the same thing that you just said about the Brandeis article and the fact that, you know, creating a kind of this whole new theory around privacy in the US, and also the US leading privacy being very influential in the early 70s. And they account for where we are now; it's very interesting; I think like you said, I love the way you said it about having privacy, and these different locations develop as laboratories of democracy. And so, you know, we do have an issue now with, you know, states rights in the US, you know, rights and territories, as you say, Federal rights, figuring out what we want to do on a national level. I think that's really interesting. And the two things always come up are preemption and private right of action, which people fight about. I don't think people are ever going to stop fighting about it, but what one thing do I want your thoughts on? And, you know, this is another debate that people have a lot around privacy laws. So, for example, US laws, in my view, tend to be very prescriptive like the European privacy laws, in my view, tend to be more, you know, less prescriptive, right? So, like, for example, you know, GDPR, and this is so funny because I have people who dislike either for different reasons. So, as people who, you know, like I say, I like GDPR because it has these broader concepts like privacy by design and stuff like that. And it's sort of incumbent upon the organization to decide how that works within,, their organization, right? Where we have something like California with the CCPA thing, like put a button on your website that says this exact thing, you know, so what, what are your thoughts about that, that dichotomy of kind of like, the broader concept versus like, you know, we're just gonna tell you exactly what you need to do and be like, very specific.

Roberto López Dávila  21:14

Yeah, well, I guess this is one of the, you know, instances where you see the different traditions reflected in the way laws are rendered on both sides of the pond. So, I guess that when the technique employed by the Europeans reflects the human rights-based approach that they have to in particular, in this case, the human rights, routes for protection, that's why they place a lot of emphasis on principles, you have all these, you know, this principle lawfulness, fairness, transparency, purpose, limitation, storage limitation, so, they began to create rules around those principles in the US case, probably they are more trying to, to give more, they believe that probably if you approach regulation in that way, will, you know, will hinder some, some initiative or people will not understand what your what you should do, it will be more difficult to translate those principles into actual practices. So, I, in fact, that's one of the I don't know if we should call it a problem. But it's a challenge to see how you translate those principles into actionable practices; when it comes to the law enforcing numbers, GDPR is very difficult. There's a lot of issues that are open. And that's why there have been a lot of opinions or reports from the European Data Protection Board and the former plenary working group because there's a lot of gaps in how we will translate this into something that is more concrete. So, anticipate that well, it goes with the regulation, we are a community of professionals, and we are trying to understand how to make this operable and how to enhance law enforcement; if you notice, there has been some sort of cross-fertilization variance between the two blocks, not only because we are taking some that the US is taking, drawing from the principal from aspects of the GDPR or other similar regulation into their own. But also, it is reflected in the fact that, for instance, now, there's a tendency to call all or at least some of the regulations within the European Union Domain Act. And the case of the AIA Act. There's some, you know, there's has been some sort of cross-fertilization periods between, and this is a two by the very fruitful roads I've been is the way to go because our seeing now the importance of it within the proposal of a new trust and lightning take transporter data mechanism. So we need both sides of the pond to get together because there's a lot of stakes here, not only from a commercial standpoint but also from raw human beings standpoint. Right.

Debbie Reynolds  25:24

That's true. I think, I think for, you know, obviously, there's a commercial impetus, because, you know, I think European Union is the US has biggest commerce partners. So they say they do, I think, $7 trillion a year in commerce, and they really want to make sure that kiss their butt, you know, you can't ignore the human rights focus, so we don't exactly have a foundation for in the US. And so my thing is, you know, we're not going to be the same. So I don't think that we're going to ever have a GDPR in the US, this is not going to happen, right, but we have to find a way to respect each other's principles. So in the US, I think our fine line is bound around free speech. And then I think, in Europe is about privacy, you know, privacy, meaning, not sharing everything and being more responsible and being more transparent. So, I think that they're going to eventually work this out, you know, I'm interested to see how this is going to be like Schrems 3, or something, probably a challenge there; it's going to be really interesting. But I think one of the things that is happening in the modern day that you've touched on that I think is great is that in the digital age, we can see all these events happening in different countries, and we can kind of pick and choose and look and see, hey, this is what works, this is what doesn't work, you know, maybe maybe I'll take, you know, a part of this law or that law, and see how it works in my jurisdiction. So I think being able to have that is very helpful for people who are working on privacy and privacy legislation; I do want to touch a bit on AI. So I know that you are a Policy Fellow at the CAIDP, and I want to talk a little bit about what you see in the future as a challenge with privacy and AI. So I saw you had already experienced it; you talked earlier about how privacy seeped into your work, like digital transformation, all the technology, and stuff. So, I think AI is kind of the next level of that. So, tell me about your concerns with data as relates to AI and privacy.

Roberto López Dávila  27:50

Well, this is huge; I believe what we call AI will challenge, at least to my mind, you know, others will challenge everything with regards to what we have to learn from the way that privacy is safeguarded, regulated the approach that we have used to safeguard privacy and cybersecurity to this point. So, this is something that I have given a lot of thought to not only with regard to what I read and also within the work that I did at the Center for AI on digital policy; when you look what how AI works, you see that E goes against all the principles that are the foundation of privacy to this point, that minimization is you know, is contrary to the way AI and demands data. So, the complete opposite of what Data Privacy has been crafted to at this point. So, that means that the way we have to deal with safeguarding the rights of individuals with the typical notice choice consent and purpose limitation does not work as well in AI and algorithmic environments is totally different. I will give you an example you provide provides the consent for center purposes for a controller to use process your data but using AI, they not only use the surface data to process different aspects of their operation, but they are also mining all this data and getting a lot of information, probably that has nothing to do with the purpose for which you have given your data in the first place. So, in our defense, this, in a sense, because you are in the dark, you don't know for what personal data will be used, is being used by the controller. So in that sense, if you continue to approach Data Privacy, waiting, the control paradigm, in which you have the ability to decide what information you can give or not, has been, you know, the class that the typical classical definition of privacy from Alan Westing, which, you know, is a very important part of the history of privacy, do people fail to provide the appropriate safeguards that people demand, people would not understand what the US has been finally given to your personal data, and what inferences controller is drawn from all the data that you have given to them. So this is really a big challenge that we as privacy professionals have because AI and all the different technologies autonomous and algorithmic technologies associated with it began to become ubiquitous, which is currently happening, and is also not only becoming more vigorous but also on a big scale. There's AI systems, and a lot of places that you ever, you will never know that data, you're using this type of systems, but your fundamental rights will be at stake; how would you challenge not only to actually be aware that they are using these systems, to process your data but how you will challenge the inferences when these systems are, in a way, black boxes, that even if you can break into it, that doesn't mean that you will understand what critical aspects what is the inference that they are reaching. And another problem is inferences are just that inferences, not facts; where we think aloud that we do not counter this, we will have a lot of systems reaching conclusions or inferences. And probably as this goes to the main aspect of human dignity, you will be judged more not only on what you are, what you are doing, or what you will do, but probably on your proclivities, or what are your tendencies based on that statistical system, which is not facts, which are no way facts. So we have a very huge challenge on our hands as privacy professionals. There's a lot of scholarship right now trying to come up with a new information privacy principle for the AIH. Because with the current instruments that we have, we will not be able to provide the protection and safeguard the people demands moving forward.

Debbie Reynolds  34:01

I agree with that. 1,000%. So if it were the world according to you, Roberto, and we did everything that you said, what would be your wish for privacy anywhere in the world, whether it be human technology, you know, regulation? What are your thoughts?

Roberto López Dávila  34:21

I wish that privacy were a bigger thing in the next century. So privacy is where you come to terms with it. I, you know, I despise them. You know, this is something that we have to avoid. This is a very pernicious concept, a lot of commercial interests that people think that they're helpless that they cannot avoid, but we can resist, and we shall resist because privacy can and should adapt because privacy is not the same as what's at the end of the 19th century, even during the 20th century, we need to adapt to privacy to our current era. That's a given. But we cannot afford to renounce privacy. Because something that goes to our very condition as humans, if we do not have privacy, we will not have those traits that make us human; our dignity, over liberty, will self determination. In the end, democracy will become something of an illusion if we and life will not be something of value if you don't have a particular space for intimacy for you to become yourself and share with others when you want it. So this is, yeah, I, at the end of the day, I want to think that privacy will remain not only a concept but a fundamental human right in the building of the next centuries.

Debbie Reynolds  36:14

Yeah. Wow, that's deep. Oh, wow. Thank you so much. I love your thought there. And you're not the only person that said this, which is 100 years. So, you know, it's not just this year, and it's not just this election season, right? We're looking like a very long game, long term, on this, which is important. So well, thank you so much for being on the show. This is great, I love your work. I'll continue to follow what you're doing, and I know people are going to love this episode as much as I do.

Roberto López Dávila  36:47

Thank you very much. It has been really a pleasure.

Debbie Reynolds  36:50

Thank you. Talk to you soon.

Roberto López Dávila  36:53

Likewise.