E131 - Egil Bergenlind, Founder & Privacy Hero Sidekick at DPOrganizer

51:18

SUMMARY KEYWORDS

privacy, risks, organization, companies, tools, professionals, data protection, data, happening, ai, technology, laws, manage, market, customers, build, solution, people, world, lawyer

SPEAKERS

Egil Bergenlind, Debbie Reynolds

Debbie Reynolds 00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

Hello, my name is Debbie Reynolds, they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest in the show, all the way from Stockholm, Sweden. This is Egil Bergenlind. He is the founder and Privacy Hero Sidekick at DPOrganizer. Welcome.

Egil Bergenlind 00:43

Thank you so much. Thanks for having me. Great to be here.

Debbie Reynolds 00:46

Yeah, this is great. I really like your product. I've always liked the types of content that you put out and education that you put out, I would love for you to be able to tell your story. And why privacy is important and why you decided to create DPOrganizer.

Egil Bergenlind 01:06

Yeah. I'll be happy to do that. Been a fan for a long, long while of your show as well. So great to have an opportunity to be here and share my story. So I actually started my career as a lawyer, as a data protection lawyer in Bird & Bird here in Sweden. This was about 12 years ago. So I did a different variety of data protection work for local Swedish customers. But being part of Bird & Bird, that international group, I had the opportunity to work with some really big, fun, exciting projects for world-famous brands around the world also. So that was an exciting start to my career in data protection. And a couple of years later, I had the opportunity to move or move over to iZettle payments service company based in Sweden that was later acquired by PayPal, where I had a couple of different hats on. I was responsible for compliance, and overall second-line defense, because they were regulated, but I was also responsible for data protection. So I was the DPO there. I had the responsibility of building a new privacy program that will make us ready for GDPR. And that will also make sure that we had a good approach to this all over the global organization that we were when we were growing quickly. So we really needed to get this in place fast. And this was around 2014, That this was my top priority at iZettle. And I actually went to market trying to look for a solution that could help me in my role there as a privacy professional. Because as most privacy professionals have experienced, I use spreadsheets, pen and paper, manual processes, all of it crayons in different colors to keep track of what are we doing as an organization. Where do we have the data flows? And what chains of responsibilities do we have? And what exposure do we have? What risks do we have? How do we manage that? But yeah, this was before GDPR. Many organizations, at least in Europe at that time, didn't know about data protection laws, but they didn't invest heavily in it. So there wasn't a big market for it. So at this point in time, there were no solutions for people like myself. So that's where the idea started, like, how would one build a software solution, an offering that would cater to the needs of privacy professionals like myself trying to build up and manage a sustainable privacy program? So that's where it started. And that is still the same thing that we're doing now; eight years later, we founded the organization in 2015. And over the last eight years, we built our web-based solution that now supports 1000s of diverse professionals from all around the world in doing data mapping risk assessments, managing incidents and the SRS, running training programs, and keeping track of risks and compliance, and the regulatory development, etc., etc. that in combination with developing our offering with professional services, is what I've been focusing on lately. So it's software, and that is professional services for all those companies that have the ambition to take their privacy program to the next level, but they may not have the necessary resources or know-how quite yet, and we can really step in and support them. So yeah, that's what I do, develop the product, the offering, think a lot about what's happening in the market and how we stay relevant in improving that, while at the same time trying to hold on to the things that our customers really, really love us for and where we're strong.

Debbie Reynolds 04:56

Yeah, well you definitely have a very strong product very strong offering. I personally love to talk with people who've been doing data protection and especially before the GDPR. Because you knew what it was like back then like you said, it was spreadsheets, it was a priority, but it wasn't top of my priority, I think, before GDPR. And I think people got to a place where they found out that they could not really manage this through spreadsheets, I think, in a way, some people still think a spreadsheet is a database, which it isn't. So trying to find the proper tools to be able to use, I think that's really important.

Egil Bergenlind 05:38

It is, and I think yeah, I mean, I think that realization has grown as the market has developed now over the last years. I mean, one big difference that we see in the market is how companies have moved away from running data protection projects and GDPR projects into trying to build something more sustainable. And when they realize that this can't be a project, we have to think about this long term. Because how we process data and what we do, and what we're exposed to that's going to change over time. So also, our privacy program has to change and continuously live with us as a business. And that realization has really driven more companies to the decision to invest in something purpose-built that can be business integrated instead of putting it all in a spreadsheet. So that's a good development.

Debbie Reynolds 06:28

Yeah. Tell me about the types of hair-on-fire moments that you have with clients, what type of client or what type of incident really drives people to say, that's it, I've got to get a tool, and I have to reach out and do something different.

Egil Bergenlind 06:46

Yeah, I mean, I'd love to say that it's just the realization that there is a law, and you need to comply with the law. But rarely, that's the case; it's a driver, it's a factor, but it's not what's driving them, taking them to that actual decision to invest. And it's also rare how expectations of individuals and data subjects has changed. That is not happening as much as we would like to see it, either. Still, a few years later, here in Europe, people aren't really aware of the risks that they're exposed to, and the rights that they have, at least not to the level that we'd like to see in order for the rules to really have a strong impact on the market. But we do see a lot of companies putting pressure on each other; that's the biggest driver that if you want to do business with me if we are going to be in this partnership, then you're gonna have to prove to us that you take this seriously. And that you work with privacy and that you are compliant. So the pressure from business partners, from business customers and from investors. That's really the big driver that we're seeing in the market right now.

Debbie Reynolds 08:07

I agree with that. You actually answered my next question that I was going to ask you because it's true. I work with a lot of third-party companies, and they're like, this doesn't apply to me. Do you want this contract with this other company? You know, you have to align yourself with that. So it does apply to you. Right?

Egil Bergenlind 08:28

Yeah, it does. So that's an important factor. But I think what also really increases the momentum in the market right now is the regulatory development that we see worldwide. So a few years ago, the main challenge of European-based customers that we meet most of the time, at least so far, is GDPR. It's that they worry about, but now we have the regulatory development all over the world. And the exposure that they have either directly or as a result of working with businesses in other parts of the world. That will forward those requirements from local jurisdictions. That is also something that adds a lot of complexity and increases the need of additional investment and helpful solutions to manage your privacy program.

Debbie Reynolds 09:27

I'm just curious, what do you think is maybe the area of data protection that you think organizations struggle with the most? Is it doing things like record of processing activities, DPIA's GPAs, and international data transfers? What are your thoughts?

Egil Bergenlind 09:47

It's a great question. And actually, I'll tell you, like, the big big headaches that our customers came to us four years ago or five years ago, when the GDPR came into place, it's actually pretty much the same key challenges that customers are coming to us with now. So yes, it's still those fundamental pieces that you need in place in order to run a valuable privacy program. It is data mapping; it's building that inventory, being able to show that you have your records processing activities, it's doing the DPI, it's the risk assessment. It's building the internal routines for managing incidents and data subject requests. And so, but it's still quite a lot, those basic things, but we see another type of approach to it. So in 2018, yeah, companies, they wanted to get these basic things in place. But they did it again; as I said earlier, they did it on a project basis; they did it as a paper exercise. And then they thought, okay, now we did it. So now it's done. Now they're picking it up four or five years later. And it's the same challenges. But they're doing it differently; they're approaching it differently. Now building it into their business processes to make sure that it's gonna stay up to date and actually be helpful for the organization long term, so they won't have to redo it in a few years. But then, in addition to that, I think what companies are really struggling with it's moving away from a reactive approach into something that is more proactive. And I think maybe that's a result of still haven't really gotten those basic pieces in place quite yet. But once they do, we see the customers that have got through those first phases of getting the basic documentation in place, getting your house in order, getting to know what do we have, what exposure do we have. Thereafter, it's all about training, which is the right focus, getting the training out in the organization. So everybody understands, not necessarily in detail, what is the GDPR. Or what do other regulatory frameworks mean? But more importantly, how does this affect our organization? And what do I do in my role in product, in marketing, HR, etc, when different things happen? And how should I think about these requirements? How do they affect my daily life? That's really what privacy professionals are trying to get to, once they have the basics in place. And I think that's exactly the right approach and trying to build something that is more proactive.

Debbie Reynolds 12:48

Absolutely, and I think that's a challenge around the world with being proactive as opposed to reactive. In some ways, I feel like this is the result of a lot of education, especially in business schools, where they think, oh, when something bad happens, and we just have to jump on it and have this action plan, but it's like, data protection has to be something that's more foundational to the organization, it can't be tacked on at the end. So being able to have people understand that you have to be more proactive. And I think what you're talking about is that people who have been very active in the past are finding that they're having the same challenges, facing them over and over if they can't do something differently.

Egil Bergenlind 13:33

That's exactly what I was trying to say. But you put it out there better. Well, that's right, like if we stay doing the reactive stuff, then then we're stuck in that work. But if we're able to help the organization understand what does data protection mean for them and what is our approach as a company that will free up time and it will enable you as a privacy professional to be more supportive in earlier stages, instead of coming in late in different business processes and creating issues. So it just simplifies collaboration and helps the business overall in being faster and more agile.

Debbie Reynolds 14:15

So you touched on something I would love to talk about, advanced collaboration. What are you seeing around collaboration, because privacy is an area where companies can really thrive if they learn how to collaborate. And a lot of times when you have these tools, there may be one particular department or area within a company that's driving or wants the tool the most, but then they still have to find ways to break out of those silos and do that collaboration. So how have you seen that or how does your tool help people do their collaboration?

Egil Bergenlind 14:50

Yeah, it's a very important topic, and that's also an area where we've seen good improvements over the last few years, actually. Back in 2018 and 2017, we saw privacy work being done more siloed, just like you say. So the privacy folks would do something, and then the operations would be something different, and information security would be another silo. Whereas now, we see more clearly that different stakeholders throughout the organization are involved from day one in the work. And typically, what we see is successful is the structuring of they're called differently. But the privacy ambassadors or privacy champions are something that we see very many organizations using, I think, for a couple of reasons. So that is typical, you identify a central privacy team, you would identify a few different individuals in different departments that you give some extra training, and an extra responsibility to be your extended eyes and ears on the ground, keeping track of what are we doing in this department. And maybe they know a little bit more than others when it's time to ask your privacy team when it's time to escalate something, when it started to conduct a certain risk assessment, etc. So it's that extension that gives you more insight into what is happening out there in the organization, which helps the different departments in being proactive, rather than having to solve compliance requirements later in the process. But it's also very important since most privacy teams out there in organizations are understaffed; they don't feel that they have enough resources, enough budget, and enough people to be everywhere. So that is also the reason why it's so important to identify those different individuals that can be helpful. They don't have to be privacy experts, and they definitely don't have to be lawyers or legally trained, but they need to have an understanding of what is happening in terms of data processing in different parts of the organization. And they need to have an interest in participating in the work and helping the organization improve the data protection and compliance culture. That is extremely important. And when that's done right when you identify the right people and keep can give them the extra training, and they are able to give these matters some extra attention, we see that working out very well. And deeply organized is really all about collaboration, helping our customers do this, helping the core team, the privacy folks, give them access to the information and the features that they need in the application. But at the same time, enabling them to collaborate with different people throughout the organization in carrying out different plans, or ad hoc tasks, and activities that come up, but also in managing training activities, doing risk assessments, managing data subject requests, etc, etc, but for us, as a supplier, it's very important to realize that these different users, they do have different needs, they do have different skill levels, they need different kinds of support, and will have different expectations on the solution, like the pure organizer, so we're trying to understand who they are and see their needs and and just be that spider in the web that can help all these different parties collaborate. It's a big challenge to make it all work. And it takes a bit of time for an organization to get it working. But it's definitely worth the investment in trying to find the right governance structure for you and take help from a tool like DPOrganizer to automate those workflows to keep it sustainable.

Debbie Reynolds 19:22

Very good. So one gripe I have around technology space and data protection is that there are so many people out there, so many different tools. It's hard for someone who's looking to go to market find a tool or differentiate one from the other. So tell me, what is your specialty that really gets people attracted to DPOrganizer?

Egil Bergenlind 19:50

Yeah, and I definitely think when you start looking for a tool when you're looking for a provider that can help you take your privacy program to the next level, I think it's super important to go out there and look at different vendors, there are different solutions out there. And I'll be the first to confess that keep you organized. It's not necessarily right for every single company. But where we are strong, and what customers really appreciate about us is that, compared to many other solutions out there, DPOrganizer is really easy to implement. We're really fast in implementation, it goes really quickly. And it's really easy to use. We hear some customers coming over to us now having experience from other vendors, and they will tell us stories like, yeah, we've been with this other vendor for a year or two. First, we were stuck in a corporate configuration phase of six months, and we put so much money into it, but we never really got it to work. And now we just don't have the resources to manage the software. And we have to get trained and certified just to learn the software. So you have to think about like, what kind of resources do we have? Do we want something that works now that we can get up and running? Or do we have endless resources to configure something that is extremely customized for our world? So DPOrganizer is really strong in that it's easy and fast to implement. We hear it's easy to use, and also that we as a company are easy to work with. And we're happy about that. And I think another important differentiator between our standard and some others is that we have a really strong focus on privacy management. So I'm one of the founders, and I'm a data protection lawyer, and we sort of always had privacy management in our DNA. And we're proud about that. We hear customers tell us all the time that it shines through, that this was built by people who understand privacy management, that it's built for privacy professionals. We're super excited to hear that. So yeah, we're proud of that strong focus on privacy management. It's an easy-to-use solution.

Debbie Reynolds 22:10

I agree and concur with that. So tell me what is happening in the world right now, Data Privacy, data protection anywhere in the world, that's concerning you. Something that you see and oh, wow, I don't like what's happening here.

Egil Bergenlind 22:28

Yeah, I mean, you're probably gonna get a similar response from many privacy lawyers these days. But yeah, it's fascinating. What is happening, I mean, is the fast and widespread adoption of the new tools that we see coming out. But that, in combination with what appears to be, in many cases, an absolute disregard for compliance, privacy rights, and other rights, from the companies behind some of these technologies, makes me very concerned; it's obvious that the potential that these technologies have, these tools built based on in the AI realm, that they are very powerful, and the opportunities are great, which means everything is happening very quickly right now. And we can see how users adopt this very quickly of all ages and how user expectations are changing rapidly. And at the same time, we see that companies, OpenAI is probably the best example. They have developed their solutions without, it seems, in many cases, considering loss and privacy risks. I mean, what about privacy by design, data minimization, the legal basis for processing personal data? It doesn't seem that these things have been considered at the right stage in time when these technologies were developed. So that concerns me a lot. And also the fact that still, these companies are not very open about how their models work in terms of collecting and using personal data. So this is concerning for me as a privacy lawyer and, as yeah, as a person, as a father, as just a member of society. But even though these technologies have huge issues legally and ethically, I think it's a fact that companies will still use them. Because of the operational benefits, and the winds that are out there, these are just opportunities too great to miss out on so companies will use them regardless of the risks and the ethical issues. And this will create very real and challenging issues for privacy professionals. They will have to deal with not only the theoretical issues of does this technology work in relation to privacy laws and fundamental rights and ethics, but their companies will still want to use them. And in many cases, it will go very quickly. So privacy professionals will need to manage this at high speed in a very practical manner. Can we use these tools? Can we use some of these tools? And how do we do it? What risks does it actually mean for us? How do we feel about those risks as an organization? Can we manage them in some way? So it's going to be very challenging for many privacy professionals. And also, I think it's interesting we see now many of those privacy professionals who, a few years ago got the data protection problem in their lap, sometimes not very voluntarily. We are also seeing them now getting the AI problem issue in their laps. Well, that's interesting. But in any case, regardless of that, they will have to support their organizations in addressing the risks related to AI. And I think privacy professionals should urge their colleagues to not jump on using new AI technologies too quickly without considering and aligning on the risks and how it can be managed. We need to talk about these things. We need to assess the risks and make conscious decisions that we later can justify and stand behind from legal and ethical perspectives. So yeah, I think this is what is happening now that we see that causes a lot of concern, from a privacy perspective and professionally for the privacy folks that we meet in our work.

Debbie Reynolds 27:24

I agree, I think generative AI creates a particular issue around privacy because these types of tools ingest so much data indiscriminately. So I think that it raises the risk around data processing and data protection. What are your thoughts on this? I think what these tools are doing is they're shifting the risk to the user from a company; what are your thoughts?

Egil Bergenlind 27:57

Well, I mean, I think you're absolutely right; that is what we're seeing that is happening; these companies are saying, well, we don't even ourselves understand exactly what is happening. And we won't tell you how it works, but we can't explain it. So users will sort of have to use it on their own risk. And you can say that, but I think always we need to expect companies to develop and take products to market. We have to demand that they take responsibility for and are accountable for what they develop and what they put on the market, especially when it's so easily accessible by anyone, including young folks who might have a harder time assessing what is happening with data. What kind of responses are they getting? What sources does this come from? How do these models come up with their answers? And how can you use that, and what different risks are involved if you interact with one of these tools and share your data, etc? So I think we have to require and expect more transparency from these organizations, and they need to take responsibility for what they develop and what they put to market.

Debbie Reynolds 29:19

You know, it's really interesting, OpenAI, some of these other companies, especially them, and they're going to be big dog on the scene right now. But the issue that we need to think about, that I want to talk about is some of the ways that OpenAI is operating in terms of how they scrape data on the Internet. That act is actually legal in the US, even though it's not legal in places like Europe. So what are your thoughts about that?

Egil Bergenlind 29:55

Well, I I think that's right. And I think that's the reason why we ended up where we are. But we should also remember that OpenAI is it's not your small, typical startup that operates from a garage and doesn't know about the different regulations from around the world. What do they have? A $29 billion valuation with very large partners and investors, and they will be working on this since 2016 or something? So they should know this; we should expect that they know and respect not only US law, but all different laws from around the world that affects and protects users that might be exposed to their services. So I think the disharmony in regulation might be a reason why OpenAI did this maybe, but I think the biggest driver is like the market forces, the opportunity which requires them to move fast and, and do this big to get a superior model out in the market as quickly as possible. But I really hope for more regulation, more harmonization in regulation across the world over time, and I hope for enforcement and global collaboration in regulatory development and enforcement. I think that's necessary in order to address the risks related to this widespread new type of technologies challenges, but I think that's what we have to do when the industry, OpenAI, etc., have to really be transparent. I mean, GDPR never intended to stop innovation. It never said, let's stop processing data and only focus on risks. It tries instead to create balance. Let's continue sharing data. Let's continue to innovate. But let's not forget about privacy risks while doing so. And I think we have to take that approach. When we look at these technologies, yes, let's remember the potential upsides, and let's try to make that happen. But we have to work together as a society and regulators and the industry in order to move forward without jeopardizing the important fundamental human rights. And the only way to do that is to come together be transparent, and what is happening with the technology? What risks does that lead to, and how do we manage that? That's the only way we can do it, really. And I think also, I mean, that the opportunities shouldn't be forgotten. And I hope that down the road, we will see more clear opportunities and benefits also from a data protection and privacy perspective; how can these tools, this new generation of tools built on generative AI? How can that support privacy professionals? How can that reduce privacy risks and compliance risks? And I think there are opportunities, but it's going to take us a time before we see actually what that looks like. And that's going to be exciting to see so carefully, like excited about the future a few years from now. But right now, we need to focus on and understand the risks and transparency. Is the solution there? I'm quite sure of it.

Debbie Reynolds 33:55

Yeah, what are your thoughts about ethics? So, to me, a lot of these tools they're playing in that ethical gray space, or the ethical gap space, where they're saying, hey, there are no laws for this. So let's do it. So in some ways, I feel like we need to hold these companies to a higher standard; just because something isn't against the law doesn't mean that they should do it, right? So just because things like scraping someone's personal data off the Internet, there may not be a law per se against this for certain reasons. How can we as consumers communicate that in terms of what our ethical priorities should be, or what the industry should be doing from an ethical perspective to, like you say, encourage innovation, but we also want to do it in a way that doesn't harm people?

Egil Bergenlind 34:55

Yeah. Ethics is tricky. It's different to different people and in different parts of the world, isn't it? But I fully agree I would like all companies at all times to, regardless of loss, always everyone should think, okay, what we're doing now, what we're developing, what we're putting to market, what we're putting in the hands of consumers and different types of users. What does that mean? Is that good? Is that right? Is that in line with society's expectations today and within in line with what consumers and these users will expect from us in the future? Would I want my own family exposed to this? If I like to use tests like transparencies has, like, if society, if consumers knew 100%, what we were doing, how we were doing this, and what risks they were being exposed to when using what we're putting to market, would they be comfortable with it? Would they use it? I think those kinds of tests, that kind of thinking, all organizations should do. But we don't see that. And then sometimes, yes, we see companies in this space saying, well, it's not regulated yet. But I think also we should remember that's not really true. Or if you ask lawyers or regulators in some of these countries, I mean, we still have, yes, not necessarily privacy laws preventing what OpenAI has done in US, but we still have consumer protection laws, we still have product responsibility laws. So and we still expect companies and require companies to take responsibility for what they're doing and how they're acting, and what they're putting in the hands of consumers. I think saying this is not regulated, I think that's a weak defense. Some of these companies are counting on but yeah, I hear you. And I think it's a good thing that we're seeing movement also in the AI specific regulation area, it's going to be interesting to see when we get new AI focused laws and what effect they have on the market.

Debbie Reynolds 37:26

You know, this is a tough one because what you have is market forces from outside pushing inside to the organization. So you may have someone, maybe it's a higher up person within an organization, say, hey, I saw this cool tool, I really want to use it. You have privacy folks, maybe the Department of No saying, hey, you need to really look at these risks. And I think that makes for a very awkward conversation, right? Because it's like, hey, everyone's using it, I'm using it at home; my kids use it, you know, the firm down the street is using it, how can we really leverage this in a way that's responsible? So how do you think the privacy folks can strike that balance I guess?

Egil Bergenlind 38:14

You're right; this is a really tough one. And it's always a tough one for privacy professionals. And this is often their job, like trying to get business to maybe not slow down or think differently, but at least make sure that they understand what we are doing here. What risks are we talking about? What is our risk appetite? And how do manage this situation? But now, it's moving so fast. We're seeing new tools and new use cases for this type of technology coming every day. And business has this pressure on it to move quickly, to stay competitive, and improve its efficiency by using these tools as quickly as they can. So and management as well. I mean, this is the whole topic. You want to tell your investors and your shareholders and your partners that, yeah, obviously, we're on the AI train. We're doing it; we're using it to the maximum capacity and getting all the benefits from it. So business has that pressure on it to move quickly in that direction. And yeah, it makes it very difficult for privacy professionals to strike a balance to manage the risks and maybe instead so that in some cases you won't be able to manage the risks at the very least you should do is to focus on making sure top management decision makers understand that actually, this is regulated, there are things to consider. There are privacy laws that affect how we use this tool. And the fact that these tools may have been developed in violation of laws creates an ethical dilemma for us. Even if we are not necessarily violating certain laws while using these tools, should we consider that the tools themselves, the underlying models, were developed in violation of laws? So putting that out on the table, making people aware of that, that's the very least that we should try to do as privacy professionals, as those trying to enable the organization to strike the right balance between innovation, growth, fast execution, and respecting rights of people.

Debbie Reynolds 40:44

I want your thoughts about this. And this is what I think is going to happen in the very near future, though, right now, a lot of times when we're talking about AI; we're talking about these external tools that people can use or not. One is happening, though, right now, and probably will happen a lot for the end of this year is that a lot of tools that we use every day are trying to incorporate generative AI into their tool. So this is going to make it even harder for privacy professionals to make these arguments because you're going to open up your tool, or you're going to open up your email, and bam, like it's going to have that feature in it. What are your thoughts about that? Will we reach that point?

Egil Bergenlind 41:29

I definitely think we will reach that point. And, again, that's why it's so important for us as privacy professionals to sort of try to take in, take your view from the top here and understand overall, what are the different issues that we might encounter with this type of technology? What is our approach to that? What kind of policy should we have as an organization in relation to generative AI and different types of similar tools in that realm? That comes with certain risks. So we need to decide on that as an organization? Will we use those tools? In which situations can we should we? How do we use them? And how do we justify our decisions around this? You're right, it will come up in all different departments very quickly. And it will be an integral part of the tools that we're already using. And in some cases, we won't even have a say in it; we won't be able to stop it, it will just happen as a result of what our vendors and existing business partners are doing. But I think that also requires us to try to have that conversation not only internally with decision-makers but also with our vendors and our business partners. What are you doing in this space? What do you think about these issues? What do you think it means? Yeah, what is your approach to that? Speaking about it internally, trying to get a sense of the risks and set the policy to align around your approach to it, but doing the same with vendors? With business partners, I think that is crucial in order to just yeah, keep pace with everything that is happening.

Debbie Reynolds 43:25

Well, it's definitely not a dull moment in privacy and data protection right now. With all the AI technology swirling around just gets more complicated.

Egil Bergenlind 43:35

Yeah, definitely. I think we all knew that this was coming. I mean, so much is happening in the space. And we talked briefly about the regulatory development earlier. And in the US, so much is happening and so much will happen with different State laws in the next few years, which is super exciting. And I think as a community, we knew that AI will pose new challenges. We've known that for quite a while, but just how quickly things have escalated over the last couple of months. I don't think anyone could really foresee that. So it definitely stirred things up and made life a lot more exciting as a privacy professional. Yeah.

Debbie Reynolds 44:21

So if it were the world, according to Egil, and we did everything that you said, what would be your wish for privacy or data protection anywhere in the world, whether it be technology, regulation, or human behavior? What are your thoughts?

Egil Bergenlind 44:38

Yeah, I think we've started on the right path. But it's moving slowly. I want more regulation. And I want more harmonization between different countries, but because of data processing, data flows, data risks, that's not something local; it's a global matter. So, therefore, I would like regulation to be even more harmonized than it even is. But also more enforcement. Also, yeah, it takes time for the different supervisory authorities to adapt to a new situation, to new frameworks, to adapt the organization, to grow the organization, and to actually do supervision; it takes time. But more and faster will certainly help companies mature and will help individuals better understand what risks are out there and what rights do they have. And I think that's also something that I really wished for more awareness about people in general about the risks that we are exposed to as consumers and users of different types of tools and technologies out there, especially among young people. So definitely more awareness there. I would love to see people exercising their rights under laws to put even more pressure on companies, which, obviously, are the parties that should know and take responsibility for the risks that they expose people to. So yeah, I would really want that, I would definitely want more people working in the privacy space. That's one of the big obstacles still for so many companies, so many organizations want to do better, and they want to take their efforts forward. But there's a lack of experienced privacy professionals; there's a lack of people who want to work in this space or have just had an opportunity to experience the space. So more people from legal, from information security, from all different areas, I would love more people moving into privacy because there is so much work to be done here. And it's so important to be in that roll-up position where you're trying to enable innovation but at the same time doing it without jeopardizing the rights of individuals. And yeah, finally, I wish for bigger budgets for privacy professionals, those that are out there so that they can invest in better privacy programs so that they can build better and do more and obviously so that we can support them even more in being those privacy heroes that we truly believe that they are. That's why I have the title Privacy Hero Sidekick. We truly believe that these privacy professionals are the heroes and they deserve the best tools we can give them. Yeah, we're proud to continuing trying to do that.

Debbie Reynolds 47:38

That's great. Well, you had a three-pronged wish; that's wonderful. I love all that. So I love the harmonization, being able to have more privacy professionals. But one thing that's very important that you touched on that I would love to chat with you a little bit more about which is bringing in people to privacy from other areas of expertise. So I think some people, definitely not people that I know in Europe, but when after the GDPR came out, and people started getting more aware of the extraterritorial things around GDPR things and privacy in the US started to go more kind of lawyer focused where I think what we really need is a multidisciplinary group of people from different areas and different expertise to be in privacy because we all see issues from different angles. What are your thoughts on that?

Egil Bergenlind 48:35

100% agree. 100% Agree. Doing it successfully? Yes, you need to understand the law. But you also need to understand like proper data protection, information, security, etc. But just like you say there are other perspectives than that, like more data-oriented, what are we actually doing with data in our organization? What are the needs that we see now and in the future? How do we build our data strategies to be aligned with what we're required to do and expected to do by law and society? And in Europe, yeah. Traditionally, most privacy professionals used to be lawyers and with what we've seen, but we see so many cases where we have DPOs and other roles being filled by people who don't have a legal background don't have an information security background. But they understand and are interested in human rights. They are where they are interested in data, what we do with data, and they are interested in ensuring a trustful relationship between the company and its customers and employees. And I mean, at the end of the day, it's not necessarily more complicated than that. This is about respecting the rights of individuals that we engage with in different ways. If we need more expertise in legal or information security or other areas we can get that either from other folks internally or externally. But it doesn't have to be the experts that have responsibility for and run these privacy programs. What you need is a keen interest in understanding what we do with data. How does that affect our relationships with our customers? And how do we really respect the rights and maintain good relationships?

Debbie Reynolds 50:28

That's amazing. Thank you so much. Well, it's been a pleasure having you on. I love having these chats with you about technology and AI and the things that are developing, and I love to continue to follow what you're doing at DPOrganizer. The things that you're doing are going to become so much more important, especially as we see this technology become more complex and people start to ask different types of questions about data protection.

Egil Bergenlind 50:55

Definitely, I agree. It was great being here. Thank you so much for having me.

Debbie Reynolds 50:59

Yeah, this is amazing. We'll chat soon. Definitely, thank you so much for being on the show.

Egil Bergenlind 51:04

Thank you, Debbie.

Previous
Previous

E132 - Michael Thoreson, Founder, KRATE Distributed Information Systems

Next
Next

E130 - Franklin Graves Technology Counsel, HCA Healthcare Attorney, Creator Economy Law Newsletter Author