Debbie Reynolds Consulting LLC

View Original

E177 - Jay Averitt, Senior Privacy Product Manager/Privacy Engineer at Microsoft

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

32:19

SUMMARY KEYWORDS

privacy, data, information, collecting, delete, data minimization, started, company, law, privacy regulations, ai, app, comply, standpoint, organizations, storing, talk, prompts, retaining, transparency

SPEAKERS

Jay Averitt, Debbie Reynolds

Debbie Reynolds

Many thanks to the Data Diva Talks Privacy Podcast Privacy Champion, MineOS, for sponsoring this episode and for supporting our podcast. Data governance has become an even more difficult challenge with constantly evolving regulatory frameworks, and AI systems set to introduce monumental complications. That's why I think organizations need MineOS. This platform helps organizations control and manage their enterprise data by providing a continuous, single source of truth. Start today with a free personalized demo of MineOS, the industry's top no-code, privacy, and data ops solution. For more information about MineOS, visit their website at https://www.mineos.ai. Enjoy the show.

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world, with information that businesses need to know now. I have a very special guest on the show. His name is Jay Averitt. He is a Senior Privacy Product Manager and Privacy Engineer at Microsoft. Welcome.

Jay Averitt  00:39

Thanks so much for having me, Debbie. I've really enjoyed listening to your podcast over the years. It's great to actually be on here.

Debbie Reynolds  00:47

Yeah, I'm thrilled to have you on the show. This is a dream come true for me. I know the first time that you commented on something I'd written, I said whoa, Jay, listens to my stuff; this is great.

Jay Averitt  01:01

Likewise, likewise, I enjoy it when you comment on my stuff as well. So it's great to actually be able to talk live with you.

Debbie Reynolds  01:08

Well, in addition to that, you call yourself a privacy evangelist. I think that's a great moniker for you because I think that's what you do. So, you definitely champion privacy. You're very supportive of people who are trying to put out their information; I always look at your profile because you're always saying, hey, look at this, think about this. So I think that's a great title for you. But before we get started, I would love to know your trajectory into privacy engineering. So yours is a very unique path. A lot of times, I have a lot of people on who are lawyers, a lot of people who are in the technical part of privacy, and data people like me, and then you're sort of in that middle space. But I would love for you to tell me your trajectory and your path into more technical privacy.

Jay Averitt  02:00

Yeah, tech was always my first love; I knew at an early age that that was something that I really wanted to do with a career. But I also kind of like, at the same time, was always interested in law. I just got interested in reading. I read John Grisham’s books starting in middle school. My dad was an attorney who had a lot of contact with attorneys. As always, I thought that would be something fun to do. But I started off my career in technology as a software engineer and really enjoyed that. But I started seeing in the early 2000s, with the dot coms going bust, I was like, oh, gosh, there might not be software engineers in the future, which, you know, crystal ball, I was really wrong. So we ended up going to law school, and  I was like, okay, well, there's got to be a way I can bridge my love of technology with a law degree. And look for ways to do that, and started working as a software licensed attorney, and, you know, had my hands on technology. But that was at home because it really just felt like I was working on contracts. Then I started with cloud stuff started coming out. Privacy really started to come into the fray in the early, I'd say, 2010s. I said, okay, this is really interesting. Really, I got my hands on AWS Azure and GCP and started looking at cloud stuff. I was like, you know, this is really a fascinating new world. But then when GDPR came out, I was like, okay, this is really, I think, it was a perfect time; I was really not content, just looking at contracts and working at it, looking at stuff from just the legal side. Maybe this is a time that I can use my law degree because, obviously, knowing the regs is important, but really go back to my roots and technology and find myself in a more operational privacy role, start doing some consulting, and then found my way, and I didn't know privacy engineering really existed until about two or three years ago. I was like, oh, this is a perfect way; you've got to know the regs, got to know what's important from a GDPR/CCPA perspective. But you can really hone in on talking to engineers and working with engineers on a daily basis. So I really feel like I found my home, and privacy engineering feels like I got into privacy from a crooked path. It's fun to be working with folks from all different backgrounds on this path.

Debbie Reynolds  04:42

I love it. I love it. You're doing a job that didn't exist many years ago. So I think people who are like you and maybe have different skills they can bring to this area are really interested in this path because I think it's a really fascinating path. I want to talk to you a little bit about privacy by design. We talked about that a lot, right? Hey, you need to design privacy. Part of that privacy by design really is thinking about privacy at a fundamental foundational level, not thinking of it as something that you could tack on at the end. Tell me your thoughts about how companies think about privacy by design. I'll give you my thoughts. A lot of times, I feel like the way that organizations think about privacy, it is at the end, right? If you think about it, from a legal perspective, a lot of times people think about legal because legal is very reactive, right? Where we're saying, hey, think about it before it becomes a legal issue. What are your thoughts?

Jay Averitt  05:46

yeah, I mean, I think that's completely right. I mean, I think there's a ton of companies out there that absolutely do consider privacy, but not until they look at it for the final sign-off and say, oh, and we consider privacy because they do want to make sure that there's no GDPR sting or CCPA sting that's coming to bite them at the end. But the problem with that is that if you're looking at it at the end, I mean, there's only so much you can do, you might be able to finesse something or best efforts to make sure that we're going to comply with the privacy regulations. But you've not considered privacy throughout the process of your design. So there's probably things that may aggravate your users that are not privacy-centric. Or you may just miss something that is a GDPR problem because you're doing it at the 11th hour. Last but not least, you're probably costing your company more money. Because if you're looking at the end of the release cycle, and you're having to bake privacy in at that 11th hour, it is probably gonna be expensive. Because one, you're delaying a release that everyone has been excited about and put a bunch of time into; it's just more difficult to rework privacy. In the end, then privacy has been considered all the way throughout the process, the better way to do it. As a privacy engineer, as somebody who's a privacy professional, they should really advocate for privacy by design principles to be in as early as possible in the software development, or you've actually started coding something, privacy should be consulted, and say, Hey, we're planning on building this widget, do you see some issues with it from a privacy standpoint, and that way that privacy can be built into the widget from the beginning. Then, in the end, when you're looking for final sign-off, there should be no problem; you're going to need legal to sign off on it even if you have a privacy issue. But if you've got a good privacy engineer, and the engineers have listened throughout the process, then it wasn't going to be a problem. You don't have that big-ticket expenditure at the end of the process, so annual releases aren't delayed. Yeah, I mean, I think absolutely; if you can, as a company, really consider privacy early on in the process, you're just gonna be much better off.

Debbie Reynolds  08:07

I agree with that. 100%. Let's talk a little bit about privacy laws, especially things in the US. So what we're seeing is almost a repeat of what happened with data breach notification laws in the US, where California started out strong. Other States liked what California was doing and decided they wanted to do their own thing, right? So, every State has its own special sauce, right? Now, we have a patchwork of laws for every State with data breaches. I think that's what's happening with privacy regulations; we're going on that same path. But as an organization, you can't really lurch from one law to the next and get super excited just because, hey, New Jersey just passed something new. You can't upset the apple cart every time a new State creates a new law. Tell me about, especially from a privacy engineering point of view, what you think about when a new law comes out. Yeah. I'm hoping that you're thinking about it more from a foundational point of view. But tell me your thoughts about that.

Jay Averitt  09:13

Yeah, I mean, it's funny, that word you use, patchwork because I was about to say it's basically a patchwork quilt of privacy regulations we have here in the US. Even if I wanted to, I couldn't be an expert on every single State's privacy laws because that would take up all my time, and I wouldn't be able to do anything else. That's not what I want to do. Or I'd be a privacy lawyer, not a privacy engineer. So I think the best approach is, if you're complying with GDPR, you're probably you're going to be pretty good for most of the US laws. I mean, CCPA pretty much models the GDPR. If you can comply with the edicts that are in there, especially around timeframes for diesel cars and things of that nature. I mean, you're probably gonna be okay under there; you're obviously going to want to console your attorneys to make sure that there's no random stuff popping up all the time, like what back had a law come up. There are some things that I had to look at from the upsell standpoint that I hadn't considered. So there's always going to be something that you might have to look at. That's different. But I think just from a design and good privacy program stance, if you build your privacy program and say, hey, we're going to comply with all aspects of the GDPR, then more than likely, whatever State law comes down the pike or other international law comes down, you're gonna be okay, there's gonna be some nuance that you're gonna have to comply with. You're gonna hopefully have good privacy counsel to help you understand what that nuance is and make sure that you're designing your program that complies with that, but there's nothing crazy that comes down; there's like, we've got to completely redesign it, because some renegade state throwing something out there, and you're having to figure out what to do. I don't know; hopefully, one day, we'll have a Federal law. But as the years go by, I wonder when that might happen.

Debbie Reynolds  11:14

I think we always wonder when that or if that may ever happen. I have a question about AI. So we're in a really interesting space. Now, there really aren't, quote-unquote, regulations set for what we see happening in the EU. China has a Generative AI law. But how can we explain how AI impacts privacy?

Jay Averitt  11:45

Yeah, it is an interesting time to be in this space. It's something that AI has been around for a while, and machine learning has been around for even longer than that. But it's not something that was being injected into every single product. It seems like even products that you wouldn't think needed it had an AI component. It's a popular buzzword that's thrown into everything, like data minimization. So do we need to store, if we're looking at an LLM, for example, ChatGPT like product? Do we need to store the prompt of the customer data that has been provided to us as part of this, which can be relatively massive and include some pretty confidential stuff? Is there a reason we need to store that stuff? How can we minimize what we're storing of the customer's data? I think that's a big part of and also the models themselves, how are we building those models with the data that's being collected? Is there a way since we need to make these models better? We're not actually storing customer data? Is there a way of maybe anonymizing that data to the standpoint that we feel comfortable that we can build a model? On top of that? We've got to look at things like, from a responsible AI standpoint, is the model biased? Are we treating some classes not fairly? So, is there a mechanism in place for an individual to say, hey, I'm seeing bias? Then, if there is that mechanism in place to say, Hey, I see some bias in the model; maybe we should take a look at it? How are we storing that information? And what are we doing with it? So, I think there are a lot of interesting privacy problems that arise from AI. I think a lot of what we've learned from privacy certainly helps. A fundamental thing of privacy is transparency. And that should be the same in AI. But also data minimization is fundamental. But if you're saying the transparency of unfairness is another thing. So, if you want to be transparent and fair, that means not having some bias in your elbow. How do you square that with data minimization, where you need the data to understand that the model was biased, but you also don't want to collect more from your customer than you need to? So I mean, these are just things that I'm thinking about a lot. I think that a lot of people are thinking about a lot. I don't think there's clear answers on how to solve these problems. But I think they're interesting problems to solve. I think the frameworks that we've used in the past got to be our guide posts, and then we'll see where we go from there.

Debbie Reynolds  14:40

I think two things make privacy challenging, When I think about it from a technical perspective, and that is, a lot of applications were not created to forget stuff. Right? They are created to remember things, right? Or if you forgot stuff that was bad, right? So forgetting and deleting or minimizing things wasn't really the thought process around developing tools, and then transparency, where a lot of these tools are never built to be transparent either, right? So it's like, okay, the person is working on it. There is like Santa's workshop; they put stuff in it, and something comes out at the end, right?

Jay Averitt  15:27

Yeah, those are two very good points. I mean, I think we can learn from mistakes that were made early on, and the big data boom, just collecting everything for collection sake, because, as you said, while we delete it, because one day, we may need this for some kind of analytics or something. The more data we can collect, the more powerful we are. That was it for a long time until GDPR and CCPA came out. Companies actually started thinking, oh crap, we actually have to consider privacy. We can't just absorb and vacuum up every single piece of data we need to learn from when we're looking at algorithms because, obviously, there's a huge amount of data that you could be collecting that could be very valuable. Because the more prompts we have, the better the models could be. We know, particularly if you're getting feedback on that, we've got to learn from what we did in the past and not go back there. And yeah, the transparency part of it. Almost all privacy policies, unless somebody's really skilled, are almost impossible to read; even as a lawyer even I only look at them half the time because I can't change them. So now what's the point of me like reviewing this, except for like, looking at, oh, big gotchas. I think we've learned from that also and made it as clear as possible, especially. I mean, we are given the opportunity for people to disclose a lot of stuff that would be extremely confidential and potentially stuck dealing with their intellectual property. So we've got to be really transparent about what's being collected.

Debbie Reynolds  17:10

So, Jay, what is happening in the world right now around privacy or data that concerns you? Something that's more top of mind for you.

Jay Averitt  17:21

I'm concerned about dark patterns. Every app you download wants to do things that just don't even make sense. Like Chick Fil A is one of my favorite restaurants; I love their app, and it's a great app. But when I downloaded it said, hey, we want to scan your network to see if there's other devices on like, why do you need to scan my network for other devices on it? I just want to use my phone to order my food; that should have nothing to do with any other device on my network. It's not a bad app; I think it's fine. But I think that just shows you what all apps are doing is collecting the information they want. Luckily, Google and Apple have made it more transparent. So when an app is doing these things, you at least know what they're attempting to do, sometimes, and you get those warnings. But it's crazy the amount of stuff that's out there, especially around health-related apps; I think the most concerning, read a bunch about period tracking apps and the information that is being collected about women's menstrual periods or periods. They're collecting all kinds of information that they don't need. I don't understand how they can get around HIPAA because of certain loopholes and such, but it's really concerning what's being done. So I mean, I think privacy is ignored in a lot of places, and users, unfortunately, are just not educated enough or just don't care about what's being done in the consumer space. So I mean, I think a lot of times, big tech gets slammed for not considering privacy. But I mean, I can tell you, we were actually looking at it. But in a lot of places, it is just being ignored. I went and bought a car not too long ago and went to fill out an application for a loan; when I started typing in my information, it started pre-filling with 25 other different people's names and social security numbers on there. I asked the guys like, hey, have you thought about maybe putting this in incognito mode or something where it's not storing it? Or, for God's sake, please just turn off the autocomplete. He's like, oh, you're the first person that said anything about it. I think that's a lot. A lot of it is consumers need to be better educated about the privacy pitfalls that are out there. Then, yes, there are some companies out there that really are doing nefarious things, and I wish they weren't. I wish privacy was better considered there.

Debbie Reynolds  20:03

I agree with that. I run across that a lot. I know, just like you're saying about the Chick Fil A app. Right? And I'm glad that there are more prompts, I believe, especially if you use an app on your phone where they go. Do you want to share your address book? Or do you want to share all of your, you know, everything on Google Drive? or OneDrive with this company? Like, no, I don't want to do that. So at least giving you that that idea that, you know, you have to make a decision to share more data, if you want to, like I say, get your chicken nuggets, I think that's the positive part, I think the thing that concerns me the most, and you touched on it, and is around putting so much work on the consumer, right, where it's like, I don't have time to read an 80-page privacy policy for every tool or, I don't have time to go into every website, and change all these different privacy settings. A lot of those settings reset after a period of time. So, like, you can go through, do this really diligent resale certain things, and they may say, hey, you know, in a year, we're going to put the stuff back the way that was, you have to go back and fight your way through again. What are your thoughts?

Jay Averitt  21:19

I think you're right. I mean, I think that it's another throwback to what we were talking about before, that privacy should be baked into everyone's place. I mean, throwing just a privacy policy up with a laundry list of things you're collecting and making it impossible to read. Yeah, no consumer is going to look at that. We can't expect we can educate a consumer to the point that they're going to be an expert at reading that; otherwise, we have way more lawyers out there than we need. Yeah, I mean, I think making the prompts is helpful, but also just making it as clear and easy as possible. You're out there. I mean, why do we need like 50 page privacy policies? How about just I've seen things that look like medicine labels that never show privacy? I think those are a good step. But yeah, I mean, it doesn't even need to be I can say, How about like four or five bullet points, this is what we're going to do. I mean, if there's more than four or five bullet points of what you're doing, maybe you should reconsider what you're doing from a data minimization piece. Why do you need all of this information to order Chick-fil-A or to track your period? In the spirit of transparency, we've got to get away from convoluted messages to the consumer and make them more digestible.

Debbie Reynolds  22:45

When I talk to tech people who aren't privacy egghead people like us, I tell them two major privacy problems are data collection and data retention. So when we're talking about collection, you're absolutely right around fundamentals, not collecting too much, figuring out why you're collecting things really important. But then a lot of people get in trouble, or they lose the plot on the retention side. So this is another thing where I feel like organizations have traditionally never had to have a reason to delete stuff so they can keep stuff forever. What these laws are saying is you shouldn't keep certain things forever. What are your thoughts about that?

Jay Averitt  23:32

Yeah, I mean, I think you're absolutely right. I mean, I'll tell you, that's the thing that I am talking to engineers the most on is, okay. See, what you're doing here could stop from the minimization standpoint. But what is our mechanism for deleting data? If there is a request for that? I think that's something that they forget about. I don't know why it’s raw data. It's not; oh, how can we delete it? Because I mean, a lot of times, if somebody serves you with a data subject access request, you've got to be able to figure out a way to delete it. So I mean, there's two pieces of that the retention piece that you spoke out is an important part. I mean, maybe you should not be collecting things for more than 30 days that you don't absolutely need to; maybe you should have something in there to delete at that 30-day mark. So, in that case, that a lot of that, you can save yourself a lot of headaches, every not retaining stuff for longer than 30 days that includes a lot of personal data, or you're using some kind of GDPR-approved anonymization technique if you need to keep it longer than that 30 day period. But you've also got to think about if somebody wants to delete this data, how can we do it? It's gotta be easy. I mean, you can say, okay, especially like, in my space where a lot of times it's not going to an end consumer, it's going to a business, the business needs to have the right to figure out how to delete an individual makes a request that business, they need to have to be able to figure out how to delete that because we haven't made it as easy as possible for the business, then we haven't done a good job. It'll be like, well, they can do it if they go to root. I'm like, no, you've got to make it as easy as possible. Gotta make it your common IT administrator; not some Linux hacker can actually delete this information. So I mean, I think those are the two main spaces: one, really look at your retention periods, and really figure out, okay, how sensitive is this data? How long are we going to retain it based on sensitivity? Then, are we retaining this sensitive data for some reason? What mechanisms are out there to delete that data? If somebody asks for that, even if the person doesn't ask, what if that person leaves the company? You need to be able to delete that data. If that person leaves, the company shouldn't be retaining all that information? So yeah, I think you're absolutely right.

Debbie Reynolds  26:08

I am very much a fan. I've heard you talk about this, about making sure organizations have playbooks for what they would do if these situations arise, as opposed to acting surprised when someone wants something deleted.

Jay Averitt  26:23

Right, right. Yeah, I mean, it's going to happen in businesses; you wouldn't think it would happen as much as it does, like I've seen in hotels get data subject access requests. So, even not just your standard tech companies, you're gonna get the data subject access requests all over the place. That's just the landscape we're in. Your solution to that should not be, hey, if you want your stuff deleted, email privacy at "The Data Diva" dot com, because that gives me a hard time for somebody to be having a manual process of monitoring their privacy, and then making sure you're complying with whatever regulation needs to be complied with, with a manual process. So you better have a streamlined process for that because you'll fail more than likely if you're trying to do it manually.

Debbie Reynolds  27:12

Yeah, I agree. So if it were the world, according to Jay, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be regulation, technology, or human behavior? What are your thoughts?

Jay Averitt  27:28

Yeah, great question. What I wish is every company would really look at two things. One is, what are we doing from a privacy standpoint, fundamentally fair to the user? Two, if I were the user, how would I think about what we're doing from a privacy standpoint? Because I mean, I think that's where it gets lost. A lot of the development of stuff is, okay, yeah, this is great from a business perspective. But how are we treating individuals and treating our end users at the end of the day? Because, I mean, ultimately, if you think about those two things, you may think, okay, well, this may be expensive from a business standpoint because we're having to put a lot of privacy controls or do things that we wouldn't ordinarily do without thinking about those two things. But what you are doing is you're building trust with your user base. If your user thinks, okay, well, they actually care about me. This company has really got top notch privacy, and they're a company I'm going to do business with. So if you built that trust, I think at the end of the day, you're going to find more businesses is going to flow your way. That's the way I look at privacy from my world. Maybe other people have different ideas. But I think if everybody looked at it from at least those two things, at least we'd be in a better spot privacy-wise than we are today.

Debbie Reynolds  28:59

All right. I agree with that. You hit the nail on the head with that because I feel like if you can't articulate how what you're doing with someone's data benefits them, then you're already off the rails, I feel.

Jay Averitt  29:12

Yeah, I mean, absolutely. Yeah. Because of all the information you have to provide to order a pizza. If you're thinking about the user perspective, why do I need to fill out all this? I rented an electric car, which is not a good idea to rent an electric car; they're fun, and I might buy one, but renting one is a hassle. You’ve got to find a charger and when you're on a vacation or a trip, that's the last thing you want to be doing. So I was at this random charger in Birmingham, Alabama, when I was visiting my parents and trying to charge the car, and I said, okay, you've got to sign up for this app to charge the car. That's it. Okay. I mean, I didn't ask her, like, what year the car was. I don't even know. I mean, this is a rental car. It's like asking for all this information like all you need is my credit card to charge me to charge my car; I have no idea why they needed all the information they were collecting. I'm sure it was for some kind of data broker or some kind of analytics they're running. But I was so frustrated filling out this information. If I had any choice in the matter, I would have gone somewhere else. That is what happens when you aren't thinking about the users, you have the risk of really taking off your customers, and they may find someone else that actually values privacy. Yeah.

Debbie Reynolds  30:35

So next time you rent an electric car, or maybe you won't rent one, you'll steer clear of that company.

Jay Averitt  30:42

Exactly, exactly.

Debbie Reynolds  30:45

Well, it's been great to have you on the show. I was really thrilled to be able to chat with you. I know that people will find your insights really valuable. Definitely follow Jay on LinkedIn; you already put out some great information. It's very educational. Thank you so much. Well, we'll talk soon, and I look forward to being able to collaborate with you in the future.

Jay Averitt  30:59

Thanks so much for having me, Debbie. It's been a delight, and I follow Debbie as well. I've learned so much from her, so I'm glad to actually have talked with her. Thanks so much.