Debbie Reynolds Consulting LLC

View Original

E168 - Nandita Rao Narla, Head of Technical Privacy & Governance, DoorDash

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E168 - Nandita Rao Narla and Debbie Reynolds - (38 minutes) Debbie Reynolds

38:15

SUMMARY KEYWORDS

privacy, data, people, companies, india, deletion, laws, build, projects, ai, happen, cybersecurity, delete, rti, ipp, technical, information, prescriptive guidance, focused, corruption

SPEAKERS

Nandita Rao Narla, Debbie Reynolds

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva”. This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show. Nandita Rao Narla. She is the Head of Technical Privacy and Governance at DoorDash. Welcome.

Nandita Rao Narla  00:41

Thanks, Debbie, for having me. I've been following your podcast. And I've really enjoyed it. So, I feel honored to be here.

Debbie Reynolds  00:48

Yeah, I'm happy to have you here. I love your content. The comments you make on LinkedIn they're always very sharp and smart; it's evident that you know a lot about data and technology. And I'm a technologist so that definitely appeals to me. I was excited that you said you'd be on the podcast. So this would be a fun session to talk about. Let's talk a bit about your career journey. Now, I think you're probably the only person I can think of who has been on the show who has had this particular trajectory. I would love to hear your background, definitely your path into privacy, but also talk about your transition from security engineering to privacy engineering because I know a ton of people who are in cyber security, and they're interested in privacy, and they don't know how to make that transition. So hearing your story, I think will help them understand how maybe they can think about that as well.

Nandita Rao Narla  01:48

I'm happy to share my journey. As an undergrad, I studied computer science. Right after completing the computer science program, I enrolled in a master's program in cybersecurity. At that point, I was very focused on cybersecurity. And this was at Carnegie Mellon, during my master's that I found out that there was such a thing as privacy, and that was very appealing to me. So, right at that time, like even looking for internships, I was very interested in doing something in the privacy space. But this was 12 years ago, before GDPR. So there were no privacy engineering jobs, there were no jobs for technical privacy people, there were jobs for lawyers, but that career sort of trajectory didn't exist. I remember going to the career fair. And there were exactly four jobs that even had the word privacy in it from the hundreds that were available to new grads. So, I applied to all four. One was in healthcare, one was in the financial sector, and then two were in consulting. So I joined EY in their cybersecurity consulting practice just because it was an opportunity to do something very broad and pivot to doing more privacy technical things. My time at EY was like consulting in the cybersecurity practice. There were a lot of projects that aligned to privacy-focused on either data governance, doing code reviews, building secure enclaves to keep data in one place, or building controls around data. There were projects that I worked on, for example, for export control projects where there are certain IPs that can't leave the US. It's very similar to how we think about data localization and residency. So, as I was doing these cybersecurity projects, it was easy to apply those same learnings to privacy realms and methodologies on security by design, doing security, technical evaluations, and architecture reviews, and translating those into privacy engineering seemed almost logical to me. This was at the same time that GDPR happened to a lot of companies that wanted to be more proactive in the technical control space. They wanted consulting companies to come and analyze their programs, give them roadmaps, do evaluations, and find gaps. So that was the time when I got really focused on privacy. And I wanted to do more to build a career in privacy. And after that I joined, was on a founding team of a privacy tech startup. Taking all the learnings from consulting is 10 years building different products for large companies, Fortune 500 companies, we wanted to build the tool ourselves or data classification tool, data risk platform raised funding, it was great, and then at that point, the company decided to pivot to more security, data protection. I then joined DoorDash as my first full-time, only doing technical privacy, and that's been great.

Debbie Reynolds  05:01

That's a great story. I think for people who, like you, are in a data space, the work that you were doing on privacy before GDPR was just no one really thought about. They're like, well, that's just part of the job, right? It's like the bottom bullet point on a long list of stuff. Yeah, we did export controls or whatever. But that's how I really got into it. I also have personal interests and privacy. But I started to see how it started to merge with what I'm doing, technically. So that was definitely my path. You're probably one of the best people to ask this question, which is, what is your vision about the difference between cybersecurity and privacy?

Nandita Rao Narla  05:44

I think we have to think about it as an overlapping Venn diagram; there are things that are common to both. You need to protect your personal information. And those protections include cybersecurity controls. So that's the common space. where it differs is things like cybersecurity is more focused on protecting the risk to the company, you're protecting the organization. And you're either from reputational damage or just breaches that affect the financial bottom lines of companies versus in the privacy space, you're more focused on risk to individuals to end users. And sometimes it's the organization or the company that's causing the risk, or financial harm, or a sort of privacy harm to the individual. So that's just a different mindset that when you're coming from cybersecurity, to privacy, you need to think more about the people who are impacted, not the company's assets.

Debbie Reynolds  06:50

That's one of the best ways that I've heard anyone describe the difference. A lot of times people think of it as the same. It's like, no, it's not the same. I think it does overlap. I always say that privacy and cybersecurity have a symbiotic relationship. So they definitely do have some overlap. But there's interplay there. And there has to be, I think, companies that have people like you who understand both sides of it; I think it makes you just a great asset, for sure. Let's talk a little bit about when you started your journey in being interested in privacy, a lot of jobs, there wasn't a lot of attention. And now there's tons of jobs and lots of attention. Right. And that's definitely great. But what is the gap? Do you think that happens within organizations? And I can tell you my thoughts. So, I think a lot of times, companies think about privacy as a legal issue that has data ramifications as opposed to a data issue that could possibly have legal ramifications. So, I feel like your problem starts at the data level. before it ever gets to that legal level, what are your thoughts?

Nandita Rao Narla  08:04

I 100% agree, and that's why anybody who is in data governance can be a good candidate for transitioning into privacy because you need to know what you have, where you have it, and where it's flowing to be able to build any sort of controls around that. The biggest challenge when it comes to privacy is that just having access to the data or using the data is not enough to make decisions; you need to understand why the data is being used and how it will be used. And sometimes only engineers know that. It's like everybody in the engineering team has access to data and uses it for identity verification processes, which is valid and required. But then they use the same data for targeting ads or doing something else without consent. And a lot of times, they don't know the difference between those use cases, and there aren't enough guardrails. And when legal teams weigh in, write policies and they write these guardrail documents, but they lack understanding of how it will be technically enforced. So, there is a disconnect between the two. I feel that's why we need these hybrid technical privacy program managers or product people who sort of know both enough to be able to be the gel between the two and bridge the gap.

Debbie Reynolds  09:31

I agree with that. That's a role that I play because I'm very familiar with both spaces. So it's fine.

Nandita Rao Narla  09:38

I feel like lawmakers keep writing convoluted laws which are difficult to interpret. There will be lots of jobs in the privacy space.

Debbie Reynolds  09:49

I never thought about that job creation by, right, some of these laws, it's like okay, I passed this law. I was like, well, how are you going to do that from a technical point of view, like, do you even understand how data works? It's true?

Nandita Rao Narla  10:04

When I read the law and I read regulations just to be able to better understand what to build? It's not easy. It's such a high level that for any engineering build, you need very specific requirements; you need to have exactly what needs to be built, and what should be the expected result? How would you test if that what has been built is meeting the standards? So this is where we need some sort of standards or reference architectures or some best practice guidance that can be more lift and shift for smaller companies to implement.

Debbie Reynolds  10:41

I agree with that. Thoughts about data deletion. And so this is a topic I love to talk to privacy engineer folks about now; regular folks think of deletion as like, oh, I press the button, everything is gone. But we know when enterprises, deletion takes many different avenues, and especially depending on how the data is used in different places, you can't press a button and, for example, delete stuff in 12 different places; systems aren't really made that way. Describe what deletion entails, the complication of it, how complicated it is to actually achieve deletion from a technical point of view,

Nandita Rao Narla  11:29

All the controversial questions. A lot of times when we say that we want something to be deleted, it is not a hard delete. And that's not how it is enforced; technically, hard deletion is irreversible, and we have downstream impacts from an integrity perspective like you need to have a log of certain things that happened. And maybe that's required for certain other bug fixes or some other legitimate uses. So, it's uncommon to do hard deletes. But there are cases when that's done. What we mean by a deletion in most cases is that it is de-identified to some degree. So I think that's like a whole spectrum of what is de-identified and that goes into this whole legal definition of what is anonymized data, what is the identified data, what is so dynamize data, so I think, sometimes there's like masking policies, which are reversible. So, unfortunately, some companies tried to use that as a delete, you know, replacement, and then you can sometimes do up to hashing functions, which are irreversible. So you've rendered the data sort of unusable. It's gibberish, though. So you can essentially call it delete, where we want to make sure that when you say something is deleted, there should be no way of re-identifying that individual. And that's where the focus should be to do some sort of like threat modeling exercise to, like, what are all possible ways that this can break and address all those gaps?

Debbie Reynolds  13:04

Thank you for that. I think people just misunderstand that. So I'm happy that you were able to tell me that. Let's talk about AI. So since everyone is so crazy now about AI, everyone wants to do AI now, right? AI is not a new thing. But it's definitely gotten a lot more media attention because of ChatGPT and these other LLMs that are now in the news. Tell me your thoughts about how maybe even journalists with AI make privacy more complicated in your view.

Nandita Rao Narla  13:41

I think there's a lot that has been talked about Generative AI and its privacy consequences. For me, I think the biggest challenge that I see is doing technical evaluations for use cases, especially when the data is unreliable. That's been a big piece of where you don't have full visibility into a system; it's difficult to assess risk. It's difficult to make judgments about what use cases can be approved or eligible or not. We're gonna be talking about this at IPP. So, I'm happy to share some slides from that conversation; there's going to be a very interesting panel that talks about ML and AI and privacy risks much better than I can. So I'm gonna make sure that I share those slides with you, Debbie,

Debbie Reynolds  14:34

That would be really helpful and really cool. Let's talk about end-of-life data. So I think the companies are very, very good at collecting data. Truly, right? But the end of life, that's where things get a little murky. Part of that is because of privacy. So, what privacy regulations have done is bring in more rigorous requirements around data retention; how long did you keep things before GDPR? Before some of these other privacy laws went into effect, there really wasn't anything that said there needs to be an in-date or in time or that you could not keep stuff forever. Right? I think that's one of the big things, companies who are accustomed to keeping everything and I really can't we need it for this purpose, we'll keep it put it back, or what doesn't matter. Now that they know that they don't want to delete now, it's like a risk. So, what are your thoughts there?

Nandita Rao Narla  15:38

I'm sure most companies, and I speak for a lot of the tech companies, it's very hard to implement, like deletion and like end of life data controls. Because, again, when they were collecting data, it was free for all; there was no requirement to enforce any sort of deletion. So, these companies weren't built with the assumption that this data is going to be retained forever, especially with these retention schedules. Law enforcement will be required to keep the data, so the focus is on making sure we have data. And then, with businesses evolving into how we make more use of the data using it for ML models and extracting more data value from this data, it became a much more important resource to keep hold of and keep secure. So they will never all of the systems were never designed to delete data. And it became more complicated because most companies don't have a good data governance program in place. So you don't even know where you have data that has sprawled into all these places where it shouldn't have been in the first place. And now, when companies are trying to put some controls, three years, or after one year, this data needs to be deleted, they're looking at huge tech debt of like a decade long of free-for-all data. So it seems like a very big problem to solve, and so big that it's almost a problem that you don't want to start; it feels like there are other things that are better investments to make in privacy than solving this problem. I don't think that any company has completely 100% got a handle on this problem yet. My advice to startups and young companies is to think about this ahead of time. Make sure you do privacy by design instead of now following all other companies and trying to retroactively think about how to fix these problems. There are tools out there, but it takes a lot of investment to even build tools in-house. And a lot of it is based on how much data you have, and like scanning it and finding out. So these costs run out to the millions. So it is a very expensive project for most companies.

Debbie Reynolds  18:06

I agree it's an expensive project for most companies. But then a lot of companies when they bring on new tools, they spend a lot of that money up front. And a lot of that attention is upfront. A problem with this end-of-life thing is that these probably aren't high-profile projects; they probably aren't projects that people have budgeted for. So, there is not a lot of glory in this end of life.

Nandita Rao Narla  18:35

It's not cool. You could be doing more interesting projects as a privacy person, so why would you want to delete data? Or would you want to do privacy by design on this new Gen AI?

Debbie Reynolds  18:49

Yeah, both of them are hard; at least the Gen AI was a little bit more interesting.

Nandita Rao Narla  18:54

And we have very lean privacy teams and most organizations, so you're being asked to do more with less, and companies are prioritizing for maybe whatever is achievable. And at least it looks like there'll be some end to that project versus this perpetual process of finding data and deleting it.

Debbie Reynolds  19:14

That's all true. I agree. I agree. What are your thoughts on the digital Personal Data Protection Act in India that has passed? So I've been watching this for a number of years. I had given up hope actually a couple of years ago because I was like, oh my goodness, I remember when India made privacy a fundamental human right, and I thought, oh wow, they did that, and they're going to pass this law really soon. And it took a couple of years and a lot of things, and then when the previous bill didn't pass, I was very disappointed. But then this one just popped up, and one of my friends in India actually sent me a message like a week before. He's like, hey, this is going to pass this time. I was like, I don't think so. All right, he's, yes, I don't know about that. But it is. So here we are. I saw a post that you made where you explained it. And I feel like a lot of people don't really understand it. And where I have you go into it, I'll tell you why I think India is very important. First of all, India is the largest democracy in the world. So, there are more people in India than there are in the US. And Europe combined. There is a lot of interest, and there's a lot of investment in India, especially around digital transformation, more people getting cell phones, and different things like that. So, I think a lot of tech companies have looked very deeply and closely at India and have looked at this with a lot of interest. So I think what they're doing is going to be influential with their Privacy Act; they had a front-row seat to see how things happened in Europe, though, and the US, so they were able to maybe pick some of the best stuff from all those regimes and make it their own. But tell me about this app and tell me why it's important and what's in it.

Nandita Rao Narla  21:06

Exactly what you said, it is a big market, there are a lot of people who live in India. So, by just virtue of the population that it impacts, this is huge. And this is the first nationwide pill that gives people privacy rights, which is incredible. I remember going to India, and I'm an Indian citizen. So I travel very frequently and have been waiting for something like this for a long time. Mostly because whenever I go like to make any purchase at a store in India, they always ask me for a phone number, and I don't have a local Indian phone number. So I hold up the line because I don't have a phone number to give. And that sort of breaks the process because the process is not built to have people decline anything to the point that the store likes the system, like they will just put their phone number in to get me through. So there has been no right to deletion access, ever. This act empowers the residents of India to be able to get a better handle on what data corporations are collecting about them and be able to request deletion. This is still a very high-level framework. It doesn't have a lot of the details yet. So it's lacking very prescriptive guidance, but yes it does provide rights to data principles or data subjects. And those are very limited exceptions for private entities. So, irrespective of size, you have to comply; they haven't really specified the timelines yet; that is recent news that big tech is probably going to get six months to comply, whereas other companies are expected to get a more graded timeline to comply. But it's all very high level at this point. We are still waiting for the rules and more prescriptive guidance on what needs to be built.

Debbie Reynolds  23:04

Well, I'm excited. I'm excited about what they're doing. I'm excited to see what the details that they put in there. When I'm thinking about India, it makes me think about the US. So why do you think we don't yet have Federal privacy regulation? I'll give you my thoughts, too. But I want your thoughts first.

Nandita Rao Narla  23:23

I'm not a lawyer. So I'll say that first. And I think that in the US, State laws have been entrenched for so long that it's so difficult to align on one law, Federal law, which covers all of the State laws. And does it have preemption, and does it make everyone happy, especially like breach notification stuff? There is no reason why we need to have 50 Different breach notification laws. I think that's like, at least that's like an easy one to harmonize and fix. So, if we haven't been able to do that for so long, I'm not sure when or how we are going to come up with this GDPR-like bill that covers all privacy.

Debbie Reynolds  24:12

I agree. I agree.

Nandita Rao Narla  24:14

I'm super pessimistic about this.

Debbie Reynolds  24:17

Right. I thought that was my thing. I'm like data breeds just I mean, we already have 50 States that agree that it's important to have a law. They're like, why don't we just take that and harmonize that on a Federal level and then let that be like a stepping stone to other things that we want to harmonize at the Federal level and that still have to happen? So I agree. I don't know. I'm not holding my breath. I'm not making any bets on when this is going to happen. I'm hoping that it will happen. But you know, as you said, I think it is hard because everyone has their own vision, and the States have taken so much of a lead now that it makes it even harder, I think, to do Federal law.

Nandita Rao Narla  24:59

Do you think it'll happen sometime soon? Like, there's been a lot of predictions?

Debbie Reynolds  25:06

Well, two things I will tell you. So I'll tell you what I think is gonna happen. Well, first of all, before I get to that, I'll tell you another reason why I think it's hard to do Federal legislation in the US. I've heard people say, Why don't we just take the GDPR and just implement them? It's like our eyes are rolling. It's like, yeah, you just don't understand. In the US, for whatever reason, a lot of our laws are very prescriptive. And I think prescriptive laws, even though corporations like that when the CCPA came out, I heard a lot of people say, we're not going to implement anything until we get the regulation. It was like, well, but you read the laws, you know, you don't really need the regulation. I mean, maybe it'll help you, if you just don't know, you don't understand how to implement that'll help you. But having those very prescriptive laws, I think those are harder to pass than the more general laws like you have in the EU; for example, in the EU, they say stuff like you get rid of data. What's the purpose of your having the data expired? You need to get rid of it, right? Where in the US, we have a California law saying, put a button on your website that says this. What do you think?

Nandita Rao Narla  26:23

Yeah, and then the whole opt-into regime versus the opt-out regime? Like, yeah, it's crazy.

Debbie Reynolds  26:30

But yeah, back to your question. When do I think it will happen? Actually, when it won't happen, I don't know when it will happen. So, I don't think it will happen in 2024; maybe the next window of opportunity is 2025. Maybe what I've noticed over the years is that just a lot of nothing happens on privacy in election years. So because 2024 is an election year, Congress will be trying to get re-elected. So they're going to put this on the back burner. And less often just ridiculous happens, like, unfortunately, something bad has happened sometimes for a legislator to focus their attention there. So I don't think there's going to be a lot of movement there. And then when the Congress comes in, the new people are elected that changes priorities, right? So you don't know who's in, who's out? Yeah, who thinks it's a priority or not? So that's going, depending on who wins the elections in 2024, I think that will set the stage for what is going to happen in 2025. I think that's our next window of opportunity. That's my view. What do you think?

Nandita Rao Narla  27:45

I'm actually like, I'm not even sure if it should happen, then just because we were so close the car and there was like, so much conversation about it that it was going to happen. It did. So I'm not sure it will make everybody's life easy. If it does,

Debbie Reynolds  28:02

I don't know at least they seem very excited about doing something on AI. So maybe something in AI will come out first. And then maybe we can hitch our wagon on AI somehow, like killer stuff sometimes and privacy stuff in there.

Nandita Rao Narla  28:18

That's possible because after, usually, I feel like the US mimics the EU. So yeah.

Debbie Reynolds  28:29

Yeah, I don't know; the legislators are very hot. Like, yeah, that's like the hot thing in the news. People are really talking about it. So maybe they'll do something about AI; I read articles and news. And they were like, one of the problems that the US may have with trying to regulate AI, and they haven't done anything on the Federal level or privacy. So that's kind of a building block that is missing there. So we'll see. I think it just makes it more complicated. So we see where the EU is going, where they did GDPR first, then they did this AI bill or x. So that's kind of like something that sits on top I feel of privacy. So right now, we don't have that foundation right now. But who knows, who knows, they're going to jump over privacy, and maybe we'll have an AI agency or something like that.

Nandita Rao Narla  29:23

That'd be cool. Yeah, we'll keep an eye out for that. Any developments on that? Yeah.

Debbie Reynolds  29:29

Any help, any help we can get on that space? That will be totally great. When you are talking with people about privacy as a career, I think is a great career. But what are some of the things that maybe people wouldn't expect about privacy that you learned? That maybe it's like, I have to think about that.

Nandita Rao Narla  29:52

When people are contemplating a pivot to privacy or growing into a privacy career, a lot of times, they don't actually know what privacy engineers, in various nonlegal roles, do. So there's a lot of ambiguity; there have been attempts to try to demystify what is a privacy engineer. IPP published sort of an infographic that I worked on as part of the IPP privacy engineering board to define that there are different types of people who could call themselves privacy engineers or software engineers who are building the tools. But let's say D, SARS, or anonymization, and we have designers and UX experts who are building interfaces that respect privacy or are free of deceptive designs. And then you have architects who are building the infrastructure that supports privacy. There are a lot of different ways you can be involved in a privacy career that is not legal. There are program managers who are managing large enterprise-wide projects to either do some sort of data mapping or do a GDPR readiness or like a country launch or compliance with CPR or some other new law; there is a lot of flexibility. It depends on what skill sets you have and what interests you have that you might be able to find a role in privacy that aligns with what you have and what you want to do. So, just like knowing the field better is a good idea, like talking to people who are in this space, I wish there was some nice little chart or diagram that shows that these are all the trend roles possible. And here are the skill sets needed for each one of them. The other complexity while transitioning is that companies have very lean privacy teams. So you have people who are doing everything. Sometimes, the privacy team has a team of one. So you end up, and those are lawyers who end up doing legal advisory work and also like implementation related more hands-on things with implementation selling, Debbie, you're more in that space where you're like leaning into everything. So there is an opportunity to do a lot and sort of like grow into something where you have interests. It's just a very welcoming field, which values diversity of thought and background as well.

Debbie Reynolds  32:39

That's a great way to put it. I love the way you put that. And I think it'll be very valuable for people because I've heard a lot of people say, how can I break into privacy and I feel like anyone with an interest in data or has some type of background and a data field can enrich their knowledge by leaning into privacy, maybe adding it, there are other things there are other specialty is I think that definitely helps. So if it were the world, according to you, Nandita, and we did everything you said, what would be your wish for privacy anywhere in the world, or in any way, whether that be human behavior, regulation or technology?

Nandita Rao Narla  33:21

I think just because it’s closer to India's Privacy Act, and I'm working on a couple of side projects related to that, it's just top of mind for me. It's great that it's been passed, and we have something it's a good start. But there are several red flags for this act. My wish would be that we have some way to harmonize for example, this new India Data Privacy Act with the existing Right to Information Act, which is geared towards transparency and government accountability, which allows citizens to obtain information about various projects and public funds are being spent to uncover corruption and this act there is speculation that it weakens the RTI Act. Some context here is a lot of corruption in India lately has been exposed by journalists and citizens using this RTI to get information on who are the beneficiaries of certain welfare schemes. Where is money being spent? Who are loan defaulters? Where has this infrastructure been built? So I remember reading a report where the government of India has a maternity benefit scheme where they give economically weaker sections a certain amount of welfare funds for women who are pregnant or who've just had a baby to be able to better support nutritional needs for babies. An RTI inquiry revealed that men were claiming these benefits, and the same women were claiming these benefits were like five times a year, which is like this would have never been made available if it wasn't for this RTI Act, which allows you to get this information. And with the new Privacy Act, there is speculation that it weakens this ability to question the government's uncovered corruption by claiming privacy as a reason that it's protected information. And these people did not consent to having that information be released. So anybody who's doing corruption will not consent to the information being released, that will potentially help uncover the corruption. So I'm a little bit worried on how such a good intention act on privacy can now potentially be used to perpetuate corruption and negatively impact the country. And I wish there is some way to harmonize and balance both.

Debbie Reynolds  35:55

I do know that a lot of times, these laws exempt the government in some way from certain things.

Nandita Rao Narla  36:05

Broad carve-outs, like the government is exempt from pretty much everything. They also have some really awkward provisions where the government can enable a takedown request, which is very odd in a Privacy Bill.

Debbie Reynolds  36:20

Yeah, who has a takedown, China? China has takedown as well. Yes. They're like pretty aggressive. I think that they gave me about 15 minutes for certain take-downs. So yeah, that's what my friend from Nike told me. Oh, my goodness, we're so happy that you're on the show. This is so much fun. I love to follow your work and the things that you say; I definitely did take a look at the things you were talking about in the India bill. So, I feel like a lot of people don't understand that and understand the implications. But I love that you're helping to educate all of us on that; we all need to know.

Nandita Rao Narla  36:58

Because the details don't exist. The timelines may be aggressive, at least from a technical standpoint, or the operational response is you need to start doing the things that you should already have been doing, which are data mapping, data governance, and building consent flows. So, a lot of it is just table stakes. And it's not some groundbreaking compliance requirements. But yes, companies haven't done that. So, for them, it is something that they need to prepare for and work towards.

Debbie Reynolds  37:31

It's a great goal and a great reminder. So we're only going to have more privacy regulations with my list. So people, for companies who haven't got the memo yet, you need to really pay attention to this area. So well, thank you so much. Nandita. Happy to have you on the show. And I'm happy to chat with you soon.

Nandita Rao Narla  37:53

Hoping to run into you at events sometime.

Debbie Reynolds  37:59

Absolutely. I will love it. I will love it. Well, thank you so much. And I'll talk to you soon.