Debbie Reynolds Consulting LLC

View Original

E157 - Kim Emiru, Vice President, Privacy, Sony Pictures Entertainment

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E157 - Kim Emiru and Debbie Reynolds - (38 minutes) Debbie Reynolds

38:30

SUMMARY KEYWORDS

privacy, laws, people, companies, business, work, data, impact, learn, policies, hipaa, ai, regulations, apply, thoughts, year, great, feel, general, technology

SPEAKERS

Debbie Reynolds, Kim Emiru

Debbie Reynolds

In Episode 157, which is the beginning of season four of "The Data Diva" Talks privacy podcast, I speak with Kimberly Emiru, who is the Vice President of privacy at Sony Pictures Entertainment. During the episode, I predict that the Biden administration in 2023 will likely do something major on artificial intelligence before the end of 2023. On October 30th 2023, the Biden administration actually put out an Executive Order which is quite extensive. It's called the Executive Order on Safe, Secure and Trustworthy Artificial Intelligence. There's quite a bit in this Executive Order around privacy, which I'll be speaking about in the months and days ahead. Enjoy the show.

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show, Kim Emiru. She is the Vice President of Privacy at Sony Pictures Entertainment. Welcome.

Kim Emiru  00:40

Thank you, Debbie. Hi, everyone. Great to be here.

Debbie Reynolds  00:42

Yeah. Well, this is a great story of how we got to know each other. So I got to know you at your previous privacy job. You had reached out to me, and we started doing some networking. I asked you to help me come on a project or collaborate with me. And so that was for the XRSI safety privacy framework. But XRSI is an organization that works on safety around augmented reality, virtual reality, and mixed reality. This was before people went crazy about the members, right? I think.

Kim Emiru  01:22

Yeah, I think it was probably.

Debbie Reynolds  01:24

Yeah, it was before that. I remember you were so amazing. So, part of that program that I ran was compliance. So, we mapped almost every imaginable conceivable privacy law against that framework. And the interesting thing was that, in those experiences, data was being collected that was never collected before. So there were gaps there, like regulation because regulators, have I really thought about some of these things, and maybe the AI, the hot thing about AI, now we're going to fill some of those gaps. But I really appreciated all your work; you were super sweet and very smart, very diligent; Sony got you before I did.

Kim Emiru  02:12

It was a really great project to be a part of. And I think one of the things that I've tried to do, more so later in my career, is reach out to people that I see working on privacy that are interesting, and then it opens you up to so many different doors that, you know, that was not an area that my prior company ever dealt with, or I would have come across, and it still gave me the experience to apply skills that I had in a new part of privacy that at the time was still cutting edge. So it was great.

Debbie Reynolds  02:44

Yeah, it was fantastic. Well, you're such a sweet, nice, smart person. So I'm really proud of you for what you're doing at Sony. And it's like, oh my god, I'm so excited. When I saw you go over there, it was really happy to me, you know, for people listening, they probably want to know, how do I manufacture my career to move into these higher level executive positions? So, tell me about your trajectory. How did you get where you are now?

Kim Emiru  03:12

Well, I started in privacy through my law school kind of summer internship programs. Summer of my tool year, I interned at the Department of Health and Human Services Office for Civil Rights doing a HIPAA breach investigation. And that was where I cut my teeth, looking at a lot of privacy regulations and how they impact people. So looking at it from a regulators standpoint and the impact and complaint investigation and what breach really meant. That allowed me to stay on after my law school. After I finished law school, I stayed there for another two years. And that was when high-tech was about to start getting enforced. And that became a really big deal for nontraditional entities that would be considered business associates. So there is a ton of activity around how to get compliant with Hi Todd as a business associate. So, I started attending some of those CLEs and learned about, well, different sectors and how regulations can impact industry. I made my way to work day at that time, and again, did privacy compliance but more in compliance work on not just HIPAA, which was my area of expertise, but all other regulations that can impact companies and just work. I think the key to my role getting to where I am today is I think there's still a ton for me to learn and do, but it is about being open to opportunities, not shying away from a challenge, and putting yourself out there and just knowing that there are still things that you can learn as a privacy professional. As a lawyer, there is a ton to do out there. So, my career trajectory has just been a combination of being available for the opportunity but also showing that you will work and grind and learn. And I think people will open up doors for you.

Debbie Reynolds  05:18

I think that's true. I also want to talk to you about, obviously, you're not this way; I read a lot of women who are afraid to go for these higher positions for some reason; what are your thoughts about that?

Kim Emiru  05:30

Absolutely. I mean, I think you see that all the time, especially minorities tend to, even if you do go further positions, you may not necessarily be considered or, you know, have the same of opportunities. So, I probably learned this later in my career than I would have liked. But applying the standard of even if I think I have a slight chance, I'm going to shoot my shot. See, of course, you need to prove that you can do the work. But don't limit yourself, or don't exclude yourself from the process. Let other people weed you out. Don't do it yourself.

Debbie Reynolds  06:07

Yeah, I think that's a lesson that all of us have to learn.

Kim Emiru  06:11

Yeah.

Debbie Reynolds  06:11

Right. I think a lot of people have impostor syndrome; I think, oh, I couldn't do this job or whatever. Then you have someone else who really is not qualified, as qualified as you are. And they get the job because they have more confidence to totally go there. Let's talk about just learning. Privacy is an area where you're constantly learning. So you're not ever, it's not like, okay, I'm king of the hill, I know everything about privacy, and then I can just do my job and not read or not learn, not research. Well, tell me about that learning part of privacy.

Kim Emiru  06:45

That's one of the best things about being in privacy, I feel like, is the continuous challenge that it presents. You may have better knowledge in one area, but again, it's not going to be I know everything in this space. And there's going to be areas where you need to consistently improve. And that, for me, is the technology side, how the law applies to the technology, that is the continuous challenge that I normally face. And then it's the ability to relate what you know about the law to people because you're not always interacting with the same type of people in the same roles. And so being able to communicate that as well, the laws all over. And so you may know, when I first started out, I had a pretty good understanding of HIPAA because that was my area, but that quickly changes and the way it's applied and the way you apply it to the business that you're in changes. So it's, again, being open as a critical component of being in the privacy role is open to being challenged, open to learning and relearning.

Debbie Reynolds  07:59

You mentioned something I think is really, really key. And people that I know that work in corporations and privacy, the ones that are successful, have a deep understanding of the business, and they know how the larger technology applies to what they're working on, as opposed to having a very narrow mindset like this is the law and you know, you all have to come over to my way of thinking, as opposed to the fact that businesses are in business to do business. And people who are successful, I think, have a deep knowledge and understanding of the business and how they fit in. So what are your thoughts there?

Kim Emiru  08:40

Yeah, I think you said it the right way, right? I am, you know, I'm in-house counsel. And I've been for a while. So my role is to understand the way in which the business wants to operate and where they're headed, and then figure out a way of doing that in a compliant manner. I'm a business-minded attorney for that reason. Otherwise, you could just get your advice externally, get that memo that tells you here's the black or white law, and then you figure out how to apply it. But when you're in house, I think we take on a different lens and being able to enable the business to do what they want to do in a way that is compliant. That is their job.

Debbie Reynolds  09:22

Yeah, I want to say I've seen some people approach privacy as a risk, the only consideration. So I was like, here's my advice to you: don't take my advice like you're terrible and different things like that. Privacy is a component of business, right? So companies make choices, and you have a privacy person, or you're like the king of business choices, right?

Kim Emiru  09:50

Yeah.

Debbie Reynolds  09:51

But I think our role is to always lay out the facts, understand the business understand how it applies. Make a recommendation, but then the business really makes the decision. What are your thoughts?

Kim Emiru  10:05

Oh, absolutely; I think every business has a mission and goals that they're trying to accomplish. And so it's figuring out how you can be part of that. And it's a set of considerations that leverages privacy; it's never just security; there are various other corporate functions that you can name and thinking; it's never just about tax; there are just a multitude of things that need to be on the table for a decision to be made, you are weighing that risk. The ultimate decision maker may be different at every company, but you need to be able to inform them of the position that you have from a privacy perspective. And they weigh that versus every other consideration that you have. And so it would make our jobs a lot easier. Of course, if we say, here's the privacy consideration, and this is the only thing that you have to do well, but that's never the reality. That's just not it. So it's important to keep that in mind to be successful because you need to persuade, in some cases, the poor things to get done, and we're salesmen for some of these pieces.

Debbie Reynolds  11:11

Oh, I hadn't thought about that. Yeah. You're influencing internally? Yeah.

Kim Emiru  11:17

Yeah, exactly. Some are going to be easier. And others, we put in the work to influence and make sure that the decision that's reached is one that you think is the best decision from the privacy perspective, but knowing that that may not be the case. And I think, you know, I'm not speaking for any particular company that I've worked for; what I can say is, I look at my job as being able to provide information that allows the right stakeholders to make the right decisions. And so if I think particularly if one of those considerations is, here's the law, and here's the fines, great, that's what you present, but it's better to approach the job from a perspective of not curious to find if something goes wrong, but like, here are all the benefits of doing this in a way that considers privacy first,

Debbie Reynolds  12:09

That's great insight for anyone in a job or trying to go into that area in-house. Let's talk about privacy globally. I've had experience with multinational corporations for longer than I can admit. But the exciting thing about privacy and data protection for me is to see all the developments of the different countries, and especially when you're working with multinationals, you have to understand that not every country or every region or jurisdiction, they don't think of privacy the same. So, I actually had a chat with someone today from Mumbai. And they were saying sometimes, people come to India, and they have this very Western view of privacy, right? But as you know, they have a new data protection law. But you know, you say you have to consider the culture of whatever society that you're working in, right. So that's very important in order to make people feel like you have respect for them as people, but then also understanding why privacy is very important to them. What are your thoughts?

Kim Emiru  13:23

Yeah, absolutely. I think it's understanding kind of the emphasis of the law. The legislative history is helpful for that reason to see what was the intent and coming up with this law, drafting it in this way, because it will, I think, shed light on potentially how it might be regulated, but also who it impacts. And one of the things that I'm seeing more recently is just the activity and the content that I'm from; I'm Ethiopian. So I'm seeing a ton of activity in Africa, in particular about data protection regulations, and some have been in place for a while. Others are up and coming. And it's really exciting to see that just because they think Data Privacy is an important consideration for any culture, but the way in which you would expect, I think individual expectations in the US for Data Privacy are very different than what Ethiopians may consider for themselves as an important factor, right. And I think the priorities are also very different for each jurisdiction each culture. So, having that in mind when you implement a program or implement laws is really critical.

Debbie Reynolds  14:38

That's true. That's true. I know. I have friends in Africa. And almost everyone I know in Africa they use WhatsApp, right?

Kim Emiru  14:47

Yeah.

Debbie Reynolds  14:47

So that's just the way it is.

Kim Emiru  14:51

Yeah.

Debbie Reynolds  14:52

So if I want to talk with my friends after I use WhatsApp, right, and so some people are like, Well, you shouldn't use WhatsApp. You should use this, where I'm like, there are some cultural components here that you may be missing if you don't understand that culture. So that's just the way it is. Right?

Kim Emiru  15:10

Yeah, exactly. And I think it's this is even more important when companies have internal policies like, okay, well, does this policy actually fit to be a standardized policy across the board? Or do you need to deviate based on the population that the policy applies to?

Debbie Reynolds  15:30

Let's talk a bit about diversity in privacy. Right. So I mean, we could just talk about diversity, period.

Kim Emiru  15:40

Yeah.

Debbie Reynolds  15:41

Not just diversity in privacy. So I've been for many years, a lot of times, I'm the only chocolate person in rooms. I'm the only chocolate person or panels. You know, it's just the way that it's been, right? So I get happy when I see more diverse people. I want to see more diverse people because we need richer perspectives. We need different ways of thinking that aren't the same. So, what are your thoughts?

Kim Emiru  16:10

Absolutely. I couldn't agree more. I think it is particularly important in this space, given the type of regulations that we see in the potential impact of data collection and processing just broadly; as citizens of the world, I think it's really important where we see the impact on just data, generally, right, as we can talk about data collection used for policing. And there's just a ton of impact around or disproportionate impacts that can take place with respect to minority populations based on data that the government doesn't use. Like, I think there's just so many areas in the US that we can go into. But in general, having diverse viewpoints has been consistently proven to improve that potential of businesses. So, I don't think that's a debatable topic at this point. But it's more around how do you improve that? I think companies may say, you know, we have a pipeline problem. Well, where do you start to address? So you don't have a pipeline problem. You think privacy is actually a really, really great field for people to get into because you don't need to have a particular background to get into that. Right. There's no, you don't have to be a lawyer, you tech person. I think there's a lot of skill sets that are transferable that you can apply to be a privacy professional. And so it is an easier field. I would say. To get into that perspective, though, I think privacy is generally more of a closed-off field in general I think it's really about networking and figuring out how to land your first job. And once you do that, I think you probably have it easier, so I think we as a community need to invest more to make sure that people of color, in particular, have those initial opportunities, like when I see junior-level positions that say seven, like five-year experience, like what do you need that for? What, like, which junior-level position requires a five-year privacy-specific experience? I think it becomes a decision on are you okay with skillsets that can transfer. Do you need them to have a deep working knowledge of GDPR or CCPA? Or whatever the case may be?

Debbie Reynolds  18:29

That's true, you know, I would love to see, and I feel like I've seen this more, hopefully, is more of the hiring managers be more involved in that first cut process for people where they may have a skill that may not have a keyword that came up in the AI system, but there may be something that catches your eye say, okay, I can work with this person, and they have this or that skill set.

Kim Emiru  18:59

Yeah, absolutely. I think people tend to get it. You know, we talked about this at the beginning around women in particular weeding themselves out with the process. Generally, if you're stuck on those kinds of key phrases, have privacy experience, and not about the skill set, you tend to read more people out of that review process. So if you're looking for a person that is analytical, if you need assistance in reviewing, you know, past or what have you like, that's probably a simplistic example. I don't know the number of years required in privacy to say they're doing junior-level work. Do you need I think it becomes a question for people: do you need them to have been looked at for several years, or can they have done something else? Can they have been in an adjacent field or a separate field that may have required that attention to detail, that ability to communicate with stakeholders and explain processes then explain or grasp concepts so that they can apply the law by working with counterpart internally that may be more senior. So, just being open to that possibility is important to get more people into the field.

Debbie Reynolds  20:18

Yeah, I think that's vital. I feel like anyone who's been in a data field of any type can add privacy that just makes them that much more valuable because privacy and data protection are like a horizontal issue that all companies deal with at some point or another. Right? So being able to have those conversations have those talks have that view will definitely help you go forward in your career, regardless of what your number one specialty is, but adding privacy, I think, just turbocharges anyone's career prospects, any type of company?

Kim Emiru  20:58

Absolutely, I cannot think of a role that does not touch personal information. So, being able to say you have some background into how you should be handling that data is going to make you more marketable in the future.

Debbie Reynolds  21:12

That, we'll talk about AI. It's like, it's hard not to talk about this, right? Because everyone's going crazy about it. Entertainment, I think I feel like you're like, ah, yada, yada, it's been there, done that. A lot of the technology that we're seeing, especially in video stuff, is old news, I think, for a lot of the entertainment companies, but because due to AI has, like, caught so much attention. And I think it's just because it's been something that has gone out in terms of not company faults, but regular people being able to put their hands on it, touch it. So I think organizations that I'm talking to, they're like, we can't ignore Generative AI where we can't close the door to, it doesn't exist, we can create policies, procedures, and we can have a statement about how we want to approach this, and also educating people about the risks. But to me, people take three different stances. So I have one, shut the castle door, we're going to pretend like it doesn't exist, let's shut it off, then other people like, we're not going to think about the risks, because we think it's a great opportunity. And then I feel like more people are in the middle. They're like, let's come up with a common-sense way to do this. Let's educate people about what they should do. Because a lot of people don’t, they may see the benefit but not the risk. And I think that's very true of a lot of technologies, where people are very good at advertising a benefit, but they don't really think on the flip side, right? What are your thoughts?

Kim Emiru  22:43

Obviously, I can't speak on anything from the perspective of my employer, but as an individual, I think with any new item. You know, we've had a similar thing and just recently around blockchain now, it's like, the big buzzword for a while, and then they're kind of fizzled out. So I agree with you that I think the accessibility to individuals, being able to pop up ChatGPT and asking questions, that interaction has raised the profile, and also the capabilities embedded in the tool just have made this kind of at the forefront for everyday people. I think that, in general, having some sort of policies is a recommended step for every business or even internal for me; I would advise my friends to do the same thing. Writers don't just share everything on anything, not even social media; I think that's just a general policy you should abide by. So I think having a similar posture from business standpoint probably makes sense to say, we're gonna have a do's and don'ts. Gonna have clear doubts and potential do's until we figure out where this is, and having a realistic expectation of what your employee population is like is going to be helpful. Right. You know, I think it would be potentially a difficult position for a cutting-edge technology company to say don't use these technologies because that's what you're in the business of doing a bit who have come to have policies around what is potential, but I can also see companies that would clearly say no to things like that if what they're working on or their general population doesn't have the means and rules or experience with prior technology to be able to step into this because, like you said, the risk can be so massive benefits can also be equally huge, but having that fine line and being able to guide people to figure out what is proper use, and what's improper is helpful. Otherwise, you're kind of left to your own thoughts about what is okay to do.

Debbie Reynolds  25:05

That's true. That's true. Is there anything in the world that's happening in privacy now that either catches your attention or a thing that you're concerned about? It's something you see like, oh, wow, I don't know if I like this.

Kim Emiru  25:19

I mean, I think, as a privacy lawyer, probably, we can always mention just the number of laws that we're seeing in this space that are potentially creating, not just a hodgepodge of laws that we would need to navigate through. But just from an individual standpoint, what is the thing we're driving for, I don't think, is becoming clear. Right? It's like, what are we protecting against? And who are we protecting? I think that can get lost when you have a number of regulations that we do.

Debbie Reynolds  25:53

You know, I'll give you my impressions. And you tell me your thoughts. I think the US is the most complex privacy jurisdiction in the world, period, cause we do have this hodgepodge of laws that are continuing to roll out; we have these States that are basically like their own little countries passing different laws. But I also think there's something unique about the West that makes it that much more difficult. And that is, a lot of our laws tend to be very prescriptive. Where are some of the other laws in different jurisdictions? They're less prescriptive. So, for example, and GDPR. In Europe, they say, only keep the data as long as necessary, right? And where the US are, like, keep the data this amount of time or put a button on your website. So this, you know, so I think, even though I feel like some people really want prescriptive, where they're like, you know, just tell us what to do, and we'll do it. But I think that is what's creating an extra layer of complication on the State and Federal level. For the UFC, you probably know this better than anybody because you're like a HIPAA expert, right? HIPAA is complicated. You know, it turns all the stuff that you have to do, right. So it's not just tech people's health data. Like it's so many different policies and things, you have to do. And it's very complicated. I had someone talk to me today, and they were like this mission to pass and oh, yeah, we want to be HIPAA compliant. I'm like, that's like putting, like, saw World Peace on like a to-do list. Was like, massive? I don't understand. So, what are your thoughts?

Kim Emiru  27:44

I agree that I think that the complexity added is just the specificity required, and some of them may not necessarily align with what some other States or some other Federal requirements are. So, I do think it's quite an additional layer of complication. The positive is that, in general, economically, they're all pretty similar in terms of what the privacy principles are, the details, and create burdens that are, quite frankly, I'm not sure how small, medium enterprises can work through or navigate. And so again, the question to me is, what's the end goal? What are we trying to achieve? If we know these are so complex and require a certain level of expertise or time that businesses don't have the capacity or the money to be able to implement, and they just don't do it, what have we achieved by just having the laws on the books?

Debbie Reynolds  28:50

That's true; if you make it too complex, people just want to do it or do it badly. If the goal is to protect people, we should make it as easy as possible for businesses to do this, but.

Kim Emiru  29:03

I think there's a way of doing that without taking away from the substance of data protection as a principle; I think that has to be the focus if that is the intended goal. But then you sometimes have regulations that are really directed, at least seemingly directed, at certain types of businesses that end up impacting all other businesses in that space or broadly. So it kind of becomes a question of how do we regulate certain industries. That's the goal without impacting every other business that may be operating in that space.

Debbie Reynolds  29:41

I think you just made a good argument for why we need Federal privacy legislation or regulation.

Kim Emiru  29:49

I mean, I think it would be helpful to have something that is consistent and that you can apply and will have a clear goal of what the intended places I think we're probably far removed from that just based on the number of States that now have laws. That might be a moot argument, but.

Debbie Reynolds  30:13

I don't know; I feel like in 2023, I think there are five major laws and State laws that went into effect in 2024. They'll be at least 12, and that doesn't count the laws that are already in effect to have enforcement that goes into effect. So I think people are going to start pulling their hair out. And they're like, oh, we definitely needed Federal privacy laws; it's like so hard, you know? So maybe that will push people to have more of a compromise because two of the biggest sticking points that we have on Federal privacy legislation are private right of action and preemption. And, you know, my view is like this a controversial view; I think we should drop private right of action and preemption. And just harmonize the definitions across the US. And I think that will get us part of the way there, and maybe we can fight another day about preemption or private right of action. What are your thoughts?

Kim Emiru  31:16

I agree with you that those are the two kinds of sticking points that likely delay these, but no idea where this will go. I think it's beyond at this stage, given where the other laws are. My take is what will probably drive Federal legislation more than anything else is just the Gen AI piece that will prompt things into action more than what's happening at a State level. But I'm not taking anything to the bank at this point on Federal.

Debbie Reynolds  31:50

Oh, no, I'm not holding my breath there. You're right. AI has gotten people's attention is captured people's imagination. We have billionaires on Capitol Hill having meetings with people at Congress; if they wanted to do something, they probably could do something on AI before the end of the session going into an election year. And maybe privacy can hitch our wagon on that.

Kim Emiru  32:18

Yeah, potentially. Right. To your point, though, I think it would be really important to have clear definitions that hopefully are consistent with what we're seeing globally. So you're not creating our own lane, where it's not necessary to do so.

Debbie Reynolds  32:37

Yeah. So if it were the world according to you, Kim, and we did everything that you said what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior, or in technology?

Kim Emiru  32:53

We only get one wish. So.

Debbie Reynolds  32:56

You can have more than one, I have people who have had like four or five wishes; you can have more than one.

Kim Emiru  33:05

I guess my primary wish, just on an individual human behavior level, is being more conscious about what you put out there. In particular, I have two little kids. So, in particular, the concept of, like,  Sharenting is a big deal for me because they think that exposure may not land as well as you may want it to in the future. So that would be one thing that I would just say: be conscious of what you're putting out there. And maybe you still do want to share that picture or item about your kid, but you have at least given more thought than just a number of likes.

Debbie Reynolds  33:49

That's a good one. That's a good one. I think 2024 is going to be a hot year for child privacy as well. So we see regulators are, you know, ramping up enforcement on existing laws or new laws coming into effect. I think it's just going to be a tectonic change for companies, especially with the California law coming into effect in 2024 and enforcement around raising the age of children and are considered children online. That's going to be huge because, before, a lot of companies, they're like, check the box, swear you're over 13. And you're off to the races, right? Where they're saying, okay, now you have to make sure you know, the age of a person, have some type of mechanism so that you can show that you've done some due diligence about making sure about the age of the person. Companies now are collecting more data than they ever did before. Maybe they have more information. Like if someone says, I'm eight years old, I have my eighth birthday party, but they say they're over 13. That's like a red flag for companies, but I think that's going to change the Internet quite a bit within the next year because they're going to be very different for people who are under 18. What are your thoughts?

Kim Emiru  35:09

I think you're right. I think the California Age Appropriate Design Code is a significant law. But I think I actually brought up that comment more on human behavior, not even the impact of the laws on companies and how they will have to treat data. Right. I think that's a whole separate beast on its own. But yeah, I think children's privacy, in general, is just something that we haven't grappled with in the same way that as a society in the same way that we will have to going forward.

Debbie Reynolds  35:41

Yeah, I agree. I think not only parents but people in general just put less stuff on the Internet.

Kim Emiru  35:47

Yeah, exactly. There are just some things that you don't need to share.

Debbie Reynolds  35:52

Right. Exactly. Absolutely. So Well, thank you so much. This has been so much fun. It's so great to see you. And I'm so proud of what you're doing. And you know, you're such a nice, kind, smart person. And so I know that you'll always be successful in what you do. So doesn't surprise me.

Kim Emiru  36:13

Thank you so much. This has been really a joy. I love geeking out about privacy with anybody. So I appreciate this. And I do want to just give a shout-out to you for being available when I reached out. I think that's such an important thing for people that just as you climb up, pull up people. So think those if you're such a great example of that, so I appreciate that.

Debbie Reynolds  36:39

So sweet. That's so sweet. Yeah. Always happy to chat. And you know, I get people contact me from all over the world if I have time. Definitely, you know, yeah. Sometimes you just need a cheerleader, don't you?

Kim Emiru  36:53

Absolutely. Right, especially in the field that is the same as you to see somebody else that looks like you that, or not even looks like you, but just doing something you're like, alright, how did you do that? So that's great.

Debbie Reynolds  37:07

Yeah, well, you're a good learner, a good student, you're a good steward of privacy. And you know, a lot of people will look up to you and really enjoy the episode. They want to learn from you.

Kim Emiru  37:21

Thank you, Debbie. Thanks for having me.

Debbie Reynolds  37:23

All right. Thank you for being on the show. We'll talk soon