E119 - Antonette Igbenoba, Privacy and Ethics Associate Counsel, Upwork
Your browser doesn't support HTML5 audio
37:14
SUMMARY KEYWORDS
privacy, people, facial recognition, ai, data, technology, breach, happening, processing, thinking, companies, organization, law, employees, users, impact, providing, relates, business, autonomous vehicles
SPEAKERS
Debbie Reynolds, Antonette Igbenoba
Debbie Reynolds 00:00
At the time that we recorded this podcast, we were discussing whether the employee provision or the pause to the employee provision of the CCPA would be postponed and it was not. So that went into effect on January 1st, 2023. But we had a very lively discussion about how we thought that the change in how employee data is being handled in California would have ripple effects in other states. So thank you so much and enjoy the show.
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show, Antonette Igbenoba. She is a privacy and ethics Associate Counsel at Upwork. Welcome.
Antonette Igbenoba 00:39
Thank you. I'm so excited to be here.
Debbie Reynolds 00:42
Yeah, well it's fun. So we ended up, I think we were collaborating together on an IAPP session. I can't remember what we were talking about exactly. That is so funny, I can't. But you were in a session. And you struck me as someone very down-to-earth, very knowledgeable, and pragmatic, which I really love. And so very much like talking to lawyers who work at technology companies because I feel like you'd have to have so many different business skills and have a listening skill, understand what's happening in the technology side of the business. And then just to get your thoughts and perspective about privacy. But before we get into that, I would love for you to talk about your journey into privacy.
Antonette Igbenoba 01:41
Oh goodness, my journey into privacy is very untraditional, very unrefined, and very unapologetic, my journey to privacy. So long story short, I moved to Atlanta with one suitcase after college for law school, with ambition in my heart and dreams in my head. And so through law school, I had two different types of internships. I interned for the IRS Office of Chief Counsel during tax law, the EEOC, and the Equal Employment Opportunity Commission during employment law matters. I had done personal injury. And then I had gone for a technology internship, which I thought was pretty interesting. I was like, What is this about? I was doing legal research as it relates to autonomous vehicles. So for instance, if autonomous vehicles were to get in an accident, who was liable, or even think about the future of car ownership like our people were going to, since cars can you know, move themselves autonomously? Well, people still park cars at home; we all begin to lease the same car on the same block, like what's going to happen, anticipating these future issues and proposing and crafting creative solutions and just giving thought leadership. I fell in love with this. My background is in theater. I was a theater major. And so it kind of reminded me about art. It was so flexible. It was so fresh. It was so new. I got to do research and just do thought leadership. And so I was like, I need a career in privacy. What do I do, you network. I went to almost every event in the city of Atlanta as it relates to privacy, chasing people down afterward, and meeting with people for coffee to learn more. I had an awesome mentor who told me to join the IAPP and get my certifications. Well, what happens when you have a law degree, you have your certifications, and you're newly barred, but you don't have experience? It's America; nobody's hiring you. So I was like, let me start my own business. I started a very small data protection consulting business working with Ecommerce based businesses to help craft their privacy policies and so build their privacy programs. I've always been a firm believer that no matter what size your organization is, Mom and Pop selling cupcakes on the street or midsize, large organization at every level, an organization needs privacy. And there are privacy standards that you can implement. I used to do like privacy one on one classes, just information handling things. So the whole premise was to just crank out, put in the universe as much privacy work as I can. Half the people were not paying me bless their hearts. But my goal was to gain experience in order to you know, get a job, which then got it kind of worked out for me I was at EY doing Data Privacy consulting for Fortune 100 businesses, from tech companies to finance companies to companies in the manufacturing industry. So many a broad range of experience, helping them to implement GDPR, CCPA and just in general, helping them to build and maintain a privacy program. And then jumping to Upwork as In-House Counsel, building our global privacy program. I'm in all that fun stuff, working with cross-functional partners as it relates to privacy, you know, your marketing team, they want to do an email campaign that's impacting the EU. Well, we need to make sure that we have the proper consents and such contracts, DPA, make sure that obligations are properly set out, do we need to include SCCs. So all that fun in house stuff, dealing with privacy at every level, and also spreading awareness through the organization? You know, most breaches happen due to internal negligence. And so thinking about creative ways to bring everybody on board and just to make privacy, you know, so create a culture of privacy answer, use privacy as a competitive advantage. So yeah, and here we are today.
Debbie Reynolds 05:44
Wow, you're really scrappy. I love that story. I know a lot of people who want to get into privacy will really appreciate that. And that's one of the things I recommend to people, if you don't have experience, find someone to volunteer for, help them with their work. That is valuable insight and experience that can help definitely further your career. As you were talking about autonomous vehicles, it occurred to me that the session we did together was about Smart Cities.
Antonette Igbenoba 06:13
Sure, was it sure was, yeah, sitting on privacy by design.
Debbie Reynolds 06:19
Yeah, that's right. That's right, every morning. So I think, you know, I love that story. I love the way that you climbed your own ladder; you created your own ladder to get to where you want to be. And I think self-education, self-learning, being resourceful with what you can do. I think that says a lot. And some people are, you know, I talk to people all the time, they're like, oh, you know, I want to get into privacy. I don't know anybody. So you know, you're like, let me get to these programs. Let me like learn stuff, let me read and you really did. Excellent.
Antonette Igbenoba 06:59
Very difficult. It was not easy at all, you know, it's awkward walking up to people with striking up conversations, setting up, coffee, following up. And funny enough, I mean, this is back in 2015, where this journey started or towards 2016. And all those same people, I still keep in touch with them and still have those same relationships. And even working with law students, as you mentioned, you know, folks want experience and there's no opportunity, but you know, giving them opportunities to do the research for the IAPP decks, that I'm the events that I put together and having them on the meetings with these speakers, and, you know, meeting people and encouraging them, hey, follow up with these people afterwards, get to know them and like, see how you can plug yourself, and you definitely have to put yourself out there in order to kind of get what you want.
Debbie Reynolds 07:49
I think, especially companies that deal with HR data, I'm hearing a lot of companies are trying to do digital transformation around how they handle that type of data. And so that puts you kind of in, because you're dealing with humans, you're dealing with data of humans, so you can't can't get away from privacy in those in those types of careers and those types of jobs. So tell me about that experience.
Antonette Igbenoba 08:16
Gotcha. So you asked me about working with HR data?
Debbie Reynolds 08:20
Yes.
Antonette Igbenoba 08:22
Internally or externally?
Debbie Reynolds 08:24
Well, either whichever one you want to talk about.
Antonette Igbenoba 08:27
Okay, okay. Okay. Well, I believe from the internal standpoint, you know, it's best to have your employee privacy notice where you've listed out the data elements that you're processing, why you're processing those, and how you're processing those. And then that will kind of guide what you do externally because you put this because you're going to want to provide, you know, your members with benefits, your team members have benefits. So for instance, like a mental health service, for instance, you're going to have to share their information for them to be contacted by the vendor to, you know, to create their profiles and such. So at a high level, I just think when it comes to HR data, it's really a new area that companies are thinking about more, especially in light of the CPRA and the exemption sunsetting. You know, now you have to think about providing your team members with, you know, more information about what's taking place on the inside. So having that policy, there is kind of a good starting point for your team members. So just understand what's flowing through the organization as it relates to them and why now the question gets deeper, you know, how do these new laws impact things like your focal content, feedback, even potential reports and things like that. How far do organizations want to go with compliance? And what is even necessary for compliance? Those are the questions that we are working towards answering at this moment, literally was on a call with outside counsel a couple of hours ago. And these are one of the things that came up. And actually, the outside counsel's perspective was interesting. You know, he mentioned that some people are waiting, actually to, in hopes that maybe like, a bill comes and knocks it out. And you know, the sunset does not take place. However, some organizations are beginning to see, you know, their team members, as you know, our consumers as those consumer data subjects and thinking about how to treat them in the same manner and implement the same processes.
Debbie Reynolds 10:50
Yeah, now for those people who don't understand about the sunset, can you explain the sunset clause you're talking about?
Antonette Igbenoba 10:58
Yep. So within the CCPA, currently, employees are not included as applicants; yes, you have to provide them notice. But providing giving them those data subject rights currently does not exist, come January 2023, that will change. And so those same rights that the consumers have, that the users have, the employees will also get. So that includes the right to know what information you have on them, that you're processing about them, the right to delete such information that's necessary according to applicable laws, and also the right to opt out of any data. So sale or sharing, and also limiting the use of sensitive data if it doesn't meet the threshold for processing within the statute.
Debbie Reynolds 11:47
That's a pretty big deal for many reasons. So one is until we get a Federal privacy law, California is almost like the de facto standard in the US for privacy laws. And this is interesting, and I'm sure you have a good perspective on this. But I'm always talking with people in Europe versus the US. And each side is shocked about what is private and what isn't. So they you know, a lot of people in Europe assume that we have the same employee privacy rights that they have, which is not true. And then the people in the US think they have more rights than they actually do have, which is different.
Antonette Igbenoba 12:30
Like I love to tell people, you don't have any privacy rights. I mean, unless you come from certain States, right. You know, but overall, there's none. And it's so interesting, you bring up the Federal law, because I had read something in the news, maybe yesterday or so that like California plans to like push back against this Federal proposal potentially. So as you said, like California is that State in the United States who just gets to act that way? Because I mean, they kicked us off with all this and they're more progressive as it relates to privacy.
Debbie Reynolds 13:05
Yeah, it's a catfight for sure. Right. So because California has been progressive for many years, and they've worked on this for many years, I can see how they don't want anything to curtail or maybe minimize rights that have already been in place for a number of years for people. So it's going to be interesting to see how that all plays out. But I think just people don't know what is private and what isn't. So if I think in the US, you should assume that if you're an employee, you have no privacy. There's nothing that you do at work that's private.
Antonette Igbenoba 13:43
Do you know why you bring up such a great point? Because, wow, I just thought about how you know, like, your work computer, you know, your click strokes finger movement, how long, you were on a page? No. So I'm just wondering at a high level, like, just thinking outside the box, like, Is this part of access requests? I guess, not from the standpoint that, you know, some platforms might leverage their users' keystrokes and all that type of stuff, but they're not providing that back in the DISA report. So maybe that's a little bit too nuanced. But then somebody can argue that employees’ keystrokes and you know, how you monitor their devices, etc. It's a bit more invasive, and how you're monitoring your users because at a certain level, there's like anonymization taking place or there's an aggregation taking place, but with your team members, you're not necessarily aggregating like you know, you need to know if Toni was on her computer you know, flipping through these documents and so on. So time, that's 15 people from this area, click on this website, on this specific area.
Debbie Reynolds 14:59
That's true. Very true, very true. I've had clients in the EU that use Zoom, for example. And oh, Zoom had a feature, I don't know, they still haven't had a feature on the admin version where it's had eye tracking. So it tracks your eyes to tell him you're paying attention. And so we have people turn that off in the EU, right? Because that's not something that they should be collecting over there. Where over here it's buckwild, you know?
Antonette Igbenoba 15:32
Eye tracking? Oh, my goodness, oh, my goodness, that's intense. Yes, terrible. It's terrible. Well, what is happening in the world right now, that concerns you about privacy? Like something that's being developed? You're like, oh, my God, I don't know what they're doing with this. I don't like it at all. So I wouldn't say a thing. But I would say a notion I'm, I'm a champion of algorithmic accountability and data ethics. I'm a champion of what is your AI and ML actually doing? Right? And outside of that, with all this data that you have? What exactly are you doing with this data? Like and what do people know about and what don't people know about? Um, so those, I would say that algorithmic accountability and data ethics are definitely areas that are always at the forefront of my mind. And you even see the general trend is towards regulating is towards AI regulation due to the wider impacts of this technology? I mean, we see like the EU, the EU has an AI law. Recently, Canada's proposed law, the Artificial Intelligence and Data Act. We're just seeing nations consider the impacts of these AI systems and thinking about ways to identify, assess and mitigate the risks and harms of biases that can come from it.
Debbie Reynolds 17:04
What do you say for people who don't understand the potential harm of AI? I think, for example, say...
Antonette Igbenoba 17:13
Oh, I got a good one for you. I tell those people in fact, I tell you to right now, Google cute baby. And then you'll immediately see the impacts of AI if we do not do anything about it.
Debbie Reynolds 17:31
Well, I got one for you. Go to Google and type in lazy man.
Antonette Igbenoba 17:36
Ooh man, no. Debbie I don't want to do it. I tried to give you a nice one. Please, all podcast listeners, I tried to do this nicely. And Debbie took us there.
Debbie Reynolds 17:48
We have to go all the way there, I have to go all the way there. I think, you know, for example, you know AI can be amazing, right? It can do amazing, wonderful things, but also to do harmful things. So I pick up in like a sword that has a blade on both sides, right? So it can powerfully do wonderful things. And it can powerfully do harmful things as well. So I think a lot of times when people are implementing AI systems, they get all excited about the benefits, and they don't really look at the harm, especially if the harm doesn't directly impact them. So I think that's the issue.
Antonette Igbenoba 18:33
Absolutely agreed. And really, a new study that I came across has found that AI benefits hinge on explainability with the human touch. So pretty much this study was advocating for an end of AI ambiguity, like we need to know what exactly is happening, looking at these ethical risks associated with AIS and how explainability helps to overcome these risks. Not only explainability but, you know, I guess within explainability is how you got see your result and like beyond that to it's you know, I think within looking into explainability, it gives you the insight into whether your data sets are diverse enough, you know, what do you need more of what do you need less of? Because the AI can only do what you train it to do. You know, with a cute baby, for instance, there's only one type of cute baby that you see. So therefore that the AI when you hit that Google machine typing cute baby, that's what it knows is a cute baby, you know, how about the people who so can we get more? Can we get more? Can we get a magnifying glass so that, hey, we need some explainability of how was it making this decision? You know, how can you program this AI to think more diversely?
Debbie Reynolds 19:48
Yeah, I say like let's say you go into a grocery store and it has a mat and you step on and it opens the door. So if a person walks in before you and then they step on the man that opens the door for them, and then you step on the mat and doesn't open the door for you. That means there's something wrong with the system like the other person can't say, oh, there's nothing wrong with the system because it worked for me. It's like, no, it needs to work for everybody. So if it doesn't work for everyone, then there's something wrong with it.
Antonette Igbenoba 20:18
Absolutely, absolutely. Yeah. And I'm not sure if organizations leveraging this technology or thinking that far, maybe the AI is solving an immediate problem, or maybe it's even, you know, causing the platform to be more lucrative. But I think that as data continues to be processed, and as we see, like how algorithms impact people, it could be something as small as you know, a young lady having an eating disorder, and you know, constantly seeing the same type of content about food due to that being maybe what she looks up. Now, I don't have the answer about how to fix that, per se. But once we start to have these discussions, having these discussions, maybe it's even more important than the solution. Because one awareness, I think that a lot of people don't even know what an algorithm is, right? Like, some people will be on social media. And we'll think that social media is like a microscope view into the world sort of like a view, but it's actually not; it's just providing you items based on what you like, you know, so it's actually not, it's not what everybody in the world is thinking. You're just seeing everybody who thinks like you. Yeah, and I think a lot of people assume that everyone is seeing the same things. So you're, you know, someone sitting right next to you could be seeing totally different, you do the same searches, you're seeing totally different concepts, some people really need to understand that. Yeah. And I mean, these technologies, I'm very pro-technology to I'm very pro-innovation, why we cannot stop innovation, innovation has to happen. Innovation has brought us this far, you know, technology is going to continue to grow, data is going to continue to be processed, companies are going to continue to develop very interesting technologies, whether it be I read a while ago the technology to diagnose depression through your voice or technology to decide whether you can pay your loans, etc. I am not against innovation and development. However, within innovation and development, we need to think or at least have a team thinking about the wider impacts of this, you know, think about the beginning the middle of the effort, I mean, privacy by design, just thinking about it at every stage, and how we can affect others. And having this conversation with a diverse group of people who all see life differently, because then the issue becomes if we all see life the same way, you know, that's not helping either.
Debbie Reynolds 23:10
Right? Well, because then this is what is happening with a lot of these AI systems, they test it on a small population, and then they make the assumption that they can apply this globally to everyone. And it's going to work the same way. That's just not true. That's like a fallacy about how data and technology work. Yeah, well, I will love your thoughts about kind of the future of ease a lot of these laws, but I feel like companies aren't prepared for the ways that they need to change to be able to be in alignment with these laws. So I feel like some companies are like, everything's fine. We haven't had a breach or we haven't had a privacy issue. I'm not in Europe or whatever. So I think I you know if I just continue, as I'm doing right now, like sending out emails to anybody in the world, you know, I've given consent and stuff like that, that'll be fine. But we know that that's not true. So what do you see in the future about companies really giving the message that they need to change?
Antonette Igbenoba 24:19
Okay, companies will only change when it affects the dollar bills. Okay, that's true. They start to lose money when they start to lose money. That's when they will get it together, unfortunately. So I would say when there's more when there's a financial impact because really, the bottom line drives everything the organization wants to grow. The organization wants to expand; the organization wants to make more money, take care of its employees take care of its shareholders, and all that fun stuff. We're getting to a point soon where if companies actually pay attention to privacy, they will see that that's an opportunity for them. Money, that's actually an opportunity for a competitive advantage. Why consumer trust? When a consumer trusts you, they will give to you. When a consumer sees that, hey, you're transparent. I was on a call with the outside counsel; they had mentioned how it's public. I mean, everybody knows about a breach that took place at Salesforce and how Salesforce had a public website where they were providing such transparent moment-to-moment updates, and they had never seen that before. And you know, that is something that will continue to happen in the future because people will be like, wow, like, we feel included. It's not like there's wool being pulled; there's no wool being pulled over us. Like there's no scheme. There's no gimmick that's taking place. No, these people are letting us know what exactly is happening and how they're addressing it. People love that. People love to be informed and people love to be included. Privacy is a way to inform users and includes them in your process. And I believe that companies are truly take heart of that will not only have a great product but will also have even better consumer trust. And consumer trust equals to more users, it could even equal to people posting about your company on social media, and then your company, you know, potentially going viral or not even going viral. It's just the notion of other people who you know, viewing that person’s social media saying, oh, hey, that's cool. Let me check them out. And then it just becomes a thing. But definitely, I mean, privacy is important. And companies that feel like they haven't had a breach and that they're safe. The question is not if a breach happens; the question is, when a breach happens, no one should ever have the mindset that they haven't had a breach or we were we're bulletproof against the breach. Now, in this day and age, the mindset is thinking about when and also to, if possible to get like bounty hunters are those bug hunters, the ethical hackers, like have them crawl through your platform? Yeah, I work for a technology company. So it's, so you know, it's a bit different for us; I guess we really prioritize that. And we're also remote environments. And so we really leverage resources in order to kind of help us to get ahead of that, you know, people who are paid to scour your site and look for vulnerabilities.
Debbie Reynolds 27:25
That's great advice. I think you're right. It's like if you have a breach in some way, and you have a breach, and if you're prepared, you'll be able to limit the damage that happens. Because you're prepared. You have a game plan; you're not running around like your hair is on fire right?
Antonette Igbenoba 27:42
You've done a tabletop exercise like it knows what their role is information security knows what their role is, and marketing knows how to do their internal and external comms. HR has told the folksy for the employees, Hey, y'all, we're keeping you up to date. You know, like, there are so many facets of a breach that, you know, it's just beyond a breach happening and you responding like if you practice multiple, multiple times before it takes place when it happens, it will be less pressure. I mean, you have your outside counsel ready, you know, like you have all your ducks in a row. And so when it happens, you're ready to go.
Debbie Reynolds 28:19
Yeah, I agree. I agree with that wholeheartedly. I want to just have your thoughts about facial recognition. And what's happening with that? I have thoughts, but I want to your thoughts first.
Antonette Igbenoba 28:34
Facial recognition. Of course, we know that in 2019, the National Institute of Standards and Technology did research that revealed that facial recognition algorithms generally misidentify non-white folks far, far more than they do white people, especially white men, and this is a study that's publicly posted. Facial recognition we have a ways to go; we have a ways to go for it, to be for it to hmm, I don't want to say for it to be fair, because nothing is ever fully fair. But we have a ways to go for it to be completely useful, right? I mean, we've seen instances of facial recognition like misidentifying potential suspects, people getting arrested in certain areas by police departments that leverage this technology and it turns out that these people weren't even the suspect but due to the poor programming and the poor, like you know, the measurements taken of certain faces, you know, it's causing actually more stress to certain communities and others. And for it to have such a long way to go it's so widely used, right? I mean, you could be on like your iPhone, your iPhone will scan everybody's face in your photo album and it will have like dad mom have like my best friend. No. So I think that facial recognition is a great technology and can be used in many great ways. But there's obviously considerations that we cannot deny; we need diverse data sets, we need to validate methods of collection, and how was this even being collected. And also, if we're going to permit our law enforcement to leverage this technology, then we need to have solid boundaries and guidelines in place. We have, we've seen cities banned facial recognition, the city of San Francisco, I think Baltimore, also a year or so back. So you definitely see cities putting in place ordinances against these until there's a better structure for processing facial recognition data.
Debbie Reynolds 30:56
Yeah, the technology will continue to be developed. So it's not going to stop. So you have to find some way to improve it, or at least try to minimize the harm to it's use, right. So I don't think that facial recognition is ready to be used in like high stakes, things like liberty related things. So you know, we saw a case, I'm sure you're familiar with it, where this guy was arrested because they looked at his photo that was from a license bureau. And they thought that he looked like some guy who did some burglary at some store. And they arrested him based on that. They took them in front of a judge and the judge even said, like, this is not like the same person. So it's like, you know, jumped in. But like, what if he did think of the same person, right? It's like, you know, there has to be more thought more human in human thought or human judgment. In this canvas trust, like what a computer spits out like that's, you know, to me, technology is supposed to help you. With heavy lifting, it doesn't replace your human responsibility to have judgment.
Antonette Igbenoba 32:22
Agreed. And that same human intervention, just to throw back to algorithmic accountability, I didn't; I forgot to touch on that piece too. Like ways to infuse human review in this process. Although everything is mainly powered by automation, we need to pepper in, you know, the opportunity to appeal or push back and then involve a human in that process. And the same for facial recognition. Definitely.
Debbie Reynolds 32:47
Yeah, I agree. I agree. Well, if it were the world according to Antoinette, and we did everything that you said, what will be your wishes for privacy anywhere in the world? Whether it be human technology, law, regulation, or anything?
Antonette Igbenoba 33:04
Oh, my wish is really, this is a big ask. Well, okay, first, I will, I'm going to do two things here. First, I want us to have a law in America. Okay, I am sick of patchwork legislation. As an in-house counsel, I'm sick of making spreadsheets with every State breaking down the legal requirements and breaking down, you know, what our next steps are entirely us one law to comply with, okay? But outside of that, just back to algorithmic accountability, data ethics, and facial recognition, that, that diversity and datasets like across the board, because, you know, beyond the processing of people's data, and you know, the collection and how people are using it, like, decisions are being made from this data, you know, like, Let's put aside, hey, we need to make sure everybody has noticed about what you're doing, we need to make sure everybody has a choice about what you're doing. You need to make sure everybody has access to the data out; those are so beautiful and nice. But something more sinister can take place when people have noticed access and choice but when people are not looking, you know, that those diverse considerations, and also just those ethical data considerations, like how are you actually using people's data? Um, so I would want to just have more formal process procedures, I would say around what really happens with data after it leaves the I don't wanna say the hand because there is no physical but after it just leaves The after it leaves my digital and goes, you know, to whoever is processing what's happening after that secondary users. I mean, like the whole Clearview AI is an example of like, something happening where they kind of gave people notice access and choice at some point. Okay, but then something else was happening that people did not know. And that's my concern about privacy, it's not the shiny things that you can see. It's the things that are taking place that we don't see. And then what happens as a result of those decisions, you know, and all that eventually falls to AIML, and then you just have this whole cycle. So it's like this, getting a better magnifying glass and processes into what's happening once the data leaves the user. How that's going to happen. Ooh child now that's a separate discussion. But at least we have one discussion. Okay.
Debbie Reynolds 35:57
Yeah, at least we're discussing it right? We have to talk about it first before I can do anything about it. Well, thank you so much for being on the show. It was such a treat to have you on the show, and I look forward to us being able to collaborate more in the future.
Antonette Igbenoba 36:12
Of course, I really had a great time. I'm going to have to come back because I had a really great
Debbie Reynolds 36:20
This is amazing. Thank you so much, and we'll talk soon.
Antonette Igbenoba 36:23
Of course.