E38 - Dr. Mechie Nkengla CEO at Data Products LLC, Chief Data Strategist, AI Advisor
Episode link
38:31
SUMMARY KEYWORDS
people, data, ai, work, terms, build, responsibility, privacy, ethics, happen, individuals, technology, feel, create, women, responsible, algorithm, talk, humans, conference
SPEAKERS
Debbie Reynolds, Dr. Mechie Nkengla
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own, and are not legally advice or official statements by their organizations. Hello, my name is Debbie Reynolds, this is "The Data Diva Talks" Privacy podcast where we discuss Data Privacy issues with industry leaders around the world and what information these businesses need to know, now. I have a very special guest on the show, a fellow Chicagoan actually, Dr. Mechie Nkengla. She's the CEO and Chief Data strategist of Data Products, LLC. So I'm happy to have you on the show. Dr. Mechie.
Dr. Mechie Nkengla 00:46
Thank you, Debbie. It's my pleasure. I'm very, very excited to talk to you today.
Debbie Reynolds 00:52
So we'll start a bit. So I feel like we have to have a background about you. So the cool thing is we met on LinkedIn, you contacted me, and we had ended up having a call and I was super excited. You know, to see a woman of color, black woman, in data, the data science area, and also you're a mathematician, which I was excited about as well. So we talked, you'd asked me to speak at a conference that you had called DataYap. And I want to get into that as well. And then it's so funny, because we started talking about ways that we can collaborate and then I said, I'm doing this other thing about a lot of the post quantum computing. Is that what it was? Yeah, so. So you ended up on a, on a panel of me, you have two lawyers from New York. Actually, Ron Hedges, who's a former judge who's at Dentons right now. And then Gail Gotthehrer who has her own practice in emerging technologies. So this was an amazing session that we did together. And I was so happy that it was just very good timing. And then it's great for me, because I always like to have people who don't know each other meet each other or something like now you're all in, you're all plugged in. That's my New York friends with the New York State Bar Association. But I would love for you to talk more expansively about your kind of career trajectory into, you know, data science. And then I would love to talk about sort of where you focus and your DataYap conference.
Dr. Mechie Nkengla 02:41
Thank you. So great. How did I get into data science, data strateg. A couple of decades ago, I'm not going to say how long when I was in school studying, I happen to do my bachelor's and master's degree in math and computer science, which was strange, because when you graduated at time, you're not considered a pure mathematician. So you wouldn't be get the juicy research jobs. And you weren't considered a pure computer scientist. And so you can get hands on a coding job as well. So you're stuck in this sort of cross limbo, looked down upon satisfaction. But I enjoyed what I did, right? I literally went to school, I didn't have a major and I went eeny miney, moe on the catalog to my first power course and fell in love where that this is what I'm going to do. And after working after a while, after graduating, I'd set to go for a PhD and do a PhD in applied mathematics. And then did a stint at Lawrence Livermore as a Postdoc, where I did some research. Using data, again, with the idea that while researching massive amounts of data, how you can make them compact, small enough so that you can run regular algorithms and computations on to get the same accuracy as the large amounts. That was the focus of my research, and decided that I wanted to get into more industry officially. I want to work in real world problems and tangibly see the effects of the work I was doing right. And worked for some startups. And that time, there was no data science, right? It was a sort of newly coined term. You're a quant? That's what you were or algorithm scientists. So did that for a little bit. And then got, work at Groupon, moved on work at some FBI, some director positions and then decided I wanted to do my own thing. And so I came back and I work at this data science consulting firm Data Products which you get to guess it. We build data products for our clients. And my journey has not been a straight path, one sort of meandering all within the same space using my mathematical skills as well as my computer science skills to sort of drive in this emerging field right now that we call data science.
Debbie Reynolds 05:20
So, you know, I think, you know, for me, I was really excited to meet you, because people think that people like you don't exist. Black women in mathematics, black women in, you know, technology and computer science and things like that you kind of criss cross that, you know, you go across a lot of different things, right. But people don't think about you, or us in those ways. So tell me a little bit about the importance of women, you know, women, definitely in science and tech and definitely sort of your experience of being a woman of color in that area.
Dr. Mechie Nkengla 06:05
That's a deep question. And one that's also very, very dear to me. And important, not just in the open to the community and our society. So at large. So while I was a student, I went to University of Illinois, Chicago, and back in the early 2000s, there in graduate school, I was the only black student in the math department. And not just black student, there were no black math professors in the department as well. So for me, I was I of course, I noticed that but I guess I was very comfortably in positions, where I was the odd one, odd looking one, so to speak, or the different actually say looking one, and it never occurred to me, I thought, well, maybe just black people don't like doing this. But I couldn't fathom why because I enjoyed it so very much. One of before I go for talk a bit further, I would say one of the reasons I enjoyed it very much was it was a one way I felt I could fight any discrimination, whether warranted or not. Because science was science. No one was going to tell me that this person was wrong, where I could clearly show that it was correct. So I felt like my work could speak for itself. in that manner. That was a strong pull. I guess one of the added benefits of why I loved what I did.
Debbie Reynolds 07:35
Yeah, that's amazing. I love that story. Yeah, I don't know. I like the empirical nature of data. So that I can concur. I concur with yeah, but I fell into technology by happenstance. I just, I wasn't going to go to law school. My mother was diagnosed with cancer, my senior year of college and thought I wanted to spend more time with her. So I thought, okay, I can't go away anywhere. So how can I find a way to make money stay busy, and be with her. So I bought a computer taught myself to use it. And I just fell in love with data, really just does how data flows and moves. And that's kind of been my interest. So I think that sort of, you know, maybe that criss crosses, so I'm a good reader, I understand, you know, law quite a bit. worked with people in law for decades. So that's kind of how I got into my career. You have to explain to me kind of the spectrum, I feel like, I feel like people can't really grasp AI, right? To me, I think people think about it as like a movie, right? They think about it as you know, this robot took over and did stuff to me that I didn't really want or whatever. You know, we have all these dystopian future things about AI and technology. But can you tell me talk to me about kind of a realm of AI as a whole and sort of where you fit into that AI realm?
Dr. Mechie Nkengla 09:21
Right. That's, that's a good that's a really good point to bring up and I spend a lot of time I have another passion of mine is data democratization, which data literacy where people understand, everyday people understand everything around data, because it really affects them. We live in a digital world. There's no one that's currently living that they do not interact with data in some form. So it's important that they have some, at least very high level understanding of what these terms in these terms are and how they affect them. But back not to digress too far about that. AI. So when you think of AI a lot people think of some futuristic movie, maybe The Matrix, right? And the sentient bots are some sort of fi sci-fi stuff and that matter. But really, AI is really just artificial intelligence, all it means is your asking you build a program of process heuristics that mimics the way a human thinks, right. So a very classic example I like to give doesn't involve high level computing at all. So if you're as old as I am, the older my age, you remember the day when you would go to work and have to punch a card, right? To mark when you come in and punch a card, when you go up. That can be considered a basal level system AI system, right? Because it's sort of mimicking the what you would have originally done, which is coming in, sign your name on a sheet then right the time you come in, and when you leave, sign your name and write the time leave. So by automating that task, where no one needs to sit there and punch, you're just basically punching a clock card that is automated, that is artificial intelligence. Now, the sort of difference I like to make with intelligence and machine learning is the which people use interchangeably all the time that really are not the same thing. Machine learning is the idea of, of a computer system or program, I like to say, acting in a manner to reduce the errors in something is ever it's in terms of reducing the probability of errors that he makes in some sort of calculation. I'll give you a classic example. So as humans, imagine you're a baby or you have a baby, the baby crawls and learns to stand, but a baby realizes that there's a particular corner of the room that's probably not so flat, whenever the baby crawls, tries to stand over there, tumbles and falls, right. So the baby learns, then, that when labor, he or she's at that corner of the room to hold on to something so they don't fall. That aspect of learning, right, by reinforcement could be by reinforcement in terms of they did it one time and fell down, or someone told them not to do it. And, and did they follow that instructions or not? Didn't do it. That's the learning aspect. So imagine taking that and implementing within a system or program or algorithm where the algorithm you're teaching the algorithm, if you, if you print out x and someone respond to y, then you respond z, that is learning. That's sort of a train learning system. And the other hand, it could tell them to sort of go through sort of examples, and they get feedback where, like, if they responded, if the machine printed x, and someone responded, why they got some reinforcement, positive reinforcement, the next time they would print y. If they printed y and they got some negative reinforcement, then next time, they will not printr y, they'll print something else. That's all learning is.
Debbie Reynolds 13:19
Yeah, that's a great, those are great examples. Actually, thank you for that. I would love to talk about ethics, ethics. And he was such a this is like, super hot, hot topic. Right? And I don't know, I would love to hear your thoughts about what you think about ethics and AI. For me, I think that humans bring ethics to AI, so I don't think I don't think algorhythms can be ethical on their own. Like they, they absorb the ethics of a person or I feel like the human has to be the judgment, you know, brains, the judgment, brains kind of knowledge, because I feel like computers even as our algorithms as they get more advanced, and they do, you know, more sophisticated, they they can't fake or reason, like a human all the time, right? So I don't know, what are your thoughts about that?
Dr. Mechie Nkengla 14:22
Oh, my goodness, we're talking about all the hot topics to me, I feel. So ethics is I wouldn't just say important, I would say critical, right? And you can really talk about ethics in a vacuum. If you talk about ethics, I feel it comes hand in hand with responsibility. And that one of the reasons I feel it's gotten a lot more press is just as beyond the dyno once up, it's a good thing to do is beyond that. It's more to do with the fact that some sort of events have occurred and there's additional responsibility and therefore it lit the fire over this discussion of what is what was where does the responsibility lie when in what fashion. So if we talk about responsibility when you build a system, so you're you're ensuring that let me give you the perfect example, you're an attorney, you have to go to school to study, you have to take a license, right? Board, and you have to do some continuing education to be able to practice law legally, if you were to make a mistake, or do something that had an effect, the onus of the responsibility for that effect lives of that outcome lives with you. But we live in an age where everything is becoming digitized. AI is being imbued in all our lives and all everything that we use. However, the companies that build the systems really don't want to bear that responsibilities of the outcomes. And therefore they point out to the user and say, you are responsible for using our system. But that's ridiculous. It's like going to an attorney or a doctor and a doctor performing surgery on you. And then the doctor saying you are responsible for the surgery that I just performed on you. There's a disconnect, and some heavy disconnect there. And then the talk about airbags. I feel really any, if you're responsible for building a system or solution, or a program that affects people's lives in any single manner, you should really be responsible for that, whether there's a law or not.
Debbie Reynolds 16:47
You and I agree 1,000% on that. I hear a lot of people, mostly people who try to sell AI unfortunately, try to you know, I don't know, I don't know if it's an automatic thing. But I mean, I've talked to companies, and I've talked to people over time, they have brought in, you know, some forms of artificial intelligence to help them or machine learning. And it's almost a knee jerk reaction to, to them to think, oh, wow, well, this does this, then that means I don't have to do you know anything like that's, that's the opposite of what it means. I think it means that you have to step up and take responsibility and make sure that you understand, you know, what it's doing, because, you know, the algorithm is supposed to be doing things on your behalf and shouldn't be doing things that harm individuals. So I think, you know, it's very important, be vigilant. One thing that I I touch on a lot is bias, bias and AI. And it's scary to me, when I hear people say that I don't think AI is biased. I'm like, you know, hold up, you know, people are biased, people may add a bias to say, AI. And then you know, the example of I give is what's happening sometimes now is the correlation, I will give us like, people have different blood types, right? So let's say someone created this transfusion. And you're like, okay, this works, you know, really well, you know, my blood type is O, and then I'm not gonna test on other blood types, because it works. But then I'm going to use it on people with other blood types. So as you know, if you give someone a transfusion with a blood type, they don't have they could die, right. But sort of this is, to me a parallel what's happening with AI, where it's being tested in a smaller sense, unfortunately, and then it's being hoisted on many different people, you know, just to see what happens. And it's like, in a medical situation, that just cannot happen. So I feel like we're treating technology differently, even though it can have, you know, as harmful of an effect, if not more of a harmful effect on people kind of over time. What are your thoughts?
Dr. Mechie Nkengla 19:13
Oh, my goodness, I completely agree with you. And this is prevalent, I think, not even just in AI, particularly AI because of the rate at which has been adapted into society, but in all aspects in medicine has been occurring, right. So you're about studies that are being done in 2018, where the belief about how a disease or how a disease shows up in a smaller set of population, whether it's brown or black, or women people or women group of people is not the typical way that's thought so doctors that are trained in that are not able to recognize it because they have not been thought about this. There is this not to digress again. I have this anecdotes but just medical student that wrote this book of common diseases and how they shop and black people Just because medical students were not exposed to that, and funny that you should bring this up, too, I'm just reading this book in this book. This is incredible. And he recounts hundreds of stories of how products are created. But the idea of who is tested model which is tested is not that's not representative and 50% of the population, right? You hear about astronaut suits. NASA, for God's sakes doesn't have a size that fit women, and they had to go recreate it because the standard model that was used to create a suit was a man. You hear about testing in cars, right? The airbag system dummies, the man that's tested and so the safety precautions that are built in for this dummy are based on man dummy, the proportions no match, you're about building systems where the the heat level, and the AC level is set based on the standard temperature of the body heat of a man. And that's why women are typically oh cooler in the offices in the summer, because the temperature set are way that's not really catering to their core temperature. But I digress. All of these examples are nothing new they've been going on for for a very, very long time. But with the rate at which we're adopting AI, this is just blasting, right. And it's incredible. And it's the same phenomena where people go, if you experience a bias, people go for those listening, I am a black woman, if you hadn't picked up on that. And if you recount that experience of something that happened to you and a bunch of people that are not of your color, and or women would tell you, they have never experienced that before sort of discounting your experience, because they happen therefore you have not. I think that plays the same thing with AI when you actually talk about the ways that solutions have been built to sort of not taking into account inbuilt discrimination and bias. People respond to it. Well, I have never experienced it.
Debbie Reynolds 22:05
Yeah, that's probably one of the the least my least favorite comments that anyone would ever made this like, oh, well, I don't have that experience. Like that doesn't mean anything. Because I'm telling you that I've had it. So, you know, I can't tell you what happened to you. But I can discuss what has happened to me. So I always like to say, you know, like, he goes to the grocery store, and you step on the mat, and the door opens automatically, like a walk in. But every every time you go to the grocery store and a door doesn't open, you know, you say okay, there's a problem here and someone say, well, when I walked through the door it opened. So there's not a problem. I'm like, you know, that's a fallacy in and of itself. Just because it didn't happen to you doesn't mean that it didn't happen. So I'm gonna check that book out Invisible Woman that's really interesting. So when we talk about AI, we talk about data, right? So data is about humans. They can be about many things. But when I'm thinking about data about data of humans, what is your concern, right now about the privacy of individuals data?
Dr. Mechie Nkengla 23:20
Oh, my word. From the three year old to the 90 year old grandma, we all are using data where our data is being used, whether it's tracking in terms of our phone, which is the most prevalent use in terms of where we're been idea if you go where you are, where you go. So I can basically know exactly where you are as a person by just tracking your GPS locations where you spend your time, right throughout the day, over a period of time, or to the monitoring in terms of how you use a phone, to go into a grocery stores and having cameras track your facial expression. We hear stories about that. And everyone just goes well, it's alright. And it's that there is this notion of if you're not feeling it or seen the impact immediately, it's something that's far removed, and therefore you can take a backseat. Right? Now this is where data literacy my mind comes. It's really critical. People need to understand what they're giving away. And I think there's a we don't, society do not understand the impact of that. Right? They're just resigned to well, if they want me to give them a sign that consent to use my data, therefore I have to and because we're resigned. We're not really paying attention to the effects of this. And I cannot stress imagine how if you have a camera this is the only way I can explain to you but imagine you have a camera or person that follows you around every single day, every single minute in the shower everywhere. everything you're doing, they're watching, you're recording you. That's really what your personal data is.
Debbie Reynolds 25:04
Right? And then there's what I call meta metadata. So data about data about data. So things like, and I had a guest on the show we talked about this was really interesting. So, you know, he was saying that some people on their phones, there's a statistic, the saying that people scroll on their phones, like they're looking at Facebook or whatever, they scroll, they can scroll every day, like almost like the height of the Statue of Liberty. And just what the things that you look at the longest, tell marketers things about you, that you may not even you've not uttered, right? You've not said anything, but they're looking at your actions. And the future to me is about the gaze of individual. So technologies take a lot of technologies are being developed, they want to know what you look at, what you you know, what you like, what you love, you know, what catches your attention, because they want to give you more of that, because they want more of your gaze. So being able to have to me that's the whole other category of data, right? Because, you know, you could, to me, that's personal data in some way. But you don't even know that it's being gathered, and you don't know what is being calculated about you as an individual as a result of kind of your actions, activities and things that you look at. What are your thoughts?
Dr. Mechie Nkengla 26:49
I don't think there's anything for me to have you said it all. It really is that nuanced in terms of how detailed they can get, and the name of the game I do. I myself, I build products, and this sort of light, and I am well aware, I'm well aware of the privacy issues, right. And I I am the notion that is not strongly. The alarm is not strongly founded enough in terms of what people understand what they're really giving away. Because it's a stand in the game of the game is about having users spend more time on your website, on your app on your solution, even if they're not buying just want people to spend more time on your website for several reasons. Right as marketer, is there any website, they're not your competitors. If then your website, you're learning about them, and hopefully can serve them better, which means that they can then in the future, by the can build brand loyalty. There's a host of reasons. So how do you keep someone on your website or your app longer? You want to understand how they navigated and cater to the way they navigated? Which was all the meta meta monitoring the data that you just an example that you just mentioned?
Debbie Reynolds 28:02
Yeah. And I think, to me, that leaves, you know, especially it's kind of just monitoring and trying to try to tempt you with things that they think you want to see, you know, a lot of times that creates these fringes, right, where things are one extreme, or the other other. You know, I don't think is, I don't think anyone would be interested in me going to Costco, like, that's one of my things that I do. But you see people online, they're attracted to more extreme things. And even though they're, they may be bad, you know, these companies like that, because if you spend your time looking at that, that tells them that maybe there's a product that they could sell you or something, you know, they they want your attention. So for the time that they have your attention, they want to know what they can show to you. So is quite a concerning.
Dr. Mechie Nkengla 29:00
Yeah. And also, I'll add this, I am a data structure. So what I work is I work with organizations and educated leaders to build this sort of product. So it's not always nefarious, it's really about serving the people, but you as a consumer, you as a person, you should be aware of what it is that you're doing by engaging with this product. So you really need to.
Debbie Reynolds 29:25
Yeah, I agree. I agree with that. I think education is really key. And then I think, you know, the thing that is so hard with stuff like this is that, you know, you get the benefit almost immediately, right? We download an app, you know, you want to do something right now, which is great, and then there can be a harm that is in the future. So you don't see the harm is not immediate sometime so because the harm can be in the future. Some people do, you know, just click through and don't really think about about it. And I think, you know, that's that's disturbing. And I agree that we should have more education in the area.
Dr. Mechie Nkengla 30:06
But then I asked you this, a follow up ask you this, who is responsible for instigating or pushing this discussion? Right, this sort of dissemination of the knowledge and importance about the privacy and ethics of all of all the apps that we're using loosely responsible? So your government? Is it individual? Is it the people that create the apps?
Debbie Reynolds 30:27
I don't know that the only answer I can give is that there is a responsibility, everybody has some part of responsibility, right? But the problem is, when is everybody's problem, nobody wants to step up and own it, right? So no one owns this problem. All together, it's just kind of fragments of the problem, right? And then you just have people like us who want to say, hey, you know, don't touch that stove is hot, you know, you know, watch out and not do this. And I'm hoping that that advocacy will help people, companies, individuals make better decisions. But, you know, there has to be shared ownership and shared responsibility and sort of more collaboration, or, you know, more collaboration, businesses, more collaboration with individuals to understand what they want, you know, people who are pushing for things like regulation people, you know, like me, I'm more on the proactive side, as are you, or you're building products and stuff, because I want to, you know, I feel like that's where I can have the most impact instead of firefighting, right? So this bad thing happened. And then now we'll see what we're gonna do where I think AI can be really damaging its people in ways that can't be resolved reactively. So I think it's really important that we try to get this right. And I love that you're in this space, and you're talking about these issues.
Dr. Mechie Nkengla 32:04
Oh, yeah, absolutely. There's still there's all has not been figured out, right? We're raising their issues that are there, certainly. And there are some great people doing some great work and great organizations doing great work and area, including you or your family, trying to advocate in with the podcast and teaching people all about the this heavy, critical thing that's really in everybody's life. But I feel like more still needs to happen whenever, however, that more is packaged in who knows. But still something that more that needs to happen. I feel like the alarm is not loud enough the volume, we need to step it up in some fashion. Whether that might be it's some incident that wraps everybody up and like no, we need to come up with some regulations, or some sort of statutes in terms of people building this start to have some sort of license and schooling and whatever that be. What look, for example, what's going on in Europe, the way they're approaching answer, I feel like they're taking it a bit more serious and awesome in America.
Debbie Reynolds 33:11
Yeah. Even if not everything is 100% successful, I feel like they're really just going out. And you know, they're really swinging for the fence to try to say, Hey, you know, this is a problem, we need to do something about it. And there needs to be somebody somebody needs to take the reins and, and have conversation. I would love to talk a bit, I want you to tell me a little bit about your data yet conference and sort of kind of why you created that. And what what is kind of the purpose or the the goal of that conference.
Dr. Mechie Nkengla 33:45
Thank you. So data you have released a knowledge platform, all these data. Yes, it sounds. Really our idea is to build an online community for professionals, like for professional practitioners, enthusiasts, students, business executive leaders to talk about all these data. So one of the key things if you're in a data space you find is if you're looking for some resources content a question, the first thing you do is pull up Google lingo. You type it and see the top five result results. And you look through that. There is pod. So there are all kinds of content around the that times everywhere, groups everywhere. People post resources everywhere. But it's not curated in any fashion. There's no guidance in terms of what's best, what's not what comments what's voted up. So DataYap really is this platform, with the vision of curating all this content that already exists. We're not creating new content, create curating all this content better exists, right, with people giving their opinions and content. So say, if you're an executive, you're looking for some information around data strategy. What analysts are papers of analysts really seeking, go there and see and see other people's comments and sort of gauge what works for you. We're looking forward to events around data to professional and in particular city that you happen to be on a particular date, you can scroll in basically pulling together all these resources that naturally already occurred. And so as to launch a soft launch of this platform, daily comm we had this conference to sort of push into that. Right. So we have a DataYap conference that we brought together, professors, that are run educational programs around teaching data, we've had professionals like you talking about privacy, we had people from government about government data we had industry leaders talking about, it was a fantastic, fantastic conference.
Debbie Reynolds 35:54
Yeah, it was wonderful, I was really excited to participate, and actually enjoy talking at conferences that aren't about privacy. I think that's fun for me, because I get to know like I live community of people on so you know, data people are my people. So data, you know, I'm down. So definitely call me up for stuff like that. So if it was the world according to Dr. Mechie, what will be your wish for privacy, whether it be technology regulation anywhere in the world,
Dr. Mechie Nkengla 36:33
I would wish that people had the power to dictate what they want to private or not. So you were not forced to sort of sign on the license and give your privacy rights away and never use application? That will be my door to two things to understand exactly. Clearly layman's terms. One privacy I was given away when using it, and make that decision of whether I wanted to give that away without taking away my right to actually use that product
Debbie Reynolds 37:05
That was very succinct. I like that. That's really smart. I agree. And then, I don't know for me, I want the right not to share, right? So I don't want to have to choose, you know, like it's okay, you're at this junction, you have to make this choice. Right. The second like, I want to not have to say anything. I want to just be a foe. I don't know. Maybe I have that in the future. I have no idea. But well, yeah, this is a fantastic session. Thank you so much for sharing, you know, your knowledge and your expertise. You know, I'm excited that we I would love for us to collaborate more in the future.
Dr. Mechie Nkengla 37:53
Absolutely agree with Absolutely. My pleasure. Thanks very much for having me. I'm sure we will be collaborating and I look forward to it.
Debbie Reynolds 38:02
Absolutely. Absolutely. Well, thank you so much, and we'll talk soon. Bye bye.