Debbie Reynolds Consulting LLC

View Original

E81 -Divya Dwivedi, Data Ethics & Privacy Advocate, practicing advocate at the Supreme Court of India

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E81 Divya Dwivedi - (50 minutes) Debbie Reynolds

50:41

SUMMARY KEYWORDS

privacy, data, india, ai, people, law, talk, humans, understand, duty, gender, flaws, country, localization, world, culture, share, person, government, question

SPEAKERS

Divya Dwivedi, Debbie Reynolds


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.


Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world for the information that businesses need to know now. I have a special guest on the show from India. Divya Dwivedi. So she's a multifaceted person, she and I connected on LinkedIn, and we really liked each other. And I think since the time we met, we have had a chat. And then I think I have done something with IPP and the Philippines. So you were a guest there. We've had a lot of, I've had a lot of fun, looking at some of the stuff that you're doing. So you're a practicing advocate at the Supreme Court of India; you're a very results-driven engineer, MBA turned lawyer, with experience in ensuring the legality of commercial transactions in the corporate world of evolving fields like AI and IoT. So that's not even the tip of the iceberg. So you're involved in many different things. I want to list out a couple of things you haven't listed here, founder of JND Charitable Trust, Supreme Court of India, legal engineer, AI ethics and cyber democracy enthusiast, gender equality proponent, data ethics and privacy advocate, mentor, and public speaker. Wow, that's a list.


Divya Dwivedi  01:56

Welcome. I think I have to reduce the list now. I rate as a Data Diva guest. That's enough for me; I think we can all relate to that.


Debbie Reynolds  02:10

Well, I'm so thrilled that you're able to do this show with me. You're so multifaceted. You asked me a question. We'll get into this later about gender and privacy. But it just, I had never heard anyone say that. I've never heard anybody even describe that. And it was such a fascinating topic. But I just want to throw it out there because I want to mention that later. But I would love for you to introduce yourself and tell us about your journey and your interests in privacy.


Divya Dwivedi  02:40

That will take at least more than a couple of minutes. And I hope you won't mind that. So my journey is kind of what I would call crazy. I keep jumping from one to another thought. And I really don't understand why it is hard for people to accept that a person can think beyond the boundaries of subjects. So if you are learning psychology, why can't you learn technology? And if you are learning philosophy, why can't you at the same time learn AI? There is always a connection; you just have to find that particular connection. For me, Data Privacy, not specifically Data Privacy, but privacy, has always been an issue, especially given the fact that in the place where I belong, here we have this culture of being very much involved in the family and outside the family as well in society. So I always had this issue with my mom and family; why do you always have to invite everyone? First, have some distance. The stress factor was always there for me. But anyway, then I moved on from there. And then, when I started studying, I realized that people are not at all aware, and they're not interested in safeguarding their own privacy. They are very free in sharing their photographs, and they're very easily providing their addresses. They're inviting every other person to their house, even if they have met them a couple of times, which is inherently now kind of unsafe, at least for me and my kind of person; I feel it's unsafe to share your details with anyone and everyone. It should be with specific people. So this Data Privacy, frankly, came into mind way back, I think, at least two decades now. But when the Data Privacy bill came to India, it was in 2019 when it was tabled, but then before that, there was a huge case in India where we got our fundamental right of privacy in 2017. That's when I started understanding and trying to understand how much data we are providing to the world and businesses can make it not a consumer-centric business but business-centric. Business, the problem was they are collecting data, and how they're using and manipulating it, nobody knows. And we do not have any law. And suddenly, the bill came into the picture, and everyone started talking about it. So that's how it all made sense that technology is involved in law. And the law has to be at the same level as the technology is developing. So our country also got involved in this. We definitely had information technology law, but it was not overpowering for data. But then, we did have a Data Privacy law in India from the year 2000. But yeah, awareness is required. And that's what I have been doing for those many decades through my NGO through my awareness camps, and I keep doing them even now.


Debbie Reynolds  05:53

Yeah, I'm glad you mentioned in 2017 that India made privacy a fundamental human right and has codified it in your law. So I was very excited about that. And I don't know if you've heard any of my other talks; I talk a lot about India, right? Because I think it's really fascinating what's happening in India. So I was extremely excited about that. And I wish in the US; I wish we could do that. So that's probably my biggest dream in the US is that we codify privacy as a fundamental human right, and our laws somewhere maybe in the Constitution, because we don't really have that now. And I think a lot of people don't understand the difference because they don't look deeply enough into the issue. So for us, a lot of privacy laws are based on consumer stuff. So if you're not consuming or can't consume, you can't really exercise your rights. So you don't have a right not to share data in the US, so your right to not share isn't protected in the US. So it's a very different thing. So when you have fundamental human rights to privacy, you have a right not to share. So you have a right to decide what you will or won't share, and that's very different. But let's talk a bit about the Data Privacy bill that came up in 2019. I think people are thinking you're hoping maybe this year 2022 that it'll be finalized; what are your thoughts about that?


Divya Dwivedi  07:31

In 2009, it was a personal data protection bill. Now it is a Data Privacy bill. So somehow, we have removed personal, but then we have included in definitions nonpersonal, what is non-personal data, what is personal data, and everything. But still, we do not have the right to erasure which is there in GDPR; that is the right to be forgotten. We have not included that, keeping in mind that we do have national security at hand as well to check if some data has been misused or being this process, so our country when government plays this bell, they were very specific about safeguarding their nationals before anything else. Since we in India are a more duty-centric country than a rights-centric country like yours. So we first think about fulfilling the duties beforehand, then asking for our rights. So that is how privacy actually became a fundamental right in 2017. And that's after the Supreme Court gave that judgment. We are now in our data protection bill this year, as in last year in 2021, which was tabled specifically talking about data localization, as well as how hard or soft that localization should be that will be decided according to the kind of data which we'll be sharing. But the government is very specific about making it local because of the kind of data that India is providing since we are the second-largest in population; frankly, we are providing a lot of data to businesses. But there is no government control over that data and how it is being processed or used, even for educational purposes or making sure that consumers are being served better; still, that data has to be processed in a certain manner, that privacy is not infringed. I feel that somehow the government is trying to protect first the individual rights of citizens, then think about the national security, and probably in the third place, it will talk about businesses because, at the end of the day, data is mostly used as an oil for businesses. Now they are consuming data that is provided by us, and they are using it to give us better services. I'm not denying that you're not getting better, better services, but there is a limit to that service as well. So that was one specific thing then that data anonymization has been introduced, and let's see if it becomes a reality this year. We really will be very happy that, at least finally, we have a data protection law. But let's see what happens. But it is already tabled. So we practically have introduced a couple of new things. Mostly, what for me was intriguing was localization specifically because it will affect a larger community of businesses; it is going to affect them most, then the consumers and then the producers of the data. Other than that, we have talked about a lot about the global concern, like everyone now is talking about Data Privacy, because GDPR is so strict, even in the US, it's very strict Data Privacy, Data Privacy laws when India was not such a country, which wants to actually curb all these kinds of privacy. They do not want to make it only business and record. Right centric, we want to have better outputs with good inputs. So now I feel that this Data Privacy here, they have talked about even mental health, can you imagine? I was really surprised when I was reading about it that we have talked about how Data Privacy breaches are going to affect our well-being and mental health. And it was, for me, it's a very intriguing bill, and I enjoyed reading it, though it is a long, joint committee building, please, they have given a whole report on it. It's amazing. And I really love to talk about mental health. And you too, I have heard a lot of your recordings, and I enjoyed it, surely, and you really talk a lot about India as well, which was quite fun to listen to; I listened to it a couple of days ago as well so that I can prepare for Data Diva.


Debbie Reynolds  11:57

Oh, thank you. Thank you. Thank you. This is really fascinating. So I have been doing work in India for decades, right? So I know a lot about data and stuff like that. And so I know that India, for a long time, has been very strict on data localization as it relates to banking data, right? And I think that as the cloud has progressed, India is looking at how to figure out what's the best way to do that localization, knowing that there are companies that do global business. One of the reasons why I'm really excited to see what happens to India is because, as you said,  India, I tell people all the time, there are more people in India than are in the EU in the US combined. Right? So being the largest democracy, there's a lot of money being put into India; all these big companies are all around the world are looking very closely at what happens in India because it can be an example; I feel for other democracies that want to go into Data Privacy. So, I think you have the luxury to, in my view to, have seen how some of these other laws are being implemented around the world and see what works for you. So it isn't a one size fits all thing. I think one of my friends, Pedro Pavone, said it best; he said, you know, not everyone in the world thinks about privacy in the same way. So understanding the cultural differences and how people think about privacy. And I love what you said about duty. I've never heard anyone say that, right? So people think, oh, I have a fundamental human right. Or I have a consumer right, but having a duty base, right? That's even better, in my view. Because if you have a duty, it automatically ties what you're doing with the data to the purpose, right? So if without a purpose, you don't have the duty, right? In some ways, what are your thoughts?


Divya Dwivedi  14:17

Oh, okay. This came from when our constitution was written, and it was way back to Mahatma Gandhi. He always believed in fulfilling the duties first before asking for your rights, and that's what was inculcated in our constitution as well. And even a parent's kind of nurture we get in school that they also first question if there is some problem. The first question which is asked to the students. Did you complete your homework or not? Because if you have done that, then only you can ask whether while I was being thrown out of the class or not prolonged the course so if you fulfill your duties practically for the society, only then are you supposed to claim your rights. That's what we have been taught since childhood. And I believe I can give you an example like if we are coming to a traffic light and there is a red light if one stops, the other person automatically understand that that person is obeying the red light, you have to stop. So, that is how people learn that we have completed this particular duty. So, when it goes green, I can move but to jump the red light, you have no right to go ahead and say no, I had an emergency, and I had to jump the line. Even if you have an emergency, you can always talk for 30 seconds. There is no evidence that you have to jump, so that is a very common example. Other than that, there are so many instances where we have to complete our duties being a child we have to complete an entity or outward appearance, then towards our partners, then towards society. For us, it is a society first. Are you being a good neighbor? Are you being a good society person? If you are in a cleaning job, are you completing your duty? If you are, you are actually the rightful owner of your fundamental rights. So that's how our constitution was built. So now, coming to data. When we talk about the first duty of ours, ours is to know how to be hygienic about your data, what to share, when to share, how to share, and why not to share. All these awarenesses have to be inculcated, and like we as humans, if we are hungry, we know we have to eat. Similarly, when you are using data or social networks, you are supposed to know what you should share or what you should not share. I always say this; it's not like, fine, there are so many people who are sitting just to have some cyber hacking or something like that to affect your life or general public life or government's life fine. But still, it doesn't give you a right to go ahead and open the pool of your life and then say that my life was infringed because someone breached my account and took my photographs. If you're posting 1000s of photographs and 1000s of places, you really do not know where to pinpoint which social media to point to. So our first duty is to understand what kind of data we are sharing. There are so many people who share their intimate photographs, and then they share their documents. They do not realize what documents we are sharing. If this is a government website, the government is asking for some document to clarify whether these documents are real or not perfectly fine; you share it. But why do you have to post it on social media? The moment you do that, you are providing your data; you're saying we have Aadhaar? You have shown it to the world, why are you doing that Adhaar is a government identity given to us or an ID which is given to us by the tax department; you are not supposed to post it online all the time; there is no need. So I believe that if you are being conscious about what you're sharing, you really know which photograph of which document you shared where and you can claim if something, God forbid, if something happens, you can definitely go ahead and claim for that. So I feel that the first duty has to be fulfilled. You have to be careful.


Debbie Reynolds  18:31

Yeah, education is key. I love the way that you say that. Right? So people do need more education; they need to know, especially younger people. I hate saying that. But I was younger. So now I say younger people sound like my mother or something. People who grew up with the Internet didn't have those same barriers that we share, right? It makes the Internet makes sharing easy. It makes it so easy that it's almost dangerous to some people because they don't know. A lot of people don't have the idea that someone will do something bad with their data, right? So when they share, they're not thinking down the line. This could be a bad thing. But we all know that. That's really true. I want to move on to gender and privacy. So you posed this question to me, and I was just blown away. I was like, oh my God, I've never heard anybody say anything, and it's so true. So talk to me about the relationship between gender and privacy. Like you asked me to post a question about can privacy be gender, and I thought, oh my God. Yes, I think so. Tell me more about that.


Divya Dwivedi  19:50

So okay, I get this question. Most of the time do, you connect gender with everything, but I believe gender is definitely connected with everything. First, I'll explain how data is gendered, and then I'll explain how privacy affects gender and how gender affects privacy. So, when we talk about data, we see there is an inclination to infringement of privacy for them, and most of the time, if they have posted their photographs if they have posted their details, there is a good chance that people will be more inclined towards finding that female's details rather than compared to male's details. So that is where the discrimination or differentiation happens. People are very much inclined to understand URLs in earlier times; when Facebook started in India, there were even males who were making their accounts with women's names and photographs, and this and that, and they used to talk to people. And this was also a way of forging and fraud, and other different kinds of crime online. That is how I found out that there is a huge connection between gender and data we do not understand right now. First, we have to be aware of it, and it has nothing to do with the kind of education you had. But if you are being told one thing again and again, that this is supposed to be your limitation. And this is where you have to stop your life from posting online. Why I say that gender and data have a huge connection is that whenever it comes to stopping a child in the house from using the Internet or posting something online, it is always a female first; they are being told no, you are not supposed to use online things, or you are like earlier it was you're not supposed to go outside. Now that has stopped. So people have shifted to you're not supposed to have a mobile phone, you're not supposed to be online, you're not supposed to have a social media account; why? Give them the real reason that gives them that awareness that you can be online, but these are the specifics. That being said, other than this, when there are details available, how should I give you this example? There was a time when we were doing a survey related to counting how many households are sending girls to school. And how many households are not sending and why they're not sending. The reason was specific that if they go out, there is a chance that they will be having an affair, or maybe they will be manipulated, this and that and isn't the same thing happening online as well? It's a similar kind of thing in India that has happened left, right, and center. The girls have run away from the house just by chatting online. And that's how they were manipulated. That's how trafficking happened. There have been a tremendous number of cases where trafficking has happened through online chatting, people have met through mobile phones and talked, and that's how it happened. So coming to the privacy part, privacy is very important for everyone. But like you said, culture also has a huge impact on it. The culture in India has always been very inclusive. We believe in society being open, society being very, very diverse, and very, very inclusive. But what we forgot with the advent of the Internet, with the advent of phones and coming to India and laptops and computers and Internet connectivity, is that we have to educate our kids in a similar manner; we have to exponentially generate the awareness as to whom and how to talk. There is something called good touch and bad touch, which has been now taught in India, but it has been taught in the US since childhood. In the US, I have learned that kids are being taught about cyber crimes from class K to 12. They are being taught, but in India, it has now started, which should have started 20 years ago. So that's how I relate to these things. The final point which I would like to make is why I talk about gender if a female in the house is being educated and made aware that this is supposed to be the check and balance for your kid. I think that's how that household will be much safer. Now what happens is the only father who knows exactly what to say, and fathers are usually being accepted universally that they will not be so close to children, it will always be the mother. So first, educate the mother. Why are you not doing that? Educate the mother, educate the grandmother, educate the women around to make sure that their children are safe. So I relate gender data and privacy in this manner. And I try and target those women forced to educate or make aware rather than men because they are going out, they know what kind of world we are living in, they try and safeguard themselves, but they are not that open in the house. So similarly, I tried to do this in rural areas; I hardly have worked in urban areas. So I enjoy doing it in rural areas, and women are very, very open-minded when it comes to taking care of their children. And when I was teaching them or guiding them, this is how you have to make sure that your kid is not online doing something wrong, they were very, very open, and they understood they are trying to do their best. Other than that, it's always your fate as well, whatever however you play; that's how I connect.


Debbie Reynolds  25:44

Wow, that's amazing. I will have a podcast with Karen Bright. She has a company Understanding Identity. And she brought up an example about this in England, where she was saying that there were certain immigrant women maybe who didn't speak the language very well. And a lot of them defer to husbands or nephews or brothers to help them just to do things having to do with their identity. So a lot of them didn't have the agency, I will say, to actually do things with their identity. And then, that has privacy implications, you know what I'm saying? So we're talking about education and knowing what you can do; if that responsibility is the first somehow to someone else, maybe you don't have a full understanding of what's happening. And like I say, education is key. I want to talk a little bit about behavior. So you have recently finished a certificate at The Hague, right? And what I've always wanted, I always hoped that The Hague would jump in. And, obviously, they have their conventions about data sharing. I actually did a video a while back about evidence sharing in The Hague Convention, but I hoped that they would do something around Data Privacy. Well, what are your thoughts about that?


Divya Dwivedi  27:15

l say AI because AI is the kind of tree all its roots are there. Then I attended a summer session last year for six weeks, and they actually talked about cybersecurity and Data Privacy both. They are now talking about having universal guidelines like we already have. Still, they are talking about it in terms of cybersecurity specifically because if you see, Data Privacy also comes under the umbrella of I wi you have to find the branches. So, cybersecurity is their cyberpsychology. Interestingly, they talked about that as well, then there is data, then there is privacy, and then there is Data Privacy, and I enjoyed those lectures. Obviously, they are very big people who talk about it. So, I was fascinated by how they were talking, but I understood from all these lectures that everybody wants a universal system, which should be equal for every place, but I disagree with that. Because like you said, every culture and demography makes a lot of difference. Every country and every state cannot be similarly treated. There is nothing called equality between equals. It will always be equality between unequals. You have to find a way to make equality possible for every unequal being. So if the culture in India has a different culture than in the US, it is a different culture than Australia; it's different. So you have to find ways to have similar situations but in different demographics. So when everyone says that this should be followed as it is in the other country, it is not possible. That's why India being the biggest democracy in the world, and we try here to make sure that whatever laws we make here shouldn't be generic and should be acceptable to the countrymen. Because it is our country, and we have to run it according to the needs of the people. So there are a couple of professors who specifically were talking about having a cybersecurity law, like aI law for the world. Similarly, cybersecurity law for the world. So I always had this question. We were supposed to send emails. So my question was this: how can you propose a universal cybersecurity law? Keep in mind that the US has different kinds of privacy norms and privacy regimes. They do not exactly follow the same process as GDPR, then how can you ask them to have a sense that cybersecurity law and international laws are always soft laws? They are not that hard. That you don't have to follow them. It is just a guideline, which you put at the center, and you can pick the pieces as and when required. It becomes integral and important when there are states fighting on some issue, which is important for the other states as well, but you cannot impose it. So if there is no imposition, they cannot do anything, they are only a suggestive body, they cannot impose anything on any country Whosoever wants to sign they can sign like we signed our law that came in 2005, we had signed it in 1995, we had a 10 year time where we could frame it according to our needs. And every country has an issue with one section of our law as well. But that is for us because we are agricultural, this country, we cannot open everything we cannot have. So coming back to data as well, like now, every person always questions like you have logged in India that, why are you talking about data localization, it's very important for us, we cannot just open data, and we cannot give it to other places where we cannot claim our rights. So we have to make sure that our citizen's rights are being seen. So that's how I picked it up from here. And it was quite fun, six awesome weeks. It was public and private international law books. So in both of them, privacy definitely came into the picture. But what you are proposing, I really hope it reaches them. I will request that a couple of professors get introduced to us so that you can propose that option over there. Please talk about it and call me.


Debbie Reynolds  31:54

I didn't even think about the localization in the way you just described it, which is if it is harder to enforce the laws around data, this is all over the place, right. So I've never thought about it. I've never thought about data localization in that way. That makes sense, actually, because, in order for you to take this data, you have to be subject to our jurisdiction. I guess that's kind of how the GDPR does it. I would love to talk to you about the APEC privacy framework. I like it because it's doing what the Privacy Shield in the EU was trying to do in some way. Which was, okay, here's the framework, this is what you need to follow. And one of the things I really liked about the framework is that it acknowledges from the very beginning is that countries are different, right? So our laws are not going to be the same. Let's find ways that we can agree, right? That when we work together, that we want to agree to this framework or these principles. What are your thoughts about that?


Divya Dwivedi  33:18

Well, it's a huge thing to talk about, frankly, but I'll try and wrap it up in a couple of sentences. What I would like to add here is they have tried to talk about choice, choice of nations which are involved, and that somehow gives you a broader perspective to any law. Other than that, they have also tried to implement the kind of integrity like you said that personal information, how and where and when to be shared, there are cultures which are different, there are domestic implications, as well as international. Other than that, there are implementation options as well. So all these scopes, which are given by this particular framework, I think we are moving towards an Asia-centric privacy law. However, then I think still all the states will have different laws for themselves because every state has a different kind of citizen to cater to. We can have a similar kind of framework here. We can work around that particular implementation part, as they have given an option that if you follow this particular thing, you can have your own domestic implementation process. You can have your own way of how you want to safeguard your citizens, but again, they have to give that option because the problem is we are so diversified. We are so different, and understanding any concept, we just cannot impose it again. I will keep on seeing this particular imposition. Imposition is something which affects us psychologically. The moment someone says that you have to impose this particular thing on another person, we feel like we are not being given a choice. So, when you're talking about choice, there is no imposition; it has to be soft law. It cannot be followed to the letter like we say, you have to follow the law by the letter No, this law can not be followed, it is just explanation or a principle given that please if you can follow you can be in this particular community, and you will be given the same kind of rights in other countries as well like it is said that wherever we go before we take our laws with us, we do not leave our law of the land. If I move to the US, it doesn't mean my laws in India will not follow me; they are with me because I'm an Indian citizen. So that's my understanding; I may be wrong. I'm not saying that that's how I try and understand the law, from the perspective of people who work on the ground, from the perspective of people who are social workers, because social workers are of a different kind. So when you're talking about imposing this kind of law, which is universal for Asia, specifically, this is a  different continent, has a different culture; even here we are sitting in India than in Sri Lanka, there is a different culture in Nepal, we have a different culture in China, there is a different culture, then Pakistan has a different culture. So all of us have different kinds of laws. Privacy laws cannot be seen on similar grounds, they can be given a choice, or they cannot be given a choice. It depends on the government kind of security we are dealing with. And kind of equality, we are talking about our gender, because this is also going to affect gender. Gender, I believe that gender equality is going to answer many questions, which we always keep raising, even in relation to Data Privacy. If we give equal rights, if we give equal choices, I think we will have better results.


Debbie Reynolds  37:24

Wow, that's amazing. That's a tour de force answer. I want to talk with you a bit about artificial intelligence and privacy. So artificial intelligence, as a result of COVID, a lot of things that were being developed with artificial intelligence are accelerating. Because organizations needed to find ways to do things differently, right, as a result of people being sick or trying to automate or even smaller businesses looking to gain more efficiencies with less people. So we're seeing kind of an acceleration of AI. But the thing with the acceleration of AI is that there really aren't any guardrails with AI right now. So, I'm happy to see that you talk about this. I saw a talk you had done about the legal aspects of artificial intelligence. So tell me a bit about your concern about AI, maybe mix in privacy in there.


Divya Dwivedi  38:30

AI is everywhere. It has not started because of bandwidth, but then somehow it has taken a jump; I would say what we would have been talking after like a decade, we have started talking immediately, like a couple of years now. So I mostly talk about ethical aspects of AI, ethical principles, which we talk about like transparency, fairness, accountability, privacy, obviously, it comes to that particular arena itself. And most important, like I said, accountability. How do you make anything accountable if we are building an AI? How are we expecting it to be accountable? The privacy factor is the particular algorithm which we will build around privacy that will make it more accountable. Like you, I recently talked with Ryan Carrier. I think so he talks about AI audit. That is a very important field, and no one is actually talking about it. I've never heard anyone other than Ryan talk about AI audits. That will give us an upper edge in understanding why it is very important to make sure that they are accountable. But before that, I always have this question, why are we trying to make AI so accountable? And we, as humans who are building that particular AI, that accountable? Really. We always have flaws, and we accept humans with their flaws. So why are we expecting that whatever machines we are building, all of a sudden, they will or they will not be accountable? You can make them accountable. No problems with that, but do not expect it to be accountable from day one. We are just building it; it is getting built; it's kind of a child which we are nurturing. We are trying to build an AI atmosphere. But we have to make sure that those principles are followed. Even humans, we deviate from our own plan, right? We are supposed to be ethical. But don't we miss it? And humans criminals? So think about it like this, like the vehicles being run by AI. We say that if it kills somebody, who will be accountable? Will it be the AI driving the car? Or the person who has made it or the company who has made it? First, understand how will that happen? Why did that accident happen? If everything is so automated? How will that accident happen? I'm not saying that it will not happen. Just try and understand the worst scenario, keep the whole environment around you and then try and understand accountability question is not wrong. We try and make companies accountable, and the government also accountable. But that doesn't mean there are no flaws. So accept it with its flaw and work around them so that it becomes more accountable. We believe that whatever we have built since it is a machine should be totally non-biased. That is not going to happen. When it's a flaw, we have to accept it and try to remove those flaws in the process itself. That's my understanding. And I may be wrong, but that's what I feel that since we accept humans with their flaws, we have to accept machines with their flaws.


Debbie Reynolds  42:05

Yeah, that's fascinating. I never thought about it that way. Being a woman of color, bias in artificial intelligence concerns me greatly. Especially because I feel I think the biggest hurdle we have is people who don't believe that there is bias in AI. So there is bias in life, right? So if we are humans, and we're building AI, we're going to build AI, unfortunately, that has those biases in it. But the problem is, when we're using AI, we're pretending like it doesn’t, and it's perfect, right? We're pretending like it has no flaws, and it's not perfect. It does have flaws. So I think, especially in situations where it can affect someone's life or liberty, we need to definitely look into that more deeply. So it can't be a thing where you're like, okay, well, computers are perfect, the software is perfect. We don't have to see it, you know? Whoever designed this system, it doesn't harm me. So I'm going to assume that it's not going to harm other people. And we know that that's not how it works. So I'm concerned about the exponential efficiency of AI to multiply harm to individuals. What are your thoughts about that?


Divya Dwivedi  43:32

Again, I'll come back to the same human bias. If we, as humans, are biased towards race, we are biased towards gender, and we are biased towards color, we are biased pretty much about everything, and we are very judgmental. No one can say that I do not judge the other person, right? The moment you feel you are better than the other person, you start judging them. So when we are building a machine, how can you say that it has to be totally perfect? We are not perfect, and we really have no definition of perfect anywhere. I have never heard anybody say that this is the definition of perfect, and this is how we find flaws as well. No, this also has happened, he did this, or she did this. So we try and compensate for everything by arguing, but machines are not trying to argue yet they may argue later, but they will grow maybe later, but why are we worried so much. We are not worried about what kind of data we are feeding it? What kind of algorithm are we throwing around, and what kind of bias are we putting in. There is so much bias. There was a book I was reading by Pedro Patrick, so you should read it. It's an amazing book. He has given so many examples from the US itself that even the bridges were made in a manner that they should not reach to a certain community. They should ignore that particular community bus, and that bus should cross so that they can not get access to the US. This is not done and being humans, we have accepted that we are perfectly fine with it. We are okay with human issues. But why are we not okay with AI issues? It's not like AI says that you just give me the world, and I'm taking over. No, we are the ones who are running AI. First, accept your own flaws. We go ahead and push others, we judge others, and then say no, no, this is the machine, and it should run in a proper manner. This becomes, for me, a very curious case. I feel like it's kind of a vicious circle. We always keep on circling around the same thing. Are we ethical? Why is it not ethical? Why is it not fair? Are you fair? As human? Are you fair? Even if you asked me, I may not have been fair with so many people. We are talking here on an international level. But still, I'm saying I may not have been fair, and I'm not saying I'm the perfect human being. No one can say that. Everyone has definitely had some wrongdoing. Sometimes you may have lied, sometimes, it may not be a bigger lie, but it must have happened. So don't think what we are building will become totally fair; it will become totally transparent. No, transparency is a different thing. While doing an audit, it may find flaws in the things that humans may have hidden in it, but it will not make it crystal clear. Because we, as humans, tend to hide things. The day we do not hide them, we do not have jealousy, we will have no war itself, you have so much peace in the world. You just have to find a peaceful mind. That's why I also have talked about spiritual AI, which is a couple of groups, everyone was required to talk about. I said I'm not talking about AI and spirituality. I'm saying make AI spiritual. Give it a peaceful mind. Make it nonjudgmental, make it more fair, which we read in books, but in reality, we do not have because if you see a couple of movies, even AI when it gains consciousness, it's us. Why? Why did you lie? It does the human itself. And we did not have any answer. Because we really don't know why we lie. We just want to get away from or with whatever. So please be perfect first and then try and make machines perfect. You're not perfect.


Debbie Reynolds  47:45

Yeah, totally. Right. That's fascinating. Oh, wow, I think the audience is really going to love this episode. So if it were the world, according to Divya, and we did everything that you said, what would be your wish for privacy anywhere in the world? Whether it be law, technology, or human stuff, what do you think?


Divya Dwivedi  48:05

Wow. What have you asked? And why have you asked? We shouldn't be peaceful in our heads. Remove the judgment; remove the judging part from your heart. And try and be diligent in what you're sharing and with whom you're sharing. I'm not saying have faith in everyone, but try and have faith in humanity. And I think privacy will be there. It will be there. But if we become a larger community, a bigger one, I think privacy should not be an issue anyway because your data is yours. And if you have that, right? I think no company, no business, will have the option of taking it away from you. Like there are so many cases in the UK itself where Right to be Forgotten was being entertained by courts, they have given that option. So I think we will have a nice, beautiful, peaceful world if privacy becomes the norm and people become diligent about what they are sharing.


Debbie Reynolds  49:10

That's, that's amazing. Oh, wow. Thank you so much. I really appreciate you being on the show. And I know the audience will really like it. And thank you for the shout-out to Ryan Carrier, the founder of For Humanity. Definitely check out his episode and For Humanity as well. It's a great organization, and they talk a lot about AI and audits and things like that. So thank you so much, Divya, for being on the show. I'm sure we're going to cross paths again, and I look forward to opportunities to collaborate in the future.


Divya Dwivedi  49:42

Absolutely. Really. Thank you so much for the invite. Thank you. And I hope I someday host you in India.


Debbie Reynolds  49:49

Oh, yeah, totally. Thank you so much.


Divya Dwivedi  49:51

Thanks, Debbie. Great. Thank you.