Debbie Reynolds Consulting LLC

View Original

E152 - David Hendrawirawan, Owner Code Ninjas

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E152 - David Hendrawirawan and Debbie Reynolds (43 minutes) Debbie Reynolds

43:38

SUMMARY KEYWORDS

data, technology, privacy, companies, ai, education, people, schools, cybersecurity, coding, society, area, curriculum, students, scraping, teach, consent, work, industry, subject

SPEAKERS

Debbie Reynolds, David Hendrawirawan

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. Our special guest on the show today is David Hendawirawan. He is the owner of Code Ninjas. He's also an adjunct professor at the University of Texas at Austin. Welcome.

David Hendrawirawan  00:49

Thank you very much, Debbie.

Debbie Reynolds  00:50

Yeah, those two things don't really encompass all that you are and all that you do. You and I met, and we're connected on LinkedIn, You contacted me because you're teaching courses on technology, especially emerging technologies, and you asked me to come in and speak to your class. And I was really happy to do it. I think it's really cool because you have a mix of people who are in technology, where cyber privacy fields who are practitioners right now, and then also people are starting. So I thought the session we collaborated on was very rigorous for those people and for me, it's really cool to really see what people are thinking about. And then your students have really great questions as well. So why don't you introduce yourself? Tell me how you came to be the owner of Code Ninjas and your foray now into education. Also, I want to mention, I'm sure you're going to talk a little bit about this. You and I have had some chats. And even you invited me to a meeting with a lot of industry folks related to your work that you do in child privacy, like K through 12. So give me your background. How did you get here?

David Hendrawirawan  02:05

Yes, thank you, Debbie. So I have been practicing as a consultant with Deloitte. I've also worked in Enterprise Architecture for Dell previously; I've always been in and around data, and that has been the theme of my career. Whether it's starting with cybersecurity privacy analytics enablement, I connect the dots between the defense and offensive side of data and technology to optimize solutions for clients who are in heavily regulated industries, as well as data-intensive work such as financial services, healthcare, life sciences, technology, and now everybody's pretty much in that bucket. So after about 20 plus years of being in the industry, throughout those times, I've also been actively involved in education, doing guest lectures at the university, sponsoring student projects, also at the children level, have been a Sunday school teacher, ministry director, and all of that has prepared me for when COVID happened. In the aftermath of COVID, we see that our educational system is very much vulnerable to disruption, especially technology disruption. So that motivated me to try to make in the past two years, an intentional effort to put my focus more on education. So I have gotten the opportunity to increase my teaching load at the University of Texas at Austin on subjects that I see are very much in a whitespace area. Things like data governance, AI governance, privacy, preserving analytics, and advocacy; as you mentioned, I got the opportunity to acquire a coding school for kids, for K-12 children. And that put me in touch with local schools, with organizations trying to promote this as I get into this and learn to work even closer. I've been exposed to a lot of conversations with people who are in a position of trying to make a difference. And I see that it is definitely overwhelming in terms of how can we provide our future generation with the literacy about data, about privacy, the things that are seemingly commonsensical, and they need it the most, right? Unfortunately, due to where we are now, oftentimes, it's in the whitespace that people don't pay too much attention to or while it's kind of out of reach. So that's kind of how I came about to where I am. Now I'm focused on that. I also still provide advisory to companies leveraging my previous experience. So I'm a board member of a corporation. And I'm also speaking to various industry events or any kind of board or executive meeting that wanted to talk about this subject. I'm happy to do that.

Debbie Reynolds  05:22

Yes, your work is outstanding, and you definitely have a heart for education and making sure that people have the knowledge. And so, the thing that you touched on and I want to delve deeper into is data literacy. I did an article called something about data illiteracy, right? Because we are, as a society, data illiterate and digitally illiterate as well, right? So, in the old days, when you would buy a computer, it came with a huge book, and it told you everything you needed to do. Now you get the computers, no book, you just start pressing buttons, and all this type of stuff. And so not just the technical, logical part of it. But I think there is just such a gap in education at all levels, including definitely K through 12, but definitely just adults, right. So a lot of us are adopting a lot of these new technologies; we think they do these cool things. And we don't really know what we're doing. We don't know what harm there is because it's just a Wild West type of situation. So you hear a lot about the benefits, but you don't hear about the potential harm, or you get a situation where people can't make really an informed decision because they don't really know what to do. So tell me about what you see in the data literacy or data illiteracy space.

David Hendrawirawan  06:47

Yes, thank you. So data literacy, as well as let me just kind of clubbed together the term AI right now, right? Because that's what's really hot, and it's going to be hitting us unexpectedly and have a direct impact on education. Everybody's talking about it as kind of, again, like one of these kinds of perfect storm situations. So for the longest time in the educational realm, it's, you know, the area of digital citizenship, which is kind of the term for a lot of the curriculums for the K-12. Started off from of course, how do you understand computers? How do computers work? How do you develop computer programs and coding, right? That's what most people are thinking of when they think, what can I? What do we need to teach our kids? And then later on, there isn't more of an emergence in the area of cybersecurity, right? So people are thinking about online protection for kids from the consumer side of how do you protect yourself from cyberbullying, or just from misinformation, and stuff, but then toward developing the cyber workforce, there's a lot of just articles and news about the crisis and talent shortages for cybersecurity. And so a lot of work is being done by that. Now, when it comes to data and privacy, this is where it's that whitespace a lot of people's minds; the term privacy or data is kind of either clubbed into cybersecurity or clubbed into the more general coding, right? But if we look at the curriculum today, even in the coding curriculum, there's not a whole lot of data in it. Yes, you work with the data, but not in the same way as working with, say, your traditional database and your business intelligence, even at a college level. There's a lot, actually, interestingly, of computer science programs that don't require SQL at all. They learned the coding on Java and Python and all that stuff, but not so much on SQL, and RDBMS databases, which is very interesting, right? On the privacy side, in the same way, a lot of people think of, hey, this risk factor of cybersecurity. So they think about how do you protect with a strong password, with multi-factor authentication with encryption and stuff. But a lot of what we're seeing today is because of the pervasiveness of data sharing and the pervasiveness of AI; even if you check all of the boxes under cybersecurity protection, that data may still be accidentally or intentionally misused and violating the privacy rights of individuals. And that, again, is something that is very subtle, even for most people who are practicing in the risk domain, because it requires you to understand not just your traditional cyber risk type of skill set or even legal aspect from a privacy compliance perspective but also understand analytics. A lot of the things that are happening today when it comes to privacy risks and harms comes from not cybercriminals but rather, people who intend and are trying to do good things. But not necessarily understanding the ethics and the context of privacy. Right? Open AI is now getting sued. Because people say, hey, you scraped the public? Yes, it's called public information, but you scrape all of them and then use them to train your model. Now that you know that there's a couple of huge newspaper media companies that are banding together and potentially coming up with a lawsuit against Open AI, saying that all of our journalist's work is the reason why you've been able to train your models. So we need to benefit from the outcome of your product of your work right. So again, a lot of this stuff is not as subtle, is not always addressed by the law beforehand. So this is the reason why it is already difficult, as you mentioned, Debbie, for professionals to grapple with this and imagine how it is to influence the curriculum for our future generation.

Debbie Reynolds  11:05

I agree. I guess on one point, I do want to delve deeper into the education part. But you touched on something about data scraping and Open AI. And so this conversation happens a lot on the Internet and on LinkedIn. And people are really disturbed by this. So data scraping is legal in the US, right? That's one of the problems. LinkedIn actually had a unique victory not long ago around their hiQ case, where even though data scraper was legal, they were saying that the way this company scraped data from LinkedIn was against their terms of service. So there's a little bit of hope there, I will say, in that regard. But I think what a lot of these companies are doing is playing in that gray area, where there aren't really strong laws around that. But then we're seeing people like Sarah Silverman creating a lawsuit around her intellectual property, which is that these AI models have sucked in her book, and they're starting to regurgitate things that are copyrightable information. So I think that's going to be a very interesting case, right? So people always talk about that. So data scraping is illegal in Europe. So you hear a lot of people outraged about that because it is right. But then I would also say to people that a lot of these companies are using consent very broadly. And I say consent is like, there's almost no limit to what you can consent to, right? You probably couldn't sell your limbs on the Internet. But beyond that, they can ask you to do whatever you need, if you say okay, that's it, that's like consider, okay, so to me, there needs to be some type of ceiling to what consent could actually mean, right? Especially when it could be potentially harmful to the other person. I want to talk a little bit about your work in this K through 12 education stuff. So the last time we chatted, you told me some really interesting things about the barriers. So I would love for you to talk about when I hear someone say, okay, I want to teach kids coding, and I want to teach them about technology. Like, I don't know anyone who will be against that, right. But we know that there are challenges. And we know that there are barriers, because what you're doing is introducing something that hasn't been traditionally taught at schools either, whether it be K through 12, or even at a college level. So especially when you're talking about how people can protect themselves and gain that knowledge, and all these kids. We're all influenced by technology. We're all using technology at some point when we're not necessarily educated about it. So can you tell me a little bit about what those barriers are? Especially in K through 12?

David Hendrawirawan  13:57

So the barriers that I see is, first of all, you know, environmentally, this area of data and privacy moves very, very fast. So a lot of our curriculums, traditional curriculums, which you know, is pretty advanced when it comes to coding or cybersecurity or just networking and how computers work. But the area of Data Privacy is still very nascent. Now in a normal situation, which we are not in right now, right? Curriculum Development is already challenging. But in the context of today, where a lot of teachers are leaving the profession, a lot of administrators are struggling to cope with the challenges that currently are pervasive in our education system. Whether that's a curriculum, whether that's the politicization of issues, and just again, the inability to still cope with learning loss that happened during COVID, all of it's slowing down progress in general overall, right? Schools are struggling just to keep up with the bare basic minimum expectation from the various State government and local and Federal governments around standardized tests. So a lot of principals and teachers are just struggling to keep up with making sure that students pass their math and reading test scores, other subjects like sciences, they're tested, but they're not necessarily part of kind of the, you know, standard accountability. But things like cybersecurity coding, and privacy, those are getting by the wayside, right? So while yes, these issues are important to the citizens, the parents, and teachers, as well, the system does not allow them to move fast enough into this right now; from the societal perspective, this is why I definitely, in my journey here would like to encourage practitioners, professionals, people who have this exposure to this area, to band together and to help. And I'm glad that through my journey, I was able to find a lot of people in the industry who are willing like yourself, to be my guest speaker, to be my partner in trying to enhance this education, it really helps a lot. And so the other part of that is no, our educational system was designed, especially with the public education route, right, as a nonprofit, as an institution that is not very, it's not accustomed to work with private businesses. And because of that, it's not only from the work culture, from the time cycle of people responding to things, I've had experiences trying to work with the school district, come as owner of Code Ninjas, it's a private business coding school for kids, but then we're not able to do even, for example, when we want to do fundraising for a school district, they were very wary about that, because they said, we cannot be perceived in any way of supporting any particular business, because then everybody else will say, hey, why not us, right. And so there's shying away from that, because of these factors. So it's difficult because even though we want to help, it's challenging. There are some companies like for example, Microsoft has promoted an organization that tries to matchmake professionals who are willing to volunteer and teach, and K 12 schools, we teach CS at Epic, which is expanding pathways in computing at UT Austin, and we teach CS. These are organizations that are dedicated to provide professional development for teachers to help. Now at the university level, there's also things like Computing Research Association, ACM level up, a lot of people are trying to now build that bridge between academia and industry. But again, a lot of these things, it's difficult because mostly you find either a teacher who learned technology, or and it's rare that you see people from the industry, if somebody can get an education in computer science or MIS, they most likely will choose to go to a technology role in the industry, which pays you know, a lot better. Right? So but how can we figure out a way in which they can contribute as much as possible back to education? This was kind of one of the things that I'm trying to work on, with various players in industry and academia.

Debbie Reynolds  18:45

What is your ask? So people are really clear on the privacy community in terms of if this is of interest, and then they want to help? What are you asking?

David Hendrawirawan  18:56

Honestly, at this point, it is still very early, Debbie; I have a lot of aspirations. But I also know in talking to people that this is not necessarily an easy path. And there have been a lot of things that we do, for example, fundraising, right? Yes, companies have already provided funding, guest speaking or supporting teachers; yes, there are people who have come to schools or universities and provided teaching, etc. There are also technology companies that have provided a platform or even an organization that supports making it easier for access to computing, education, curriculum, materials, etc. What I see, though, is that despite all of these good efforts, the sense that I get from speaking with many of even the leaders in institutions, both industry, and academia, is that there's still an insurmountable amount of challenges ahead of us, and it feels like we're not moving fast enough. And definitely not currently able to outpace the development of technology, such as Generative AI, that's really demanding a kind of almost like a green space strategic effort for developing something that doesn't necessarily exist or depend on the current existing institutions. Right? So I know that that's not, that's kind of an, almost like a non-answer. But that's just to reflect the reality is that even as I speak with people who have spent a lot more time and have been in this game for a lot longer trying to solve these problems, and they all have said, yep, we are definitely drinking from the firehose, right. So I do hope, again, out of this conversation, is why I feel like, hey, I appreciate you using your platform for us to voice this out. I think it's going to be a little bit longer than what people probably expect, but at the same time, I think we have an opportunity to accelerate it. There's enough people. So that's why I'm kind of reaching out from this forum to say, hey, if you have a genuine interest to try to dedicate yourself to the education of future generations about privacy and data literacy, I would love to connect, I have a lot of networks that I've connected with, because I've spent almost half of my time in the industry, but also half of my exposure in the educational realm.

Debbie Reynolds  21:22

Well, thank you. And I know a lot of people contact me who they're passing it around education or privacy in the children's space. One thing that I think this is going to help you that's happening in the US coming up now is that one of the California laws is changing the age requirement for consent. So we know at a Federal level, COPPA right now has an age limit that online privacy law for children 13 and under. So some States are raising those ages. And I think there's talk of raising the age from 13 to maybe 16 or even 18. So a lot of that will play itself out this year, next year. So next year, in 2024, I think it's going to be a sizzling hot issue because a lot of the companies that are very popular, like you know, say TikTok, or Facebook or whatever, you know, a lot of their bread and butter customers are those 13 to 18-year-old people, right? So because these laws are what they're trying to do is create a situation where companies can't just put a checkbox on their website and say, hey, you say that you're over 13. And that's it. So we've seen a lot of companies, the FTC is really pushing enforcement and has had some successful enforcement around companies that they ignore the fact that they may have other data about someone and let them know that they're under a certain age and know that they don't have any type of the proper consent, whether it be parental consent, or are they the informed consent or opt in consent for individuals who are under 18. So I think it's just going to turn, especially advertiser, it's going to turn that over on its head, I think, because a lot of these advertisers have not been accustomed to getting that type of consent or being able to show that and then the other thing that's happening is that I was talking to someone on the Internet about this today. And this is not just privacy, it's also going to be cyber. So a lot of people, a lot of companies, they've thought of compliance and stuff like that. It's like a paper exercise. So let's create a policy, and let's put it on the website. Let's create these manuals within an organization, and what's happening is that these agencies are actively embracing technology as well. So just like if you don't file your tax return, the IRS knows you didn't file it, right? This is going to be a situation with cybersecurity and privacy where it's not going to be just the Googles and the Facebooks that are getting into getting these cases, right? They're going to be able to tell electronically via the digital systems that they're using that you're not compliant. And so it's not going to be a situation where people can hide in plain sight, it's going to be evident, right? So they're going to be able to look at your website, they're going to be able to bring technologies to bear to know that you're not in line. So I think what people, what they think is happening, it's like, okay, someone gets like a report on their desk, and they just fall at the biggest leap possible. And that's it, where we're going to get a situation where these regulators can go after anybody, right and everybody, so it's very easy for them to programmatically send out a letter to you like the naughty or nice letter, and you have to basically get in line. So I think that all those things together are going to put more focus on the children's privacy area because companies aren't going to be able to hide in plain sight anymore. What are your thoughts?

David Hendrawirawan  25:20

No, absolutely, I think those are definitely important. And as you mentioned, that's probably the one area where there's a bright light in terms of, hey, it's a bipartisan issue. And everybody's in support of increasing the age limit for children to be exposed to these technologies. And the research shows again, especially for young girls, how their mental health is negatively impacted by these tools and technologies, right? At the same time, I also want to make it; I think we need to be careful not to just look at these types of advancements as a silver bullet. Right. And let's face it, I think the reality is, again, I'm sympathizing with every parent out there; it's very difficult to keep our kids away from screens, right. I was just browsing last night through my research. And notice that even as early as 2000, there were organizations, Alliance for Parent and Childhood, something that has already discussed, you know, when computers are starting to come to the classroom, they were concerned. And there are people who felt like, hey, you know, what this thing could actually have a negative impact, just like today, when people are resisting Generative AI in the classroom, etc. I think history has proven to us that, despite what we feel about it, it is here, and it's not going to go back into the can, right. So we must progress beyond this. And I think there's a lot of parents who are feeling exhausted trying to keep their children away from the negative harms of technology. And subconsciously, unfortunately, they succumb to more of these futile preventative attempts, such as, okay, let's limit the amount of time they're gonna be on screen or let's increase the age. So it's interesting; I've heard a lot lately, especially right. So among the coding schools that I've seen, generally in the network of Code Ninjas, which has almost 400, coding schools across the US, UK, and Canada, generally, enrollment is down over the summer camp. And we look at that it's kind of interesting. But then we noticed that a lot of families and parents chose to go to travel or do sports and do anything other than so there's there appears to be the subconscious rejection of trying to even learn technology, right? So we argue that, hey, look, you can't fight it fully. So you have to, you would, I would rather teach my children to be productive and safe in the use of technology instead of trying to prevent them from accessing it. So I do agree that these measures to increase the age limit, or perhaps even the ability for parents who are restricted, etc. But I also hope that it doesn't necessarily give us the false sense of assurance that things are going to be okay without us, as a society, as parents, as the older generation, to really invest ourselves in learning and understanding these things. And to have open dialogue with kids about it. This is just like in I'll use this analogy, right. So you know, the sex talk that parents have with their children as they come of age, we need to have similar program like that when it comes to privacy, online safety, digital ethics, if you will.

Debbie Reynolds  28:57

Yeah. And that education can't just be in schools, right? It has to be an effort from, like you said, from society, from the community. And that's really the gap that I definitely see in that. And that you're articulating what is happening in the world right now. What's concerning you, like something you see? You might say oh, wow, I don't really like this, or I'm concerned; what gets your attention right now? What concerns you about data or privacy?

David Hendrawirawan  29:27

Yeah, my concern is that I think we see a lot of these challenges. I see a lot of just, but all of these, whether you call it privacy, whether you call it the issue of diversity and inclusion in schools and the workplace, economics, geopolitics, all of these are symptoms of a greater, larger underlying problem, which is that we as a society ad has lost that connection, that interpersonal connection, the ability to trust, trust is really thin nowadays. And the way in which we engage is a lot more on the digital front rather than face-to-face. Right. And so I think, you know, COVID has again exacerbated, COVID has pushed us to a new plane, in terms of how we interrelate with one another. And that generates just habits, conscious or subconscious, of communication and relationships that had caused difficulty for people to really trust one another to take risks in building relationships and trying something new. We've heard a lot about how in the workforce majority of people are not engaged; the loyalty between company and employees, right, has pretty much gone down the pot with so many layoffs, even in fields like technology, teachers, and parents, you know, student and teachers, parents and kids. Politics, right? 2024 is not an exciting prospect in the minds of many of us here in the United States. So there's a crisis of leadership; it comes down to the core root of, like, we as a society have not seemed able to go back to this, again, societal values. And because of that, really, a lot of the issues like a proliferation of AI may impact the livelihood of many people, may eliminate jobs, may create misinformation, etc. These are, unfortunately, in front of us, and we don't have the luxury of being able to be complacent about it. But our all of the crucial things, education, health care, you know, workplace seem to be facing an underlying greater threat, societal disruption due to the technologies that made us further apart, rather than unites us.

Debbie Reynolds  32:06

Wow, that's extraordinary. I agree with that. I don't think I've been able to fully articulate it in that way. I don't know. I feel like they're, I think what technology has done is create a more individualistic thing that can't really explain it. So, for example, if you go on the Internet and search for something, and I go on the Internet and search for something, we may get totally different results, right? But in our head, we think that the other person is seeing what we're saying. So there's a level of narrowing there where these companies are curating this content for you, but we have the impression that we're seeing the same things like it's creating more space and more distance between us. I remember growing up when my mother used to make us watch the news at night, right? So you watch the news, and everyone saw the same news, right? So now, it's not that way; not only are we not seeing the same news, we're having things that are being made up that aren't true. Because a lot of these companies, they thrive on eyeballs, right, So they know the controversy gets eyeballs, a lot of bad things that may impact the fabric of society, those things are popular, and they earn money, right? So in some ways, it's like we're going in the opposite direction with technology, where instead of people bringing people together, and sort of bringing people apart in some way. So for me, I see that as a lack of people caring about one another, and it'd be sort of, we're entering like an it sucks to be you part of societal existence right now where it's like every man for himself. And so we're in a situation where, especially with AI and all these advanced technologies, my concern is that I feel that we're not doing good on the basics right now. So to move to see this rapid growth and technology, things compute just a lot more, a lot faster, a lot more complex. And we're not handling the basics really well right now. We're going to be in trouble in the future. So the educational gap is getting wider and wider. What are your thoughts?

David Hendrawirawan  34:34

Yeah, I completely agree. I think you know you nailed it there, Debbie. I think I know this sounds cheeky, and this word is not used a lot in the professional setting. But I would say one of the best quotes that I've had is from a book, Leadership Challenge by Kouzes and Posner right at the very end of that book. It's a really thick book about leadership. And there was an interview with one of the most prominent leaders, and they asked all of these leaders what is your secret to being effective as a leader, and one particular General former General said, continuing to stay in love. The word love is not something that you hear a lot in the context of a business. But it's true. If you love what you do, if you love your team, if you love your people, and you love the mission with a genuine care, and trust that you're trying to do your best for others, loving your neighbors, loving other people, right? And loving your planet and the society that we have. Now. We need to cultivate that sensitivity to that, right? Maybe I'll share this. I'm a little bit kind of sheepish sharing this. But last semester, I taught privacy and data governance in a class. And at the end of the semester, the students voted for me to receive an award, and I didn't know what kind of award it was, I wasn't even aware that there was such a thing as student voting, it turns out to be the EY Diversity and Inclusion Amplifier award. And that surprised me because I cared a lot about the diversity and inclusion issue, but I never actually being in any kind of DNI position. Nor is my subject in the class about that. However, I do talk about privacy, which they all care about. And I pour in all of my industry experience, I invited a lot of guest speakers that came from different perspectives. So once students say, hey, look, the way in which you engage with the guest speakers, you invite people that come at this issue, that from many different perspectives, and it has nothing to do, you don't even have to mention the socio-demographic dimensions of race, gender, or belief system or whatnot. But the treatment of the subject makes them feel that this is a class that supports diversity, and as a professor, I am always open to helping them when I found out they have it; they're struggling this year to find jobs, I offered to connect them to my network so that we can open more doors. That's what inclusion means to them. Right? So to me, again, diversity and inclusion is an outcome of just good leadership, good teaching and taking responsibility for your people and not the other way around, necessarily. Right. So.

Debbie Reynolds  37:20

Oh, congratulations, very deserving. As a matter of fact, I totally agree with it. But I agree that that's something that is very important to me as well; people have commented about how diverse my podcast is, because I go out of my way to make sure that I have different voices. Because the thing I like to say, the example I like to give to people is like, if I gave a camera to 20 people and told them to photograph the same thing, they will all have different pictures, right? So you think, okay, they're photographing the same thing. But everyone has a different point of view; they have a different perspective, and they have a different lens to look through. So I feel like privacy is an issue that impacts humans, right? So if we don't have humans of all shades, colors, creeds, and every type of diversity you possibly think of we're not going to be able to solve this problem. So I think that that's what your students recognize in you, and I congratulate you for thinking that way. Because I think that's just that's the way that it has to be for us to be able to solve these big issues. So if it were the world, according to David, what would be your wish for privacy anywhere in the world, whether it be human behavior, technology, data, anything, government, cyber? What are your thoughts?

David Hendrawirawan  38:46

So this is my dream science fiction superhero movie, right? So I, in my mind, if I could play a role in making something happen, like the X-Men, Professor Xavier, right, like, you know, the s, school of the misfits, and he created his own school, it's not subject to any kind of government, it's a private, but then he provides shelter, home for students who have the gift, and who have the talent, I feel that a lot of progress now is inhibited because it's very difficult to push the boundaries of privacy and data literacy in our society. Because it's a big machinery, right? And look, first, yes, we have to make sure that whatever we deliver can be accessible to everybody. But at the same time, there's a balance between that and enabling, you know, acts in a faster progress for people who are already there. And in the area of technology, that playing field is very much the most uneven playing field across all disciplines and subjects, right? But that does not mean that we should, in the name of quality, equity, etc., just wait because if you Look at the CSTA standard, for example, computer science teacher association standards. For High School, it is so watered down that it is equivalent to what we are teaching our kids who are eight years old in terms of coding at the moment, right? So I have a dream that we can create a school that will rework some of these curriculums so that we can focus on the subjects and the topics that help them to accelerate their journey to become data literate, AI literate, privacy literate curriculum that we have today is primarily dominated by the engineering mindset, right? Calculus, for example, is kind of the pinnacle of the math track. Because back in the 50s and 60s, we were all trying to get to the moon. And yes, calculus is important also for AI and student data intelligence. But not as much as the way we're teaching it. Now, we need more exposure of kids to statistical concepts earlier on, right? So it definitely needs a reorganization and reframing of the goal of education. But looking at the way our institutions today, I don't see that happening fast enough. So I'm hoping to be able to at some point, if there's enough people who want to support that goal, to form a new entity that can make that happen, that would be kind of my dream state.

Debbie Reynolds  41:30

That's amazing. I love that. So tell me, how should people who listen to this podcast and want to get involved with what you're doing with K through 12? What's the best way for them to reach out to you,

David Hendrawirawan  41:43

I would love for you to reach out to me, either via email or LinkedIn; my email is dhendrawirawan@dataintegrityfirst.com. I ask that you extend patience because this is, again, a lot of people who come to me in the past; they thought that you know, there's already a large platform for us to make change happen now. But it does take time, right? And I'm still learning to pace myself; I'm the most impatient of most people in this area. And I'm having to learn to exert greater patience. But I definitely, if you want to be in this, it's a marathon, I would love to connect with you and to work with you to pursue this goal.

Debbie Reynolds  42:25

Yeah, I really admire not only what you do in education but the fact that you truly have the heart to help people; you could have just done your corporate track thing with your blinders on and then moved into this education area. So the fact you've done that, I think it says a lot about you and your care for society and people so that they can be educated.

David Hendrawirawan  42:54

That's very kind and encouraging. And I need to hear that a lot, Debbie, because to your point, it is definitely not an easy path. It's oftentimes, you get a lot of nos more so than yesses. So I really appreciate you opening this opportunity for me to speak to your audience on a subject that we both care a lot about.

Debbie Reynolds  43:15

Well, we'll definitely talk soon. And you know, you always have my support. So yeah, thank you so much for being on the show. We'll chat soon, for sure.

David Hendrawirawan  43:23

Thank you, Debbie.

Debbie Reynolds  43:24

You're welcome.

David Hendrawirawan  43:25

All right.