E200 - Nicol Turner Lee, The Brookings Institution, Author, Digitally Invisible: How the Internet is creating the new underclass
Your browser doesn't support HTML5 audio
38:18
SUMMARY KEYWORDS
ai, privacy, people, book, issue, talk, kids, technology, data, digital divide, school, online, connected, point, podcast, digital, debbie, remember, society, space
SPEAKERS
Dr. Nicol Turner Lee, Debbie Reynolds
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello. My name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest; actually, in addition to being a special guest, this is a special episode, so I have Nicole Turner Lee on the podcast with me today. She is the Senior Fellow of Government Studies and Director for the Center for Technology Innovation, as well as the author of Digitally Invisible, How the Internet Is Creating a New Underclass, which is coming out in August 2024. Welcome.
Dr. Nicol Turner Lee 00:59
Thank you so much and just to be exact, August 6th is the date the book is out, and you can get it at a bookstore near you. Thank you for having me. Debbie,
Debbie Reynolds 01:09
Yeah, well, this is going to be amazing to have you on the show. First of all, I've interviewed over 200 people on the podcast. You will be my 200th interview, and you're the only person who's ever been on the show twice.
Dr. Nicol Turner Lee 01:24
Oh, I feel so fortunate; that is so good to know, and I'm happy to be here with you because if anybody knows where to go for the types of conversations that they need to have on Data Privacy, you are that person. You are a rock star in this space, and I appreciate you having me on the second time.
Debbie Reynolds 01:41
Oh, sweet. Oh, my goodness, I have recently listened to your other podcast episode again because you were recently appointed to the AI Safety board with the Department of Homeland Security, and there are some lesser-known people on that board, like the guy from OpenAI. I think the head of Microsoft is on there, too. I can't remember, but as I looked down the list, I was like, oh God, please let it be somebody I know on this list, and when I saw you on there, I was like, ooh, okay, I'm okay. I know Nicole's there. She's having the right conversations, and I think you're a perfect person for that board, but talk to me about what you've been up to. I just got the flurry. I saw you on CNBC. I see you at, you going to Martha's Vineyard, doing some book signing. You're just all over the place. So tell me, what have you been up to?
Dr. Nicol Turner Lee 02:29
Well, first and foremost, let me just say this, I maintain the true and overarching responsibility of being my kid's mama. So this summer, when I'm not doing important policy work, those kids have gotten me very active with regard to their own activity, so I maintain that particular position as being the mom of the house. But other than that, I, like so many other people, have just been in a flurry of a variety of activities, whether it is in tech policy or some of the Presidential politics that are happening, or the Congressional conversations that are going on right now, it has been a flurry overall. I think all of us feel like it's maybe a snowstorm at times, because the flurries come down so strong and are so intermittent that we don't know whether or not we need to put on our snowsuit or just keep on our jackets. I think tech policy has been the same. Unlike other times and other periods, we've had some consistency. We have seen, for myself, at least some ebbs and flows when it comes to the issues that are touching the surface and whether or not Congress is actually going to do anything on those issues. That's not the story of the whole episode, but I've tried to maintain some central intelligence on all of those issues, as well as participating in Federal advisories. So yes, I'm on the AI Safety Board as a person who was invited by Secretary Mayorkas of the Department of Homeland Security, with many of those lesser-known folks from the big tech companies, which have provided a great opportunity for people like myself, who represent research and civil society to have a space there, particularly as a black woman, I just finished up a report. Believe it or not, I don't know if you saw this, Debbie, on AI and the financial services market, particularly global markets with the commodities futures trading commission. It was the first time I've ever written about AI in that context. But I think a lot of lessons learned are basically connected to our general conversations around Artificial Intelligence and its implications on critical infrastructure, and I've also been doing a whole lot of other stuff that's been interesting to me. One of them is writing a book. So the book that you just mentioned is "Digitally Invisible". It's been a labor of love. It's been a project. I think the first time I was on your podcast. I mentioned I was in the middle of writing, and I'm happy to be talking about it with you. I mean, this is one of those spaces where it's fun to be an author, crawled up in a corner with your laptop or a pen and paper, letting the words flow that you're constructing to tell these narratives, these compelling, profound stories about your experience with this subject. It's another thing to have it in hand, and I just got it in hand. So, most recently, what I'm up to is just trying to figure out how to get this book narrative out to as many people as possible. Because I think right now, we're in the middle of a movement, not necessarily a moment.
Debbie Reynolds 05:16
Think you're right. I think this is a time where those lesser-known voices who haven't been heard before need to be heard.
Dr. Nicol Turner Lee 05:24
Yeah.
Debbie Reynolds 05:24
Because there's so much happening in technology that could just run away with everything.
Dr. Nicol Turner Lee 05:29
Yes.
Debbie Reynolds 05:30
A lot of people, I'll just give you an example. I had talked with someone about, there was an app or something that they have for their store or something, and they were like, oh, well, so easy. All people have to do is scan this QR code for this app. It's like, you act like everybody has smartphones, or you act like everybody understands how to use this stuff. That's a huge problem, especially since we're trying to use things like Artificial Intelligence and all these more advanced technologies, and if people don't really have access, you're talking about a digital divide, but I think it's almost like a caste system.
Dr. Nicol Turner Lee 06:06
Yes.
Debbie Reynolds 06:07
Where we have instead of the haves and have-nots, there'll be the know and know-nots.
Dr. Nicol Turner Lee 06:12
Yes.
Debbie Reynolds 06:12
What do you think about that?
Dr. Nicol Turner Lee 06:13
I love the way you talk about that. I mean, we're listeners. I've been in this space for more than 30 years, so I started in Community Technology when I was just a graduate student in Chicago in the 90s, and essentially fell in love with an affordable housing building, and they had this very small computer lab, and I talk about this in the book, 300 square feet, to be exact, and from that experience, I not only connected the residents to at that time, 386 computers, but I also connected the residents to this outer world which had forgotten them. That really showed up when jobs went online, particularly civil service jobs. I don't know if people remember there was a splurge in the 2000s when the civil service jobs went on, particularly after 9/11 and people had to apply online. They couldn't go in line to get those jobs, and so I had the experience of being in the community, watching how technology either helped folks or amplified existing inequality, and then, to your point, at the opportunity to do something about it. It's so interesting because how I write about it in the book is my experiences back in the late 90s and early 2000s are so similar to today. So when you give that example of a person who's like, oh, all somebody has to do is scan a QR code, it's just not that simple. One of the things I did in the book before the pandemic is I went around the country talking to everyday people, whether it was Garrett County, Maryland, which is predominantly a farming community, Amish in particular, Staunton, Virginia, which is where people want to live, rural because they've got great Main Street, but they still can't manage to maintain the connectivity in a way that makes sense for them to live a quality life. Syracuse, New York, where I visited, actually, a lady, get this, in 2022, had the same type of computer lab I had in 1999, 386, computers, and, you know, very archaic and outdated equipment. But here she was in 2022, sitting in public housing. I just saw myself in her, and I can go on and on about the stories of the people, and what I found was, and the reason I called this book "Digitally Invisible". This is not a digital divide, right? This is not about connectivity, to your point. It is a caste system. On the one hand, because people who have access have the means to have access, and it's also an invisibility problem. It's about people who, unfortunately, starting with our transition from analog society we were holding CDs and records they just cannot get online for whatever reason, and they're missing out on opportunities to build a good, credible quality of life for themselves and their families. To me, that was, Debbie, the most disturbing because I did this research before the pandemic hit, when 14 million people were not necessarily connected to the Internet. Then the pandemic hit, and then all of a sudden, people like me were invited to the party because people found out about a digital divide, and I really think today it's not about the haves and the have-nots. It's really about, and I use the analogy because I was such a reader in high school. I don't know if you remember Ralph Ellison's Invisible Man, where he talks about the protagonist, it didn't matter what he did. He could stand up and get the best education, but still nobody saw him, and that's what I write about the people who are out there that we just do not see because, for whatever reason, they just cannot be connected.
Debbie Reynolds 09:49
Wow, I love the way that you put that. I'll actually make a privacy parallel to that in the US, and I want your thoughts. So because the US has more of a consumer rights privacy system it's basically, if you're not consuming you really can't exercise certain rights, and to me, it's very parallel to what you're saying. What do you think?
Dr. Nicol Turner Lee 10:15
Oh, definitely, and that's one of the things that I talk about in the book. We have made the traditional digital divide, and this is a divide that dates back to probably Reagan, but more so President Clinton as a Democrat. We've made it about these haves and have-nots, and we've codified the digital divide in a way to your point where, back then, if you weren't on America Online, that was probably okay because we were going outside, and we were still transacting in person. Getting on to get your email was not a big thing. But today, I write about in the book, you can't do Cashapp without a bank account. You can't transact with your physician without a phone. There is certain collateral that now comes with being connected, and the challenges to your point, there are inconveniences that come with that, as well as trade-offs. So it's really important that we move this from, and I mean, I hate to be harsh about it, but since I have almost 30 years, three decades, working in six Presidential administrations, it's almost as if we still treat people not being connected, as a social service program, and we don't look at it. This is something I really stress in the book, as these trade-offs are also trade-offs to our global economy. Who's not getting the jobs of today. What faces in this digital disconnectedness are being disserviced because they have to pay a surcharge every time they go into a store or every time they get an operator on the phone. To me, all these things are part of this invisibility, which is why, Debbie, I call it the new underclass. This is a new group of people; we saw them in Covid. I hate to say it, but they couldn't even get their vaccination, but it's a new group of people, and I love your analogy that they have been cast out of our system simply because of their economic and social needs.
Debbie Reynolds 12:17
I feel like this is similar to what life was like in the mid 90s, when people were going from paper systems to digital systems.
Dr. Nicol Turner Lee 12:27
Yes.
Debbie Reynolds 12:29
The issue was that the things we wanted to do, the amount of data there was, it was just impossible to do it in paper. But we had to move towards these digital systems, and now we're bringing in more AI. So now we're going to get a situation where, in the future, we're going to need those advanced capabilities to be able to transact and do anything. So it's a huge issue. What do you think?
Dr. Nicol Turner Lee 12:53
I love that. Okay, so it's so funny when I consider myself to be interesting, and I write a lot of stuff, but I've never written a full book, so this is 200-plus pages of my writing. I think the longest thing I've ever written like that was my dissertation when I was a PhD student. That was many, many, many years ago. So this was a new thing, but part of that, I had to do some historical fact-checking and storytelling, and to your point, I do talk a lot about this transition from what we all know is like this analog, very rotary society to something that's more digital. So I still remember my phone number growing up, 636 2437, and if I really want to date myself, it was NE62437, okay, because back then, we used the letters of the number as opposed to straight numeric values. My point is all of that has impacted where we are with digital invisibility and our global competitiveness in a very highly technical society, and you are so right about AI, I did not go deeply into AI except in the last chapter of the book, because, again, it is very hard to imagine a society where we have this cognitive processing, and I do a lot of work, as you know, in Artificial Intelligence and equity, and yet we still have a digital divide before us. It means that whatever systems we create, or whatever efficiencies that businesses or the government embeds into eligibility determination or public benefit completion or getting your driver's license, it still will be disadvantaging people who are not online. One of the things I remember when I was working this space in a nonprofit around the decades that I've been working on digital issues, my old boss used to say it's about getting people from in line to online, and this was in the mid-2000s let me say that again, from in line to online. Well, that very much holds true today. The challenge is we still have not solved this as a competitive problem. We don't see this directly as a healthcare issue, an education issue, an employment issue. We see it as we need to accelerate these shiny objects and tools and make sure there's enough infrastructure for people to engage, and then we get people that come in and says, well, even if you gave it to them, it may not be relevant to them. My point is, as a society, as a country, as a global world, it becomes very difficult for us to reimagine what life looks like with digitally astute citizens, and it also becomes difficult for us to realize how much further we can push the digital envelope if we still have not created baseline access to some of the very fundamental tools that you need. Can't go to a library that doesn't have books. It's hard to drive a car without any wheels. It's the same thing, I think, with technology, particularly as things become more advanced.
Debbie Reynolds 15:47
Yeah, I agree, and I want your thoughts about privacy as being foundational to what's happening in AI. So I'm sure you know, you've been in this work for many, many years we've been bringing up the sky, saying, yeah, we need to do something in the US on privacy, and then when AI burst on the scene, in terms of people getting more democratization, I would say of AI and people talking about it more, the privacy people like us are like, well, what about us? We need some action on privacy. I feel like some people don't understand how privacy is connected to AI, and they don't understand how it has to be foundational. So tell me your thoughts about that.
Dr. Nicol Turner Lee 16:28
When I talk about this, also in the book, I think privacy is foundational to digital access, and I'll start there and then move to AI. Think about during the pandemic, when people were asked to report their vaccination or go to their local pharmacy to get a shot. We don't know where those records are. I don't know where my records are. I said that I got this card that has become very flimsy, that has the dates on which I took all the shots. But for the most part, my data was going into the pharmacy's database and, in some way, was probably being matched with the database from the Federal government. But there's been no mention of where that data is. So, even if we start to think about connecting people to the Internet, it is foundational that we also have some privacy concerns related to how we construct either good policy or effective best practices. I talk about in the book, for example, I am particularly interested in seeing us move beyond a digitally invisible discourse to one that's rooted in universal service. What does it mean to be in a country where having access is a civil right? But then I'm also because, in a minute, I'll talk about AI; I also don't want us to be a country of trade-offs, and just because you're connected that you have to give up something. I think there's something embedded in it again; just think about the urgency of the pandemic. Everybody was rushing. Here's my information and my medical card. Give me who wanted the vaccination, but half of us didn't know where our data was going, and if we knew, we'd know that it was going to pharmaceutical companies and others that were going to continue to cross a shot. If you were one of those people who wanted the vaccination, but half of us didn't know where our data was going, and if we knew, we'd know that it was going to pharmaceutical companies and others that were going to continue to cross-sell; that's where AI comes to the picture. AI has this fundamental capacity to take our private information and our public footprint and recalibrate that into products and services that have a market incentive or profit incentive. For that matter, privacy is so fundamental to that. I mean, when I started in this space, one of the first studies I did was the National Minority Broadband Adoption Study. I'll never forget. It was probably about 2008, 2009, something around that time, and my goal was to show how people of color use the Internet then years later, we'd have our first round of the privacy debate, Debbie, and I remember writing a paper at the time I was at the Joint Center for Political and Economic Studies, only Black think tank in Washington, DC, and I wrote about the trade-offs that people of color gave of their privacy to want to get online. I really think that many of those challenges that existed before exist today, but AI has made them even more challenging because we're not just talking about AI like I'm going to give you my email address so that I can get on the mailing list for a product or service, I'll be the first one to know about a local concert. Now, AI is implicitly, in some cases, extracting our information in ways that we don't even know. We're part of the model. We may be like you said, the people who don't get online, but we're still a subject because maybe there's some AI that's saying assesses against people who don't get on the model, or some of the studies we've seen in terms of bias, AI that I work on in my work at Brookings ride-sharing, for example, won't go to certain communities because an AI doesn't dictate them as a viable market in which to do business. My point is that privacy is fundamental to every aspect of the digital ecosystem. I don't care what part it is. We need some privacy guardrails, and you and I and so many other people have been beating the pavement for so many years. I would say, I think I've been in digital access issues as long as I've been in privacy issues, and until we have those rules of the road, we're not going to see privacy the same way that I saw it coming in the digital divide age, privacy is going to be so much more significant because AI will extract that privacy for its own means and take and cultivate that in some instances, into bias outcomes for people. I don't know if I told you, Debbie, I'm running at Brookings now, the AI Equity lab that is looking at nondiscrimination and anti-racism and AI models. A lot of that has to do with the fact that the data that's training these models one is open data without any type of privacy restriction, and it's traumatized that, as my dear friend Renee Cummings would say, it comes with so much historical legacy baked into it around racism and discrimination. So you've been hitting the pavement on this. I think it's really important that this privacy become an overarching component of how we actually cultivate and craft the digital ecosystem. I'm starting to sound like a black Baptist preacher. I'm so sorry.
Debbie Reynolds 20:59
That's okay, and I think data trauma. I love that. Renee says that because I think that's a very apt word, because not everyone experiences data the same way, right?
Dr. Nicol Turner Lee 21:09
Yes, and data has the ability to amplify and celebrate you, or it can harm and penalize you. I mean, I am doing the work of the Equity lab to workshop out AI use in some high-risk areas, education, criminal justice, housing, healthcare, financial services, education, and employment. The reason I did the Equity lab is, one, I was probably testifying on just about everything. When you talk about all the committees I'm on, you ain't telling a lie, right? I'm one of those people who shows up. I think it's so many things, and I would say to myself, I'm not even the subject matter expert. I'm a sociologist who just gets the general schema of things. What would happen if you put people with interdisciplinary backgrounds and cross-sector expertise alongside technologists? So that's part of the intent of the Equity lab is to really workshop, how do you get to more inclusive, ethical, responsible? AI, and starting with the data is really the starting point? That data, in many respects, will help us to frame the extent to which there are vulnerabilities in these models, particularly in the context in which they're applied. So, just a quick example on that. It's one thing to develop an AI model in a lab and to suggest, well, hypothetically, this should work because we've tested it on these conditions, and then you deploy something like facial recognition in the world, and you find out that it misidentifies people with darker skin, or it picks up on outdated photos, or, because of the lack of privacy, it's scaling your social media, or it's looking at photos that you have in 2015, databases of Dean, at the end of the day, they are incriminating decisions that are being made to some bad data, and I think poorly interrogated systems.
Debbie Reynolds 23:00
Yeah, my my concern there is that, a lot of these systems, the problems, or facial recognition, for example, doesn't say, okay, this is not that person. They're like, well, maybe it's this person.
Dr. Nicol Turner Lee 23:13
That's right, that's right. I mean, it's a problem. I spent a year and a half at the National Academies of Sciences. We were doing research as it was blessed by the President as part of an Executive Order on criminal justice. It was a small group, and many of these people were the people who built the technology. Here I come in, I just know about it as a researcher, sociologist, and I tell you, you know, the technology is not fully cooked, and it's not optimized for circumstances in which technologists may not see those variables play out in the lab, but it has very, very harsh consequences, like Porsche Woodruff in Detroit, public knowledge about her case was picked up by facial recognition for robbing a store. She was seven to eight months pregnant. Clearly wasn't the person robbing the store, because they weren't pregnant, yet she was arrested in front of her children, brought down to the station, almost had a miscarriage, and had to hire a lawyer to get out. Now, she is basically suing the city of Detroit. Those things should not happen because we've put all of our trust in a technology that is not fully baked, and they shouldn't happen. I think again, going back to my book, in an evolution of technology that has not quite been equitable, inclusive, and fair to all populations. We have a lot more work to put a stake in the ground when it comes to digital technology, and the faster that we deploy AI without solving digital access issues, the more problems we're going to have going forward.
Debbie Reynolds 24:44
Yeah, I agree completely with that. During your research, when you were doing the book, tell me about something that maybe you were surprised, that shocked you; as you were getting deeper into it, you were like, oh my goodness, I couldn't even believe that this is happening.
Dr. Nicol Turner Lee 25:01
I had a couple of those on a good news side, because I don't want to sound like, you're Debbie Reynolds, I'm Debbie downer, right? I gotta make sure I keep myself a little positive here because I'm a tech enthusiast. I do remember, on a positive note, going into Garrett County, Maryland, which is about three hours outside of DC, and it's a very small Amish community. Some of you may know it as the Deep Creek area. So it's a beautiful farming community, very rural, in fact, so rural, there were more cows than people. Is what I call the chapter, and I'm a city girl, so I kind of felt a little out of place, and my GPS was not always working well, let's just say, in terms of cellular signal. But I remember going to a five room school, and the math teacher of the 50 students that were in the school basically suggested that they were the leading robotics team in the state of Maryland, and this was despite the religious proclivities, where they really the Amish are not big fans of technology, very few. I mean, there's some Amish sects that just recently took on electricity. But this one Amish school had this leading robotics team, and I found that to be so amazing because, first, I couldn't imagine being in a five-room school. I'm a person out of New York, but two, I can imagine a community that was also a little aversive to technology, taking in robotics. The teacher mentioned to me that it doesn't mean that they take it home because of their family culture and cultural traditions, but in school, they rock it, and they are really focused. So, that was one story that I found to be interesting. On a similar note, with education, I went to Phoenix, Arizona, Maricopa County, to be exact, because I had heard about a school that had gotten tablets as part of the Obama's Connected program. So I was really trying to target places in the educational space that had been part of some of the Presidential programs because I knew they had access, and I could ask some questions about what that access was like in this particular school in Phoenix. It was a Latina school, primarily. In this Elementary School, all the kids had tablets, but the school found that even though the teachers were engaging them around the tablets, the kids also had phones. So, every child in this element just got a phone. So I'm sitting there listening. They're like, yes, we got the tablets. We've restructured it, and I write about this little book because we want the kids to thrive, not survive. But they also have phones. Every single kid, you know the challenges that they have. They have phones, but their parents don't sign up for free or reduced-price lunch. There was a cable company that had a low-cost program for the home. People weren't signing up, and here was the shock factor. Are you ready for this? These kids had phones because they also lived in a county where there was a high rate of deportation and a lot of illegal deportations at that. So the phone in the hands of these Latina kids was really not for school, per se, but it was to stay in touch with a parent, or an aunt or an uncle who may have to deliver some news that Mommy was not going to be home. I can't tell you how much that shocked me, and about 15 minutes later, I went to the local library after hearing that story because I was asking how far kids had to go to get to the library. Obviously, it was a five mile drive for me, and while I was in there, another shocking story was a woman who was yelling at the librarian. Her name was Francis. I talked about it in the book. She just wanted the library card. A librarian said I can't give it to you. I gave it to your son. So, because my book was based on a lot of qualitative anthropological research, I followed out to the parking lot. I said, Excuse me, ma'am, I heard you're looking for a library card. For what reason? She was another one who took me by storm, which is why it's so important to have this conversation when it comes to healthcare. She found out about a stage four cancer diagnosis through her phone, and she needed her library card renewed because the only way she could talk with a doctor was by going to the public kiosk and the public computers in the library. This is in 2019 when I actually spoke with her, and the pandemic hit 2020, and many people are still in those scenarios in 2024. I think I lose track Debbie, sometimes of the timing, because I see myself in many of those people when I was in Chicago in the small computer lab in the 90s. But those kind of stories were shocking to me, because here we are a very highly industrialized technological society, and we're still dealing with the same issues that I saw as a graduate student in the 1990s and this time is so much more consequential.
Debbie Reynolds 29:56
So where do we go? How do we chart a new path, or a new way forward?
Dr. Nicol Turner Lee 30:01
I was told by someone that you must write a book with a path forward because no one wants to read a book that makes them sad. So I'm telling all your listeners, there's some really bright moments in there. I just get nostalgic when I think about this book, and if anybody's ever written a book, it's an emotional toll that it takes, particularly when you're taking on the stories of people that you meet for the first time. So in the book, I do lay out a pathway. I mean, my pathway is that for us to get to a digitally just society, that it's really important that we move away from making this a digital divide issue. This is not an issue. It's an issue that we've dealt with for many, many, many, many decades, and it's really an issue that should be about the type of universal service that America gives to its people, right? The people who are residents of this country should have access to technology, and we have the means to do that. We have funds that are designed to provide and offset the cost of communication services. As you said earlier. I mean, for many, many years, we did so many things analog, and now here we are today, doing stuff online, like I said in line to online, and the country really has to rethink, from a policy perspective, what do we do? But there are a lot of people who understand that policy perspective, and there's some who do not care. So I give another alternative. As we think about the just path forward, it is so important with any technological issue that we center people in their communities. This is about the faces of the digital divide. What does that mean? We're getting ready to go back to school. Many of you remember there were a lot of kids that didn't have laptops then, and there will probably be a lot of kids that don't have laptops in this new school year. The pandemic is over. We haven't seen this as a reason to connect our young people where they learn, and so my path forward is no child should be left offline the same way we did no school left behind. We've got to do the same thing with young people. There should be no child that does not have in their backpack some type of technology, technological device, Internet access, as well as a way forward into our society, which is becoming increasingly technical. No worker should be left offline. I think it's particularly important that, when I was a kid and I worked at McDonald's, yes, I was in the back flipping burgers. Okay, I was that kid who my mom would tell me when I got off of work, get in the backseat of the car, girl, because you smell like a burger. You smell like you just left your job. Well, today, jobs exist where they're either remote or there are a lot of jobs that just have, I would call it intuitive, technical experience, not even like a computer science degree, you just have to know how to upload updates and download, outdated things. But no worker should be left behind, and we need to carve out ways in which we're really thinking about the ways in which we do worker retraining to ensure that that's the case. The third thing I say in the book is no community should be left behind. It dawned on me when I went to rural areas that there are some people who want to live in rural areas. They should have that right. Not everybody wants to live in an urban area. There's some folks who want to open up a restaurant, stay on Main Street. I met this really cool man who had a barber shop, a one-chair barbershop that had decided he was going to put a Wi Fi connection in his barbershop because the kids in Staunton could not do their homework, and he said, come in and get a haircut and get online and do your homework. He's like, I love living in this rural community because it gives me the opportunity to do stuff like that. So I think we need to think about, how do we make sure that no communities are left offline? I mean, how do we make sure churches and community organizations and hairdressers and small businesses are part of the larger ecosystem for many communities where maybe getting connectivity to the home is just not going to make sense in rural Alabama that I visited, the teacher basically told me, during the pandemic, her tablets that she had gotten were books without paper because they couldn't figure out how to get connectivity to kids unless they were in school. Then finally, I just say it's important for us to solve and address these issues before the AI cliff. AI is coming, it is actually here to stay, and before we start addressing the inequities and the challenges of ensuring that people have access to these highly advanced tools, I'm talking about you out there, Debbie, with Gen AI and all these AI tools that you're using, you gotta make sure people have basic connectivity, and so that's the other thing I would say. We're going to go towards a digitally just society. It's important for us to solve this issue. We've had enough time we know how to do it. In fact, we've had six Presidential administrations that have had the will to do it. It's not just about a shiny object and the hope that you can get on somebody's network. Let's make this work once and for all, and again, I'm trying to bring credibility to faces, you know, all the policies, policy stuff, sometimes we get in our own heads. You know what I mean? We're talking to ourselves. It's so important to hear from everyday Americans, everyday residents of communities, and the generational kind of people and really hear what they have to say.
Debbie Reynolds 35:25
Excellent, you're tremendous. I am so excited about the book. I'm glad for this, and I think I'm going to get a signed copy. So I'm really excited about that.
Dr. Nicol Turner Lee 35:34
Yes, I'm gonna send you one. I mean, listen, I cannot tell you. I mean, I'm just so fortunate to be the second person on your podcast. I always consider you to be a wise counsel on so many issues related to the technological ecosystem, particularly privacy. I put privacy in this book, girl, it's in here because I do mention a lot of things that we talk about. I just think it's so important. We've never had a comprehensive book on technology access, and for those people who are like, oh man, this is going to be in the technology section, or it's going to be somewhere embedded in some highly business section. No, this book is for everybody. I kept my aunt in in mind when I wrote the book, because she told me, you'd better make it interesting. She tried to tell me to make the words a little bigger in terms of the font too, but I couldn't arrange that as well, so you must wear your glasses reading this book. But yeah, I really appreciate being on here, because this is such an important topic. I think for us, even going into this campaign, everything is technical; it's all online. Let's bring some real credibility on how we're going to solve this once and for all.
Debbie Reynolds 36:46
I agree, and I concur, and I join you in that fight, for sure.
Dr. Nicol Turner Lee 36:50
So yes, this is a movement, not a moment.
Debbie Reynolds 36:55
People definitely check out the book, Digitally Invisible; How the Internet is Creating a New Underclass, and follow Nicol. She's everywhere.
Dr. Nicol Turner Lee 37:06
You can get the book at any bookstore that is available to you, when you are listening to this podcast. You can go to www.NicolTurnerLee.com, Nicol, without an e, and I actually have places where you can actually purchase a book, read a little bit about me, stay connected to this movement that I'm hopefully creating alongside many esteemed colleagues like Debbie, and help me out. It's my first book. Buy it and tell me what you think as well. Take a picture when you buy it, and I'm on LinkedIn; post it on my LinkedIn or my Facebook or Instagram so I can see your response.
Debbie Reynolds 37:43
Yes, I would definitely do that. I would definitely do that. Well, thank you so much for being on the show. I'm always excited for all the things that you do, and I'll keep following you, and then maybe we'll have a part three at some point.
Dr. Nicol Turner Lee 37:55
Yeah, let's do it. I mean, that's gonna be my third book, the existential threat of AI digital divide. Thank you so much. I appreciate you.