Debbie Reynolds Consulting LLC

View Original

E215 -Jennifer Pierce, PhD, Founder of Singular XQ, AI and Performance Anthropology 

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E215 - Jennifer Pierce and Debbie Reynolds (59:15 minutes) Debbie Reynolds

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know. Now I have a very special guest on the Show, Jennifer Pierce, PhD. She is the founder of Singular XQ.

[00:34] Jennifer Pierce: Welcome, Debbie. Thanks for having me on. It's quite an honor. I've been following you for, I think, over a year and I just love your podcast. I think your perspective is fresh and direct and I just love listening to you.

[00:46] Debbie Reynolds: Oh my God. Thank you so much. First of all, and I'm really excited to have you on the show. One of the reasons why I wanted you on the show is you are just ridiculously intelligent. And the comments that you make just really elevate any conversation that you're in. So you have to really put on your thinking cap. And I love the way that you think and the way that you write and explain yourself. But tell me a bit about your technology journey and how you got to be the founder of Singular xq.

[01:20] Jennifer Pierce: Well, my journey is strange and it's hard to put me in a kind of category because there was no strategy to who I became or what I was. It just who the the series of experiences that came across my transom. I. I am primarily an artist. I was a theater artist, a film artist, and I wrote and directed. I did a little acting, but mostly because I was a writer and director. And as a kid, that was my overwhelming passion from doing Carol Burnett's Gifts in the living room, rewriting novels and stories and plays and clubs with codes to crack for people to join. And I just was very much into fantasy, Lord of the Rings, Narnia, all of those geeky things. But I also had an uncle who was a bachelor who didn't have any of his own children that was a computer enthusiast and kind of, I think, a genius in retrospect. And when I was 8, he gave me a subscription to Compute magazine and a Vic 20 and I taped over the corners of my parents Captain Tenille cassette in order to write my own video games. Using BASIC was the. Was the code I was learning then. And I was hooked as a Gen xer. It made me an early adopter because it didn't really become a thing until like just a little after that. That was like 19, what, 81 or 82. So. But I pursued the arts and philosophy as my primary educational area. But because I came from a humble background, there were not professionals in my family. My parents went to college later in life because they value education, but I didn't have parents with degrees. I had to make money all the time. So I found that this hobby of computers like gave me this income stream. But my real ambition was to become an anthropologist of the arts and performance and to be an artist. And so those two careers developed together and it was very strange how they all came together when new media became a thing. And my first consulting job was when I was working for a pharmaceutical company. I was writing and directing the industrial film scripts and making video scripts for the CEO of this company, German pharmaceutical company. And they said, you know, you also fix computers. We saw you under the secretary's desk. And I said, yeah, well could you figure out how to put all this stuff online in the new media? And that's how my life as a software developer started. I had worked in the IT department at my college, so I had hands on skills there, but it never occurred to me that that would be my career. And so those, those interests kind of wove together and I became an anthropologist of technology and helped to serve CL clients in all kinds of spaces, building innovations that humans would adopt and use and find useful and learned a lot of the tech stack behind it and got very into agile software development and here I am. So about two years ago I turned my for profit consultancy into a nonprofit because I really interested in how we can take back technology and give it to the many instead of the few. I think we're headed for a plurality, not a singularity. And I would like to open the field up rather than see it closing down more and more over time and let people play with this thing that allows us to discover all kinds of interesting things.

[04:38] Debbie Reynolds: Well, I want your thoughts on this. What do you think is happening right now? So I feel like, I don't know, I don't know if it's a second wave, third wave, but to me it's kind of two, two waves that I want to talk about. And it's fascinating that I guess I didn't realize that you and I have some similarities there in terms of our geekiness and interest in computing and how we turned into careers and you and I, you know, I was a philosophy major as well, which my parents were very horrified about at the time, but, but you know, I turned it into something interesting and good. But I feel like we're at a stage now, you know, it feels familiar to me because around the mid-90s when, you know, the commercial Internet became a thing and people started putting more stuff in digital systems. It was, you know, all this excitement. People didn't know what to do. You know, they needed help and stuff like that. And so I feel like we're there again with artificial intelligence where people are still trying to, trying to figure it out, trying to figure out what they need to do. But, you know, it's almost like, you know, not knowing enough or knowing a little bit is like truly dangerous in this day and age. But I just want your thoughts on that feeling. That's just the feeling I get.

[05:58] Jennifer Pierce: I think. So, yeah. Everything's run in cycles, right? And I think, I think what we're being told right now is we're running down the pathway to this singular totalizing technology solution. But it's actually the opposite. The technologies that have erupted since that first burst of digital transformation in the 90s, there's so many different, like AI for example, I'm always talking about, has so many different kinds of categories of tech within it. But we're being given this image of it all coming down to ChatGPT and LLMs and of course there are so many other possibilities and potentials for solutions in those spaces as well. So I think it's that moment of when the means of production passes from the few to the many. And you know, we have everybody sort of like for a while there was this very, very pulsing diversity of, of opportunities on the Internet and then it kind of slowly closed down. And the reason that I had a perspective on that is because, well, I have a uterus. True story. So I have three children. So when I had those three children, I was still teaching and doing artwork, but I also did a little bit of private consulting for doctors, insurance companies. And they didn't have platform as a service and software as a service. And then when my kids, you know, I pushed my little redhead out the door for her first full day of full time school, I went back into the work sector and things had changed dramatically. There was so much to get interested in. And now it seemed like every place I went were working on the same solutions sold by the same platform that I'm going to be building a new CRM. Well, not really. I'm building a shell on top of the same platform that was at this other place and this other place. So it closed down for a while. And I think part of the fury that we're seeing now is it's about to open up again and people are fighting to stake their claims. So that that won't happen, that they'll be at the top of the heat, but I think it's inevitable that it's decentralizing and becoming very diverse once again. Like that moment. Yeah. So I think that's really astute. What area of philosophy did you study in? Where did you get your degree?

[08:08] Debbie Reynolds: I went to Loyola University, so.

[08:11] Jennifer Pierce: Oh, nice.

[08:12] Debbie Reynolds: Well, I call myself more of a pragmatist, so instead of a leather patch on tweed jacket type of philosopher, I'm more like, hey, a spoon is shaped this way because it's useful.

[08:26] Jennifer Pierce: Yes. We were actually talking about the American pragmatist the other day on LinkedIn. There's these pockets of very smart, fun people on LinkedIn, and you find them. And we were talking about what a refreshing relief pragmatists can be sometimes. You know, where it just. What works, what's basically happening here. Love it.

[08:45] Debbie Reynolds: Totally, totally.

[08:46] Jennifer Pierce: Thank you for that.

[08:47] Debbie Reynolds: That's so sweet. First of all, I love what you're thinking, and I think that that's true. Where I think there's been a pendulum that's been going back and forth, and I think it's been happening on many different layers of technology where we go back and forth from, you know, let's centralize everything, let's decentralize everything. It kind of goes back and forth. And I think the future will be kind of a combination of that. Even though I especially as we go into a situation like now where we have these massive cyber threats that don't seem to be stopping anytime soon, I think it's going to force more decentralization as a way to protect people. But I just want your thoughts on that 100%.

[09:31] Jennifer Pierce: And it's an oscillation. Right. Decentralization, or any kind of process in history or in a complex system, which is part of the approach at Singular xq, is we specialize in complex systems theory. In a complex system, there's an oscillation. It's never static, because a static system is fragile. And if you're familiar with the term of fragility, as Nassim Taleb described it in antifragility, the more stasis there is in a system, the more. For example, I always use the heartbeat. It seems like it's a really stable, consistent, but there's actually a degree of chaos in it, which is not the same is randomness, but the chaos protects it. And the lower the chaos, the more susceptible to potentially fatal arrhythmias the heart system is. That's why heart rate variability is of interest. To people. And so any living system. And look, the tech ecosystem is made up of what humans. Everything that is in tech does not run autonomously as much as we want people to think that it does, and that we sell it on the fact that it's going to run autonomously and we sell fears of it running, running autonomously and everything else. But it's an exponent of humans working together towards particular goals in a kind of, you know, haphazard way sometimes. But yes, and it's a living system, so it's going to oscillate between centralization and decentralization. And I think we are, absolutely. You see my linky Love Thursdays. I do that as a tribute to that moment. It was more like when we went from kind of like I call it Wizard Web to Muggle Web, which is like there was this period where geeks like you and me were on there and like your friends would be like, what do you mean? You met your boy boyfriend online? And then, and then when everything became graphical and user friendly and accessible, thanks to the whole computing revolution that Jobs and Bosniak left, the Muggles came in and were able to use it in a way that us weird geeky kids were using it before. And that moment was really lovely because it was this wild west of people who are out in front. I remember blogging with, right next to like Wil Wheaton and all of those people. And you would just build little communities by sending links and then Blogger and things like that came out and started platformatizing the whole thing. And you'd have your link, you know, your blog links on the side. But eventually those blog links were. Became unnecessary because you would sign up with a big platform like Facebook or you would sign up with a platform like if you were a writer, Pajamas Media or HuffPo or wherever. And so it all that went away. And then we started leaning on those platforms to do all that tribe building for us. And they were doing it with algorithms. And, and you know, the, the secret is, is that there's a lot of human actors in those. Those algorithms too. It's not as magic as people say. Meta has brought it to a new science. We started relying on that. We got a little bit lazy in our social tribal coherence on the Internet. We let LinkedIn do it for us. So I am seeing this future now that everything's been scraped in the great scrape, that we have to start building those intentional communities, for lack of a better term, again ourselves and finding those people we want to listen to and not blindly listening to what the algorithms are driving us towards, because it stops us from thinking and doing that work of building community in digital spaces. In fact, building communities that way, we're building fake communities that are being driven by actors other than ourselves. So I love that feeling of that time coming back. I get kind of excited, like, hey, who's in your tribe, right? Who's in your substack? I don't follow substack anymore. I'm on Ghost. But, like, who's in your substack or your ghost newsletter group? And I'm aware that those places are. Are filled with disinformation and things like that, but it's a chance to find good people again. And I just. I just love that. I think it's exciting.

[13:24] Debbie Reynolds: I agree. I'm excited about that as well. What's happening now in technology that you're concerned about that you see, you're like, oh, my God, I don't know if I like this, or it's going kind of off the rails. What do you think?

[13:38] Jennifer Pierce: Well, there's two things. One, you're very interested in, so I'll save that for a second. But the first thing that I'm super concerned about is the level of deception in technology. That is the level, like, there's certain people you'll probably see. Since LinkedIn is like Narnia. I was joking. Somebody said, I just found you, and you're great, and I don't know how I found you. I said, well, sometimes you go to Lincoln and it's just a wardrobe with stinky coats in it. And sometimes you go to LinkedIn and the back opens up and there's this enchanted forest with all these creatures and thinkers that you never saw before. And I said, how did that happen? What, did you switch algorithmic gods? But recently, a group of us were talking about just this. And I say, thank God I read your stuff, person, because you studied at the same time that I studied in AI at the PhD level. And we know what's really happening. And I need to see what you write so that I don't go out of my mind when people say things like, don't forget these neural nets get better over time, and all of these kinds of magical things that whatever limitations you see in AI as it's being promulgated, and I put that in quotes, because AI means a whole bunch of things. So much so that when you say AI will do this, it means nothing, because. What are you talking about? Are you talking about this tech? This tech, this tech? And I'm just Concerned that there's no truth out there. That and the people making the decisions are in the dark, like as if they don't know that. Actually, robo taxis need highly optimized roads, they need geofence locations and they have remote drivers. And if you say that, people will say, no, no, no. Well, the remote driver has to be summoned by the car. Well, how often is it being summoned? And how many cars does the remote driver control? So what does semi autonomous actually mean? What does semi supervised actually mean? Well, we can't tell you. That's proprietary information. And there's one scholar who's called this autonomous. And this has been around since before, before bc, ancient times. The antikythera mechanism, the idea, and it's usually associated with some kind of mildly like racially stereotyped version of the Far east, bringing in a machine that will tell the future and therefore make you rich. And that the, the Mechanical turk is the famous version of it, which later had a company contemporarily called the Mechanical Turk as a little wink, wink dungeon nudge, that swami that sits in the box and tells you the future. But it's actually an actor pulling strings and all those different things. And like Deep blue with chess. Well, we eventually, you know, in the 50s when, when we first had our first machine learning algorithms, it was checkers. And they predicted within 10 years there would be chess. Well, it was much more than 10 years before there was a chess game that was won by machine learning. But even that, if you look far into it, there are human actors stepping in and what we call fine tuning. Right. So the definitions of autonomous, Semi autonomous. And what does it mean when you see a robot dancing on stage? Do you know there's somebody remote controlling that robot or not? Do you know what's happening in the back end? It's on a stage. Remember, I'm a theater artist and we take these things to be truthful. And I find it concerning how far it has actually gone and what does that mean? What is the risk? And when you ask people who are in the know that do these kinds of things at both small scales and large scales, well, you know, we just need to get enough Runway. We need to convince the investors so we can get enough Runway to actually build what it is. And you say, okay, I get that everybody's been involved in a demo. If you're in software development and you say you've, you've been in a down, you've never been in a demo that had some portion of it that was sort of you know, faked. You're, you're lying. You know, there's always that little like, I always call it the cardboard box with the slit. You're pushing the card through and going, boop, boop, boop, boop, boop. And here it is. There's always some degree of theatricality with that, but at scale, it's starting to get very scary when there's human lives at stake. Like, you know, faking a demo in your garage to get an investor to write a check with a box with lights on it is kind of cute. Faking it when you're putting microchips into human brains is horrifying.

[18:16] Debbie Reynolds: I agree with that wholeheartedly. I don't know when you're talking about that, it remind me of the movie the Prestige.

[18:22] Jennifer Pierce: Oh, I love that movie. Yes.

[18:24] Debbie Reynolds: Yeah, yeah. A lot of trickery. A lot of trickery going on.

[18:27] Jennifer Pierce: And that came out at the same exact time as Tesla. It was like, you know, people in Hollywood were thinking about that issue at the same time. And like in tech, Hollywood properties will copy each other. It was the Prestige and the Tesla movie were at the same time.

[18:39] Debbie Reynolds: Oh my goodness. Well, tell me a little bit about what your organization does for companies.

[18:46] Jennifer Pierce: So we work right now mostly with. Well, first of all, we are a nonprofit. We are fiscally sponsored and we're about to get our own 501C3. We're going through the paperwork process for that now so that we can be independently our own charity, but we have a fiscal sponsor. Our idea is that failed digital transformation and now even failed AI products are because we approach business problems that are really human problems. And it's not the digital transformation, transformation of businesses we should be worried about. It's society and culture. Because a society and culture that does not understand the tech cannot meaningfully adopt the tech. And it becomes a kind of mutually gaslighting game of imposing technology on people and telling them they need it and then them figuring it out. We have to evolve society and culture to understand tech so that tech can actually flourish. So we focus on that. We focus on the things that people think are going to make money. So that. Having said that, we provide education, open access research and open source development in public facing pro social innovation. And I know everybody says they do that, but we don't take money from big tech. And the thing that I was trying to explain to somebody is there used to be something called NASA's Technology Readiness Levels. We are research driven. We have a lot of PhDs and engineers involved and we focus on zero to three, which are those levels of research and discovery that ensure that you're creating something that's going to create more, more good than harm and that is going to make sure that you're not being duplicative with other products that exist in the market and that you actually have a feasible testable concept or a proof of concept. And it's our experience that people have been skipping over these steps, that certain algorithms in business want you to skip over these steps because speed to market has become the sole metric. And that there was a reason we had these steps and that, and there was a reason that having those steps in place actually accelerated American innovation to the top of the food chain for a while. Because we did that what some people, a friend of mine I just spoke to might call due diligence before we did it. And we feel like we're pretty safe there because nobody wants to do that work as a consultant for profit. I can't tell you the number of times I'm not going to name any names that people have come to me for either a CBR grant, you know, the federal grants, or a EU innovation grant. And they want me to fabricate or pretend that they have done the zero to three work so they can get right to getting the money for the go to market strategy. And I say to them, even if ethically I did not have a problem with that, I would tell you it's very expensive to do this, but the idea has been that if you skip over it and get to market and can get so far with attention and investment, it doesn't matter, it'll make up for it. But the work we do at zero to three saves so much money in the long run because it's all on paper, it's all model, it's all theory, it's all hypothesis. And we're finding the best hypothesis to test. The best way to structure your testing and validation. How are we going to know this works? That's the biggest thing that I see in these test results on AI tools is the metrics they provide. Don't test the things we really want to know. Anybody can get a 98% accurate reading for a single point of information you want to know over time, is the model stable? Right. You want to know if it's been stress tested, means tested, edge tested, you know, how long does the model consistently preserve, represent those results. Because there's an inherent model instability in stochastic systems. So making sure you're testing the things that actually measure the performance and the outcome and that's the kind of research we do and the kind of education we do is to support people who want to do that kind of work. We do not support any businesses that are currently taking venture capital. You may decide that after working with us you will take venture capital, but we're not interested in that. We're interested in doing that due diligence, research driven, ethical, responsible research that tests your idea before you actually bring it forward. And we do that in six different areas in business.

[23:11] Debbie Reynolds: That's amazing. Thank you so much for that. Well, let's talk a little bit about privacy because I feel like some people don't understand where privacy fits in with AI and all these digital systems. Especially when you're talking about you want to do things that, you know, reduce or greatly minimize or at least recognize potential human harm. I want your thoughts there.

[23:34] Jennifer Pierce: Yes. So we have, there's so many ethical problems in tech. As we all know, anytime there's an innovation eruption or a disruption, there's ethical issues in the fallout. We've seen this multiple time in history and we're in that corrective arc now and we're thinking about the ways to correct it. And the most important thing for us is that we don't make the cure worse than the poison. Right? Because sometimes the things that we administer to solve ethical problems create the exact opposite of its stated intent. So for example, you might want to regulate so that people don't become anti competitive in their practices, but the regulations actually end up hurting the people who are the competition as opposed to the people who are closing the competition down. So our three, we decided what are the three issues? What are the three hills we're going to die on? Because we have a lot of people at Singulars Q are very passionate, focused people and they have a lot of social concern, you know. But our three issues are, number one, data sovereignty is number one. We feel like your data is your private self, it is yours to do with. And actually you should be paid for the fruit of your labor. If you're creating data trails that benefit a business, they should be paying you for that. And we believe that privacy is paramount importance to any free society. And so that's number one, data sovereignty. Number two, we believe in humanity centeredness, in the design of complex technical systems, that the human flourishing portion is the most important outcome. And that we used to call this human factors research, which I was very involved in. We have a lot of people in our network, our community that are human manufacturers, engineers, you know that human flourishing is the center of any tech innovation agenda. And that finally, number three, human rights in the global technology supply chain. And that affects not just conditions in faraway places, but toxic workplace conditions that impact technology today. And you can call us bleeding heart, but we really like the idea and believe it is true that the more varied the people are working on your hive mind, the better the outcome. And for that to happen, there needs to be a mindset shift in how we do work. And unfortunately we haven't made that mindset shift. And people are being harmed in the workplace because it's not set up for them and that when we try to correct it, that creates more harm on those people. So we're very keen on belonging is what we're calling it, and psychological safety as part of that initiative for humanity centeredness in the workplace. You're going to be seeing more stuff coming out from us about psychological safety because we're about to start season five on our podcast and the data is in. In our psychological safety and toxic workplace stuff is very popular. We have a lot of psychologists on our board and in our network and leaders who are about psychological safety, self care. People in technology have been hurting lately. They're in their sign, being forced to sign confidentiality agreements, NDAs, non competes and they are scared to speak out. They are being bullied and they feel like something very big is happening that they can't even really talk about in everyday life, especially people working in the larger companies. And we're interested in being a safe place for those people and also providing an alternative. You don't have to sign up with team like Game of Thrones. There are smaller ways that you can innovate and do good in the world and there's a plurality of opportunities for you. You may not be resting, investing and making huge kinds of money in your Patagonia vest, but you might. And we love Patagonia. That's no knock on them, but we. But you might find a decent living doing decent work that does good in humble ways and you get to solve tough problems for ordinary people. And that is kind of the world. It's the job that many of us have come together and, and wished we had. We want to create. Where can I go where the only thing that matters is was a human helped here? Did we create something with zeros and ones that actually made life better for people, that that's what we are interested in? Yeah.

[28:17] Debbie Reynolds: Well, I'm glad that you are thinking that way. I think I echo your concerns in that area. I tell people I love technology, but I don't love everything that people do with technology. And I feel like there's been a shift in some way and I want your thoughts about this. So cars could not have become as popular and car adoption would not have been as deep if we weren't able to do it in a way that was safe to do. Right. But I feel like when we're moving into a lot of these emerging technologies now, people are like, well, let's just build it. And, you know, if it does harmful things, you know, probably won't do harmful things to me, whoever's in this Patagonia Invest, but maybe someone who's maybe in the Global south is doing this background work that you're talking about. Maybe they aren't paying proper wages. Maybe we are creating harm because we're not thinking in terms of human centricity. But I want your thoughts there.

[29:18] Jennifer Pierce: Yeah. And the car is our favorite. We actually have just to do a shameless plug. We actually have a couple of pieces of information out there. Our website is under development. We're in residence at Cornell Studio Lab with students that we get assigned to every. And so it's awesome, it's egalitarian, it's very diverse, but it's slow. So our website is kind of a placeholder right now. But soon things are starting to come together in a digital magazine where you're going to be more able to access our materials in a logical way. But we did a series on Ford and the in the car because it is a great analogy, I think. But and I'll give you that note and you can look into it yourself because actually a lot of people in technology are modeling themselves on Ford, let's put it that way. And how Ford actually created an America where everybody had a car for good or for ill. And then what trajectory happened? Like what kinds of horrors happened when the car first emerged and there were no highways or regulations. And for example, it took pretty much three quarters of a century to even get a lemon law on the book so people could sell exploding cars and kill people with no repercussions. So, like there is a very good case study there and then to watch what happens now. And now we're walking back a lot of the harms that were created with the car. But to your point about the global supply chain and even though we are very focused on helping people here, of course we're thinking most about people in areas that are vulnerable and have the least access. And it's one of the saddest things about my career was when I realized how much of what I was working on was enabling that close kind of work conditions. And that when you go into global international consulting, for example, you realize that your teams are not having equivalent experiences at work and that, you know, your team might be divided across four different countries and the ones in one sector have a very different experience than the others. And you get to know those humans and who their families are and what their situations are, and you say, my goodness, how is this even fair that we're sitting at the same virtual conference table right now when this person is dealing with that and we're just going home and having a cocktail, you know? And I think that people really need to look long and hard at this. If you consider yourself a person of conscience, some people might say that's just the way the world is. But think about the data. Labor labelers in Kenya, I think there's some in Ethiopia. Think about those grocery store checkout clerks with a large E commerce platform was doing automated checkouts, and it turned out to be a thousand virtual cashiers, another part of the world. And ask yourself not only about the fraud, that autonomy, that. That illusion of autonomy where none actually exists, it's just the displacement of labor from one country to another. And then say, what do you think the conditions are there? Now, there are some people that will tell me. I have done a lot of work in India, not only professionally, but as an anthropologist and artist. I'm a. What they call a dramaturg of dance and theater. So I do documentarian work and I follow an Indian dancer who I adore, named Kastabe Sakar. And I document her work and I've published about her and photograph her. And I love Indian culture as an anthropologist. Studied it, studied the Nadia, Sasha and other ancient documents of creativity and philosophy in India. I've never been to India myself and wish to, but I will ask a variety of people in business, in theater, in academia, in the arts, who are from India. When these companies come here, the technology companies, and you get a job, is it better than what you had before or, you know, what are? And they'll say, it depends. And in a lot of cases, it's better than not having a job. But I asked one person, a very honest human who's one of those stories started in one of these places in India, came over here on a visa, and is now a citizen and a very productive and successful professional. And we have worked in situations where, you know, the workplace was very toxic and abusive. And we're like, our health is suffering. And I say well, you know, is it just meet the new boss same as the old boss? And she said, you have no idea. She's like, as much as I'm suffering here and miserable with this situation. And so I said to this person, I won't use a name, but I said on a scale of 1 to 10, one being this boss is the same as my old boss, same stuff, different day, right? And 10 being oh my gosh, I'd never go back to the situation I had in my home country. What is your answer? And she said 12. So, and mind you, the situation we were in, we were being forced to work very long hours for a very abusive, psychologically abusive person. One of us had been in the hospital from, you know, their health failing. And we, we realized that we were dealing with very dishonest business practices. I mean it's about as bad as it gets. And they still said it was a 12. So I think we really need to take a long hard look at not just what we're being sold as being autonomous, which is truly not, but what that means for the global South. Tech has been creating improvements all over the place and that's true. But if we make it so that the EU and North America have all these great regulations and all these labor protection, that just means that business will travel to the places where they don't have to deal with that. And it creates a two tiered society that I think is already here, but that we really need to look long and hard. That's why we're about, we so far don't get involved in policy. We want to be educated about policy, we want to make some observations about policies, but we don't write policy or advise policy advisors. We may in the future, I don't know, certainly locally we're going to have to get involved as we become a South Carolina based state nonprofit in technology. But we want to be super careful that we're not actually creating the exact opposite of our stated intent. So if you regulate XYZ here, where are people going to go to get that need met and what happens there? And that's kind of what we think about. And when we say human rights throughout every link of the global supply chain in technology, that's what we're thinking about.

[36:05] Debbie Reynolds: That's tremendous. Thank you for that. I can talk a little bit about just artificial intelligence in general. So I find a lot of people, people like us, like nerdy, geeky people who've been in data systems and technology for a long time. I don't know about you but some of the things that I see people post are somewhat apart, appalling in the fact that it's very misguided and often not true.

[36:30] Jennifer Pierce: Yes.

[36:31] Debbie Reynolds: If you had time to clear up a misconception that someone has that a company may have about artificial intelligence or using it or working in digital systems, what would it be?

[36:43] Jennifer Pierce: I posted about this the other day. Demand clarity and specificity in what people say. And when you hear something like AI is changing the world, demand specificity of yourself. So, for example, you know how algorithms work and social media works. And if a certain messaging emerges now that people are using AI tools to deliver their counter message, they have formulas, these formulas. They train these things on agitation, propaganda and psychological operations manuals. People don't understand that advertising actually came from psychological operations. That's the whole story of Mad Men, which is Don Draper from the Cold War, reemerges with this new fashion personality and arrives at Madison Avenue to create contemporary advertising. That's very truthful. The people who were working in Psyops and in the Cold War did invest and create the contemporary advertising agency. Now, that's not this vast conspiracy. It's just we know this works for this purpose. It's going to work to sell things too. But it does mean that there's a formula for selling messages. So you'll see that formula will be, AI is not hype. Here's what AI is doing. Rocket ship this arrow this bullseye this, right? And then you look at it. Now ask yourself, what do they mean when they say AI is not hype? Because AI covers. Nobody agrees on the taxonomy. We're working on our own taxonomy here. And by taxonomy I mean like categories and subcategories and everything else. There's no accepted taxonomy for these texts, and there's at least 20 super categories for AI and machine learning. And within those super categories, there's any number of subcategories. And then for what purpose? What happened at DeepMind for visual AI versus what's happening in LLMs and what's happening in generative AI? What's happening in the kinds of deterministic AI systems versus more stochastic systems, multimodal systems. All of these things have different applications, different outcomes, different use cases. So saying AI is going to change the world is a nonsense statement because it has been changing the world since 1950, arguably before that. The fact that you're on LinkedIn reading about it was created by what could be called artificial intelligence. So start to get more specific, okay? And then the second thing is demand specificity of the people who talk to you. And I had a great conversation with a potential partner the other day, and he came to me. And I greet people with a kind of curmudgeonly skepticism because we, when we first got started, we got approached by all of these bad actors and these, like, crazy people. I wasted our time. Some of them stole our ip. Now we vet people really hard, and so they'll start telling me, like, we have this product. And I said, really? So these limitations exist without technology. How are you addressing them? And how is it different than the way other people are addressing them? Because there are limitations to this tech. So you need to know what they are. And if you're going to accept that it's actually innovative, you need to know how it's addressing those issues differently than other people. Be specific. AI is a general term that almost has no meaning. And I have a little recording on that on our podcast, if anybody's interested in hearing it, where I go over some of those categories. And, and it is so diverse that there are any number of other kinds of solutions for analysis and driving better outcomes and decisions that could emerge that look nothing like what we're looking at right now. They're already being developed. So be specific in how you speak. Demand specificity of the people you speak to, and demand specificity of the language that you read on a daily basis. AI is going to change the world is, in philosophy, analytic philosophy, what we call a. A performative. It. You're not able to analyze it as true or false. There's no content in it to analyze. And I'll tell you one thing, if people understood that concept a little more, the LLMs would be working a little bit better than they are right now. But that. And, and so, you know, be specific and, and make sure that what people are saying or advancing actually refers to something in the world. Because what determined protein folding 10 years ago is not the same thing that we're talking about is changing the world today. And we need to understand why they're different.

[41:25] Debbie Reynolds: I think that's true. I think that's true. I think I slap my head every time I see someone say, hey, I've used this particular AI and it couldn't do this. And I'm like, well, it's not built to do that. Why are you running into a brick wall for something we already know and doesn't do? Right?

[41:41] Jennifer Pierce: So, yeah, because we're pretending that they've created this, like, generalized intelligence. That's not there. There is no intelligence in this yet. They're they're claiming that they can point towards it, but there's no agency or autonomy in these systems. It can only recapitulate information it's already has. And when it reaches the bottom, it needs new data produced by fresh human minds. I mean like that's the problem we're in right now is there's not a lot of, of non synthetic data left. And so you know, we keep waiting for that moment that it's going to break through and start creating new knowledge. It's not. And that's, that's kind of my joke is that I don't think this tech is useless. I think it has a lot of uses, but I think that it's not generative, it's degenerative. We are deconstructing the knowledge that already exists with it and when we deconstruct it down only a few layers, it becomes useless because you know, reprocess synthetic data is poison to the systems that, that produced it.

[42:38] Debbie Reynolds: Right. And people are finding that out also, talking a lot with people about model collapse and things like that. Especially when people are thinking about building something foundational for a particular data system. You know, you want to be able to have a data system, especially in businesses that will do the same thing a million times if you wanted to do. Right. Not different things.

[43:01] Jennifer Pierce: Right.

[43:02] Debbie Reynolds: Yeah.

[43:02] Jennifer Pierce: And I mean a lot of us have theories about this. And my theory was, and by the way, model collapse is something we've been talking about for decades and like nine months ago some of the people around are like, we've been screaming about it like it's like the recording of a recording of recording and it's going to create a problem. I don't know if we're going to get to the dead Internet or not. But you know that theory that there's so much synthetic data that the Internet ceases to make sense. Right. But we knew these things and that's that zero to three research. If you had just read a few books and seen some small scale models that produce those kind kinds of results before we had these super jacked up processing cards from the gaming industry, before we had the ability to get as much data as we did, before we had somebody willing to actually break every known law and scrape that data on the idea that it was fair use. Sure, why not try it? You know, by the time they catch me. And maybe it is fair use that's being adjudicated, I don't know. But you know, until all that happened, we had smaller versions of these models that we could test and know what the problems were. And, and the idea I think has been generally, as far as I can see it, that the volume in size of the system will overcome those limitations without any evidence of that being the case. Well, it's doing that now. But if we just add more parameters. First of all, in data management, you probably know this, I'm just going to say it again. More parameters means a bad system. It means model overfitting. Right? So like we don't want more parameters, we want less. More parameters, more data. And that's what makes me sometimes suspect on my darker moments that this was all a pretext to get data, to hoard it for influence and never about creating a useful piece of tech.

[44:49] Debbie Reynolds: But I agree with that and I always, I always wonder. So when people like, oh, we need these huge language models, we have all these parameters, we have all this data and I'm like, for what? What purpose? Like it's not useful for me. Like what? You know, why do I need a million cat videos? Right, right. So that, that doesn't do anything for me. That doesn't mean anything for me. So my thought was that a lot of this move was to try to create almost like a, the new Google or the new superpower company where we say, okay, well you can come to us because we have all this information. We want to try to hold onto this kind of centralized way of thinking. But really like you say, it needs to be much smaller, fewer parameters, more things that are more purpose built. Built, right?

[45:36] Jennifer Pierce: Yes.

[45:37] Debbie Reynolds: More manageable. Right?

[45:38] Jennifer Pierce: Yeah, it's, it's the quest. It's like, you know, the ancient fantasy novel trope, the quest for the one ring to rule them all. The quest for the large platform that everybody's forced to use because the excitement for some people on the geeks like us was, ooh, I have this friend on the IRC over in Australia and we can get up at weird hours of the night and talk about Kurt Vonnegut, right? Like that, that was the geeky joy for us. But for business people, the joy was my market just went from down the street to the entire globe. Right. And so the size of that market is what everybody's after. And that's where everybody gets into trouble. Right. Creating a small self hosted language model is cheap and easy for a very specific purpose. If you're not trying to do all the things. And I think that that's where we, we get into big trouble is when we want the big totalizing, that's the totalitarian urge, like the One platform to rule them all and in the darkness bind them. You know, it's just. It's really the issue. But I think that what I know about living systems is that diversity wins out. And I don't mean that in the DEI sense, of course. I do support dei even if it becomes illegal to say it in my state. But diversity wins out because is. It's just the nature of things. The way moss grows over everything, every building that we try to make, or the way things eventually break down to the lowest levels and then endlessly vary themselves like a kaleidoscope. That's just the nature of our existence in our world. And so you can't stop the pluralism of solutions and decisions and cultures that are going to come to disrupt the ones. I think what feels so oppressive right now is this one's about to break down again and nobody wants to let go of the goodies they got while it was consolidating. It's a massive redistribution of wealth, not through an economic system like socialism, but just through the pure force of innovation in the planet. And if we could lock it down and keep it in one place, then I can stay rich and I can stay on the privileged side of things and you can stay on the de. Privileged side of things. Things. But that's not the way it works. Yeah, that's true.

[47:58] Debbie Reynolds: Oh, my goodness. So if it were the world according to you, Jennifer, and we did everything that you said, what would be your wish for technology, privacy, modernity? I don't know. In the future, whether that be regulation, technology, or human behavior.

[48:17] Jennifer Pierce: Oh, my God. That's a big question. I'm going to take that down to a. I don't know why if I want that big wish, because that's too much power. But I. What I want people to understand is that legislation, you have three things to force the good that we've decided as humans over the world is the way that we make sure that we protect ourselves and our planet and keep the race, the human race, because race outside of the human race is a fiction. But to keep the human race moving forward and healthy and surviving that we have these three things here in the west and also now in emerging governments. And it's legislation, regulation. But there's this third one. It is public outcry. Do not accept what is unacceptable. And it may feel like you're wasting your breath or your energy or typing to leave a comment, but don't continue to go to the vendors that are doing terrible things to your digital wallet because they can like if your cognitively impaired mother gets the end of her credit card and your family digital wallet on an E commerce platform. Some of those E commerce platforms are now going to every digital wallet card you have in there without your permission. It used to be they to ask and now they just do it it. And then your 16 year old kid who's been working a summer job comes and says E commerce platform X just took, you know, $1,000 out of my account or whatever it was. And you say what? And you go in there and how did that happen? Well, I think the game is I'm just going to do it until somebody tells me otherwise. Because I'm going to, I'm going to bank on the fact that people don't understand that a wallet made out of O's and ones is the same as a wallet made out of leather. They, they're confused in their mind that because this is made up of numbers, nobody owns it. So I can do whatever I want and you gave it to me. Terminate your service. It's inconvenient. I gotta tell you. I don't order online anymore at all for that reason because I will not, for my own convenience empower this terrible way of doing business and this terrible way of treating your employees and this terrible way of creating a gig economy that oppresses the most vulnerable. I won't do it. So I get in my car and I go down, I see my neighbors. It's been a good thing, right? You know, I mean, it's really not been that hard. I actually spend less money. I don't buy things I don't need. I don't order a gross of something and say oh my gosh, I only wanted one of those. Oh, now what am I going to do with that? You know, like it really hasn't given me as much as I thought. It seemed initially exciting to get my favorite mascara without leaving the house in a big box filled with waste just so that like survives the transport. But I either go out without the mascara or go down to the store and run into my friend. I mean like that's, that's just something that we all have power to do. And it seems like small but little things like that over time. Protest. And I don't mean that by our, you know, formal protest. I mean that's an option too. But protest by your actions. Say what you think and don't give into this idea. Who cares? Our data's out there already anyway. Who cares? What can we do? That's just the way it is. Don't give into that, that it won't be like this because it's unsustainable. It is unsustainable and it will change. So realize that public outcry is the most important piece of that triangle of regulation and legislation. I think we're leaning too hard into regulation right now because all the regulation I'm seeing is going to have the effect of just giving an obstacle to competition at the lower levels. And they're just going to buy their way around the regulations that they themselves are actually advocating trading. I'm not inclined to listen to meta until they expose their training data. They're in the right direction with giving out the weights and the open source. Show us your training data and don't equivocate on how much of our generated content is in your training data. They equivocate every time it's publicly owned data, you know, and we're using your data to train our AI models, but not the open source one. Oh, so I feel better about you using my data for your closed source funds. I'm not sure, like make the training data available and I'll listen to you. Otherwise, what you advocate for regulation or not regulation, I don't care to hear. So that's basically where I sit. But public outcry, get mad, get fed up.

[53:03] Debbie Reynolds: I agree with that and I have actually, I've told people this. So that is what I'm seeing. A lot of the behavior change that we're seeing in the marketplace has not been from regulation. It has been from public outcry. Right? So like for example, General Motors, they stopped selling their car data to certain data brokers. And it's, it was, it was legal, quote unquote. Customers of GM say, I don't trust you anymore. I'm not going to ever buy your car. And because of that, they decided to change their behavior. So I think we can't discount how important that is to be able to make your voice known. Because what businesses don't want to do is lose business. So they feel like they're endangering their business or that customers will turn away to them and go to someone else. I mean, those things do get listened to.

[54:00] Jennifer Pierce: They do. And, and look, this thing with the data and them running out of data, they need our content. They need the traces of data that we leave behind us. They need us more than we need them. They need the data we produce that have made them the data dragon lords of the, you know, future feudalistic society that's emerging for us or whatever nightmare landscape you want to imagine. But we generate the data that they use, and they're nothing without us. We've allowed them to be this for convenience, for connectivity. And so, trust me, I know. I've been trying to extricate myself from any number of proud platforms, and there's certain things that are built in my way of doing business and my way of, like, even operating my family. Like, Facebook became my lifeline to my octogenarian and nonagenarian relatives that, like, want to see pictures of my kids. And, like, it's easy to check in on them that way. Like, if there's a hurricane. Like, these are beautiful things, and it's hard knowing what to do. I. There's some platforms that I'm using as a necessary evil, but not without being vocal and putting signals back into the system about what I think about it. And I see some exciting. I've seen this week. I've seen some exciting innovators being given grants to do AI that is data sovereign, that is transparency in the global supply chain, and that is fully transparent and open about their trading corpus. Because if it had stayed truly open source, it probably could have been a boon to humanity. But they manufactured this idea. It had to be closed for our own good, you know, and of course, that is not the case. And then surveillance, like. Like, let's make sure that we don't allow this to become a massive global surveillance state. That is probably my biggest fear about all of it.

[56:06] Debbie Reynolds: Yeah. I share your concerns. Absolutely. Well, thank you so much, Jennifer. This is great. I'm so happy to be able to meet you in person, digitally, be able to talk with you, and I'm excited about continuing conversations with you online, for sure. Sure.

[56:21] Jennifer Pierce: Me too. I would love to have you on our podcast soon. We're taking. You know, we're taking a little hiatus, and then I'm planning season five, and we're definitely gonna have some cybersecurity shows. So I definitely want to hit you up then, because you would be an amazing guest.

[56:37] Debbie Reynolds: I'd be so excited. Thank you for the invitation.

[56:40] Jennifer Pierce: You're welcome. And thank you for being you. I just love your voice. It's. I love the sound of your voice, for sure. But I love the fact that you speak so candidly and directly from your very, very impressive experience.

[56:52] Debbie Reynolds: Oh, thank you. Oh, my goodness. Well, I feel the same way about you, so you're amazing, and I love what you're doing. So, Kim, you let us know what's the best way for people to get in touch with you. They want to use your service.

[57:05] Jennifer Pierce: Yeah. So we're@wwerxq.org we have a business page on LinkedIn. I'm pretty pretty prominent on LinkedIn. You can follow me at. Jennifer Pierce, Ph.D. our a pretty easy email to access and remember is infoingularxq.com or admin singularxq.org we have two sites you can subscribe to our newsletter on Ghosts. It's all on the website and definitely help. We're a non profit so we work on a like our paid and our. We're not very good at this because right now we're building. But you know, our paid tiers and our free tiers are the same because we want to be open in. But a lot of people subscribe, you know, at $5 a month or more just to keep us going as we build up. When we get the 501C3 paperwork, that's when we can apply for some serious funding. And that's been keeping us going. So if you can subscribe, it really helps us and know that we make a special effort to employ a. Most people inside the company now are volunteer, but we, the people we do employ, we tend to favor people who have, have experienced barriers to access to careers in tech or have been harmed in the technology workplace. So we deal with people with disabilities and people who are recovering from workplace trauma. And so the money that we do bring in, the very small revenue that we're bringing in right now does go there. So help us out if you like what we're talking about.

[58:36] Debbie Reynolds: Excellent. Thank you so much and I look forward to chatting with you more for sure.

[58:41] Jennifer Pierce: You too, Debbie. You have a great night.

[58:43] Debbie Reynolds: All right, you too.

[58:44] Jennifer Pierce: Bye.