Debbie Reynolds Consulting LLC

View Original

E96 - Katy Pusch, Lead Director of Product & Integration, Cox2M

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E96 - Katy Pusch and Debbie Reynolds (44 minutes) Debbie Reynolds

43:59

SUMMARY KEYWORDS

data, people, iot, companies, laws, user, problems, business, wearables, terms, tracking, privacy policy, collected, product, continue, transparency, complexity, case, important, world

SPEAKERS

Debbie Reynolds, Katy Pusch

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me  "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that business needs to know now. I have a very, very special guest on the show. Katy Pusch. She is the lead director of product integration at Cox2M. Welcome, Katie.

Katy Pusch  00:41

Hello, Debbie, thank you so much for having me.

Debbie Reynolds  00:44

Well, this is really going to be a fun episode. You and I have known each other, we met a number of years ago, and we had a chance to collaborate on some work stuff, which is great. And I just think what you're doing is really fascinating. I like your point of view, you're a very smart executive woman, very technically skilled as well. And I would love for you to explain, first of all, your journey into technology and what you're doing with Cox2M? 

Katy Pusch  01:16

Yeah, absolutely. I guess I'll give you a brief rundown of how I made it here to this point in time. I started out in data analytics; actually, my undergrad degree is in economics, it's not in technology, it's in economics. And I started out trying to help companies make better decisions about their markets and their customers and how to support those. And so I started out in data analytics, several years later transitioned to product management, but specifically for data science products, so that data thread has been with me throughout my career. So focusing on data science, product management, and then finally moving into now, I lead the Department of product and integration for IoT product management, which I really relate to IoT, just being an additional data collection mechanism, really the core value that you want to deliver with IoT is also still data. And there are a lot of very interesting data management aspects associated with IoT, in particular, as a data source.

Debbie Reynolds  02:23

You are definitely a smart cookie, and I enjoy collaborating with you. I also love your point of view because I think sometimes when people think about going into these emerging data or emerging technology areas, they're like, let's just develop products and put our blinders on and not think about the risk or that thing about, you know, other things. So I feel like you're very balanced in that way. It's like, yes, let's have technology innovation. But let's also consider all the other things that you need to think about when you're trying to develop a product, whether it's for consumers or business.

Katy Pusch  03:01

Absolutely. My very early career experiences were market intelligence and competitive intelligence as well, in terms o creating the market strategy and competing against competitors. But there are some really clear ethical guidelines involved in competitive intelligence for anyone who's been in that space before, to not lie being the golden rule, right? And that early experience of brushing up against ethics very early in my career, I think, was helpful because I absolutely continue to view all of my business cases and market approaches through the lens of not only is this good for the company but is this good for society? And to put that second question, I guess, a little bit more practically for companies. You know, if this shows up in headlines, are we going to be embarrassed? Are we going to be proud? And that's sort of the litmus test that stayed with me throughout my career that I think results in really good market strategies, really good product approaches and keeps you on the right side of things where you want to be.

Debbie Reynolds  04:08

IoT is such a big space. That can mean almost anything. But what can you tell us at a high level about the types of IoT stuff that you work on?

Katy Pusch  04:21

So absolutely, you're correct. There are so many different flavors of IoT. The IoT that I focus on at the moment is primarily asset management. So can I provide valuable information to companies about where their assets are located and the status of their assets so that they can manage and track those assets more effectively? Primarily, the types of use cases that I'm thinking about, again at the moment, are related to efficiency gains, right, can we save money and operate more effectively as an organization, and of course, that results in the downstream business impact of getting to market sooner and higher customer satisfaction, right? That really impacts the bottom line for our customers.

Debbie Reynolds  05:05

I read a statistic recently that said that most companies, that the average company has more IoT devices in their environment than they have computers. That's kind of shocking. People don't really think about how many Internet-connected devices they encounter when they're at work or even at home. So what are your thoughts about that?

Katy Pusch  05:29

Absolutely. It's less shocking to me, maybe. But I love that stat. And I love bringing that awareness to people, especially if you think about it in terms of asset management, right? If I'm tracking inventory at my company that I intend to sell and process through to send to market, versus the number of computers that my employees are using live much higher inventory than computers that my employees are using, right. And so the ratio sort of works out in that favor. We have many more pieces of equipment, and I would say, generally, in our organizations than we do people these days. So that I mean, that metric is, I think, a great educational step that makes a ton of sense to be in the world that I see. Cox Communications is part of Cox Enterprise, Cox Automotive, and so we brush up against the automotive space a bit. And the same is true for vehicles, right? So if you think about asset tracking, in terms of tracking vehicles, there are so many vehicles on the road, so many vehicles that businesses employ to help their employees be mobile and do their jobs more effectively. And there's so much data out there to assist in solving these questions and answering them.

Debbie Reynolds  06:50

So as you're working in this emerging space, with all this new technology, is there anything that you've come up against that just really surprised you, just haven't thought about in the past? You know, when you start working on these parts of products?

Katy Pusch  07:11

You know, I think about it a little bit in the context of the Gartner Hype Cycle, in the sense that I think anytime you're learning a new industry, you have this preconceived notion of what it could be, right? You're on that hype cycle of the peak of inflated, inflated expectations of, oh my gosh, this can like revolutionize the world. This was true for me with data science as well, right? When you get involved in the mechanics of data science and really solve problems with data science, you realize that a lot of the fancier techniques are there's fewer business cases that really require those really fancy, advanced techniques. And so I think it's interesting to think about IoT, and in that same way, there are so many problems that IoT can solve, and in a lot of ways, right, we're on the forefront of revolutionizing. But I think any industry that you really get into the nuts and bolts of you run into some common use cases that you want to solve problems with. And you develop a playbook. And then you learn how to improve the way that you deliver that playbook. And I think that's something that I continue to learn about each industry is just that you can learn the patterns of the use cases and the business value that need to be delivered. And you continue to try to push the envelope of that, of course, but there's a set, there's a set of value that you can deliver with 80%, right 80% value 20% effort, and then you continue to work on flavors of that, for IoT, in particular, with asset management, I wouldn't say that it's shocking the amount of data, though, that we can process to help solve problems. But it is a different experience actually being able to interact with that data and be involved in the really complex data pipelines that go from the embedded device to the network to the application layer and how all of that needs to be processed in order to ultimately deliver value for a customer coming from previously a world of just software application development and data science related to that. A lot of the data you're gathering from software application management is either directly input by the user, or collected through data streams, like if you think about something that you're doing with Twitter data or things like that, but it's still user-generated data. And the scale of data and the complexity of pipelines for IoT is just immensely vaster. So I enjoy talking about IoT to data science professionals because I think it's a really interesting industry to get into at the moment.

Debbie Reynolds  09:56

I really love this area. It definitely gets me excited because I love technology. I love Data Privacy problems. So to me, this is really, really interesting. One thing I want to talk about, and this is something that I've talked to other people about, and we've seen some cases internationally about stuff like this. So, for example, let's say a company had like a CCTV camera, right, that they use in their business traditionally, and they decided they found a new one right to have some type of Internet connectivity and have some other type of analytics and different things it does. And a lot of times what we're seeing are companies that just say, okay, this is the newfangled thing, let's just install it, and it'd be great. But now, what they're not thinking about is that these more advanced tools are collecting more data, it's doing more things on the back end. So you really have to think about them differently. You have to analyze and assess them differently. But what are your thoughts about that switch for people?

Katy Pusch  11:08

Let me know if I'm answering the question that you intended or if I'm going in a bonkers direction, I, in my opinion, disclosures are just the absolute most important thing about Data Privacy. And I think when you're dealing with an application layer, and you are displaying data back to the user that they can see that clearly, you've collected about their assets. So if you're tracking asset location, and you're displaying that on a map in the application that the user sees that, of course, that's disclosed, they realized that you're tracking the location of that asset. But I think that it gets even more important with things that aren't as transparent as that, right, not everything can be immediately put back into the display that the user is interacting with. And so, there may be areas that are a bit more opaque in terms of what data is being collected and what's being done with it. So I think that finding ways to be transparent about data collection, what it's being used for, and why it's being processed is very important. I think sometimes that can be done in the application, and sometimes it has to be done through privacy policies or training, right, in a b2b context, which is a lot of the space that I'm in right now. Training can play a huge role, especially with management and employees. But even more interesting, one more layer of abstraction is the different combinations of data and what insights are derived from that. Right? If you process the messages coming from an IoT sensor in a different way, it can lead to different insights, right, if you apply a timescale to certain position messages, you get track of where things have been over time. So again, I just think there's so many different levels of abstraction, and if you adjust the data processing, then that is still another moment in time to reflect and say, has something meaningfully changed about the information we're gleaning from this data? And is that something we need to incorporate into our privacy policy or into our training, or into our application directly? So I think continuous reflection on those questions and incorporating them into various methods of disclosure, depending on the situation, is what's really key to me.

Debbie Reynolds  13:32

Yeah, one case that came up in Australia last year it's pretty popular, it was a 7-11. I'm not sure. But basically, they had customer feedback. They installed a tablet that gave people the capability to be able to rate the tore and different things like that, what they didn't tell people in a way that was legal in Australia was that this was doing face captures, collecting their facial data, comparing it to certain databases and stuff like that. And so, seven levels of hot water for this. And actually, there's another set of cases in Australia for almost exactly the same thing. So I think this is a ditch that companies are falling into where it's like, okay, maybe they were doing in some different way before they found this newfangled technology way. But then, I think there really has to be a lot of education with companies about how they use it because a lot of these tools have a lot more capabilities than they had before. So they have to really think through am I infringing on someone's rights by using this tool in this way. And then too, I think; also, a lot of companies traditionally hadn't ever had to think about that. Okay, I collect data, it's mine, I can do whatever I want with it. But now we're seeing these rights and these privacy laws and regulations be able to raise up, sort of giving people more of a stake in what's happening with their data. So what are your thoughts about that thing that's happening?

Katy Pusch  15:19

Oh, gosh, I think a lot about human data interaction. And I don't have it right in front of me, so I'm going to screw this up. But there are principles of data, and human interaction in terms of advancing rights that the user that the data is about should have with regards to their data. And at the lowest level, it's transparency, and that's largely where we're at. And then it advances two additional levels, one of those beyond transparency at the moment, for the most part, our choices as users are to completely opt out of the experience that is collecting the data, or consent to the collection of data and the use of that data, however, the company sees fit, effectively. And so where I would really like to see this industry proceed to, I mean, the industry broadly, of all of data science, is towards a model that allows more granular choices by users on how they can opt-in and out of certain data uses. I think that there's a tendency for technology to want to move so quickly because of this premise that we're delivering value to businesses and to users. And so let's move quickly, we need to get it out to market we need to be the fastest to market. And I think there's a lot of value in moving quickly. And iteratively. I also think there's a lot of value in being intentional with data use, especially in this field that is growing so rapidly, you know, data science, brushing up against AI, and all of the very important implications of what an algorithm can do with your data outside of what was initially intended with the data collection. So I think that it's a very difficult problem to solve in terms of how to execute that consent. And help users understand more granularly. Like, for instance, what I would love to see is if a company changes its privacy policy to incorporate additional sets of data that weren't previously used before, that a user can choose to continue using that experience under the previous privacy policy if they so choose. Now, this introduces a ton of governance, headache, and overhead for the companies administering these in a way that could really inflate the cost of the solution downstream. So this is a very complex ecosystem of considerations. And I'm not trying to minimize that. But in my opinion, it's a very positive goal to be striving for not only transparency but also more granular inputs from the user on how they want their data to be used as a result of this interaction. And if there are certain things they're okay with, but also if there are certain things that they want to continue to be able to engage with your product, but opt out of that element of it.

Debbie Reynolds  18:33

Yeah, I agree with that. I call it incremental consent. So I think in the future, there's going to be a situation where based on what you're doing, or how you're using something, it may be sort of a stopping point to say, hey, do you want to consent to this thing? You know, this or that you? So I think that's kind of where the technology has to go, especially as we're thinking about the metaverse and people, you interact with the objects around them, and maybe our way.

Katy Pusch  19:02

Yeah, absolutely. And I think that that poses a particularly difficult challenge for some of the, it's kind of funny to call Google a legacy player, right. But their infrastructure has built up over decades at this point. Right. And there have been reports, and I believe they have acknowledged as well, to some extent, I'll have to look up the news reports on that speculation. Sorry, I don't have it in front of me again. But you know that they can't say with 100% confidence where the data has all been sent in their ecosystem, right. And I think that's very common for some of these legacy players that have been processing data for a long time before. All of these considerations were really at the forefront of public discourse or even of technical community discourse. And so it's a lot to untangle and go back from, but I think it's a really important problem to solve, especially as I think this is a really good space that the public sector has sort of lagged behind the private sector because there is an opportunity for the public sector to learn and implement things even more successfully in some ways than the private sector has been able to. And so, I really look forward to the continued implementation of these emerging technologies in the public sector. But in ways that reflect all of these more convoluted, let's say, interactions.

Debbie Reynolds  20:40

Yeah, I think, just in the way that technology has been built in the past, it's like, let's create a new bucket for everything. And let's take the data replicated and all these buckets and, and then these buckets are made to be interoperable, they often are. Made in ways you can easily take the data out. Yeah, sort of a one-way in type of thing. So and the world is changing, where you have to know where this data goes, who touches it what you're doing with it? Yeah, so I think there's just, you know, we're trying to turn a battleship over on time, in a way.

Katy Pusch  21:24

So there's a lot of love that data science gets in the industry, right. And I also love data science, it's amazing. Specifically, within data science, though, I just want to say data engineers are the superhero heroes of the future, right. And that skill set of ensuring that we understand, well, I'm going to make it a combination of data engineering and data governance, right, we understand where this data originated, we can do some statistical analysis on how accurately we believe it reflects the population as a whole, we understand how it was processed to get from point A to point B, to point C, to point D to point E. And all of the stops around the way all of the processing that happened, I just think there's so much interesting and important work to be done and data governance and data engineering, in particular, I think it's another one of those examples, though, of industry, the real important work is not often the flashy work, like I was talking about earlier, data science or IoT. But I just have a lot of love and respect for data governance and data engineering professionals.

Debbie Reynolds  22:36

I agree they do a lot of the hard work and the heavy lifting and think about these problems in different ways. You know, there was a recent case out of Illinois with Google Photos, a lot of companies get tripped up on this law and the Illinois Biometric Information Privacy Act, and from being in Illinois and watching this over many years, actually, I've talked about this on a PDF a couple of years ago, about this law. I'm like, hey, you guys need to watch out for this, you got to be in trouble. So this recent case, it has to do with Google Photos. So for a period of time, they were saying that Google may have taken data of individuals that were in Google Photos over a period of time, without the FBI, you have to let people know what you're collecting that data for and how long you're going to retain it. And the retention part is where a lot of companies trip up. I think that's probably what happened in this case. So I think actually, I think I have a check coming from Google, I don't know, as a result of this, but because it is a class action, and they have to make a payout to people in Illinois. So what are your thoughts about this case?

Katy Pusch  23:54

I saw that as well. And I found it really interesting. Also, as you mentioned, the complexity of the state laws. And that was what you and I had, part of what we had an opportunity to collaborate on recently was the differences and all of the state laws, I thought it was extremely interesting to find this case where I always kind of have this in the back of my mind of Google's probably using the data fairly indiscriminately. That's my just sort of personal consumer bias. I just have that suspicion. But it was still surprising to me to see that Google had used what I consider to be personal, private information that I store on Google Photos, to enhance biometric data face recognition, right. And that was certainly not maybe I needed to read into that Data Privacy really, really closely. But that certainly wasn't something that I would have expected Google to be doing with a private data set that in my mind, I own that data set and the service that they're providing is storing that data set for me, but certainly doesn't give them again, in my mind as a private consumer access to use that data for whatever they deem appropriate. So it was a really interesting case to me from that perspective, but also from the perspective of these complicated state laws that differ in so many ways. Because if Illinois didn't have that law, then I, as a consumer, may have never found out that my data had been used that way. So I think from a business perspective, with my business hat on it, sometimes frustrating that there are so many different state laws to try to comply with, it can really increase the complexity and the stress level with wanting to make sure that you're doing the right thing, according to all of the different state laws, especially if you're operating in a national capacity. But from a consumer perspective, this was a case where I was pretty appreciative of this odd little law from Illinois because it's the only reason that I became aware that something like that was happening. And I think that that kind of awareness in different arenas, especially because it is an emerging space. And so there continue to be questions about which of these things are most important, like, what laws are going to be most effective and managing the patchwork? I think it's a really helpful way to run, if I put it in business terms, iterative experiments, almost on like, what types of cases does this cover? What types of behavior is this going to uncover that companies are engaging in? And I hope that over time, it's going to result in more of a consensus on what is most important, and what laws really need to cover? Versus maybe handling outside of literal, legal methods. But I found it to be really intriguing for those reasons.

Debbie Reynolds  26:58

Well, I'm hoping that we come to sort of going through the legalistic route, people come to more of a human-centered route and kind of think about it like, what was someone reasonably expect as a customer? So if I, if Kenny gave our pictures to Google, does he reasonably expect that it will be used for some other purpose beyond storing it? You know, for your own purpose? You probably say no, right? Most people will probably say no, because, first of all, they don't know what it's being used for. And then they don't have any kind of ability to opt out without not using the service. So I think that those things are really important to build on actually a few things from our conversation, another layer of the human data interaction is actually compensation if I recall correctly right. And so that feels like a perfect combination of the incremental permissions that you were talking about. But also, if I pay for a service, and then as a result of me paying for a service, then that company is going to take the data that I use in their service and make money off of it right, then that feels like a perfect opportunity to say, one, are you okay with this? And two, if you are okay with this, we're deriving value from using your data set. And so we're going to compensate you for that. And I would love to see that type of compensation come more to the forefront in terms of transparency as well. I'm always very suspicious of free applications. Because if you're not paying, then you are the product, is my thought. And I don't think I made that up. I think I read that somewhere, originally. But I do believe that generally, as a rule of thumb, right, if there is a free product, then they are getting value in some other way besides me paying for it and what is that way? So very commonly, it's through selling data. But I think there's an opportunity to be more transparent about that monetary flow as well, right? What is our business model? Is our business model to sell the data that you provide to us? And is there a way that you can pay us for our service to avoid us using your data in that way, and then you can still enjoy the service? Right? I think there are just lots of options that haven't been fully explored yet. I agree with that. I know our friends in Europe, they are horrified about the whole data monetization thing and say, oh, they shouldn't be sold. And it's like, well, data is sold. Let's not. It just is, so let's pretend like it's not happening. But I think people should have more transparency as to what their data is worth. And maybe they think a little bit harder about, maybe reconsider the data that you give to people as a result of that.

Katy Pusch  29:57

You know, we haven't even talked about data brokers and that additional layer of complexity, right? If a company is selling data to a Data Broker, then what happens? Right? It's not very well understood or tracked, in my opinion.

Debbie Reynolds  30:15

Well, I know that regulations around the world and in the US they're really trying to focus on this third-party data transfer. So if a person gives you data for one reason, if you want to transfer it or use it for some other reason, they're trying to create mechanisms where there has to be some type of notification to a person. And in some places, you have to like consent to that third-party data sharing. So I'm hoping that that process will bring a level of transparency so that people can say, no, I don't want you to sell my data to these unnamed people that I don't know who they are, what they do with this information.

Katy Pusch  30:58

Exactly. Yeah. Well, I'm glad that is forthcoming. Data brokers keep me up at night.

Debbie Reynolds  31:05

Yes, frightening. It's frightening. How many there are, and what they're doing with the data? You know, actually, let's talk about health, about wearables?

Katy Pusch  31:16

Yeah.

Debbie Reynolds  31:17

Wearables are interesting, it's a bit of a loophole and way around Data Broker data collection stuff because I think people think if you were to have a wearable especially was collecting this type of health information, people have a misconception that their data collection may be covered by HIPAA, and HIPAA and the US is basically a patient-provider transaction. So if you're not a patient-provider situation, but you're giving out health information, it does have a much lower level of protection. And those things are sort of up for grabs currently by whoever wants to share their information. So give me your thoughts about wearables in this kind of gray area.

Katy Pusch  32:07

Absolutely. So wearables being part of IoT, certainly, and also extending to other types of health tracking applications, there's a lot of them whether you want to track personal goals related to fitness, or whether you're tracking a chronic condition that you have as a patient. If that is not a direct conversation with your health care provider, then, exactly, I think about that, as someone who's been active in the chronic care space for a little bit, right there's a lot of implications about having data out there that is not governed by HIPAA, but that the users may have an unclear perspective on what rights they really have with regards to that data. I think about it a lot because I do think that there are ways to improve the patient experience, improve the quality of care delivered, through applications that help patients manage their own care, through wearables that help patients monitor their own health, I think it's a really important piece of the puzzle as we think about how emerging technology can improve the lives of people, including, in this case, patients and people managing their health. That's one area that, and you're the policy expert, right? So you can help tell me if you feel like a policy or whether that human approach is the better approach, especially when we think about startups who are bought and sold by different investors and go through many management hands, it feels like it's an opportunity to look for ways to apply some consistency to how that data is managed, and ensure that the people tracking their health and trying to improve their lives are really aware of how that really sensitive sometimes data is being used.

Debbie Reynolds  34:05

Right, exactly because what we're seeing is data brokers take their predefined risk profiles of people based on what they either collected or things that they're inferring about individuals. In the US, the FTC has a rule about a breach of security, which is basically supposed to handle this loophole, which is saying, okay, even though you're not covered by HIPAA uses still, you know, respect the fact that you have kind of personal health information about someone and that you still have to kind of protect their certain way. The problem with that has been this rule hasn't been very well enforced. So it's kind of a free for all, I think, as a result of COVID and all these COVID apps that popped up that weren't sanctioned by a health provider, whatever. I think this really got on the radar, the FTC, and I think they're going to try to start more enforcement in that area. But, you know, I think, I hope that companies try to go more towards the human approach in terms of what they think is reasonable, but what they should be doing with people's data. I don't think people expect it, you know, I wear a watch that checks my blood pressure, I don't expect that data's going to be sold to an insurance company or software, right?

Katy Pusch  35:30

Absolutely. It reminds me of a study I read about GDPR. Because GDPR came out. And I was one of the people saying, yay, more privacy laws. And then I read this report that really opened my eyes about how it really favored large companies and stifles innovation for some smaller companies that couldn't afford all of the analysis and tried to understand how to be compliant with the law. And so just gosh, we just keep getting into layers intellectually, for me, that's why the Data Privacy and data ethics space is so interesting is because there are so many layers of complexity that you need to think about. And, of course, innovation and allowing more value and products and services and healthy competition to make it into the market is a value of many people in the United States. Generally, I would say we love innovation and startups, and market competition. And it's a really complicated balance to try to make sure that the right things are in place to protect people while not unnecessarily stifling that innovation. And so that's just another layer, I think that startups can end up providing some really important value to our, to people's lives more broadly, I don't want to say to our markets, right to people's lives, they create products that can help improve the way that we navigate the world. But it's almost like that funding model even could be reevaluated in terms of what incentives were driving, and are we encouraging looking at the human aspect versus looking at the return to investors solely. I would love in the startup space to see more sustainable growth paths instead of this emphasis on, you know, the 10x Unicorn. And I know that's the existing model.

Debbie Reynolds  37:40

Well, I think that we found in our work together respecting privacy and individuals can be a business benefit. You know, it could definitely be a driver because it creates less friction and creates less problem for a company that wants to use your tool. It lets them know that you're really thinking about that. And then consumer as you have consumer, if you have a choice, you've got to choose the one that's going to protect your data, take your privacy.

Katy Pusch  38:13

How do you feel about all of this? And because I was thinking about this in the back of my head earlier, but what are your thoughts about decision fatigue? And all of this, right, I think about all of the things that an average human has to make decisions about or think about deeply. And with all the transparency in the world, I sometimes worry how realistic it is that someone's not going to make it to their decision fatigue limit for the day when it comes time to make a decision about whether they're okay with Google using their photos for biometric data or not.

Debbie Reynolds  38:45

Yeah, I think something should happen on this point. Decision fatigue is real. It's psychological manipulation, in my opinion. So it's basically like, let's give this person 1000 choices. So then they're going to just give up, and they're going to say, okay, and everything, right. So that's what we're seeing when people say, okay, accept our cookies, and then they have the other button, more choices, and more choices. And it's so many that most people aren't going to do that. So to me, that's kind of a way to skirt the, to pretend like you're complying with the law, but really not really, and sort of, also these 5080 Page privacy policies, that's also a decision fatigue problem as well. So I think we need more simplification. You know, I'm like, let's have a privacy policy on one page or something. So, you know, do you sell my data? Yes or no? Yeah, something very simple like that. So I think it is in the best interest for companies to make it simple, and I think the ones that I've seen that have been very, very very successful at getting more people like Apple, like Apps products classes, are pretty simple. Very simple, actually. So you don't really need 80 pages privacy policies, I think that you need to make it very clear what you're doing with people's data. And then if they feel like it's a benefit, they'll give it to you, right, and they'll give you better data. So more data is not necessarily good data because people lie, I will tell the truth about stuff, you know, so a lot of the data that you think you're getting surreptitiously may be wrong and not good quality. So I think the companies that have the trust of people and not having this, you know, this fatigue, decision fatigue model, they'll have better quality data and relationships. So if it were the world according to you, Katie, and we did everything you said, what would be your wish for privacy, data, data science, IoT, anything, anywhere in the world?

Katy Pusch  41:12

It's a good question. And there are so many complicated layers that I have trouble answering it. I feel that I am an eternal student in this space. And I'm actually going back for a master's program to continue to explore this because there's so many different layers to untangle to advance this discourse, I think there's so much good work being done in all of the areas that we talked about really today. But those are all very sort of separate topic areas that can be investigated on their own, they all have interrelated complexities. If I say, if I want to say pie in the sky, and sort of ignore those complexities for a moment, I would love for each user to understand what data is being collected, what the implications are to them, as a human of that data being collected, right? How it could be used in different ways and how it actually is being used by the company collecting it. I would love to see that the human data interaction paradigm be advanced, really. And I think a lot of what I continue to be a student of and look and learn about is I think we can always continue to optimize that human data interaction model, but also, how do we advance on that continuum? Right, there are so many factors that we get stuck in and some of the easier steps. I feel like as a society, right, and so how do we advance that more holistically is really what I want to see happen, where users don't just have the opportunity to opt-out, but really, truly have more control over what's being done.

Debbie Reynolds  42:54

I agree with that, more agency.

Katy Pusch  42:57

Agency is the word I couldn't find this whole conversation. Yes, agency.

Debbie Reynolds  43:04

That's what we need. And I agree with that wholeheartedly. Oh, wow. Well, thank you so much. This has been fun. I've been excited that you were able to come on the show, and I look forward to speaking with you about other fun things together in the future.

Katy Pusch  43:18

Thank you so much. This was a blast.

43:22

Thank you.