E101 - Collin Walke, Attorney and State Representative, State of Oklahoma
47:46
SUMMARY KEYWORDS
data, privacy, people, state, bill, oklahoma, passed, legislation, law, point, committee, companies, federal level, problem, uniform, consent, agree, read, third party, technology
SPEAKERS
Debbie Reynolds, Collin Walke
Debbie Reynolds 00:00
Welcome to our show. And thank you so much to our sponsor today, Reliance AI. Relyance AI helps organizations manage all privacy operations on a single intuitive platform. Reliance AI uses advanced machine learning technology to automatically generate always live data maps and inventory, a universal record of processing activities managing the organization's data, subject access requests, and data protection assessments. So visit www. Relyance.AI That's Relyance.AI; privacy is in the code. Enjoy the show.
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy Podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show, all the way from Oklahoma, Collin Walke. He is an attorney and State representative for the State of Oklahoma. Before we get started, I wanted to rattle off some of your accolades. I'd love to chat with you about this. You authored and passed the country's first opt-in Data Privacy bill out of the house by a vote of 84 to 11. You're a member of the Data Privacy working group of the National College of State Legislatures. You served on the following committees: appropriations and budget utility, judiciary, civil insurance, banking, financial services and pensions, administrative rules and transportation. Welcome.
Collin Walke 02:02
Thank you so much for having me. I truly appreciate it. And like I told you before, I feel like I should be interviewing you because I'm just so impressed with your background. And I'm glad that I've had the opportunity to listen to your podcasts now. And so I appreciate the chance to be here.
Debbie Reynolds 02:18
Yeah. And actually, you and I have a friend in common. Robin Meyer was on the podcast. She's amazing, isn't she?
Collin Walke 02:26
She absolutely is. And she's been somebody that's been willing to help me along my Data Privacy path as well because she has more experience than I do in it.
Debbie Reynolds 02:34
Yeah, I thought this would be a great episode to do because especially in the US people don't understand how difficult it is to get anything passed anywhere. You know, like getting the dog catcher at the local, you know, your city passed in terms of legislation. So doing it on the State level is a really big deal. And we're seeing, we've seen for many years that the States have really taken the reins in terms of trying to pass privacy legislation within the US. So just give me a bit of your background, how you came to this process and why you decided you want to champion privacy.
Collin Walke 03:18
Yeah, so first and foremost, to your point about the difficulty of passing legislation. You know, I think that's actually kind of a philosophical intention within the way our framers founded our Constitution, which was, we want it to be difficult to limit people's liberty, right? So I think that it's just kind of a general theme through legislation. And sometimes it's a good thing. And sometimes it's a very bad thing. But I got interested in Data Privacy because I've been in the house for three years. And I read the book "Zucked" by Roger McNamee, and he's a hedge fund investor former adviser to Sheryl Sandberg and Mark Zuckerberg. And when I read the book, "Zucked", I went, oh, my God, my eyes were opened, the future is here, but we live in Oklahoma, where we grow it, feed it or pump it out of the ground. So data was the furthest thing from any of our thinking. And so I read that book, I wrote Mr. McNamee a letter, and I said, I want to do something about this in the State of Oklahoma. And then over the past three and a half years, he's been kind enough to guide me to make suggestions about whom to talk to in preparing and drafting legislation. Because to your point about the difficulty in passing legislation, I think all too often legislators run bills that they don't actually understand. And that becomes very problematic when you're talking about Data Privacy; what's the difference between student optimization and analysis of data? Right? What is what does it mean to transfer data? What is an API? All of those sorts of things play into this, and it is a highly technical area. And so the consequence of meeting with Mr. McNamee and deciding to try and delve into this meant that I had to learn about Data Privacy from scratch. As an attorney, I represented healthcare entities in relation to HIPAA compliance issues. And so that certainly helped. But it certainly is quite different from just Data Privacy in general. So over those three years, I boned up on, and I read the GDPR, a bunch of other legislative policies, including the CCPA, because I needed to know, do we need a Data Privacy officer in every State, you know, in every office, just like they have in the GDPR? Lots of questions and lots and lots of learning.
Debbie Reynolds 05:31
Yeah, it's fascinating. I love that you said you read that book or you decided that you wanted to get involved. So that's my Genesis in Data Privacy; I read a book in 1997 actually may have been 1995, called "The Right to Privacy" and actually, it's a book my mother read, and she was fascinated by it. And so that got me interested. And I was so shocked about what we didn't have in terms of rights. So I thought, you know, we're the land of free and the home of the brave and all that freedom stuff. You know, privacy has to be in there somewhere. So when you go looking for it, it is not there. You're like, wait a minute, wait a minute.
Collin Walke 06:16
Yeah, yeah, yeah. And to your point, about a year ago, I emailed a data brokerage company called Sift. And I did that because I had never heard of them. But a journalist who lived in California had emailed Sift and said, hey, give me all of the data you have on me. And it included things unsurprisingly, like the last time she changed her iPhone password, or orders on Postmates, etc, that this data brokerage company had. So I sent an email to him, and I said I want to know, and they sent me an email back and said, Sorry, you don't live in California, you don't have a Data Privacy law, you don't get to know what we have on you. And that, to me, just seems fundamentally unfair and discriminatory.
Debbie Reynolds 06:55
Wow, that's eye-opening. Right. Using the lack of laws to say, oh, well, I don't really have to because no ones come out and said it. They think that they don't have an obligation to be able to give people that data. Oh, wow.
Collin Walke 07:15
Irrespective of the ethics behind it, right? Clearly, they don't care about that. Right? They're like; we don't care because we're selling to other companies, so.
Debbie Reynolds 07:24
Yeah, right. So tell me about the legislative process that you have to go through and tell me about this bill that you got passed. And what were the steps that you had to take to get this going?
Collin Walke 07:38
Yeah, so just as a refresher for everybody, just and you know, every State's a little bit different. But as a general rule, a bill starts out in either the House or the Senate, starts out in the committee, then if it passes the committee goes to the House or the Senate, depending on which side it started on. And then if it passes that house, it goes to the other house, and it goes through a committee, and then it goes to the full floor and then assuming it passes there, and because it goes to the governor's desk, so there's essentially four areas where your bill can fail. And so in my bill, the first step is to do what we call an interim hearing, and so that's very much like a committee hearing at the Federal level. And so at the interim hearing, we had Mr. McNamee come in, we had Internet service providers come in, and we had a tech industry come in and speak about how Data Privacy legislation would affect them. And at this time, there was only Colorado on the books. And I'm sorry, California was the only one that was on the books. And so the consequence of that was everybody pointed to the AG opinion that said it costs $55 billion to implement the CCPA, which is to be absolutely clear, not what the AG said. And so during that interim hearing, though, one of the Republicans actually asked the question, could we tax the data that's being collected so that the consumers could actually receive something for their data? And when you hear a Republican talk about taxing something, you know, you're onto something, right? They do not like to tax things. And so once that happened, I took a version of the CCPA and modified it because I wanted our bill to be meaningful. And I don't think that the CCPA isn't meaningful. It absolutely is. Even if you have the right, just the right to access, correct, and delete, even if you just have those that are meaningful. But I wanted to do something that really put the control back into the people's hands because I believe it's your data. And so I took the CCPA and lowered the thresholds for applicability. So for example, in California, I think it's $25 million gross income before the bill applies to you. In Oklahoma. I cut that all the way down to $10 million. I wanted it as restrictive as possible. And then the real kicker was doing the opt-in because this is something that data companies do not want, whether for technical reasons or for profit. And quite frankly, I think it's for profit. And the proof is kind of in the pudding when you look at Facebook coming out six months ago saying they lost $10 billion in revenue as a result of iOS going to opt-in. And my response to them is that wasn't your data in the first place. It'd be like me stealing your car and then being pissed off, and I didn't get how much I wanted when I sold it. Right. And so I ran the bill, and it caught everybody off guard because as you know, Oklahoma, we live in the 19th century still, we're not caught up yet. And so the first committee hearing I had to go in, I went talk to all of the committee members of the technology committee, including the chair and answered any questions they might have. And it passed out of committee, I think unanimously. And the reason is that at that time, people were really starting to wake up to this, "The Social Dilemna" had just come out on Netflix, etc. And so after meeting with the committee, it passed out of there, but it passed with huge opposition from the State Chamber of Commerce, various tech companies, Amazon, Microsoft, etc. The next step then was to take it to the House floor. And I've got a war story from that if you ever want to hear it, but long story short, AT&T tried to kill the bill and pulled some shenanigans. Fortunately, that didn't happen. And we were able to get it passed out of the House at five to 11. During that, I answered questions for about an hour and a half on the House floor from members. Because they were on the side of the State Chamber, thinking this was just going to cost tons of money and not actually do anything. Fortunately, we were able to pass it at five to 11, which means that it then got sent to the Senate for their committee hearing. The Senate Judiciary chair did not want to hear the bill because she is aligned with the State Chamber as well. So when you talk about the difficulty in politics, that one person, so if you think 85 representatives each represent 38,000 people roughly, you know, you're silencing 3 million voices, when one person says I'm not going to hear that bill. So we decided, okay, we'll give it another run. So this session, we ran a modified version of the bill still opt-in, etc. But this time we included limitations on dark patterns and some other things. And at that stage, we were supposed to get a hearing because we had a new Senate Judiciary chair, unfortunately, politics got in the way. He got removed from the chair, and then the old chair got put back into place. So again, politics killed the deal on that side of the building. But I cannot underscore and emphasize enough for your listeners how many lobbyists are out here trying to kill this bill. And lobbyists aren't necessarily bad, and I don't want to make that the point here. lobbyists can be good or bad; that doesn't matter. But we need people's voices, specifically people in privacy and tech, talking to their legislators so that they can appreciate and understand the reality of this legislation. Because otherwise, they will just believe what they're told. Fortunately, I had the credentials, and I had the trust of the Republicans, my best friend in the house is the majority leader of the Republicans. I mean, I'm a liberal, urban Democrat. He's a conservative, rural Republican, and we don't agree on much. But we agreed on this because we agree that privacy is fundamental to a Democracy. And I think a lot of the problems we're seeing today are a result of the lack of privacy. And that's why we're having problems in our Democracy. So that was kind of the shorter version of how that bill got passed out of the house and killed two years in a row by the Senate.
Debbie Reynolds 13:49
What is the status of the bill right now?
Collin Walke 13:52
So the way it works in Oklahoma is our legislative sessions run for two years. And so both of those bills are now officially dead. The one I passed two years ago was still alive for this session. But both of them are dead, however, Representative Josh West, the majority leader for the Republicans is going to continue to run it, even though I won't be here as his wingman next year, but he's going to keep pushing it. Hopefully, maybe the Feds will intervene between now and then. And if they don't, we're going to keep going down this road. Because what we don't want to have happen is we don't want to have it happen where, you know, States pass weak privacy legislation, like what we see in Utah. And then, you know, you have 29 States now, and the tipping point has happened where it's weak legislation. And so when the Feds come in, they pass weak legislation as well. We think that you need to have a balance so that the feds would have a justification to say, well, yeah, that's a weak State law. And that's a strong State law. Which one do we prefer to try and make uniform plus other States are looking around too and in fact, that's what the lobbyists are trying to do with my bill? Between the time that passed the House and went to the Senate, they were trying to get us to gut our bill and basically make it Virginia-light. And their goal was then to take that bill and go to every other State legislature in the country and say, look, Oklahoma did it this way. This is a good bill; just run this. So they're doing a race towards the bottom while we're trying to do a race towards the top.
Debbie Reynolds 15:16
Wow. This is staggering. I'm so glad that you explained this process. Yeah, it's difficult to get these bills passed. And so I see people like you. I consider you a citizen advocate. So we see people who come out of left field, and they decide they want to make this a passion project and really push it and learn how to talk with people in the legislature, explain and answer their questions in a way that makes them understand why this is important. But what do you think about what's happening on the State level in different States? So we know that California has always been pretty progressive on privacy; they worked on it for many decades. Their laws are very complex, actually, all the different privacy laws they pass over the years. But what we have seen previously, for example, with California, for example, was the first State to pass data breach notification, right? And all 50 States, all States now have their own data breach notifications, even though it's different, not the same. Do you think that's what will happen with other privacy legislation? And I'm not talking about Federal budgets on the State level.
Collin Walke 16:42
Yeah, I mean, absolutely. What I think you would ultimately see happen if you had a tipping point of States that passed Data Privacy legislation that varied because I think, for example, CCPA and Utah's legislation are not necessarily congruent. And so what's going to have to happen, I think, ultimately is you'll have what's called a uniform commercial law come up, kind of like we have the UCC, Uniform Commercial Code, we have the Uniform Trust Act, we have all sorts of State by State uniform laws that they end up passing. And I think you would see something like that come out for the States as well. Because otherwise, if you have a State like California that's really strong and another State that's really weak, that's going to make California an economic pariah, and businesses aren't going to want to do business there. And so I think that you have to have some form of uniformity. I'm sure you've probably seen the draft that came out of the Uniform Model Legislation Committee, which was just trash; it was abundant trash. So that is obviously a concern, anytime you're trying to do something, either nationwide or State by State that's uniform, is that you've got to be careful because what type of law are you actually passing? I don't know that you would actually see too many uniform laws just because of the way commerce works and the way that data travels. I don't think you can have that disparate of State laws and still create a uniform functioning data system.
Debbie Reynolds 18:11
Yeah, I mean, I recently went through an exercise looking at State laws. And I'll probably do a presentation before the end of the year on that. And it's concerning, concerning, to say the least in the differences. I almost feel like some States want to have their own bespoke thing like we're in Oklahoma, so we're going to put our Oklahoma thing on it or whatever. So, to me, I don't necessarily see it as a trend towards people being uniform if they have to, you know, It is clear that a lot of the laws that have come after the CCPA in California have borrowed from that. So that definitely helps. But I still feel like every State in some way wants to stand out in its own way. What are your thoughts?
Collin Walke 19:05
Yeah, I mean, I think that's natural because it's politics, and egos get involved, and not just egos that get involved, but also the actual political politics of it gets involved, right? I mean, I think part of the reason why Virginia's law passed with so much ease is that Microsoft and Amazon got behind it. Microsoft and Amazon did not get behind my legislation here. You know, if I had a less willing Speaker of the House, that bill would have never made it to the floor. But the speaker, you know, understood the issue. He understood why it needed to happen. And so I was very lucky in that regard. It's not as though I'm just that persuasive. I mean, well, I think I'm a persuasive speaker. At the end of the day politics really does trump a lot of this which is going back to your point, you know, at the Federal level concerns about whether it would ever actually become law, politics, and money and influence is extremely important. And especially because there is a practical consequence to this, right? I mean, if you think about the small mom-and-pop shops, and if I didn't have an income threshold or something along those lines, then you know, even if it's $20,000, you know, to come into compliance, I hired a lawyer to download the software that they need, whatever it might be, you know, that's a hit to people's pocketbooks. And so, you know, I think that State, especially politicians want to be cautious about gouging and hurting their consumers. And so I can see every State wanting to do something different. But I don't know whether or not it would be business-friendly or consumer-friendly, the way they're wanting to do it. I mean, certain States that are more conservative, I think you're going to lean business-friendly, and those that are more progressive, we're going to lean consumer friendly. But to your point, at the end of the day, you're going to end up getting form fatigue because you're trying to figure out what should I do in Mississippi. And what should I do in Colorado? I don't know. And it also creates mistakes. Anytime you don't have uniformity, you're opening up the gate for a mistake to happen unintentionally. But I think a lot of what you're seeing in the GDPR makes a lot of people hesitant to make a mistake because you're going to pay for that if you do.
Debbie Reynolds 21:15
Right, right. Wow, there's so much to think about. Well, two things, and I think two laws that I feel have been successful, a successful path that I've seen people take that I would love to see more of reminds me of New York and Illinois. So New York, when they passed their Shield Act, which is part of an update of their data breach notification, even though that's a mess, in terms of, you know, all these bills sort of met it together. But one of the things they had in this bill, because the focus really was kind of cybersecurity, and data protection as the way we think about it. And so it didn't have a, you know, very few exceptions. So basically, if you're a company, you have some type of financial transaction, that you have data of a person in New York, and then this applies to you. And then they have sort of what I consider some common sense, reasonable standards, you know, not saying that you need the Rolls Royce version of kind of a data plan or cyber plan, but something that fits your company, at least you're thinking about it, not like I say, throw your hands up, and they'll do anything. So to me, I think I'm not seeing a lot of pushback on New York because I feel like the things that they were asking for was relatively reasonable. So not as proscriptive as California, like put a button on your website that says, this is like, look, you need to have a plan, you protect people's data, you know, the AG is going to get involved. We don't do X, you know, that's probably something that I think has a longer shelf life and something that people can probably swallow better. And then the Illinois HIPAA law, which I am from Illinois. So, you know to me, I'm really happy to see this. And I've seen the legislative process in Illinois; this is bananas, like they pass laws, like nothing like bam, bam, bam, or whatever. So this one came out, and no one cared about it for many years. And now it's like this hot thing. But the thing that they did in Illinois with this BIPA law, it's probably one reason why a lot of companies don't like it; it's because they really focus on the harm, you know, what harm can come to an individual when their biometrics are breached? So that's kind of their focus. And that's the reason why that law stayed on the books and has been pretty fresh, because it's not saying, you know, we're against x technology or whatever. They're like, look, people have biometric data. Do you want to use your biometric data, you have to follow these things. And unfortunately, even though it's relatively simple, I think it's less than a page like a fifth grader can read it literally. A lot of big companies trip up on that. What are your thoughts?
Collin Walke 24:21
No, I agree. I mean, I think that so there's a legal concept known as the law of the horse, which basically says, you know, you know, back in the day when we all had horses instead of cars, we could have designed all kinds of legislation about horses, like what happens if your horse goes too fast? Well, who's responsible if you do this to a horse, whatever, as opposed to having a general law about property? Right, what is a horse? It's a property, you know, theft, whatever. And I think it's the same thing with technology. So for example, going back to the flexibility of New York's law, I think that's really important. In my legislation, I didn't include anything specific either about data security. And part of my thought was, well, I would certainly think as a lawyer that if somebody you know is hacked or breached, and they were negligent, I've got negligence there. I don't necessarily need to tell them how to do it. Maybe that would be a good idea for sensitive information, right? Make sure it's encrypted, whatever it might be. But I think the legal concepts are already there for enforcement through negligence and things like that. And to your point about the simplicity of BIPA. I agree that the more we can make these laws simple and understandable, it's beneficial for the companies, beneficial for the consumers and beneficial for the legislators because most legislators don't understand most of the bills that they read. I mean, I had a bill recently that came up on the House floor dealing with cattle; I don't deal with cattle. So what I had to do I had to talk to my Republican friends that deal with cattle. So if you can keep it simple and flexible, that's the best way to do it. Because what are we just trying to achieve here? Two simple things, one control, and two accessibility. That's what we're trying to achieve. And so I don't think you have to spell out every single thing. And I, in fact, think the more you do that, the more pushback you're going to get, but the more expensive it's ultimately going to be because you're actually going to have to hire experts to do each and every one of these aspects of the bill. So I agree wholeheartedly with you; the simpler and more flexible you can make a law, the better. We don't need to reinvent any wheels, which was part of my reason for opposing the model legislation that came out of that committee because it reinvented the wheel, the terms are different, the definitions were different, and nothing about it was similar. We, you know, you talked about most of these laws starting from the CCPA. And that's right, because why should we reinvent the wheel? Most legislation, in fact, starts somewhere, and we go, oh, that's a good idea. Let me rob it and make it okay for Oklahoma. So I agree wholeheartedly with you on that.
Debbie Reynolds 27:04
Yeah, I feel like we in the US, are where the EU was in 1995, maybe even before that, like pre-1995. So in 1995, the EU passed its Data Directive, and then that got updated by GDPR. But what they wanted to do was the situation they had at times, like, look, we have all these technologies coming out these new ways that data is being transferred, we need to have some rule, we need to have some harmonization around Europe, you know, the EU. And so that is kind of the genesis of that legislation, and it started out to be a directive. And then after many years, they decided they were going to make it a regulation. So I feel like that's where we are right now. Like there needs to be some type of harmonization of something on a Federal level. Like to me, you know, the two things that people get hung up on. And the reason why we don't have any Federal legislation right now is that people argue about the private right of action and then the preemption of States. So my view is to forget both of those. Like take, get a dictionary, write some definitions, make the definitions of sensitive data and personal data breach notification, and make those all harmonize across the States. And then you can leave the private right of action at the State level, in my opinion, and not even have to deal with preemption. What are your thoughts?
Collin Walke 28:47
Yeah, it's interesting because, you know, on the one hand, you think about a law like HIPAA that preempts, and I really think we're at least sets a floor, let's put it that way. And I think that almost has to be the case; we have to have a uniform law. And that probably does preempt most States, or at least sets a floor for it. So if States want to be more secure, they absolutely can. To your point about the PRA. I originally had a private right of action in my bill. But I took it out fairly quickly because even as a lawyer, I don't think that that's actually going to be a way to work out, you know, grievances over this issue. So I wholeheartedly agree with you on the private right of action. I think it's just we shouldn't have those sorts of conversations. We need to look at other avenues personally. But I will also say that one of the frustrating aspects and a good aspect of times about living in America versus the EU is they're much more communal. They're ahead, and they're looking at what are potential problems that we can do to prevent these sorts of things. In America, we rely so much more on the market. You know, we believe in these Free Market ideals. And it hasn't been until the very old legislature at the Federal level has realized and woken up to what's actually going on and the harm that's been occurring as a result of a lack of legislation, right? I mean, they're finally realizing children's location data is being sold, you know, everything else the under the sun. So it took a bunch of bad things happening like it always does. In Oklahoma, for example, out on our State Capitol, there's a photograph; there used to be 50 oil and gas wells right outside our capitol right on top of one another. And they realized it about 10 years into it. You're wasting oil because you're reducing the pressure to pull it up. So they put in regulations to cause spacing between oil and gas wells. And that was a consequence of seeing all of the bad things. So in America, I think we're way far behind on legislation and technology because of that free market belief. And so, you know, I don't like the concept of Federal preemption all the time. But when you're talking about data, you have to have that I think you have to have that sort of uniformity, as you do with HIPAA.
Debbie Reynolds 31:10
Yeah, exactly. Yeah, it's tough. It's not easy. I feel like, you know, this has to, it takes many years to build a GDPR or a data director. So it's not something that can be thrown together before the midterm election and think it's going to pass and everything's going to be great, right? It happens in phases, and it happens in layers over many years for things to happen. So trying to do this Hail Mary thing, we need to figure out I don't know, maybe it's like baking a cake. So instead of saying, okay, we're going to bake the cake on the national level for everybody, you know, someone agrees to the egg, someone agrees to the flour and the butter and the milk and all that stuff, and then maybe once we have those things agreed to at a fundamental phase, we can make a cake? Who knows?
Collin Walke 32:07
No, and you're right because as I was saying before, you know, everybody brings different concerns to this. So for example, one of the concerns on my legislation was, well, what if I'm an oil and gas company, right, I make more than $25 million, etc. So, you know, your bill's going to apply to me? You know, how does this affect the oil and gas industry? How does this affect the airline industry? How does this, you know, because we have Boeing here, you know, so there's a lot of different industries that get affected as a result of this. And so to your point, I think maybe targeted, industry-specific might be a way to go to start getting there. I mean, there's a reason why the Federal legislature has not simply gone; oh, the GDPR is there; we're going to do the GDPR. It's because businesses don't like the GDPR. And so I do think that you're right, we've got to look at small bites at the State level; we can do you know, a lot more. Because it's you we're not dealing with 500 members of Congress, you know, we're dealing with 100 or so. And so it's a lot easier to get it done at the State level. But I think at the Federal level, we may have to look at it piecemeal.
Debbie Reynolds 33:14
Yeah. I'm very concerned because a lot of the proposals they have when they think about prior to the Federal level, a lot of it comes out of FTC. Commerce has evolved; nothing wrong with that. But I think in order for us to move from a consumer slant to more of a human right, I think it needs to be almost like its own agency, its own thing. So commerce doesn't regulate every industry. So but we are all humans, right? So in the US, if you're not consuming, you really can't really fully utilize the rights that exist now.
Collin Walke 34:01
No, you're absolutely right. I mean, just like with the IRS, or you know, using biometric information and scanning your face, facial recognition, technology, etc. You really do see where even though I'm not on the Internet, Data and Privacy still matters. When I go through the airport, and they scan my face, it still matters. I'm not consuming anything, but I'm walking about, and my privacy may be invaded, unbeknownst to me. And I think that that's really the scary part is, you know, this is consumer ask but just think about it. Most people buy their TV, and they don't think anything about what's being recorded on that TV or what's being picked up from that TV. And so I think the surreptitious theft of privacy is a big problem, even if you're not a consumer, because it's happening all the time. And it's not just because you, your friends, how many of your friends post your pictures on Facebook, how many of your friends post them on Tik Tok? Right? I mean, your information is getting out there even if you've got all the control in the world over you?
Debbie Reynolds 35:03
I don't know. I have a philosophical point of view about the way we have privacy now and the way things were going about it. I feel like we don't have a, maybe from when I was growing up to now, there's kind of a lack of cohesion, to say the least, around national things, you know, human things, as opposed to me, me, me, you know, the whole individualistic thing. So it's like, okay, if, you know, I feel like Warren Buffett isn't concerned about his privacy like I am, and he shouldn't be right. So because he can do whatever he wants and have as much privacy as he wants. Right. And I feel like we're entering almost a caste system about who can protect their privacy and who can't. So even something like Apple will allow their privacy changes, which is great, right? I'm an Apple user as well. But I'm concerned about people who can't afford Apple devices. Like, what about those people?
Collin Walke 36:17
No, you're absolutely right. In fact, there was a New York Times op-ed a few years ago that really struck me. And it basically was making the argument about social media specifically, you know, the number one indicator of poverty right now is what do you smoke? If you're a smoker? That's a pretty good indication of whether you're in poverty because of socio-economic reasons. And so the same thing was in the argument that was being made, which is eventually it'll be social media. Are you on social media? Or are you not? That's going to be an indicator of whether or not you're in poverty. And that is a severe concern. And to your point, you know, when you think about well, okay, what about people who want to consent to sell their data? Well, are we going to make it so that only low-income individuals are accessing this because they can't afford to pay the paywall? It is a huge problem. And I think that gets back to the original point in America; we're very libertarian, our culture is very libertarian, and it's not communal in nature, unlike Europe. And I think that that's a sad thing. I think we all need to realize that every single one of us, I have a saying that all of us need all of us to make it. And I do truly believe that if we leave anybody behind exploitation is open and available. I completely agree with you on that. And we have to be concerned about that.
Debbie Reynolds 37:31
I'm happy about a lot of changes that companies like Apple are making, and they are championing privacy, which is great. But I would love it to be a right that I have, as opposed to something provided to me by a company.
Collin Walke 37:50
And that was the selling point, Oklahoma; I mean, from my perspective, and especially with the interpretations of Roe vs. Wade and everything else, you have a Constitutional right to privacy. That's why the cops can't just walk into your house without, you know, a search warrant. That's why you don't have to testify against your own self-interests; our founding members felt like we need to have this right to privacy. But in practicality, that's not the way that this has ended up playing out. We don't have that right, so to speak. And I think that's a shame because, without privacy, you don't have autonomy. And if you don't have autonomy, you can't very well have a functioning Democracy.
Debbie Reynolds 38:30
I love the tie you're making between privacy and Democracy; I think it is very, very true. Well, I will talk to you a little bit about the third-party doctrine; you just said something that had me think about that. So this is something I've talked about in the past, something I'm very concerned about. And so third party doctrine, as a legal concept has been the idea that if you give data to a third party, it doesn't have the same protection that it would have if it was in your vicinity, like in your house or whatever. Even though we know the technological age, a lot of us use the third party to store data and share data to do different things. What are your thoughts about that?
Collin Walke 39:12
No, I agree, in large part because you can't do anything today without kicking off a trail of data. It's a simple impossibility to do that. And so I think that's why it's so you know, it's not like we're talking attorney-client privilege, right? But even if we were talking attorney-client privilege, at least in Oklahoma, there are exceptions. So for example, if I needed somebody in a meeting to help me understand what was happening or to help advise me, the privilege extends, even though there's a third party in the room with me and my lawyer. Typically, there's a third party in the room that breaks confidentiality. But here because we're all trying to go towards the same goal, we're able to unite, and it doesn't break the privilege, and I think it's the same thing here. If I transfer my data to a third party, it's because I have a goal in mind. I'm trying to achieve an end of some sort that doesn't give them free reign over my data because I think of the data as me, that is me, you are taking portions of me, and you're using it in some way, either for your own profit or for your own benefit. And so at the end of the day, I really do believe that we have to have legislation that addresses this core concept that data is mine, it belongs to me, and I have control over it one way or another, irrespective of whom I transferred to unless I can sit.
Debbie Reynolds 40:31
Yeah, and that consent thing is a big, big deal and a big issue, you know, at your example that you gave, by contacting your data broker and them saying, you know, you're not in California, we don't have to do X, I'm hoping to see more legislation. And we are seeing in more legislation where they're really putting a finer point on third-party data transfer, especially if we're selling the data. So, you know, would the person reasonably expect that their data will be sold to 100 different companies for some reason that they don't benefit from? What are your thoughts?
Collin Walke 41:09
And yeah, and I can't speak to any other State because I didn't grow up in any other State. But I can tell you in Oklahoma, there's a massive amount of ignorance behind data, right? People just do not understand it; they don't think about it, and they don't have it at the forefront of their minds. And so I absolutely think that it's important that we continue down the path of ensuring that third parties aren't able to dispose of this information. It's befuddling to me why this is and Joshua, as the majority leader, it's the same way; it is so confusing to us how it is that anybody has a problem with the concept of your data is yours, that you have the right to control it. You know, it's just like, if I had a camera in my house that was recording me 24/7 that I set up, and that was on an old VHS cassette, that's mine to do with, well, today, if I've got a, you know, a Ring doorbell, or whatever, that information is being kicked off somewhere else that I don't know where it goes. But most people don't know that; they don't appreciate that. And so I do think that we have to begin an educational process with the citizens, especially in Oklahoma, but I assume in most other States as well, about what this is really all about. And that's why I applaud you, and I applaud, you know, "The Social Dilemna", I applaud all of these organizations that are coming out to try and educate people because that's where it starts; it starts with education and communication. If you don't try and educate the populace, if you don't try and communicate with them, then you're just going to be banging your head against the wall all day long. And so I really, I think that's where we need to head en masse on a large scale.
Debbie Reynolds 42:47
Yeah, you just touched on something that's really interesting. I'd love to get your thoughts on it. And it's a kind of consent and choice. So right now, when you say the example of the Ring doorbell will you decide you want to use it? You'd have to; no one reads the 80 pages of privacy policies or whatever. So you're consenting to a lot of this just so that you can use it and you don't really have a choice to opt-out. So you either don't use it, or you consent to all these things. And I think the problem with consent is that you can consent to things that aren't in your best interest.
Collin Walke 43:24
All of this, we all do it because we all want to use the functionality. Yeah.
Debbie Reynolds 43:29
Right. That's the problem. A lot of that is about education or transparency. And, you know, I would love a law where a privacy policy has to be one page. Yeah, I mean, do you sell my data? Yes, or No?
Collin Walke 43:43
There you go. And I also think, you know, to that extent, I mean, I think that to the extent that there were, you know because they always say, you know, with third-party vendors, or third party, whatever, and you still don't know who they are. And so I almost think that you know, to your point, a one-page policy, whatever it might be, but for that hyperlink, you can also see who all the vendors are, who are these people that you're selling it to me, because I don't think it's informed consent, if you just say, we're going to give it to people, right? You've got to tell me whom you're giving it; then it's informed consent. And but I also think that and I do think that more people if you made it simplistic like that, you know, Cox Communications, the privacy policy is huge. And I went through it and oh, my god keystrokes and everything else. But I think that if people could read and go, oh, my God, I don't know these companies. Why would I give them this information? Because if you say the third party, they may think it's somebody that's actually trying to help in the transaction itself, which is usually okay. Not a problem. But I'd like to know that I'd like to know who it is. So I think that we can make clear policies. I think lawyers get in the way because they try and think of every possible thing that goes wrong and I live by the KISS principle, keep it simple, stupid. That's the way I've always practiced law. And that's the way I try and communicate. Because these things get in the weeds real quick, and most people are in a hurry. And so they don't want to take the time to do it. But if you make it simple and in a format that they can easily follow, then people can give informed consent. And I think it's possible.
Debbie Reynolds 45:19
Yeah. Wow. Wow, fantastic. So if it were the world according to Collin, and we did everything you said, what would be your wish for privacy in the US, anywhere in the world, human legislative stuff, law? What are your thoughts?
Collin Walke 45:36
So I would really like to see something like the Data Collaboration Alliance, or some of the stuff that Jaron Lanier has been working on talking about control over data, that somehow there's a centralized way for you to control your data. But that's my dream world, in which you could also maybe be paid a return on the use of your data, right? I think that those are good ideas. But ultimately, what I would want to see at the end of the day, is my ability to consent before you even collect a thing from me and then be given the right to access, correct, delete, and the right to be forgotten on that. Because I think that the onus needs to be on the business that if they're going to use my information for profit, then they should also have the responsibility that when I say quit it, they stop, and they notify everybody else to stop as well. You don't just get a free bang for your buck in this world. And unfortunately, we've let data companies and data brokerage companies get away with this for way too long. And I really hope that something happens in the next year or two to keep this from happening ever again. I really can't tell you how much I think this has had an influence on our politics today. And not for the better, not for the better in the least because we're all in our echo chambers. The way our districts are drawn up through gerrymandering, those also are in echo chambers. And so we've got to get out of the echo chambers. And the only way to do that is by giving control and preventing specific targeted advertising.
Debbie Reynolds 47:12
Excellent. Well, this is such a great show; you bring such energy and passion to this discussion. I love that you mentioned the Data Collaboration Alliance. I love them over there, Chris McClellan; I know him quite well. You're doing really great work. So I know we're in good hands when people like you Collin; you're fighting the good fight and helping to advocate for everyone, not just yourself.
Collin Walke 47:36
Well, thank you so much for having me. And thank you for doing what you're doing because now I'm a follower and I appreciate you staying informed. So I do thank you so very much for this opportunity.
Debbie Reynolds 47:46
Yes, yes, we'll talk soon. Hopefully we have ways we can collaborate in the future. I would love it. Excellent.