E21- Jay Glasgow – Founding Partner of the Privacy Co-op Information Rights executive

March_Podcast Guests  Jay Glasgow.gif

 

Jay Glasgow

40:20

SUMMARY KEYWORDS

privacy, data, law, authorized agent, tort, rights, regulations, op, podcast, people, happening, year, state, businesses, apply, publicity, largely, called, affiliates, private

SPEAKERS

Debbie Reynolds, Jay Glasgow

 

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva." This is "The Data Diva Talks" Privacy podcast, where we discuss privacy issues with industry leaders around the world to tell businesses things that they need to know now. I am joined today by Jay Glasgow, who is the CEO of The Privacy Co-op. He always talks about the value of your rights. Jay and I met on Linkedin. I think you sent me a message, and you and I have some chats back and forth, and we had a call recently, and I am rarely ever at a loss for words, but some of the things that we talked about I was just speechless I'm saying oh my god we have to put you on this podcast because some of the things that you're talking about are so spot on about the way privacy laws stand now or what we lost in the US. I feel like you have such a deep understanding of how that works and also understanding the privacy rights people have and what they don't have and how to sort of work within that system in a way that I feel like a lot of people don't understand so maybe you can give just a little bit more background about yourself and what you do and how you know, what, why you're interested or why you're passionate about privacy in your work.

 

Jay Glasgow  01:41

Sure, thank you, Debbie, for inviting me to the podcast. I've listened to several of them, and your show is always exciting. Ican see why I think you got awards recently for being in the top 10 podcast so it's an honor to be here and as far as how I got into all of this mess and why i'm passionate about it I wish I could say that I charted out my career because it would make me sound a lot smarter than I actually am but I think that there's a lot of you know convenient blessings and and opportunities that just kind of fell my way but for the past 30 plus years I've been in the telecom industry largely and have had the benefit of being in the room when a lot of innovation is happening so I'm on either 15 or 20 patents and all in the area of convergence and data use and big data and when you talk about convergence that's when everything was coming together around the millennium around the year 2000 where telco and television and and mobile communications and everything was merging so you know we we just about couldn't turn around without having some sort of innovation and early on we realized that privacy was a serious concern. The phone company is certainly heavily regulated by the FTC and FCC, and other authorities worldwide, so we had to be very careful about data use and privacy, so we looked at different ways we can help companies. I was a key in a publication into the public domain of a standard that's been adopted by many of the larger companies called personal levels of assurance about ten years ago and then from there went into big data with AT&T. So six years ago, I was brought in as the architect for consent management in big data so when you're dealing with petabytes of data a day you know, and you know potentially seven to nine different uses of that data some of them are opt-out some of them are opt-ins some of them you can't have a say like fraud and others we had to figure out how to deal with this when we're dealing with very very large data sets as well as transactional consent and then very fortunate to about a year ago joining The Privacy Co-op as the CEO it was a real, again a great opportunity to come in, so I guess I've just had a great view into some of the complexities that you deal with on your podcast

 

Debbie Reynolds  04:22

Yeah, that's excellent, that's excellent. I think maybe similar to you. I sort of kind of rode the wave as sort of you know data adjacent, I would say. So if you're someone who's been dealing with data over the years, this privacy thing has come up, and it's probably something that had been sort of a bullet list on a longer list of things right where now so much more focus because there's so much more data being generated in different places and then too I mean you have the experience of working in regulated industries that you know, they have to get this right, where, like industries that aren't as regulated, they've been sort of, you know, devil may care, I guess, about this, and they can't be anymore. So, you know, I always say, great wealth precedes regulation. So I think we're on the cusp of it.

 

Jay Glasgow  05:21

Oh, yeah. Oh, yeah. I mean, you know, in 2009, to that point, I was asked to be the tech lead and Project Lead for Project Fulcrum, which was identity across all of AT&T. So if you can think about the different forms of identity that a very large telco might be engaged with, and the laws that apply to that, and you know, we were learning all kinds of things like age, a majority is totally different from state to state within the United States, I thought it was 18. But in some states, it's 20. In some states is 21. And then from country to country, you have countries out there where the age of the majority is different for a boy and a girl, right? And it's kind of crazy. You see ages from 12 to, you know, 24 in the wild. So just looking at the laws and how they apply not just to privacy, but identity. And so again, very fortunate to have made that track. It was painful at the time, but I'm glad now for all the work you've done. I'm sure you're in the same boat.

 

Debbie Reynolds  06:20

Yeah, definitely. Yeah. So I run into that all the time about identity. Identity is very, identity, to me has been sort of in the background for many years, but I feel like it's going to be in the foreground for probably the next ten years because it's going to be the center of a lot of what's gonna be happening so.

 

Jay Glasgow  06:38

Yeah, Paul Windley and Kalia Hamlin will be glad to hear that. They've been running the identity, The Internet Identity workshop now I think it's twice a year on the East Coast and the West Coast. And they've been doing that for oh, my goodness, greater than ten years. And so they'll be glad to hear that I'm sure their audience and attendance are going to be going up.

 

Debbie Reynolds  07:00

Yeah, I tend to be right about these things. So I have a good feeling about this. And I've been talking about it for a couple of years. I love to see that for sure. Let's talk about, you know, obviously, The Privacy Co-Op, we work with people all over the world. But I feel like the US has peculiar privacy issues, in my opinion. And now a lot of that is based on something you testified on about, which is how things are different from state to state and how we have kind of, you know, we have some laws that are very sector-specific, and there are some gaps in there. And I'd love you to just sort of give a more fulsome overview of the kind of what you are doing The Privacy Co-Op and then get your view about what's happening in the United States as it relates to laws and kind of consumer or Human Rights and privacy.

 

Jay Glasgow  07:56

Oh, wow, what an excellent question. First of all, privacy is an individual right. It's a personal right. Everybody has, and typically personal and individual rights are secured at a national level in most countries. Certainly, here in the United States, the right to privacy is a national Federal law issue. And if your audience thinks about it, you can't sell an individual, right? There's no way to buy and sell individual rights. So the right to privacy cannot be bought or sold just as a basic legal theory. And that's true in most countries that we work. Now publicity is something totally different. Publicity is contractual law. And contractual law is largely settled at the state level or local, regional level in other countries. And contractual law is a right that can be considered contractual rights. And it can be bought and sold. Now, in most countries around the world, they've done a pretty good job of keeping those two things separate, publicity law from privacy law. But about 120 years ago, with the advent of Hollywood here in the United States and the fame of celebrities and the legal clout that was forming in Southern California, you might have seen a kind of a conflation of those two things within the laws here in the United States. So it gets lumped together, and the same law that would you know, protect Cary Grant from having a photographer snoop up and and and do Peeping Tom type tort violations by taking a picture in the kitchen window was largely applied to the fact that you could put, you know Cary Grant's photo on a box of Wheaties, which would be misappropriation, which would be more of a publicity law or contractual law issue. We just lumped all of that together and called it privacy. So there's some confusion now in the United States as people are wrestling with the fact that we're all on this track for fame individual fame we want to publish videos on YouTube, we want to get likes on Linkedin we want to put ourselves out there, and yet we're dealing with these things and calling them privacy laws when in fact they're really probably more contractual publicity law and those are settled state by state. Now where The Privacy Co-Op comes in is that we started realizing that there's nothing new about individual rights being secured at the Federal level. There's nothing new about contractual law being established at the state level just because we put the word tech in front of it or high tech or we put an AI in front of a word or an e in front of the word, we feel like we have to go and write totally new law, but the truth is that Benjamin Franklin was trying to secure and reduce risk and liability for new innovative products back in 1753 before the country was founded. There's a set of common law around cooperative associations that have evolved into Co-Ops and nonprofits, really two different classes of businesses now The Privacy Co-Op came along and said, can we do this? Can we take all of this existing infrastructure or framework as we understand it and just apply it to data, and we found success in that through the use of two pieces of paper? It doesn't get much more low-tech than two pieces of paper. We have a piece of paper for our members to sign and our members you can become a member every year hopefully all your listeners will become a member of the Co-Op here's a shameless plug I think it's $25 for a lifetime membership, so it's not like I'm out to make a ton of money here off of memberships, but you can become a member of the club that's buying a share in the Co-Op and sign a piece of paper where you're you're asking us to manage your affirmative express consent and the licensing of your information right much like Cary Grant might have had an agent representing him for his face being on a box of Wheaties, so this is very much largely the publicity side of this. We guard the privacy side, but at the same time, we help our members manage the publicity side of this, and that's one piece of paper. The other piece of paper's for our affiliates. For businesses that come to The Privacy Co-Op that say, hey, we have all these pressing global regulations around privacy pressing down on us with these heavy fines. There was an outfit that was fined $6.5 billion under EU's GDPR a year and a half ago, right so this isn't chump change, and it's only increasing as these governments see the possibility for increased maybe even unlimited revenue through privacy fines and violations of businesses. They're setting up all of these tripwires for businesses to stumble across and say, oh, you violated another privacy law. You need to pay us a billion dollars. Well, all of that money is that going back to your listeners who are generating the data, to begin with, you know? Did the government say, hey, the value that was added to the system was added by the common person using Facebook? No, they're just using that money for something else, right, so at a certain point, it starts to look and smell a lot like attacks or attacking right if you're generating the value for the platform the value of the platform is making that platform billions of dollars a government comes in and sets up some regulation and takes a big percentage of that money it starts to look an awful lot like attacks not on the business but on you. It's your information that's generating the value, and the government is reaping the reward, so as we looked at this, we said hey for the affiliates of The Privacy Co-Op for businesses that are trying to find some sort of cover for these ever-increasing regulations perhaps they become an affiliate of The Privacy Co-Op and license the use of our members' information that's what we're brokering that's what we're managing is the information rights for our members right so this is a perfect union of two pieces of paper that are helping businesses come into compliance with regulations around the world including you know CCPA and US 74 here in the United States and others as you've mentioned. There are 50 states now that have various forms of some regulation around data as well as helping our members secure that you know the picture on the box of Wheaties royalties or dividends that are due to them, so here's your opportunity if you're generating the value by creating data and some of which is information then here's your opportunity to find an agent that will represent you and that and make sure that you're getting your fair share while we guard the publicly excuse me the privacy side of the house, right? So that's in a nutshell what we're doing as a nonprofit as The Privacy Co-Op. There are other authorized agents in the United States. Fairly famously, Andrew Yang ran for President this past cycle, and he, after stepping down from that effort, formed the Data Dividend project, which is a nonprofit doing largely the same kind of work. There are others that are stepping into this, and where this is happening is I heard a lawyer one time say new law begets new business right there's a new business opportunity when you have new law as the regulations carve out this role for an authorized agency you're seeing more and more authorized agents pop up, but we like to think the privacy co-op is the best

 

Debbie Reynolds  15:45

Very good, very good. I would love to talk about just. I think before the call, we were talking a bit about one of the previous podcasts that you were listening to from Memme Onwundiw, where we sort of touched on the prospect of Federal Data Privacy law and you sort of brought up one of these COVID laws where I would love to talk about and then also the idea or the potential around Federal Data Privacy law as it relates to potential preemption of things that states are passing right now and then also a private right of action. That's a big question.

 

Jay Glasgow  16:28

It's a big question, and we can peel that onion. First of all, you know, let's start with everybody admitting that COVID tracing is reprehensible. Worldwide we haven't found a great application of it yet. I would say probably the best area around the world that we found at handling privacy rights and data rights and human rights and still doing a good job of COVID tracing is in Alabama. Remarkably they have a General Counsel out of the Attorney General's office there in Montgomery that's working on this specific problem has been since last spring, and she told us as we're kind of exploring how different states and regions around the world are doing this. She told us that they were so concerned about it that when they realized they needed more people, which you know 35 states I believe have outsourced COVID tracing. In Alabama, when they went to outsource, they insource they went to UAB, which is a medical college in Birmingham, and had them come on board to help because they understood the HIPAA regulations that apply to patient data. Now contrast that to the State of Texas. They're on the other extreme on this scale where an outside outfit went through a bid process against some pretty heavy hitters like AT&T and Cisco and others that had you know we're under Federal regulations they were under FTC and FCC regulations and probably would have done, and you know we totally consider they would have done an excellent job with maintaining data security. An outside outfit called MTX Group from New York came in. My understanding is that they hired two lobbyists in Austin, Texas, for $50,000 for three days of work, and they ended up with a $295 million contract where the health department was turning over all COVID testing records to this private organization now you look at their website what are they doing with the data well they have ten products listed on their website nine of them largely are data collectors and then there's one that sells data it's really interesting to see what the one that sells data. Does it look like it sells insights to politicians that are running for office? Then you know where we are a year into a three-year contract, not a single audit. There's no ability for people to opt-out that we've been able to find, so this organization is still collecting live data off of our citizens. Houston Chronicle said that the language was meant to "track us down and monitor us," right? So it's very scary what's happening with COVID tracing. At a national level last year, five Republicans in the Commerce committee co-sponsored a bill it's S.3663. It's the 2020 COVID Tracing Privacy Act Data Privacy act. The regulation is awesome. It looks as buttoned up and maybe even more innovative than the CCPA, and it does a really great job, and it's very narrowly focused, so when may on your podcast said he didn't see an appetite for a Federal Data Privacy regulation that was sweeping he's right, but I think it can be carved out one topic at a time. So in this S.3663 in the Senate Commerce committee, it stalled last year, and it was largely over something you've pointed out in your previous speeches and podcasts, and that is that the two hills that these politicians fight and die over are known as preemption. That's the Federal government being able to preempt a state government's law and private right of action, and that's sometimes couched as flooding the courts with frivolous lawsuits, so naturally, you see partisan wrangling on those two hills, and it's become so politicized and so polarized that you will have politicians arguing over a word as a knee jerk reaction even though they later hypocritically argue the exact opposite position on something that the very politicians that are saying that we should not clog the courts with frivolous lawsuits are the ones that are complaining most about being de-platformed or censored by social platforms and we think they should be held liable. In fact, they want to get rid of Section 230 so that they can be held liable they're the loudest voices for a private right of action, and yet when it comes to privacy and data regulations, we see them very vehemently arguing that you know we should have no private right of action so with this bill stalled last year there's this really incredible opportunity. If I may go into one other vector for you, and hopefully we're not because we're trying to peel that onion of a very large question, yeah, with COVID main mandated distance learning, what happened around the world. School shut down. Kids were forced by law by government mandate to stay at home and distance learning. We've had conversations with numbers of technical directors and CTO's for school districts around the world largely also focused on New York City and also in Texas as well to find out what's going on specifically in those regions but in other countries and other states and what we're hearing across the board Debbie is another I hate to use the word reprehensible. It just seems to keep coming up, but what some of our friends in the World Economic Forum told us an interesting statistic as we started researching this. Did you know that COVID mandating distance learning is impacting 1.2 billion children in 186 countries? Now I don't think there are very many more countries than 186, so we can say this is happening in every country virtually and what's happening from these technical directors points of view is that when COVID hit, they naturally had already started some distance learning and some software platforms and some education through automation, but they didn't have the whole suite. They had gaps, and they're trying to fill the gaps for the children immediately, and they're not giving any additional budget, so candidly and largely off the record, they're telling us that they are signing contracts with software manufacturers and platforms where they are negotiating their bargaining the children's information rights contractually to these educational outfits and talking to some of the chief executives of these educational outfits to try to piece this together what are they doing with this data. Well, you know, I don't know if you've seen the documentary "Social Dilemma." I bet some of the folks that are listening to this podcast have. And I don't say that they're going down this road with malicious intent. I'm saying that they're attempting to make their educational software the best that they possibly can be so that they can maximize their profits, but if your goal is to increase the attention of a child and get good marks from the teachers, then you start to see the software play and dabble in an area which was used to be considered unethical by teachers, and that's an area called tracking. So you know it's totally possible that we could see an error coming up here where and I'm just going to be very specific if a young girl in the fifth grade doesn't seem to be doing very well in math the artificial intelligence of the education might be skewed to keep our attention by introducing other topics and other subjects they may start weighing that attention time so that the product is made better for the profitability of the company. Well, this could mean you know potentially you're not very good at math so let's do Home Ec instead. That's something I'd be interested in, right? What are we doing to the next generation, right? So we have this massive problem that we need to address worldwide with children. Data rights in The Privacy Co-Op is very, very active in working on this. Well, if we realize that that's a problem, and we look at that bill that's in the Senate S.3663, which is around COVID Tracing Data Privacy, you know, we're almost one word away from saying children's data rights or educational, you know, mandated distance learning, right. And the bill, as it's worded, could easily apply and be broadened with a sentence or two to apply to children's data rights. Now, what are we doing here? What am I suggesting? I'm suggesting giving parents the ability to opt their children out of these contracts that are being formed. And what does that do for the school board? Does that hurt the school board or help it? Well, the school boards are doing these negotiations without any leverage whatsoever. Their backs are against the wall. But if their parents are now starting to talk about opting out versus opting in for the day to use, the school board can say, hey, how about if we work on the licensing through the guardians in a transparent way for you, the software company so that you can become more compliant with regulations around the world. Now the school board has leverage in that negotiation they are. They're doing a job through an authorized agent, like The Privacy Co-Op or the Data Dividend project. They're doing a great job of helping to almost commoditize or bundle information rights and guard them in a co-managed way. Now they've got a different chip in that negotiation, right? This is a different gambit that they're bringing to the table., this ultimately helps the children. It ultimately helps the guardians. And it can be done in an easier way than throwing the guardians 17 contracts that are 250 pages long and telling them to read through it and figure it out for themselves. Right. So this is a great solution. So where this comes back into the federal law is we just had a change in the dynamics of the politics and in DC with Democrats taking over both houses, and then the Executive branch, right, they already had the house, but now they have the Senate, and they have the Executive branch. Well, here's a bill, that if we can just simply look at re-evaluating our perspective on preemption and private right of action and say that these authorized agents can help kind of modified the thoughts around those two firebrands and say this isn't about preemption. This isn't about the private right of action. This is about organized nonprofits working to fight on behalf of parents, but take the rest of the bill as it's worded and apply it to children's data rights. I bet we would get four or five Democrat senators that could sign off on that very easily. And I think you would see a bill that signature-ready that could hit Biden's desk next week. This is something that's not going to take very long. And the impacts of rolling this out are readily available. We don't need any more law. We don't need any more of, you know, necessarily money and you know, being spent on this. This is just simply giving people the opportunity to sign up with an authorized agent and applying their information rights right away to solve a very critical problem.

 

Debbie Reynolds  23:42

Yeah, I've seen it. Wow, that's fascinating. I like the fact that you brought up the I don't know, I don't know, I will use the word bias. But the idea that, and this is happening in a lot of AI, right, AI is used to make decisions about people where they can put someone on a track that may damage them in the future? So just like you said, If Sally's not good at math, you know, let's let her bake cakes instead or something. And then how does that information? Follow Sally through her life? You know, maybe we'll get into, you know, more advanced classes in the future, because some somewhere, some AI say that she wasn't good at math or something?

 

Jay Glasgow  29:03

No, you're absolutely correct. And it's okay to use the word biased. I don't know where the politically correct idea of algorithms being objective came along. Algorithms have never been objective. I've been in computer science, you know, for the better part of my entire life. And algorithms are not objective. They're very subjective. Every single time I created an algorithm, I was trying to derive some point, right? I was trying to arrive at some value at the end. Of course, it's subjective. I don't know any algorithms that are innately objective. They would be almost meaningless to businesses if they were and to computer scientists as well. So we got to start there and just say, let's throw that out with yesterday's news. Algorithms are very subjective. And we can go further to say that the intent of that subjectivity isn't necessarily malicious. So I know a lot of, you know, an evil scientist, software developers out there that wringing their hands together and have that really great laugh where they go. Haha, yeah, I'm gonna get them. You know, I don't know anybody that does that. I think every data scientist and every software developer I've ever met is really after some really cool new innovation. And they really feel like they're helping people. It's just the unintended consequences of this thing, particularly with people not understanding their information rights. And that's the two words that you can put together. Information rights. You put those two things together, and you wrap up all of that publicity law that we were talking about earlier into a simple noun, proper noun. And if people start to understand their information rights and take action, that's what's really key. Yeah, I heard an interesting statistic. Just a couple of days ago, did you know radio stations were more profitable last year than they have been in history? They had revenue of $1.6 billion last year? And that was surprising to a lot of people because radio is sometimes thought of as kind of a, you know, an antiquated communication medium. And yet, the average artist whose music is played on the radio makes a guess how much they get paid every time their song gets played?

 

Debbie Reynolds  31:18

How much?

 

Jay Glasgow  31:20

0 the average, the average now, that's the average artists, now artists that are represented by BMI or ASCAP or some other authorized agent, you see what I did there, authorized agent, those artists that are represented by some authorized agent, average 12 cents per time that their songs are played. And then you have representatives that have really great agents. They get 19 cents per play. Now, Privacy Co-Op last year, we have members that bought a membership, and the co-op cost $25 for a lifetime membership, right. So they hired us to be their authorized agent. We went out and negotiated contracts and worked on their behalf. We had some affiliates that came on board just because they loved what we're doing. Right? That's great news. We had affiliates that said, Hey, we're not going to license the use of the information. But we still want to be an affiliate because we want to support your efforts. So as a nonprofit, what do we do with all that leftover income, we distributed as dividends to our members, so our members made an average of $28.45. Last year, you know, so we'd like to say, Hey, we actually beat a lot of the musicians that are being played on the radio last year. $3.45 doesn't sound like a lot until you find out that most musicians are only paid 12 cents per song. If they have good representation. We want to get to the point where you know. We're negotiating the color of your M&M's in your green room for you as you're getting ready to perform your data output for the day.

 

Debbie Reynolds  32:48

That's really cool. Oh, that's a good example. That's a great example. Could these laws apply? And I know there are some lawsuits to come up about. They want to use the Wiretap Act to apply to a case about the incognito mode on Google. What's happening? I don't know. It just made me laugh, whatever it was. But yeah, I think, you know, I actually think that we need a more comprehensive framework about laws and then figure out where all this stuff fits in. And I think there's definitely some updating and some pruning. We definitely need it. But I feel like we're just trying to contort our way through trying to apply some of this old stuff in ways that just don't matter anymore.

 

Jay Glasgow  33:37

Yeah, you sit there, and if I broke those two words apart, and Latin and mean with tort, which are lawyers might chuckle at. And if I baked a cake that's many layers high and called it a tort, it would be a tort tort. But if I take the torts that are applicable to what you just said, right, there are really four key torts that are involved. There's peeping Tom, publication of private facts. There is misappropriation. And lastly, there's defamation of character. Well, if you remember what we were talking about a little while ago with the difference between privacy law and publicity law, not all of those torts are privacy. Right? You could say I think you can make a pretty big argument that peeping Tom and publication of private facts are two torts that are heavily leaning towards the privacy side of this house. But misappropriation. No, that's taking, you know, Cary Grant's photo and putting it on a box of Wheaties. Right? That's, that's misappropriation and then defamation of character. While there's, you know, look at what's the opposite of that. Do we have laws around the formation of character, right? It's contractual law where you can make a person famous, you can use their likeness and use it for profitability, and they can license the use of their imagery right of their likeness. So when you think about those torts and how they apply to what you were just saying, it starts to kind of answer those questions, but we really need to tease apart privacy from publicity law because at the face of it, you know, who doesn't want their incognito not peep that right that that is it a peeping tom issue or is this really a contractual right for publicity. So when you start piecing that together and realize that well, actually this might be more on the publicity side we don't get our hackles all up about, oh, I think you need to get a court order subpoena, and it has to follow these regulations in order for us to find out a piece of evidence from the crime. But this does ask a lot of questions. I think you recently talked about a fifth amendment argument around passwords. And I thought your content on that your coverage of that was remarkable and thought-provoking. So I really appreciate your thoughts on that. I encourage your listeners to go back and check out your article on that in the past.

 

Debbie Reynolds  35:57

So much, these things just really fascinate me more. I think my interest in privacy was more personal from like, you know, what is private to me, like, what are my rights? So, you know, that's still sort of drives me a lot even though I work with businesses as well. And so it's just fascinating to me. So to me, I love to peel the onion back and talk about this stuff.

 

Jay Glasgow  36:21

One last thought on that peeping Tom piece. Do you know that that really is trespassing? It's interesting. It's an interesting set of law that hasn't really been applied well to the digital age. But if you think about several hundred years ago, this is eavesdropping, and I was talking to somebody the other day, and I said, well, you know, that's a that's literally a thing. And they said, No, what? And I said, Yeah, absolutely eavesdropping. Do you know where the word came from?

 

Debbie Reynolds  36:49

in youth.

 

Jay Glasgow  36:50

Yes, in medieval times, you would trespass on somebody else's land, you would climb up on their roof, and you would hang down from the eaves and listen in to the open window because they didn't have a glass at that time. And you could hear what their plans were against your kingdom, right. So so one five to avoid sending a spy into another five diamonds. So in early law, they were saying we don't want people hanging off our eaves. It became a trespassing issue. So peeping Tom, is really that there are some things you want on the internet to stay off the internet, you're just writing a letter to your mom, and you're saying something personal, that's a created work of the mind. Right, you clearly on the rights to it, it is concomitantly data and information fully right data plus, meaning equals information. And so that's an example when you're writing an email, the content is a creation. It is something very clearly covered by rights. It's all the other data, the metadata wrapped around that email that it's going from point A to point B at this time, and the subject is blah, blah, blah. That's all metadata that's produced data. And it has rights, but it has a different set of rights. So right there, within an email, you start to see the benefit of teasing apart privacy law from publicity law, right. And that's a great example where a person might not want them to monetize the content of their email, but they may be totally okay with monetizing the produced data around the email, all of the metadata, you know, and therefore, it falls into publicity law.

 

Debbie Reynolds  38:23

Right? Oh, my goodness, I'm happy to have you back on the show. We have so much more to talk about. This is fascinating. Thank you so much. Well, before we end, I would love for you to tell people how they can contact you or get involved with The Privacy Co-Op.

 

Jay Glasgow  38:40

Yeah, sure, they can go to PrivacyCoOp.com. That's our website. I would base it on the subject matter we discussed today. I would probably direct people to PrivacyCoOp.com/good. That'll take you to our social good. We're talking about children's data protection. We're also talking about a great subject, wrongful police profiling, which we didn't have time to get into today. Maybe we can follow up on that at some point in the future. And we also talk about COVID tracing, exploitation there, so you can follow up on that. If for others that are interested in maybe looking into us and supporting our work as a business, you can go to PrivacyCoOp.com/business. And then for everybody else that wants to become a member, just land on that main page. There's lots of content there for learning privacy rights, your information rights, and hopefully joining as a participant or a member in The Privacy Co-Op.

 

Debbie Reynolds  39:34

Wow, that's wonderful. Thank you so much. I'm so happy that we're able to have this conversation. I know people are gonna really love this because you really went down into the weeds on this one. So this would be great.

 

Jay Glasgow  39:48

Excellent. Thank you so much for having me on. It's been a delight.

 

Debbie Reynolds  39:51

Well, thank you so much, and we'll talk soon.

Previous
Previous

E22 - Sanna Toropainen – Co-Founder of Muna.io

Next
Next

E20 - Jennifer Mailander – Deputy General Counsel Fannie Mae Data Privacy & Cybersecurity