E114- Alexandra Ross, Senior Director, Senior Data Protection, Use and Ethics Counsel, Autodesk

41:52

SUMMARY KEYWORDS

privacy, companies, people, data, happening, security, law, support, autodesk, organization, legal, esg, customers, teams, thinking, attorneys, called, terms, business, ways

SPEAKERS

Debbie Reynolds, Alexandra Ross

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show. This is Alexandra Ross; she is the Senior Director, Senior Data Protection, Use and Ethics Counsel at Autodesk. Welcome.

Alexandra Ross  00:44

Thank you. Thanks for having me, Debbie. 

Debbie Reynolds  00:46

That's a lot of big hats. We have a talk about this.

Alexandra Ross  00:50

I know like my title is a bit of a mouthful, and I'm happy to explain what it is I do. 

Debbie Reynolds  00:55

Yeah, I would love for you to talk about your career trajectory and how you ended up wearing so many hats in this in this field and privacy.

Alexandra Ross  01:07

Sure. So I'll start with my current role at Autodesk. So I am the director of a department within our legal team. We call ourselves data legal. And I and my team support our privacy, security and data ethics program. So the global scope of data essentially at the company, and we are providing strategic governments and legal support for the operational programs themselves. I've been at Autodesk now for about seven and a half years. And prior to that, I worked as a consultant. Prior to that, I was privacy counsel at Walmart, based in the Bay Area at Walmart.com, the internet division. And then prior to that, I started out at a law firm called Thelen Reid and Priest as an intellectual property attorney. So I started out originally doing intellectual property licensing and trademark and a little bit of privacy and then got more and more interested in privacy. And as the privacy realm increased and there are more and more opportunities for lawyers. I just saw that as a great niche for me to enter into.

Debbie Reynolds  02:28

Well, you caught my attention, you and I collaborated with Lourdes Tourrecha, at the Rise of Privacy Tech, and you're one of the innovator evangelists there. So I know I see a lot of her postings and stuff and you caught my attention. And I was like, why don't I know this lady?

Alexandra Ross  02:51

Thank you, I appreciate you reaching out to me; I really enjoy your podcast and all that you do. And let me tell you a little bit about the rise of privacy tech because I think it's a really interesting project that Lourdes has undertaken; what she's trying to do and what this group is trying to do is put some more attention onto the privacy tech ecosystem. So vendor's tools, and solutions, that are helping companies manage compliance, find the value in the data they collect and find ways to bring experts in the field, the companies themselves the startups that are looking for funding or for advice on how to sort of get started, as well as the investors themselves. So there are sort of three tracks or three ways to get involved in the rise of privacy tech. I've been involved now for about a year and a half, working with startups and advisory roles and working with Lourdes and some of her colleagues on putting together white papers and other materials to help people in the privacy world people like us that are privacy professionals, whether they're attorneys, or advocates or academics, learn a little bit more about what is on offer from a privacy vendor perspective. Yeah, Lourdes is doing tremendous work. I love what she's doing. Because I think a lot of times when people think about privacy, they think about the legal and regulatory part of it, and they don't really think about the tools, you know, the operational side, unfortunately, doesn't get enough attention. I feel like she is smart enough to be able to pull all those pieces together and then throw investors in there as well. So I think it's a brilliant move on her part to be able to try to wrangle all these different people. I would love your thoughts about the technology and the legal part of privacy. So, obviously, in your role, you traverse all different areas within your organization that includes people who are working in operations strategy, the legal regulatory compliance folks, you know, the privacy engineering people. So, how do you manage that relationship? Because I don't think it's a natural relationship, you know, you have to kind of work at it. Because, you know, everyone, you know, I feel like, I know, there are a lot of people, okay, I'm legal, I broke this legal bucket, you know, technology, I grew up in this technology bucket, and it's hard for those to gel and meshed together. So it really takes a special person to be able to diverse kind of those two big groups. What do you think? It's a great question; I think it's, it's what I find interesting about being in-house counsel, being within an organization and supporting the operations functions, I think, the legal analysis, you know, tracking the regulations doing the legal analysis and counsel part is, is one part of my job. But as you've pointed out, a lot of the work is interfacing with different operations teams to you know, that are actually running the program. So I support a wide range of programs, you know, I work with our security team and our NRC so on things that are related to protecting the data that we are stewards for, for our customers and our employees. So incident response, you know, Threat and Response programs, all the things that are strategic in terms of the security program, and what I found, in particular with the security and privacy operations, folks, is they have the desire to understand the legal underpinnings of what it is they need to do so that when you're speaking with them, it's always helpful to put some context around the legal requirements, you know, there's a legal requirement x, or customers are asking for certification. Why? Where does that come from? You know, what's the sort of philosophy or emotional component behind that? And I think if you, if you work from that perspective of the why are we doing this, you can really have a baseline have your relationship with your operational teams, and help speak their language in terms of functional requirements. Because if you just go with a list of legal requirements with no sort of context, or no basic understanding about how your organization actually operates, I'm not an engineer, the technical expertise that I have has all been on the job learning and talking to colleagues, I actually have a theater background. So I come from like, the other end of the spectrum in terms of, you know, the sciences or the humanities. But I have tried to bridge that gap just by being really curious and asking a lot of questions. When I talk to the engineers or when I talk to the product managers. I talked to the data scientists; I want to understand what they're trying to accomplish and what business goals they're trying to achieve, and that I bring to them the legal analysis, but I try to frame it in a business context. And I try to make it as practical and as understandable as possible.

Debbie Reynolds  08:33

Well, you're obviously succeeding at this. How do you get champions within the organization that are in different areas to champion privacy? So someone that you can turn, you know, how do you find the people that you can turn to in these different groups that you know that let's say, for instance, like a new product or feature comes up, and they'll be able to say, ah, this is something I think we need to pull in Alexandra about how do you do that? There's a couple of different touchpoints.

Alexandra Ross  09:05

So we actually have a Champions program across our company where we've identified particular individuals within various you know, business teams, product teams, and so forth. And they are called privacy and security champions. And they are a cohort; it's sort of an additional; it's not their full-time job. They are, you know, engineers or product managers or whatever, but they're also tapped by their teams to be the privacy and security champions and we give them specific training. They have meetings, and they are responsible for helping us complete privacy impact assessments or meet particular controls and requirements that we've established as part of our program. So that's one way that we get that sort of groundswell of support by identifying particular individuals within each product or business unit that's actually processing data. The other way that we try to find some touch points is with our legal colleagues. So the team that I manage are the subject matter experts in privacy and security data use compliance, but we interface on a daily basis with employment attorneys, marketing attorneys, product attorneys, compliance attorneys with litigation attorneys. And we are both providing them information through playbooks and training and escalation paths, but also engaging with them to understand what are the types of issues that they're coming across. And how can we help them solve their employment initiative and help them support our people team, but also help them flag our privacy or data use issues and we'd like to empower them as much as possible. And then they can come to us if needed, if you know, a playbook or a process doesn't answer their questions. And then finally, we have a lot of executive support. So there's governance committees that I sit on, and our General Counsel at our SI staff is very invested in privacy and security compliance. And we like other companies have branded that as trust. So we have organizations within the company and departments within the company that is invested in trust. And that comprises privacy, security, data use and resiliency and availability and other things that are customer-facing. And we feel that it's really important not only to be compliant with what the laws and regulations require but also to meet our customers’ expectations and be seen as a leader with privacy and security and a company that our customers trust. And that adds value when they do business with us.

Debbie Reynolds  11:59

You post on a couple of really great points that we'll love to get your thoughts about. And this is the trust. So trust is part of your brand, trust me is not about reacting because I think customers expect if you have something bad happening, they're going to react, right? So that's not like a special sauce type of thing. But, you know, companies that can can step out at the forefront, say we're thinking about these issues, in proactive ways we're building these programs, or we're thinking about them, not just reactive, but proactive ways. I think it definitely helps. You know, definitely your brand reputation outside. But I mean, I can tell that you are really forward-thinking, in terms of the organizations that I've seen are about this issue about how central trust is to your, your bottom line, you know, I think it enriches the organization internally as well. What are your thoughts?

Alexandra Ross  12:59

I think that's right. And it's not an original concept. There's, there's a lot of progressive companies that are talking about Data trust and being customer-focused and making sure that they are not only doing what's required by the law but potentially going above and beyond to meet customer expectations. It's you know, it's something that customers expect, that customers are more and more sophisticated. Now, they hear, you know, the horrible stories in the news about companies or governments that aren't doing the right thing with their data. And I think people have a higher expectation about what companies are going to do with their data, how transparent they are, and also what sort of value you get from disclosing your data to a particular company. So that's definitely part of our culture, you know, their culture and values that are important to Autodesk in terms of integrity and being a customer company. So you know, we're starting on the journey and we hope to do more and more external messaging about trust and communicate more to our customers about what it is that we're doing. So that we can meet them where they are and give them the information that they need to feel confident in doing business with us.

Debbie Reynolds  14:21

And I want to talk a little bit about your roles or the multiple hats that you wear. I think it's fascinating, partially because I just personally, I think if you've already in my podcast, you know, I say this a lot. My favorite lawyers talk to our lawyers who are in-house at tech firms because I feel like you get such a rich experience of different you know, not only technology and techniques to different people, but also you know, you have to understand the business, right? You have to understand the business that you're doing which you know their business isn't legal, right? There is your business's technology and its people and software and stuff like that. So I think it's fascinating, you know, with your roles, the way that Autodesk has, you know, thinking about these areas and putting people in leadership positions that way. And I'm sure when you went to law school, you probably wouldn't imagine that you were on this type of path, right?

Alexandra Ross  15:25

No, I mean, when I went to law school, there was not a privacy course, I took a lot of intellectual property classes, and I may have taken like a cybersecurity class. But the privacy experience that I have gained over the years has been on the job or getting certifications and various things. Now, there are whole departments at universities where you can, you know, get more exposure to privacy concepts as a lawyer, but no when I first started out in law school, I went to UC Hastings in San Francisco, I thought I was going to be an IP litigator. And then I found this thing called transactional work and licensing and trademark work. And that's what I ended up doing in my first few jobs at the law firm. And it was only when I started working in-house that I realized that there was this emerging domain called, you know, privacy, and a lot of the people early on were former IP attorneys that found a way to sort of migrate over to privacy. Now you see a wide range of people coming into privacy; they're coming from a technical background, they're engineers or data scientists, and they go to law school and become privacy attorneys, or they become privacy program managers, and so forth. So, you know, I agree with you, I would never have known this is where I was going to end up when I went to law school; I thought, I thought, originally, I was going to be a litigator. And then I thought, oh, I'm going to be a trademark and licensing attorney. And then I thought, oh, there's this thing called privacy. And this is pretty interesting. And this whole data technology was just really really fascinating to me. And I saw it as a growth potential. And I've been able to, throughout my career, find different opportunities within various companies to expand my scope and continue learning because that's what I really appreciate about my job, the fact that I'm not pigeonholed into one particular area of the law, I love all of the teams that I support. And it's fun to be able to go from having a meeting about data use and ethics and AI to a meeting with our privacy program managers about consent and permissions or California regulations. And then I have a meeting with someone about a security project. And that variety is what keeps me going. And it also helps me because I see different aspects throughout the company. And I have this broader view, and I can take the learnings that I have from one area and bring it into another.

Debbie Reynolds  18:16

Well, you've certainly succeeded at that. And you're a shining light and an example for a lot of people who are trying to find paths to privacy. You know, you're an example of what people can do if you are someone who enjoys learning and new opportunities and things like that. So I think, you know, I think we're only going to see more people from even different walks of life, the move into privacy; I feel like anyone in a data job that has data anywhere, you know, can benefit from learning about privacy. One thing I would love your thoughts about one of my talks about ESG. Right? Environmental, social and corporate governance. So ESG, in my view is an acronym that now has a life of its own, as you see it in the news now. And I think, you know, Elon Musk even tweets about it; I think he doesn't like it or something. But I think, you know, I would love for you to talk about ESG and how it relates to privacy because I feel like it's kind of, I don't know, the term almost like the cloud like it exists out there. People talk about it a lot, but they don't really describe, you know, what it means and how it's important to how organizations attack this problem. But there is a connection, though, I feel between privacy and ESP, but I want your thoughts on it.

Alexandra Ross  19:44

Yeah, sure. So ESG stands for environmental social governance, as you said, and it's a concept that's been around you know, for several years and it stems I think from two different places. One is compliance and reporting obligations for public companies to say this is what we're doing in these particular areas, especially around corporate governance or in your financial disclosures. And then the other is connecting with stakeholders and customers and explaining what it is that the company stands for, you know, what they're doing from a sustainability perspective, how they're working on diversity programs for their employee population. And in the governance realm. That's where privacy and security fit in. And what we've seen in the past couple of years is this idea of corporate governance expanding to include how companies are treating your data. So not just, you know, corporate governance, like the traditional model of corporate governance, but a larger framing of governance in terms of what your privacy and security and data use programs look like? And we're starting to see more and more regulations that are requiring companies to disclose what it is that they do from a privacy and security perspective; there's some draft SEC disclosure requirements, for example, around cybersecurity; I wouldn't be surprised if we saw something similar related to privacy, the way that I've been involved in ESG programs, at the companies that I support, we have a group that manages our ESG program and puts out impact reports, we complete Pertino surveys that rank companies based on their disclosures around ESG topics. And I'm part of a steering committee. And we need to think strategically about what are the different components within the ESG framing that we think are appropriate for Autodesk and where does privacy and security fit in. So I meet with a group of stakeholders, you know, people representing different parts of the company, and think about what's the appropriate discussion that we want to have externally, with customers and in our shareholders, about what we're doing in these particular areas. And you can look at our last Impact Report, and you can see a lot of great things that we're doing with sustainability with employee diversity. And we're starting to say more and more things publicly about privacy and security. So for me, it's an outgrowth of what I've already been doing in terms of supporting the programs. And now we're thinking about ways to be more public-facing about it and fit it into that ESG framework.

Debbie Reynolds  22:54

That's great. Thank you for that. What is happening in the world right now in privacy that's concerning you most so?

Alexandra Ross  23:06

Yeah. Well, I'd say two things. And you know, they're things that I'm just tracking as a privacy professional because they impact the work that I do at Autodesk. And, you know, the teams that I support. So one of them is Federal privacy law, and whether we will actually get US Federal privacy law this year, and what's going to happen with preemption and private right of action. You know, I think there are a lot of different opinions out there about whether this is the one that's going to happen. I think the fact that Senator Cantwell isn't supportive of this latest proposal doesn't bode well. I'd say as a privacy practitioner and talking to my peers at other companies, you know, companies want to do the right thing. We want to be compliant with the laws. And it's very, very challenging when there are multiple laws and multiple jurisdictions that are not consistent and are not interoperable. So I would say the more consistency we can find, you know, in my personal opinion, this is not an Autodesk opinion, necessarily, but I support a federal privacy law. But I think there needs to be a way that the federal privacy law is what companies are working towards, rather than having to also deal with all the complexities of state law. So I would prefer to see federal preemption, rather than, you know, having a federal privacy law and the handful of state laws because I just think that makes the compliance burden on companies and ultimately, I don't think that's beneficial to individuals and end customers of those companies. So tracking the legislation in the US, and then also what's happening in Europe. So a lot of the digital regulatory schemes that are coming out are the Data Governance Act, the DATA Act, the AI Act, the Digital Markets Act, the Digital Services Act. I mean, there's a whole range of regulatory proposals, some of which have been enacted and some of which are still being debated. You know, I'm tracking those very closely. Luckily, we have a wonderful Government Affairs and Public Policy Team at Autodesk that I relied heavily on that's tracking these as well. So how are those European regulations going to impact global companies like Autodesk? Where are the things that we need to be looking out for? And how is that going to impact the compliance programs that we already have in place?

Debbie Reynolds  25:55

Yeah, that is, I guess, I support; I think we should have some type of Federal privacy legislation. I just think we're in a tangle right now about preemption. So yeah, the thing is complicated. It's complicated because, for a number of reasons; one is, in my view, that this law is being proposed as a consumer law, and like human laws are broader than that. So you can't tell states if they pass human laws; the Federal government couldn't preempt those. So I think I think it would be, I think, complicated because I think if, for example, let's say we had a Federal law come out before the CCPA came out, we probably wouldn't even be talking about preemption right now. But many states pass laws and you're like, hey, we have these rights. And we don't want to curtail the rights that people have already been enjoying for all these years. And then say, we have a Federal law and one of the Federal laws never gets updated. And then the states are going to go, you know, I think it's gonna prompt states to want to pass different laws that aren't covered by the FTC. I think people don't realize FTC doesn't cover all types of industries. So there are definitely some gaps that I think that unfortunately, some states, it will embolden them to pass these other types of laws, which makes it worse.

Alexandra Ross  27:27

More complex. It's true, it's true. And I'm based in California, and I, you know, I support California as a progressive state, you know, we are always on the forefront of progressive legislation. And so while I value what the CCPA and CPRA are trying to do, I think at the end of the day, the Federal law is going to apply to all citizens in all states. And I think it's actually a fairly strong, strong protection. So there's a balance, right? If you're giving certain things up that you may have gotten used to under California law, are you gaining more by virtue of the Federal privacy law? We'll see. We'll see how it all gets sorted out, right? In Washington?

Debbie Reynolds  28:12

Yeah, we'll see. We'll see. I'm not going to hold my breath on this one. Because it is such a complex thing, and it's hard to do on such short notice, in my view, so I think, you know, I don't know, I feel like these things you almost have to do in layers over time. So it's hard to do a big, huge omnibus thing. Who knows? It may reach the finish line; we'll see. But I think it's just difficult to do it that way.

Alexandra Ross  28:41

Yeah. One other thing that I'll mention that has nothing to do with legislation but more has to do with people is and we touched on this a little bit, just in terms of you know, our path to the privacy field is getting more and more people interested in privacy and finding opportunities for a diverse range people to get into privacy and security, and how do we find ways to get more classes and colleges and universities? How do we expand the way that we're recruiting and finding more diverse candidates, women, people of color? I feel really strongly about that. And I try to, you know, reach out to people. People contact me through various social media and want to talk about my journey and give them personal advice. So there are ways that I'm trying to find to give back to the younger generation or those people who are interested in getting into privacy because we need people, right? We need people that are going to be privacy professionals that are going to be security engineers that are going to be privacy attorneys because there's a lot of problems to solve, and we need more people that are interested in and well educated and ready to come to work.

Debbie Reynolds  30:03

Yeah, this is a huge problem. I think it's only getting broader as we go forward; I think, you know, there needs to be more education around what's private and what's not. Because I think a lot of the stuff, slow pace of privacy, either regulation or even having individuals be really passionate about privacy, to me, especially in the US have come because I think people thought they have more rights than they have. So as we're seeing, you know, different things happen, you know, in the public and the press, you know, with different, you know, precedents being set by cases and things happen in Supreme Court, I think there's a wake-up call moment for people about what's private and what's not, and what they need to not only fight for personally but what they expect companies to do, which, you know, comes back to all of this and other companies, you know, what, what are people asking for now that they probably weren't asking for before? Do you know what their questions are? They are coming up with them now that maybe they hadn't thought about before.

Alexandra Ross  31:17

Yeah, I think that's right; I think there's definitely an opportunity for data literacy, or privacy literacy, or whatever you call it, just to make sure that we all as citizens of the world understand what our rights are, and what some of the risks are. And, you know, how how to engage with companies, you know, what, what are the things that that you should be looking out for when you're reading Terms of Service and all those things? And, and I agree with you, I think some of what's happened in the United States with the Supreme Court has really driven home how, you know, personal information and our sort of bodily autonomy and privacy are very important. And you know, what, what could be taken away? If we're not being mindful?

Debbie Reynolds  32:01

What bit of advice can you give someone listening to this, who is, you know, maybe they're in a different maybe they're legal, maybe or technical and they want to wear a privacy hat, how to get involved in privacy? But what is maybe one bit of advice you can give someone just starting out exploring privacy as a career choice? Yeah, I mean, I think it probably depends where you are in, you know if you're already working in an organization, and you can engage with your privacy program or your legal department, I think they'd love to have an advocate or a champion that's willing to learn whether it's informal, or formal mentoring, or job shadowing type of relationship. I think if you're in school, you know, look for those dedicated classes, or courses or certifications, related to privacy and security. There's a lot of internships that are available with privacy advocacy groups and think tanks, where you can get a job right out of school and have something on your resume that's going to be attractive to companies that are hiring either privacy attorneys or privacy professionals. So that's more than one recommendation. But those are a couple of things I would suggest to people that are interested in privacy is, you know, talk, talk to people who are in the field and find ways that you can add value. Yeah, I would also, I guess, after two more in there. So one is reading, figuring out what's happening in the world around privacy and adding that to your as a feather in your cap. Because I think if you're someone who can keep up with the happenings, or even anticipate, oh, oh my God, a new iPhone came out with this feature, oh, there's gonna be a privacy concern. So you could be a go-to person in that regard. I think that'll be helpful then. And then also people who are in kind of fledgling starting their privacy career, you know, I say, you know, fine, maybe there's a small mom and pop shop around and needs help, you know, volunteer to help them, help them with their privacy policy, help them, you know, give them a bit of advice or some type of direction that, you know, so that both things couldn't hurt.

Alexandra Ross  34:25

Yeah. And you prompted me to think of one other; you know, I have friends who have volunteered at organizations or with their school districts to review privacy issues that are impacting their families or their children. You know, they may or may not be privacy specialists by training, but they hit get wind of something that's happening and they maybe are concerned or want to help put something in place that's going to benefit that community but also be mindful of privacy and security.

Debbie Reynolds  34:56

That's a good one. I know a lot of people who are parents who maybe get a notice from the school, and they were like really upset about it. And then that spurs them to get more involved with what was happening with their children's data, for sure. So, if we think about companies, I don't know, I think, in my view, a lot of times, especially when you bring in security, some companies think about, like cybersecurity in reactive ways. And we know that privacy and cybersecurity aren't the same, but they have a symbiotic relationship, but I think companies need to be thinking proactively about privacy, it really isn't something that oh, we did this bad thing, and then we can fix it, you know, sometimes you can't fix a privacy problem that you created because, you know, that goes to someone's human rights, right? So how do you in corporations where sometimes companies are like, well, if I don't see an immediate problem, maybe I don't want to invest in the time or talent or skills, think about privacy, you know because we don't have like an immediate issue coming up, what are your thoughts about them?

Alexandra Ross  36:23

It's challenging because I do think there, there are a lot of issues and problems to solve for, from a security standpoint, you know, the way that I've seen it done most effectively is to take a look at all the different risks that are relevant to your particular company, and then try to rank them, you know, in some sort of priority, and you're not going to get everything done right away. But you can look at it through that lens of risk and impact and address the ones that are the highest impact and highest value first. And I agree, sort of not sort of sitting on your hands and being completely reactive. That's not, that's not the best practice. Certainly, I mean, I think that's why we have threatened response teams that are looking and testing systems to try to find the ways that the hackers can get in and proactively address them not waiting for the incident to happen, but sort of doing our own testing or relying on, you know, third party organizations or information sharing, you know, with law enforcement. So I think there are ways to both look at the risks and rank them in order of priority. And also be proactive and make sure that you're using your resources as effectively as possible because we all wish we had more resources to devote to things. And I think trying to make the case that resources are important because you can be proactive, and being proactive, you know, is oftentimes much better. You spend the money now and you're going to prevent something bad from happening that's going to cost you much more in the future. And it's not just a financial cost. It's a reputational cost; it's a cost to trust and all of that. So trying to make those arguments in order to get support for being proactive rather than reactive.

Debbie Reynolds  38:27

That's great. That's great executive advice so I know everyone will love that. Rate your risk. And then also, sometimes I feel like companies go the other way. So instead of companies thinking, oh, let's be reactive, then they think, okay, this law came out. And every single thing applies to me. And then I'm here on fire and going crazy, where not all laws apply to everybody, you know, so you like you say you need to figure out what is the first of all, do these regulations apply to you? What are your risks, you know? What type of data do you have? So you may have data that doesn't have, you know, different types of data have different types of risk. And maybe all those risks are not articulated in the way you do business. So I think companies really need to think about that before they go crazy.

Alexandra Ross  39:18

Yeah, well, it also doesn't help. It doesn't help your own sort of mental health. And it doesn't help you to be effective. If everything is a fire drill, and everything is the same level of importance. You really have to pick your battles.

Debbie Reynolds  39:34

So if it were the world, according to Alexandra, and we did everything you said, what would be your wish for privacy anywhere in the world, whether it be regulation, technology, human stuff, or anything?

Alexandra Ross  39:49

Oh gosh. Well, you know, I'm concerned about government surveillance and the use or misuse of certain technology by authoritarian and top autocratic governments. So I think the more people can be aware of that tendency or be aware of what's happening in certain countries. that would be my wish that we address that more head-on and that we have more sort of a groundswell of indignation or resistance to that because I don't want to live in that kind of society. That's a sci-fi movie that I don't want to live in. That would be one of my wishes.

Debbie Reynolds  40:36

Yeah, yeah, we definitely don't want the dystopian future, you know, super surveillance and stuff like that. It's terrible. Yeah, I agree with that. I agree with that. Well, thank you so much for being on the show. You've been fantastic. I know that people will really, really love this, especially those who are either trying to get into positions maneuver or you know, their career. But then also you give us really executive advice for people who are enrolled and thinking about different ways to attack privacy.

Alexandra Ross  41:09

Thank you. Thank you so much.

Debbie Reynolds  41:11

Talk to you soon. Thank you.

Previous
Previous

E115 - Bill Pugh, Smart Connections Consulting LLC, IoT, Smart City, Smart Grid, and Digital Twin technology

Next
Next

E113 - David Heinemeier Hansson, Co-owner & Chief Technology Officer of 37signals (Makers of Basecamp + HEY)