Debbie Reynolds Consulting LLC

View Original

E190 - Stéphane Hamel, Strategy, Privacy, Ethics, AI and the Future of Digital Marketing & Analytics (Canada)

Find your Podcast Player of Choice to listen to “The Data Diva” Talks Privacy Podcast Episode Here

Your browser doesn't support HTML5 audio

The Data Diva E190 Stéphane Hamel - and Debbie Reynolds (42 minutes) Debbie Reynolds

ADD TRANSCRI42:43

SUMMARY KEYWORDS

data, privacy, customers, marketing, regulations, business, marketers, work, organization, law, analysts, company, cookies, europe, principles, perspective, understand, ad blocker, first party data, conversation

SPEAKERS

Stephane Hamel, Debbie Reynolds

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest all the way from Canada, Stefane Hamel. He is part of strategy, privacy ethics, AI, and the future of digital marketing analytics. Welcome.

Stephane Hamel  00:40

Thank you, Debbie, for having me.

Debbie Reynolds  00:43

Well, it's exciting to have you on the show. I've seen you on LinkedIn for many years. You work with so many different areas of data, so I thought you'd be just a great person to have on the show. But you're also a lecturer. You teach in graduate studies on digital marketing analytics, which is really cool, but give me your trajectory in your career; I'm just fascinated by how you got interested in data and all these different areas of data.

Stephane Hamel  01:12

Yeah, for sure. My background initially was in computer science, so even when I started my internship, I had the opportunity to work on data that was related to healthcare and even at the time, we were very careful about privacy, even if we didn't call it in those terms, but we were very careful about the way we were using data to do our analysis. Healthcare data is very sensitive, and that was like 35 years ago. So, eventually, the Web came out, and I was a bit of a nerd. What was really a big data project before we called it big data? So it was so big that we had to spread the data in multiple Oracle databases, and it was huge for the time. Because it was a research project, we had access to the Internet. So there was no Web, but we had access to file transfer, email, chat, news groups, and stuff like that, but no Web. So when the Web came out, it was just natural to document the research project that we were working on. So I created a first website, and then things got in towards that direction of being involved, into creating websites and stuff like that, working in security for a while, and eventually doing an MBA specializing in E-business. I guess they liked what I was doing, because they offered me to teach analytics to bachelor students. But it turned out that it was a little bit too advanced, so they decided to offer it to MBAs instead. I've been teaching analytics for 14 years and eventually got marketing classes and stuff like that. So I've been a consultant for a long time, working with my clients on data analytics and stuff like that. I've done some coaching of agencies also, so kind of train the trainer, so that agencies could develop their own analytics practice. I developed a few tools that people might know almost 15 years ago and it was the first tool of its kind to actually detect which tags were running on a website. That was the first of its kind, a Chrome browser extension, and I also created a tool that was basically improving Google Analytics, Data Studio, and Google ads a little bit, basically changing and fixing the issues that we had with the Google Analytics interface. Eventually sold it, but I wish it was still around to fix issues with GA 4. But other people have developed great; I've developed a great browser extension to improve or fix little things that are annoying in the GA 4 interface. So a bunch of different things, the technical consulting strategy, data analytics, and now, of course, AI. I leverage AI in my classes, for example, which is pretty interesting. In a nutshell, that's what I'm doing. I'm closer to retirement, so I decided to share my knowledge, experience, and craziness with students, and I really enjoy it.

Debbie Reynolds  04:17

That's really cool. I love the fact that you were, as I was, involved in data before the Internet, yeah, so I think us seeing it grow day by day, it's very different. Now, almost anybody can create a website, but back then, you needed someone with a special skill to do it.

Stephane Hamel  04:37

Yeah, coding HTML by hand with a text editor, those were great days. It was fun. It was building something new, being very creative, and solving issues, bugs, and stuff like that. So it's a useful skill, I would say.

Debbie Reynolds  04:55

Tell me a little bit about your thoughts about how privacy got into the mix with data analytics and websites and stuff like that.

Stephane Hamel  05:05

It's interesting, maybe because I was coming from a technical background, I remember going to marketing conferences, and the thought of the day in those times was really like, okay, big data, now it's so cheap we can store everything. It doesn't cost much. Let's gather all the data. We can dump it into a data lake, and we're going to figure out later what we want to do with it. I thought it was a bad idea to do that, probably because of my technical background, where, when you work with data, you want to know what kind of data you work with, even the data flow is something that in computer science, we learned very early on. So, that was not really new for me. I had a sensitivity for privacy, of course, but there were a lot of things that we didn't know or didn't realize at the time. I think in terms of digital analytics, the party was on. Basically, there was pushing back on everything that was the rigor that brings the processes, and they were perceived as being showstoppers, paying to work with, and so on. Now, okay, marketers were happily playing with their 3d pie charts whenever they wanted them because they had tools to view the data whenever they wanted. But the party is over, and now we see with data science and privacy, the pendulum is swinging back into more rigorous processes, data flow, privacy being procurement also for many years, marketers could go online, put on their credit card, okay, cheap. 10 bucks a month, 20 bucks a month, 200 bucks a month. Okay, that's fine. That seems like all the tools we want. We don't have to talk about it. We don't have to talk to procurement. Life is good, but that doesn't work anymore. As an industry, we have matured to a point where things are more serious now; I would say, so, yeah, I think there was a big transition and coming back to something that is maybe that's how it should have been from the start.

Debbie Reynolds  07:16

Yeah, you and I are old enough that I could tell the story. I remember back in the days when we had meetings with servers, went online, and tried to decide whether it was going to be connected to the Internet or not.

Stephane Hamel  07:28

Oh yeah.

Debbie Reynolds  07:32

Now it's like, it seems like that's the default thing. Or some people, they don't even have physical servers, there's so much stuff is in the cloud, and so just thinking about that is different. But then I think one of the things that you mentioned, and I think it's true, somehow in the process, we've lost control of the data because all these innovations made it easier to obtain data, made it easier to create data, made it easier so that you didn't have to call IT, remember, because a lot of times you have to call someone from IT to install something, and now you can, just like you say, put your credit card in, go on the website, download it, and no one knows anything about it. But that's probably one of the reasons why we have such a problem in privacy and organizations, especially to me, around marketing. So marketing becomes a target for two reasons. One is a lot of what marketing does is like the public face of an organization. So mistakes that are made there are probably going to be caught more readily because it is public facing. But then again, marketing wants to be creative, and they want to use different data within organizations to be able to capture more customer engagement and customer trust, but first of all, you're using data that maybe someone didn't agree to be used in that way. But tell me your thoughts about that.

Stephane Hamel  08:56

Yeah, some of that misunderstanding, I would say, between marketing and sometimes another department, IT, and others. When you look at IT, marketing wants to be creative. They want to move fast. They don't need perfect data. The imperfect data is enough to make a decision, while as soon as you look at it and the back office systems that really are the core of the business. It needs to be precise. It needs to be accurate. It needs to have rigorous processes because, given a certain input, you want to make sure that you have a predictable output. So that requires strict processes. It requires strict control in terms of privacy, for example, and that makes things go slower. So, on one end, you have marketing that goes very fast. The wheel is spinning very fast. They adapt all the time to the changing environment and so on, and on the other end, you have almost the rest of the business that wants to create stability. They want to create predictable outcomes. In marketing, you move fast and do great things, and you can adapt; especially in digital marketing, you can make mistakes, and maybe it's easier to overcome or fix, while if you take the wrong order at the wrong price for the wrong client and ship at the wrong location, there's no excuse for that, so it needs to be much more rigorous. So maybe that explains a little bit about the different perspectives and directions. Now, when we look at privacy, sadly, I see that some organizations are perceiving privacy as if it was the Department of No. We hear that all the time: privacy is a pain, and privacy is an expense. Privacy will limit innovations and so on and so on, and I disagree with that. It's such a bad perspective on how important privacy is, not only for the customers themselves but also for the long-term value of the organization. That's something I strongly believe. Let's put aside all the regulations, the law, and those aspects. Just think of it in terms of principles, in terms of ethics. What do my customers want? Building trust? It's the basis of marketing in the long run, caring for the environment, for example, caring for employees and caring for being a good corporate citizen. Privacy becomes one of those values where customers will look at option A and option B. One company has a really bad reputation. I don't trust them. I'm not sure what they are doing with my data and so on. Of course, there is the other one, where while things are good, they respect my privacy, and I can build trust. I want to do business with them, obviously.

Debbie Reynolds  11:46

Yeah, I agree with that. I do hear people say privacy is telling people no all the time, and I also don't agree. I want businesses to thrive and do business, but what they also need to understand is that you will lose customers if they feel like they can't trust you with your data. So if you're not building that trust then there is a bottom line impact to that because customers feel like they trust you, they'll give you more data, they'll give you correct data, they'll engage with you in different ways. So I think that argument that somehow we're trying to block people and their innovation that's not right because we want to be able to have people do business with companies in safe ways.

Stephane Hamel  12:33

Yeah, I don't like it when I see people or hear people saying, oh, customers need to educate themselves. It's their responsibility. Let's put the burden of their privacy on their shoulders, and I don't agree with that because one analogy that I like to use is the belts in cars. When you look at the history of seat belts, at first, manufacturers didn't want to put them in the cars because it was an additional cost, and customers didn't want it anyway. Eventually, it took regulations to force the car manufacturers to actually put seat belts, and still, consumers were saying, no, we don't want them. They were cutting the seat belts, and they were saying, oh, yeah, but if I have an accident and I end up in a river or a lake, I won't be able to get out. Manufacturers were saying, and lawyers, I guess they were saying, well, if someone is driving impaired or driving they don't have the skills, and they end up in a lamp post somewhere, it's their fault. We see exactly the same thing in privacy, where, right now, we're at the stage where, okay, we got the regulations, but there's a lot of resistance. Also, at the time, the manufacturers were saying, no, no, it's going to be something that will slow down innovation. Because while we work on that, we're not working on something that maybe would be more valuable. If I do it and other manufacturers don't do it, I will lose a competitive edge. So, exactly the same thing. So it took regulations. It took years and years and years of advertising or education of the market saying, okay, buckle up. What's the current state is actually that we have security by design, by default. Get in a car you buckle up. It took 25 years. It took a whole generation to educate the market so that when you get in a car, you buckle up. Now, we don't only have seat belts, we have hair bags, and we have stronger frames and stuff like that, but it took such a long time. So, hopefully, it won't take 25 years for privacy to become privacy by design and by default. There's no excuse to go on a website where tracking is just on by default. They don't ask for permission. They find sneaky ways to still collect a lot of data because, oh, we're. Not using cookies or, Oh, we're not doing this, and I have a legitimate interest to do it. Well, if it's so legitimate, be transparent about it. Ask your customers what they think about it. If they say yes, as you said, customers will be willing to provide even more information and more quality and the value of the data is going to be so much better. I've come up with a hashtag, and I think I'm going to stop using it. I was using no consent, no tracking, and sadly, I think either I miscommunicated the goal or people simply don't understand. But for me, no consent, no tracking, simply means that if as a marketer, you're going to do something, take a step back, think about it and wonder, what do my customers really want? Will they be happy when they see a consent banner on a website that says, reject all, and they see that? Oh, look at that. Eventually, we'll find that Google is still tracking some data and try to explain to that customer that, no, no, don't worry, it's anonymous. Don't worry, there's no personal information, yeah, but you just asked me if I wanted to be scrapped or if I accepted cookies. I said no. I explicitly said no, but you still track in some ways, because while legally you can and there's no personal information, or it's not really cooked, or whatever, the reason you want to make the no concept no tracking is simply to maybe think about a more ethical and principles, rather than strictly thinking about the law. And the law says you can do that or you cannot do that, so okay, I'm good check the box. I asked my customers if they wanted to be tracked. They said no, but legally, I can still track them, and of course, I hear all the excuses if I don't do it. If I do this, my competitors are not doing it, so I lose a competitive edge. I need the data. I have a legitimate interest in doing it, and so on and so on. We're still at the speed belt analogy where people are saying that it's the customer's fault if they don't understand it's the customer's fault if they don't take their own privacy on their own ends. I just don't agree with that. I think we can do better.

Debbie Reynolds  17:31

People don't know. People don't understand what you're doing, so you're taking advantage of the fact that people don't know. They don't understand, or they can't really opt out or consent in any meaningful way. That's a problem.

Stephane Hamel  17:47

When you go on a website, and it says, oh, yeah, sure, you can opt-out. Go figure it out in your browser and find out how to block cookies. That's not the way it should be. The funny thing also with the law is, if you're what does it say when you have a global company, and they are in Europe, and their privacy statement says we care about your privacy, it's really important, blah, blah, blah and so on. Because they are in Europe, our GPD applied, eprivacy applied, so they ask for consent. They do what they have to do to be compliant. But then the same company in Canada, the regulation is slightly different, and then, if it's in the US, this regulation is different, they don't have to ask for any permission, but their privacy policy still says, your privacy is very important to us. What signal does it send when? Yeah, your privacy is very important to us, but it depends on where you live. To me, from a marketing and brand perspective, does it make any sense? Why would privacy be less important in the States than it is in Europe? For example, the only difference is the regulation. If you take a principled approach, an ethical approach, that organization should have the same privacy standards globally if they really care about the privacy of their customers; at least, that's how I see it. I know a lot of people will disagree with that, but that's okay.

Debbie Reynolds  19:14

I agree with that 100%; actually, it makes me laugh. The fact that you've mentioned that I think it's so funny. So, I've had these conversations with some companies that complain that the US wants to move forward towards having better regulations and privacy. Some of their complaints are that it's too hard for us to comply, and it's like, well, you do business in Europe. You do business in other countries; they have these laws. I had this same debate with someone about biometrics, so they were upset about this Illinois biometric law, like, don't they do business in Europe? It's like you still have the follows on stage.

Stephane Hamel  19:54

From a compliance perspective, from an operational standpoint, it would be a lot easier if you apply the same principles throughout instead of saying, okay, oh, there's 50 different regulations in the States. There's the RGP in Europe, but the UK is not part of Europe anymore, and eventually, they might have a slightly different approach. Canada has different provinces with different regulations. Some of them don't have any. So it's the Federal regulation that applies, and then you go to Asia, you go to other countries, Brazil, and so on. Everywhere, its terminology is different. The law is slightly different. In some places, the approach is much more on. The companies have a right to do business, so they can do pretty much anything. The other extreme is, say, in Europe, where privacy is a human right, and it's the individual person who has the stronger control over the way regulations are expressed, the law is written, and so on. It's a mess. It's very complex. You look at a map of the evolution of regulation around the world; who can understand all of that? It's impossible, of course; if you're not a multinational organization, maybe it's simpler. You're limited to your own little part of the world. But still, I think making privacy a brand value becomes an investment. It becomes an opportunity, becomes really a value in the long term, rather than being an expansive pain and so on.

Debbie Reynolds  21:27

Yeah, I know every time a new law comes out, people ask me about that law, and now I start saying, let's just go beyond regulation. At this point, your company needs a data strategy for how you're going to handle this kind of trying to jump from one law to the next, if you have that fundamental foundation, or those policies or procedures, or get it into your company culturally about how you're going to handle data, then comply with these other regulations will become easier. As you say, I think it's more difficult to try to dissect different laws and say, hey, because you're in Canada, this applies to you because you're in the US, then we don't have to respect you as much. I agree with that. For me, as a data person, whenever a company asks for more data, I'm always immediately suspicious because I'm like, what are you doing with the data that you already have? Why do you need more? You're creating more risk for me without really telling me what the benefit is to me.

Stephane Hamel  22:12

One of the things that analysts and marketers complain about all the time is about silos in the business, and now we're creating silos. From a privacy standpoint, it doesn't make sense. It's an interesting evolution, and for me, I can't count the number of times that I was given access to way too much data, and I never asked for it. I've lost track of the number of times that my data has leaked. Some are pretty serious from a financial institution and stuff like that. And just this week, I had an order confirmation from Apple, a real one. The order exists and everything, but I never made the purchase. I'm in Canada, and it is in the UK, so okay, what's going on? Is it just a matter of someone who has billed something on my credit card so far? I don't see anything. Or is it more serious?  It's an identity theft because of one of the numerous data leaks; what I hear very often is first-party data, all because of third-party data being regulations and privacy and all those aspects organizations and agencies entice their own clients to, oh, you need to collect more first-party data. Flip it around and think of it from a customer perspective; the more interactions we have with different brands, each one of them trying to collect more first-party data. What happened? The surface of risk just increases if everyone is trying to pull their rings so that we provide them with more data; even if I trust the organization, the end result is, from a privacy perspective, for a consumer, we're just increasing the surface of risk, the likelihood that somewhere one of those is going to have a data leak, somewhere one of those won't manage my data in the right way. That's a scary thought. There was all that crazy approach to big data that has so many other things, as if it would, by itself, solve everything, and now we're coming back to okay, small data is pretty good. Small data, when it's collected in the right way, with trust, with consent, with control, it's worth so much more than big data that we just dump into a data lake and hope for the best and try to decipher what it means, when you already have very accurate customer data, most of the time, right, you have so much data that you can tap onto before going to the trying to collect even more. It's fascinating. It's a very interesting evolution, and for me, I think the transition, shifting from so many years of creating tools, doing advocacy for analytics, teaching consulting, and so on, and jumping the fence in a way, and thinking more about privacy, is the result of data leaks. It's the result of my own experience with clients and agencies and so on. The result also of having a conversation with Christopher Wylie from Cambridge Analytica on stage, where I did a Q & A, and it may be especially the backstage conversation I had within that was high opening about the way Cambridge Analytica emerged as a scandal and everything, and what I did during that conference, what I did initially before he came on stage, and I like to use that as a test in a way. I ask, well, you're an analyst, or you're a marketer, and I offer you a job where you can have a whole data science team. You can have millions and millions and millions of super-rich data about your customers and potential customers with hundreds and hundreds of attributes about them, and I want you to create the best campaign you can sell T-shirts. Will you accept the job? Of course, everyone's going to say, oh yeah, that's awesome. Oh yeah. Full data science team, rich data, bring it on. Yes, that's a good challenge. I want to do that, and then I say, well, change T-shirts for election. Will you do it? And then the picture is totally different, but it's the same thing, selling T shirts or selling an election. It's the same thing. You want to create the best marketing campaigns using all the data you can to optimize your campaign, and if it works great, you have a big check and a big bonus and everything. So where are the limits? Where do you stop? I know analysts that simply won't work in the adult industry, or they won't work in the military industry; they won't work in insurance because their insurance is not an ethical industry. So everyone has their limit, but it's very tempting. I won't lie, and I will say that I've been offered to work on data in the adult industry, and I always refuse. That's the limit. I won't go. So, yeah, where's the limit on the data we collect for which purpose? Where does it stop? The pay might be very tempting, but we have personal values also. Yeah, actually, it is interesting that you mentioned Christopher Wiley. He was the whistleblower in the Cambridge Analytica scandal. I read his testimony. Actually, it's fascinating. One of the things that he mentioned, I want your thoughts on it. To me, this is the dark side. This is the bad thing that can happen with data, beyond just the election, changing people's minds or their behavior. He had spoken about they had a thing at Cambridge Analytica called the Kit Kat project, and what they found is that if they put anti-Semitic ads or messages on Facebook and people who like them, they found out just because they have so much data, those people also like Kit Kat bars, yep. So the problem with that is okay, so if you like a Kit Kat bar, are you an anti-Semite? Yeah, causation and correlation. But one other thing you said is that to change politics, you need to change culture, and culture is everything from fashion to what you eat, and the fact that you eat a Kit Kat bar versus something else, in a way, it's part of your culture. When we were kids, our tastes developed and so on; it's part of our culture to say, oh, we like that, or we don't like that. So, is there a direct correlation between KitKat and CCMS? I hope not. But what if there is? What does it say? Of course, we can make the data say anything we want. But when does marketing become manipulation, propaganda, and so on? Where is the fine line between marketing to educate your potential audience and build trust and show that you have the best offer for whatever thing they want to do versus really misinformation, propaganda, and so on? One of the things also. We discussed backstage a little bit, it was the notion that propaganda is always fine until people realize they are the victims of propaganda. So, as long as people don't realize they are being manipulated, everything seems to be right. Everything seems to be good, and we see that with fake news. We see that with everything. So essentially, until people realize that they've been manipulated, it's good for the business. It's good for everyone because they don't know what's going on. That's scary, also.

Debbie Reynolds  30:35

It is scary. So, what's happening in the world right now in data or privacy or marketing that's concerning you now?

Stephane Hamel  30:47

Speaking of misinformation and so on, all the things that I see about cookie apocalypse, they don't even call it third-party cookie apocalypse. That message that said, oh, cookies are going to disappear. No cookies are going to be there. It's only third-party cookies that won't be there. It's not an apocalypse. It's far from it. If you think about it from a customer perspective, if third-party cookies disappear, what is going to change in their life? Nothing, nothing at all. I've been using Brave, and blockers and third-party cookies have been blocked for a very long time, and in Safari and Firefox, third-party cookies, okay, Chrome has the biggest share of the market, but it won't be in Apocalypse for the customers. What about the company themselves? The organization themselves? Well, if their marketing still relies on third-party cookies, maybe it's time that they review their martech stack. They need to do their own work and say, well, are we doing the right thing? Probably not. So there's a solution for them, and the solution might not be to collect more first-party data, hopefully. What about agencies? Well, for agencies, I'm sorry to say that having a discourse and propagating that bud, the fear, uncertainty, and doubt creates a lot of business. Oh, Mr client, Mrs client. You need to do something because the cookie apocalypse is coming up. It's scary. You need to do something, and by the way, we can solve that. Just give us your money, and we'll figure it out. So I don't want to go against agencies, of course, but I think there's a balance there. There's an ethical approach to tell your client that they need to improve their marketing; then, if you think about the last layers, which are the ad tech and market, some of them are going to have a very hard time because of their dependence on third-party cookies. Sadly, I think a lot of them are trying to find other ways of going around third party cookies, be it fingerprinting or be it other methods. The biggest impact is for the ad tech industry, but as a consumer, as a marketer, and as a data analyst. It's not an apocalypse for us. It's up to them to figure out what they want to do, and maybe, again, the party is over, and that's a good thing.

Debbie Reynolds  33:10

Yeah, that's true. Very true. So if it were the world, according to you, Stefane, and we did everything that you said, what would be your wish for privacy anywhere in the world, whether that be technology, human behavior, or regulation?

Stephane Hamel  33:26

Oh, the challenge we have is that when you look at privacy, there's an intersection, and we like to use Venn diagrams and stuff like that because we're analysts and so on. So imagine a Venn diagram where you have marketing, you have legal and you have technology. Nobody can be at the center of the Venn diagram. It's impossible. It's way too complex, and each of those has its own perspective. Oftentimes, legal doesn't really understand marketing or technology, and technology doesn't understand the legal aspects and so on. Actually, don't pretend to be at the center of those Venn diagrams, but over time, a technical background and interested about legal I was honestly somewhat intimidated for a long time when going into discussion forums and so on, and seeing that people were answering questions about marketing and technology from a legal perspective, and fighting, oh, that is regulation, x, y, z, paragraph two, sub paragraph and and it says this, and so on. I'm like, I don't know by heart all the legal terms, but what I can understand are the principles, the principles of transparency, control, when data should be deleted, and things like that. I think marketers and analysts need to understand the principles more than they need to understand, and that's not. I see it right now in the market. What I see is a chassis being built. There is something going on where the legal perspective is; I have the truth. It says that in this paragraph and so on versus other people, and that's my worry. I wish that was different. Maybe over time, that's going to change.

Debbie Reynolds  35:20

Yeah, I think it has to change because there's just too many laws, too many changes, and I think companies are getting tired of getting maybe slightly different advice or different laws, because they're trying to figure out how they're going to operate as a business. So in order to operate as a business, like you said, they need to think about more principles and try to get data protection in their culture.

Stephane Hamel  35:48

Exactly.

Debbie Reynolds  35:49

I've heard people verbally gasp when I say the data they collect about other people doesn't belong to them, and they're like, what?

Stephane Hamel  35:59

Yeah, you see all the different perspectives. Because at the end of the day, what happened is that the law is based on principles, of course, but it's also based on values, values within typically a geographic area, collective values about what people think, what are their understanding and their limits. Of course, that varies a lot from region to region, so we have different perspectives, but the principles remain the same wherever we are. One of the things I noticed also is if you go in discussion programs and you ask a question about a specific scenario, oh, I have a website that does this and that and so on, most of the time you won't get a clear answer from a lawyer on those forums, lawyers are trained to say, No, I will not commit myself to say something publicly without having all the details and all the context. So basically, you never get a clear answer to your question, and then you get all the other people who are not lawyers who sometimes will comment, and they will sometimes provide good advice, and sometimes it won't be as good. I include myself in that. I like to use the acronym I AN A L, I'm not a lawyer, but I supplement that with I have a lot of opinions. So, even if you're not a lawyer, you are entitled to your opinions. You are entitled to state what you think is right or wrong, not from a legal perspective, because you cannot provide legal advice. But I think we would all benefit if there were more open conversations about different scenarios and saying, okay, do we collectively, as marketers and analysts who have certain expertise and we're in a discussion forum, we want to build our collective wisdom? We need to share our opinions and knowledge and so on. Just go ahead and do it. I think we would all benefit. Because if, as marketers and analysts, we say, oh, such thing is right or wrong, it will help our customers in the end. Just as an example, ad blocker usage in the general population depends on the source. It might be 30% for some people; I saw something recently in the US that would be close to 50%. I was at a conference a couple of years ago in Europe, and let's say, at the time, ad blockers were 30%. I asked the audience of about 1000 people are using an ad blocker, and it's not a super scientific approach, but I swear I saw maybe 70 or 80% of their hands going up. So, right there, I see an issue. Why, as marketers and analysts, do we use ad blockers as a segment of the population? We use them more than the general population. To me, it's like, what's going on? There is something wrong just in that number, that difference between what we impose on our customers versus what we accept ourselves. I think there are more if we were to have several hours and maybe more people in the conversation, I think we would find several cases where organizations are doing stuff that we would say, oh, wait a minute, you're really doing that. Is it the right thing? Do you think that it's the right thing for your customers? That would make for a good conversation, maybe with a beer or something like that, to have an open conversation about what we think as a group is right or wrong.

Debbie Reynolds  39:39

Yeah, I love that because all of us have a part to play here, and I think when I see companies get in trouble for privacy stuff, it's not because of the law, it's not because they don't understand the law, it's because they don't know how to change the way that they operate. So having people of different areas in the business have those conversations really can help organizations because I don't think it's a money issue where we see front page news every day of these huge, multi-billion dollar companies getting in trouble, I'm hoping to see more dialog and more enrolls to these different people who offer their expertise. As you say, it's a great, great insight.

Stephane Hamel  40:32

Privacy is everyone's responsibility. We hear that very often. What do we do about it? Do we really create an environment where different ideas and perspectives? The thing I like to highlight also is we're just a tweet away whatever we call them. Now, is it an X away or whatever? But we're just a tweet away or a post away from people saying this brand is doing something evil or is doing something that is not right as a brand; we're afraid of that, or we're not proud of our marketing tactics. There's an issue. We should be proud. We should hope that people will say either or on a forum anywhere, this company has great marketing. They do this and that, and it's wonderful. I like it. Rather than seeing stories about, oh, this company has a data leak because they were too stupid to change a password on something. There was a data leak recently, and I just went on their website and just did a very, very quick audit. There's so many tags on that website, it's crazy, and some of them are not even in business anymore. That tells you about the governance of that organization when it comes to their martech stack. You know, there's more we can do. Yes, it's a matter of resources. Yes, money. At the end of the day, we need money to make it happen. But still, I think on a smaller scale, there's always something we can do to improve things.

Debbie Reynolds  42:09

Well, thank you so much for being on the show. This is great. I love this conversation, and I am excited always to interact with you on LinkedIn. I’d love people to follow you to connect with you on LinkedIn as well, so keep up the great work.

Stephane Hamel  42:23

Yeah, thank you, Debbie; yeah, great conversation. I really enjoyed it. Thank you.

Debbie Reynolds  42:27

Yeah, we'll talk soon. Thank you so much.PT HERE