S5 #6 Navigating the complexities of data privacy and cybersecurity

S5 #6 Navigating the complexities of data privacy and cybersecurity

S5 #6 Navigating the complexities of data privacy and cybersecurity

Guest:
Guests:
Joe Dehner

Joe Dehner

A graduate of the Harvard Law School, Joe Dehner is an international attorney based in Cincinnati, Ohio, who has traveled to eighty countries. His portfolio includes NGO and bar association work in Ukraine, Jerusalem, Ramallah, Colombia, North Korea, and Erbil, Kurdistan, Iraq. He retired from a major U.S. law firm on January 1, 2024, and now provides legal counsel to select clients.Joe uses his expertise in global data privacy as the host of over 175 episodes of the podcast "Data Privacy Detective." He authored the novel "The Seventh Trumpet,"  numerous short stories, and many legal publications.

In this episode, Kristina is joined by Joe Dehner, an international attorney with decades of experience in privacy, cybersecurity, and international law. The conversation dives deep into businesses' challenges as they operate in a global digital environment where privacy laws and cybersecurity threats constantly evolve.

Joe reflects on his career, sharing stories that illustrate the complexities of cross-jurisdictional data privacy regulations, and highlights the growing importance of cybersecurity as companies become more data-driven. He argues for the need to create privacy-centric infrastructures and offers practical advice for companies on developing global policies that align

Keywords:
data privacy, cybersecurity, international law, data protection, global privacy regulations, AI and data security, blockchain in privacy, cross-jurisdictional compliance, digital policy, privacy-centric infrastructure, data breaches, business strategy in data privacy, cybercrime, AI governance, legal frameworks in technology
Season:
5
Episode number:
6
Duration:
31:42
Date Published:
September 20, 2024

[00:00:00] KRISTINA: Get ready for an enlightening episode of the Power of Digital Policy. In this episode, we have with us, Joe Dehner, a leading expert in international law, data, privacy, and cybersecurity. Tune in as we delve into the evolving landscape of data privacy, the challenges businesses face in maintaining compliance in the future of digital policy.

Whether you're a seasoned professional or just starting your journey in the tech world, this episode is packed with valuable insights you don't want to miss.

[00:00:26] INTRO: Welcome to The Power of Digital Policy, a show that helps digital marketers, online communications directors, and others throughout the organization balance out risks and opportunities created by using digital channels. Here's your host, Kristina Podnar.

[00:00:47] KRISTINA: Welcome back. Today, we're lucky to be joined by Joe Dehner, an international attorney based in Cincinnati, Ohio. Joe is a graduate of Harvard law school, his work portfolio includes various NGOs, bar association work across Ukraine, Jerusalem, Ramallah, Columbia, North Korea, and the list goes on and on. He's got it all. Joe retired from a major U.S. Law firm in January of this year, 2024. And he now provides legal counsel to select clients. To me, what's fascinating is Joe has a world of experience, probably some battle scars we can hear about. And I think more than anything, he's been around the block enough times that he can teach us all a thing or two about privacy, how to get it right and balanced what we need to think about in terms of international law and certainly paying attention to cybersecurity. So Joe, welcome. And thanks for joining us today.

[00:01:40] JOSEPH: Well, Kristina just a thrill to be with you and thanks for what you do.

[00:01:45] KRISTINA: Well, you know what? It's always exciting to have somebody on who's really seen it all I say because, there's some things that we can read about. There's some things that we can infer, but some things you just have to live through in order to know what it's all about. So you've lived it all, it seems, or at least a lot of it. What are some of the biggest challenges that you've been seeing throughout your career that companies are facing? And how's that evolved into where we are today in terms of data privacy and cybersecurity?

[00:02:15] JOSEPH: Right. Well, you know, there are really two interconnected topics here. One is data privacy and the other is cyber security. And they're like a married couple. They go together. But, you know, when I started and I was in college, we were lucky if we got a half an hour time of the mainframe at the university and we'd get these green and white hard copy papers. And that was about it. We're in a 50 some years later, you know, so in that sense, I've seen it grow. So maybe we take these one at a time. Data privacy. We're really talking about personal information, although, you know, company privacy is important, too. And the fact is, even today, we have no global approach about what is truly private and what isn't private information about us.

There are just vast differences. And how countries and subcountry governments regulate data privacy and how cultures view privacy. It's very, very different. So organizations, not just businesses face enormous challenges dealing with this. We can get into examples as we talk today. And then the other side of it. The partner here is cyber security. Now that's really about the infrastructure. I would tell you this We've been too focused on data breaches and then penalizing companies That suffer data breaches as though they're the villains instead of themselves some of the victims, you know it's a little bit like where food safety was 100 years ago. Today we, we take individual cows and pigs and we, and we, we take barcodes and we can trace where any food came from. And look when I was in college, Ralph Nader, published, unsafe at any speed about card today, we have seatbelts. When I was a kid, we didn't have seatbelts. We haven't built the infrastructure that cybersecurity needs, and so these are the two things that are going together, and it's just been fascinating to be on this journey for 50 some years.

[00:04:14] KRISTINA: So it's interesting that you say, we haven't gotten there yet because sometimes it seems like we should have after all these years gotten this right. But I think it's more complex when we start thinking about, you know, the data privacy regulations and cybersecurity laws across multiple jurisdictions. What has been your approach and what lessons have you learned that you can share with us around best practices or things that organizations can do to get this right at this very moment, even though we may not have gotten to the final destination yet in terms of maturity.

[00:04:44] JOSEPH: Yeah, that's a great question. And sometimes this is best understood through stories. So let me tell you two. Here's the first one. So I have a global client. This is number years ago. And they have a workforce scattered all over the world. Okay. And the company wants to have a company culture, right? With a company handbook. And so it provides its IT equipment and policies that are supposed to be common to everybody around the world that works for the company. Good idea, right? So, and it has a policy that lets the company insist that the company's employees only use its equipment for business purposes, right? It allows the company to monitor messages that are sent, on work time and in the computer system. Now that all sounds perfectly normal in the United States, doesn't it? Okay. So here's what happens. A French employee is fired because he was browsing and doing other activities that I think everyone would, anywhere would consider very inappropriate and that clearly violated the company's handbook, right? And just to take an example, let's say the employee that's what the thing that happened, but it was almost as bad. Let's say the employee was planning to bomb a superstars concerts. Well, wouldn't the company be expected to take action like firing the company and the employee. So that's what the company did and terminated the employee.

Well, then we got a claim by the employee. That the company had no right under GDPR and French law to monitor the person's online activities while at work using a company computer. And that's an example because France, for a lot of cultural and historical reasons, has a very different culture and set of rules that put individual privacy, even in this instance, on a much higher level than we would have it in the United States. So just think about that. Any organization has a problem creating a globally common culture unless is very clearly designs its data privacy policy properly that goes cross culture. I can give you a second example, but what'd you think about that one?

[00:06:59] KRISTINA: I thought it was an interesting one because I think it sums up really well where we are today and the challenges that face. Going from jurisdiction to jurisdiction and the fact that we are an increasingly interconnected world where everything is digital, where there aren't boundaries and borders and yet we are stuck in these legal and regulatory schemas that very much have borders. How does that translate into virtual worlds? Or do you think we even have gotten to the point where we have a point of view on that, right? What happens when I'm in a virtual world? Does still that French law apply to me as a worker if I'm working in the virtual world? Is that what we're defaulting to is where the employees are physically or the customers or the clients are physically based? Is that really where we're heading going forward?

[00:07:47] JOSEPH: Two quick points. Data, once it leaves somebody. is global. It's almost, you know, this isn't like creating a work of art or writing something on a notepad that's where it is. But as soon as you turn it into digital data and it gets released, It can go anywhere, anytime, and it's virtually impossible ever to say, Oh, we've retrieved it and we've deleted it. Impossible to do. So by definition, this is a global problem. That's that's the first point. And maybe I'll give you a second example here to build on your question. As as to a spear phishing and cybercrime. So we have a client. It's a small to mid size securities broker in the United States, and one of its wealthy customers travels a lot globally and send an email saying, Would you wire 500, 000 to a Mexican bank? I'm trying to buy a piece of property in Mexico. So please do it. Okay. At least that's what it appeared to be. But within a few hours, it turned out the message was not from the client. It was from a thief who, who then took the money and ran within hours. Okay. So even after very prompt efforts to try to get the U. S. and the Mexican banks who sent the money, delivered the money, the money was gone just a matter of hours and hard to imagine, but there really are no international guardrails for this. And the Mexican authorities aren't really very interested in pursuing justice for a U. S. company that might have prevented it in the first place. They're not in Mexico. Now, is that a one off? No, not at all. This is a matter where there are no set statistics, but crime, cybercrime. is rumored to be approaching the level of illicit international drug cartels. So we're in a world where a criminal syndicate anywhere, especially in countries that aren't particularly interested in pursuing cyber criminals, are thriving through cybercrime because of the global nature of data. Now, Interpol is pretty good at global cooperation against murderers and thieves of tangible property. Not so good at dealing with data and with cybercrime.

[00:10:06] KRISTINA: So if Interpol can't get it right, What do you think organizations need to do, especially multinational organizations? What can they start to do and how do they improve their posture to get this right?

[00:10:17] JOSEPH: I would start with not what the law is at a given moment. It's evolving too fast and that'll continue to be true. I'd start with, does a company want to be private centric? And I'd be very careful about what a company promises it's going to do and what it can't it's going to do. That would be the start of the policy. And this should come from the top of a company. This is not just saying, well, lawyers write us a policy, whatever you say goes. No, no. This is about how a company deals with its customers, its suppliers, its contractors, and everybody else. I'd start with that.

[00:10:54] KRISTINA: That's an interesting point that you just made. Maybe if we just kind of dialed back for a moment, what you said and played it back to ourselves slowly on repeat, I think it's a very nuanced and important point that we don't think enough about, right? You said it's not enough to have a lawyer write a policy. It needs to go back to the top of the company. And it sounds like it's really about the culture and the credo of the company. And thinking about what is your stance? Who do you want to be? How do you want to approach your business as a whole, and that business includes data and it includes privacy. Is that a right way of thinking about it?

[00:11:34] JOSEPH: That's right. I mean, not to veer too much on a detour here, but take Boeing right now and, some accidents happen. Some people die a terrible thing in a year. But, the mistakes made there weren't made by the CEO. They're made by somebody way down below. But what you can criticize Boeing or anybody in that about is why wasn't a policy and a set of standards and monitoring and reporting and everything is created from the top saying we don't want these things to happen. You see now that's really the way data has to be thought of here. It's not just a legal compliance issue. It is that Or you could suffer enormous fines, especially in Europe. But, uh, it's not about that really. It's about a company's reputation. It's about what a company can responsibly promise. And if you start with that, the legal compliance aspect becomes a subset, which is what it ought to be.

[00:12:30] KRISTINA: That's really a crucial conversation. It seems to me to have right now, I think this morning I saw at least two different people post on LinkedIn asking. Who should own AI governance in the enterprise? And I was a little bit miffed, right? Because it feels like we had that same conversation 30 years ago around who should own your website. And then we said, well, who should own your mobile applications? Who should own your IOT? Who should own your AI? Who should own your fill in the blank of the next new technology? Are we creating just too many silos, you think, around technologies and organizations and, how have you seen that handled from, your vantage point? Should we put everything with the legal, should we continue to build silos? Is that the best approach?

[00:13:16] JOSEPH: Well, that's a great question. And here's my response to it. I mean, how do I get into this data privacy? I grew up in Cincinnati, Ohio. We're headquarters in Procter and Gamble and GE Aviation, but we're not thought of as an international city, but as a lawyer, I became my specialties international. So I became the business development chair, one of the world's leading law firm networks. It became very clear to me 25 years ago, the data privacy by definition was going to be a major international legal matter. So we set out to do that. And that led to the creation of the world's first alliance, not of law firms, but of law and tech firms together, devoted strictly to data privacy. It's called privacy rules. I'm emeritus now. I don't have anything to do with it, but this is the point you've got to combine lawyers with tech experts who then work with the top of a company to say, what kind of policy do you really want to have? And the tech people have their job, the cybersecurity part, and the lawyers have their job, the legal compliance side, but they don't decide the policy. That's got to be set by the company.

[00:14:27] KRISTINA: Who will trigger those policies though, from your perspective, what you're describing makes a lot of sense to me, but somebody has to still trigger. I don't know. We need a gen AI policy. We need a data localization policy. Somebody has to recognize that. Is there a natural steward you think in the enterprise?

[00:14:46] JOSEPH: Well, if the CEO says, look, I'm not an expert at this. I want so and so to work on this. That's good. But it, it shouldn't be siloed, which was the phrase you were asking about. There's more than one silo here. There's certainly the tech, cyber security silo, and there's the law silo, but together they have to work together to then create a policy that is global by definition. Now, from the law side, it can have a sentence in it that the company will comply with any applicable law. Almost doesn't need to be said, but that's fine. And, but what is the real policy? I mean, I've seen so many policies, Kristina, the current ones would say, uh, here's our policy. And if you're a California resident, you have extra rights. And if you live in state X, you have other rights. Well, wait a minute now. I'm in Ohio. We don't have a state code. You mean I don't have the same rights as somebody? How's that fair? And what's the company telling me? You know, I it's just and you can see how that is driven. By good legal advice that for California here are the rules and elsewhere It's something else and if you're in Europe, it's GDPR and so on and so forth without saying wait a minute! Shouldn't our customers our consumers have similar rights now? They the applicable law may Augment that that's fine. But you see what I mean. I I think that's how it has to start

[00:16:14] KRISTINA: Yeah, I see your point. And I think it's a really good and interesting one. And I wish in a way that people would maybe look at it from a different angle. In fact, yesterday I was just, I actually sent somebody an updated version of their privacy policy and I stripped out all of the, if you live in California, if you live in Colorado, if you live in Virginia, and I was like, okay, enough of this nonsense. And I call it nonsense, because I've worked on breaches with clients in the past. And what's fascinating is when a breach happens. It's really, really hard to determine where the user's located for reporting those breaches. So in a way, it really harks back to your point, which is it might just be better, easier, probably more effective and efficient to just treat everybody as equal. Give them the same rights. Otherwise, the devil's in the detail. So if a data breach occurs, how do I decipher where you were, where you are now, what rights apply to you? Is it not best just to say, okay, these are the rights that everybody across the U. S. or even across the globe have? Um, it seems simpler, but is there a drawback to that approach?

[00:17:20] JOSEPH: Yeah, I have a lot of clients who have three passports now. They're citizens of three different countries. That wasn't true 50 years ago. Generally, you could be a citizen of one country, not true anymore. We have green card holders. Well, they're not citizens the u. s, but they're permanent residents. Okay. Well, what law governs them? Is it GDPR? If you want to get into the weeds like that and say we're going to treat people differently You're going to be paying enormous legal bills And maybe to some extent, it's inevitable. But if you start with saying, you know, here's our policy. And, if we if we have a data breach, we're going to treat everybody pretty much alike. What's the real harm in that, compared with trying to figure where you really domiciled or resident those are two, you know, you get into the weeds quickly here and it's a nightmare.

[00:18:09] KRISTINA: From a corporate perspective and a corporate legal perspective, is there a drawback to doing that? Like taking that blanket approach and just saying, look, we're going to do the right thing and this is what we're going to define the right thing to be and everybody is equal?

[00:18:20] JOSEPH: Well there's certainly benefit. It's efficient. Now there are differences. You can't treat everybody quite the same. There are laws, particularly in more totalitarian countries that are different from the United States laws or the GDPR laws and so on. I'm not saying there are no differences, but start with what you how you're going to treat people in common. That's how I would start writing your code. I mean, when I left a large law firm and a wonderful law firm and a nice privacy policy, pretty clearly worded. Mine is worded very differently. I have a solo practice. You won't see a reference to law, except that we, we're going to comply with any applicable law. It says, here's how we try to protect people's data. Here's what we promise. And here's what we can't promise. Very plain English.

[00:19:11] KRISTINA: I love that. I love plain English. I love the idea of plain English.

[00:19:15] JOSEPH: You won't find many worded like it. Yeah,

[00:19:16] KRISTINA: I love the idea of plain language, that anybody can read and actually understand. Maybe we'll start reading more of these privacy policies, right? Um, I'm curious, as you've written these plain language policies and made them practical and still comprehensive enough, just because they're practical doesn't mean that they're light. They still can be very comprehensive. How do you adjust for new technologies like AI or blockchain? And how do those actually change the landscape in terms of data privacy policy security?

[00:19:47] JOSEPH: The great question, and let's take them one at a time. They're different blockchain, sometimes people think of that as, Oh, that's cryptocurrency. No, no, no, block blockchain is just a tool. It's a technology that can organize exchanges. That's really what it's for, with a public ledger that in essence is more privacy centric. So it's used in a lot of spaces far beyond crypto. But that's one of the spaces it's used in. Now it has an underlying advantage and a weakness in regard to data privacy. It aims to disguise personal identity. Now that's an advantage, but that's also its weakness, and we know that its users also can fall victim to data breach, phishing, and other losses. But blockchain is quite different from knowing who's involved on the two sides of the, the two keys that are out there that, you know, I have to do with data privacy. So that's one set of things that is not well addressed by definition, because blockchain can be global. And independent, on its face of who lives where and how they're doing. Now, AI is really one of the giant issues here. And this is much more than, just does it work well or not that, you know, it'll, it'll continue like everything we work on as humans to get better and better and better, but it all rests on data sets. Now, sometimes that's non human data sets, like the safety in a workspace and something. To the extent it relies on personally identifiable information, this is a critical topic. And let's take an example. The French medical health system, I think, is way ahead of almost anybody else as a public health system.

And it encourages French residents to allow their very sensitive medical information to be shared within this system run by the government. And it could be done in an anonymous or a pseudonymized manner. And that's how they approached it. Now that has allowed massive French database being gathered about the residents of France and their health. And what that means is if you're in this system and you're in a car wreck, and the first responder comes and needs to know your blood type and your allergies, that'll be in the system in a privately protected way. Great advantage to, you know, what we have in the United States, let's say.

But at the same time, it creates a more than 50 percent of French residents are within the database. So it's accomplishing the public health of space. But that took a lot of work developing the rules around us that encourage people to put their information into the system in the first place. And so they've got a very good data set, at least for French residents, for medical purposes. That shows you how this isn't just what government decides. And that's why I so applaud the work that the Data and Trust Alliance and others are doing. In the private sector to develop standards that have nothing to do with politics or, corporate advantage or anything else. They're really saying, Look, we all are in this global data world. Let's have standards that make sense sector by sector. So, uh, data and trust alliance keep up the good work, but I hope, I hope that responds a bit to your question.

[00:23:14] KRISTINA: Thanks. It does appreciate that shout out. And what you're pointing to is something that I frequently see working with various global clients. Their heart is in the right place, right? Folks just don't seem to have all the tools yet. And what's interesting to me is if you remove the big tech organizations and just focus maybe on the global 1000, people come into work really trying to do the right thing day in and day out. They're not out there trying to monopolize personal data, do something evil or bad with it, or exploit it. But what I do think is interesting is that sometimes those same people that are trying to do the right thing tend to have a really hard time balancing out the risk and the rewards of personal data in context of business. What I mean by that is, for example, a lot of the legal teams or the privacy professionals, instead of the organization, we'll say, no, you can't put that information or the client's information into the consumer data platform, we will not allow that. Or we're not going to share that with our sister operating company.

Like that's not okay. We didn't get that consent and they will die on that hill. I'm wondering from your perspective, how do we start to get people to move to the right place where they understand the risks and the rewards and are properly balancing this, like it seems like a change management issue rather than a tool issue.

[00:24:37] JOSEPH: Well, to me, I think the focus has to be on the infrastructure of data, because we just can't expect individual consumers to take an hour to read privacy policies before they, you know, buy a product or a service. It's just self defeating the A. I. And is so powerfully important for our future and data sharing is Is so convenience driven. We want to be able to, the one the one click of Amazon is one of the keys to its success because it's easy to go buy something there, right? So we're not going to slow it down and say, Okay, now you all as consumers have to go out and read all these policies before you decide to do X, Y or Z. Instead, it has to be infrastructure that, from its inception, is privacy centric, that can ensure, assure someone that's going to use it that they aren't accidentally sharing their personal data, so they're going to get, have their money stolen. Or have their health information shared or be told that, you know, the information about their browsing history means they didn't get a job because it was a company that didn't like gay people, or I can go on and on and on or a woman who wants to visit a pregnancy clinic. She doesn't want that information shared with people who might get 10, 000 dollars in Texas for turning her in. I mean, unless we build the infrastructure that is privacy centric to begin with. We're not going to achieve the great benefits that AI can provide, and we're not going to provide a level of privacy protection that's going to protect consumers for what they really expect.

[00:26:23] KRISTINA: Who should be leading that? Or how do we actually start moving in that direction, Joe?

[00:26:28] JOSEPH: Well, you already see the private sector leading. It's not the lawyers, We're the janitors of the world in general, now, we can write some good laws and so on and we can do stuff on the preventive side. But it's not going to be the lawyers. It's going to be the people who run businesses and for two reasons first they want to do the right thing. You said it exactly, right? That's what businesses want to do at least decent businesses. Not criminals. I'm not I'm excluding gangs. All right. But businesses and organizations, they want to do the right thing. So that's the first point. And they want to have a good reputation. And secondly, they don't want to lose money to fines to government for violating data laws, or the costs of data breaches, which are just going up and up and up in, uh, in overall cost. So for those two, two reasons, it's going to be the private sector that drives it. And then government will come in country by country, state by state until we get a federal code, which we should have had a long time ago. And, uh, and it's going to evolve, but even then it's going to be changing. It's going to evolve law follows practice. That's what happens. And we can't over, we shouldn't overreact and what I mean by that is you get some terrible thing that happens and government's instinct Oh, let's regulate in a more heavy way just to solve the 1%, 2 percent problem. We can't risk that in the data world. That's the real thing that's going on in the politics of data privacy, the struggle between a wonderfully evolving technology and the need to prevent disasters.

[00:28:05] KRISTINA: I love that. That's a really great way, I think, of summarizing the entire risk reward, formula that we're all trying to get right in our digital operations. So really appreciate you sharing that. I don't know if I agree with you that lawyers are the janitors of the world though, because honestly, as you said that I thought, well, gosh, you know, I remember maybe 15, 20 years ago when I would run into lawyers and set up organizations and we talked about digital, they would sort of just shy away or look at their shoes. About maybe seven or 10 years ago, I started running into more and more lawyers who are fluent in digital and in law. And I would say that they're anything but janitors of the world. I tend to call them the purple unicorns, which are more common now than unicorns to me, or they're becoming more common. How are you seeing the legal profession change and how are we best to partner with our legal colleagues inside of enterprises to make sure that digital operations are done right?

[00:28:59] JOSEPH: Yeah, well, that's a good question. And lawyers aren't only janitors. Janitors mean we clean up messes, meaning those are the litigators and the dispute resolvers and, you know, a mess is made and people have to figure out what to do about it. But at their best, lawyers are peacemakers. That's really what they are. And, at their best and preventing misunderstanding, setting agreed expectations so that commerce can proceed and humans can get fair and relatively equal treatment. Lawyers at their best. That's that's what lawyers do. And they're, you know, they're very terrific at settling disputes, but they can be excellent at framing laws and regulations that are sensible. And what this means is what you just said, law is becoming so specialized, since the time I went to law school over 50 years ago. And lawyers today are more and more trained in digital technology. And this is great. And but digital technology requires attorneys as well who understand the law and how the law interacts with digital technology. And that's what you're seeing. So that data privacy lawyers are there really to be the peacemakers and the preventive side of this. And they'll still be disputes and they'll still be lawyers need to kind of sweep up the dust of past mistakes, but at their best digitally intelligent attorneys, they're more and more of them are going to help us move to a better data world.

[00:30:31] KRISTINA: Well, Joe, thanks for helping all of us move to a better data world. Certainly could talk to you for a number of other hours, but we're out of time for today. Still appreciate you coming on to the power of digital policy podcast. Your insights into many things, but also how lawyers act as peacemakers. I love that. And certainly making sure that all of us are ensuring that the right expectations are set so that commerce can thrive. That's what business is about. Is really a good credo to have as we move forward. And so really enlightening conversation and appreciate you being here today to have it with us.

[00:31:04] JOSEPH: Thanks for what you do, Kristina, nice to be with you.

[00:31:06] OUTRO: Thank you for joining the Power of Digital Policy; get access to policy checklists, detailed information on policies, and other helpful resources, head over to the power of digital policy.com. If you get a moment, please leave a review on iTunes to help your digital colleagues find out about the podcast.

Feel free to respond to this podcast here: