Jonathan Joseph
Jonathan Joseph is the Head of Solutions and Marketing at Ketch, a platform for programmatic privacy, governance, and security. Passionate about innovation, his career is focused on disruptive technology and organizational change. He serves on the Board of Directors at Reel Works, which builds opportunities for diversity and inclusion in media through a platform that empowers underserved NYC youth to share their stories through filmmaking, creating a springboard to successful careers.
Building sustainable solutions for privacy requires building privacy programs that can scale effectively across existing and future laws. Privacy must also be a foundational layer of the data ecosystem – so that peoples’ privacy choices are respected and enforced everywhere. Your UX team, legal, marketing, and IT should participate in delivering that experience. Jonathan Joseph discusses how these teams can become your organization’s growth drivers.
[00:00:10] INTRO: Welcome to The Power of Digital Policy, a show that helps digital marketers, online communications directors, and others throughout the organization balance out risks and opportunities created by using digital channels. Here's your host, Kristina Podnar.
[00:00:29] KRISTINA: Life before May 25th, 2018, seems simple, but since GDPR, CCPA, and other data privacy laws came into effect, we've been talking about privacy and data collection in the context of design and especially user experience. To continue this conversation today, we have with us, Jonathan Joseph. Jonathan is the Head of Solutions and Marketing at Ketch, a platform for programmatic privacy, governance, and security. He's passionate about innovation, focusing his career on disruptive technology and organizational change. Jonathan, thanks so much for coming by to geek out a bit here on privacy and thoughts on design and definitely lending us to your marketing lens. Welcome.
[00:01:08] JONATHAN: Hey Kristina, thanks for having me. I appreciate having me on your podcast.
[00:01:12] KRISTINA: Jonathan, what is privacy UX and its purpose?
[00:01:17] JONATHAN: Is basically the first thing in consumer sees when it comes to privacy, it's the mechanism by which you collect consent, and your consent is such a funny word, but what we mean by that is permission really. The permission to do what you want to do with people's data, process it, and collect it. And so privacy UX is really that first line of where a consumer can understand what you're doing with data. It's the first place where they can give you a sense of what they let you do with their data. And this whole idea that consumers own their data is important. So, Hawaii recently is talking about enshrining this right to data privacy in their constitution and actually calling people's data privacy property. And so that, to give you an idea of how important this is, when you started in your intro, Kristina, talking about you know, where are we? Is it dead? Is it whatever, the consumer is very much now a very important motivator, more than compliance, I would say as, as people think about privacy, and that's why marketing and marketing's involvement in privacy projects is so important.
[00:02:24] KRISTINA: How aware do you think consumers are? We talk a lot about CCPA, GDPR, and lots of different acronyms that most people can't remember, but how aware is the everyday consumer?
[00:02:35] JONATHAN: It's a really great question. So they're absolutely aware. Here's the thing. We actually did, did do a study on this. If your listeners are into it, we can send the link to this. But we did a study across the UK and the US of 5,000 people. They may not know the specific legislation. They're not going to know the acronyms CCPA and maybe GDPR cuz it's been around longer. But what they do know, and what they do see is the privacy UX, and they do see the regulatory action, so they see the FTC enforcement, they see enforcement of GDPR, they see some of these data breach cases. So, privacy is very much front and center for consumers and for people, even though they may not know the acronyms. And so, one of the questions we asked in our study was, how highly do you value your privacy? And we also asked how highly you value sustainability and how highly do value diversity, and inclusion. And the reason we ask about those is cuz they typically are the values that brands like to share with consumers, right? They like to say, Hey, we care about these issues as well. And what was interesting about it is, you know, about half the world said are highly valued sustainability, and half the world set a highly valued diversity and inclusion. But today, privacy was 75 percent. That so not to say that it's more important than those other two issues; it just seems to have a broader appeal. And we've wondered why people answered that way. And, I think data privacy is just such a deeply personal thing, but also, to your question on awareness. They're out there, and they're thinking about it.
[00:04:16] KRISTINA: So, what do marketing teams need to be thinking about right now in terms of privacy and this consumer awareness?
[00:04:23] JONATHAN: Well, firstly, I don't think marketing teams are showing up enough in privacy projects. Some of the data on that talking to Forester about this two years ago, something like two years ago, 5% of privacy projects had marketing involved. This year, it's something like 15%, so it's, it's on the way up, but 15% is nothing compared to how important this is as a brand issue. And so I think what marketing teams need to do is first show up. And show up in the context of privacy UX shows up in the context of articulating the value around privacy, sharing it as a brand value idea like you do; by the way, with sustainability and diversity and inclusion, you show up caring about the environment, you show up caring about diversity. I think what's important there is you can't just maintain a Hollywood facade of caring about these things; you actually need to do something about it. You can't say you care about the environment and not reconfigure your supply chains, and in a similar way, you can't say you care about diversity but not hire that way. In your recruitment programs and for data privacy, it's the same if you're saying, I care about data privacy, and you're transparent about what you're doing with data, and you give consumers control over their data, you need to make sure that when they make a choice, it gets reflected across your systems. And so what we're seeing, we like to say privacy as a team sport. It's not just a legal thing. It became a tech and IT thing. And now I think it needs to start to become a marketing thing. And between marketing and IT and legal, you figure out privacy, and you figure it out in a way where it's more than just, I'm ticking the box on compliance. It's, I'm doing this to build brand value. I'm doing this to build a relationship with my consumers. I'm doing this to rethink my data strategy. Which is important for marketers.
[00:06:21] KRISTINA: You've worked a lot in this space, do you have a favorite project involving privacy UX that you can talk to us about, just so we can understand a little bit more what that looked like and what did you learn from it?
[00:06:32] JONATHAN: Yeah, it's funny. I think there are some principles, that the folks that are building privacy UX need to think about. One of those is transparency. Just being super clear about what you're doing with data, and why you're doing it. We didn't give people enough credit regarding how much they understand the value exchange with brands. And that, you know, data's kind of a currency for that. What we found in that study is that people understand that, hey, oh yeah, of course, I understand my data is the currency in this relationship between me and you. As I share that data, I know I get benefits in return. I know I get product recommendations; I know I get personalization, I know maybe I get discounts, or whatever the case may be. Consumers get that, and they've always understood that. They've understood that. All the way back to the soap opera, when those big consumer goods companies would sponsor these soap operas. The idea was that we give you this delightful content, and in return, they're going to watch a few ads. And so there was this social contract if you will, and that hasn't changed. And I think consumers have always understood that. But brands, to some extent, went on a wild west of collecting as much data as they could, storing it for as long as they could. And those days have gone. So, principles like the transparency of principles around responsible data use, I think, are critical to privacy UX projects, you know, some of them, the other principles in addition to transparency. Have an adequate and appropriate retention policy when you collect data. So, use the data for as long as you need it. Don't just store it indefinitely. Be transparent about how you're sharing data. And then the other thing is like, you know, collect and use data that are appropriate for the thing you're doing. In legal terms, they call that data minimization. And marketers like to think of it as data relevancy. Here are the data points I need, and just ensuring you're kind of, you know, restricting or, or really thinking about the data that matters for a specific purpose. I think those projects matter to me a lot. Sometimes in privacy UX, you can get stuck in, well, what does it look like and what color is it, and what position? On the page, on the webpage, the privacy UX shows up, and I think those things move the decimal point, but the big change comes to really engaging with consumers and articulating them about your privacy and then actually doing something about it that, that I feel moves the needle.
[00:09:03] KRISTINA: I'm happy to hear you say that. One of the things that I've been bothered by lately, I used to see a lot of websites saying, Hey, give us your email address, and we'll give you 10% off. Now it's like, Hey, give us your email address for 10% off or $10 off. And then, the moment you enter your email address, give us your phone number so we can text you. And I tend to wonder how many people give up their phone numbers and then immediately opt out of text messaging. Do you still think we're seeing sort of a spray-and-pray kind of model out there? Or are these just sort of really marketers trying to get closer to me and understand me more?
[00:09:39] JONATHAN: I think they're trying to get closer to understand me more, but I, I think what's driving it is, there are different types of data and, I think your, your listeners will know it's, there's first-party data, second party, third party data. First-party data is when I collect directly from consumers. Second-party data is somebody else's first party where I've kind of merged to get a fuller; I'm just the bio journey, or audience overlap, or whatever you need to do. Third-party data comes from inferences and comes from, you know, folks looking at what you're doing across the web, and essentially, cookies have driven third-party data to a great extent, and they're going away. So I think what you're seeing, Kristina, is this realization that this big piece of the data strategy around third-party cookies is going away. And so I need to replace that with something. And the thing I replace it with is first-party data, and this direct relationship with the consumer and the way that manifests is your email address and your phone number. But look, I experienced it at a personal level every time I fill out a form; I mean, I've got a fake number that I give out because, like, why do you need to text me about this? Like, my email is fine. I don't need you texting me. And the times that I have received texts, especially in the context of why political contributions. You just immediately regret it when you see the volume of texts that you get, and then it doesn't stop, and it starts coming from different numbers. Anyway, I could go on about this forever, but it's like, it's fresh. It's almost like an immediate erosion of trust when you blow up people's phones like that, you know? So, I think marketers need to be super careful about it; it's not just about, okay, great, I can have this relationship with my consumer, and I need their email. I don't know if rationale’s the right word, but you've gotta be almost reasonable in what you do after that because consumers do now are super sensitive to that, right? How many spam calls do you get and whatnot? So I think that's, that should be part of the equation. It's actually one of the reasons why we're seeing a big convergence between obvious portals and marketing preference centers. So this idea of consumer control. You know, mandated by privacy laws, maybe. But now, you know, we're seeing a lot of clients saying, well, I wanna ask more than I wanna ask questions to my consumer that are more than privacy. I wanna ask, how often do you wanna be communicated to which channels do you prefer? What would you like to be communicated about? And I think that's a really important step to giving the consumer a little credit.
[00:12:15] KRISTINA: So, back up for a moment. What is a privacy portal? Do we all need one?
[00:12:20] JONATHAN: Yeah, so you define need. In some cases, you need one because it's basically a legal obligation. You have to have consent, the right permissions, and that could be opt in, opted out. It could be just the fact that you're delivering a disclosure on what you're doing with data. So if you're running analytics, if you're doing target advertising; In most states where there's data privacy at law, and you need a portal where people and come in and change their preferences. So, you need one. But the case I'm, I'm trying to make is you also want one, whether you're legally mandated or not, doesn't; while we were saying earlier, this is an important issue for consumers. So, I would say, yeah, absolutely, you need one and, you know, to add to. It's merging with this idea of marketing preferences and communication and kind of what you want to hear about.
[00:13:11] KRISTINA: Is that something that organizations need to start introducing to consumers or is it something that can just appear? I'm wondering how digital teams, or how even marketing teams introduce the concept of a privacy portal. Do they need to, or do consumers just expect this to appear? Do we have to have a journey that we take people through in terms of education? What does that look like?
[00:13:31] JONATHAN: I think there is, I think there is a plug into the journey. So over the last however many years, marketers have put a lot of effort into this perfectly curated digital journey for casinos. The website is carefully considered the colors, the graph, and the journey people take. The content that they see is all very carefully considered. And then we said, I'm gonna slam this cookie banner up in the midst of this carefully curated journey. And so I always thought that was crazy. It's like this legally mandated thing that you just gotta put up. It's just like; I don't care about it; I'm just gonna put this weird step in there. And so I, I think people need to, marketers need to think about how to integrate privacy into their carefully curated digital journeys already. And so the look and feel. The Pepsi portal, you can think of it as there's a module that comes up when people first show up to your website, and that's where you, you have the opportunity to say, we care about yours, your privacy. We're transparent about what we're doing. We're giving you a choice in how we're using that data, what we're collecting, and what we're doing with it. And then of course, and that goes away, people start to engage with your website. . But to maintain that there should be a place that they come to, and this would be the privacy portal where they can come back and change those preferences if they need to. And if you do a good job of that, you can collect more data and build this relationship with your consumer. Because they'll come back and say, you know, I like what this brand is doing. So now, yes, they can have my phone number. Yes, they can reach out to me about these other things. And I think that's the idea that if you carefully curate the privacy, In addition to, you know, your digital journey, then the, then you build that relationship with the consumer. You get better data. They trust you more.
[00:15:26] KRISTINA: As a consumer, Jonathan, you're making my heart go pitter-patter. That's exactly what I want. I want a relationship with a brand. I want to trust them. I want to be on this lovely journey, and then I put on my consultant hat, and one of the things that I hear oftentimes is, but we need to push things to consumers. We need to get them to convert. We need higher conversion rates. So it's almost like pounce on them all. The more, there's tension there. There's a dichotomy there. How are you getting organizations to take just a quick breath and pause and say, no, really, the way that you've described this is the way to go? It's building trust, and getting consumers to engage with it. You have that relationship, and they will buy, if they're anything like me, they will buy. But maybe not on, on their time.
[00:16:12] JONATHAN: First is there's spray and prey and spray and prey has because you could annoy a thousand people and two of them might say, okay, I was actually interested in that. And it works, and when it works you say, well, spray and prey on another thousand people. And then there's this idea of, like, how do we get more specific about who's actually into this product, and how do I get more targeted? But for me, the holy grail is how do you move to more of a pool model, you know? So for me, the dream is where consumers just tell you like, here's what I want from you. Can you pull it down? And you know, for the most part, they say that 80% of your revenue comes from people you know already. So you should be able to kind of build that motion where they pull down with and, and you just give them the content that they need, give 'em the context, give them the kind of whatever. But at the same time, you do have to build a new mechanism for new, for new customer growth. But, but I think there's a better way to do that than to spray and pray.
[00:17:14] KRISTINA: How do you see that working across channels? As we've been talking, I've been thinking about websites and portals and sort of more of the web world. Does privacy UX differ across channels? Like if we're talking about websites versus mobile apps, even virtual reality, and some of the new frontier technologies? Or is it consistent across the board? What does that look like from an organizational perspective as well as a consumer's perspective?
[00:17:37] JONATHAN: I think that the important piece is how you connect all of the channels. So, what's the multi-channel experience? So what I mean by that this is gonna be a big theme in 2023, this idea overseeing identity. It comes up in some of the laws, specifically in California law. This idea is that if somebody opts out on a website and then shows up on your mobile. If you know that that's the same person, then you should reflect on those choices when they're in the mobile experience. And so the idea is if somebody makes a choice on one device, cause you know the path that's been difficult with some of the data privacy laws is that they talk about people and honoring the data, privacy rights of people. But we don't show up as people on the web. We show up as a collection of devices. I show up as my mobile advertising id. I show up as my email address. I show up as a cookie, right? I show up as all these different things, but the collection of those is basically me. I shouldn't have to consent to the processing on each one of my devices. At some level. I want you to know it's me. And I've said, if I've said no on a mobile device, then I jump on your website, you should know it's me and just don't ask me again. Right? There's such a thing as consent fatigue. So I think that's what's important here, that yes, there are different experiences, but what's important is to be able to connect those, reduce consent fatigue, treat people like people, not like devices. Now, however, the new frontier technologies, I think, raise a different set of problems or challenges. So, for example, in virtual reality and, and some of the thinking that's happening around the metaverse, I mean, they can track your eye movements, right?
[00:19:22] KRISTINA: Not creepy at all.
[00:19:23] JONATHAN: Crazy. Not creepy at all. Not creepy at all. And I'm like, it, it kind of reminds me of, remember of Clockwork Orange and guys like forced to watch TV's very eyes open. Yeah. So now we're starting to talk about do you need. And where do we draw the line on some of these analytics or some of the biometrics that you're collecting? And there's a real need for this privacy experience to morph into, some kind of ethical guidelines and frameworks, and we're not there yet in the world. But I mean, you saw it, you saw Microsoft kind of shelve some facial recognition work and, and came out and said, we shelved this because we're not sure, how to think about the ethics around this and we need to do, the world needs to do some work on that. So I think around the new frontier technologies, absolutely that's something we need to think about. What are the ethics or what we're collecting and how do we remove bias and all that?
[00:20:25] KRISTINA: You mentioned privacy and identity will be a hot topic in 2023. You've been out in the world talking to different folks at different conferences, doing a lot of speaking. What other things are on the table for digital operations teams or even marketers in 23? What should everybody be thinking about?
[00:20:43] JONATHAN: It's the data strategy for sure. How do we rethink our data strategy in the context of some of these privacy laws? It's how we rethink our data strategy in the context of the consumer and their awareness of data privacy and its importance of shared value. I think that's a real opportunity for marketers. You're looking at regulators who think or are openly out there saying that what you're doing in digital marketing advertising is surveillance capital. Out there openly saying that. And as much as we just agree with that, maybe there's some atonement that needs to happen because we did have all kind of a wild west of data collection and use and we deliver a programmatic ad in milliseconds . And so the infrastructure to do all that is actually pretty impressive. But through a regulator's eyes, that's surveillance capitalism, and where do you draw the line? I don't think it's that plow island, or I don't think it's about capitalism. Obviously, I think there's a middle ground where you can respect people's data of dignity, and you can participate in a data-driven economy. And so that achieving that balance, I think, is what's, what's important. It's not all surveillance capitalism. Consumers totally get it. They understand the value exchange; they understand that there's value in sharing data with brands and brands who do that responsibly. I think that's going to be a big thing for this year. I was just at the IB conference in Florida, and it was basically a privacy conference. It was folks asking about privacy; how do we do this? How whatever ways we think about it how do we engage with consumers around, this is such a hot topic, and it has implications for data strategy, has implications for UX. So I think that's gonna be a really big thing this year. And we've got most of the US state laws becoming enforceable in 2023, so you can't ignore that. And just all throughout the year, California, Virginia, Utah, Colorado, January, July, and December. There's, you know, different layers of enforcement kind of come coming here that people need to consider.
[00:22:52] KRISTINA: I had the privilege last week of talking with a group of folks on a digital operations team, and I asked them why they were collecting so much data on users just for the sake of it. I'd gone and signed up for something on their website, and they asked for my first name, last name, zip code, email address, physical mailing address, and date of birth. And all I wanted to do was download the equivalent of a checklist for caring for the elderly. So nothing really exciting or sexier than that, I could probably even just find this without exchanging my data for it. But I was like, why are you collecting this much information? And what was fascinating to me is that the head of UX was in the room, and I looked at him and I said, you know why? And he's like; I can't tell anyone anything. Like there, other people are driving this boat. I'm wondering how do we get the voice of UX experts in this privacy conversation to be louder. People who are closest to the consumer, and from your experience, how do we set them up for success to help us achieve success and engagement in building that trust relationship with the user?
[00:24:02] JONATHAN: Yes. Such a great question, Kristina. When privacy was just a legal issue, I think some of the attitudes toward it was, okay, how much money do I need to throw at this thing so they, I can have some kind of minimal level of compliance and check the box. And then kind of since then, what we hear a lot is legal teams saying to us, I need to do a whole lot more in privacy. For example, one of the big trends now in Europe is in a mid-market; GDPR has been around for a long time. Now teams are looking at, well, actually, how do I optimize my privacy program? I need some level of automation, in the US, given that there are five state laws in 2023, others are coming, and folks are saying this isn't a manual process; this is something I need to approach programmatically. And so they're asking for optimization. And so what's happened is legal teams have said I can partner with IG and tech teams. A, it gets me what I want. B, it gets me more budget. And so, in a similar way, inviting marketing and UX teams gets you what you want. And it also unlocks the budget. So in that study that I mentioned that we did, one of the key findings was that if you do data privacy, you're right. If you have responsible data practices. And we wanted to quantify what that meant rather than, Hey, consumers care about privacy. Good things will happen. The good things that will happen are that we found that it'll drive 23% more purchase intent. And purchase intent, as you know, is a digitally important marketing KPI. And so 23%, how does that translate to revenue depending on what type of company you are, durable goods, long sale cycle, short sale cycle, 30 to 75% of that translates to revenue. So we're looking at 5% revenue growth at a minimum if you get responsible data practices right. And when we showed that slide first time we showed it, we showed it actually to a group of lawyers who just wanted to understand how marketing could think about privacy. And they were taking photos of the slide. And I remember asking why are we taking photos Slide, and they said, well, I think it's gonna help me get budget because now I can introduce the marketing team and say, Hey, you care about this, right? You care about building value. You care about revenue and growth. And if I can handle the risk mitigation side of things and our tech team can hand over collapsing cost of the compliance side of things and get automated processes for privacy. And marketing teams can use this as a growth driver. Doesn't that sound like a great mix? Where kind of collectively, we can get this budget for everything we want to do and elevate privacy to this board-level issue, which is starting to become more and more, especially in the context of Sephora, like who wants to be on the front page talking about the front page of the newspaper, talking about this stuff, you know? So it's become a big reputational thing. So that's what I would suggest, like kind of roll through. Be involved in these privacy projects. Find privacy council. I bet your privacy council will thank you for finding them and being so engaged in this as a marketer.
[00:27:22] KRISTINA: I'm taking notes on that exact recipe. It's actually a perfect recipe and it's probably the most succinct I've heard anybody deliver it. So thank you for that.
[00:27:30] JONATHAN: I just cited up now, so...
[00:27:31] KRISTINA: you might want to patent it before we go and release this, this episode, but no, it is great.
[00:27:37] JONATHAN: Thanks. I'll re-listen to the podcast, and make sure I've got it all right. Yeah.
[00:27:40] KRISTINA: There you go. We'll experiment. You just mentioned, or you said a few minutes ago, that GDPR has been around for a while, and it's hard to believe five years is almost where we're at. It seems like such a long time. How do you see privacy evolving in the US, especially in the next five years?
[00:27:57] JONATHAN: What are some of the trends? Because of the complexity in US laws across different states, and all the states are different in some annoying little way, you have to automate privacy in autopilot. You have to stay flexible in your systems. And so one of the big trends we're seeing is towards automation. And so my theory is that the US will start to drive a lot of that technical innovation because they have this complexity state by state. And so what we're seeing is that in the US I think we've always been, we've always leaned toward what's a programmatic solution for this? How do I take out the manual processes? And my experience with Europe is they're actually very much focused on a lot more of the legal. What are the compliance checks? How do I think about cross-border transfers? How do I, how do I think you know, when u, when UK or European citizen dynamic is stored in US cloud providers, like they, they think through those legal issues. They have Max Shrems, and other, let's call 'em, privacy agitators, right? In a positive way. So they're thinking through those equal issues. But because the US had to deal with the state-by-state and think through technical innovation there to address that, that's now going back to Europe where they're saying, yeah, we should optimize our stuff as well at make privacy programs more efficient, more automated. So I think that's an interesting trend, kind of regional difference. So maybe that's one way to answer your question, Kristina
[00:29:30] KRISTINA: I'm excited about that thought. The role that's going through my head is, I dunno if you ever watch I Love Lucy, right? But there's that episode where Ethel and Lucy are at the Chocolate Factory, and more and more candies are coming out, and they're trying to box it, and it's going too fast. And so Lucy starts eating the chocolate, and Ethel says, I think we're losing at this game, so it's like, it's not just about figuring out the automation. It's also getting the automation right. So it's probably a combination of the EU and the US and everything coming full circle to make us all more privacy-aware and get us onto this hopefully happy path that you're talking about of really thinking about UX around privacy and, and getting that part right.
[00:30:09] JONATHAN: The chocolates are just coming down. The conveyor belt here and the chocolate sell all these different state laws. And then, so people are like, I can't keep English guy. Is there one platform as a way to do it? And unfortunately, there isn't gonna be a federal law in the US that bails you out of this. It is just like politically, the chances for it to happen are almost zero, and most people have said our best shot was the federal law that got drafted last year. So you have to get bailed out. You have to find main solution.
[00:30:41] KRISTINA: There you go, folks. You heard it from Jonathan. He's been right so far on many different topics. I think he's right on this one too. That's where I'm citing. We'll come back in five years, Jonathan, revisit your forecast and see how right we were or how right you were. But I'll send my name under that one, too, and revisit this. But for now, it's a pleasure having you with us today. Appreciate the insights and the talk. And if you wouldn't mind sharing your report with us, I think you would be well received.
[00:31:06] JONATHAN: Yeah. Thanks for saying that. Thanks, everybody.
[00:31:10] OUTRO: Thank you for joining the Power of Digital Policy; to sign up for our newsletter, get access to policy checklists, detailed information on policies, and other helpful resources, head over to the power of digital policy.com. If you get a moment, please leave a review on iTunes to help your digital colleagues find out about the podcast.