S3 #06 The crusade to protect digital privacy

S3 #06 The crusade to protect digital privacy

S3 #06 The crusade to protect digital privacy

Guest:
Guests:
Rob Shavell

Rob Shavell

Rob Shavell is CEO of DeleteMe, The Online Privacy Company. Rob has been quoted as a privacy expert in the Wall Street Journal, New York Times, The Telegraph, NPR, ABC, NBC, and Fox. Rob is a vocal proponent of privacy legislation reform, including the California Privacy Rights Act (CPRA).

The pandemic has fueled the propagation of technology and tendencies toward spending time online. It has also fueled cheap access to consumer personal data that has flooded the marketplace. Information abuse is on the rise and presents a serious problem for individuals. It is also a growing security threat to business security as the data is used for social engineering and breaching protocol. In this episode, we discuss the trends and how individuals and organizations can fight back with digital policy.

Keywords:
social engineering, data abuse, data privacy, online privacy, data safety, data breach, data deletion, personal privacy
Season:
3
Episode number:
6
Duration:
28:51
Date Published:
April 14, 2022

[00:00:00] KRISTINA PODNAR, host: Your shopping habits, your family members' names, and even your salary are out there for anyone to see, but you can take control.

[00:00:08] INTRO: Welcome to The Power of Digital Policy, a show that helps digital marketers, online communications directors, and others throughout the organization balance out risks and opportunities created by using digital channels. Here's your host, Kristina Podnar.

[00:00:25] KRISTINA: Rob Shavell is the CEO of DeleteMe, The Online Privacy Company. Rob has been quoted as a privacy expert in many different outlets, including the wall street journal, the New York Times, Fox NBC, ABC NPR, and all the other acronyms you can think of. Today he's with us to talk about data privacy and enlighten us as individuals and as digital workers. So, Rob, welcome.

[00:00:48] ROB: Thank you for having me.

[00:00:55] KRISTINA: Rob, when I think of privacy, I think of not having my name or age or address or phone number, salary, and other information available on the internet. Is that an outdated view of privacy for today's world?

[00:01:03] ROB SHAVELL, guest: I think privacy means so many different things to so many different folks. It's hard to place a definition on it. And I think we'd been gone back to the Supreme court; there is precedence where even those, the smartest definers, and justice, and law lawmakers in our country have had problems defining, you know, what it is. So, I don't presuppose anything, but I do think that the notion of privacy that maybe many of us share, which is we can find spaces that are independent and free from surveillance, whether it's our names and addresses and house information about our homes and families being widely available to too many other considerations. I think that a lot of the notions that, you know, our parents, or certainly our grandparents might have had about those things, are now quite outdated.

[00:02:04] KRISTINA: A lot of colleagues in the EU, of course, are thinking GDPR. And sometimes, when I talk to them about what comes up online here in the U.S., they're appalled. The fact that anybody can look up, where do I live? Because my housing information is online. If you own a house, how much I paid for my house, how much it's taxed, lots of personal details are available out there. Do any of us in the U.S. right now have a good way to opt-out, or is it just a foregone conclusion that information is out there and it's over and done with?

[00:02:34] ROB: Like many things in privacy, it's, it's a little bit nuanced, and it's changing a lot. I think that context matters. And you know, we talk a lot about that at DeleteMe, and what I mean by that is that, for housing, for example, there's a long history of house sales and other kinds of information such as voting information, not just in us, but in GDPR countries. And across the world, having a notion of being a public record. And that means that governments and citizens have said, hey, you know, some of that information should be public it's much. This is the same concept that underlies the blockchain like there's value in having anyone be able to check on a price or a history of something. And those notions have proven useful and powerful in terms of transparency and other things. What I'm saying is I'm not necessarily against the notion of having house records or tax records or other records be marital records. They're all kinds of things you can go on and on to be publicly available. Because there's a precedent, there's a history. And there are reasons why transparency can be helpful to society. What is much more problematic is with some of that information, especially in the U.S., where we don't have GDPR to your question, which is it's all become highly correlated to the rest of our individual personal information. And by doing that and putting it up easily available on Google in profiles, for sale, for as well as 99 cents. It's really, in my admittedly biased opinion, a privacy nightmare.

[00:04:16] KRISTINA: So, talk to me a little bit about that because when we could have focused on DeleteMe, you're not really focused on trying to delete my record of homeownership. You're trying to focus on deleting potentially other types of information that I don't want on the internet. So, what really do you focus on, or what is the type of information we should all be thinking about not having publicly available?

[00:04:38] ROB: At DeleteMe, we'd love to be able to delete all the information that you want. So, I think, just say that our ultimate goal is to be able to handle any data requests that you might have, you or your family, or you as an employee had your company any requests about any data that's out there held by a third party that really relates to you that one might say is your data. We want to be able to facilitate the request, whether that request is for removing that data, closing your account, constraining that data, so they make sure that they don't sell it to marketers and advertisers or, what have you, or correcting it, or just being able to see it. We aspire to be the partner that people can rely on to make that easy because right now, our companies do not make it easy. Third parties, all your data do not make it easy. And it's clearly not in their best interest to make it easy. But with that coined made, what is it that we do today? And how does it relate to the fact that someone, you know, this information about, for example, house prices and things like that is widely available online and somebody you can't remove it under todays wise. What we do, and, you know, I think it can be highly effective, is we opt out all the data, we opt you out from all the data brokers that collect that information and correlate it to all the other personal information that's out there at you. So, we, in effect, can stop the aggregation of your and resale of a bunch of the information that gets out about you.

[00:06:17] KRISTINA: Can you give us an example of that because some of the things that I'm always worried about, certainly my information, but my child's information and I'm a little bit shocked when I see online, what I can actually find out about my child, but what are the types of information already that we should be worried about that it's floating out there may be that we want to get rid of, or that we want to contain?

[00:06:37] ROB: Yeah. A very simple example is when you Google yourself and any one of your listeners can obviously do this, and what will often come up, not in every case, but often is in the first page or pages of Google, a litany of, of site purporting to not only sell your information and allow you to see details of your family, your child. Their name, your cell phone number, your criminal background history, your employment history, how much you're almost worth, photos of your home from the street, all kinds of, there's so many, they're marketers, it's throwing any possible salacious detail that they can, and these are advertisements from the top data brokers sites, like whitepages.com and Spokeo. And the list goes on and on. Unfortunately, the list changes a lot because these data brokers are multiplying like rabbits. So, when you click on those things, you can do a search, and you can find profiles about yourself that include incredibly detailed information. Oftentimes, not just the names of your children, but their ages, the relationships between them, not just your children, but your parents, grandparents, aunts, uncles, their addresses, past addresses, things like that. And obviously, that information put to misuse or even just purchased by somebody that you're not comfortable knowing that much about you is not comfortable, not where I think a lot of Americans and a lot of, you know, citizens in any society would be comfortable. Yeah, it's not a position that a huge amount of people would be comfortable with. It's just too much detail. And that's, unfortunately, where we are right now today, March of 2022.

[00:08:31] KRISTINA: I did that. I went and googled myself. And to be honest with you, it was overwhelmed. Is there such a thing as the do-it-yourself data removal process? What does that even look like?

[00:08:41] ROB: Well, there is, and I'll be the first to say, and we do hear this from some of our customers, hey, can you remove me from the internet? Delete me from the internet, unfortunately, with educate them and say, no. That's very, very difficult, and achieving anonymity in the true sense and definition of anonymity has always been different. What you can do and to your question, you can do this yourself. You don't need a service like DeleteMe or one of our competitors. You can go and perform these opt-outs yourself at many different websites. And you can also change the settings and a bunch of your social media platforms and things like that to constrain how this information, some of the ways that this information leaks out And in fact, if you go to our website DeleteMe.com on the very top of the home page, you'll see a DIY guide do yourself guide that breaks down in very simple steps, usually between one and five steps, how to do these removals and opt-outs so, it is possible to do it yourself. And indeed, we support, I think there's, tens of thousands of individuals that are visiting this guide every week or month and doing it themselves. So, it is possible. It is simply time-consuming.

[00:09:56] KRISTINA: To know where to go and request the removal of data, obviously from data brokers, et cetera, you have to go and collect at first; DeleteMe seems to be doing that really well. But how do you deal with the data that you collect? Is it anonymized, or can employees see it? Is that another thing to be thinking about? Because you are kind of the ultimate broker, and you delete the data, but you also have to collect it in the process of deleting it sounds like.

[00:10:22] ROB: We have that issue, and it is somewhat ironic. And maybe even a little bit hypocritical that we have to ask for your data in order to go find it and remove it. And that means we have to have a database where your personal information tells us to go find. You're creating yet another data set about you with us that ostensibly could be; I suppose it could be misused by us if we were a bad actor, and it could be stolen by hackers and the data breach or something like that. Our answer is to look in some ways, it's a necessary evil. And others, we would say that we don't have any more information out there in our databases than is out there about you already. So, in a way, it can't be any worse, and we need that information to do our job. It's not the most satisfying answer in the world. But it is an answer. Lastly, I would say that we take all the precautions that we can in, and we go through security audits and everything else, but I tell everybody who asks this question to any company that says that they know, you know, they know very, very well that your data is secure and you can trust them, and nothing will ever happen. No mistakes will ever be made, no hackers, well, forget it. This is simply, you know, not being truthful because anything can be hacked.

[00:11:49] KRISTINA: One of the things that I've been thinking a lot about is our evolution towards the metaverse, and at various organizations that I participate in, we argue whether or not we're already in the metaverse or not. My stance is we're heading in that direction. We're not quite there yet, but obviously, we're moving from a two-dimensional to a three-dimensional world or a web, at least. And with that comes exponentially more information that's going to be floating out there. Things like biometric data that maybe we don't want to have out there are, at least in my mind presenting the next level of privacy information floating around about out there. How are you preparing for that at DeleteMe? Or are you thinking about that yet?

[00:12:30] ROB: Yeah, and we are, and I think the metaverse is problematic for data and data privacy in many ways because there, as you said, there's much more data that could be much richer data that can be collected. If you're sort of living a fully immersed second life in virtual reality or across different virtual reality. So. You know, there's a set of tools and things that we're working on in terms of sued pseudonyms and, and, and recommendations for people and kids that are participating in these communities and worlds and expect that to increase. But also, we're looking at the privacy practices and the ability to opt-out of including things like biometrics, speech recognition, and other forms of biometrics that are going to be used. In our opinion, far too broadly and, and, and potentially in dangerous ways. So, we'd never been a company that scolds users for being part of technology. We don't advocate not participating in social media. We simply are trying to fight, give you, give people a way to have somebody they trust, look out for their data when they do the things that everybody's doing in the modern world. And that includes the metaverse.

[00:13:45] KRISTINA: I love what you just said there. People ask me, oftentimes, are you against technology? And I'm like, no, I'm like the first one to raise my hand and say, I want cooler technology. I want better technology. I want life to be easier, more exciting, and more engaging. I just want it to be safe, and I want some aspects of privacy that makes sense. And so I'm wondering about that, how are you dealing today with the data that's collected? Or should I even be worrying about data that's collected through devices, such as the Oculus? I know we have an Oculus device in our household. And it's fascinating because once I went through the setup and gave permission to Meta to collect my information, I've never been prompted to give permission for data collection. Again, even though that device has switched heads, right from mine to my sons, to my fathers, to my friends, to her husband's, and anybody else who's wanted to check it out when they come on over. So how does that work in terms of privacy? Because that's another layer now, right? It's no longer of public records around my homeownership. Now it's about who's put that device on their head and what type of privacy can I expect in that context?

[00:14:53] ROB: Yeah, I mean, it's a great question. I don't have a good answer. These are new technologies. It's a good example of one of many the new problems and new opportunities that are being created by the technologies and the communities that are experimenting with all over the world. There are complex use cases like you just said, which are device sharing and in virtual worlds and the data that's being generated across different people, which actually may be a privacy opportunity rather than a privacy problem. Because I think one of the things that people are going to have to do is mix the signal that they're giving to you pick technology companies more than they're doing today. And but I think there are also, you know, examples that are much more prominent, just like, Hey Alexa, the Echo, and Alexa are out there in so many households now and the potential use of facial rec that's coming down the road. All of these things are going to require a whole lot of discussion, but ultimately, they fall back on the principles and the frameworks that have been pretty well laid out in the GDPR and the CCPA. And if you're not familiar with those acronyms, basically privacy legislation, that's been sweeping across not just the European Union, in California, but really all over the world. Even China, a year and a half ago, passed a personal information law. So, this is a global phenomenon, and the laws have very common structures in place. And the structures are simply ones that allow somebody whose data is being collected by third parties, which is basically a hundred percent. I want us to do several things. One gets access to that data. So that could include data from your Oculus, and data from Alexa and whatever to correct it or constrain it from the sale or from mistakes or whatever, and three to opt yourself out or to delete. If you decide, that's the course you want to take. And I think having those rights, citizens having those rights, I think can apply. And this is what the law is trying to do. They can apply to many future technologies and any use cases that we're going to see. So, the question is whether we can get those rights broadly here in the U.S. and in the rest of the countries across the world where they don't have them yet, and then how are they applied.

[00:17:23] KRISTINA: Do you think we need to have a digital ID for that to happen?

[00:17:25] ROB: No!

[00:17:27] KRISTINA: Okay. I was thinking more along the lines of, I'm thinking about it from a practical perspective. Once we go into virtual reality or into sort of biometric data, I was also actually in the back of my mind, thinking about a more practical today use case at DeleteMe. So in my household, we're known for having a gazillion different email addresses. My son has what I call throw-away email addresses. And we try to blur things all the time, depending on what he's signing up to do. I'm wondering if we need to go to a more centralized digital ID, or is it okay to continue to create a whole bunch of junk data out there? That's just kind of floating around.

[00:18:10] ROB: I am. I, I'm a huge, enormous proponent of, no single digital ID and lots of what your son's doing.

[00:18:21] KRISTINA: Okay. So good. So I'm going to get a little star for that at least?

[00:18:25] ROB: Goldstar look compartmentalization to use a fancy term is what your son is dealing is using one email address for one game he's playing and another for another world. And another, for another thing, he's registered. It's fantastic because it allows you to be somebody in one context in somebody in another context, and it doesn't give marketers spammers and nefarious actors that can do much worse harm the easy ability to correlate data across the domains that you're participating in the digital world. And that's very powerful, and it's very simple for people to do.

[00:19:06] KRISTINA: So you've been doing that. I think that's a service that DeleteMe offers. We've also seen apple offer that as well. What should people be thinking about? Is that sort of like the number one hygiene thing that we should be doing? Is that kind of the easiest thing?

[00:19:21] ROB: Well, you know, it's a good question. I think the simple things that people can do are to go to their key accounts like lists or just yourself, the list, the top 10 things that you spend the most time in your email inbox, your most used social media accounts, your credit card, your cellphone. Go to the settings and choose the settings that are privacy being constrained the resale of that data. So that's one important thing. Another is yes. Don't use your same email address and your same phone number as little as possible across different contexts, or use, throw away ones when possible. And then three, whether you're doing it yourself or using a service like us, go out there and make sure that your data isn't easily available with a single simple Google search so that there are some of the digital footprints about you that you're okay with and that's not easily accessible to anyone and everyone. So I think those are the three things that pretty much anybody can do it. It doesn't take a tremendous amount of time; a transplanted effort is relatively simple, and it can truly help give, give you better privacy and reduce your digital footprint.

[00:20:45] KRISTINA: Organizations have been all about data all the time, and I'm wondering, do we really need data brokers? And for anybody who's listening, they're either part of a digital team or part of a marketing team. What do you think organizations can do, or rather what should they do to ensure the integrity of the privacy game?

[00:21:05] ROB: We need data there's clearly this is clear, not a binary fight. It is not like a regular good marketing team in a company that is using lists to try to, you know, send targeted offers to a group. They should not be off the planet for their aggressive, bad behavior. We are in economy; the economy needs data to function. So, where to draw the line is, is a good question. A difficult question, but I would start with the fact that consumers' rights should be respected. So if somebody goes through the effort of sending you, I don't want my information in your database, or I don't want you to use that information, my information, for anything other than the service that I have agreed and paid you to deliver to me, whether that's paid in the form of attention from an ad I click or whether it's paid from a subscription I have with you, like that should be respect. And that is a fundamental principle that I think. You know, marketers and companies collecting data of all types should implement. And I think in large part; they are, they are, they will be either. They will either be proactive about it or they'll be forced, or they're being forced to do it by these, by the CCPA and other privacy legislation that is coming soon. The thing that they're not doing is making it easy. And that is really where the debate over the next five or ten years, I think, is going to occur. Now, whether they should do it, they're going to have to, and many of them are doing they're allowing people to somehow get access to their data and their opt-out or constrained it. What they're not doing is making it easy. And, and, and even I would say most big tech companies are spending tremendous time and effort making it what is the right word, intriguingly confusing for the average person so that they don't take the options that they might otherwise take if they were presented, bearing clear.

[00:23:21] KRISTINA: That's a very eloquent way of putting it. As a consultant, who's also an individual, one of the things that I see oftentimes, and I'm a little bit appalled by, is clients who are trying to even do the right thing and they're willing to delete data they have because I think like 90% of the companies that I see out there actually trying to do the right thing. They're not all trying to smuggle bad data from brokers and do really evil things. At least not on purpose. But I see more organizations that are trending towards trying to respect user privacy, who are trying to give users choices. But inevitably, I see data that still slips into things like Google sheets, spreadsheets that are emailed around to people in the organization or outside of the organization. So it's really this unintentional seepage of private data that's making its way out there. And that's the thing that makes me cringe because, in some ways, I think. At least if you're deliberately handling the data, at least if you're deliberately doing something with the data, you know, that that data is at a specific place or part of a deliberate process, you can go back and delete it potentially. It's really when you forget that you shared a spreadsheet that has my information, such as my name and my email address and my dietary preferences, with a conference organizer who's going to be catering the event, and you forget that it's out there on Google sheets and it stays out there for the next decade. And Google now has my information about dietary restrictions, which arguably isn't the most private of data, but it still is private data. And so I'm wondering from your perspective, how do we actually get at the data that seeping out and intentionally, how do we actually clean that up, or does it even matter?

[00:24:55] ROB: I mean, I know, it's frustrating. But I think on the, on the list of concerns that I have about data unwanted exposure of, of our personal information, it's actually rather low because the aggregation of the data that's out there about us and the correlation of it and the new data that we're generating in context, like the metaverse and having it all collected from these disparate sources, including your dietary preferences in one hand, and then your mobile data on where you've been on another. And then what you've purchased in the past, that who your social network is and how much your house was, all this stuff being fed into recursive sort of machine learning and AI engines to make decisions about us, that we're not privy to, that combined with something we haven't thought about right now yet, which is the unwanted sort of the doxing and harassment angle on, on, on personal information, which is not necessarily just companies using your data, but, unwanted individuals going from a place where they may have seen you or them in one context and then using this exposed information to basically harass you in the unwanted ways is another big, big issue that we see you know, in a lot, a huge and increasing frequency of incidents around. So there's a whole lot of things, really. I think about the aggregation of data that is more troubling than someone accidentally leaking a spreadsheet in one case or another simply because statistically, there's just a lot less of that, a lot lower of a chance of that affecting any one of us, then these other broader things happening at the aggregate level at the data.

[00:26:47] KRISTINA: This is helpful because I'm feeling, I guess I'm feeling a little bit better, Rob; I'm also feeling more hopeful, which is think about context and data that's actually being put out there in terms of context. So don't worry about the little stuff it sounds like because you can't get a hundred percent of it. So worried about the big stuff that's going to impact you. And the rest will take care of itself.

[00:27:08] ROB: Yeah. And I think the other reason to be hopeful it's always good to have data privacy conversation around on a hopeful note is that there's a lot of, I think, bipartisan support for laws that empower us as individuals to have more control of our data. I think everyone, whether you're, you know, far-right and far-left middle, middle of the road politically, recognizes that what's happened over the last 20 years, which has been the wild west of data, is not what society wants ultimately. And I think those laws that help us get right, get the rights back to control our, our data. If they're not already working their way through, legislatures are coming. And I think that's something we can all look for it.

[00:27:59] KRISTINA: As you said, Rob, always good to leave everything on a hopeful note. So thank you so much for giving us hope, and thanks for sharing so much great information with us today. Appreciate you taking the time to come and hang out.

[00:28:15] OUTRO: Thank you for joining the Power of Digital Policy; to sign up for our newsletter, get access to policy checklists, detailed information on policies, and other helpful resources, head over to the power of digital policy.com. If you get a moment, please leave a review on iTunes to help your digital colleagues find out about the podcast.

You can reply to this podcast here: