#12 Digital ethics for a long-term, competitive advantage

#12 Digital ethics for a long-term, competitive advantage

#12 Digital ethics for a long-term, competitive advantage

Guest:
Guests:
Pernille Tranberg

Pernille Tranberg

Pernille Tranberg is an independent speaker and advisor in data democracy, data ethics, data understanding and digital self-defense for companies, authorities, and organizations. She is the co-founder of the European thinkdotank DataEthics.eu

She is former tech and investigative reporter and editor-in-chief.  She has written 7 books: the latest DataEthics - the New Competitive Advantage (2016) with Gry Hasselbalch for free here: dataethics.eu/book The second latest with Steffan Heuer ‘Fake It’ (2012)

Just like organic milk, digital ethics will become a global standard to which more countries, businesses, and individuals will subscribe. We are all at the beginning of this journey, and Pernille Tranberg shares what smart organizations are doing to lead the way.

Keywords:
Season:
1
Episode number:
12
Duration:
28:54
Date Published:
April 16, 2020

KRISTINA PODNAR, HOST: Welcome everyone to today’s episode of The Power of Digital Policy. Whenever the topics of ethics, whether broadly in digital or more specifically in data come up in conversation, my mind immediately heads to Pernille Tranberg, and in a few minutes you will understand why.

Pernille is an independent speaker and advisor in data democracy, data ethics, data understanding and digital self-defense for companies, authorities and organizations. She is the co-founder of the European thinkdotank DataEthics.eu. Pernille is a former tech and investigative reporter and editor-in-chief. She has written 7 books: the latest DataEthics - the New Competitive Advantage (2016) with Gry Hasselbalch is available for free at: dataethics.eu/book. Her second latest book is with Steffan Heuer ‘Fake It’ (2012) available at digital-selfdefense.com. All of these links are easily available in the resources tab of the podcast, so you don’t need to worry about jotting them down. Pernille, thank you for taking the time to speak with us today. Can you start us off by telling us how you got your digital ethics legs.

PERNILLE TRANBERG, GUEST: I was in media. A former investigative journalist and Editor in Chief of a consumer magazine until around 10 years ago. I ended up proposing how to find a new business model for Danish media, which they did not want and I then I gave up. I didn't want to be in a business which was just going down, down, down. So, I decided. I did a book called “Fake It: Your Online Identity if Worth Gold”. This is how you take care of it. I did that in 2012, about how businesses were collecting data and how you could kind of take control of your own identity. I did that with a German journalist living in the US, and that started this business I'm working on now. Partly it’s with individual’s privacy, helping individuals try and control as much as possible their own data and partly with businesses and organization how to treat other people's data in an responsible ethical way.

KRISTINA: Why do consumers need to fake it? Why should we all just not be online, especially with this age of surveillance and privacy?

PERNILLE: Well, I think you should when I say this I talk to individuals. And all kids under 13 should never use their own name. That's what I believe. It’s my strategy and what I teach is that you should keep a strategy where you decide what's online about you, So let's just separate your professional and your private identity. What you consider private keep that away from the public and only use privacy focus tools and then optimize your professional identity. So I use my own name on Twitter and then my own website when I go out and participate in discussions online. I always use my own name and I'm very, very visible online.

But when you Google me you get more or less what I want you to get on page 1, 2, 3, 4, and 5 on Google. And that's because, that’ your CV today. That's your resume. That's your online identity. Whereas I don't want people who don't me to see, you know, stupid pictures of me and my dog, and stuff like that. So on Instagram I had my dog and I sent out a lot of stupid pictures and on Facebook whom I don't trust at all. I have had another name for 12 years on Facebook and all my friends on Facebook knows it's pseudonym. You know before Facebook, everybody has this thing we call a pseudonym.

KRISTINA: For people who are trying to do that. I think that sounds great but sitting in the United States, I know that if you Google me, for example, you're going to see when I purchased my home. You're going to see my home phone number, that by the way has been disconnected for years, but it's still in the search engine results. There's a lot of information out there that I personally and others don't have control over. I know that the EU is so much kinder in that way because there is more control over your data under GDPR, etc. But what do you say to people who aren't in areas where they can actually control some of that data it simply put into the public by our government, by other entities.

PERNILLE: Well, then you really have to work up a law. That's the only thing you can do. We have a right to be forgotten so we can tell someone: “We don't want to have a public phone number. We don't want to have a house out there. We don't want to be on Google Maps.” A lot of people don't mind and don't do anything about it. But if you live in the States you have to work to get better laws like, GDPR, so you get the same rights.  If I lived there I would try individually to take that fight, and say well go to the provider and say you would like to have these data pseudo anonymized, so they can see it's yours. It's okay. They might see the address but not connected to your identity, if you don't want it out there. So you have to do in the States everything on an individual level. So you have to take the fight on your own.

KRISTINA: Yeah, I have tried that. I've been very successful in changing or sort of anonymizing or pseudo anonymizing really, our data but only to an extent.  And this is interesting enough to those people listening in the United States if you create a trust or if you create some other type of legal entity, the individual county level or city level titles are allowed to actually change the name from your individual to a trust name or to some other type of legal entity like a business. And so that's not exactly the right solution, I think. But it does get us hopefully one step closer, you know to removing our personal names around that. But what do you think about things like IoT? Given the IoT revolution, more data is now collected not just from individuals but from devices. Is this starting to be a good thing or how can we benefit or can individuals also keep fighting for their privacy in that arena?

PERNILLE: Yes, I think that what we will see here is that we will have the good guys and the bad guys. And for example, I would never use Google home or Alexa, the one from Amazon. I would you Snips which is out of France. Snips.AI which is a voice assistant based on privacy by design. So we will see the difference here and those who can afford it, unfortunately, and who are aware of it can go after these products. That's also a way of showing that we want to be in control of our own data. So I think that companies going in that direction, will get the consumer groups of the target groups who are rich enough and aware enough about this. We really need to have that on IoT products. Otherwise, we're losing control and completely right.

KRISTINA: Do you see people starting to become more aware though of the IoT aspect, especially the voice search? I know that for you personally people might say hey, I have Alexa in here as you enter their home, but on the whole society moving in that direction? Where there is awareness and warnings for people who are coming maybe to drop off a child for a play date or just coming for a cup of coffee. Either raising awareness of the fact that you're in an environment where you might be listened to?

PERNILLE: I think that awareness is bigger in the US than in Europe, actually. The problem is that's very little there's not a lot you can do about it. And that's what we're waiting for. That we have ways to act on this. But the awareness is definitely rising also with a lot of different movies like Black Mirror or Years and Years, and so many movies coming out showing that dystopia of this. So I think more and more people are becoming aware of this and you can compare it to the environment in the 60s. You just throw stuff out when you're walking around Central Park, you know, I remember seeing the Magnum episode where they're picnicking in the beginning of the 60s and they just throwing things away on the loan and that songs a picture of the data today.

KRISTINA:. What about kind of thinking about the company side? I believe that organizations can create competitive advantage by defining sound digital policies, especially in this sort of new arena IoT voice search. And including those aspects of ethical data usage, they can free workers to innovate, do other things. You also wrote in the book Data Ethics: The New Competitive Advantage, you know from your perspective. What's the most important advice for companies to incorporate if they want to become data ethical? What does that really mean for them?

PERNILLE: That they had to think long term. If you are only in this to make a lot of money, it's not working. So what I see is those companies or corporations who are very much into being green and responsible with the environment and humans, you know, no child workers and we're paying our taxes. Those companies. For them, it's much easier to go into this word now. And I'm working with some big companies who are working with us as well. And they have the privilege of thinking long term, because it's not something you would win on tomorrow. It will grow slowly like the environmental awareness. We are becoming more and more aware of this and then you'll start choosing these companies who are treating you well, your data responsibly. And in Europe, we have laws which are backing this up as well.

KRISTINA: So how do we get that then to trickle down from these privileged few large companies that actually have the funds to do that to everyday people. How do we get everybody to kind of sign on to this data ethics concept?

PERNILLE: Yeah, now it's very, very hard. I mean you don't you haven't gotten everybody on the buy-organic-food ride. That's probably still 30% or 40% in Europe, or in Denmark. All the others are buying cheap, non-organic food. So of course, we have to work towards trying to get everyone on.  We see Apple distinguishing itself from Google. So usually people buying Apple products are in the wealthy end and they are the first movements here. But hopefully buying these privacy-focus products will be, the more that buy them, the cheaper they will become. Just like organic milk. For example, in the beginning it was very expensive. Now  it's on the same price level as non-organic milk, and that's why that is taking over. We can hope that this is the way it will develop with privacy-focus products as well.

KRISTINA: Does that mean that those that are disadvantaged in the world are the ones that are going to suffer the most and is that something that you see flattening out over time?

PERNILLE: Yes, yes, they will start for the most in the beginning as with everything else. You know, I mean because it will be those who are aware of it. In socialist countries, like some of the countries in Europe or Canada, there might be a possibility of trying to enforce all, make everybody use privacy-focus products. But in countries like the US where the winner takes it all, it will be big a divide between rich and poor. And I'm really sad about that, but that will probably happen all over.

KRISTINA: So it's already happening, right? I think, we're starting to see, you know, GDPR a little bit of CCPA, you know, I was more hopeful, I guess at the beginning of this year before I started to see all of the modifications and the newly proposed changes.

You're helping companies use data ethically, how does today's landscape with GDPR with CCPA, Brazil has a similar law coming into effect later this year. How are they helping your mission and are they really changing the world or is this sort of moving the needle a little bit while we watch horror still happening?

PERNILLE: Well, they are just beginning in this, you know. These corporations, I work with I'm not a consultant. I'm just giving them my knowledge and starting a network with big corporations working on this. So this is the very beginning. But if we look at this in a 10-year period, I think they will become role models and help spur change. But it will take some time. If for example, a big company starts stops buying YouTube advertising, they already changed on that couple of years ago. I think it was when they had all these extreme content on YouTube. So we are already seeing the change slowly. And if we see a big company use Signal, like you see the EU government using Signal instead of using Facebook Messenger. That will change, that will make a change because Signal is much more safe and secure than Facebook Messenger or Whatsapp. And so Signal or Wire, you know, we are seeing that change coming. And if those big companies can walk the talk themselves that will change because we don't get privacy-focused products if we don't use them.

KRISTINA: That's a good point. And that is, we have to demand that we use those products if we're actually asking for them to be developed. Otherwise, it doesn't work, right?

PERNILLE:  Exactly.

KRISTINA: So you're developing these data ethics standards, which I think is just incredible and I'm very grateful. I'm sure a lot of people are as well, because it takes somebody to start a huge revolution. And so you're developing these data ethics standards for technology, for business. You're trying to educate the individual as well. Can you tell us more about the project and the progress? How is this going?

PERNILLE:  It’s going very well. We're going out to schools and school teachers and individuals and companies. So it's moving, the interest is there and it's really growing. What I think is important, next steps, is how do we make sure that companies are anonymizing data in the right way. How do we distinguish between good and bad products in there. So I believe that we will have to develop some very serious trustworthy certification schemes. Just like when you go to the supermarket and buy an organic cheese, you know, you trust that certification scheme that this is actually organic. This is what we need to help consumers and businesses buy ethical products. Because nobody can sit by themselves and decide what is good. And that's what we are trying to do right now. Not that we are becoming an auditor or something like that, but we want to help that movement so we can develop that.

KRISTINA: Are you seeing corporations jump on that bandwagon because it seems like that would be a competitive advantage, just like the example you gave of organic. The organic stand for example in the United States or perhaps non-GMO has become such a big marketing. Do you see organizations coming to you and saying “Look, I do want to distinguish myself. I want this label.”

PERNILLE:  I think some companies are just doing it because, oh, it's going to be a competitive advantage and then think of profit but I actually more see now out of Europe seeing some big companies doing this because it's the right thing to do. It's simply because we have to do it and they know they're not going to make money on that to begin with. One example is, I'm not working with them, but Lego. It’s one example, they company were quite a nice company.

Working with your children, right? They've done this for the past 10 years and they are up against Disney and Matel and Pixar, or manufacturers from China. But they're doing it the right way because they had to and they decided, Well, we might not make money on this. For example, the don't use third-party cookies. They haven't been using that for 10 years or something and knowing that they will lose money on that. Because they said no to the ad industry before everybody else did. So what I see is mostly companies doing this because it's the right thing to do.

KRISTINA: So it's the right thing to do but it also places them at an advantage in a way right? Because now that we're talking about third party cookies going away companies like Lego are ready to continue doing business as usual and they don't have to go back and refactor their technology.

PERNILLE:  Yeah. That's why it's going to be a long-term competitive advantage.

KRISTINA: So if individuals want to look to these corporations and have them be certified around data ethics, what do you think that program looks like? Is there a criteria for the certification?

PERNILLE:  Yes, I think there will be some really cool certifications. Like we have with in all other fields. So that's what we are pushing with our data ethics standards and we are also. My colleague is part of the AI high-level expert group. So we're pushing these standards to be used to do certification.

KRISTINA: That's very helpful. I’m hoping that a lot of other people actually come on board, especially from the US and even from Asia. Because I don't see a lot of companies in Asia paying a lot of attention to privacy, to data ethics. It seems like Asia is a little bit behind the curve, but that might just be my perception. What is your take on that compared to the EU and to the North America region?

PERNILLE:  Yeah, I do think that a lot of us companies are going this way, like Apple and Mozilla and some really responsible US companies. I don't see that many in Asia, actually. I can't think of one.

KRISTINA: How do we bring everybody together? How do really unite this around the globe or is that just sort of a pipe dream for the time being?

PERNILLE:  I think it's a dream. I mean universal human rights. I mean who is living up to that in China? So, so I think we have to start where its most realistic in that in the western democratic world where we have democracies. That's where we should start. And that all of this data ethics it's also about creating a data democracy. It's about saving our democracies and our humanity.

And if Asia want to come along and say… I think in Japan actually, they're doing a really good job. Japan is also Asia and Japan is an exception. They have big companies working with this as well. I just thought of that. So, Japan is really cool on this as well. But it has to grow from beneath. It has to be the population in that country who are fighting for it. Just like we are seeing young Hong Kong Chinese working demonstrating to keep their own democracy and not being part of China. That's what we have to see.

KRISTINA: So we're seeing more of, I think individual step up. I'm thinking about you obviously in the space, your colleagues and your partners. In the United States. I don't know if you're familiar with Perry Aftab. She's a lawyer who is focused for many years on children's privacy rights. And so she's really taking it upon herself at this point in her career to fight for children's online rights to privacy. How do we come together or do we even need to come together as individuals so that there's more of a force behind data ethics and the right to privacy? Or is it enough that we're working at this organic level on an individual basis?

PERNILLE:  No, you're very right about this because the problem is now that some people are see. Oh my God, there's money in this. So there's a kind of a competition in this area already and a lot of people don't want to work together on that, which is really sad. I think we have to unite in a way. I don't actually know how to unite on this. I'm giving away everything I know. I'm trying to walk the talk myself and give away everything I can of course. I'm also making money on this but I've been doing this for 10 years and I have ethical pricing. I'm trying I could actually raise my prices a lot because I'm very busy. But I'm not doing that because I want to walk the talk and I want to share as much as possible. So I'm sharing everything. I know.

It be companies were charging because I make enough money. But that's why we have to work at this in this as nonprofits. I think Data ethics is a nonprofit organization and we work with a lot of other nonprofits. In the U.S. we see ACLU, we see IEEE. We work with Mozilla. That's also a nonprofit. We are big network of nonprofits working together and I think that's probably the way to do it.

KRISTINA:. That's I think a lot of other initiatives have gotten traction. And so hopefully you're right. So if people listen to you want to get involved, I think there's plenty of people who say, you know, what I might not be able to do this as Pernille does full time and for free or for a very low fee, but I still want to get involved. I still want to give back. I can do something because we can all do something. How do they get involved? Can they volunteer with you? Can they volunteer with another organization? What should we individually do today?

PERNILLE:  Yeah, well, I think you should join forces with other people in school. Or in at your job, who know about this. Some of us are publishing tools for example. And if you could join forces with other people who understand to use these tools, which is actually pretty easy. You can form communities. There are something called cryptoparties. Which came out of Australia a couple of years ago. I don't know if that that still exists. But that were people who voluntarily joined crypto parties where they told each other that to install tools and services on their gadgets and helped others doing that as well. So in a way, it's also it's just using those tools is also good thing. Use Signal instead of Facebook Messenger, get away from these bad tools. And use the other tools if you're in the U.S. I would say Signal instead of Facebook Messenger of Whatsup. That would be a really good thing to do. And just read about it and try to use them, the alternative tools, even though they are not as convenient as the big free so-called free tools.

KRISTINA: Nothing's ever free though. That sounds like a good piece of advice and for organizations out there who are thinking to themselves: You know what I can't change everything that I'm doing because I don't think that I have the resources, or it's going to cost too much. But I want to start doing something towards data ethics. It sounds like something I should be doing. What are the little things that they can do to kind of get themselves some momentum in order to take the really big steps?

PERNILLE:  So in the U.S. I would sign up for the new tech publication The Markup.

And then I would as an organization instead of enforcing the Chrome browser on my staff. I would use Firefox browser or Apple Safari browser. Those two browsers are really  good and they are also blocking tracking by default. Which is even more than the GDPR are asked you to do. This is data ethics blocking third-party cookies by default. So those two browsers are really good. Or they can use Brave which is an even better browser. Brave who are also fighting for GDPR rights in the US.

KRISTINA: I love this because you're demonstrating that it we can take these small steps. It doesn't have to be an all-or-nothing proposition and businesses, individuals. We can all do this together and the outcome is going to be the best for everybody.

PERNILLE:  Yeah. Small steps are important because otherwise you just give up beforehand. It's just like saying well, I'm not going out to go to vote for this political election because it's not going to help anyway. It does help to do small steps. And it will take some time. But just do small steps and you are on the right track.

KRISTINA: Wonderful. Well, thank you so much for providing all of this Insight. I'm very grateful. I know that my listeners will be as well. I can't say we appreciate enough of what you're doing because I think you know as somebody with a twelve-year-old in my household, I'm very keen on having privacy not so much even for myself as much as for future generations.  And a lot of my clients struggle with this issue. So I think for personal and for business purposes, you've given us a lot of great insights and thank you so much.

PERNILLE:  Thank you so much for your podcast and the newsletter.

KRISTINA: Thank you to everyone for tuning in for another episode of The Power of Digital Policy. I hope everyone one has enjoyed this conversation with Pernille. I know that for me, I am  reenergized to take action around data privacy and ethics in my personal life. But I am also really energized to continue working and bringing this issue to the forefront inside of organizations. Digital ethics is one of the new norms in our business and as we heard today, it doesn’t have to be an all or nothing proposition. We can start small and build. We can make a difference from a business perspective and do the right thing. It will pay off, but we have to lay the ground work.

Thanks again to Pernille for being with us. Links to the resources Pernille shared, as well as a link to download her book and sign up for her newsletter are on my website under the podcast tab. With that in mind, stay well and keep doing great digital policy work. And Pernille, I hope we can have you back again soon.

You can reply to this podcast here: