Episode 23

full
Published on:

29th Sep 2021

A Data Privacy Paradox

Canada's Top Data Protection Expert Reveals The One Area Where She Is Completely Torn

Sharon Bauer dedicated her career to privacy based on her strong belief that we all have a right to privacy, we all should be in control of our information, and free from interference in our private lives, especially from government and law enforcement.

Sharon explains how the very tools that privacy and security pros get excited about, such as end-to-end encryption are the same tools that child abusers rely on to share images of child sexual abuse.

Sharon sheds the light on Apple's announcement that it plans to scan iPhones for images of child sexual abuse and match them to known images in a database.

She explains how Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure.

Whilst we support a proactive approach to preventing such heinous crimes and commend Apple for trying to solve a problem that too many people are scared to talk about, Sharon explores whether this is a slippery slope to a surveillance state, whether the system will be accurate, and whether innocent users will be implicated due to incorrect predictions.

Discover:

  • What tech companies doing to combat Child Sexual Abuse
  • Why privacy and security professionals are outraged by Apple
  • How to deal with this ethical dilemma

And so much more…

Ready to become a World Class Privacy Expert? Book your call to join the World's Leading Privacy Program

Sharon Bauer is a privacy consultant. She is also a lawyer. She is the founder of Bamboo Data Consulting.

Her firm specialises in privacy, security, data strategy and a range of cutting-edge technology ethics work.

She provides strategic risk management and privacy compliance advisory services. Sharon works with diverse companies, from startups to multinational corporations, in industries such as technology, financial services, telecommunication, healthcare, sports, marketing, and retail.

Sharon implements global privacy programs and acts as a virtual Chief Privacy Officer for various companies where she provides ongoing privacy advisory services.

Before founding Bamboo, Sharon was a litigator for 10 years and then worked at KPMG in the national privacy team. Sharon is a sought-out speaker on privacy matters and emerging technologies and frequently publishes about privacy issues.

Listen Now...

Follow Jamal on LinkedIn: https://www.linkedin.com/in/kmjahmed/

Connect with Sharon on LinkedIn: https://www.linkedin.com/in/sharonbauerlawyer/


Subscribe to the Privacy Pros Academy YouTube Channel: https://www.youtube.com/c/PrivacyPros

Transcript
Sharon:

Apple has introduced a new software technology whereby they're going to scan your personal device, your iPhone and search for child sexual abuse materials, which is also called CSAN.

Intro:

Are you ready to know what you don't know about Privacy Pros? Then you're in the right place. Welcome to the Privacy Pros Academy podcast by Kazient Privacy Experts. The podcast to launch progress and excel your career as a privacy pro.

Intro:

Hear about the latest news and developments in the world of privacy. Discover fascinating insights from leading global privacy professionals and hear real stories and top tips from the people who've been where you want to get to.

Intro:

We're an official IAPP training partner. We've trained people in over 137 countries and counting. So whether you're thinking about starting a career in data privacy or you are an experienced professional, this is the podcast for you.

Jamilla:

Hi everyone and welcome to the Privacy Pros Academy podcast. My name is Jamilla, and I'm a data privacy analyst at Kazient Privacy Experts. I'm primarily responsible for conducting research on current and upcoming legislation as well as any key developments by supervisory authorities. With me today is my co-host is Jamal Ahmed, fellow of Information Privacy and CEO Kazient Privacy Experts. He is an established and comprehensively qualified privacy professional with a demonstrable track record solving enterprise-wide data privacy and data security challenges for SMEs through complex global organizations. He is a revered global privacy thought leader, world class trainer, and published author for publications such as Thompson, Reuters, The Independent, Euronews, as well as numerous industry publications. He makes regular appearances in the media and has been dubbed the King of GDPR by the BBC. To date, he has provided privacy and GDPR compliance solutions to organizations across six continents and in over 30 jurisdictions, helping to safeguard the personal data of over a billion data subjects worldwide. Welcome Jamal. Thanks for joining.

Jamal:

Good morning to you Jamilla. How are you?

Jamilla:

It's:

Jamal:

Not it's morning for Sharon.

Jamilla:

So we're delighted to have Sharon Bauer with us, who is a privacy consultant and she's also a lawyer. She is the founder of Bamboo Data Consulting, a consulting firm specializing in privacy, security, data strategy and a range of cutting-edge technology ethics work. She provides strategic risk management and privacy compliance advisory services, and she works with diverse companies from start-ups to multinational corporations in industries such as technology, financial services, telecommunication, healthcare, sports marketing and retail. Sharon influenced global privacy programs and acts as a virtual chief privacy officer for various companies where she provides ongoing privacy advisory services. Before founding Bamboo, Sharon was a litigator for ten years and then worked at KPMG in the National Privacy team. Sharon is a sought out speaker on privacy matters and emerging technologies and frequently publishes about privacy issues. Welcome, Sharon. Thank you for joining us.

Sharon:

Thank you so much for having me. This is going to be so much fun.

Jamal:

Yeah. Welcome, Sharon. We're so delighted that you managed to make the time with such a busy resume to give us some time for our valuable listeners to share some value. And Sharon is going to talk about the latest Apple updates and what it means to us and why we should be really worried. And I asked Sharon to jump on this interview, having read a LinkedIn post that she wrote that really resonated with me. So we're going to get into that. But before we do that, I'm going to let Jamilla ask Sharon an icebreaker.

Jamilla:

What is your favourite job that you've ever had?

Sharon:

What I'm doing right now, of course, it's funny because I never really thought that I would absolutely love what I do. And for a long time I stayed at law firm for nearly ten years. And I think I was always kind of brought up to think, well, if you have a really great, steady job, just stay with it and be happy. But I wasn't. And then I think I was looking to try to figure out what would make me happy. And it took me a while to get to a point where I am so happy with my job, but I curated it, and that's what is, I think, so amazing that I get to do what I want to do, not because I have to do it or it's expected of me. I get to curate this work environment and work product that I end up delivering, and that is what makes me so happy. So the answer is what I'm doing right now at Bamboo Data Consulting, it's a privilege that I get to do something that I love to do. If you love what you do, you don't work a day in your life.

Jamal:

That's so inspiring to hear and I can completely resonate with you. I'm a little bit in a similar situation where I've set up my practice and I enjoy the freedom and I enjoy choosing what I want to do. Not because someone has told me I have to do it because I'm expecting a pay check at the end of the day, but because I love to come and help and create value for my clients, both the corporate ones and both the individuals that we're helping. And of course, all of the amazing things that we get to do, like this podcast, right. I can choose to do this and you can choose to make time to do this. Amazing.

Sharon:

Yeah. And I mean, something that you said about the freedom and that really resonates with me, the autonomy that you have and the decisions that you get to make that feel right for you. And you don't need to do it because you have to, it’s because you really want to. And that makes such a huge difference in what you do every day.

Jamal:

Exactly. You know what I like most is I get to not only pick my team and pick my colleagues and the people I get to work with, like, really lovely Jamilla, but I also get to pick my clients. And I can pick and choose, and I say, yes, I'd love to work with you and no, your values don’t align with ours, so unfortunately, we won't be working with you.

Sharon:

Absolutely. It's a lot about the values and the culture that you're creating around you.

Jamal:

All right, awesome. So we're going to learn a lot more about your career and how you got to that stage and the journey you're on and why you really wake up each day raring to go in another episode. But for this episode, we're going to get down to business and we're going to learn a little bit more about what is happening with Apple and what's got you so upset and why did you actually choose to get into privacy? And why is this something that you're actually making a stand against?

Sharon:

You know, for me, privacy was always a paradox. I always enjoyed the convenience that technology provides to me, but then once I started to very organically realize what is happening with my data and how my data may be used in ways that I never thought it may impact my decision making in the way that I view the world. So that was really how my passion into this whole privacy universe came to be. An exact similarity of this paradox happening with Apple at the moment. So for those listeners who are not aware, although I'm sure most of you are, Apple has introduced a new software technology whereby they're going to scan your personal device, your iPhone, and search for child sexual abuse materials, which is also called CSAM. And I'll be speaking about CSAM. So that's what I'm talking about when I say that. Now, I've actually been following tech and CSAM for several years. The New York Times released an investigation several years ago about the responsibilities that tech companies have when it comes to CSAM. And that really intrigued me, maybe because it pulled on my heartstrings. I think it pulls on everyone's heartstrings when we talk about children and their rights and the fact that they're being exploited and not much is being done about that. And so it was something that always resonated with me. And I felt like this is almost the one and only area where I felt so torn because of course we want to protect the children and we want to make sure that those predators are caught. But at the same time, we want to protect our privacy and citizen’s privacy. And there seems to be this paradox. So one such paradox is Facebook and messenger. So Facebook surprisingly, is a company that reports the highest amount of CSAM out of all tech companies. And that is because they are able to identify CSAMs in their messenger app. When Facebook decided to encrypt messenger, it meant that they would no longer be able to identify those images, which meant that those predators can use that platform to send each other images and not be caught. Now, there was a huge push for encryption because citizen’s, privacy professionals were pushing for it. Right. We don't want Facebook to read about what we're talking about with our friends, with our family, it’s none of their business, that’s private information. And so Facebook was listening to us in trying to accommodate what our needs are, what our desires were, but at the same time, they were enabling these predators to continue to send messages to each other. And it's really interesting. Mark Zuckerberg once said, encryption is a powerful tool for privacy, but that includes the privacy of people doing very bad things. And so all of this resonated once again when Apple decided that they're going to install the software on iPhones in the US. And they're automatically, without seeking your consent, without asking you to opt in or have the ability to opt out, they are going to scan your photos on your device irrespective of whether you've downloaded it into the iCloud. So that was a big issue for me. When I first heard about what Apple was doing, I was actually like impressed that a tech company was being proactive about trying to do something about this horrible thing that is happening that a lot of tech companies are trying to ignore because they don't want to bring attention that their platform is enabling this kind of behaviour.

Jamal:

Hold on a second, Sharon. Isn't Apple known and don't they market themselves for having all of these privacy and security settings that no one can get into? And whether there's this big incident where the FBI was forcing them to let them get in and they were saying, we don't have a back door, is Apple now actually creating a backdoor into everybody's devices?

Sharon:

Absolutely. And so what I was going to say is, initially my first gut reaction was, this is amazing, someone's actually doing something to proactively, deter or prevent predators from collecting and sharing CSAM. But of course, as a privacy professional, my mind goes to privacy and what implications are there to privacy? And I think that we're lucky in the area that we work in, in being privacy professionals. Our mind does think this way. But for your average individual, what you're thinking of is Apple is doing this wonderful thing. They are protecting children. And this is not my quote. It was a quote that I read somewhere, and I wish I knew exactly where it came from, but it said this, that Apple is using areas of moral certainty to get people to buy into areas of moral uncertainty. So look, I don't want to say they're using this topic that is so sensitive and pulls at all of our heart strings to try to open this back door into our data, but it certainly feels that way. And where law enforcement asked Apple to break into the shooter's phone to get some information about the shooter, and Apple's position was we actually cannot break into this phone because of the way that we've designed our technology, that there's no way that we can do that. It is private and there's nothing that we can do. There's strong encryption to protect this shooter's privacy. And Apple was successful in arguing that in court. The difference now is that Apple cannot use the argument that we cannot. They can only use the argument that we will not with this new technology. Right? So this new technology does in fact allow Apple to get into your phone. So the next time law enforcement or government agencies asks Apple to go into users phones for any reason, forget about CSAM for citizen dissidents, for protesters that they're trying to target for any sort of reason, Apple cannot actually use the argument, we cannot do this. They'll say we won't, which is quite a weaker argument. And because Apple is, for example, so reliant on China to produce products, the concern with a lot of privacy professionals is when will Apple cave in? If China, for example, asks them to surveil users of iPhones, will Apple be strong enough to say, no, we won't do it, and not be able to say we cannot do it and that's the back door?

Jamal:

Yes. I think that is hugely concerning because once you create that back door, okay, we might have started off with great intentions and the argument is, okay, why would anybody want the possibility for children to be abused and for those perpetrators of these horrible, horrible crimes and not to be detected? The other question is, okay, let's start off by saying we want to use it for CSAM, but then where do we draw the line? You just said now that Apple can no longer say we cannot do that. You can do that. So pretty soon it might be, okay, you're going to do this for CSAM. It might also be acceptable to do this for other sexual crime. And if we're going to do it for sexual crime, why not do it for other horrific crime, for murder, for actual body harm? And then why not start doing it for burglary? Let's see where they were at this time. They don't have an alibi. They were saying they were there. Let's check the device to see where does it stop, and where do we draw the line and how can we actually mitigate or put protections in place for the fundamental rights of the individual's, rights to their freedoms or privacy?

Sharon:

That's right. And it also becomes an ethical question. Why is child sexual abuse more serious than a murder or more serious than individuals who are protesting or the LGBTQ community that governments may want to target? So it becomes an ethical question as well as a human rights question. Where do we draw the line and what policies does Apple have to draw the line of what they will do or will not do? But again, it's the we cannot do it versus we will not do it. By the way, we should also tell our listeners that Apple last week said, we're putting a pause on this. And I think it's because of the major outcry from all these privacy and security professionals amongst other citizens, individuals across the world saying you cannot do this. So we don't know if they're actually going to do this anymore. But if they are, what policies do they have in place to make sure that there are, like, safeguards in place to stop them from having to comply with law enforcement? What if there's a court order? Apple is going to have to do something. At this point, they've kind of opened the floodgates because they've shown the world that they can do this, that they can enter into our devices. Before this, we always thought that they couldn't do that, it was not possible they would not do this. Now we know they can.

Jamal:

This really reminds me of the whole reason why the European Union decided to strike down the privacy shield. It was because that American companies had to obey the FISA regulation, the Foreign Intelligence Surveillance Act, which says that they can knock on any data controller, data processors, door within the United States at any time and say, hey, hand us your data of anyone who is not on US Soil and they would have to do that. So if Apple are creating that back door, that means that the NSA, other intelligence services in the US can actually go to Apple anytime and say, hey, give us this data on this population or these individuals or from this community, and how is Apple going to actually put protections in place? And where would that leave them from a competitive point of view? Because one of the features that they really sell themselves on is on the privacy and the security. And now if that comes away, where will that leave them from a commercial point of view?

Sharon:

I think you nailed it. Absolutely. I think a lot of people don't realize we all put Apple in high regard when it comes to privacy and security, and to some extent they should be. But what a lot of individuals don't realize is that Apple is still scanning your photos that are uploaded to the iCloud. Right? So long as any photos are in the iCloud, they're still scanning it. It's not completely encrypted. Right. Any attachments to emails, they're scanning. And they are scanning for CSAM. And for me, going back to the original question of, am I for this or am I against this? I have to weigh the benefits and the risk. And originally, again, my initial reaction is benefits outweigh the risk. This is so good, someone is being proactive and we're going to get to the bottom of this. But then when you start thinking about it, these perpetrators are very tech savvy and they understand the intricacies of where they can save these images, how they can share in the dark web encrypted via VPN. And part of the way that Apple is going to detect these images is by comparing a hash of an image to an already existing hash. So are these perpetrators not going to be able to slightly make a change in the image so that the hashes are not identical? Are we really going to get that many more individuals from these perpetrators and be able to catch them? I don't know. I don't think so. I don't think that most of them are just leaving the images on their device and that's it, we're just going to catch them all. I don't think that the benefits outweigh the risk.

Jamal:

I think I would be minded to agree with you. And I think there are additional risks that we haven't even thought about yet. What about our grandparents, uncles, family members who have innocent photos of their loved ones, of their children, of their grandchildren, of their nephews, their nieces having a bath or playing in the park or on the beach, and they're sharing them amongst the family in a perfectly innocent way. What safeguards are there to stop those people being treated as suspects? To stop them having been locked out of their accounts, and to stop any detriment from being suffered by them?

Sharon:

Yeah, and I've thought a lot about that. So I'm a parent of two kids and I certainly have all these pictures of my children in the bath. And that was one of the first things that I thought about what would happen to those photos. I have to say Apple has thought about that. And what they've done is they created a safeguard whereby you must have at least 30 images that match the database of CSAMs that they currently have. And if you have at least 30, then Apple starts looking into those images, locks you out and reports you to law enforcement. So they have thought about it. The images that you have of your children, grandchildren, nieces, nephews, whatever, those hashes they turned into a hash have to match the images that is currently in the database. Now, hopefully they don't because they're different pictures they're obviously not CSAM. But the other issue is, okay, we're relying on machine learning. We're probably relying on AI. And we know that they're not 100% accurate, right? So what is the accuracy rate when we're talking about a system like this? Have they tested it? What are the results? And what are the chances that someone can innocently, like you said, be locked out of their phone and even worse, be reported to law enforcement? What if someone decides that they have it in for you and they start sending you images so that they can frame you? Like that's kind of scary.

Jamal:

Yeah, that is really scary. The other thing I was thinking is there's an English expression, and they talk about the tail wagging the dog. And so what we're saying is there is a minority of the population who do some of these most disgusting things. There's no other way of describing it, unimaginable things. And out of the minority of the population who are doing those things, I'm not sure how many of them are likely to be an Apple user. I don't know if Apple has done any research and said Apple users are more likely to be or not, but we've got the minority of people. Does it justify violating the privacy rights of the majority to identify this minority? And even then, the detection is only based on the machine learning and the limited amount of pictures you have in your catalogue. What's stopping these abusers from learning what's already being detected and doing one or two very simple things to get away from being detected by that machine? So if there isn't really much they can actually do and the detection isn't actually proven and it's actually quite weak, how are they still justifying this is the right thing to do by pulling on people's initial heart strings, saying, oh, who wants to hurt children? Nobody wants to hurt children. But that is not the argument here. Nobody is saying, let them hurt children. We're saying what you're trying to do is not justified based on what you're trying to achieve and how you're trying to achieve it.

Sharon:

idea of how crazy this is. In:

Jamilla:

When you initially talking about what Apple wanted to do, you said it was only in the US or starting off in the US. Was there plans to move it to the UK or the EU, or was that kind of hindered by things like GDPR?

Sharon:

That is such a good question that I wish I knew the answer to it. I wasn't able to find anything to suggest that they were going to move ahead and implement it in other jurisdictions. I would imagine that at some point that would have been their intention, but I couldn't find anything in that regard. And so at the moment, it's only in the US. If this even happens.

Jamilla:

Yeah, it's really interesting. And I've seen kind of, I used to work in human trafficking and there was a really interesting use of technology that we came across.

Jamal:

Hold on a second, Jamila. I just want to clarify. When you say you used to work in human trafficking, you wasn't involved in human trafficking.

Jamilla:

I used to work with victims of human trafficking, modern day slavery in the UK. And there was an app that an online site that was used kind of and it was saying that if you went to a hotel room, take pictures of your hotel room, upload that onto the website and then it can scan images background and then can possibly help alert victims who are trafficked into hotels. I always think of that as a really good use of technology without also violating anyone's privacy rights because then that's me opting to do that.

Sharon:

That is really interesting. But absolutely, I mean, technology can be used for lots of amazing things, but can also be abused in so many ways. And it's trying to find that right balance, but also holding technology companies responsible. They know that they're enabling this kind of behaviour, then to what degree are they responsible to prevent that from happening or to report it? So currently, tech companies are not required to be searching for these images. However, if they do come across it, they must report it. So do we need to change the law so that we are requiring them to look for these kind of images? And even if they do, how are we going to prevent it? Okay, so we know that your platform has X number of images and maybe more than this platform, but what are we doing about it? How are we going to prevent this from happening? It's complicated.

Jamal:

Yes. The other challenging with passing a law forcing tech companies to do this, is what's stopping us from passing another law or leaving this law so generic that it can actually be abused by some people who should be using it for one purpose, but actually exploiting it for other purposes and really violating those basic human rights that we hold so dear.

Sharon:

Sure, you're absolutely right. And then the other argument could be, well, tech companies don't want to spend all their time searching for these images than they have to. So instead what they will do is encrypt and say, well, we cannot kind of like Facebook did with messenger. So now that's only enhancing the opportunity for the predators to use that platform. This is like really quite a paradox. Like how you try to give something here, but then you take something there. It's a fascinating discussion and debate and I'm not sure how we're going to solve it, but it is interesting to see what Apple is trying to do. It's interesting to see where this is going to go.

Jamilla:

Yeah, and it'll be interesting to see if other technology companies follow suit and try and implement some of these privacy decisions. But that was a really interesting kind of insight into CSAM about Apple's decision regarding CSAM. So thank you for that. Stay tuned everyone. We will be hearing more from Sharon Bauer in the next part of this podcast. We'll be asking her a little bit more about how she got into privacy and how you can also become a world class privacy professional.

Jamal:

Yes, Sharon, thank you very much for. Joining us and sharing all of that great information about Apple and CSAM and about all of the things that we should be thinking about, not just as privacy professionals, but as individuals and human beings using smart devices. So thank you so much.

Sharon:

No, and thank you for bringing this to your podcast and bringing attention to this really important discussion.

Outro:

If you enjoyed this episode, be sure to subscribe like and share so you're notified when a new episode is released. Remember to join the Privacy pros Academy Facebook Group where we answer your questions.

Outro:

Thank you so much for listening. I hope you're leaving with some great things that will add value on your journey as a world class privacy pro. Please leave us a four or five star review. And if you'd like to appear on a future episode of our podcast, or have a suggestion for a topic you'd like to hear more about, please send an email to team@kazient.co.uk. Until next time, peace be with you.

Show artwork for Privacy Pros Podcast

About the Podcast

Privacy Pros Podcast
Discover the Secrets from the World's Leading Privacy Professionals for a Successful Career in Data Protection
Data privacy is a hot sector in the world of business. But it can be hard to break in and have a career that thrives.

That’s where our podcast comes in! We interview leading Privacy Pros and share the secrets to success each fortnight.

We'll help guide you through the complex world of Data Privacy so that you can focus on achieving your career goals instead of worrying about compliance issues.
It's never been easier or more helpful than this! You don't have to go at it alone anymore!

It’s easy to waste a lot of time and energy learning about Data Privacy on your own, especially if you find it complex and confusing.

Founder and Co-host Jamal Ahmed, dubbed “The King of GDPR” by the BBC, interviews leading Privacy Pros and discusses topics businesses are struggling with each week and pulls back the curtain on the world of Data Privacy.

Deep dive with the world's brightest and most thought-provoking data privacy thought leaders to inspire and empower you to unleash your best to thrive as a Data Privacy Professional.

If you're ambitious, driven & highly motivated, and thinking about a career in Data Privacy, a rising Privacy Pro or an Experienced Privacy Leader this is the podcast for you.

Subscribe today so you never miss an episode or important update from your favourite Privacy Pro.

And if you ever want to learn more about how to secure a career in data privacy and then thrive, just tune into our show and we'll teach you everything there is to know!

Listen now and subscribe for free on iTunes, Spotify or Google Play Music!

Subscribe to the newsletter to get exclusive insights, secret expert tips & actionable resources for a thriving privacy career that we only share with email subscribers https://newsletter.privacypros.academy/sign-up

About your host

Profile picture for Jamal Ahmed FIP CIPP/E CIPM

Jamal Ahmed FIP CIPP/E CIPM

Jamal Ahmed is CEO at Kazient Privacy Experts, whose mission is safeguard the personal data of every woman, man and child on earth.

He is an established and comprehensively qualified Global Privacy professional, World-class Privacy trainer and published author. Jamal is a Certified Information Privacy Manager (CIPM), Certified Information Privacy Professional (CIPP/E) and Certified EU GDPR Practitioner.

He is revered as a Privacy thought leader and is the first British Muslim to be awarded the designation "Fellow of Information Privacy’ by the International Association of Privacy Professionals (IAPP).