Episode 77

full
Published on:

6th Jun 2023

Ethics and Children's Privacy: The Next Big Thing

Can Age Appropriate Design Drive Business Success?

Brace yourself for an eye-opening episode as Jamal Ahmed and Jeffrey Kluge dive deep into the thought-provoking world of children's privacy and its ethical implications, exploring how age-appropriate design can drive business success.

Uncover:

  • The vital role of the age-appropriate design code in safeguarding children's privacy
  • The imminent wave of stringent penalties and regulations concerning children's privacy
  • The influence of AI on children's privacy and the unique challenges it poses
  • Why privacy professionals must prioritise children's privacy and ethics as the next big frontier

This is a must listen episode for Privacy Pros who want to stay ahead of the curve!

Jeff Kluge, a distinguished Fellow of ForHumanity and a certified auditor in Foundations of Independent Audit of AI Systems (IAAIS), Ethical Choice, the Children’s Code, and more, plays a crucial role within the priority drafting team.

Through an engineering-driven methodology that utilises business language, he effectively translates the legal principles of Age-Appropriate Design Codes into actionable criteria, enabling teams to ensure not only compliance but also the creation of robust and morally superior products.

Jeff’s expertise and findings affirm the notion that adopting methodologies like Age-Appropriate Design Codes can prompt businesses to develop customer-focused products that inspire enthusiasm and purpose. This contrasts with prevailing systems driven by fear, anger, and anxiety solely for the purpose of maintaining user engagement. His firm’s product offering facilitates the creation and updating of the Impact Assessments and Age-Assurance solutions for all those looking to innovate in children’s technology, ethically.

Follow Jamal on LinkedIn: https://www.linkedin.com/in/kmjahmed/

Follow Jeff on LinkedIn: https://www.linkedin.com/in/jeff-kluge/

Get Exclusive Insights, Secret Expert Tips & Actionable Resources For A Thriving Privacy Career That We Only Share With Email Subscribers

 https://newsletter.privacypros.academy/sign-up

Subscribe to the Privacy Pros Academy YouTube Channel

► https://www.youtube.com/c/PrivacyPros

Join the Privacy Pros Academy Private Facebook Group for:

  • Free LIVE Training
  • Free Easy Peasy Data Privacy Guides
  • Data Protection Updates and so much more

Apply to join here whilst it's still free: https://www.facebook.com/groups/privacypro

Transcript
Jeff:

Google had a saying, don't be evil that they later dropped. You wonder, well, why would someone drop a code, a mantra of don't be evil in the creation of systems. There's a fear in the boardroom or at the executive level that if we don't make money for shareholders, if we don't find a way to monetize this data, that we are somehow in trouble. The codes now shift that to say, take that thought, but now it must be balanced against the best interest of the child.

Intro:

Are you ready to know what you don't know about Privacy Pros? Then you're in the right place.

Intro:

Welcome to the Privacy Pros Academy podcast by Kazient Privacy Experts. The podcast to launch progress and excel your career as a privacy pro.

Intro:

Hear about the latest news and developments in the world of privacy. Discover fascinating insights from leading global privacy professionals and hear real stories and top tips from the people who've been where you want to get to.

Intro:

We're an official IAPP training partner.

Intro:

We've trained people in over 137 countries and counting.

Intro:

So whether you're thinking about starting a career in data privacy or you're an experienced professional, this is the podcast for you.

Jamal:

Good morning, good evening, and even good afternoon, wherever you're listening to this podcast today. My name is Jamal, and I am the founder and lead mentor at the Privacy Pros Academy, where we help driven professionals to unlock their potential, build thriving privacy careers, and future proof their careers. Today we're going to be taking a deep dive into looking at children's data. And as you all know, I've got so much more passionate about protecting children's data ever since I've become a father to my Amy about 15 months ago now. And on today's episode, we have an amazing guest. We have Jeff Kluge, a distinguished Fellow ForHumanity and a certified auditor in Foundations of Independent Audit of AI Systems (IAAIS), Ethical Choice, the Children’s Code, and more. He plays a crucial role within the priority drafting team. Through an engineering-driven methodology that utilises business language, he effectively translates the legal principles of Age-Appropriate Design Codes into actionable criteria that we can apply pragmatically. He enables teams to ensure not only compliance but also the creation of robust and morally superior products. Jeff’s expertise and findings affirm the notion that adopting methodologies like Age-Appropriate Design Codes can prompt businesses to develop customer-focused products that inspire enthusiasm and purpose. This contrasts with prevailing systems driven by fear, anger, and anxiety solely for the purpose of maintaining user engagement. Jeff’s firm’s product offering facilitates the creation and updating of the Impact Assessments and Age-Assurance solutions for all those looking to innovate in children’s technology, ethically. Welcome to the Privacy Pros Podcast, Jeffrey.

Jeff:

Thank you, Jamal. Pleasure to be here.

Jamal:

So first question for you is what's your favourite Disney movie?

Jeff:

Favourite Disney movie? Ratatouille.

Jamal:

I love that one. I get a little bit hungry every time I watch it, and then I get a bit concerned about what's happening in the actual restaurants, so I try not to watch it too often. How did you become interested in the intersection of children technology and ethics? It's a very niche area.

Jeff:

Jamal I think it started back in early family tween age years. My family brought home an Apple Two computer. Actually, mom and dad brought it home, put it on the table in our family room, and over time I taught myself how to reprogram the game. This was well before chat rooms, YouTube or any of the manuals. It was just a learning experience and I loved it. As I went through middle, upper school and into college, I didn't have that worldly a view of what was available. And I went into finance, had a very nice career in finance. My primary clients were founders, so I was sitting side by side with them as they built their business and had a pivot in life and came back to technology. There was a book by Shoshana Zuboff, Surveillance Capitalism, that really captured my interest and attention around AI based ethics. And then I got involved with a group for Humanity, which is looking to build trust and accountability in AI spaces. And I found the children's code. So it almost seemed this really fortuitous arc to come back to technology, come back to helping protect kids and give them a really good experience in their learning, their own learning arc.

Jamal:

Okay, it's interesting you said protecting kids there. I'm curious, what are some examples of harmful practices you've come across targeting children?

Jeff:

I think a lot of technology today has been not only written by adults, but is written for adults. And there was a professor at Stanford, B. J. Fogg, who 20 plus years ago really wrote the book, literally wrote the book on captology, which is the use of computers to influence behaviour. And people began to view that and use technology to manipulate people. That's a strong word, but to influence and to direct people in what turns out to be very intimate, very close contact ways. Madison Avenue in the US has been doing this for 50 plus years in terms of advertising in normal, hard print, TV related spaces. And now technology is really in the palm of our hand and children have it at very young ages. And I think that is part of the cause for alarm, that there haven't been many people thinking about the messages to kids along that arc.

Jamal:

And was there like an Ah Ah moment or a specific incident where you decided, hey, I'm going to do something about this, enough is enough, or there's something needs to be done about this?

Jeff:

It was an opportunity I had to become part of the priority drafting team at For Humanity. So we were looking at GDPR and we were writing these criteria in business language of how a business could uphold those legal principles and we moved on to GDPR children's code. And I started going through these criteria and thought, nobody is doing this or very few. I shouldn't say nobody, that's strong language, very few companies are doing this. But if you were able to do this, you would build amazing technologies, amazing products, amazing online related services, if you had that focus. And that was the light bulb moment for me that said, well, wait a minute, if nobody is doing this, here's the opportunity to teach, train, help businesses learn this, and come on podcasts like yours to help educate a whole new generation of privacy professionals on how to start thinking about kids differently.

Jamal:

Absolutely. And in our privacy pros network, we have so many people who are specifically interested in children's privacy, children's security, and how they can help businesses and organizations actually build those solutions in a way that you've been talking about and you've been sharing. So for those privacy professionals who might not yet be familiar with the age appropriate design code or who aren't as versed on it as they'd like to be, can you tell us a little bit more about what this is and what it requires and why it's important?

Jeff:

Well, I think there's in part a carrot and there's part of a stick. The stick becomes that there are penalties if you begin to process, use, or collect data from children without parental consent or without the express consent of that child done in child friendly and age-appropriate language. The carrot is that if you do begin to think about it in that regard, you will build trust. You will build products that are now designed for children. I'm sure we'll get onto it a little bit later. Some of the major social media apps and how those have been skewed and how they're damaging to mental health for kids.

Jamal:

Yeah, tell us more about that. We know TikTok recently got an interesting decision against them for a fine for some of those things. But yeah, tell us more about those social media practices and what's really concerning you.

Jeff:

When you look at TikTok as an example, there's one side of the equation that says TikTok's algorithm is really focused at what does the person like, what are they engaged in? And it delivers more content to them on that topic. There was a young girl in the UK, girl by the name of Molly Russell, wound up taking her own life at the age of 14. Subsequent review of her phone, her computer, showed a tremendous amount, well over thousands of images and instances of self harm, restricted eating, what people would call damaging mental health topics. And the algorithms that are written today see maybe a child exploring that topic and then they begin to deliver more and more material that is in that space where it's okay for a child perhaps to explore a new topic such as their body image. But how can we then discern some positive effect of that, talk to them about their own view of themselves, maybe present more questions, a little bit more thinking, maybe direct some positive psychology instances back to them as opposed to funnelling into this negative feedback loop of what many would call detrimental material to them. And I think that's really a central point of age-appropriate design of kids have questions. Can you give them good information, quality information, quality data and help them learn through that process and deliver something at the end that can have them leaving in a better mental state than when they got on?

Jamal:

Who is going to be responsible for making decisions on those topics? What is good and what is not so good for the children? And how do we kind of understand how to weigh those things up so we can factor those in when we're thinking about some of the clients that we're working with?

Jeff:

Fortunately, the Children's code and a copy of that which was passed in California last September, it becomes enforceable in July of next year. There's a couple of lines that refer to the commercial interests of the business must be weighed against the best interests of the child where conflicts of interest exist. And I think some of the large social media platforms, some games, perhaps products, online services, there's a fear in the boardroom or at the executive level that if we don't make money for shareholders, if we don't find a way to monetize this data, that we are somehow in trouble. The codes now shift that to say, take that thought. But now it must be balanced against the best interest of the child. And some of it a business that would really get involved in this would have a Child Data Oversight Protection Committee, people specifically charged and trained child psychologists perhaps around what is in the best interest of the child. And there's an outline of the topics in the United Nations children rights. So there's a guide, the UNCRC that can begin to have a framework for people to look at, but then it's also for those that are parents. How would we want others to treat our children? And I think for many years we've had parents that have just given their technology over to the child. They've given them an iPad, an iPhone to entertain them, not knowing the full extent of what's happening behind the scenes. And I think parents are now becoming far more aware and better educated about, okay, what are the tools that are happening? What data is being collected and how do I mitigate that? How do I arrest some of the stepping into privacy for yourself, for your children, for those that are around you? And I think that's all part of this learning arc that many people are going through today.

Jamal:

Definitely. And I think what's also really positive to see is that it's not just all hot air. It's not just those regulations coming in. There's guidance coming in and nothing's being done about it. We are seeing social media companies especially are being held to account. And the recent decisions against TikTok, recent decisions against YouTube, they're all good examples of the fact that this is something that is going to be taken seriously. It needs to be taken seriously. And although it might have been forgotten about in a previous paradigm where it was all about how do we monetize this data and give a return back to the shareholders, we're now saying we need to shift that way of thinking. That is no longer good enough. And what we need to do is whenever we're thinking about touching children's data, whenever we're designing platforms, whenever we're creating algorithms, we need to make sure there's checks and balances against what is in the best interests of the child and not what's in the best interest of the child, from what the engineers think, but from what actual professionals who are qualified to understand the interests of the child are able to offer advice and guidance. And there's like a committee there to support you to understand that. And if all of it fails what you're saying, Jeff, is how would you want your own child to be treated when you was a child, how would you have wanted to be treated? And the cost of not having those thoughts, the cost of not having those discussions, is the unfortunate cases that we see of that poor girl Molly, may God rest her soul in peace. But it is not uncommon for children, especially teenagers, to get sucked into the rabbit hole of seeing misery after misery, self harm, suicide. And it's not an isolated, Molly's case wasn't an isolated incident. It's the only one that was widely reported.

Jamal:

But we know from the work that we do that there's lots of children who are detrimentally impacted, their mental health is impacted because they’re sucked into this world where they believe that the world around them consists of self-harm and suicide and negativity. It's because the algorithm shapes their digital world in such a way that they don't see there is anything else for that. And sometimes in their reality, they might mistake the world for all that there is. And if that's how we're shaping a young mind, if that's how we're developing someone as they're really trying to understand and make sense of the world and going through lots of hormonal changes as well, what hope do we have for these children to grow up and make this world a better place if all we're giving them is doom and glory? So we need to balance those things so that we can really fulfil our vision, really fulfil our mission of making sure that every woman, every man and every child enjoys freedom, not just over their personal information, but over how that personal information is used to shape their reality and their understanding of the digital world, the metaverse and in turn, how they then internalize that and act in the actual real world?

Jeff:

Yes.

Jamal:

In your experience, what are some common misconceptions or blind spots that you see are common when it comes to protecting children’s data by privacy and other professionals?

Jeff:

Jamal I think what started the opposition today is that they view protecting children's data and the regulations that are coming about as all negative, that it's all downside to the business. And I hope to be able to change that, that's having these types of conversations. Because that's just not true. Or it's not true enough to the point where we should just give up. As I mentioned earlier on in looking at how one would go about living up to those legal principles, let's look at the social media case again where if a child does come and they're uncomfortable with their body image, kids are trying to figure out who they are. Someone calls them a name on the playground, and that sticks into their mind, right? It hurts. So you go onto social media, you look up the word, you begin to have questions that you ask, whatever system it is that you're engaged in, and you get into this spiral. But a really dynamite technology would acknowledge that, and it would begin to cue, okay, well, why are you asking these questions? What happened now, a parent being involved at that moment in time would be spectacular. An adult, some good, trusted friend that really had that child's interest at heart to have that conversation around, well, they didn't. Maybe they meant it or what was the context of it. You'd have a good conversation on the bed at night or over the dinner table or wherever you could. I think technology, in the virtuous sense of what it can be would begin to pick up on that nuance, would begin to pick up on the language and could begin to deliver something of that positive message to the child, to give them hope, to give them another side. To maybe say that okay, what you did might have been the improper thing to do. But here's how you can make it better. And I think everything today has just been skewed into the negative because they looked up that one topic, and it just got worse. So, Jamal, for privacy professionals, children today is the start. But if you think about engineering your algorithms and engineering how you collect and process data, if we did that for adults, it would be amazing. And I think that's the message that I really wanted to speak to and help engineers and finance and advertising and boards of directors and the attorneys and lawyers that are involved in these businesses to understand is that there's really a dynamite pathway forward to be good offense in business around developing something that is excellent.

Jamal:

Absolutely. I completely agree with that. And we're going to do everything we can in the network to help you to amplify that message and make sure we get it to as many people as we can, and also help them understand why age-appropriate design isn't as difficult as it might initially seem. And how you've got solutions to start making some of those things more accessible and more easier so that everyone can benefit. Because when we get this stuff right, it's not just the business that benefits. The business benefits, the user benefits, the parents benefit and hopefully society also benefits because we have more balanced children. We have people who have more balanced views. We have people who haven't been negatively impacted, young people growing up who haven't been negatively impacted. And it's interesting what you said at the beginning is we have to think of children as people, not just something that is half a person or less than a person, but they are actual individuals in their own right. And we always need to remember that as privacy professionals, as technical staff, as engineers, as chief privacy officers, as legal counsel, that we're dealing with small people here and we have to treat them with the same respect we would treat everyone else. Just because they sent to me a certain age and we call them children and they might not have rights, like they might not be able drive a car or vote or take out a mortgage, it doesn't mean that they are any less valued or valuable as individuals or to society or to the benefit of mankind. And here's the thing. We are all going to expire at some point and what's going to continue that legacy, what's going to continue making the world a better place is the messages is what we're doing now that we're going to pass on to the future generation, what we're doing to shape the way people are growing, the way people are thinking, the ethics they're adopting and what is acceptable and what isn't acceptable.

Jamal:

And in the early:

Jeff:

Well, the topic de jour is generative AI. And I mentioned a bit earlier that when we think of AI systems or generative AI, there's I think a better term that we could use and that's a triple A system, artificially intelligent, algorithmic or autonomous system that are all designed by, in a way, math. We're seeing AI learn and grow and the data that it has been fed, I question, and I think rightfully so, the transparency of how those systems were created and whose data is in those learning models, because I don't think the businesses fully understand how those models have been trained with the data that they've collected. In California, you need to be registered as a data broker. I pulled up the spreadsheet at one point, there were over 400 businesses in California registered as a data broker. I think New York is doing something similar where you have to register that's a lot of places that data can be sold. I've noticed on a few websites I've gone to where when you click the box of reject all and then it has that other pop up box of the ones that you're rejecting, there are over 100 different places that want to pick up and collect my data that I have no idea who they are. You mentioned what are the technologies that can be harmful? I think it's how we've collected data, how we've used it, and how we use that data to train models for these triple A based artificially intelligent systems.

Jeff:

k was very profound. From the:

Jamal:

Every time.

Jeff:

So I think being a privacy professional today and having a watermark, if you will, of this is now a standard that we eagerly uphold. And we have design, age appropriate design embedded in our corporate ethos I think will become a calling card for people over the next five years.

Jamal:

So on that note, for privacy professionals who are looking to future proof the careers, what areas or skills do you think are going to become increasingly important in tackling children's privacy? And what should we start focusing on upskilling now so we can be ahead of the curve and really future proof our careers?

Jeff:

It's a great question. Learning the laws, learning those legal principles is a great first step. Second would be understanding what a business could do to be able to uphold those legal principles. So what does that look like for your engineering department? And I think privacy professionals will become very good teachers throughout the organization to go to engineering and to go to finance and to perhaps even go to the board level and talk about how they can implement privacy centric process throughout the business. And not just think of it as this afterthought of, oh, we have to have this legal criteria in the back of our head of how to uphold this. So I think privacy professionals will be good stewards throughout the entire business. They'll be good educators. Being able to communicate in any form or fashion, I think will be very helpful skills to be able to develop, but also having good introverted people that just really understand and can either be good support or good leaders will be valuable as well.

Jamal:

Awesome. Three key things I take away from that. Guys, if you're listening and you're thinking about future proofing a career, especially as you're thinking about the future and children's data, then what we need to first figure out, or what we need to first get to grips with is understand what the laws and regulations actually require, learn the principles. Once we've understood those things, the next thing we need to figure out is, okay, I know what this says, I know what it means, but how do we apply that? What is the pragmatic way that we can actually realize and achieve these things from a business point of view? And the third thing I took away was that we need to improve our advocacy skills. We need to grow our advocacy skills so we can advocate a privacy centric approach throughout the entire lifecycle from the beginning, rather than try and tackle something as an afterthought. And if we implement those three pieces of advice, then we have a very great chance of having a positive contribution to the well-being, to the welfare, to the digital world of children all over the world. And also it can also help us to enhance our careers and really stand out as a thought leader moving forward in this actual really exciting, fascinating, and responsible and precious space. Jeff, what’s one piece of advice that you wish you had got at the beginning of your career that you'd like to share with our audience?

Jeff:

If I look back to when I was that young child learning about technology, part of it would be that understanding of who you are, what makes you tick, why do you like what you like, being able to try a number of different things. And I think we're seeing the younger generations, the millennials, and certainly generation Z here in the United States are beginning to live more of that. And I think that's very exciting because you are getting a change of ethics mattering where people are not going down certain paths that might have seemed very easy before, but they're challenging themselves and doing challenging work that is more purpose driven. I think that advice of finding your passion, finding something that you really enjoy and maybe you don't have it, but take a look at the things that have really engaged you as a child or and even as an adult. Things that just innately come quickly to you, that you like and enjoy. And time passes by very quickly. That may be something that you found a passion in and see how you can foster that, uncover it. Is there a way to be able to support yourself doing that? Can you make money at it? Can you make it a calling? Can you get paid for it? And if you can't, then how do you balance? Okay, I need to be able to make money and find a way to incorporate that somewhat into my life that's I think somewhat profound advice that I wish I would have had, but I think it would have proved very worthwhile.

Jamal:

Yeah, I was going to say I was not expecting that answer. Very deep, very meaningful, very thoughtful, and as you said, very profound. So it's all about actually taking a step back and having a deeper think into what am I doing? Why am I doing this? Why do I like this? What is it about this I actually like? And how can I then translate that? And what is it that makes a passion for me? And is that profitable? Can I make a living from this? Can I get paid to do this? And can I leave my mark and legacy? And it's when you have that really thoughtful conversation early on, or even now, if you think it's too late, it's never too late. The best time would have been 10,15 years ago, the second best timing now, and we're building a time machine at the privacy pros. But until we do that, we got to take action, and we got to take action now. And that's very profound advice that any young person could benefit from. And it's advice that we can benefit from.

Jamal:

Because I remember when I was thinking about what I want to do with my career, what kind of things I enjoy. It wasn't as thoughtful of a process. And it probably sounds a bit silly admitting this, but there wasn't that much thought into it. It was like, oh, yeah, that seems like a good idea. That seems cool. I know someone that seems to be doing quite well, and you just go off on that tangent. And there's so many young people who I believe if someone just sat us down or even took us out and just had that conversation to start thinking about these things, it would have really shaped a more colourful, interesting future, which would have been a little bit more in line with our passions. Another thing you mentioned there, Jeff, is ethics. And I was in Lithuania last week talking at the International Data Protection Conference, and the word or the topics that kept coming up is ethics, ethics, ethics. And it was really refreshing and empowering to actually hear so many and be part of so many conversations where people are actually talking about ethics, because two years ago, everyone was talking about data protection, privacy, security, all those things, and ethics was hardly mentioned. And more and more we're seeing forward thinking companies are creating those ethics functions. They're actually hiring people to really focus on ethics. And the ESG seems to be really growing. So for those privacy pros who want to get ahead of the curve, who want to be current, I would say start looking at ethics. Start looking at the ESG function. And if there isn't one, think about how you can add or create one, or how you can even without creating one, start thinking about some of those principles and bring that in. Jeff, I know we're short on time, but there's one other thing I really wanted to ask you about, and it's a pickle that we come across with our clients all the time. So we've got children, young people who are trying to use the platforms, and one of the things that we're required to do is verify their age or verify what suggestions do you have for assuring someone is how old they say they are?

Jeff:

Great topic to discuss. We could probably have a full episode on that one itself. I believe what we're going to see coming forward are privacy centric age assurance solutions that are also globally harmonized to the laws and legal principles that are there. For example, and this is one that I happen to have as my own pet project that I'm working on today, have something that can keep the child's private information, personal information, sensitive personal information, whatever word you want to call it, whatever jurisdiction you are around the world, safe and private. When a child comes to a new website, think of it like a bouncer at a club or a bar. We don't let 14 year olds just walk into a bar, right? There's someone at the door that says, how old are you? And if the child says, which is the predominant case today, well, I'm 18. Oh, come on in, you're fine, that's self declaration. There are some systems, because there's very low risk in the data processing or very low data that's being collected, where self-declaration is okay. More likely it won't be. And having some system that acknowledges privacy of the child, but then also assures that age, whether it's via a parent, carer or guardian that has approved them in some capacity. We've seen a couple businesses to date, very large in terms of their scale and scope that are using facial detection. So is my face, does it seem old enough? And there are some biases that are there around the colour of the face that is still needing to be fine tuned.

Jeff:

There's also a trouble with the accuracy or the mean absolute error that is produced from that estimation of age. For young children, it can still be over a year and a half. So a six and a half year old or six year old may show up at the door at either seven and a half or four and a half. That's not acceptable, right? That's still too big of a gap. 13 year olds could be 14 and a half, or they could be eleven and a half. That will fine tune and I think some companies can figure out, well, you have to be 14 and a half to get through the gate and that will mitigate some that's coming through. But I think having something that a parent would set up if I were designing the blueprint, you'd have the parent that produces the account for the child. So you're assuring their age. They would go to a website, think of it as a knock on the door. The website would say how old are you? And the child would then use that facial detection on their phone or the device to verify that it's them. It would say, yes, this is them. It would produce a signal that goes back, verifying the age and has now certainty. Then the company would have a log or a record of that showing what's happened. That's roughly I think if I were I've already thought about this many a times, that's how I'd build something new. So for privacy professionals, maybe going back to the original part of the question, how is the organization looking at the certainty of age? What level of risk of data processing is occurring behind the scenes? Do we have a high risk? Are we using geolocation? Are we using this for behavioural recommendation engines? Are we having advertising models that are putting material to that user? A lot of that is going to come through on your data protection impact assessment, your impact assessments before you even begin to build. It all comes together in a holistic way, but you do have to think about it holistically as opposed to just, well, we need to plug in an age assurance solution or we need to do something around data, we need to find the age of a child. I think some of them are a bit more holistic in terms of how they need to be solved and privacy professionals would be well advised to think about it in a bigger scale.

Jamal:

Yes, sounds fascinating. Sound like lots of opportunities there to really go and be a part of shaping what that could look like and find some really creative solutions. One of the things I was thinking about, as you was mentioning that is when I want to take my phone, I want to log on to my bank, for example, it'll ask me for my fingerprint and that fingerprint is something that stays on that one device. I wonder if we could have something where that passport, that age verification stays on the device and then it goes and says, hey, if you want to come in then you got to be a certain age. And then without sharing it, without being scanned by lots of different companies, it's already there on that one device, a little bit like the chips we have on our passport. And then it says, oh yeah, you know what, that checks out or no it doesn't. And I think solutions around that sound pretty exciting going forward because it's less collection of the data of the child, less could go wrong, and it's just already secured there, just like my fingerprint stays in my mobile when it doesn't go anywhere else. What are your thoughts on that, Jeff?

Jeff:

Brilliant. Without having given it all away, that's a business plan I've written around age assurance, quite honestly.

Jamal:

All right, definitely. Great minds think alike but you've been in this space, you are the expert. And if that's something that you're actually building a business plan on, then as a privacy professional, I think there's definite legs there. And I will do everything we can to support you, because the more quickly we see things like that being adopted and implemented, the more children will have assured when they're going to these websites, they're going to these apps, and we can make sure that they're actually receiving the content. They're actually being targeted in the way that balances their interest as well as the business's interest. And it's not just a, hey, let's see how we can monetize this child's data. And we don't really care how old they are, as long as they said that this age, that's absolutely fine. Let's see how we can profit from that. So I'm super fascinated where this specific niche area of privacy is going. It's great to see the regulations are coming in. And just to sum up the podcast, ladies and gents, we started speaking about the age appropriate design code, and Jeff informed us 15 principles. And then Jeff gave us lots of advice about what we can actually do. And he said, look, go and learn the principles. Get familiar with what they are. Once you familiarize yourself with what they are, understand how you can pragmatically and practically implement them. And always remember, you are the voice within the organization. You are the voice with your clients. So advocate for privacy centric approaches in everything the business does, so you can take care of those things from the beginning rather than think about it from an after talk. And then Jeff spilled some secrets or some really top tips on how you can really enhance your privacy career, how you can future proof that. And finally, we've discussed his future business plan and other solutions of how we can actually find pragmatic solutions that are privacy friendly and that will enhance the privacy of the children as we move forward into assuring their ages and they are actually as old as they say they are. Jeff, it's been an absolute pleasure having you on the podcast, but before I let you go, I'm going to give you the same courtesy we extend everyone else. What question would you like to ask me?

Jeff:

How are you going about as a new father, thinking about protecting the identity and the online presence for your child?

Jamal:

That is a fascinating question. It’s something I think about a lot more and more every single day. And the challenges are my daughter, she's very fussy when it comes to eating food, right? And we were these parents that would say, you know, no electronic devices until she's seven years old, no TV, none of this stuff. The only way we can get her to have food is by letting her watch something on a mobile phone or a tablet. And oftentimes when we're doing that, there's the YouTube kids. Then sometimes someone will put a device in front of her that isn't YouTube kids, but it's a children's program. Now YouTube thinks hey, you're not going to YouTube Kids. This is an adult. So they start doing whatever they do with that data and with the algorithms. And I'm really concerned about what kind of adverts they might throw in there and how we can actually get away from those. So the solution might be, do we just pay not to see the ads? Or she's not targeted with irrelevant ads, for example. So these are the kind of questions I'm thinking about every day. It might sound very strange to some, but as I'm walking around, as I'm thinking, as I'm going somewhere, I just have these really random thoughts. And although they sound random, they actually fascinate me because I look at it from the privacy angle and look at what's possible and what could be done, and then what would be the future proof solution? And it's like, okay, now I need some more clients that focus on that specific side of things, so I can really grow, develop, and add value there. And it's been such a pleasure having you here. If people want to get in touch, if businesses want to get in touch and they need a little bit of help and assistance, or they just want to continue the conversation with what is the best way to get in touch with you, Jeff?

Jeff:

LinkedIn is the primary focus at this point, so Jeff Kluge you'll see FHCA behind my name, which is for Humanities certified auditor. So you know you have the right Jeff Kluge.

Jamal:

But what we will do in the show notes, folks, just like we do with every single guest, is we will link Jeff's LinkedIn profile in. If you're listening to this podcast and you're part of the Privacy Pros network, then make sure that you share your takeaways and tag Jeff on LinkedIn and he can come and have a look at what your thought processes are, add to the conversation, and continue the conversation. Because it's when we are part of the conversation, we matter. It's when we are part of the conversation, we can make things happen. It's when we're part of the conversation, we can actually advocate just by people contributing or even just reading. 90% of people on LinkedIn will not like, they will not add up to the conversation, but they will find what's being discussed with you and Jeff and other privacy professionals valuable, and they may take that away and implement on that. So never think just because people are not responding to you and you're not getting active affirmative reactions, that the content that you're creating, the posts that you're writing, the takeaway you're sharing, are not adding value because they're really adding value. And it's only one day someone will message you and say, hey, I've been reading your comments and watching you for the next five last 5, 10 years and I think you're amazing. You realize how many other people must there be with those same similar patterns who never say anything, never interact, but they're there, they're learning, they're valuing, and they're benefiting from you. Guys hat's everything we have time for today. Thank you very much for joining us again and if you're interested in upskilling and getting certified and really moving your career and going just beyond the certification, but to have a really thriving career as a world class privacy professional, get in touch. Until next time, peace be with you.

Outro:

If you enjoyed this episode, be sure to subscribe, like and share so you're notified when a new episode is released.

Outro:

Remember to join the Privacy Pros Academy Facebook group where we answer your questions.

Outro:

Thank you so much for listening. I hope you're leaving with some great things that will add value on your journey as a world class privacy pro.

Outro:

Please leave us a four- or five-star review.

Outro:

And if you'd like to appear on a future episode of our podcast, or have a suggestion for a topic you'd like to hear more about, please send an email to team@kazient.co.uk

Outro:

Until next time, peace be with you.

Show artwork for Privacy Pros Podcast

About the Podcast

Privacy Pros Podcast
Discover the Secrets from the World's Leading Privacy Professionals for a Successful Career in Data Protection
Data privacy is a hot sector in the world of business. But it can be hard to break in and have a career that thrives.

That’s where our podcast comes in! We interview leading Privacy Pros and share the secrets to success each fortnight.

We'll help guide you through the complex world of Data Privacy so that you can focus on achieving your career goals instead of worrying about compliance issues.
It's never been easier or more helpful than this! You don't have to go at it alone anymore!

It’s easy to waste a lot of time and energy learning about Data Privacy on your own, especially if you find it complex and confusing.

Founder and Co-host Jamal Ahmed, dubbed “The King of GDPR” by the BBC, interviews leading Privacy Pros and discusses topics businesses are struggling with each week and pulls back the curtain on the world of Data Privacy.

Deep dive with the world's brightest and most thought-provoking data privacy thought leaders to inspire and empower you to unleash your best to thrive as a Data Privacy Professional.

If you're ambitious, driven & highly motivated, and thinking about a career in Data Privacy, a rising Privacy Pro or an Experienced Privacy Leader this is the podcast for you.

Subscribe today so you never miss an episode or important update from your favourite Privacy Pro.

And if you ever want to learn more about how to secure a career in data privacy and then thrive, just tune into our show and we'll teach you everything there is to know!

Listen now and subscribe for free on iTunes, Spotify or Google Play Music!

Subscribe to the newsletter to get exclusive insights, secret expert tips & actionable resources for a thriving privacy career that we only share with email subscribers https://newsletter.privacypros.academy/sign-up

About your host

Profile picture for Jamal Ahmed FIP CIPP/E CIPM

Jamal Ahmed FIP CIPP/E CIPM

Jamal Ahmed is CEO at Kazient Privacy Experts, whose mission is safeguard the personal data of every woman, man and child on earth.

He is an established and comprehensively qualified Global Privacy professional, World-class Privacy trainer and published author. Jamal is a Certified Information Privacy Manager (CIPM), Certified Information Privacy Professional (CIPP/E) and Certified EU GDPR Practitioner.

He is revered as a Privacy thought leader and is the first British Muslim to be awarded the designation "Fellow of Information Privacy’ by the International Association of Privacy Professionals (IAPP).