Episode 87

full
Published on:

5th Sep 2023

The Real Cost of Ignoring Experts: Why A DIY Privacy Strategy Is A Ticking Time Bomb

From hidden costs that lurk like an iceberg to DIY disasters, we discuss why cutting corners today could cost you dearly tomorrow.

Tune in and find out why when it comes to data privacy, ignorance isn't bliss—it's a ticking time bomb.

In this eye-opening episode, we have a conversation with Sean Falconer, a former postdoc medical student at Stanford turned leading expert in Privacy!

In this episode Sean shares:

  • His fascinating journey from competitive programming and medical research into the world of privacy
  • The common pitfalls companies encounter with a DIY data privacy approach
  • How AI is rewriting the cybersecurity playbook and what it means for Privacy Pros
  • Why investing in yourself isn't just smart—it's essential.

Tune in to level-up your privacy game and avoid the pitfalls of going it alone.

Sean has a lofty goal: to make the digital economy safer.

And he's doing so one API at a time. A former competitive programmer, entrepreneur, and expert storyteller, his many accomplishments include designing the software used to create ICD-11 at the World Health Organization, founding Proven.com, and leading developer relations teams at Google. When he’s not interviewing industry insiders as one of the hosts of the popular Software Engineering Daily podcast or diving into all things data security and privacy on Partially Redacted, you can find this world-class tech talent serving as Head of Marketing and Developer Relations at Skyflow, the world’s first and only data privacy vault delivered as an API.

If you're ready to transform your career and become the go-to GDPR expert, get your copy of 'The Easy Peasy Guide to GDPR' here: https://www.bestgdprbook.com/

Follow Jamal on LinkedIn: https://www.linkedin.com/in/kmjahmed/

Follow Sean on LinkedIn: https://www.linkedin.com/in/seanf/

Get Exclusive Insights, Secret Expert Tips & Actionable Resources For A Thriving Privacy Career That We Only Share With Email Subscribers

 https://newsletter.privacypros.academy/sign-up

Subscribe to the Privacy Pros Academy YouTube Channel

► https://www.youtube.com/c/PrivacyPros

Join the Privacy Pros Academy Private Facebook Group for:

  • Free LIVE Training
  • Free Easy Peasy Data Privacy Guides
  • Data Protection Updates and so much more

Apply to join here whilst it's still free: https://www.facebook.com/groups/privacypro

Transcript
Sean:

The thoughts and opinions expressed by guests on this podcast are solely their own and do not necessarily reflect the views of their employers or any other individual or organisation.

alked to last year, they have:

Intro:

Are you ready to know what you don't know about Privacy Pros? Then you're in the right place.

Intro:

Welcome to the Privacy Pros Academy podcast by Kazient Privacy Experts, the podcast to launch progress and excel your career as a privacy pro.

Intro:

Hear about the latest news and developments.

Intro:

In the world of privacy, discover fascinating insights from leading global privacy professionals, and hear real stories and top tips from the people who've been where you want to get to.

Intro:

We've trained people in over 137 countries and counting.

Intro:

So whether you're thinking about starting a career in data privacy or you're an experienced professional, this is the podcast for you.

Jamal:

Hello, and welcome to another episode of the Privacy Pros podcast. I'm your host, Jamal Ahmed, founder, and lead trainer at the Privacy Pros Academy, and I'm thrilled that you've tuned in to listen. If it's the first time you're listening in, make sure to hit subscribe, like and share wherever you get your podcast from. Today, I'm joined by another awesome guest. Today we have Sean Falconer. He has a lofty goal: to make the digital economy safer. And he's doing so one API at a time. A former competitive programmer, entrepreneur, and expert storyteller, his many accomplishments include designing the software used to create ICD-11 at the World Health Organization, founding Proven.com, and leading developer relations teams at Google. When he’s not interviewing industry insiders as one of the hosts of the popular Software Engineering Daily podcast or diving into all things data security and privacy on Partially Redacted, you can find this world-class tech talent serving as Head of Marketing and Developer Relations at Skyflow, the world’s first and only data privacy vault delivered as an API. Welcome to the Privacy Pros, Sean.

Sean:

Thank you for having me, and I feel like I have a lot to live up with that description. Starting off with another awesome guest, hopefully I can live up to the former guests that have been on this awesome podcast.

Jamal:

I love your humility and your modesty, Sean. It's actually me needs to be on my toes. You've got two podcasts there. You've done so many amazing things, but I have a question for you which you weren't expecting. If you were a flavour of ice cream, what would you be and why?

Sean:

Oh, that's a good question. It's not only probably my favourite type of ice cream, but I think that it relates to me as an individual and also as a father of young children. I would think I'd go coffee flavoured because I basically survive on caffeine most of these days. I think that I'd end up being a coffee flavoured ice cream.

Jamal:

All right. Love it. When you said father of young children, I was just about to say my nephew loves bubble gum. Is Sean going to say bubble gum?

Sean:

My son loves vanilla with sprinkles, I think it’s his favourite.

Jamal:

Nice. So, Sean, you're a former competitive programmer. For someone who's listening, who's not really sure what a competitive programmer does or is. Could you explain what that is?

Sean:

Yeah. So there's a few different competitive programming competitions out there. The one that I competed in the most while I was in university is the ACM Intercollegiate Programming Competitions, or ICPC, and these are worldwide competitions amongst different universities. So the way these work is you actually compete on a team of three, but you have one computer, and then essentially you have 5 hours that you're competing. And at the beginning of that 5 hours, you're handed a stack of eight to twelve problems in no particular order. So you don't necessarily know that the first one that you see is easier than the last one that you see. And through those 5 hours, basically the team that solves the most amount of problems in the least amount of time wins, and you get penalized every time that you get a wrong answer. And essentially the way these problems are structured is they have some sort of story that describes what the situation is. So maybe it's like someone's lost at sea, but you have certain radio signals that tell you kind of approximately where they are. How do you plot a course to rescue the person while also avoiding maybe the location of various pirates or something like that in the least amount of time? And then they'll give you a description of how the input will be structured and how you need to structure the output. And then they give you a couple of samples. And then what they're going to do once basically you figure out how to solve that problem, you write the code to it, you submit it.

Sean:

It runs through an automated system of a bunch of test cases that you've never seen before, and all they do is report back whether you got it correct or it's the wrong answer, or maybe you exceeded the time limits or the memory limits of the problem. Very little information to go on. And essentially you get penalized 20 minutes every time that you get that wrong I started competing in those in my third year of computer science at University of New Brunswick. Essentially, it starts out with tens of thousands of different universities around the world, and you compete in different stages. So I started out competing against essentially all of Eastern Canada. And then the best schools from there win or finish in the top of those competitions go to the next round, which is essentially Northeastern United States and Northeastern Canada. And then whoever does the top two teams from there go to essentially the World Finals. So I was lucky enough to go to the World Finals twice while competing for those. And I believe the teams that I competed on are the only two teams from that area of Canada that have ever made it to the World Programming Finals. So hopefully that'll change over time. But generally, what ends up happening in the area that I competed is in the second round, you're basically going against MIT, Harvard, Princeton, like the full gamut of the Ivy League, and they kind of come in as this buzzsaw and knock out all the Canadian schools. But when I was doing these competitions, we ended up you know beating out MIT in the first year, and then we beat out Harvard in the next year, and we're able to go on as this essentially school that no one had ever heard of, beating these giants.

Jamal:

Congratulations. That's amazing. So how did you transition from that into data privacy?

Sean:

Yeah, so I would say, like many things in my career, it wasn't necessarily by design. I started out on a path to being an academic researcher. That's where I worked with the World Health Organization. So I was a postdoc student at Stanford Medical School, working on designing new tools or sort of modernizing the way that we build the International Classification of Disease, and a lot of work in biomedical, bioinformatics, ontologies and so forth. And there you're dealing with a lot of sensitive data, obviously, medical health records, clinical trials records, things like that. So that's kind of where I first started getting some experience with what we would classify as, like, sensitive data or regulated data. And as a sort of academic and researcher and engineer, I didn't really have that much concept of essentially what sensitive data is, how do you protect it, what the regulations are. And then from there, I ended up founding a company that was in the job and HR space. So then again, we're dealing with very sensitive information as people's job applications, their history of work, other employee information, as well as essentially our customers, which were companies. So there we had to deal with things like PCI compliance and also just like general data protection. But again, I would say I didn't have a really in-depth understanding of why any of these things were important. I think that's consistent, really, with most people who have engineering training. My entire sort of security training, even though I spent a decade in university was I took one course as an undergrad in encryption, and that was about it.

Sean:

And that's, I think, sort of par for the course for most people that go into engineering, we don't get a lot of in-depth coursework around privacy and why it's important. And I think that's a reflection know some of the challenges that we have as an industry today. And then at Google, I did a little bit around HIPAA and GDPR, as you would expect, but all of this was sort of basic knowledge. At Google was my first time ever working with privacy engineers, and I didn't really understand the function. And then eventually I came to Skyflow. I was recruited to join Skyflow by the CEO and founder who was also an investor in the company I had started. And what really attracted me at the beginning was this idea of creating an API for something as challenging, as complicated as data privacy. Because as someone who had built some of these types of products that touch sensitive data, as an application engineer, I don't really want to have to be a privacy expert or a security expert. I just want to build my application and serve my customer. But I also want to do it in a way that is going to comply with whatever the rules and regulations are, as well as do the right thing and actually do my best to protect people's identity and their sensitive information. And I love the idea that we could abstract that away as a product and make it as simple as plugging an SDK or hitting an API just as an engineer, that was a really attractive proposition. And of course, now that I've been at Skyflow leading various functional areas there for a year and a half, I've ramped up my understanding significantly in the privacy space. But that was really the original attraction was, hey, I just want to offload this to something. And that sounds amazing to be able to do that.

Jamal:

I'm seeing the connections from your early days competing under different circumstances with different challenges. So here you are trying to create a piece of code or trying to achieve something, and some of the constraints you have is it has to be compliant. It has to win the customer's trust. We have to be doing the right thing by them as well. So how do we do something that's easy to it into? And you go back into that competitive mode and you just go, here we go, this solves all those problems.

Sean:

Yeah, I think there's definitely a degree of that is I think a lot of times, especially in privacy and security, if you make something too complicated, then people generally find a way to sort of circumvent it. And that's a lot of times when companies, I think, run into problems is like if the security measures or the process to make something secure or private is too complicated for the team, then they're going to find some way around it. Or maybe they don't understand it and they don't even know that they're actually working around it because they just see it as a blocker to them being able to deliver something. So if you can make it really simple and that's the default experience, then people are actually more likely to actually follow those processes. I think that's some of the advantage of speaking larger to things like the public cloud is if you can build a lot of these services on AWS or Google Cloud or wherever it is with secure by default, then I don't have to think about it as an engineer. That's the default experience and it makes it really easy. And then I have that stuff sorted of out of the box.

Jamal:

I completely agree with you, Sean, about making things more complicated and not making things simple enough. Because what I found in my experience is when something doesn't have clarity, people don't have any confidence. They don't know what they should do. They don't know what they shouldn't do. So, they do one of two things either they do nothing or they just ignore it and find a way around it. When we overcomplicate things, rather than focusing on simplicity, it creates additional problems, and you end up with bigger problems than the one you try to solve to begin with. And what I can understand is when I'm speaking with engineers and the technical guys, is you're right, you didn't go to law school, you didn't have to study privacy laws. It's something that has just come along. You haven't been giving proper training on it, so how do you understand things? And that's where making things really easy peasy really comes in. And that brings me on to that's. The whole reason I wrote this book. The Easy Peasy Guide to the GDPR. And what I've done is taken out all of the complexity and legalese from the actual regulations and just made it very simple. In fact, when I wrote this book, I had my eleven-year-old niece in mind and I was saying, if she can understand it, then any engineer, any privacy pro, any CEO, any executive can understand what they need to do. And if they're clear on what they can do and what they can't do, then they should have confidence to go about and actually make those things happen. So what you're saying there about simplicity, I love it. In fact, in my book I have a quote from Albert Einstein who says that if you can't explain it simply enough, it means you haven't understood it well enough to begin with.

Sean:

Yeah, I believe there's a similar quote by Richard Feynman, another famous physicist, that talked about how if you can't essentially teach it, like at a high school level, then the concept isn't understood at a deep enough level at that point. And I agree with that a lot. And I think a lot of the challenges that we've had as an industry around essentially data protection and all the breaches that we see, it's not that I think there's any malicious intent behind companies not dealing with the problem. It's almost as if a lot of times they just don't know what they don't know. As an engineering organization, you're just, like, focused on, hey, like, if you're Uber for, like, the thing that you're focused on is, how do I optimize the experience of someone ordering an Uber driver and get that driver to them as efficiently as possible, essentially. And that is what you're spending all your thinking cycles on and your problem-solving cycles on. And that's what you're hiring your talent for. It's not really their job to be thinking about this essentially data protection, data privacy at a deeper level, and they don't even have the training and know how to even be maybe asking those types of questions.

Jamal:

Absolutely. Very insightful. Thank you for sharing that, Sean. So from what you've seen and from the people that you speak to, what are most companies getting wrong with their data privacy strategy?

Sean:

The basic one is not having one, I think is like the major problem. And again, it goes back to kind of not knowing what you don't know, or as you mentioned earlier, just being hit with all this complexity and then thinking you're almost like a deer in headlights, and you're like, I don't know what to do here, so I'll just kind of ignore it until it becomes a real problem. And that's unfortunate. I think the situation that a lot of companies find themselves in is kind of like the backups problem, where everyone knows that they need to back up their data and they need a disaster recovery plan. But it's one of those things that it's not going to burn you tomorrow, so you can kind of keep pushing it off, pushing it off. And then companies find themselves in this really bad place someday when they actually do need those things and they haven't tested it, or maybe they didn't do it at all. So I think besides not having one, the other challenge is just starting late. And I would also say thinking that people can kind of do it themselves is not a challenge that you want to take on yourself unless you have the expertise and resources to really do it. And that's where companies like Skyflow come in, where you can offload a lot of those challenges and complexity, because this is all we do. We have 60 plus engineers that spend all their time thinking about this stuff. It's just like, how do I do encryption key rotation, right? How do I make sure that is performant? We're not going to get locked out of the system and so forth. And I think people generally sort of underestimate the cost of DIY, and that's true of not just data privacy, but most kind of services. A lot of times engineers think about what is the initial cost to get something working, essentially. But then there's all this the actual bulk of the cost is like an iceberg.

Sean:

Like that 20% that's above the surface is the cost of getting something kind of working. And then there's 80% that's below the surface, which is all your maintenance updates upkeep building out a team to monitor all this stuff. It gets really expensive over time. And the way that a lot of people start out is they think like, oh, I'll do some database encryption and that will be fine. The records are essentially encrypted, so the database is secure. So what's the worst that could happen here? But what ends up happening is you have to support all these new use cases that come along and whatever you're doing in this bespoke system, you have to adapt. It's not a platform that you necessarily design from the beginning. You're just solving one particular problem. So now suddenly your company needs to be able to show like the last four digits of a Social Security number or a phone number. And how are you going to do that? Well, then you end up in a situation where you need to grab the encrypted record, decrypt it, dynamically mask it, maybe you need to share it with a third party. So then you need to decrypt it. So in those places in your infrastructure where you're decrypting it, how are you protecting that? And then you need to be thinking about that, who is responsible for that? And then in a larger engineering organization where you're going to have multiple teams working on multiple products, running various microservices, how do you do that holistically across all these different teams?

Sean:

What you end up seeing a lot of times is different teams are solving the same problem in different ways. So someone maybe is doing relying on essentially encryption at the database level. Someone else is solving it in some other way or maybe they're not doing it at all. And you end up with situations where people are logging some of this information and the log files aren't encrypted and they get compromised. So companies just end up buying all these tools and building a number of sort of bespoke or DIY solutions. And altogether it's just not very flexible or adaptable. And that leads to a whole bunch of different problems that companies face. And I think that's why we continue to see companies who spent millions of dollars on various solutions still suffering from breaches or even compliance issues.

Jamal:

I think you're absolutely right. And it's something that I can definitely say I've seen with a lot of clients is they decide because of a cost or perceived cost at the beginning, they're going to try and figure it out themselves. And then as they start touching it, they realize, oh, we need this and we need that, or we need this and solution, we need that solution. And then they have so many different solutions, and they create their own entry point risks with whatever they bring along as well. And suddenly, what you thought you were saving money by doing it yourself, actually, over the longer run, it's causing more expense. First of all, it's taking more time. It's causing more problems, which means people aren't focusing on their core area, which is what they're experts in, and they're trying to mess about, trying to learn and figure something out. And all of the opportunity cost of that, it actually makes more sense just to get the experts like Skyflow to come and do that to begin with and do the best job, because, like you said, that's all they're sitting there thinking about. So I'm a big fan of getting experts, getting mentors, getting the right people in the right places, because I believe you either pay with pain or you pay with attention. And I'd rather pay with attention at the beginning than have to pay with pain and have a much bigger problem.

Sean:

Yeah, and I think the other challenge, too, is that generally when you're hiring people, they're attracted to whatever essentially is customer facing within your business. And engineers typically want to work on customer facing features because they want to feel like people are using my stuff. So then when you start to build out teams to handle problems that aren't necessarily core to your product area, what is the depth of that talent? Are you putting your best people on that, or are you putting your best people on things that return ROIs to your business? And historically, if you look at something like, as an analogy to or something comparable to data privacy and security is backups. A lot of times in the backup world, you're putting your junior person on backups because you're like, it's not that complicated, it's not that important. It's very rarely ever going to be a problem. And then, of course, it becomes like a major problem. And I think we see that all the time in the privacy and security space, too, unless you're a big enough company to really build out the expertise. And even then, it's still a huge cost centre to try to take on all these challenges yourself versus relying on experts that essentially live and breathe and think about this stuff 24/7.

Jamal:

Exactly. And the other thing is, what I found is sometimes what businesses actually need isn't just encryption of the database. They actually need to encrypt at file level or even at field level, especially when we're dealing with what you said about medical data and when we need to share it. Do you have the capabilities in house to be able to do that? How difficult is it going to be for you to figure that stuff out? How many different things do you have to figure out to make sure everyone talks together and all of your third party entry risks and what your surface area is. Just let the experts deal with it. That's what I always advise. So it's great to hear Skyflow is offering really pragmatic solutions where we can outsource all of that and know that it's taken care of by the experts. Tell me a little bit about Skyflow. What kind of businesses are you most suitable for?

Sean:

rent nations. Essentially, in:

Sean:

And then the last big one that we're starting to see a ton of momentum around, which is something that everybody is talking about right now, is around generative AI and large language models. And the big challenge there for companies is that lots of different companies want to invest in these different models. Everyone is super excited about the potential of what these things can do for your business. But it represents a completely different paradigm in the way that we think about data. In the old world of the regular world, of how we've been dealing with data for the last 40 plus years is that data sits in a file, it sits in a database or something like that. But we could essentially point to the data, and even though these systems get really complicated and we end up with lots of copies of the data, conceptually, we understand that if I need to get rid of it, I just need to find those different locations and delete it. I delete a row in the database and the data is gone. But in the context of a large language model that doesn't exist, there's no row of data to delete. Essentially, once a model learns, you can't really unlearn it unless you're keeping snapshots of all your different models and you blow away the new model, which is not really feasible in terms of the cost to do that and the size of these models and so forth.

Sean:

So how do you solve that problem? And this is why we've seen temporary bans in countries like Italy, Samsung banned ChatGPT for a while because essentially employees were copying internal source code into ChatGPT and then it's like, well, what happens to that? And essentially, in the context of ChatGPT, they can keep that stuff around for training new versions of the model. So it's not guaranteed that that's going to show up, but it's possible that they could show up in essentially a prompt response later down the road in some new version of the GPT model. So this is something that I think is very front of mind for lots of different companies. And we recently launched a version of our Data Privacy Vault technology that's specifically designed for working with large language models, which gives you full data protection through the entire lifecycle of a large language model. From training inferencing prompting. Even things like fine tuning or embeddings and so forth, which gives you all the value and utility of using a large language model while essentially protecting either your customer information or even your internal company information. So that's really what we're about in terms of the technology that we provide all these different companies.

Jamal:

Thank you very much for that insight. And if you're listening and you can resonate with any of those challenges, or you can foresee some of the challenges upcoming that Sean's been speaking about, then get in touch with Skyflow, speak to Sean or a member of his team and just have a listen to see what kind of solutions they could help you out with. There's no obligation. That doesn't mean you actually have to sign up with them or anything, but just go and have a discussion and just see where that takes you. And it might be that they just give you some ideas and you go away with that. And you do the DIY of model. You make lots of mistakes and you go back to them later on anyway, which is what we see most clients do with us. They say, okay, thanks, but no thanks. And then they come back when big problems really hit them. But you can be ahead, get ahead of the curve. We can see these problems are coming along, so let's just deal with them now while we can, rather than having to deal with them when it becomes a larger problem and becomes more expensive and more urgent. Now, one thing that we both agree on, Sean, is when it comes to data breaches, it's not a matter of if, but rather when. Why is that so?

Sean:

I think there's a couple of things. One is that I think if someone or some group of people is determined enough, then they can probably get into any system. If they're really focused on it. And I think what companies essentially do have control over is how easy or hard that is and then also what the potential impact of that breach is. So if you look at something like the Robin Hood breach that took place a couple of years ago, what happened there was a customer support agent had been socially engineered to giving up their credentials. And then what the attacker was able to do with those credentials is, it gave them access to 6 million records in their database of all in plain text. So you can't really stop something like the social engineering or it's very hard to even if you train your employees and stuff, people still get tricked. It happens to humans. But you don't need to necessarily give your customer support agent access to 6 million records. There's no reason for them to have access to 6 million records at any given time. Realistically, they should only have our access to whoever's in their customer support queue. And even then, it should be what limited information they need to do their job. So a lot of this comes down to, and you mentioned this earlier, that it's not just about essentially are you encrypting the data at the database level or where you're storing it, but how do you essentially control access to it?

Sean:

That is the really hard part. So you can buy essentially point solutions or something like that to do encryption. You can even buy point solutions for doing tokenization. But how do you essentially combine all these privacy enhancing technologies with something like comprehensive fine grained data access control so that Susie in accounting is seeing a different level of information, it has different level of access, different number of rows that they can access versus you know Joe in customer support. And that is really the hard problem that I think most companies struggle with. And then the other big challenge and why I think it's very difficult for most companies to prevent something like a data breach is that most of these companies just have a huge PII sprawl problem, and the data ends up everywhere. So we collect the data, let's say, in the front end of an application, and then it's going to be passed downstream through our API gateway and a bunch of backend services, and it ends up in a database or warehouse somewhere, and we think about, okay, that's where we need our data protection. But all those touch points along the way are potential attack surface areas. And a lot of the times, those different services have log files that are written. Maybe that's for debugging. We might accidentally dump someone's PII into a log file that's unencrypted, and that becomes a compromise point. There's all these different touch points that we aren't thinking about, and then the backups of all those systems as well also end up in a situation that we need to be thinking about protection and deletion and so forth. So it's not just a matter of protecting that single entry point or there's no single source of truth essentiall. It's like the difference between having one copy of your passport that you keep in a safe location in your home or making 10,000 copies of your passport that you distribute all over the place and then try to protect all those locations. That's a much harder problem to solve.

Sean:

alked to last year, they have:

Jamal:

This is true. If you don't know where it is, if you don't know what you do with it, if you don't know what other people are doing with it, how will you even protect it? There's no clarity. So how can you confidently say, yes, all our PII is protected, when you have no visibility on it to begin with? And you're right, this is one of the biggest challenges that businesses are facing. And if you're listening and you've been listening to Sean and he's scaring you, well, good. That's exactly what we need to be aware of, because without that awareness, we have this false sense of security. And when something happens, we act surprised. But why should we act surprised? It's because we never had that visibility or that awareness straight away to begin with. Rather, and here is Sean doing his best. He's given up his time to come and hey, hey, these are the vulnerabilities. This is where the weaknesses are. This is what the challenges are. And now you're aware of those. You've solved half the problem. Having awareness is solving half the problem. The other half is there. And Skyflow can help solving a lot of that for you. So be put at it. There are solutions there. First of all, we need to become aware of the problems. We need to become aware of the extent of the challenges. Once you've identified that, got clarity on that, then we can start looking at credible, affordable and pragmatic solutions that can help us keep the reputation of our organizations and also keep that trust that those people have given when they shared the data with your organization to begin with. Sean, how is AI artificial intelligence changing the rules of cybersecurity and following on from that, what steps should privacy professionals and businesses be taking right now as a result?

Sean:

Yeah, it's a really good question. And we touched on a little bit of this around some of the large language models I was talking about earlier. I think there's both good things from a security perspective in AI and then also potentially bad. If you look at something like fraud detection as one area of security. A lot of the first practical applications of AI was in the fraud realm, essentially. Like, how do we do pattern recognition to be able to differentiate between a good actor and a bad actor? And a lot of that was essentially in the fraud world. And I know that there's a lot of work now incorporating these new techniques around deep learning models as well as generative AI into the fraud world. But at the same time, there's also gives the attackers or the fraudsters or whoever's doing these types of attacks new tools to also leverage. And in many ways, the information sharing in the sort of bad actor criminal world is better than the information sharing in the non-criminal world. Like a lot of companies that figure out some way of essentially protecting themselves, they're not necessarily readily sharing that information with the world. They're kind of like it's a little bit like you don't want to let people know how you're doing it, or you don't want to let people know that you had a vulnerability and then you filled the gap. Whereas in the criminal world, there's all kinds of different locations and chat groups and so forth where people are readily sharing this information, you can learn. So now with things like large language models, you could actually build and fine tune a model that could be your essentially attack assistant, where you're actually able to leverage that to help you find these things. Because a lot of vulnerabilities that people exploit are due to old systems and known exploits. So people share that, or you can learn it, and then with the access to something like a large language model, you could actually essentially figure out where those vulnerabilities and then potentially target vulnerable systems easier and more scalable. And at the same time, on the flip side, we can also use this technology to help monitor systems, detect anomalies, and make it more scalable and easier. So in a lot of ways, this becomes sort of an arms race, I think, to some degree. And then, as I mentioned earlier, around large language models, the big challenge that companies are facing with actually good attentions is just the privacy and security challenges. Like, this is really a new paradigm, and people are trying to figure it out.

Sean:

And there's such a huge rush and hype cycle right now around this. As somebody who's entering this that maybe doesn't know that much, but is interested in the potential, it's hard to differentiate between essentially, I don't know, the snake oil salesman and the people who actually know what they're talking about. So I think as a company or anybody, an engineer working in this space, it's very important for us to educate ourselves about these new technologies and also understand the privacy and security challenges, but make sure that we're getting that from, I think, a reliable source. Don't rely on tech twitter, essentially, is what I'm suggesting as your information source. Go to the people who really know what they're talking about. And then the other big challenge which we mentioned earlier is, it's not just about securing the data. It's really like, how do you control access to all this stuff? How do I make sure that the right people have access and they have access to the limited set of information that they need to be able to do their job? Because some of the ways that we're approaching, some of the challenges around privacy and security for generative AI is essentially, let's create essentially a private space. Let's do it in a private cloud. But that doesn't really solve the access control problem. So if I build my own essentially large language model based on GBT, I do a private version of GBT for my company. That doesn't solve the challenge of who within my company can see what. And that is the bigger sort of fundamental challenge that needs to be solved, I think.

Jamal:

Absolutely. Now, other than listening to podcasts like Software Engineering Daily or Partially Redacted, what other sources are there that you could say, like, are really great sources for people who are privacy professionals that are more interested in the privacy engineering side of things?

Sean:

That's a good question. I do listen to a ton of podcasts. This isn't specific to privacy, but I think it's a really valuable podcast for anybody that wants to learn about what's going on in the industry. And they do, of course, sometimes dip into these hot button topics that are going on, whether that's privacy or generative AI and stuff. It's the SaaStr podcast. They have a lot of sort of C-level talking about how they solve specific challenges. I think that's a fantastic business podcast that I think is valuable to listen to in the engineering space. I also know the MongoDB's podcast is great. There's a lot software engineering radio is another one. There's a lot of great engineering focused podcasts that a lot of times are touching on specific privacy use cases. And one of the reasons I started Partially Redacted was I felt like when I first looked at podcasts focused on privacy, there was less that were technical. A lot of them focused more on the sort of legal issues. So I wanted to create something that could help essentially educate technical practitioners so that we could solve some of these challenges around, like, I don't know what I don't know so I essentially ignore the problem.

Jamal:

I love what you said there. I don't know what I don't know. And this is one of the challenges that we can see with a lot of people in the industry, is they believe that they can self study and teach themselves everything, or that because they've done really well with exams in the past when they was going through the academic studies that somehow that translate into the actual practical world. And they believe that they can self-study and do everything. But you're only limited to what you know. Your best level of thinking gets you where you are, and you don't even know what you don't even know. And that's why you need to find mentors. You need to find experts who have a different perspective, who have a wider understanding to help you to get that clarity and to help you to start asking the right questions. That shift in perspective is what I think is the difference between the average mediocre professionals and the really great professionals in an industry. Would you agree with that, Sean?

Sean:

Yeah, I definitely think that's the case. I remember when I was doing my master's degree, the professor who was my supervisor, one of the things that he said that I always remembered that resonated with me was he said, the problem with people with PhDs is they think that they know something, meaning, essentially, that once they've achieved their doctorate in whatever the degree, they kind of, like, stop learning. They're like, okay, I've hit the apex now, I'm done. But really, the reality is, even if you have a PhD in something, one, your level of expertise is usually super narrow. It's like one very specific problem. But the other thing is that's really just the beginning of the journey. It says something about the commitment that you've had to achieve this degree, but that's just the beginning of your educational experience, and I think you can never afford to stop learning. And a lot of times, as you mentioned, you need enough information to even be able to start to ask the right questions. Because if you have no information, you don't even know what questions to ask.

Jamal:

Exactly. That's super powerful. And one of the things I teach my mentees is the quality of your life, the quality of the results that you get is going to be based on the quality of the questions you ask yourself and you ask your clients and your team and those around you. And if you don't even know what question you should be asking, well, that's the real problem, isn't it?

Sean:

Yeah, absolutely.

Jamal:

So in addition to what you've just shared there as a super valuable lesson, what other valuable lessons have you learned over your career so far that you can share with our listeners?

Sean:

A big part of it that's worked for me is sort of being open to new opportunities and being adaptable, and that's allowed me to sort of skill stack along the way. I've been able to compile a variety of different skills that end up making me more unique if I had to focus on one specific thing. So as an example, my undergrad, I focused on theory and computation. So it was very mathematical, very conceptual. And then for my master's, I focused on artificial intelligence. And then my PhD, I did human computer interaction, software engineering. So even within the degrees that I did, I went in different directions along the way. So I was good at theory, but I probably wasn't in the top 10% in the world. But by the time I'm completing my PhD, now, I'm someone who can run a user study, prove a theorem, and also build an AI model. And that combination makes me more interesting and unique than someone who purely focused on one of those areas. And then I went into Bioinformatics and then started a company. And as part of starting a company, I had to learn marketing and business and sales and so forth. And that essentially led to the job and career that I had at Google and then eventually at Skyflow, I would never be in the position that I am today if I had to focus on one particular thing. And I don't know what I'll end up doing next, but I wouldn't be surprised if it's something in a different direction as well. And it really comes down to I think what we've been talking about is continuing to be open to learning new things, learning new skills, educating yourself, and acknowledging that you don't know everything and just having that sort of curiosity to deep dive into new topics and new areas and allow your interests and passion to kind of steer you a little bit. And a lot of these things that I've done I didn't do with some objective in mind that hey, I want to get to this place in my career. I did it more, I'm interested in this thing and I just want to learn about it and I would learn about it whether someone was paying me or not. And I just allowed that to kind of steer it and then eventually those things led to opportunities in, in my career but it wasn't something that was completely by design, it was really just a genuine interest in those particular topics and I've been able to leverage that throughout my life and throughout my career.

Jamal:

Wow, that is super powerful Sean. That is super powerful. It really resonates because one of the things that we do on our twelve-week accelerator program is we talk about skill stacking and we have five pillars essentially and the first pillar is all about mindset. And what you've described said there is that passion for learning is what we call a growth mindset. Not being afraid to try something new, not being afraid to pick up new skills, knowing we're going to have to do it badly. But the more we do it, the better we're going to get at it. The more we learn, the more we improve. And one of the things that you said there was really powerful. You said I wasn't looking to see where I can get in my career, but I was looking to see how much I can grow. And that's exactly what we teach our mentees at the private pros academy. Stop asking what can you get, start looking at what can you give? And when you start looking at what can you give you first have to ask yourself what do I have to give? And if you have a very limited skill set then what you have to give is very limited and the value you bring to the table is very limited. But as soon as you start stacking those skills just like you've been doing and just like you've been encouraging our listeners to do, then you become super valuable and you become super versatile and you create more options. And everyone talks about freedom. Freedom is having the ability to choose and be open to opportunities, to be able to say no to them. If someone only has a very limited skill, their opportunities are so limited there is no way they can say they have freedom. They're not, they're trapped. They're only trapped based on what they have. Whereas when you have more diverse skills, when you have more diverse value that you bring to the table, you now have more doors open to, you have more opportunities and you can really take advantage of those and see where they get you. And as long as you're driven by your passion, you're driven by your principles and the things that you hold dear to you, and you're doing stuff that energizes you, then you're going to have great careers, you're going to have great freedom, and you're going to be able to bring a lot of value and leave a true legacy in the world.

Sean:

Yeah, absolutely. And there's a great book called Range. I think the author is David Epstein that I always recommend and I think the subtitle is like the Generalist will rule the world or something like that. But they give all these examples essentially of how over specialization too early in your career, whether that's sports or academics or whatever it's going to be, makes it so that you become basically a more rigid problem solver. You can't essentially adapt to new situations. There's all these different examples where some really hard problem in chemistry ends up being solved by someone with a background in geology because they were able to look at the problem in a completely different way and sort of connect it to something new. Whereas if you've been really focused and narrow on one specific thing, you might be an expert in that thing, but it's hard to kind of see outside of that one thing that you are used to using as your tool of choice. What's the saying, if you're a carpenter, every problem looks like a hammer or something like that? You can probably do a better job of interpreting that than me, but essentially you end up using the same thing over and over again and not really being adaptable and flexible. I really agree with you that that's where the freedom comes from and all these kind of downturns and changes in the market. If you can essentially have that level of freedom because you're adaptable, then you can change with the market as well. It's not about like, oh, suddenly my job's been deprecated or something like that. You could change and do something different. You can contribute in a new way.

Jamal:

Exactly. And this is one of the challenges we're seeing in the tech market right now is large language models are coming in, AI is coming in and a lot of people are worried that they're going to lose their jobs. And the only reason they're worried, rather than getting excited is because they have such a specific limited skill set that that's all they've been doing for the last 5,10, 15 years. And the thought of doing something new scares them and that's what they're living in a fear of. So a lot of people are retraining and pivoting to things like privacy. But if we think about those things now and we always make sure we're constantly thinking about how are we adding to our skill set and stacking those skills and stacking the value that we bring to the table. We should never be in a situation where we have to be concerned about something coming in. We should be excited about changes, we should be excited about embracing those things. And for that you need to have that growth mindset where you're learning about these things, where you're actively seeking opportunities, whether someone's paying for it or not. And one of the things that really upsets me sometimes is how little people value investing in themselves. I can't understand why anyone would think that you are not your best investment. I can't think of anything better to invest in than myself. The challenge that I see sometimes is they say, oh, my boss is not going to pay for me, or this isn't going to happen. Well, how are you waiting for someone else to invest in your education or invest in yourself? Why don't you value yourself enough to invest in yourself?

Sean:

Yeah, and I always felt like I spent basically my entire 20s in university. I'm not a professor academic today. And some people ask me, if you could do it again, would you have done it that way, or what skills did you get from your PhD that you're using today? And I think there are things that were transferable. But to me, education isn't purely about career. Essentially, you have your entire life to work and to make money, but essentially education is about bettering yourself. It's the same thing that you get from deep diving into something that you're not necessarily being paid for and not necessarily everybody has that gear. It could be something else. Maybe it's physical training or something like that. But I think investing in yourself is really important. And then the other thing that you mentioned around, essentially people end up with a fear of change. I think we can kind of get into that mindset and I've had different points in my life where I felt that way as well. But if you don't continually challenge yourself, then it's easy for that fear to build. It becomes like a bigger hurdle to get over. Whereas if you sort of constantly doing things, even at a micro level of going through something, that makes you a little bit more uncomfortable, pushing yourself outside of your comfort zone from travel, taking a different path home than you normally do, all these types of things you can actually do to just push your boundaries a little bit, then it gets easier. Suddenly doesn't seem that scary because, oh, I've done a million things like that, or you think about like, millions of people have done this thing, of course I can do it. And I think it helps build that confidence so you can have that growth mindset.

Jamal:

Absolutely. I couldn't have put it any better myself. Sean, before I let you go, one of the things we always let the guests do is to ask me a question. So feel free to ask me a question, any question you like.

Sean:

Well, I guess I'll flip the ice cream question back to you. What kind of flavour of ice cream would you be?

Jamal:

I would have to be a very exotic flavour of ice cream. So exotic that I don't even know what that is. And the reason I say that is because if you look at my background, if you look at the kind of skills I've been stacking, it's very peculiar blend. It's a very peculiar blend. But that's the only reason I get to do what I get to do is because I've got such a strange and varied stack of skills and experience behind me. So it would have to be something exotic with lots of different bits going inside.

Sean:

Excellent. Yeah, that's a great answer.

Jamal:

Sean, thank you so much for coming onto our podcast. It's been an absolute pleasure speaking with you. Until next time, folks. Peace be with you.

Outro:

If you enjoyed this episode, be sure to subscribe, like and share so you're notified when a new episode is released.

Outro:

Remember to join the Privacy Pros Academy Facebook group where we answer your questions.

Outro:

Thank you so much for listening. I hope you're leaving with some great things that will add value on your journey as a world class privacy pro.

Outro:

Please leave us a four or five star review.

Outro:

And if you'd like to appear on a future episode of our podcast or have a suggestion for a topic you'd like to hear more about, please send an email to team@kazient.co.uk

Outro:

Until next time. Peace be with you.

Show artwork for Privacy Pros Podcast

About the Podcast

Privacy Pros Podcast
Discover the Secrets from the World's Leading Privacy Professionals for a Successful Career in Data Protection
Data privacy is a hot sector in the world of business. But it can be hard to break in and have a career that thrives.

That’s where our podcast comes in! We interview leading Privacy Pros and share the secrets to success each fortnight.

We'll help guide you through the complex world of Data Privacy so that you can focus on achieving your career goals instead of worrying about compliance issues.
It's never been easier or more helpful than this! You don't have to go at it alone anymore!

It’s easy to waste a lot of time and energy learning about Data Privacy on your own, especially if you find it complex and confusing.

Founder and Co-host Jamal Ahmed, dubbed “The King of GDPR” by the BBC, interviews leading Privacy Pros and discusses topics businesses are struggling with each week and pulls back the curtain on the world of Data Privacy.

Deep dive with the world's brightest and most thought-provoking data privacy thought leaders to inspire and empower you to unleash your best to thrive as a Data Privacy Professional.

If you're ambitious, driven & highly motivated, and thinking about a career in Data Privacy, a rising Privacy Pro or an Experienced Privacy Leader this is the podcast for you.

Subscribe today so you never miss an episode or important update from your favourite Privacy Pro.

And if you ever want to learn more about how to secure a career in data privacy and then thrive, just tune into our show and we'll teach you everything there is to know!

Listen now and subscribe for free on iTunes, Spotify or Google Play Music!

Subscribe to the newsletter to get exclusive insights, secret expert tips & actionable resources for a thriving privacy career that we only share with email subscribers https://newsletter.privacypros.academy/sign-up

About your host

Profile picture for Jamal Ahmed FIP CIPP/E CIPM

Jamal Ahmed FIP CIPP/E CIPM

Jamal Ahmed is CEO at Kazient Privacy Experts, whose mission is safeguard the personal data of every woman, man and child on earth.

He is an established and comprehensively qualified Global Privacy professional, World-class Privacy trainer and published author. Jamal is a Certified Information Privacy Manager (CIPM), Certified Information Privacy Professional (CIPP/E) and Certified EU GDPR Practitioner.

He is revered as a Privacy thought leader and is the first British Muslim to be awarded the designation "Fellow of Information Privacy’ by the International Association of Privacy Professionals (IAPP).