Featured image

Creating a culture of privacy matters for GDPR and CCPA compliance

What you'll learn on this podcast episode

The world of data privacy and protection continues to evolve at a rapid pace. From the growing number of US states adopting privacy laws to the growing list of rulings under GDPR, the EU’s General Data Protection Regulation, it’s a lot to keep track of. What can organizations do better to adapt to these regulatory shifts and adopt a greater culture of privacy? In this episode of LRN’s Principled Podcast, host Aitken Thompson talks with Andrew Lachman, the head of legal and data protection officer at Contentstack, about data privacy and protection and how to create a privacy culture in the modern workplace. 

Learn how you can get involved in today’s conversations around data privacy and protection with these organizations mentioned: 

Where to stream

Be sure to subscribe to the Principled Podcast wherever you get your podcasts.

Listen on Apple Pocasts Listen on Spotify Listen on Audible Listen on Google Podcasts_@2x Listen on TuneIn

Listen on Amazon Music Listen on iHeart Radio Listen on Podyssey Listen on Listen notes Listen on PlayerFM

 

Guest: Andrew Lachman

Andrew_Lachman_Principled_Podcast_S7_E12

Andrew Lachman has nearly 19 years of experience in privacy space, having founded the privacy practices committee at Move.com and co-founding the Congressional Tech Staff Association while Legislative Director for Congressman Ted Lieu who represents most of the Silicon Beach area. He is currently Head of Legal and Data Protection Officer for Contentstack after running his own firm for a number of years working with startups and growing companies. Andrew is a co-founder and chair of the LA County Bar Association's Privacy and Cybersecurity Section, a member of TechGC, the California Lawyers Association Privacy Section and has been a member of the International Association of Privacy Professionals since 2007 when he received is Certified Information Privacy Professional certification. 

Host: Aitken Thompson

Aitken_Thompson_Principled_Podcast_S7_E12

After starting his legal career at Kirkland & Ellis, Aitken became interested in the then-nascent field of educational technology.  He left law firm life and co-founded Thompson Educational Consultants and, subsequently, Taskstream, LLC.  Taskstream quickly became a leading company in assessment and accreditation for higher education. Aitken served as Chief Operating Officer, leading the legal, human resources and finance functions of the business. Beginning in 2016, Taskstream underwent a rapid expansion, merging with five other ed-tech companies in a span on 18 months and, in the process, becoming Watermark, LLC, and creating the “Educational Information System” category of ed-tech.  During this period, Aitken’s legal and HR focus expanded to encompass private equity investment and the transition between primary sponsors, cultural and process integration amongst the various merged entities, and the management and harmonization of legacy client and vendor contracts. 

 

Be sure to subscribe to the Principled Podcast wherever you get your podcasts.

Listen on Apple Pocasts Listen on Spotify Listen on Stitcher Listen on Audible Listen on Google Podcasts Listen on TuneIn

Listen on Amazon Music Listen on iHeart Radio Listen on Podyssey Listen on Listen notes Listen on PlayerFM

 

Principled Podcast transcription

Intro: Welcome to the Principled Podcast brought to you by LRN. The Principled Podcast brings together the collective wisdom on ethics, business and compliance, transformative stories of leadership, and inspiring workplace culture. Listen in to discover valuable strategies from our community of business leaders and workplace change-makers.

Aitken Thompson: The world of data, privacy, and protection continues to evolve at a rapid pace, from the growing number of U.S. States adopting privacy laws to the growing list of rulings under the EU General Data Protection Regulation, it's a lot to keep track of, and that doesn't even include following your own company's data privacy policies. What can organizations do better to adopt to these regulatory shifts and adopt a greater culture of privacy?

Hello, and welcome to another episode of LRN's Principled Podcast. I'm your host, Aitken Thompson, Chief Legal Officer at LRN. And today I'm joined by Andrew Lachman, the head of legal and data protection at Contentstack. We're going to be talking about data privacy and protection and how to create a privacy culture in the modern workplace. Andrew's a real expert in this space. He's been working on the topic of data privacy his entire career, consults on public policy, and his actively leading conversations about this with GCs and Tech. Andrew Lachman, thanks for joining me on the Principled Podcast.

Andrew Lachman: Oh, thank you. It's my pleasure to be here with you and with LRN.

Aitken Thompson: Your legal career in Tech goes back to the earliest days of internet technology. You've been on the front lines of data, privacy, and protection for more than 20 years. We often talk at LRN about an ethical culture in the workplace, but you also talk about privacy culture. What does privacy culture mean?

Andrew Lachman: Privacy culture means making privacy decisions for the benefit of your customers as a part of the operation of your company and ingraining that in your culture. It's a difference between for instance, what you see companies like Apple do, where they have really made privacy and privacy by design and everything else that they do a part of the company and understanding that maintaining that customer trust is important. There's a lot of allure out there in data, but it also presents a very big target for hackers and for abuse, as we've seen with some of the headlines that have come out recently, and a lot of the decisions that have come out of the European Union about technology such as Google Analytics, for instance, and a variety of others.

Aitken Thompson: Well, you touched on it right there and the regulatory environment or topic is what privacy is constantly evolving and changing on a month-to-month basis, it would seem. So thinking about privacy culture as an attribute to ethic culture is worthwhile. So how can we best prepare the company's leaders, product developers, business analysts, et cetera, to keep privacy in mind on a day-to-day, moment-to-moment basis?

Andrew Lachman: Well, I think first of all, it's important to always review with leadership and everyone what the cost is of not engaging in a culture of privacy. It can affect the trust in your company, it can do reputational damage. It can do financial damage. If you think about it, GDPR, we're starting to see out of a number of jurisdictions, some very large fines. It can be as much as 20 million Euro. It can be as much as 4% of your gross revenue, if they're able to establish especially after repeated engagement, that there's no longer a good faith effort to try to comply with the law. So there's the financial damage, there's the operational damage in terms of the morale of the company. So, there's a lot of reason. And then, of course, there's dealing with investors. And if investors don't have faith and dealing with also activists investors, if you're a public company, that kind of damage can take up a lot of resource as well. So I think that's the number one thing to do is review what the cost is.

And then the second thing you can do is empower these various leaders to understand your analysis and how you look at things and what the benefits are. And I think the third is doing a real financial analysis and asking yourself, not just financial but also a data analysis. Everyone wants to hold on to data because it could be useful at some point, but with GDPR and CCPRA in California and other laws around the world, Brazil as well, even Philippines, China just passed laws as well. You can only hold the data for the purpose in which you collect it, and you can't hold onto it forever. You can only hold onto it in an identifiable fashion for the period in which you're empowered to use it in relation to the service being provided. That is, if you don't do that, then you expose yourself to investigations and audits. And those all take a lot of time and resources as well that could be spent on building your customer base.

And so when you do that analysis, you can really look at it and say, okay, well, is this particular data we're collecting, we want to collect? What is the actual use? Is it related to the purpose in which we collect it? Recently, the French Authority said that you can't even use data to improve products that aren't related to the original product that the data was collected from. So you really have to ask yourself, "why am I using this data? Why do I need it? Is it related? And is there other ways in which we can collect or separate out this data without building profiles or things that would run a foul of various regulatory authorities?" So you've got things like pseudonymization, where instead of throwing together a bunch of data on an individual in one place, you can separate it out into different places and use a token that can't be directly linked without some sort of exceptional effort to the various pieces of data. And that can also, at a very low cost and a low operational impact, protect the company.

Aitken Thompson: Got you. And so I can understand since I'm a GC, and I understand the balance between business goals and these privacy other concerns, I worry about that. I think about that. How do I get people sort of the rank file people, the product developer, the marketing department, to keep this in front of mind from a logistical or education standpoint, or however you want to take that question.

Andrew Lachman: So GDPR requires that you have to do annual privacy trainings. And of course, companies will generally speaking to privacy training. It's like on a broad level sometimes and say, "Hey, this is personal data." What you're really better off doing is breaking down and doing sub trainings as well. And talking with your sales team, your marketing team, and your product teams to have them understand. And by the way, also your engineering and technology teams to make sure that each of them understand how this applies to them. So you're making them partners. If you're in the situation where you're the only one who are asking these questions, then you're making your job much, much harder. And so you are much better off empowering your teams that you work with some of the tools of analysis so that they can ask you these questions and it becomes a part of what they do. 

And that makes a big difference. And that can be a challenge, especially in a high growth environment, because everyone wants to get it out right away and get it done. But if you build in, what's called privacy by design, which is mandated by GDPR early in the product development process. When they're starting to put together, you can help answer those questions early on, and they're not rushing to clean things up. Proactivity is often a lot easier said than done, especially in this world of what was it, Ready, Shoot, Aim, but that we have and fast failing and things like that, but the whole purpose behind privacy by design and the principles around it is to get you involved in that process early, so that you're not in that situation.

And if people are having regular meetings with your teams, talk about these issues and being a part of every product development process early on makes a difference. Maybe it makes sense if you have a product to just base the entire hosting out of Europe. So that way you're not spending the money you'd have to separate out the data. And frankly, as GDPR and American standards continue to get closer and closer together, it might make sense just to go for the highest denominator as opposed to spending the money to separate everything out.

Aitken Thompson: Got it. And it's interesting. I agree that we're seeing the American or the U.S. version of GDPR which would body in sort of CCPA, or it's actually not called CCRP, but [inaudible 00:08:21].

Andrew Lachman: It's going to be called CPRA as of 2023. And in 2023, it gets more GDPR like, so, a lot of the standards, a lot of the rights are more similar as well. So including by the way most of the other states have a business to business exception, with respect to data that's being provided in connection with a service. Right? You work for a company, you have to provide your email and your work email and your login and your history of the use of the product in connection with that under Colorado and Virginia and the law that's coming out from Connecticut as well, that is accepted and is not considered personal data. And there are exceptions in Canada as well with that, but in California and in Europe, it's all treated the same and you don't have a business to business exception.

Aitken Thompson: Gotcha. And all these new regulations come out and then coalesce around what will probably be a global standard. Have they gotten the balancing right? Do you think that regulators, there are certainly an argument to say, "Look, the regulations have gone too far." Certainly the gathering and analysis of data has created an incredible amount of interesting business insights and certainly value to customers and companies alike. Do you think they're getting that balance between privacy and commerce correct, as a general matter?

Andrew Lachman: It's a always going to be a constant battle. I think there are a couple of challenges. And I remember this from my days both when I was on Capitol Hill, I followed very closely the commenting process around CCPA and the appointment of the folks on the Data Privacy Agency Board as well. Challenge. Number one is that regulations move much slower than the technology changes. And so governments have gotten better at regulating from a set of principles as opposed to a set of strict standards. And that gives them the flexibility and then they can adapt accordingly. GDPR did a very good job with that. CCPA, a CPRA I think did a better job with that. We'll see is the regulations come out because it's still a push and pull. But the other big issue that comes up is that sometimes privacy advocates don't have an understanding of how technology works in the flows.

And so there is a tendency to have a set of abstract principles out there without thinking how it goes through a user experience, how it goes through and how data actually flows with the technology that exists. A good example is sort of the Universal Opt-Out button right now. CCPRA says, "You're not supposed to have it. The attorney general really wants one and has been pushing very hard for it." But getting universal technologies like that adopted takes a very long time in working it into your code, et cetera, in a private workplace. So I think in that particular case, you have this disconnect where there are these theories that exist, but actually understanding how they can be put in practice creates a problem where the aspirations can be very difficult. If not impossible to reach, it's not like environmental regulations, there's technologies out there and people are there and they're constantly working to meet certain standards.

And there's a strong impetus in the marketplace to do these kinds of things. Cause people can make money from it. And that kind of thing. That's not quite yet there. And so you've got really good organizations, that want to protect your data and protect your rights to privacy, but they have never been in the place of actually seeing how the data flows work and understanding how to take these principles and plug them in to the real life application of data, and how companies will work through them. And I think if we can bridge that gap, we'll see a much better set of regulations that meets those goals and also takes away the advantage of some companies that frankly their approaches, well, we just can't do it. And no one has asked them why let's walk through it. Let's see where we can plug into your process.

A while back Google was saying, "Oh, we can't ever change, go analytics. We have to collect IP addresses." And the regulators in Europe went through a very long process of discussion with Google and believe me, it wasn't an easy slog, but they did it. And now Google analytics and its next version will not collect IP addresses, which is going to change the entire advertising space and in a lot of other areas, but it takes a lot of discussion to make that happen. I know that was a very long answer, but it's a very, very complex issue.

Aitken Thompson: So certainly data privacy and protection is an issue for essentially all companies these days. Because everyone uses computers and gathers some sort of data, but does the regulatory burden or people should looking for change or significantly get larger depending on what industry a company is in?

Andrew Lachman: Oh yes. Very much so if you're in a financial space or if you are in a healthcare space, there's much more extensive regulatory regime and environment. As an example, we've got Gramm-Leach-Bliley and the New York Financial Services Regulations Around Financial services, but the banking industry itself has come up with its own set of separate regulations about cybersecurity. And if you're working with banks, for instance, you need to adapt to those. If you're in the healthcare space, you have to deal with HIPAA and Ransomware and High Tech. And then even the cures act now, which has some changes as well. And there's which updates a lot about electronic health records and ePHI, when I work with a Contentstack, which is a content management platform, they have to deal with all kinds of customers and that making sure that they're adapting and working with the various companies out there on the special models that they have to deal with as well.

And finally, I forgot to mention children as well, which is a very highly regulated area. You've got Copa and California has its own separate regulations around children, but GDPR and the privacy regulations out there as well, which affects marketing. You really can affect your outreach to children. So you need to know who you're serving and make sure that you've got those particular kinds of risks in mind as well when you're dealing with regulations based on your market, and what particular areas that you're serving.

Aitken Thompson: Absolutely. These are very [inaudible 00:14:37] issues. We LRN had a product launched delayed because of the Google Analytics issue in France and Austria. I believe that was the country. Do you want to tell listeners just a brief sketch of what exactly why we're talking about IP addresses and Google Analytics?

Andrew Lachman: Your IP address is considered personal data in California and Europe everywhere else. And the reason is even if you're using a dynamic IP address, like on your phone, your IP address, isn't static. It can be traced and be used to identify you as an individual and under European law in particular, their constitution gives individuals a right of control and not just privacy, but a right of control as well in data that's collected about them. And that's why GDPR has set the standards of rights of correction and that sort of thing. And that jumps over into those IP addresses and other things can be used to create a profile on you. As well, which can be used in all kinds of different ways by advertising companies and Google Analytics how they make their money is that they build profiles and they use those profiles to increase their advertising dollars.

So there's been a real logger heads between the European Data Regulatory Authorities, which are saying you have this information that can really be used to identify you as an individual, especially on a computer more, even much more so than a phone and companies that are relying on this construct of collecting the IP addresses to build these kinds of profiles. And so the authorities finally said, look, you've been told over and over again, we've been asking you questions about your industry to understand it better that this IP address information is just too personal. It tells too much about you and really can help to identify you. And so you need to find other means other than IP addresses, if you want to collect information because you're putting them in a situation where you can't turn off the IP address and not share it. And that's a fundamental of European data regulation and we'll be in California as well.

Aitken Thompson: Gotcha. So there actually are other tools we're using Google Analytics, just for internal page counts, to understand usage of our site, parts of our site. There are actually other ways of doing other pieces of software don't collect IP addresses. You said Google's going to be changing their analytics package to not track IP. Do you know when that's going to happen?

Andrew Lachman: I think people were saying whatever version four is, so it's going to be coming out in the next couple of months. I don't recall an exact date.

Aitken Thompson: Got you.

Andrew Lachman: But they are responding to it. And also I think what Apple has done as well in terms of really moving toward privacy and disabling some of the tools that Facebook and Google have relied on to collect information on their users of their services has moved and forced people to move as well. So sometimes when one major market player decides to embrace privacy as core value, it can ripple out and affect other partners as well, and force them to the table in ways in which big companies they don't turn on a dime. So it takes a lot in order to move them in a direction. And I think also what Apple did made a big difference. Tim cook recently spoke at the world the global privacy summit that the IAPP does every year in Washington, D.C. And talked a lot about those values.

Aitken Thompson: Got you. So the conversation we're having just points to me and it's every conversation I have about privacy and data protection illustrates. It's a complicated, fast moving area of law that you really need to cut. Go on to date with. I understand you're also among your other activities, a founding member of Tech GC. Can you tell me about what the mission of that organization is?

Andrew Lachman: There are a number of great organizations. If you're a GC that you should be involved with. I would say Tech GC is certainly one of them. It's been probably they association of corporate council is also great, but Tech GC particularly deals around the Tech Industry and is an association I'm one of the relatively early members in this that allows GCs to talk to other GCs and share their experiences. Sometimes it can be a good therapy session too, by the way. It's a great area for you to find resources, to ask questions and to be able to learn from each other about your experiences and the issues that you face as a GC in the tech area. I would also, by the way, very much recommend if you're interested in privacy, joining the international association of privacy professionals, I've been a member since 2007.

They have certifications now, which are pretty much the standard in the field. There's also an association for data governance, which is other group that's come in, but IAPP frankly, really set the standards. And when you go to their conferences that they do and they have them all over the world, you always pick up something really useful that you can take back at. The last one I was at, we had the top folks from commerce and the European Data Protection Board coming and talking about what changes were coming with respect to a success or to privacy shield.

As you know, it was thrown out about a year and a half ago. And we've all been dealing with the tremors from that, which have made transatlantic data flows very, very difficult. And one of the differences in changes, the administration has been The Biden Administration has been very good in terms of really embracing this and making sure that those avenues remained open and they've come up with a process and by June or July, they're supposed to have all the rest of the details worked out, but this is the kind of stuff you can pick up at IAPP that you can't pick anywhere else.

There's also organizations like future privacy forum, which is also, I think it's a smaller group, but a really good standard and a good place to learn and keep on top of all the changes happening in this particular field. So those are a couple of really good resources, both from a Tech GC perspective, general council perspective, but also from a privacy perspective, to make sure that you're really staying on top of this very rapidly moving field.

Aitken Thompson: Got you. And go, and going back to a culture of privacy in a corporation or a company, I would see a stakeholder being the GC HR there, any other people you would suggest to get together, if for someone who's been implementing or strengthening their privacy culture in their company, how would you go about that from a stakeholder standpoint?

Andrew Lachman: That's just the tip of the iceberg. HR, obviously you've got regulations around employment privacy as well, but your product team, your marketing team and your sales teams and your engineering teams all need to be a part of this discussion, because that's what makes a difference. And I just give an example very early on in my career, when I had my own firm, I had a company come to me and say, by the way, our engineering team just decided that they're going to implement this one technology out there that tracks everything people do on our site. And it's like, well, that kind of has an impact on GDPR. So let's.

I would recommend you pause that and make sure that you review it. But also, frankly, you also need to make sure that your engineering team understands what it is they need to do so that they don't get the company in trouble, because that's the things that will catch you. You can have a privacy policy, but it's those little details out of the marketing and the technology ends. If you're not making sure that they match up to what you say you're doing on paper, whether you've got SOC two compliance, ISO 27001, or just your general privacy policy. All of those will open up your company to lawsuits or potentially open up your company to lawsuits and costs. And they can be catastrophic. They can be company enters, not withstanding the trust of what it would affect in your company.

So you really need to make sure that all of those teams are involved. A lot of the big fines that have come out of Europe, they've been against companies, but it's not been marketing companies, it's been regular companies. It's been telecommunications companies that are tracking information on the back end and didn't realize that they were doing it. And then very suddenly they find out that you got a complaint because someone finds out.

Another really good example that we've seen out there, not from a Data Breach Perspective. The target had a catastrophic data breach that existed because they kept their HVAC Software on the same server as their credit card information. And some guy was fixing the HVAC system and plugged in his USB and ended up downloading a virus into the database, into the computer. And because there was no separation ended up being one of the largest tax of credit card information at the time in the world as a result. But also there's an example of somebody was in a family, a daughter who was worried that she was pregnant, was doing searches on her own about pregnancy tests. And her parents were on Google, I think it was or something.

I want to say it was Google, but it was somewhere that very suddenly Adds started popping up for, are you pregnant? Obviously, the privacy issues not just for children, but between adults who live in the same household and who may use the same computer are pretty scary. And so it's just an example of how trust in your company can really be hurt as a result. And when the trust in your company is hurt, it attracts more regulatory scrutiny. When you have more regulatory scrutiny, there are more legal bills because now you're having to deal with all these audits and investigations that weren't there before.

Aitken Thompson: Oh, absolutely. Well, those couple stories just shows how much more we could cover here, but we're running out of time. Andrew. So thank you for joining me for this episode. My name is Aitken Thompson. I want to thank you all for listening to The Principle Podcast by LRN. Thanks Andrew.

Andrew Lachman: Thank you for having me.

Outro: We hope you enjoyed this episode. The Principled Podcast is brought to you by LRN. At LRN, our mission is to inspire principled performance in Global Organizations, by helping them foster winning ethical cultures rooted in sustainable values. Please visit us @lrn.com to learn more. And if you enjoyed this episode, subscribe to our podcast on Apple Podcasts, Stitcher, Google Podcasts, or wherever you listen. And don't forget to leave us a review.

Ready to upgrade your ethics and compliance program?

We’re excited to give you a personalized demo of the LRN solution. We’ve been a trusted ethics and compliance partner for over 25 years. With over 30 million learners trained each year, we optimize ethics and compliance programs across the globe to help save your team time, increase engagement, and align with regulation.