What you'll learn on this podcast episode
How do you know if your ethics and compliance program is successful? How are you capturing data and comparing it to industry benchmarks, or tracking your own company’s trends over time? In this episode of LRN’s Principled Podcast host Emily Miner, director of Advisory Services at LRN, talks about benchmarking E&C data with her colleague Derek Clune, product manager of Data & Analytics. Listen in as the two explore how benchmarking practices come to life and the role AI plays in LRN's new Catalyst Reveal solution.
Where to stream
Be sure to subscribe to the Principled Podcast wherever you get your podcasts.
Guest: Derek Clune
Derek Clune has been working in the ethics and compliance space for over five years with an emphasis on data and analytics. As a Product Manager at LRN, Derek is responsible for the vision of LRN’s new data and analytics platform; Catalyst Reveal. His main goal is to provide E&C professionals with more actionable data to understand their E&C program effectiveness better. Derek’s team works to create products that offer best-in-class prescriptive interventions to improve E&C programs and ease the administrative burden.
Host: Emily Miner
Emily Miner is the Director of LRN’s Ethics & Compliance Advisory practice. She counsels executive leadership teams on how to actively shape and manage their ethical culture through deep quantitative and qualitative understanding and engagement. A skilled facilitator, Emily emphasizes co-creative, bottom-up, and data-driven approaches to foster ethical behavior and inform program strategy. Emily has led engagements with organizations in the healthcare, technology, manufacturing, energy, professional services, and education industries. Emily co-leads LRN’s ongoing flagship research on E&C program effectiveness and is a thought leader in the areas of organizational culture, leadership, and E&C program impact. Prior to joining LRN, Emily applied her behavioral science expertise in the environmental sustainability sector, working with non-profits and several New England municipalities; facilitated earth science research in academia; and contributed to drafting and advancing international climate policy goals. Emily has a Master of Public Administration in Environmental Science and Policy from Columbia University and graduated summa cum laude from the University of Florida with a degree in Anthropology.
Principled Podcast transcription
Intro: Welcome to the Principled Podcast, brought to you by LRN. The Principled Podcast brings together the collective wisdom on ethics, business and compliance, transformative stories of leadership, and inspiring workplace culture. Listen in to discover valuable strategies from our community of business leaders and workplace change makers.
Emily Miner: Gone are the days of checklists, ethics, and compliance programs where one simply goes down a list of program features and elements. Now, regulators, employees, customers, leaders are asking, are our ethics and compliance programs effective? Are they successful? Well, how do you know? Hello, and welcome to another episode of LRNs Principled Podcast. I'm your host, Emily Miner, director at LRN. And today I'm joined by my colleague Derek Clune, product manager of Data and Analytics at LRN. We are going to be talking about ethics and compliance benchmarking and how organizations can track their own trends over time, as well as compare themselves to industry peers. We're going to talk about how all of this data comes together in technology environments like LRNs new Catalyst Reveal Solution which is launching soon. Derek is a real expert in this space. He's been working in that data and analytics vertical at LRN for a number of years, and is a key architect behind our product innovation and incorporating the insights of our industry collaborators at major corporations around the world. Derek, thanks so much for joining me on the Principled Podcast.
Derek Clune: Absolutely, Emily. Pleasure to be here.
Emily Miner: So before we get in, maybe just some definitions and level setting. So what is benchmarking? The way that we think about it, it typically means comparing what you do as an organization to a number of comparable organizations or individuals. And usually this is done in a quantitative way, so a more kind of a numeric databased way as opposed to a qualitative way. And benchmarking is helpful just for comparative purposes. And it can also help to identify best practices in the industry. And best practices referring to those behaviors, those practices systems, which some sort of research shows that the very top firms use in a way maybe beyond or to a greater degree than other organizations. So why do organizations benchmark or want to benchmark? Derek, I know that you have a lot of conversations with our client partners around their benchmarking requests and their needs. But sort of as an overarching point, why are companies interested in benchmarking? What's the value to them?
Derek Clune: Yeah, I think there's a number of reasons why we see it. In my conversations with our partners, obviously regulators are looking at ethics and compliance programs with much higher scrutiny than they ever have. And so organizations want better visibility into the wider space, whether that's how their ethics and compliance program measures against others within their industry, whether that's how it measures against others from an employee size or geographic footprint. So organizations use really two sets of benchmarks, internal company benchmarks. Their own data and organizational assessments and benchmarking those quarterly year over year to measure their own program. But also they want a broader audience to compare themselves to, to really see where they... For lack of a better term, rank within the pack so to speak. And so a lot of this we see is all around measuring ethics and compliance program effectiveness.
How do I know my program's effective? I have the parts and the components, the codes of conduct, the policies, the disclosure certifications, but how do I know that those are effective? And we're seeing more and more that data is being used as a key component in that measurement of program effectiveness.
Emily Miner: Yeah, I'm reflecting on some conversations that I've had with our partners where they've said, our calls to our hotline are X percent. Is that good? When we look at our... We can collect data on ourselves and measure it and certainly that's sort of where organizations have been heading for a while. This increased data collection and analysis. But sometimes doing that in a vacuum, you're sort of left wondering, okay, well, the number is four, is that good? Should it be five? Should it be one? Should it be 20? What does this mean? And I think that's where that comparison is helpful because you used the term kind of broadening the pool or broadening the lens. I don't remember exactly what you said, but that idea of broadening your view finder. And that's where I think the strive for this desire for being able to benchmark and compare a large place of where it comes from. And just also as humans, we like to compare ourselves to others in so many parts of our life. So there's maybe a human nature component to it too.
Derek Clune: Yeah. No, absolutely, you took the words right out of my mouth. None of these organizations while they all are unique operate in a vacuum. And so they need to have some sort of comparison just to know that they're below, above or equal to a number because we know the regulators don't give specifics. So the next best thing that we have really is this benchmarking tool of, in our case all of the LRN partners, which over 2000 partners in a number of different industries, Fortune 500, et cetera.
Emily Miner: Yeah. And so Derek, I know that you partake in a lot of voice of the customer type conversations, and you are the recipient of a lot of requests for information from others within our organization. What are some of the top requests or data questions that you hear from our partners? You talked about wanting to measure program effectiveness, how are people thinking about program effectiveness? What do they want to measure? What do they currently have versus what don't they have but they want it? What are some of the general themes?
Derek Clune: At a high level we know that all of these questions typically start with a risk assessment. A company will do a risk assessment from a third party to get at maybe their blind spots or to tell them some things that they already know. And so in most cases that serves as the initial roadmap of different topics to consider around benchmarking around these data questions. And so from there we see organizations typically focused on the course data. That's the most popular one. We're rolling out mandatory training, how are my employees performing on that training? It has some sort of test in it, are my employees performing better or worse than I expected or right on par with what the requirements are? And within that there's a lot of different sub context. So is a particular business unit outperforming or underperforming based on the average or the median?
Is there a regional confusion around a question? So I would say the initial focus that people immediately go to is the mandatory training that is being assigned and the course performance metrics I'll say, how employees are performing within those courses. There are a lot of tertiary components that are critical to measuring program effectiveness. What we see also is culture being a critical component of ethics and compliance. And larger initiatives at an organization at measuring that overall learner sentiment of the communications and the courses that are being rolled out to the learners. So not only are we looking at the performance aspect of those but also the sentiment and learner feedback of what they think. All of those kind of surveys where you're getting additional feedback from employees is another great metric. Overall investment. Of course, senior leaders and maybe chief ethics and compliance officers, they want to see the return on investment in the ethics and compliance program.
So what metrics can we look at to demonstrate that there is an ROI there? And then another piece that we see is the communication strategy and how do we take those metrics to identify the best time to roll out a campaign. Frequency of reminders, those types of things is another point at which we can look at and also improve upon year over year.
Emily Miner: Yeah, that's interesting. I actually don't know the answer to this, do we do any type of AB testing? I'm kind of fascinated by that idea that you were just describing of what's the optimal time to roll out a training and the frequency of reminders. And I was actually just having a discussion with one of our colleagues about is she the type of person that kind of takes her training right away as she is. Or does she wait till the last minute, which I confess I am that type of person. So we were just talking about kind of different type of people and how their personality, their characteristics kind of inform their behavior with respect to taking training. But anyway, so I'm just sort of thinking about like okay, we have these two types of people and how can we optimize attention and what's the right cadence of reminders to get the laggards like me or the right time of year?
So with that, how do we know? Do we do any type of AB testing or comparison? We tried one reminder a week last time, let's do two reminders a week. How do we kind of know?
Derek Clune: Yeah. So at LRN, we currently have if a partner is using the LRN platform, we do have the overarching data of when they're sending reminders, when they're rolling out campaigns. We are just at the forefront with this new catalyst reveal dashboard of being able to look at that data and make prescriptive recommendations for organizations. And so what I've seen is a lot of there's really no one size fits all for any of these organizations and their communication strategies. You have some people who are rolling out training and communications once a year, some on a quarterly basis, some biannually. And so we're just at the forefront of being able to look at that data, look at the time of a completion, how many people completed before a due date and after a due date. And we're expecting sometime mid next year to be able to make those prescriptive recommendations.
And the exciting thing about this is the more people that we get onto this tool and using the tool, the more accurately we can prescribe different methods. So potentially we could say for a specific industry like tech, we see that Friday afternoons are a better time. This is very normal in sales and kind of marketing strategies for emailing. When's the most optimal time to send an email and somebody will look at it, 11:00 AM on Tuesday is something the last time I checked. So we want to be able to do that. But also within each of those communications, what are the collateral that are the most effective? A quick short video from the CEO that Emily is going to click on, is it the email spoof to look like it's coming from the CEO? That might get everyone's attention. So there's a lot of things that we're looking at currently, and we'll be doing that sort of AB test as you mentioned.
Emily Miner: Okay. Well, that's really fascinating. So it sounds like you and I should have another conversation in June or July of 2023, and you can tell us what you found with all of this. I love it. I think that's so interesting. And you're right that it's about building up that data pool, and it's only as good as the size of the data pool just like genetics testing. What percentage of me is Irish? So we talked a little bit about... You've mentioned Catalyst Reveal, and you're talking about LRNs platform. So I kind of want to turn to that now because this is a really exciting product launch for LRN. And we're launching out a whole new platform in just a few weeks in mid-October that will dramatically increase the ability for our partners to benchmark against some of those metrics that you were talking about before. And I know that you've played a lead role in designing what that looks like, and the feature functions, and how it work and making those choices. So one, can you tell us what Catalyst Reveal means? And then two, what will it enable our partners to do?
Derek Clune: Sure. Yeah, so Catalyst Reveal is the name that we've given to our new data and analytics platform. We want to reveal actionable data and insights to our stakeholders who are mostly ethics and compliance program administrators, who are really in the day to day nitty gritty of an ethics and compliance program and the data around that itself. Secondly, chief ethics and compliance officers, thirdly our leadership board of directors, et cetera. So the name reveal comes from the idea that we want to provide our partners with more actionable data so that they can get deeper insights into their employee populations. But also be able to use those insights through data to take action accordingly. So that's where the reveal comes from. [inaudible 00:15:10] itself really is we've really revolutionized our data and analytics platform to allow for administrators to do a whole lot more than they ever could through LRN.
Number one is the organizational aspect of the data and benchmarking just the single organization's data. So I have all the LRN employees. I want to be able to compare and contrast sales with finance, with marketing, and see how those test scores are on a quarterly basis year over year. That's something that's going to be within the tool. Additionally, as we're talking now, being able to benchmark those pieces of data to a larger LRN audience. So within a particular industry, how do we compare within an employee size of 5,000 to 10,000 organizations of that size? How do we compare with organizations with a revenue between 500 million and a billion dollars? And so going back to the beginning of our conversation, this allows our partners to internally benchmark and externally benchmark.
So they have the numbers and the data, and they're not in a vacuum because they can quickly with one click of a button look at the benchmark and see how they compare. And the main aspects of our initial launch in October are going to be the course data. So the course performance metrics that I mentioned that we know organizations are keenly attuned to, the company culture and measurements around that. And finally, the overall learner sentiment on the courses themselves. In the future will continue to add but those are the three core dashboards and benchmarking capabilities that partners will be able to have come in October.
Emily Miner: Yeah, just a quick note on the potential benchmark data pool because I hopped over to our director of communications to get some insight into our partner base. And we're looking at over 1000 partners from around the world with a combined 28.3 million employees, including a big chunk of the Fortune 500. So that's sort of the potential universe of comparative data that we hope our partners will have access to, so that's really exciting.
Derek Clune: My eyes light up when I hear that amount of possibility with this tool because it really is the more you put in, the more you get out. And so as the product manager of the tool, we'll be adding more capabilities such as the disclosures and certification management all around that. So you can see and in the future you can see the possibilities of, okay, if we identify a risk through a disclosure or a certification and we see that employee or that region is scoring low on say the conflict of interest course and they're not disclosing anything, we could hypothetically see potential for a high risk environment. There's a lot of really exciting things, and I know we have a bullet point to talk about what the future of this tool looks like. But it's going to be a game changer for LRN and I think for our partners as well.
Emily Miner: Absolutely. And thinking about the culture data, the ability to drill down into that one particular business unit or location that's scoring way below. And kind of what is raising that red flag and going in and comparing that with some of the other data that you've mentioned, more partners can collect on our platform. And kind of triangulating those and rolling out some early intervention, or refresher training, or leadership coaching, whatever it might be. But being able to have those different kind of data sources, those data feeds pulled together into one place so that you can look at them... We've been talking about in vacuums, you can look at them not in a vacuum is really exciting. I can't wait to see how our partners use it,
Derek Clune: Yeah, a single source of truth to be able to start the triage process whether that's for a high risk issue, or even if it's for triaging, okay, is this specific question in the specific course too difficult? Do we need to change the wording? Okay, we've changed the wording, do we see an improvement in performance? So we want to create that tool to really track the entire kind of ethics and compliance life cycle and just make the administrators' lives a little bit easier with that single source of truth, so that they have one place design specifically for them with the appropriate metrics that we found from our 1000 plus partners. These are the metrics that are most important so that they can build world class program.
Emily Miner: Yeah, so this is all really exciting. And I know that this is a kind of an area that you and I both have some personal passion around. But we would be remiss to not also acknowledge that there are limits to benchmarking and it's not a be all end all. And we should be thoughtful to guard against what's sometimes referred to as blind benchmarking. So I want to spend a little bit of time kind of talking about where benchmarking isn't helpful or what it can't do or what we shouldn't use it for. I guess just to start, you mentioned earlier one size doesn't fit all. And I think that we know that to be true and also the regulators acknowledge that as well. So the Department of Justice in their evaluation of corporate compliance programs, guidance document, talk about how one size does not fit all with respect to ethics and compliance programs and that organizations need to consider the risks.
So you talked about risk assessment as well. They need to consider their specific risks, their size, their industry, their geographic footprint, their resources, et cetera, when designing and implementing their ethics and compliance programs. So because all organizations are unique, even within a given industry there are some limitations. To give a concrete example that one of our colleagues in the advisory practice, Susan Deva shared with me. We conducted a program evaluation for two companies around the same time when we were looking at their program maturity and effectiveness. Both companies happen to be in the same tech manufacturing sector and they even produced really similar products. So one might be forgiven for thinking that our evaluations and our recommendations would be structured the same. But despite these companies similarities, they had really different risk profiles. So one company was a major exporter to the Chinese tech company, Huawei, which was sanctioned by the US government in 2019. Whereas the other company had a different customer based makeup.
So comparing policies and procedures around trade control for example, would not have been appropriate in this case. So that's one example of again, kind of using that term blind benchmarking. And we just have to be careful with what we're choosing to benchmark and recognize that not everything is benchmarkable or even if it is, should be benchmarked. I'm just curious kind of your thoughts around the limits of benchmarking or where we want to take it with a grain of salt. Obviously it has a lot of really positive uses that we've already talked about, but what are some of those that we need to just make sure we're kind of eyes wide open about?
Derek Clune: Yeah. I think you really want to make sure you understand the benchmark itself. So if you're looking at industry, what that makeup really is, or employee size. Two organizations of 10,000 employees can be wildly different as we know. One could be in retail, one could be in manufacturing and those have completely different risks. And so when you look at the numbers associated with the benchmark, like the average test score for this harassment course in your industry is an 80% and you're at a 70%, the immediate response is well I'm below the benchmark. But those could be wildly different organizations. And so I think understanding the benchmark itself is certainly critical to when organizations are looking at this. And even when that benchmark is... As you correctly pointed out, is correctly defined, each organization is still very unique. And so I think it is a great data point to use to orient yourself and navigate from not the end all be all solution.
Emily Miner: Yeah. And really by default the benchmark if it's a complete data pool benchmark, meaning it's including all the possible data points, it is by definition an average. And the average isn't always good. My beat at LRN if you will, or one of my beats is culture. And I've worked with a number of organizations in helping them to understand and evaluate their ethical culture and improve upon that ethical culture. And we typically do provide industry benchmarks related to that ethical culture data which is helpful. But I've been in a number of conversations with chief ethics and compliance officers where they say, we don't want to be average. We want to be better than average. So the benchmark is a helpful sort of orientation of where we are, but it's not something necessarily to shoot for. Maybe we want to shoot higher because our standards... What we expect is higher.
Just another little example, we recently, a few months ago I guess it was, did this benchmarking effort related to codes of conduct where we evaluated I think it was nearly 150 publicly available codes of conduct from the top listed companies in the US, UK, France, and Germany. And what we found was that over 70% of the codes we assessed had a flesch Kincaid grade reading level. And that's meaning the typical sort of grade level that one would need to have in order to understand the content. Over 70% had a reading level above a 9.5, but that's actually kind of commonly accepted to be too high. What we typically want to shoot for is like an eight and a half to nine and a half. And this is a sort standard range for not just codes of conduct but for material on a company website. Like any sort of content that is being consumed by people.
It's sort of a generally accepted appropriate reading level to be accessible to the majority of your audience, whatever the audience might be. So anyway in this case, we have the vast majority of codes reading at a very high reading level. But again, that's not necessarily what a company should shoot for. And in this case we would argue that they should shoot for something lower. So I think those are just... I completely agree with you, one, just understanding what is the benchmark. Is it a valid benchmark that we're comparing ourselves to? And then assuming that it is, how do we also just put that in context of our own organization, our own goals for ourselves, our internal comparisons year over year? I think that's really important. And I've had the privilege to work with a lot of companies for many years with respect to their ethical culture where we do these recurring assessments. And so we're able to track progress over time.
And what I have observed over about 10 years of doing this is the industry benchmark tends to be important kind of the first time. But as companies do this the second time and the third time and the fourth time, the industry benchmark I've even seen to sort of decrease in its relevance because at this point the company is competing with itself. Well, how do we improve versus last year and the year before that and the year before that? And that's where they're setting targets. We want to increase people's willingness to speak up by X points, how do we do that? As opposed to, well, here's the benchmark and how do we shoot for it? So that's just been an interesting trend that I've observed in how the benchmark is so helpful for setting that baseline but it can be less useful, the external benchmark. The internal benchmark is always useful but the external benchmark can become less relevant. I don't want to say useful but less relevant or less informing our goals as we go on in making investments in certain areas.
Derek Clune: Yeah, right. The benchmark becomes a trend internally and so you have the trend analysis. And the tool that we're building is really helping administrators identify those trends. You mentioned context is key, and you're so right there. And in talking with some of our account executives, they're very excited to get their hands on this data and share it with their partners so that they can make more better informed decisions around our recommendations as an organization to our partners. We know that these ethics and compliance professionals are busy. They are juggling multiple jobs, if you will, at once. And so our goal is just to make their lives easier and to again, prescribe best in class initiatives and actions that they can take. And so super excited to be able to take those benchmarks, take this data, and within the context of the organization in a specific environment be able to consult and add value.
Emily Miner: Yeah. So Derek, to close us out let's talk about the future. Let's talk about what's next. What is on the roadmap for Catalyst Reveal? We're launching in October. It has a lot of great features and functionality and the ability to reveal insights to our partners. What comes after that? What are we adding on?
Derek Clune: Yeah, so ending on a high note here, this gets me very excited. So I mentioned reveal insights provide actionable data, and I've touched on the prescriptive aspect of having the tool work for the professional. And so that's really where we're going to be focusing on in the next year and out into two, three years into the future. So what this looks like in practice is taking all of the data from the entire LRN product ecosystem, bringing it into this single source of truth so that we can... If you think of a spider web, can pull from different parts of the web whether that's a high risk disclosure, or a knowledge check score, or a unsigned policy, or if we're bringing in hotline information from a partner, a hotline call. We want to be able to have this network of an ecosystem that we can pull from different places into a single source of truth to provide that data.
And then taking the next step is to prescribe specific action of, okay, this is what we've identified within the tool and the tool suggests that you should do this. And a lot of the capabilities around that is around AI and ML, artificial intelligence and machine learning. And so some of the four key aspects that we're looking at in 2023 will be natural language processing and search. So you could similar to Google, go into this tool and type a full sentence question and the tool can provide you with the answer. What's the average knowledge check score for my harassment course? And it's going to populate that answer for you instead of the click through to get to that answer. To provide quick answers to the administrator or maybe even the senior leader, or the chief ethics and compliance officer that's walking into a meeting with the board and needs to know that information quickly. So one aspect.
Second aspect would be auto notifications around the data. So giving administrators the ability to kind of program the system so it works for them. Meaning if I know the average score for a course is 70% and someone's scoring 30%, I need to know that. I want to get an automatic notification that lets me know this business unit, this location, or this gift of this amount was given. I need to know that information. And so being able to have those automatic notifications and have the tool work for you is another aspect. Two more that we're working on is around the auto narratives. And so that the tool again, having the tool work for you and prescribing action. So based on the data, here are the high risk topics that we see in your organization. Based on this benchmark here within this industry are the trainings that most people are rolling out.
So having auto narratives around the data that change based on your filters or the data that's coming in. And then the last piece is going to be on a forecasting. And what forecasting will allow us to do is to do some predictive analytics in terms of where we want the program to be in the future. And so we could see... If we touch back on the campaign data, we can see that okay, out of the 10 reminders we've rolled out five, you're at 30% of getting 100% completion. Here's the forecast, here's the trajectory that we expect you to get by your 10 reminders. And so being able to forecast different components of the ethics and compliance program, all of these aspects or all of these capabilities go back to those really two points that I've [inaudible 00:34:16] on providing the actionable data, having the tool work for you, and then the prescriptive part of what should you do. We've identified this, now what do we suggest that you do?
And so those are all key initiatives that we have for 2023 with the overarching idea of making this tool as self-service as possible. We want admin to be able to go in here and do all this on their own. Obviously they can rely on LRN if they need to, but we want to give them the power to be able to do everything.
Emily Miner: Wow. Derek, I'm struck by how valuable these types of insights are going to be for our partners and ultimately their organizations. Like what is this in service of? This is in service of helping employees around the world know what the right thing to do is in any given situation. And know how to behave in alignment with their company's values and their code and inspire principal performance. That's so exciting to hear about the future. And 2023 is not that far away, sadly this year has flown by but wow, I can't wait to hear more. Thank you so much, Derek, for coming on and sharing your insight with us and sharing about these exciting updates for our company and for all the companies that we have the honor of working with. I look forward to coming back and speaking with you and what is it about? Eight months or so, so we can hear the answers to some of these questions that we asked. Yeah. Thank you so much, Derek.
Derek Clune: Yeah, likewise. I look forward to it, Emily.
Emily Miner: All right. Well, my name is Emily Miner, and I want to thank you all for tuning in to the Principled Podcast by LRN.
Outro: We hope you enjoyed this episode. The Principled Podcast is brought to you by LRN. At LRN, our mission is to inspire principled performance in global organizations by helping them foster winning ethical cultures rooted in sustainable values. Please visit us lrn.com To learn more. And if you enjoyed this episode, subscribe to our podcast on Apple Podcasts, Stitcher, Google Podcasts, or wherever you listen, and don't forget to leave us a review.
Be sure to subscribe to the Principled Podcast wherever you get your podcasts.