In this episode of the Learning at Large podcast, I talk to Paul Goundry, Head of Learning at Utility Warehouse. We talked about what CLOs can learn from product designers, how to show the value of L&D by linking it to business impact, and Paul shared a number of resources and insights that have helped him develop his quite unique perspective on learning at scale.
About Utilities Warehouse
Utility Warehouse is a multi-utility provider based in London, England. Paul and his team support over 45,000 salespeople with personal learning programs, adopting a strong design approach and a focus on tangible learning outcomes.
Listen to this episode now
Or search for “Learning at Large” on your podcast player of choice.
Simon: So, Paul, just to get started then—can you tell us a bit about yourself, what you’re doing at Utility Warehouse, and your role there?
Paul: Yes, so I am responsible for learning at Utility Warehouse. We’re a utilities company, but we operate in a very different way; whereas a traditional utility telecoms company will market through traditional advertising, we actually sell through word of mouth. So, we have thousands of independent people who, alongside whatever else they’re doing in life—whether that be another job, or being a full-time parent—they recommend our services to people. My role is to essentially understand that group of people and find ways to best support them, whether that be online or other forms of learning and development.
Simon: And how big is that group of people?
Paul: In total, we have about 45,000 of these independent salespeople.
Simon: Paul, I know I’ve always admired the way you innovate in learning and do that at the scale you’re working with up there to support so many people. Could you pick out an example of a project you’re particularly proud of, or initiative you’ve taken in your career that’s delivered that impact to that audience?
Paul: Within our distribution network, people grow teams. But they might be a hairdresser during the day, and in the evening they’ve got a large team to run. Well, they’re not experienced leaders, they’re not experienced managers, they’re not experienced trainers. But for me, one of the most exciting projects I’m working on is how we use Elucidat, how we use our training platform to provide what we’re calling home training toolkits. For example, one of the projects we’re looking at is creating a board game—we’re actually working on this right now—and this includes a garden version with inflatable dice. This person will facilitate a group activity where they roll the dice, they face challenges, they have to overcome those challenges—and it’s essentially taking some of the principles of scenario design into board game design, and then into creating a sort of framework that an inexperienced person can deliver as an actually quite sophisticated, facilitated training experience with their team.
Simon: You’ve kind of mentioned that you are empowering people to become trainers on a much bigger scale. What’s the balance of face-to-face training versus digital training that you do and see as the future of what you’re doing?
Paul: It’s interesting—if you imagine one of our distributors, who might be anywhere in the country, we try to make our training as available as possible. We deliver training in about 60 venues across the country, we deliver about 100 training events a month, so we do make it available. But the truth is that people need access to learn at a time that is suitable to them, or to refresh their memory. So, there’s a real balance—I would almost say 50-50—between providing options to learn in training sessions as well as online. What we’re doing through these home training toolkits is beaming into someone’s lounge—experts, that can’t actually be there in the room, and we can’t even be live on Skype or Zoom calls in everybody’s lounge—but we can bring them into our film studio, capture best practice, sequence that in scenario designs, and then facilitate it through activities led by a team leader or manager. Then, we’re getting the best of both worlds.
Simon: I think your take on blended learning is kind of like a 3-D version of it, isn’t it? And social learning as well—that’s kind of coming back into what you’re doing there. In a kind of real, tangible basis. Have you developed your own kind of methodologies, Paul, as you’ve gone through your career and worked in these organizations?
Paul: I’m a big believer in performance consulting as a way of trying to understand business problems and trying to come up with interesting solutions for those problems. So, in a way, I’m not really a believer in having set training methodologies; but instead, to an extent, always starting with a blank canvas. Saying, “Do we actually understand the genuine problem?” and “Do we genuinely understand the system in which that problem exists?” and “Do we genuinely understand what it takes to bridge the gap to solve that problem?”
And when you can answer “yes” to all of the above, the solutions actually fall out of it. I believe this does exist across other organizations that central training functions operate almost independently. You send your employees off to a training function, and then they come back to their desk and they get on with their job. In most cases, rather shockingly, their manager actually has no clue what they’ve been off learning and in no way supports that. To a large extent, it’s a completely redundant exercise. So, one of two things happens. One, either it’s not embedded into their actual role or into their actual skills because no one even knows what they’ve learned, or, worse—it’s completely contradicted because their managers have got their own ways of doing things, which the training isn’t supporting.
This idea, for me, has really fallen out of that recognition. That if learning happens—if we think of the 70/20 model, it suggests that, of course, most learning happens on the job—well, my realization through working with the teams at Utility Warehouse is that our job should surely be to better enable and equip managers to effectively support, lead, and develop their teams.
Simon: What kind of skills do you think a chief learning officer needs, because I think you approach it from a kind of design point of view, don’t you?
Paul: Well, I think first and foremost, for me—and it’s a subject I’ve already mentioned—it’s a subject I’ve studied and since taught to people in various organizations. It’s performance consulting, and I’ve been doing that for now about 15 years.
I originally trained as a product designer, and I remember reading a book called Design for the Real World, which was written in the 70s by a product designer called Richard Papanek. In that book, it really hammered home that there is a social responsibility on any person trying to solve the problem. So, we work in learning and development. But the truth is, all we’re really trying to do is understand and solve problems and add value. And training as a product designer has taught me that there are processes, there are ways to look at the world that help us better understand problems, and then—you know, in the last decade, learning more and more about performance consulting really helped to put a kind of business focus on that. So, how do we understand business problems?
I think chief learning officers, people working in learning development, would really benefit from being able to see themselves as product designers, as people who are actually waking up in the morning to solve problems.
A mistake that we sometimes make in learning and development is assuming that the problem we’re trying to resolve is a capability problem. We’re looking at knowledge and skills, and therefore the solution automatically is training. Well actually, true problem solving is more intelligent than that. It actually says, well, there’s all sorts of things that affect performance and capability, and nice frameworks that I tend to use to help keep my mind open is an NLP framework called logical levels. It encourages you to look right from, “What is this individual’s purpose if they don’t wake up in the morning wanting to do or believing they are the right person to do the role they’re in?” It doesn’t matter how many skills they’ve got. They’re never going to do it. And then it goes right down through their mindset and beliefs, but also then the environment. I mean, it’s amazing how many times we try to solve environmental problems through training and development.
Simon: For a rookie chief learning officer, that responsibility to look at an audience of 45,000 people in that way could feel quite daunting. Where would you start from that, to pull out those people that represent the high performers, the nonperformers? How do you make that task simpler and break it down for yourself?
Paul: It’s actually much simpler than it probably sounds, and I think in a way, it’s because what we all need to do is take one step at a time. Solving problems and adding value is an iterative process. So, another very useful principle is to learn from techie colleagues and the whole world of agile software development, that what we’re always doing is trying—to the best of our ability—to understand a problem, come up with a solution that we think is going to add value, and then get into testing that solution as quickly as possible. And be prepared to constantly receive data and information that tells you whether you’ve got it right or wrong.
Don’t try and solve the problem 100 percent. Try and understand enough of the problem to make one step forward, and then measure where you are, and then re-evaluate, and then take the next step. First of all, identify the high performers and say, “Who is it?” First of all, “What are we trying to achieve as a business?” And in my world, for example, a lot of that is sales performance and also casing who are our top salespeople. Do we understand what they’re doing, how they’re doing it, why they’re doing it, the environment in which they’re doing it?
So, I will then usually convene or bring together typically anywhere between 12 and 18 of our high performers and run a one- or even sometimes two-day workshop to fully explore what it is they’re doing, and map out what is leading to their success. And again, as I mentioned before, using logical levels as the framework for that to really look at all aspects of what they’re doing—not just their knowledge and skills. Then, following performance consulting methodology, do a gap analysis to try and define what’s different about what they’re doing to the people that aren’t necessarily doing it, which will often then include doing workshops, interviews, etc. with people that aren’t performing quite so well to try and describe where they’re at.
Simon: Something you have said before is around personality types, to cater for them and that kind of style of learning, or delivering training.
Paul: Yeah, I mean it’s interesting with learning styles, because of course we need to understand how people learn. But also actually it’s often necessary and valuable to challenge people’s learning styles—to challenge people’s personality types—because often people will default to something that they believe is their preference and believe is their comfort zone. That isn’t necessarily helpful for that performance, in that we see a far greater shift in someone’s capability—because what ultimately happens is they learn they are actually more resilient, more capable, than they thought they were. And they’re able to stretch into new behaviors that otherwise they’ve held themselves back from trying.
Simon: And that leads me on to a question I was going to ask you about ROI and how you measure the effect you’re having. And, of course, you’re using logical levels framework to support 45,000 people, to understand what’s going to really support them effectively. How do you then play that back to look at how those kind of rapid tests are working, and to iterate what you’ve got? And how does the way you report on that relate to how you might report ROI on a business level?
Paul: I was very keen when I came to Utility Warehouse to be considered and be measured on the business metrics that were already being measured. I’m not a big believer in training and development metrics, so measuring the so-called effectiveness of training—and, of course, we’ve got all sorts of different frameworks. But let’s use Kirkpatrick’s as an example.
Simon: I’m just dropping in here to quickly explain what the Kirkpatrick model is. The Kirkpatrick model, which was developed by Donald Kirkpatrick in 1955, is one of the most commonly used methods to evaluate the effectiveness of learning solutions. The model is composed of four levels, consisting of reaction, learning, behavior, and results. For more information about the model, look up Kirkpatrick model on the Elucidat website.
Paul: In my mind, the only metric in the four levels of Kirkpatrick’s evaluation that is of any interest to anybody frankly is level 4. Yes, we’re actually able to see genuine real-time results—so therefore, everything we do is measured at level 4. And then, if we need to, we’ll work back through the other levels to say, “Okay, so do we fully understand why?” The aim is to really speed up that evaluation period and not get bogged down in lots of lengthy traditional training and development evaluation processes, which in my mind rarely have value in truth.
Simon: How do you kind of break down what’s working what isn’t working in those, when you get those successful results through sales?
Paul: Now, this might be quite controversial, and some people listening might think I’m completely barking mad—but I’m of the opinion… and I’m quite fortunate to work with an organization that is also of the opinion… that we shouldn’t try and kid ourselves about data and stats and what we can learn from that. There’s an interesting book by Daniel Kahneman called Thinking Fast and Slow, and in that book, he essentially distinguishes between our ability to make generalizations in life and in business versus really analyze, be genuinely analytical.
One of the things he advises us to be very careful of is to not believe that we’re being genuinely analytical—we’ve genuinely got data—when in truth, all we’re doing is making generalizations. So, as an example, one of the recent changes we made last year was the main presentation of how our salespeople present what we do as a business to prospects. We realized through evaluations that we had to make the presentation simpler. So, we did that, and we then measured results and we saw a shift. But at the same time, all sorts of other changes were made in the business.
You know, we do a lot of incentives and holiday promotions. And there were changes in the energy markets, etc. The idea that we could say with absolute certainty that the shift in sales performance directly came out of the sort of changes that we made was impossible when this many factors affect performance. So instead, we did a Brinkerhoff’s success case methodology with actually about 40 of our senior leaders. We brought them into a session, and without any real direction—we weren’t in any way trying to lead it—we worked through a series of activities to explore what was having the greatest positive effect on their business. At the end of the day, we were judging whether or not we had reasonable evidence and reasonable justification to say that yes, that change had been successful.
Simon: So it’s a much more holistic view, then, of performance improvement and measurement in the organization.
Paul: It’s really frustrating to me in organizations where the only metric they’re interested in is how many days of training have been delivered. Because the assumption that the day’s training delivered value in some way—there’s no further discussion at all, no further investment in discussion. Really, what it suggests is that training and development is a separate entity; it delivers something. And we assume what it delivers is valuable. And then, I’ve seen in other organizations where actually—though these studies aren’t asking is training valuable—can see value in the training that we’re delivering, but instead prove training is valuable.
So, be very selective over the data that you choose. It’s just literally pulling the wool over the business people’s eyes. There’s all sorts of really peculiar behavior going on, which isn’t good for business and it isn’t good for training and development. I genuinely believe the only way we solve this is no longer treating training and development as a separate organization within a business, but to be a partner, to be genuinely working with, consulting with, reflecting with, evolving with the business.
Simon: How can we—where does that shift need to happen, do you think? Is it from the business, or is it from L&D functions around the world stepping up and being more accountable for what they’re delivering on?
Paul: Partly, it’s in the hands of the learning and development professionals to recognize that your organization is at risk of becoming completely redundant if you’re not genuinely adding value, and the only way you can be genuinely adding value is if you’ve got the right contract with your organization. But also, it’s on the wider organization to see that when you’ve put a load of training development professionals in a room and they’re there and they’re good at solving problems, then actually they’re an incredibly valuable asset.
I’m kind of passionate about this—that if a learning development team has essentially just delivered training… and often they’re delivering training because someone in the operation has thrown a requirement over the fence, they’ve picked it up, they’ve delivered it, they’ve moved on to the next. And there’s very little dialogue—there’s very little actual problem-solving knowledge—just delivering stuff and moving on. Then, why would the business see them as a genuinely valuable asset in driving business performance and solving problems?
Simon: You can see the effects of what you’re doing through sales where organizations are working with, say, compliance-based topics in their organizations. How could they apply a similar kind of style to yourself in working with the business and solving problems, rather than just delivering hours and hours of so called training?
Paul: In that scenario with compliance training, what we’re talking about is behavior—exactly the same principles apply. It’s how much do we understand about the behaviors that are actually taking place in our organization, because in order to be compliant, we want people to behave in the right way. And we often assume that—really, all we’re actually doing with compliance training is trying to cover our backs against some kind of regulator. At some point, they might step in and say there’s evidence for the wrong behaviors that have been demonstrated by your employees, and therefore you’re in trouble. And businesses think they can roll out a spreadsheet proving how people sat through some compliance training and that is an acceptable level of effort on their part, and that they’ll get away with it.
And I think there are precedents now in court where that isn’t acceptable. It’s not seen as a reasonable effort by businesses, simply to put people through training and assume they’ve done what they need to do. So, I think it’s risky—the old mentality of just rolling out training self-compliance. But it also doesn’t help if you’ve still got in your organization behaviors that you don’t fully understand, and what’s motivating and driving those behaviors. Therefore, you’re not really doing anything to resolve them.
Simon: Well said. I think it’s also a use of someone’s time as well—as a kind of scale of some of these programs being rolled out. Every second counts for productivity, and if we’re not using that time well, there’s just no point to this exercise.
Paul: Yeah, and I think often what this—in the world of compliance—what it ultimately is leading to is that we as business people need to actually step up and invest properly in resolving some of these behavioral problems. If we simply invest in a few days’ time of learning design to throw together some learning modules, and then we invest in a few days’ time to chase people, hound people—this could be weeks of time, of course, it could be much more. Given the number of people we’re often trying to affect in terms of their behavior, it’s a tiny investment, and frankly, it’s not enough. And really all it leads to is—in the same way at Utility Warehouse, in the same way at agile software development in the same way—is genuine product design, genuinely solving problems. You’ve got to invest time in mostly discussions, taking people on a journey. And over a period of months you, together, explore, test, and solve the problem that you’re trying to solve and move toward the behaviors that you’re trying to see.
Simon: Totally agree. I’m looking forward to that future being a reality. If we can have that big of an impact to make sure everyone is supported properly in the way you’re doing, I think we’re on to a really good thing. And, you know, shifting the compliance to be a personal responsibility for doing the right thing—not just a box-ticking exercise to get people off the hook.
Paul: thank you very much—that was a really enjoyable conversation, and I’ve learned a lot from that myself. Loads to take away, and thank you for sharing so much. When you asked me to do it, I wasn’t quite sure what I was going to talk about, and then you haven’t been able to shut me up! So that’s great.
Simon: I want to say thanks again to Paul for sharing so much today, and to say that we really want to hear your thoughts on today’s chat too.
Join the conversation!
You can join us on Twitter @Learn_at_Large to keep the conversation going using the hashtag #LearningatLarge, or email me at email@example.com with thoughts, suggestions, and questions. We’d love to hear from you so don’t hesitate to get in touch.
And finally, don’t forget to subscribe to “Learning at Large” in your favorite podcast app and subscribe to our newsletter for updates on new episodes.