Today, I’m joined by Jon Kaplan of Corvantus Consulting. In this week’s episode, we take a deep dive into the skills that companies must develop in order to keep up with change, why digital learning itself needs a change of pace to keep up with the learner, and how you can use xAPI to build new insights into performance.
Prior to working as a consultant, Jon was CLO of Discover, managing a team of over 200 to build the skills of a workforce of over 17,000 people. Jon’s original background as a teacher is demonstrated in his corporate career, where he rolled out a game-changing education assistance program which has now resulted in over 500 employees gaining a bachelor’s degree.
Listen to this episode now
Subscribe to Learning at Large:
Or search for “Learning at Large” on your podcast player of choice.
Simon: [1:40] So, Jon, just to get us started, can you just introduce a bit about yourself and your background in L&D?
Jon: [1:46] Sure. So, I’m Jon Kaplan; I’ve spent my entire career, 25 years plus, in the learning industry. I started out as a high school teacher—I thought I was going to go into academe, and ended up hating the whole pursuit of more and more specific and less and less useable knowledge. I ended up leaving to go and teach economics and government in inner-city Oakland. [2:17] I taught high school seniors, and I bounced around a bit and went to a couple different schools, always teaching in social studies. I ended up opening a charter school with Author-Anderson executives—they came into my school district and they put in a bunch of money to create the school of the future. I had a fabulous experience, and really learned a lot, but decided that I had greater ambitions to impact a larger number of people and that school was probably something that wasn’t scalable.
[2:50] So, I decided to go into the corporate learning sector and I went into a technology company and I wrote training manuals for software developers. I bounced around a little bit, and clawed my way up the corporate hierarchy. One day, a recruiter called me from Discover Financial Services, and I went to Discover to run all of their learning programs for their call centers. That ended up working out pretty well, and six years ago, they made me the Chief Learning Officer. [3:18] I had a wonderful time running all the learning for Discover, and then one day I realized that there were probably more opportunities to impact more people outside of Discover than inside Discover. So, I left Discover and I founded my own consulting company that is focused on upscaling and reskilling people who are subject to displacement because of AI automation, advanced analytics, the digitization of work.
[3:45] So, that’s what I’m doing now.
Simon: [3:47] Brilliant, thank you. So, you’ve kind of taken an interesting route, haven’t you? You’ve gone from teaching a class—you’re passionate about teaching and the effect that learning can have, and then you’ve gone into an organization where you’re going from a classroom to 17,000-ish plus people. Is teaching a class the same as teaching an organization?
Jon: [4:06] That’s an interesting question. I think in a classroom, you think of yourself as a facilitator of learning. You know, if you’re a good teacher, you probably shouldn’t be thinking that everything you say is what people should be learning—you should be thinking about architecting experiences from which people can learn. [4:28] I think throughout my career, I’ve focused on that. After a while, when you gain more responsibility and you have a big organization working for you, you think about how you can architect an organization that creates these learning experiences. So, you become one step removed, but it’s still all about creating those experiences that can impact people and create greater capabilities when they’re done than they had when they started.
Simon: [4:58] As you say, kind of one step removed—so how are you checking that that impact is resonating, that you’re having the same effect that you’d be having in a classroom, at scale?
Jon: [5:06] That’s a really good question, and maybe that’s the biggest challenge. I think about it and right now I think in my consulting practice—I actually just thought about this—as a consultant, you’re actually two steps removed, because you’re working with learning organizations that are trying to create those experiences. So, with each step away from the learners, it’s more difficult to assess the impact you’re having. [5:30] When you’re a teacher in a classroom, you get a really good sense of how much people are learning from just that authentic experience you have with individuals. I mean, you see them, you talk to them… I think a lot of being a teacher is constantly checking understanding. It becomes more difficult when you’re running an organization and you have people who are doing that, but how do they articulate how people are learning and what the impact is?
[6:03] Maybe you have other stakeholders who are looking for not only whether or not people learn, but whether or not they could apply the learning in ways that advance the business goals of the organization. One of the reasons why I came to Discover was because in call center training, you have an opportunity to really assess whether or not people have made gains from their learning experiences. So, in a call center environment—call centers are one of the most measured parts of the economy, I think—people really measure exactly how productive humans are, what they’re doing, what they’re not doing. [6:46] You know, calls are recorded, there’s advanced analytics—everything is measured about the calls. In most call centers, the customer behavior after that customer interaction is measured. So, you can tie the customer’s behavior to the employee, both pre- and post-learning experience.
[7:03] So, we got pretty good at assessing how effective our training was. For example, most of our work was in new-hire training. So, we hire people—they would come from all walks of life, but with relatively little professional experience—who had a high school degree, maybe a little bit of college. But very few people with college degrees would come join the call centers. And they had very little awareness of what the financial products they were supporting were, in essence. I mean, this was the first time they’d dealt with credit cards or deposit products or personal loans or student loans. [7:38] So, we would put them in a new-hire training class. It would last somewhere between—I think the minimum was about three weeks, and the maximum I think we had some eight-week programs.
But what we found is that our employees graduating from these programs, if you measured their performance in the first month after they graduated, they were performing somewhere between 95 percent and about 105 percent of the performance of the tenured peer—so, someone who was in the business, who had been there for a while, we measured their performance on a bunch of business metrics. [8:14] So, you measure like the time of the call, the engagement of the customer—there’s a lot of different things that you measure. You create a basket of metrics, you take a weighted average of them, and you compare experienced people to people just graduating. They really were performing almost as well—95 to 105 percent. So, these were enormously effective learning programs. That’s something we were really proud of.
[8:47] [It’s] harder to measure the performance on things like leadership and executive development, you have to have proxies for it, you can’t have that data—or it’s difficult to find that data, and there’s a really long tail with that data. But I think that most learning leaders are looking for how to measure the effectiveness of their learning.
Simon: [8:54] That’s really interesting. I know from your background that you’ve had some really great successes through skill-building, and I’d love to ask you a bit more about that as we go. But also kind of investing in educational assistance programs. I was keen to ask you—you know, this kind of balance between upskilling to do their job and formal education and a place for that in the workforce. Do you think a degree, and degrees, are still necessary in such a fast-changing world? Or, is this skill-building the blueprint for supporting that across an organization the more important thing that organizations should be investing in?
Jon: [9:28] It depends. I think if you really understand an employee—what role they’re currently in, what their skill levels are—and you have a very good sense about what the skills and what the behaviors and what the tasks of their future job is going to be, you don’t need to waste your time with a degree. You don’t need to waste your time with anything more than very focused training on the new job role. For technical jobs, that’s very easy to do.
For example, AT&T I believe has a very big reskilling program on the way. [10:06] A lot of what they’re doing is reskilling people from 4th generation wireless to 5th generation. That’s something that because they’re implementing the infrastructure themselves, they have a really good understanding of what it takes when a technician moves from old technology to new technology. Because they can map out exactly what the job is, it’s actually a relatively straightforward thing to retrain someone.
[10:37] For most jobs in the economy—well, I won’t say most jobs, but most of the jobs that I’ve spent my time around, it’s actually harder to know what the new role is going to be in eighteen months or two years or three years. It’s just, things are moving fast enough that a lot of rollouts are a little bit more iterative. So, jobs change, but you don’t really fully understand how jobs are changing until you get employees in the jobs who are doing the jobs. And then, oftentimes, companies do that and they find out, “Well, these employees really aren’t successful in these jobs. What we’d rather do is have a layoff. We’ll lay off these employees and go hire new employees and now that we know what they’re supposed to do, we can define the job descriptions well; we can hire specifically for the need.”
[11:26] I think there’s an opportunity for companies to be more foresightful and more proactive in trying to understand what the jobs are like and having more effective reskilling programs. And to the extent that they can’t have those targeted reskilling programs, I think putting in place a university program, some sort of upskilling—and I define upskilling as just trying to increase the base platform of skills that are generic and transferable across multiple jobs—I think these upskilling programs are really helpful, because they help people develop the learning skills, develop the collaboration skills, develop the problem-solving and critical thinking skills, that are broadly applicable. [12:13] So what you do is if you don’t really know how the jobs are changing, you can have more generic programs in place that give people a really good chance to learn new jobs much more quickly than they would if they were coming with little academic background.
Simon: [12:30] Really interesting. So that’s the way for organizations to future-proof their workforce—to make them equipped and ready for change, and absorbing information quickly.
Jon: [12:43] And I don’t think we fully understand why degree programs do that. If we did, we probably could create academies and we could create some other way of getting those skills to people, but I think it’s a combination of the grit and resilience around a learning program—really focused learning, where you have to actually sit down and you have to focus and have to learn something new. The content has something to do with it, but I think the content is less important than the process of learning. I think that’s probably what a degree program does more than anything else; it creates a rhythm in which learners have to learn. [13:28] That exercise, it’s like just building a muscle—learning to learn skills, I think, is incredibly important. And I think a degree program actually builds that muscle of learning how to learn.
Simon: [13:38] Interesting— learning to learn, the upskilling trumps reskilling in a future-proofing exercise.
Jon: [13:44] It’s very difficult for companies to really understand how job roles are changing with enough specificity to put in targeted reskilling programs. I mean, I think there are some—and AT&T is probably a good example, but I think that’s probably the minority of job role changes, are changes in which the company really understands how the jobs are going to change.
Simon: [14:04] Okay, brilliant. You told me that you’re interested in data, are excited about the role xAPI now allows in tracking behavior that SCROM could never handle at all. I was wondering if you could elaborate on that, and the types of behaviors that you can see through xAPI and what interests you the most. This is actually something I think I’ve been personally grappling with recently, in a learning experience platform we’re rolling out.
Jon: [14:41] Yeah. So, I think there’s sort of two ways of thinking about this. Let’s talk about sort of how we develop data and what xAPI allows us to do. xAPI has the capability of monitoring a wide range of someone’s activities on someone’s computer that SCROM could never do. So, for example, if you’re training someone to use a computer system, you can just embed some code in the back of the computer system that will connect with your learning resource store. They got trained; we know when they got trained; five minutes after this training, they went and they did these things. You can track that.
[15:05] Then, you can track things like how long it takes them to complete a task on that particular system. There are lots of different applications; I think we’re just at the beginning of those applications. There are some really big holes in that—maybe you’re training people in things that have nothing to do with their computer, or you’re training them how to have a good conversation with their direct report or manager. [15:25] You can’t really use xAPI to track that. You can use xAPI to track how they interact with a learning asset, whether or not their attention on their computer wanders from one place to another—there’s a lot of different things you can track. I think probably for xAPI, the best thing that learning technology teams can do is keep it simple, start tracking a few things that you can’t track through SCROM. Meanwhile, probably the best proxy for how effective learning is, simply ask the learner. Say, “Was this learning valuable? Do you intend to use this in the future? Would you recommend this?”
[16:00] Those smiley sheets aren’t great measures, but as a first order of approximation, they’re probably not the worst thing in the world. Really measuring whether or not people change their behavior is always incredibly expensive and time-consuming. They involve—especially if you’re doing soft skills, you’re doing leadership skills—what we call professional development. [16:25] Those are things that are just difficult and time-consuming to study. It’s not to say you shouldn’t do them, but one thing you can do is do a few of those studies, correlate them with the results you get on a sort of “will you recommend this training” kind of questions, and see if you can use that question as a proxy for how your studies assess change and behavior.
Simon: [16:49] Interesting, thank you. I asked you in a survey what books about digital learning are you reading right now. And I thought your response was very interesting—you said it often feels too slow, which was actually something… I was surprised by the answer. I just wanted to check what you meant by that—why do you think digital can be slow, and what can we do to make it faster?
Jon: [17:09] Yeah—so, interesting. How we consume information is always one step ahead of where learning professionals are delivering information. There’s a book I read probably 15 years ago called Everything Bad for You Is Good. It’s by a guy named Steven Johnson, and essentially his premise is that, your parents always told you that watching TV would turn your brain to mush and video games are not good for you, and it turns out that there’s really good data to suggest that because all media is getting richer and more complex all the time, that sifting through popular media actually makes us smarter. [17:54] In this book—it’s a wonderful book, I recommend it to everyone—the thought experiment… maybe it’s not an experiment, it’s more of a challenge.
[18:04] The author says, “Try and go back and watch a full rerun of Happy Days,” which was a sitcom in the 70s. It’s insufferable—you can’t do it, because it is simplistic and it’s slow. He compares that TV show, just the plot dynamics, how slowly the plot evolves, and how overdramatized everything is, and how heavy-handed the dialogue is, to—and it came out years ago—he compares it to The Sopranos, in which there are many, many plotlines. They’re all completely independent but intersecting at various times where the motivations of characters are not clear, people are saying one thing and meaning another. [18:49] And you can compare that to something like Game of Thrones, which had hundreds of characters. That process actually helps hardwire your brain. Your brain just gets stronger from constantly having to think.
[19:05] Media actually is ahead of where learning is, and media is constantly getting more rich, getting more detailed, getting more nuanced. And I think that’s why it feels, oftentimes, like learning is too slow. Because it doesn’t keep up with other popular forms of—other popular media. The goal is to get more and more information to you because that’s what drives future conversations and Internet traffic and posts online, is when people start debating the Game of Thrones episode. [19:38] They intentionally create more complexity in order to drive that conversation. That actually makes people smarter. And that’s one of the things we haven’t really figured out the creative level of ambiguity in learning and a level of complexity that makes it harder to learn, that forces people to use those muscles to learn.
Simon: [19:58] That’s a great thing to take away from this, actually, Jon. That we need to make digital learning more like Game of Thrones and Happy Days. Thank you.
Jon, for people listening, how can they find out more about you and connect with you?
Simon: Brilliant, Jon. Okay, well, thank you so much for your time today. It’s been really great to kind of go through some of these new ideas about learning at scale and learning integrity as well. Thank you very much and have a great day.
Jon: Alright, Simon, thank you very much.
Thanks again to Jon Kaplan for sharing so much. It’s been a fantastic episode.
Join the conversation!
We’d love to hear your thoughts on today’s podcast, so feel free to get in touch on Twitter @learningatlarge with any questions or queries. You can also email me at email@example.com. As always, don’t forget to subscribe to Learning at Large in your favorite podcast app and leave us a 5-star rating if you enjoyed it. Thank you for joining us, and see you next time.
Latest posts by Simon Greany (see all)
- Ep10: Socializing your learning design to support global audiences - September 17, 2019
- Ep9: Up-skilling your workforce to future-proof your organization - September 3, 2019
- Ep8: Supporting the learning demands of 2.5 million technical specialists - August 20, 2019