Mind the Skills Gap

Evidence-Informed Learning Design

Stellar Labs

Mirjam Neelen and Stella Collins discuss the best ways to change behaviour, improve performance and increase return on investment to make learning more effective and enjoyable. As well as how to ensure employees are equipped with the skills they need for today's challenging and demanding market.

Speaker 1:

Welcome to the stellar labs podcast, the skills of tomorrow trained today at stellar labs, we've found a way to solve the skills gap, backed by science and research to achieve the best possible performance and return on your investment. Our AI driven personal learning assistant and up skill platform is fueled by neuroscience specialists in your organization, share their knowledge effectively so that learners are energized and engaged to master high demand skills and capabilities. In these podcasts. You'll hear from industry experts and practitioners from the world of technology and training does share their experience insights and inspiration and their visions for the future. With you, keep listening to start your future learning here today. Hello and welcome back to the stellar labs podcast. I am really pleased today because I have a guest with me who I've been trying to talk to for ages. And one thing leads to another and we've found it's been really hard to have this conversation. So welcome Miriam. Neland. I'm very happy to have you here today.

Speaker 2:

Hi, Stella. I'm happy to finally be here.

Speaker 1:

So Miriam is currently head of global, uh, learning design and learning sciences at Novartis. She's got more than 15 years of industry experience. She's worked in many, many different companies. Um, currently I think you're responsible for global learning design strategy and plan, and I'm kind of curious what that really means. Um, I think it's her mission. That's the most interesting, which is to build learning design capability across organizations. Um, she's a huge advocate for evidence, evidence informed. I think you'd call her evidence informed practices to learning and has a, a great blog with, uh, Paul Kersner on, uh, three, three star learning experiences and they have a very, very good book that I recommend to everybody and regulates it's on my bookshelf. And I open up regularly to check things in it on evidence informed learning design. She presents about her work regularly conferences all over the world. And she's an amazing advocate for evidence informed learning and a very strong critic of those who perhaps are purporting to do things that Miriam doesn't think fit in her pattern. So, um, Miriam, welcome to the, the podcast. It's really great to have you on here today.

Speaker 2:

Thank you. I'm happy to be here.

Speaker 1:

Um, perhaps you can tell us a little bit, first of all, about, you know, how you got to where you are, how you got to your senior role in L and D.

Speaker 2:

Yeah. So I've always been, uh, really interested in how the brain works as in how, how it processes things. So how it started was I actually started studying, uh, Dutch language and literature, and then my master was in psycho linguistics. So that was around, you know, how people process, uh, language. And, um, I got really interested in, uh, aphasia and then I thought, okay, this is great. You know, it's theoretically and, and everything, uh, very interesting. And, but then I thought, okay, now I would like to work with people, you know, uh, who suffer from aphasia or who had a stroke or whatever. So then, um, I had to do a bachelor's after my master's in speech therapy, because then the Netherlands, there was no other way to become, uh, a speech therapist. Um, so that's what I did. And, uh, I never ended up working with, uh, people with ATA instead,

Speaker 1:

What you might not know Miriam is, I also did a master's in speech therapy<laugh>

Speaker 2:

Oh,

Speaker 1:

Really? But never ended up working with people in that field either.

Speaker 2:

No, I did work as a speech therapist for five years, but I worked with children with, uh, neurological, uh, disorders and, you know, having communication and, and speech and language problems be because of it. Uh, so I did that for five years and a, I liked a lot of it, but I didn't really like the, the, the treatment side of it. So I liked the analysis and the planning and, and all that, but I didn't really like the therapy itself. And one of my biggest, um, challenges at the time was that speech therapy was not very evidence informed. So I've always had that drive that, you know, I was looking for the evidence and why we're doing things. Then I did some kind of coaching path and cuz I thought I wanted to be an academic at that point. Cause um, all the tests that I did showed that that might not be the best choice for my career. So anyway, after long thought I ended up studying learning sciences and I was very lucky because I was, uh, dating my no now husband at the time. And he said, why don't you just quit your job? You know, we go to Mexico, I'll work. Uh, you can do your master's, you know? And I was like, whoa, this guy's great.<laugh> oh, nice.<laugh> so yeah, so I did my master's and then, uh, while I was doing my master, I moved to the us and um, I started doing internships in instructional design because one of the professors at the university in the Netherland, she recommended, uh, her name is Olga for SFA and she recommended, like she said, if you are in the us, you know what, you need to make sure you learn like altering tools and stuff. Cause otherwise you can never get a job like as an instructional designer. So I started to do internships while I was studying and then slowly kind of like I moved from instructional designer doing a lot of e-learning stuff to more like senior instructional designers, more like leading projects. And I worked for, when I moved to Ireland, I, I got more strategic roles as a contractor. Then I worked for Google for a year. Um, and then I worked for like about three years for an applied research center called the learn eight center in Dublin. And there, you know, that was really a really nice sweet spot for me because I could do a lot of research yet. It had to be in a practical way and I was still working with real clients. And so it was still quite, you know, practical. And then I moved to Accenture where I was more like a, in a learning design advisory role, more like overseeing multiple projects. And then after a couple years I moved to Nova, just so this is my first global global role as in I've had global roles before, but this is, you know, where my remit is to change things at a global level, which is quite a challenge at a company with like 130,000 people.

Speaker 1:

It's a massive massive organization. And throughout this journey, were you always trying to bring in this evidence inform practice? Oh

Speaker 2:

Yeah. Yeah. I've always been interested, uh, in, in doing that. Although I must admit about 10 years ago, I, I was very bought into 70, 20 10 and that stuff. And I, I thought at the time, I, I believe that adults were very good at self-directed learning. And so I've had my, you know, I've I've and I'm sure I have gaps now, you know, you never know everything. And the more, you know, the more you realize you don't know. Um, so, but yeah, but it was mostly when I started working with Paul on the blog that I really, so we've had that blog for about six years now. I think yeah. Combination of my work at, at the<inaudible> center and my work with Paul have, have really sparked.

Speaker 1:

I've always been curious. How, how did you and Paul come to write a blog together, meet, come to, to write a blog together? Well,

Speaker 2:

Our, our versions are, uh, are different.<laugh> Paul has a different version of this than I do, but my story is that I followed him on Twitter. He was one of the professors at the open university in the Netherland. Um, he, I, I, he never was my professor, but I, I knew his name and I read 10 steps of complex learning because that was part of the masters. So I was familiar with his name. So I followed him on Twitter. And at some point I remember, this is what I remember. I said to him, why are you not like tweeting and writing more in English because I feel your work needs to spread like more widely. And, and he said, well, I don't have time to do that. Would you like to translate my work? And I thought, well, why the hell would I do that? You know, what's in it for me, but then I thought about it and I thought, well, you know, this, uh, this guy is fantastic and I can learn loads from him. So what if I start is translating his work, but then at the same time, you know, I can start initiating blogs and he can kind of gimme feedback. So it kind of became like this partnership. And we just basically Skyped one day, uh, six years ago. And we said, will we just give it a go? And we gave it a go and we're still going, so, okay.

Speaker 1:

That's great. That's right. Yes. Cuz you know, I read it regularly, but um, yes, I've, I've always wondered how you came, came upon each other. So was this something that very specific that made you want to adopt an evidence informed approach?

Speaker 2:

It was just that I noticed that people had such strong beliefs because I did my master's right. So I had, I had quite a solid foundation of the research that is out there and, and, and the foundations of it and, and learning design theory and, and all that stuff and learning, learning theory. And I just noticed that people didn't know that it even existed and that we made a lot of decisions based on what we believe or what we prefer or just the language kind of like, you know, I was like, something is, is missing here. And this is not, this is not a professional practice as such, you know,

Speaker 1:

I used to be in, um, sessions where people would bring up the Merian myth because I was often working in kind of communication area and people would bring up the Merian myth as if it was absolutely, you know, a hundred percent true. And you only had to ask some very simple questions to sort of say, well, you know, if, if, if it's, and I'm not even going to repeat it because I think a danger in repeating it that people will hear it again. Um, but you know, you could ask some simple questions that would make it clearly obvious that it couldn't possibly be true. The piece of research was a valid piece of research, but the extension of that research to all communication was utterly ridiculous. And yet people were, and, and still, you know, you still see it written you still, and I kind of made it my mission at one point, if I was in a session to actually, you know, raise the question and address it, which was very scary thing to do initially. Yeah. You get kind of more used to it. So

Speaker 2:

Yeah. I no longer find it scary, but I find it frustrating sometimes.

Speaker 1:

So what's the, what's the hardest thing about adopting an evidence informed approach

Speaker 2:

For me is that it just eases up a lot of my energy, you know, like, yeah, yeah. You just need to read a lot and compare a lot and also be open to being challenged, which is exhausting sometimes, you know, then I see something that challenges my thinking. I'm like, oh, now I have to read that thing as well. You know,<laugh> but that's sometimes how it feels. It can just be quite overwhelming cuz there's, there is of course a lot of research out there and you can read it all. At least I don't have the time to do so. So yeah. To kind of make that selection and, and also to, to kind of, you know, find some focus cuz I, sometimes I'm at risk personally that I see, I start something in a topic and then I see something else and you know, and then I start reading that and then in the end, nothing really happens with it because I'm doing too many things at the same time also of course, to find a way to translate then what you read to practice and be careful with that. Cause nothing works all the time. You know, just to kind of know when, to, when it might make sense to, to implement it. I mean influence in stakeholders and peers is definitely a challenge. Although I must say that more and more people seem to be open to it and, and interested. I do see a shift there and then choosing your battles, which I'm terrible at

Speaker 1:

<laugh> you want to battle everything you see?

Speaker 2:

I don't know. Yeah. I, I I've been reflecting on that a while ago, but I can't really find the answer why I seem to have a passion for battling sometimes.

Speaker 1:

So you've said that, you know, there's a challenge and there's very much that challenge about picking your battles and, and, you know, knowing when you should be challenging and knowing maybe when it's okay, just to let somebody else do it, perhaps<laugh> um, what's good about it. What, what do you find kind of either easy or, you know, the most rewarding thing

Speaker 2:

For me, the most rewarding thing is when through conversations with stakeholders or peers, that's wh when, when they kind of start to see, you know, where, why I'm saying what I'm saying and, and how that then impacts on practice and when they kind of go when the light bulb boom and happens when they go, oh, okay. Yeah. So then we need to focus on this and not on this cuz often in my experience, it is not so much that people do things wrong necessarily. It's just that they, they focus in the wrong angle. And then if you can have that conversation and say, okay, but let's take a step back and really think about what people need to achieve here. Then what is important in that specific context? Cuz people, at least in my experience like, uh, people in learning have a tendency to throw things in like a given, you know, oh yeah, we need social learning. Okay. But why, you know why? Well, because it's, it's almost, it's almost like it's almo it's always better than, than individual learning. Well, no, it's not like, it really depends. Right? Yeah. So we seem to have these beliefs or these strong like things like gamification or so we focus on these things like fairly quickly and we jump on them and then if, if I'm able to say, okay, well let's take a step back and really think about, you know, what do people need? What do they need to learn? How if we break that down, what kind of components does that have? And, and what do people need for these specific components to, to learn that stuff then, then what do we end up with? And then if they go, oh yeah, okay. Yeah, that kind of changes things. And also, I, I recently had like an experience with stakeholder at, at Novartis where I was able to kind of show them by visualizing what learners need to kind, kind of pivot their thinking from the training focus to more like the capability building focus, where they really think about the end point. Like, okay, we're not training people for training's sake, we're training people so that they can change something while do. Yeah. So to me, those are the moments where I'm like, yes, this is why I'm doing it because in the end, yes. I'm very passionate about changing practice, but also about making sure that people have what they need to do their jobs well, better have a job in the future, all that type of stuff. Yeah. So that, to me, yeah. When, when I'm successful by explaining the rationale and, and why, and people go, oh, okay. Yeah. I've never thought about it that way or whatever. Then that makes me happy.

Speaker 1:

<laugh> yeah. Yeah. I think it's really important. I, I think one of the reasons L and D people jump on these new ideas is we're quite open to new ideas, which is great, but it's like, we kind of throw away all the old ideas when, when a new one comes up, it's like gamification is gonna fix everything. We're always looking for the holy grail. And, and it's kind of ridiculous to think there is a holy grail. There isn't one. There's never going to be one solution because it, as you are saying, it completely depends on who's learning what they're learning when they're learning, why they're learning, you know, all those questions need to be asked. And one of the things that frustrates me is, is the fact there's an enormous emphasis on kind of content and knowledge. Oh

Speaker 2:

Yeah. Which

Speaker 1:

Is, you know, we need content and knowledge. It's fine, but that's not enough to have somebody, as you say, do their job become better at better skilled, better, uh, you know, better colleagues at work, better able to, to, you know, develop their organization and develop the people with their organization. And I think it's that kind of, it's like learning what is learning. Yeah. And I think a lot of people don't really know what learning is.

Speaker 2:

Yeah. And I think also we're not very good at breaking this down right. In, in the different types of needs we have in organizations and what it takes like when we think about informal learning super important, but what, what does it take? Right. Well, that takes a certain type of structure, a certain type of culture, you know, good managers that are able to support like community like breaking silos, blah, blah, blah, all that stuff. But we have a tendency to focus on mostly, you know, scalable is now really Hal in heaven. We need to scale. We need to scale, well, maybe some things, but, but maybe we can also just not do a lot of these things that we think we need to scale. Right. Because then you end up with, this is my opinion and I don't have any evidence apart from how people learn, which is maybe quite enough, but

Speaker 1:

Quite an important piece of

Speaker 2:

Yeah. That's because I feel that, that, that we, you know what you're saying, like, we, we focus a lot of like content libraries and content curation, and then we give people all this content and then they need to figure out which content is relevant to them and how it applies to their I'm. Like, how is this useful to people? Like, it's so overwhelming for people. And, and, and then if they don't succeed, then it's their fault because we gave them everything they needed. Yeah. And like, can we not just stop a lot of things that we're doing and only focus on the things that really require like thoughtful design, you know, really supporting people, doing their jobs better. And then the other part can be, how can we now put other structures in place to support people, to do the informal learning bit? I think if we were do these two things, well, I, I think we would be pretty successful and impactful.

Speaker 1:

And I think that's informal learning thing is becoming such a, a thing at the moment. So many organizations we speak to saying, you know, we want self-directed learners. We want learners who can, yeah.

Speaker 2:

But people don't realize what it takes.

Speaker 1:

No, it's yeah. You need to a, you need to know what learning is like as a self-directed learner or a self-paced learner. You need to know what learning means. You need to know that it's not just going to a content labor and watching a whole pile of videos. I mean, the way I always explain to people is can you ride a bike simply by watching a video? And they always say, well, no. And, and it's the same as any other kind of skill or expertise you need at work. You can't do it simply by having the information about it. You need, you need guidance and support and structure and frameworks. And,

Speaker 2:

And I think people don't realize that self-directed learning is not an activity, right. Self-directed learning is the front end process, setting your goal, figuring out what your learning process needs to be, figuring out what activities you need to tackle, how to prioritize these activities, how you can find the support to do these activities. Well, you know, set up your feedback structure to get the feedback that you need, track your progress, man. That is quite

Speaker 1:

That's, that's a skill in itself.

Speaker 2:

A skill.

Speaker 1:

Yes. Yes.<laugh> indeed. Now one of the things I know we've kind of, um, we've been, we've sometimes been pitted against each other, I think in, in conferences and things is the idea of, you know, you appear to say that you, you don't believe neuroscience has any value for L and D. I know it's more subtle than that. Um, but I'd like you to explain a little bit more about your stance there and, and perhaps, you know, what's your definition of neuroscience, cuz I think that's probably where our biggest difference

Speaker 2:

Is. Yeah. So, uh, my definition is that it's the field that, um, study the structure and function of the brain. And I think it's a, I think neuroscience is a bit of an umbrella term in the way that, you know, you can study the brain at different levels. Right. Molecular, how do you pronounce that in English?

Speaker 1:

Molecular? Yeah.

Speaker 2:

Molecular cellular, like, I don't know, but, and then how these parts work to, you know, how these things work together. But to me it is fairly abstract as in yeah. Studying the brain like chemical cellular.

Speaker 1:

So it's that kind of real, the, the really kind of deep scientific stuff that, you know, you need a degree in physiology or chemistry or biology or

Speaker 2:

Neuro

Speaker 1:

Neuro or in neuroscience as well. Yes.

Speaker 2:

But I remember like to me, there are like, those subtleties are, are, are important because I remember when we were learning technologies, uh, when was it February that E uh, draw was there and he explicitly said, I am not a neuroscientist. Mm-hmm<affirmative> I am a cognitive neuroscientist. And that is a really important distinction because that is more about, you know, the brain versus the mind and how, how, what we understand about the brain and how we then like, think that that is completely different levels. And yes, a cognitive scientist then looks at both, but yeah, that's how I think about neuroscience. It's more about the brain, the

Speaker 1:

Actual, the brain and how it works. And, and for me, I'm now really interested in not just the brain, but how it connects with our bodies. So the science of how our brains and bodies are so well connected, you know, you, we are not just brains floating around in jars. We're humans with many complex physiological and biological needs, but I suppose I'm, I'm still curious as to why you jump on it so much, cuz you know, for, for a lot of people, I think it's an interesting entry point. It helps them get interested in the science of learning

Speaker 2:

To me. This is, this is really important because I am interested in it as well. I think it's super fascinating. I I'll give one example. I actually wrote it down. There was imagine a pose where somebody explains that and here it comes, dopamine magic neurons response, not only to rewards, but also the things that are novel and surprising. Okay. That's interesting right now the next thing is that this person says, implies that this can then inform how we design for learning. And this is the type of leaping that I constantly see when people talk about chemical neuron stuff, transmitters, whatever, and then make a jump to, oh, this is so interesting. And then start to talk about behavior or even worse, how we then shoot design for things. And that to me is not acceptable. And that's what I see like yeah, the whole dopamine thing is crazy. It's it's getting outta hands dopamine. I remember when I interviewed, uh, Danielle sari for our book when he said dopamine is just a neurotransmitter, that's all it is. It is a, a, a, a value free neurotransmitter.

Speaker 1:

Yes. It's neither good nor bad. Exactly.

Speaker 2:

But that's not how the people who are so passionate about neuroscience in learning, talk about it. Yeah. And, and so I have not seen anything coming out of it that I'm like, okay, this is, I mean, I'm sure I've overlooked things. But to me, me, the most problematic part is, and, and what I really honestly don't understand is we have our own science. It's called the learning sciences<laugh> and it's an interdisciplinary field and it's focused on the question. How do people learn? How do we need to design to help people learn? What do we need to do to support people, to learn? That's we have our field had our own science and it's an interdisciplinary field. It has it, it uses things from

Speaker 1:

Neuro science course. We

Speaker 2:

Have our own field. Why would we focus on another field? It's like being a data scientist and then focusing on statistics, like, why would you do that? I don't understand. So maybe you can answer that because you do it

Speaker 1:

<laugh> well, I, I mean, I'm, I'm obviously very, very interested in neuroscience, but I think AI define it more broadly. So for me it is the new, and I'm talking about the neuroscience of learning, but I'm talking really about the science of learning. And the reason my book was called neuroscience for learning was because the, or the publishing company said that would sell better. So I wanted to call it psychology of learning. I wanted to call it the psychology of how having said that you

Speaker 2:

Would and you did. Okay. Yeah,

Speaker 1:

Definitely. Um, I think it's subtitles the psychology of learning. Having said that I'm completely in, you know, fascinated by how our brain does work and understanding how that connects to the learning process and how that connects to cognitive psychology and social psychology and all those pieces. Um, totally get the whole thing about, you know, we, we tend to label, you know, dopamine as, as good and neuro adrenaline as maybe bad, cuz it makes us feel tense, but actually all those things are completely, yeah, we can't say they're good or bad. We need them in order to function as humans. And sometimes, you know, too much dopamine is really bad for you and too little dopamine is really not good for you, but not

Speaker 2:

In the same way. You're scared. So yeah.

Speaker 1:

Yeah, yeah. And, and too much dopamine. Yeah. You end up being schizophrenic, you know, um, or you have schizophrenic tendencies. And I think that the whole complexity of all those neurotransmitters in the way they work, you know, there is so much more to learn about them that I am utterly fascinated in, but I, I, I, what I suppose, or I am interested or where it comes as interest for me is I find it's quite a nice way for people to start to understand themselves because everybody has a brain and they like to know a little bit about their brain. The challenge becomes when it becomes, as you say, over generalized, it goes from being, you know, this is a fact about your brain to, and this means X, Y, or

Speaker 2:

Z. Yeah. So I think that is a really good point. I think two points that you've made that I think kind of like nail it for me, one is the selling, I think somehow neuroscience sells better than the learning science is. Yeah. And I don't know why, but probably because people anyway, I don't know,

Speaker 1:

Probably because people see it as people understand it less. So it's seen as more scientific, more clever because people actually it's, it's much more complex or you, you need to be more of a specialist in it specifically in neuroscience itself.

Speaker 2:

Yeah. It's, there's not really a way to understand it as a, as a layman, I think.

Speaker 1:

No, no, you really need to have a really strong, scientific background.

Speaker 2:

I don't think there's anything wrong with like the second bit that you were saying. Right. If, if people find it interesting and understand a little bit better around why we might be or behave or responded or whatever, that, of course there's nothing wrong with that. Um, that's, that's perfectly fine. I just think that that's about the level that we're at, you know, like it's more like fun facts. Mm-hmm<affirmative> type stuff. I mean, I, I remember, uh, one, one talk that you gave, uh, at learning technologies, ones where you explain, you know, why sleep is so important for learning and, and that is interesting. Right. It's interesting. But, and, and so, and that's it for me. Oh, that's interesting. But it doesn't really help us. I mean, that maybe helps a little bit in the sense that when

Speaker 1:

That's design can help. Yeah. Yeah.

Speaker 2:

But not for our practice that much. That's what I'm trying to say more for people themselves to manage their own.

Speaker 1:

I think for me, for me, it did influence that one in particular did influence my practice to some degree along with a whole pile of other things. So I really recognize that, you know, if you split learning up into, I mean, there's many other reasons why you should, but you split learning into chunks, bite size pieces and allow people to sleep on it before they then come back to it again the next day or a couple of days later, that is more effective than trying to cram everything into their brains. All at once in one day, sleep is part of that. That's not the only reason, but I actually think that is a really good reason to start to split learning into pieces. And it's a really easy way to explain it to people. This is why we should split learning into pieces because if you don't sleep, you really won't learn anything.

Speaker 2:

But to what extent did we need the neuroscientific explanation for that? You know, so again, to me, it goes back to, okay, it's interesting. I'm not nine I'm interested in it. I'm just, I constantly think, where should we focus our attention? And of course there is this part and, and I think you are better in that than I am about, you know, how to sell things to people so that they it's more easy for them to accept or to buy into that. And I think I am, I, I am, I just think a bit differently in the sense that I think, yeah. Okay. But okay, fine. That's that's to me the bucket. Okay. Nice to know. Interesting for the sake of interesting and that's fine with me, there's nothing wrong with that. It's great. People can be interested in loads of things. Um, and then there's the part of okay, but what do we really need to focus on to change our practice? So I think that to me is the kind of the distinction that I, that I make just

Speaker 1:

To address that specific one, for me, seeing how sleep, seeing how our brains actually function during sleep. That actually is, that is the convincer for me. So if, if you know, for years, our mum's always, and our granny's always said to us, you need to sleep. It's good for you. Yeah. That was just, you know, that was just what my granny said, but, you know, when you can actually understand what's physically going on in the brain during sleep, it's like, okay, that really is why we need to sleep. So for me, it's, it's the convincer. Yeah, yeah. Um, it is the evidence that proves we do need to sleep. So that's, that's kind of an, an interesting, that's a good place to be, to perhaps be thinking about why, and, and it may depend on your level of scientific inquiry or where you are in, in your learning, you know, knowledge about the brain and learning as to, to what you need. So for me, I need those convincers that say, somebody's actually seen the, the, the transmission of, of, uh, memories from the hippocampus into other parts of the brain during that sleep period that doesn't seem to happen in other times of your daily life. Yeah. That for me is the convince that we really do need sleep. So it, it probably depends where yeah. Where you are in your, your journey.

Speaker 2:

Yeah. And, and, you know, cause I also remember when I was inter interviewing, uh, Daniel Anari that, that he was saying, well, you know, like on one hand, it's okay. That people use neuroscience to sell things, you know, if it's for good reason or for good. Cause I still have problems with that in the sense that I'm like, why do we need to call something that it's not,<laugh> why we have, we have a name for it, you know? But yeah, I think once, yeah, I think, and that's what we do in learning a lot. I think like we want to use terms because somehow people need, I just think it just says so much about our field that we

Speaker 1:

Constantly it's another bandwagon, isn't it?

Speaker 2:

It is like we need, we constantly are looking for these shiny things that we can then. And, and I remember having a conversation with somebody about worked examples and, and, and this person said, I, I think you should change the term cuz that's not the term that will resonate with people. I'm like, well, I'm sorry, but this is what it's called. And it's been there for 40 years and it's been researched, this is what it's called. Why would I need to call it something else? This is

Speaker 1:

That's, that's a really interesting question, Miriam. And one that we debate regularly and shall we call things, things that the genuine scientific name, or shall we call it something that we think our clients are going to understand? And I think that's, that's a question that I'm sure many organizations have, you know, do we use the internal jargon name or, or, you know, the scientific jargon name it way, you know, but it feels like jargon. And do we teach our clients that? Or do we have to go with the words that they're already using in order to get access? And I kind of think we sometimes need to work with the words they use in order to introduce them to the more complex stuff. Otherwise you're putting up barriers.

Speaker 2:

This is such yes. But at the same time, I think the fact that we, that we are not really a professional field is the, is one of the reasons why we have to do this, you know, like, because it also causes confusion, right? It ease up a lot of our energy. We need to really understand like how other people, what language they use and how we can then like tweak our language to, and, and I know that's all political and it's all important and, and all that. But I think it's kind of like a shame because it also causes a lot of confusion in the sense that in one division they might call it this and then in another division they might call it that. So then we need to work on alignment between divisions because you know, otherwise nobody understand. And it just becomes also messy. Like, yeah,

Speaker 1:

That's a whole, that's a whole new conversation that we'll have to have. Yeah. I know, point

Speaker 2:

<laugh>, that's more like the evidence of craziness of corporate

Speaker 1:

Or whatever and yeah. Corporate communication. So have you got any, any kind of guilty secrets about evidence based practice? You know, something you've had to change your, your mind about you've already kind of mentioned 70, 20 10, perhaps you've, you've shifted on that. Is, is there kind of anything else?

Speaker 2:

Yeah. And, and so I constantly doubt things by the way, but, um,<laugh> yeah. I mean, as I said, the idea that adults are good self-directed learners, I used to think that was the case. You know, that's also one of the reasons I did kind of believe in 70, 20 10, because I thought, well, just give people stuff and they'll figure it out. And that's just not true. Like when you look at the evidence. So definitely also I recently, um, Paul and I, uh, posted the blog about learning strategies that we had posted previously about that, you know, we had this distinction between effective and ineffective learning strategies and, and we had ineffective. We had like, uh, highlighting summarizing and, and, uh, yeah, I think highlighting some, I don't remember all of them, but anyway, my point is that we rewrote it in because now we kind of wrote it in. Well, it actually depends. Highlighting is not necessarily it's, it's bad if you don't have enough, like prior knowledge. And if you're just like wildly highlighting things and then, you know, reread and then think, oh, I got this because then your brain, you know, kind of

Speaker 1:

Recognizes, recognizes it.

Speaker 2:

Yes. What you've read before. And then it thinks it, it gets it, which is not enough to, to learn. So I like that type of, um, you know, work in the sense that I think that's also what science is, right. It's not like we have the answer and then it never changes. I actually see that a lot lately on LinkedIn where people said, yeah, we don't have to use the science because it constantly changes. And I, dude,<laugh>, that's the point,

Speaker 1:

But I think that is a challenge that scientists have, um, as a, as a public perception that, you know, there was this kind of theory that, you know, existed and it was a theory and maybe it was well tested, but then, you know, it does, science does change and that's the point of science. That's the beauty. Yeah. But it can leave us, like you just said, sometimes you doubt things because you think, well, I, I thought I knew that, you know, five years ago, but actually evidence has shifted. And, and now maybe I don't know as much about it. Or, or maybe there's something to

Speaker 2:

Say, like, there is no evidence because evidence constantly changed. So therefore we don't have to take an evidence in front of the process cuz nothing matters. You know, nobody knows. And like, well that's,

Speaker 1:

That's not true.

Speaker 2:

That's not true because you know, I, well, at least my very strong opinion is that you should use what you have now and what you have the strongest evidence for now, because that allows you to make better informed decisions with as much confidence as you can, that you've done your due diligence, you know, to, and then of course we need to evaluate in our, in our particular context, but otherwise it's so inefficient. Right. Cause then we keep going with trial and error and reinventing the wheel. Yeah.

Speaker 1:

Yeah. I totally agree with you. We use what we know now with the idea that, you know, this may change in the future, but right now this works for us. You know, otherwise we wouldn't have vaccines, we wouldn't have, you know, uh, digital technology. We wouldn't have any of those things. If science hadn't helped us create those things, we wouldn't have them. I've I've told this story a number of times and it really resonated with me. I was listening to, um, Higgs professor Higgs invented the Higgs bow on and it was when they were looking at, um, CN to see whether, you know, recently they were checking weren't they, whether the Higgs bow on actually existed. And just before they announced the results, uh, a, an interviewer asked him, you know, well, what'll happen. You know, what'll happen if you discover it. And he said, well, these amazing things, you know, we'll know all this. And he said, but what happens if you find there isn't one? And I think he was expecting him to say, I'll be devastated. It's my life's work room. He said, oh, that's even more exciting because that opens up a whole field of things that we hadn't even considered before. Yeah. So it was, you know, he was a true scientist who really got the idea that you research and you find out and you get to where you are and then if, if the results come out differently, well, that's, that's just as exciting and just as interesting. And I think that's where real science is important. I

Speaker 2:

Remember I had that conversation with, um, I dunno what you call it in English with the person who is supervising of my thesis supervisor

Speaker 1:

A supervisor. Yes

Speaker 2:

<laugh>. And uh, I had this discussion with her where I, and I, I was completely naive that way because I said, I don't understand why it's better when a study has successful results and why you never see publication of studies that completely failed. Cause is that not, you know, interesting, interesting. And she said, yeah, you would think so, but it's just not how it works. And I was so disappointed by that because

Speaker 1:

That's human nature, I guess isn't it to not want to be seen to publish a, a failure.

Speaker 2:

Yeah. But to me it's like, okay, I had these hypothesis, I tested it. It turned out to not be true. Isn't that? Not as important to know.

Speaker 1:

Absolutely.

Speaker 2:

No. Yeah,

Speaker 1:

Yeah, yeah, yeah. But scientists, unfortunately are human too.

Speaker 2:

<laugh>

Speaker 1:

Miriam, you work in this enormous company right now. And I sometimes sort of imagine you sitting there in your, um, you know, your kind of little space where you are, how are you actually able to, to influence evidence informed learning across such a, a huge global company? What, what, what do you do? What, cuz I think that's be really useful for lots of us, you know, what works and perhaps what doesn't work.

Speaker 2:

Yeah. I mean, uh, it's really hard to know, like to what extent things really land. So let me, you know, and to really, uh, change things at scale. So it's more like a, a pebble in the pond type thing I would, I'm I'm thinking, um, I can, I can tell you what I do. I don't know how successful, uh, everything is, but because it's a, it's a matter of re learning and, and anyway, I'll tell you what I do. So what I do is I do a lot of like, um, explanation around what evidence inform learning experience design is. And I really work with like the three buckets. So the stakeholders and systems buckets, the, our expertise and then the, and then the science. And I think in organization, it's really important to, to start with certain terminology and keep repeating it and keep repeating it everywhere in everything that you do, cuz that's the only way that things are gonna stick and that people start picking up on it. And, and it becomes kind of part of the, the language. Now the language is only one, one part of it. But the other thing I do is I try to work out loud as much as possible. So really designing purposefully opportunities where we bring together, like other people are involved in learning design and really work out loud and think out loud about. So as an example, we recently, uh, started piloting well. So a while ago we were piloting our LX B it's. Now it's now implemented. But at the time what I did was I designed a learning journey on work, examples on the learning experience, design platform, our learning experience platform, and then brought together a group of people who went through the learning journey. So it was a bit of a meta meta thing as in, I wanted people to learn about work examples, but I also wanted to discuss why I designed it the way I designed it. So we came together like once a week and we discussed or once every two weeks sat down for an hour. And every time we discussed like one phase of the, of the learning journey. So we discussed, you know, what they learned if they had any questions, but then we jumped up a level. I mean, we, we talked about the design, so that really cuz to me, the rationale behind the design is something that we need to talk more about and that we need to dig into, cuz that if we get better at that, then we are able to have these conversations with our clients because I often see that learning designers or learning professionals have conversations with their stakeholders. And it's very difficult for them to convince their stakeholders because they don't start at the right end of the spectrum with the conversation. So they might start with, uh, I think we need gamification or I think we need social learning or I think we need whatever. And then they need to convince their stakeholder while I think you need to start with, okay, when we look at what people need, then that implies that they need to fix some, to do something together because without it, they won't be able to achieve X. So anyway, my whole point is I'm really trying to implement that way of working. Uh, another, um, mechanism I have for that is we do learning design jams. So because I got a lot of requests from teams across Novartis asking for consultation and feedback on design. And I couldn't like, it's just too, too many requests. Yeah. So yeah. So I've like put together a mechanism where we then organize a learning design jam. So I'll bring together again, like the score group from across the organization that come together, the, the client who requested it, they prepare for the, for the gem. So they present a little bit on their, uh, project. And then there's so it's like a, it's a win-win right. They gets feedback, the peers who are involved, get to learn about other projects. And then we use like evidence inform learning design principles to structure the conversation. So it kind of depends at what level the design is. You can have quite, you know, high level as in like how clear are the needs and, and, and whatever. And to what extent do these needs really require learning. So it, we also have really detailed principles like Myers multimedia principles. Okay. Let's look at this e-learning and to what extent does it meet, you know, these criteria? So that's the way I try to work. Um, and one other example is that I work with one or two really key projects. So, you know, I really partner with the business and that I need that because I need to understand what's going on a little bit and help people work. So then I provide, you know, advisory and help them improve their design. But then I also, with my team design things out of that, like case studies, templates, methods, tools that then other people could use to improve their capabilities.

Speaker 1:

Okay. So that's, so you've got four, four really strong tactics there. You've got the, kind of the, the explanation, I guess that's kind of keynote conversations. Yeah. Sharing

Speaker 2:

Webinars or met with cahoots and the type

Speaker 1:

Stuff. Okay. Then you've got the working out loud, which is, I think is hugely important for people. And it's that kind of, you know, explaining what's going on in, in your head when you're doing stuff, you've got the jams where you're bringing more people together, but again, you're kind of, you're doing that. Meta learning, you're learning one thing, but then you're learning on top of it. Yeah. And then yet you're actively working in projects because that keeps you up to date with what's really going on, but also helps people learn from you as you're working in there. That's really helpful because I think that's

Speaker 2:

Yeah,

Speaker 1:

Yeah, yeah, yeah. That's really, really helpful. Thank you. Okay. Well, we've got a couple of minutes left, so just last of all, perhaps if you could wave a magic wand and everyone followed evidence informed practice I'm so with you here, what, what would be the three things that they could do that would have the biggest impact?

Speaker 2:

Yeah, one of the biggest changes I think we need to make is that we need to spend more time unraveling the need and what I mean, I, I, and I don't even mean like the performance consulting bit. I, I do mean at, at the level where we have decided it needs some kind of learning intervention, but I think we really need to unravel what that thing is made up of. Like, is it knowledge? If so, is it declarative knowledge? Is it procedural knowledge? And is it skills? If so, yes. Uh, if yes, then is it a simple skill? Is it a complex skill? Is it more about habit? So, and the reason is because only if we break it down that way we know where to focus our design attention and we need to know which evidence to use. Yeah. So as, as an example, when it's the knowledge, if it's, I always get confused with it terms like, uh, the factual stuff descriptive, then you need to explain concept and, and, and give examples, right. If it's more procedural, well, then you might need to explain it also, but you also need to demonstrate it. Yes. And then as a solution, you probably don't necessarily need things like space practice necessarily. You might be able to get away with just performance support aids, right. It kinda

Speaker 1:

So space repetition and things. So they actually, well,

Speaker 2:

Space repetition is more when you need to remember, but when it's something like a, how to like a procedural knowledge thing that you don't really necessarily need to understand that deeply, you might be able to teach people just by providing the job aid again, it's nuanced. It depends. Yeah. Yeah. But just, I think that way of thinking, it just doesn't happen. It's never really broken down that way. Like we talk about things at a way to abstract level.

Speaker 1:

Yeah. We talk about learning at this very high level, instead of asking about what is it, what is it? People are learning in a very detailed way.

Speaker 2:

Yeah. And, and, and I think it's because people, when I talk about this, that people are worried that, you know, it's kind of like analysis paralysis, but that's not. I think if you're, if you're good enough, you can, you can relatively quickly do like a task analysis type of thing and, and break it down. Like, and, and even like, even if your client gives you a, a bunch of content, if you can then have an initial conversation and say, okay, tell me what your people need and why, and then go through the content. I mean, yes, it takes a couple hours, but then you can show them like, okay, this is what you told me. When I look at your content, this is what I see. Right. Mm-hmm<affirmative> yeah. Yeah. So here's the gap and you need to show them and, and usually they go, oh, oh yeah. Okay. Yeah. Well, we don't have time for that, so that okay. But then we need to, then you need to track back and, and, and adjust your expectations. You need to compromise. Right. Mm-hmm<affirmative> cause your people are not gonna learn this if you offer them that.

Speaker 1:

So indeed

Speaker 2:

<laugh>. Yeah. So that, I think we need that's that's I think for me, that would actually be the one, if we were able, would be able to do that that's

Speaker 1:

Because that would change everything after it.

Speaker 2:

Yeah. But then the, of course the, the million dollar question is how are people gonna get that knowledge? Cause I actually think that L and D is pretty okay. When it comes to skills. I think the knowledge is where the gap

Speaker 1:

Is. Ah, that's interesting. I kind of think it's the other way around, but people,

Speaker 2:

People don't know all the research that's out there. I've had a, um, a conversation recently where people said that, that everything around skills is new and I'm like, okay, people, there are methodologies that've been around since like the fifties.

Speaker 1:

Yeah. But just cuz it's been around doesn't mean people know it. Well,

Speaker 2:

Exactly. This is exactly my point. We need to stand on shoulders of giants. Yes. That's the other things. Yes. Yes.

Speaker 1:

Miriam, thank you so much for, what's been a really, really fascinating conversation. I think we could probably have gone on all day, but I think both you and I have jobs to do and I suspect our listeners probably have jobs too. Thank you so much. That's been really fascinating and I really look thank you for, to our next conversation.

Speaker 2:

Really a nice conversation. Thank you. Great.

Speaker 1:

And I look forward to seeing you again soon.

Speaker 2:

Bye.

Speaker 1:

Thank you for listening to today's podcast. Please share it with your friends and colleagues and visit our website, www.star labs.eu, to learn more about how we help you reach for the stars tune in to the next episode.

People on this episode