The Learning & Development podcast is hosted by our Chief Learning Officer David James. Featuring L&D leaders from across the globe, each conversation focuses on hot topics in the profession. This transcript is from the conversation between David, Danny Seals and Tim Brind on design thinking.
David James: Welcome to The Learning & Development Podcast. I'm David James from Looop. In each episode, I chat with guests about what lights them up in the world of people development. In this episode, I'm speaking with Danny Seals and Tim Brind about design thinking. This is a topic requested by listener Catherine de la Poer.
Before we get into it, if you're enjoying this podcast, please do give us a five-star rating on your podcast app of choice to help others to find us, and thank you if you've done so already. Now, let's get into it. Danny, welcome back to The Learning & Development Podcast.
Danny Seals: Thanks for having me, David. It feels like-- Is it three time's the charm? This is the second time down.
David: Yes, second time down. Welcome for the first time, Tim.
Tim Brind: Thanks, David. Great to be here. Yes, first time, go easy on me.
David: Now, to start us off, and for clarity, perhaps Danny, could you explain to us what design thinking is and how it applies to L&D?
Danny: Yes. What a great question to start off on. Design thinking, so I think the first thing to call out is we should be saying design as a way of thinking. That's how we should be looking at design. It isn't design thinking. The best way to try and describe what this is is tomagine I was building a house. Within that house, you've got some core elements.
You've got the frame of the house, you've got the bricks, the plumbing, the electrics. You've got the people who are connecting the electrics and stuff like that. The best way to describe this design as a way of thinking for me is the frame of the house would be human-centered design. The areas of design would be the bricks, the waterworks, the power supply.
They would be your service design, your experience design, all these different types of design methodologies, but it's not this one thing. It's a collection of things. I guess, for me, it isn't a five-stage process. I think what we'll probably hear about throughout this conversation is me saying it's not a five-stage process, it's not.
Design as a way of thinking in nutshell, depending on who you follow or whatever Google brings up, is either a five-stage process or often it's a process, and then it's a funnel, and then it's something else, and then it's something else. Often, it's really, really murky to what it is, but stripped right back, it's a kind of approach that people can use to look at solving big problems.
That can range from creating products and services. It's also kind of-- We use it to fix big organisational, big problems. I'm not sure if that answers that. Tim, if you want to add anything on there?
Tim: Yes, I guess, mine is not as abstract as yours. Two things for me, whether we should do something and whether we can do something. Then I guess it's about pressure testing that to work out. Like Danny says, if people want the product, the service, the feature, whatever it is. I guess L&D have got-- they've got the need for that type of scenario, even more as we try and compete with new tech, new ways of working.
I guess for L&D, it should be a way to identify what those challenges are, hearts and minds of users, or the cause and effect of a product, whatever it is that's going to work. For us, do we use it on every project? Probably we do in some shape or form to get us through those obstacles.
David: We've got design thinking or design as a way of thinking, as you mentioned there, Danny. You threw into the mix human-centered design, which I know that you're a huge advocate of. How is human-centered design different and similar to design thinking?
Danny: This is where it gets juicy. For me, you need to take one one step backwards, I guess. You have areas of design. You've got service design experience, design UX, UI. In fact, it probably makes sense for me to say what they are. Service design focuses on interactions between people, touchpoints, the backstage and frontstage of it. Then they all tie up into a kind of-- It's not about tying it all up into one thing, a product.
They look at how we can create better services. Service design looks at a bigger picture. Then you've got experience design, which looks purely at the users touchpoints and interactions of their journey. On the last podcast I was on, we talked quite a lot about experience design. That's kind of one layer. Service design is another layer. Then you've got the UX and UI.
For me, this is where this design thinking it's more- for me, it's more of an approach. Now, depending on which approach you take, it can range. It can be five steps, it can be four steps, it can be a collection of principles and loops and keys and lists. It just gets murky. How one place does it is very different to how IBM does it. How IBM does it is very different to how I do it. They've got core elements that are all the same.
Actually, what I talk about when I think about human-centered design is actually the problem. We're designing for and fixing the problem from the user's point of view, the human's point of view. If we can keep that human front and center, and the job that they are trying to do, we'll be able to design better for them. I don't know if Tim can testify to this, but when it comes to designing anything, it is anything but a five-stage smooth process.
It's not four days in a room together, start to finish. It can be, but often what you don't get is you never really hear about the eight weeks that have been done before we even get to that stage. You never really hear about the weeks and months of work that we've done before that to get to that. That's human-centered design. That's really understanding your people and bringing in these elements from service design, and these elements from experience design and UX.
For me, human-centered design is much broader. Actually, what human-centered design does is it allows you not to get fixated on this five-stage approach or this four-stage approach. For me, it's less about the approach and more about the job that needs to be done. That pulls from that service design and that experience design.
David: I suppose, crudely speaking, we're already at the stage where we are pitting this against a traditional approach to Learning & Development. If I can crudely perhaps make a comparison that in the past, we've been much more topic centric. We isolate particular skill sets, whether that be about presenting or communicating, managing time, managing people. We then turn that into--
We've got our topic, and then we create our content. Then we've got our content looking for people who that might help. There's quite a lot of hope within that. Of course, these programmes are heavy, they're expensive. They are time-consuming. Quite a lot of that is because we haven't actually stood in the shoes of the end-user. We don't know their challenges. We don't know the unfamiliar situations that they're facing. We don't know the work involved at all.
We'd give them this all-encompassing educational experience in the hope that some of it sticks. Now, many of us have been in those training rooms, and said, "If you change one thing, if you make a 5% difference," because at this stage, once we've done our jig, once we've done our dance, and the event's over, we're then hoping that what we've assumed could be the solution will have any resonance at all.
Now when I talk about that compared with what you've just described, Danny, mine sounds like a hidden hope. Where yours seems like it's analysed, and then evidence-based, and then experimented. Am I anywhere near?
Danny: Yes. I guess there's a few things with that. We don't do fingers in the air, let's have a guess. My team isn't a learning team. The team that created GP Strategies isn't a learning team. We do things such as user manuals. Today, I'm looking at the user manuals today with my team because we're doing a bit of a refresher on it.
I was looking at how many of us talked about learning in our user manuals, and in our team's manual it's mentioned once in like a 12-page document, but problem-solving, understanding people, understanding the performance, and understanding the business is spread throughout this document.
I guess, Tim, you can probably jump onto this in just a second. One of our key principles is data-informed design. We have seven principles. We have led by compassion, informed by data, experience focused, behaviourally applied, simplification over innovation, enabled by tech and advice collaboration, they are what we do. At no point in our principles there is learning mentioned. There's times when we've had to have really difficult conversations with customers to say, "Right. I'll tell you what? You don't want us to do research. How much does it cost to do your solution?" That's the question they ask. Like, "How much would it cost to do that?" I'm like, "£5 million." They're like, "What? £5 million? What do you mean?"
I'm like, "You want a finger in the air price to fix a finger in the air problem that you can't identify. £5 million." It's very tongue in cheek but the idea is, if you think it's costly to do research at the start, look at the cost of not doing it at the end when you never hit or fix a business problem. It's 10 times more expensive. Tim, I guess you could probably jump on here, on that.
Tim: We have those good conversations with clients. Sometimes it's well received. Sometimes it's not. Danny's not joking about the £5 million. He has actually said that. I guess, I see design thinking as a process as a means to an end. We use it with lots of pit stops along the way to get it to the right place. Like Danny says, it might take two weeks, it might take two years.
It depends how much iteration, it depends how much the org wants to invest, how big the problem is. Human-centered design, I see it as a toolkit. It's a state of mind almost. Once you begin thinking like that in design sense, you open up Pandora's box, and you need to shape those ideas. Like Danny says, again, you might use experience design, you might use service design, you might use journey mapping, but you're starting to shape what that looks like.
I guess, I question how much design thinking is overused? I think we're seeing it loses impact because of that. I think people, they almost bastardise it. They misunderstand the approach. It's abused like a Radio 1 playlist. That's what I think design thinking is turned into.
Danny: It's so true.
David: Overplayed, overexposed?
Danny: Yes. Everyone looks at it in five years' time and goes, "I was listening to it before everybody else was." It is. It's exactly that, right. Often there are times when people say, "We're doing design thinking." You go, "Okay, show me what that looks like." “Oh, well, we've got together for two hours in a room.” No, that's not design thinking. That's not. Often design thinking isn't design thinking. What is it you're actually doing?
It sounds quite brash, it sounds quite boring. People want to sell the term without the understanding. It's an investment to really understand how to do design as a way of thinking or human-centered design or whatever you want to wrap it up as. It's an investment.
It's investment in time, it's an investment in understanding. People don't want that. We want to piggyback on a label quickly, and then jump off when the wheels fall off. It's a tricky one.
David: It's endemic in our profession that there's-- An emerging term is bastardise to just label existing practice. We've seen with the word experience, all of a sudden, instructional designers or experienced designers without actually changing what they do. All of a sudden, the term means absolutely nothing. Change can't travel through our profession because of the misbranding, the misappropriation of emerging practice that has the potential of transforming what we do, not as a set of activities, but as outcomes.
That's why I want to spend some time nailing this down. We've talked about what this is from a higher level. We've talked about what this isn't. I'd love to know how it works? What do you do? I know that this is probably coming from the wrong way. The question is, what do you do when employing design thinking in an L&D context? You probably don't go around trying to apply design thinking in an L&D context.
You're probably, I'm guessing, seeking to address real business problems and performance problems in the most appropriate and efficient ways for which design thinking may or may not be part of that toolkit. For the purposes of this, and nailing either design thinking, design as a way of thinking or human-centered design, whichever way you want to apply, love to ask, how does it actually work?
Danny: This is the golden question. For me, with our team, we go through nine design processes. I think I shared this on episode one. The origin story. It goes through a couple of key points. You've got sensemaking and sense-checking. That falls within actually just understanding what is actually going on.
What is the problem? Tim can probably talk about this in a project that we did recently where it was a thing of they came to us saying they've got a learning problem. Then we said, "No, you haven't. You've got seven problems over here. Learning isn't even one of them." Anyway, you've got this whole sensemaking phase, and you go through into the next phase, which is the experience.
This pulls in experience design and service design. In there, you've got this really big macro picture. You've got the micro and nano, which we went through on the last experience. You move over into the third phase, which is tech as enabler. Tech does not fix your problems, it enables you to fix them. I think often people don't look at that as the right way. It's a blend.
When I look at our design process, I think I can share a link to it so you can see it. It's like it's iterative, really iterative. It's a multilayer design process. Tim, I'm not sure if you want to take him through what we did with the large consultants company we work with?
Tim: I guess, like Danny says, we were approached from a learning problem, but we dug into it. This is the thing. This is where it gets quite complex quite quickly. We map a lot of what we do out visually. We're going to use things like systems thinking. That's generally down to the size of the problem, size of your org, or how big and complex the problem is in terms of dynamics or behaviours, we've got these reinforcing behaviours which either impact the problem, make it worse, or reduce it. If we map all that out, we get quite quickly to the root of the problem. Again, back to that data decision, a lot of people miss that out. That's not important. That's the whole sense making piece for us. That's the bit that takes time. We could be stuck in that for up to 16 weeks, depending on how big it is.
It sounds really intense and involved. It is, but it's also just about data. It's discovery, it’s analysis. For us, it's just a way of managing that data. If we can get it down, we can map it out, and then it's not learning. We'll present that back to a client in a really lean format and they get it straight away. They might like it, they might hate it, but they get it straight away. I think, again, where maybe L&D falls over, they're not part of that process.
That's probably where they resigned to becoming order takers, I guess, we deal with problem solvers rather than order takers. I think what's refreshing about Danny's approach is that it allows us to challenge that from day one. We're having those conversations on day one. We don't wait for a two-year relationship where we've built up something long term with the client. You’ve still got to have these hard conversations, but it comes from the right place.
David: To give us an example then, you mentioned there about some work that you've done. Could you talk to us about that level of abstraction? What was the problem? What was your approach? What went on behind some of those closed doors? Who was involved? What was your approach to actually addressing what you then analysed as the core problems?
Danny: I'll probably jump in first. They came to us. The original problem was we need some learning, the usual. We need some learning. Actually, we want to do learning differently, and it came through via experience design at first. We want to do something more experiential. Then we were like, "Cool." Our first touchpoint is we do something called design a relationship.
Design a relationship, ultimately it's that personal approach right at the very start. For us to fix your problem, we're going to come at you, and we're going to ask you, "Give us keys to the castle. Let us really understand what that problem is, and who's the stakeholder, who's the SMEs?" Actually, let us sit down with 300 people in your business and ask them what's getting in their way.
Now, usually, that's the first hurdle, because people don't like looking at what's under the bonnet a lot of the time, because it might expose bigger things. They were like, "Look, come on board, do it." It wasn't as easy as say with best friends. We have to work out relationships, and we use design relationships to do that. That is design. We get in the room and we map it out. Then we also start reverse engineering. Actually, what is the performance you want to see, what's the business, what does that look like now, what is it you want to see? Then we go into this research stage which we did-- I think we did, was it eight weeks on this one, Tim?
Danny: The eight weeks consisted of, obviously, qual and quant insight. That was audience insight, one-to-one interviews, shadowing, we did user journals, we could track people's journals from the collective. We also did lots and lots of data, so we talk about digital breadcrumbs in our team. What is it they're doing, where are they bouncing across? All that data starts creating a picture for us, and then we start identifying.
We've got a research team in my team, so we'll use them a lot to start looking at, actually, where are the trends and patterns, what are we seeing come out here? They end up pulling out, I think was it six key core things that came up time and time again, Tim? You could probably share what they were, actually.
Tim: Yes. We chopped the solution into six different pieces. What was it? There was a big comms piece. Again, they thought there was a learning problem. There wasn't. It was around comms, marketing engagement. It was almost giving people something to stand behind. They lacked that message, I guess, that tone of voice, that message. They didn't understand what it was.
Let's sort that out. Let's use marketing comms and sort that piece out. Once we've got that, we can create some content. It was a really small piece of content because it was about enablement. It was like this is what you need to know, this is what it means to you in your job, go out and do it. It wasn't 10 hours of e-learning. It wasn't three different videos. It was just, bang, watch this, off you go.
There were some really nice pieces in there. There were some diagnostics in there, joining the disconnect between two different areas of the business. Getting sales to talk to delivery, because they didn't. Sales can't serve because they don't understand, and vice versa. We're just bringing them together in a nice little app. We had some A/B testing, just to get to the root cause of platform and adoption, and some issues around that. There was one more.
Danny: Yes. There's that tone. When I think about it, we asked over 200 people what does this product mean to you, and we received 200 different people's opinions. We’re like, "You who are selling this product, you don't even know what that product means to you as an organisation, to your people." Actually, part one is bringing that to life. What does that product even mean to you as an organisation? Going back to Tim's point, the A/B testing.
One of them was like, "Oh, we've got all these. These people have got to be certified and stuff like that, and all this other stuff." We're like, "Right, well, how do you know that isn't even happening already?" Actually, people are getting certified in what matters to them, not what you say. They couldn't track it, and they couldn't keep tabs on actually how many people were even doing it?
This isn't a learning problem. Your problem is your people can't find a way to track the solution, so we did A/B testing. We moved this thing around on various different sides, and we could instantly start tracking. This is why it's not that people don't want to do it, it's that some people couldn't find something, that's it, job fixed.
Now if you went to learning, you'd probably get, I don't know, 20 pieces of content, all saying the same thing one way or another. Maybe there's e-learning in there, maybe not, I'm not sure. Actually, we didn't need to do any of that. We just moved the square around on a page.
David: We've talked there about some of the stuff that you did. I think you've insinuated here as well that everything you did was checks of progress towards achieving what they had defined as their outcome or what in those earliest conversations you define as the desired outcome. Is that the case?
Danny: Yes, exactly. That's where it comes back to that reverse engineering, right? Reverse engineer backwards. This is your current state now, what is that state? Okay, reverse engineer backwards on performance and business. Going back to the designer relationship, where we do that at the very start. The first thing we do is we go through all that. We go through identifying actually what is performance in business, not learning, performance in business.
What is it? By doing that right at the very start and identifying that, it makes it frictionless to get there. We'll use tools such as Hotjar to track user tracking on pages. Actually, where's heatmaps, where's recordings of X, Y? We'll use stuff like that. We'll use tools like Miro and Dovetail to do audience insights. It doesn't need to be learning, and I'll probably say- and this is bold, but I'll probably say, out of every 10 learning projects, eight or nine of them aren't learning projects at all, which is bold.
David: You’re leading nicely to my next question. If design thinking rarely produces a learning problem, then should L&D really be involved, or how does L&D take it on?
David: Remember, this is The Learning and Development Podcast. It's too late to say no.
Danny: Tim, you can probably tell your story of your journey into this in just a sec, I guess. L&D, I've been saying this probably for what feels like 100 years now, and it's an unpopular opinion. You either upscale your people to be problem solvers, or remove your L&D team, which is quite savage to say, but it's not. As soon as you can start moving that bias of everything being a learning problem, you start to see problems very, very differently.
It's a hammer and nail. The only tool you've got is this. That's why our team is transformation and performance, that's it. There's no learning in the title. I guess, Tim, maybe you can talk a little bit about your journey, your transition maybe.
Tim: Sure, yes. I've been in operations for, what, the last 10 years, and moved back to a more design role, consulting in Danny's team only recently. I was in the tech and engineering sector, working with aerospace, manufacturing clients. Whilst we were addressing learning needs, it was way further on that performance support scale.
Not fluffy learning, but digital tools. Augmented reality, AI, technical manuals, or interactive job aids. Stuff that would be used out in the field in those aerospace, manufacturing, engineering environments, by guys on the field. Very specific to me, but it definitely talks to performance over learning. I guess where we're coming from. We want users to learn, and that might be through training. We were enabling them to do that whilst on the job.
That's probably the first point of where I'm coming from. That's, again, very different to where I am now. That's mainly, again, where I think L&D does fall over. It's jumping into a solution before understanding the problem. The only way we get to do that is doing, like Danny says, that discovery, that sense-making phase. It's so simple, but it's probably the one, the single biggest activity we do in that whole process, whether you call it design thinking or human-centered design, but that's the integral part.
There's also a big learning curve for us, talking about my journey, it's probably-- I've always had that hunger, I guess, to learn. Research has been a big part of that learning curve. When we look at human-centered design, there are so many tools. There are so many methodologies, there are so many disciplines it calls on. Psychology, behavioural economics, all these kinds of things. We bake them into our designs, but that takes time to use, to learn, to know.
It's overwhelming, just the wealth of information and stuff. I guess, how did I manage to do it? I started to create playbooks. That's become part of our design system within the team. Just to visually get that down and map it out. Just thinking about something, I guess, Danny pointed out to me. He sounds like a wise old man, doesn't he? It's like we do so much research on our clients, and then we learn from the observation piece.
We also do the same to ourselves, which I thought was quite an interesting way, a bit more reflective. Again, it's a new way of looking at how we do things and measure ourselves every time. A bit like practicing what you preach. I'm not sure, I think we probably agree, L&D doesn't do that too well.
David: No, I think you've got a good point there. I mean, one thing I'm picking up from what you're saying here is that, look at the other side of the coin from what you're describing here. If you don't know enough about the problem you're trying to solve, any “solution” will do, and its solution there is in inverted commas. I think that's the place that Learning & Development come from a lot of the time.
As you've just described, there aren't actually a lot of learning needs, but there are a lot of performance needs that might come up as a result. The more you know about that, the more targeted that you can be with any intervention that is aimed at progression towards the desired outcome. One that has been defined because it is being discussed and debated and challenged and experienced by the people in the room, or the people that you're learning about the work and the challenges that they actually face.
Tim: Spot on. It's uncovering the real problems as opposed to the learning problems. Not all clients are ready for that.
Danny: I'm just thinking then. When I think about, actually, when it goes through design, and really understanding and caring about what's going on, and talking about compassion versus empathy. When you break down often what is the problem, you can usually break it down into 4Ps really. It's people, process, props, or place. It's very, very rarely learning. Choose your back processes, the people, and actually how do we help them do what they need to do?
Often, it is just simply the place in which they work in and how we can remove the friction and touchpoints in that? Maybe actually it's simple because I've been through my own journey and I’m doing that. Sometimes I'm aware that I always oversimplify things. There's times when we have conversations,Tim looks at me like what's that? That's my learning, right? That's me learning.
I think the one thing I will probably say is, in the time of setting up this team, I've had three large organisations reach out to me and asked me if we can shift the shape of my team. They're very big organisations. What I'll probably say is that it's changing a little bit. I think the people who are having this conversation and want to shift the shape of my team are people who are moving from that traditional approach to that, actually, let's just fix business problems and get on with our job.
David: To more predictably and reliably fix problems rather than “deliver” learning solutions. Now, despite the benefits that you describe, there is some criticism of design thinking. I've seen plenty from instructional designers because they argue that employees don't know what they need to know or learn. How do you counter that?
Danny: Right. I guess my take on this is probably quite straightforward. To a certain point, if design as a way of thinking was done right in the first place, the result of that would never ever, ever, ever be e-learning or instructional design. With that said, I guess instructional designers are pretty lucky right now. In fact, they're very lucky, because if people did design a way of thinking correctly, they would all be out of a job.
That sounds quite harsh, but it's true. You will never, ever get someone to go, this is a business problem. If they've done the proper research and really understand it, the outcome is never, ever, “Hey, here’s an instructional design.” I don't know. Tim, maybe you've got a more nicer and direct approach.
Tim: I'm a bit scared how to tackle that one. We do hear it, though. I think instructional designers don't understand it. I think it's combined with the abuse of design thinking. Often, it's those two things together that drives that criticism. I'm not sure you'll ever counter that. I don't know if we need to counter it either. I think when you start looking at it from a different place, that's what makes sense. You can argue about it for as long as you want. If you're only going to look for a learning problem, you're only going to find a learning problem.
David: I think from what I'm hearing, that you're coming at problems from a diametrically opposed position from an instructional designer. You're looking at a problem as in what is it you're experiencing and how can we help? Instructional designers come at it from we need you to know this, so that it's almost as if they don't need to fully understand the unfamiliar situations, the challenges that the people are facing in the context of their work. It's just a case of we need you to know this, which has got to be the hardest way to do Learning & Development.
Tim: It’s a question about what's the pedagogical use? Do you even hear that anymore? Do you need to know that? If you understand why an individual makes those psychology-based choices, whatever it is, if you understand the economics behind that, you don't need a piece of e-learning. You don't need something that's written to Kirkpatrick Level 1 and 2. It doesn't work like that.
Danny: A piece of me dies inside every time I go on Linkedin and see design thinking for instructional design. Often, I just feel like I need to be taken out back and put down. When I see stuff like that. It's because what you've done is you've jumped on a buzzword and they've gone, right, how can I stay in my comfort zone, but look like I'm being really proactive and really progressive in what I'm doing?
I'll just lift this term, drop it into my little comfort zone, and just pretend I do it. It's a shame. It's a shame because if you're in “Learning Development”, you're meant to have this consistent beginner's mindset. If you've got a beginner's mindset, I will give you absolutely all of my time and everything.
The people who are doing instructional design, it's very clear they haven't got that beginner's mindset, because they're not always looking at new things and applying new things. They're doing it, but in the comfort of their little safe environment.
David: Welcome back to the how to win friends and influence people podcast, with Danny Seals and Tim Brind. As we look to wrap up, there are going to be people listening to this and thinking this really resonates. I can feel the tide turning, but not everybody will feel the tide turning. There are plenty of people in organisations who are delivering the expectation that Learning & Development still looks like school, people are still asking for courses and programmes.
There is a very real and very urgent skills gap that some are very much experiencing and need to tackle. Some aren't feeling the urgency and buy an LXP full of content. There are people who are feeling that things are changing and what you're saying is really resonating because it affects performance. There's a degree of analysis.
There's bringing people with you. Ultimately, it's aimed at working and addressing that challenge. How can L&D become more skilled or begin their journey as far as design thinking or design as a way of thinking or human-centered design is concerned?
Danny: I'll jump in, Tim, if you like, first. I know you've got quite a bit to say on this. I think for me, it's one, challenge yourself. On your next project, go and get research. Go and get your qualitative and quantitative insight. Create hypotheses and problem statements. Challenge your hypothesis and acknowledge your own bias to this as well. Do that first.
You're going to have options where you can map out the journeys, map out the people in there, map out what that front stage experience, but also what that backstage experience looks like. Just get in the weeds of things. You've got archetypes and stuff like that. They can be beneficial, they can. Ultimately, no matter what you do, you can't just follow a tick list of exercises.
For me, the biggest thing, and I say this to the team, and maybe Tim has heard it said too many times, but you need to do time served. You need to do it. You can't just do a tick list of exercises and go job done, it doesn't work that way. You need to be able to know when to turn left and when to turn right. It's a constant for me, because the design way of thinking is a constant battle between order and chaos. It's friction at every single touchpoint.
You've got to get comfortable with that. You've got to get comfortable with not knowing the answer straight away. Not knowing a solution, and being able to acknowledge your bias along the way. There's times and times again where we've been together and I've gone-- With my team, and my team has done it to me as well, right? Like it is that way. You're jumping to a solution. You're jumping to a solution. Does this break or does it confirm my hypothesis? If it doesn't, and if it does, both are equally as valuable, right? If it breaks, if it challenges it, and says actually the hypothesis is wrong, great, there's value there. Just don't jump into that solution because there's one thing worse than not fixing a problem, that’s spending loads and loads of money on a problem that doesn't exist. Tim, maybe you’ve got some more.
Tim: Yes, I’ve got my top five for how L&D can become skilled in design thinking, I guess. Don't create solutions, solve problems, that's my first. Second, data is your friend. Go off, grow a beard, become a data scientist. Get excited about proper user experience. Number three, learn something about behavioural economics or choice psychology or neuroscience and apply it. Apply it next time you create something. Number four, try out a small project, just get involved and do it. Make the jump. Then number five, use Miro or Mural, create a board and get involved. This is one of the best things I've done. Timmy's top five.
David: That's wonderful. Thank you very much, guys. Now, Danny, I know for sure that you've written and published a great deal of your thinking around this. If people wish to follow your work or connect with you, start with you, Danny, how best can they do so?
Danny: Best thing-- I've slowed down putting out articles on Linkedin at the moment, just because I’m having an ongoing conversation about a book. I do try and give little snippets away in posts so LinkedIn is the best way for me, for sure.
David: Fabulous. And you, Tim, how can people follow you and connect with you too?
Tim: Best place for me is LinkedIn. Happy to connect.
David: Fabulous. We'll put links to your profiles in the show notes. Thank you very much, guys. All that’s left for me to say is thank you very much for being guests on The Learning and Development Podcast.
Danny: Thank you. It's been a pleasure.
Tim: Thanks, David.
David: As you heard, whether it's design thinking, design as a way of thinking, or human-centered design, approaching problems from the perspective of the user, and being informed by data is a way to solve real business problems, and track progress towards just that. It's a discipline worth exploring and investing in and employing when change is imperative.
If you'd like to get in touch with me, perhaps to suggest topics you'd like to hear discussed, as Catherine did, you can tweet me @DavidInLearning, and connect on LinkedIn for which you'll find the links in the show notes. Goodbye for now.
Danny Seals is Director of Design, Experience & Innovation at GP Strategies and the purpose of his work is to create impactful change through using good design. He pulls from various design disciplines and mindsets such as human centred design, design thinking, service design, system thinking and experience design.
With a background in the engineering and aerospace sectors, Tim Brind understands ROI and how to add value; an approach that combines performance consulting with human centred design to ensure the production of innovative, user-centric learning experiences and creative problem solving.
Connect with Danny on LinkedIn
Connect with Tim on LinkedIn
Chat to our L&D experts and find out how Looop gives you your days back