The Tech Includes The Humans with Tim Lockie and Tracy Kronzak

Episode 6 April 06, 2022 00:38:54
The Tech Includes The Humans with Tim Lockie and Tracy Kronzak
Why IT Matters
The Tech Includes The Humans with Tim Lockie and Tracy Kronzak

Apr 06 2022 | 00:38:54

/

Show Notes

Tim Lockie and Tracy Kronzak take on the mythology of technology as it should work for impact organizations and the realities of why it doesn’t.  In their first one-on-one conversation, they explore how the mythos of technology failure is supported by wide-scale marketing to under-resourced organizations, what changed both of their perspectives on their work, and why Now IT Matters as a business is now working the way it does.  This is for leaders looking for new ways of engaging with the clients they serve, and an informed take on the journey making these changes as a business engenders (spoiler: it’s a complete rebuild).  Tune in for a professional “Why IT Matters,” as part of a series we hope to offer that lets you take advantage of our understanding of the ways in which the technology world is changing.

Episode Shownotes

View Full Transcript

Episode Transcript

Speaker 1 00:00:07 Welcome to why it matters. This episode is the tech includes the humans. Hi Tracy. Speaker 2 00:00:12 Hey Tim. Hey everybody. Thanks for listening today, Tim and I are opening up a new vector of conversation for why it matters, where we have conversations about topics that are important to us. Speaker 1 00:00:27 We know that there is a lot of failure in technology, and what we thought is if we get the technology right then we'll succeed. And then when that didn't happen, we had to start asking questions. This episode is about what happens when you see failure, even when projects succeed. And, uh, this starts with us seeing a friend of ours, Brian, Komarr put up a statistic that really burrowed into my brain, especially, and that is 90% of nonprofits collect data, but less than 40% use that data to make decisions. And we don't think it has to be that way. Um, we, we really dived in to say, how could this be different? And this episode is an intro conversation into why we think things could be different in what that difference looks like. Speaker 2 00:01:22 It's also what happens when you ask too many questions. Enjoy Speaker 1 00:01:26 It is what it's when you ask him any questions. So here we are. I've got my fancy new mic. I'm so excited. Uh, and I hope Speaker 2 00:01:37 The microphone. Speaker 1 00:01:38 I know you have you definitely what's what's okay. So you did not get there first because I owned this. I've owned this Mike for like two years and there's a big, long story about, it's not even a long story. I just like never set it up. Speaker 2 00:01:52 That's the big, long story. That's Speaker 1 00:01:54 It? That's the only difference. I know. I know. Okay. Speaker 2 00:01:57 You didn't set it up and you were like, Hey, if you really want to record stuff, I have this mic. I'm sending it to you. And I spent like a day setting it up. I Speaker 1 00:02:06 Know. Exactly. And then I just never did that. Speaker 2 00:02:09 It's amazing what reading the manual can produce in life. Speaker 1 00:02:12 I am actually learning that, uh, to start reading manuals and it's embarrassing to me, I feel like a failure as a Montana Montana. Like, like you just, you know, you put those in the outhouse, that's all you do with manuals. So it's just like, you know, it's just, yeah. Speaker 2 00:02:26 There's no indoor plumbing and all of Montana. Speaker 1 00:02:29 Exactly. Right. Yeah, exactly. I'm, I'm really going back to my roots on that comment, so. Okay. Um, well, Speaker 2 00:02:36 One thing I'm going to add to that, and that is I've completely be fuddled. You know, this ongoing saga with our solar installation company, I've completely befuddled them because I was like, where is the schematic for this system? And why don't I have it? And they're like, who asks for that? And I'm like a former material scientist and physics major who like knows how to read this stuff and needs it for system maintenance. And they were like, that's really interesting. I was like, okay, so that's a, no, Speaker 1 00:03:06 They said they had one of those clients and Speaker 2 00:03:08 One of those people, Speaker 1 00:03:10 Anyhow, Speaker 2 00:03:11 We can talk dress all day. Let's go with him. Speaker 1 00:03:14 Absolutely good. Here is okay. Here is, uh, the topic and I want you to launch it. Okay. Because I think this, I think you got there before I did. Um, and, and, and the topic here is why do nonprofits fail at tech? And I don't know if this is a one time conversation or a multiple conversation, like that's, that's yet to be determined. But what I do know is that you went to an N 10 concert, not concert and 10 conference back in the day and had kind of this revelation. And then for me, it was a long time later than that, but I want to start there for both of us because it, it is actually personal. Um, Speaker 2 00:03:58 It's very personal. It's been personal to the point of honestly, putting me into weird places in my career that didn't end well. Uh, honestly, because I became fascinated with technology failure. Um, maybe going on eight years ago now, I, I can't really give you the exact date, but what I can say is that when I became a consultant, the promise of that consulting career was if you implement the technology, well, it runs itself in the organization is better. And we believed that, and I will never discount the power and promise of what well-designed technology can do for organizations. Because if you do spend enough time making technology simple, it does become easily adoptable. Now, my bar for simple technology is an iPhone. And I already know that apple has spent almost 20 years of work on that to make it that way. And we, as an aggregated system of business partners have not. Speaker 2 00:05:16 So what's the Delta there at and 10 we're presenting for a long time on how to use technology, how to capture better things with technology. And eventually a bunch of us got together one year and we're like, none of this is working for people. None of this is actually getting them to where they need to be. And every year we run into people in the lobby, in workshops who were saying, we did all of these things that are supposed to make it better, and it got worse. So five of us, including some really cool people like Dan Goldstein and Robert Weiner. Um, I can't remember the whole crew, but we got together and we did a full day workshop on failure. And it began with the five of us walking out, I think in t-shirts that said, we're all failures, we're failures, but let's talk about why and the data that was coming out at the time, which I don't really believe has been revisited much, but indicated that of the top 10 reasons why technology fails something like seven or eight of them had no relationship to technology whatsoever. Speaker 2 00:06:36 It wasn't like the technology was bad. It was things like crappy project management, poor expectations, management, resentment, poor change management, poor training, poor ability to convey the value proposition of the technology. And so that last point conveying the value proposition of the technology as much more than just saying, if you do this, they will come. It was really understanding that people's lives and roles and work was centered around the tool. And by changing the tool, you were disrupting their lives and their work and their buy-in to the mission of the organization that was implementing it in the first place. So that began a fascination with me on technology failure. And I've occasionally been lured over to go work for people on that promise of, I want you to be a strategist for our customers, not an implementer and where those engagements inevitably ended was strategy. Speaker 2 00:07:48 Doesn't accelerate. It slows things down, and it pays attention to them as they come up. And what those roles that I had in the past were actually asking me to do was can you convey a sense of technical assurance to this work so that the sale can accelerate and, you know, can you convey a sense of strategic importance to our attention so that we can just look like we're ticking the box on these projects? Uh, and both of those engagements with two different employers ended poorly because I was like, you're missing the point. Uh, and I think that's where your study of this kind of kicked in because you kind of witnessed me go through that, but you also were, were doing your own journey. Speaker 1 00:08:37 Yeah. And I, I, I think if I go back to one of the early signposts for me on this, I mean, we were all, all of us implementers, you know, no something's kind of wrong. And at the same time, it like, well, you know, he going to do about it. Like, this is the way it's like, in some ways it feels like there's something wrong with gravity because people keep falling. Like, no, these are the conditions that we live in, what are you going to do about it? And so it took me a long time to get to the place where I felt like any action past analysis was even like, worthwhile to consider. And so to your point, eight years ago, you sat down, talked with five brilliant people. Um, and I can imagine those conversations were both hilarious and insightful and coming out of them. And it was like, okay, we understand this. And then what, like, Speaker 2 00:09:36 This is Speaker 1 00:09:36 The shape of the market exactly. In that. So I looked at that as an economist for years. I'm like, okay, it makes sense to me, somebody has to pay for the strategy. And if there's so much in patients about getting technology rolling, that customers don't want to pay for strategy, then what are you going to do about that? Like, that's just baked into the economics that this is not what customers want actually. And so, you know, I think that that's, that was just kind of in my thinking for, for years and everybody else. So I want to say it's fair to say, like there has to be economic underpinnings for this, uh, this kind of work. And it's also fair Speaker 2 00:10:10 To say, like, simplicity, Speaker 1 00:10:12 People do want simplicity, even when it's not possible, but exactly. They people do want to drive to simplicity. And the other thing that I want to just, I feel like this is a later, a very much later discovery, but it's one of the reasons that I think the analysis is easy to, um, to not get right, is that collaborative backend systems function entirely different than simple. One-off like one person engaging. And so like a good example of that is nobody has to teach, well, some people probably have to be taught how to use Instagram or Facebook or socials or whatever. Right. But they're pretty straightforward Speaker 2 00:10:54 In that bucket. Speaker 1 00:10:57 But, but my point is, but like not on the point and clicks like, yeah, you don't need to like, okay, here's how you log in. Here's where your password is. Right. People just kind of, Speaker 2 00:11:06 I have, it is what you're pointing to. Why, why are we here in the first place? Speaker 1 00:11:10 Well, partly, but also like when it comes to, for example, donor management or volunteer management, we're talking a completely different magnitude of complexity. And, and, you know, as soon as you're talking about systems where multiple people are required in a like supply chain of data, that's a, that's a very different kind of thing. And part of the reason I'm saying that is that it's really easy to compare a technology should be simple to a backend system and not realize like you've just crossed oceans here. Um, and so I, I do want to at least say that, but so you, you came to a conference in Montana and it was 2018 big sky dreaming. And you presented from something that, um, you got slides, I think from Brian Comark do I remember that Speaker 2 00:12:01 Potentially like, I actually, I was still working at Salesforce at the time, although, um, I was in between teams there and I was like, I want to speak at this conference. I want to speak about this topic. And I grabbed none of the slides that I was given internally, which were our sort of go-to market sales deck. And I instead pinged a colleague on the inside Brian Kumar, whom we both know and really very much admire and said, I need to talk about what's really going on with technology. And he gave me a whole bunch of slides of around impact and data utilization and all of these beautifully put together things that somehow weren't making it into the side of the organization where I was, which was closely aligned with the sales teams, but were exactly the explanation I needed for that presentation. Speaker 1 00:12:57 And so was that the, that was the impact revolution deck, right? Speaker 2 00:13:01 That was the impact or a vocation deck Speaker 1 00:13:04 The same summer. So Speaker 2 00:13:07 Before the words for please, let's not add another numbered revolution, like the third industrial revolution on the fourth age of man and the fifth criteria for the ring system that, you know, we shall build as the Dyson sphere on the 600. Speaker 1 00:13:22 I mean, that makes sense. They're a middle earth Speaker 2 00:13:25 Exactly. Stop defining these things by the errors of middle earth and get to something more real. Yeah. Speaker 1 00:13:32 Okay. So I encountered that deck before you that well, at the same time as you, Speaker 2 00:13:38 It was presented at a partners that Speaker 1 00:13:40 Partner summit. Speaker 2 00:13:40 Yes. Yeah. Speaker 1 00:13:41 Okay. So, so then, um, so then I'm first on, on that deck. Um, and, and what happened for me is that Brian Komarr showed in that deck, a screen that came out of an impact study that said 90% of nonprofit organizations collect data, but less than 40% of those organizations use, use that data. And it broke my brain. Like I looked around the room at other partners, like, what, what are we doing here? Like what, like this, how can we just sit here, look at that and just go on with our sales lives. And I don't know why that was the moment because I are like, there's other, there were other things out there, like, you know, 70% of CRM projects fail, whatever, you know, that all of us know, but that one, that one burrowed in and Mo you know, and it wouldn't go away. Speaker 1 00:14:41 And then you came and reinforced it a couple of months later. And then you showed this one graph that has like red lines and blue lines. And we've got to find those for the show notes, but the, that was, that was such a power because it's exactly what you're saying out of like these 10 different criteria, almost all of them. And, and it was percentage wise too. So it's all people stuff. It's not the tech stuff. Like the tech stuff works really well. And so, um, I, a few months later just said, I'm not doing that anymore. Like, I won't keep doing that. And I do think you, I think you are right. Like I took it to action and it really, it really caused a lot of damage in our business model. Uh, you know, it, I still have really good relationships with former staff and, but I changed all of the rules on our business, you know, in the course of like six months. Speaker 1 00:15:41 And, and, um, and it was really frustrating for a key staff who had worked here for years, you know, Angela and Justin and Joni, you know, we're, we're doing so well at hanging in there and trying to follow this crazy leader who is trying to find his way. And basically it boiled down to, I'm not taking money from nonprofits for something I know doesn't work. Like I won't do that anymore. And we, and there has got to be an economic way to solve this through a different kind of service. And so, yeah, we spent the last three, three years, like just wandering around, not wandering around, being very intentional about, is it, this is it, you know, like how does this work? Um, you know, and I do feel like we got to a place on that. And for the, and for the sake of time, I'll just summarize where we landed on what that looked like next. And by the way, thank you very much, Brian Kumar for ruining my life. Um, but it has, Speaker 2 00:16:38 I have a funny story to follow up on that. Uh, Speaker 1 00:16:41 Well now all Speaker 2 00:16:42 The back end of it, the back end of it was, um, we had visited you in Montana in either I think it was, uh, Speaker 1 00:16:53 2019. It Speaker 2 00:16:54 Was 2019. It had to have been, it Speaker 1 00:16:57 Was, Speaker 2 00:16:57 Yeah, because I have distinct memories of a couple of things, obviously the whole sort of government shut down under Trump and going, uh, snowmobiling in Yellowstone. Speaker 1 00:17:08 Gosh, that was amazing. Speaker 2 00:17:09 But you and I, during two different visits that I think Amy and I made, and then I made separately to ski because big sky, like I remember having conversations with you on the chairlifts at big sky. And you saying like, I think I just nuke to my own business. Like, I mean, that was how real that was for you. You were like, oh God, I think I just nuked my entire business Speaker 1 00:17:39 And to throw on top of that. Um, and I was talking with, I was talking with somebody who was referencing a book called or an idea that there's some people that can't not, not do something. Yeah. And, and it was like this thing, this thing crawled inside my brain is all, it's more like a Crow crawled inside my soul and was like, I don't believe in what I'm doing. And I'm, I don't know how everybody else is wired, but I can't continue moving forward if I don't believe in it like that. That's and I think most people are shaped that way. I don't think I'm not different, but this one grabbed me in a way where I was like, I, I know this is solvable and I'm not, I'm not working on something that is just taking money from nonprofits and not delivering the kind of results that I think are appropriate. And I also don't resent or think anybody else's wrong for doing it the way that it is. I think I'm crazy. I just, couldn't not do it. Um, you know, Speaker 2 00:18:42 Repeating the same thing at scale is actually what makes money right. And doing things exactly how they've been done is what actually makes money. So you're holding these two things, you're holding sustainability of business versus desired outcome of vision. Right. Right. And that scene that comes to mind is from zoo Lander. It's, it's Mugabe too, at the end of zoo Lander where he's like, am I taking crazy pills? All of these things look alike. I don't understand. Right. And then in the NDCs blue steel and he's like, oh, it's beautiful. Speaker 1 00:19:17 Right. Speaker 2 00:19:17 Right. Like that's the scene that comes to mind is where this put you. Speaker 1 00:19:22 Yeah. It is. It is. I mean, I cannot handle the fact that this went to zoo Lander because now I'm like Speaker 2 00:19:29 Files Speaker 1 00:19:30 Are in the computer. Well, I use male models. So, um, but yes, exactly. Like, yeah, it is, it is like that. And I do feel like part of, part of the mythology around us has been that S that this kind of work is one off. And the technology work is at scale. Speaker 2 00:19:52 And in this transit on Speaker 1 00:19:53 That assumption, that is absolutely a fault. Partly it's because humans are not crazy. And unpredictable humans are extremely predictable. Our behavior follows, and this is what economists do. So maybe this is why it hit me, but behave, like if you present people with cost benefits, you can predict very reasonably what they're going to choose. And so I S I think the thing that happened to me in the process is I stopped believing that users were crazy and started to say, the behavior is the thing to pay attention to, because it's the, it's the most unpredictable part of the model right now. So why are we focusing? Like, it makes sense that we'd focus on the most predictable part of the technology, except that the breakdowns weren't on the technology side and what I saw over and over again, and what I see now, because this really reshaped my brain, as I look at what's happening in technology, what the problems are it boiled down to? Speaker 1 00:20:51 We never really grasped the idea that technology is part human and part technology. And what I mean by that is technology has just always been the computers. And I don't think that's true. I think actually our minds, the way we think that's part of the technology. And so when I think about technology and I, I know that sounds crazy, but I think about it in terms of DNA. Um, and so, you know, the DNA has like, these it's like a twisted ladder, right? And so one side, one, one side of the ladder is the tech stack. The other side of the letter is the human stack. And then the processes, all the stuff in between, right? That's the things that go in between because you've got people process technology. I still think that those things are relevant, but they're not equal. So process is actually the way that technology talks to itself. Speaker 1 00:21:41 It talks to people or the way that people talk to themselves, talk to technology. So like the process of just connectors, but the two real stabilizers in that equation are always the people in technology. We've always thought of the technology as its own only stack. And so the only thing you could ever do with it is evolve the technology. And this is where our mindset, the way that we frame things in our brains limit or expand what options are available. And once we, once I saw a model that was, you can evolve the humans and you can evolve the technology. Suddenly it stopped being a one dimensional model because evolving the humans is part of the technology. And so that that's, that was what became as you know, the, the human stack for me. And if you, if you think about, um, if we think about what stops, what stops progress or what halts projects, or what blows them up, it's emotions, it's not CPU limits, it's not governing limits. Speaker 1 00:22:44 It's not, you know, infinite loops on the tech side, this has been solved for, with careful construction. What it is, is fear of losing your job. Resentment of being asked to do more work. It is like all of these other pieces that exist that, uh, that need their own methodology to implement just like there needs to be agile and, and waterfall on the tech tech stack. We need something, some kind of methodology to shape the human stack. And that has to be considered part of the technology. This is not, you know, and, and I think that that was there that happened like 24 months after I started this process. And it was just like, the lights went off. I was like, now I can see it a lot more clearly. So, um, Speaker 2 00:23:30 Well also to that point, Tim, I think one thing I do want to call out is the notion that the other thing that happens, particularly for non-profits using technology, because this is a methodology derived from your experience, looking at that, uh, certainly my intense focus on it has also been derived from my experience doing it. Um, what I would say is that a large part of how impact organizations are sold technology is come to us, stay with us because we are eternal and become part of who we are, meaning X, Y, or Z, or a B, and C technology product or platform. And that not only makes us look good, but the longer you stay with us, the more you will just shape your work to us and you will never need to leave. And I think the problem here in is that we all intrinsically know that technology changes and it's changing even faster now than it was over the past decade. Speaker 2 00:24:48 So the promise of technology is eternal, and it's just the humans that are just going to keep changing out and utilizing this same system that provides for them, all the structure they need for their work is fundamentally false because with every change out of those humans come new priorities, new mission re-evaluation and new programmatic actions that make technology obsolete, right? So where I've gotten in trouble with this, and what this formulates for me is my underpinning of when I say impact organizations are always benefit of choice, always because what we know to be true also is that when you have humans who are actually empowered as humans to do their jobs, they will make the right choices from a technology perspective. But that might not always be the folks who are saying, come to us and stay with us forever and we'll shape you Speaker 1 00:25:51 Well, and yes, app, I agree with, uh, I agree with that a lot, one thing I will say on that is that there is efficiency in platform. Like that's just true. And Speaker 2 00:26:04 The definition of that is substantially change it. Speaker 1 00:26:07 Oh, absolutely. But it's, it is so like I do, um, I do this off off-grid property and I'm building like the stuff on it, Speaker 2 00:26:15 Your dome. Speaker 1 00:26:16 Yeah, exactly. I'm building this dome. Uh, and it's really awesome, but I started buying tools for that. And I bought, I've bought a lot of tools for that. And guess what, they're all yellow and black, they all say DeWalt. And it is because they all, I can use the same batteries on all of them. Yep. You know? And so I think this goes back to common data, model questions, interoperability questions. But what I, what I will say is that when you, like, it makes sense for me to just like, partly on just the decision level. I'm not even asking the question, I'm just going to buy DeWalt, you know, um, because it's simple, there is something to that. That's helpful. What I will also say though, is that we've ended up in a world where the marketing is basically all of us, as I partners have found, you know, a platform to act as a dealership for. Speaker 1 00:27:07 And there are a couple of large car manufacturers out there that are all selling CRMs. And, you know, there are some that, you know, support more than one CRM platform and a lot that are just a dealership of that platform, you know, and they're, and they're selling those cars. Um, and I think non-profits go in all the time and are saying, I'm having a hard time driving this. And I think that what's our, I think the dealerships say, uh, and they have a lot of marketing and they have a lot of flashy brochures to back this up. What they're saying is, well, this car is so easy. It basically drives itself. And here's what I want to say about that. I don't think that's evil. I, I think that is good marketing. First of all, secondly, it is true. Um, it is true in that like, cars are getting easier to drive and CRMs are getting easier to drive. Speaker 1 00:28:00 That just is true. The idea that, that they're self-driving is not true. And I I'll say this is where the nonprofit market has lagged behind and professional development because every dollar went to a kid and no dollars went to professional development for so long. And there's been such an under investment in professional development around technology and skills and digital literacy that the, that, what, what non-profits want is a car that drives itself. But what they actually need is a driving school. And I think if I were to, if I were to name what I've, what I've developed and what our services like, what, like, what our methodology does is that it focuses on how to create a driving school for, you know, any like whatever team it is in whatever technology. And that's not totally true. I think like our tagline on it. And, and I've seen it happen enough times that I feel so confident saying this, like we can create measurable change within four months for any team that has a culture of authenticity and transparency and any technology that is cloud-based and collaborative. Speaker 1 00:29:14 And, you know, so there are conditions around that and those conditions are partly human stack and partly tech stack. But, um, but I do know that there's enough predictability around that, that you can actually see change in the existing technology with the existing team. And that, that right there, I think, was the promise that I, I didn't feel comfortable making earlier, you know, like when I started on this and that I definitely wouldn't have made when we were just doing implementations, because I like that wasn't the deal. The deal was, I can do this technical deliverable if you don't use it, that's not on me. Cause we've got a contract to make this technical deliverable. And, and so, um, it, and it also, it goes back to this idea, like consultants are very, very nervous about taking on the outcomes. And so they want to, we all want to write contracts around what's predictable, which is the technology. Speaker 1 00:30:12 And, um, when you think about, uh, for example, like a personal trainer, you know, there that's, I love what they do because they are committing to the results of someone else. And that is, that is such a different, like that depends on their behavior. It depends on their ability to say that they're going to do it and follow through. And I, I, I re I totally understand why more consultants don't want to take on that responsibility. But for me, that became the central thing. Like I need to find, I want to find the clients that are really ready to change, and then I want to have the program ready for them. So that when, when I find those ones that are saying, like, I I've been through it, I've tried it on my own. I can't do it. What does it look like? We've got an answer. Speaker 1 00:30:56 And I think I will say that is what we developed. I also will say, I, I have, uh, I think now it matters has really struggled to get to a place where it can deliver because of all of the ups and downs and businesses of business, but I still am. So I'm so passionate about the diagnostic that we've developed around that. I'm so passionate about the, the work that we've done around that. Um, and so for, I, I'm hoping to be able to talk a little bit about what we've seen in that, uh, for at least a couple of episodes where it's just you and I talking about, um, what is it we've done, what we've seen, why we think that tech failure in digital literacy can actually be undone and why I'm very, I'm actually so hopeful. And I know Katie Gibson is somebody that really brings me a lot of hope around this as well. I'm really hopeful about digital transformation for nonprofits in a way that I haven't been, or I wasn't when I first started. So yeah. Speaker 2 00:31:51 Yeah. I agree with that. I mean, it's funny. It wasn't where I was taking the self-driving car analogy in my mind, but where you took it was fine, but where I was going to take, it was the self-driving cars work until you see the myriad of newspaper reports that we get out here in the bay area, around people who fell completely asleep while their car was still moving and then got pulled over and had to be awakened by a police officer because their car had been pulled over. I mean, it's not a technology problem, right? Like the technology, wherever you look particularly now in 2022 is good enough. You know, it clearly it is so clearly a human problem. Yep. And, you know, I say this, the other place where my mind was going was, you know, one of the things that I want to do before I hit the half century mark in my life, which is unfortunately closer to the horizon than I care to admit sometimes, uh, I want to go do big cooler at big sky, right? Speaker 2 00:32:54 Hmm. What is big cooler? It is a narrow ass 55 degree pitch down the side of the mountain where I have been standing at the tram entrance and watch somebody missed the first turn repeatedly and slide all the way down to the point where there was one day where I was a big sky, where a bunch of us watched a guy wipe out and he hit rocks in a way that silence to the crowd because we sincerely thought he was dead. Right. But why I'm seeing this is because I'm not just going to like drive over to big sky and do big cooler tomorrow. What am I doing? I'm getting the right equipment. I'm training, I'm exercising, I'm working on technical skills on high pitch environments at mountains that I'm super familiar with the terrain on I'm developing lesson plans for myself and I'm engaging with teachers. Right. And what you're talking about is that kind of training for organizations and making that training a priority so that when they go through that gate, they don't just sort of fly everywhere and wipe out. And I think that's super important for, for folks to understand that this is a new methodology. Speaker 1 00:34:16 It's I I'm, first of all, I mean, when it comes to decisions and human like humans, and decision-making the fact that you're even considering that like just is, is so, is so crazy to me, I'm like, what possible benefit is there in all of that? But then, you know, I think that's what is, that's what makes me so interested in humans is that spark, that the question is what is like, what drives that benefit for people. Um, but you know, to that, to the end, one of the ways that humans are predictable is that we know they are not a single event, uh, configuration the way that that technology is like, so you can, you can set a workflow rule and it will just run the right way every time from then after. And we take that approach and put it, we overlay it on humans in an end user training with a list of like 50 things to go learn. Speaker 1 00:35:18 In two days sitting, you know, in a room watching like completely out of the environment of how they're going to use the technology. And humans can learn like five things in one day. And one of them is their password. So in a two day training, you get nine things in a password, right? And there may have been 50 things per day. I've been to those trainings like that is not going to stick. And if we already know that and we don't walk into the, like into the contract saying, use end user training will not accomplish what you think. And neither will your documentation, like that is not going to get it. What will, what con like how you, how you configure humans is to have them practice what they need to do accomplish it, and then say yes, that worked. And if it didn't have them say, here's what didn't work. And what do we do about that? If you don't have a methodology that includes that like a series of learnings related to it and practices around that, they don't know it, like, especially in collaborative tech, like it just will not work. Speaker 2 00:36:22 I mean, it's like saying to me here, kid, you want to ski big cooler, watch a Warren Miller film. I'm like, That's, that's the analogy here really Speaker 1 00:36:32 Is a perfect analogy, right? Yeah. It's yeah. I mean, I, I just love that because it's so like, just watch someone else do it now, go back to your desk and completely change everything you've always done. And by the way, don't export this to Excel, which is exactly what's going to happen. Not to mention all of the frustration and fear. Okay. I learned if I do it wrong and messes it up for everybody else, I feel unprepared to do this. I'm worried that if I don't do it, I'll lose my job. So I can't tell anybody that I don't know how to do it. And you know, so yeah, it is so similar to that. And then the fear that comes next, like if you go to big color and you're afraid a you're rational, but B if you're, if you're afraid, because you're unprepared, that is rational. You should not go forward. Speaker 2 00:37:22 That's your job. Speaker 1 00:37:23 If your job makes you go forward anyway, like how, how is that responsible implementation methodology? It's not, it's just all of us know why that's going to fail. And, and I just think we all feel it's the gravity in the room and you can't challenge gravity. Right? Speaker 2 00:37:42 Well, you can, but only in the Hitchhiker's guide where they say the art of flying is to throw yourself at the ground and miss, Speaker 1 00:37:52 Um, that's a great place to land it because that, I mean, Hitchhiker's guide to the galaxy. So, uh, if you're, if you're wanting to hear more about this, we're going to, we're going to keep, we're going to talk about this for, um, probably a couple of episodes, just me and Tracy. And then, um, and then we are going to switch topics. Um, so grab your, grab your towel, because if you've read anything about Hitchhiker's guide, you know, that you need the number 42 on a towel, and you're good to go. And, uh, yeah. Join us for, join us for a couple of other discussions around, um, around why tech fails and about what we've learned about that and why we think that there's a different, better way to do it. I'm tin lucky. Speaker 2 00:38:33 I'm Tracy Crohn's and you've been listening to why it matters. Speaker 3 00:38:38 Why it matters is a thought leadership project. Now it matters a strategic services firm offering advising and guiding to non-profit and social impact organizations. Speaker 2 00:38:47 If you like what you've heard, please subscribe, check out our playlist and visit us at now. It matters.com to learn more about us.

Other Episodes

Episode 20

October 20, 2022 00:56:32
Episode Cover

Weird Al, Seinfeld, Partnershipping, and Transactionalism

This episode is as much a Coda to some of the discussions we’ve been having throughout 2022. It is perhaps an opportunity to change...

Listen

Episode 10

May 10, 2022 00:26:34
Episode Cover

Ch-Ch-Ch-Changes (special episode)

Change is constant and in our recent history, the changes have been grand and rapid. Like all things, Why IT Matters has also been...

Listen

Episode 28

November 10, 2021 00:54:57
Episode Cover

Find the Light with Rakia Finley

What happens when we wake up every day knowing that we are intentionally here and love solving problems more than anything? Tim Lockie and...

Listen