Privacy For All with Joshua Peskay & Kim Snyder

Episode 16 August 31, 2022 00:42:36
Privacy For All with Joshua Peskay & Kim Snyder
Why IT Matters
Privacy For All with Joshua Peskay & Kim Snyder

Aug 31 2022 | 00:42:36

/

Show Notes

Digital security has always been something that nonprofits and the greater impact economy need to consider – as the world of technology has grown and evolved, so has the need for organizations to consider the why, how, and where of their data.  More importantly, what it means for the constituents they serve.  This discussion is both eye-opening and for everyone in nonprofit leadership who needs to understand these evolving needs.  However, the most important thing for us all to understand is that we can no longer live in a reactive space, and there are attainable, understandable means by which this can become proactive and beneficial.  We’re joined today by Joshua Peskay and Kim Snyder from RoundTable Technology for a discussion on making these shifts across the impact economy. 

Episode Show Notes

View Full Transcript

Episode Transcript

Speaker 1 00:00:07 The name of today's episode is called privacy for all, or maybe privacy for all, depending on what area of the country you're from. Hey Tim. Speaker 2 00:00:16 Hey Tracy. And welcome everybody to, um, another episode of why it matters. We are so glad to be back. We've been on a recording pause while we figure out our lives. So Tracy, it is so great to see you again, and I'm so glad to be, uh, recording is just, is just, it feels so good. So hi. Speaker 1 00:00:38 Yeah, it feels so good to be back in the saddle. And I mean, we've done some recordings that are gonna hit before this one, but this is the first one where it's like, as, as the pelotons instructors would say to me on a Peloton ride, it's like, I fixed my crown and my wig is fully attached and we are back with our beautiful sort of WT matters recording kind of perspective on everything. So I'm I'm yeah, it feels great. It feels great. And for today we're talking about digital security and that has always been something that nonprofits and the greater impact economy needs to consider. But as the world of technology and technology tools has grown and evolved, so has the need for every organization to consider why, how and the, where of their data. And I think more importantly, what it means for the constituents that everyone is serving this discussion, I will say flat out is eye-opening. Speaker 1 00:01:43 And I think at moments terrifying, and for everyone in nonprofit leadership who really needs to understand these evolving needs, something that we hope you can reference later as a bookmark. But I think the most important thing for us all to understand is that we can no longer live in a reactive space when it comes to security. And there are attainable understandable means by which this can become a proactive and beneficial thing for your organization. We're joined today by Joshua PSK and Kim Snyder from round table technology for a discussion on making these shifts across the impact economy. And it is gonna be awesome. Speaker 2 00:02:27 Uh, welcome to the show, Joshua and Kim Tracy and I have been friends for almost 10 years and that's a, I feel like that's a pretty long time to know somebody. So, I Speaker 1 00:02:37 Mean, this feels like 50, but Speaker 2 00:02:39 I know exactly that's it only, not even 10. Um, I count Speaker 1 00:02:45 My friendship with Tim and dog ears. <laugh> Speaker 2 00:02:49 I, so I wanna get into the math on that cuz the dog ears, math keeps changing. Yeah. But anyway, um, my question here is you're you two are almost three times that that you've been working together for a long time. So we will get to the tech conversation. There's a lot to cover about cybersecurity and all, all sorts of other good stuff. But I would love to have you introduce each other and say something about the other person that you've learned in nearly 30 years that maybe isn't on the website and that maybe not everybody would know about the other person. Uh, and Kim, do you mind going first? Speaker 3 00:03:28 Sure. So I'm introducing Joshua pesky, Josh and I started working together 28 years ago. We were working at fountain house and he, um, did a radical thing when we were working together. And that is no Josh. That's not bringing windows NT in overnight, but it was over a weekend. You created an access 2.0 database. I dunno if you remembered that, but that, and that was early in our working together. And my big takeaway from that was this is someone who really listens and really solves problems for people besides the fact you did over the weekend Speaker 1 00:04:14 And access, which was wow. Just there's a lot to dig into. Speaker 3 00:04:19 That was fantastic. Speaker 1 00:04:20 Yeah. I did an access database too. In the late nineties. It was fun Speaker 4 00:04:25 Didn't we all I think was like, it was all the rage that was part of the problem. Made it too easy to make all these Franken databases <laugh> rolled out in the world, Speaker 1 00:04:33 48 custom fields. Sure. Speaker 4 00:04:35 Why not? Yeah. You know, so Speaker 2 00:04:38 All right. Joshua, your turn. Speaker 4 00:04:39 All right. So yeah, so I've been working with Kim for, for over half my, my life now, which is kind of amazing. And, uh, we've been at four different organizations together. So we started fountain house. Uh, four years later, we formed our own business, providing it services to nonprofits. Uh, we effectively got acquired by a foundation called the fund for the city of New York for a brief period of time. We went our separate ways. Kim went to Pearson where she, uh, led an agile transformation of the entire company, um, as they, uh, transformed. And then, uh, we rejoined at round table technology where we have been, uh, I've been here for 10 years and Kim for eight. And, uh, the thing, I mean, there's so many things I have learned from Kim, but the biggest one I will say is the absolute power of being really, truly a lifelong learner because Kim is someone who, when I first met her, seemed like she was smart and knew everything and just has continued to get smarter and know more things while always remaining completely humble and open to learning new things at all time. So Speaker 3 00:05:40 Thank so Speaker 1 00:05:45 I love, you know, some of the things that you all are passionate about most explicitly like privacy data access, how do you sort of manage, how do you manage the world around the people using the technology? And I think from a very high level perspective, you know, when I was working in it for nonprofits, these were some of the issues I was confronting. And I think the biggest shift at, at the a hundred thousand foot level that I've noticed over the past 20 years in particular has been, you know, we seem to have come from this place of, you know, data security and privacy 20 years ago felt a lot like, Hey folks, please don't do dumb things like, please don't put your passwords on sticky notes, please don't, you know, give away things that you shouldn't give away. Like it has moved radically from this very sort of reactive. Please don't do dumb stuff to what actually I think we're learning needs to be a super proactive approach to security as the sophistication of security threats has increased manyfold and the politicization or the polarization, excuse me of our world around us has only increased. So I'm curious if you spot that same trend and I'm curious if you all believe that the impact economy is equipped to keep up with what I think now needs to be a very proactive approach. Speaker 3 00:07:25 Well, and in addition to that, the third factor is the immense amount of data that we're collecting that we hadn't been right. And so while it was protecting our passwords, you know, a short while ago, that's what privacy meant. I think the GDPR regulation Speaker 1 00:07:48 Yep. Over Speaker 3 00:07:49 In your, how we think about information, information is something that belongs to a person. It is not ours. And so that's required a mindset shift for a lot of organizations thinking, well, our data is a resource, right? Data is the new oil. Well, Joshua actually has a really good saying around this data is both an asset and a liability. And we really need to think of it as such and GDPR, which we're seeing come to life through various state laws, uh, a patchwork of state laws, GDPR really protects the, the individual's right to privacy. And I think that's a, a way of thinking about data and personal privacy that would resonate with a lot of nonprofits. Speaker 1 00:08:44 Yeah. I mean, it's funny, you, you, you quote, you know, you know, the perspective that data is the new oil, right? I, I will never forget 15 years ago when one of the investment places that I had some money with tried to pitch me on fracking. And they said every word, but fracking, they were like, would you like to invest in a new methodology of oil extraction that involves high pressure water and the ability to float oil on top of that in a manner that is much more environmentally sound than just digging a hole. And I was like, Hmm, that sounds a lot like fracking <laugh>. Um, and aren't there people lighting their tap water on fire in the Midwest right now because of this. And they were like, well, we don't call it fracking. Um, where I want to go with that, Kim is you said patchwork. And I think that's a really important thing to hammer on for data privacy. It's not one thing that we're dealing with Europe has its own regulations. And now in the United States, every different state is developing its own regulations because we're moving into that new era of here. We are a system of tiny republics. What does this mean for nonprofits trying to do their work? Like how can they keep up if this is what's going on? Speaker 3 00:10:05 Well, what managing it? It actually changes the game on data management. So all data management may have been something that was nice to do. Good thing to practice. It's not really a requirement. We really wanna think about data in terms of data responsibility. And I think what it does is it puts an ownership of data or it, it puts the onus on data ownership, um, within an organization, it doesn't really work. You can't maintain privacy, you can't do the practices. And I wanna shift soon to Josh talking about really protecting privacy, cuz that's a, a different dimension, but you without life, you know, data management across the life cycle, when you collect information, when you store it, when you use it, when people access it, when you share it, when you delete it without management across that life cycle, you really can't do privacy. And in today's world where we want to protect, right? Speaker 3 00:11:14 That's an important word, the data of our constituents or our donors, people who give to our causes, which may be controversial, may not, but we wanna protect their privacy in that context. Uh, we can't really do that without some level of data management, which then necessitates data ownership. And in a lot of nonprofits that we've worked in over the years, the many years, you can go into an organization and, and hear who owns the data we all do, and that doesn't really work anymore. So it, it needs to have a guardian, but, uh, but data privacy is, um, the data management is only one part of the equation, cuz then we need to speak to data protection. And what we're talking about there Speaker 2 00:12:08 One, um, what does, what does data management look like for data as a liability? I don't, I've never thought about, I've only thought of data as asset. So that's really intriguing to me. What are the risk points on that? And maybe Joshua the questions to you on that, but what are the risk points on that? And like how does it go wrong? Like what, what are the implications here? Speaker 4 00:12:35 So the simplest, uh, way to describe it is, is liability around, you know, privacy laws that carry fines. If so, if you Tim and you, Tracy give me your data and your New York state residents and that data is breached and I fail to notify you. And I haven't taken reasonable safeguards that I'm subject to fines. I believe it's $750 per data record. So, you know, I could be out $5,000 for the two of you. Uh, you can imagine if I had a thousand records, right. Um, what that could look like from a liability standpoint, GDPR, you know, fines of up to 4 million euros. I think it is. Um, you know, and uh, so it that's the easiest one is that, you know, to your point, Tracy, if I, if, if, if I think of data as the oil, right? If you lose the oil, right. Speaker 4 00:13:27 Or if I lose my gold bricks or my a hundred dollars bill, um, I've lost that asset and I've lost the value of it. But think of if in addition to that, right, the oil came back and said, you owe me, you know, another thousand dollars because you lost me. Right. Because I got stolen. Right. Or the gold bar that I lost said, you owe me another gold bar because you lost me. So you're out, not just the one you lost, but another one as well. That's kind of what it looks like. Um, and then Tim, the, all the other things, uh, that create liability are around the kind of cyber security risk. So you've got ransomware, I've got the data, I need access to the data. And if I lose that, then I might have to pay someone, wanna get it back. You think, okay, I've got backups and all that good stuff. Speaker 4 00:14:15 Well now there's ransomware 2.0 where I'll say, well, not so fast, you've got the backups, but I'm going to sell all of this data of all of your constituents in the dark web, unless you pay an extortion fee and then ransomware 3.0, I'm gonna take these thousand records that I lifted from your donor database. And I'm going to use that data to specifically target all of these individuals for cyber crime activities. Right. Unless you pay me some, you know, additional if you're having gonna do that anyway. So there's all of that liability that's created. And Tracy, I think you were talking before about how, you know, 10 years ago, 15 years ago, cyber security was kind of the nice to have. And a lot of people were like, eh, we'll get to it. If we feel like it, we'll not write our passwords and posts, notes, it's whatever, what, and you kind of asked the question, you know, is the nonprofit sector ready and able to really take security seriously? Speaker 4 00:15:11 And I guess my, hopefully it doesn't sound glib. Answer to that is it doesn't matter cuz they're gonna have to, or they're going to suffer terribly. And what that's going to look like is, and we're seeing this already is you're not going to be insurable. So right now anyone's got, there's listening to this and you've got your cyber liability renewal coming up and you don't have multifactor authentication implemented on all of your key systems. You're not getting insurance. It's not that your insurance policy is gonna be bad or more expensive. You will be denied. Right? And you're gonna have lots of other requirements that are coming up, your partners, anyone who's sharing data with you is go, if they're not already, they're gonna be sending you questionnaires saying, what are the, you know, 20 question questionnaires saying, what are your security practices? And they don't like the answers. They're not going to be sharing data with you anymore. They're not going to be a partner of yours anymore. So these are problems that are happening today at nonprofit organizations that we are working with, this is not speculative. Um, this is literally stuff we're dealing with day in, day out and have been for a couple of years. Now, Speaker 2 00:16:22 The, the interesting, is it interesting, disturbing, like, um, it can be, yeah, it sure, sure. But it's like stranger things, right? That's interesting. It can be both. You're exactly. Right. In fact, there's probably a relationship between disturbing and interesting, but that's for another podcast, I'm sure. Um, Netflix makes a lot of money on that. On that. I also have no idea where I was going was something that was interesting and disturbing. No I do. Um, I think what, uh, is disturbing about that is looking down, like from the experience I've had with nonprofits, which I've been in nonprofit work since I was 18. So it's a long time for me. There is clear evidence to me in past behavior that nonprofits respond more to risk opportunity. And so Kim, what you are talking about here in data management could be done well holistically, or it could be done as a risk assessment. Speaker 2 00:17:27 And only, only looking at how to buttress up their data, instead of saying, we need to manage both the asset of data and also the liability of data. And those are actually pretty much the same functional actions that you know. Um, but my concern is that instead of doing both, they'll only do the liability and not actually use data, but they'll protect data they're not gonna use. And it's not even high quality in the first place that just like the, um, that just is very hard for me to wrap my head around because I can see that as a reality that could unfold really easily. If nonprofits are not thinking about this, um, together. And I think some of that is consultants need to be clear the actions to preserve data. And this is why I love what you're saying. Can the actions to preserve data is data management. And if you do that part well and you know how to do those actions, you can also take the extra steps on protecting it. Do you agree with that? Is that a trend that you see set of concern you have as well? What are your thoughts? Speaker 3 00:18:30 It can be hard, um, Tim to sell like risk and this is another Josh I'm gonna borrow, but he's right here, that risk ain't sexy, you know, it's, it's, it's, it's, it's just, you know, because it's, you're paying for something then you don't get exposed to. So yeah, yeah. And Speaker 4 00:18:49 No one's looking to fund the most secure nonprofit, right? Speaker 3 00:18:52 Yeah. <laugh> Speaker 2 00:18:56 Yeah, totally. That makes sense. Speaker 1 00:18:58 There's a Simpsons thing that I quote in moments like this and, and then honestly, it's of Speaker 2 00:19:04 Course, course there is, of course there, of course in intense Speaker 1 00:19:06 One of course there is, uh, it's Lisa Simpsons saying to Homer, you know, it's like, she's like, they're talking about spurious logic and she says, you know, it's like me saying, this rock keeps tigers away. Right. You can't prove what doesn't exist. And Homer's like, oh my God, gimme the rock. Right. Speaker 3 00:19:25 So Speaker 1 00:19:26 Like, it's really hard to talk about it. And I mean, I'll be honest and say, I haven't looked into cybersecurity since the era of like how to not get a national VPN like hacked, but you know, are you seeing trends? I guess my follow up question is, is, are you seeing trends in classes of nonprofits and the needs that they're going to encounter in this? And, and, and what I mean by that is, you know, when I moved out of that era of my career and into the cloud computing era of my career, you know, I didn't worry about security as much because the businesses that were selling cloud platforms had to do that worrying for me, Speaker 3 00:20:12 Oh, Speaker 1 00:20:13 To a greater and lesser extent of success. I will caveat that, but the, are there sizes of nonprofits that are gonna encounter different kinds of problems based on, you know, what the data is they're working with and what their overall, you know, panacea of tools looks like? Speaker 3 00:20:38 Well, data sensitivity and the type of data that you happen to work with, that you collect about people and how you use use it. That actually speaks to the whole data, ethics, data, privacy security side of the world, which I do think resonates a whole lot more. A lot of the questions that you ask to do a data inventory are, are some of them similar to the ones that you would ask if you were doing a data ethics review. Mm. Why and how did we collect that? And so I think framing it that way. I mean, there's sure there's fear around the cyber security side, because we hear about hacks and bad things happening. And there are horror stories and they, they all the time. Right. And so people are less or nonprofits or maybe five years ago. Oh, well what happened to us? Or, oh, we're, we're, we're too small. Speaker 3 00:21:39 We don't, you know, but that possibly is fading, but data privacy, I think at, at the current place we're in, I don't think it resonates yet. It's like, oh yeah, but that's, California's law and that's oh, California. And it's all commercial businesses. Yep. GDPR applies to more, more organizations and realize, because it's, you don't have to collect the whole lives. It doesn't have to be really sensitive. It's basically personal information. So people should pay attention to GDPR. It does set up a good foundation. But I think thinking about it and reframing it to data ethics and data security and protecting your organization is can resonate. Speaker 4 00:22:27 Yeah. I think the ethics part is huge. Cuz you know, if I ask you Tim and you Tracy, to give me information about yourselves, you know, either as a constituent or as a donor, you know, I I'm really making a pat and I'm saying, you know, I'm asking you to trust me with that information that I'm going to steward it. Well, I'm not going to, you know, share it with people that I haven't asked you permission for. I'm not going to use it in unethical ways. I'm not going to combine it with other data to get more information about you than you would agreed to. I'm not gonna sell it. I'm not gonna lose it because I didn't protect it, you know, in any meaningful way. And I'm gonna get rid of it when I no longer need it. So I'm not holding onto your information and exposing you to risk that doesn't need to be. And that's a lot, but that's ethical to me. Yeah. If I'm going to ask you to give me your data and I think that to Kim's point might be a stronger argument to make to the nonprofits than the you're gonna get burned, although you are gonna get burned Speaker 2 00:23:32 <laugh> yeah, I think that that's, I think that's well said and um, it, it, you know, it's more holistic. The, the deeper I get into my work in the human stack, the more I work with nonprofits around technology, not just, you know, thinking in the box outta the box, but really thinking about the box itself. The more I get into that, the more I realize you can't parse this into one thing, you need a holistic framework that that makes all of it make sense because without, without all of the interdependencies on it, you pick one thing and you invest in that and you missed another five things that could have been part of that investment. And I just think that we're, we're now getting to a level of sophistication with nonprofits where more and more leaders are seeing that, but don't know how to execute on it. Speaker 2 00:24:27 And more and more consulting firms, I feel like have been taught. So clearly what, you know, how long the sales cycle is, how hard it is to do business with nonprofits, that they're very, very hesitant to include anything that might not get them in the door to do the work that they need to do. Um, yeah. And so I, you know, it's, it feels interesting to think about this as part of the holistic solve on, on some of that. And I know that you do a lot with, you know, technical strategy and you look at just solving problems for people. That's what you're passionate about. What does that look like? Uh, what does that holistic kind of framework and tech strategy look like in your work? Speaker 4 00:25:09 Can I take this one? Yeah. And I wanna circle this back to something Tracy asked a while ago about the kind of classes of nonprofits, Tim, because I think this connects to that a lot and, and, and I'll, I'll want everybody to sort of comment if you observe this too. I think there's this idea called you know, that that's referred to sometimes as technical debt and that can exist in the form of, you know, we've got a bunch of 10 year old computers that we don't have the money to replace. And so now we have to spend more on maintenance and our staff are quitting, cuz the computers are so slow and I've, I've gotten an article about that. We can share that kind of describes it. It can happen in, in applications as well. It can happen in staff skills and in culture. And what I am seeing, uh, Tracy to touch on your point about classes is that leadership can make just an enormous difference in a fairly short amount of time because in our roles, Kim and I have wound up, especially in probably the last five years, working with organizations that come to us with enormous amounts of technical debt on every level. Speaker 4 00:26:13 But leadership, that part of their choice to work with us is even though we are serving nonprofits, we have matured a lot as people and as an organization where we're said, we're not just trying to get him in the door at the nonprofits, we're saying, look, if you wanna work with us, we need a commitment to like make changes and get better. And so the executives that we're working with are committed to those changes. And if you have the right either internal staff at the nonprofit, um, or you're working with the right outside consultants, you know, uh, uh, now I, now it matters or a round table or someone who's high quality and the leadership is committed. Then, you know, trace at your point, whether you're fi five people, 500 people, whether you're already in pretty good shape technologically, or you're terrible, you can really get a lot better fast. And if you're not doing those things, then to me, you're in a class that is falling really like behind at a rate that we have not seen before. Speaker 2 00:27:18 That's so interesting you raising that. Um, I don't know if you pay attention to the work that, uh, Katie Gibson is doing in Canada around digital resilience, uh, with, with Canadian nonprofits specifically, but, um, there was a, a LinkedIn explosion of comments to a question that she asked a couple of months ago. That was what, you know, consultants that work with nonprofits. What do you see when you're working with executives? Have you seen an increase in their interests? And there is so much around what you're talking about and um, you know, really, really great thread there. Um, and so I think that this is emerging and in the work that I've found, part of what's part of what's in the methodology I'm promoting on the human stack, is that the, the relationship between that executive that's holding, uh, strategic accountability and somebody working on the tacticals, that's holding tactical accountability, that relationship between those two people is the most critical part of digital success in the long term. Those roles can even swap out it is, but when they swap it isn't that they perform those actions as much as they create the right kind of relationship to keep that moving forward. So I re I love what you're promoting about that. I think that's, that's really helpful. And I know I was talking over you, Tracy, and that you probably have stuff to add in to. So Speaker 1 00:28:46 I always have stuff to add, but you're not talking over me. I mean, you know, I'll just insert myself, however, I need to, you know, that, um, you know, Josh, I actually have two follow ups for that. Uh, one is also one of my follow ups is an observation and, and it connects to everything that you and Tim were just talking about. And that is, you know, in my day job, I work in business partnerships and I will say there has been, particularly in the past couple of years, a notable shift in the types of business partners, working with nonprofits on the implementation side, where many more of them are saying to me, you know, I'm not interested in just shotgunning your technology. I'm interested in working with your technology, but my real core, you know, observation and interest is leaving an organization better, equipped to work with technology and work with itself. Speaker 1 00:29:41 So that is one of the trends. And, and I've said this out loud, many occasions, what we need to do next in the world of business partnerships is create a landing zone for that more strategic work with organizations. And that has everything to do with the sales cycle that Tim's talking about, because that does not accelerate a sales cycle. You know, it actually slows it down, but it leaves an organization in a better place. And I think that's a trend that's emerging everywhere. My actual follow up question connects to some of the things that you and Kim were both talking about in terms of regulations and privacy expectations across the United States are, and this is the first recording we've done post row. Uh, and I am wondering, are you at all concerned with the trend of eroding the constitutional rights of privacy, that that is going to start undermining data privacy as well? Speaker 3 00:30:45 Well, yeah, when Roe was overturned, you know, the privacy community of which I impart had a strong reaction, it is, this is a privacy issue. Speaker 3 00:30:58 Uh, you know, one of the things well, and this is getting to that topic though, of data protection, right? All of a sudden that takes an elevated meaning, especially if you're working in an area where wait a minute, privacy, basic privacy and right to privacy can't necessarily be assumed, never should, but now there's some real consequences, right? So I'd say, yeah, that is, that is changing substantially. And depending on where people are working and what level of risk they may be unwittingly exposing themselves or other people to, in an effort to help them is, is something that people need to be thinking about, um, a lot more carefully and protecting one's self, protecting my communications with another, right. So if I'm I'm, if I'm talking to Tim and we're talking about something that may be controversial that may, uh, court may wanna, uh, hear something about how do I make sure that those con conversations can be protected? Am I aware of the risks I'm in even? So I think yes, that that has, um, changed things. Um, pretty substantially is one thing that I wanna get back to though. Um, but it, it may, can I pick up on that first you on that, Speaker 4 00:32:36 And then we'll, yeah, I'll try to be quick though. But, um, you know, I've been working with kind of at risk organizations and individuals, um, around cybersecurity risks for the last decade. And so I work, you know, with journalists that work internationally on human rights and civil rights. And, um, and interestingly, a lot of the climate change activists around the world deal with this problem a lot. And what it's interesting now that it's come to America, but so I've been dealing with what I'll, I'll refer to as the legal subpoena threat model for quite some time, which is if I'm doing climate change work and I live in India, right. And I'm working with a us NGO or a nonprofit on climate change. Well, that's potentially a crime in India and the Indian government can just legally subpoena Google for the data and that's happened, right. Speaker 4 00:33:26 Uh, you can look up a woman named, uh, Deisha Rossi, I believe, um, you know, was, was arrested for just accessing, uh, Google document shared by Greta Thor. Uh, and so that threat model with, I believe the Dobs decision is now officially here in the United States. And if, to Kim's point, if you are communicating on Google or on Dropbox or on Facebook or on any of these platforms with something that is potentially a crime in a us state, you should think of that information as being just as available to law enforcement, as it is to Google or to Facebook. It's there. You, you think that there's a distinction there, you may, but there really isn't. And that's really important for people to understand as part of threat modeling there, you know, privacy and cybersecurity arrest. So I just wanna state that really quickly. And Kim, thank Speaker 1 00:34:22 You, Joshua for giving me yeah, absolutely something else to keep me awake at night. I hadn't even gotten there, sorry. Like most of my public policy work is, you know, actual public policy work now, not, you know, data security. Oh God. Wow. That the whole context of that just sunk into me while you were talking and there, thank you for adding to the Tracy fierce list. I'm sorry. Thank you. No, it's okay. Thank you. This cam go Speaker 4 00:34:53 Tour and signal tour for browsing signal for communicating on anything you wanna search or talk about that you might be worried about. That's the quick and simple advice I can give. That's not all of it, but, Speaker 3 00:35:07 And they're available technologies, actually. They just require an extra step. Um, you know, and, and that's, that's something that, depending on, you know, the, the thing I wanted to get back to was this idea of incremental change and reasonableness a little while back, Josh was saying, you know, reasonable safeguards. So tradey, if I get your data, am I keeping gonna have reasonable safeguards? Right? So if we hold onto this idea of reasonableness, because it can be really scary, like all of a sudden, like, oh, maybe you shouldn't collect any data, right? That's I'm not saying that at all, but incremental change toward building better practices and management and accountability is what the privacy community is looking for. Not that a nonprofit with a staff of 20 has the security apparatus of a fortune 500 company overnight, right? That's, that's not right. So it is manageable. And, and that's one of the ways that I feel Josh and I have in our working together with nonprofits, the power of kind of long term incremental sustained growth, and as opposed to bang digital transformation you're done, I think that leads to meaningful, responsive ways to re, to deal with things like GDPR or, and depending on the area that you work in and need to really lock down some of your communications and change your culture around that internally. Speaker 2 00:36:54 Kim, thanks for that. And, um, thank you also for not just talking about this here, but having created free resources on your site. Yep. You know, that, uh, promote this, that give tips around this, that allow people to dig into what you're talking about practically, because I, I agree like this conversation has been like a can opener with my brain on like all, all of these things that I sort of don't even wanna know about. And that's kind of the point, there's a motivation to not know, and live in, you know, blissful ignorance and that, and we just don't have time for it anymore. It's just, it is too risky for the work that we do for the impact that we have in our communities and, and around the world, we just, we just can't live in that, that level of, of ignorance. Um, thank you so much. We really love what round table's doing. It's been so amazing to get to know you a little bit and to talk about, uh, these issues. It's so clear that we could hang out and talk about this all day long and, um, and maybe someday, cuz that would be really interesting. Um, thank you. Thank you for that. Um, any last, any last words or tips from, uh, each of you to our listeners, um, before we have to go? Speaker 4 00:38:17 Um, well I'll say just, you know, are, if some of the things we've said have kind of caused you, you know, Tracy, to like another thing to stay awake at night, you know, we have a really great free guide on our site called, um, the tabletop facilitator's guide, a tabletop is a really simple, easy thing you can do at your organization where you just take a scenario or two that you're worried about, you write something up and then you play it out with your team and say, okay, if this happened, right, if you know someone, you know, in Texas subpoenaed Google for the information that we were communicating with about our, you know, what would they get, what risks would that have, what we do about that. Right. Um, and so that is something I would encourage people to check out. And one of, you know, if you're interested, engaging with us a very inexpensive and easy engagement to do is we can facilitate a tabletop for you. So we'll do a quick threat model, write up a, a couple scenarios for you, do that with you as an activity that can be a really quick, easy way to get to know us and also help maybe take a look at some of your concerns. Speaker 1 00:39:17 Yeah. Thank you. I think silver lining here truly is this is manageable. Don't walk away in fear. This is manageable. And I mean, I remember that from consulting on technology years ago, it was like people were afraid of it because it felt unmanageable and then it Durst turned into something else we needed to educate ourselves about. And I think that's an amazing way to sort of wrap this up with a beautiful silver lining and say, Hey, look, don't be afraid. This is something else to manage. That's all it is. Any last thoughts, Kim? I, Speaker 3 00:39:49 So I have one, uh, kind of tip and it's, it's another resource actually on our, on our website. And that is, uh, we wrote an ebook and have various resources on the New York shield act. And that's a, and, and New York shield, but not everyone listening is from New York and cares. Right? The thing that the New York shield act does, and people don't necessarily consider it a data privacy law, as much as a cyber security one and the New York shield act. Unlike a lot of these other regulations really provides almost a clear, understandable outline of what reasonable cyber security practices look like. And so what we've done is translated that into, you know, it's a nonprofit manageable and we have that as an ebook and that's a good place to start and thinking like, oh, you know, what does you know cybersecurity mean? What does that mean? You can read about it and learn about it. And again, it's reasonable. That's not perfect. And recently thank you. Speaker 2 00:41:00 Thank you for that definition. Thank you for those resources and for this conversation, we, uh, really appreciate it. Um, and keep up the good work, please. Yeah. Keep this messaging going. I think it's really critical for, uh, those of us in this space to, to better understand that. Speaker 1 00:41:17 Yeah. This whole dog really appreciates learning new tricks. So thank you for giving us something else to learn. So thank you. Speaker 3 00:41:24 <laugh> us Speaker 4 00:41:26 As well. So ready get to meet your actual dog though. <laugh> some other meeting. Thank Speaker 1 00:41:33 You so much. This is great. Speaker 3 00:41:34 Yeah. Thanks. Yeah. Thank you. Working what you're doing. Thank you. Thank you. Speaker 1 00:41:39 This is Tracy KZA Speaker 5 00:41:40 And I'm Tim Lockey, Speaker 1 00:41:42 And you've been listening to why it matters an independent production that captures our passions, personalities, and purpose for technology as applied to the impact economy. Speaker 5 00:41:52 All of that's important, but even more important. We are here to have fun and introduce some of the people and ideas that keep us up at night and get us out of bed in the morning. Speaker 1 00:42:02 We are so grateful that you've been listening to us. We have no idea why you'd wanna do that. Maybe you lost a bet. Maybe you're stuck in a car with someone else controlling the sound system, or maybe you are truly interested in what we have to say. Speaker 5 00:42:17 Whatever the reason, whether it's a bet or you're a believer, would you hit subscribe or if you've already done that, would you mind leaving us a review? And if you're really brave or wanna pu punish someone, please recommend this podcast to your friends, enemies, and family, Speaker 1 00:42:32 And all kidding aside. Thanks for tuning in. And we are so glad that you're here.

Other Episodes

Episode 6

January 13, 2021 00:41:28
Episode Cover

Technology, Ability, and Impact with Laura Tovar

Our first “Why IT Matters” recording of 2021 covers the small and large ways decisions made about access, priority, and need in technology have...

Listen

Episode 20

October 20, 2022 00:56:32
Episode Cover

Weird Al, Seinfeld, Partnershipping, and Transactionalism

This episode is as much a Coda to some of the discussions we’ve been having throughout 2022. It is perhaps an opportunity to change...

Listen

Episode 18

September 28, 2022 00:58:12
Episode Cover

Humanity is Not a KPI with Meena Das

Listen to this episode to learn how data can be shaped toward human dignity. It’s easy to get bogged down in abstraction when it...

Listen