Submit to your local DSC chapter CFPSubmit now!
close

The Secure Developer | Ep 98

Security Education with the Code Doctor

with Jet Anderson

About this episode:

In episode 98 of The Secure Developer, Guy Podjarny speaks to Jet Anderson from Nike to discuss education, specifically security education, why it matters, and how to get it right. Jet is a secure software architect, writer, speaker, and evangelist of DevSecOps. A former software engineer on a mission to teach today’s developers to write secure code as part of modern DevOps pipelines, at speed and at scale, he is also the host of a weekly podcast and training program at Nike, known as Code Doctor. Tuning in, you’ll find out why Jet considers himself a developer advocate at Nike, why he chose to invest in security education for developers, and some core principles for training success, as well as the value of informal learning and whether or not gamification is a game changer.

Tags:

Application Security
AppSec
Open Source
Secure Development
Security Transformation

Episode Transcript

[INTERVIEW]

[00:01:38] Guy Podjarny: Hello, everyone. Welcome back to The Secure Developer. Thanks for tuning in. Today, we’re going to talk about education, specifically about security education and why does it matter and how to get it right. To guide us through this journey, we have Jet Anderson, who is the Code Doctor at Nike, which we’re going to understand a bit more about what that means in a moment. Jet, thanks for coming onto the show.

[00:02:00] Jet Anderson: Thanks, Guy. Appreciate it. I’m glad to be here. It’s a topic near and dear to my heart, indeed.

[00:02:05] Guy Podjarny: Yeah, for sure. Jet, before we dig in, tell us a little bit about what is it that you do and maybe a bit about the journey. How did you get into security and kind of the path you took through it?

[00:02:16] Jet Anderson: I am primarily responsible for developer security training and education at Nike from a fixit standpoint. So how do we solve problems in software that could lead to vulnerabilities? I was a software engineer for 20 plus years. In my last role, I was the engineering manager for a team at a big bank in the United States, and we manage a platform that was responsible for processing close to $3 billion a month in check payments, so just lots of pressure for five-nines availability, that sort of business.

I got in a fight one day with InfoSec. They brought me a report over 300 pages long with thousands of what they called “vulnerabilities,” in them air quotes, and just said, “Fix it.” No prioritization, no guidance, and, oh, by the way, you have 30 days to fix all of these or you go on a report that goes to the CEO of the company, and you stay there until they’re all fixed. So I said, “No, I quit.” I had seriously quit. I knew that there had to be a better way to do secure software development and that, and I wanted to find out how and be part of the solution, instead of this throw it all over the wall, adversarial relationship between information security and the development community.

I spent the last eight years gaining an in-depth knowledge of security testing practices, kind of really trying to fit the model of DevOps and continuous integration and deployment, practices that are so much of the challenge to that testing process. What I found very quickly in that is that one of the biggest deficiencies that we have isn’t in our ability to test code or applications, but it was in our inability to educate developers who are responsible for delivering value for the company. So I see myself as a developer advocate at Nike and within the community at large, and really want to try to bridge that gap and create a partnership to delivering fast but delivering the most secure product you can.

[00:04:26] Guy Podjarny: I definitely love that approach. Code Doctor is an interesting kind of thesis and doesn’t have security in the name. It doesn’t say security code doctor or likes. Is that intentional? Is that like security is sort of separate or –

[00:04:40] Jet Anderson: No. That’s a great question, Guy, and I think that it really lands at the ethos of the program in general, is that security is just another element of code quality. It’s not something off by itself that we need to worry about. If you ask a developer to make their application the highest quality they can, they will do what they can to make it high quality, right? We’re going to do linting. We’re going to do functional testing, we’ll do unit testing. We do all the testing we possibly can. Security is one element of that, and we can do it better or faster in various ways, but it’s just one piece of it. Architecture is another piece of it. Training and education is a piece of it. All of these things lead to better software, better experience for our customers, better bottom line for our company. All of these things are an element, are part of that.

So Code Doctor was a desire to bridge that gap, really help be the advocate. I even work really to try to change the culture within Nike so that we collaborate and respect each other and assume best intent, those sorts of practices, where it’s not us versus them. It’s not adversarial. There’s not a tug of war. But instead, it’s how can we do this the fastest that we can, delivering the best quality for the business as a partnership, and it’s all of those things together. It’s education. It’s architecture. It’s training. It’s collaboration. It’s testing. It’s fast. It’s give and take. It’s accepting risks sometimes. All of those things are part of the journey.

[00:06:18] Guy Podjarny: Yeah. Well said and I think that fully resonates with me, and sort of it has its own kind of competency in it, but it has to be sort of seen as part of the proper ethos. So I guess before we dig into how and what’s working, what’s not working, maybe let’s take a moment and talk about why. So why invest in security education for developers?

[00:06:40] Jet Anderson: At least up until a few years ago, I had been scouring the university ecosystem here in the United States, looking for computer science degree programs that had as a requirement for graduation anything to do with security. I couldn’t find any. I know that there are some now; Johns Hopkins, Columbia, and so forth, MIT, those kinds of places, the upper echelons, if you will. But in general, in the public university systems, we don’t set that expectation. So we graduate entire classes of developers or scientists, computer scientists, if you will, who aren’t really considering security as a part of their application development process, of their workflow of anything. So they’re generally thrown into a bath of cold water when they enter the enterprise because we have these expectations for security. We start giving them these static analysis tests and software composition analysis results. They’re like, “I don’t – What do I do with this? I have never heard of cross-site request forgery or cross-site scripting or SQL injection. I don’t know what this means. How do I fix it?”

If we don’t educate developers – I mean, if you think about the software development lifecycle, we love to use this phrase, ‘shift left’, right? That is do all the security things as early in the SDLC as possible so that we don’t have to do them when we get to prod. NASA did this study back in 2012, looking at their own software development practices, and found that it’s between 40 and 1,000 times more expensive to solve security defects or solve any kind of defect rather later in the SDLC than if you just did it at the beginning. I couldn’t think of a place earlier in the SDLC than a developer’s brain.

So if I can train them, “Hey, here are some of the classes of defects you should be most familiar with, here are some simple ways that you can get after those and here’s a vast set of resources that you can use to learn more. And by the way, I’m your advocate. I’m your I’m your help. I’m your ally. If you want help, come to me directly, and I will work with you, walk with you, strive with you to help you learn and grow and be the best and most secure developer you can be.”

[00:08:59] Guy Podjarny: Yeah. I think you had a statement there saying, “If you want some help.” So if I’m starting from the maybe most popular objection around it, which is developers don’t care, they don’t really want to get security education, you kind of need to want to learn about other things to internalize it. I guess what do you do? What’s your perspective when people come along and say, “Well, developers don’t care about security.”? They just check the box when you kind of throw a training class to them versus trying to learn something, so you might as well do the minimum.

[00:09:32] Jet Anderson: To put it bluntly, that’s horseshit. I don’t find that to be true at all. In fact, I find developers to be intimately aware of and desirous of good quality, and they want to do the right thing, right? That’s one of the maxim’s here at Nike is do the right thing. If all you do is slap people on the hand and shame them for being wrong for not knowing something, then you immediately lost the opportunity to create a trusted adviser relationship with them and to collaborate on doing something great. I think that that’s largely the reason why folks in infosec have felt like developers don’t care. It’s not that developers don’t care. It’s that often folks in information security don’t necessarily have the deepest knowledge of software development, and so they may lack the credibility or even the language sometimes to accurately explain the risk or accurately even explain the flaw.

For example, information security is fond of using the term vulnerability. In static analysis, especially, and also sometimes in software composition analysis and other types of secrets detection and so forth, what we’re finding are not necessarily vulnerabilities. We are finding security defects that may represent a flaw that could, if an exploit is found, lead to a vulnerable bit of our application. So I’m really intentional to use the term security defect when I talk to developers. It’s just a flaw. Not any different than any other flaw that you find in Jira, described with some sort of understanding, even a proof of concept sometimes.

I don’t call anything a vulnerability until it is discovered by pen tests or potentially a desk scanner or something like that. Those are actually vulnerable pieces of code. But prior to that, they’re security defects. So it’s that small sort of minutiae that is a part of the change in culture, understanding developers. I’m a software engineer. I’ve been writing code for a long time. I understand how software gets built. I was very intentional, in fact, when I started my security journey 8 years ago that in my certifications not to choose the standard certification, the CISSP, the Big Daddy, if you will, of certifications.

I went for the CSSLP, software security lifecycle professional, because that’s the bit that I’m passionate about and I don’t want to try to address the mile-wide inch-deep bit of certifications. I want to go deep on software security and let other people worry about the network and file integrity monitoring and all that kind of other stuff that I know it’s important. Malware reverse engineering is definitely a killer field. I applaud the people who do it but I’m going to stay in my lane and deal with software.

[00:12:42] Guy Podjarny: Well, it’s a pretty big problem and it’s hardly solved. I think we’ll be plenty busy. So now we’ve established, we talked about the passion for security education. I fully relate to the fact developers absolutely care. But just generally, as an industry, we made it too hard. We can only do so much when you really ratchet up that caring level to be so high that it is above the barrier of entry from a terminology and the technology and all of that perspective. So let’s dig into how we make it work. So I imagine you’ve got some core principles, some methodologies. Tell us a bit about how do you approach security education for developers.

[00:13:25] Jet Anderson: The first thing that I did was to make an assessment of what kind of tools that we had in place to do security training and how effective they might have been. So we had a catalogue of computer-based training in our learning management system that was available for folks to consume, if they will. We looked at that engagement and found it to be almost non-existent. Unless somebody has it assigned by their manager, they’re not going to take these classes. They just don’t. That was the first problem.

The second problem is there’s not really an appetite in many enterprises to make security training a mandate, right? You don’t mandate this sort of training. You might – Security awareness training, certainly, or anti phishing campaign sort of training. Please don’t click that link training, absolutely. Those are the kinds of things that we want to mandate. But as far as like training developers, there’s not really an appetite for that. So I found that I really had to go to each of the leaders within the major organizations here within Nike and talk to them about what we’re seeing in terms of the kinds of defects that exists and we see most frequently and really point to the opportunity to train developers.

The first one that I worked with said, “Absolutely, let’s do it. Whatever you want, we’ll buy it. We’ll train the developers.” I said, “No, no, no. I don’t want you to buy anything. I’m going to create a class and I’m going to teach them because I’m passionate about this, and it’ll be hands on. We’ll do some demos. We’ll show them how scary cross-site scripting can really be. It’s not just alert tests, right?” Then he said, “Okay, that’s great. Go ahead and do that, and I’ll make it mandatory for everybody in my org.” So in three months, we trained 250 developers.

I created the Code Doctor training program, the introduction to secure software development. It’s a three-hour class. I actually joke it’s four hours’ worth of me talking in three hours because it’s drinking from a fire hose. I’ll admit. I mean, it’s the OWASP Top 10. It’s the OWASP Top 10 Proactive Controls. It’s a flyby of the application security verification standards. We talk about attack driven logging. We really try to boil the ocean in three hours. It doesn’t result in people who know everything. It results in people who’ve been exposed to a lot and now have the language and the opportunity and the resources to dive deeper and know where to find the problems when problems are found or know where to find the answer to solutions to problems when problems are found.

[00:15:50] Guy Podjarny: This is kind of the 101, so everybody should undergo this. So they have sort of a little semblance of understanding of what is it you’re talking about, some basic taxonomy.

[00:16:00] Jet Anderson: Absolutely. It’s the university class they never had in three hours.

[00:16:06] Guy Podjarny: Three hours. That’s –

[00:16:08] Jet Anderson: Exactly. We saw some positive results from that. We saw some developers addressing some stories in their backlog that had the potential for being problematic, and we’re able to address that with the business to put it on hold until they could re-architect it in a way that would be more secure. All simply because they now had a framework by which to think about these things, and the word got out. That was just with the one org that I was working out with, and that VP went to others and said, “Hey, we did this Code Doctor thing, and our people think it’s amazing, and you should do it too.”

That quickly became, “Hey, Jet. Teach us this one org” to “Hey, Jet, by the way, now your full-time job is teaching all 5,500 developers at Nike how to write secure code.” I was like, “Hey, challenge accepted. No problem. I got this, right?” I’ve been teaching the Code Doctor 101 class now for two years. We’ve taught about 1,000 of the 5,000 developers, right? So we’re continuing that effort and now we’re adding to that some additional learning modalities, simply because in-person class doesn’t necessarily work for our global org. Even though I do get up at four o’clock in the morning and teach it for my folks in EMEA and I stay up till nine o’clock at night and teach it to my friends in Greater China and Shanghai, I’m willing to do that. But that’s probably not a sustainable way, so we’ve expanded the Code Doctor program beyond just the in-person learning to additional forms of content that might be helpful.

[00:17:47] Guy Podjarny: All of this is sort of the core of the basics of security education, and it sounds like also kind of scaling it to, I guess, those people who can’t afford or just don’t have the opportunity to hire you to sort of – Don’t have a Jet to run those programs for them. Was the 202, if you will, right? If people do it, is it important? How do you think about the continued education maybe after that session?

[00:18:09] Jet Anderson: Well, that was part of the reason why we went beyond the initial 101 was everyone immediately out of the class was like, “This is great. I’d love to go deeper on X.” Everybody’s X was different, like what they wanted the deeper dive on. So what I realized is, yes, people want more deep dive training. So what form should that be in? I did some research and study and looked at various training offerings and found that a lot of what we are proposing to people is still the same computer-based training stuff, which is largely PowerPoint, click-wear. It’s not particularly interesting. Sometimes, it might have video components in it.

But as far as we structured with quizzes and things, it requires a real investment to get through. I realized kind of like just thinking about how I learned. One of the ways that I learned a lot about the work that I do is through informal tools. I watch YouTube videos. I watch OWASP talks like crazy, and they happen all around the world, and I can consume them after the fact, and there’s some really great content out there if you just go looking. So that was kind of the birth of what became the Code Doctor Podcast now at Nike.

I host a weekly conversation with subject matter experts on secure software development from all around the world; Jim Manico, Dr. Philippe De Ryck. Jimmy Mesta did some talks on Kubernetes. So we’ve gone deep dive on OpenID Connect [inaudible 00:19:35]. We talked about Kubernetes and container security. We’ve talked about API security, arrest security, how to handle tokens securely, how to handle secrets security. We’re really – Like pick any topic that a developer might want an answer to, and we just record the conversation. It’s just Jet acting the part of the developer with the expert who is smarter than me, and I can at least intelligently ask some questions that are likely to be the same ones developers would ask.

We have about an hour-long conversation. Sometimes, there’s PowerPoint involved. Often there are Google Dorking involved, we’ll go ask the Oracle for an example of jot decoding and find the most egregious sack overflow system sample we can and then deconstruct it and why it may not be the best approach and what would be a better approach and that sort of thing. That has been explosive. We found that developers at Nike are consuming that at a huge level. They watch it when they want to watch it, when they have some time. Sometimes, they’ll put it on in the background while they work on other things. Then if they hear something interesting, they’ll pause it, rewind, and re-listen to that portion, or watch what’s on screen.

But there’s very low barrier to entry. There’s no penalty if they get it wrong. That kind of informal learning we’re finding is actually much more effective even sometimes than me being in the classroom.

[00:21:04] Guy Podjarny: And is this an internal podcast, so this is broadcasted to Nike developers? People go off now on Spotify and they look for Code Doctor.

[00:21:11] Jet Anderson: Yeah, good luck. No, yeah. It’s internal to Nike and it’s video-based, in fact, because a lot of this is content that happens on screen. So you get to see the terminal output from examples and that sort of stuff or PowerPoint as we talk through these things. But it is video-based, and I produced this myself. It is similar to what we’re doing now, right? Get on a Zoom call, you hit record, you introduce the topic, you have a conversation about it, and you end at the end. Then I post-produce a little bit and make it available for streaming for folks at Nike.

The great thing about that is I don’t have to be present for them to watch it, and it can happen at any hour of the day in any country in any language. They can watch it slower, they can watch it faster, they can do whatever they want, and they can go find archives. So we found that there’s been now thousands of hours of training that have taken place in just the last eight months or so of it being live simply because it happens whenever they need it.

[00:22:07] Guy Podjarny: Yeah. No, that’s excellent. All of this is really around kind of watch and learn. You’re looking at it. You’re hearing. There’s a whole class of security education that is more around being more interactive, ranging from sort of next-next-next type tutorials with some engagement to full on sort of CTF labs. I guess sometimes called ranges, right, sort cyber ranges and the likes. How do you think about them in their value or place in this exercise?

[00:22:39] Jet Anderson: It’s funny you asked because I have this Venn diagram to describe the holistic education program as part of my mission statement on the Code Doctor website that describes all of the parts of training that need to be available to provide holistic learning experience for developers. The first one obviously is classroom. It’s with Jet. The second being informal learning and the third being hands-on experience, right? So we have made an investment in a platform that allows developers to do exactly what you described. We host a number of teams that want to drill down on this, who are making an investment in security to compete with each other on this platform to solve coding challenges, to fix security defects. I’m sure you can gather or guess what that platform might be.

Then also hosting tournaments that are for the larger Nike audience that take place over the course of a week where everyone from across the enterprise can participate in the same sort of exercise to solve these sorts of challenges for bragging rights and prizes. But really I think it takes all three of those. Not every developer is going to learn the same way. This is true about all of us, right? Each of us has our primary learning method. Some are more auditory. Some are more visual. Some need to be more hands on. So it’s only going to be if we provide these sort of opportunities in a variety of ways that each individual developer will be able to self-select the path that’s going to lead them to the best outcome.

But it does require that we have that mentality that, yes, they do, in fact, want to do the right thing. They do, in fact, have a thirst for learning like we do. If we provide them an environment where they don’t feel ashamed, where they have the opportunity to play in a safe place without penalty and without that kind of impact on their psyche of the stick that we use so often, they will, in fact, go after it and they are. We’re finding that that’s taking place. Developers are learning or having conversations about security, without me even being present, right? We see our Slack channels that are not even security-related, where they’re having security conversations. People are becoming passionate about it, and that warms my heart.

[00:24:56] Guy Podjarny: That’s great to sort of see engagements embedded. When you talk about education, how do you see the mandated versus opt-in element? How much of it should be, “Look, I don’t really care if you want to do this or not. You kind of all have to,” versus, “If you want to learn more, here’s the here’s the things you can do.”? I mean, I think the solutions we talked about, the informal ones in the teams, I think are pretty obvious there. But do you think that’s the right line? Is it really just about saying, “Look, everybody should have the basics, and everything onwards should be an opt-in.”?

[00:25:33] Jet Anderson: I don’t know where the right line is there. I have opinions, but they aren’t necessarily based on any scientific facts, so I’ll just take them for what they are worth. I believe that there should be some requirement that before you can get commit, get push. You should be able to attest to your level of knowledge at the very lowest level. So in the past several years of admittedly very anecdotal survey, this is not a scientific survey but just a survey of developers taking the Code Doctor class, for instance. Rate your level of knowledge before you take this class, 0 to 10. Rate your level of knowledge after you take the class, 0 to 10. We find that there’s a relatively low level of knowledge in a lot of cases, and there’s an increase from the class, so that’s helpful. I also do a survey during the middle of the class, a poll using the poll feature of Zoom, which is really cool. Hey, who here has ever heard of the OWASP Top 10? Sadly, the number is lower than 20%. So developers just aren’t being exposed to this stuff.

While I think that it would be nice to not have to mandate some of these things, I think that there should probably be some expectation of mandated requirement that you at the very least have a basic knowledge. We tell you, even if it’s a 10-minute thing. Watch this 5-minute, 10-minute video. Here’s the OWASP Top 10. Here’s where to get some resources get inside of Nike. Here’s how to get help. By the way, here’s a link, an email address and a phone number for the SOC, so you can at least report things when they go badly and then kind of start from there. If that’s 10 minutes, and that’s all we do, it’s still better than not doing anything, right? You should have at least that level.

Do I think everyone should have to take my three-hour in-person class? Probably not. Do I think everybody should have to do at least a yearly survey of demonstrating understanding of best practices, principles, or at least where the resources live? Yeah, I think that should happen.

[00:27:38] Guy Podjarny: Makes sense. I guess another big game or sort of big name in the world of educational security as a whole is gamification. You talked about teams competing. That’s one aspect of it. What are your views around using gamification? Is good education important, not important? I don’t know if there’s any practices you’ve sort of seen. You’ve been describing the Nike program well here, but do you apply it, and what results are you seeing?

[00:28:04] Jet Anderson: We do apply it here at Nike. We’ve had a program for the last few years that focuses more on the braking side, so giving people a kind of a CTF experience. It’s more of a means to engagement with our community to help them understand what security’s trying to protect us against, and that’s been one aspect. The other side is we began to develop the Code Doctor program for fixing these things has been for us to try to achieve that same level of fun and experience, which is kind of why we’re heading in this direction of tournaments in the gamified competition platform.

Frankly, though, I’m not convinced that that is the way forward, and part of the reason is there might be some psychological resistance to that where folks are like, “This is cool. This is fun. But am I really supposed to be having fun during work hours? I’ve got this thing to work on. I really shouldn’t be in here playing,” even if they’re being encouraged to do so by management or their leadership and so forth. Then the other is I’m not really sure how committed folks are to playing games that way. I think some people are committed to playing games. I personally am not a gamer. I don’t get a lot of joy out of sitting down and chunking away at problems, competing with someone else. I do it because I just want to learn. I’ll sit down and attempt some new bit of learning and knowledge, simply because it’s the thing that I want to learn at the moment. I don’t want necessarily anyone else telling me or I don’t care if anyone else got more points than I did. I’m not convinced that that’s going to be a game changer.

There are some companies that are doing this that are better than others. I’m sure if you have a team that wants to promote this thing and is excellent at marketing and engagement and communications within your organizations, you may be able to make it successful. But I don’t believe it’s something that’s going to be successful without a lot of work. I don’t know that it’ll necessarily result in a huge lift without that effort. I’m just not quite sure of the ROI there.

[00:30:11] Guy Podjarny: Well, I think it’s hard to sort of indeed find a one size fits all, so [inaudible 00:30:15]. One theory that came out of sort of past conversations and actually a little roundtable we’re hosting was around the separation between developer and maybe like a security champion. Not security champion as in the program, but rather someone who is a security curious developer who is keen and hungry and kind of is interested in security as a field versus a developer that isn’t, that cares about software, cares about software quality. This isn’t to their detriment, or security doesn’t really specifically kind of catch their attention.

So the theory was that there’s a type of security education you want to provide to the latter, which is maybe a little bit more basic, a little bit less gamified, a little bit more in and out, versus the one to the former, which is more CTF, competitions, and the likes. I’m just sort of curious. Does that resonate? Do you think that’s a decent mental model?

[00:31:11] Jet Anderson: What I see is often we have different ideas of what the word security champion means. Especially in a company as big as Nike, like we’d argue about what every term means. I don’t think we often agree on much of anything. That kind of phrase I think gets lost pretty quickly, but you are correct that there are developers out there who are more security-minded. They’re thinking about encryption. They’re thinking about architecture. They’re thinking about validation and the frameworks required to do that validation and so forth. They’re concerned about the total picture.

My job as Code Doctor has been to identify those folks and make friends. But I’m a big advocate of the developer community anyways. I spend time in people stand-ups. I go to their community meet-ups, talking about whatever they’re interested in and so forth. I think that’s where you find those developers, but it also requires a cultural change, right? You have to have developer-minded security professionals in order to go find the security-minded developer professionals. I’ll repeat that again, in case anybody got lost, right? You have to have developer-minded security professionals to go and engage with the security-minded development professionals in order to be able to find them and really adopt those security champions, if you will.

Then what do you do with them once you have right? What do the conversations look like? What kind of resources do provide for them? It starts with this great, passionate idea to go and engage these security champions and but I’ve seen it lose steam because we’re not really sure what to do with them when we find them. We’re not really providing any value other than just face time, and face time means time taken away from them doing something else. If we’re not really intentional with cultivating their knowledge and giving them the information that they want at a deep technical level, then we’re not adding value through those champions sort of programs or relationships.

I think the better way to go in that case is simply participate with developers in their own programs and in their own communities, and find those folks and engage them at a deep level or don’t try to engage them.

[00:33:31] Guy Podjarny: It’s fascinating. It is true that security champions, you don’t see like as many programs. You see some. It might be privacy champions [inaudible 00:33:39], but there aren’t like quality champions or –

[00:33:43] Jet Anderson: Some people are interested in, “We’ll do it on our own,” and other people are not. I think that that’s true, without regard to necessarily security. You’re going to have developers who want to learn and grow in their capability and what happens to them. They end up becoming architects or your principal engineers, who are providing guidance and oversight and technology. I often find that those folks are already your security champions. So if you just engage with those people who are self-selecting to increase their knowledge in general, security will be one element of that. Don’t necessarily try to make it be the only thing they’re thinking about.

[00:34:18] Guy Podjarny: Yeah. I mean, I think that that’s nurturing a security community there. You build on these like security education programs. How do you measure them? How do you know that you’re doing well? You make an edit to the program. How do you know if it did better? What do you use to measure the success of the program?

[00:34:35] Jet Anderson: The way that I’m measuring success is through the survey that we do after the class. Obviously, we have a lift score, we call it. Where did you start? Where did you end up? Then that Net Promoter Score, right? Like how likely would you be to recommend someone else take my class, right? We rate the success of the class based on those two numbers and then the Lift and then the NPS. Then as far as informal learning goes, we track viewership and engagement of all of the videos, right? So we were able to see analytics based on that. We know which episodes are the most popular.

No surprise that the episode on cross-site scripting has been viewed thousands of times, compared to some of the other deeper dive topics. But it’s the one that we hope people will get the most value from anyways. Then we’re sort of just beginning the hands-on portion that we call Code Doctor’s laboratory. We’re just beginning to track engagement with that, but initial insights look pretty good. Really, I think there’s some elements of this that can’t necessarily be directly quantified, and so just sort of the – We’re looking to hear how much of discussion of this there is among leadership teams and so forth that are beginning to consider this as a concern. The response from our board of directors that definitely is interested in seeing security at Nike be top notch. So we’re really positive on that, but some of this can’t be directly quantified.

[00:36:05] Guy Podjarny: It’s complicated. It’s qualitative kind of materials. I think a survey in NPS is a – They’re good measuring techniques that are imperfect but they’re valuable. So, Jet, this has been excellent, tons and tons of sort of great insights around how to drive security education program and why. One last question that I like to ask every guest on the show, you imagine someone in your seat building, driving security education in five years’ time, what do you think would be most different about their reality?

[00:36:36] Jet Anderson: That’s a fantastic question, and I’ve been thinking about it. One of the things that I see being the greatest change in our technical infrastructure landscape right now is the explosion of data. By it, the explosion of fields like data science, AI and ML leveraging this data to gain insights and train models to help us make better decisions as companies and reach our customers in better ways and that sort of thing. With that comes a huge set of data privacy concerns that are off the charts scary. Couple that with the fact that we haven’t been training software engineers. Now, are we even thinking about training data engineers and those who are involved in creating these models and the security of those models, and so forth, the security of the training data that goes into these models?

I think that that landscape will only get more complicated. My job as an educator, for everyone who produces computed insight and applications that do whatever they do is to really try to adjust to that changing landscape and provide value for everyone writing code, whether it’s R and Python with NumPy or if it’s a frontend developer using React or Vue or Angular. Everyone deserves an equal chance at writing the most secure software.

[00:38:07] Guy Podjarny: Well, well-said. Jet, this was excellent. Thanks a lot for coming onto the show and sharing your learnings here.

[00:38:15] Jet Anderson: Been a pleasure. I had a lot of fun. I’m very passionate about the topic.

[00:38:19] Guy Podjarny: If anybody is trying to find you on the Internet, where’s the best place to find you?

[00:38:25] Jet Anderson: I’m on Twitter, @thatsjet, T-H-A-T-S-J-E-T, that’s jet. You can find me there. Find me and hit me up on LinkedIn. I am a passionate advocate of helping the community at large as well. So if you want to find me there on GitHub as well because like I have some repos. I’ve been making some contributions to OWASP projects. In fact, I recently released a new logging vocabulary standard. It would be really cool to have folks check out and make commentary on, so definitely look me up. I’m happy to help if I can.

[00:38:55] Guy Podjarny: We didn’t get to sort of talk about sort of the great cheat sheet there. We’ll definitely put some links to it in the podcast, and I recommend people go and learn another treasure trove of good practices and another build secure code. So thanks again and thanks, everybody, for tuning in. I hope you join us for the next one.

[END OF INTERVIEW]

[00:39:16] ANNOUNCER: Thanks for listening to The Secure Developer. That’s all we have time for today. To find additional episodes and full transcriptions, visit thesecuredeveloper.com. If you’d like to be a guest on the show or get involved in the community, find us on Twitter at @DevSecCon. Don’t forget to leave us a review on iTunes if you enjoyed today’s episode.

Bye for now.

[END]

Jet Anderson

About Jet Anderson

Jet is a Secure Software Architect, Writer, Speaker, and Evangelist of DevSecOps. A former software engineer on a mission to teach today’s software developers to write secure code as part of modern DevOps pipelines, at speed, and at scale. He’s also the host of a weekly podcast and training program at Nike called “Code Doctor.”

The Secure Developer podcast with Guy Podjarny

About The Secure Developer

In early 2016 the team at Snyk founded the Secure Developer Podcast to arm developers and AppSec teams with better ways to upgrade their security posture. Four years on, and the podcast continues to share a wealth of information. Our aim is to grow this resource into a thriving ecosystem of knowledge.

Hosted by Guy Podjarny

Guy is Snyk’s Founder and President, focusing on using open source and staying secure. Guy was previously CTO at Akamai following their acquisition of his startup, Blaze.io, and worked on the first web app firewall & security code analyzer. Guy is a frequent conference speaker & the author of O’Reilly “Securing Open Source Libraries”, “Responsive & Fast” and “High Performance Images”.

Join the community

Share your knowledge and learn from the experts.

Get involved

Find an event

Attend an upcoming DevSecCon, Meet up, or summit.

Browse events
We use cookies to ensure you get the best experience on our website.Read Privacy Policy
close