Profound

S3 E7 - Tracy Bannon - DevOps, DevSecOps, SRE, Risk, and Generative AI

March 28, 2023 John Willis Season 3 Episode 7
Profound
S3 E7 - Tracy Bannon - DevOps, DevSecOps, SRE, Risk, and Generative AI
Show Notes Transcript

I geek out with Tracy Bannon in this episode. Tracy gave us a little history of MITRE and her work at MITRE. DevOps, DevSecOps, and SRE, were also discussed from words to movements. We explore QA and risk controls as well as modern governance. Our conversation ended with a great discussion about generative AI, specifically ChatGPT.  Tracy can be found on Linkedin here:

https://www.linkedin.com/in/tracylbannon/


John Willis:

Hey, this is John Willis again, and we've got another episode of the profound Deming podcast have a guest that I've admired for a while now. gotten to see her at RSA conferences, speaking and DevStack ops. And, Tracy, do you want to go ahead and introduce yourself? You betcha. Thank you. First of all, for inviting me, we've had a hoot talking on a couple of other panels and other things. So it's, I was really excited, because it's just you and me right now. Let's see background a couple of decades as software architect engineer, I keep my hands dirty in the code, I subscribe to his, uh, Greg llopis approach to being the elevator architect, talk to the boardroom, talk to the engine room, or talk to the boiler room, whatever you prefer. I spent a lot of my time, especially over the last nine years, eight years, nine years, focusing on the federal government, but spent a lot of time before that with with commercial state and local governments, some international governments. So definitely looking at big problems more recently. Brilliant, but ya know, I noticed you've been in this business for a while, though, you know, I've been in little longer than you, I think. Just a good thing, right. Just a little bit. Just a little more on the age thing for me. Tell me Tell us about mitre. Yeah, I you know, it's one of the things that was interesting. That, you know, not being a security professional having sort of, I think, you know, I've told some people my story, I think probably the only reason I even pretend to sound like I don't talk about security is because of Shannon leads and Josh Corman and those two have been incredible mentors to me, you know, in, particularly Shannon, because she's, she coined the whole DedSec ops thing, and I just like to meet her and like, tell me just, like, teach me teach me. So I came about it. And you know, and I think the thing I saw a couple of times is like the mitre attack framework and how I'm so involved in in sort of NIST and all that. So tell. I mean, I think a lot of people who are just pure DevOps, who really sort of try to understand security better. You know, I'm not sure all of us know, if we're not security professionals. What, to the extent of what mitre does? Yeah, so it's interesting. I've been with mitre for just over three years. And in in mitre terms, that's like counting in dog years, like

Tracy Bannon:

reverse dog years, there are people that have been there for decades, that have been working to support the federal government. So the background MITRE is what's called an F F RDC federally funded research and development corporations. These are chartered by Congress. And the entire concept of these things came out after World War Two, they realized that they didn't have the technologist. They didn't have the engineers, they didn't have the scientists. Where are they going to get in? And at first, the answer was, well, we'll just go employee industry, we'll ask industry, and I realized that there's a balance, because industry has to make money selling things. So will you always get an objective opinion? Maybe not, not that anybody's being nefarious. But if I have something really cool, and I want you to buy my cool widget, I might try to help you make it fit into that situation. So that's what MITRE was chartered to do. We are to provide objective guidance to the government. One of the big focus areas is cybersecurity. So I live in an organization. That is the advanced innovation centers in our cyber and software engineering group, under a woman by the name of when that master's fantastic. And she actually said, you know, we have these two divisions, we've got software engineering, and we've got cyber, assess wonky, we need to pull those together. So instead of emulating what's been going on in the industry for so while we've actually pulled them together, let's see, the attack framework is one of many, many different things that are done on behalf of the public. You're probably familiar with the CVS, and how, guess who manages and maintains that registry on behalf of the public good. Yeah, that's mitre. So they're very interesting. From that perspective. I think probably the strangest thing for me, having been with global consultancies, like Deloitte, is that they don't sell like they don't go and sell themselves. I don't compete with industry. So if they put an RFP out and say, We need somebody to build us a cool thing. I can tell them how it could be built. I can give them ideas for it, I can help them to architect it, but I can't compete if somebody else can do it already. So we're supposed to be that that leg ahead that step ahead. And especially with everything related to security, into what sort of your day job there then.

John Willis:

Great, great question. The answer is all of the above.

Tracy Bannon:

Good at this at this moment, I am all in actually helping the army. We are working at the Pentagon. And we are looking at policy process technology in the army. And nothing I'm saying is like, secret top secret. If you go out and you can Google all of this stuff, the way that the army procures software is the same way that they procure hardware, project, start project end gets delivered, then you warranty it for a year. Warranty, bad. Think about that. And then you pick it up, and you hand it to another organization altogether, who's to sustain it, just like you would a hammer or a physical gun? And there's all kinds of ramifications that come with that, how can you do continuous anything? Well, you can't. If I want to deploy something, it needs to not take 15 months to get it through different processes, for example, and depending on who it impacts it could. So right now, I'm spending my time there with digital transformation getting after how do you actually decompose architectures that are there to be able to adopt things like data mash, right, all kinds of trend setting leading edge to enable us to act at the speed that we need to we've got to deal with those peer adversaries. But sometimes, another thing that I'm focused on right now is generative AI. Everybody needs to understand what's coming. Everybody needs to understand where it's been, it's not new. But it certainly is trending like trendy right now. So how do we help people not to rush to adopt things just because they heard the word, they heard jet chat GPT, they got an account. They said, Give me a little bit of Python code, they saw something they didn't know what it was. And now they're enamored. Yeah, that's another area where I'm doing some leaning in so that we can better understand how it's effect across the whole SDLC. So pretty cool. I mean, you were both geeks on general AI, we were recently on a panel together, and I think that might consume the rest of this podcast

John Willis:

squeezing. You have some brilliant ideas about that, about all this. So, but let's go back to sort of basics. So that's so I mean, I guess I'm trying to you know, like, you know, it was funny for me. Early in the DevOps movement, you know, there was we tried to guard against anybody changing the name, right? Don't call it dev ops mark set, you know, just it's, it's one thing, it's a metaphor. And Shannon coined it, and I'm like, Okay, well, you know, like, in fact, when people tried to arm your arm, like, go argue with her and tell me how it looked, how successful you are after that conversation, but then I think the thing for me was, I had to change and flip my like, please don't don't mutate the name, because it's only a name to standing up and saying, I believe in dev SEC ops. And part of it was, you know, we both attend, you know, Allen's dev SEC ops, that runs through an RSA and, and I think it was the first one where they named it the dev SEC ops from DevOps. Right. And I noticed there was just a lot more security people showing up. So I did this sort of an informal questionnaire, like, you know, when I met somebody who didn't seem like the normal DevOps crowd, which at RSA, you would expect everybody to be, you would expect, because couple years, it really is just the same old DevOps crowd. And they all basically kind of gave me the same answer, which was, you know, we didn't really think we were invited when you call it DevOps. And now you call it dev SEC ops, like, I should probably stick my head in, all we have to do is add a phoneme and we can get a whole nother population group in the door. It's all a hack, but I guess the so how do you view just dev SEC ops? Either, you know, so as it stands right now is is Is there too much hype is effective, or we're, you know, just a general Tracy view, not too much hype, very necessary. Low the the term, I actually don't like the term DevOps anymore. I don't like the infinity symbol. Even though you see it behind me there I my little I just like to you,

Tracy Bannon:

because it implies that every bit every case, every mission, every business is going to be able to do the same things. If we take a step back and start talking about continuous change, and how do we adapt to enable continuous change? What does that mean across the whole SDLC? It's a better mindset. Now, I've been involved with DevOps before it was called DevOps. When we were back when we were trying to figure out how what unit testing was and continuous integration. What does that mean and when we define continuous as in once a day at the end of the day, right, but early enough that we could run things before we went out So it, it is not overblown that people were running too fast. It's the same thing with the Agile crowd. Do you know Linda Northrup, maybe she's from Carnegie Mellon From Software Engineering Institute, somewhere around 2013 2014. I was listening to her at a at one of the Saturn conferences, and she said something approximately worst thing that ever happened to software was agile. We mean, where she was going was that there has there had been a spike. And we're still trying to pull ourselves out of this, that we don't need architecture that all teams are completely autonomous. And we left security behind. We know you're of the same age that I am, I think we're actually pretty close in age. And I would guarantee the security has always been part of what you did when you were defining it wasn't always a bolt on as a right authentication authorization, it wasn't that you're always thinking about it, you're talking about your environment, you were talking about to transition from environment to environment, you're talking about different threat vendors. Somewhere along the line, we got a little bit obsessed with going fast, and having autonomous teams, and some of what's happening right now is actually just recoil from having probably seven, eight years of crazy.

John Willis:

Yeah, that's my that's my thought. I think you're spot on i Same thing. I mean, like, DevOps has been very, very good to me. But but but like, I think there is a world where I'd love to be able to say we don't need to use that name anymore, right. And I used to sort of do in some of my presentations for dramatic. I talked about the history of DevOps. And then I talked about how proud we all are. And then I throw up a slide say security, I take my hat and I throw it on the ground and go, Gosh, darn it, we forgot about security, but and then I get yelled at by other people, like I didn't forget. And like exactly, I didn't forget, what do you mean? That's right. But but there was this emphasis on, you know, where you, you know, like you've like, if you could go back in time, would you? Would we have set a whole set of different primitives for maybe the first seven or eight years of DevOps?

Tracy Bannon:

I think so I think that we may have, the world that I was in at that point was actually working at this with different states. And I was working on a child support. And I was working on integrated eligibility. So I was working in domains, where you had to be hyper vigilant about the type of information who could gain access to it. So I never had that opportunity, personally, to let my guard down. However, I was riding sidesaddle with a lot of areas where they were and it was because it was in the interest of moving fast, protect the if you if you security people protect the perimeter, we'll be fine with our security inside. Right. And that was our that was our mentality for for quite a long time. You might you might like this, John. I've actually, to the extent possible, stopped using the term DevOps and Dev SEC ops, and I've been successful with the army with one of the efforts. And all that we're saying is you have to you need to adopt modern software practices, what's modern, it means it's moving. You need to have continuous change. And how do you adapt and adopt continuous change? So we're working through it. But we as a big group, not just the Army as a big group, we've got to work towards that mindset. The interesting thing right now is that we have roles that are called DevOps engineers. Tell me what the hell that is. I've been arguing against that for a while. What's a DevOps engineer? Is that the guy that builds your pipeline, or the one that does the software? All right, and how many levels abstracted from it I and I know that there are lots of surveys that say it's a great thing. And that's supposed to be where you get your money if you want to watch something industry, but wasn't supposed to happen. By creating another, another team to pull away. That was not supposed to happen, by definition, but hey, yeah, but now we have platform teams, that's the that's our biggest shebang. Right? No,

John Willis:

that's another whole area I think, you know, I you know, it's almost not wasting the time, we have to talk to talk about the somebody I really I'm enjoying this conversation with, let's, let's stay on the positive stuff. You know, one of the things you said I thought was really cool is, you know, that I'm probably actually about 2018 17 or 18. I started I met Topo Powell he was first fell at Capitol and we all know Topo is brilliant, and yes, and an awesome person and and I started meeting him through Jean Kim's, you know, had this organization group that we we do these papers and and i i go did you do this? I go on I do these In fact, probably the main reason I got interested in security was I do these interviews of companies. So you know, somebody would a secretary of a non secretary, but a Chief of Staff of a particular CIO would tell them che, oh, you need to bring this person in, you know? And they'd say, why? And he'd say, well, he wrote the book on DevOps. And then, and then they'd say, Well, they all say that, and you go, no, no, he wrote the book on DevOps. And I go in, and I was independent at the time, and I didn't really say, hey, you know, here's the deal. I'm not trying to do any long term work. My gig is I'm gonna go interview all your people who have fingers on keyboards, and I'm going to tell you the truth, and you're not going to like it, but you're going to pay me anyway. And I go interview hundreds. And so I, there was one bank, one of the largest banks in the world, I interviewed like, 400 people over a summer. And in the stories I would hear, which is peace, soul crushing, you know, like, you know, we don't tell auditors, things they don't already know. Or they, you know, we, we use the same, you know, TDD print screenprint that we've been using for years. And so I went to Topo. And I, you know, I said, This can't be so terrible. This just can't be like the way it is in large banks. And he's like, it absolutely is. And so in 2019, we wrote a paper called DevOps, automated governance and, and then, you know, we went back and now now, it's a book called the best settlement. But the long story short was, and I'm getting to my point here is, we call it data to automated governance. And I was having a conversation with Shannon, at one point, and she said, you know, that nature, I'm totally paraphrasing, because I don't know exactly words, but it's something to the extent of that word stinks. And that phrase, thinks DevOps, automated go. And it's something that she said that, you know, I know what you're talking about when I see you present, but like all my peers and security, hey, they think we've already done this, we got Archer or, you know, or, and, and we brainstormed and she helped me sort of come up with and I don't know if it's the right word, but modern governance, you know, just different from, you know, like, to your point, like, can we stop using DevOps with something like modern software practices? Modern governance, so I've been trying to tease that out, although, you know, I haven't really, you know, I'm not my

Tracy Bannon:

I'm behind it. You say modern at first, the first time somebody ever said it to me, it's like modern, the hell does that means I mean, 1955, right, because, and then we have the 80s, which is postmodern, yeah, then I was, I actually adopted it and swallow my pride on that one, because I didn't like it at first, right. But it really does reflect continuous movement. Modern means now modern means the direction that we're going. So modern software practices are continuously changing. Okay, great. Modern governance will continuously evolve. Okay, great. I actually kind of getting behind that term right now.

John Willis:

Yeah, no, I think it is the right way to describe. I mean, I even sort of made the like, I could have got punched in the nose. If we were in person. I said, What about cloud native GRC? And I literally, I felt like I was if there wasn't a virtual screen, I would have been punched in those by shattered but like, No, don't go there. But but the point being that I think there's a way to say, like, there's a world of which there is a line. And maybe it's something like cloud native and not that you can't do security for anything post cloud native. But but the point being that is a world where it sort of necessitates your point, a world is changing rapidly and connected up to change and speed. Can you imagine that there are still places where the cloud is not the best option? I know that's scary. I know, scary to everybody. But not all software is cloud based. Not right, we forget that there is an amazing world of embedded software. It's incredible. And we forget about that. And can you apply the principles that we espouse our dev SEC ops to that? Well, you bet your bottom we can. But again, it gets to we have so many hype, phrases right now. Right? So the hype word, since November is what AI, all you have to do. All you have to do is say AI and that's the hype word. Yeah, I think it was Dave Linthicum this morning or yesterday posted something cute. Saying that, you know, it used to be the, all the businesses out there when when cloud was the Word, everything, every product that they had every service that they provided to you and cloud. He I think he called it a cloud washing thing. So you cloud washed everything. Right now we're AI washing things. So but in a kind of tracking back, the words matter, the concepts are more important. I believe that there are we've caused a lot of almost almost like mental division. For people, because now it is I've got my I've got my architect and my developer and I don't need architects. That's a whole nother conversation for later. So I've got my developers, and then I've got my DevOps people. And now because my DevOps people are not really operations, they're more dev people, but they're only doing the pipeline and now need sre. So we are so hung up on labels, that we're not taking that step back and saying, What the hell are we trying to do here? What are we trying to accomplish? What's the value that we need to deliver? Do we have organizational constraints? Great, what kind of humans? Do we need to solve these things? If I look at the SRE information and say, Yeah, that's what I need to apply here. Awesome. But we're getting so caught up again, on labels. You know, it's, it's always been that way. The one thing I'm happy about is the it's not as as tool first as it used to be, where we actually we say the word culture now, we just say the word culture, we don't actually do anything about it. But we say the word culture, and we talk about people more so at least, it's not tools first all the time, because that's been epic failures for us for decades. Right. Yeah. And I think the, you know, the thing, too, is, you know, you know, I think you're right, it is better, but we're still I think we're, you know, I mean, again, I think it's, you know, not to get too meta, but it's a human condition that we can sort of work around labels and then figure our way in, and that's because we don't communicate really well do. So that's why we can, you know, take SRE see somebody who talks about SRE and, and then you see the vision, you know, the biggest thing I see with SRE, and I think we actually discussed this one things we talked about is where people will, you know, there are people that are resistant to change, part of like, a classic, you know, ITIL service management group, and they're constantly told they need to DevOps and they're like, you know, just go away, you know, and then they come back, go away. And then finally, they see, you know, SRE, DevOps, or SRE inherits DevOps, or whatever the nonsense that that was, you know, not really nonsense, but like, I think it was something that just got used improperly, because then service manager groups, the DevOps folk come back, and they're like, hey, and I'm like, we're Dallas. Now. What do you mean, wait a minute. I was here two weeks ago, you weren't DevOps? No, no, no, we're

Tracy Bannon:

down. No, we're definitely DevOps.

John Willis:

Yeah, like we are SRE because we've been, you know, like, in, in that that's, you know, that's a terrible sort of, like, you know, offshoot of what was supposed to happen, like, just like, like the DevOps teams, like you mentioned earlier. Right. That sometimes, like the phrase of the word agile, I love that the thing you said about Lyndon Northrup, right, like, you know, the worst thing that ever happened to the software was agile, right? Like, I hope that's not what we'll be saying about DevOps someday, but

Tracy Bannon:

I think it will, the word will die away, and the benefits that we're gaining, will, will continue on, there are a lot of bad habits that are happening that we still have to peel away. We're gonna continue to be crunchy with all of this, it takes we have to get after the next generation. I think that's a really important thing right now. We don't have enough people in technology, who we need engineers. Yes, we need developers but I need to I call it tech adjacent. I need people who understand who live close to it, right? Who are doing, they're going to be enabled in in different ways. But there has to be, I'm finding on that triangle. You know, you draw that triangle where we've got all these junior people and you've got some, some gray beards like us at the top. There's a middle section there, that I'm not seeing as many growing up into that kind of mid level leader, technology leader and I want to see more of that. That's probably concern

John Willis:

is the area where I'm like, okay, the, the old fat white male wants to talk about women in tech. You know, I've seen, I mean, from my perspective, you know, the thing, it just, it breaks my soul when I sit in a room and it's, you know, 2023 now, but like, within the last three, five years, you know, there'll be, you know, the room will be 96 77% white male, right and in and I, you know, I try to, you know, in a different life, I would literally just sit somewhere between 11/10 11th grade and just convince young woman that I tried to tell my relatives, their daughters, their granddaughters like, I cannot tell you how prosperous this industry is and how much we need better leadership better just diversity. And I noticed that you know, of course you you know, with your phone posture in the industry that you of course, you'd be involved in women in tech helped me I mean, I don't know, there is no right answer, but it just it just seems it, you know, I don't. I mean, I've been thinking about this problem way shorter than then I'm sure you have. But at least since the beginning of DevOps where I got to meet all these incredible young, brilliant woman, I heard these sort of terrible stories about how, how, you know, they can be treated in this industry. And then, you know, now even today, when I look at in a room and you know, in, you know, speaking to 1000 people, and I just sort of pit in my stomach,

Tracy Bannon:

I've been taking notes, I've got all kinds of things.

John Willis:

Like, when does it change? Or I mean, again?

Tracy Bannon:

Well consider, I was just talking, talking to Roseland, Radcliffe, talking about when she was going into college of the engineering degrees, I think it was something around 35 to 38%. So it's on the rise. It's now down to 18%. Female. Why is that? Is it? Is it really because we're redirecting women in a different in a different avenue? Is it the workplace environment and some of the toxicity that happens there? Is it that we have pushed for longer hours, right, everything goes rapid. I don't know what that piece is. I do know that. As much as I'm involved in women in tech, I generally don't put out there that I'm a woman in tech. And that sounds kind of strange. So older brother and I are best friends. I've always been it's an old term, I'm going to use it. I was always a tomboy. So I was always hanging with my brother hanging out with his friends, always comfortable with them. It never dawned on me when I was in college. It didn't dawn on me when I first had, you know, when I was moving through the initial ranks, I never really noticed that I was the only woman I was too busy to unreleased stuff, getting stuff done. I didn't notice it. And it was about a decade ago, when younger women in their career newer in career would say that you How did you navigate this? How did you navigate that? And that's where I had kind of that whiplash one moment and realize, hey, we're no longer emphasizing it the way that we were in the 90s. And in the 2000s, we have started to emphasize it less. We're now seeing that comeback. Oh, come up a little bit more. They, it's interesting. You know, I have that little sign up there that says real technologist behind me. And there's a there's a reason for that, for that to be there. Somebody asked me specifically, could you put your pronouns in your signature? And you can see that when your email goes out? And I said no. What? I said, Well, I am wholly supportive. love everybody. I'm all for every kind of diversity that you can talk about. Gender, race, educational, diversity, all of those things mattered to me. And I spent 30 years not being the woman at the table. So why are you making me out myself? I said, so instead of my pronouns, instead of you asking me about my adjectives? How about I control my adjectives, so I don't want to be a woman in technology, I'm just going to be a real technologist. So that's kind of the backstory behind it. I don't, I think one of the things that we have to do is continuously look past any of the trappings of what we see, if you were I were just talking, you wouldn't know if I'm green or blue, you wouldn't know and you still don't know how I truly gender identify, you might be able to guess. But does it matter? It doesn't really matter. So part of it is the next generation coming along and helping to educate them. And I don't necessarily mean just the 20 year olds, I mean, now I gotta lean in with that 40 year old millennial, and, you know, how, what did they think? And what are their perceptions? So there's, there's a lot of work for us to do. I also am one for advocacy. Lifting other people up, regardless, I'm looking for the idea and lifting them up and making sure that there's a really brilliant tapestry, right, of all of these diverse thoughts at the table.

John Willis:

You know, I don't know you that well, but just in a couple of the longer conversations we've had other than just shaking hands at a conference, I can see that you're that kind of person that does sort of I mean, that's always been mine help everybody you know, and again, it sounds platitudes like yeah, of course this guy is going to say that but but my in my career and I tell my young young boys same thing. I'm like, you know, just just get out there and help people and it just comes back.

Tracy Bannon:

It comes back in spades is my my dad used to say have a command in spades,

John Willis:

no doubt. All right. Well, then I guess we got to move on to the dreaded live chat GPT discussion time. How fun is looking at the clock? Could it be cool Was there an hour? Or maybe I only have five minutes?

Tracy Bannon:

I can I can take Yeah, we can meander down all kinds of other topic avenues. And she wants to

John Willis:

know, I think you have, you know, we both, you know, again, we were recently on a podcast, and I think we had a shared mindset on, on what this is all about, what does it really mean? What's the hype? What? And so I'm just going to turn the mic over to you and people who are now and again, probably most people here, you know, are going to be a majority of DevOps, some DevStack ops people, and I'm starting to get some sort of Deming, people who are poking in calluses, what this what is the separation between the reality to hide, what is the good stuff, what's the bad

Tracy Bannon:

stuff, but at the end of the day, it's a large language model. You know what that means? It's a model. It is it speaks with authority, which is why people are so tickled with it. Right? It's, it's they're playing with a toy that appears to have this human quality to come back at you. Blank knowing, first of all, starting with the mentality that it's a language model, really cool language model ever growing multiple different variations of language models. That's, that's really cool. We're still in control. It is not sentient, it does not think on its own, we have to remember that i did i do a little mini podcast with Mark Miller. It's called five or five. And just about a two minute entry every couple of days. On Friday, I decided I was going to my entry was I was going to have chat GPT, right, my podcast. Right, right, only two minutes of of text only takes me about five minutes to write that, it took me the better part of an hour to create a prompt, that could understand my target audience that could understand my voice by having samples of what my writing style is, in order to be able to get out something that that worked well for me. So I say that because with a model, a model is only as good as the prompting that you're giving it to give you something back. So we're going to see a lot of emphasis on prompt engineering, you won't get if you're a marketing person, you can get awesome cheat sheets that are out there right now. Not as many for software architects, software engineers, not as many cheat sheet yet. I'm working on it. I'm working on, right. So how is it gonna affect it? If I think about not going to talk about production environments, right now not going to talk about for the moment, I'm gonna think about the whole creation process. The model has to be trained, the model has been trained the model CAD models will continue to be trained, it's only as good as the information that's put into it. Okay, whose code was put into it. And if you're using co pilot, that Microsoft GitLab, right, your GitHub turned that loose and looked at all of the all of the public source that was out there didn't necessarily analyze it quality didn't necessarily train the model for quality. So the model itself is only as good as the data that's going into it, so it'll get better. But where do you see it being used, and ain't gonna take your job soon, however, it is going to impact everybody's job. I've been using it and playing with it for generating user stories, I'm going to be able to take somebody who's not so good at translating into like mad lib format, you know, as an axe, I need to blah, blah, blah, so that I, they will get a leg up because they can say in natural language, I want to be able to do this. And then say, generate a user story for me. So we're going to see that type of transition places where if you think about low code, no code, and the democratizing right coming up with a citizen coders, you're not going to quite see that but you are going to see requirements you are going to see other types of functional outputs where people are going to be able to just speak language to get at it. It's It's It's across the full SDLC I watched a really interesting proof of concept yesterday, generating 3d models. It was it was beautiful and it was wonderful and I didn't couldn't zoom in close enough to get after that code but it made me want to go in put in those prompts and see what I can get after so it's we are going to waste more time playing with it upfront and learning about it. What I'd like to see is a smaller groups actually focus on that let's do let's let's quickly do that research and start to surface that out. Here's some leading practices. Here are the gotchas where it could work could bite you. Yeah. Actually, when we're at RSA, one of the things that I will be talking about is just some of the gotchas. Some of the things that are are good, right, some, some jumpstarts. And some places where you have to be thinking about, am I crossing a security boundary? Am I putting information in that we don't own the ultimate model and could be leveraged against me?

John Willis:

No, you know, I think you just hit on, you know, um, you know, I mean, the AI ops to ml ops, right? In, I'll probably say this, in that I haven't dug deep enough into those things, to really make a judgement, which I'm going to sound judgy here is, you know, God bless all of them. And all the work. But, but I never saw, it seemed like a lot of, again, I'm using words, I might get some of my friends mad at me, but false starts. But I think to your point, leading practices, I mean, this general AI is real. And I think it is you're right, that, you know, people we think about, like how do we get ahead of the good stuff. That probably should be a deeper conversation driven about practices. And, and, you know, one of the things that you said, which was interesting, a gentleman that I work with Bill Benson, he's done a lot of work with the government, he was at Red Hat with me, and, and we work together now I'm this quote, unquote, modern governance, if that's the right phrasing, but he said the same thing you said it was interesting, he was talking about not not only that we've, that prompt engineering is sort of this new thing. And I think everybody knows that, but everybody has been paying attention. But the thing he said, I thought was really cool. And you touched on it is that, you know, we he's a software architect as well, sort of, like you, that's his background. And, and he's like, you know, we've kind of been terrible at telling software, how to do things, and then being so surprised when it like the business or whoever doesn't. And in the sort of, like, not low code, no code, excuse me, the, you know, maybe these tools can be just help us a lot better at things like learning how to speak through the patient to do better at describing what we want to be built, right, like software specifications have some quality, right? Where I mean, that the biggest thing I see today is the gap between the people who relatively understand quality from an organizational perspective and what it means for the brand protection, people who have to end the arbitration, like the three lines of defense, right, the arbitration middle, and then the people who have to sort of deal with it, who don't really understand the why. That's a classic example of, like, Why do I have to, you know, the, you know, the all these, like, tragic stories of, you know, your build getting broken, because there was a, you know, a possible inspection of an SQL injection, and personally, I don't even make database calls, you know, you know, and they're like, those, they're all nuts, you know, like, why did they have these? You know,

Tracy Bannon:

some of those things have been around for a bit, though, like, if I think about cognitive load, and we've been starting to talk about that more. There have been tools that have been around for a couple of years now, that, for example, will do security scans, and not just provide you with a report, but it's in your IDE, and will suggest to you changes that you should make, I don't know, since chat GPT that was already here. Same thing with like, I do not like chat GPT for generating test harnesses yet for generating tests themselves, the the actual coded test was already things that do that. So why am I gonna bother to train a model for that right now? I'm just, I'm just trying to make a logical use of my time. But when you have to have annotated annotated test plans, and you have to have annotated, there are annotated test cases, right, not the automated but the annotated. Fantastic for that, right. Some of the things that you can do from an education perspective, I actually believe that this is going to turn education on its side. Because if I'm able to give you I don't know whose model it is, but let's say that I'm in control of that model. And I'm able to give you the prompts. Right? And so I can start you can start to be more interactive. Try this, here's how this is going to do, I think we're going to be able to help people understand more quickly. I do like the virtual pair programming concept. It also scares the bejesus out of me because that model is only as smart as the model. And the person who's providing the prompt will be missing out on really the understanding like I'm sure back in your in somewhere crashed your career. I'm positive that you wrote something that would generate code for other people like cogeneration like T for templates, something somewhere along the line, you did that? Well, I did that too. And what I found Round was mean, people would generate that code and be like, Okay, I've got my code, it works great. And they would test it based on what they understood. And it would get propagated forward, we're we're going to have some quality control challenges until until, you know, we people be people understand better we can't just have. So the thing I like better about low code, no code for democratizing access to building is that it has constructs in it, you're constrained. somebody sitting at a generative model is not constrained with code that's coming out. So we don't have constraints around it yet. So have you tried this some time, take a piece of code that you think has a security vulnerability, feed it in ask it, I did this, it was kind of fun. It was only like three or four lines. Tell me what security vulnerability is, bang, here's a security vulnerability. Okay, can you create a harness a testing harness so that I can execute this to better understand how to what you know what this exploitation looks like? shebang, guess what, you got? A way to test that bad code? Well, that's being used by the bad guys, as well as the good guys. So the bad guys are learning as quickly as the good guys are learning right now. But so I will, I will continue to just barrage you with all my thoughts around generative.

John Willis:

I think that going back to the risk control thing, like what I hope there is, you know, one of the things I've been playing with is looking at compliance forged, right. And, and it just, you know, it's overwhelming when you look at like that spreadsheet of like, I think there's 1000. And then there's like 300, or 250 different frameworks. And, you know, the way that I just did some work with a client that is involved with FDA compliance, right, and so you have all sort of title 21 CFR stuff, right. And, and, and then they, but they got to do sock two, they got I mean, when I look at just this, you know, this pharma supply chain company, and I put up a screen of all it, just the different compliances that I have to deal with. And then if you go to any other industry, it's the same thing. And he's, and I wonder, like, you know, the way it's just my perception, the way that people are approaching this is, like, they're just a bunch of like, you know, shields that we have to sort of get around this one, get around that one, and we don't really understand the context, right. And so I spending a lot of time in my brain doesn't jump to oh, can GPT for several days. So that's not the first place I go. But I do sort of think like, Okay, if I can get a better sense of, are there 30 risk controls that are common to 80% of all the different frameworks, we know there aren't right? And if I can start there, and then I'm, like, do a better job of like, describing what this risk control and how it's described for like sort of NIST, you know, versus whatever, you know, PCI DSS or some

Tracy Bannon:

salami ask a question about this? Yeah. So let's say you're using an agenda to the AI tool. Who made the model for you, where the data come from? That right now is my biggest sense of pain. That's an like, there are some things like I don't know if you know who Nick Chalene is. He was the chief software officer for the Air Force.

John Willis:

Anybody who's been following this for a while knows of him.

Tracy Bannon:

So he has, he has stood up a generative model. He calls it I believe, it's ask sage. It's an LLP, though, right? He's he's a private corporation, right? It's not a bad idea. Yeah, I actually believe that we need to be thinking about the quality of the model itself. Because to your point, if I fed in crap about NIST, you're gonna get crap back out. So who has audited the quality of what has gone in? If you ever get a chance to try? I think it's perplexity. Oh, I love perfect. Yeah, I like it. Because even though it's only giving me a link to a website's not getting generally it's not taking me to too many white papers, and I Tripoli pubs and things. But I love to get my simple sentence and have the footnotes and see where it deduced those things from. We need to have that kind of toggleable auditability if we're going to be using this to build doesn't have to be on all the time, but I need to be able to turn it on and figure out where it got it from.

John Willis:

No, I think you know, so I, I got I started playing so I was writing my Deming book and a friend of mine turned me on to it to one of the earliest adopters of PPT three was something called Jasper I mean it was called Jarvis. I think originally pronounced Jasper. Oh my god, this is gonna be great, right like and you know, and then I realized to your point about them Not all, if it's a well known subject, it will write you pages and pages of accurate stuff. But it's the lesser known that it is or that there was known about it. So, you know, the first first couple paragraphs, I was testing, like, I keep coming back with all this stuff, and I'd see something like, No, he wasn't born in, in, you know, in Africa, you know, you know, and then there's a Jim Deming, who writes No, you know, some type of other novels. And then I realized, okay, you know, like, You got to be really careful. And then I did the research on open AI. And that's like, you know, that it was closed that GPT three, because of the investment from Microsoft, my first concern with that was okay, this is never going to be as great as I'd hope it was, until I get an open model, talked about this on that last part is there are now competing open models, and before does actually accept about 40 pages. So it's gotten a little more malleable. But at the end of the day, if we want to do the kinds of things that you're talking about, or I'm talking about, I need to source for the model. And then I call that cannon, I think I can remember one from Stanford that somebody just turned me on to the other day, but they have a competitive version of, of an open AI, that's supposed to generate much cheaper but and I think my my concern on that last podcast was, you know, that maybe, you know, maybe GPT has jumped the shark, and we're stuck with a closed model forever, and you are more the positive? Like no doubt, I think there'll be competing models that will be able to do the things we really want to do,

Tracy Bannon:

they may have, all you have to do is to question that, and I don't know the answer. Look at what just happened with barn house, how well accepted is that people are still going to GPT there are lots of competitors out there right now, lots of interesting tools, I have to say, chat GPT, and somebody's gonna sign up for whatever it is you're selling before they because there's a massive hype cycle to it right now, I do believe that we have to have some private models, I think, you know, putting on my US citizen, I think that for the good of the US, we need to have some US centric models. If I'm working with defense, and if I'm building software, or if I'm trying to understand analytics that have to do with the tactical battlefield. I think we need to have those models, we need to train those models, we need to own those models. And that and that's an okay, situation, I don't believe that all models should be open. I also don't believe that all models should be closed. But we're not thinking about this yet. We're just rushing to play with the models.

John Willis:

Yeah. And it sort of goes back to we talked about early about rushing to do the DevOps, like we tend to sort of, I mean, that's human nature. But you know, and in the end, the the positive is like that there are positives, we that we accomplish things right.

Tracy Bannon:

So here's where I think it's gonna be fun. I don't I don't know if anybody's playing with this only because it only dawned on me this morning, I was thinking back to some work I had done with the IRS. And we were looking at modern software practices and helping them. They also have a number of all kinds of massive programs and program offices. They still have mainframes, some of the stuff is amazing and excellent, good. Some of the things they've been trying to modernize for 25 years, I do think that we're going to be able to accelerate modernization of true legacy software, we're gonna have to be careful because some of its really effective and efficient and doesn't need to be changed. Right? We need more COBOL developers? Yes. I actually said that. No, I'm not going to be the person to do that. Visual COBOL Do you remember when that came out? Fujitsu published that probably two decades ago, that I think we're gonna see some interesting use cases around things like that modernization, I, there are bits of software that exist. I've looked at them. And I said, Okay, where's the documentation? Documentation, at least I can feed it to it and say, Tell me what this does and explain to the model. What the, you know, if I understand how the parameters are being defined, if I understand maybe there's, you know, they had coding standards of some point, pump that in, pump the other thing in and not do what we have to do oftentimes today, which is we crack the code, and we look at it and we study it and we put tracers in it, I think we're able to do some pretty cool stuff to at least explain and document this old code.

John Willis:

No, I think you're right. You know, it's funny because Scott Pruitt CSG has been a longtime he's the he's one of the interesting, there are certain people that have spoken almost at every DevOps enterprise summit from the beginning. And Scott Pruett was one of those and, and one of the things that's interesting about is you can watch him go back to like 2015 And watch what he said about his company then, and what they were planning on doing the next year and 16 1718 and it's like this really just helped people like I think Jared Scott proved to Doug 17, we'll start here, but we had to write, but one of the things he did, which was they brought in this company, that, that were these people that were like incredible Java programmers, but they knew mainframe assembler. And they literally, were able to prove case that actually what you know, again, is this sort of a plague, but like many companies are stuck with large system, a record model. It's written in, you know, either COBOL or a mainframe assembler. And they actually successfully pulled it off using that sort of concept that you had, which was they the few last surviving mainframe assemblers would explain to them what the code was doing. That's how we do it in Java? Yeah, I think you're right. I think that at the very least, we would have a better documentation system.

Tracy Bannon:

If we can take, we can take our poison. Yeah, it doesn't necessarily mean that we have to have a do code conversion. That would be great. I know, the millions, the millions that have been spent on writing conversion frameworks, and what was it J CLA? Probably, oh, probably 20 years ago, there's all just in different kinds of companies faded in this way. It'll convert it to that. Well, large language models offer, right it because it's a language model. So we're gonna give it a language and have it converted to another language. Great, that's a great use for it. If I had enough time, if I hadn't, if I didn't, if I didn't sleep, you

John Willis:

say that a lot these days.

Tracy Bannon:

I would love to dive into that. Actually, that exact use case, because there is so much legacy code. We're training people on how to write because we have a generation that's aging out, the boomers are only going to be coding for so much longer. You've got Gen X trailing behind, they don't want to touch the COBOL code. If they did have to write the kind of keep their head low. The youngins, you had a pretty penny to even be interested. So we have to make sure that we are going to be able to manage and maintain this. At a minimum getting after that documentation. So when somebody had does have to make a mistake, I don't have to go and shake Tracy Bannon at three in the morning because she was the last one to touch it 15 years ago, which is what happens.

John Willis:

But we early on the company in Europe bank in Europe invited Damon Edwards and myself to do an internal DevOpsDays. And Damon made the mistake of mentioning that I started my career as an IBM mainframe assembler coder. And like that I hadn't touched a mainframe assembler for at least 10 or 15 years at that point. And why it just my day was over. I was

Tracy Bannon:

warm you please draw and tell us help us you know. I totally get it. Ways that we can use this for good there are ways we can use it for good.

John Willis:

How do people find I mean, you're out there it's pretty easy to find. But if people were you know, like I am fascinated by you know your conversations and in the way you think about the problems if they wanted to reach out to you how would

Tracy Bannon:

they sure so professionally, they can hit me on LinkedIn just search for Tracy ban and you'll see the pink hair. You can also hit up my my it's a my email, not my home email. But I have an email for just for this. It's trace tra see at Tracy Bannon dot tech, hit me up at an email. Always glad to hop on phones with folks. I have my calendar online and I set time aside so that we can just have these conversations and noodle event. Yeah, that'd be the main way.

John Willis:

I'll definitely see you at RSA. Yes. So I'll be good. And yeah, no, this was it was great. I really enjoyed it. So

Tracy Bannon:

thank you so much. I appreciate the invite for sure.