Profound

S5 E11 - Diane Kulisek – Engineering Across Industries

John Willis Season 5 Episode 11

I have a conversation with Diane Kulisek in this episode. Diane, a veteran in quality systems and regulatory affairs, shares her journey from Gillette to Rocketdyne to Johnson & Johnson, weaving in the principles of Deming and the realities of complex, high-stakes industries. We dive into W. Edwards Deming’s seminal perspectives on quality and how they’ve shaped Diane’s extraordinary career across aerospace, consumer products, and medical device manufacturing. 


We start with Diane’s early work at Gillette, where she first encountered military-grade quality standards, and move into her groundbreaking experience at Rocketdyne. There, she managed space shuttle main engine avionics and led self-managed teams. Diane highlights the power of elected management and the deep cultural dysfunctions she observed, drawing sharp analogies to adult children of alcoholics and the normalization of deviance in corporate environments.


Our conversation then pivots to regulatory complexity. Diane explains how compliance efforts in medical device manufacturing must transcend minimum standards to uphold the priceless value of human life. She critiques the profit-centric motives of insurance companies and exposes the structural misalignments that can compromise quality in favor of greed and speed.


We also explore the limitations and potential of AI in auditing, with Diane emphasizing the importance of human experience in identifying risk and systemic failures. She proposes the provocative idea of creating an “AI Deming,” using Deming’s extensive body of work to model principled decision-making.


Diane’s reflections bring a critical eye to regulatory frameworks, the ethics of risk management, and the potential of technology to augment human insight. Through it all, she remains grounded in Deming’s enduring vision that quality is a moral imperative and a societal good.

This is Diane's LinkedIn Page:

https://www.linkedin.com/in/dkulisek


Show Notes:

1: https://www.scribd.com/document/451480272/MIL-Q-9858A-Quality-Program-Requirements-pdf

2: https://adultchildren.org

3: https://www.acquisition.gov/far/16.305

4: https://www.wsj.com/articles/SB991862606575154843

5: https://www.nasa.gov/wp-content/uploads/2015/04/708066main_Shuttle_Bibliography_2-ebook.pdf?emrc=c67e14

6: https://www.irvinestandard.com/2023/johnson-johnsons-innovation-irvine-roots-and-credo-to-give-back/

7: https://asq.org/-/media/ASQ-Supplemental-Media-Import/1/3/9/2/6/ar_1106_105018.pdf

8: https://www.dol.gov/sites/dolgov/files/oalj/PUBLIC/WHISTLEBLOWER/REFERENCES/STATUTES/SARBANES_OXLEY_ACT_OF_2002.PDF

9: https://store.pda.org/TableOfContents/Risk_Assessment_Ch01.pdf


10: https://www.healthcarefinancenews.com/news/class-action-lawsuit-against-unitedhealths-ai-claim-denials-advances


11: 

John Willis: [00:00:00] Hey, this is John Willis, another podcast. I think you, we've done a couple of podcasts about the In2:In Meetup. Unfortunately I wasn't able to make it this year. I, I know I raved about it. I tried to get everybody to go that I, 'cause I was there last year and, and the, the people that I met there, and I've talked about this over just incredible.

Careers that are sort of outside what we're used to in DevOps, DevSecOps, whatever we sort of talk about these days. But this year I did a virtual presentation. I had a family obligation, and my next guest was really like, amazing questions at the end. In fact, I think the q and a went longer than my presentation and bulk of the QA was Diane.

And she was at the end two end meeting and then I looked at it, but we, we had a call and it was just, oh my God. What a fascinating background. So, so Diane, you want to go ahead and introduce yourself? 

Diane Kulisek: Sure., Great to be here on this podcast with you, John. I gotta confess, this is my first podcast.

John Willis: Oh, there you go. Oh, well, yeah. With [00:01:00] your career too. That's crazy, right? But like, that's about talking. Well, you

Diane Kulisek: know, in, in my career actually, we have to be a little bit quiet because Oh, that's right. Yeah. We've got so many, yeah. Slings and arrows aimed at us. You sure? 

John Willis: Yeah. You gotta be really careful what you say, what you can say, what you, you know, what you slip.

I've had a lot of podcasts where people slip and then they'll call me after the thing. I'm like, Hey, remember that thing? Where No problem, we can that out very easily. So, 

Diane Kulisek: well, I'm in a profession that is actually accountable to a large extent for finding errors. At least that's been the traditional approach.

You know, we were the, the checkers, you know, the last place things go before they go to a client or a customer, and that is the quality assurance profession that has evolved over time. I, I'm now more of a quality systems. Subject matter,, specialist. And also, I've stepped into the regulatory compliance portion of this or what's [00:02:00] often referred to as regulatory affairs.

So I, I build myself as quality systems and regulatory affairs consultant, currently focused on the medical device manufacturing industry, which includes the development phases. But before that, I worked about eight years with the Gillette company that is a commercial consumer products company. , Specifically my, my area of focus was in their stationary products group, which was Paper Mate Parker Waterman and Liquid Paper Products.

But, but we also oversaw Duracell batteries., we also oversaw Oral B. Hygiene.

 we oversaw Braun, which is responsible for electric razors and a lot of electric household appliances. And then lastly, Buxton, which was a, a leather accessory manufacturing company. I think we also had a cutlery company [00:03:00] in there somewhere, because I, I remember buying cutlery at the employee store at the time.

But that was my first massive corporation that I worked with. And, you know, it is the foundation for a lot of my approach to things, because frankly, in consumer products, at that time, quality assurance was not a requirement. 

But they had chosen to parallel, , the military requirements for a quality management system.

It was referred to as MILQ 98 58 A at the time And I learned about the value of a quality system through Gillette. Hmm. So, kind of set the tone for a lot of my forward approach to doing things. I, I have worked for probably 30 companies at least, maybe 40. And, and that's a combination of my consulting gigs as well as full-time regular employment.

But there only have been three major companies that I've [00:04:00] worked for. Gillette was the first the second was, Rocketdyne which was part of Rockwell International when I joined previously Atomics International, which my grandfather and my mother had worked with. 

John Willis: Oh, wow. Wow. 

Diane Kulisek: So I was a third-generation Rocketdyne employee in Canoga Park.

Is 

John Willis: that what you meant, bill Bellows then? Is it? It is. We started 

Diane Kulisek: very close to each other. Okay. 

John Willis: Okay. Cool.

Diane Kulisek: Dr. Bellows came in as a physicist working with engineering and I was working in quality assurance and systems safety. Okay. I started out as kind of a generalist working on all of Rocketdyne programs, which were both civilian for the space shuttle main engines and the space station as well as military.

So they had missile rockets as well. 

John Willis: Okay. 

Diane Kulisek: Peacekeeper was, you know, one of the big programs, which is probably one of the worst named missile rocket systems. 

John Willis: Yeah. But 

Diane Kulisek: anyway you know, one of our nuclear rockets, but, you know, I started working [00:05:00] there first in procurement quality, which is, you know, working with suppliers.

Then with what they would call systems quality, but it was primarily corrective and preventive action, frankly. So we did internal audits as well. So I, I became a, a systems quality manager for the first time at Rocketdyne, and then that evolved into program management. So I ended up being the program manager over the space shuttle main engine avionics and controls.

John Willis: Oh, wow.

Diane Kulisek: It was about an 80 person cross-functional team. And one of the unique things about that structure is that the managers were elected by the employees. That would be reporting to them. 

John Willis: Oh, wow. So 

Diane Kulisek: there were six candidates for the quality management position, and I was one of them. As you can imagine, there weren't many women in management in aerospace at that time.

And I beat out my five [00:06:00] male counterparts to win that position. Electorally. 

So that was my, my last position I held there. It was a fun one. 

John Willis: Is that, that kind of, I guess, boy, we could go, this could take five hours. 

Diane Kulisek: Oh yeah. 

John Willis: Like a harder questions, but like, what are your thoughts about sort of an electoral management structure?

We don't, I wanna go too long into it, but I'd love to. 

Diane Kulisek: I am a firm believer in self-managed employee teams. 

John Willis: Yeah. Okay. Okay. 

Diane Kulisek: I've seen them succeed in ways that. You know, employees that management tries to motivate with management by objectives. 

John Willis: Sure. And 

Diane Kulisek: performance assessments never could. 

John Willis: Right, right. 

Diane Kulisek: I do believe in, you know, a broader type of performance assessment.

I think we all like to be you know, acknowledged for our contributions as individuals, as well as, as a member of a team. I do think it's important to have trained facilitators teams in a non-decision making role, [00:07:00] if you will, but I do believe that self-managed teams are much more effective and satisfying to work with than the traditional type of employment.

And that is very much, I believe, along the lines of what Deming. 

John Willis: . And so one other question because I wanted to continue. You said there was three major companies, but you were talking about shuttle. Was any of that work with sort of parts or sort of engines for which, which shuttle or was it anywhere near, near either the challenger or the Columbia?

Actually yes, 

Diane Kulisek: of course. We built the engines for all of the space shuttles. Okay. There were three engines for shuttle per shuttle and my particular team was accountable for all of the sensors. That fed into a computer as well as the software that, you know, facilitated that communication. So we were the first to be blamed when something went wrong.

It was never the, the hardware, the mechanical stuff. It had to be a sensor mystery, right. But frankly, during the time [00:08:00] I was there, we demonstrated high reliability in the output of our sensors. 

John Willis: Yeah. I'd love to circle back. My crowd is we, we. There's a, you know, sort of, there's a woman, Diane Vaughn, who wrote one of the books about a challenger, and she holds, it's sort of this idea that you, well, she, she sort of uncovered all the human factors that went, went into the disasters, you know, how NASA delays and speed up and anyway, so I'd love to sort of a lot 

Diane Kulisek: of backroom politicking going on both within the manufacturing.

Yeah, 

John Willis: she called it normalization of deviance. That's what it was. And it was sort of deviant organizational behavior that just got normalized and normalized and normalized to the point where you know, things that happen on lift off or like, you know, the, all that stuff. And it's, I, I should, well, just 

Diane Kulisek: like any, any other situation, you're always balancing cost.

What I like to call greed. 

Against 

quality, you know, which, which I like to refer to [00:09:00] as what's needed and the requirements for on-time delivery or schedule performance. Right. Which I refer to as speed. So you're always trying to balance Right. You know, need with greed and speed and those factors. I don't think it's outside of the realm to say that many of the big companies I've worked for are structured like a dysfunctional family. You know, if you've ever heard of the Adult Children of Alcoholics program, ACA, you know, there, there's a certain pattern to the way people behave in that kind of a family

The person who is the most. Out there. The most abberant, the looniest, you know, is the, the child of the family, even if they're in their seventies or eighties, and everybody else in the family works to enable that individual to thrive at their own expense. So when you look at a ma major corporation that [00:10:00] is deviant as your, as your colleague would said oftentimes they're paralleling that model.

So you have a couple of executives that are ably outta their minds driving decisions that don't make any rational sense. 

John Willis: Right, right, 

Diane Kulisek: right. But if you look at the underpinnings of those, they'll often be, you know, inspired by greed or, or a need for speed. 

John Willis: Right, right, right. Rarely 

Diane Kulisek: does need enter into the equation.

John Willis: That's right. That's right. Yeah. No, there's plenty. And, you know, we can circle back how we sort of Yeah. That triangle and software. I will 

Diane Kulisek: say that NASA incentivized quality through what they, their, their award fee program, which is the way that cost plus programs make profit

John Willis: Okay. 

Diane Kulisek: So everything is paid for upfront.

That's an expense. And a company has to demonstrate that it's met certain goals to achieve and fulfill the needs of the customer. In this case, nasa, [00:11:00] in order to receive a stipend that is essentially a huge bonus that qualifies as the profit for the company. So when they say it's a cost plus program, that's what they're referring to.

There's an award fee that's assigned. 

John Willis: Yeah. Alright. Circle back. You're a Rocketdyne. And you said there were three companies that were 

Diane Kulisek: Yeah. The third major company that I worked with, and by the way, I should mention, I have pensions from all of these companies. Oh, 

John Willis: that's awesome. Yeah. 

Diane Kulisek: Is Johnson and Johnson.

John Willis: That's a, that's a good one. That's a nice one. So, 

Diane Kulisek: you know, the, the Gillette went bankrupt in the nineties (Edit: Gillette WOULD have needed to file for bankruptcy in the nineties, however, it instead appointed a new CEO in 2001 and Proctor & Gamble acquired it in 2005. The new CEO, James Kilts, described the challenges facing Gillette as "The Circle of Doom")

NASA lost funding for the space shuttle program, which led to NASA laying off thousands of engineersWas kind of without a chair in a, in a game of musical chairs. And I looked around and said, where is the next industry I should focus on? I was, was and remained thoroughly convinced that the skills I've learned to become a quality and regulatory specialist are highly transferrable. Right. Despite the [00:12:00] complexities of each industry.

And so I saw the healthcare and medical device industries as my next horizon, right? And I began consulting for Johnson and Johnson in 2010 after having worked for a number of smaller industries, including high tech and, and filtration industries. But ultimately, I, I went to work for Johnson and Johnson as their quality systems manager after a couple years of consulting.

John Willis: Okay. 

Diane Kulisek: So, all told I was only with them for about three, four years. Okay. Only one and a half, two years of that as a regular hire. But it was a meaningful few year. It was an opportunity to see how the FDA applies its influence to a massive corporation. It was an opportunity to observe how a massive corporation [00:13:00] manages every facet of the business.

Because in quality assurance, you touch all of it. Right, right. You know, whether it's, whether it's the design and development activities or whether it's, you know, dealing with a failure that's been reported in the field, you, you have to touch all of it and everything in between. So it was a, a great learning experience for me.

And you know, they say, you know, experience is what you get when you don't get what you wanted. Well, 

John Willis: wow. That's right. Yeah. Well that's 

Diane Kulisek: ultimately Johnson Johnson consolidated the majority of their medical device manufacturing in Irvine, California, where I was based at one of their divisions, and they ended up with multiple quality system managers

So I was given an opportunity to take a severance package, which included early retirement benefits. 

John Willis: Got it. So there's a lot to unpack here. But I do, I'd be remiss if we didn't least cover my favorite Deming story. I've heard of all [00:14:00] stories I've heard so far on this podcast how you first met Dr.

Demings. 

Diane Kulisek: Yeah. In person. 

John Willis: Yeah, yeah, yeah. 

Diane Kulisek: I had a mentor when I was working as a volunteer what they call now a member leader for the American Society for Quality. To go into a suite where they were hosting a reception for so-called Quality GU gurus. You know, and this included Deming Juran I'm trying to think.

It wasn't  Taguchi who was it? It was is Ishikawa.

John Willis: Oh, wow. Wow.

Diane Kulisek: AWA. And, and so they, they had these, these very notable quality role models there for me to meet my mentor. Ray Goldstein was a fellow of the society and a frequent author in the Society magazine on topics like statistics. He, he likes statistics best, and so he, he, allowed me to go with him. And it's interesting because I [00:15:00] did meet Taguchi the same way through Bill, even though Bill and I were kind of peers. Yeah. He, he had an inside track to take me to a meeting where I met, that's Dr. Taguchi. Wow. 

John Willis: Wow. 

Diane Kulisek: But Ray took me to meet Deming. Juran and Ishikawa. And

I I was very young. I, I think it was in the eighties (May 1986). I graduated from high school in 74, so I would've probably been in my twenties, late twenties, early thirties. Okay. I, I would say late twenties. Okay. All right. It was back when, when women were still wearing these miniskirts. Yeah, yeah. Yeah. And they had these micro miniskirts that had like ruffled panties that were real cute.

And I didn't know anything about. You know, appropriate attire for a conference like that. It was hot, you know, and I was, I was looking to stay cool. So I showed up in one of these little outfits. 

And you know, I was just a GOG at this crowd of distinguished professionals. And Ray took me over to shake hands with the people that were lined up [00:16:00] in a scenario that where they would be taken a photo of that would appear on the cover of Quality Progress Magazine, which was ASQs Flagship magazine

And as I was working my way from left to right, I think I sent you a picture where Deming's sitting in front of the group on the left. 

John Willis: Yeah. 

Diane Kulisek: Deming grabbed me and sat me in his lap. And I was quite surprised, despite knowing I probably wasn't dressed appropriately for the occasion, 

John Willis: Uhhuh. 

Diane Kulisek: And he was laughing and, and looking at me and behind him was Juran.

And even though on the magazine cover, Juran is smiling. Yeah. I saw an outtake of the photos and he was definitely scowling it. 

John Willis: Yeah. Didn't you say that, you were saying that you think you felt that maybe Deming did it as a jab to Juran? 'cause he's so, 

Diane Kulisek: I do believe so. He was scowling 

John Willis: at you and Okay. Yes, yes.

Was like, whatcha are doing because Juran was as far, I mean j brilliant man. Right. And like he's worthy of his own book, you know [00:17:00] about Yeah. Or somebody else writing a bi biography of him, but sure. Probably not me. But the but it, it all appears he was a jerk. I mean, he was just, just a jerk. And so I think it's a great story of Deming, like just, yeah.

Sort of, 

Diane Kulisek: I never saw the photograph with me in his lap, but I know, oh no, it was somewhere they probably shredded all those negatives, but, 

John Willis: oh no, 

Diane Kulisek: I would've loved to had one. It would've been a great legacy for me. 

John Willis: Yeah, no kidding. That's a great story though too. I love the whole Juran angle too, of him scouring and yeah.

Diane Kulisek: What are 

John Willis: you doing on stage, you know, like an in Deming notice? Yeah. I don't 

Diane Kulisek: know what I was doing at the time, but I, I don't, I really don't recall being at work at the time. 

John Willis: Ah, yeah. So 

Diane Kulisek: I may have been between jobs, you know, to be able to afford to take time off to go to that. 

John Willis: So I think this, where it gets a little interesting is I wanted to give you a little bit of background about how a lot, a fair amount of people who listen to this podcast think about audit.

And and I won't say exclusively so I apologize to the readers that don't fall into this, [00:18:00] but we wrote, we did, a bunch of us, about nine of us wrote a book called Investments Unlimited. And we believe that we, and we did a paper two years before that and we called it DevOps, automated Governance. And one of the things that we started seeing is the way internal audit was working in software delivery, mostly in banks, but pretty much any industry that delivered software.

You know, as, as a sort of core business, which I can't think of a business that is kind of software driven as a core business that the audit process was really just theater. I mean, it was, you know, like people writing in documents or tickets saying, Hey, we did this, we did this, we did this. You know, and good luck finding the logs and internal audits.

It, it would be almost like just, and the chaos would begin in the internal audits. You know, like these banks would for six weeks a year, it would just be pure chaos where, you know, the, the internal audit there, you know, in the banking they call it the three lines of defense. Like, so there'd be this, first off it was already messy because it was three, three different groups.

The middle group [00:19:00] supposedly be the translated between the two groups, all this nonsense. And and, and the, so the thing that we kept seeing in the industry is all this wasted time you know, sort of negative ROI stuff on like this six week period of like, just like. Internal auditors saying, well, can you tell me why you did this?

And then like, phone calls and emails. And so we started thinking about like, could we build a system that creates digital at the stations and assets like, so like instead of like somebody just writing in a record that we did this, we did this, we, we covered this regulatory control by doing this, this, this, we, you know, we said, well, why don't we just like, since the system's all automated in software anyway, why don't we just sort of timestamp create sort of an encrypted, you know, what we call a digitally signed immutable evidence so that at the end of the day, the auditor should be able to look at it and say, you know, unless they believe that digitally signed evidence is no good we're [00:20:00] done.

Not, not done, but like done, done. Like, there's no like, and, and so you know, banking is where, so we started, but you know, medical devices are interesting consumer products. And and very little you know, of least the people that I've worked with were in sort of aerospace or, you know, and I, I guess the thing I would think based on all of that and like pure freedom here, don't try to sort of dive in or don't feel obligated to dive into our world.

I, I think since you've been in regulatory control and three may like very interesting different industries, like what is, and I don't wanna get this sort of right, but what's the commonality that may help us learn a little more? And I'll add one caveat, like I think I said this before we started recording, is one of the companies I worked with was was sort of responsible for the delivery of the cold chain for Pfizer.

And we had to understand from a regulatory control from their software delivery, all the [00:21:00]FDA requirements related to regulatory. And, and it sort of opened up a whole world of, wow, why aren't we looking more here? We were looking at banking regulatory information you know, in general, like, okay, that's a good model to understand banks and they're pretty specific, you know, don't mess with people's funds, but when, you know, when you get into the medical device, now you're messing with people's lives, right?

And, and you start, well, 

Diane Kulisek: there's two ways to look at people's value 

You know, one is that it's priceless, it's irreplaceable, it's unique, and, and therefore you can't put a monetary value to it. The other one is that it's worthless. You know, that people are expendable, that there are necessary losses.

And if you wanna make money, you have to endure those. And it's always a, a balancing act between those two philosophies. I personally think human beings are priceless. 

You know, I, I am appalled by how many decisions are, are based on money. One thing that somebody asked me about [00:22:00] recently is I would imagine that the FDA makes their decisions based on, you know, what's most cost effective.

Right. And I said, actually no. In fact, when you're being inspected or investigated by the FDA, it's almost a given that if you bring up money with regard to any decision that's been made, you will be found in violation of some tenant of the FDA's regulation. Oh, 

John Willis: okay. 

Diane Kulisek: Because they have deeply ingrained the concept that human life is invaluable.

It's, it is priceless. 

Therefore money should never enter into the discussion. However, I will tell you that many of the choices made about, you know, whether people benefit from something or not, go back to cost, go back to greed. And, you know, some of that has to do with, you know, the, the variant behavior that you mentioned earlier that, you know, not everybody is, [00:23:00] you know, good hearted, generous, you know appreciative of anything other than profit.

John Willis: Oh yeah, no, come on. You're saying drug companies are, are in it for profit. Yeah. 

Diane Kulisek: Nothing shocking there, right? Yeah, yeah, yeah. But, but you know, even though the, the FDA, you know, will hold people accountable for that if it happens, 

John Willis: right. 

Diane Kulisek: You know, there's, there's this pervasive philosophy that the clients of medical device companies are not.

Patients, they are not even doctors or healthcare providers like hospitals or clinics. They are the insurance companies. And so, you know, who pays for the things these companies develop? It's very rarely an individual. It's always another huge institution. And so if your [00:24:00] shareholders are people like UnitedHealthcare or you know, Cigna or Blue Shield and you know, you have to decide whether to invest another, you know, several million dollars to raise the bar on quality for whatever it is that you're providing to you know, the healthcare industry, so to speak, which includes their insurance companies.

It's probably not gonna happen. Insurance companies, even when everything else is tanking. Profitable. 

John Willis: Right. That 

Diane Kulisek: is their number one priority. I don't think there's anybody that, 

John Willis: and this is a very naive question, but isn't, shouldn't there be overlap between the insurance? 'cause the insure obviously, you know, to take a real cynical versus insurance companies aren't really cared about sort of the, the priceless concept other than you know, how much their profit or loss.

Right. Well, 

Diane Kulisek: let me mention one thing about insurance companies in relation to the quality professions, 

And 

regulatory professions that you may not realize. [00:25:00] You mentioned, you know, that your first involvement with auditing was through finance. 

John Willis: Right? 

Diane Kulisek: There's a reason for that and it's called Sarbanes Oxley 

John Willis: right?

Diane Kulisek: It's a financial regulation. 

John Willis: Right. 

Diane Kulisek: It's not about people's health. And where would the US government be most interested in applying regulation? It would be the finance industry because it runs our government. 

Now here's another thing that goes even further back. Where do you think the concept of risk management arose?

It was the financial industry. It was insurance companies So we put a lot of emphasis on failure modes and effects analysis failure modes of effects and criticality analysis, you know, detectability of problems with quality. But the bottom line is that whole science emerged out of the financial industry and specifically out of insurance companies.[00:26:00]

So it doesn't surprise me that they are also running healthcare and you know, frankly, if they can't profit by whatever it is that needs to be done to address your needs at least from a public relations perspective, they probably won't do it. I recently learned that United Healthcare has now begun applying AI to more quickly decline claims

John Willis: Well, and yeah, this is, I've been reading articles where not only that now claimers are using AI and now it's this sort of like ridiculous 

Diane Kulisek: war of ai. 

John Willis: Yeah, it is, it is. I've already got, I already sort of posted a couple articles here where not only are they using it, but the, the sort of lawyers or the, the representatives of clients are using it to sort of outsmart the providers, the providers are trying to outsmart.

You know, it's like, it's literally like, you know, the radar guns on highways, you know, so, oh, yeah, yeah. But I guess my, my other question, and just more for me [00:27:00] is that, is there, shouldn't there be some overlap that insurance companies, like, even though they don't really per se in in a grossly satirical way care about life, the fact that they care about losing money because of losing life should overlap with what 

Diane Kulisek: Yeah.

To the extent that it fuels their public relations. 

John Willis: Well, but isn't it, there's some financial overlap too, right? In other words like preventing creating risk control on a new drug or a new device. 

Diane Kulisek: There's no profit in somebody living longer or being healthier. The sicker you are, the more money Okay. 

John Willis: Yeah.

So yeah, so's the more 

Diane Kulisek: terrified you are of dying, the more money they make

John Willis: I was thinking about the cause of death, but yeah, you're right. The overall systems of the is better. Yeah. That they wanna 

Diane Kulisek: keep you sick. 

John Willis: Yeah. Yeah. Got it. Okay. Alright. So, but then circling back then, like sort of like. Like, so the FDA has brilliant teeth.

We can probably trace risk management. I mean, there, [00:28:00] there are people who talk about like actuary tables going back and, you know, to World War II, right? And but the but so, but what, what is, so let's sort of bleed it into like, what is it, what does Deming tell us, the story I told you about what we are trying to solve in all industries, like the commonality, you've seen, commonality and then, you know what, what's sort of a Deming overlay on this?

And I guess the last question, am I making sense right now? 

Diane Kulisek: No. Yeah, you're making sense. I, you know, I, I guess to rephrase your question, to make sure I answer the correctly one Sure. It's, you know, would automation of the audit process at a broad level be of value in our society today as opposed to spending enormous resources doing these things?

Over a longer period of time with human resources. Is that the question? 

John Willis: I that's the, it's a better, it's, it's not exactly the question I was asking, but I think it's a better question that I was asking. Yeah, let's go with that. Well,

Diane Kulisek: I, [00:29:00] I think the answer is yes, and in fact I've already seen people working on that.

John Willis: Okay. 

Diane Kulisek: The question is, will it have enough information to ask the right questions? One of the things that, that I became fairly well known for as an auditor, I, I actually was a lead auditor for Roche briefly on there in vitro diagnostic area on CAPA, especially, specifically CAPA is corrective and preventive actions.

And they sent a, a representative from their facility in Germany out to partner with me as we went from facility to facility doing audits of this particular activity. And it was internal audits, but driven at the corporate level. They could go in and look beyond what was written, look beyond the evidence that was being presented, look beyond [00:30:00] whatever, you know, public information might be available, for instance, on the FDA website about, 

John Willis: you know, 

Diane Kulisek: various issues that had been noticed, and assimilate an opinion about where the vulnerabilities were so that they could get in front of them.

Okay. The financial motivation behind that is that it's extremely, extremely expensive for a large corporation to respond to a non-conformance issued during a regulatory audit. So they spend an enormous amount of effort and resources to prevent that from happening. That's why you were seeing what you were at the banks.

Okay, 

John Willis: so both on the pre and the post, right? 

Diane Kulisek: Yep. Yep. 

John Willis: Preparation. You do, which in the end is low efficacy for the post. Yeah. It's human developed. It's sort of, it's at the whim of So, 

Diane Kulisek: [00:31:00] so basically a, a regulatory investigator inspector is like a cop coming in with free access to the vast majority of your business.

John Willis: Right? Right. 

Diane Kulisek: And, and anything that they might find that is in violation, believe me, if they look hard enough, it's just like a cop following you through town. They're gonna find a problem at some point. 

John Willis: Yeah. Yeah. 

Diane Kulisek: Okay. So you wanna be on their good side. You wanna put on a, a really great first impression.

Usually start with a tour of the facility that's guided by somebody in management and you wanna document everything that's going on. But, going back to, you know, asking the right question. 

John Willis: Right. And 

Diane Kulisek: somebody that knows where to look. You what we used to call it was knowing where the bodies might be buried.

Right. Right, 

John Willis: right, right. 

Diane Kulisek: You know, I used to say every time I, I found a problem and just kind of shoved it to the side to see what was behind it, I'd find a dozen more problems. 

John Willis: Yes. Right. That's right. The 

Diane Kulisek: deeper you dug, the bigger the problems were and the higher up they originated, [00:32:00] you eventually end up with, you know, the president of the company caused this.

You can't address that to prevent recurrence unless you, you know, are willing to go with a dark horse, a new president, and, and believe me, these guys have been vetted six ways from Sunday. They're not gonna do that lightly. Okay. So you have to find the root causes that, the battles you can win.

You may not win the war, but you have to deal with the battles you can win in order to, to. Be reasonably successful in preventing that regulatory finding. You have to be able to ask questions that go, go beyond the evidence you're seeing, go beyond the, the written, you know, text because you can frankly go on the internet even, even 10 years ago and find 50 examples

or even quality systems. You don't need to have a lot of knowledge to create the documentation. 

John Willis: Well, yeah, right. Like that. I mean, and like a lot of people trying to building AI to now AI to sort of do that for you almost [00:33:00] automatically, 

Diane Kulisek: but you need experience and you need deep experience and broad experience to be able to get in front of that train.

I can't give you any specific examples, sadly, without  violating nondisclosure agreements.

John Willis: I think one of the things I think how the banks have dealt with this reasonably well is that idea of a three lines of defense, right? And in theory, that second line is supposed to be rather technical enough to go sort of deeper into the left hand side, which is the first line, which is the owner of the risk, and then be able to sort of translate that to the third line, which is the pure audit.

Again, the, the, the problem with all this, I think you just pointed out is, you know, so do you get the battles you can win? And that's on all sides. Yeah. That was sort of our problem is the, the, the, the, the inertia of that sort of concept gets to a point where you're not really protecting. Anything.

You know, and I always [00:34:00] say the most, the reason for me, the way I think about in most of the businesses I work with, we're not usually life or death. Yeah. Sort of medical devices or sort of hospital or patient care. Is that protect the brand. 

Diane Kulisek: Yeah. 

John Willis: At the end of the day, like a bank, that's a PR piece.

Yeah. So, but I mean, but at the end of the day, like, you know, the, like what puts a bank out of what, you know, like, you know, a bank has a couple of ways of going outta business, right? They can literally lose their license. Unlikely. If you're sort of a, like, you'd have to do something disastrous if you're JP Morgan Chase, you know, fancy using the license, like, you know Yeah.

Pretty hard. But you could destroy your brand, you know, where like you know, like I, you know, I can walk, if I go three miles from my house, there's four different banks I can sort of choose. Or if I go online, there's a hundred different banks I can choose to put my money or do my transactions with.

Sure. But, but yeah, so that, so like, I think that what most people lose sight [00:35:00] of is why the corporation from the head to toe is why are we doing all these things? Not necessarily the like, what we have to do to not get fined, but what we have to do to pass on audit. And unfortunately, I think most organizations I dealt with, they sort of take the, what I have to do and then inertia become acceptable.

Like you said, the idea of giving the, the the auditor is a tour of the facility and, you know, sort of show like the, the, the big smoke screen of the, the, the, the, the, the data center, information center, the 

Diane Kulisek: dog and pony show for the 

John Willis: trust me, look at this room. We've got it under control. Right. 

Diane Kulisek: Well, you know, it's, it's interesting you bring that up because one of the, things that has emerged over the last 20 or so years for me is that compliance is not enough. 

And when you deal with the [00:36:00] regulatory field, that's really all that matters. You know, what is the law and do we comply with it? 

From my perspective, and I think from that of many leaders, even outside of the quality profession, that is the minimum we should be doing is complying with the law.

Now there's also the philosophy that some laws are written to be broken, you know, so, so which laws do you comply with? If one makes no sense and does more harm than good, should, should you comply with that or not?

So, so there's a point where you have to have somebody that is willing to cut the baby, so to speak.

Right? 

John Willis: Well, there's, there's definitely, I've heard stories of arbitrage even on Sarbanes oxley, right? In other words. Mm-hmm. Mm-hmm. I, a trading organization that literally decided at the CTO level that that the cost of like the and it's the vagueness of the laws too, right? That's a whole, yeah, yeah, yeah.

They're written by lawyers, so, right, right. Yeah. So what is it? If they're written purposely vague, I'm gonna interpret it as this, if we got, you know, but, but like, even Barnett, like the, the cost of trying to [00:37:00] even do what the generally interpreted version of a, of a sort of a regulatory control, maybe a little consider a law.

Yeah. Is the amount of fines I would get, even if I get fines, I'm better off not complying. Right. And 

Diane Kulisek: that applies in the healthcare industry as well. It 

John Willis: does. That's scary. But yeah. Yeah. I mean, it is 

Diane Kulisek: scary and I've seen some really horrible things over the years. 

John Willis: So I, I think I was trying to start, so where, what is, what is Deming?

I, I know I, I did a little search while you is, I remember looking up like the, not to get too, but the thing that really sort of blew a couple of us away is when we started looking at the FDA it was the FDA 21 CFR and it just seemed almost like Deming wrote it. And I know I'm, I'm, so, I'm wondering like, it, like there was, you know, more than anything I've seen in the banking industry from like the, you know, regulatory control from like the FDIC or the OCC or all those sort of financial regulatory control excluding IGN Toley, which is sort of a general, but it [00:38:00] looked like the FDA was far ahead in, and it would make sense 'cause hopefully its some intent is life and death.

But, but like, I guess in general, well what's your, what's the Deming take on maybe your career on how you see it from regulatory control?

Diane Kulisek: Well, you know, I think it goes back to. What drove Deming to focus his life the way he did. 

John Willis: Right 

Diane Kulisek: on, you know, how to return the pride of workmanship to employees on how ensuring quality can help companies be profitable and stay in business.

You know, there, there are lots of bits and pieces of what Deming's philosophy conveys, you know, not demeaning people, you know, using arbitrary objectives, right? You know, or, or ones that are based on profit when their job has very little to do with profitability which is the case for a lot of quality professionals.

I think there is a certain level of [00:39:00] altruism that is inherent to people that choose quality assurance as a profession. It evolves into quality systems and it, it can encompass regulatory affairs. But I think the bottom line is that people in the quality profession endure an awful lot of slings and arrows because it's an altruistic kind of profession.

You're trying to elevate the quality of life by ensuring the quality of goods and services, and it's, it's a global concept. 

John Willis: But isn't that, that's part of what's been wrong with quality too, right? Is the non Deming approach. Right. Which, 

Diane Kulisek: well, and, and there's a lot of gray, there's a lot more gray area than there is black and white.

Yeah. And I think a lot of new quality professionals struggle with that. 

Aren't 

I supposed to be if I comply with a specification, that's the black and [00:40:00] white, right? But there's so much between that and doing good for the world. That's right. Right. So, so are you willing to take risks in order to raise the bar in order to improve quality for all of mankind versus improving, you know, benefits for shareholders and company owners beyond bonuses for executives?

Are you willing to take a stand, even though you may fail numerous times, even though you may have to change companies numerous times? Are you willing to be humiliated in public because of information that wasn't shared with you when you made your choices, you know, in order to pursue that mission? 

And I think Deming was the kind of guy who knew that he had to get to a place where he didn't care what other people thought of him, especially those in power. 

John Willis: Yeah. In order [00:41:00]

Diane Kulisek: to have that impact, 

John Willis: that was clear. Yeah. I mean, that was clear at least from, again, I'd never met the man and so 

Diane Kulisek: I, I don't think it's completely out of line with what Deming was portraying.

I think Deming actually started being a hard line quality assurance pro. I mean, he was involved with Taylorism, right? 

John Willis: Right. Yeah. 

Diane Kulisek: And then he evolved into the more enlightened being that we consider our role model today. Our mentor today in his writings by saying, look, you cannot care about certain things if you're really going to make a difference.

If you, if you really want to make sure that you don't get, become a victim of a traditional performance assessment, you have to defy the traditional performance assessment. If you happen to have everything lined up correctly, when you do it and you have an alternative solution, you may survive it. If you don't, you may find yourself out of a job and it may not [00:42:00] be a good time to be out of a job.

John Willis: Right. 

Diane Kulisek: The chances that any of us will ever become the grooves, Deming laws are pretty slim. 

John Willis: Yeah. Yeah, yeah. Well, one of the problems with Deming, like problems, and I think this is worth saying, is that you know, and like, I'm gonna get slammed. I did this podcast with Lonnie Wilson and there was a lot of flak. Lonnie tells it as it is, you know, and I, I think, one of the criticisms of Deming, you know, might be flamed for this.

It's easier said than done, right? Like this, this is, like you said, we might not all be Deming. But the problem is most of us have to live in a world is chaos and resistance and layers of layers and, and pay the bills and take care of the people we live, right? And then, yeah, and then at to some point, I you 

Diane Kulisek: are quality of life, you know?

Right? 

John Willis: Yeah. Right. And then like, I have to make those risk decisions myself. Like, you know, how many times have we made that decision to sit in the room with the, you know, the, the, the oak desk? Yeah. Don't say it. Don't say [00:43:00] it. Don't say it. Right. Like you know, I gotta say it. Don't say it. You know, the devil and the angel on your shoulders.

You know, so Yeah. Yeah, yeah, yeah. But so no, I think, you know, I mean like in a perfect world, like I think we all, at least most people listen to this podcast and certainly anybody who was at in two end believed that Deming was spot on. And, you know, one of things I was gonna ask you about, like, I probably ask Bill this too, is, you know, Bill's very big on Taguchi and I learned a lot from Bill about Taguchi, but you know, one of the, and I always forget the quote, I need to put it on my wall so I can just, like, it's one of my favorite quotes.

But basically Taguchi is saying that sort of a, a system is how it sort of helps society. 

Diane Kulisek: Yeah. 

John Willis: It's a 

Diane Kulisek: contribution. The Taguchi function is about loss to society. 

John Willis: Yeah. Yeah. That's exactly right. That's, 

Diane Kulisek: and so the alternative of that is benefit to society. Right? Right, 

John Willis: right. Exactly. Right. Which is the implied, right.

I, I wonder 'cause I don't recall Deming. I, I think Deming talks about joy of work and he talks about pride of [00:44:00] workmanship. Pride of workmanship. How that, you know, sort of like, and I'm like terribly paraphrasing him now, but like, if, if you get quality, you'll get, you know, you'll get profit. Right. That kind of stuff.

Right. But I don't recall ever him doing the sort of all in on Taguchi. About society. So I I'm sure he believed that, because I don't know why you'd have that. Well, 

Diane Kulisek: the whole out of the crisis book is talking about That's true. Economic, economic recovery. That's the 

John Willis: Yeah, yeah. In fact, that book you're right, is about, it wasn't about how managers can fix corporations.

It was how America needs to fix itself. Yeah. So you're right. It's, that's good. Thank you for, so he, 

Diane Kulisek: that was his altruistic mm-hmm. Contribution, I think 

John Willis: of what, what he meant about like, the world. And I, I have a feeling that, that at that point, he might have thought that was his last hurrah too. You know, he didn't realize he was gonna have another 10 years of, was like the most significant part of his career.

Diane Kulisek: love that he was in his eighties when he put that together, you know? 

John Willis: Yeah, yeah. Like, I, I always say that it 

Diane Kulisek: gives me another 10 years, you know? Ah, 

John Willis: yeah, yeah, yeah, yeah. So I'm, I'm, [00:45:00] I'm a, I'm a similar book to you. But the yeah, no, I think that, that, you're right. It, there is something to be said about him.

Writing that book, thinking about him being fed up with everything that was going on in America, and rightfully so because they listened to him. They didn't listen to him, they listened to him again, you know? And then so what, what, 

Diane Kulisek: let, let me, let me talk a little bit about what we've been saying with Sure.

Yeah, yeah, yeah. I mean, in relation to this AI audit approach, what you and I were having our conversation about during the q and a at the recent end-to-end thinking, oh 

John Willis: yeah, that's right. Correct. Thank you. Yeah. 

Diane Kulisek: Was about what would Deming do? It kept coming up over and over again at different presentations, and I said, well, why don't we create a Deming AI and try to compile it from all of his trademarked and copyrighted intellectual property, as well as whatever's in the public domain, and see if we can come up with what Deming would do [00:46:00] utilizing ai.

And I think the same thing could be said for. Creating any kind of intellectual persona that emulates a real life character. Could you do it for Jesus? I mean, that's where the, where what would Jesus do concept. Yeah, yeah, 

John Willis: yeah, yeah. Based 

Diane Kulisek: on the New Testament, could you compile something that emulates what Jesus would do in a given situation when it responds to a question, you know, Deming could be viewed not as Jesus, but as an entity that we might want to emulate a response from based on all of the work he has left for us to appreciate in this day and age.

And weaving that into an AI of some sort. But it goes back to the concern about, you know, not knowing what you don't know. And right now, we as human beings do not understand adequately [00:47:00] consciousness. 

John Willis: Yeah. 

Diane Kulisek: We do not understand adequately what people lightly call the spirit or the soul. They're not necessarily the same thing.

John Willis: Right? Right. 

Diane Kulisek: We don't understand, you know, even our most basic thought processes, scientists are still grappling with them and have nowhere near approached what I would consider a level of competency. So if we build an ai, whether it's to emulate Deming or any other heroic figure for any profession, would it bring to bear on a, a response to a question, the same type of spirit, if you will.

Okay. 

John Willis: The, the and 

Diane Kulisek: consciousness that. A true human being would. 

John Willis: No. And so like the, I I, I sort of cover this sorta in the epilogue. And I kind of wish we had this conversation before I wrote the [00:48:00] epilogue. 'cause I think I would've added this. I mean, certainly. So the, the thing is what we, what we're failing to real, the answer is no.

The, the answer is we can do everything we possibly can to throw into sort of a, a model if you will. Yeah. I know how I do this for clients all the time. I take all their documentation or all their press releases and I make it so you can have a chat bot with their whatever they do, right? And and, but like you can do that, right?

And you can take everything and, and in a sense it's kind of all there anyway because if you sort of, what, you know, what they're doing is building these models. They're scraping literally what's probably 10 years of everything that's out there about Deming and they're putting in the model. The difference is.

It's not you know, in the sort of the the algorithms that give your answer, there's a lot of sort of like geometry, quite frankly that might get you not the right answer or the wrong direction or, but what you could do is you could take all that Deming data that [00:49:00] was scraped and put it in its own model.

Yeah. And it would be much tighter on, in other words, it wouldn't look for an answer where, you know, like when it's sort of vagueness, it's all probability based, right? So if you asked a question, it's gonna try to answer it. Maybe Deming didn't really answer it this way, but it found a better answer from somebody else and gave it to you as a Deming answer.

If you used only Deming data, well, even then you have the variability of who wrote about it and what was their knowledge. You know, like, yeah. Today you ask about Deming and PDCA, right? You most people who know Deming pretty well know Deming, never condoned PDCA. It was PDSA. But the, you could do all that, but the answer would be it's just what's in the model.

And that's why that, what's a couple of points that's missing about the conversation about AI is the most useful mechanism in AI is not treating it like a human, it's treating as a, a sort of a research collaborator on steroids. So if I had a Deming model that literally took everything known, [00:50:00]maybe even Deming next, and they, everybody put in all the proprietary stuff and like we had this real tight model.

The thing that what at the end of the day, just like when you're using chat GPT or any of the models right now, the way, the most effective way to do it is think of it as this unbelievable library of resources. Yeah. And start asking questions against that library for what you are trying to formulate.

Your opinion on, or your research on and, and, and, you know, and then like, you can challenge it too. Like, hey, I think, and I think what people miss, the point is that there is this human, you know, I think, you know, what I kind of focus on in the epilogue is that these are cognition engines. They're alien cognition.

Like we cannot deny that these things reason they they have, you know, they, they sort of emerge answers. They sort of converge. They they have emergent behavior. But [00:51:00] they're not human. And to your point, and all the things that make a certain human, we don't understand. We may never understand. So they can't, like if I put Deming in the greatest model with the most advanced reasoning against each other, 

Diane Kulisek: yeah.

John Willis: You know, Deming's gonna have that sort of what made him unique. Above most humans, when you ask the question, he's gonna have some cognitive nature or some abductive or some heuristics in his life that's gonna give you another answer. But that doesn't say that this alien cognition that we've created can't be incredibly helpful.

Diane Kulisek: Oh yeah. 

John Willis: Right. That's when I 

Diane Kulisek: was working in aerospace, we talked about capturing lessons learned from those that were retiring. There was a, A two Oh, yeah. Yeah. Totally gap. There was a two generation gap in the workforce. 

John Willis: Yeah, yeah, yeah. 

Diane Kulisek: Of financial issues. And [00:52:00] so all of my mentors within the industry were retiring faster than we were hiring new employee.

John Willis: Yeah, yeah, yeah, yeah, yeah. 

Diane Kulisek: And, and I said, you know, we need to capture this. I mean, it was something as simple as, you know, yes, we have all these written instructions, but what not is not written there and never will be. That's right. Is that before you apply this part to that part, you need to, you know leave it, sit on this counter for two hours, and then you have to put it through this cleaning process and grab this specific tool with this identification on it from that location in order to fit it correctly.

John Willis: Okay. No, and that's, I can think of industry, you know, I was I, I, I went back for a wedding. My, my wife's family's all, farmers basically. And like they were talking about, you know, their, their the, the people that my wife's age, my age basically their fathers were the last generation that lived between two generations of farming, which is all the, I mean, the farming is so advanced and electronic and digital and [00:53:00] Yeah.

Robots and all that stuff. And then the world before that and that whole, that's all being lost. Like there's gotta be some insane knowledge transfer between people who work the farms before automation. 

Diane Kulisek: Yeah. And 

John Willis: the advantage of spending the, the, well the 

Diane Kulisek: same thing is happening with computer skills.

Skills. So, so I'm gonna tell you this quick story. I had two roommates briefly because their house had burned down 

John Willis: Uhhuh 

Diane Kulisek: and I had a big home. So, you know, each one of them took a room in my home. They were 20 somethings. Okay? Okay. It was in my forties. And, and. I was staying in another city while I was working for a particular company as vice president of Quality.

John Willis: Okay. 

Diane Kulisek: They were staying at my home and feeding my pets and taking care of, 

John Willis: yeah. 

Diane Kulisek: But they also were gamers and one of them was kind of a cis ops guy, and the other one was a programmer. 

John Willis: Okay. 

Diane Kulisek: And they had lots of friends in other roles [00:54:00] within the gaming industry. And so I came home for the weekend one time, and I, I woke up and I, I came downstairs and the two of them were sitting with their laptops at my dining room table looking totally distraught, totally brow beaten.

And I looked over and they explained to me that they had been infected with a worm and they, they were losing everything on their, on their hard drives. 

John Willis: Right. Right. 

Diane Kulisek: So I grabbed the first computer and I entered, you know, a, a backslash dot command. Went to their main string of commands, found the worm, which I recognized, deleted it, patched the thing and said, okay, restart your computer.

And everything was back to normal.

And the other guy sat there with his mouth open looking at me and said, how the heck did you know to do that? Yeah. They thought of me as a dinosaur. 

John Willis: Yeah, yeah. Sure. And 

Diane Kulisek: I said, sometimes it helps to know 

John Willis: Yeah. 

Diane Kulisek: How these computers [00:55:00] were originally developed to work.

This, this is dos basic. Yeah. 

John Willis: You know, well, you know you just made an advertisement from my book on the history of ai because that was, did I, because that, that, I mean, I've already had people tell me they better understand what they're using. 'cause I go back a hundred years and I said, chat PT wasn't just a moment.

I mean, it wasn't a moment, but it wasn't sort of like, it didn't just happen out of blue in 2022. Right. It was, yeah. You know, in, in some ways 75 years in working, but you could go back a hundred years and you know, and I think there's, there's some knowledge that, like, you get to understand like what happened in 19 40, 19 50, 19 60, 96.

Yep, 

Diane Kulisek: yep, yep. 

John Willis: Which also help you. I had 

Diane Kulisek: started working with computers in, in the, I would say early eighties before personal computers were really 

John Willis: Sure. Yeah. 

Diane Kulisek: And, you know, and so I was there for, you know, the emergence of

using shared consoles where you had to book time. [00:56:00]

John Willis: Yeah. No, yeah. No, remember that? Yeah. That I started, I started at Exxon as a, a mainframe systems program in 1980. So yeah. So yeah. Alright. We're about an hour. This was great. We probably should do it again. I think there's a lot more, I think the I'm like fascinated as a group of us.

I, one of the things I spent on way too many plates, this has been my whole career, you know, I'm like an expert in nothing, but I understand everything. The mostly everything but the but like one of the things I keep my fingers is, is like regulatory control. It's a lot around software, but it's, like I said, like the whole FDA thing.

There's just beautiful overlap between your knowledge of just regulatory quality insurance in, in general and how, 'cause it, there's a world that's happening right now. It's already been happening. It's gonna get worse. Where like the bleed between software delivery and physical software and physical is just, there aren't, there aren't no lines anymore really.

[00:57:00] So like everything reality 

Diane Kulisek: and, you know, how we manipulate reality are right. 

John Willis: And, but then like what regulatory control isn't like everything that software and everything that isn't software. It is regulatory control. You know, like, you know, again, a lot of what we see in the banking industry is very focused.

Like we see. I'm not saying it's the only thing going in the banking industry, but it's an important part is the software delivery. Aspects of how, like the FDIC or more importantly, the OCC in the US apply, you know, audits and regulatory control on, you know, which is all about people's money. Like, you know, 

Diane Kulisek: well, my goal with regard to AI is to aggressively pursue how to achieve a synergy between AI and human beings.

Yeah. I don't know to what extent it's feasible, but I see it as the only route by which our mutual existence might be assured. [00:58:00]

John Willis: Yeah. And I, I think, you know, the, so the the optimistic view of this is that it just becomes another evolution of technology enablement and it's all the hype. I mean, you, you know, you were there Yeah.

With original computer hype and it was all, it is very similar, like the scary thoughts of computers and Oh yeah. Thinking that the computer, I mean, war games. Remember the movie war games was like, this was, I 

Diane Kulisek: can't do that. HAL, from 2001 a Space Odyssey, that's where I started with that. 

John Willis: I mean, me too, but that's the post right there.

Yeah. But even like the war games where the, that computer won't the w Yeah, it was just gonna start nuclear wars. And I'm not saying that stuff hasn't kind of almost happened, but the point was it didn't play out that way with computers. That movie scared everybody to death about computers before really anybody really had their own computer.

And look how it played out. I, you know, the optimistic view of AI is, there'll be 10 years from now, five years from now, we'll just think of it as another technology enabler. And, and I hope that's true. 

Diane Kulisek: I hope that's, 

John Willis: I think I, I think it is, but Alright. We'll [00:59:00] just shortly, where, where do people find you if you want them to find you or have conversations?

What I, I would warn you. The answer to this question, I think I always wonder if Bill Bellows would've said, you know, I probably shouldn't answer that, his question. 'cause a lot of people who listen start reaching out and Bill's actually made a whole lot of new friends like Rob, who you, me, but, but they will listen to this and they'll be like reaching out to you.

So, so answer probably, I would 

Diane Kulisek: say LinkedIn is a good bet. Okay. 

John Willis: We'll put that up on the show notes. So, 

Diane Kulisek: yeah, 

John Willis: I knew this was gonna be a lot of fun. I, I guess the only sort of downside is there's just so much to cover based on your background. That it was just gonna be hard for me all those questions in, so I think we'll just have to do it again.

So well, 

Diane Kulisek: I think we, we took a good cut at it and we did,

John Willis: yeah, I think that, I think we, I think, like you said, I think we have the general ballpark now.

I like to just have those first conversations and, and just let it flow. But I think that to go a little deeper now we could figure out, like where would we kind of dive into a little deeper? Well, again, it's been a blast for me and it's [01:00:00] great meeting you. 

Diane Kulisek: It was great talking with you. Same 

John Willis: here. Bye-bye. 

Diane Kulisek: Talk to you again. Bye-bye.