In this engaging conversation, Robert Vescio shares his unique journey from horticulture to cybersecurity, emphasizing the importance of economics in understanding cyber risk. The discussion highlights the value of learning from mistakes, the need for transparency in cyber risk management, and the cultural challenges within the cybersecurity field. Vescio advocates for a compassionate approach to cybersecurity, encouraging professionals to embrace failure as a learning opportunity. He also introduces X Analytics, a platform designed to simplify cyber risk management and provide organizations with a clear understanding of their cyber risk condition.
Chapters
00:00 Navigating the Conference Landscape
02:53 From Horticulture to Cybersecurity: A Unique Journey
06:09 The Importance of Economics in Cybersecurity
09:00 Learning Through Mistakes: A Personal Journey
12:05 The Culture of Mistakes in Cybersecurity
14:54 The Need for Transparency in Cyber Risk
18:06 The Role of Boldness in Career Growth
21:14 Embracing Failure: Lessons from NASA
24:00 Understanding Cyber Risk Management
26:58 The Impact of Cyber Incidents on Businesses
30:01 The Importance of Compassion in Cybersecurity
33:13 X Analytics: Simplifying Cyber Risk Management
#podcast #techsecurity #informationsecurity #cybersecurity #ai
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, robert?
00:00:00
It's great to get you on the podcast.
00:00:03
We've been working towards getting this thing scheduled for
00:00:06
quite some time now.
00:00:07
At this point, I'm really excited for our conversation.
00:00:10
Speaker 2: Same same.
00:00:11
Speaker 1: Sorry, it's taken a few attempts to get here, but
00:00:13
glad we finally made it people to do like hour long calls like
00:00:27
this during the summer because, like all of our conferences kick
00:00:29
off you know april, may time frame and they go all summer
00:00:31
long, especially if you start doing like b-sides conferences
00:00:34
and everything else right, like that's all the time.
00:00:39
Speaker 2: It's uh do you enjoy the conference?
00:00:43
Speaker 1: thing, you know DEF CON, right, I don't like all the
00:00:46
vendor stuff.
00:00:47
All the vendor stuff.
00:00:48
You're just getting sold all day long and it's really
00:00:53
frustrating.
00:00:53
So I'd rather go to like DEF CON or a B-Sides conference.
00:00:59
You know something that's way less vendory Substance You're
00:01:04
looking for the substance, right yeah.
00:01:05
Yeah, I'm not looking to be sold something.
00:01:08
You know I went to RSA a couple of years ago and I hope I never
00:01:11
go back.
00:01:13
Speaker 2: Yeah, I haven't been to an RSA conference since, uh,
00:01:16
the last one was before COVID happened and uh, honestly I
00:01:21
gotta say I don't miss it at all .
00:01:23
Speaker 1: But honestly, I got to say I don't miss it at all.
00:01:26
Yeah, I went to the one.
00:01:30
I think it was like the one in 2021, right when they came back
00:01:32
from COVID or whatever, and it was like a super spreader event.
00:01:34
Like you know, I only walked the vendor floor maybe two times
00:01:42
and I was done Like one day I did half of the room and then
00:01:46
the next day I did the other half of the room, and it was
00:01:49
just, it was just pointless for me to be there.
00:01:52
Speaker 2: honestly, yeah, I've always found the best times at
00:01:55
RSA are when you get people to meet you at places outside of
00:02:00
the Moscone center.
00:02:01
You know you can get into me, like a cool coffee shop or one
00:02:05
of the hotel lounges.
00:02:06
I just always felt that that was where the real action
00:02:09
happened at RSA.
00:02:11
Speaker 1: Yeah, yeah, that's what I would prefer, honestly,
00:02:15
right, like, if I'm going to make the trip out to California,
00:02:18
I already don't like going there.
00:02:19
Right, like, might as well, like, show me around a little
00:02:24
bit, you know, take me to a restaurant or something.
00:02:25
I don't want to meet you at the conference room, you know.
00:02:28
Right, right, because I think it provides a lot of value for
00:02:54
people to hear everyone's background and say, maybe I have
00:02:57
a similar background, right, and if he did it, maybe I can do
00:03:01
it too, right, so what does that look like for you?
00:03:03
Yeah?
00:03:04
Speaker 2: So you know, my background in the cyber is a
00:03:06
strange one.
00:03:06
I started off as a horticulturalist out of all
00:03:09
things right which is plants, if you're not familiar and when I
00:03:13
graduated I started working for an environmental company that
00:03:18
was doing a lot of irrigation systems in Southern California,
00:03:21
san Diego mainly and they were in the process of moving all the
00:03:26
irrigation systems from analog to digital.
00:03:28
So, of course, being one of the fresh guys out of college, they
00:03:32
were like hey, can we send you to some computer classes to
00:03:35
learn how to make these swaps on our behalf?
00:03:38
And so I just kind of fell into technology.
00:03:41
You know, a lot of my training in college was in the field of
00:03:45
science, obviously, but there was a lot of design aspects,
00:03:49
which, surprisingly, has led into where I'm at today.
00:03:52
Right, so from those initial technology classes that I was
00:03:55
jumping into in the mid-90s to where we are today, I have found
00:04:00
that I continue to pull back on things that I learned in
00:04:03
college, like in the School of Agriculture.
00:04:06
I went to Virginia Tech.
00:04:07
Horticulture is in the School of Agriculture.
00:04:08
You have to spend a lot of time in economics classes, because
00:04:12
small business finance economics is a big part of agriculture
00:04:16
and you have to understand how that works.
00:04:17
So a lot of the economic principles I've applied into the
00:04:21
job that I have now and what we do at X Analytics.
00:04:24
So it's actually worked out in a really sort of strange way.
00:04:27
You could never sort of predict this path, but even the way
00:04:30
like viruses propagate isn't too dissimilar.
00:04:33
How plants propagate right the way that viruses work in the
00:04:38
plant system isn't uncommon.
00:04:40
How viruses work in the computer systems.
00:04:43
So there's all these strange correlations that I have found
00:04:46
from where I started to where I'm at today that you know again
00:04:50
, you can never sort of plan a path like this, but I have found
00:04:52
to take advantage of every little bit of knowledge that I
00:04:55
have and combine it in a unique way that really sort of created
00:04:59
the career that I have in front of me.
00:05:02
Speaker 1: That's really fascinating.
00:05:03
You know I've done over 200 episodes and horticulture is not
00:05:08
one of those backgrounds that I've gotten.
00:05:10
You know I've had opera singers on.
00:05:12
I've had, you know, cyber warfare mercenaries on Musicians
00:05:17
, probably, right, yeah, yeah, it's really interesting.
00:05:21
You know, you said that when you were in school, right,
00:05:25
studying horticulture, you had to study economics as well.
00:05:29
Is that because maybe the industry in horticulture or
00:05:33
agriculture overall is more small business focused?
00:05:38
Right, like, you're not going to go work for.
00:05:39
You know a really large company , right, like here in technology
00:05:44
, you can go work for Apple or Google.
00:05:45
You know a household large company right, like here in
00:05:46
technology, you can go work for Apple or Google.
00:05:47
You know a household name worldwide, right, but maybe in
00:05:50
that industry it's more common to go the small business route.
00:05:54
Is that like what it is, or is there another reason behind it?
00:05:58
I think that's part of it.
00:06:00
Speaker 2: You know, virginia Tech is one of the few
00:06:03
land-grant universities in the country.
00:06:05
That's where I went to school.
00:06:05
Virginia Tech is one of the few land-grant universities in the
00:06:05
country.
00:06:06
That's where I went to school.
00:06:06
Virginia Tech and Virginia Tech's really built on the
00:06:10
foundation of many of the founding fathers of our country,
00:06:14
and something that I think a lot of people don't understand
00:06:17
about the founding fathers is that many of them were part of
00:06:20
this Enlightenment philosophy Even Catherine the Great of
00:06:25
Russia, right, she was in this enlightenment philosophy, which
00:06:29
was to intersect science, technology, math and the arts
00:06:32
together and Virginia Tech.
00:06:34
In order to graduate, at least at the time that I graduated,
00:06:37
they wanted to make sure that you were well-rounded by the
00:06:40
time you completed your four-year degree, and so that
00:06:43
well-roundedness include that you had to be part of the arts,
00:06:47
you had to be part of science, you had to be part of technology
00:06:49
, you had to be part of mathematics and obviously for me
00:06:53
in my field, some of the mathematics led directly into
00:06:57
economic classes micro and macroeconomics, but I think on
00:07:01
the big picture too, if you also think about agriculture
00:07:05
agriculture, especially you live in chicago, right?
00:07:08
Agriculture has been something that's been traded on the
00:07:10
chicago stock exchange for a long time, and so I just think
00:07:14
there's a direct association between how agriculture works
00:07:18
and how the stock market works.
00:07:20
And obviously you're right, a lot of at least historically, a
00:07:25
lot of the farms, a lot of the horticulture businesses, the
00:07:28
nurseries, were small businesses .
00:07:30
Today that's changed a lot.
00:07:31
Right, they're part of mega corporations, but back in that
00:07:34
time, absolutely, they were part of small businesses.
00:07:37
So, having that foundation in finance, having that foundation,
00:07:40
understanding how to balance the books in an organization,
00:07:43
pay your liabilities, but then also weaving it all into the
00:07:46
bigger picture of macro and microeconomics which is
00:07:49
something that was part of the philosophy of Virginia Tech at
00:07:52
the time I hope it's still there .
00:07:54
I have a strange feeling it's probably not there anymore, but
00:07:57
I hope it's still there.
00:07:58
Speaker 1: Yeah, I really feel like everyone should take some
00:08:02
economics classes.
00:08:03
You know either in high school or you know in college, right
00:08:07
like, because that that information is so much more
00:08:12
valuable than like learning.
00:08:17
You know how to write a paper in english class, like I mean
00:08:20
honestly it really is, and that's coming from someone
00:08:22
that's getting their PhD right.
00:08:24
Like you can learn the things of like how to write a paper
00:08:28
properly through a couple, a couple drafts you know you have
00:08:31
a patient professor.
00:08:32
It's like, oh okay, you need to structure it like this, you
00:08:35
need to use this terminology or whatever it might be right.
00:08:38
Like you can learn those things really on the fly.
00:08:41
But economics, I find myself, you know, I grew up in a in a
00:08:46
poor family, right, I mean, we didn't realize that we were poor
00:08:49
, but, you know, looking back on it was like, wow, we were, we
00:08:53
were pretty poor and so, like, money wasn't money wasn't
00:08:58
discussed.
00:08:59
Of like how it works right, how it can work for you and against
00:09:02
you, of how you know these things all, all, all matter
00:09:07
Right, and like that.
00:09:09
That was the most challenging part for me when I became an
00:09:12
adult.
00:09:12
I had to, then, you know, learn that right and teach that to
00:09:17
myself, and that took me.
00:09:19
It took me a couple of years actually to to actually, you
00:09:24
know, learn it how you're supposed to, like, actually know
00:09:28
it Like, oh no, this is what a bad loan looks like.
00:09:30
You know, my very first.
00:09:32
I look back at my first, my first car that I bought Right.
00:09:35
I should have I never.
00:09:37
Should have leased it.
00:09:38
I should have financed it Right .
00:09:40
Should put more money down on it Right.
00:09:42
Should have accepted the insane interest rate that I got
00:09:46
because it was my first ever car loan and whatnot.
00:09:49
All of those things.
00:09:51
I had no clue that they were Right.
00:09:56
Speaker 2: But I do feel like you have to learn through those
00:09:57
mistakes, and sure you could learn some of that academically,
00:10:03
but sometimes I think the best opportunity for learning is the
00:10:07
mistakes that we've made.
00:10:07
And clearly you've made those mistakes so you probably
00:10:11
wouldn't approach a car loan in the same way.
00:10:12
You know what I mean, because it stuck are learned in real
00:10:15
life, especially in the field that we're both in is that you
00:10:18
can read about something, you can learn about something
00:10:29
through a lecture, but until you actually experience it
00:10:32
firsthand, I don't think it really sticks you know what.
00:10:36
I mean Like it doesn't really resonate in how you make
00:10:38
decisions moving forward.
00:10:40
Speaker 1: Yeah, especially for me.
00:10:42
you know how I learn is by doing right, and if I don't,
00:10:46
understand that something is is wrong, or you know, like it
00:10:52
shouldn't be a certain way, right, I don't, I don't realize
00:10:56
it until until I do it.
00:10:58
You know, like I think about even like my current, like like
00:11:03
my sports car that I have, right , again, it was a bad situation
00:11:08
and I learned, oh, I can't go into it.
00:11:10
You know, excited, right, like I have to be a better salesman.
00:11:14
When I'm excited about the car, I'm like a closet car guy, you
00:11:19
know, and so, like I just like started getting into cars and
00:11:23
the guy showed me, you know, the right, like he knew exactly
00:11:27
what he was doing.
00:11:27
Obviously he does it for a living, and so it's like I got
00:11:30
to learn this lesson again, or I got to learn it a different way
00:11:33
, you know, but the same same thing in technology, and there's
00:11:37
, there's so many people out there that are afraid to mess up
00:11:43
, you know, and like I talk about it on my podcast a lot
00:11:47
right, when I was fresh out of college, I mean I very
00:11:52
embarrassingly like destroyed a bank's database of our products.
00:11:57
Right, just very inadvertently, very innocently, you know, ran
00:12:01
the wrong command, had too high of privileges and permissions
00:12:05
that I should have had, right, and I went and destroyed their
00:12:09
database and I'm sitting here like man.
00:12:11
I just started this job, I'm about to get fired, like this is
00:12:15
terrible, you know?
00:12:16
Yeah, but the VP gave me the opportunity to learn through
00:12:22
that mistake.
00:12:23
He's like well, you know, I hired you because I knew that
00:12:26
you would make mistakes, and when you did make them, I knew
00:12:28
that you would solve them.
00:12:30
Right, what a great boss, though, right?
00:12:32
Right, not many people are going to give you that
00:12:34
opportunity, and that's probably why people are so worried about
00:12:39
making mistakes now.
00:12:40
Right, because they don't want to get fired, but you have to
00:12:46
make the mistakes to really learn it.
00:12:49
Speaker 2: This is.
00:12:49
This is one of the things that I find fascinating and and for
00:12:52
some colleagues, that you and I have that overlap I have these
00:12:55
discussions with them and and, uh, I always struggle,
00:12:59
especially for somebody that's new at a cso they don't want to
00:13:04
share their findings directly with their boss or bosses, and,
00:13:10
whether that's the CEO or corporate directors or whoever
00:13:13
it happens to be, there's this hesitance like well, I know,
00:13:17
that's my cyber risk condition, but I really don't want to share
00:13:21
it.
00:13:21
And I find that to be the strangest thing.
00:13:23
It'd be like a CFO saying well, we know what our tax rate is,
00:13:27
but I'm not going to share it because there's going to be an
00:13:30
adverse reaction to it.
00:13:31
You know, the CFO is just going to share it, right.
00:13:33
Or if the sales numbers came in poorly for the quarter, the CFO
00:13:37
is just going to share that revenue went down right Because
00:13:46
sales numbers came in poorly.
00:13:47
It is what it is, and I find it so odd in the world of cyber
00:13:48
that there's this hesitation to share the reality of the
00:13:50
circumstance.
00:13:51
And I think it gets to what you're saying, where people
00:13:54
don't feel like they can make mistakes.
00:13:55
By the way, I don't think the cyber risk condition is a
00:13:57
mistake of the CSO.
00:13:58
But there's this natural sort of like feeling can't share that
00:14:04
because it's a reflection of who I am, or I can't make a
00:14:06
mistake because it's a reflection of who I am and I
00:14:10
think that in itself is a huge mistake in our overall industry.
00:14:14
Speaker 1: Yeah, yeah, it's, it's an unfortunate consequence
00:14:19
of I feel like punishing too harshly, right?
00:14:23
Right, I feel like punishing too harshly, right?
00:14:26
You know?
00:14:28
I remember when I was working for a credit bureau and they had
00:14:29
the culture on the security team was, you know, if you cause
00:14:33
an outage, you're done by the end of the day, right?
00:14:36
There was people that caused outages, you know, in the middle
00:14:41
of the night during a change window, sure, and they were let
00:14:45
go by the morning.
00:14:47
Wow, what about?
00:14:49
Like?
00:14:49
Speaker 2: somebody running a vulnerability scan, which is a
00:14:51
requirement that could cause an outage.
00:14:53
Same.
00:14:56
Speaker 1: Do not let it go down .
00:14:57
You know, and that was just the culture, and that was a
00:15:01
terrible culture because there was a lot of pressure with it,
00:15:05
right.
00:15:05
And one day, you know our solution we had just recently
00:15:09
upgraded it and our solution created an outage.
00:15:13
That was was quick, it was quickly contained, but the
00:15:19
damage was very significant.
00:15:20
It took us, you know, a week or so to to actually recover from
00:15:25
the damage.
00:15:26
That it did Nothing technically went down, but you know, it was
00:15:32
in a state where, you know, you're resetting 10 service
00:15:36
accounts and you're resetting, you know, 40 other accounts,
00:15:40
right, and all that sort of stuff.
00:15:43
And you know, my intern was the one that made the mistake.
00:15:47
It wasn't even a mistake, it was a business as usual test.
00:15:50
The product literally had a break in it that we didn't know
00:15:54
about.
00:15:54
You know, she did the job that she was supposed to do, she did
00:15:58
everything that she was supposed to do and it caused this issue,
00:16:01
right, and she was nervous about reporting it because she
00:16:07
immediately thought, okay, this is my last day here, I'm not
00:16:11
even out of college, this is my last day here, I'm screwed.
00:16:14
This is off to a terrible start for my career and my boss, or
00:16:19
my boss's boss actually.
00:16:20
He tasked me with doing an on-the the spot, like forensic
00:16:25
analysis of what happened who did what?
00:16:28
Because his boss was going to say did you fire whoever did it
00:16:32
right?
00:16:32
Speaker 2: so that was the first .
00:16:33
Speaker 1: Yeah, that was the first question he was going to
00:16:35
get, and so he tasked me with that and I showed the proof that
00:16:39
it was her and he started walking away and I had to stop
00:16:42
him mid like, literally mid stride to you know, firing her,
00:16:48
and say, hey, this wasn't her fault.
00:16:50
Like this is what happened the product, you know, broke the
00:16:54
product.
00:16:54
Let us down, right, this vendor , let us down.
00:16:58
It wasn't her fault, she did everything that she normally
00:17:01
does.
00:17:01
I mean she did it two days ago, right before the upgrade, did
00:17:05
everything that she normally does.
00:17:06
I mean she did it two days ago, right before the upgrade.
00:17:07
And but, like, it brings me back because that that culture,
00:17:09
that mentality, like really carries forward in a significant
00:17:13
way.
00:17:14
I remember, when fast forwarding a bit, every time I find like a
00:17:18
glaring vulnerability or a glaring hole in an environment,
00:17:22
I now don't really care, right, right, like, if, right, if I,
00:17:28
well, I don't care in terms of like telling people about it
00:17:33
because I'll, I'll find the issue.
00:17:35
And then, like, my cso will ask me, you know the question well,
00:17:37
what did you find?
00:17:38
I was like well, do you want the full detail?
00:17:40
You want like a, you know, a cherry-picked version of it?
00:17:43
Right, and you know he'll, he'll ask for, like the whole
00:17:47
truth and whatnot, and I'll be like, okay, I, I found all of
00:17:50
this.
00:17:50
You know, I found this stuff that we've been, you know,
00:17:53
hiding under the rug, or I found these dead bodies over here,
00:17:55
like we need to figure this out.
00:17:57
You know, make that sort of thing but, and that's just my
00:18:00
mentality, right, but, but, and that's just my mentality, right.
00:18:03
But I know other people that went through that same incident
00:18:06
that I went through, and now they're in a situation where
00:18:09
they constantly feel under pressure to not mess up.
00:18:13
Speaker 2: Yeah, you know it's a strange thing because I, in the
00:18:16
story that you gave, always think does that hold true for
00:18:19
other departments in the business?
00:18:21
No-transcript.
00:18:28
But at the same time, there are mistakes made in marketing,
00:18:34
there are mistakes made in sales , there's mistakes made in
00:18:37
product development and product execution, there's mistakes made
00:18:41
in how the CFO and the accounting team does their job.
00:18:44
And ultimately, it's always to get to the truth, right, it's
00:18:47
always to learn from past mistakes and it's always to try
00:18:50
to figure out how do we solve our problems and get better.
00:18:53
And the reality is that a business is always a series of
00:18:55
problems that need to be solved, right, nothing's perfect in
00:18:58
business, all considering the world's changing around you,
00:19:01
right, at the same time.
00:19:02
And so, you know, I just get this sense that IT and cyber is
00:19:08
in this unique position inside of corporations today, where the
00:19:14
other departments just kind of operate differently, almost
00:19:20
organically, in a way where they can adapt and maneuver and make
00:19:24
mistakes and overcome mistakes.
00:19:25
I find it very strange, and so I don't know if it's
00:19:28
self-inflicted, you know, as it's a culture thing, or if it's
00:19:33
a real thing.
00:19:34
And sometimes I think it's self-inflicted, right, it's the
00:19:36
culture inside those departments that continue just to
00:19:39
perpetuate that, and I'm not so sure that it's that same feeling
00:19:42
sits with the CEO, or sits with the corporate directors of the
00:19:46
business, who are risk takers by default, right.
00:19:50
Speaker 1: Yeah, I, you know, I think it is.
00:19:53
It's a bit twofold.
00:19:54
I think it is the culture within security and the
00:19:59
mentality that we're all taught, right.
00:20:01
I mean like you're taught.
00:20:02
You're taught this not even in school, you're taught it from
00:20:05
peers.
00:20:06
Right On this side of security, you have to be right every
00:20:10
single time, 100 percent of the time, and that one time that
00:20:13
you're not right it could lead to the entire company being
00:20:16
breached and us being out of a job.
00:20:19
Right, like, having that mentality means that you're
00:20:22
having a no-fail mentality, right, and then I kind of go
00:20:26
back to, like NASA's mentality of no-fail Right, and what that
00:20:31
means for them is, no, we're going to fail.
00:20:34
We're going to fail in controlled ways, right, we're
00:20:37
going to fail in as many controlled ways as we possibly
00:20:40
can.
00:20:40
We're going to think of literally every single thing
00:20:44
that can go wrong and we're going to try and prepare for it.
00:20:47
And then having you know the coding in a way where I think
00:20:52
it's like it's like fault tolerant or error tolerant
00:20:56
coding I mean, it's probably a different term, right, I'm not a
00:20:59
developer so I don't know it offhand, but it's this type of
00:21:02
coding that, even when errors happen in the code, the critical
00:21:07
systems are still running because they're all segmented
00:21:10
out and they're so well protected from each other that
00:21:13
there's no you know there's no stopping the engine before you
00:21:16
want to stop it.
00:21:17
Right, there's no stopping the navigation before you reach your
00:21:20
destination or whatnot.
00:21:22
And those things, those things all matter.
00:21:24
Right, and they're applicable in our world too.
00:21:28
Right, because you have to.
00:21:34
You have to approach security from the mentality of if this
00:21:36
endpoint were to be breached.
00:21:37
Well, what's the blast radius of that right?
00:21:39
Are we giving up everything because someone clicked on an
00:21:42
email, or are we giving up 1%?
00:21:47
Speaker 2: Yeah, you know, Joe, to dig into that.
00:21:49
Take NASA, right.
00:21:52
I mean, obviously, if you go to the race to the moon, right, no
00:21:55
, red moon, right, John F Kennedy, they made a lot of
00:21:58
mistakes.
00:21:59
They couldn't get rockets to launch.
00:22:01
Unfortunately, astronauts lost their lives in that process
00:22:05
through testing, right, but it was, you know.
00:22:07
And then we almost lost Apollo 13 on its mission to the moon.
00:22:12
Thankfully they got them back, but there was a lot of mistakes
00:22:15
that were made in the race to the moon.
00:22:18
But they learned a lot through those mistakes, right.
00:22:23
And now you're right, we're in this age of NASA, especially
00:22:26
like from the two shuttle explosions right in the 80s, and
00:22:30
I think the last one was in the 90s, where they became really
00:22:33
risk adverse 80s, and I think the last one was in the 90s,
00:22:36
where they became really risk adverse.
00:22:37
But then, all of a sudden, all the successes and progress and
00:22:39
NASA stopped as a result of being risk adverse, Right, Like
00:22:41
it had to take a company like SpaceX, Elon Musk, to sort of
00:22:46
fuel them and perpetuate them into the future, which even
00:22:49
SpaceX was on the verge of disaster because they couldn't
00:22:52
get their rockets to work Right.
00:22:53
And so feel that all great things happen through failure.
00:22:57
You have to be willing to fail and you know there's all sorts
00:23:01
of concepts in engineering like fail fast, right, so that you
00:23:05
learn from it.
00:23:05
But you know to dig into cyber.
00:23:08
I find this interesting because , Joe, I'm not sure if you're
00:23:11
familiar with what we do, but we help organizations understand
00:23:16
their cyber risk condition but then, ultimately, we help give
00:23:19
them options so that they can decide what to do with that
00:23:22
condition.
00:23:22
We basically simplify cyber risk management for them.
00:23:27
But what I really wanted to get into was that I'm analyzing tons
00:23:31
and tons of data on a regular basis related to losses inside
00:23:35
of cyber.
00:23:35
And you know the losses really aren't that bad.
00:23:39
Sure, nobody wants a loss to happen, right, they don't want a
00:23:43
data breach situation or a ransomware situation.
00:23:45
But if you really look at the full volume of all things that
00:23:50
have happened, it's really not that bad, right.
00:23:54
I mean, take the biggest IT outage in the history of IT,
00:23:57
which was CrowdStrike this past summer.
00:23:59
No outage has ever been as systemic as that outage.
00:24:02
And you know, in the Fortune 1000, just over a quarter
00:24:05
percent of the Fortune 1000 were directly impacted by that
00:24:09
outage.
00:24:09
Of course, you know we hear about Delta, right?
00:24:11
They lost half a billion dollars from that outage, but in
00:24:16
the big scheme of things, companies continue to go on,
00:24:21
right.
00:24:21
I mean, think about it Like out of the Fortune 1000, did any of
00:24:24
those 250 or so go out of business because of the
00:24:28
CrowdStrike outage?
00:24:29
No, they've continued, right, Even CrowdStrike in themselves,
00:24:34
themselves who caused the outage .
00:24:35
Their stock took a hit, but clearly CrowdStrike's on full
00:24:39
recovery mode right now.
00:24:40
They'll probably shake this off Two years.
00:24:43
We'll be like, oh, whatever happened with that CrowdStrike
00:24:45
outage and we'll be laughing about it, right?
00:24:46
Yeah, the largest fines inflicted on data breaches go to
00:24:53
Meta, right?
00:24:53
Facebook.
00:24:54
One of those fines alone was $5 billion.
00:24:56
Sure, some companies would be crushed by a $5 billion fine,
00:25:01
but Meta continued on, right.
00:25:04
The only real damage that I see is happening to small and medium
00:25:08
businesses.
00:25:09
Right, when they not the large corporations, but when small and
00:25:12
medium businesses have too many events cyber being part of
00:25:17
those events that take place together in a short period of
00:25:21
time, do they tend to be in a situation where they can't
00:25:24
recover?
00:25:24
Obviously, is it where you live Lincoln College, Lincoln
00:25:28
University right After COVID and then the ransomware incident,
00:25:31
they just had to shut their doors.
00:25:32
Right, that was a university that was open for, I think, more
00:25:34
than 100 years and just had to shut their doors right.
00:25:35
That was a university that was open for, I think, more than 100
00:25:36
years and they had to close their doors right.
00:25:37
So you do see those circumstances, but generally
00:25:41
it's compounded situations.
00:25:43
It's not just the cyber event all by itself.
00:25:45
The reason I bring this up is you can fail in cyber Most
00:25:51
organizations can fail in cyber, have an incident, deal with the
00:25:55
consequences of it.
00:25:56
It's not ideal, right?
00:25:57
Sometimes it's bad for shareholders, but you can deal
00:26:00
with the consequences of it and continue to move on.
00:26:03
It's not detrimental, it's not catastrophic to the business and
00:26:08
I think if more people realize that, then maybe this culture
00:26:12
that you and I are talking about would start to correct itself.
00:26:15
Speaker 1: Yeah, that is, that's really fascinating, because
00:26:18
that culture is very different in, like you said, in other
00:26:22
parts of the business.
00:26:23
Right, right, I mean for you to be the CEO of really any
00:26:29
company.
00:26:29
You're, you're, you got like a few screws loose, you know, like
00:26:34
talk about sure, about pressure , talk about stress and risk,
00:26:38
you know, yeah, um, and those guys are typically like very,
00:26:42
very big risk takers in some ways, you know, and they have to
00:26:46
be right, that's the only way a business will survive, because
00:26:49
that's how they got there play it safe, you'll just eventually
00:26:54
evaporate as a business yeah, that's how they got there, you
00:26:57
know, that's the only way, like, and with you know, elon musk,
00:27:01
right like, I mean, he has bet everything that he has owned
00:27:06
several times over.
00:27:07
You know he's.
00:27:09
He's not even worried to do it anymore, you know right, which
00:27:13
is.
00:27:13
It's a lesson that everyone can really learn from.
00:27:16
Speaker 2: I think yeah you know , do you watch bill maher at all
00:27:19
on hbo?
00:27:20
Not very much.
00:27:21
You know, I like to tune in, not every week, but every now
00:27:24
and again.
00:27:25
You know, have a glass of wine on friday night, tune in just to
00:27:27
see what's happening, see what his guests are saying.
00:27:29
But he does repeat something quite often on the show where,
00:27:33
when people on a show are risk takers, they're taking a chance,
00:27:36
they're being bold, regardless of everybody hates their opinion
00:27:40
or not.
00:27:40
He always celebrates their boldness and he always says that
00:27:46
he believes that for the people that are bold and make bold
00:27:49
decisions, it will always work out in the end for them.
00:27:52
And and I think there's truth to that statement, you know,
00:27:56
like I really do, I think that you know, in general, if you're
00:28:00
bold and determined, you continue to have that motivation
00:28:02
to move forward, it will work out for you.
00:28:05
It's not, it's the people who give up, it's the people that
00:28:09
are afraid of making decisions.
00:28:10
That indecision that I think generally leads to dire
00:28:14
consequences.
00:28:14
Speaker 1: Yeah, yeah, that is, that's very true.
00:28:17
You know, and like when, whenever, you know, whenever
00:28:21
people are making like a career change, right, or they're trying
00:28:25
to, just for for sake of this, this podcast, you know they're
00:28:28
trying to get into cybersecurity from something else.
00:28:31
I mean, that takes a level of boldness to think I don't know
00:28:35
anything about this area and I'm going to get into it.
00:28:38
You know, like that, that, that really that takes some guts and
00:28:43
you're doing it, joe right, You're doing it.
00:28:45
Speaker 2: You know you started this and now you're doing it and
00:28:48
you've done, would you say the other day, like more than 200
00:28:52
episodes already, right?
00:28:53
So that's the boldness, right?
00:28:55
That's what I'm talking about.
00:28:57
Speaker 1: Yeah, Well, you know, I also look at it.
00:29:00
I remember when I was deciding to do the podcast or not, right,
00:29:05
I was looking at it from the angle of, well, what happens if
00:29:08
I'm like 60 or 70 years old?
00:29:10
And I look back, well, I regret not doing it and I thought that
00:29:15
I would.
00:29:15
You know, because I like connecting with people.
00:29:18
You know me and you like we'll, we'll be talking, you know,
00:29:22
fairly regularly, right, Like once a year, like we'll talk and
00:29:26
see how each other is doing.
00:29:27
Now, without this platform, that wouldn't be possible at all
00:29:32
in any way, shape and form, right, Like I would be nervous
00:29:36
to even just reach out to you.
00:29:37
But now I don't care who I'm reaching out to, I'll reach out
00:29:39
to them, you know, Right right.
00:29:42
Which, yeah, I totally would regret it.
00:29:45
And I have a personal rule too, where if that answer is yes, I
00:29:49
will regret it, then I absolutely must do it and there
00:29:53
is nothing that can stop.
00:29:54
Same thing with when I was trying to get into security.
00:29:57
I thought I could be successful at it.
00:30:00
Right, I didn't know how successful I would be at it, but
00:30:03
I knew that if I didn't try I would regret it.
00:30:06
And so then, like by default, I literally had to give it
00:30:10
everything that I had, and I couldn't stop until I gave
00:30:14
everything, and I was.
00:30:20
I was just about to stop too, Like I gave it everything that I
00:30:21
had and I was just about to stop.
00:30:22
Speaker 2: And then I go and I get two offers in the same day,
00:30:24
like, okay, there's something here, that's the boldness, right
00:30:27
, that's what, that's the reward .
00:30:28
It just it, just it's kind of like magic, right.
00:30:31
It just it works out.
00:30:33
It happens.
00:30:34
You know, I feel like too, like when you're right at the bottom
00:30:37
is when sometimes, the best things happen.
00:30:40
Right, like right when you're ready to lose everything or
00:30:43
you're ready to give up, but you just have that little bit of
00:30:46
perseverance.
00:30:47
I feel like that's always where , like, those amazing things
00:30:51
happen, is at that bottom where, like those amazing things
00:30:58
happen.
00:30:58
Speaker 1: It's at that bottom.
00:30:59
Yeah, yeah, it's.
00:30:59
It's interesting how I have found that when I go to you know
00:31:01
new levels.
00:31:02
It's like a new level of anxiety too.
00:31:05
Right, like you know, my, my wife and I, we we built our
00:31:12
first house.
00:31:13
Right, this is our first house.
00:31:15
I'm in it right now, our first house.
00:31:17
We built it, which is no small.
00:31:19
Speaker 2: That's a hard thing to do, by the way.
00:31:21
Speaker 1: Yeah, it's no small feat.
00:31:22
I don't think that we really realized that, right, we were
00:31:26
trying to buy but buying didn't really make sense because the
00:31:29
market was so inflated at the time.
00:31:30
Right, because the market was so inflated at the time, right,
00:31:33
you're going to spend $700 for literally the house that I
00:31:38
have right now, and then you're going to be spending another
00:31:40
$250, fixing it up, making it livable and whatnot.
00:31:46
It just didn't make any sense.
00:31:48
And when we finally moved in here, we obviously reached a new
00:31:56
level together, you know, with, with getting the house and
00:31:58
whatnot.
00:31:59
But then I had like a new level of anxiety, right, like
00:32:02
something I that like crippled me for like a week.
00:32:05
You know, it was just like I'm so nervous I don't know if I can
00:32:08
make this, make this payment.
00:32:10
You know, what did I do here?
00:32:11
Like I'm, I'm a failure just going through all of this stuff
00:32:15
and I had a.
00:32:16
I had to stop myself.
00:32:17
I think I actually had a friend that like stopped me and was
00:32:20
like, hey, you know, like you're fine, it's going to be okay,
00:32:24
it's new, it's different, but you're going to get used to it.
00:32:27
And now you know it's totally different.
00:32:29
Like I'm not even, I'm not even worried about it.
00:32:31
I'm more frustrated that my mailbox is crooked as hell that.
00:32:35
I put in than than anything else .
00:32:38
Speaker 2: Right, but you've relaxed into it.
00:32:39
Yeah, yeah, yeah.
00:32:42
Speaker 1: It's going to be uncomfortable in the beginning,
00:32:44
you know, but as you get used to it, as you get used to that
00:32:47
level, um, you know it gets easier.
00:32:50
I think that's something that people forget about, or they
00:32:54
miss often.
00:32:54
Speaker 2: Yeah, it's almost like you have to accept, right
00:32:56
your own reality and and just contend with it for whatever it
00:33:01
is.
00:33:01
Joe, I have a special needs daughter.
00:33:05
She's gonna be 20 on sunday, by the way, well, and uh, you know
00:33:09
she's one of those kids that, um, after she was, the doctors
00:33:13
are like telling you she's probably not going to make it
00:33:16
Right, and we heard that throughout her childhood.
00:33:20
Of course, the doctors were wrong, but she's with me all the
00:33:25
time.
00:33:25
I take care of her.
00:33:26
She's with me all the time and I get these people who come up
00:33:29
to me and they're like oh, you know, god bless you or you're
00:33:32
such an amazing dad.
00:33:34
I don't even think about it that way.
00:33:35
I think about it as that's my reality and I accepted it very
00:33:40
early on.
00:33:40
I didn't fight with it, I wasn't angry about it and, of
00:33:43
course, like back when Olivia was younger, people would ask
00:33:46
like are you angry?
00:33:47
Do you wish that you knew so you could have aborted the child
00:33:50
?
00:33:50
And I'm like, no, like I'm, I'm happy, like this was a gift in
00:33:58
my life, and so I think perspective the point I'm trying
00:34:01
to get to is I think perspective allows us to really
00:34:04
operate in a way that is normal and anxiety free and allows us
00:34:09
to really find the joy and the beauty in the things that we're
00:34:13
doing, like.
00:34:14
I personally love cyber risk, right, I enjoy the things that
00:34:15
we're doing.
00:34:15
Like.
00:34:16
I personally love cyber risk, right.
00:34:16
I enjoy the space that we're in .
00:34:18
I love being a father to Olivia .
00:34:19
I have two other children too, and I love being a father to
00:34:21
them.
00:34:22
It's all different, every one of those circumstances is
00:34:25
different, but I just accept them for what they are and love
00:34:28
them, and it allows me to just operate with a certain amount of
00:34:33
peace and anxiety-free attitude .
00:34:36
That could be totally different if I was full of anxiety, right
00:34:40
Like.
00:34:40
If I was angry and anxiety, then I'm not good to them, I'm
00:34:45
not good at my job, I'm not good to anybody, but interestingly,
00:34:49
there's a lot of people, I think , that focus more on the
00:34:52
negative than the positive, and part of it maybe, is a little
00:34:57
Buddha-like, but you just sort of have to, I think, let go of
00:35:02
the suffering, right Like just let go of it, right of.
00:35:21
Speaker 1: oh you know, do you wish that you would have known
00:35:22
that unknown or whatever might have been right?
00:35:25
yeah, it's like you know people are or they're, they're coming
00:35:30
at it from.
00:35:31
I don't know about that unknown in my life.
00:35:33
I don't know how that's going to change me.
00:35:34
I don't know how that's going to change me.
00:35:36
I don't know how that's going to impact everything else around
00:35:37
me.
00:35:37
You know and yeah it's that's a really incredibly tough
00:35:42
situation.
00:35:43
You know, like, like you, you mentioned that.
00:35:46
You know doctors were telling you that.
00:35:47
You know she wasn't going to last very long and whatnot.
00:35:51
And you know, I think back when , when, when my first kid I only
00:35:56
have one kid right now, but when my first kid was born, she
00:35:59
had a pneumothorax right and one of the doctors one of the
00:36:03
doctors I really didn't like.
00:36:05
I didn't trust her.
00:36:06
Speaker 2: I had, I had some, didn't like either of them.
00:36:08
Makes you feel better?
00:36:09
Speaker 1: Yeah, I really did not like her and I met her for
00:36:13
maybe 20 minutes, right, and that was the last time I ever
00:36:17
saw her or spoke to her.
00:36:19
And I mean, like literally, you know, I just tore into this
00:36:22
person because they were, you know, they were treating my kid
00:36:25
almost like a, like an experiment, right, and I'm
00:36:29
sitting here.
00:36:29
I'm like you guys.
00:36:31
You guys literally don't understand who you're dealing
00:36:34
with.
00:36:35
Like I can reverse engineer this thing on the fly.
00:36:38
You can't like, you literally can't tell me that you don't
00:36:42
know when she's going to run out of morphine, for instance.
00:36:45
Right, right, there should be no shortage of morphine in this
00:36:51
room, like I and I literally said, from this day forward, I
00:36:56
expect her to be a bag of morphine until she's discharged
00:37:00
sitting there and if she needs it, she gets it immediately.
00:37:04
Yeah.
00:37:04
Speaker 2: You don't want to see your daughter in pain, right?
00:37:06
Speaker 1: Right, yeah, she's what alive for three days and
00:37:08
she's in, you know, excruciating pain.
00:37:10
And I told you know I guess there was a benefit to what I
00:37:16
went through younger, right, because my sister, my sister,
00:37:19
went through renal failure, right, and I ended up donating
00:37:24
my kidney to her.
00:37:25
She's fine today, she's living a great life and whatnot.
00:37:28
Totally fine, but seeing how, seeing how my mom had to
00:37:34
navigate right this world, as we're not a wealthy family,
00:37:37
we're a poor family, right, my dad actually, in fact, a few
00:37:41
months beforehand, lost his job, you know, and so that was an
00:37:45
extraordinarily stressful time.
00:37:47
But my mom learned, and I learned in return is that, in
00:37:52
that situation, the social worker and the nurse actually
00:37:55
have the most power.
00:37:57
In that situation, right, if you want a doctor removed, if
00:37:59
you want a team removed, or if you want them transferred to
00:38:03
another unit, or whatever it is those two people, they're tasked
00:38:07
with making it happen, no matter what.
00:38:09
And so you know, in this situation, right, when that
00:38:13
doctor was basically trying to use my kid as an experiment, you
00:38:18
know, when it's a relatively minor issue that she was going
00:38:24
through, right, it was a pneumothorax, a little hole in
00:38:26
the lung she had.
00:38:27
You know, she was a little bit early, right, so they just had
00:38:31
to wait for it to heal.
00:38:32
You know it wasn't like anything crazy, but for a new
00:38:36
parent that's extraordinarily stressful.
00:38:38
Scary, yeah, super scary.
00:38:39
That was by far the most scared I've ever been in my life, by a
00:38:43
long shot.
00:38:44
But going through everything that I went through, I just went
00:38:47
to the nurse and I said I don't want that doctor ever seeing my
00:38:49
kid again.
00:38:51
If she's in this room, she's seeing other kids, she's not
00:38:53
allowed to cross this threshold of the room and she's not
00:38:55
allowed to have any input.
00:38:56
And they literally said well, what if she's the only one?
00:38:59
I'm like you better call in someone else.
00:39:01
Like I don't, I don't care, she is not allowed to touch my kid,
00:39:05
she's not allowed to treat my kid, and I was very clear with
00:39:08
them.
00:39:08
I was like I want only these three nurses to be on her
00:39:10
nursing team, right?
00:39:11
So we know the nurse during the day, we know the nurse in the
00:39:13
afternoon, we know the nurse during the day, we know the
00:39:15
nurse in the afternoon, we know the nurse at night.
00:39:17
And that doctor is not allowed.
00:39:19
And it was done right, everything that I requested was
00:39:22
done.
00:39:23
But if I went and started a fight with that doctor.
00:39:26
Now they're going to have problems with me.
00:39:29
You know, and I can't even remember how I got down this
00:39:32
path right, but it's that unknown that you kind of have to
00:39:35
dive into and embrace.
00:39:38
That's when you actually make the real progress.
00:39:41
That's when you actually make the real change in your life and
00:39:44
everyone else's life.
00:39:47
Speaker 2: I agree with that.
00:39:47
If you were to think back on that Joe, that particular doctor
00:39:52
, do you think part of it was attitude like yeah 100.
00:39:58
Speaker 1: Yeah, because she was the only one that was just like
00:40:03
openly smiling at me as she was saying what she was going to do
00:40:07
.
00:40:07
You know, like there was like no empathy, right, and I'm
00:40:12
sitting here, I understand you may have a positive personality,
00:40:16
right, but there is six other doctors that see my kid every
00:40:21
single day and none of them approach me in that manner.
00:40:24
Right, like when my sister was sick and she had dozens of
00:40:29
doctors, none of them approached us that way.
00:40:31
Right, it was very serious, concise, to the point, very
00:40:36
exact about what was going on.
00:40:38
There was no questions about what was going on and what was
00:40:41
going to happen.
00:40:41
There was none, right, and it was.
00:40:44
It was totally her, her, her attitude.
00:40:47
Because as soon as, as soon as she approached me from that way,
00:40:50
I was like, oh yeah, she ain't, she is not ready to handle my
00:40:55
kid because she's not ready to handle me, because I am not.
00:40:59
I'm not going to be all like jolly with you, like no, get out
00:41:03
of the room, let the adults.
00:41:05
Let the adults handle it, you know.
00:41:07
Speaker 2: Yeah, yeah, you know, you know it's interesting.
00:41:10
And to tie it back to to cyber, you know, I think this is a this
00:41:15
is part two of like the mistake that a lot of the vendor and
00:41:18
the consulting community makes.
00:41:19
They want to talk at the CSO or they want to talk at the
00:41:23
security people.
00:41:25
And I see some people that are out there that are big voices
00:41:28
and you know they're always like CISOs should get fired for this
00:41:31
, or or they say, like you know, they just need to get in line
00:41:35
and adapt and you know, and I don't think any of that's useful
00:41:39
, I think it's all like that doctor you're referring to,
00:41:42
right, it's just like that sort of like, and they're just
00:41:45
alienating the community, and I think we are in a place where
00:41:49
there needs to be more compassion, right, we're in a
00:41:51
place where there needs to be more of like I'm relating to you
00:41:55
your job sucks, right, like it sucks, it's hard and you know,
00:42:03
mainly you're responsible for a problem that you partially own,
00:42:05
right, lots of other people own the problem and maybe you know,
00:42:08
like, if people learn from your example and approach others in
00:42:13
the field with that sort of grace and empathy, then I think
00:42:20
we would see incredible changes and maybe some of that
00:42:24
frustration and that anxiety that's prominent in our
00:42:28
community would start to go away a little bit.
00:42:32
You know, it's just everybody being a little bit nicer with
00:42:36
each other and relating to each other, and yeah, I'd love to see
00:42:39
that personally yeah, you know it's.
00:42:42
Speaker 1: It's fascinating when you approach insecurity, when
00:42:47
you approach things from the like customer obsessed mentality
00:42:51
.
00:42:51
Yeah, right, you're, you're not looking to get through whatever
00:42:56
the end user is talking about so that you can prescribe them a
00:43:01
solution.
00:43:01
Right, You're not like being solution oriented in that way,
00:43:06
but you're more focused on actually hearing them out.
00:43:10
Right, ask people.
00:43:12
When I'm about to ask something of someone, right, I ask them.
00:43:18
Well, why don't you tell me why it is this way?
00:43:20
Right, Because I'm probably missing something here.
00:43:23
You know, why was it like this 20 years ago or whatever?
00:43:26
It is right.
00:43:27
And learning what the decisions were, why they made it, the
00:43:31
evolution of a system or you know a domain and the
00:43:35
environment and whatnot right, Coming at it from that
00:43:39
perspective also opens up the recipient of what you're going
00:43:44
to say.
00:43:44
Right, Like the recommendations , because there was many times
00:43:49
when there's been many times where people will be very
00:43:52
attached to a system or you know a part of the network that's
00:43:56
critical.
00:43:57
Right, Because they feel like, hey, I really contributed to the
00:44:00
success of the IT team here at this company and it's going to,
00:44:05
you know, be like this forever.
00:44:06
Right, Well, when you're meeting them in the middle there
00:44:09
and saying, hey, you know, you created a great system, you
00:44:14
created a great environment, like it's top of the line for
00:44:18
sure.
00:44:19
You know, 10 years ago it was top of the line.
00:44:21
Right Now, with different you know, advents of like zero trust
00:44:26
and whatnot, we should restructure it just a little bit
00:44:29
, make it even better.
00:44:30
Right, it's really great.
00:44:31
We're just making it a little bit better and and it fits our
00:44:34
future endeavors right, that's how you approach it.
00:44:38
But if you approach it from the perspective of, oh, you're
00:44:42
wrong.
00:44:42
This is an antiquated technology, You've already lost
00:44:45
them.
00:44:46
You've already lost them.
00:44:47
You're going to entrench them and they're never going to.
00:44:50
Speaker 2: You want to hear a great story.
00:44:51
Over the weekend I was having an issue with one of my websites
00:44:54
.
00:44:54
It's hosted on the Wix platform and it was a weird issue I
00:44:58
couldn't figure out, like what was going on.
00:45:00
Normally I can sort of figure those things out and fix them on
00:45:02
my own.
00:45:02
And so I reached out to the support channel, and the support
00:45:07
channel is normally a phone-based channel.
00:45:10
You can get a human on the phone, but on the weekends it's
00:45:13
through messaging, right?
00:45:14
It's a messaging platform and this woman over on the support
00:45:19
team at Wix she picks up and she starts interfacing with them.
00:45:22
She's like by the way, thank you so much for helping me on a
00:45:24
Saturday morning.
00:45:26
And then we just started this nice dialogue back and forth
00:45:29
with each other and it got to a point where she said you know,
00:45:34
bob, normally I have to elevate this to somebody higher up than
00:45:38
me, but I really want to help you and I think I know how to
00:45:42
help you with your problem and I'm not going to elevate it.
00:45:44
But let's try something.
00:45:45
And so she gave me this thing to try.
00:45:47
She walked me through it step by step.
00:45:49
We got through it.
00:45:50
It actually fixed the problem right, but normally that would
00:45:53
have been escalated to somebody else.
00:45:54
I would have had to wait for somebody else to contact me
00:45:57
later in the day.
00:45:58
But I think that politeness, that polite exchange between the
00:46:02
two of us, a little bit of gratitude.
00:46:04
She was probably like you know what, I'm going to help this guy
00:46:07
and she did and it was great.
00:46:10
My experience with WIC support was outstanding and I just feel
00:46:14
like those little things matter to your point.
00:46:18
You know what I mean.
00:46:19
Those little pleasantries matter.
00:46:25
We're all humans at the end of the day.
00:46:25
We all want to be thanked for the work that we're doing.
00:46:27
We all want to feel good about what we're doing.
00:46:28
One of the reasons I love horticulture is because it has a
00:46:30
lot of instant gratification.
00:46:32
You plant a bunch of plants in a field.
00:46:34
You get to see what you just did.
00:46:35
Right, it looks great, the field's been tilled, you can go
00:46:38
back and see the perfectly straight lines.
00:46:39
So I love instant gratification .
00:46:42
But I think most of us really in general want instant
00:46:45
gratification and the things that we do um and add in some
00:46:49
pleasantries and I think all of a sudden you got this success
00:46:52
thing happening yeah.
00:46:55
Speaker 1: Yeah, absolutely.
00:46:56
Well, you know, Robert, you know I apologize, we didn't
00:47:01
really dive into X Analytics, but why don't you, why don't you
00:47:04
tell my audience, you know, a little bit like an overview of
00:47:07
like what you guys do, what you specialize in, yeah sure, and
00:47:10
how you help other companies?
00:47:12
Speaker 2: Yeah, by the way, joe , this has been a fun
00:47:14
conversation, so thank you for having me on today.
00:47:16
I really appreciate it.
00:47:17
Yeah, absolutely.
00:47:18
You know, just just summary wise, with X analytics, I'll
00:47:24
tell you like where, where the idea came from and where we are
00:47:27
today.
00:47:28
So years ago I was in a board meeting for a large bank.
00:47:31
I was assisting the CISO in the board meeting and and for the
00:47:35
stuff that we prepared, you know , we did a great job preparing
00:47:38
the materials, but it was not received well in the board and
00:47:42
it wasn't because they were upset about the information.
00:47:44
They just didn't know what we were talking about, right?
00:47:48
Like clearly, they had no idea what we were talking about, and
00:47:51
so I left that board meeting.
00:47:52
I said, you know, there has to be a better way, there has to be
00:47:57
a way to communicate cyber in a way that people can understand
00:48:00
and a way that they can make sound decisions from.
00:48:01
And so about a year later, I left that job that I had at the
00:48:06
time and joined up with some partners at X Analytics and we
00:48:10
created the concept of X Analytics and the idea was could
00:48:14
we build something that simplifies cyber risk management
00:48:18
and could we build something that allows people, whether
00:48:22
they're novices in cyber or not, to understand what the risk
00:48:26
condition looks like and then where they can make decisions
00:48:29
with ease.
00:48:30
Right, that was the concept, and so we achieved that concept.
00:48:33
We built what we wanted to build.
00:48:35
We continue to iterate it, iterate on that idea as time
00:48:38
goes on and we continue to advance our capabilities.
00:48:41
But fundamentally, that's what X Analytics is.
00:48:44
X Analytics is a cloud-based platform that helps folks
00:48:48
simplify cyber risk management, and the way that we do that is
00:48:53
we have a really simple structure to help them build a
00:48:56
profile for their business.
00:48:58
That profile gets married with back-end data, which is
00:49:01
historical loss data, historical threat data, historical
00:49:04
probability data, and it serves up a really easy-to-understand
00:49:10
concept of their cyber risk condition and they can see if
00:49:18
they're a NIST CSF organization, they can see what the world
00:49:19
looks like in this CSF.
00:49:20
If they're CIS CSC the critical security controls they can see
00:49:22
what the world looks like under that context.
00:49:25
But then we take it further and we weave in the elements of
00:49:28
governance.
00:49:29
We weave in the elements of optimized transfer, optimized
00:49:33
mitigation, so that, ultimately, organizations can see that the
00:49:38
decisions that they're making is leading to an improved outcome
00:49:43
in their overall cybersecurity posture and the goal there is to
00:49:47
give the CISOs a pat on the back right.
00:49:49
The CISOs are doing all these wonderful things.
00:49:52
From the very beginning.
00:49:53
They can see those wonderful things by looking at the
00:49:55
difference between inherent risk and residual risk and the
00:49:58
current set that they're in.
00:49:59
But then as time goes on, they continue to show those trend
00:50:03
lines and how those trend lines are improving based on the
00:50:05
wonderful projects that they're implementing within their
00:50:08
organization.
00:50:09
So it's really to serve up not only an honest perspective of
00:50:13
their business but also to make sure that the CSO is getting the
00:50:16
compliments for the hard work that he and she are doing for
00:50:19
the business that they work.
00:50:21
Speaker 1: Wow, yeah, it's really.
00:50:22
It's interesting.
00:50:23
It's like providing context right when, where, where
00:50:28
wouldn't normally be in a very tangible way.
00:50:31
Speaker 2: Yeah, and it's, and it's represented in a financial
00:50:35
lens.
00:50:35
Oh, and that's not the only way you can look at, like how much
00:50:38
of NIS have I achieved?
00:50:40
What is my NIS tier achievement ?
00:50:41
Between one and four.
00:50:43
So it serves up other sorts of metrics that they can draw that
00:50:50
perspective.
00:50:50
But the perspective that we do put forward is a financial one,
00:50:51
right?
00:50:52
So they can say my cyber risk problem is equivalent to 1% of
00:50:55
revenue.
00:50:55
Now, is that a big deal?
00:50:57
Maybe, maybe not right, that's unique to every company.
00:50:59
But it allows them to take that understanding and also compare
00:51:03
it to other operational risks inside the business.
00:51:05
So, you know, in this past year, if you're comparing your cyber
00:51:09
condition and it's 1% of revenue , but inflation is 7% of revenue
00:51:13
, well, you're probably going to focus more on inflationary
00:51:16
problems, right?
00:51:17
Going back to the economic stuff that we were talking about, if
00:51:20
you're dealing with a company that has a lot of shoplifting
00:51:23
and if shoplifting is 5% of revenue, well then that's going
00:51:27
to be more important to the business to address shoplifting
00:51:29
than the cyber problem.
00:51:30
On the other hand, if cyber is the thing that is most
00:51:33
significant it's 2%, 3%, 4% of revenue then it allows you to
00:51:37
compare that with the other operational risks, to have a
00:51:40
very honest conversation with the leadership, to say you know
00:51:43
what, maybe we need to invest more in cyber.
00:51:44
It's our number two or number one problem in our company.
00:51:47
How would you guys like us to proceed?
00:51:49
Right, and that's just a very open and honest conversation.
00:51:54
So that's the goal.
00:51:55
Right Is to really sort of simplify it, put it in a
00:51:58
language and a context that everybody in the business can
00:52:00
understand, compare it to other things that are happening in the
00:52:03
business and then ultimately make the right types of
00:52:05
decisions.
00:52:08
Speaker 1: Yeah, it makes a lot of sense.
00:52:09
It's definitely an area that's needed, for sure, in the
00:52:13
industry.
00:52:13
Well, you know, robert, I really enjoyed our conversation,
00:52:17
but before I let you go, how about you tell my audience you
00:52:20
know where they can find you if they wanted to connect, and then
00:52:23
where they can find your company if they wanted to learn
00:52:24
more?
00:52:25
Speaker 2: Sure, I mean, you can easily find me on LinkedIn,
00:52:28
robert Vecchio.
00:52:28
There might be more than one.
00:52:29
There's a doctor out of Los Angeles that also has a Robert
00:52:32
Vecchio name, and there's also an author of children's books.
00:52:34
So there's three of us out there that I know of, but,
00:52:37
robert Vecchio, you'll see me because I'll have the
00:52:39
cybersecurity tag associated with my LinkedIn, and then our
00:52:43
web address is xanalyticscom.
00:52:45
It's x-analyticscom, so we're really easy to find Joe.
00:52:50
This has been an absolutely wonderful conversation.
00:52:52
Thank you so much for having me on today.
00:52:54
Speaker 1: Yeah, absolutely, I really enjoyed it.
00:52:56
I'll definitely have to have you back on you know in the
00:52:59
future.
00:52:59
Absolutely, I look forward to it.
00:53:01
Thank you, Joe, Awesome.
00:53:03
Well, thanks everyone.
00:53:04
I hope you enjoyed this episode .