Have you ever gazed into the depths of a cybersecurity expert's mind? Prepare to be captivated as we sit down with Oliver, a virtuoso in the realms of IT and cybersecurity, whose tale unfolds from a childhood enchanted by math and sci-fi to the frontlines of digital defense. In today's episode, we peel back the layers of cybersecurity, from the bedrock of IBM mainframes to the latest in AI-driven security strategies, through the eyes of someone who has seen it all. Oliver's insights paint a vivid picture of the hacker's mindset and the relentless progression of cybersecurity challenges.
Oliver doesn't shy away from the personal, either. He lays bare his struggles with imposter syndrome, reminding us that even the most seasoned professionals harbor self-doubt. This candid talk traverses the landscape of technological leadership, contrasting the role of yesterday's CTO with today’s, and emphasizes the transformative journey required to shepherd teams through decades of tech evolution. With Oliver's narrative, you're invited to witness the metamorphosis of an industry and the professionals within it, rooted in the principle of lifelong learning.
As we explore the shifting sands of network security, Oliver guides us through the sophisticated use of AI and machine learning in detecting cyber threats. We probe the vital nature of identity security in a boundaryless digital world and the adaptation of cybersecurity strategies to protect networks, clouds, identities, applications, and endpoints. Diving into the ethical quandaries of AI in security, we uncover the importance of safeguarding privileged access against the burgeoning capabilities of AI. Join us for this enlightening episode that promises to arm you with a deeper understanding of the complex, ever-changing theater of cybersecurity.
This podcast focuses on explaining the fascinating ways that science and engineering...
Listen on: Apple Podcasts Spotify
Affiliate Links:
NordVPN: https://go.nordvpn.net/aff_c?offer_id=15&aff_id=87753&url_id=902
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: So how's it going, oliver?
00:00:01
You know it's fantastic to have you on.
00:00:04
I know that we've been planning this thing for a while now, and
00:00:07
I'm really looking forward to our conversation.
00:00:10
Speaker 2: Yeah, I'm glad to be on.
00:00:12
These conversations tend to go sometimes in predictable ways
00:00:15
and sometimes in unpredictable ways, but we'll keep it as
00:00:17
organic as possible.
00:00:19
Speaker 1: So this podcast is completely organic, right, I may
00:00:25
have a few questions on the side that I jotted down, right
00:00:28
that I'm interested in.
00:00:29
I want to make sure that I get to.
00:00:30
But really, you know, this time is for you to tell your story
00:00:34
how you want it to be told, right, and I pride myself a lot
00:00:40
on how the podcast is structured .
00:00:41
So I feel like a lot of people appreciate that.
00:00:44
Yeah, absolutely so.
00:00:45
You know, oliver, why don't we get started with you know, you
00:00:48
diving into your background, right, like how you got into IT,
00:00:53
what made you want to get into IT?
00:00:54
You know, was there a certain you know show, movie, book that
00:01:00
you read, something like that, that kind of sparked that
00:01:03
interest to make you go down this path?
00:01:05
Speaker 2: Yeah, it's interesting.
00:01:06
As a young kid I was, always I was the kid, the math kid, whose
00:01:15
parents and grandparents would throw out to me, multiply these
00:01:19
here numbers and come up with a result.
00:01:21
My uncle, who was not much older than me, who was going to
00:01:26
university and studying computer science.
00:01:29
And so I remember from the early days when I was a kid he
00:01:35
would come back and this was kind of like an IBM mainframe
00:01:39
with green bars, you know the old printout, this kind of white
00:01:43
printout with white and green bars on it, and so I remember
00:01:47
always being kind of fascinated by that, certainly a science
00:01:51
fiction person, so, you know, fitting like 2000 on Space
00:01:54
Odyssey, which of course has Halloween and the notion of a
00:01:58
malevolent AI and all of that was necessarily a thing back
00:02:00
then.
00:02:01
So I always kind of felt that I was predestined to go do
00:02:06
computer science and math and some combination of the two and
00:02:09
also to kind of study a better degree in both, and I always
00:02:15
found complexity of turning problems into symbolic logic and
00:02:22
running code in the early days to me was just like that, it was
00:02:25
chatnet, and so I was always that in the creative process and
00:02:28
I think there may be a lot of people that think cluster
00:02:31
programming is a highly technical thing.
00:02:32
I knew this before when I was in practice at all.
00:02:36
If you've seen boot code, if you've seen back code, you
00:02:40
understand how the emergency came out, and so for me it
00:02:45
understand how to do business between that, and so for me it
00:02:46
was always programming, was it?
00:02:47
And I was also going to go on from the early days for really
00:02:48
understanding systems well to the point that you know, when I
00:02:51
joined IBM, I mean I was still at the main game operating
00:02:53
systems in assembly language, and so you know, a low-level
00:02:58
really understanding all the way down to the hardware, interrupt
00:03:02
, vectors, those kinds of things .
00:03:04
So that's how I cut my teeth and the notion of wanting to
00:03:10
know how things work.
00:03:12
It's like an intrinsic thing to something that's always not to
00:03:17
others.
00:03:17
Right, I mean it's bread.
00:03:19
I mean it's bread, I'm using it intrinsically on a sandpile.
00:03:22
All for me it kind of goes into that.
00:03:25
I do these like rice, sourdoughs, salt and pepper and
00:03:29
everything.
00:03:30
But I need to have an understanding of how that works
00:03:34
and how it's intended to work and how it can be made to
00:03:37
actually not work.
00:03:38
That's what I was intended.
00:03:39
You saw more of the hacker ethos.
00:03:41
You know smooth.
00:03:42
This is the around the mid 90s.
00:03:45
I kind of got into security and security.
00:03:49
This is all of your kid right.
00:03:52
And the only thing I learned in security was basically the
00:03:54
amount of greatest servers off on the systems and the value of
00:03:58
the key.
00:03:58
How do you have two parties have an exchange without
00:04:02
necessarily saying you know here's the password and get
00:04:06
proof to each other that they have access to the same metric
00:04:08
team material or that you have asymmetric team material and
00:04:12
stuff like that to communications?
00:04:14
Ultimately, I got acquired.
00:04:16
I got acquired to Juniper Networks, where I kind of ran
00:04:26
into R-Wall and IPSs and SSL VPNs and other things.
00:04:32
It's a sort of this entire arc of you know getting into what I
00:04:37
thought was either follow-up to short comms or authentication or
00:04:41
preventative stuff.
00:04:42
And the epiphany at the end of that journey and it devalued
00:04:48
when I came to Vectra was that you know all this preventative
00:04:52
stuff that we're doing is just like a filter.
00:04:55
It's the first line of defense.
00:04:58
Ultimately, the reason people have incident responders and
00:05:01
they have SOC teams is that, yes , that's what happens.
00:05:04
And there's also this of this interesting thing is, when
00:05:07
you're on the preventive side, oftentimes you create these
00:05:10
preventive controls that, if operated perfectly, might, you
00:05:17
know, really save you.
00:05:18
But you know, if you actually took a whole story and were a
00:05:22
post-honorist, 2% of them were capable of running this thing,
00:05:26
particularly because it required a degree of cleared lands and
00:05:29
mental modeling that was just not economically feasible for
00:05:34
them to have everything that they had off.
00:05:35
And so often times I think, in my retrospective view of that
00:05:40
world, is that the delta between both the theoretical efficacy
00:05:47
of a solution and the actual, as-deployed efficacy of a
00:05:51
solution was the large enough human capital that you as the
00:05:55
customer were intended to kind of put into the system.
00:05:58
And then, even if you got it all right, you know something like
00:06:01
Log4J comes along or something like that, the latest kind of
00:06:04
backdoors and balls kind of come along.
00:06:06
It's just like, oh well.
00:06:07
Or you know, microsoft gets one of their sign-in keys and we
00:06:11
start seeing sort of, so you still kind of deal with the back
00:06:14
end of it, and so then you know I kind of got to this point of
00:06:18
at some point the law of the machine returns on the
00:06:21
preventive side, right, it's like, yeah, if you put another
00:06:24
million dollars to gain another percent of efficacy in your
00:06:27
prevention, but that's how it is .
00:06:29
This balance of trade of how much are you working on
00:06:33
resilience and detection and response.
00:06:35
That's the how it got me to Electra at the beginning of
00:06:38
Actra To say, okay, I'm going to send in questions on either a
00:06:44
perfect job or an imperfect job, but whatever job they have done
00:06:48
, they filter out as much unwanted stuff as possible, with
00:06:52
the problem that remains on the far side of that, which is okay
00:06:55
.
00:06:55
People are still going to get in, maybe as soon, as it depends
00:06:59
upon thousands of users for fishing training, and well,
00:07:05
let's assume that they're going to get that right.
00:07:07
There was a customer that we had in the early stages.
00:07:09
I remember talking to them this was 10 plus years ago at this
00:07:12
point where I was visiting them in their offices and the CISO
00:07:16
basically said to me yeah, I have 2, I know I have 2
00:07:21
vulnerabilities in our environment and their cars are
00:07:24
out there in the parking lot.
00:07:25
Because, I mean, as long as you have humans that can be
00:07:32
engineered in different ways, they will make that decision.
00:07:35
We'll assist them in respecting that and so stuff will get in.
00:07:38
And now, really, for us, the mantra has been how do we
00:07:43
actually make customers resilient, recognizing that
00:07:46
there will be incursions into their environment?
00:07:50
How do we stop those from becoming breaches that may
00:07:54
devastate the information and also the big kind of being
00:07:58
reported and fit in the news cycles?
00:07:59
And so that is really our mission at the highest level, is
00:08:03
to be swaying in the detection and response stage, and then we
00:08:08
can kind of talk about the peculiarities of kind of our
00:08:11
approach and the direction we've chosen.
00:08:16
Speaker 1: So you've really covered a lot there and I kind
00:08:21
of want to break it down a little bit.
00:08:23
You know, when you were at IBM, it sounds like you weren't just
00:08:27
drinking from the fire hose, right, it sounds like you were
00:08:30
trying to drink the ocean.
00:08:31
Almost right For that sort of environment, for you to cut your
00:08:35
teeth in that sort of environment, you know.
00:08:38
Can you talk a little bit about the dividends that it paid down
00:08:42
the road?
00:08:43
Right, because I'm sure that that pays massive dividends,
00:08:44
because you know even now paid down the road right, because I'm
00:08:45
sure that that pays massive dividends.
00:08:46
Because you know, even now, as a CTO, right, Like there's a lot
00:08:50
of CTOs or C-level executives out there that you know are not,
00:08:56
you know they know how to turn on their laptop, get to their
00:09:00
email, maybe Teams, right, and that's really it, they rely on
00:09:04
the rest of the team to tell them what's going on.
00:09:07
I feel like with your experience , it's kind of the other way
00:09:11
around.
00:09:11
It's kind of like, okay, you tell me what's going on and I'll
00:09:15
probably actually be able to figure it out.
00:09:18
Speaker 2: It's interesting, I think, getting a grounding in
00:09:23
how systems work, to set their core right, rather than coming
00:09:27
to the industry in the age where it's all about Java high code
00:09:30
and there's some magic where Java high code is un-executed.
00:09:34
I think there's a difference in being able to have it go out of
00:09:37
the hard-to-replicate system.
00:09:38
The other thing is I just learned how to learn and along
00:09:46
the way I met a number of people who were very smart, people who
00:09:49
were very good at simply saying , hey well, they didn't
00:09:52
understand something, and so I tried to explain something to
00:09:54
them and they didn't understand it.
00:09:55
It's a I don't get it.
00:09:57
So yeah, and I forced myself and this is my technique even to
00:10:02
this day.
00:10:03
I have security researchers reporting to me.
00:10:04
I haven't really been a security researcher.
00:10:06
I have user experience people reporting to me.
00:10:09
I haven't really been a user experience designer.
00:10:11
I have data scientists that I talk to on a daily basis.
00:10:13
I'm not I mean, I'm math-framed , but not heavily in data
00:10:17
science, and yet I've learned how to ask enough questions to
00:10:22
construct a mental model for myself that is not only the
00:10:26
accurate.
00:10:26
Right can have meaningful conversations with these various
00:10:30
parties within within our company and then also they kind
00:10:33
of with outside their company and the one skill that I
00:10:39
seemingly seem to have like.
00:10:40
Again, when you have a skill yourself, I was going to think
00:10:41
of it as quite exceptional until you kind of go for an entire
00:10:42
career and don't find a lot of people seem to have.
00:10:44
Again, when you have a skill yourself, think of it as quite
00:10:45
exceptional until you go through an entire career and know quite
00:10:47
a lot of people.
00:10:48
But I'm being in the essence, the core essence of a problem,
00:10:53
and then I can talk to it in various altitudes.
00:10:58
And so I can talk to business people and still capture the
00:11:02
core essence of it, but have it abstracted to a level where it's
00:11:05
not just overwhelmingly detailed, it doesn't really
00:11:10
bring anything to the table.
00:11:11
And yet if I'm going to go talk to a security researcher or a
00:11:15
beta scientist, I need to be able to speak at a low enough
00:11:18
level that it just still feels like it's within their universe,
00:11:21
and capturing the essence of what's hard, what's easy.
00:11:23
So I think those are the skills .
00:11:25
Ultimately for me, when I think of my skills as CTO, is to be
00:11:30
the acquirer of enough information to construct
00:11:35
reasonable mental models and then the ability to describe
00:11:38
those models at a variety of altitudes appropriate to the
00:11:42
conversation that's being had.
00:11:44
Speaker 1: It's a really good point that you bring up there.
00:11:46
And you know, earlier on in my career when I was just starting
00:11:51
out right and I was drinking from the fire hose, not the
00:11:55
ocean, I kind of figured out how I learned right In that
00:11:58
environment you really figure out what works best for you.
00:12:01
And when I was trying to figure out how these Linux systems
00:12:05
would work and how it interplayed with the emergency
00:12:09
911 system and how that, you know, got relayed to national
00:12:13
call centers and stuff like that , me understanding that system
00:12:17
at length made it a lot easier for me to be able to talk about
00:12:23
it at higher levels, engage the room and, you know, break down
00:12:28
what I'm saying into consumable terms to people that are
00:12:32
non-technical.
00:12:33
And it pays dividends now, because that's like primarily
00:12:37
what I do, which is really interesting how that plays,
00:12:41
because you know, like yourself, once I was able to get that
00:12:44
mental picture or maybe even a physical picture right, maybe I
00:12:48
drew it out.
00:12:49
Once I was able to get that, it all started to kind of make
00:12:52
sense to me.
00:12:53
It all started to, you know, really open my eyes and whatnot.
00:12:57
But I found that even after you know that role and I had moved
00:13:02
on into more interesting, more, even more technical roles, I
00:13:06
still had a little bit of imposter syndrome.
00:13:09
Did you ever encounter that?
00:13:11
Maybe you encountered it at IBM , but did you ever encounter it
00:13:15
after IBM?
00:13:16
Speaker 2: All the time, every day, right, it's like I spend my
00:13:20
time talking to people who are much more gifted at what they do
00:13:24
than I am.
00:13:25
Right, I'm an aggregator of information and as such, there's
00:13:32
always kind of a question.
00:13:33
It's like am I, you know?
00:13:34
Do I really know what I'm doing ?
00:13:36
And I've talked to other kind of high-functioning people.
00:13:38
A lot of them, the reason they're high-functioning is
00:13:40
because largely the imposter syndrome problem kind of keeps
00:13:45
them on their toes.
00:13:45
It's like well, I don't feel comfortable talking about
00:13:48
something until I actually grok it and understand it, and so you
00:13:52
have to go to a certain threshold of learning and then I
00:13:55
feel okay.
00:13:56
But up to that point it's just like I know a bunch of other
00:14:01
people who understand this, who spend time on it, and I don't
00:14:03
understand it.
00:14:04
Imposter syndrome, I think, is a good thing.
00:14:07
In many ways it's kind of an evolutionary trait, right,
00:14:10
because I think a lot of our ancestors, as long as they
00:14:13
didn't get too confident in their understanding of, like
00:14:18
yeah, I can deal with anything.
00:14:19
It's like yeah, well, a new, faster animal has just fallen
00:14:22
off and brought me down right.
00:14:23
Yeah, well, a new, faster animal has just fallen off, it's
00:14:25
going to walk me down, right.
00:14:30
Speaker 1: And so, yes, imposter syndrome to this day is a thing
00:14:31
.
00:14:31
Yeah, it's fascinating to me because I feel like it has the
00:14:35
stigma, right, that, okay, eventually this will go away.
00:14:39
And I think what it does is it just changes.
00:14:42
It kind of changes with the role that you're in and whatnot.
00:14:47
Like you know, for instance, I I used to have imposter syndrome
00:14:51
a lot when I was getting started in security.
00:14:52
Right, I don't really have it anymore for the most part, or at
00:14:57
least I get over it pretty quickly now.
00:15:00
Um, but you know, I was just confronted with an opportunity
00:15:02
right to create, essentially, you know, something that
00:15:07
communicates with satellites and it's regarding securing
00:15:11
satellites in space in a different way, with new
00:15:14
technologies that are just now emerging, and this is like a
00:15:18
side project.
00:15:19
And even with that and I'm getting my PhD in securing
00:15:22
satellites, right, and even with that, I still thought to myself
00:15:25
there's no way I'm going to my PhD in securing satellites,
00:15:25
right, and even with that, I still thought to myself there's
00:15:27
no way I'm going to last more than two weeks, right, they're
00:15:29
going to figure out.
00:15:30
You know I'm an imposter or whatever, right, like I
00:15:33
literally dealt with this yesterday, you know, I was
00:15:36
doubting myself and everything and it's it's important, I think
00:15:39
, for people to understand like, hey, this is something that is
00:15:43
very likely for you to, you know continuously, you know, have
00:15:47
going on, and it's important to ask questions, to learn more, to
00:15:50
become you know a forever learner.
00:15:53
Speaker 2: It is.
00:15:54
The healthy perspective is simply a fossil one.
00:15:58
It's so much more you don't know than you do know.
00:16:01
You just need to get to a point of.
00:16:03
That's the confidence that you can learn and you can acquire.
00:16:07
It's almost like the Matrix version, right?
00:16:10
It's just like hey, can I download the flying Apache
00:16:13
helicopter thing?
00:16:14
It's not quite like that.
00:16:15
But we're asking for confidence .
00:16:17
I think as you grow, there's already kind of confidence
00:16:22
that's offset to that imposter syndrome which is the hey.
00:16:25
I've done this enough times where I didn't know something
00:16:29
and three weeks later I was like I understood it.
00:16:32
Hold on, I've been going through that enough times.
00:16:36
You have this offsetting instinct which is yeah, I don't
00:16:40
know it now, but I will know in the end.
00:16:44
And what people are valuing me for is not the sad knowledge
00:16:49
that I have about a secret topic , but my ability to kind of grab
00:16:53
a hold of something that I don't know today and power
00:16:58
through and get to the hard side and distill it, connect it,
00:17:02
aggregate it, explain it to others.
00:17:04
That is the role that I provide and, almost by definition, it
00:17:10
means going through periods of uncertainty where you don't
00:17:12
understand something.
00:17:13
Because that's the value role.
00:17:15
Nobody does it, but it's my job to do it.
00:17:18
And a number of times people just send me this website and
00:17:20
it's like, hey, here's this particular intellectual life,
00:17:22
probably a competitor, you know, just talking good and squinting
00:17:26
at it and going, yeah, they're like a blend between S and Y and
00:17:29
Z, with a little bit of this or that, and it's nothing really
00:17:32
totally new?
00:17:32
Speaker 1: Yeah, it's really.
00:17:33
It's a fascinating rabbit hole that we could definitely go down
00:17:37
for the remainder of the time.
00:17:39
But you know to kind of, I guess, move along right down
00:17:46
your career line or your career background, right?
00:17:48
I just did a quick look on LinkedIn, right, and it says
00:17:52
that you were a CTO starting in the 90s and I'm wondering how is
00:17:57
it different back then being a CTO compared to today?
00:18:01
What's the difference?
00:18:03
Because I feel like, personally , ctos back then were a lot less
00:18:08
technical, right, they were maybe not, maybe not well-versed
00:18:14
in all areas of technology, like like you had.
00:18:17
They didn't come from the same background, whatnot and this is
00:18:20
me just kind of spitballing, right, like this is me thinking
00:18:24
without having that actual knowledge of what it must have
00:18:27
been like.
00:18:27
So, in that situation, you know what skills helped you stand
00:18:32
apart and how is it different than from today?
00:18:34
I think?
00:18:36
Speaker 2: there's both the time element of it.
00:18:38
I think it's hard to make a distinction from the scale
00:18:41
element.
00:18:41
In those days, you know, I was a CTO of a company of 25, in
00:18:47
those days I was a CTO of a company of 25, a CTO of a
00:18:50
company of 100.
00:18:50
And I progressed to being suddenly kind of CTO of a part
00:18:55
of an organization at Juniper that had like a thousand people
00:18:58
in it, right, and so there's both the time element of you
00:19:01
know, how was this different in the 90s versus 2000 versus 2010?
00:19:04
Element of you know, how was this different in the 90s versus
00:19:05
2000s versus 2010s?
00:19:07
It's hard for me to kind of disconnect that from the reality
00:19:10
that I was doing it in a very much smaller on in the early
00:19:14
days and doing it in a much bigger on in the later days.
00:19:16
So I did in the early days on the small size.
00:19:22
I was basically the alpha developer, right, and learned
00:19:27
how to lead small groups to at least more people as opposed to
00:19:30
kind of BP of engineering, which was to be more.
00:19:32
You run the entire machinery.
00:19:34
So I was always going to just overlay into a few savants,
00:19:39
collect the overlays kind of into engineering, and that
00:19:43
progressed until I kind of got to Juniper and then at Juniper
00:19:46
she felt like, okay, now I have this large team of an R&D
00:19:50
organization and we are relationships with others.
00:19:54
You're trying to again figure out where you can most
00:20:08
impactfully move the needle, because there's this big machine
00:20:13
kind of moving along and there's also more of an extra
00:20:16
aspect to it.
00:20:17
At that point there's an expectation okay, we've got all
00:20:19
our festival surveys, we've got sales.
00:20:21
The other half are the customers that are going to ask
00:20:25
hey, what's your technical vision?
00:20:26
Because customers aren't just eyeing your client on that,
00:20:30
they're going on a journey with you and they don't know that
00:20:33
your journey and their journey aligns reasonably well.
00:20:35
So it's a much more complicated thing, unfortunately,
00:20:40
oftentimes depending on the machinery and the bureaucracy
00:20:45
and the existing business.
00:20:47
But you also get further removed from actually kind of building
00:20:51
cool stuff, oftentimes in CTOs and larger organizations.
00:20:56
And so you know, coming back to Vectra, at a point when you know
00:21:00
, in the very early years we had no product, we had no customers
00:21:04
, we had a blank sheet of paper, we had a problem statement, and
00:21:07
so it was good to shed that skin right.
00:21:11
And now, as if you know, we're basically about six or three
00:21:14
people now in the company.
00:21:16
We're just kind of messed up, using some combination of the
00:21:19
skills of you know how to build things and how much to be from
00:21:22
not much to see, how steer of the blank sheet of paper, right,
00:21:26
because oftentimes people are going to use the incremental
00:21:29
building page, so being able to go from black sheet of paper as
00:21:32
the organization has grown all those tools that I kind of built
00:21:36
up later in my career of how you impact your organization,
00:21:41
how do you get word out, how do you I mean they work with
00:21:44
customers, with field people, with other parts of the
00:21:48
organization and ultimately kind of get people onto a common for
00:21:53
the future, all of those things .
00:21:56
So I find that I'm now successful in this combination.
00:22:00
I still hold on to this notion that we can build cool stuff,
00:22:04
and so I have small teams of people building cool stuff in a
00:22:05
cleaner form, where I can still build cool stuff, and so I have
00:22:06
small teams of people building cool stuff in a cleaner form,
00:22:08
right, where I can still build cool stuff and get it out.
00:22:11
But it's also I have these other responsibilities to get
00:22:15
much more, yeah, in service of the existing one way.
00:22:20
Speaker 1: Yeah, it's always a fascinating journey for me,
00:22:24
right, to identify different skills, skills right, that
00:22:29
different roles different organizations demand, and it's
00:22:32
always interesting to see how people you know fit into that
00:22:36
right, because you know, obviously, I, I see myself, you
00:22:41
know, going and growing in that direction as well, and so it's
00:22:45
out of out of my own curiosity as well.
00:22:48
It's like, hey, what skills should I be working on now to
00:22:51
prepare myself five, 10 years down the road, right?
00:22:55
So it's always just fascinating .
00:22:57
I feel like it's always beneficial for my listeners too,
00:23:00
because a lot of them have that same mentality as well.
00:23:03
So let's talk a bit about V vectra.
00:23:08
What's the, what's the purpose of vectra?
00:23:11
I'm sure a lot of people are just gonna say, oh, it's another
00:23:14
ai company, right?
00:23:15
That's what they're, that's what they're saying to
00:23:17
themselves right now yeah yeah, what is it?
00:23:20
Speaker 2: so what is it so?
00:23:21
So this journey for us didn't just about his own perspective
00:23:24
really began in 2012, 2013, right, and so we were.
00:23:26
When we're talking about ai, it was a very different kind of ai
00:23:29
than for us didn't just put this in perspective really began
00:23:30
in 2012, 2013,.
00:23:31
Right, and so we were.
00:23:32
When we were talking about AI, it was a very different kind of
00:23:33
AI than what everybody talks about now, which is, you know,
00:23:34
everything is about DNA.
00:23:35
The notion at the time that it kind of came out of Juniper
00:23:42
hackers at the point that they want to kind of sell their son
00:23:45
foothold in the inside the further, but before they do, you
00:23:49
agree with harm?
00:23:49
And we were.
00:23:51
We were networking people, that endpoint people.
00:23:54
So you know, in lieu of doing that, like obviously, product
00:23:57
strike and and and silence and others were in front of a
00:24:01
blocker looking at this from an endpoint perspective, we said,
00:24:04
hey, let's start with a network, because that felt like much
00:24:06
more of a land flight at that point.
00:24:09
Yes, you've got IDSs and things like that trying to do
00:24:12
signature stuff, but we're like, no, this isn't a base of
00:24:21
TensorFlow or other kinds of like a lot of the grounding
00:24:24
elements of neural nets that you can build kinds of like a lot
00:24:27
of the grounding elements of neural nets that you could build
00:24:29
.
00:24:29
We can actually use, like what I was using to refer to as
00:24:32
SANSyNet, to go look for patterns that are indicative of
00:24:34
attackers.
00:24:34
And so in the early days we said, okay, we're going to
00:24:38
basically put sensors on the network, you have shock data and
00:24:42
we're going to check that data and we're going to de-gauze a
00:24:44
series of different kinds of models.
00:24:46
I'm going to look for things that look indicative of an
00:24:49
attacker behavior and then I'm going to aggregate those
00:24:52
attacker behaviors based on which entities they were
00:24:56
attributed to, and then we're going to score those entities
00:24:58
and say it's best work.
00:25:00
We take this thing, this machine, this account, whatever
00:25:04
is misbehaving in these different ways and you should go
00:25:06
deal with it.
00:25:07
That was kind of the early days , but it set the ground for a
00:25:14
few things that I've told you to this day, which is our last
00:25:17
part, is an application.
00:25:19
We somehow believe that the efficacy of applying AI, ml and
00:25:23
other kinds of models, mathematical models, through the
00:25:26
problem of finding signals, and we don't believe it to be
00:25:30
entirely an anomaly detection system, which is oftentimes what
00:25:33
people did in those days.
00:25:34
And if you look at the UEDA market, you know user identity,
00:25:38
behavioral analysis it became this hell of well, you know this
00:25:42
person is doing something weird .
00:25:43
Well, it turns out that human beings, on the day that they do
00:25:45
these weird things, that's what you usually do when you're in
00:25:48
dust You're going to trigger a lot of signal and overwhelm the
00:25:51
SOC teams.
00:25:51
And if you're, on the other hand, about the attackers,
00:25:55
that's gotten good enough in terms of flying below the radar
00:25:58
that they're actually not going to look weird enough in a system
00:26:01
like that.
00:26:02
So we became much more attuned to saying if I was an attacker,
00:26:07
what would I want to do?
00:26:08
And then we started organizing these techniques and this was
00:26:12
before MITRE G-Number and all of that stuff was out there.
00:26:15
So we were effectively coming up with prefacers to classify
00:26:19
the different attacker techniques and saying, well, I
00:26:22
want to find that technique, I want to find the tool associated
00:26:25
with that technique, I want to find the business of the loss, I
00:26:29
want to find all the tools they're using long on weekends,
00:26:34
so there's a great idea of call of life or similar or something
00:26:38
like that maybe at any given moment, have a manifestation of
00:26:40
it.
00:26:40
So we built this corpus of protections and ultimately kind
00:26:44
of built what was initially an NDR business, a network
00:26:47
protection response business with data science, ai, mlml,
00:26:52
kind of an underpinning to funding that signal as pincers
00:26:54
then migrated and went in different directions in terms of
00:26:58
data construction, right, so you're basically with detection
00:27:01
for the data centers and for the attacks networks there's.
00:27:06
You know, people started moving a bunch of stuff in the cloud,
00:27:09
in the SaaS applications, and so we just have to deal with that
00:27:12
reality.
00:27:12
It's an exchange server and on-prem AD server, now an entry
00:27:17
ID and, because you know, microsoft 365, the problem is
00:27:22
still the problem.
00:27:23
In fact, the problem is it's increased in the sense that your
00:27:26
identity, which was under lock and key, is now federated on the
00:27:29
internet.
00:27:29
All of your tools and collaboration tools are kind of
00:27:32
set to be internet-ready and out by the firewall and so lots
00:27:37
more attacks can kind of occur and so inevitably on North Star
00:27:41
Routes, which is you want to find attacker signal wherever it
00:27:45
resides, we want to fire the data.
00:27:47
So now we started acquiring the laws as of like four or five
00:27:50
years ago.
00:27:50
More and more laws are more logs into the system, but again,
00:27:54
unlike a sim, which has very little opinion about those logs.
00:27:58
We're like very careful about how we pull those in, how we
00:28:00
represent them, how we stitch them, how we aggregate them in
00:28:04
service of finding that attacker signal and effectively
00:28:07
prioritize that.
00:28:07
But that's kind of now the power of mission.
00:28:10
So, whether it's the VRAM, public clouds, even public
00:28:13
clouds, is your network being attacked or is your control
00:28:16
plane being attacked?
00:28:17
There's a two different attack factors.
00:28:19
You've got identity out there federating it.
00:28:22
You've got Microsoft 365 and all your assets and teams and
00:28:27
other things out there.
00:28:28
You've showed off messages via network and now you also you're
00:28:32
just trying to reinvent your campus network by this flashy
00:28:33
thing where you've showed off messages via network.
00:28:34
And now you also you keep trying to reinvent your campus
00:28:36
network by this flashy thing.
00:28:37
You have a disease killer or an ex-co or a palo right.
00:28:41
So it's kind of follow the modern ponce jokes that
00:28:44
everybody else kind of created.
00:28:46
And yet it's why that visibility across all these
00:28:48
distributed tax services and this is where it becomes kind of
00:28:52
XDR-ish, which is the term of art nowadays it's like, well, if
00:28:56
we're covering these different attack surfaces your public
00:28:59
cloud, your data networks, your identity, your app, the apps
00:29:06
that you have that are the ones that you care about and at a
00:29:10
customer's depth.
00:29:10
We've now started integrating EDR alerts in which is kind of
00:29:15
like a long other major attack surface that people tend to care
00:29:18
about.
00:29:18
So if we're pulling all these things together and we're
00:29:22
aggravating them and we're exposing them to you as a hey,
00:29:25
this looks like one attack, go investigate it, then the best
00:29:30
term for that.
00:29:30
In the same age, even though XDR is still defying, I wouldn't
00:29:34
say that's ultimately the mission of XDR.
00:29:36
It's trying to be.
00:29:39
Can we come a third-party, take all these different signals
00:29:42
some of them are native to us, some of them are third-party
00:29:44
stitch them together, produce great speech, get them to their
00:29:49
speech, which is response-tribunal?
00:29:51
But it isn't.
00:29:52
Because if you look at an account in this day and age,
00:29:57
on-the-found, it's going to be a domain slash account In Azure
00:30:03
AD.
00:30:03
Azure AD will be your email address Within Azure itself.
00:30:10
Mad Synths and RIDs and UPN and other kinds of things.
00:30:14
They all refer to the same object.
00:30:16
So a simple problem with people like getting all the last stuff
00:30:21
done, like Wlinka, so that people are called as FOP and you
00:30:25
have deductions related to that account in there as well.
00:30:28
Funny thing to all stitch together.
00:30:30
It's a non-trivial problem.
00:30:32
This is where Simmons has struggled, quite frankly, and
00:30:36
this is how I put it this DIY problem and Simmons and Shore
00:30:39
and all that, and so we have to do a lot of heavy lifting for
00:30:44
our customers and connect the dots, and also that we cannot
00:30:48
prioritize things based on what we know and what they know as a
00:30:53
collection.
00:30:53
So AI is going to be used for mental disorders with identity.
00:30:58
When we talk about attacker behaviors, a lot of this model
00:31:00
are ML models and AI models.
00:31:01
Again, ai is a term of ours, but 3GEN AI my repented
00:31:08
description of AI was something that looked like it had some
00:31:12
pixie dust in it.
00:31:12
I mean you couldn't easily understand and so if you went
00:31:16
back two or three years ago, this would be a lot of neural
00:31:19
nets and deep learning and stuff like that.
00:31:21
It was really at the forefront of what we might have called AI
00:31:23
in those days.
00:31:24
Obviously, in the last 18 months, jet AI has looked a lot
00:31:29
out of the sun.
00:31:30
Jet AI is generally not terribly good.
00:31:31
I find it a good behavior.
00:31:32
Jetai has a lot of fun.
00:31:32
Jetai is generally not terribly good at finding hacker behavior
00:31:34
.
00:31:34
It was like, yeah, it doesn't have that kind of rational
00:31:39
system, but it is useful.
00:31:41
It has a number of things that two related accounts that we
00:31:46
just have right, which is it can be useful for defenders If you
00:31:52
look at things like Microsoft Security Pro I live under FN,
00:31:55
crossfact, charlie and others but there's a notion of, hey,
00:31:59
can you lower the bar and make it a little easier for those
00:32:03
security analysts to basically give them this purpose of data
00:32:07
that we're exposing them to and have them just deal with it in
00:32:11
natural language?
00:32:12
That's an edge On the attacker side.
00:32:18
Clearly, there's a changing number of applications to affect
00:32:23
resourceful engineering, be it writing something in the voice
00:32:28
of your boss to basically actually reporting a voice on a
00:32:32
voicemail from your boss that reports to you from your boss,
00:32:33
to basically actually recording a voice on a voicemail from your
00:32:34
boss that reports to you from your boss to ultimately have
00:32:37
videos and other kinds of things like that.
00:32:40
So there's a way, from a social engineering perspective, that
00:32:45
Atomic is going to get use of this.
00:32:46
And then there's one other part that people, beyond the kind of
00:32:51
internalized, which is it's kind of the problem upon a task
00:32:55
of its own.
00:32:55
So if you think about WAF and AppSec as this problem whereby
00:33:00
you know nefarious people would attach my application stack and
00:33:05
figure out how to do C4 injection and script injection.
00:33:07
Well, now it's 10-cal-LM.
00:33:10
I see a lot of that.
00:33:10
That means like, yeah, well, now it's trying to tell a lamb as a
00:33:11
middle-of-the-bath.
00:33:13
And now, when you go from the degrees of freedom that we've
00:33:15
all had with the application with HTML to, hey, anything you
00:33:20
can express in the English language in any form that you
00:33:23
might want for its process, right, so that gives this notion
00:33:27
that our duty to this world is to allow lambs Not just to have
00:33:31
the lungs eat for pirates, but just use your identity but
00:33:37
conducting donations, advocating that, these things on your
00:33:39
behalf, but with more proven identities.
00:33:42
That is going to get broken into.
00:33:44
And so I think there will be a little bit of a mea culpa over
00:33:48
the next two or three years as people keep rolling in more
00:33:52
LLM-enabled things without selling a lot of for-pa, because
00:33:56
if you're a CTO nowadays, you're being taught by your CEO.
00:33:59
Hey, what are you doing to launch a new channel?
00:34:02
Well, it doesn't matter.
00:34:07
Celestine, right?
00:34:08
So let's say you want to build a code system for our Celestine
00:34:12
so we can just talk to a chatbot , right?
00:34:13
So let's say you want to create a code system for our cell
00:34:15
scene?
00:34:15
Okay, so we create something where the cell scene can just go
00:34:18
up to a chat box and say I want a code for these thoughts and
00:34:22
for this hospital, and it should be similar to the code that we
00:34:25
previously did for customer Z.
00:34:27
Speaker 1: It's like okay, great .
00:34:29
Speaker 2: When we launch a cell phone and you record a system
00:34:31
and you're always sharing, and so I think it's just a better
00:34:32
deal.
00:34:32
You go okay, great, when you have a self-holding, you call it
00:34:33
system and you're always sharing, and so I think it's
00:34:34
just a little bit of you going well, yeah, you always said that
00:34:37
way.
00:34:37
You know one twist here and then you go, that system will
00:34:41
have a lot of health core, to understand you, but the
00:34:45
self-guided way.
00:34:45
Well, in the back end it will have a region that is privileged
00:34:50
.
00:34:50
You're have an agent that is privileged.
00:34:54
You'll notice that they have the sales force as the financial
00:34:55
sales force, because these days you work on behalf of any of
00:34:58
the sales guys that come in the front desk, not just this one
00:35:01
sales guy, right?
00:35:02
And then I'll ask you to quote whether or not you fully
00:35:06
understand the quote system, because these are great quotes
00:35:09
on behalf of any of the sales guys.
00:35:12
So now you're talking about the law connecting in a privileged
00:35:16
way these systems in your environment and I'd be like,
00:35:20
okay now if an attacker that's inside and steals the
00:35:24
credentials of that one sales guy, which is not a privileged
00:35:27
account, and so you say, well, how much damage can they do.
00:35:29
Well, they could punch a photo of your rate by going through
00:35:34
this amorphous blob, getting over the guardrails or dealing
00:35:37
with some inconsistencies in how the agent is implemented, and
00:35:41
now they can dump the entire Salesforce database or the
00:35:44
entire coding system and stuff like that.
00:35:46
So those are the issues that we're trying to look at, but
00:35:49
again, our perspective on this stuff is okay.
00:35:51
If we believe those attacks will occur, where do we need to
00:35:55
insert ourselves to get the data necessary to be?
00:35:58
And then what theories can we form as to how we would find
00:36:01
that attack or signal at high fidelity and relatively low
00:36:05
noise, because you fill all the noise, but this is something the
00:36:08
entire system requires.
00:36:10
Speaker 1: There's a lot there to send the entire system to
00:36:12
hell.
00:36:12
There's a lot there.
00:36:13
It sounds like the solution is closer to a next-gen SIM XDR
00:36:21
source solution.
00:36:22
I mean, it's doing a lot.
00:36:25
And something that you mentioned earlier on is catching
00:36:30
the attackers as early as possible.
00:36:34
Um, you know, a lot of the times security technology is catching
00:36:38
attackers after the fact, right , or it's piecing things
00:36:41
together after the fact.
00:36:42
And now here you are a breach happened, someone got into your
00:36:46
network, someone extracted data, whatever might might have been
00:36:50
right, and you're trying to piece together how they did it.
00:36:52
You're trying to figure out what they even took right, and
00:36:57
it's, uh, it's, it's almost like the rest of the industry kind
00:37:01
of gave up, right, I'm trying to identify as soon as that bad
00:37:07
actor entered the environment.
00:37:09
Yeah, and I, I've seen solutions where you know they
00:37:14
base the attack or they base the identification of an attack off
00:37:20
of a user's interactive, you know, or activity within the
00:37:25
environment, right.
00:37:25
So you know, I always kind of tell this story, right where we
00:37:31
had this really cool piece of technology, had it in the
00:37:34
environment.
00:37:34
It was looking at the activity of what engineers are doing,
00:37:38
where they access different resources, where from and all
00:37:41
that sort of stuff and if anything was out of the norm it
00:37:45
would alert you to a potential attacker and it was like meant
00:37:49
it was meant to not give you so many alerts.
00:37:52
That was like the big selling point.
00:37:54
And then I talked to one of the engineers, one of the system
00:37:57
engineers of our company, you know one of the one of the
00:38:01
bigger ones, one of the guys that have been there for you
00:38:04
know 20 years, that has access to things that you know you
00:38:08
probably wouldn't expect Right and I brought up that solution.
00:38:11
You know you probably wouldn't expect right and I brought up
00:38:13
that solution.
00:38:14
He goes oh yeah, I have to log into.
00:38:14
You know 25 different systems.
00:38:17
You know, once a month I've found to make sure that I
00:38:23
maintain access because you know there's always that one time
00:38:25
where you haven't logged in for a couple months and you know the
00:38:28
solution pings it when you're up at 2 am and you're trying to
00:38:31
do work right and I'm just sitting here like man.
00:38:34
This guy just defeated what was supposed to be the crux of our
00:38:39
security program.
00:38:39
He just completely defeated it.
00:38:41
Speaker 2: Yeah, you kind of think about look, what do I have
00:38:43
to do for those systems to kind of remain violent?
00:38:46
Well, I'm not going to word on it right, effectively, right.
00:38:49
On it right, effectively, Right .
00:38:51
And you know, the way I'm going to kind of think about this is,
00:38:56
again, I think oftentimes people design security solutions
00:38:59
for like this idealized view of the world as to how the world
00:39:01
works.
00:39:01
Again, you know the standard saying it's like oh, you know
00:39:03
users, you know, if you observe users for a week, a month, you
00:39:06
know you can pretty much figure out their patterns and then you
00:39:08
will only trigger a handful of anom month.
00:39:09
You know you just figure out their patterns and then you will
00:39:11
only trigger a handful of anomalies.
00:39:13
You know a year, okay, great, you've got 100 users, a
00:39:17
handful of anomalies a year.
00:39:18
Let's say I earned it by $100 a year.
00:39:22
I'm not actually working for a certain life, and so part of the
00:39:27
epiphany that we had was that if you're trying to perfect any
00:39:32
individual signal, one individual signal, and have it
00:39:35
be the end-all, be-all for detecting the threat, you're
00:39:39
going to invariably fail, because at that narrower scope
00:39:44
that trying to get that one signal perfect you will
00:39:46
invariably have, you'll draw a threshold and customer A will
00:39:50
say it's too noisy and so you'll .
00:39:52
They increase the threshold so you have less noise.
00:39:54
And then customer E will say even this long?
00:39:56
And so you find yourself a lot of constantly kind of going back
00:39:59
and forth dealing with this, the purse of noise and noise and
00:40:02
coverage.
00:40:03
For us, I think, the destiny was ultimately you need just a lot
00:40:07
of lower rate signals connected to the system, and the job was
00:40:11
really to get out of the cumulative and as quickly as
00:40:16
possible, how the comrades of evidence can build something.
00:40:20
And for those of us who aren't about to use the patch zone, why
00:40:25
not push something like that?
00:40:26
Because that won't be going to be there.
00:40:29
And so we're going to have to hear from the professors or
00:40:32
we're going to see if the so we're going to have to hear all
00:40:34
we want to hear.
00:40:34
As soon as I have all the guesses on the scenes of Batman
00:40:39
distinguish itself sufficiently, you're going to have 2000
00:40:43
alerts of you.
00:40:44
If I let this story play out for two, three, the ones that
00:40:52
are clearly not trending officially split off from the
00:40:56
ones that were more threatening, right, and now you have the
00:41:00
obstructive problem of dealing with the ones that actually look
00:41:03
like they're progressing on a threatening path.
00:41:04
It means that you have delayed by one or two or three hours,
00:41:09
you know, trying to get on top of it.
00:41:11
So this is, I think, for us it's sort of doing the same
00:41:15
thing.
00:41:15
We try and kind of give it to our customers, the best
00:41:17
customers, the customers who are realistic about you know what
00:41:21
resilience means.
00:41:22
Get it.
00:41:23
A lot of customers coming early in the journey from a kind of
00:41:26
preventive mindset, they're like what do you mean?
00:41:28
Something is in my environment.
00:41:29
We should try and detect it immediately.
00:41:31
I'm like, yeah, but it's not a tractable problem.
00:41:33
There might be a thousand things in your environment.
00:41:36
You can take them all out and thereby you're kicking off after
00:41:39
legitimate users, or you're sending off a whole bunch of you
00:41:43
know triage and deal with signal where just you know.
00:41:49
Take you off and just chill.
00:41:51
Let us our mobile environment, just you know, dig you off a
00:41:54
crystal shell and let's see what they do next before we decay
00:41:57
them.
00:41:57
Bitter evil.
00:41:58
Speaker 1: This is a rabbit hole that I feel like we could go
00:42:03
down for another couple hours, right, I mean, that just means
00:42:07
that I definitely have to have you back on and we'll do a part
00:42:10
two of this.
00:42:11
But you know, oliver, we're unfortunately at the top of our
00:42:16
time here that we have.
00:42:18
But before I let you go, you know how about you?
00:42:20
Tell my audience where they could find you if they wanted to
00:42:23
reach out and where they could find Vectra AI if they wanted to
00:42:27
.
00:42:27
You know, learn more about the product.
00:42:29
Speaker 2: Yeah, sure, basically , vectra AI, easy enough, it's
00:42:32
Vector and the product.
00:42:35
Basically, I mean you can reach out to anybody kind of like
00:42:39
info at Vectorai.
00:42:40
Obviously you know Shieldkins and you know all the regions
00:42:45
across the world.
00:42:46
They can kind of reach out to us and people on our website.
00:42:50
In general.
00:42:50
We try and do a good job of providing you not just to stand
00:42:53
out from marketing market texture.
00:42:57
Right, you can go to our blog levels and other kinds of stuff
00:43:00
like that.
00:43:01
If you're, if you're trying to reach out to me directly, I'm,
00:43:03
you know, I'm one of the OGs, so you can just reach me all over
00:43:05
at Nice, awesome.
00:43:06
Well, thanks, oliver.
00:43:08
Speaker 1: Awesome.
00:43:08
Well, thanks, oliver.
00:43:10
I really appreciate your time and thanks everyone for
00:43:13
listening.
00:43:13
I hope you really enjoyed this episode and we'll most likely,
00:43:17
you know, have Oliver back on and do a part two.