Brad's journey from sci-fi enthusiast to cybersecurity expert is an unconventional path filled with unexpected twists and valuable insights. Hear firsthand how his initial pursuit of engineering took a dramatic turn following 9/11, leading him to the military and into the Signal Corps, where his foundation in cybersecurity was forged. Discover how his experiences at SecureWorks highlight his dedication to diversifying the cybersecurity workforce by recruiting and training talent from varied backgrounds, making this field accessible to all with a passion for tech and a willingness to learn.
Step into the high-stakes environment of cybersecurity as Brad shares gripping tales from mission deployments where every second counts. Feel the adrenaline of operating in high-pressure situations and the critical role certifications play in carving out a successful career in this field. Brad sheds light on the diverse backgrounds of cybersecurity professionals, illustrating how police officers and others transitioned into this field, proving that aptitude and determination often outweigh traditional education in achieving success.
In the face of rapid AI integration, organizations encounter new hurdles with shadow IT and unsanctioned applications. Explore the intricate landscape of AI security threats and the pressing need for secure implementation, as Brad outlines the challenges posed by AI's rise. With over 92% of organizations facing data breaches from unauthorized apps, the urgency for robust security measures is palpable. Concluding with ways to connect with Brad and Morphysack, this episode promises a treasure trove of insights and a peek into future conversations on emerging AI threats.
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, brad?
00:00:02
It's great to get you on the podcast.
00:00:04
We've been working towards this thing for a while and we've had
00:00:09
some pickups along the way, of course, but I'm glad you're here
00:00:11
.
00:00:12
Speaker 2: Yeah, it's an honor and a pleasure.
00:00:14
I appreciate it.
00:00:15
Yeah, absolutely.
00:00:16
Speaker 1: So, brad, why don't you tell my audience how you got
00:00:21
into IT, what made you want to get into security, into the
00:00:24
security space?
00:00:25
Right, and I start everyone off there is it kind of fills you
00:00:29
know two purposes, right?
00:00:31
I have a lot of CISOs, directors, managers that listen
00:00:35
to this, a lot of experienced professionals, right.
00:00:37
So it kind of opens up who Brad is to them.
00:00:41
And then it also shows my younger audience that is trying
00:00:45
to get into IT, maybe trying to get into security.
00:00:47
It shows them your background, right.
00:00:50
So they can maybe match that up and say, hey, if he did it, I'm
00:00:54
coming from a similar background, maybe I can do it
00:00:56
too, right?
00:00:57
Speaker 2: I will kind of Tarantino this and say, you know
00:01:00
, start with the ending first and then we'll go back to the
00:01:03
beginning.
00:01:03
So I will say that if I can do it, anyone can do it.
00:01:06
I did not originally start on the it path by any means.
00:01:09
I always loved sci-fi, I always loved technology.
00:01:12
I was always a tinkerer, like going back to when I was a kid,
00:01:17
you know, I had the ge clock radio that everyone had the
00:01:21
brown one with like the little buttons, and I would basically
00:01:24
take it apart and put it back together again.
00:01:25
My dad would yell at me and be like what are you doing?
00:01:28
And I would overclock my Apple II and everything else.
00:01:32
I would say maybe above average .
00:01:34
Tinker took apart my first computer when I was a teenager,
00:01:39
just wanted to know about technology and was always just
00:01:41
super curious about everything and how the world operated.
00:01:45
But I did not know what I wanted to be when I grew up and
00:01:49
when I, you know, in the 90s, and all that when I was, you
00:01:52
know, going through doing, you know, applying to colleges and
00:01:56
trying to figure out okay, where , where do I go?
00:01:58
Do I go in the military?
00:01:59
Do I not go in the military?
00:02:01
What career do I pick?
00:02:02
Um, what do I do?
00:02:03
Ultimately, my dad convinced me to go into college and then I
00:02:07
ended up, uh, starting in engineering because I wanted to
00:02:12
do math things and do engineering, hated it, switched
00:02:15
to business, ended up going into the military because of 9-11
00:02:20
and ended up getting a military scholarship, went in the army
00:02:23
and what and what they assigned me?
00:02:24
I originally wanted to be a pilot, took three flight
00:02:27
physicals and failed the third for vision, and then got
00:02:31
assigned the Signal Corps, which is basically where
00:02:34
cybersecurity now sits, and IT and all that, and got thrown
00:02:38
into it.
00:02:38
And then, as a young 22-year-old lieutenant and
00:02:42
officer, I was in charge of basically one of the first, what
00:02:46
was known as a quick reaction force, or basically a rapidly
00:02:50
deployable team to go around the world and basically respond to
00:02:53
natural disasters and dealt with the uh influenza scare.
00:02:59
We'll call it a scare now, but the influenza outbreak that
00:03:02
almost turned into a pandemic back in 2007.
00:03:05
I was there in Thailand during Cyclone Nargis, which was a
00:03:09
Category 4 and devastated and lost hundreds of thousands of
00:03:13
lives during that time period.
00:03:16
While I was learning all that, there was no real books, I was
00:03:19
very limited in terms of certifications and training and
00:03:22
there wasn't any real degrees out there to learn all this
00:03:26
stuff and we were responsible for securing our networks.
00:03:29
I mean, it was basically a mobile deployment center.
00:03:32
So if you know how the federal emergency management agency
00:03:35
works, or fema or red cross, they typically have their own
00:03:39
command centers.
00:03:39
I used to run those command centers and eventually I ended
00:03:42
up getting reassigned or an expanded role, not really
00:03:48
stopping what I was doing, but basically doing cybersecurity
00:03:50
for the military intelligence.
00:03:51
While doing this and that basically was summed up the
00:03:55
first kind of four, really six years of my career as the Army
00:03:59
and eventually moved on to what's now SecureWorks and built
00:04:04
out a lot of their different capabilities.
00:04:06
But I had no idea and really I just go on deployment after
00:04:09
deployment.
00:04:09
I always had a book with me and always had.
00:04:12
You know, google really wasn't a thing back then.
00:04:15
So whenever I can get my hands on you know, through other folks
00:04:19
and when I say books I mean like manuals, really boring long
00:04:22
Cisco manuals on like how routing works, and then you know
00:04:26
it's pretty limited in terms of you know what the army had to
00:04:29
offer in terms of training.
00:04:30
You know I could basically take a computer and put it back
00:04:34
together again and do some basic routing when I started but and
00:04:38
configure like outlook.
00:04:39
That was about it.
00:04:41
And then, you know, basically fast forward over the past 20
00:04:44
years and, uh, 20, you know, going on 20 plus, and it's been
00:04:49
an interesting journey for sure, uh, and worked with a lot of
00:04:53
different others in terms of kind of building all that out,
00:04:56
um, at secure works.
00:04:58
You know, it was kind of really interesting because, because
00:05:02
one of the things that we talked for the podcast was, like, you
00:05:05
know, kind of work with others, helping them get into the, the
00:05:08
space, and kind of what we did at secureworks was we were
00:05:11
trying, you know, the stock manager, uh, and I was basically
00:05:14
working with my peers and trying to figure out like we
00:05:17
need we need to hire people and expand and grow.
00:05:21
Secureworks is, you know, was one of the leading MSSPs or
00:05:25
managed security service providers at the time and it was
00:05:28
really difficult, like we would find people outside of college
00:05:32
and not really have any experience.
00:05:34
I was hiring, you know, some of my people that I hired at
00:05:37
someone I was in Rhode Island time.
00:05:39
I hired someone out of Florida that was working at a Publix as
00:05:42
a bagger and they would basically train, you know, was
00:05:46
learning cybersecurity themselves.
00:05:47
I was like, if you're willing to relocate up here, we'll train
00:05:49
you, we'll teach you how to be an expert.
00:05:54
You know, they became one of the top threat researchers in
00:05:57
the industry and arguably one of the top 1%.
00:06:00
And then another was, you know, a general manager at UNOS and
00:06:05
just wanted a career change and knew the opportunity.
00:06:08
What a lot of people don't realize is you can actually make
00:06:10
20% more doing a similar job in cybersecurity than if you're
00:06:15
doing marketing in cybersecurity or applicable field.
00:06:21
You actually can make 10%, 15%, 20%, if not 30 percent more in
00:06:25
this field, just because it's in such high demand.
00:06:28
Speaker 1: So it's it's been an interesting journey.
00:06:30
Speaker 2: Certainly a lot more.
00:06:31
We probably do a podcast just on the early days.
00:06:33
But um, yeah, that's kind of how I got thrown into.
00:06:36
Speaker 1: It wasn't by choice, right so tell me, tell me about
00:06:43
what that was like on that QRF team.
00:06:45
What was the workday like?
00:06:47
Right, Are you going from nine to five or are you going until
00:06:52
you hit a certain point and then you're studying, you're reading
00:06:56
a book, right to learn those different skills and whatnot.
00:06:59
What does that look like?
00:07:04
Speaker 2: that look like.
00:07:05
Yeah, I don't think I slept.
00:07:06
The first four years I was in the military, the short version.
00:07:08
Luckily I did it when I was in my 20s and not my 30s and 40s,
00:07:10
but um, it was.
00:07:10
I was in charge of six different teams and basically
00:07:15
over 100 soldiers in terms of different, you know, billions of
00:07:18
dollars of equipment, and basically two teams were on
00:07:21
standby, two were on rest period and two were basically in
00:07:25
training or basically ramping up for the next cycle.
00:07:28
So at any given moment, two people were ready to go and we
00:07:34
basically were trained, like, whether it was boat, rail,
00:07:38
vehicle or air, like we would basically figure out how to.
00:07:43
From a logistics perspective, we have to be anywhere in the
00:07:45
world within 54 hours, and so you have to figure out I mean
00:07:49
literally measure it down to, you know, bringing printers and
00:07:53
measuring out all of the paper that you need toilet paper,
00:07:57
because you're going out into really remote, completely
00:07:59
disaster area.
00:08:00
You're not going to have power, you're not going to have clean
00:08:05
water, you're not going to have the internet is going to be what
00:08:11
you bring with you because we're deploying with satellites,
00:08:14
and so basically what it was was, um, uh, how the vehicles,
00:08:16
we, we had basically our trucks and on the back would have like
00:08:19
a shelter unit and in that you'd have, you know, pretty much a
00:08:22
data mobile.
00:08:23
It was a mobile data center is what it was.
00:08:25
And then we had basically different satellite systems that
00:08:29
would connect to different military networks and, if needed
00:08:32
, some of the deployments as I got into more of the top secret
00:08:37
levels we used commercial bands as well and it worked alongside
00:08:43
three-letter agencies, depending on the mission bands as well,
00:08:45
and it worked alongside three letter agencies, depending on
00:08:47
the mission.
00:08:47
And you know, especially in if they're an allied nation and
00:08:49
there's a natural disaster element, intelligence becomes
00:08:51
incredibly key, as well as cyber security elements, and people
00:08:55
don't realize that when there's a natural disaster that happens,
00:08:57
so part of it is being on standby and so you're until the
00:09:02
stuff hits the fan.
00:09:03
You know you're, we're, we're kind of on easy street, so to
00:09:07
speak, in terms of you know, it's kind of a what we call
00:09:11
being in, like basically the garrison, and we would do like
00:09:14
our day-to-day training.
00:09:16
We'd do maintenance.
00:09:17
Everything had to be tip-top, like ready to go on the plane,
00:09:20
like that day, if needed, my bags were packed and ready to go
00:09:23
, like that day, if needed, my bags were packed, I'm ready to
00:09:25
go, like I'd leave it in my trunk, kind of scenario where,
00:09:27
like at any given moment, I had to be at the airport to go where
00:09:30
I needed to be.
00:09:30
Really, what it breaks down to is every free moment that you
00:09:34
get.
00:09:34
You gotta remember, like the internet isn't what it was today
00:09:38
, like you're not sitting on YouTube.
00:09:41
Youtube didn't exist.
00:09:42
Google wasn't like mainstream, like people actually sat around
00:09:45
and like read books.
00:09:46
You know it was not a side phone inside, we were just
00:09:48
living in the moment.
00:09:48
I mean, some headpads really didn't come out until later in
00:09:51
the early, the later, early 2000s, if that makes sense.
00:09:56
But you know the first couple of years.
00:09:57
You know you're dealing with us .
00:10:12
You know basically, uh, remote areas and, and you know,
00:10:13
basically rubble.
00:10:13
You know you don't want to bring your nice stuff most of
00:10:16
the, the clothing and gear that I had got tore up um, yeah, it's
00:10:20
interesting to to kind of see that and you fit it in like
00:10:23
you're on the plane for you know some kids.
00:10:25
You know eight, six, like eight .
00:10:27
You know I love I was stationed in hawaii.
00:10:29
You know just to get to japan or to get to thailand or get to
00:10:33
korea, south korea or anywhere, or the philippines, you know, it
00:10:37
takes 18 hours and people don't realize that.
00:10:39
Nine to twelve direct flight to japan, and then it's another
00:10:44
nine to twelve to get to Southeast Asia, depending on
00:10:47
where you are.
00:10:48
And all that, sure you can get on a, and we didn't always have
00:10:52
the ability to get on a military aircraft because, believe it or
00:10:56
not, despite us having a primary mission, we didn't have
00:11:00
dedicated aircraft and things were constantly being sent over
00:11:04
to Iraq and Afghanistan.
00:11:05
We were ramping up and supporting the, the two wars, so
00:11:11
we sometimes it was where we got there, and sometimes we got
00:11:14
there before the equipment did.
00:11:15
But at least we could be liaisons, we could be advisors
00:11:18
and and and advise folks.
00:11:21
Speaker 1: Yeah, yeah, I asked that because, uh, you know,
00:11:28
there's a lot of people, a couple friends of mine actually
00:11:30
right, they always say I'm so busy, I can't do it, I don't
00:11:33
have time and this and that right, um, and I'm just sitting
00:11:36
here, well, you don't want it bad enough, you know.
00:11:39
I mean, like, plain and simple, like you just don't want it bad
00:11:41
enough.
00:11:42
Like, like you, you want, you want the money, right, you want
00:11:45
the flexibility, you want the ability to do something else,
00:11:48
right, but you don't, you don't actually want it right, because
00:11:52
if you actually wanted it, like you would figure out how to
00:11:55
optimize your sleep so that you're sleeping less hours and
00:12:00
that you're, you know, spending more time studying or doing
00:12:03
whatever you need to do to get this done right.
00:12:07
Like I can give you the roadmap, I can give you the plan, but at
00:12:10
the end of the day, if you're not going to do the work, if
00:12:13
you're going to keep on claiming you don't have time, you know
00:12:16
it's never going to happen, right?
00:12:17
And so, like it's always important, I think, to point out
00:12:27
, right when I have people like yourself on right that are very
00:12:28
much, you know, self-starters, that you, at some point in time,
00:12:31
you saw and you were like, hey, I want to get into this right.
00:12:34
Like I want to learn, I want to know as much as I possibly can
00:12:37
about this area.
00:12:38
You know, when you saw that, when you identified that, you
00:12:42
went for it.
00:12:42
You know, and a lot of people are missing that part.
00:12:46
Speaker 2: Yeah, and I have a lot to say about that.
00:12:49
So, like when people would say, oh, I didn't get a good night's
00:12:52
sleep last night, I'm like every mission that we had we
00:12:56
didn't sleep the first 72 hours and it's like you're on the
00:13:00
plane you're going.
00:13:01
We would sleep on the plane getting there because there's
00:13:03
only.
00:13:04
But even then, like sometimes we didn't because, depending on
00:13:06
the mission, we didn't have all the like.
00:13:08
We were reviewing the material on the plane, like like we would
00:13:13
get the map, you would get our kit, and it'd be like here are
00:13:16
the maps of ireland.
00:13:17
That's what it looked like before the natural disaster.
00:13:19
You can see the river and all that's not there anymore.
00:13:22
But uh, yeah, at least figure it out when you get there yeah,
00:13:26
you figure it out when you get there, but like part of it's
00:13:28
like you're writing the operations order and the
00:13:31
execution plan and everything like on the plane we would have
00:13:34
our standard like packing list and like we would do like last
00:13:37
minute packing and and all that you know.
00:13:39
But you know we're it, you know we it's.
00:13:43
You have to be there in 54 hours.
00:13:45
So that means that like every second counts leading up to that
00:13:49
moment when you hit ground.
00:13:50
That's when the fun begins and then from there that's when the
00:13:54
like we kind of cut it off at 72 hours because you start to get
00:13:59
loose, you know.
00:13:59
Then you start to become a safety hazard, so and you get
00:14:03
micro sleep and all that.
00:14:04
But really it's you're pretty sleep deprived in the first
00:14:08
couple days and once you get the internet flowing and phones
00:14:11
going and people can operate, then you can kind of take a
00:14:14
breather.
00:14:15
But yeah, it really is typically something that that
00:14:18
that ends up happening.
00:14:19
I mean your adrenaline's pumping so hard because you know
00:14:22
it's life and limb, so spare the gory but not everybody's
00:14:26
alive when you get there.
00:14:26
But it's that scenario I would say I have.
00:14:30
You know, a couple of people a month could reach out to me and
00:14:33
they're like you know, you helped me break into
00:14:35
cybersecurity.
00:14:35
So, like a little known fact, I've actually worked with three
00:14:41
different police officers over the past couple years to help
00:14:44
them transition, because there's been a lot of pressure on
00:14:47
police officers as I've been so nice to them over the past
00:14:50
couple years, a lot of good trends of mine.
00:14:52
They're like yeah, I gotta get out of here.
00:14:54
It's not safe.
00:14:55
Defund, the police is killing me, everyone hates me.
00:14:58
I want a purpose, you know, I want to make money for a living
00:15:01
and you know I'll give them the answers to the test.
00:15:03
And they're not.
00:15:04
You know, I'd say I love helping those types of individuals
00:15:07
because they you know they wouldn't because they have kind
00:15:10
of that, that military mindset or that warrior mindset in terms
00:15:13
of like, I'll do anything it takes to accomplish my mission.
00:15:17
Now I'd say nine times out of ten I talked to somebody in
00:15:20
college or you know, know, they're a younger demographic,
00:15:25
they're really in there.
00:15:25
You know people that like, hey, can you help out my son?
00:15:28
Or can you help out my friend?
00:15:29
All he does is play video games on his couch.
00:15:31
And it's like OK, well, I'll talk to him.
00:15:35
And it's like, hey, this is what you got to do to succeed,
00:15:38
and then, like they instantly get nauseous and don't do
00:15:41
anything.
00:15:41
Yeah, it's like you know, one of the things I like to do is
00:15:47
show people uh, so there's a.
00:15:48
You just go to google and type in uh, security certification
00:15:52
roadmap.
00:15:52
Paul jeremy has built this beautiful diagram in terms of
00:15:58
kind of basically how, uh, all the certifications and there's
00:16:02
over 400 and a huge thing yeah, 400 and something.
00:16:06
Certifications from beginner, intermediate to expert.
00:16:09
These are certifications and any one of these certifications
00:16:13
could be at a minimum, 80 hours, if not 800 hours, and studying,
00:16:18
depending on the level of complexity, whether it's
00:16:20
hands-on or not and where you're starting from.
00:16:22
So it's, it's a major commitment and I show people
00:16:25
it's like, hey, this is mount everest, like you can get there
00:16:29
and most companies will pay.
00:16:31
You know, not for all of them, but you know, anvil, I never.
00:16:34
I maybe paid out of pocket once for one of my certifications.
00:16:37
I was able to get the military and, uh, you know, secureworks,
00:16:41
ibm, others to pay for certifications.
00:16:43
As I go along, you have different like grants and things
00:16:45
like that and the ways around it.
00:16:47
There's tons of, tons of free information and certifications
00:16:51
that you can get out there, a lot of non-profit stuff.
00:16:53
Cisco has their academy, um, just stuff like cyber break,
00:16:58
good stuff out there in terms of just tons of research.
00:17:01
But like I show them this and they're like, no, I can't.
00:17:05
And typically I start people and I say, hey, just look at
00:17:08
CompTIA and start with A+, go to Net+, go to Security+, you'll
00:17:13
have enough that you could start working at like a internet
00:17:16
service provider place and you can, you know, basically get
00:17:22
work at a health desk effectively and learn the ropes
00:17:24
and then get some experience and then, once you have one or two
00:17:28
years under your belt, then you go somewhere else and at
00:17:32
SecureWorks we would take people with zero experience and train
00:17:35
them.
00:17:35
We actually almost prefer it sometimes, because we get people
00:17:40
out of school sometimes and they have a master's degree in
00:17:43
computer science and they don't know the difference between TCP
00:17:46
and UDP, like you got to be kidding me how do you, how do
00:17:51
you, how did you, how, how, how is that?
00:17:53
You just literally wasted your entire, your entire, you know,
00:18:00
really postgraduate education, education, yeah there's a
00:18:04
thousand dollars at a top tier school and you didn't learn a
00:18:07
damn thing, so it really comes down to aptitude and you know,
00:18:10
like I said, I hired, you know, people that worked at a
00:18:13
supermarket, that worked at a restaurant, that did different
00:18:16
walks of life, and just police officers.
00:18:19
you know some of those elements are translatable and
00:18:23
transferable, so you're not necessarily starting from
00:18:26
scratch.
00:18:27
You know it really comes down to the fortitude and really
00:18:31
rolling up your sleeves and putting the work in and it's
00:18:36
that perspiration that you have in terms of sweating it out and
00:18:42
and doing it and and I, whenever , when I was starting, I really
00:18:47
like fully immersed myself, like I was learning a language,
00:18:50
literally learning binary, uh, and and hexadecimal conversions
00:18:55
and all of that and and learning how to do the packet analysis
00:18:59
component, but really treating it like I'm fully immersed in
00:19:03
learning Spanish or French or any language, and that's really
00:19:07
the best way to learn anything.
00:19:08
And I would follow newsletters and podcasts.
00:19:14
I mean, podcasts weren't super readily available 20 years ago.
00:19:18
They really took off over the past five, 10 years.
00:19:20
But whatever you know, whatever was available, sans had their
00:19:24
you know their news bike feet.
00:19:26
You know tons of after 2010,.
00:19:29
Things started to really come out in terms of like open source
00:19:33
and having a lot more information readily available
00:19:37
that you could get your hands on stuff.
00:19:39
Speaker 1: Hmm, yeah, it's.
00:19:41
It's always interesting, you know, when someone comes to me
00:19:45
and, you know, says that they want to get into security, how
00:19:48
do they do it and whatnot, I actually take the approach of
00:19:50
trying to convince them to not get into it.
00:19:53
Right, Because if you're so easily swayed by what I'm going
00:19:56
to tell you?
00:19:57
You probably shouldn't, you know, waste your time getting
00:20:01
into it.
00:20:01
You know, like I was telling a buddy of mine, shouldn't, you
00:20:03
know, waste your time getting into it.
00:20:03
You know, like I was, I was telling I was telling a buddy of
00:20:05
mine and he, you know, owns his own franchise of some business
00:20:07
and whatnot and he really enjoys it.
00:20:09
Right, and I was telling him how a group of developers I'm
00:20:14
talking about, you know, 150 developers at my day job were
00:20:18
trying to convince me to put in an exception that would
00:20:22
essentially bypass our entire WAF.
00:20:24
Right, they knew what they were doing, but they were wording it
00:20:28
in such a weird way that I was asking a lot of questions, right
00:20:32
, and so, because I didn't understand what they were asking
00:20:35
, I didn't understand the reason .
00:20:37
I didn't understand, like, any of what they were trying to say,
00:20:40
because they were literally trying to just bully me into
00:20:43
accepting it.
00:20:43
Right, and it was.
00:20:45
It was literally 30 minutes of me asking questions, right, and
00:20:50
then it finally got down to the, down to the answer where they
00:20:54
couldn't escape it, and they were like, oh yeah, we just want
00:20:57
to get around this whole thing and I'm sitting here like these
00:21:00
are adults.
00:21:01
Yeah, these are.
00:21:02
These are grown, grown adults.
00:21:04
We're paying them good amount of money, right, and they're
00:21:07
trying to get around our security tools that are
00:21:09
protecting them from themselves to some degree even, which is
00:21:13
which is frustrating right in and of itself, and having to
00:21:17
respond in a very I I'll say stern way, right, like respond
00:21:22
in a very stern way saying no, we're not doing it, you need to
00:21:26
suck it up.
00:21:27
I don't care how much time this is going to cost you or
00:21:29
anything else like that.
00:21:30
Like you need to figure out how to handle it.
00:21:32
And if you have a problem with it, I'll just bring it up to
00:21:34
your director, right, because that's who I got the approvals
00:21:37
from.
00:21:37
That's who told me you had the time.
00:21:40
It was your director, it was your VP, it wasn't anyone else,
00:21:43
it wasn't your manager.
00:21:45
My buddy, though, who's not even interested in getting into
00:21:47
security, he heard that story and he's like, yeah, I would
00:21:51
never.
00:21:52
I would never want to be in that industry.
00:21:55
I don't blame him.
00:21:57
It takes a certain kind of person to really look at that
00:22:03
and be like, yeah, but it's worth it, you know, and that's
00:22:08
who I want in the industry, you know, because there's a lot of
00:22:11
people that get burnt out.
00:22:12
You know, you got to be able to be mindful of your mental
00:22:16
health and whatnot, right, and maintain that properly.
00:22:20
But it's a great field that really anyone can get into if
00:22:23
they want to get into it.
00:22:24
It doesn't matter where you're at right now, like you mentioned
00:22:27
, right, with those police officers who's busier than
00:22:29
police officers, right?
00:22:31
I mean, like I tried to go into law enforcement out of college
00:22:35
and I'm very glad that that did not work out.
00:22:38
You know, I think I had, you know, no more than like two
00:22:42
brain cells at the time, right, and the riots started kicking
00:22:45
off and I think it was like St Louis or something, right, and I
00:22:49
saw that and I'm like I just feel like I don't want to die
00:22:53
doing a nine to five.
00:22:54
You know, like I just I feel like I need to find something
00:22:58
else I'm passionate about, right , like it just didn't add up to
00:23:01
me.
00:23:01
So I'm like, okay, let's go some other route, right, and
00:23:04
then it just escalated from there.
00:23:06
You know, it just got worse for them from there.
00:23:08
But like, who has a more rigorous schedule outside of the
00:23:13
military?
00:23:13
You know, these guys are driving around, they're
00:23:16
patrolling all day long.
00:23:17
They're putting their lives on the line all day long and then
00:23:19
you want them to come home, have family time, decompress and
00:23:23
study.
00:23:24
Speaker 2: I mean, that's a large ass, right yeah, it's
00:23:27
tough, it really is, and I mean the biggest I mean that's why I
00:23:31
left the military ultimately was it's.
00:23:33
You know, it wasn't very safe and I knew that my body wasn't
00:23:38
going to be able to keep up with it and, uh, still dealing with
00:23:41
a lot of that and it's, yeah, it's tough, and and part of the
00:23:45
problem with transitioning was the decompression.
00:23:47
So, for all the uh, soon-to-be veterans or are currently
00:23:51
veterans of, you know, either first responders or military,
00:23:56
regardless of where you are in the world, you just have to be
00:23:58
conscious of that decompression.
00:24:00
Sickness is what I I call it when you go from being 100 miles
00:24:04
an hour to not so much.
00:24:06
That was our problem.
00:24:08
Being at that operational tempo for so long I forgot what normal
00:24:12
was.
00:24:13
I mean it's like drinking two pots of coffee every day down to
00:24:20
nothing overnight.
00:24:21
It's just like you go 100 miles an hour and just like, slam on
00:24:25
the bra on the brakes and like and I started off as like a
00:24:29
shift supervisor, I went from basically being in charge of all
00:24:33
this stuff millions of dollars of equipment flying around the
00:24:36
world in charge of 116 people seeing pretty much everything
00:24:40
under the sun.
00:24:41
It's like, you know, a junior level officer in the army to a
00:24:45
first ship supervisor.
00:24:46
It's a security operations center at Secure Works In charge
00:24:50
of like six people.
00:24:51
They're like oh, this is different.
00:24:55
Speaker 1: Yeah.
00:24:57
Speaker 2: And then it's like you start looking around, it's
00:25:00
like, what can I fix around here ?
00:25:02
And then, before you know it, I ended up getting in charge of
00:25:05
like 30 people and then moving up and then eventually moved to
00:25:08
product management and then eventually ended up going more
00:25:14
and more up the ladder.
00:25:15
So eventually it ended up leapfrogging.
00:25:18
But it was definitely a hard transition, for sure, mentally
00:25:22
more than anything else, and it was I was I don't really knock
00:25:24
100 miles an hour for anybody to see that movie like american
00:25:27
sniper.
00:25:28
When, when kyle is is sitting there and he's in the doctor's
00:25:32
office and kind of is, he's just sitting there and, like his,
00:25:36
his blood pressure's through the roof and it's like, you know,
00:25:38
that's, that's how you get hypertension, you get high blood
00:25:41
pressure and uh, you just you used to that, uh, that temp
00:25:47
that's just going a million miles an hour, but yeah, it
00:25:52
really comes down to uh, dedication, dedication to your
00:25:55
craft, um, and that's when it as a hiring manager, that's those
00:25:59
are the things that look for.
00:26:00
And anybody that's ever worked with me before in my interviews
00:26:05
I always used to ask a variation of this question, which was in
00:26:10
one minute or less, tell me 10 things that you can do with a
00:26:13
number two pencil.
00:26:13
And typically I wouldn't hire anybody that didn't try to get
00:26:22
all 10.
00:26:22
And it's about the whole point of the.
00:26:26
I don't really care what you're going to use the pencil for.
00:26:28
It really breaks down to are you going to give up or not?
00:26:32
And I've had people give up at five.
00:26:34
I'm like you can't think of, you're just going to give up.
00:26:36
You have 30 seconds left, and so if you can't get through an
00:26:42
interview question for literally 60 seconds, then I think I
00:26:48
started off doing 2 minutes to really like just different pros
00:26:51
and cons to doing 1 minute versus 2 minutes.
00:26:53
I'll sit there and I'll just be quiet the whole time and just
00:26:57
like list off what they have.
00:26:58
It's just interesting the answers that people give.
00:27:01
It's just like an interesting question to ask.
00:27:04
It really shows you kind of the how people go about a problem
00:27:09
and how they they do it and you know the people that ended up
00:27:13
getting 10 out of 10.
00:27:14
Like you know, I've seen people write it off 10 out of 10 in 30
00:27:17
seconds.
00:27:18
And those are the type of people you want to hire, because
00:27:23
you know when the stuff hits the fan that they're going to be
00:27:25
the ones that really push forward.
00:27:27
Because really it comes down to problem solving.
00:27:29
If you have a custom routing issue in your Cisco firewall or
00:27:36
whatever that you have custom rules that you've built.
00:27:40
You're not going to find that on Google.
00:27:41
You're going to find that in the Cisco manual.
00:27:44
You built it.
00:27:45
It's your bed.
00:27:47
Why are you in it?
00:27:48
So you're going to have to figure it out.
00:27:50
You put custom code on your machine.
00:27:53
Only you know that code.
00:27:57
It's like if you bypass the WAF and you built some custom tool
00:28:01
or whatever to do X, y, z, given the example that you have,
00:28:05
that's on you, that's shadow it, that's a.
00:28:07
That's an unsanctioned app Like don't, don't be calling to help
00:28:10
this, because you jail, broke your laptop, your device, so you
00:28:14
could play some video game or watch sports on your work laptop
00:28:18
or do something naughty Uh, sports on your work laptop or do
00:28:23
something naughty uh, that's on you and that's like a systemic
00:28:26
problem right now across the board, um, that like over I
00:28:27
would say arguably over 40, if not 50 percent of organizations
00:28:30
have this problem right now and it's it's probably even
00:28:33
extensively higher because of, uh, all the adoption of
00:28:38
artificial intelligence.
00:28:39
So, like you'll, you'll look up and just like it's so much like
00:28:45
what are the number one applications?
00:28:46
And like it'll be like microsoft and it'll be, you know
00:28:49
, adobe, it'll be whatever, and it's like, no, it's actually ai.
00:28:53
It's just no one's talking about it.
00:28:55
They're, they're using it to to , basically as a competitive
00:28:58
edge against their own um team, to to make you know they'll be
00:29:03
assigned a project and they'll do it five times faster because
00:29:06
they used, uh, an ai tool to do it, and but they're not going to
00:29:10
admit that they use an ai tool for and that's that's kind of
00:29:14
what's happening is, people are using these unsanctioned apps,
00:29:17
these web apps all I have to do is open a chrome browser and do
00:29:21
that and now we're getting to really scary stuff where you
00:29:24
have, um, you know, ai avatars that are like really hard to
00:29:29
discern, like as it starts to feel it.
00:29:31
Speaker 1: yeah, I actually I know someone that very recently
00:29:35
had someone you know interview and and apply for their company
00:29:40
at open role, right, and they passed the interview, they got
00:29:44
hired and someone else basically showed up and had you know
00:29:49
something.
00:29:49
They had something going on with their screen where it
00:29:52
looked like the person that had interviewed, but the answers
00:29:55
that they were giving were just like completely off the wall.
00:29:58
You know it was.
00:30:00
It was wild, Like they thought that they they must have had you
00:30:04
know some AI, you know, like listening in on the conversation
00:30:09
and telling them what to say and stuff like that.
00:30:11
It was pretty crazy.
00:30:13
I've heard of that but I've never talked to someone that has
00:30:19
actually gone through that and experienced it firsthand.
00:30:23
Speaker 2: AI is turning into a crazy world where everything now
00:30:28
has a significantly higher attack surface yeah, and it's
00:30:32
gone through the roof, and part of what's driving this forward
00:30:36
is, um, obviously the economy.
00:30:38
So the last two years have not been very forgiving.
00:30:42
You're really the post, we're definitely the post-co economics
00:30:47
.
00:30:47
So we had basically the massive boom that occurred in 21, 22.
00:30:52
And then mid-22, we started to kind of hit a critical point.
00:30:58
We saw sales cycles take longer , purchasing decisions were
00:31:02
greatly reduced, budgets were reduced.
00:31:03
A lot of layoffs, a lot of trimming of the fat, a lot of
00:31:07
shift towards rapid hyper growth , towards better margins and
00:31:10
better profitability, a lot of hoarding of cash.
00:31:14
The introduction of open AI you know the Q1 of 23,.
00:31:19
They really started to take off and then like really is the
00:31:38
only way to get like funding right now is you have to have an
00:31:40
AI store, and it's been that way for over two years now.
00:31:43
So hat tip to the investment community for being that
00:31:47
forward-thinking that.
00:31:48
You know, going back to 22, you had to have an ai story in
00:31:52
order to influence them.
00:31:53
So over 80, if not 90 percent, of all tech companies and
00:31:57
cybersecurity companies have to have it's part of the roadmap to
00:32:00
have ai embedded into their solution.
00:32:03
So it's like a board mandate and and then concurrently, from
00:32:07
an adopter perspective, over 80% of organizations are evaluating
00:32:11
it in some way, shape or form and it's a bell curve in terms
00:32:14
of that maturity level and what's driving a large portion
00:32:18
of it is sales and marketing, as well as operations.
00:32:20
It's a lot of low-hanging administrative tasks, but some
00:32:24
companies have gone all in and everything has to be AI-driven
00:32:28
or AI-assisted, depending on what they do for a living.
00:32:31
And part of the problem and Gartner's kind of devoted a
00:32:35
whole subdivision of studies.
00:32:38
So folks that don't know who Gartner is, it's a research
00:32:39
analyst firm.
00:32:41
They do market insights and they're the leading provider, so
00:32:45
they have this space where they track basically AI and the
00:32:50
trust behind it and the overall adoption and basically the
00:32:54
challenges that we're running into is over 80% of
00:32:57
organizations are evaluating this and unfortunately, like one
00:33:03
third of all implementations are resulting in a data breach
00:33:06
because of the expanded attack surface.
00:33:09
So we're rushing towards this goldmine.
00:33:10
It's kind of like the early days of the internet there was
00:33:13
no protection and we're implementing things like this
00:33:18
and now we have scenarios where, like Anthropic, which is the
00:33:21
OpenAI competitor, their new bot that they have, quad, controls
00:33:27
your entire computer.
00:33:28
Does that sound safe to you?
00:33:33
I don't know.
00:33:33
I don't even understand why that's even remotely a good idea
00:33:39
.
00:33:39
I understand the use cases of it, but just because you should
00:33:44
doesn't mean Just because you could doesn't mean that you
00:33:46
should.
00:33:47
And it was a similar issue with Microsoft and their AI co-pilot
00:33:51
.
00:33:51
When they launched it out, they were taking screenshots of your
00:33:53
desktop.
00:33:54
Uh, talk about an invasion of privacy and a violation of gdpr
00:33:58
and all the other privacy rules and protections.
00:34:01
Part of the problem is because of this rush, that this giant
00:34:04
gold rush and and drive towards ai and and efficiency
00:34:09
profitability, you know, without any kind of forethought, is not
00:34:12
to give the ramifications of this.
00:34:15
So it's resulting in data breaches.
00:34:17
It's resulting in, you know, stolen intellectual property,
00:34:21
data exfiltration.
00:34:22
It's resulted in a lot of these impacts.
00:34:25
You know over one third of these implementations have resulted
00:34:27
in that lot of these impacts.
00:34:28
You know over one-third of these implementations have
00:34:30
resulted in that kind of issue.
00:34:32
And then that's compounded by the, the challenge of like it's
00:34:35
being forced upon you in terms of so, like I'm an adobe user
00:34:40
and like, all of a sudden, like the ai assistant pops up, it's
00:34:44
like I didn't enable you, I do legal stuff.
00:34:48
I don't want that.
00:34:50
No, separate.
00:34:52
No, I don't want to have the default opt-in, I don't want to
00:34:57
have the default opt-out.
00:34:59
And then there was another issue with LinkedIn, where they
00:35:02
were using all of the data for their AI mechanism and tool and
00:35:08
their learning capabilities.
00:35:09
Speaker 1: I don't want that.
00:35:10
Speaker 2: And then even NVIDIA was caught scraping YouTube data
00:35:13
and YouTube video for their AI.
00:35:15
It's like I don't want my own videos on YouTube getting
00:35:20
scraped and embedded in your NVIDIA AI system, sorry.
00:35:24
So there's no rules and regulations anymore.
00:35:27
And Gardner came out and said that basically over 92 percent
00:35:31
of uh of organizations have some form of unsanctioned apps that
00:35:36
are running in their environment .
00:35:37
That's, that's very scary statistic and it's not slowing
00:35:42
down and it's leading over into the, you know, basically being
00:35:46
used for social engineering.
00:35:48
It's being used for phishing and , um, you know it's it goes
00:35:52
right down to the layers of business email compromise, but
00:35:55
right down to uh, I mean, I don't know if you've ever used a
00:36:00
technology device in the past couple months, but, um,
00:36:03
automation is everywhere.
00:36:04
Like every day, I get like spam calls, I get spam, I get spam
00:36:09
email.
00:36:10
Linkedin is now flooded with this crap.
00:36:13
It's like overwhelming amounts and it's like really clever,
00:36:17
every phishing email is now a spear phishing email.
00:36:19
It's like, is this real or not?
00:36:20
And it's like, and it's really difficult, and part of what I do
00:36:25
on the side is, you know, part-time work is expert witness
00:36:29
.
00:36:29
So if you ever watched the movie my Cousin Vinny, so it's
00:36:33
like you get on stand and basically you evaluate and
00:36:37
provide an expert opinion To show you how far I've come.
00:36:40
You know, now I'm actually I've had several cases with Fortune
00:36:44
500 companies and part of the cases that are coming up now is
00:36:49
um fraud using ai and basically I've listened to recordings and
00:36:55
listened to case files and things like that where um ai was
00:37:00
used to really what they do.
00:37:02
It's interesting.
00:37:03
So these tools have the ability to um, unfortunately, listen to
00:37:08
, like this podcast, podcast or a YouTube video or any kind of
00:37:12
snippet of your voice and then create an avatar for it.
00:37:15
And then what they'll do is they'll call in a bank, bank of
00:37:18
whatever, and basically they'll get your information from the
00:37:21
dark web or whatever, piecemeal it together so they have your
00:37:24
banking information or whatever enough information for them to
00:37:27
call in and socially engineer the uh and user and mimic your
00:37:33
voice and then, like I've actually listened to the
00:37:35
recordings and the, the defendant or plaintiff in this
00:37:40
case, if they're suing the bank or whatever, it'll actually say
00:37:43
on there like, like that, that's not my voice.
00:37:44
And then you know part of the problem is the detection of that
00:37:48
is not at the level it's actually hit.
00:37:50
The technology creating it is so like 10, if not 100 times
00:37:56
more advanced than the detection mechanisms.
00:37:58
So what do you do to prevent against these things?
00:38:00
Because the technology is evolving so fast, but the
00:38:04
prevention mechanisms aren't there.
00:38:07
Speaker 1: Yeah, you said it pretty well, right, it seems
00:38:11
like everyone is racing towards this thing, but we don't know
00:38:16
the implications.
00:38:17
We don't know.
00:38:17
I mean, we kind of know the implications, right, but I feel
00:38:22
like AI is still in its infancy, right, and we're running into
00:38:26
all these issues, all these security problems, and I haven't
00:38:29
heard a really good answer.
00:38:29
Forancy, right, and we're running into all these issues,
00:38:29
all these security problems, and I haven't heard a really good
00:38:31
answer for them, right, like, the core question is how do I
00:38:36
take an AI model or an LLM, put it into my company's environment
00:38:41
and use it to assist my employees to do better work for
00:38:45
the environment?
00:38:45
But how do I do that in a secure way?
00:38:48
How do I ensure that, if they put you know private data into
00:38:53
it by accident or intentionally, that it's protected, right, how
00:38:58
do I ensure that no one else can query that data?
00:39:01
All these different things, right, we don't have any answers
00:39:04
for them right now, and all of the things that you and I just
00:39:08
mentioned are all like huge breaches.
00:39:11
Yeah, of these different regulations that you know,
00:39:15
companies pay, companies pay, I mean banks pay billions of
00:39:19
dollars to get around, like those regulations to make, not
00:39:24
even to get around, but to make sure that they are you, you know
00:39:27
, abiding by those regulations, you know, to the fullest extent.
00:39:30
So now we're going into a situation where it's like, yeah,
00:39:34
that $2 billion that you just spent on security for your
00:39:37
organization, yeah, it's probably going to have to, you
00:39:40
know, five X, I mean that's that's insane.
00:39:44
Speaker 2: Yeah, it's absolutely insane.
00:39:45
Yeah, it's absolutely insane.
00:40:07
I mean, we're trying to at Morphosec this is why I love the
00:40:08
company is we're trying to at least mitigate a sliver of this
00:40:09
larger problem.
00:40:09
A large portion of it ends up landing on the endpoint.
00:40:10
So that's where we sit.
00:40:11
It's an agent-based solution, but we basically help with kind
00:40:12
of getting visibility down at the endpoint level.
00:40:13
And how we kind of go about it is we identify different
00:40:17
vulnerabilities in terms of that and different to control.
00:40:22
We monitor and validate that security controls are working
00:40:25
properly.
00:40:25
We identify high-risk software.
00:40:27
Security controls are working properly.
00:40:28
We identify high-risk software.
00:40:29
So if you don't want this to run in your environment, you have
00:40:32
the ability to detect those mechanisms and then provide
00:40:34
mechanisms on who is that?
00:40:37
So if you're bypassing a WAF or you're bypassing a mechanism,
00:40:40
the endpoint's going to pick it up and then any kind of
00:40:44
misconfigurations that are there .
00:40:45
So those are the four major tenants and it's all on a
00:40:48
continuous basis.
00:40:49
It has to be continuous, can't be just like a routine scan, and
00:40:54
so we call that basically adaptive exposure management,
00:40:57
being able to basically have visibility with that.
00:40:59
So that's kind of like the outer ring of like what we call
00:41:04
basically a blast radius resiliency and currently, like
00:41:07
adaptive cyber resiliency, is another way to think about it.
00:41:10
And if you think of a blast radius, whether it's an
00:41:12
earthquake, so we're talking about natural disasters.
00:41:15
Basically, the more and more you can reduce that blast radius
00:41:18
that's contained to a smaller circle and you can really
00:41:22
mitigate the impact that you have.
00:41:24
So it's just that endpoint, or that application or that
00:41:27
subsector that you're basically monitoring and implementing and
00:41:31
really you want to be able to prevent that infiltration from
00:41:35
happening.
00:41:36
So we have something called infiltration protection as well.
00:41:38
So we're looking at the different memory components,
00:41:42
we're looking at the privilege escalation, we're looking at
00:41:44
credential theft protection, hacking tool mechanisms.
00:41:48
So if you're using a certain AI tool that's been flagged as
00:41:52
malicious or known to create these different avatar
00:41:56
mechanisms, then being able to mitigate the different ways that
00:42:00
these attacks can get in, like prompt injection or executable
00:42:05
code execution or executable code for manipulation of memory
00:42:10
so we might not be able to stop the first kind of components.
00:42:14
But from the actual infiltration or impact
00:42:17
perspective so actually hitting that point and where it's
00:42:21
actually going after your crown jewels, it's going after the
00:42:24
intended target then we have the certain layers, and so the last
00:42:27
layer that we have is impact protection and mitigating
00:42:29
against that as well.
00:42:31
So it's kind of like a three-layer in-depth approach to
00:42:33
preventing these types of attacks, as well as things like
00:42:37
ransomware or preventing ransomware in advanced-level
00:42:40
attacks, regardless of what you have operating on your
00:42:44
environment.
00:42:45
So it's kind of like a standalone element in addition
00:42:48
to other endpoint security or application security type
00:42:52
elements that you have, and really what it breaks down to is
00:42:56
you have basically an ai war happening right now.
00:42:59
So all of the offensive tools and defensive tools are all ai
00:43:04
driven.
00:43:05
Now, well, that's a threat model versus a threat model.
00:43:08
So what happens when that cat and mouse game comes to a head
00:43:13
and you have to have that layer, that safety net, that
00:43:17
additional layer that protects against that, and that's kind of
00:43:20
like where we come in and the core value problem that we have
00:43:23
within the vein of at least AI security.
00:43:28
Speaker 1: It's a different approach.
00:43:28
It's interesting, yeah, you know, when you put it like that
00:43:32
right, like we have defensive and offensive tools with.
00:43:35
Ais that are going up against each other.
00:43:37
It's like a recipe for disaster .
00:43:41
Almost right it is.
00:43:43
Speaker 2: Well, yeah, and if you look at like I don't want to
00:43:46
pick on any leading vendors or anything like that, but
00:43:48
something pointed to the podcast , but they're all toting the
00:43:52
same narrative that you know we have the special the super-duper
00:43:56
AI that's coming in and we have all these algorithms and all
00:44:02
that and we're able to detect the things that no one else can
00:44:06
and all these other things, and it's like, okay, what if?
00:44:08
what if I?
00:44:09
What if, as a threat actor, I poison that algorithm and I'm
00:44:13
able to get in there and basically manipulate that and
00:44:15
change your signatures.
00:44:16
That's how.
00:44:17
That's why AV traditional antiviruses is a dead and
00:44:21
obsolete solution is because threat actors figured out how to
00:44:25
change the signature types.
00:44:27
And one of my favorite stories is early on, I was doing some
00:44:31
forensic analysis and I noticed that a threat actor changed kind
00:44:36
of the mechanism of just to simplify it, this is spam stop.
00:44:43
And it changed it as like this isn't spam, this is ham.
00:44:47
It's something that's simple.
00:44:49
And then you modify the signature, because not everybody
00:44:52
listening here is super technical.
00:44:53
But you think of the way the rule set is written.
00:44:57
It's that simple.
00:45:00
If you change the algorithms and the main elements of that,
00:45:05
you're able to tamper it.
00:45:06
You're able to bypass it.
00:45:07
Every security tool now can be bypassed.
00:45:10
So what happens?
00:45:10
Where's that safety net?
00:45:12
And it has to be something that's not ai driven, that has
00:45:16
that layer of the actual layer of protection that protects
00:45:19
against it, especially if you're going all in on one vendor,
00:45:22
because then you lose that heterogeneous diversification of
00:45:26
having multiple vendors.
00:45:27
You, you know, because you know , unfortunately, with some of
00:45:30
the bigger players, you have one AI, like yeah, there's subsets
00:45:34
of that, but I mean, it's really one mothership that's running
00:45:38
the whole scene by design.
00:45:40
So you know, if you poison that lake, then well, unfortunately
00:45:45
all the water's poisoned and we're just waiting for that
00:45:49
Armageddon day to happen.
00:45:51
I think the recent CrowdStrike event that happened in July is a
00:45:59
good testament of how just that was.
00:46:01
Like maybe 1% of what could have happened.
00:46:03
Take a vendor that's equally as big, make make the blast radius
00:46:08
that much bigger.
00:46:09
You know what if they'd gone after?
00:46:11
Uh, what if that wasn't?
00:46:13
What if that was a malicious attack?
00:46:15
What if they went after, like the core mothership?
00:46:18
How bad that would have been.
00:46:20
And then what happens if it was a solar wind step event where
00:46:24
that went undetected for months, if not years.
00:46:26
Right, what the ramifications?
00:46:29
Speaker 1: are yeah, we're going into uncharted territory, right
00:46:35
, but it'll be interesting, right.
00:46:37
It creates a lot of opportunity for people to separate
00:46:40
themselves and stand out from the rest of us that know a
00:46:44
little bit more AI, that have that specialty, that have that
00:46:47
knowledge that companies will be able to capitalize on because
00:46:51
they desperately need that skillset.
00:46:52
So it's a it's an interesting, it's an interesting time for
00:46:56
sure in it overall.
00:46:58
Well, you know, brad, we're, we're, unfortunately we're at
00:47:02
the end of our time here.
00:47:03
I'll I'll have to have you back on.
00:47:05
It was a fascinating conversation.
00:47:07
I love how you know we went through everything.
00:47:10
I thought it was really interesting.
00:47:19
Speaker 2: But before I let you go, how about you tell my
00:47:21
audience you know where they can find you if they wanted to
00:47:23
reach out and connect, and where they can find Morphysack?
00:47:25
Yeah, for me the best place is LinkedIn, so drop me a
00:47:26
connection, happy to connect and talk there.
00:47:27
And then for Morphosac, just reach out on our website so you
00:47:31
have a contact us on there and happy to have a chat, teach you
00:47:36
more about our solution and learn more about us.
00:47:39
You can follow us on social media or on LinkedIn.
00:47:43
On Twitter, all the major sites and Twitter.
00:47:47
And now it's X.
00:47:52
Sorry, an X.
00:47:52
And then it's gonna be really hard for me to learn that.
00:47:55
But x and um for linkedin awesome.
00:47:56
So thanks for having me.
00:47:57
Yeah, I hope it's good to catch up.
00:48:00
Speaker 1: Yeah, yeah, absolutely, um, it'll.
00:48:03
It'll be interesting, I wonder, uh, we could probably do like a
00:48:06
part two about, like emerging ais and how to protect against
00:48:09
them and whatnot yeah, absolutely, I'd love to do a
00:48:13
deep dive.
00:48:13
Speaker 2: So pick a topic and I could talk about this stuff all
00:48:17
day long I've gone a long way in in 20 something years now.
00:48:21
Speaker 1: Yeah, yeah, definitely, well, thanks
00:48:24
everyone.
00:48:24
I hope you enjoyed this episode .