Mick Leach is Field CISO of Abnormal Security, an AI-native email security company that uses behavioral AI to prevent business email compromise, vendor fraud, and other socially-engineered attacks. At Abnormal, he is responsible for threat hunting and analysis, engaging with customers, and is a featured speaker at global industry conferences and events. Previously, he led security operations organizations at Abnormal, Alliance Data, and Nationwide Insurance, and also spent more than 8 years serving in the US Army’s famed Cavalry Regiments. A passionate information security practitioner, Mick holds 7 SANS/GIAC certifications, coupled with 20+ years of experience in the IT and security industries. When not digging through logs or discussing operational metrics, Mick can typically be found on a soccer field, coaching one of his 13 kids.
Abnormal Security: https://abnormalsecurity.com/unfiltered
Abnormal Security provides the leading behavioral AI-based email security platform
Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.
Affiliate Links:
NordVPN: https://go.nordvpn.net/aff_c?offer_id=15&aff_id=87753&url_id=902
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, everyone?
00:00:01
This is another security unfiltered podcast episode where
00:00:07
today we actually talk with Mick Leach from abnormal
00:00:11
security.
00:00:11
Abnormal security actually sponsored this podcast and again
00:00:17
, you know, just remind you guys , they didn't determine any
00:00:20
questions that I can ask them or anything like that.
00:00:23
You know, they just believe in what we're doing here at the
00:00:26
podcast and they wanted to support the podcast and so
00:00:29
that's how it all kind of happened, right?
00:00:31
So, you know, with that, let's go ahead and dive into the
00:00:36
episode.
00:00:36
I think you guys are going to love it All.
00:00:38
Right, see you guys, how's it going?
00:00:42
Mick, it's really good to finally have you on the podcast.
00:00:45
You know, I think this one we actually put together pretty
00:00:49
quick.
00:00:49
You know, most of my guests, it takes about like six months to
00:00:55
come on.
00:00:56
Honestly, like it's my backlog is insane, but we were able to
00:01:01
put this thing together pretty quick and I really appreciate
00:01:04
that Certainly, yeah.
00:01:05
Speaker 2: Thanks for having me, joe.
00:01:06
I really appreciate it.
00:01:07
It's great to be on the podcast .
00:01:09
Speaker 1: Yeah, absolutely Well , mick, you know before we dive
00:01:13
into.
00:01:13
You know abnormal security and everything.
00:01:16
Why don't you tell my audience you know your background right
00:01:21
how you got an IT?
00:01:22
Maybe why you got an IT?
00:01:25
The reason why I have all my guests start there is because
00:01:31
you know everyone's coming at this from a different background
00:01:33
.
00:01:33
They're all coming at it from you know different skill sets
00:01:37
and whatnot.
00:01:38
And if I feel like, if they can hear, you know a matching
00:01:44
background, a matching skill set , they can then know like hey,
00:01:48
this thing is possible Right.
00:01:49
Like I didn't know that this was possible at the time, but
00:01:52
now I think I can actually do it .
00:01:54
So what's your background?
00:01:56
Speaker 2: with it.
00:01:56
Yeah Well, hopefully my story is a little bit inspiring in
00:01:59
that case, because I did take sort of the scenic route to
00:02:02
cybersecurity.
00:02:03
So I joined the military, was in the army, the US army go cap
00:02:08
scouts, so you're familiar with that.
00:02:10
Was was a cap scout for eight and a half years with the US
00:02:14
army and as I was getting ready to get out it became time to
00:02:19
start figuring out what next looks like.
00:02:21
And had been interested in computers for a long time,
00:02:25
really enjoyed working on them, had been in my, in my unit.
00:02:30
I was early on the only guy that knew how to type.
00:02:32
I'd taken a typing class in high school.
00:02:35
I failed that class, by the way , was terrible at it, but still
00:02:38
was the only guy that knew where the keys roughly were and got
00:02:42
suckered into typing all kinds of non commission officer
00:02:45
evaluation reports, these kinds of things.
00:02:47
And so suddenly it became known as the computer guy in a combat
00:02:52
arms unit.
00:02:52
So it was.
00:02:54
It was a little unusual there, so I get out and ended up having
00:02:58
an opportunity right away to work with Linux.
00:03:00
And so I was supporting custom based Linux applications, a
00:03:04
small telecommunications company , and did that for about four
00:03:08
years before an opportunity to join nationwide insurance, came
00:03:11
along as a system administrator there and did a variety of
00:03:17
different system administration things, really getting into
00:03:21
encryption decryption of data and motion.
00:03:24
So there was PGP, sftp, fts really encrypting and protecting
00:03:30
data in motion and that was kind of my first foray into
00:03:34
cybersecurity, so fast forward to about 2012.
00:03:38
And they were looking to start creating a security operations
00:03:42
organization, and I was given the opportunity to join that
00:03:46
group from the ground level and was jumped at that chance and
00:03:52
realized this is what I was made for, you know, being able to
00:03:57
protect with ones and zeros digitally felt so in line with
00:04:02
my background in the military, and it really just felt like
00:04:05
this convergence of everything I had been doing and loving for
00:04:09
the last, you know, basically my whole life, and so I jumped at
00:04:13
that chance.
00:04:14
Of course, nationwide at that time experienced a relatively
00:04:18
public breach.
00:04:19
What that meant, though, for me was that and it happened to be
00:04:23
my first week on call as these go what that meant, though, was
00:04:29
they opened the wallet and said how do we fix this?
00:04:32
We hired a lot of consultants that came in and helped us build
00:04:35
an elaborate security operation center becoming larger than
00:04:39
Natta Security Command Center, and I was one of the founding
00:04:42
members of that, so learned a great deal about building and
00:04:46
running security operation centers, had the opportunity to
00:04:49
move from there after that was reached a steady state to a
00:04:52
company called Alliance Data and did much the same thing for
00:04:56
them building and optimizing a security operation center there.
00:05:00
Did that for about four years and then knew that my next role.
00:05:05
I wanted to make an impact at a higher level.
00:05:08
Right, I want to move the needle on the industry.
00:05:11
Protecting one company is valuable work.
00:05:13
Right, that's honorable work.
00:05:15
But I wanted to be able to make that impact broader.
00:05:19
And so that's when I knew the next move would probably be with
00:05:23
a vendor.
00:05:24
And I had bought and used abnormal security at my last
00:05:28
company for about a year and loved it and added them to the
00:05:32
list of companies that I would love to work for.
00:05:34
And, sure enough, an opportunity came available.
00:05:38
So I've been here.
00:05:39
I was security hire number two at abnormal security yeah.
00:05:44
Yeah, and so my CISO.
00:05:45
My Britain is an old friend of mine, and so getting to come in
00:05:50
and build from scratch, really put your finger on you know your
00:05:54
fingerprint on something from the beginning was such a great
00:05:58
opportunity.
00:05:59
So did that for about the last two years and have just recently
00:06:02
about two weeks ago moved into a field CISO role where I get
00:06:06
opportunities to talk with folks like you.
00:06:11
Speaker 1: So you know I have a lot of questions about that
00:06:14
journey.
00:06:14
You know.
00:06:16
So, when you started out with Linux, you know what was that
00:06:21
like, because Linux is unlike anything else really, and I
00:06:26
assume you know you're talking about the terminal side of Linux
00:06:29
, not the fancy gooey side.
00:06:33
You know that everyone is used to.
00:06:35
What was your experience with that?
00:06:39
What was the ramp up like with that?
00:06:41
You know because I actually started my career with Linux and
00:06:46
I mean it's like drinking from the fire hose on steroids.
00:06:51
It's like wait a minute, the network stack works like this.
00:06:54
You know, and I'm just getting out of college, I'm like wait, I
00:06:58
just learned how to spell it.
00:06:59
Like what are we?
00:07:00
What are we talking about?
00:07:01
Speaker 2: here.
00:07:02
Yeah, yeah, no, it's funny.
00:07:03
You say that I couldn't have put it better right.
00:07:05
Drinking from a fire hose on steroids, that's absolutely what
00:07:08
it felt like.
00:07:08
I will tell you that on the tail end of my military career,
00:07:12
as I knew I was getting out, I had the opportunity to go to to
00:07:16
a Solaris course.
00:07:17
So they sent me to a Solaris course and I'll never forget, I
00:07:22
felt like the biggest bonehead in the room because, you know, I
00:07:27
was half paying attention, thinking about when's lunch, and
00:07:31
suddenly it was time for an exercise and and the first, the
00:07:36
first step was to install Solaris.
00:07:39
And so we're working on that and and I've got the CD in and I
00:07:44
and I could not, for the life of me, I'm hitting the button, the
00:07:47
eject button could not figure out how to get the, the CD out,
00:07:51
to change CDs to the next, the next run, and I couldn't figure
00:07:57
it out.
00:07:57
I felt like a moron.
00:07:57
And so I I was like sorry, I'm going to have to raise my hand
00:08:02
and just ask how do you get it to eject?
00:08:04
And he's like type eject.
00:08:05
I was like no, no, no, like really, he's like seriously.
00:08:09
So that was the first time that I remembered thinking, oh, this
00:08:13
is very different, you know, than the world that I had been
00:08:17
raised in.
00:08:18
At the same time as we started to get into the tech stack and
00:08:22
we started to see how the disk was, was partitioned and and
00:08:26
data was stored, and you had so much more granular access to
00:08:31
where you were putting things on disk and the ability to go and
00:08:34
see things directly on the disk, I was like, man, this is
00:08:37
fantastic, you don't have any of this with with Microsoft, and
00:08:42
so.
00:08:42
So that's how I kind of got into it learn just enough to
00:08:47
convince somebody to hire me, which was probably a mistake
00:08:51
initially, but they gave me a chance and and I wanted to prove
00:08:55
them right, so so did that.
00:08:57
That was 21st century communication.
00:08:59
So got in there and they were very patient and allowed me to
00:09:02
make a lot of mistakes and learn a few things the hard way and a
00:09:06
lot of things the easy way, thankfully, yeah, it's.
00:09:10
Speaker 1: You know it's crazy when you, when you go from
00:09:14
something like Windows it's very user friendly, you know.
00:09:17
And then you go to Linux and you're I mean you have to like,
00:09:23
like you said, you didn't know how to eject the desk, like on
00:09:26
Windows.
00:09:27
You hit the button, you know you don't type eject or anything
00:09:30
like that, you're not clicking anything.
00:09:32
You hit the button, you assume it's going to work, but in Linux
00:09:35
none of those, none of those features and functionalities tie
00:09:40
together right.
00:09:42
You have to actually create the tie.
00:09:44
You, if you want that button to eject, you need to write the
00:09:49
script that says eject.
00:09:51
When this is pushed, you know, like that's what it is, and you
00:09:56
know.
00:09:56
Same thing with.
00:09:57
You know, like partitioning a disk.
00:10:00
I mean, oh my God, and I tried to do encryption at one point on
00:10:04
Linux.
00:10:05
I mean my brain was melted.
00:10:07
I actually needed my VP to come over because he was the only
00:10:12
person at the company that knew how to do the encryption and I
00:10:16
mean I had to have him just come over and just type and I just
00:10:19
took notes on the side, just like my brain was melted after
00:10:22
like two hours of that troubleshooting it, not knowing
00:10:26
you know what the hell is going on not looking at the right log
00:10:30
files, like I was just having a terrible time.
00:10:33
Speaker 2: Yeah, yeah, no, I agree.
00:10:35
And so moving the good news was right.
00:10:37
Learning with Linux early on, especially with like old school
00:10:41
Solaris, and then moving to like red hats, is something a little
00:10:45
better supported.
00:10:46
You know you learned a lot of things the hard way, sort of the
00:10:49
old school way, and while there are new, far more, you know new
00:10:53
fangled ways of doing things.
00:10:55
You know said knock can get you pretty far.
00:10:57
So I kind of lean on those things from time to time.
00:11:02
But yeah, learning from the beginning was great.
00:11:05
Speaker 1: Do you ever, do you ever work with Linux at all,
00:11:10
like on the side even?
00:11:11
You know I find it that.
00:11:14
You know I can't stay away from it.
00:11:16
Honestly, you know I still have a VM that's forever, you know
00:11:21
installed with a Linux flavor.
00:11:23
That that I like, that I prefer , which, embarrassingly, it's
00:11:28
Ubuntu, because I like the ability to have that GUI and
00:11:32
also be able to do things in the terminal and feel like I'm
00:11:35
getting things done.
00:11:35
But I use that for very like select things you know, within
00:11:39
my home network, sure.
00:11:41
Speaker 2: Sure, yeah, no, I absolutely it's still.
00:11:43
It's like, it's like an old pair of comfortable shoes, right
00:11:46
, it's still the most natural, most comfortable thing.
00:11:49
So so I still do, just like you in my home lab, have a couple
00:11:54
of VMs of Linux.
00:11:56
I've got, a couple of Ubuntu versions I've got.
00:11:58
Of course, you can't be a security guy without having at
00:12:01
least one instance of Cali running all the time.
00:12:05
Adhd, which I really like.
00:12:06
Active harbinger, active defense, harbinger distribution
00:12:12
really good there.
00:12:13
So yeah, just a few different security onion.
00:12:15
You know you've got to have a few different versions running
00:12:18
in the background, but it's still the most comfortable for
00:12:21
me, especially if I'm going to get into log analysis.
00:12:25
Yeah just nothing beats a great grep said AUK, you know ability
00:12:31
to parse tons of data.
00:12:35
Speaker 1: Yeah, yeah, absolutely.
00:12:37
Once I learned that it was like having a superpower in Linux.
00:12:40
It's like wait a minute, I don't have to sift through this
00:12:43
log and search for things and all that you know.
00:12:47
So when you, when you started at nationwide, you said the
00:12:51
first week they got breached.
00:12:53
Speaker 2: So it was the first week of my move into security.
00:12:57
So I moved into a security focused role.
00:13:00
It was the first week that I was on call and I remember
00:13:04
joining the call and thinking, wow, this is an exciting call.
00:13:08
And then it got very exciting and and I realized I'm in, I'm
00:13:13
in over my head and so we had, you know, I had to call in for
00:13:17
some help.
00:13:17
But that being at the very front end of that and then
00:13:21
actually seeing a large company work through a pretty large
00:13:25
scale breach, that, that made an indelible, indelible impression
00:13:32
on me because it taught me so much about security, the legal
00:13:36
side of things.
00:13:37
You know the way we we protect information, the way we share
00:13:40
information, all right at the very beginning.
00:13:43
So it was like a crash course and you know you, you can't pay
00:13:48
for that kind of experience?
00:13:50
Speaker 1: Did they?
00:13:51
Did they already have a security team stood up?
00:13:55
Speaker 2: Yeah, yeah.
00:13:55
So a lot of those.
00:13:56
And that's what, as we brought consultants in, that's what we
00:13:59
learned is that a lot of the right things were in place.
00:14:02
You know, it wasn't a lack of skills, it wasn't a lack of
00:14:05
people, a lack of tooling.
00:14:06
What it really was was a lack of coordination.
00:14:09
So we had a lot of disparate tooling, a lot of disparate
00:14:12
capabilities spread across a very large organization, and
00:14:16
what we learned is that we needed to better unite those
00:14:19
capabilities in under one house, right under one floor, so that
00:14:24
we could better communicate with one another.
00:14:26
And that made a world of difference.
00:14:28
You know like large, any large organization there were.
00:14:32
You know territorial things where you're like oh that's,
00:14:35
that's my world, you got to stay out of that.
00:14:36
You know, you got to stay out of that area, that's mine and
00:14:39
this is our world.
00:14:40
And now we don't touch servers, we only work in.
00:14:42
You know endpoints and you know .
00:14:44
So we had to break down some of those walls, deal with some of
00:14:47
those sort of past sins and and then figure out how to better
00:14:53
collaborate going forward.
00:14:55
Speaker 1: Hmm, yeah, you mentioned, you know, the, the
00:15:00
checkbook opened up, which you know, for a security person.
00:15:03
That's like what you've been waiting for.
00:15:06
And I went into a credit bureau right after a breach at one of
00:15:12
our competitors and I mean, it was anything you want, like name
00:15:18
, your number, it literally doesn't matter.
00:15:21
We have a blank check from the CEO saying that we can do this.
00:15:25
You know, apparently, apparently my manager this is a
00:15:29
couple weeks before I got there, apparently, my manager actually
00:15:35
, you know, was at the bar across the street with a bunch
00:15:38
of the team and he was drinking a beer and he said you know what
00:15:42
effort, you know I'm gonna, we're going to deploy this thing
00:15:45
, we're going to buy these tools , like screw it.
00:15:48
You know, the company is just going to figure it out.
00:15:50
And I mean, like that was for the company, that was the best
00:15:56
decision, because we desperately needed that technology in place
00:16:00
before the team the team around him it was.
00:16:05
It was miserable because not only were we trying to ensure
00:16:08
that we want to get breached, like our competitor, but now
00:16:12
we're, you know, trying to deploy these tools at the
00:16:16
quickest pace of these vendors had ever seen.
00:16:18
You know, I talked with the vendor that I was in charge of
00:16:22
their solution and I asked him you know what's the quickest
00:16:27
deployment you've ever seen with this solution at any other
00:16:31
customer?
00:16:32
And he said, oh well, I'd have to.
00:16:34
I have to, you know, ask around and get back to you, right?
00:16:37
They said the quickest full deployment was 18 months.
00:16:41
I said, okay, you know what did we do it in?
00:16:45
He goes you did it in six weeks .
00:16:47
Yeah, oh, I guess that was a little quick.
00:16:52
Speaker 2: Yeah, yeah, and I actually had a similar
00:16:55
experience right coming here to abnormal security, where we were
00:16:59
able to buy and build things very quickly, which was for me a
00:17:04
whole new world, because in a massive fortune 500, fortune 100
00:17:08
company, you know things move slowly and there's lots of red
00:17:12
tape, and so you know to deploy a new sim would take 18 to 24
00:17:17
months, you know here, you know, three weeks, because everybody
00:17:22
is pulling in the same direction .
00:17:23
It was a massive difference.
00:17:25
So that's exciting, I will say.
00:17:27
Part of the problem with the blank check, though, was that,
00:17:31
because you know they did, they, the CEO, cfo, they all come
00:17:35
down and say, okay, just tell me , tell me how much to write this
00:17:39
check for us that we never have this happening.
00:17:41
And that's where you have to have our conversations say, guys
00:17:44
, it's, it's not how it works.
00:17:45
Right, we can build a lot of defenses, we can better protect
00:17:50
ourselves.
00:17:51
We can certainly lessen both the likelihood and the impact of
00:17:55
a compromise going forward, but to say it'll never happen again
00:17:59
, that's, that's not something that we can do.
00:18:01
So what we'll do is better position ourselves to lower the
00:18:05
likelihood of it happening again , but in the event that it does
00:18:09
and it likely will at some point but in the event that it does
00:18:13
will lower that impact as well.
00:18:15
So that's the key.
00:18:16
That's what we're trying to do.
00:18:17
It took a minute to kind of re-scope people's mentality on
00:18:23
this.
00:18:24
Speaker 1: Hmm, yeah, that's.
00:18:28
That's interesting because I feel like you know executives at
00:18:33
that level they're not, they're not used to not being able to
00:18:39
throw money at a problem and have that problem be fully
00:18:43
resolved.
00:18:43
You know, that's always kind of the question and the approach
00:18:50
that's taken.
00:18:50
But with security it's like, yeah, we could throw $20, $30
00:18:55
million into this thing, we could have the best tech stack,
00:18:58
the best you know engineers in the country working on this
00:19:03
thing, and we could still get breached by something that you
00:19:06
know we didn't even know existed .
00:19:08
You know, because that's what a zero day is.
00:19:11
You know they could use, they could literally just use a zero
00:19:14
day on us and get around everything that we just did.
00:19:18
You know, and I feel like it's also really important to what
00:19:22
you did, there is, you know, explain it that way, you know,
00:19:26
because a lot of security teams would just take that money and
00:19:31
you know, like run for the hills basically, and that's maybe the
00:19:34
worst case because what if you take all that money, you deploy
00:19:39
all these tools, do all that work, you get the increased
00:19:42
headcount and then you get breached.
00:19:44
Well, now your job is at risk because you didn't properly,
00:19:48
kind of prepare them.
00:19:49
You know for that reality.
00:19:51
And so now it looks like you're incompetent, when in all
00:19:54
actuality you're not incompetent , you know.
00:19:57
It's just how it works.
00:19:57
In security, yeah yeah, absolutely.
00:20:01
Speaker 2: It takes a while to change the mindset right.
00:20:03
That's the big, that's the key thing, I think.
00:20:08
Speaker 1: Yeah, that mindset is the hardest thing I feel to
00:20:12
change.
00:20:12
You know, right now I actually I work at a pretty large company
00:20:16
and when you said, you know, yeah, it's going to take 1824
00:20:21
months to deploy this SIM, you know it took me.
00:20:24
It took me like 12 months to get through a POC of a far
00:20:32
smaller solution than a SIM.
00:20:34
It's like embarrassingly slow.
00:20:38
You know, like guys, I would have had this done in three
00:20:40
weeks, like, come on, just let me, let me do some work.
00:20:43
Speaker 2: Yeah, yeah, I know I'm with you, yeah.
00:20:45
And the other thing in terms of , you know, avoiding future
00:20:51
compromises is that, no matter how good your tech stack is,
00:20:55
right, bad guys are always coming up with new and exciting
00:20:59
ways of circumventing them.
00:21:00
So you know, you look at some recent breaches that we've seen
00:21:03
hit the news.
00:21:05
You know they're not even targeting technical things
00:21:08
anymore.
00:21:09
They don't even use a zero day.
00:21:10
Often it's not, it's not nearly that complex or technical.
00:21:14
They just pick up the phone and call the help desk, right.
00:21:17
It's more social engineering that we're starting to see these
00:21:20
days, right, Because people are easier to hack than systems.
00:21:25
It's just that simple.
00:21:28
Speaker 1: Wasn't that the case with LastPass, the LastPass
00:21:32
breach, where they like called up support and the support guy
00:21:36
had enough access and they just sent him a link and he, you know
00:21:41
, he clicked on it, like he normally would, you know, and it
00:21:43
completely compromised LastPass as a whole.
00:21:46
Speaker 2: Yeah, yeah, or you look at some of the recent
00:21:48
things happening in the desert, right in Vegas, you know where
00:21:51
they just picked up.
00:21:52
Picked up the phone, called the help desk purported to be you
00:21:56
know a high level security engineer and they reset his
00:22:02
password, they reset his MFA for him, and suddenly we've given
00:22:05
the keys to the kingdom to a bad actor.
00:22:08
Right, it wasn't enough.
00:22:10
They just reset his creds, but they also reset the MFA token so
00:22:14
that he could get directly in the right way.
00:22:17
Right, he didn't have to print anything.
00:22:19
There was no like, there was.
00:22:21
No, there was no link, there was no malicious activity of any
00:22:24
kind apart from social engineering.
00:22:27
And that's really what we're starting to see these days.
00:22:32
Speaker 1: Yeah, it's really interesting you bring that up.
00:22:34
You know, when earlier on in my career I did a lot of work with
00:22:40
the government and there was a slew of documentation and
00:22:47
background checks that I had to do and I didn't even have a TS,
00:22:50
right Like I didn't even have Top Secret, I didn't even have
00:22:53
Secret, you know I couldn't touch a keyboard and I still had
00:22:56
to fill out like a 80 page document about.
00:22:59
You know, every place that I've been since I was born who.
00:23:05
I spoke to like, yeah, all that stuff you know, and one of the
00:23:13
things in there like a part of my training, I guess you know, I
00:23:18
was talking to my handler and we somehow we got into this
00:23:24
conversation.
00:23:24
It was pretty late at night so I don't remember quite exactly
00:23:29
what it was, but we were discussing about how people get
00:23:32
compromised and he said, you know, from his own experience,
00:23:37
right, he, you know, he had a sick kid with cancer.
00:23:41
His credit card bills were very high and he was waiting on a
00:23:47
tax return to actually pay off the credit cards.
00:23:50
Because, you know, he's in government, he has a clearance,
00:23:52
he has to make sure that it's low.
00:23:53
But it wasn't anything that he did, you know, like he wasn't
00:23:58
buying cars, he was like he was buying medicine for his kid and
00:24:02
so the agency knew about it.
00:24:04
You know, they, they were very understanding of it.
00:24:07
But he said that you know, enemies will look at that and
00:24:12
say, oh, we could cut him a check for, you know, a small
00:24:16
amount of money 20 grand and it'll alleviate that debt and
00:24:21
we'll do that.
00:24:21
Just for a name, right, he doesn't have to tell us anything
00:24:24
else was doing.
00:24:25
For a name they make it sound very small, very minute.
00:24:30
Like what would you ever do with a first and a last name?
00:24:34
Like I'm not even giving you the title, you know, like
00:24:36
nothing like that, and you know that's a good point, right.
00:24:41
Like I don't know if something like that took place, but like
00:24:45
these, these, these people out there, you know that that don't
00:24:50
like America or don't like your company or whatever might be,
00:24:55
they will literally, you know, pay you tens of thousands of
00:24:58
dollars just for a name.
00:25:00
I mean, that's a, that's an absurd topic to me, right to go
00:25:06
down, because to me that means nothing and that's such a minute
00:25:11
thing.
00:25:11
I would never, personally, if I was ever confronted with that
00:25:14
situation.
00:25:15
I mean, now, obviously you know I wouldn't make that choice,
00:25:18
because now I have that knowledge, but beforehand I
00:25:23
would never, I would never second guess it.
00:25:25
You want to give me how much for a name?
00:25:26
Dude, I'll give you the roster.
00:25:28
You know, like that's where my mind would be.
00:25:33
Speaker 2: Sure, yeah, and the thing is and while that
00:25:36
definitely happens right, I mean , when I was in the military,
00:25:39
that was definitely something that we talked a lot about, you
00:25:42
know is reporting those kinds of things.
00:25:44
We're well trained to expect and report those kinds of
00:25:49
interactions, regardless of how innocuous they seem.
00:25:53
You know?
00:25:54
You look at Tesla.
00:25:56
A few years ago, there was an external threat actor that
00:26:01
offered a Tesla engineer significant money to just simply
00:26:04
produce a single file of malware and let them they'd take
00:26:09
it from there, and it was, and he thankfully reported it.
00:26:13
So it didn't become an issue.
00:26:15
But more, what we're seeing, at least in my experience, is bad
00:26:22
actors that are preying on the good nature of human beings.
00:26:26
You know, as security people, we don't trust anybody, right?
00:26:31
Speaker 1: I suspect everything.
00:26:33
Speaker 2: I'm paranoid, that's just.
00:26:35
You know, that's how we are built.
00:26:36
However, like Marcy, over in finance isn't built that way,
00:26:42
right, they're just people in service roles.
00:26:46
Their job is to help people.
00:26:47
If you work in finance, your job is to pay bills.
00:26:50
You know, if you work in HR, your job is to help people solve
00:26:53
problems.
00:26:54
And so when someone comes to you needing a bill paid or
00:26:58
needing a problem solved, they don't typically look very deeply
00:27:02
into those things.
00:27:03
They just don't suspect, you know, bad motives, and so they
00:27:09
respond to that call for help.
00:27:11
Right, you know?
00:27:12
I've talked to an FBI psychologist just before who
00:27:18
talked about the power that comes with the request for help.
00:27:22
Right, I need your help.
00:27:24
Those four simple words right, I need your help.
00:27:26
Those are powerful words and they can elicit people to do
00:27:31
things they wouldn't otherwise do or that they might be more
00:27:35
suspicious of.
00:27:36
But because they feel like they're helping someone, they'll
00:27:39
do more, they'll go further and not be suspicious, not think
00:27:44
they're doing anything wrong and yet be the source of the
00:27:47
compromise.
00:27:51
Speaker 1: Yeah, that's a really good point that you mentioned.
00:27:54
You know, security professionals are some of the
00:27:56
most paranoid people that I know .
00:27:59
You know I feel like calling it paranoid is not doing it
00:28:05
justice, but you get what I'm saying, you know.
00:28:07
I'm sure all of our listeners, you know, understand that, which
00:28:13
you know brings up a good point .
00:28:15
You know, currently, and my current role, right, we POCed a
00:28:19
bunch of different WAF solutions , chose one of them and my first
00:28:24
response was hey, I want to stand up a Kelly Linux box and I
00:28:28
want to pound this thing when we deploy it right as it's going
00:28:33
live.
00:28:33
As we're creating the rules, I want to make sure that these
00:28:36
rules that they claim is working .
00:28:38
I want to make sure that they're actually working.
00:28:41
And almost everyone in the room was like you know, you think
00:28:46
that they wouldn't, you know, already know that this is
00:28:49
working when they're deploying it.
00:28:50
Like we have the vendor that created it deploying it for us.
00:28:54
I'm like, yeah, that doesn't matter to me.
00:28:56
You know, if we're spending this amount of money, we need to
00:28:59
know definitively it's working.
00:29:01
Like not know, you know, oh, yeah, I'm sure it is.
00:29:05
You know, I configured it right .
00:29:07
Like no, I ran the configuration issues in the POC
00:29:11
for a reason you know, like I configured it to the best of my
00:29:15
knowledge and it wasn't working.
00:29:16
Yeah, you know.
00:29:18
So what does that say about, you know, this solution?
00:29:21
Like I have my own thoughts on it, but obviously I want to, you
00:29:25
know, trust but verify.
00:29:27
And I think the only other person in the room that agreed
00:29:29
with me was my CISO.
00:29:30
He's like, yeah, that's exactly why you're here.
00:29:33
Like we need someone thinking outside the box, because
00:29:37
everyone else in this room is just going to blindly trust this
00:29:40
vendor because they put, you know, millions of dollars into
00:29:43
this product.
00:29:44
Speaker 2: Yeah, I'll tell you, the military does a good job in
00:29:47
a few areas and one of them and I'll be honest, I didn't
00:29:49
understand the need for it then because it was not very pleasant
00:29:52
.
00:29:52
But when you go through the gas chamber in basic training,
00:29:57
right, we have this CS gas.
00:29:58
It's a tear gas, very like you know military grade of tear gas,
00:30:03
and to give you confidence in your mask, you have to go
00:30:09
through this gas chamber.
00:30:11
And so you'll go through.
00:30:13
It's very unpleasant.
00:30:14
They give you the mask.
00:30:16
They tell you the mask works.
00:30:18
I was initially comfortable trusting.
00:30:20
I just I'll trust you.
00:30:22
I don't need to go through the gas chamber for you, you know,
00:30:26
to trust that the gas mask works .
00:30:27
But the reality is for all of us to truly understand and
00:30:31
appreciate that it works.
00:30:32
You got to test it and you got to test it in the worst way,
00:30:37
which is you go in there no mask , right, or you go in with the
00:30:41
mask, then you take the mask off , you breathe in, you choke, you
00:30:45
cry, right, and then you throw the mask on and realize this
00:30:49
works.
00:30:49
And so it was a lot about not only training how to use it, but
00:30:53
it was also learning to trust that it works.
00:30:55
I think that's an important lesson in terms of security
00:30:59
tooling today as well.
00:31:01
Right, we can trust the vendors so far, but I would encourage
00:31:07
you and I tell that to my clients right, our customers
00:31:11
throw everything you got at it right?
00:31:13
That's how you'll trust it and how we'll make sure that we're
00:31:17
meeting your needs.
00:31:18
So that's absolutely critical.
00:31:22
Speaker 1: Yeah, that's a huge thing that trust would verify.
00:31:26
You know, when you bring up the gas chamber there, it makes me
00:31:34
remember, or recall, when people were saying that that's too
00:31:38
cruel.
00:31:38
Right, that's like too cruel for our soldiers to go through.
00:31:41
It's like the very first time that they should experience CS.
00:31:46
Gas should not be on the battlefield.
00:31:47
They need to know one, they're not dying and two, they can get
00:31:53
through it because they did it before.
00:31:57
All of these things are absolutely critical.
00:31:59
It's the same thing with security or IT in general.
00:32:03
You have to build up that resistance.
00:32:06
You have to launch a cross-site scripting attack yourself to
00:32:13
understand what's actually going on.
00:32:14
When I understood what a cross-site scripting attack was,
00:32:18
it wasn't because I read it in a book, it was because I did it
00:32:23
and I saw oh wait, a minute, I just made this query a little
00:32:28
bit weird and I got back three accounts when I should have got
00:32:32
back one.
00:32:32
Okay, now I kind of understand what's going on.
00:32:36
Here I get the function that's going on, which I think is an
00:32:43
interesting segue into abnormal security.
00:32:45
So let's start with what abnormal is.
00:32:50
What's the problem that abnormal security is trying to
00:32:53
solve?
00:32:55
Speaker 2: Yeah, so abnormal security is an AI native, an AI
00:32:57
native email security solution that uses behavioral data
00:33:01
science to really baseline your environment and understand
00:33:06
deviations from nor so, if you think.
00:33:09
If you go back a little bit in history and you think about the
00:33:12
move from antivirus to EDR, right, instead of trying to
00:33:17
define what evil looks like and then find it based on what we
00:33:22
know it looks like, edr changed flip tables.
00:33:26
It changed the game entirely and said what if we don't care
00:33:29
what evil looks like?
00:33:30
What if we just know your environment so well that when we
00:33:34
see a process run and spawn another process, right,
00:33:39
microsoft Word shouldn't probably spawn another process?
00:33:43
That would be incredibly unusual, at least certainly 10
00:33:47
years ago, maybe a little less so today, but still there are
00:33:51
certain behaviors.
00:33:52
Regardless of how evil gets into an environment, it sort of
00:33:59
standardizes in what it needs to do next, and so that's how EDR
00:34:04
changed the game.
00:34:04
Well, abnormal came along and our founders didn't come from
00:34:10
the email security space.
00:34:11
They actually come from AdTech, so advertising, and they had
00:34:16
learned a lot about machine learning and understanding
00:34:19
behavior through machine learning algorithms, large
00:34:23
language models, and so they were actually behavioral data
00:34:26
scientists and started to talk to other folks in the security
00:34:31
industry and said what is a problem that isn't solved?
00:34:34
Well, today, and one thing kept coming up email.
00:34:39
You know it's 2023.
00:34:42
At the time it was 2018.
00:34:44
And we're like, look, email's been around since the dawn of
00:34:48
time in terms of the internet and networking.
00:34:51
Why are we still talking about email security?
00:34:55
And the reality is because nothing had really solved it
00:34:58
completely.
00:34:59
You know, you think about what we used to see malicious links,
00:35:04
malicious attachments.
00:35:05
Those were the kind of the bread and butter of bad guys.
00:35:07
And the reality is today, you know, bad guys aren't even using
00:35:14
these tried and true methods.
00:35:15
We've trained our users in all what worked then, but are now
00:35:20
the wrong ways to think we train them.
00:35:23
Don't click on any links, and you'll be fine.
00:35:24
Don't open any attachments from somebody you don't recognize,
00:35:28
and you'll be fine.
00:35:29
Look for misspellings or bad grammar, and that's how you know
00:35:34
you found the bad guy.
00:35:35
Right, that's a phishing email.
00:35:36
Look, the reality is bad guys have departed from those tried
00:35:42
and true methods in favor of more advanced attacks.
00:35:47
Right, first, it started with Grammarly long before generative
00:35:50
AI was a thing.
00:35:52
Right, ai, ml, nlp, these things they've all been around a
00:35:56
good long time, but the transition into AI that
00:36:01
generated net new content.
00:36:03
That is a relatively new concept.
00:36:05
Right, chatgpt was released almost a year ago today, right
00:36:10
November 30th, I think 2022 kind of changed.
00:36:14
But even before that they were using Grammarly to improve and
00:36:19
so.
00:36:20
But now, with the advent of ChatGPT and Bard and other
00:36:24
generative AI solutions, we're seeing threat actors that
00:36:29
couldn't formulate a coherent English sentence two weeks ago
00:36:34
can now write a very they can craft a very good, realistic
00:36:39
phishing message, probably better than my 10th grade
00:36:41
English teacher, mrs Fox, I mean , and that's saying something.
00:36:45
So that's what we're seeing is things are leveling.
00:36:49
Generative AI, ai in general, has leveled the playing field
00:36:54
for bad actors.
00:36:57
Speaker 1: Hmm, yeah, you bring up a very valid point.
00:37:02
Is that email security really hadn't changed in, I mean, a
00:37:07
decade?
00:37:08
You know, like it's kind of the same exact thing, like oh, this
00:37:13
is how you train your users on email security.
00:37:15
These are the rules that you configure.
00:37:18
You know, like it's such an antiquated method it would never
00:37:23
keep up, you know, in modern day.
00:37:25
You know security, yeah.
00:37:28
Speaker 2: Yeah, and so, at my last, my last job, you know, I
00:37:31
just had never seen a good solution to an email that simply
00:37:35
said hey, bill, it's Bob, give me a call when you get a minute,
00:37:37
because that's what we're seeing today.
00:37:40
Right, bad actors, it starts with a conversation.
00:37:43
Right, if they can, the social engineering attempts that we're
00:37:47
seeing today, really just seek to start a conversation and
00:37:51
carry it from there.
00:37:52
Business email compromise is the number one, you know, at
00:37:57
least in terms of financial impact, the number one security
00:38:00
threat for the last three or four years running, according to
00:38:03
the FBI's ICEN 3 report.
00:38:05
So, you know, this is what we're seeing.
00:38:07
And so when I and I remember thinking at my last company, I
00:38:12
had all the right tools, had a wonderful tech stack, I'd spent
00:38:15
three years building with my leadership, flipped it over
00:38:19
entirely, had upper right quadrant stuff across the board,
00:38:23
it was the tech stack of my dreams.
00:38:26
And yet there were still things slipping through and so we
00:38:32
needed to think about things differently.
00:38:33
Right, you look at the way email security has traditionally
00:38:36
been you set a secure email gateway on the perimeter of your
00:38:41
environment to protect you.
00:38:42
That's great.
00:38:44
It's largely looking for again defined evil.
00:38:48
Right, it knows malicious IPs, malicious URLs, malicious
00:38:52
attachments, but if those things aren't present and if DMARC,
00:38:58
dkms, pf all check out, it's going to deliver that message
00:39:03
regardless of what it says, because it couldn't really look
00:39:05
into the body of the message.
00:39:07
And then I took a meeting with Abnormal's founder and he
00:39:11
explained Zanjay.
00:39:12
He explained that Abnormal was fundamentally different.
00:39:15
Rather than trying to sit on the perimeter and guard things
00:39:20
and evaluate them as they pass, it would sit outside as a SaaS
00:39:25
solution.
00:39:26
It would sit outside your network entirely and make API
00:39:29
calls directly into your email tenant, evaluate every message,
00:39:34
including those that are that east-west traffic that nothing
00:39:38
else could see.
00:39:39
Right, 70% of all email traffic is internal and so tags are
00:39:44
blind to that.
00:39:44
And so, hearing all of this, I said, okay, that's great.
00:39:49
I mean, how long is this going to take to install?
00:39:52
This is going to take months.
00:39:53
You know, I was a Fortune 500 bank.
00:39:56
I mean, I couldn't do anything in weeks, let alone months,
00:40:00
sending up the infrastructure.
00:40:01
It's going to be a nightmare.
00:40:02
And he said no, no, no, no, no, that's not how it works, it's
00:40:06
all going to just take minutes.
00:40:07
It takes three clicks and because it sits outside, you
00:40:11
don't have to change your mail flow, you don't have to make MX
00:40:15
record changes, right, all you do is give it the creds and
00:40:18
we're off and running.
00:40:19
And I said, okay, we'll see.
00:40:21
And sure enough we did.
00:40:23
We set it up as a POC and I got a report the next day and I'll
00:40:28
never forget looking at that report, thinking wait, wait,
00:40:32
wait, wait.
00:40:32
Let me just understand.
00:40:33
All this is slipping through my tech stack right now.
00:40:35
And he said, yeah, and listen, we want to, we want to call your
00:40:40
attention to one message in particular.
00:40:42
This one here.
00:40:43
This is your HR business partner corresponding in real
00:40:48
time right now with a threat actor.
00:40:50
And I went no, no, no, no, oh my gosh.
00:40:54
And sure enough it was.
00:40:56
That particular one was a direct deposit fraud case where they
00:41:01
were trying to convince our HR person that one of our internal
00:41:06
users was trying to change their direct deposit.
00:41:09
They were on vacation and, to be fair, the threat actor had
00:41:13
done their homework.
00:41:14
They used the world's greatest hacking tool, linkedin, found
00:41:19
someone with a high you know big title and probably a lot of
00:41:22
money, went cross-referenced that with Facebook, saw that
00:41:25
they were posting pictures from Cabo and realized this person's
00:41:29
on vacation.
00:41:30
They created a Gmail account.
00:41:32
That was the user's first name, dot.
00:41:35
Last name, addgmailcom it was an unusual name and it looked
00:41:40
very legit.
00:41:41
And from that email sent a note to our HR business partner and
00:41:45
said hey, I just I'm on vacation .
00:41:47
That's why I'm emailing you for my Gmail account.
00:41:49
I don't have access to my corporate account right now.
00:41:51
I just realized we changed banks before we went on vacation
00:41:56
.
00:41:56
But this is really important.
00:41:57
I need you know we're on vacation.
00:41:59
I need my next check to come to the right bank.
00:42:01
Can you take my direct deposit information?
00:42:04
And she was like no, you didn't fill out the right form.
00:42:08
It's attached for your convenience and wouldn't you
00:42:13
know it?
00:42:13
Threat actor fills it out, probably better than any
00:42:17
employee ever would, and she was getting ready to make those
00:42:21
changes when we caught it.
00:42:23
So, as I say, it didn't take much more to convince me we
00:42:29
found the right.
00:42:32
Speaker 1: So you know what I hear a lot of the times.
00:42:36
You know everyone's using Microsoft Office.
00:42:40
You know 0365, right, everyone kind of thinks that Microsoft
00:42:47
has this stuff.
00:42:48
You know kind of locked down that it's, you know, not going
00:42:51
to really get through.
00:42:52
You can get by with their default settings.
00:42:56
You know what I mean.
00:43:01
How do you break that mold?
00:43:03
You know?
00:43:03
Do you break it by showing them ?
00:43:06
Because I'll tell you right now .
00:43:08
Actually, you know, we looked at abnormal internally, right,
00:43:14
and it was that exact same mentality.
00:43:17
It was like we already have Microsoft.
00:43:18
Like what are they going to provide that Microsoft isn't?
00:43:21
And then when we talked to Microsoft and Microsoft referred
00:43:25
abnormal security and we're like, oh, wait a minute, like
00:43:30
Microsoft provides similar things and they still refer to
00:43:34
us to abnormal, you know.
00:43:37
So, like, how do you break down those boundaries?
00:43:40
Speaker 2: Yeah, I think the key thing is to understand the
00:43:42
differences.
00:43:42
Right?
00:43:43
Microsoft, to be fair, very good, right?
00:43:45
If you've got an E5 license and you've got all of this spam
00:43:48
stuff turned on, you've got they do a great job with defined
00:43:53
evil.
00:43:53
Right?
00:43:54
If they know that this is a known malicious IP address, a
00:43:58
known malicious sender URL, if you know they can look at how
00:44:04
recently the URL was stood up, they can do lots of things on
00:44:08
the front end.
00:44:10
However, where they admit right to their own.
00:44:13
You know, as they brought you guys, you know they told you
00:44:16
guys to look at us.
00:44:17
You know they're admitting that there are still things that
00:44:21
they don't do well, and one of those is identifying malicious
00:44:25
activity without a link, an attachment, without any known
00:44:30
evil, and that's where we come in.
00:44:32
We're using those large language models, we're using
00:44:35
behavioral data science to baseline your entire environment
00:44:38
.
00:44:38
So what I say, what I mean when I say that, is that you know
00:44:44
when let's go back to that that added you a moment ago hey, bill
00:44:47
, it's Bob, give me a call when you get a minute right, the
00:44:51
things that we may know.
00:44:52
Having lived in your email environment for a while, we know
00:44:57
that Bill actually goes by William right.
00:45:02
Bob knows that, and so for the first time ever, bob calls him
00:45:08
Bill instead of William.
00:45:09
Even though they've been friends for a long time, they've
00:45:11
traded thousands of email messages, he's never called him
00:45:15
Bill.
00:45:15
Well, that's unusual, right.
00:45:18
And then it comes from a place that we just don't expect, right
00:45:21
, where the timing is off.
00:45:23
You know, there are certain things tonally, by using, by
00:45:28
evaluating the content of the message, using natural language,
00:45:31
processing large language models, what we can do is
00:45:35
understand, break, parse out the message itself, the message
00:45:39
body, and understand what's actually being said for the
00:45:42
first time, and so, because of that, we can understand tonal
00:45:46
changes.
00:45:46
This isn't how Joe normally sounds, you know he doesn't.
00:45:50
He's not this formal in his messaging.
00:45:53
Normally, some things and some things differ.
00:45:58
Speaker 1: Huh, yeah, I mean you just answered probably like my
00:46:01
next two questions, right, I was going to go back to that email
00:46:05
and you know ask how do you defend against it?
00:46:08
And to you know what's a large language model?
00:46:11
You know, because you always hear, you always hear these
00:46:15
terms with vendors and at some point, you know, as a security
00:46:19
professional, you kind of just gloss over it, right, you don't
00:46:21
even, you don't even look into it anymore.
00:46:23
But that that makes a lot of sense as to why and how abnormal
00:46:29
is able to kind of change the email security landscape,
00:46:32
because you're looking at the actual context of the email,
00:46:37
with the context being the other millions of emails that are
00:46:40
sent within the environment.
00:46:41
Speaker 2: Yeah, and not only that, but because we're plugged
00:46:43
in at that level.
00:46:44
Let's assume you're using a Microsoft 365 account for email.
00:46:48
You know we also support Gmail as well if you go that route.
00:46:52
But so if you're using Google workspace for mail, but let's
00:46:57
assume it's Microsoft 365.
00:46:58
Because of the way you plug in using the API Microsoft's Graph
00:47:03
API what you can then see also is all of AD.
00:47:08
Speaker 1: So now I see everybody in your company.
00:47:11
Speaker 2: I know what all of their titles are, I know what
00:47:14
the work groups are, I know who's bosses who's and, and now
00:47:18
I get this rich, the richest understanding of who you are as
00:47:23
a person.
00:47:23
Now, when you couple that together with all of the past
00:47:28
emails I've ever seen you send or receive, I now have this
00:47:33
tremendously rich understanding about who you are and how you
00:47:37
communicate and who you do it with.
00:47:38
So suddenly get a new message that purports to be from a
00:47:43
friend, right, or maybe it's a vendor that you do business with
00:47:46
, but something's amiss, right.
00:47:48
They call you by the wrong name .
00:47:50
It doesn't come from the right email address.
00:47:52
It doesn't come from the right IP address or URL.
00:47:55
Something's amiss Even if it so .
00:48:00
With vendor email compromise, we're starting to see vendors
00:48:04
get compromised, their email accounts get compromised, threat
00:48:07
actors living in those email accounts for a while,
00:48:11
understanding who folks do business with and then targeting
00:48:15
and praying on those trusted relationships from the right
00:48:18
message right from the right email account.
00:48:21
And yet we can still detect that based upon tonal changes, based
00:48:25
upon the way we've the history of how you talk, so, and we've
00:48:29
caught that before.
00:48:30
It's very interesting, Hmm.
00:48:35
Speaker 1: Yeah, that's really it's.
00:48:37
It's a fascinating way to look at it, you know, like it.
00:48:41
It makes me wonder why no one else ever thought of that before
00:48:45
.
00:48:45
But I mean, I guess that's a topic for another podcast.
00:48:48
You know, where, where do you see abnormal Going in this space
00:48:54
in the future?
00:48:55
Right, where do you see email security going and growing in
00:48:59
the future?
00:48:59
You know, if you would have asked me, you know, 10 years ago
00:49:03
, if this is where email security would have gone, you
00:49:07
know it wouldn't have been something that crossed my mind,
00:49:10
right?
00:49:10
I would say it's a good idea.
00:49:12
But I I never would have said you know, oh, yeah, like that's
00:49:16
where it's going, yeah, so where do you think that it's going?
00:49:19
Speaker 2: Yeah, I think the key is AI.
00:49:21
Right, ai is changing everything.
00:49:23
It's changing everything on both sides of the fence, right?
00:49:26
So in terms of threat actors, it's leveled the playing field.
00:49:30
It's also to allow them to scale attacks in a way we never
00:49:35
could have envisioned before.
00:49:36
And so I think what it takes and what you're going to see
00:49:40
you're starting to see it now, but I think it's going to
00:49:43
proliferate in a massive way soon which is Security solutions
00:49:48
beginning to use AI to combat right, good AI to combat that
00:49:53
bad AI.
00:49:53
Because the reality is, as the threats scale up, our defenses
00:50:00
need to scale up as well.
00:50:01
Because we're seeing so much more throughput In terms of
00:50:06
malicious activity, it's going to take different security
00:50:10
solutions that are leveraging machine learning that can, that
00:50:14
can parse through thousands and thousands, tens of thousands of
00:50:19
signals to identify that thread of of abnormal, of abnormal, to
00:50:26
chase that down and identify that and tell you, as an
00:50:29
operator hey, I think we have something unusual here and it's
00:50:35
Really based on the volume of signals that we're seeing.
00:50:40
It's too much for a human to probably unite that way, and so
00:50:44
it's going to take artificial intelligence through through
00:50:47
machine learning algorithms that can parse all that data
00:50:51
together, just like a sim can aggregate and correlate data
00:50:54
together from logs.
00:50:55
We're going to need it upstream , though, right in our different
00:51:00
security solutions.
00:51:01
You're going to need it in your email security.
00:51:03
You're going to need it in your EDR.
00:51:04
You're going to need it in in in your firewalls To be able to
00:51:08
parse through and and understand .
00:51:10
Unite all of the disparate information together to
00:51:14
understand what's going on.
00:51:17
Speaker 1: Is there?
00:51:17
Is there any thought around potentially creating, like a
00:51:21
verified user, uh, you know, logo or icon?
00:51:27
Um, you know, in an email so like, let's say, you know I'm
00:51:33
talking to a vendor, both of us are abnormal security customers,
00:51:37
so you would know if the person that's sending me the email is
00:51:42
a real user, right, and you could tell if I'm a real user?
00:51:46
Um, is there any thoughts around, you know, maybe Adding a
00:51:51
logo to that email saying abnormal security, verified user
00:51:55
or something like that, right, because that would that would
00:51:58
really, I feel like, just from a , from a user perspective, you
00:52:02
know, that adds a lot of peace of mind where it's saying like,
00:52:05
hey, I know I'm responding to the right person in this case.
00:52:09
Um, you, anything like that?
00:52:12
Speaker 2: you know we've talked about that before.
00:52:13
You know, uh, google just released that capability.
00:52:16
You know the sort of the blue check mark, if you will.
00:52:19
You know, if you go back to twitter's uh, mentality, um, you
00:52:22
know, we've looked at that concept before.
00:52:25
The challenge with that Is that this is that that was the
00:52:28
concept of the sender policy framework spf Using dmark and
00:52:32
dkim to verify the authenticity of the sender.
00:52:36
So, so those things actually already exist and as long as
00:52:40
we're all good corporate citizens and you have your spf
00:52:43
set to fail and and or reject or whatever, um, you shouldn't be
00:52:49
allowed to spoof.
00:52:51
Traditional spoofing has kind of fallen out because most
00:52:55
companies Do the right thing they set up their dmark, their
00:52:58
dkm.
00:52:59
Spf is set to reject, so if it doesn't come from the right
00:53:03
place, it shouldn't reach you at all in the first place.
00:53:06
That's that's sort of email security 101.
00:53:10
Now, if you, if you fast forward , the real challenge then
00:53:14
becomes how hard is it to compromise an email, a cloud
00:53:17
email account, today, right, whether it's credential stuffing
00:53:21
using Um compromise creds that are found all over the place, uh
00:53:25
, that because users refuse to change their passwords like, or
00:53:31
they use weak passwords, or whether it's, uh, whether it's
00:53:34
like a credential phishing attack and I've now collected
00:53:37
your creds Regardless.
00:53:38
Let's say I have your creds Logging into your it's.
00:53:42
It's already a done deal.
00:53:44
If you don't have mfa enabled, right, I can just log directly
00:53:47
into your office 365 account.
00:53:49
Now, if you have mfa, I can still brute force it, right, I
00:53:53
can still try and smash you at 3am with push after push after
00:53:58
push until eventually you just accept one.
00:54:01
And now I'm in.
00:54:03
So Compromising a cloud email account by itself today, it's
00:54:07
pretty trivial to do.
00:54:08
The danger then becomes now that I'm in as you into your
00:54:14
account.
00:54:15
I can send messages as you, with the blue checkmark or
00:54:20
whatever would be there.
00:54:21
So I fear that it would give a false sense of security in the
00:54:26
in the circumstance of an account taken.
00:54:28
Speaker 1: So that's where you've got to be cautious of
00:54:30
these days.
00:54:31
Yeah, that makes sense.
00:54:34
I um, somehow I didn't think of that, but it definitely makes
00:54:37
sense.
00:54:38
Well, mick, you know, I really appreciate you coming on.
00:54:42
Unfortunately, I think we're at the the top of our time here,
00:54:47
um, you know.
00:54:47
So, before I let you go, how about you tell my audience, you
00:54:50
know, where they can find you if they want to reach out, where
00:54:52
they can find abnormal security, if they wanted to learn more
00:54:55
about Abnormal?
00:54:56
Speaker 2: Yeah, yeah, you guys can reach me directly at mick at
00:55:00
abnormal security calm.
00:55:01
That'll reach me m I c k at abnormal security calm.
00:55:05
You can also go to abnormal security calm slash demo if you
00:55:09
want to see how this works, because seeing it live Changes
00:55:14
lives.
00:55:14
I'm going to tell you that right now.
00:55:15
And lastly, if you want to sign up for a free risk, free trial,
00:55:20
if you want to, we can come in and do a report.
00:55:23
You can go to abnormal security calm and uh and we'll, we'll be
00:55:26
able to set you up there as well.
00:55:29
Speaker 1: So, yeah, I think we're going to be able to do a
00:55:32
report.
00:55:32
You can go to abnormal security calm slash demo if you want to
00:55:34
see how this works.