Prepare to have your mind broadened and your digital defenses bolstered as we journey with cybersecurity expert Aaron Painter, whose insights from Microsoft and NameTag are nothing short of enlightening. We tackle the increasingly sophisticated realm of social engineering, where attackers prey on human psychology rather than system weaknesses. Discover the chilling ease with which these modern-day digital pickpockets can orchestrate account takeovers, and how traditional security questions are no match for their cunning. It’s a deep dive into the human element of cybersecurity, with compelling anecdotes that reveal just how vulnerable we can be when our guard is down.
This episode isn't just about the problems; it's a treasure trove of cutting-edge solutions! We explore the terrifying capabilities of deepfake technology and its impact on identity verification with a story that sounds like it's straight out of a spy thriller—a finance controller tricked into transferring $25 million. But there's hope yet, as we uncover the groundbreaking methods NameTag employs to thwart these digital doppelgängers, reshaping the landscape of multi-factor authentication resets to outsmart even the craftiest of con artists. Aaron's narrative is a testament to the fine line between innovation and security and how we must tread it carefully.
Wrapping up, our discussion casts a spotlight on the shadowy operations of cybercriminal collectives and the ongoing battle against supply chain attacks. Witness the complexity of securing against compromised hardware and the constant threat of breaches that loom over every organization. We round out with a clarion call to action for heightened cybersecurity awareness and education—a beacon for anyone looking to navigate the treacherous waters of cyber threats. Aaron's stories and strategies, available through LinkedIn and getnametag.com, serve as a vital arsenal in the fight to protect our digital footprint in an age of relentless change.
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, aaron ?
00:00:00
It's great to get you on the podcast.
00:00:03
You know, I think we've been trying to plan this thing for a
00:00:06
while now and finally the stars aligned and we're able to get
00:00:10
together.
00:00:12
Speaker 2: It's great to be here , and the funny thing is it's no
00:00:14
less relevant than when we started planning it.
00:00:16
So things just get more and more exciting out there.
00:00:19
Speaker 1: Right.
00:00:19
Yeah, that's the interesting thing about security.
00:00:23
You know it's evolving every single day.
00:00:26
Tomorrow there could be a zero day that comes out.
00:00:29
Knock on wood, right that now we're all working through the
00:00:34
weekend and into the coming weeks, 24-7 a patch, but luckily
00:00:40
that hasn't really happened, maybe in a year and a half.
00:00:44
Man, I really need to knock on some wood here.
00:00:46
I'm really playing with fate here.
00:00:52
Speaker 2: Yeah, it's interesting.
00:00:52
I've spent a lot more time with folks that are looking to get
00:00:55
into the cybersecurity profession, you know, from other
00:00:57
angles and other parts of their career, and one of the things
00:01:00
that's been sort of telling for me is how much people are
00:01:03
bringing fresh ideas and fresh way of thinking about
00:01:05
vulnerabilities and exploits.
00:01:07
That aren't always technical, it's not always, you know, deep
00:01:10
understanding, command line interfaces and intrusions of
00:01:13
sorts.
00:01:13
It's oftentimes just human properties, you know, and
00:01:17
thinking like a bad actor thinks or as a, you know, particularly
00:01:18
in this field of social engineering and the like,
00:01:20
particularly in this field of social engineering and the like,
00:01:22
it just becomes too easy to interfere and sort of take over
00:01:27
someone's account or gain access to a network using really
00:01:29
non-technical means.
00:01:38
Speaker 1: And so, as security practitioners, things certainly
00:01:40
are evolving, in that it's not more appealing towards the
00:01:47
social side of people, the human side of people, because the
00:01:51
technical controls are, all you know, in place.
00:01:54
They're there, they're working.
00:01:56
You know, they probably even tried to poke around and, you
00:02:00
know, get past them and they couldn't Right.
00:02:02
So now it's time to appeal to the human side of this factor
00:02:07
and it's a, it's a difficult thing to kind of ingrain into
00:02:11
people.
00:02:12
You know, every once in a while something will happen with my
00:02:15
own personal accounts and whatnot, right, and I'll think
00:02:20
to myself, like you know, I need to like almost retrain myself,
00:02:25
and I live in this environment.
00:02:26
You know, like I live in this field, it's a, it's a really
00:02:30
interesting place that we're going, that they're kind of that
00:02:35
the attackers are kind of extrapolating on, I feel it was
00:02:39
one of the things that was very formative for me when we started
00:02:42
.
00:02:42
Speaker 2: Name tag was at the start of the pandemic.
00:02:43
I had a bunch of friends and family members who had their
00:02:46
identity stolen.
00:02:47
So I'm a good friend and be good son, like let's jump on the
00:02:50
phone.
00:02:50
You know everything can move digital and you couldn't go in
00:02:52
person and we still let's figure out what happened.
00:02:55
And we'd call these companies and they say, well, before I can
00:02:57
help you, I need to verify you with security questions and they
00:03:01
were sort of asked these bizarre things that were rather
00:03:03
wildly simple or silly and complex you know what street did
00:03:06
you live on in 19, whatever?
00:03:08
Or you know what's your favorite color or your pet's
00:03:10
name, and sometimes it's even just oh, what's your email
00:03:13
address?
00:03:13
I need to quote verify you.
00:03:15
And you're like, how is this keeping my account secure?
00:03:17
And you know, as a professional , that drives me crazy, but I
00:03:20
think it drives all of us a little bit crazy.
00:03:22
And in that context, at that time, what had happened is
00:03:29
someone else called before we did and answered those silly
00:03:30
questions and was able to take over our account.
00:03:31
And that had nothing to do with the strength of encryption or,
00:03:33
you know, MFA protections or other things.
00:03:35
It had to do with simply social engineering or answering silly
00:03:38
questions to a human, and that is very much a social science
00:03:46
more so than really even a technical one.
00:03:47
Speaker 1: Yeah, that's a.
00:03:47
That's a really good point.
00:03:48
You know, when you bring up social science, I feel like it's
00:03:51
almost like a.
00:03:52
It's a different wing of psychology or something.
00:03:56
You know, I took one psychology class in college, right, so I'm
00:03:59
really prepared to talk about it, talk about it.
00:04:05
But you know, it feels like a separate section of that field
00:04:07
almost.
00:04:07
You know, because you're you're appealing to the person's.
00:04:09
You know social aspect, right, they want to help you, they want
00:04:11
to be a help, um, overall, you know, they want to get you the
00:04:16
information that you need and whatnot, right, and so they have
00:04:19
to go through this stupid process that their company put
00:04:22
in of you know answer this, uh, you know, question of, like
00:04:26
what's the maiden name of you know your grandma, or whatever
00:04:29
it is.
00:04:29
You know, these things are all readily available online, like
00:04:34
there's, there's.
00:04:35
There's actually like very little that we can do about it.
00:04:38
You know, like it's, it's all relatively out there, it's very
00:04:44
easy to piece it together and maybe they even prod in
00:04:48
different ways, you know, and call you and impersonate you
00:04:52
know some other company, right, just to get maybe one security
00:04:55
question answered, you know, and then maybe they call a couple
00:04:58
weeks later, you know, and they get another one answered, and
00:05:02
they get another one answered.
00:05:10
Speaker 2: It's a fascinating area, but we're going into a
00:05:13
place where it becomes impossible to defend against, I
00:05:15
feel, yeah, fraud and bad actors are out there doing social
00:05:16
engineering.
00:05:16
Right, they're using these techniques.
00:05:17
They're looking at data that was released in a data breach by
00:05:19
someone else or is publicly available or on a social network
00:05:22
, and they're using sort of the human fallibility in a way to
00:05:26
even earn credibility, right, oh , you know what's your manager's
00:05:29
name.
00:05:29
Oh, you know, I was on vacation and we just had a reorg and
00:05:32
gosh, I'm not even sure who I report to anymore, but now I
00:05:35
can't access my email.
00:05:36
I've just got to get back in, right, it's kind of thing that's
00:05:39
sort of relatable and, ironically, if you go to work at
00:05:42
any sort of support organization customer support,
00:05:44
it help desk you probably like helping people.
00:05:47
It's probably sort of a core requirement of the role.
00:05:50
Yet you're often part of the responsibility is to be this
00:05:54
identity detective and your job is really to interrogate someone
00:05:58
when they first call you, to sort of identify who you're
00:06:00
speaking with.
00:06:01
It doesn't work on either side.
00:06:03
It's frustrating on both sides, and then you're supposed to
00:06:06
very quickly switch over to great.
00:06:07
Now how can I help?
00:06:08
I mean the chance of someone walking away and having a good
00:06:11
experience out of that is slim and it's just not working.
00:06:15
Today's techniques are not working and these bad actors are
00:06:18
now increasingly armed with new tools like AI, deep fakes and
00:06:21
other things to further social engineering, to further trick
00:06:25
these sort of well-intended support reps.
00:06:30
Speaker 1: It's a mess.
00:06:31
Yeah, how?
00:06:36
How are you seeing AI kind of change the game with this?
00:06:37
You know, like I feel like almost every episode I do now we
00:06:40
talk about AI, you know it's it's not, it's not me trying to,
00:06:41
you know, and it's it's not, uh , it's not me trying to, you
00:06:46
know, bump the numbers of the episode or, you know, get into
00:06:49
some algorithms, right, like I think my audience knows like I'm
00:06:52
terrible at that part, um, and so like that's why you know, my
00:06:57
podcast probably isn't.
00:06:58
You know where it could have been or where my some of my
00:07:01
competitors are Right could have been, or where my some of my
00:07:07
competitors are right.
00:07:09
But you know, um, just talking, just thinking about ai, it seems
00:07:10
to be changing like everything with it.
00:07:11
You know it's touching everything, um, and it's it's
00:07:16
unlocking new capabilities that used to not be readily available
00:07:23
.
00:07:23
It used to not be, you know, readily, it used to not be
00:07:25
readily available to get on a call with someone, have an AI,
00:07:28
listen to that phone call and now, 30 seconds in, you can
00:07:32
mimic that person's voice and so now you can go and call Chase
00:07:36
and if they have some AI on their backend that verifies your
00:07:39
voice, you're already verified.
00:07:41
It's already one more step to that customer service person
00:07:46
feeling good about giving you you know this information right,
00:07:50
because you're like.
00:07:51
Certainly you know he's not going to fake his voice.
00:07:54
Speaker 2: Right.
00:07:55
Those are, I think, convenience layers in a way that haven't
00:07:58
been properly implemented or thought through.
00:08:00
You know, the concept of my voice is my password is great
00:08:04
from a convenience perspective.
00:08:06
But, by the way, oftentimes, let's say, you go to the average
00:08:08
bank account opening process and you go through a KYC, a know
00:08:12
your customer flow.
00:08:13
Maybe it's in person, maybe it's virtual, that's great, and
00:08:16
they ask you to scan your ID and take a selfie of yourself, and
00:08:19
it's kind of matched up.
00:08:20
Those tools were created in an era where regulatory compliance
00:08:24
was the goal.
00:08:24
In fact, their goal was to make it easy.
00:08:26
You could essentially even say oh, you know, I don't need to
00:08:28
take a photo of my ID, I'm just going to upload a PDF that I've
00:08:31
scanned once before and saved.
00:08:32
That was a great convenience play and satisfied the need.
00:08:36
The problem today is you can essentially go to a chat GBT
00:08:39
equivalent and say here's my photo, make me a California
00:08:42
driver's license and, by the way , I'm going to export it as a
00:08:44
PDF to conveniently upload into that tool.
00:08:48
And so there's a reason why, when you call the bank and the
00:08:50
transact, they have to ask you security questions.
00:08:53
They have to ask you these bizarre things because they
00:08:55
don't trust the integrity of what that you know know your
00:08:57
customer flow was.
00:08:58
So let's say you've done that, you've opened the bank account,
00:09:01
they've checked the regulatory boxes and you call into Transact
00:09:03
and they go through a security question.
00:09:05
They say well, you know what?
00:09:06
Hey, to make this easier for you next time, why don't we
00:09:09
enroll you in some sort of voice authentication service?
00:09:11
Sounds compelling.
00:09:13
The problem is that rep still doesn't really know who you are.
00:09:16
The best effort to verify you, to send you to enroll into that
00:09:19
voice verification system, are the security questions they
00:09:21
asked.
00:09:22
And so it's this sort of bizarre loop that is only as
00:09:25
good as the element of secure enrollment.
00:09:27
It's no different, by the way.
00:09:28
It drives me crazy the concept that you know we hire, say, a
00:09:32
new employee in a company and HR will maybe ask IT to go
00:09:36
provision a new email address or user account.
00:09:38
And you know, pick your ID provider, active Directory and
00:09:41
you put in a new hire name and Active Directory and you create
00:09:43
a temporary password and then you send it, or HR sends it to
00:09:47
someone's personal email address and so there, at that moment,
00:09:50
you've just given network credentials to someone to show
00:09:53
up for work that you sent to their personal email address,
00:09:55
not actually knowing really who they are.
00:09:57
Who's the recipient of that email account?
00:09:59
If that email account was secured in the first place,
00:10:05
that's enrollment, just like the hey.
00:10:06
I've asked you a few security questions.
00:10:07
Let's hop you into voice authentication for the future.
00:10:08
You don't know who the person is and it turns out, the risk is
00:10:11
just as great on the other end, because all you have to do is
00:10:15
pretend to be locked out or pretend to be a person who is
00:10:18
locked out and needing to recover your account, and you're
00:10:21
back to this sort of human level of verification and often
00:10:24
these silly security questions.
00:10:26
Those are the vulnerabilities today.
00:10:31
Speaker 1: Yeah, that is.
00:10:32
That's a difficult one to defeat.
00:10:35
You know how do you, how do you protect an organization from
00:10:41
that?
00:10:41
Right, it sounds like it sounds like you almost need like a new
00:10:45
process or a procedure or something like that.
00:10:48
You know where do we go from here to kind of add in another
00:10:53
layer of security on top of the 10 that we already have?
00:10:56
Speaker 2: Yeah, this was really formative for us.
00:10:58
In building NameTag, you know, we set out to say why is it that
00:11:01
someone goes through that KYC flow when they open a bank
00:11:03
account and then it's not reusable and it turns out partly
00:11:06
.
00:11:06
We said can we, can we use one of the KYC tools and just make a
00:11:08
product that makes it reusable so that every time you call to,
00:11:11
let's say, transact, you can re-verify the person.
00:11:13
But the input methods, the idea of being able to allow someone
00:11:17
to use their web browser to verify themselves in the first
00:11:20
place, wasn't sufficient.
00:11:21
So we invented a way to use mobile phones and by using
00:11:24
mobile devices, you can ask the end user for the same inputs.
00:11:31
In theory, you know, show me your ID and kind of take a
00:11:32
selfie.
00:11:32
But you can do it in a wildly more secure way because you're
00:11:34
using all the embedded features of a mobile phone, like their
00:11:38
secure enclave in the cryptography, so that you know
00:11:41
the camera that's being used is coming from the camera on the
00:11:44
device.
00:11:45
You can use things like the 3D depth camera.
00:11:47
You can see that they're moving their phone as a human would,
00:11:50
as they're moving, to turn their ID over or go to take a photo
00:11:54
of themselves.
00:11:54
There are all these data pieces that come up in a mobile device
00:11:58
that, used in a smart way, can be just a better experience, but
00:12:02
it can also be much more secure , and so we created this as a
00:12:06
way for people to be able to solve those otherwise tricky
00:12:09
moments of what could be a social engineering attack.
00:12:11
You know it's someone calling the help desk and instead of the
00:12:14
help desk rep having to ask you questions, they can send you a
00:12:17
verification link and then come back with some confidence in
00:12:20
knowing you really are the human behind the screen, and that's
00:12:23
sort of been a formation of our company.
00:12:33
Speaker 1: So what sort of information are you gathering?
00:12:35
You know from that verification link right.
00:12:37
Is it an ID?
00:12:37
Or turn on the camera.
00:12:38
Show me this ID, whatever it might be.
00:12:39
What does that look like?
00:12:42
Speaker 2: For the end users.
00:12:43
As far as they're concerned, it's an ID and a selfie, it
00:12:45
averages 23 seconds in a flow and it feels super slick and
00:12:48
fast.
00:12:48
The hidden behind the scenes is we're able to get a lot more
00:12:52
data and how that was captured and, particularly, we're able to
00:12:55
avoid things like injection attacks and injection attacks
00:12:59
are not always understood, but they're really the key way that
00:13:02
people use deep fakes and put them to work not always
00:13:04
understood, but they're really the key way that people use deep
00:13:06
fakes and put them to work and it means, simply, you're sort of
00:13:07
tricking the system with a different feed of where the data
00:13:09
came from.
00:13:09
One of the best ways to look at this might be you know, you
00:13:13
asked about AI and deep fakes and you know how more and more
00:13:16
people are using them.
00:13:17
I think one of the most relevant examples we've had lately has
00:13:20
been sort of this Hong Kong finance controller.
00:13:23
I don't know if you followed this story.
00:13:24
It was a multinational organization, shall we say,
00:13:30
where the quote CFO was based in London and the finance
00:13:31
controller was based in Hong Kong, and the controller
00:13:33
allegedly got an email from the CFO saying hey, I need to do a
00:13:36
bunch of wire transfers.
00:13:37
Can you help me process these?
00:13:38
The controller was rightly a bit suspicious and the CFO said
00:13:41
hey, you know, a bunch of us are on a video call now why don't
00:13:45
you join the call?
00:13:45
I'll send you a link and then you can get all the approvals
00:13:47
that you need.
00:13:48
And it turned out, everyone on the video call was a live, deep
00:13:51
fake emulator.
00:13:52
And so the controller was like well, I recognize these people,
00:13:55
I know these people, these are, you know, executives in my
00:13:57
company.
00:13:57
This must be okay.
00:13:58
And they proceeded and, you know, transferred $25 million in
00:14:01
wire transfers.
00:14:03
Wow, so it was sort of this wake-up call that, my gosh, the
00:14:06
method that we thought kind of worked, that took extra effort
00:14:09
of let's jump on a video call, really can't even be trusted.
00:14:12
But it's a classic example of a few different things, one of
00:14:15
which being an injection attack Because Zoom teams.
00:14:18
It's intentionally easy to say let me select a different input
00:14:21
camera or a different microphone , but that also means you could
00:14:31
select an emulator that's projecting a deep fake in real
00:14:32
time.
00:14:32
Those tools weren't meant to stop fraud, they weren't meant
00:14:34
for those sort of high risk moments and ironically now
00:14:35
companies like Okta and others are recommending that when a
00:14:38
user says they're locked out of multi-factor authentication.
00:14:41
They're saying do what they call, you know, video
00:14:44
verification or visual verification.
00:14:46
Hop on a video call and make sure that you're speaking with
00:14:49
the right person.
00:14:50
Yet now we've seen that deepfakes has sort of even made
00:14:53
that maybe not the best option.
00:14:55
Speaker 1: Wow Did.
00:14:57
Do you think Okta came up with that recommendation after they
00:15:02
had their their fairly recent breach as like a measure to say,
00:15:07
okay, you know, he kind of trusted the voice and trusted
00:15:10
the situation right through this phone call?
00:15:13
How do we prevent that?
00:15:15
Video would probably be the next logical step.
00:15:19
But you know, with deep fakes out there, it's how do you?
00:15:23
How do you take it a step further, because the same thing
00:15:26
will happen.
00:15:27
Do you think that that's what prompted it?
00:15:30
Speaker 2: Well, this is the challenge that MFA itself is
00:15:32
secure.
00:15:33
But MFA has this glaring kind of side or backdoor and it's
00:15:37
this concept that all the user has to do is claim to be locked
00:15:40
out and things exist.
00:15:42
You know, self-service password reset exists, great.
00:15:44
You know, email your personal email address and reset a
00:15:47
password.
00:15:47
But secure self-service MFA resets hasn't traditionally
00:15:52
existed.
00:15:53
We've just brought it to market at NameTag as a way to surround
00:15:56
your existing, you know, okta, duo, microsoft Entry
00:15:59
implementation, because it's a glaring hole.
00:16:01
It's this concept that when a user is locked out they
00:16:04
shouldn't even have to contact the help desk.
00:16:06
But traditionally the only way to reset your MFA is to contact
00:16:09
the help desk.
00:16:09
So we now take them through a flow that says, hey, what's your
00:16:12
work email address, for example , let's go through a name tag
00:16:15
verification flow, verify your identity and then we integrate
00:16:18
directly with that MFA provider to let the user reset their MFA
00:16:21
credentials.
00:16:22
So it avoids a ticket to the help desk, it avoids that call
00:16:26
and avoids the risk of social engineering.
00:16:28
But, by the way, it makes it way faster and, frankly, way
00:16:31
more cost effective, because support tickets themselves are
00:16:34
pretty expensive.
00:16:35
That's sort of been our solution as a way to bring to
00:16:39
market and kind of close this kind of glaring hole that exists
00:16:42
.
00:16:44
Speaker 1: You know how did you get here.
00:16:47
You know, you know where did you start.
00:16:50
You know earlier on in your career that kind of led you down
00:16:54
this path.
00:16:54
You know, and I I asked people that because you know,
00:16:58
especially when you start a company, right, there's a,
00:17:00
there's a different kind of level of stress that comes along
00:17:03
with it.
00:17:04
And I feel like people that haven't started a company,
00:17:12
haven't tried to do anything new , don't really understand that,
00:17:14
because you're kind of locked in this nine-to-five bubble and
00:17:17
it's safe, it's secure.
00:17:19
Every once in a while people get laid off but you find
00:17:22
another job.
00:17:27
Speaker 2: So where did you start?
00:17:28
That kind of led you down this path.
00:17:29
I think balancing risk in your career is always sort of it's
00:17:33
for the moment of growth, and so finding that balance that's
00:17:36
right for an individual person is hard.
00:17:38
But it really matters and it might be right at different
00:17:40
times in your life but fundamentally to grow as a
00:17:44
person you kind of have to leave your comfort zone, and so the
00:17:47
degree to which you leave means more risk, maybe, but more
00:17:50
opportunity for growth, and that's something that kind of
00:17:53
everyone thinks through and struggles with and has to
00:17:54
balance at different points in their life.
00:17:57
You know I spent 14 years at Microsoft.
00:17:58
I started in product.
00:18:00
I worked in Redmond and kind of Seattle headquarters.
00:18:03
I was the first product manager for Office.
00:18:06
Back in the day there were individual apps.
00:18:08
We said how do we make it something integrated?
00:18:09
We called it the system.
00:18:11
You'd now probably think of that as 365.
00:18:13
And I love that because it let me work with so many different
00:18:16
teams and kind of reposition and bring together a product that
00:18:19
was already being used by so many people.
00:18:21
But the rest of my career was outside the US and I focused a
00:18:25
lot on helping Microsoft expand to new geographies.
00:18:27
Ultimately, that meant opening, you know, 31 new offices, kind
00:18:31
of who was the first person, who's the third person in a
00:18:33
given country?
00:18:34
I spent a couple years in Brazil, I spent five and a half
00:18:37
years in China all kind of new markets, all trying to build
00:18:42
stuff where it didn't exist before and so I had this
00:18:44
opportunity to work in a big company but to kind of do new
00:18:47
things and to do things that, frankly, were risky.
00:18:49
They're risky for me, they're risky for the company.
00:18:51
They were hard, but I grew a lot from taking on those risks
00:18:55
and that was the right time for me and my career and kind of
00:18:58
what I wanted to do in my life at that time.
00:19:00
And I love Microsoft, I love the mission, the people I got to
00:19:02
work with.
00:19:03
I love Microsoft, I love the mission, the people I got to
00:19:15
work with, I love the reach.
00:19:17
I left Microsoft eventually because I wanted to be more
00:19:19
entrepreneurial.
00:19:19
I felt like I hit an entrepreneurial kind of max,
00:19:22
particularly first and largest partner in Europe, and then it
00:19:26
was in doing that and it was particularly in a lot of my work
00:19:28
outside the US, where so many of the companies I spoke with
00:19:32
had very similar challenges and a lot of them were large
00:19:36
enterprises thinking about how to use technology in new ways,
00:19:38
and security just kept coming up over and over again, as much as
00:19:42
employee growth, employee loyalty, customer loyalty, kind
00:19:45
of the business trends that relate to, you know, maybe
00:19:48
having a security vulnerability or not, making customers feel
00:19:51
like the platform they were using was secure, and so I
00:19:54
really passionate about this area.
00:19:55
But security and I knew I wanted to solve something around it
00:19:59
and then it was sort of that start of the pandemic when I had
00:20:02
this personal frustration, that kind of just hit the max of why
00:20:06
do I call people and they not know who I am and yet the only
00:20:10
thing separating their safety in my account is our answers to
00:20:13
these silly questions.
00:20:13
Like there has to be a better way in this modern time to
00:20:16
verify sort of the human behind the screen.
00:20:18
And I was able to assemble a really great group of other
00:20:22
folks, who were a lot of them with great security backgrounds
00:20:25
and and a lot of creativity, and we're able to invent something
00:20:28
new in this space that, uh, frankly, it's using identity as
00:20:31
a way to make the perimeter more secure that's fascinating, that
00:20:37
, uh, you know you kind of started at microsoft, right, and
00:20:42
now you're Now you're doing all these different things, you're
00:20:47
living all over the world.
00:20:49
Speaker 1: Do you think that that type of personality is
00:20:53
pretty common at big tech?
00:20:55
I ask because you know you're thinking outside the box, you're
00:20:59
open to the unknown, you are, you know, a bit more comfortable
00:21:05
with being uncomfortable, right , and to a lot of people that's
00:21:09
pretty scary.
00:21:10
You know, like I, uh, you know, for instance, right like I love
00:21:14
germany, right, so I love going to germany.
00:21:17
I have been there far too many times.
00:21:20
I'm neglecting the rest of the world, um, and so like that's
00:21:24
why I'm like forcing myself to go to lond year, but I'm like,
00:21:27
well, maybe I'll stop in Germany too, you know, and but that's a
00:21:32
difficult thing.
00:21:32
Not everyone thinks like that, and where do you think that
00:21:37
mentality came from?
00:21:38
Even for you, to, you know, be comfortable in that situation,
00:21:42
because that, I mean that is a challenging thing, especially
00:21:46
when you're starting a company in other countries.
00:21:49
I mean, that's probably the most challenging thing that I
00:21:52
could think of.
00:21:54
Speaker 2: There are a lot of.
00:21:55
I mean, sometimes I feel like, hey, we're still doing like bits
00:21:57
and bytes and so forth.
00:21:59
You know we're not saving lives , we're not medical doctors,
00:22:01
like.
00:22:01
There are other really important fields out there, of
00:22:04
course.
00:22:04
For me I was because I took risk and then grew through it.
00:22:09
I got stronger Right.
00:22:11
I was that concept of taking risk and leaving your comfort
00:22:14
zone and then your comfort zone kind of expanding a bit.
00:22:16
And so as I did that, more and more and more, it just kind of
00:22:19
kept expanding.
00:22:21
And I think there is a role in any organization for people that
00:22:24
have different styles and different ways of working and
00:22:27
typically, I'd say, larger companies.
00:22:29
One of the benefits is that it's not going anywhere.
00:22:31
It's very much built with redundancy in mind.
00:22:35
That also means one person the impact of one person is
00:22:38
intentionally a little bit less right.
00:22:40
One person can bring down a company and one person can make
00:22:43
a big difference, but it's not going to radically transform a
00:22:46
company necessarily and in general I think that's something
00:22:49
just a large organization gets.
00:22:50
That's part of its strength, that's part of its value
00:22:53
proposition.
00:22:55
But when you have large companies that are constantly
00:22:56
trying to innovate and they're trying to grow in new markets.
00:22:58
You want to mix the personalities, you want to mix
00:23:01
the styles, you want some people that are a little bit more
00:23:03
comfortable trying new things and taking risks and leveraging
00:23:06
the foundation the company has and growing into new areas.
00:23:09
That's how you sort of stay relevant and I think we need it.
00:23:13
I think tech is sort of a perfect platform because tech is
00:23:16
such an industry that's changing so much that when you
00:23:19
have these particularly larger tech companies now all of the
00:23:22
ones, the big ones we think about they continue to evolve,
00:23:24
they continue to sort of push the boundary and come up with
00:23:26
new things and stay relevant.
00:23:28
And that's because they have a varying degree of kind of styles
00:23:32
in the company and divisions and product areas and ways to
00:23:35
invest.
00:23:36
Some of it's optimized and some of it's kind of invent.
00:23:39
I think you need that balance.
00:23:43
Speaker 1: Yeah, it's.
00:23:43
You know, it's fascinating with the tech industry because it's
00:23:49
kind of like a universal industry.
00:23:51
I could take my skills and go to Europe and there will be
00:23:57
companies there that need exactly what I know, exactly my
00:24:02
experience and whatnot, and maybe it'll be a different
00:24:05
opportunity.
00:24:05
Same thing if I go to Russia.
00:24:07
Maybe there's some challenges with that, of course.
00:24:10
Same thing with China and whatnot, but it's an industry
00:24:15
that you know translates very well, I feel.
00:24:18
Is that the same way that you look at it as well?
00:24:26
Speaker 2: kind of like language .
00:24:27
Right, when you have an underlying language, it's easier
00:24:28
to communicate and do things.
00:24:28
And certainly language of code and certain platforms and others
00:24:30
, you can learn a new platform.
00:24:31
But language of code is a little bit more universal.
00:24:34
I struggled with still a bit of Western alphabets and other
00:24:37
things not perfectly, but yeah, I think you're right in a way,
00:24:40
because there's some commonality across and a lot of that also
00:24:44
has to do with the infrastructure out there.
00:24:45
Right, there is a set of large cloud players.
00:24:48
So you think, okay, well, I'm going to develop skills in how
00:24:51
to write or publish code to certain platforms and evolve
00:24:55
them and tweak them and optimize them.
00:24:56
Great, there's sort of a handful of them out there.
00:24:58
They're almost like a language themselves.
00:24:59
So, if you know how to work in AWS, azure, gcp, you know, maybe
00:25:04
Alibaba, like you know, pick a few in terms of, some of those
00:25:06
are more similar than others to each other.
00:25:09
You can kind of have impact and work in a lot of different
00:25:11
places, but there's still a huge opportunity for culture
00:25:14
relevance.
00:25:14
How does it work?
00:25:15
How do systems work?
00:25:17
How do people and humans interact with machines in
00:25:18
different ways and different geos?
00:25:20
What are local regulations and things you need to work with.
00:25:23
What are the problems you're trying to solve for?
00:25:25
And, more interestingly, as tech just sort of permeates
00:25:30
different industries, it's really a lot of opportunity for
00:25:32
industry knowledge.
00:25:32
What's happening?
00:25:34
How do you apply tech to automotive or healthcare or
00:25:37
financial services?
00:25:38
It means you need people who understand tech and you need
00:25:40
people who understand those given industries.
00:25:42
And when you can bring those different sets of minds and
00:25:44
experiences together, that's where I think you really can
00:25:47
have fun in transforming whole industries.
00:25:51
Speaker 1: Yeah, it's a really good point.
00:25:52
You know, I kind of got my start more in the financial
00:25:58
industry and ever since then it's just financial institution
00:26:04
as a financial institution.
00:26:06
You know.
00:26:06
That, you know, is trying to poach me from the last one.
00:26:10
And the big thing that they're looking for is the industry
00:26:15
knowledge, the industry experience.
00:26:17
And so, you know, when they bring up different compliance
00:26:20
frameworks and whatnot, it's like guys, that's that's all I
00:26:23
know.
00:26:24
You know, I don't know HIPAA.
00:26:25
I know PCI inside and out, I know NIST, you know like we can,
00:26:30
we can, work with this and for some reason, you know they're
00:26:34
not, they're not looking to really venture outside of that.
00:26:38
They really want someone with that that experience which I've
00:26:42
always found interesting.
00:26:43
You know, I think that the experience overall is relatable,
00:26:47
right, but I guess it's that industry of knowing you know
00:26:52
kind of the ins and outs of when to pay attention to something
00:26:55
and how to pay attention to it and you know how different
00:26:59
systems are linked together and things like that.
00:27:01
I guess there is that benefit with that that you know when,
00:27:06
when you're starting a brand new company, brand new methodology
00:27:11
of securing you know this, this process of gaining access to an
00:27:17
account, getting access to you know, personal information and
00:27:20
whatnot.
00:27:20
When you're starting that, how do you manage between building
00:27:27
the product yourself, hiring the right people, and then
00:27:32
splitting your time with finding the customers to pay those
00:27:35
right people?
00:27:36
You know, to balance it all.
00:27:39
How do you find that balance?
00:27:40
You know?
00:27:41
Are you the person that is building it from scratch, or are
00:27:45
you the person you know, writing it down and trying to be
00:27:49
like okay, this is my vision, let me go find the right people?
00:27:51
What does that look like?
00:27:54
Speaker 2: the reality is, all of those things matter and you
00:27:56
have to kind of find a way to um find the right blend, so to
00:28:00
speak.
00:28:00
You know, interestingly, the the ceo role in my second
00:28:03
company is sort of ceo you're.
00:28:05
You're constantly uh, typically the problems are things that
00:28:09
land on your desk.
00:28:10
Typically it's hey, something didn't work or this.
00:28:13
This area needs more attention, and so you sort of wake up
00:28:15
every day and you can have an agenda and things that you're
00:28:17
trying to solve for.
00:28:18
But you're kind of also helping to solve problems, and some of
00:28:22
those could be, you know people issues or we need more resources
00:28:24
on this, or um, gosh, this customer's's doing great and
00:28:28
they want to 10x what they're doing.
00:28:29
They're all things that typically the system can't
00:28:33
process on its own, which is sort of fun, exciting, a little
00:28:36
bit randomizing at times, but you are kind of chief problem
00:28:39
solver in a way, for lack of a better term.
00:28:43
For me, the focus has always been on finding great people and
00:28:45
building great culture, and when you have great people and
00:28:48
you have great culture, that creates an environment where all
00:28:52
the other things you talk about can happen.
00:28:53
You can invent new things, you can solve problems, you can
00:28:58
think about scale.
00:28:59
You can be empowered to sort of solve a problem on your own,
00:29:02
you know, sort of on behalf of the company, without needing to
00:29:05
involve, let's say, me or my role in things.
00:29:07
And so, as much as possible I always strive to say let's find
00:29:13
great people who are good at what they're doing, be clear on
00:29:15
sort of what we're trying to achieve together, be really good
00:29:16
at listening.
00:29:17
You know something I'm passionate about.
00:29:18
I wrote a book on the importance of listening, because when sort
00:29:22
of employees in an organization feel like they're heard, like
00:29:25
their opinions matter, then they're often carrying that to
00:29:27
how they engage with customers.
00:29:29
And when customers feel heard, they often feel respected by the
00:29:32
way it can be a great source of ideas and feedback and features
00:29:35
and what to go build next and how to evolve it.
00:29:38
And so one of the things we try to put a big focus on is sort
00:29:41
of that culture of listening and name tag.
00:29:43
How do we listen to each other, how do we make sure we're open
00:29:45
and respectful to different viewpoints and feedback?
00:29:47
But then we carry that to often how we're engaging with our
00:29:51
customers and frankly, we've been so shaped by that, by
00:29:55
customer feedback, by customer ideas, that it's really helped
00:29:58
us to differentiate in the market.
00:30:01
Speaker 1: Yeah, that is.
00:30:02
You know.
00:30:03
That's a huge skill set that I feel like everyone in IT should
00:30:07
be learning in the beginning of their career when they're on the
00:30:10
help desk is, you know, listening to understand, not
00:30:15
listening to respond.
00:30:16
And you know it's interesting because all through school you
00:30:21
are listening to respond the entire time.
00:30:23
You know there is no understanding.
00:30:26
It's like you understand the topic and you're listening to
00:30:28
respond based on the information that you know.
00:30:31
But when you flip it and you're on help desk, it's really
00:30:38
important that you understand the problem, that you understand
00:30:42
what's going on, and I always, you know.
00:30:44
Go back to this example where you know I was working with
00:30:49
different federal government agencies and whatnot, and
00:30:52
they're very cagey.
00:30:53
They don't really tell you a lot of things over the phone,
00:30:56
especially if they've never met you in person.
00:30:59
They're very cagey with you and they always wanted this certain
00:31:03
feature in our product, feature in our product.
00:31:10
But my VP and the developers and the engineer that used to
00:31:12
run the product they had a wrong understanding of why they
00:31:15
actually wanted that feature.
00:31:16
So when I went on site for the very first time, they're asking
00:31:21
about it and I was like, guys, you always ask about it.
00:31:24
Apparently, this is my first time on site, but I've heard
00:31:26
that you bring this up a lot.
00:31:28
Why do you bring it up, um?
00:31:31
And they said, well, don't you have other customers that ask
00:31:33
for?
00:31:33
And we're like, no, we we actually have no other customers
00:31:36
that ask for this feature, and so that's kind of why we just
00:31:40
push it off.
00:31:41
You know, we kind of need to know your reasoning behind it,
00:31:44
and if it provides value, you know, it's an easier sell for me
00:31:48
.
00:31:48
And they told me about a time when, you know, a very
00:31:52
legitimate emergency happened and they had no clue where it
00:31:58
took place, and so there was a lot of chaos, because this is,
00:32:02
this is a federal agency in the middle of the mountains in West
00:32:04
Virginia.
00:32:05
They have it's basically it's a military base, without it being
00:32:09
a military base.
00:32:10
They don't have outside resources from local fire
00:32:13
departments and they can, you know they can get that help if
00:32:16
they want it.
00:32:17
But it is all on site.
00:32:18
You know, um, and they explained the situation to me
00:32:24
and I said, oh, that's a life or death situation that we didn't
00:32:29
account for.
00:32:32
The product that I was working with at that time was an E911
00:32:37
solution, right?
00:32:38
So when someone dials 911, it gives exact location information
00:32:41
, and this was a situation that we had never worked through
00:32:45
before, and so when I took that back and I brought it up to my
00:32:49
VP, he said, oh, we need to build that in.
00:32:52
And literally two weeks later it was in and I was flying back
00:32:57
out to this client to put it in, because now he understood.
00:33:01
He said, oh, this is a gap, this is a very real gap.
00:33:04
I wish that they would have just told me this two years ago,
00:33:06
several years ago.
00:33:09
I wish that they would have just told me.
00:33:11
But they're so cagey If you're not cleared, they won't even let
00:33:15
you be on the phone with them, like that's, that's how this
00:33:17
place is, and so, you know, I always took that away as kind of
00:33:24
even bringing that into my security engineering.
00:33:28
You know why?
00:33:29
Why do we want this solution?
00:33:30
What's the problem that we're actually trying to solve?
00:33:34
And before I give anyone consulting advice, I want to
00:33:38
hear the problem what are you trying to solve?
00:33:41
With whatever solution, we'll go from there.
00:33:44
It's a really good skill that everyone really needs to learn.
00:33:51
Speaker 2: That's a really thoughtful story and example.
00:33:54
One of the things I take away from what you shared is the
00:33:56
importance of building trust, because it's very difficult to
00:34:00
have a healthy relationship in any part of life if there's not
00:34:03
trust and, like you described, showing up in person and being
00:34:06
there and showing your genuine curiosity curiosity to
00:34:09
understand the problem and just even following up and saying,
00:34:11
hey, look, we made progress on this and did it.
00:34:13
You probably continue to earn a lot of trust.
00:34:15
I'm sure your relationship continued really well from there
00:34:17
.
00:34:17
And I come back to, it's very difficult to build trust if
00:34:21
someone doesn't feel respected, and one of the best ways to feel
00:34:24
respected is to listen to them and truly listen and truly be
00:34:27
curious and want to understand what they're saying.
00:34:29
And so I think you're absolutely right and in all
00:34:32
aspects of um, what you described within that flow, even
00:34:35
the fact that you went back to your manager and said, hey, I
00:34:37
have a, you know, I've heard this and they listen to you is a
00:34:40
really great cultural sign and that's probably why you were
00:34:42
able to adapt and kind of give the customer what they needed
00:34:45
and probably stay ahead of the market.
00:34:47
But it's because you knew that you would be heard and you went
00:34:50
out with that same sense of curiosity to listen to your
00:34:52
customer.
00:34:53
It's a great story.
00:34:55
Speaker 1: Yeah, you know it was interesting too because you
00:34:59
know, before that first trip internally, you know our team
00:35:03
was like, you know, whatever it is, we're not going to do it,
00:35:16
it's not that big of a deal.
00:35:16
You know they had all this stuff like preplanned, and I
00:35:18
came back and they were like, oh , we're doing this immediately
00:35:19
and just book your flight now, because it's going to be done,
00:35:21
you know, by this date.
00:35:21
But it did build a significant amount of trust with me and that
00:35:26
customer because now they were more comfortable with maybe not
00:35:31
sending me log files, but they'll get on the phone and
00:35:33
they'll tell me what the error is, right, like I'll have to
00:35:36
walk them through it, of course, to a nauseating extent of you
00:35:43
know, when you say space, I don't mean type the word space,
00:35:49
I mean hit the space bar, right, like that's the kind of
00:35:52
specification I have to tell these guys that I'm on the phone
00:35:55
with.
00:35:55
And of course the agency does that purposefully because you
00:36:00
know they want someone that if your product is based on Linux,
00:36:04
well, we want someone that's never even seen a Linux terminal
00:36:07
before.
00:36:07
Good luck, it's not on him, If it fails, it's on you.
00:36:12
You know they do that purposefully, but it kind of
00:36:18
that experience kind of took my standards and expectation of
00:36:22
customer service to a totally different level, right, and even
00:36:27
now, even today, today, right, when I experience like really
00:36:30
poor customer service, it like pains me to to such an extreme
00:36:36
extent it's like, come on, guys, you could do so much better and
00:36:38
it would provide such a you know better, more enhanced
00:36:42
experience.
00:36:42
Um, that's just, that's just me , right.
00:36:46
Do you?
00:36:47
Do you take that and do you really run that into the culture
00:36:52
of your company as well?
00:36:54
Because I feel like that is also something that a lot of
00:36:57
security companies miss.
00:36:59
You know they're they're used to selling that new thing,
00:37:03
getting the phone call with the right person, getting the right
00:37:05
email, that they kind of forget about the person on the other
00:37:09
end of the line, right, do you also ingrain that into your
00:37:13
employees?
00:37:15
Speaker 2: I think we try, we really try and embody and live
00:37:17
it and it's, for us, been very formative in how our company's
00:37:19
evolved, because we set out, we built this more secure way to
00:37:23
verify who someone was, sort of know the human behind the screen
00:37:26
, and there are a lot of places, frankly, where you could apply
00:37:29
that and there are a lot of places, frankly, where you could
00:37:31
apply that.
00:37:31
There are a lot of places in society that need that right now
00:37:33
.
00:37:33
But it was some of our early customers who said, yeah, I love
00:37:35
this credit I could use here and there, but hey, I have this
00:37:38
problem in that I have a large customer base.
00:37:41
You know, I'm actually very public with it recently.
00:37:44
A hub spot, you know, the marketing automation platform,
00:37:47
amazing company, amazing people and their CISO.
00:37:50
You know, eric said I we want to further protect our customer
00:37:54
accounts.
00:37:54
We're rolling out MFA to protect our customer accounts.
00:37:56
It's great for the customer, but I'm seeing a corresponding
00:37:59
increase in support tickets and they're really expensive and the
00:38:03
customers that get locked down are unhappy and yet we have to
00:38:06
do the right thing and protect their account because if we let
00:38:08
the wrong person into someone's account, that's our credibility
00:38:11
on the line, that's impact to our customers.
00:38:14
And so I said, hey, can we apply this technology you created as
00:38:17
a more secure way to do these sort of MFA resets or account
00:38:20
lockouts?
00:38:21
And we saw that that's actually really clever.
00:38:23
But that was from listening.
00:38:24
That was from listening, that was from building trust and
00:38:25
having the strength of that relationship.
00:38:27
Where Eric felt comfortable bringing that to us, we felt we
00:38:30
were there listening and hungry and eager to learn.
00:38:33
And it only expanded from there because it went from giving
00:38:36
their tool to their help desks so their help desk agents had
00:38:39
something higher caliber and higher fidelity, to know who's
00:38:44
behind the screen.
00:38:44
But also actually now they've integrated into their product.
00:38:49
So if you're a HubSpot customer and you go and you say, hey,
00:38:50
I'm locked out, I need to reset my MFA, it takes you to a screen
00:38:53
and says would you like to contact support this might take
00:38:56
up to 48 hours or would you like to do it immediately and use
00:38:58
name tag?
00:38:59
And it's super cool and you know people use it and they love
00:39:03
it and, um, it's, it's totally changed the game and them
00:39:06
feeling like they're having a good customer experience with
00:39:08
HubSpot, like HubSpot cares and protects their account.
00:39:10
And, by the way, hubspot has a ton of money because they don't
00:39:13
have as many support tickets from all these users who are
00:39:15
logged out.
00:39:16
But that whole application of our technology which frankly was
00:39:20
a little bit ahead of its time in the sense that MGM then
00:39:23
happened and other breaches starting last year in particular
00:39:26
, that got very public targeting the help desk, this concept
00:39:30
that Eric saw that it was a vulnerability and you know Eric
00:39:32
is a proud Okta customer.
00:39:34
Eric is on stage at Octane and talks about Okta, how great they
00:39:38
are, how great he uses Okta but he recognized this as a clear
00:39:41
vulnerability before others did, and he took smart steps and so
00:39:44
we built a solution that's uniquely targeted to do it.
00:39:47
And then, by the way, it happens to be the epidemic of
00:39:51
the moment where hundreds of companies are being targeted at
00:39:53
the moment exactly this way People are calling the help desk
00:39:56
, they're taking over customer accounts or they're taking over
00:39:58
employee accounts to gain access , and so it was all because of
00:40:02
that insight and our ability to listen that we're able to
00:40:04
develop a product that, frankly, is what the market now really
00:40:07
needs at this moment so you know , you, you have a product that
00:40:13
is absolutely ahead of the curve .
00:40:15
Speaker 1: You know it beats all of the legacy uh solutions that
00:40:21
we we had previously applied to this problem, and surely
00:40:24
they're not going to get past this one, right?
00:40:25
Surely they're not going to get past this one, right?
00:40:27
Surely they're not going to get past.
00:40:30
You know device authentication and things like that, right?
00:40:33
Where do you see the threat landscape going in the next five
00:40:39
years with the evolution of AI, and how quickly AI is evolving,
00:40:44
how quickly it's being included into everything that we do now,
00:40:49
it seems, especially with how good deep fakes will become and
00:40:55
things like that.
00:40:56
Are you looking for?
00:40:58
I mean, I don't know if it's even possible, right?
00:41:00
I'm not trying to poke any holes or anything like that, but
00:41:02
maybe there's a deep fake to.
00:41:05
You know, show someone with their ID through the phone and
00:41:09
the camera.
00:41:10
You know, like, maybe there's something like that.
00:41:12
I don't know what's that thought process like.
00:41:16
Are you thinking about that next generation?
00:41:20
Speaker 2: We're constantly thinking about it.
00:41:21
Actually, it's really fun.
00:41:22
It's actually really fun to the degree to which we've seen very
00:41:25
bad actors repeatedly try and get through, and they're testing
00:41:31
us and we're learning from them , and so we do a lot of analysis
00:41:34
on those.
00:41:34
We learn when we're successfully what did they try?
00:41:37
And often it spurs new ideas for new antifraud techniques
00:41:41
that keep us ahead, and so, you know, there are examples of ones
00:41:43
that we've invented now, two years ago, that we're now seeing
00:41:46
come into market, of people trying and we're thrilled.
00:41:48
We're like like, wow, that that worked.
00:41:53
But the team feels really successful.
00:41:54
Wow, we, we stopped something.
00:41:54
We, you know we were a couple steps ahead, but I'm a firm
00:41:56
believer that ai alone cannot defeat ai.
00:41:59
Now we have a problem in that bad actors are using ai more
00:42:03
than good actors are to prevent it.
00:42:05
However, I feel there will always be an arms race of AI
00:42:08
versus AI if those are your only tools, and so our thought
00:42:12
process is we need to take broader tools that are proven,
00:42:15
that exist in the market, like cryptography, like biometrics,
00:42:19
like the technology behind mobile devices and AI to defeat
00:42:24
AI, and that alone is those are a much stronger arsenal to bring
00:42:27
against sort of an adversary who's trying to use a deepfake
00:42:28
If you're is.
00:42:28
Those are much stronger arsenal to bring against sort of an
00:42:29
adversary who's trying to use a deep fake.
00:42:30
If you're using, you know, device telemetry, 3d cameras,
00:42:35
the cryptography in modern mobile devices, a whole bunch of
00:42:37
things.
00:42:37
This is not us updating our AI model to detect a deep fake as
00:42:41
fast as someone's making a deep fake.
00:42:42
That will always be an arms race.
00:42:44
Some companies will inch ahead and then they'll inch behind.
00:42:47
We believe you just need to bring more to it, and that's
00:42:50
been sort of the foundation of our approach.
00:42:54
Speaker 1: Yeah, I think that's a really good way of approaching
00:42:56
this problem.
00:42:57
I mean, that's probably probably the only way that you
00:43:01
approach this problem and be effective against it.
00:43:04
You know, out of curiosity, have you seen any patterns of
00:43:12
different kinds of attacks against you know?
00:43:15
Identity fraud, right, like maybe, maybe I don't know, some
00:43:20
group in you know Poland or something like that?
00:43:23
Right, I'm trying to stay away from Russia and China because I
00:43:25
always say I'm, and now I'm blocked in our countries.
00:43:27
You know, but, like you know, in another region, do you see?
00:43:31
Oh, this is typically used.
00:43:34
This method is typically used from a hacking group in this
00:43:38
region.
00:43:38
Are you able to see that telemetry, that kind of data, or
00:43:44
is it kind of just, you know, spray and pray, almost you know,
00:43:48
everyone kind of uses the same stuff at this point.
00:43:52
Speaker 2: There's definitely a lot we learn in patterns that we
00:43:54
see from fraudulent acts.
00:43:55
I'd say there is one group in particular that's got a lot of
00:43:58
variety.
00:43:59
They're very public about it and it's really targeting this
00:44:02
current exploit.
00:44:03
They go by various names scattered spiders, probably
00:44:06
their most common and that's the group, frankly, that impacted
00:44:09
MGM.
00:44:09
So you asked a little bit about Okta earlier and it was an
00:44:13
interesting timeline because in early August Okta came out with
00:44:16
sort of a blog post and said hey , we're seeing some concern with
00:44:19
customers.
00:44:20
We recommend that you be thoughtful about your account
00:44:22
recovery workflows.
00:44:24
We recommend visual verification.
00:44:26
People sort of made note but it didn't quite ring too many
00:44:30
bells.
00:44:31
And then by late August, mgm was following the new SEC
00:44:35
disclosure rule which is now impacting a lot of companies so
00:44:38
you have to disclose a cyber incident and so they came out
00:44:41
proactively.
00:44:41
The deadline hadn't come yet but they said, getting ready for
00:44:43
this new disclosure law requirement, we're going to
00:44:47
disclose a breach that we had and that was the breach that you
00:44:50
know we've been hinting at.
00:44:50
That was this breach of a bad actor, scatterthbiter, who
00:44:55
claimed they went and they researched an employee basically
00:44:57
on LinkedIn, called the MGM IT help desk and said I am that
00:45:00
employee.
00:45:01
I'm locked out.
00:45:02
It was a 10-minute call.
00:45:03
They got in, they took over credentials and, as you said,
00:45:08
people couldn't check into the MGM hotels.
00:45:10
The whole system sort of went offline for days and days and
00:45:13
then that's only continued.
00:45:14
Then we heard Caesar's Entertainment.
00:45:17
Next then we heard Clorox.
00:45:18
Clorox had something that was like a 24% drop in revenue
00:45:22
because of supply chain disruptions because of this
00:45:24
exact attack factor, and so by some accounts, last Q4 alone
00:45:29
this group's targeted at least 230 large organizations and
00:45:33
we're just we're seeing it continue at a crazy clip.
00:45:35
Right now they're particularly active in healthcare, healthcare
00:45:39
and degree financial services and sort of layers where they
00:45:43
can get more reach.
00:45:44
So you know storage providers and Okta themselves, because if
00:45:50
you can find a way into Okta support infrastructure, then you
00:45:52
can target companies that are using Okta sort of getting very
00:45:56
sophisticated about going one level deeper as a way to get
00:45:59
into many other customers, and so it's crazy that group at the
00:46:02
moment is sort of running wild.
00:46:04
We don't quite know where they're from.
00:46:05
People thought they were US based.
00:46:07
They were surprised the FBI hadn't kind of cracked down on
00:46:09
them.
00:46:10
The FBI took down their website a couple months ago.
00:46:13
They put it back up.
00:46:14
They made references that now they're really going to go wild,
00:46:17
except in sort of Russia and affiliated Russian states kind
00:46:21
of implying maybe they have some affiliation there.
00:46:22
The truth is we don't really know in a public sense.
00:46:25
What we do know is that they're having significant impact and
00:46:29
they're being very successful because there aren't good
00:46:31
defensive mechanisms in place.
00:46:33
Speaker 1: Oh, yeah, it's.
00:46:36
You know the supply chain attacks.
00:46:40
They're not anything new, but they always kind of up the ante,
00:46:45
right.
00:46:45
They kind of there's like these moments in security where it
00:46:48
kind of up the ante, right, they .
00:46:48
They kind of there's.
00:46:49
There's like these, these moments in security where it
00:46:52
kind of changes your mentality or your thought process with
00:46:56
what's possible, with what you should actually be looking at.
00:46:59
It's very easy for me to look at my kind of report and see all
00:47:03
these green scores, you know, and focus on, you know, a couple
00:47:07
subpar ones, right, and think that I'm secure.
00:47:11
But when you start talking about supply chain attacks, it's
00:47:16
like, okay, where does this end ?
00:47:18
You know like, how, how, how can we limit?
00:47:23
You know this because we're a company, we need to buy other
00:47:27
products from other companies.
00:47:29
They have chips in them.
00:47:30
These chips can be compromised.
00:47:32
And then take it a step further oh, where's the country of
00:47:36
origin, right?
00:47:37
Oh, it's China.
00:47:38
Well, there's a very real percentage chance that there
00:47:42
could be a backdoor in that chip that's coming from China to
00:47:47
give them access to your entire company, right, and it's a
00:47:53
difficult time, I feel, to be a security practitioner because
00:47:59
you know, right now, right, we haven't had a major breach in
00:48:02
you know six months or something like that.
00:48:04
Right, we're all kind of holding our breath and saying like,
00:48:10
okay, when's that next zero day popping?
00:48:12
Right, that is probably even used in the wild right now
00:48:17
against you know very real infrastructure and companies and
00:48:20
things like that, but no one knows about it.
00:48:22
Where is that thing going to come from?
00:48:25
What's it going to do?
00:48:26
You know all these different things and, of course, you know
00:48:31
what comes to mind is these different tool sets right,
00:48:35
getting released potentially from bad actors within
00:48:38
government agencies that open up a whole other can of worms with
00:48:44
creating zero days and problems and things like that, with
00:48:48
creating zero days and problems and things like that.
00:48:49
So, you know, in a field that is forever evolving and ever
00:48:52
vulnerable to people, you know, I feel like your solution is a
00:49:02
step in the right direction that we really need to go to.
00:49:03
You know, ensure that another MGM doesn't happen or another
00:49:06
Okta doesn't happen.
00:49:09
Speaker 2: The sad thing is it will.
00:49:10
The sad thing is the leading attack factor at the moment.
00:49:13
Is this the way to take ransomware for the leads data
00:49:17
breaches?
00:49:18
Is this concept of social engineering attacks at the help
00:49:20
desk because it is unpatched, so to speak, and there is so much
00:49:24
to worry about?
00:49:25
You're right as a security leader and there's so many
00:49:27
concerns you can have.
00:49:28
This at the moment is kind of the lowest hanging fruit and
00:49:31
it's not terribly sophisticated of an attack.
00:49:34
It's not some wildly advanced at the chip level that was
00:49:37
deposited here.
00:49:37
This is just basic social engineering and we've all been
00:49:41
through it.
00:49:41
Because it's so obvious that if you do your own penetration
00:49:44
test that's one of my key pieces of advice today to
00:49:47
organizations Do a penetration test of your help desk.
00:49:50
Call and pretend to be locked down and see how it goes.
00:49:52
Chances are it's not going to be that hard to get back in.
00:49:55
And the other interesting thing we found in this space has been
00:49:59
security really matters and this is a great security driver, by
00:50:03
the way.
00:50:03
It was actually a problem before deep fakes.
00:50:05
It was a problem actually since you rolled out MFA.
00:50:08
These are the hidden risks and increasingly hidden costs of MFA
00:50:11
.
00:50:11
You can actually go to your IT department as a CISO, go to your
00:50:14
CIO and say I think we can save a bunch of money.
00:50:18
You know, up to half of our support tickets are people who
00:50:20
are locked out of their accounts .
00:50:21
Like, can we just automate that ?
00:50:23
That's a huge cost saving factor.
00:50:26
And then it turns out your employees weren't that happy.
00:50:28
For example, they didn't like having to call the help desk
00:50:30
when they got locked out because they upgraded their phone.
00:50:33
Like wow, so you can improve employee experience, you can
00:50:35
save money.
00:50:36
Oh, and, by the way, shut down the leading security vector.
00:50:38
This is kind of a no-brainer.
00:50:40
And so, while there are so many things on the plate of a CISO
00:50:43
today, for example, or a security leader, I would
00:50:46
strongly advocate you look into this area as kind of one of your
00:50:49
lowest hanging fruit initiatives in the coming year.
00:50:53
Speaker 1: Yeah, that's a really good point.
00:50:54
This is absolutely a low-hanging fruit that can
00:50:57
really make a huge difference in more ways than one.
00:51:01
You know, aaron, I really enjoyed our conversation.
00:51:05
This is a fantastic conversation.
00:51:07
Unfortunately, you know, we're at the end of our time here and
00:51:10
I try to be very cognizant of everyone's time because we're
00:51:13
all so busy.
00:51:14
But before I let you go, how about you tell my audience where
00:51:18
they can find you if they want to reach out, where they can
00:51:20
find your company and all that information that they may be
00:51:24
looking for?
00:51:25
Speaker 2: Yeah, we're super active on LinkedIn in particular
00:51:27
, so check me out Aaron Vander looking for.
00:51:28
Yeah, we're super active on LinkedIn in particular, so check
00:51:29
me out Aaron Painter will have a link, I'm sure, in the show
00:51:32
notes.
00:51:32
Our website getsnametagcom.
00:51:33
We have a ton of content and a lot.
00:51:35
We really try and cover some of these recent breaches.
00:51:37
You can follow along on these kind of help desk hacks as
00:51:40
they're happening and what we're learning from different
00:51:42
companies, and we have really good explainers, even on things
00:51:45
like injection attacks, on things like deep fakes, trying
00:51:51
to just keep people educated so you can better respond in your
00:51:53
own organization.
00:51:55
Speaker 1: Awesome.
00:51:56
Well, thanks everyone.
00:51:58
I hope you enjoyed this episode .