From the sun-kissed landscapes of Arizona to the digital defenses of cybersecurity, Jackie's career leap is nothing short of remarkable. Her story, rich with the twists of fate that took her from the financial sector to the front lines of IT security, paints a vivid picture of how diverse backgrounds can fortify the cybersecurity industry. Tune in and get inspired by Jackie's journey, from learning the ropes of Python to tackling cyber threats with a finance-savvy perspective.
Ever wondered how cybersecurity experts think outside the box? Our chat with Jackie dives into the welcoming arms of the security community, where at places like DEF CON, unconventional thinking is the secret weapon against cyber adversaries. Discover how varied professional experiences, like Jackie's in finance, are not just useful but essential in crafting ingenious security strategies. Plus, get a glimpse into the personal growth that comes with this territory, as we explore how hobbies like podcasting and improv comedy can sharpen your professional edge.
But it's not all about the code and firewalls; we also grapple with the hefty topics of diversity and ethics in tech development. As AI and hardware evolve, the conversation turns to the crucial role of inclusivity and accountability in crafting tools that serve everyone. Before we wrap up, we give a shout-out to Cribl and tease their innovative approach to data utilization. For an episode that's as enlightening as it is entertaining, don't miss our deep dive into the interplay between cybersecurity, personal evolution, and ethical technology.
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, Jackie?
00:00:00
It's great to get you on the podcast finally.
00:00:03
I feel like it's been a while, but I'm really excited for our
00:00:07
conversation today.
00:00:09
Speaker 2: Yeah, super happy to be here.
00:00:10
It's beautiful out in Arizona, so it's a nice day.
00:00:14
I'm in a good mood.
00:00:17
Speaker 1: Oh, I know, A good friend of mine moved to Arizona
00:00:21
maybe a year and a half ago and now he always gives us updates
00:00:25
like oh, it's a little bit chilly, today it's 60 degrees.
00:00:29
You know, like man, you can leave me alone, it's negative 20
00:00:34
where I'm at.
00:00:38
Speaker 2: I grew up in New Hampshire.
00:00:39
For the first 10 years I lived in New Hampshire I lived in a
00:00:42
little town called Berlin, which is about 45 minutes in Canada.
00:00:45
So I have definitely walked to school in like 10, negative 15
00:00:50
times a week before I empathized with you.
00:00:53
Speaker 1: Yeah, when I was in college actually, you know, I
00:01:00
worked for the police department , the campus police department,
00:01:04
and it was like negative 40.
00:01:06
And I was one of like five or 10 people that had to actually
00:01:11
go in as like a central personnel and which was crazy,
00:01:15
because they had a student be an essential personnel and the
00:01:20
police chief actually sent a squad card to my dorm to pick me
00:01:24
up because he said that it was too unsafe for me to walk a
00:01:27
quarter of a mile.
00:01:29
Speaker 2: Yeah, yeah.
00:01:30
I believe that the thing that finally broke me was we had an
00:01:34
ice storm in 2008 in New Hampshire that left an inch and
00:01:37
a half of ice on everything in New Hampshire and I didn't have
00:01:41
power for six or seven days.
00:01:43
I lived in Manchester.
00:01:45
I didn't live in like nowhere in New Hampshire.
00:01:46
I lived in the biggest city in New Hampshire and I was just
00:01:50
like this place is not inhabitable.
00:01:51
Nobody should live here, and I literally just sold all of my
00:01:54
stuff and moved to California.
00:01:56
I've never been there.
00:01:57
I didn't know anything about it .
00:01:58
I found a roommate on a forum for motorcycles and I ride
00:02:03
motorcycles on my way there and I was just sort of never living
00:02:06
anywhere that does this.
00:02:08
Speaker 1: Wow, that is so crazy An inch and a half of ice.
00:02:15
Speaker 2: It was insane.
00:02:16
I've never seen anything like that before.
00:02:18
It was so crazy.
00:02:21
Speaker 1: Yeah, I would move too At that point.
00:02:24
I would move too my gosh, but it's in Arizona today.
00:02:30
So I'm happy.
00:02:32
Yeah, it's very tempting to go to Arizona.
00:02:35
My wife and I we like colder weather I wouldn't say insanely
00:02:44
cold weather, but we prefer a good variation in the seasons
00:02:51
and whatnot.
00:02:51
My buddy was telling me that Flagstaff Arizona is a good mix
00:02:57
of the seasons, and so now I'm on the mission of convincing my
00:03:02
wife.
00:03:02
Hey, we should go check out Flagstaff Arizona and see how it
00:03:06
is.
00:03:07
Now I'm on that 10-year mission .
00:03:11
Speaker 2: I can see snow from where I'm standing, so there's a
00:03:13
mountain outside.
00:03:15
I live in Tucson.
00:03:16
There's a mountain just outside Tucson called Mountain Lemon.
00:03:18
It actually is a ski area.
00:03:19
You can ski for a couple months out of the year and I can see
00:03:24
it from like.
00:03:24
So it's less than an hour from my house to the mountain, so you
00:03:29
can get to call by there if you really want to, you just don't
00:03:32
have to live in it.
00:03:34
I hate scraping my windshield.
00:03:36
It's one of those stupid things that you're like it's such a
00:03:40
small thing to be like, so it's not, but it's just one of those
00:03:43
things that it's like 7.15 in the morning and you're exhausted
00:03:47
and you only had a cup of coffee and you're trying to get
00:03:49
to work and you're just like I hate this, I hate everything.
00:03:52
But then again, I've never burned my ass on a frozen
00:03:56
windshield In Arizona.
00:03:59
Half a year I literally have to keep a towel on my car seat so
00:04:02
that I don't get second degree burns on my butt.
00:04:05
Speaker 1: So just straight out to everything.
00:04:07
Yeah, yeah, absolutely.
00:04:10
Well, jackie, I'm really interested in hearing about how
00:04:15
you got an IT, what made you want to go into IT overall and
00:04:20
then what made you want to maybe make a I guess maybe a slight
00:04:24
switch into security and focus more on security.
00:04:29
Speaker 2: Yeah, so I had a really weird, like meandering
00:04:32
career.
00:04:32
I've dropped out of college three or four times at least.
00:04:36
Speaker 1: Oh geez.
00:04:37
Speaker 2: Yeah, so I started with psychology and dropped out
00:04:41
because I was poor and poor to afford tuition.
00:04:44
And I actually did a stock broker when I was 20.
00:04:46
So I went through an interview about higher class edelides.
00:04:50
I worked as a stock broker through the financial crisis,
00:04:53
moved over to SVB and managed cash for companies, and so it
00:04:59
was like I had kids and I was doing the whole like
00:05:03
quarter-life crisis thing everybody does and like doing
00:05:06
psychedelics and like questioning mindsets on the
00:05:08
planet.
00:05:09
And I got an economics and finance degree because I was
00:05:13
working in finance and I really wanted to learn how to write
00:05:16
Python.
00:05:16
So I was like, well, what was the easiest way to work, how to
00:05:19
write Python?
00:05:19
I was like, well, I already understand economics and
00:05:22
statistics, so maybe data science would be a good idea.
00:05:25
It seems to be an up and coming field right now.
00:05:28
This is like 2017.
00:05:31
It was like you know, also, it would be applying Python to
00:05:35
something I already understand.
00:05:36
So I did a data science boot count and I pushed my final
00:05:40
project in my GitHub and it's got me jumped and he said hey, I
00:05:45
am the data science person at this sim startup and we're
00:05:49
looking for somebody to write algorithms for like anomaly
00:05:52
detection and user behavior analytics, are you interested?
00:05:56
And I really thought about working in cameras security,
00:06:00
which is weird because I was always like my mom used to drop
00:06:04
me off at Radio Shack when she grocery shopped.
00:06:06
So I've never had to use a computer since I was like three
00:06:10
and I always like I don't know, it's so strange, and like I
00:06:12
actually grew up in the military on a couple of intelligence
00:06:15
phases, so I've always been kind of weirdly adjacent to security
00:06:19
, those types of things.
00:06:21
That's how I initially got into it and then I think, personally
00:06:27
, one of the things I love about security is, well, the gender
00:06:33
disparity isn't a good thing.
00:06:34
The bathrooms are always clean for me and there's never a line,
00:06:37
so like that's a good.
00:06:38
No, no, I find more than anything, more than any industry
00:06:42
I've worked in, security is very much a meritocracy, right,
00:06:46
it is a hundred percent a meritocracy and it's a really
00:06:49
interesting industry and that people don't actually need to
00:06:52
like you, they just need to trust you.
00:06:53
And so you have all of these crazy like neurodivergent, like
00:06:57
not necessarily super socially adept people.
00:07:00
But I did like it, just felt like home as soon as I went to
00:07:05
my first like DEF CON.
00:07:06
You know my first DEF CON was mind blowing.
00:07:08
It was like, oh my God, these are my people Like, and it's
00:07:12
that everybody kind of wants to solve the same problems and I've
00:07:17
always had problems in regular industries where people think
00:07:21
I'm crazy because I don't think the way that neurotypical people
00:07:24
think, whereas in security people are like, oh, your brain
00:07:27
works differently than mine.
00:07:29
Come help me with this problem, because between the way your
00:07:31
brain works so it just seems to be an industry that's
00:07:34
significantly more open to like everybody brings their own
00:07:39
special talents.
00:07:39
Yeah, and so I ended up moving from the random elementary
00:07:46
school in my living room during the pandemic because I have
00:07:48
three kids.
00:07:49
And then after that I was like you know how do I combine
00:07:52
cybersecurity with my finance background?
00:07:55
And they ended up becoming an industry analyst.
00:07:57
So I covered, like Sims or XDR, all those kind of like
00:08:01
analytical platforms for S&P slash survival research, and
00:08:06
that was how critical found me is.
00:08:08
They pitched the companies to me and I was like I love this.
00:08:12
So I hate regex.
00:08:13
It's like a beta by existence, because when I became a data
00:08:16
scientist nobody told me that before you can write an
00:08:19
algorithm, you have to get this like beast of assist along file
00:08:23
into something that you can actually use, and there were no
00:08:26
good tools for doing that at the time.
00:08:28
So I literally spent like weeks on regex when I was a data
00:08:31
scientist.
00:08:31
So when I saw Kerbal I was like this is amazing.
00:08:34
But at the time they were kind of calling it an observability
00:08:37
pipeline, which is what it was it's great for, but I was like I
00:08:41
don't know, I'm sorry, but he really needs a looking security
00:08:46
the thing we see like other people know exactly how
00:08:49
important their jobs are to security.
00:08:51
But security people are trying to work observability.
00:08:56
Speaker 1: Isn't that like right ?
00:08:58
Speaker 2: Yeah, so, but so I thought it was a very
00:09:00
long-winded story, but I think it's important because I have
00:09:03
not met many people in security who came here directly, right,
00:09:07
like, we all have these like weird backgrounds and like the
00:09:09
diversity of background is usually what makes you good in
00:09:13
security, because when you're dealing with, say, like, a
00:09:16
financial services client, you need to have some background in
00:09:20
that to really understand the nature of the threats that
00:09:22
you're dealing with.
00:09:22
Speaker 1: So, yeah, yeah, it's a really good point.
00:09:26
You know I have a lot of people reaching out to me, you know,
00:09:30
constantly asking me you know how do I get into security and
00:09:34
that sort of thing, right, and they kind of want that you know,
00:09:37
12, 18 month path into security .
00:09:40
And you know I always tell them you don't want that path
00:09:45
because security is so stressful and it requires so much context
00:09:49
outside of security that you just simply wouldn't get.
00:09:54
You know, if you went straight into security, yeah, you know,
00:10:01
like you said, right, you kind of have to know how the
00:10:05
financial industry works, the kinds of systems that they have
00:10:08
in place, the methods and all that to really, you know, kind
00:10:14
of understand like, oh okay, we're going to do security this
00:10:17
way because we have this huge compliance standard with NYDFS
00:10:22
that just came out, that you know they're going after
00:10:27
companies for it, right?
00:10:28
So, like this is a hot priority item.
00:10:31
You know we need to get through it like this rather than this
00:10:34
other, you know, industry recognized method that we've
00:10:37
done forever.
00:10:38
You know.
00:10:39
Speaker 2: Yeah.
00:10:40
Yeah, I think my job is to be able to actually underwritten.
00:10:42
I understand the statistics that an actuary uses.
00:10:46
You know and I can form opinions about this is the way
00:10:49
the market is going.
00:10:50
You know right now we're underwriting enterprise value.
00:10:52
I actually think we're going to go to a model where we
00:10:54
underwrite the value of the actual data that they've risked
00:10:59
and part of that is bringing that back.
00:11:00
And I agree like I always tell people like you're not the non
00:11:05
security background that you bring is what's really important
00:11:07
and I think we in the security profession to do a lot better
00:11:10
job of building those bridges.
00:11:12
Because I agree with you, I hate the like.
00:11:14
Here's your 18 month old print into a tier one analyst rule.
00:11:18
That's going to suck your soul out of you, right?
00:11:21
Because, like, that path is not going to get you into a really
00:11:25
cool, interesting security game.
00:11:27
It's going to get you into a tier one socket, which is not
00:11:31
the best way to start out in security is.
00:11:33
A lot of times those are kind of burn assured roles.
00:11:36
They're really stressful.
00:11:38
Yeah, like how many you can be like.
00:11:41
Okay, well, I come from a you know a manufacturing background,
00:11:46
right, but I really understand the physical security aspect of
00:11:50
manufacturing.
00:11:50
That's an entire like subset of security that is actually
00:11:55
desperately in need of monetization.
00:11:57
So go go into it that way, you know, develop opinions on it,
00:12:02
like in that I think a lot of times people are afraid to have
00:12:04
an opinion, but that's gotten me most of the best jobs I've ever
00:12:07
had, right Is having like a contrary opinion on something.
00:12:12
Speaker 1: Yeah, I feel like that's a that's a pretty common,
00:12:16
you know, attribute of security professionals is having an
00:12:21
opinion on something that isn't isn't the norm, isn't the
00:12:25
expected, you know opinion and you know I.
00:12:29
I kind of go back to like when my wife and I we were building
00:12:34
our house and you know figuring out where we wanted the rooms
00:12:37
and everything like that, right, she wanted to have a more open,
00:12:41
you know, floor plan and I'm thinking to myself, well, like
00:12:45
that makes things too easy, you know, for for a potential
00:12:48
attacker, right.
00:12:49
Like I'm thinking from a physical security perspective
00:12:52
and I'm thinking like you know, oh, I don't want a wall here
00:12:55
because I want to put a camera there so I can have a wider view
00:12:57
of range, right, Like like all of these things.
00:13:00
And you know, when we're finally in the house, she's like oh
00:13:03
well, I like you know, no, like no shades on the on the window
00:13:08
because it blocks the natural sunlight from coming in, and
00:13:11
whatnot.
00:13:11
Like one.
00:13:12
We live in Chicago, so we get natural sunlight like four
00:13:15
months of the year.
00:13:16
And two, you know, we're just opening our windows to attackers
00:13:20
.
00:13:20
And she's like where do we live ?
00:13:21
Like we live in a place that has literally zero crime?
00:13:25
Yeah, who are we protecting from you know?
00:13:28
But like, my mind works totally different.
00:13:30
My mind is like like no, you know, worst case scenario, we
00:13:35
already expect them to be here, you know that sort of thing.
00:13:37
And she has to like dial me back.
00:13:40
Speaker 2: Yeah, it's interesting to think about, like
00:13:43
how we think about security because, like what you're
00:13:46
talking about, like some of that , is such an actual security.
00:13:50
It's so good to secure a theater and it's really
00:13:52
interesting to think about it like a post-911 world, right,
00:13:55
like a lot of people under.
00:13:57
It's so crazy.
00:13:58
I think people under 25 have never known a world without
00:14:00
security theater, I mean you didn't use to have a lot of that
00:14:04
, and so I think some of that is like there are things you do
00:14:07
because they actually make you more secure, and then there are
00:14:09
things you do because they present the illusion of security
00:14:14
right, like she is a security theater, because, realistically,
00:14:18
if someone wants to run straight through those
00:14:21
parachutes with something, they can and you know I didn't so,
00:14:25
but I it's interesting to think about that because we're in the
00:14:27
same position and like I shouldn't say that I might not
00:14:32
lock my doors at night.
00:14:33
Yes, yeah.
00:14:35
Because I think it like I've had .
00:14:36
I've tied my car.
00:14:37
So, for example, I used to have a convertible right.
00:14:40
I had my caribled three weeks in South Salem that it would
00:14:43
just tie the barry out, but I'm supposed to be kept at top and
00:14:46
broke into it and it's like a two brand for a new convertible
00:14:49
top.
00:14:50
So you know what I started doing.
00:14:51
I just didn't lock my doors.
00:14:53
I just left my car.
00:14:54
I locked all the time, but anything really important in the
00:14:57
trunk is, I figured you know like, and the top never got
00:15:00
caught again.
00:15:01
It's interesting to think about , like what things are actually
00:15:05
secure, what things are and I had a kid that I was reading a
00:15:09
book the other day about also like our perception of security
00:15:13
and how much more dangerous the roles become and actually
00:15:16
statistically the world's become quite a bit safer, you know,
00:15:20
and that they fit a more dangerous person than they used
00:15:22
to be, or things like our diets.
00:15:25
Speaker 1: Yeah, yeah.
00:15:27
Speaker 2: So it's like I don't know we let their kids watch
00:15:29
school.
00:15:30
Yeah, there's this perception that like there's all these
00:15:34
people out there who are going to snatch your kids, but
00:15:36
realistically, kidnapings are like are down by more than 50%
00:15:40
since 25 years ago.
00:15:41
So, it's a yeah, but I mean the same thing in cybersecurity.
00:15:46
Right, it's like people we spend all spend billions of dollars
00:15:51
on all this AI and sophisticated detection and it's literally
00:15:55
just some dude in your mailroom that clears from the wrong link
00:15:58
that call.
00:15:58
Or your your HVAC system.
00:16:01
You're using the default password for the system that
00:16:03
operates all on your air and somebody gets it.
00:16:06
It's like you can sit all the time in the world and you're
00:16:09
spending all this money on stuff , but the end of the day, it's
00:16:12
usually the little things that you're going to get that screw
00:16:16
you over.
00:16:17
Speaker 1: Yeah, you know, that reminds me of, like the Octa
00:16:22
breach that recently, apparently recently, happened right, where
00:16:25
you know someone just dialed into support or you know
00:16:29
whatever it was the help desk and they got access via that.
00:16:33
And you know, octa was infamous for having top notch security
00:16:41
never really dealt with a breach like this before you know, or
00:16:45
anything like that.
00:16:46
And I think that they I think they handled it fairly well
00:16:51
right, because I felt like I was getting the information, like I
00:16:56
felt like I was getting the updates as they were getting
00:16:58
them.
00:16:58
You know, like, oh, we just found out 100% was breached.
00:17:03
Yeah, Sorry, yeah, you know, we just learned of it.
00:17:06
You know, not like a whatever, whatever breach that was last
00:17:11
pass right, that like, yes, really frustrated me.
00:17:15
Yes, where it's like oh, you know, they don't have anything.
00:17:18
They got in, but they didn't get anything.
00:17:20
Oh well, I got some stuff.
00:17:23
You know, some of the stuff is unencrypted somewhere.
00:17:25
Right, they got some stuff, but you're fine.
00:17:27
Oh, it turns out they got everything and your master
00:17:30
password to your, you know, to your vault.
00:17:33
Like guys, you should have told me this six months ago.
00:17:37
Speaker 2: Yep, yeah, how do you handle a data breach is a lot
00:17:40
like obviously, the kind of data that's breached is really
00:17:43
important, but how you handle it Like I think about this going
00:17:46
through me you know breach, and I think that's kind of like on
00:17:50
the polar opposite end where I haven't really heard a whole lot
00:17:53
from them and I just keep, like you know, to your point.
00:17:56
I said well, the last pass.
00:17:57
It's like I haven't heard anything from them, like all
00:18:00
I've the only thing I've seen in the press from them is that
00:18:02
they still think they can be profitable as a company and I'm
00:18:05
like how do you know if you could be profitable?
00:18:07
I want to cover my DNA.
00:18:10
Speaker 1: Right yeah.
00:18:13
Speaker 2: You know I so I actually wrote when I was in
00:18:16
industry at all this a couple of years ago I wrote a paper on my
00:18:19
paper on zero shots and that, like zero stress is still you're
00:18:23
creating a single point of failure.
00:18:25
And that's like it's we, it's these first pull forces of
00:18:31
convenience and security.
00:18:32
Right, everybody wants to be super secure, but also we can't
00:18:36
inconvenience people, because if you inconvenience people with
00:18:39
security measures, they finally circumvent them.
00:18:41
So it's this constant battle of like and octasease like a great
00:18:46
idea, right Cause it's like, oh , it's all encrypted, but again,
00:18:50
single point of failure.
00:18:51
And so you know it's.
00:18:54
I see these things and I think this I assume this is most
00:18:59
recent kind of takedown of loft bit.
00:19:00
If you've read through any of the documents about the US it's.
00:19:04
They're basically like, hey, the US has single points of
00:19:07
failure all over infrastructure.
00:19:09
The AT&T outage the other day really kind of drove that home
00:19:13
for me, you know, because it wasn't just AT&T, because it's
00:19:17
an AT&T downlink satellite area, it's all self providers Block.
00:19:22
People don't realize that, like self providers don't each have
00:19:25
a tower, it's like they kind of choose each other's and yeah.
00:19:29
So it's a really interesting thing to think about and how we
00:19:33
go about managing that the kind of trade off.
00:19:37
And to me I've always said you know, security is really a
00:19:39
culture, and so I think what we need to focus a lot more on is
00:19:44
how to just build security practices into your culture at
00:19:48
your company.
00:19:48
Because I could talk smack about that now because they
00:19:52
would pay.
00:19:52
Personally Can't sue me, but I went from Fidelity to SBB and
00:19:58
Fidelity is one of the most conservative financial companies
00:20:00
that exists.
00:20:01
Right, they're boss and base, they're super.
00:20:04
It went here like your first day there.
00:20:05
They're like, hey, fyi, compliance is your best friend.
00:20:09
Like they are here to save your ass, they are not here to ruin
00:20:13
your day and not here to make your job harder.
00:20:16
And at SBB it was kind of like compliance.
00:20:17
Just they had these two people running all of compliance and
00:20:22
when I got there, like lots of the stuff they were doing, I was
00:20:23
like I had a supervisor lessons and my supervisor didn't, and
00:20:27
so I was like we can't do most of this stuff.
00:20:29
But it was like a check the box in there.
00:20:30
And security is the same way.
00:20:34
Security compliance can go hand in hand, right, and it has to be
00:20:37
a culture, because if it's not just baked into everything
00:20:41
everybody at your company does and if they don't
00:20:45
unquestioningly trust security to have their back and to ask
00:20:49
stupid questions, to be able to send efficient emails before,
00:20:51
like if I because a lot of times I think people make bad
00:20:54
decisions because they're straight to ask questions.
00:20:56
They don't want to look stupid or admit that they don't know
00:20:58
whether it's safe to click on an email or not, right, and so
00:21:01
maybe what we do look at a lot more is like how do we make
00:21:04
security more accessible to non-technical people and how we
00:21:07
just bake it into the corporate culture, and most of the time
00:21:11
we're just like we need to make the culture in most places where
00:21:13
we focus more on like so and this is, I have this argument of
00:21:19
people a lot if having security policies in place Prevent you
00:21:26
from doing your job effectively, that's probably a procedure.
00:21:29
It should, not a policy issue.
00:21:30
Right, like, if the policy is really prohibitive, change the
00:21:33
policy, but usually it's the way the policy is being implemented
00:21:36
that people have a problem with .
00:21:38
I'm really focused on is how do we separate policy from
00:21:42
procedure and Acknowledge that, yes, some of these things might
00:21:46
add some more work, but we can really optimize the procedure by
00:21:49
which we do that, so that the policy is not prohibited to your
00:21:52
day-to-day work.
00:21:53
If that makes sense and they think like we're not design
00:21:56
architects for security people, so we don't think about these
00:21:58
things, but they have to start kind of going together the same
00:22:01
way you would design UI products To, because it's it's not just
00:22:05
technical people that get hacked , right, oh, yes, yeah, that's,
00:22:10
um, that's.
00:22:11
Speaker 1: It's an interesting balance that well, one you
00:22:14
probably wouldn't get if you went directly into security,
00:22:17
right.
00:22:17
So it kind of circles back to that, yeah, but you know that
00:22:21
that's.
00:22:22
It's a balance that actually I'm having to deal with right
00:22:26
now, right, where I'm trying to deploy and enhance security
00:22:29
controls, and all in line with security policies that my
00:22:32
architects have created, but at the same time, I need to not
00:22:38
create something so restrictive or enforce something so
00:22:43
restrictive within these applications that my devs can't
00:22:45
do their work, and so I.
00:22:48
I have to actually work, you know, very closely with the
00:22:51
business, that, with people that are much smarter than me, that
00:22:57
you know, or languages, right, you know like there's compliance
00:23:01
, you know, basically everyone.
00:23:04
So just to make sure that the organization is not just secure
00:23:07
but that everyone in the org can do their job as they expect to
00:23:11
do it, you know, and that they've been doing it that way,
00:23:15
and so it's a challenging balance, for sure.
00:23:19
Speaker 2: Yeah, do you find it also challenging to have to tie
00:23:22
this up here, doing till a broader corporate initiatives To
00:23:26
keep yourself relevant.
00:23:28
Speaker 1: Yeah, that's.
00:23:29
You know that.
00:23:30
That's like, um, I guess in my most recent role, you know,
00:23:35
that's been a more of a focus right Of of me taking more and
00:23:40
more ownership of.
00:23:41
I'm basically a manager or director without the title,
00:23:45
right, like the title is engineer.
00:23:47
But all the stuff I'm doing like my manager even say, is
00:23:51
like, yeah, all the stuff you're doing is, you know, director
00:23:53
level role stuff, right, like I'm managing my budget, I'm, you
00:23:58
know, putting out, you know company wide notifications and
00:24:01
things like that, right, all the people I'm communicating with.
00:24:04
And it's a learning curve for sure, that is.
00:24:08
I mean I just spent like the last four, five months trying to
00:24:12
figure it out.
00:24:13
Speaker 2: Yeah, yeah, and I think that's kind of where
00:24:17
you've.
00:24:17
Everybody, I think, who retires in their career goes through
00:24:20
this process, where all of a sudden you realize that like you
00:24:23
have to think like the CEO, even if you're working in,
00:24:27
because if you want to get something funded, if you want to
00:24:30
get people to pay attention to it, if you want you know, if you
00:24:33
wanted to be more than just your pet project, like it has to
00:24:36
tie into these kind of broader company initiatives.
00:24:39
And so I was actually talking with one of my friends about
00:24:42
setting up like a CISO training thing at one of our corporate
00:24:46
events that we're doing, and I said you know, I think you
00:24:48
should do improv comedy, like do an improv comedy class, because
00:24:52
one of the things I find is that people in security are
00:24:55
really not that a problem speaking, and so, like you not
00:24:58
only have to be able to understand what you're doing as
00:25:01
a security leader, you have to be able to articulate it to
00:25:05
vastly different audiences.
00:25:07
Right, the way you explain what you're doing to the CFO is
00:25:09
different than the CEO, is different than the CTO.
00:25:11
And then you also have to be able to get up in front of
00:25:14
people who are going to pepper you with questions and you'll
00:25:17
answer those questions.
00:25:18
And I like I don't think that's necessarily something that
00:25:21
people anticipate when they go into security that when you get
00:25:25
to a certain point in your career, all of a sudden it
00:25:27
almost seems like all of a sudden you have to become a
00:25:29
significantly more robust professional.
00:25:32
And you did when you were just doing like detection response.
00:25:36
Speaker 1: Yeah, yeah, it's a really good point.
00:25:39
You know, I always talk about, or I try to on this podcast,
00:25:44
talk about the things that kind of separate you right from from
00:25:48
other people.
00:25:48
And the reason why I do that is because those separation, those
00:25:53
I guess those separation points , you know make you stand out
00:25:56
more.
00:25:57
And when you stand out more, hopefully it's in a good way.
00:26:00
You know you get promoted, you get the opportunities that
00:26:03
others don't get and that you know you approach different from
00:26:08
from.
00:26:08
You know, let's do improv.
00:26:10
To me, improv would be very scary.
00:26:12
I think I'm a funny person but I'm not improv funny, you know.
00:26:17
That would be terrifying.
00:26:21
Speaker 2: But anyway, just have a bunch of people and like the
00:26:24
people position crying on stage.
00:26:26
Speaker 1: Right, right, but you know what one of you said.
00:26:32
You know they struggle with public speaking, right, or
00:26:36
speaking to other people that they don't know, or what.
00:26:38
Right?
00:26:39
And this was also an issue for me, you know, several years ago,
00:26:44
before I started this podcast, and somehow I got this idea to
00:26:47
start a podcast and I could get it out of my head.
00:26:49
So here I am, right, like over 150 episodes in.
00:26:53
And you know you came on here, there was no prep.
00:26:57
It was like, hey, this person's name is Joe, he runs this
00:27:01
podcast, you know.
00:27:02
And then like the same thing for me, like this person's name
00:27:05
is Jackie, she's from Cribble, this is what they do, there's no
00:27:10
questions, you know anything like this.
00:27:12
If you go back five years ago, the thought of this conversation
00:27:16
taking place would have given me a lot of anxiety, but now you
00:27:20
know it's nothing right, like we're just having a conversation
00:27:24
.
00:27:25
Speaker 2: No, it did muscle and that's like.
00:27:27
So improv was terrifying for me too, like when I first did it.
00:27:30
The number one rule of improv is yes, and which is basically
00:27:35
no matter what the person before you says, you have to agree
00:27:39
with it and add to it.
00:27:41
So, and it's actually a really good lesson for how you should
00:27:44
live your life, because I do some crazy stuff in my life,
00:27:47
like I love music, festivals, traveling, and I've done all
00:27:50
kinds of crazy stuff on purpose and accidentally, because when
00:27:53
somebody's like, hey, do you want to do this thing, I'm like,
00:27:56
yes, and we should also do this , which would make it even more
00:27:59
epic, right, and so that's kind of like I.
00:28:03
So I'm in Cribble.
00:28:04
We have my team is very small, but with people who make content
00:28:07
.
00:28:07
We call our own team if it will do it live, because same thing,
00:28:12
like almost no live streams.
00:28:13
We do Like I'm usually finishing the slides for
00:28:16
whatever thing we're about to do as we're starting the
00:28:19
introduction on the recording.
00:28:21
But I think to your point, it's a muscle and it's a muscle you
00:28:24
have to exercise.
00:28:26
And the other thing, like the thing to figure out, is that we
00:28:28
all, we all have this like critical interior of failure and
00:28:33
I guess in my career I've been really fortunate that I have
00:28:36
screwed up so badly, so publicly , a few times that I have failed
00:28:40
in the most epic ways you can imagine.
00:28:42
And it turns out that, like, none of your family stops loving
00:28:46
you, none of your friends stop hanging out with you.
00:28:49
No, they think you're like, let me be true about worthless
00:28:53
person.
00:28:54
So you know, you fall on your face a couple of times and
00:28:57
you're like, oh, it's not that bad.
00:28:58
You know this podcast wasn't the best one I ever recorded,
00:29:00
but maybe next week's will be better.
00:29:02
You know, like everybody in your life is, like nobody cares,
00:29:06
and that's the kind of thing that I figured out is like the
00:29:10
work you do is extremely important, right, having this
00:29:13
podcast, having a resource of people who really need it, is
00:29:15
both extremely important and extremely important at the same
00:29:18
time.
00:29:19
So I keep looking to figure that out.
00:29:22
It makes life a lot easier than that.
00:29:25
Speaker 1: Right, yeah, you bring up a lot of really good
00:29:28
points there.
00:29:29
You know, with having you know, I feel like it's so important
00:29:35
to have I don't want to call it a safe space, but you need to
00:29:39
have a space where you can fail constantly in tech.
00:29:42
You know, like one of my first jobs out of college, I mean I
00:29:48
dropped a bank's database, like I didn't even know the term drop
00:29:54
right, like I accidentally deleted this customer's database
00:29:57
and they were a bank, and then I spent the next you know two
00:30:02
days, right, fixing it and restoring it from logs that I
00:30:06
didn't even know could be restored from, and you know all
00:30:09
this sort of stuff.
00:30:10
They didn't lose any data, they had no downtime, right, but I
00:30:14
still dropped their production database and you know in a lot
00:30:18
of companies on a lot of teams.
00:30:20
That's like immediate termination.
00:30:21
Like, okay, you don't know what the hell you're doing, like,
00:30:23
get out of here.
00:30:24
You know.
00:30:25
But I was also very open in the interview.
00:30:27
Like, hey, I don't know what the hell I'm doing.
00:30:28
Yeah, like, I need to be taught , you know, and that was a huge
00:30:33
learning moment.
00:30:34
But like, and that's just one of like a hundred, you know
00:30:40
situations that I was in at that company alone, right, and so I
00:30:44
developed, you know, the greatest troubleshooting doc
00:30:48
ever at that company.
00:30:50
It's literally still in use.
00:30:52
Where you know, when someone encounters a random problem,
00:30:56
they just go look at my doc because I guarantee you I've
00:30:59
encountered it and there's a whole section of SE Linux and
00:31:03
before I encountered SE Linux there, I never touched it, I
00:31:06
didn't know it existed, literally.
00:31:09
One of my customers was a federal agency and he said, hey,
00:31:11
we need to turn on SE Linux, you know, on this server.
00:31:14
And I was like, okay, turn it on.
00:31:16
What's the problem?
00:31:17
He goes, no, it breaks everything.
00:31:19
I was like, whoa, that's weird, you know.
00:31:22
And then I would.
00:31:22
That was rabbit hole for three months of knowing way more about
00:31:26
SE Linux than I ever wanted to, you know it's, it's really
00:31:32
interesting.
00:31:33
And then you know you bring up the the yes and perspective from
00:31:37
improv and I actually do that with like all of my trips.
00:31:40
You know that I'm planning a trip to London for my first time
00:31:44
in the fall and I'm going with a friend.
00:31:50
I'm bringing my wife and my one year old, and you know he comes
00:31:56
from a different background, I guess, of doing trips right,
00:31:58
where they kind of plan everything around food and you
00:32:01
know everything else kind of like falls into place.
00:32:03
I guess I am the complete opposite.
00:32:06
I am like like no, like we're going on this bus tour, we're
00:32:10
going to get off, we're going to go have drinks here, like all
00:32:14
of it.
00:32:14
You know, because when I go somewhere new it's like well,
00:32:17
let's do everything, like I'm not here to sleep, like if
00:32:21
they're open at 4am, like let's go at 4am, Like I do not care.
00:32:25
You know, yeah.
00:32:26
It's the same thing I did with my Germany trip last year was,
00:32:29
you know, every day was another adventure, like one day we were
00:32:33
in the mountains going through castles for the entire day,
00:32:36
walked an entire marathon.
00:32:37
I was dead tired at the end of it.
00:32:41
And you know, the next day was a football game.
00:32:44
Right, like, went to the football game, did a full day of
00:32:47
drinking.
00:32:47
You know, I got to see my buddy not keep up.
00:32:50
That was fantastic, you know, like the whole thing, you know
00:32:55
it's.
00:32:55
It's that ability to just want to keep going, you know, want to
00:33:01
keep exploring and pushing and seeing what else is out there.
00:33:04
I guess.
00:33:05
Speaker 2: Yeah, yeah, and I just kind of always done like
00:33:08
that.
00:33:08
I think some of it is that I grew up super sheltered, so if
00:33:12
you were poor and I was homeless when I was 19, like I, so I
00:33:18
never got to go anywhere.
00:33:19
Like we went on like two trips that I can remember as a kid.
00:33:23
So when I was finally an adult and making a lot of good money
00:33:26
because I was working in fine ass, it was like I want to do
00:33:30
all the things, I want to do everything, like there's no
00:33:33
reason not to try anything because I didn't get to do
00:33:37
anything when I was a kid and I always had like I've always had
00:33:41
health issues, so I've also always had this kind of like my
00:33:44
clock might be ticking faster than other people, so I need to
00:33:47
do all the stuff you know before you know before I run out of
00:33:51
time and health to do it.
00:33:52
So that I just I feel like everybody's.
00:33:56
So one of the interesting things I've also found is that,
00:34:00
like I used to think that everybody's idea of happiness
00:34:03
was roughly the same, and I think that as I've grown as an
00:34:06
adult, I figured out that we're all wired completely differently
00:34:11
.
00:34:11
Like everybody's brain is wired differently about brings people
00:34:14
happiness, and joy is completely different from one
00:34:17
person to another.
00:34:17
It's like growing up in New Hampshire.
00:34:19
It's a lot of friends who live within 10 miles of where we
00:34:23
graduated from high school.
00:34:24
We were married to somebody that we went to high school and
00:34:28
you know, they've literally never been.
00:34:30
I have a friend who's never been west of Tennessee because
00:34:33
they live in New.
00:34:34
Hampshire like doesn't have a passport but they're happy and
00:34:39
or ish and so like, if I don't know, I just think that I know
00:34:43
I'm ADHD, I know I'm autistic, so I know that my brain has like
00:34:47
a 40% higher need for stimulation and activity in most
00:34:50
people's.
00:34:51
Yeah, I'm all about like maximizing the hour line for
00:34:55
every moment I'm awake because, like, I think that's just how
00:34:58
ADHD brains work.
00:35:00
Right, we're much human optimization machines.
00:35:04
Speaker 1: Yeah, yeah, you got to.
00:35:05
I don't know like I'm, I don't know if I'm ADHD, but you know I
00:35:11
find that I have to at the minimum.
00:35:13
I have to have a goal, you know , at all times, right, and I
00:35:17
need to be making progress towards it and I have different
00:35:20
ways of kind of tracking that progress and whatnot.
00:35:23
Right, because when I don't have that, I start I don't know,
00:35:28
I like start going off the deep end, right, and I'm like no
00:35:30
longer focused.
00:35:31
It's very easy for me to get into that spiral, right.
00:35:35
Speaker 2: You don't know if you have ADHD right, right.
00:35:38
Speaker 1: You know, I guess I've never been tested, or
00:35:41
whatever.
00:35:43
Speaker 2: I didn't get tested until like four years ago.
00:35:45
It was crazy too, cause it's like a list of 45 things that
00:35:50
you thought or like character flaws about yourself, and all of
00:35:54
a sudden you find out like, oh, it's actually not that I'm
00:35:57
human garbage, it's actually just that my brain is wired
00:36:00
differently than other people's.
00:36:04
Speaker 1: Right, yeah, it's interesting, I feel like, as, as
00:36:09
time goes on, I just figure out how, or like, different
00:36:15
everyone is, you know, and how different everything is, you
00:36:19
know, and how to appreciate that it's.
00:36:21
Um, it's an interesting thing that I kind of recently went
00:36:26
down, I guess, but you know, can we really?
00:36:30
Speaker 2: so it's really important with regard to AI, to
00:36:34
understand how different people are and so to actually talk
00:36:37
about something technical here, one of the really interesting
00:36:39
things I've been thinking and researching a lot about is
00:36:42
diversity as it relates to artificial intelligence and as
00:36:46
it relates to technology in general.
00:36:47
Um, and so any modern times that people and I'm going to go
00:36:52
out of kind of a tangent people take of DEI as like a PR thing
00:36:55
or like it's a moral issue and like, yes, morally we should all
00:36:59
iron diversely and hire every candidate.
00:37:01
But it's actually also just a technology usability issue,
00:37:05
because if you don't want a world that's primarily built to
00:37:10
serve mostly male, mostly white men, then you can't have mostly
00:37:15
male, mostly white men building all of the technology and this.
00:37:19
That sounds kind of, you know, like a political stance, but
00:37:22
it's really not in that.
00:37:23
So if you take the politics out of pronouns, right, and you
00:37:28
don't like and we ignore who you know, we want to argue over
00:37:32
whether there should be more than two pronouns.
00:37:34
Well, guess what, in the Thai language there's like 20
00:37:37
something, because in the Thai language your pronoun
00:37:40
encompasses what you were born as, what you currently identify,
00:37:43
as is who you like to date.
00:37:44
And so we were talking about something like generated AI.
00:37:48
When we're trying to talk about canonical inferences and being
00:37:51
able to understand text, if we don't want generated AI to only
00:37:56
be a utility and helpful for English speaking people, then it
00:38:01
can't just be written by English speaking people.
00:38:04
And so another example is like English and Spanish are both
00:38:07
romantic languages, but the way you say I love something and I
00:38:11
have something or the same in Spanish.
00:38:12
So this comes into play.
00:38:15
When you're talking to a generative, like I'm putting
00:38:17
props into gen AI.
00:38:19
If I say, let's say, kiaro tacos, how does that generate AI
00:38:26
know whether I'm giving it a piece of factual information?
00:38:29
And saying I love tacos because take Kiaro tacos means I love
00:38:34
tacos, romantically, right?
00:38:35
Or Kiaro tacos meaning I want tacos, and that actually makes a
00:38:40
big difference to because one of those is input and one of
00:38:43
those is requesting help, right?
00:38:45
If you say I want this, you may be requesting to get that thing
00:38:49
back.
00:38:49
So diversity another place this comes into play is hardware.
00:38:55
So hardware is predominantly built for male frames.
00:38:58
So when you think about something like the Apple Vision
00:39:01
Pro, causing a lot of women and smaller framed people massive
00:39:06
migraines, it's probably because the people who designed, built
00:39:08
and tested that we're all skewed towards a specific population.
00:39:14
So this is one of the things I think is really interesting to
00:39:17
think about, in that diversity in technology is not just
00:39:21
important because in a utopian world, that's how it would be.
00:39:25
It's important because it's going to make, it's going to
00:39:28
determine whether or not technology is only useful for a
00:39:33
small group of people, and that's important, right?
00:39:37
So the hugging face shout out.
00:39:39
The hugging face is a nonprofit .
00:39:41
It's super dedicated to democratizing machine learning
00:39:44
and AI and I'm like those are things that are really
00:39:47
passionate about, because we have a technology that has the
00:39:52
ability to fundamentally transform the way humans live
00:39:56
and to provide benefits that a large percentage of our
00:39:59
population has never had before.
00:40:01
But we can only get there if we build it so that it works for
00:40:05
all people, right.
00:40:08
Speaker 1: Yeah, it's a really good point, and I've had on,
00:40:11
like AI researchers before and I talked about this where, like,
00:40:18
how do you ensure, right, that the AI has enough
00:40:21
diversification of its data and how it's making its choices, and
00:40:30
if it hurts a certain group of people or just advantages a
00:40:33
certain group of people, or whatever it might be right, like
00:40:37
, how do you protect against that and how do you have,
00:40:40
potentially, I don't know, like a base set of language or a base
00:40:48
AI model, right, that this other AI model can check itself
00:40:53
against is like, oh, did I make the right decision here?
00:40:56
Like that's where the people come in, I guess.
00:40:58
But it's a really it's a fascinating area because, as
00:41:05
humanity has evolved, we've never encountered something like
00:41:08
this before.
00:41:08
It's never been.
00:41:10
It's never been a thing that anyone ever really thought about
00:41:15
.
00:41:15
It's never been a thing where we thought about, like, is
00:41:18
Google serving me the right search results, right, based on
00:41:24
I don't know where I live, or whatever, right, those things
00:41:29
have never come up before.
00:41:29
It's really interesting where we take it, because this will,
00:41:33
like you said, this will really have the capability of advancing
00:41:38
civilization as a whole.
00:41:40
This can either go really well or it could probably go really
00:41:43
bad.
00:41:44
Speaker 2: Hopefully, it goes really well.
00:41:47
Yeah, for sure, and that's it's something that I think that I
00:41:52
don't tend to be as much of a doomsdayer as a lot of AI people
00:41:58
are, that they do potential.
00:41:59
I do see the potential for things to get out of control.
00:42:02
But also, just like a bucket of water, right, like there's just
00:42:07
a bucket of water on it.
00:42:08
No, but what I do?
00:42:11
I think it's something that we need to.
00:42:14
There's this phenomenon that I've always encountered in tech
00:42:17
where everybody assumes that somebody else smarter than them
00:42:20
is focused on this problem, on any problem, on any equity
00:42:24
problem that you bring up in tech.
00:42:25
Like I think most people always assume there's someone else
00:42:30
who's gonna deal with that.
00:42:32
Like because somebody smarter has already thought of that A
00:42:35
lot of times.
00:42:36
Like seriously, nobody's brought it up.
00:42:38
Like there are some large technologies that have been
00:42:40
released that people are like, oh, what about this?
00:42:43
And oops, like I remember the it was the one of the Apple
00:42:48
washes that was released.
00:42:49
Like the Apple wash, the face of it was too big for like 40%
00:42:53
of women's wrists.
00:42:55
Speaker 1: Yes.
00:42:56
Speaker 2: And actually I have comically small wrists anyway,
00:42:59
but I haven't seen an issue with like this is not what this wash
00:43:02
was intended to look like, right, so I really have.
00:43:07
But yeah, I think everybody was assuming somebody else is doing
00:43:10
this and as the world becomes more complex, that phenomenon
00:43:14
will probably increase.
00:43:15
So I think one of the big questions we have to have is how
00:43:18
do you put in place checks and balances to make sure that
00:43:22
someone actually is thinking about these things, and how do
00:43:25
you try to control like we're also trying to make regulations
00:43:29
at like a state and even country level, and data and technology
00:43:33
is local.
00:43:34
So the other thing is like there's all these different NGOs
00:43:37
that are trying to do things.
00:43:38
So we have to come to a place where we're coordinating these
00:43:41
things a lot more closely so that everybody is kind of aware
00:43:44
of the state of technology, the ideals, you know what we're
00:43:47
working towards, cause it seems like a lot of this stuff, some
00:43:51
of these decisions about you know, do we make it more
00:43:53
equitable, or do we make more profits, or are being made
00:43:57
behind closed doors.
00:43:58
So I think there needs to be clearer expectations that for
00:44:01
when one of these paradoxes comes up that has the potential
00:44:05
to impact a large number of people.
00:44:06
Those decisions aren't just being made by a small group of
00:44:10
people, and they're being made in a public way.
00:44:14
Speaker 1: Yeah, I think you bring up a great point, right Is
00:44:16
that it's very easy for us to kind of assume or think that,
00:44:22
you know, someone has already thought of this, they're already
00:44:26
working on this, they're already doing acts, right, which
00:44:30
really isn't always the case, and it's probably happening a
00:44:35
lot less than what you would expect.
00:44:38
And the difference, right, someone may even have the same
00:44:40
exact idea, but the difference is, if you act on it, it's
00:44:44
actually, it's if you actually do something with it, and that's
00:44:47
, you know, that's the important part.
00:44:49
That's, honestly, that's what separates, you know, I would say
00:44:53
, you know the people that you hear about.
00:44:56
That's what separates them from everyone else is that they get
00:44:59
an idea and then they find a way to make it work.
00:45:01
Like, however, whatever that looks like, whatever that takes,
00:45:05
you know, they just find a way to make it work.
00:45:07
And I feel like, as technology professionals, we kind of like
00:45:13
got to get out of our own heads, you know, with that, because
00:45:16
we're so analytical by default, right, that you know we'll
00:45:20
overthink something for years before actually moving, when
00:45:24
it's like, hey, you should have done this, like 10 years ago.
00:45:27
Speaker 2: Yeah, yeah, I mean like, how many times does the
00:45:29
technology come out?
00:45:30
You're like, oh man, I had the idea for that like 10 years ago
00:45:34
and it's like, yeah, but you didn't do anything about it.
00:45:35
It's a issue, yeah, great, we at Adobe always assume and
00:45:40
that's like people so many people underestimate their power
00:45:43
, and this is the thing when I so when I came into security.
00:45:46
Here's the thing we're all talking about how great it is to
00:45:50
get into security from other industries.
00:45:53
But we should acknowledge that when you do get into security,
00:45:56
if you're new in the industry, it's really easy to feel like an
00:45:59
imposter or feel like an outsider or feel like you're
00:46:01
faking it because you're moving somewhere.
00:46:03
But that goes back to what I was talking about with
00:46:06
insecurity.
00:46:07
I found that people don't really necessarily need to like
00:46:09
you.
00:46:09
They just need to trust you, and I've earned a lot more
00:46:14
respect from my peers by being really clear on where my skills
00:46:19
end than demonstrating those skills themselves, because I
00:46:23
come into a room full of security engineers and these
00:46:25
people can hack your router in four minutes with a fluke or
00:46:30
zero.
00:46:30
Like.
00:46:31
I'm not that.
00:46:32
I'm a data scientist and I only worked as a data scientist
00:46:36
applying it to security for a little over a year, so but I
00:46:40
know that what I lack in actual technical ability to pop your
00:46:45
Tesla's gas door, like I made up for in my ability to
00:46:49
communicate, so, like my superpower, is communication and
00:46:52
translation.
00:46:53
I can take.
00:46:54
I can sit down with your security engineers and they can
00:46:57
dump on me all the technical stuff that they're doing and I
00:46:59
can take that and make it into a store.
00:47:01
You can tell your customers.
00:47:02
I can make it into a store.
00:47:04
You can tell your marketing team, your sales team and so
00:47:08
like.
00:47:09
There's a lot of different skills required around security
00:47:13
to make a security program successful.
00:47:16
Like communication, like marketing, like training, like
00:47:20
cause.
00:47:20
There's a big difference between knowing how to do
00:47:22
something and knowing how to teach somebody else how to do it
00:47:24
.
00:47:24
So I was a facilitator for a long time another communication
00:47:27
job, right.
00:47:27
So it's important for the knowledge that you may feel like
00:47:32
a fish out of water if you get into security or you join a new
00:47:35
industry, but you need to understand that, your ability to
00:47:39
know your limits and to say, hey, I've actually never had
00:47:44
experience with that, but it's something I'd like to learn more
00:47:46
about.
00:47:46
Where could I read about that?
00:47:48
Like people will respect you 10 times more for doing that than
00:47:51
for immediately.
00:47:52
Well, should they get an answer , because you feel like you
00:47:54
should have one.
00:47:56
Speaker 1: Yeah, I've always found there to be a lot of value
00:47:59
when you're more honest, more open, more upfront about your
00:48:04
own limitations.
00:48:05
You know, because people will keep, I guess, kind of drilling
00:48:11
you or drilling you, especially in security, in the security
00:48:14
world.
00:48:14
You know, as soon as you say like oh yeah, I've done this for
00:48:18
10 years, or I'm an expert in this, I built this, I mean in
00:48:23
security, it's like okay, well, guess what?
00:48:26
I understand what that is and let's talk about it.
00:48:28
You know, like we're gonna talk about it at a level that like
00:48:31
if you didn't build it yourself, you're not gonna know.
00:48:34
You know, and I've been on both sides of that interviews, right
00:48:41
when I've said you know I'm an expert in something and they
00:48:44
just completely grill me on it.
00:48:46
And you know, thankfully, like I've got him past it because you
00:48:49
know what I put on my resume is the stuff that, like I have
00:48:52
done.
00:48:53
You know, I'm not like bluffing it.
00:48:55
I may use words that I may like rarely use, you know, because,
00:48:58
like you know, you don't wanna use the same like verb or adverb
00:49:01
or whatever it is.
00:49:02
You know, agitate to describe something right, but like when I
00:49:07
say like hey, I built this thing, it's like.
00:49:08
No, I actually built it, like you know, because I really don't
00:49:12
wanna be in a situation where someone points out a point and I
00:49:15
can't answer it at length, you know.
00:49:18
Speaker 2: Yeah, oh, we've all been through the experience of
00:49:20
seeing, like an ex-co-worker's LinkedIn and seeing all the shit
00:49:23
that we did that they're taking credit for and you're like, oh
00:49:26
really, you made that happen, huh.
00:49:28
Speaker 1: Yeah.
00:49:30
Speaker 2: Yeah.
00:49:31
Speaker 1: I know.
00:49:31
I don't remember you on that project.
00:49:34
Speaker 2: Yeah, yeah, and do you ever call that the George
00:49:38
Santos effect?
00:49:39
Now, is that?
00:49:39
Yeah, I mean, I think that's true Like there's a lot of power
00:49:43
in saying I don't know, like there's a lot of risk in saying
00:49:47
it if you're the one who's supposed to know.
00:49:49
But in a lot of circumstances you're not the person who's
00:49:51
supposed to know and I'm, you know, like people aren't looking
00:49:55
to you to have all the answers, they're looking to you to know
00:49:58
where to go to find it.
00:49:59
Yeah, and that's the kind of what your utility is, a security
00:50:03
professional usually is.
00:50:04
It's like I don't know everything, but I have a process
00:50:07
that I can go through or I can quickly get to the information
00:50:11
that I need, process it and get it back in the form we can use.
00:50:17
Speaker 1: Yeah, that's a key distinction there, you know,
00:50:20
being able to say I don't know and then also following up with
00:50:23
what I can find out.
00:50:24
Yeah, In today's, you know, modern age, right, 2024, like,
00:50:30
you can absolutely find something out.
00:50:32
If you don't know it, you know, by a simple Google search you
00:50:35
don't have to go to the library anymore and hopefully they have
00:50:38
a book on it, right?
00:50:39
So, like, there's no reason why you can't, you know, say that
00:50:44
and actually follow up with it with the real information.
00:50:46
Well, you know, Jackie, we've gone this entire time and
00:50:53
definitely doesn't feel like 50 minutes, that's for sure.
00:50:56
But you know, I'm I try to be very conscious of everyone's
00:50:59
time and you know I don't want to go over because I know that
00:51:02
we're all booked.
00:51:03
You know, meeting after meeting .
00:51:04
So you know, before I let you go, how about you tell my
00:51:07
audience, you know where they could find you, where they could
00:51:09
find Cribble, if they want to learn more and maybe they want
00:51:12
to reach out.
00:51:14
Speaker 2: Yeah, absolutely.
00:51:15
I've been on LinkedIn, so LinkedIn slash Jackie's in
00:51:18
security Can interpret that whichever way you want to.
00:51:22
Yeah, and Cribble.
00:51:26
You know we love Cribbleio, or you can also follow Cribble on
00:51:29
LinkedIn.
00:51:29
We have a fantastic social media manager who makes a pretty
00:51:32
high quality means.
00:51:34
You know we didn't talk a ton about what Cribble does, which
00:51:38
is my preference, because I think that it's a product that
00:51:42
is much better for people to do than use.
00:51:44
But if you have questions about moving data, making use of data
00:51:49
, any of those things, you're more than welcome to reach out
00:51:52
to myself or anybody else on the Cribble team.
00:51:55
Speaker 1: Awesome.
00:51:56
Well, thanks everyone.
00:51:57
I hope you enjoyed this episode .