Ever felt the thrill of jumping into the unknown, taking a leap of faith from a successful career into the exciting world of startups? Our guest Brian, a "fallen physicist" and professor turned IT Security expert, knows this feeling all too well. Join us as Brian shares his intriguing journey from academia to the forefront of cybersecurity, where he found his calling in the high-pressure, high-stakes universe of IT security.
Navigating through the sea of alerts, Brian and his team didn't just survive, they thrived, winning the battle to design user-friendly interfaces. A journey fraught with challenges but also filled with rewarding victories. Brian talks about his evolution into the startup space, the grueling process of creating the perfect business plan, and the satisfaction of creating usable solutions for an industry fraught with complexity. And while we're on personal journeys, I also reflect on the whirlwind of emotions and experiences as I stepped into fatherhood, all against the backdrop of the life-altering September 11th events.
We also delve deep into the pressing issues of runtime security and vulnerability management, especially for Linux systems and Kubernetes. With Brian's expertise, we dissect security into "left of boom" and "right of boom" stages, underlining the importance of efficient detection and response strategies in a world where average detection time for breaches is three months. We also put the spotlight on Tipping Point technology that has revolutionized security processes in large companies, transforming painstaking tasks into enjoyable experiences. To wrap up this riveting conversation, Brian shares the quirky origin story of his company, Spider Bat. So, come join us on this riveting journey through the complex world of cybersecurity!
https://www.linkedin.com/in/brian-smith-07a4191/
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, brian ?
00:00:00
It's really good to finally have you on.
00:00:03
I'm really excited for a conversation today.
00:00:05
I think it's going to be.
00:00:06
I think it'll be an interesting one.
00:00:08
Speaker 2: Yeah, thanks, thanks, joe, it's good to be here.
00:00:10
Thanks for having me on.
00:00:12
Speaker 1: Yeah, definitely so, brian.
00:00:15
You know, why don't we start with how you got into IT?
00:00:19
Maybe what you specialized in before that Did you start
00:00:22
somewhere else, or were you always, you know, it focused
00:00:25
throughout your whole life and you kind of just went into it,
00:00:28
right?
00:00:30
Speaker 2: Yeah, no, it was a bit of a weird journey.
00:00:32
I think pretty much a lot of people who get into security
00:00:36
have a weird journey, but for me I'm a fallen physicist, wow.
00:00:41
So I started off doing going to TC, berkeley and studying
00:00:47
physics, and so I have a bachelor's in physics from
00:00:50
Berkeley, and then found that most of the things I was doing,
00:00:54
being kind of naturally drawn to , are more related to computers,
00:00:57
and so I ended up going back and getting into the grad
00:01:04
program, the CS grad program at Berkeley, and went and studied
00:01:08
there and then ended up as a professor of CS at Cornell for
00:01:13
five years and left Cornell and started a company.
00:01:20
Well, it's a bit of a different story.
00:01:22
I joined a startup down in Austin that out of the ash
00:01:26
called NetBank and out of the ashes net clients grew tipping
00:01:30
point.
00:01:30
So net clients was actually building these, these
00:01:34
informational appliances.
00:01:36
Think of it as an iPad they were building in 1999.
00:01:40
So it was.
00:01:41
It was a flat panel display, didn't have touchscreen in at
00:01:46
that version, but it was hooked up over dial up line had an
00:01:50
embedded processor.
00:01:52
We ran our embedded OS.
00:01:53
We built the operating system, built all the software that ran
00:01:56
on.
00:01:56
It was all custom software had big buttons on it so that you
00:02:00
can, you know, access them Right .
00:02:03
And that company went public right on the week of the height
00:02:08
of the NASDAQ in the bubble, back in the 2000 bubble.
00:02:12
We went out on Monday and Thursday was peak and then it
00:02:20
was kind of like run like a lot of the dot coms at that point.
00:02:23
And so what happened was the it was.
00:02:28
It was not profitable and the only way to get it profitable
00:02:32
was to do another offering.
00:02:34
But the markets had crashed so there was no realistic way to do
00:02:38
a secondary offering.
00:02:39
So the board got together and said, well, we can run it into
00:02:41
the ground, we can sell it all off and give the money back to
00:02:44
the investors, or we can sell it all off, take what we have and
00:02:49
go do something different.
00:02:50
So they do chose door number three and they put literally
00:02:53
eight of us in the room and said you have $53 million, you have
00:02:58
53 engineers and $70 million, go do something.
00:03:00
And out of that grew tipping point.
00:03:04
So it's kind of the anti startup normally have this idea
00:03:08
and no people and no money.
00:03:09
So here we had people and money and no idea.
00:03:14
So we, we, we, we started tipping point and that was a
00:03:18
company that built intrusion prevention systems, and so
00:03:21
that's how I got into IT security, because I had a
00:03:24
background to do networking background systems.
00:03:27
Other people in that.
00:03:29
That startup group had some backgrounds in security and some
00:03:33
background in other other parts .
00:03:35
And so we we decided to originally built this box that
00:03:41
was going to be sort of a unified threat appliance and it
00:03:45
was going to have a firewall and an IDS and a vulnerability
00:03:48
scanner in it.
00:03:49
Use the things to to reflect each other.
00:03:53
And so what ended up happening was we looked at it and said,
00:03:57
well, what's the IDX?
00:03:58
Did text something?
00:03:58
Want to block it in the firewall?
00:04:00
No, just block it.
00:04:01
You feed it's in the software, just drop the packet on the
00:04:04
floor.
00:04:04
And so we figured out how to build that thing and make it
00:04:07
reliable and make it go online.
00:04:08
So that was that was tipping point and build a great group of
00:04:12
security pros there and that became what we call the tipping
00:04:16
point mafia.
00:04:17
They're kind of everywhere.
00:04:21
It's been really fun watching all the different careers.
00:04:26
Speaker 1: So, you know, take me back to when you got that
00:04:30
degree in physics and then you made the switch to CS.
00:04:34
You know computer science, how, what was that thought process?
00:04:38
Like you know, why did you make that switch?
00:04:40
Was it?
00:04:42
Was it the right opportunity?
00:04:45
Right, like kind of right opportunity came up and and you
00:04:49
took a hold of it and you went with it.
00:04:50
Was it difficult for you to pick up the material?
00:04:54
Was it easy for me?
00:04:56
Physics, back in back in my bachelors I took physics.
00:05:02
It was like a basic, you know, physics 105 or whatever, and I
00:05:06
would ace the labs, I mean completely ace the labs.
00:05:09
But then we get to the test and I failed every single exam
00:05:13
after the curve, which is really saying something.
00:05:15
I mean that's, that's impressive all in its own right
00:05:19
there.
00:05:19
But it was just such a, I don't know, it was abstract to me, it
00:05:25
was a.
00:05:25
It was a huge word problem and I'm terrible at word problems
00:05:30
and so, like God forbid, you throw me a paragraph with some
00:05:34
math in it and ask me to put stuff together.
00:05:36
I just can't do it, yeah.
00:05:39
Speaker 2: Well, for me it was actually the exact opposite.
00:05:41
I was terrible in lab but I did really well in the theory thing
00:05:45
and I wanted to be kind of a theoretical physicist actually
00:05:49
at the time when I went in.
00:05:50
And so going through that I there were two things going on.
00:05:54
One is my older brother was actually a grad student in the
00:06:01
architecture department and so when I showed up as a freshman,
00:06:05
he knew some people who were looking for someone that could
00:06:10
help them set up this, this lab that they had there, which was a
00:06:13
boundary layer wind tunnel lab, basically wind tunnel and they
00:06:17
were trying to instrument it with and they needed a
00:06:19
combination of instrumentation and computer programming.
00:06:22
And so I had done enough that I could could do that with them,
00:06:27
and so I was.
00:06:27
I was doing a lot of programming and just kind of
00:06:30
drawn to doing that, and in the physics I kind of liked a lot of
00:06:34
the.
00:06:34
I liked going to seminars, I liked a lot of the discussion,
00:06:39
but I really didn't like doing physics.
00:06:41
I found out like knowing physics , I didn't like doing physics,
00:06:44
and so when I got out, as a as an undergrad, I didn't really
00:06:48
want to to go into physics, but I didn't know what I wanted to
00:06:52
do, and so I spent a couple years just you know, continuing
00:06:58
to do the programming work in the architecture department.
00:07:02
That got me into a bunch of other programming jobs, which
00:07:04
eventually led me into the CS department as a programmer there
00:07:07
, which is how I got into grad school, but as a side effect of
00:07:13
that.
00:07:13
but during that time, I mean, I was doing all sorts of things.
00:07:16
I was teaching hang gliding, so it's just doing random things.
00:07:21
It was a lot of fun.
00:07:25
Speaker 1: Yeah, I mean it sounds like you're not afraid to
00:07:30
try new things and potentially not be good at it.
00:07:34
I mean there's no way that.
00:07:37
I guess there's no way that you could have known like, hey, I
00:07:41
like theoretical physics, I like the theoretical problems that
00:07:47
I'll encounter right, and you excel at that.
00:07:49
But then when you get into the practical side of it, you
00:07:52
struggle on that end.
00:07:53
You couldn't have foreseen that , but you still stuck it out,
00:07:58
and which really says something, because, I mean, a degree is
00:08:01
typically four years long and classes, especially in physics,
00:08:06
are not push-over classes.
00:08:08
I switched my major to criminal justice, which is a complete
00:08:12
pushover compared to anything in science, thankfully Well yeah,
00:08:18
I just got.
00:08:19
Speaker 2: I began to realize that the other people in the
00:08:23
physics that I was going to be competing against were just a
00:08:26
hell of a lot smarter than I was .
00:08:27
I just got my butt kicked.
00:08:29
In general relativity I was just by the other guys.
00:08:35
I think I barely got a lopie or something in that and I'd been
00:08:40
used to getting good grades, but I think it was more just what
00:08:45
you're drawn to.
00:08:46
I just kept on coming back and finding that I liked doing the
00:08:51
programming and I just naturally gravitated towards it.
00:08:54
It wasn't forcing myself to do anything, it was just.
00:08:57
The problems were interesting, they were tractable, they were
00:09:01
fun little puzzles to solve, and physics is about problem
00:09:07
solving and lots of things are about problem solving.
00:09:09
And IT and a lot of IT at least is about problem solving.
00:09:12
So it's kind of the same genre.
00:09:14
I found that when I was over in the CS department there was all
00:09:20
these conversations going on that I couldn't understand the
00:09:23
words they were using.
00:09:23
They said oh yeah, you just use a scatter gather vector and
00:09:27
insert it into an implicit heap and then secondary two level
00:09:34
hash tables solve the problem, something I don't know.
00:09:37
It was just mumbo jumbo, so it kind of got back into CS,
00:09:40
largely because of just wanting to understand the words.
00:09:45
Speaker 1: Yeah, it's interesting.
00:09:47
It's like you have to really enjoy puzzles and
00:09:52
troubleshooting and get used to failing in IT overall, like even
00:09:57
literally this week.
00:09:58
I worked for a large automotive manufacturer and I had just
00:10:05
something very simple and I'm a fully certified AWS guy.
00:10:10
All I had to do was deploy an EC2 and log into it Run five
00:10:17
commands once I log into it.
00:10:18
And that was it.
00:10:19
Pretty simple.
00:10:20
You would think it's extraordinarily simple.
00:10:23
It sounds easy.
00:10:27
Yeah, and I spent three days deploying this EC2, and I mean,
00:10:35
it was mistake after mistake, after mistake, and I reached out
00:10:38
to an engineer that was more senior on the team because I'm
00:10:41
newer at this company, and he's like oh, you're solving the
00:10:48
problem that's in front of you.
00:10:49
There's a whole other puzzle that you have never even heard
00:10:53
of that now we have to go solve.
00:10:55
And I'm like what?
00:10:57
This is an EC2.
00:10:58
It's AWS, like this should just work.
00:11:00
He goes no, there's a lot more going on that you don't know
00:11:03
about.
00:11:04
And I'm sitting here like I'm the stupidest person at this
00:11:08
company right now.
00:11:09
I can't even do my basic functionality of my job.
00:11:12
Like what am I?
00:11:14
Speaker 2: doing.
00:11:14
Well, that's one of the things that I've seen just over the
00:11:17
years, watching IT kind of grow and develop.
00:11:20
Because when I started we were relatively things were honestly
00:11:26
just kind of simple.
00:11:27
There were C programs, you compiled them, there wasn't that
00:11:32
much going on.
00:11:32
And now we've built layers upon layers upon layers of
00:11:35
abstraction and stuff and you kind of need it to run.
00:11:39
So all the stuff going on the cloud and when it works, great
00:11:42
it works.
00:11:43
And then when it does sense, there is just so many things
00:11:47
that could have gone wrong and so many of them are completely
00:11:50
opaque.
00:11:51
Just it's hard to even know where Look.
00:11:54
So you end up developing just a lot of experience and saying oh
00:11:57
, I bet I know what that is, type things.
00:12:01
Speaker 1: Yeah, there was probably 20 or 25 things that I
00:12:06
had to check that I knew about, and then my other engineer gets
00:12:10
on the call and he goes oh no, there's 15 other things that we
00:12:12
need to look at right now.
00:12:15
Speaker 2: This is crazy.
00:12:16
We were trying to get this one just the other day.
00:12:20
We were trying to get this machine to launch in a certain
00:12:27
availability zone within a region in AWS, and so it was USE
00:12:33
1A that we had to get it to launch it to, because that's
00:12:36
where the disk that we wanted to mount was, and we had the
00:12:39
launch template set up and it was just ignoring it.
00:12:42
It would come up in whatever zone it wanted to, and so we'd
00:12:45
check it and have a crash and check it, shut it down and after
00:12:49
five or six tries it would randomly get the right one.
00:12:52
We finally found out that the thing that was wrong is we
00:12:56
hadn't set the subnet explicitly in the launch template,
00:13:01
seemingly not documented anywhere that you have to do
00:13:03
this Such a smooth thing.
00:13:04
And when it does now it comes up in the right zone.
00:13:07
But it's random stuff like that , yeah.
00:13:10
And so many of that you don't even know what.
00:13:12
I think we're getting more and more things where we don't even
00:13:15
know exactly what's wrong.
00:13:16
We just find other ways to skin the cap to solve it.
00:13:21
Speaker 1: Right, you mentioned when you were at that startup
00:13:27
company and the board came together and they had three
00:13:31
options.
00:13:31
It sounded like it kind of sounded like the option that
00:13:34
they went with was more like an incubator.
00:13:37
That's the feel that I got, because they gave you a bunch of
00:13:41
money, a bunch of engineers and said figure it out, create
00:13:44
something new.
00:13:45
Is that what you?
00:13:46
Is that how you felt it was at the time as well?
00:13:50
And how did you overcome that creativity part?
00:13:52
Because I feel like, even being prompted with that situation,
00:13:57
there's a creativity block right there.
00:13:59
It's like, oh no, now I actually have to be creative, I
00:14:03
have to come up with something new, or whatever it might be.
00:14:07
Speaker 2: I feel like that could be an issue.
00:14:08
Well, in that case, what we did is we split up into four groups
00:14:14
of two and each of the group was responsible for coming up
00:14:20
with one business plan a week, and so we kind of cranked
00:14:25
through them and we must have gone through like 100 different
00:14:28
plans before we landed on the right one.
00:14:30
Wow, we thought and it was kind of a combination of looking at
00:14:35
the businesses where we knew them well enough that we thought
00:14:40
we could succeed, as well as what the talents of the team
00:14:44
were and kind of what our assets were.
00:14:47
So it was kind of intersecting all those and taking an idea and
00:14:50
then just kind of running it to ground, and it probably took
00:14:54
like 10, 12 weeks to, you know, sort it out and figure out the
00:14:59
right one.
00:14:59
It was a pressure cooker, that was a real pressure situation,
00:15:05
but it worked out.
00:15:07
Speaker 1: Do you do you think that there is Potentially some
00:15:11
unique skills that you learned during that time that you're now
00:15:14
utilizing today?
00:15:17
Speaker 2: I yeah.
00:15:19
So I kind of came into that as having been a you know CS
00:15:25
professor and you know all these other things.
00:15:28
So I didn't know much about the business side and I didn't know
00:15:31
much about a lot of the other Aspects of running a startup.
00:15:34
Because that first startup that was doing the iPad type things
00:15:37
I came in as, just like you know , essentially the lead architect
00:15:41
to help build out the system, but in tipping point you know as
00:15:47
much more, being one of the founders essentially coming up
00:15:50
with the idea.
00:15:51
And then I Got to do kind of every job in the company
00:15:55
practically Felt like at least I was.
00:15:57
I ran one of the engineering teams for a while.
00:16:00
I Ran, I wrote, I was product manager for a while I was
00:16:05
essentially out in the field Working with sales, the sales
00:16:09
teams, for a while I was, so I got I got a really broad
00:16:13
exposure to do all of the different aspects of it and that
00:16:16
helped a lot when we went to start the, the next company and
00:16:20
the next company after that.
00:16:21
Now I'm on, I'm on my third company now, but they've all
00:16:24
been in the cyber security space and I think that the but it's
00:16:30
it's such an interesting and deep field Because partly, just
00:16:34
like all things in computing, it's constantly changing right,
00:16:37
the technology Changes and aspects of a change so new
00:16:40
solutions become possible that they just weren't there before.
00:16:43
But it's got this added dynamic of a kind of a cat in mouse
00:16:48
game between what the attackers are trying to do, when, what,
00:16:51
how we try to defend them, and it's that the second company
00:16:56
that we we sold.
00:16:58
We had a second company was a company called click security.
00:17:01
For tipping point got bought by three com, which ultimately
00:17:05
became part of, got sold off into turn micro.
00:17:09
The second company we did was something called click security
00:17:12
and that was back in 2009, essentially an early day XDR.
00:17:15
So we called it something different because the term XDR
00:17:19
had a big point, but it was.
00:17:21
It was in that same realm and that got sold to a company
00:17:26
called alert logic down in Houston and I got really to
00:17:31
spend a lot of time watching.
00:17:33
We became the front end to their sock.
00:17:35
It was managed security service and so we got to watch how they
00:17:39
how kind of Frontline security analysts in the sock use this
00:17:44
stuff day in and day out and that's a.
00:17:47
That is such a treadmill job.
00:17:51
You know, just pick up an alert , spend 15 minutes trying, and,
00:17:55
you know, trying to figure out if this thing's real or not,
00:17:58
move on to the next and just because they're just constantly
00:18:01
in triage and it's exhausting and it's pretty repetitive and
00:18:04
there's a really high burnout rate in that particular, that
00:18:08
particular job.
00:18:09
So it's been was really interesting just watching all
00:18:12
these different aspects of the field and trying to figure out
00:18:15
how to solve it better.
00:18:17
Speaker 1: Yeah, that is, that's extremely true with.
00:18:20
With the sock, like it's the burnout rate on that team alone,
00:18:26
I is it's insane.
00:18:27
I've never, I've never seen something like it before, and
00:18:32
when I worked for a credit bureau previously, you know they
00:18:35
had their own separate sock that even people in the security
00:18:39
team couldn't get into the sock , like they didn't have access
00:18:42
to get through the door, and I mean it was.
00:18:46
It was intense.
00:18:48
You know, those guys they do not mess around and they're
00:18:52
they're constantly going for their entire Shift or whatever
00:18:56
might be right.
00:18:56
I mean it's just, it's nonstop.
00:18:59
It, like you said, it's alert after alert after alert,
00:19:03
investigating things, doing deep dives into things.
00:19:06
I mean they, they told me about something that was Launching in
00:19:09
my browser that I I had no clue was even there.
00:19:12
I mean I'm a security engineer, right, so I should know what's
00:19:15
going on on my computer.
00:19:16
And they like confront to me because they're like there must
00:19:20
be like a trojan on your computer or something like that
00:19:22
that we haven't seen before.
00:19:23
And they looked and they like pointed out you know the binary
00:19:27
and everything that was running.
00:19:29
And I'm like, dude, you guys are on a totally different
00:19:32
planet.
00:19:33
Speaker 2: Well, I remember just even talking to a lot of the
00:19:37
security professionals, just as In the very my various roles at
00:19:44
the companies, I would, I'd get these, you get these war stories
00:19:48
about that attacks, some of Really smart and sophisticated
00:19:51
it's and you kind of think about , hmm, I'm not sure how I would
00:19:55
defend against that.
00:19:57
Like one of them.
00:19:58
There was a guy who told me about this, this Backdoor that
00:20:02
they had planted into a browser where on rent and random,
00:20:07
seemingly random intervals, it would go out and fetch data from
00:20:10
this website and and pull it down.
00:20:12
And it was just a regular looking website.
00:20:14
They had also compromised the website and what they did is
00:20:17
they put in the comment stream of the HTML that commit the they
00:20:21
command and control instructions.
00:20:22
It was a pseudo random number generator that would figure out
00:20:26
when it would randomly go out to the website and they had the
00:20:29
same random number generator running on the website.
00:20:31
So about three minutes before they would modify the website
00:20:35
with the instruction, put it in the comment section, it would
00:20:38
get downloaded and then it would get deleted off the website
00:20:41
after Got got things, so you can never see it.
00:20:44
It's kind of like, how did you find that?
00:20:48
And it was all being drifted and geez.
00:20:53
Speaker 1: Like, even if an alert was triggered on that and
00:20:56
you start investigating it, by the time you even start to
00:20:59
investigate it it's already gone .
00:21:01
Speaker 2: Yeah, yeah.
00:21:02
So I kind of realize.
00:21:03
This is kind of the thing that I ultimately realized a lot
00:21:08
about this is that a lot of the way we've been doing this this
00:21:12
is Kind of upside down and backwards, because what we do is
00:21:16
we we get an alert and then what we do is we try to develop
00:21:21
context for that alert.
00:21:22
So we go because An alert coming in is like some Listening
00:21:27
to a wiretap or something, and you hear the word assassinate
00:21:30
and bomb and president, or something like that, and you're
00:21:33
trying to figure out is this a plot, or is this a bunch of
00:21:37
academics reviewing a you know historical paper or what's?
00:21:41
What's the context?
00:21:41
You can't really understand what that means until you
00:21:44
establish some context.
00:21:45
And so what this the sap guys are doing and if you step back,
00:21:52
is really taking that alert and trying to develop enough context
00:21:55
around it to say what is the attacker trying to do, or is
00:21:58
this even an attack, or is there some mitigating factor?
00:22:01
What kind of what's going on?
00:22:02
That's that's that's sort of their job.
00:22:07
So If you can automatically provide that context and this is
00:22:15
sort of the idea of what I've been working on recently is try
00:22:19
to build up the context for everything that's going on in
00:22:21
the system.
00:22:22
Then when you get first off, you can use that to detect a
00:22:25
bunch of stuff.
00:22:26
But then also when you have a detection, since you have the
00:22:30
context already automatically built up, you can have a lot of
00:22:34
rules that just say, oh, in that context this alert doesn't mean
00:22:36
anything, and so you can really tune out the pulse positives.
00:22:39
That way, and because you can tune out the pulse positives,
00:22:42
that way, you can turn up the signal because you're not
00:22:45
getting drowned in false positives.
00:22:47
And so the combination of those means you can turn up the
00:22:51
signal so high that it's almost impossible for an attacker to
00:22:55
get in and wander around.
00:22:57
You're no longer trying to detect malware or something like
00:23:00
that, you're just trying to detect almost anything weird
00:23:03
going on and then using the context to filter that out and
00:23:09
the context to group them together and do a lot of that
00:23:13
work that the SOC analyst does in triage kind of automatically.
00:23:17
So you end up with it's a pretty neat system and it makes
00:23:22
the SOC analyst job a hell of a lot more fun and it just leads
00:23:26
to a lot more efficient, better, cheaper, faster, stronger
00:23:30
security etc.
00:23:31
So it's been 20 years trying to solve this problem.
00:23:40
Speaker 1: Yeah, you know, whenever I demo product or
00:23:43
anything like that or POC it and if I'm getting so many alerts
00:23:49
that I don't know what to pay attention to, it's actually just
00:23:53
qualified immediately from my POC.
00:23:56
Because if I can't figure out in you know a couple, maybe one
00:24:02
or two minutes of if I need to pay attention to it or not, then
00:24:06
I know for sure I'm going to be spending my entire day in a
00:24:09
tool that's generating nothing but a bunch of garbage alerts
00:24:13
that I don't need.
00:24:14
That has been the case one too many times in my career.
00:24:20
Now that I'm beyond the alert assessment part and I'm on the,
00:24:24
you know more of leading engineering teams and whatnot,
00:24:27
yeah, now it's about kind of lifting that burden for my SOC
00:24:33
and from everyone else saying like, yeah, you don't need to
00:24:35
spend your time on these alerts, you can go do other things.
00:24:38
Speaker 2: There's better ways to solve this problem.
00:24:40
I think a lot of people get one of these tools and they say, oh
00:24:44
, this is really cool.
00:24:45
And then after a while they realize this thing's like my
00:24:48
Christmas puppy.
00:24:49
You know, it's things a lot of work.
00:24:52
You know, take care of this.
00:24:54
Or you know, babies are like that.
00:24:57
Actually, babies are a lot of fun.
00:25:01
Speaker 1: I understand you're a new father relatively yeah,
00:25:04
yeah, almost six months now Cool .
00:25:08
Speaker 2: Is this your first year?
00:25:09
Yep, that's very cool.
00:25:11
It's a journey.
00:25:15
Speaker 1: Yeah, it's been a lot of fun, a lot of hands-on
00:25:19
learning, I guess, and like finding out that I don't need
00:25:25
sleep that bad, I guess.
00:25:27
Speaker 2: It's a sleep deprivation experiment.
00:25:29
Someone once said babies are all joy and no fun, and
00:25:36
sometimes it feels like that for the first year and then it's
00:25:39
funny how you look back on it and it's really neat.
00:25:42
It's also a neat time because it's one of those kind of
00:25:45
transition periods.
00:25:47
Speaker 1: Yeah, yeah, she was born in March.
00:25:50
And I mean the first two months .
00:25:53
Like I don't even remember like anything that happened in those
00:25:58
first two months, because, like I was talking to someone A
00:26:03
couple of weeks ago at work and they said, oh, don't you
00:26:06
remember this thing that happened in April.
00:26:08
You know, like it was going to affect the criteria on this
00:26:10
other thing.
00:26:11
And I was like, dude, I don't remember April existing.
00:26:15
Like my memory doesn't kick back in until the middle of May.
00:26:18
Okay, that's where my brain's at right now.
00:26:21
And it's insane because you know previously, right, like if
00:26:26
I got four hours of sleep in a night, like I'm exhausted, I'm
00:26:30
beat up, I need to take a nap during the day, whatever it
00:26:33
might be.
00:26:33
Now, if I get four hours of sleep, I'm like, oh, I'm well
00:26:37
rested.
00:26:38
I'm actually ready for the day.
00:26:41
Speaker 2: Kids give you perspective.
00:26:42
My oldest, my first kid, was born on September 8th 2001.
00:26:51
So the first day home from the hospital was the night of
00:26:54
September 10th.
00:26:55
So I wake up in the morning on September 11th and turn on the
00:26:59
news and it was, you know, in the sleep deprivation state that
00:27:03
you're in, and it was very Alice in Wonderland.
00:27:06
That was a very strange experience.
00:27:11
Speaker 1: Yeah, I bet man that's a wild time to be born.
00:27:18
You know, I forget like how long ago that actually was,
00:27:22
because I was in fifth or sixth grade at the time.
00:27:24
You know, I think fifth grade and that was just such an absurd
00:27:30
moment.
00:27:32
Speaker 2: Yeah, it was.
00:27:32
It was 22 years ago now and almost exactly 22 years yeah.
00:27:39
Speaker 1: But anyway it was.
00:27:41
Speaker 2: You know, we, we, we muddled through that and we
00:27:44
muddled through a lot of things, yeah, yeah, it seems like we're
00:27:48
just like getting through the next thing.
00:27:50
Speaker 1: One thing comes up.
00:27:51
You just got to get through it, you know, one day at a time,
00:27:55
but you know to.
00:27:57
To circle back, I guess, a little bit.
00:28:00
You know your current company is called Spiderbat, yes, sir,
00:28:04
and I saw that it looks like it is focused around protecting on
00:28:09
runtime, application security right or detection at the
00:28:15
runtime, which is really it's an interesting place to be and,
00:28:23
coming from someone that is not a developer, that's always an
00:28:28
area where it's kind of like it's like it's the expertise
00:28:30
that I'm not a fan of, right, like I need someone else to dive
00:28:33
into it.
00:28:33
But from just looking at you know, the website and whatnot,
00:28:36
right, looking at the product a bit, it looks like a pretty easy
00:28:40
platform to kind of dive into and learn it and understand what
00:28:43
it's like.
00:28:44
Was that, was that a core principle of when you were
00:28:53
designing this product, that it had to be something that you
00:28:56
know virtually anyone could pick up and figure out what's going
00:28:57
on?
00:29:01
Speaker 2: That's definitely.
00:29:02
One aspect of it is that you know I came out of these, this
00:29:07
set of experience, from building iPads basically to to building
00:29:14
things that the SOC uses and ever since the beginning I've
00:29:18
been very much into kind of user interfaces building very easy
00:29:22
to use user interfaces.
00:29:23
In fact, that's what I was studying at grad school in at
00:29:29
Berkeley was user interface design.
00:29:30
So that's always been very near and dear to my heart and it's
00:29:38
such an important aspect of it and so many of the tools are bad
00:29:43
, so you kind of want to build something beautiful, yeah.
00:29:47
But I think that the core idea of the runtime security, if you
00:29:54
think about it, is we're building runtime security for
00:29:58
for Linux systems, in particular for Kubernetes.
00:30:01
So let me back up just a second and kind of explain what that
00:30:04
means.
00:30:04
And I think about the world of security I mean it's such a
00:30:06
broad thing because it goes all the way encryption and
00:30:09
governance policies and things like that but broadly speaking
00:30:13
you can break it up into left of boom and right of boom.
00:30:16
So left of boom are all the preventative things trying to
00:30:21
harden your world up, trying to prevent things from bad things
00:30:24
from happening, that's.
00:30:25
That's the left of boom stuff.
00:30:27
The right of boom is something, something they got in anyway or
00:30:33
something bad happened.
00:30:34
How fast can you detect it?
00:30:36
How fast can you respond?
00:30:37
How, how effectively can you, can you get in there?
00:30:40
And as an industry we suck at it, honestly, so we're pretty
00:30:46
terrible.
00:30:47
The average time to detect, I think from from the time when
00:30:52
you kind of go back to these breaches, it's three months from
00:30:56
the initial break To when they're actually detected.
00:30:59
If it's a, if it's a real breach, it's 93 days is the call
00:31:02
time.
00:31:02
So it's it's kind of crazy.
00:31:05
And then, because they're all over your network, the
00:31:09
investigation time is really long and because then there's
00:31:11
such a mess that the delousing time is really long.
00:31:14
So it's like a year and a half.
00:31:17
And, yeah, exactly.
00:31:20
So the the right, the idea of right of boom is you want to
00:31:27
catch them as close to right of boom as possible, because then
00:31:30
it's little damage is done and it's relatively easy to clean it
00:31:33
up and so on.
00:31:34
And then what you discover as things go right of boom should
00:31:39
inform the preventative side.
00:31:41
So the preventative side tries to prevent boom from happening
00:31:44
in the right of boom.
00:31:45
You learn from those mistakes and say this is what I fix up.
00:31:50
But the way that the problem with the left of boom is, our
00:31:53
systems are so complicated, the attack surface is so huge that
00:31:58
in prevention is almost perfect.
00:32:01
Prevention is like just impossible.
00:32:03
It's like having the entire continental US border and trying
00:32:12
to prevent all possible ways of anyone from ever entering the
00:32:15
country through any through underground tunnels or boats or
00:32:19
on any shoreline, on some remote shoreline or whatever.
00:32:23
It's kind of that's not going to happen.
00:32:25
You're not going to be able to, you know, prevent everything.
00:32:29
So you want to have something where you can focus.
00:32:31
The most likely area is what people are gonna, you know what,
00:32:35
where to best spend your prevention but then have really
00:32:38
good detection.
00:32:38
And so the same thing happens in in cybersecurity.
00:32:43
I could try to patch absolutely Every vulnerability in my
00:32:47
product, for example.
00:32:48
That's been kind of the the latest way, that latest wave is
00:32:51
the ship left.
00:32:52
But the problem is is that only covers some of the things,
00:32:57
right?
00:32:57
What about all the unknown vulnerabilities in the product?
00:33:00
They could be in libraries that could be in your code.
00:33:02
Who's doing all the research to find the unknown
00:33:04
vulnerabilities in your code?
00:33:05
Because all the release are just bugs.
00:33:08
And do you ship code code?
00:33:10
That hat is a hundred percent bug free.
00:33:11
Do you never find bugs at runtime?
00:33:13
It's.
00:33:15
It's a lot like saying that what I want to do is make my
00:33:19
stuff so perfect that I won't monitor any of the systems
00:33:24
running for crashes or reboots or out of memory errors or
00:33:28
something like that, because Because I'm perfect and it's,
00:33:32
it's doesn't work that.
00:33:33
Instead, what you do is you build a good monitoring system,
00:33:38
good testing, testing, monitoring systems, and you try
00:33:41
to you, you detect where it, what's going on, and use that.
00:33:45
If you can detect it really fast, you can recover from it,
00:33:49
mitigate the problem before any damage is done, and then you use
00:33:52
that to focus where you want to , where you want to spend your
00:33:55
hardening efforts, and it's just a hell of a lot more
00:33:58
cost-effective, more interesting , and you get to spend more time
00:34:03
developing Features and things that benefit people rather than
00:34:09
just trying to protect all, spending all your money and time
00:34:12
trying to protect.
00:34:14
Speaker 1: Yeah, that's that vulnerability management.
00:34:17
It's almost like a.
00:34:19
It's like a legacy process or mentality, right where, in the
00:34:24
beginning of my career, I had a boss that if it didn't have a
00:34:29
CVE attached to it, he didn't care about it, he didn't want to
00:34:34
hear about it, he didn't want to know about it, like none of
00:34:37
it mattered, you know.
00:34:38
And I had to sit him down and say, like you know, can we both
00:34:41
agree that, like me, gaining root without putting in a single
00:34:45
password On this server is a bad idea, right, like that's
00:34:49
what I want that.
00:34:49
And he agreed to that.
00:34:51
Then I said, okay, well, there's this thing that doesn't
00:34:55
have a CVE and I just used it right in front of them and I got
00:34:58
root on the server that I should have been able to get
00:35:00
root on.
00:35:01
And he's like, oh, how'd you do that?
00:35:02
I'm like there's no CVE, I don't know what you're talking
00:35:05
about.
00:35:06
You know it was like kind of open up his eyes right to like,
00:35:10
oh there's, there's more threats out there than then what I've
00:35:14
been told before, what I've been used to.
00:35:16
And it was that mentality shift , though, that it took to
00:35:21
actually Like open someone's eyes and say like no, there's a
00:35:25
whole lot more.
00:35:26
Then just these CVE's.
00:35:28
Even now, you know, even now I I encounter it where people are
00:35:32
caring about the CVE's, you know , way too much.
00:35:35
It's like, yeah, the known stuff I'm not concerned about,
00:35:39
like MacaFee can catch that I'm not concerned about it.
00:35:42
You know, not saying that we use MacaFee or anything which we
00:35:47
don't, thankfully, but you know like the lower end endpoint
00:35:53
detection response platforms can catch that stuff.
00:35:57
It's the stuff that I don't know about that worries me.
00:36:02
Speaker 2: Exactly, exactly.
00:36:03
And so the way that if you look at, if you look at the attacker
00:36:08
in the pre attack stage, you've got to defend against
00:36:11
everything that they could possibly do In the post attack
00:36:14
stage.
00:36:14
They're on our turf, so they have to not make any mistakes to
00:36:18
avoid getting detected.
00:36:20
And so they.
00:36:21
You know they'll do fileless malware and or file less attacks
00:36:25
and living off the land to type of tax.
00:36:26
But even that's really detectable and hindsight.
00:36:29
And that's where the context comes, because if you see, if
00:36:34
you look at what attackers do after they get in the system,
00:36:37
it's kind of like I've been teleported into this room.
00:36:41
That's a dark room and I've somewhere in the Pentagon and I
00:36:45
don't know where I am.
00:36:46
In their traps they were what do, what do I do?
00:36:49
And so I might start looking around.
00:36:51
Maybe you know light, a match or something and try to
00:36:55
cautiously look around and then be really cautious about trying
00:36:58
to move around.
00:36:58
But very first, things I'm doing are just trying to orient
00:37:01
myself.
00:37:01
So they'll try to figure out what user am I running as, what
00:37:05
privileges do I have?
00:37:06
What are the other systems around?
00:37:08
They do this and and do that in very subtle ways.
00:37:11
But all that work is something that an ordinary program would
00:37:17
never do, an ordinary user would never do, and so if you see a
00:37:21
bunch of that type of activity linked together, it's almost
00:37:24
certainly that someone's broken in it's.
00:37:26
It's very, very suspicious activity.
00:37:28
The problem we have is we can't when we get, we can't alert on
00:37:33
all that stuff and have humans do it, because it happens too
00:37:35
often.
00:37:35
I actually knew a guy.
00:37:38
There was a project at Google that you know tried to do
00:37:41
anomaly detection, detecting these, these kind of weird
00:37:44
things happening, and the problem is they did a really
00:37:47
good job of doing anomaly detection but they were flooding
00:37:51
the sock because at Google, scale anomalies happen all the
00:37:53
time.
00:37:53
One in a billion things is like a thousand a day.
00:37:55
It's just the scale kills you, and so what you need is to be
00:38:05
able to chain together those anomalies, to put them together
00:38:08
into a bundle.
00:38:10
In order to do that, you need enough context to be able to say
00:38:12
these things are all related together.
00:38:13
That's the anomaly side of it.
00:38:15
The other side of it is being able to say characterize what is
00:38:20
, what is normal, what is expected behavior for the system
00:38:23
, so all, and then saying if it ever drifts from that, alert on
00:38:28
it.
00:38:28
So for example, if I have a container image, it runs three
00:38:31
or four programs, it talks to three or four services, east,
00:38:35
west, at north, south, and I can cattle, I can watch it and I
00:38:40
can catalog all that behavior and after that I can say and I
00:38:45
can build that into human profile, human readable profile.
00:38:48
And after I put that in I can say if it ever does anything
00:38:50
different than that, you know that then it's something bad.
00:38:54
So now an attacker gets into that and they start trying to
00:38:58
just like that lighter and look around a little bit.
00:39:01
Well, it's all gonna be stuff that that thing is never done
00:39:04
before and so that's gonna.
00:39:07
That's the.
00:39:07
That's drift could be called drift detection, but it's also
00:39:11
kind of this really targeted anomaly detection that isn't
00:39:14
generic, but it's really kind of trained on what, what you're,
00:39:19
your, how your program works, and you integrate that into the
00:39:22
development life cycle and now a developer can look at that
00:39:25
thing and say, yeah, that's how my program is, all that's
00:39:27
expected, and so you can get this loop that's closed and it's
00:39:30
fricking almost impossible to break out of that that thing.
00:39:34
As soon as you start doing anything, you can react to it
00:39:37
and say that that's definitely bad, should it?
00:39:39
You know, kill the pod, kill the process, kill the container,
00:39:42
whatever you have.
00:39:44
Speaker 1: And we're actually really interesting, and that's
00:39:46
just right, a boom.
00:39:47
Hmm, so is there a learning period where you know you deploy
00:39:53
it in the environment and then is it learning, you know, for 90
00:39:57
days, right, or whatever it might be, where it's saying this
00:40:02
is normal, this is expected behavior, then, based on
00:40:05
everything that we've seen everywhere else, this is also
00:40:09
normal and this isn't, you know, is that learning phase there?
00:40:13
Speaker 2: It's not exactly it's .
00:40:14
It's similar to that, but not quite so what.
00:40:17
What you do, what spider bat does, just real briefly, is you
00:40:20
deploy this, this agent, on on a node Kubernetes node or just a
00:40:26
regular VM and it starts recording everything that's
00:40:29
going on the node, every process , every network connection,
00:40:31
every file access, and links them together.
00:40:33
So that I know that this process launched this process
00:40:37
which created a listening socket , which accepted a connection,
00:40:39
which then opened this thing, which made a connection over to
00:40:42
this other process on this other machine, which accepted the
00:40:45
connection and then downloaded this file, wrote it in as a
00:40:48
crown job.
00:40:49
Three days later that crown job executed, and so now I can
00:40:52
chase together this causal chain of what caused that crown job
00:40:56
to execute.
00:40:57
So that you can build that graph and you build that
00:40:59
automatically.
00:41:00
Once I'm watching, it's the mother of all root cause
00:41:03
analysis.
00:41:04
How did this thing get started?
00:41:06
Why is this thing trying, you know, transferring data over
00:41:08
here?
00:41:09
And, oh, bob installed that script two days ago.
00:41:12
There's a typo in the script, that's sort of that.
00:41:15
That's doing doing stuff.
00:41:18
So, as a mother of all root cause analysis.
00:41:20
But then you can also look at that from a different lens and
00:41:24
say, let me take an image run and say, for this run in this
00:41:29
container of this image running, build out what it does, and
00:41:33
then another and another, and you just do that on your
00:41:35
development system.
00:41:36
So now I know, after after 50 or 100 runs of the developer
00:41:41
running all the tests and running it in their integration
00:41:43
environment, I can merge those all together to create a merged
00:41:47
baseline that a human can read and the human, the developer,
00:41:52
can look at that and say, yeah, that these are the processes
00:41:55
runs, these are the uses they run at, these are the
00:41:57
executables, these are the connections each one makes, and
00:42:00
you can bound it, outbound and generalize it as needed and then
00:42:04
say that that is, that's my profile.
00:42:07
Now you put that back in and then every time a deviation
00:42:09
comes in, open it up as a pull request and get up.
00:42:13
They modify the.
00:42:16
Your thing is drifted.
00:42:17
This looks like what's changed.
00:42:19
Either accept the pull request or you know it's something bad
00:42:27
has happened.
00:42:29
But the developers in the loop and they know how the program
00:42:32
works.
00:42:32
So it's not this poor guy in the sock trying to reverse
00:42:34
engineer what's a developer had.
00:42:37
The developer is able to say this is the way it works.
00:42:40
Once you have that, then that becomes something you can use in
00:42:44
the staging environment, the production environment, in the
00:42:47
same way.
00:42:47
So it takes maybe about a week or two of learning period.
00:42:52
But it's not learning an image, a particular instance of an
00:42:57
image.
00:42:57
It's learning what your program , what each program does, and
00:43:02
then after that it's very stable .
00:43:04
It takes very little effort to maintain it and it becomes
00:43:11
really hard to break out of it Just because you know almost as
00:43:16
soon as you get to get a foothold in there you're going
00:43:20
to get caught.
00:43:22
Speaker 1: Yeah, it's really fascinating because you know
00:43:26
you're kind of shifting left in the perspective of who's
00:43:30
responding to it, you know, and who's like kind of on the hook
00:43:33
for it first.
00:43:34
at least that's been a real headache that I have personally
00:43:39
experienced and that I know others are going through,
00:43:43
because you know, it always gets pushed back on the security guy
00:43:47
right to figure out what's actually going on, when in
00:43:52
reality we didn't write the code right, we didn't create this
00:43:55
application or whatever's going on, like we have no clue what is
00:43:59
going on there in that 10 lines of code or 100 lines
00:44:03
of code, like who knows right, and so it always puts us into a
00:44:08
very difficult situation of where, you know, we don't know
00:44:12
what's going on, right, because we haven't been a part of it and
00:44:16
it's our responsibility to determine it and, like, figure
00:44:19
it out.
00:44:20
You know all in 15 minutes, all in 20 minutes, you know which?
00:44:25
I mean that's a really fascinating way of approaching
00:44:30
this problem.
00:44:30
Was that also a perspective or a thought that you had when you
00:44:36
were creating?
00:44:36
This is, like you know, let's go to the source, right, and I
00:44:40
would actually know what this is doing.
00:44:43
Speaker 2: It was partly that and partly it was based on kind
00:44:47
of changing the conversation.
00:44:48
So when I was a tipping point, we built this box that was an
00:44:56
inline network inline box.
00:44:57
Conceptually you cut in network wire, you plug both ends of the
00:45:00
box and it's a bump in the wire but filters out the text.
00:45:03
Okay, and one of our customers, a very large company, said that
00:45:13
it completely changed the way they acquired other companies.
00:45:16
So their old process was they would buy, they would like
00:45:21
buying.
00:45:21
A company.
00:45:22
Week literally was such.
00:45:23
It was a huge company and whenever they acquired a new
00:45:28
company the networking team would have to go in before they
00:45:31
could connect the network.
00:45:31
So there's a lot of pressure to connect the networks together
00:45:34
to start realizing the value of business.
00:45:36
But there's a security risks of just blindly connecting them up
00:45:40
.
00:45:40
So they used to have to go in and do this kind of full audit
00:45:45
of every system they had and review all their security plans
00:45:48
and stuff.
00:45:48
And it was just this deep dive exam, that no one.
00:45:52
It was a painful conversation.
00:45:54
It's like having the IRS audit type feeling you know, it's just
00:45:59
no one, no one wants it and the IRS guys are not happy about it
00:46:04
either.
00:46:04
No one's happy in this.
00:46:07
And so instead, what they do is they just put the tipping point
00:46:10
box in there between them and connect them together and now
00:46:13
they could say, okay, this machine is infected with this,
00:46:15
this machine is infected with this, this machine.
00:46:16
And so they go back to that same group with this very
00:46:21
actionable.
00:46:21
These things are problems in your environment that you can
00:46:24
fix.
00:46:24
And it was a much better conversation.
00:46:26
So it's just kind of flipped the script and the same thing is
00:46:30
happening here.
00:46:31
You know, if we go, if we try to solve the vulnerability side
00:46:35
or the way a lot of security groups work, they go to the
00:46:38
development organization and try to embed themselves in and get
00:46:41
them to develop a lot of process and review CVEs and for
00:46:45
scanning all their coons.
00:46:46
Tell me why this vulnerability that you know is in your code is
00:46:51
can't be exploited, and they're like I don't even know one uses
00:46:53
that library, so I don't, I don't really need to worry about
00:46:58
that.
00:46:59
And so you go this kind of a painful conversation.
00:47:01
Instead you go to them and say, look, I'm seeing your
00:47:04
application behave exactly this way and they look at and say,
00:47:08
yeah, yeah, that's expected that .
00:47:10
What the heck is that?
00:47:10
And as a developer, they want to know about it because now,
00:47:14
all of a sudden, it's a puzzle, it's an interesting thing, and
00:47:18
then they find out more about how the how the code or some
00:47:20
library that they're using actually works, which is is
00:47:23
usually interesting, and if it's anything bad, they can catch it
00:47:27
immediately and they're not reacting to it in a fire drill
00:47:30
with the security team after some breaches happened, which is
00:47:33
no one likes.
00:47:34
So it's changing this security process from a bunch of things
00:47:37
no one likes to something that that's kind of, frankly, more
00:47:40
fun and easier, better, better conversations.
00:47:45
Speaker 1: That's interesting.
00:47:46
You know, brian, unfortunately I think that we're coming to the
00:47:51
end of our time here, but I feel like I could talk to you
00:47:53
for another hour, you know.
00:47:55
So that just obviously means that you'll have to come back on
00:48:00
some time and we'll talk more.
00:48:02
Speaker 2: Absolutely.
00:48:02
Anytime I'm joining the conversation, it's these things.
00:48:07
It's always fun, awesome.
00:48:09
Speaker 1: Well, brian, you know before I let you go, how about
00:48:11
you tell my audience?
00:48:12
You know where they could find you, where they could find your
00:48:15
company spider bat, and you know all that good information if
00:48:18
they wanted to learn more.
00:48:20
Speaker 2: Yeah, so the name spider bat.
00:48:23
Actually we're based in Austin.
00:48:25
We're actually fully work.
00:48:25
We're a virtual company, but that the founders based out of
00:48:30
Austin, and Austin has bats.
00:48:32
They have this largest colony of Mexican free tail bats in
00:48:38
North America.
00:48:38
About a million bats live under the Congress having a bridge.
00:48:41
Speaker 1: It's kind of neat.
00:48:42
Speaker 2: If you're down there, you can go out at night on the
00:48:44
shore and the bats fly out of, starting about 30 minutes before
00:48:47
sunset and it's just waves of bats coming out of it.
00:48:51
It's really cool.
00:48:51
So there's type of back called a spider bat that eats each
00:48:55
spiders and so we're founding the company.
00:48:57
You know, we thought that was just a neat name and kind of
00:49:01
kind of cute.
00:49:01
This was before COVID, when bats became a little less
00:49:04
popular, and so we're going to open up the account and the guys
00:49:10
said you know what type of business you're in?
00:49:11
Oh, cybersecurity.
00:49:11
So you were at spider bat with the Y, spy D or, and so that's
00:49:18
how we the name got got turned into spider bat with a Y, but
00:49:23
anyway, so that's, that's the name of the company, spider
00:49:25
batcom and that with a Y, so you can be able to find it and I'm
00:49:32
on there.
00:49:32
I'm just Brian at spider batcom or B Smith at spider batcom,
00:49:37
either way, both work and so you can find me there.
00:49:41
And we have you, I'm on LinkedIn , I'm on, I'm on all the usual.
00:49:45
So they can, you can, you can find me and I'm interested in
00:49:49
hearing you know from anyone.
00:49:50
I enjoy talking, as you can tell.
00:49:53
I enjoy talking about cybersecurity and kind of most
00:49:58
things interesting in both technology and physics.
00:50:00
So it's still like physics, I still, but it's a hobby.
00:50:07
Speaker 1: Not a job.
00:50:07
Yeah, I can't imagine that being a job.
00:50:11
But that's just me, I mean, someone that failed physics.
00:50:16
Well, yeah, I mean, that's fantastic.
00:50:20
You know, and Brian, I really appreciate you coming on and
00:50:23
I'll definitely put you know all of your links in the
00:50:26
description of the episode.
00:50:26
So if anyone wants to, you know reach out or check it out.
00:50:30
They absolutely can.
00:50:31
But thanks everyone for listening.
00:50:33
I hope you enjoyed this episode .