In this insightful interview, Joe sits down with cybersecurity expert Mr. Jeff Man to delve into his extensive background in security and his impactful tenure at the National Security Agency (NSA). They explore how Jeff embarked on his security career, the critical mission of the NSA, and the agency's compartmentalized structure. Jeff recounts his experiences working on significant projects at the NSA and underscores the importance of compartmentalization for security. The discussion also highlights the challenges of government work and the stringent entry requirements for agencies like the NSA.
The conversation spans various topics, including the complexities of handling different telecom and operating systems, the advanced technology at the NSA, the pioneering days of hacking and network security, and the formation of the first red team. Jeff shares his motivations for staying at the NSA and the circumstances that led to his departure. Additionally, he talks about his current work in PCI compliance and his active participation in the security community through conferences and podcasts. Don't miss this deep dive into the world of cybersecurity from a seasoned expert.
00:00 Introduction and Podcasting
03:47 Getting into Security
10:47 Jeff's Background and Entry into the NSA
15:58 The Mission of the NSA
22:27 Challenges of Working in the Government
29:07 Overlapping Projects and Duplication of Efforts
31:02 Technological Advancement at the Agency
36:47 The Early Days of Hacking and Network Security
51:42 Reasons for Staying at the Agency
54:20 Leaving the Agency and the Significant Incident
57:06 Current Work in PCI Compliance and Involvement in the Security Community
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, Jeff?
00:00:01
It's great to get you on the podcast.
00:00:03
I'm actually really excited for our conversation.
00:00:07
Speaker 2: I'm happy to be here.
00:00:09
It took a while to make this arrangement, but I've heard
00:00:13
other guests say that, so you must have a way of doing things.
00:00:17
Speaker 1: Yeah, it's a lot of demand.
00:00:20
Let's say it's all demand right let's say it's all demand,
00:00:25
right, yeah, yeah, of course, yeah, everyone wants to get on
00:00:26
this thing right.
00:00:26
But you know, it's uh, it's interesting when you're, when
00:00:28
you're running your own podcast and you're, you know, working a
00:00:32
nine to five, right, because it's like you have to fit it in
00:00:35
where you can fit it in, and I used to kind of just open up my
00:00:39
entire week for recordings and I realized that I got burnt out
00:00:43
very quickly doing that and so now I dial it into you know one
00:00:48
or two days a week, like that's my recording period, and then
00:00:51
you know which, which I guess kind of creates a backlog right.
00:00:56
It creates a backlog of one recordings and it creates a
00:00:59
backlog of people trying to get on, alleviates, I guess, a lot
00:01:03
of the stress that I had in year one of doing this right, like
00:01:06
we're three years in and year one was just like the worst time
00:01:10
trying to find guests.
00:01:12
Last minute there were so many episodes where it was just me
00:01:15
talking.
00:01:16
You know it's like, and I'm sitting here like no one wants
00:01:18
to hear me, just talk you know, that's what those LinkedIn
00:01:22
shorts are for.
00:01:24
Yeah, I guess I hate social media.
00:01:26
Now, you know, I didn't.
00:01:28
I didn't hate it that much before podcasting, but now that
00:01:32
my my growth right is kind of determined based on my
00:01:37
engagement that I get on social media, I hate it so much more
00:01:42
and I wish, I wish I was making the money to just hire like a
00:01:46
social media manager, just like here, just figure all this out,
00:01:49
you know.
00:01:50
Speaker 2: Yeah, Uh, like many things, uh, a necessary evil
00:01:54
that uh aren't going away anytime soon.
00:01:57
You know me, I'm focused on how soon can I retire and just walk
00:02:01
away from everything, Right, Uh , so a few few years away, I am
00:02:06
having fun at a lot of levels doing what I'm doing these days,
00:02:09
but definitely there's days I would love to just, you know,
00:02:15
walk away from all of it.
00:02:16
I mean, I could be being blackballed on on social media
00:02:20
and doxed and dosed and I wouldn't even know it because I
00:02:23
just don't get on that much.
00:02:24
But yeah, it is what it is.
00:02:29
Speaker 1: Yeah, yeah, absolutely.
00:02:30
Well, I'm sure we will get there in our conversation, but
00:02:34
you know, to start, you know why don't you talk about how you
00:02:38
got into security, right?
00:02:40
The reason why I ask everyone this question is because there
00:02:44
is a portion of my viewership that, or listeners, you know,
00:02:48
whatever platform they're on, that are trying to make that
00:02:51
jump into IT or security, right, and they probably don't know if
00:02:55
it's possible for them, right, they might be coming from
00:02:58
various backgrounds and I feel like it's always helpful for
00:03:02
everyone to hear someone else's background, right, Because I
00:03:05
remember when I was trying to get into, specifically, the
00:03:09
federal government, right, it was extremely helpful for me to
00:03:13
hear someone that came from even my same area, right, same
00:03:18
background even, and he made it, and so I was like, well, if he
00:03:21
made it, then why can't I make it?
00:03:24
Right and so getting that mentality switch, I think is key
00:03:27
for everyone.
00:03:27
So how did you get into it?
00:03:30
Speaker 2: Well, uh, I got into this in the early mid 1980s, you
00:03:37
know.
00:03:37
So, back in the 1900s, before the internet was a thing, before
00:03:42
cell phones and mobile phones were a thing, people barely had
00:03:48
computers, what were called desktop or personal computers
00:03:51
back in those days.
00:03:52
I've actually told this story in one of the talks I've given
00:03:58
at conferences, and I think there's recordings of it out on
00:04:01
YouTube somewhere.
00:04:02
The name of the talk is Hackers Are Neither Created Nor
00:04:06
Destroyed, given that nothing existed back then.
00:04:09
There was no training courses, no certifications.
00:04:12
The real question is well, the short answer to your question is
00:04:17
I worked for some time for the National Security Agency and I
00:04:22
started out as a cryptologist.
00:04:23
But the real answer to the question is how did I get to NSA
00:04:28
?
00:04:28
And I chuckle because when I was putting this talk together
00:04:33
and sort of thinking back about my origin story, I think you
00:04:39
know the NSA that exists today.
00:04:40
They probably would never hire me, just for all sorts of
00:04:45
different reasons, mostly because they're always looking
00:04:47
for very specific skills, and even when I was applying to NSA
00:04:53
back in the mid-80s, they didn't call it STEM back then, they
00:04:58
just called it critical skills.
00:04:59
They were looking primarily for engineers, mathematicians and
00:05:02
computer scientists and I won't tell the whole story, but I went
00:05:07
through the traditional route.
00:05:09
I sent in an application the government SF-71-171 form by
00:05:16
mail US Postal Service-type mail Got a response and was invited
00:05:22
to Fort Meade, to NSA headquarters actually to one of
00:05:26
their satellite offices to go through a couple days worth of
00:05:30
sort of aptitude skills testing.
00:05:32
And I forget exactly how many tests.
00:05:35
There were 10 or 12 or 14 tests .
00:05:37
Long story short is I scored well enough on all these
00:05:47
aptitude tests that they hired me without any kind of position
00:05:49
in mind, which was they were hiring a lot of people back then
00:05:50
.
00:05:50
They were doing a lot of recruiting at universities,
00:05:52
again hiring these critical skills, trying to entice people
00:05:56
out of college as they graduated college to come work for NSA.
00:06:00
And back in those days nobody knew what NSA was.
00:06:02
By the way, they didn't advertise.
00:06:05
We weren't allowed to say that we worked at NSA.
00:06:08
We could only say we worked for the Department of Defense back
00:06:11
in those days.
00:06:13
But I ended up going to work, for my first assignment was in a
00:06:16
manual crypto systems branch in what at the time was called
00:06:20
communication security, which was sort of the defensive side
00:06:24
of the house.
00:06:24
Later it had become known as information security director or
00:06:29
InfoSec, thus my start in information security.
00:06:33
How I knew I was in the right place was I grew up in a family
00:06:38
that loved to do puzzles and one of my favorite pastimes was
00:06:43
doing the Dell Crossword Puzzle book, which had logic problems
00:06:46
in it, and I loved doing the logic problems.
00:06:49
My first assignment in this manual crypto system branch.
00:06:53
I had a mentor who was a cryptanalyst on loan from the
00:06:56
operations side of the house.
00:06:58
One day at lunchtime he was doing some stuff with graph
00:07:03
paper and colored pencils and I kind of asked him what are you
00:07:06
doing?
00:07:06
He says I'm writing logic problems as a side job.
00:07:09
I write logic problems for Dell Crossword Puzzle Magazine, and
00:07:13
that was just one of these karma .
00:07:15
Oh, I would say and I say this a lot to people that ask this
00:07:20
question you know, don't get hung up on the technology things
00:07:23
, the STEM things.
00:07:24
While that's important and while we need people with those
00:07:43
skills, there's a lot of other skills out there that people are
00:07:44
looking for.
00:07:44
You know, I liked logic problems, so I had sort of a
00:07:47
logical mindset which kind of made me good at math, but I'm
00:07:49
not a mathematician.
00:07:49
Critical thinking, abstract thinking, people that are
00:07:52
involved in music and are good at music.
00:07:54
It's really what makes you a hacker and how I decided at some
00:07:59
point oh, I am a hacker.
00:08:00
What are the attributes?
00:08:01
Not accepting answers, thinking there's a better answer or a
00:08:09
better way, liking to tinker, take things apart, find out how
00:08:13
things work, curiosity being organized.
00:08:17
During COVID we talked to some people I talked to some people
00:08:20
somewhere along the line, people that were getting laid off from
00:08:24
restaurants, that were like chefs and short order cooks have
00:08:28
to be really skilled and organized and they were getting
00:08:31
into cybersecurity and finding out they were really good at
00:08:34
things related to cybersecurity because they had such good
00:08:38
organizational skills, leadership skills, able to sort
00:08:43
things out and do things, multitasking and so on and so
00:08:45
forth.
00:08:46
So there's lots of different things that might make you good
00:08:51
in this field.
00:08:53
I'm not really a good textbook example because I just kind of
00:08:56
stumbled upon it.
00:08:57
Frankly, I got into it and it was the you know I got into
00:09:00
hacking and computer hacking at the beginning, of sort of
00:09:04
everybody doing it Again.
00:09:07
No training courses, no certifications or degrees.
00:09:10
So you know, I've always kind of scoffed at certifications and
00:09:14
I've kind of scoffed at degrees , but I realize that they're
00:09:17
necessary for people.
00:09:18
But you know, my pedigree is I lived it, I grew up in it, which
00:09:24
not everybody can say.
00:09:25
So, acknowledging that, I try to do as much as I can to just
00:09:31
impart knowledge and experience that I've had over the years on
00:09:35
the next generation and try to encourage people to don't give
00:09:39
up.
00:09:39
Try new things, try different things, find your niche, find
00:09:44
out what you, what you like to do, find out what you're good
00:09:47
good at and do that, uh, that type of thing.
00:09:49
Long-winded answer, but that's sort of my background story no,
00:09:54
but I think it's.
00:09:55
Speaker 1: I think it's really interesting, you know, because
00:09:58
you didn't we always think of I mean, even I do, to this day,
00:10:03
right, when I'm thinking of someone that goes in the nsa.
00:10:07
I mean, these guys are, you know, mathematicians and
00:10:11
cryptologists and some of the absolute smartest of the smart
00:10:15
right like I, I've applied to the to the nsa previously, you
00:10:19
know, years ago, right, and even when I was applying, I was
00:10:22
thinking like why the hell are they going to choose me?
00:10:23
What I'm not?
00:10:25
I'm not a mathematician.
00:10:27
Yeah, I like math, but that doesn't make me anything special
00:10:31
in math.
00:10:31
You know what?
00:10:33
What value would they get out of me?
00:10:34
Right, right, and you, you came into it in a period where you
00:10:41
know they were more open right to skill sets and potential
00:10:45
skill sets, rather than, you know, I guess, the
00:10:50
accreditations on paper that they're looking forward to right
00:10:55
now.
00:10:55
More right, they're looking for those people that have the
00:10:58
degrees in math, right, and the degrees in, you know, cryptology
00:11:03
, the experience you know from a company like RSA, for instance
00:11:07
that has been doing crypto for 10 years.
00:11:11
That's what they're looking for.
00:11:12
Now.
00:11:12
The bar is much higher.
00:11:14
Speaker 2: Right.
00:11:14
Well, I also came into NSA at a time when we were still
00:11:21
fighting the Cold War with the Soviet Union.
00:11:24
It was during the Reagan administrations, and one of the
00:11:29
big strategies there was basically to try to bankrupt the
00:11:32
Soviet Union, which is effectively what happened.
00:11:34
So there's a lot of money being poured into the Defense
00:11:38
Department.
00:11:38
They were literally hiring 100 people a week at NSA.
00:11:43
They were casting a pretty wide net and they were growing
00:11:48
through attrition, because, you know, I don't know the exact
00:11:51
numbers, but let's say, you know , 80 out of the 100 people that
00:11:55
they were hiring were people coming straight out of college
00:11:58
or people with these critical skills degrees, and they would
00:12:01
immediately put them into what they called the 2020 program.
00:12:04
So you could go get your graduate degree and have the
00:12:08
government pay for it and only work half-time 20 hours of work,
00:12:11
20 hours of school.
00:12:12
So people were coming in with these advanced skills.
00:12:16
By the way, they were paying them extra.
00:12:19
There was an accelerated pay scale immediately sending them
00:12:21
to school and paying for their degree, immediately sending them
00:12:24
to school and paying for their degree and the vast majority of
00:12:26
these people would get their degrees, fulfill their
00:12:31
requirement to the government for having gotten all this stuff
00:12:34
which at the time it was time of service, but the clock was
00:12:39
running while they were there.
00:12:41
You know, somebody figured it out.
00:12:43
But you could basically get your graduate degree and leave
00:12:46
like two or three months later and go out into the private
00:12:48
sector and make more money, never look back, and a lot of
00:12:51
people did that.
00:12:52
So NSA was I don't know if it was their strategy or not, but
00:12:55
they were growing incrementally by bringing in a lot of people
00:13:01
and, you know, saving the ones that survived or didn't leave,
00:13:05
which you would think would be like second-class citizens, and
00:13:09
you know, the ones that weren't the cream of the crop, that
00:13:11
couldn't go out and get the job in the private sector.
00:13:13
I don't know, because that kind of puts me into that category.
00:13:16
I like to think I'm different, but you know, I graduated from
00:13:21
college with a human resource management degree, so it was a
00:13:24
business major, and I had a whopping 2.6.5 grade point
00:13:28
average and somehow I but I was good at some stuff and I scored
00:13:33
well on aptitude tests and I just had happened to be at the
00:13:37
right place at the right time and had the mindset of a hacker
00:13:41
where, when I was asked to do things and, you know, could we
00:13:46
do something differently?
00:13:47
Could we do something that hasn't been done before?
00:13:50
I was naive enough to say I don't see why not, so let's do
00:13:53
it.
00:13:53
So I had some early successes at NSA doing some, as it turns out
00:13:57
, innovative things and, you know, sort of rounded out my
00:14:01
career at NSA, effectively architecting the first red team
00:14:06
at NSA, getting into ethical hacking and penetration testing.
00:14:09
Because a lot of us that worked in this one little group saw
00:14:14
the movie Sneakers, had the mission of doing evaluations of
00:14:18
fielded systems, including network systems we called them
00:14:21
distributed systems at the time and somebody in our team said
00:14:26
why don't we just start learning how to be hackers?
00:14:28
That movie came out and we all remember the classic movie War
00:14:34
Games.
00:14:34
It's like we should be doing that instead of just sort of the
00:14:38
controlled, scientific, engineering-oriented analyses
00:14:42
that we had been doing, the right place at the right time
00:14:46
and the right mindset or persona .
00:14:49
That I happened to be able to and was naive enough to think
00:14:54
that we can do things differently, which is hard to do
00:14:57
at very large government bureaucratic organizations,
00:15:00
which is one of the reasons why I left there, but that's another
00:15:03
story for another day.
00:15:06
Speaker 1: So what's the overall mission statement of the NSA?
00:15:12
And it leads into my next question.
00:15:14
So I know that's pretty basic, but what is it, at least when
00:15:18
you were there.
00:15:20
Speaker 2: Well, I don't know that we had a mission statement
00:15:24
per se, but the way I used to describe it was we were sort of
00:15:29
the nation's ear.
00:15:31
We were the ones that were listening to what people were
00:15:34
saying, and of course we were doing that globally and we were
00:15:38
listening to all sorts of different things, which were
00:15:41
primarily, in those days, radio signals of one frequency or
00:15:45
another, maybe a little bit of telephone traffic, a whole lot
00:15:53
of radio traffic and other creative ways that we could
00:15:54
intercept communications.
00:15:55
But the mission was more or less we were an information
00:15:58
gathering agency, sort of adjacent to things like the CIA,
00:16:02
which were the actual spies, you know, hiring people to
00:16:06
commit espionage and stuff like that, whereas NSA was just sort
00:16:10
of the big ear where we were listening to everything and any
00:16:12
kind of data or signal that we could collect in any number of
00:16:17
classified ways.
00:16:18
You know if it happened to be encrypted in some way.
00:16:22
Of course, you know NSA was known for being code breakers,
00:16:25
cryptanalysts, and so that was a large part of the mission.
00:16:29
But that was what we called operations, which was probably
00:16:33
about 80% of the mission at the time that I was there, and the
00:16:37
exact number is probably classified, but it's been a long
00:16:40
time.
00:16:40
It's been a long time.
00:16:42
The other side of the house, what was called what came to be
00:16:56
known as InfoSec, was responsible for producing all
00:16:57
the secure communications cryptographic systems that were
00:16:58
being used by US special forces, anybody in the military,
00:17:00
anybody who had a classified mission, state department
00:17:01
embassies and so on and so forth forth.
00:17:03
So we were making all the little black boxes.
00:17:05
At those times it was very much an engineering organization and
00:17:08
there was very much a mentality that I I mean very just, just
00:17:13
succinctly distinctly remember a chief scientist saying you know
00:17:17
, there's really no such thing as software.
00:17:19
Everything we do is hardware, or let's say firmware based in
00:17:39
in the modern dialect.
00:17:40
But the idea of doing something in software, which is something
00:17:43
I did early on, was very kind of crypto would happen and out
00:17:46
would come the code or the cipher that would be transmitted
00:17:50
and somebody had the box on the other end.
00:17:52
That would reverse the process and get back to whatever it was
00:17:56
that the message traffic was Primarily communications traffic
00:18:06
.
00:18:06
Back then it wasn't as much storing secrets and protecting
00:18:07
secrets that were what we call these days data at rest.
00:18:10
It very much had to do with.
00:18:12
I mean, the organization was called Communications Security
00:18:15
when I first started there, because we were intercepting the
00:18:18
secrets that were being transmitted past from one end to
00:18:22
another.
00:18:22
That's actually an important distinction, because I think
00:18:25
that's one of the huge changes that came about with the dawn of
00:18:29
the internet age and the digital age is we started
00:18:35
sharing a lot of information but wanted to have some way to put
00:18:40
the data online and make it freely available to everybody
00:18:43
but only certain people.
00:18:44
So we had to come up with all sorts of different ways of
00:18:47
trying to protect data and started thinking about different
00:18:51
levels of the classification of the data, the different
00:18:53
sensitivities.
00:18:54
Rest was necessarily all printed on paper and locked in
00:19:07
safes and locked rooms and locked buildings with guards and
00:19:09
guns and machine gun nests and barbed wire fences and so on and
00:19:12
so forth.
00:19:13
Very much the mission back then was communication security and
00:19:18
everything involved in communications, whether it was
00:19:21
listening and decoding what everybody else was saying and
00:19:24
then reporting that to the decision makers, the military
00:19:27
leaders, the Congress, the president and so on and so forth
00:19:30
, so they could have information to make better informed
00:19:33
decisions about.
00:19:34
Are we going to go to war with somebody or whatever the
00:19:40
question was at the time.
00:19:43
Speaker 1: Yeah, that's really fascinating.
00:19:45
You know, back then was the NSA very siloed?
00:19:49
It doesn't sound like it was very siloed.
00:19:51
You know, now, right 10 years ago, at this point 10 years ago,
00:19:56
I did a little bit of work with some agencies and I mean, it is
00:20:03
so siloed, is siloed within silos, right, and it is, um,
00:20:07
it's crazy how, it's kind of crazy how they get anything done
00:20:11
.
00:20:11
You know, to be completely honest with you, because it's
00:20:14
like you know I was, I was working with one guy on the left
00:20:18
side of this aisle right, massive aisle right takes you 10
00:20:22
minutes to walk from one end to the other in a building.
00:20:25
And I'm working with the guy on the left and you know, he gave
00:20:29
me some, he gave me what he could tell me that he works on
00:20:34
Right, and I was like, well, what, what's he work on right
00:20:38
across the aisle Right, like they've known each other for 30
00:20:41
years, they've worked together for 30 years at different
00:20:46
agencies and things like that, Right, and I was like, no, but
00:20:51
you know, do you, do you actually know or do you actually
00:20:55
not know he goes?
00:20:56
No, I've known him for 30 years .
00:20:58
I have no clue what he works on .
00:20:59
All I know is he works on the same product family as I do, and
00:21:05
that's it.
00:21:06
And I was like what?
00:21:07
And then you know, you could go all the way down the aisle and
00:21:11
everyone will give you the same answer.
00:21:13
And you know, maybe, maybe that is a part of the veal right
00:21:18
when, where they're not going to tell you because you're not,
00:21:22
you're not cleared or whatever.
00:21:23
It is Right.
00:21:24
But I feel like when you go to 30, 40, 50 people, you know and
00:21:31
I mean, you know, I wasn't a podcaster at the time, right,
00:21:34
but I'm able to talk to people, I'm able to typically, you know,
00:21:37
get information out of people in different ways.
00:21:39
Right, when they give you the same answer of no, I don't know
00:21:45
what they work on.
00:21:45
I've known them for years, right.
00:21:48
Was that the case back then?
00:21:50
Or was it a little bit more open for you to you know?
00:21:54
Maybe talk to the guys right that created the black box and
00:21:58
now you're creating the software that goes into that black box
00:22:02
and because now I don't think that someone in your position
00:22:07
creating that software, I don't even think that you would know
00:22:09
that there's a black box right, they're.
00:22:11
They're just saying create a software that does this right.
00:22:18
Speaker 2: No, there were certainly silos and a lot of the
00:22:21
concept was actually very deliberate.
00:22:23
We didn't call it silos, we called it compartments,
00:22:26
compartmentalization and it had to do from an information
00:22:31
security or data security perspective.
00:22:34
I mean, we still have the concept these days need to know,
00:22:37
we talk about it in terms of escalation of privileges and
00:22:41
access to different areas in the technology and our data storage
00:22:45
and databases and things like that.
00:22:47
But the concept is simply the fewer people that know about
00:22:50
something that's very sensitive, the better, because then you
00:22:54
have fewer people that are going to have loose lips.
00:22:57
You know loose lips sink ships.
00:23:00
No World War I security poster.
00:23:02
The fewer people that if it is discovered that some information
00:23:07
is leaked, the fewer people to investigate.
00:23:10
You know, there's just all sorts of different reasons why
00:23:15
you add compartmentalization.
00:23:17
I had the top secret, secret compartmented information, tssei
00:23:24
clearance, as most people pretty much everybody at NSA did
00:23:28
.
00:23:28
But even beyond that, if you're working on a different problem
00:23:37
and problems, let's say, were loosely geography-related,
00:23:42
geopolitically-related very much different compartments,
00:23:48
different security clearances that you would have to get read
00:23:53
into signed papers promising that you're not going to reveal
00:23:56
secrets about it, and so on and so forth, and very much
00:24:01
dependent on what the target was , what the object of interest
00:24:06
was, and it could be very literally right next door to one
00:24:09
another and you really wouldn't know what the other people are
00:24:12
doing.
00:24:12
And that was primarily on the operations side of the house.
00:24:15
On the defensive side of the house, the information security
00:24:18
side of the house, it wasn worked in.
00:24:22
One of the things that I ended up starting to develop was
00:24:26
I-wheeler trucks to comprise this mobile, supposedly mobile
00:24:29
communications base station.
00:24:31
One of the trucks was primarily the power plant, the other one
00:25:10
was filled up with all sorts of equipment.
00:25:12
They were trying to modernize and make use of things like
00:25:16
laptop computers, and so a contractor had come up with a
00:25:20
design for doing a lot of what this thing was doing, but do it
00:25:27
in software in a laptop laptop, and so what took two semi-trucks
00:25:39
to create a base station was being reduced to I'm looking at
00:25:40
the picture of it now 15 transit cases that were two feet by two
00:25:43
feet by two feet.
00:25:44
I'll just grab the picture and show it to you.
00:25:46
A picture is worth a thousand words.
00:25:49
So this is what was being designed as the replacement for
00:25:53
two trucks, much more transportable, much more meeting
00:25:58
the original goal of mobile, because the truck ones basically
00:26:01
weren't mobile.
00:26:03
The key element to that was the encrypted communications that
00:26:07
they were doing, which was with what was called a one-time pad.
00:26:11
It was called a one-time pad, the one-time pad key, which was
00:26:14
printed on paper pads and was used by the troops in the field,
00:26:18
the Green Berets, the A-teams At the base station.
00:26:31
The same key was being printed on the old-fashioned paper tapes
00:26:33
that you used in the early days of computers and they came to
00:26:34
us and said is there any way we can get the key, instead of on
00:26:35
paper tape, on a floppy disk, Because then we can feed it in
00:26:38
the laptop, do the encryption, decryption?
00:26:40
I had conveniently already done that for another customer and
00:26:46
so we were able to do it for these guys and so help them in
00:26:49
the project.
00:26:49
So that's something I was working on for quite a while.
00:26:53
I was having lunch one day with a guy that literally worked the
00:26:56
office next door to me and we're like so what have you been
00:26:58
working on?
00:26:59
What have you been working on?
00:27:00
Like firmware in something that looked kind of like kind of
00:27:16
sort of some of today's modern phones, that kind of flip open
00:27:17
with a key pad.
00:27:18
But it was something back then that was called a kale 43.
00:27:22
You can Google that.
00:27:23
And as he was describing his his client that he was doing it for
00:27:28
, I'm like, well, it sure sounds a lot like my client.
00:27:31
So we started comparing notes and, lo and behold, we were
00:27:36
working on two very different things but going after the same
00:27:41
problem, and it turned out that the customer had two different
00:27:46
offices let's say that approached two different offices
00:27:49
at NSA, so we could point fingers as to where the
00:27:52
replication was going on.
00:27:54
We could point fingers as to where the replication was going
00:27:57
on, but literally we were right next door to each other, both
00:27:58
involved in multi-million dollar , multi-year design projects,
00:28:02
research and development projects to satisfy the same
00:28:07
base requirement from one customer.
00:28:10
So yeah, it happened, even on the InfoSec side of the house.
00:28:14
Speaker 1: Huh, that is really interesting.
00:28:15
Customer.
00:28:15
Uh, so yeah, it happened even on the infosec side of the house
00:28:16
.
00:28:16
Huh, that is, that's really interesting.
00:28:18
That also isn't like completely out of the realm after working
00:28:21
with the government that they would spend, you know, double
00:28:23
the money on the same exact project.
00:28:25
Basically right, just you know there's a reason why toilets
00:28:30
cost like fifty thousand dollars or whatever it is right, like a
00:28:32
pentagon, like that's always the joke.
00:28:39
Speaker 2: Yeah, the.
00:28:39
The epilogue to this and I'll share this just because it
00:28:40
matters to me is after I left that office, which was before
00:28:42
the completion of this project, and you know it bubbled up to
00:28:45
management.
00:28:46
Hey, we've got two, you know, different efforts going on that
00:28:49
are basically going after the same thing.
00:28:52
Somebody from that office sent me an internal memo some months
00:28:56
later.
00:28:56
That came from the top levels of the Army, because that's who
00:29:01
the customer was saying that my system was what they were going
00:29:06
to go with and they were going to cancel the other one.
00:29:09
So I won, at least in the short term.
00:29:12
Awesome.
00:29:18
Speaker 1: It mattered to me at the time, trust me, yeah,
00:29:20
absolutely, I mean, that's a, that's a huge thing, you know,
00:29:21
because I I always felt like, especially at the facility that
00:29:25
I was at, right it was broken up into four major modules and
00:29:29
each module would have a different you know, telecom
00:29:33
system would have a different OS , would have completely
00:29:36
different things and each module was competing against the other
00:29:40
.
00:29:40
So, you know, if X phone system ever went down, they would just
00:29:45
, you know, promote Y, and Y is a totally different product and
00:29:50
they're all competing against each other.
00:29:51
So it's not like it's not completely out of the realm and
00:30:15
they're all competing against each other.
00:30:16
So it's not like it's not completely out of the realm.
00:30:17
You know, like when, I guess, when companies, maybe the best
00:30:18
way to think about it, you know, and that's what that's like.
00:30:19
It's a, it's a environment like none other.
00:30:22
And you know, one of the things that I kind of got a glimpse at
00:30:27
was the technology that they were using.
00:30:29
You know, the technology seemed to be I mean, it seemed to be,
00:30:34
from what I saw, right, and I'm not even seeing the cutting edge
00:30:39
, the top tier stuff, right, but even that stuff seemed to be
00:30:45
five to 10 years ahead of whatever was on the market,
00:30:47
right, whatever you could purchase right now as a private
00:30:50
citizen, it's probably five to 10 years ahead.
00:30:52
Does that remain true?
00:30:55
Back when you were at the agency, did you, you know, work
00:30:59
on stuff that you were saying like this, this will come out in
00:31:03
you know 2000 or whatever it might be Right, and you're
00:31:07
you're looking at advertisements for you know the newest stuff,
00:31:11
right, when you go home or whatever, and you're saying, man
00:31:14
, we were like, that's nothing at work, you know, like, is that
00:31:18
the case, or maybe not so much?
00:31:21
Speaker 2: Well, you know my opinion only, I should say, and
00:31:25
certainly I didn't have a complete view into everything,
00:31:27
but my experience was rather sort of the exact opposite.
00:31:30
You know, what you're describing is is sort of the of
00:31:33
one of the fundamental principles of the free
00:31:36
enterprise system is the idea of competition, and the idea that
00:31:40
there's competition is going to drive innovation and the best
00:31:44
products to come out at the best price.
00:31:46
This might go back to, you know , sort of the military mindset
00:31:52
that NSA was built around, because I can remember I worked
00:31:56
with a lot of people that were enlisted and officers Third to
00:32:01
half of the workforce were military people at any given
00:32:05
time and I'm making that number up because the exact number is
00:32:08
probably classified, but I can remember one of them, one time
00:32:11
they were passing out a list of aphorisms.
00:32:17
You know true statements that were true for the military, and
00:32:20
one that always stood out to me was you know, remember when
00:32:24
you're in a firefight that your rifle was built by the low
00:32:28
bidder.
00:32:28
You know, but even I think you know the InfoSec side of the
00:33:07
house being more or less an engineering organization that
00:33:08
did notillion-dollar contract with one of the branches of the
00:33:09
military and that branch was coming back to NSA saying why
00:33:11
are we spending millions of dollars on you to develop
00:33:12
something everywhere called pretty good privacy tgp?
00:33:13
Google it, you younger people.
00:33:16
The guy that wrote it was in trouble with the government
00:33:20
before they were trying to prosecute him for a long time,
00:33:23
because he basically came up with a free encryption software
00:33:26
that was as good, if not better, than some of the stuff that nsa
00:33:30
was producing and it was, but crypto at the time was
00:33:34
classified as materiel, it was like munitions, and so it
00:33:38
couldn't be exported.
00:33:39
And how do you control the export or import of something
00:33:43
that's on the Internet?
00:33:44
Phil Zimmerman, this guy that wrote PGP.
00:33:47
They tried to prosecute him for a long time, but there was a
00:33:51
day and I was in the InfoSec side of the house where a
00:33:54
mandate was put out from the deputy director saying you know,
00:33:59
our contract and our livelihood is being threatened by this
00:34:01
freeware package called PGP.
00:34:05
Everybody, stop what they're doing and try to come up with an
00:34:07
attack against it.
00:34:08
To try to.
00:34:08
You know, sully its name and prove that it's not as secure as
00:34:10
our stuff.
00:34:10
Sully its name and prove that it's not as secure as our stuff.
00:34:14
I won't tell you what the outcome was, but I'll tell you
00:34:17
that PGP is still around today in some forms and some people
00:34:24
still use it, and of course, public key cryptography and so
00:34:27
on and so forth has won the day.
00:34:28
I mean, nobody's getting their cryptographic keys in search
00:34:34
from NSA anymore, because it's all out there in the wild.
00:34:37
So you can imagine where that went.
00:34:39
I'll leave it there, a, because I forgot what my second story
00:34:43
was going to be, and B, that's probably more poignant anyway.
00:34:48
Speaker 1: What was that like?
00:34:49
Starting the first red team at the agency?
00:34:52
Well A, we didn't call ourselves a red team at the
00:34:55
agency.
00:34:55
Speaker 2: Well, one you know well we didn't call ourselves a
00:34:58
red team.
00:34:58
That was a term that was assigned to us much later on.
00:35:02
But we were, you know, we're some, a small group of guys, or
00:35:07
four of us initially that were just kind of, you know,
00:35:11
interested in this new thing called computer hacking and
00:35:14
network hacking, internet hacking, internet security, and
00:35:19
we were charged with the mission of evaluating the security of
00:35:23
network systems anyway.
00:35:25
So somebody said, why don't we just learn how to hack?
00:35:28
And so we all just started learning how to hack and I had
00:35:32
hair back then.
00:35:33
I grew it long.
00:35:34
We were trying to live kind of the hacker lifestyle.
00:35:37
You know, while we were doing this was when the movie hackers
00:35:40
came out Uh, you know.
00:35:42
So you know things were happening out, out in the public
00:35:45
.
00:35:45
You know, like 2600 was around back in the early days.
00:35:49
You had the anarchist cookbook that was available on the
00:35:51
internet.
00:35:52
I mean, everything was new and the internet was new, and you
00:35:57
know, having information at your fingertips through a browser.
00:36:01
You know, worldwide web is what we called it at the beginning.
00:36:05
It was all new and everything that was written up into that
00:36:09
point, everything that was designed up in that way, was
00:36:11
trying to facilitate and foster fast, easy communication and
00:36:16
data sharing.
00:36:17
So the idea of protecting it in any kind of way, shape or form
00:36:22
was really kind of foreign.
00:36:23
So, needless to say, it was sometimes very easy for us to
00:36:29
break into things because most of the time what we were doing
00:36:32
was just taking advantage of what we would have called
00:36:35
features.
00:36:35
You know, it wasn't a bug in the operating system, it was the
00:36:39
way the thing was designed.
00:36:41
And, uh, you know, back in the early days, you know, there
00:36:44
there used to be talk about, you know, when you would buy
00:36:48
something new, like a, buy a new computer or buy a new server,
00:36:51
you never wanted to plug it in out of the box because it had
00:36:56
everything turned on by default, because they wanted you to be
00:37:00
able to use it and have it work.
00:37:02
So it was up to you, the consumer, to lock things down,
00:37:05
to harden things.
00:37:06
So to this day there's hardening guides, there's
00:37:09
configuration guides and requirements to follow things
00:37:11
like that, guides and requirements to follow things
00:37:16
like that, although you know, the big players like microsoft
00:37:17
have, you know, gone leaps and bounds into doing much more
00:37:21
security out of the box for new systems and new servers and
00:37:25
applications and operating systems and so forth.
00:37:27
But it was the wild west in the early days, so we were just
00:37:31
sponges and just learning as much as we could, and we tried
00:37:34
on our own systems and networks and like, look at that, you know
00:37:38
it works.
00:37:39
But we were learning from the same sources that everybody else
00:37:42
was.
00:37:43
You know what?
00:37:43
You know hacker websites, there were bug track.
00:37:46
You know people that are reporting vulnerabilities or
00:37:49
having problems fixing things, various news groups and RSS
00:37:54
feeds and so on and so forth 2600.
00:37:57
I think we all subscribe to it Anything we could get our hands
00:38:03
on.
00:38:03
What made it tricky, though, for us was because our targets were
00:38:10
classified systems.
00:38:12
The lawyers that tried to provide some ground rules for us
00:38:18
to do what we wanted to do.
00:38:20
They made the declaration that anything we do has to be
00:38:26
classified at the same level as our target.
00:38:29
So if we're targeting mostly top secret systems, literally
00:38:34
everything we did had to be classified top secret because of
00:38:38
the sort of the traditional way of classifying things and the
00:38:42
touch rule and the association, and you know, if you have a way
00:38:46
of breaking into a top secret network, it makes sense at some
00:38:50
level.
00:38:50
You want to keep that a secret too, so that needs to be
00:38:54
protected according to the same rules that any top secret
00:38:57
information.
00:38:57
So there, I mean there was a reason for it.
00:38:59
It became very impractical when we're pulling most of the
00:39:04
things that we were doing, which was more techniques than
00:39:08
exploits, back in those days Although there were some
00:39:11
exploits too, but we were getting them off the internet.
00:39:13
It was freeware, it was open source, and yet as soon as we
00:39:17
touched it it became classified, top secret.
00:39:21
I gave a talk years ago about sort of the origins of the red
00:39:24
team and I have to caveat in this talk I can't tell you what
00:39:30
we did because it's still classified.
00:39:33
I can't tell you what we did, were doing, but use some common
00:40:00
sense.
00:40:00
And then at some point I say, okay, I'll tell you one top
00:40:03
secret tradecraft that was a very common tool that we use,
00:40:08
and I proceed to tell them about the Ping Committee.
00:40:10
So ping command, because we used it literally, was
00:40:15
classified top secret and we couldn't tell people that we
00:40:18
used the ping command and worse than that, to get all the
00:40:22
management approvals and authorizations to do these pen
00:40:25
tests and this ethical hacking, trying to break into something
00:40:29
we were supposed to describe our attack scenario and our
00:40:32
methodology ahead of time and have everybody pre-approve it.
00:40:35
So before we could even issue a ping command, we had to go
00:40:39
through a weeks-long process of getting authorizations, in
00:40:43
theory, to be able to ping a target, ping a ping scan, a
00:40:49
network segment, ping a box to see what it was, do any.
00:40:53
And this is long before the days of any kind of commercial
00:40:56
vulnerability scanner, like most people know Nessus these days.
00:41:00
Um, ironically, the, the guy that founded the company that
00:41:04
produced Nessus, was one of the guys on our team.
00:41:08
Um, so there, you know there is a connection there.
00:41:11
But, uh, back in those days we had two freeware versions of
00:41:17
vulnerability scanners.
00:41:18
One was called iss and one was called satan.
00:41:22
Uh, and you know, you can imagine at a government or top
00:41:26
secret organization it's very politically minded how well
00:41:30
using satan went, went over as a tool, but it had to be
00:41:34
classified top secret.
00:41:35
Speaker 1: If in fact, we used it, I might have revealed a
00:41:38
secret yeah, you know, maybe a couple years ago at this point I
00:41:44
talked to someone that was a cyber warfare officer for for
00:41:51
the military right and he talked about how he would be handed
00:41:56
target packages and he would be tasked with creating essentially
00:42:00
, you know, the payload, the exploit, whatever it was the
00:42:04
entire thing, and putting it all into essentially a command and
00:42:09
never hitting enter.
00:42:10
You know he's not allowed to hit enter.
00:42:11
That goes to someone else where they hit enter on their
00:42:15
keyboard after it gets approved and everything else like that.
00:42:18
And you know the.
00:42:20
The amount of times that he said that you know things would
00:42:25
change right in between the time that he created it and the time
00:42:28
that it was approved and then it was actually executed was was
00:42:33
significant and it was very frustrating because that process
00:42:36
is so long and arduous to go through.
00:42:39
Just for basic things, basic things, you have to change
00:42:45
permissions on a file or whatever it might be Simple
00:42:49
terminal tasks.
00:42:51
It's really fascinating to me.
00:42:55
Were you ever handed a target package that you couldn't get
00:43:13
into, that you couldn't find you know anything against, I would
00:43:14
think back then that it would be a whole lot easier than it is
00:43:16
today, because now today, everyone's kind of hyper aware
00:43:17
of security, and so even I feel like even the just regular
00:43:18
average desktop is a little bit more difficult to get in, and
00:43:21
maybe that's my naivety, right Is?
00:43:23
It's a little bit more difficult to get in today than
00:43:26
it would have been 20, 30 years ago, right?
00:43:28
Because everyone is so hyper aware, I need to deploy these
00:43:32
patches.
00:43:33
Speaker 2: Well, the short answer is no.
00:43:36
I was never handed a package because we, for the time that I
00:43:41
was involved, we were so new that we didn't have really a
00:43:44
formalized methodology for delivery of services, so we
00:43:49
weren't well known.
00:43:50
I mean, we I talked to an old manager from that time last
00:43:55
summer, caught up with him, found him on Facebook several
00:44:00
years ago, and then COVID happened.
00:44:01
We finally got together for lunch and we were talking about
00:44:05
things like the TAO and I was like you know, did the TAO come
00:44:11
spring from what we were doing?
00:44:12
He goes no, not really.
00:44:13
That was.
00:44:14
I know where it came from.
00:44:15
It was from a different area of NSA.
00:44:18
But he kind of scratched his head and thought a little bit.
00:44:21
He says no, what you guys were doing.
00:44:23
And we call ourselves the pit by the way, that was our
00:44:27
nickname for our office, but in the folklore the pit was the
00:44:31
first red team at NSA.
00:44:33
But he said you know, you guys were kind of the beginning of
00:44:37
what's now US Cyber Command.
00:44:39
So you know, that's how far back we go.
00:44:44
I mean, I started doing this, as near as I can remember, back in
00:44:47
1992.
00:44:49
So over 30 years ago I was breaking into what was mostly
00:44:54
Unix systems and mainframes back in those days.
00:44:56
But in as much as we were kind of new, we didn't have a very
00:45:02
formalized methodology and a formalized way of approaching us
00:45:06
, and engaging us then was teaching people about Unix file
00:45:15
permissions and directory settings and things called set
00:45:20
UID permissions, which basically any application or service that
00:45:26
was running on Unix many of the early ones that were written
00:45:30
had to be executed as root in order to perform.
00:45:33
So even you know.
00:45:35
So there was a bit that was set in the permission that when it
00:45:39
executed anybody could run it.
00:45:40
But when it was running it was basically running it as root and
00:45:45
if you could get it to crash and hiccup, very often it would
00:45:48
dump out into a shell but retain its status, so it would be a
00:45:52
root shell.
00:45:54
Very common back then and you know probably 80 or 90 percent
00:45:58
of the services and applications that were available on any unix
00:46:01
system out of the box had these set uig bits set to execute as
00:46:08
root.
00:46:09
Um, that's the type of thing we were doing and it was much more
00:46:14
teaching people sort of you know, basic hygiene and and we
00:46:16
weren of thing we were doing and it was much more teaching
00:46:18
people sort of you know basic hygiene, and we weren't only.
00:46:20
We were figuring it out as we went along, what's good and bad,
00:46:24
and you know we used to joke all the time.
00:46:26
Well, this is a feature, this isn't a bug.
00:46:28
A lot of people were doing that and that's why we started to
00:46:33
have hardening guides and hardening standards and things
00:46:37
that you need to do if you're going to install or turn on a
00:46:42
Unix server or a Unix workstation or, later on, a
00:46:45
Windows workstation or server.
00:46:55
Speaker 1: Here's all the 10 million things that you need to
00:46:56
do to lock it down before you let it loose on your network.
00:46:58
What's the?
00:46:58
What's the tal?
00:46:59
Speaker 2: group.
00:46:59
Gosh, what is tao?
00:47:02
Everybody's listening.
00:47:03
It's going to shout it out technical access organization.
00:47:07
That's not right.
00:47:08
I could google it real quick.
00:47:10
Tailored access or tailored access, thank you.
00:47:13
It was a group that was exposed by wLeaks several years ago,
00:47:17
back when the whole Edward Snowden things and Julian
00:47:23
Assange kind of stuff was going on.
00:47:25
It was basically a super secret group at NSA that would do a
00:47:30
lot with, I think, what you were describing as being handed a
00:47:33
package and then go do stuff and very surreptitiously so it was
00:47:38
a very super secret organization that a lot of people got outed
00:47:42
as it.
00:47:43
Very few people knew it existed , especially not in the public
00:47:47
realm, and so it kind of got blown up and a lot of their
00:47:49
exploits were exposed publicly and so on and so forth.
00:47:52
So a lot of people know about tao, kind of from the whole
00:47:56
Edward Snowden, wikileaks type of thing.
00:47:59
So that's why I brought up TAO.
00:48:01
I was not responsible for the predecessor.
00:48:05
I'm not an ancestor of TAO.
00:48:08
I'm apparently an ancestor of US Cybersecurity, which was much
00:48:12
more mission-focused sort of upfront about defense really
00:48:18
still, as opposed to offense.
00:48:21
And again we started doing this ethical hacking, breaking into
00:48:24
networks in the early days to test the security and let's see
00:48:27
how secure you are and find the holes and the vulnerabilities by
00:48:32
attempting to break in, which, by the way the way, is pretty
00:48:36
much the movie plot to the movie Sneakers, which I happen to be
00:48:40
wearing a t-shirt that is a reference to the movie Sneakers,
00:48:43
but that was kind of our inspiration and, ironically, to
00:48:46
this day most red teaming companies, pen testing companies
00:48:51
, kind of follow a very similar methodology, especially if
00:48:54
they're doing physical access.
00:48:55
So if you've not seen the movie Sneakers and you're trying to
00:49:00
break into the industry, there is sort of a canon of movies
00:49:03
that you should watch and some of them are old.
00:49:06
But War Games is the classic.
00:49:09
It came out in 1983.
00:49:11
Matthew Broderick, sneakers, which came out in 1992.
00:49:15
Robert Redford and Ben Kingsley a huge cast of really famous
00:49:22
names at the time were in that movie.
00:49:23
1995, hackers, angelina Jolie.
00:49:26
Same year, a movie called the Net, which hardly ever gets any
00:49:32
attention, although I met somebody at a conference a
00:49:35
couple weeks ago and they said that the Net was their favorite
00:49:38
hacker movie.
00:49:39
Sandra Bullock, also 1995.
00:49:43
Speaker 1: Other movies since then, but those are sort of the
00:49:46
classics of the original kind of hackers being pimply-faced high
00:49:54
school kids that lived in their parents' basement and never saw
00:49:57
the light of day and had the black hoodie and knew everything
00:50:01
about computers and breaking in , so anyway, yeah, so you know,
00:50:08
jeff, I want to ask you what made you stay in the agency for
00:50:15
so long, because you said in the beginning, right, there was
00:50:18
there's a significant amount of turnover and someone you know
00:50:20
figured out how they could essentially leave, and you know,
00:50:23
under a year or so, right.
00:50:25
So I'm sure, I'm sure, you probably knew about that method
00:50:29
and whatnot at that time and I'm sure you probably even, you
00:50:34
know, thought about it, right, like, is that something I want
00:50:35
to do?
00:50:36
Maybe I want to stay here.
00:50:36
Even, you know, thought about it, right, like, is that
00:50:36
something I want to do?
00:50:37
Maybe I want to stay here.
00:50:38
What was, you know, the things that you weighed, um to stay at
00:50:43
the agency for so long?
00:50:44
Speaker 2: And then why did you end up, you know, leaving when,
00:50:47
when you did, uh, there's a whole story to why I left and
00:50:52
I'll try to make it short for you.
00:50:53
But, um, I mean, when I came to the agency because I wasn't
00:50:57
critical skill, I was not given I didn't.
00:51:00
I was on the regular pay scale, didn't have the accelerated pay
00:51:04
scale, so I got paid less than everybody else, didn't qualify
00:51:08
for a lot of the extra programs because they were just for the
00:51:11
critical skill people.
00:51:13
There's intern programs that existed for all the different
00:51:16
skills, skill groups and disciplines within the agency
00:51:23
and you had to go through the intern program to get all the
00:51:24
different qualifications, to be what was called being
00:51:25
professionalized, which was a threshold to if you ever wanted
00:51:30
to be promoted on the GS scale from a 12 to a 13,.
00:51:34
You had to have professionalization, which is
00:51:37
sort of a you know, modern equivalent of certification.
00:51:40
You're like everybody in this industry has to have a CIS, sp
00:51:44
type of thing.
00:51:44
Um, so I mean part of my motivation for staying and I was
00:51:48
there for 10 years was I was doing cool stuff.
00:51:51
A, uh, I had kind of had and have kind of a chip on my
00:51:55
shoulder that I wasn't given all this special opportunity and
00:51:58
the people that were, they weren't really doing anything in
00:52:02
particular whereas I was doing stuff, and very, very quickly.
00:52:07
The reason I left and I tell this story more in depth with
00:52:12
the talk that I'm giving this year.
00:52:14
I've given it several times and I've got a couple more to give.
00:52:17
The next one would be am I giving this talk at DEF CON?
00:52:22
I'm not.
00:52:24
I'm keynoting B-Sides, edmonton , alberta, in September and I'll
00:52:29
be giving this talk in GRR CON, which is Grand Rapids, end of
00:52:33
September.
00:52:34
So if you want to hear the story , full story there.
00:52:36
But essentially we were doing this pen testing exercise and we
00:52:40
were asked to do it for a civil agency, so a non-classified
00:52:48
network.
00:52:48
We ran into some political issues in terms of things that
00:52:53
are similar to what Snowden was bringing to light.
00:52:55
But when that was exposed and we got in trouble, and because I
00:53:01
was the team leader, I was the one that was really in trouble
00:53:09
when there was a particular incident that happened in late
00:53:11
August or mid-August of 1996, I was gone from NSA before the end
00:53:14
of September 1996.
00:53:16
So there was a significant incident that happened, that was
00:53:20
life-changing, that changed the course of my career, and within
00:53:24
six weeks I was gone from NSA.
00:53:28
I'll have to leave it at that, because this is only an hour
00:53:32
podcast and it's a pretty lengthy story.
00:53:37
Yeah, that means I definitely need to bring you back go see me
00:53:39
at gurkhan or besides edmonton or look for I don't know if I've
00:53:44
seen this talk, even though I've given the talk a lot this
00:53:46
year.
00:53:47
I don't know that it's been recorded much, but uh, you know,
00:53:51
as as a teaser, I I've given three sets of talks over the
00:53:55
years that tell my NSA story.
00:53:57
The first one was Tales from the Crypt Analyst, which was
00:53:59
when I was actually a crypt analyst, and then the sequel I
00:54:03
have stickers too to promote it More Tales from the Crypt
00:54:05
Analyst, which is where I tell more of the in-depth story of
00:54:09
the origins of the first red team and the pit.
00:54:12
And then this year it's Tales from the Crypt the afterlife.
00:54:17
What did I do after I left NSA abruptly at the end of 1996, up
00:54:21
until roughly 2004, where I started doing PCI?
00:54:25
It started out as a how did I get into a PCI talk?
00:54:29
Somebody that's a former NSA cryptographer.
00:54:31
So I'm considering resurrecting all three of these stickers.
00:54:36
So if you do go to Gurkhan, I'm hoping to have the box set of
00:54:41
the stickers available, as it were.
00:54:43
I mean free for giveaway.
00:54:45
Speaker 1: Yeah, that'd be pretty awesome.
00:54:46
You know, jeff, I really want to go longer, but I guess that
00:54:52
just means you know that I have to bring you back on.
00:54:54
Speaker 2: You have to bring me back to the last podcast that
00:54:57
interviewed me.
00:54:58
They let me talk for three hours, so I'll say which podcast
00:55:03
was that?
00:55:03
Uh, it's called the team house.
00:55:06
Speaker 1: I think I heard like the first hour of that.
00:55:08
Speaker 2: Yep, well, I think it went viral, cause, uh, I don't
00:55:12
know what constitutes viral, but it's at 115 views the last
00:55:16
time I checked.
00:55:17
Oh, wow, I don't know what.
00:55:21
Their audience is Very diverse, because they talk about special
00:55:26
operations and warfare.
00:55:27
I think they've got a lot of military-minded people, not just
00:55:30
cybersecurity people.
00:55:32
I'm happy to come back anytime, because all we've done is the
00:55:35
intro, uh.
00:55:36
But yeah, uh, I'm happy to come back anytime because, you know,
00:55:37
all we've done is the intro, yeah.
00:55:40
Speaker 1: Right.
00:55:40
Speaker 2: We've we've touched lightly on the first 10 years of
00:55:43
what's more than 40 years in the business.
00:55:45
So, uh, I'm happy to come back, uh, as time allows.
00:55:50
Speaker 1: Yeah, yeah, absolutely, I'm, I'm definitely,
00:55:53
uh, going to get you back on the schedule to come back on,
00:55:56
let's do a part two.
00:55:57
Sure, that'll be awesome.
00:55:58
Well, jeff, you know, before I let you go, why don't you tell
00:56:02
my audience?
00:56:02
You know where they can find you if they wanted to reach out,
00:56:05
and maybe even you know the link to.
00:56:08
You know whatever company you're working for, if you want
00:56:11
to put that out to put that out.
00:56:20
Speaker 2: Sure, I work for a company called Online Business
00:56:23
Systems the website is wwwobsglobalcom and I do
00:56:26
primarily consulting and advising from a security
00:56:30
perspective, primarily for PCI, the payment card industry.
00:56:34
I just finished my first version four report on
00:56:37
compliance earlier today.
00:56:39
It's going through final review and then it'll be signed and
00:56:42
delivered to the client.
00:56:43
But I've literally been doing PCI for the last 20 years, which
00:56:47
is disturbing on some levels.
00:56:49
But there's a reason why I do it.
00:56:52
It's kind of a love-hate relationship.
00:56:55
We can have a story about that.
00:56:57
I also, on the side, I'm a co-host of a podcast called
00:57:03
Paul's Security Weekly, which you can find at
00:57:05
securityweeklycom.
00:57:09
I do quite a bit of conference speaking.
00:57:11
I've been out there a bunch.
00:57:12
This year I'll be speaking at defcon at the crypto and privacy
00:57:20
village, and there's apparently something new at defcon this
00:57:23
year called creator stage, where all the villages submitted like
00:57:27
their top two or three talks to this creator stage.
00:57:31
So I think I'm doing that and I assume that means it's a
00:57:34
separate talk.
00:57:35
I'm not sure they're also resurrecting Sky Talks, but
00:57:39
instead of it being affiliated with DEF CON this year it's
00:57:46
affiliated with B-Sides Las Vegas.
00:57:47
So I'm going to be doing a Sky Talk, sort of adjacent to
00:57:48
B-Sides Las Vegas I think it's actually at the hotel next door,
00:57:51
I forget the name from the Tuscany, so I'll be speaking at
00:57:54
DEF CON and B-Sides Las Vegas, what we call Hacker Summer Camp,
00:57:59
and then B-Sides Edmonton.
00:58:02
If you happen to be Canadian A you want to come up to Edmonton,
00:58:06
alberta.
00:58:06
It's beautiful in September.
00:58:08
It's not minus 30 yet, but I'm keynoting B-Sides Edmonton up
00:58:14
there.
00:58:15
And then the last weekend of September so it's like the 27th,
00:58:20
28th or 29th, somewhere in that range is GURCON and that's in
00:58:25
Grand Rapids, michigan.
00:58:26
Highly recommend going to GURCON if you want to go to a
00:58:29
hacker conference.
00:58:30
Some of the hacker conferences aren't with us anymore, some are
00:58:35
fast going away, but it's holding its own and it's a
00:58:41
pretty decently sized but not too big hacker conference.
00:58:46
It's not too big, where you can't have conversations with
00:58:50
people and have pretty good networking sessions without the
00:58:53
crush of humanity like some people experience at things like
00:58:56
defcon.
00:58:58
Yeah, so that's it for the pitches.
00:58:59
Uh, I'm on linkedin, you know.
00:59:01
Type my name in, spell it right .
00:59:04
Uh, google me again.
00:59:06
Spell my name right.
00:59:07
Uh, you'll find me in different places.
00:59:09
I, I'm sort of on twitter, but I'm not, you know, just like
00:59:13
everybody else.
00:59:14
Yeah, um, I'm on facebook, but I'm not.
00:59:15
You know, just like everybody else, um, I'm on Facebook, but
00:59:17
I'm not.
00:59:17
I don't do social media.
00:59:19
We started the conversation, yeah absolutely Well, you know.
00:59:24
Speaker 1: Thanks, jeff, for for coming on.
00:59:25
Um, I'm probably gonna.
00:59:28
I'm probably gonna go to one of the conferences that you
00:59:31
mentioned, so I'll definitely, uh, be in touch and maybe we'll
00:59:34
meet up for a drink or so over there Sounds good, All right
00:59:37
yeah absolutely Well.
00:59:38
Thanks everyone.
00:59:39
I hope you enjoyed this episode .
00:59:41
Thank you.