Ready to unlock the secrets of cryptography and cybersecurity from a seasoned expert? Join us as we welcome back Jeff Man for the riveting second part of his story, where he navigates a hectic schedule filled with speaking engagements at premier conferences like B-sides Edmonton and GurrCon. Jeff opens up about his efforts to achieve work-life balance and self-care, sharing plans for a rejuvenating two-week road trip and the enriching experience of spending quality time with his spouse. The episode is a treasure trove of insights into personal growth and the delicate dance of integrating professional and personal lives, especially in the wake of retirement and the COVID-19 lockdown.
Travel back to 1987 and explore the pivotal role Jeff played at the NSA in enhancing military communications security. We delve into his assignment on the manual crypto systems branch, where he utilized classic cryptographic techniques, including the cipher wheel, to improve the US Special Forces' communication methods. With detailed anecdotes, Jeff recounts how he tackled the challenge of creating a practical and secure solution that could be easily memorized by field operatives, shedding light on the evolution of cryptographic practices and their profound impact on military operations.
Our journey through the world of espionage and cybersecurity continues as Jeff shares captivating stories of government espionage, data collection, and the technological advancements that often remain hidden from the public eye. From Cold War tactics to modern data interception techniques, Jeff provides a comprehensive overview of the cyclical nature of intelligence work. Rounding out the episode, Jeff reflects on his transition from the NSA to the private sector, candidly discussing the ethical challenges and evolving landscape of cybersecurity. Don't miss this fascinating exploration of history, personal growth, and the ever-changing world of cybersecurity.
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, jeff?
00:00:02
It's great to have you back on the podcast.
00:00:04
You know we're doing a part two today because your story was so
00:00:07
expansive right before that.
00:00:10
Like I couldn't just leave everyone on a cliffhanger, and
00:00:14
mostly for selfish reasons, I couldn't leave myself on a
00:00:17
cliffhanger, right.
00:00:18
So how's it been going?
00:00:21
Speaker 2: Yeah, it's been busy.
00:00:23
This is the busy season.
00:00:24
I forgot when we recorded our first session.
00:00:26
I feel like it was like a month or two ago yeah and uh, it
00:00:30
might have been june, it was before vegas.
00:00:32
I know that, so you know, hacker summer camp has come and gone.
00:00:35
Summer has come and gone.
00:00:37
I was talking previously about uh, in the fall I'd be speaking
00:00:41
at b-sides edmonton, which is in canada.
00:00:44
Edmonton alber, as well as Gurkhan, got the Gurkhan shirt
00:00:47
on.
00:00:47
That hasn't happened yet.
00:00:49
That's the end of September, so I feel like you're getting me
00:00:53
back on.
00:00:54
We talked a little bit last time about how it had taken a
00:00:56
long time to pull the trigger on this, but this seemed to come
00:01:01
much more quickly.
00:01:03
The second round that's what happens when you leave people
00:01:06
with a cliffhanger.
00:01:07
Speaker 1: Yeah, yeah, I guess.
00:01:08
So I've also taken a very, very aggressive approach to
00:01:17
lightening my schedule the back half of the year, because it's
00:01:18
just it's too much, you know, trying to juggle four or five
00:01:21
different things all at once, you know, and trying and try to
00:01:23
right, try to have a family as well, right, like I don't want
00:01:26
to be working until 9 pm every night, but yeah, that could be a
00:01:31
third episode where we talk about, uh, work, work life,
00:01:35
family life, balance, self-care.
00:01:36
Speaker 2: I've been learning about self-care this year.
00:01:39
My wife and I have been on a journey of learning how to
00:01:43
relate to each other more closely after 34 years and empty
00:01:46
nesters.
00:01:47
And what do we do now?
00:01:50
We're staring down retirement and ultimately death.
00:01:53
But this is supposed to be the prime of our life and I've
00:01:58
talked to many people that are in the same situation and
00:02:00
they're like, yeah, I don't know what I would do with my wife.
00:02:03
I had to hang out with her all day long.
00:02:05
But we're we're trying to be a little bit more proactive and
00:02:08
plan some things.
00:02:09
Like you know, we're taking a two week road trip the back half
00:02:13
of October, just for fun, just because I've never taken a two
00:02:17
week vacation before because of you know the workload that it's
00:02:22
like dang, I'm going to.
00:02:22
I'm going to do it this time.
00:02:23
So we're going to take a road trip.
00:02:25
Our ultimate destination is to hang out with a dear friend of
00:02:28
mine, pioneer in the industry, especially in terms of security
00:02:32
B-sides Mr Jack Daniel.
00:02:33
Some people might have heard of him, although he's retired now
00:02:38
and it's very quickly.
00:02:39
People forget about relics from the past.
00:02:41
But, yeah, self-care.
00:02:43
But that's not the purpose of this call today.
00:02:46
We can save that for yet another episode.
00:02:49
Speaker 1: Yeah, that might even be good for maybe an episode
00:02:53
like kick off the new year.
00:02:54
You know where we're talking about.
00:02:57
Speaker 2: You could even do it as a roundtable.
00:02:58
I know there's several nonprofits in the industry that
00:03:02
are focused on various aspects of that, mental health hackers
00:03:07
being one of the primary ones that I run into a lot, and they
00:03:11
do lots of cool things just to try to help people.
00:03:14
As it turns out, what I'm learning is self-care, take care
00:03:17
of yourself, all work and no play.
00:03:20
Make Jack a dull boy.
00:03:21
It also drives you to drinking and you know bad habits and all
00:03:26
sorts of other things and failed marriages and stuff like that.
00:03:29
Speaker 1: So anyway, we digress .
00:03:31
Yeah, you know it's interesting .
00:03:34
You brought up how you know people always say like I don't
00:03:37
know what I would do, you know, with my wife after retirement if
00:03:40
I had to spend all day with her or whatnot, and like it kind of
00:03:43
takes me back to like the very beginning of COVID right,
00:03:46
because me and my wife got married two weeks before the
00:03:49
lockdown started here in Chicago .
00:03:52
Right, so we got married, moved in together for the very first
00:03:55
time and now we're locked inside with each other Right, and
00:03:58
we're told we can't go outside, we can't go to the gym, we can't
00:04:01
go to work, all these things, and that was.
00:04:05
That was like like trial by fire for starting a marriage.
00:04:08
I mean that was insane.
00:04:10
Speaker 2: We heard stories.
00:04:11
A lot of people didn't make the cut.
00:04:13
You know they.
00:04:14
They were used to getting along when they were each working and
00:04:18
had their own careers and had their own you know, support
00:04:21
groups and social groups and whatnot, and all of a sudden
00:04:24
being stuck with one another.
00:04:25
Yeah, Not a lot of people didn't make it.
00:04:28
Speaker 1: Yeah, I know quite a few that didn't, and it's I
00:04:32
don't know like.
00:04:33
I feel like it was definitely difficult, but I'm glad that we,
00:04:36
you know, stuck in there or whatnot.
00:04:38
Right Like it's.
00:04:39
That was a feat in and of itself.
00:04:40
Speaker 2: I feel In and of itself, I feel Right, well, yeah
00:04:45
, I mean, my wife is my best friend.
00:04:49
It's been an exciting year for me because one of the things
00:04:52
we're learning is she needs to have her own things to do.
00:04:55
I obviously have my own things to do and a lot of that is being
00:04:59
out Security community, hacker community, like you know
00:05:06
mentioning, I go get a bunch of conferences every year, but I
00:05:07
had a chance to her stuff, my stuff and stuff that we can do
00:05:10
together.
00:05:11
One of the things that we're trying to do is for her to enter
00:05:14
my world and for me to enter her world.
00:05:16
So I've been able to take her to a couple of the hacker and
00:05:20
security conferences this year and that's been a lot of fun
00:05:23
because you know she likes talking to people, she likes
00:05:26
meeting people and I tend to hang out with people.
00:05:29
People approach me and it's just been fun watching her engage
00:05:35
with other people because you know she's got a lot of
00:05:38
experience and a lot of wisdom too, about all this stuff that
00:05:41
we all talk about and care about , and she's been hearing me talk
00:05:45
and bitch about security things for 34 years.
00:05:49
So she was always my sounding board over the years when I was
00:05:54
trying to figure out how to explain some difficult security
00:05:58
concept to a client.
00:05:59
I'd run it by her and she's great at making analogies and
00:06:03
that's something that I've learned to do not as well as she
00:06:06
does, but over the years to try to help people understand
00:06:10
concepts by putting in into a language and into a context that
00:06:13
they can understand.
00:06:14
I got that from my wife and she's great at it, so anyway,
00:06:19
yeah, she's, yeah, she's pretty awesome.
00:06:22
Speaker 1: I hope to bring her to many more conferences in the
00:06:24
future yeah, it's fascinating how, you know, two people can
00:06:28
kind of like complement each other that way.
00:06:30
Right like it's, it's really it's, it's great to have that.
00:06:34
I don't quite have that just yet with my wife.
00:06:36
She's a special education like early childhood teacher, um, and
00:06:41
so like me talking about anything with cyber security
00:06:44
just goes straight over her head .
00:06:45
She's like I don't know, you need to break it down more for
00:06:48
me, and I was sitting here like I don't know.
00:06:50
Speaker 2: I kind of told you the the basics there, yeah, but
00:06:53
most of us in the hacker community, especially the white
00:06:55
guys, you know, we are just adolescents at best, if not
00:06:58
toddlers, on the inside.
00:07:00
So she's probably perfect for you yeah, yeah exactly not
00:07:05
making.
00:07:05
Speaker 1: Not making any judgments or accusations, it's
00:07:09
just trends, guys yeah, the only difference is she doesn't have
00:07:12
any patience for me.
00:07:13
She has way more patience for her students, but none for me.
00:07:16
You know right right interesting well, jeff, you know where we
00:07:22
kind of left off in part, one for one for the audience.
00:07:25
If you haven't already listened to or watched part one, please
00:07:29
go do so.
00:07:30
I'll leave the link in the show description or the show notes,
00:07:34
whatever it is.
00:07:34
But where we kind of left off was we didn't quite get to your
00:07:39
invention while you were at the agency.
00:07:41
So why don't we talk about, maybe, the problem that you were
00:07:44
faced with, that you were trying to solve for, and then
00:07:46
what the invention is and how it's used?
00:07:50
Speaker 2: Sure, yeah, I started at NSA in late 1986 and spent a
00:07:57
couple months taking courses, introductory courses, learning
00:08:03
about the basics of cryptography and kind of the things that NSA
00:08:06
did, waiting to get my clearance, my top secret
00:08:09
clearance.
00:08:10
When I was finally assigned to an office it was probably early
00:08:14
1987.
00:08:16
It was in what at the time was called InfoSec, the defensive
00:08:21
side of the house, and I was reporting to the manual crypto
00:08:26
systems branch.
00:08:27
So this was back in the days where there wasn't a lot of
00:08:29
digital encryption going on.
00:08:31
I think public key cryptography had been invented.
00:08:35
You know Diffie-Hellman algorithm I think that was done
00:08:38
early on but not a whole lot of practical application for it yet
00:08:42
.
00:08:42
So most of what nsa did from a defensive perspective infosec
00:08:47
being defensive was produce manual.
00:08:50
You know manual crypto systems, what we were doing, but a lot
00:08:53
of machine crypto systems.
00:08:54
You know things that you see in movies army guys talking on
00:08:58
radios.
00:08:58
The radio itself is on a backpack.
00:09:02
Well, there there's other things besides just a radio that
00:09:05
are encrypting the voice signal , converting it to digital,
00:09:09
doing encryption, various different methods.
00:09:11
You know dating that technology kind of dates back to World War
00:09:15
II.
00:09:15
A lot of people are familiar with the Enigma machine.
00:09:17
But in the 70s and 80s at NSA, with the advent of more digital
00:09:24
machines and just more machines in general, things like
00:09:27
electronic typewriters and copy machines you know what we used
00:09:31
to call Xerox machines, because Xerox was the company that you
00:09:35
know made the.
00:09:35
You know they were the first ones or the main ones that made
00:09:40
them Everybody assumed that paper systems would go away.
00:09:43
Nsa was continuing to move towards more machine-based
00:09:47
cryptography, building little black boxes where data went in
00:09:52
in plain text form and out came cipher and all the magic and
00:09:58
secret stuff happened inside literally little black boxes.
00:10:01
So one of the things that I was doing or I guess my primarily
00:10:04
Mary assignment working for this organization was to do sort of
00:10:08
a security evaluation of the existing systems that were out
00:10:12
there, the manual systems, and one of the first assignments I
00:10:16
had was working with US Special Forces, the Green Berets.
00:10:20
They had as their primary form of communication something
00:10:24
called a one-time pad, and a one-time pad was literally a pad
00:10:29
of paper.
00:10:29
There was two copies, five, maybe 50 groups of five letters
00:10:34
I don't know how many exactly there were printed on a paper
00:10:49
and you would write your message one letter at a time, above or
00:10:55
below the letter that was printed there and then you would
00:10:58
use something called a visionaire table or a visionaire
00:11:01
square.
00:11:02
This is a copy that was the first page of every one-time pad
00:11:06
for the US Special Forces.
00:11:08
So that's the dimensions of the pad, maybe three inches by six
00:11:12
inches thereabouts.
00:11:14
This table represents what was called trigraphs, three-letter
00:11:18
combinations Again, trying to get it up there so you can see
00:11:21
it the communications sergeants that were part of the special
00:11:25
forces teams.
00:11:27
They were responsible for sending and receiving the
00:11:29
messages, doing the encryption and the decryption Back in those
00:11:33
days.
00:11:33
Very often they were sending the messages still in the late
00:11:36
80s by Morse code and they were receiving the signals by radio
00:11:40
signal and they'd have a receiver.
00:11:43
They'd set it up, turn to the right frequency, listen for the
00:11:46
message, write it down letter for letter.
00:11:49
And the way it works is you write down the letter, whether
00:11:54
it's plain text or the cipher.
00:11:56
You've got the key.
00:11:59
There's a unique third letter.
00:12:00
So you write down plain text.
00:12:03
There's the key.
00:12:04
The third letter becomes the cipher.
00:12:05
The cipher gets sent.
00:12:07
The other end has the same key.
00:12:10
So he writes down two of the three letters and gets back to
00:12:14
that first letter because it's a unique three-letter combination
00:12:17
.
00:12:17
So that's what they used as their primary form of
00:12:21
communication.
00:12:22
They also had a backup system in case they were deployed
00:12:26
somewhere and had to drop their backpack that had all the
00:12:30
one-time pads in it and still wanted to send an encrypted
00:12:33
message.
00:12:33
They had a memory crypto system and my assignment was to come
00:12:38
up with a new memory crypto system because the one that
00:12:40
they'd been using.
00:12:41
We'd done a security evaluation prior to when I was there and
00:12:46
determined that it was vulnerable and it could be
00:12:49
broken.
00:12:49
So I set out to try to come up with a new memory system for
00:12:54
them, and I had just gone through history of cryptography
00:12:58
courses and learning about all sorts of classic substitution
00:13:02
systems, transposition systems, cipher systems, the Caesar
00:13:06
cipher from back in Roman days, cipher wheels and little Orphan
00:13:11
Annie, decoder rings and all that kind of stuff, and I was
00:13:14
trying to use a lot of those different techniques to see if
00:13:18
there was one that could be used , because a memory system, by
00:13:22
definition, needs to be something that you can memorize
00:13:24
and be fairly easy to use.
00:13:27
One of my sort of going in goals was to take advantage of the
00:13:32
guys that were the radio operators.
00:13:34
They had these things more or less memorized.
00:13:36
They would have this paper every time they cracked open
00:13:39
their new one-time pad as a backup, but they would memorize
00:13:43
these things so that they could do it in their head much more
00:13:47
quickly.
00:13:48
As I was visiting them, you know, we said sort of the work
00:13:52
groups for the various communication charges from
00:13:54
various teams come together and I would present ideas of you
00:13:59
know what if you did something like this?
00:14:01
What if you did something like this?
00:14:02
Here's a couple options what works, what doesn't work.
00:14:06
I think I mentioned on the first episode I was a business major,
00:14:08
so I was like applying basic business practices, trying to
00:14:11
get you know group buy-in and all that kind of stuff.
00:14:14
But as I was like in a little classroom or meeting room
00:14:18
talking to you know eight or 10 guys turning around writing on
00:14:21
the whiteboard and trying to do this and also demonstrate, you
00:14:26
know a couple of different variations of what I was trying
00:14:29
to hope, hope, hope them, hope to get them to buy into.
00:14:32
At some point I was struggling because I'm like I'm not going
00:14:34
to memorize these things, I'm using this awkward table and
00:14:37
it's just time consuming.
00:14:38
And at some point I was back in the office and I was thinking
00:14:42
you know, I just learned about cipher wheels.
00:14:44
I was back in the office and I was thinking, you know, I just
00:14:46
learned about cipher wheels.
00:14:47
There ought to be a way to make a cipher wheel out of these,
00:14:51
these alphabets.
00:14:51
And so I talked to the guy that was my mentor I think I
00:14:53
mentioned him on the previous episode, the guy that does that
00:14:55
used to write logic problems for Dell Crossword Puzzle Magazine.
00:14:58
I talked to him about it and we figured out yeah, there is a
00:15:00
way to do it.
00:15:01
So I drew it out on paper and you know, cut it out, glued it
00:15:07
to cardboard, put it together and made just maybe a five or
00:15:13
six inch large wheel, cipher wheel that had a.
00:15:17
This is what the prototype looks like.
00:15:18
So we know what we're talking about.
00:15:20
The one I did was on cardboard and paper.
00:15:23
It was a larger edition, but basically two wheels and there's
00:15:26
a little window inside the second wheel or inside the
00:15:30
second row that has a third alphabet that's hidden.
00:15:33
So you line up your two letters , whatever they are trying to
00:15:37
get it focused here, and in the window is the third letter.
00:15:41
So it was magical.
00:15:44
I took this cardboard, paper-based wheel with me the
00:15:47
next time I visited Special Forces and they loved it so much
00:15:51
they stole it from me.
00:15:52
I mean, they literally didn't give it back to me.
00:15:54
I was like, fine, you can keep it.
00:15:55
So the next time I went and visited them, I made like maybe
00:15:58
a half dozen more and they just snatched them up and at some
00:16:01
point I was like you know, know, we're nsa, we're in the
00:16:09
business of supplying you with all your crypto systems and your
00:16:11
crypto materials.
00:16:11
Would you like us to make these things for you?
00:16:12
And they're like, oh, we would love that.
00:16:13
So I went and found, uh, a machine shop at nsa that would
00:16:16
build prototypes of little black boxes and I gave them some
00:16:19
specs and so they came up with this being the prototype of this
00:16:24
Cypher wheel.
00:16:25
And I had two of them made and I took them with me and showed
00:16:28
it to them and like, oh, they loved it.
00:16:30
So we found a way to get them produced as cheap as possible
00:16:35
Got the unit cost down to $10.
00:16:37
And they wanted 15 of them.
00:16:40
The sad part about this story is I could not find at NSA a way
00:16:46
to only spend $150.
00:16:49
Everybody I talked to was doing multi-year, multi-million
00:16:53
dollar engineering projects, three-year R&D projects,
00:16:56
five-year R&D projects, development projects and, for
00:17:00
the life of me and I talked to dozens of people where do I go
00:17:04
to find the petty cash box?
00:17:05
All I need is $150.
00:17:08
Ended up had to call back to the army and say I can't find a
00:17:12
way to get this paid for.
00:17:13
Will you pay for it?
00:17:14
No, I'm not a problem.
00:17:15
So I got the money from the army.
00:17:16
We made 15 of these wheels.
00:17:19
So I've been carrying around the two prototypes of these wheels
00:17:23
for years, going to conferences and, you know, every once in a
00:17:27
while, you know, somebody would say they're in the army.
00:17:29
Or I'd meet somebody from special forces and I'd say, oh,
00:17:32
do you ever remember using a little cipher wheel with your
00:17:34
one-time pads?
00:17:35
And at some point somebody said , yeah, I remember that we call
00:17:38
it the whiz wheel or the whizzy wheel.
00:17:40
So when I would meet guys that were green berets, you know,
00:17:47
after that I'd ask about the whiz wheel and you know a lot of
00:17:49
people remembered it.
00:17:49
I met a guy at DEF CON, I think back in 2017 or 2018, you know,
00:17:57
prior to COVID Actually.
00:17:59
A friend of mine met this guy that was an ex-green beret or
00:18:02
former green beret and said, hey , do you remember the whiz wheel
00:18:06
?
00:18:06
The guy said yes, and they said would you like to meet the guy
00:18:08
that invented it?
00:18:09
And the guy was like heck, yeah .
00:18:10
So you know, I met this guy and he was very excited because he
00:18:16
had been in special forces back in like the late 90s, early
00:18:20
2000s and he very much remembered using the whiz wheel.
00:18:23
He was the communication sergeant.
00:18:24
They called him the POMO.
00:18:25
That was sort of their nickname for that position on the team.
00:18:34
The long story short is I, you know, got to know this guy.
00:18:38
We corresponded and somewhere along the line he said you know,
00:18:41
I think you'd qualify for membership in our alumni
00:18:44
association.
00:18:45
So well, I was never in the military, you know on special
00:18:48
forces.
00:18:48
So he goes.
00:18:49
Oh, I know, but we have special status for civilians that made
00:18:53
significant contributions.
00:18:54
So he was able to get me a lifetime membership in the
00:18:59
Special Forces Association.
00:19:00
So I've got my membership made of metal.
00:19:05
It doesn't get me first on the airplane or anything like that,
00:19:08
because that has to be retired military or active duty, but
00:19:13
it's still kind of cool to whip out every once in a while and
00:19:16
gets me a free drink every once in a while.
00:19:18
Last year 2023, I got to speak at their convention.
00:19:21
They have an annual convention.
00:19:23
It happened to be in the town where this guy that I had met
00:19:26
lived, so he was hosting it, and I gave a talk to the convention
00:19:31
about the origin of the whiz wheel and I asked I sort of put
00:19:36
out an appeal I said you know, I've never actually seen a
00:19:38
production model because by the time they were made and
00:19:41
distributed I had moved on from that office.
00:19:43
And if anybody's got them and is willing to part with them, my
00:19:48
goal was to have it put on display at the National
00:19:51
Cryptologic Museum.
00:19:52
That's part of the National Security Agency at Fort Meade,
00:19:55
maryland, and also there's a Special Forces Special
00:19:58
Operations Museum down in North Carolina, in Fayetteville, at
00:20:03
what used to be known as Fort Bragg but is now Fort Liberty, I
00:20:07
believe, and that's one of the main special forces bases that I
00:20:10
used to visit all the time.
00:20:11
So a couple days later, by the end of the conference, the guy,
00:20:17
my friend, walked up to me and handed me two actual Cypher
00:20:20
wheels.
00:20:20
So this is one of two actual production model whiz wheels.
00:20:25
I mean it's made out of aluminum, it's nothing fancy.
00:20:29
But in talking to these special forces guys they were very
00:20:34
appreciative of the wheel.
00:20:36
One guy told me that, while they had the letters memorized,
00:20:41
very often when they're deployed they might be up for 24, 48, 72
00:20:46
hours and you don't have recall when you're up that long.
00:20:49
So he said, yeah, it was a lifesaver to have that wheel
00:20:52
available to use in certain situations.
00:20:54
The guy my friend that had got me into the association in the
00:20:57
first place, he's on a Facebook group and he said shortly after
00:21:02
he met me, he posted to that group.
00:21:03
Hey, I'm at the inventor of the whiz wheel.
00:21:05
So there's this.
00:21:06
All sorts of chatter about remembering this trigraphs,
00:21:11
since people were citing the ones that they still remember.
00:21:13
One person said whoever made that thing ought to have a
00:21:17
national holiday named after.
00:21:19
So apparently it made a difference for him.
00:21:22
Long story short.
00:21:23
I mean it's already been a long story, I guess, but that's what
00:21:25
you wanted me to do.
00:21:26
I was able to get one of the prototypes and the production
00:21:30
model I donated to the National Cryptologic Museum.
00:21:33
I did that shortly after I received it back in 2023.
00:21:38
Put it on display this past April.
00:21:42
So the cipher wheel that I invented is currently on display
00:21:46
at the National Cryptologic Museum.
00:21:48
If you happen to be in Maryland or traveling near Baltimore or
00:21:51
DC, it's a pretty short haul up to Fort Meade and the museum's
00:21:56
open 10 am to 4 pm, I think, monday through Saturday.
00:21:59
Come and see the cipher wheel on display.
00:22:01
They were excited because they don't often put stuff on display
00:22:05
and the inventor is still alive .
00:22:07
They've got a lot of very relics there.
00:22:10
I mean they've got I don't know how many Enigma machines.
00:22:13
They've got on display.
00:22:14
One of Hitler's personal Enigma machines.
00:22:16
He had his own special set that you know.
00:22:19
One traveled with him and one was in his eagle's nest or
00:22:23
whatever they called it where he used to hang out.
00:22:25
They got lots of cool stuff.
00:22:27
I mean, you know the cryptologic museum is is.
00:22:30
If you're into this kind of stuff, I guess you have to be a
00:22:33
certain kind of geek.
00:22:33
It's a real fun place to visit.
00:22:35
There's a lot of history there and a lot of stories and a lot
00:22:39
of mystery involving, you know, cipher and cryptography and and
00:22:44
it's really, I guess, for us aging Krippies and people that
00:22:48
you know worked at NSA and you know I was there for 10 years.
00:22:51
I didn't spend my whole career there but very much unsung
00:22:55
heroes, especially World War II.
00:22:57
You know the role of cryptography and being able to
00:23:00
break codes and ciphers that were transmitted by the enemy.
00:23:04
You know we're very pivotal in changing the outcomes of most of
00:23:10
the wars that we've fought in, in fact all the way back to the
00:23:13
American Revolution.
00:23:14
I mean there was cryptography and secret writings and codes
00:23:17
and ciphers employed.
00:23:18
I mean it's been around for thousands of years but
00:23:21
definitely has been a role in US history for thousands of years.
00:23:27
But definitely has been a role in US history.
00:23:28
I was at NSA during the first skirmish in the desert, desert
00:23:29
Shield and Desert Storm back in the early 90s and I hope I don't
00:23:33
get in trouble for saying this, but it was impressive for me to
00:23:36
be young and at that time I was over in operations, the real
00:23:41
side of NSA that people you know know and like to think that
00:23:46
they know about.
00:23:46
But basically it's the code breaking side, the intercepting
00:23:50
the messages and trying to were doing it and getting them back
00:23:53
to NSA headquarters.
00:23:54
However, we were doing that that's probably all the
00:24:10
classified stuff that might still be classified, but we were
00:24:14
getting stuff back, breaking the encryption.
00:24:17
Getting back to the messages, getting those messages relayed
00:24:21
to troops in the field, commanders in the field, very
00:24:24
often before the intended recipient of the message is
00:24:28
getting the decrypt from his radio officer and his cryptic
00:24:32
was impressive to me as a young kid and that was sort of to me
00:24:36
what the that was NSA's, that's NSA's mission, that's what NSA
00:24:40
was designed to do and that's what NSA is and was in the
00:24:44
business of doing.
00:24:45
So kind of seeing it in operation, seeing all the
00:24:49
planning and all the things involved with getting that kind
00:24:53
of stuff out there, that was kind of cool.
00:24:57
So, and you know, the Cypher wheel, the Whiz wheel, played
00:25:01
some role in that encounter.
00:25:03
Yeah, later on in 2001, after 9-11, when we started fighting
00:25:10
battles in Afghanistan Special Forces there's a group, a
00:25:14
Special Forces team, that fought the first battle in the
00:25:19
Afghanistan war horse soldiers.
00:25:29
Because they were deployed so early, they didn't have any,
00:25:30
they weren't really ready for the desert and they were
00:25:31
attached to a local tribe and went in and attacked a city and
00:25:34
so they were on horseback, so they came to be known as horse
00:25:36
soldiers.
00:25:37
There's a movie about them called 12 strong.
00:25:40
I had to meet their camo, a communication sergeant, a couple
00:25:43
years ago because he and a couple of the members of that
00:25:47
unit started a distillery.
00:25:49
Speaker 1: So there's a— yeah, I love their whiskey.
00:25:51
You've heard of it.
00:25:52
Yeah, I've got a bottle of it.
00:25:53
Speaker 2: I've got a bottle of it over there that's autographed
00:25:56
by this guy.
00:25:56
That was the commo.
00:25:57
But I was talking to him when I met him and he remembered using
00:26:01
the whiz wheel.
00:26:01
He said, you know, we didn't take it with us on that battle
00:26:04
because at that point they had the beginnings of encrypted
00:26:08
radios and stuff like that.
00:26:10
But he said it was definitely with us because, again, it was
00:26:13
originally.
00:26:13
I mean, what remained in use was the memory system which they
00:26:18
still needed to have the trigraphs, because that's what
00:26:21
was used with that.
00:26:21
So that's the story of the Whiz Wheel and, as I said, I had two
00:26:27
prototypes.
00:26:28
I had two productions.
00:26:29
One set is at the National Cryptologic Museum.
00:26:32
These two will end up at the Special Operations Museum in
00:26:37
Fayetteville at some point.
00:26:38
They're sort of COVID, really screwed up the museums in
00:26:42
general.
00:26:42
So they're getting reorganized and relocated and refunded and
00:26:48
stuff like that.
00:26:49
But it'll get there eventually.
00:26:52
My goal was simply, you know, I'd been carrying them around
00:26:55
for years.
00:26:55
My family knew the story, close friends knew the story, but at
00:26:58
some point I was like, yeah, this is a piece of history,
00:27:00
somebody should take an interest in it, and they did.
00:27:03
So that's kind of that's kind of.
00:27:07
Speaker 1: Yeah, and that museum is open to the public.
00:27:10
Yep, absolutely.
00:27:21
Speaker 2: I do tell people you know Fort Meade is right at the
00:27:23
intersection of the Baltimore-Washington Parkway, BW
00:27:24
Parkway and Route 32.
00:27:25
And there's a very clearly marked exit sign for NSA and
00:27:26
there's a sign for National Cryptologic Museum.
00:27:28
So you take the exit loop around, go under 32.
00:27:32
You'll come up to an intersection.
00:27:34
Turn left, you get to the museum, turn right, you get shot
00:27:37
at.
00:27:37
Fair warning.
00:27:41
Speaker 1: But it is very clearly.
00:27:43
Okay, that's really fascinating.
00:27:47
You kind of bring up some of the lengths that the country
00:27:54
will go to to intercept communications that are
00:27:57
encrypted, that kind of determine and sway the power on
00:28:03
the battlefield.
00:28:04
Like I remember reading somewhere that before we even
00:28:09
went into Iraq, like something like two months beforehand, we
00:28:13
had intercepted all communications, we owned all
00:28:16
communications in the country.
00:28:17
We owned their entire water system, electrical grid,
00:28:22
everything you know like that's like a superpower almost.
00:28:26
I mean like that's almost like the finger of God coming down
00:28:29
and touching you know a country right, like because you're
00:28:32
owning the entire infrastructure of that country, right, and I
00:28:37
always.
00:28:37
I just find it fascinating because I then I went and I read
00:28:40
an article about how the agencies went and set up a you
00:28:46
know like their own.
00:28:47
They either set up their own encryption company in germany, I
00:28:52
think it was, or they like took over a company in germany and
00:28:59
like basically put a backdoor into this encryption algorithm
00:29:03
that like russia was buying in north korea and all of our
00:29:06
enemies, very conveniently, were buying it, but like that that's
00:29:09
such an extreme length to get to it, so it kind of like it
00:29:14
weighs the importance of it you know properly in your head, I
00:29:18
think Well yeah, I'm somewhat familiar with what you're
00:29:23
referring to because I was in the last, I think, year or two
00:29:26
or three, but it could have been before COVID.
00:29:28
Speaker 2: I feel like it was in the last couple of years where
00:29:31
it came out that the CIA I think you know it set up a storefront
00:29:36
and you know it was like a legitimate business that was
00:29:38
selling stuff that had embedded little extras in it, type of
00:29:42
thing that none of us can confirm nor deny.
00:29:45
But it really speaks to I mean it speaks to more of the
00:29:48
classical history of espionage and intelligence and
00:29:53
counterintelligence.
00:29:54
You know I started at the agency during the Cold War.
00:29:57
So our enemy undeclared, declared enemy was the Soviet
00:30:01
Union.
00:30:02
Soviet Union and the US back and forth, you know, from the
00:30:07
late 40s on up still today, arguably, although it's Russia,
00:30:17
not the Soviet Union have been engaged in sort of this
00:30:18
clandestine, you know, cat and mouse type of game where lots of
00:30:20
deception, lots of things go into trying to steal data.
00:30:25
You know the Soviets were very good back in the day at
00:30:28
recruiting people to spy for them.
00:30:32
When I was at the agency in the late 80s, early 90s, there was a
00:30:38
couple very famous espionage cases exposed Walker Whitworth,
00:30:46
two guys that were Navy or one of them was the Navy Gosh, I'm
00:30:50
going to forget all the names of them.
00:30:52
There was a guy that was basically selling.
00:30:55
You know, we had all these super secret keys for all of our
00:30:58
crypto systems and this guy was selling them and he'd been
00:31:00
doing it for like 15 years.
00:31:02
Nobody nobody caught it Walker.
00:31:05
I think his name was Walker Walker Whitworth.
00:31:07
Anyway, they're in the history books or you can find it on
00:31:11
Wikipedia.
00:31:12
But the one guy the reason that he got discovered was because
00:31:16
he was going through a divorce and his wife ratted him.
00:31:18
But you know, he had been basically selling keys and
00:31:23
getting money from the Soviets and he didn't have political
00:31:27
aspirations, he didn't have a bone to pick with the US
00:31:31
government, he just basically entered for the money.
00:31:33
And you know there's been a long history of trying to find
00:31:38
people's weaknesses and why would you get them to turn?
00:31:41
And there's still some application to that these days
00:31:44
in terms of social engineering and things like that.
00:31:48
In terms of social engineering and things like that, you know
00:31:49
there's variations on a theme, but it still goes down to how do
00:31:55
you get the information and what are creative ways to get
00:31:57
the information.
00:31:58
I was having a conversation with some people a couple weeks
00:32:02
ago it was probably in Vegas where they were talking about
00:32:05
yeah, did you know that you can record sounds just by gauging
00:32:11
the you know vibrations of various things?
00:32:14
And of course I'm like, yeah, I'm not going to say anything
00:32:16
about that because someplace might have known about that for
00:32:19
a hundred years and have been doing it.
00:32:21
But you know, nsa primarily, when I was there they were
00:32:25
picking things out of the air.
00:32:26
It was all.
00:32:27
It was radio waves, various frequencies high frequencies,
00:32:36
low frequencies, frequency hopping but it was intercepting
00:32:37
communications and signals traffic.
00:32:39
They had whole organizations that were doing statistical
00:32:41
analysis of the signals that they were collecting to try to
00:32:44
determine if it was an actual signal or if it was noise, and
00:32:49
lots of math went into that.
00:32:50
It's funny because these days, with all the data that's out
00:32:53
there on the internet, proliferation and all the
00:32:56
traffic, we're sort of back to data analysis in some ways.
00:33:01
Not big data, but trying to make sense of more data than you
00:33:08
can assume manually.
00:33:09
A lot of the techniques are still the same looking for
00:33:11
patterns, trying to make sense of more data than you can assume
00:33:12
manually.
00:33:13
A lot of the techniques are still the same looking for
00:33:14
patterns, you know, trying to compress it down into something
00:33:15
that's even visual.
00:33:16
I saw a talk I think it was actually a GURCON a couple years
00:33:19
ago where the speaker was talking about mapping network
00:33:22
traffic and, rather than just trying to do a schematic of
00:33:26
where things were going, he was, he was plotting it based on
00:33:30
something and like look at the patterns, wow, yeah, we used to
00:33:33
do that a long time ago.
00:33:34
Nothing what goes around comes around.
00:33:37
There's, there is, there's definitely a cyclical nature to
00:33:41
all this.
00:33:41
It seems.
00:33:42
If you've been around long enough, there is a good yeah,
00:33:45
it's.
00:33:46
Speaker 1: Uh, it's fascinating how you know the agencies at
00:33:51
times or in some ways, right will be so far ahead of like
00:33:57
what's publicly available, right , and someone will come out with
00:34:00
it, you know, and people in the government that have that know
00:34:05
right.
00:34:05
They'll be like we were using that 10 15 years ago, right like
00:34:09
I, I always go back to, uh, you know, the zero dark 30 movie
00:34:13
right the very first time when people saw like the, the four,
00:34:18
the four, uh, night vision goggles, right, everyone I mean
00:34:23
at least everyone like you know, kind of like tangentially in
00:34:26
that hobby right in america was like, oh, those are the coolest
00:34:30
things, right.
00:34:31
And then you talk, you talk to a navy seal or you, you, you
00:34:34
know, hear an interview of them.
00:34:36
Several years later they're saying, yeah, we use them
00:34:40
because those were like the most trusted things that we had.
00:34:44
We had better stuff, like we had a lot better stuff, but that
00:34:48
was just the most trusted.
00:34:49
We knew that wasn't going to break, right, like that, that's
00:34:52
just like it.
00:34:53
Kind of like it blew my mind when I heard that, because it's
00:34:56
like man, we thought that that was like coolest thing.
00:34:58
You know, we never, never seen it before, never thought about
00:35:02
something like that before.
00:35:03
And here they have it's.
00:35:04
It's just a casual, you know, tuesday night, wednesday night,
00:35:08
you know, whatever it is Right, right, I always, I always took
00:35:13
that, you know that always like piqued my intrigue, right,
00:35:17
because I'm a very, uh, I'm a very curious person.
00:35:20
It's always, it's always drawn me to the federal side of it,
00:35:25
right.
00:35:26
Speaker 2: Yeah, there's uh I mean you know I do another
00:35:29
podcast Paul's a pretty publicly known thing.
00:35:31
Now how long NSA might have known about it and who they
00:35:53
might have attempted that against, that's probably still
00:35:55
classified.
00:35:56
But there's other things that people talk about that I'll just
00:35:59
I'll just play it safe and keep my mouth shut because I don't
00:36:05
want to tip the scales.
00:36:06
I mean, I mentioned the Enigman machine when I started working
00:36:09
at the agency in 1986, the fact that the Enigman machine had
00:36:14
been broken by the Allies during World War II.
00:36:16
That was still a secret and it wasn't declassified until like
00:36:21
1987 or 88.
00:36:23
And the reason it wasn't declassified is because there
00:36:25
was some entity somewhere in the world that was still using the
00:36:30
Enigma machine and we were still intercepting traffic from it
00:36:33
and reading it.
00:36:34
So you don't want to say, oh yeah, we already broke that,
00:36:36
because you're going to lose your source of information.
00:36:38
And that's one of the key elements for the classification
00:36:42
of data, at least in that classical sense, is.
00:36:44
It's not the content of the data itself necessarily back in
00:36:48
those days, it's how you're getting it.
00:36:50
That's what's top secret and compartmented, your collection
00:36:55
methods, what we call methods and sources.
00:36:57
That's very often what is the secret that needs to be kept.
00:37:02
Another example from World War II is the architect of the Pearl
00:37:06
Harbor raid, admiral Yamamoto, japanese admiral.
00:37:10
We had intercepted and broken the Japanese communications
00:37:15
prior to the beginning of World War II.
00:37:18
I mean we had decrypts of the messages that said you know,
00:37:21
close the embassy, close the Japanese embassy, come on back
00:37:25
to Tokyo, because we're getting ready to go to war.
00:37:28
And there's controversy over whether that message was held
00:37:32
and not sent to Pearl Harbor and all that kind of stuff.
00:37:35
But the point is, shortly after that there was an intercept
00:37:39
where they figured out that Yamamoto was going to be on a
00:37:42
plane going from point A to point B and so they had an
00:37:45
opportunity to take him out.
00:37:46
They didn't want to immediately send a bunch of fighter planes
00:37:50
out and take him down because that they felt would have tipped
00:37:54
off that.
00:37:55
How did they know that he was on that plane?
00:37:57
So they ended up sending out a scout plane that just
00:38:01
accidentally bumped into this plane and sent somebody to shoot
00:38:05
it down, just because, and that helped keep the secret that we
00:38:09
knew how to read the communications of the japanese
00:38:12
drama awards yeah, that is, that's really fascinating, I.
00:38:16
Speaker 1: I feel like we could probably go for another hour or
00:38:19
two, right, just talking about that sort of stuff, because we
00:38:22
could.
00:38:22
Yeah, well, once, once the interest in me is peaked, it
00:38:26
just doesn't stop.
00:38:27
Right, I'm gonna go back and I'm gonna start reading things
00:38:30
on that, right, but you know, jeff, why don't we, I guess,
00:38:34
fast forward a little bit?
00:38:35
Right, because before, in part one, you know, we kind of sped
00:38:39
past that, we sped past your, your invention, and then we kind
00:38:45
of sped past you know, the incident, or however much you
00:38:48
can tell me about the incident Right, that that result, or I
00:38:53
guess, ended up, you know, with you leaving the agency, right?
00:38:57
Can we talk about that a little bit?
00:38:59
Speaker 2: Yeah, and you know for the full.
00:39:01
For a more complete version of the story, if you happen to
00:39:04
catch my talk like if anybody's going to GurkCON I'll be giving
00:39:08
this talk at GURCON.
00:39:09
Besides Edmonton, I actually signed up for a conference in
00:39:13
Philadelphia called JohnCon and I'll be giving the talk there.
00:39:16
I don't tell the whole story, it's just a piece of it.
00:39:19
But towards the latter part of my career at NIA, the internet
00:39:24
was becoming a thing.
00:39:25
I was with a group of guys that was learning how to do ethical
00:39:28
hacking, penetration testing, breaking into systems and
00:39:32
networks to see how well they were resistant to it, but then
00:39:36
to discover the vulnerabilities and the ways that things were
00:39:39
being broken into.
00:39:40
We were doing that for a couple of years and there was
00:39:46
complexities to it.
00:39:47
There was political issues, there was bureaucracy issues.
00:39:50
This might sound foreign to people because we live in a
00:39:54
post-9-11 world, but there was this thing called the NSA
00:39:57
charter that basically said NSA doesn't do what NSA does to US
00:40:01
citizens, and while the idea of let the good guys break into
00:40:05
your network and tell you what's wrong before the bad guy does
00:40:08
seems like a great idea, it was technically violating NSH
00:40:13
charter, so we had to, and I was the one doing it work with the
00:40:17
lawyers to get the special permissions and to figure out a
00:40:21
methodology that could accelerate the process of
00:40:25
getting authorization to perform these pen tests.
00:40:27
That was the most painful part At the very beginning of us
00:40:30
doing this and trying to get authorizations.
00:40:32
It would take weeks and sometimes months to get
00:40:35
permission to just break into an internal network or an internal
00:40:39
server at NSA, and one of the problems was everything that we
00:40:44
did had to be classified top secret.
00:40:46
And then it had to, because that was the classification of
00:40:49
the network and the computers and the servers and the
00:40:52
mainframes.
00:40:53
And because it was top secret, we had to go through a very
00:40:57
lengthy process and we were making that work.
00:41:00
We were making headway, I was making headway with the general
00:41:03
counsel that's what we called the lawyers, the lawyers and we
00:41:11
were getting to the point where we were sort of coming up with a
00:41:13
good way of doing it, a methodology that was repeatable,
00:41:14
not only informing the pen test but also the process of getting
00:41:16
the authorizations and permissions and all of our docs
00:41:19
in a row.
00:41:21
But somewhere along the line and I don't know the exact details,
00:41:25
but word got out that NSA had this capability and Internet was
00:41:30
new.
00:41:30
Everybody was plugging in new.
00:41:32
The World Wide Web knew.
00:41:35
But we eventually got an approach through one of our
00:41:38
sister agencies I believe it was DISA, Defense Information
00:41:43
Systems and Security Agency security agency.
00:41:51
Look somebody, look it up.
00:41:52
Disa approached us.
00:41:52
They had a contact at the Department of Justice, which was
00:41:54
an unclassified civil agency, and they wanted to hire us to do
00:41:57
a pen test or engage us to do a pen test.
00:42:00
There was no money exchange.
00:42:02
So we had to go through this very lengthy process and I was
00:42:04
working with the lawyers every step of the way, because
00:42:07
unclassified networks at those times were the purview of NIST,
00:42:12
National Institute of Standards and Technologies, and NSA was
00:42:15
responsible for classified networks.
00:42:17
It was also fairly common knowledge within that circle
00:42:21
that NIST didn't have a lot of capability in those days, so
00:42:24
they would very often sort of have a handshake agreement under
00:42:27
the table, gentleman's agreement to pass the work on to
00:42:30
NSA anyway.
00:42:32
So we embarked on figuring out and I was following the lawyer's
00:42:38
direction how do we make this work?
00:42:41
So there's a whole litany of stuff that had to be done, which
00:42:44
was a several months long process, was a several months
00:42:52
long process.
00:42:53
We got to the point where we had a letter that was written
00:42:54
and signed by the director of the National Security Agency and
00:42:58
addressed to the attorney general.
00:43:00
You know who was you know, the oversight above the Department
00:43:05
of Justice.
00:43:05
It happened to be Janet Reno, if you remember that name.
00:43:08
It had been signed and it had been dated for a Thursday of a
00:43:12
certain week in August and the weekend before somebody popped,
00:43:19
defaced the Department of Justice webpage website and that
00:43:24
was the first time a government website had been publicly
00:43:29
defaced and hacked.
00:43:30
So it was in the news.
00:43:32
It was a big deal.
00:43:33
I come into the office on Monday and get a phone call from
00:43:38
my point of contact to the Department of Justice and he
00:43:41
said help, we were hacked over the weekend.
00:43:44
So I said well, let me see what I can do.
00:43:46
I hung up the phone with him, got on the phone with the
00:43:48
lawyers, explained what had happened and I said you know,
00:43:50
I'd really like to get people on the ground by tomorrow to try
00:43:54
to help them out with forensics.
00:43:55
By the way, there was no forensics capability, there were
00:43:59
no forensics guidelines, there was nothing written down at
00:44:01
those days.
00:44:02
All we had was the cuckoo's egg by Cliff Stahl, because he had
00:44:11
sort of invented the idea of doing forensics and trying to
00:44:12
track back where an attack might have come from and how things
00:44:13
might have been done.
00:44:14
But we figured we were more capable than most because we had
00:44:17
been learning the inner workings of Unix networks and
00:44:20
networking of that Unix systems.
00:44:22
So the lawyers gave me some guidelines or requirements of,
00:44:28
gave me three things I needed to do.
00:44:30
He said one get the request from the DOJ in writing.
00:44:33
So that's not a big deal.
00:44:34
I called them back and they sent a memo, inter-office memo
00:44:38
or whatever.
00:44:39
So that was done.
00:44:40
And the second one, second criteria was don't go alone.
00:44:44
I said, okay, that's kind of cool.
00:44:45
You know there was a bunch of us from our team, our team that
00:44:49
we called the pit.
00:44:49
A bunch of us got.
00:44:52
I think there was four of us that went down initially.
00:44:55
And the third thing was don't go on your own authority.
00:44:58
Have somebody send you, have somebody in your management
00:45:00
chain send you.
00:45:01
So I did all those things.
00:45:02
So we got a team on the ground on Tuesday.
00:45:05
Now, back in those days everything was hardware-based.
00:45:08
You know a web server was running on somebody's own server
00:45:11
that was in their own machine room, data center and it was
00:45:15
hopefully outside of a firewall, if they had a firewall.
00:45:18
But you know it was owned and operated by the entity.
00:45:21
There was no concept of outsourcing or hosting at that
00:45:24
point and, of course, when they discovered the breach, the first
00:45:28
thing they did was pull the plug on the server and rebuild
00:45:30
it.
00:45:30
So whatever forensic evidence might've been there was pretty
00:45:33
much wiped out.
00:45:34
But there were other systems, there was other servers, and so
00:45:37
we spent a couple days looking around for things.
00:45:40
So Tuesday goes by, wednesday goes by, thursday comes and we
00:45:44
go down there and we're there an hour or two and I get a phone
00:45:47
call.
00:45:48
It had somebody from the home office, from the pit, somebody
00:45:51
that stayed behind, and he said Jeff, the shit's hit the fan.
00:45:56
You guys got to drop what you're doing right now and come
00:45:59
back to the office.
00:46:01
So we did so we took an hour or so, hour and a half to get back
00:46:06
.
00:46:06
So we got back to our office and we were immediately escorted
00:46:10
into the conference room for the deputy director of InfoSec.
00:46:14
He was not in the meeting, but he was next door.
00:46:18
He knew what was going on.
00:46:19
Same lawyer that I'd been working with for months and
00:46:22
months trying to make this all work.
00:46:24
I was teaching him about hacking and pen testing and what
00:46:27
it all meant, how it all worked .
00:46:29
He was Irish and I don't know Irish.
00:46:33
Soulless, ginger redheads.
00:46:35
You know, if they get mad sometimes they get really red.
00:46:40
And he was just enraged.
00:46:41
He was like Heatmiser in Year Without Santa Claus, if you know
00:46:46
that story, and he was just yelling at us and mostly me,
00:46:49
since I was the ring leader about how we had done something
00:46:53
to break the law.
00:46:54
Weren't we aware of the NSA charter?
00:46:56
What we did could get the director fired, if not
00:47:00
prosecuted.
00:47:01
And you know, apparently it was a very bad thing that we did
00:47:05
and we were all kind of like, yeah, we were just there to help
00:47:07
.
00:47:08
The customer asked and at least that was my attitude.
00:47:12
It was like you know, I went to you and asked you what had to
00:47:15
be done to get me there.
00:47:16
The manager that I had had sent me, that person, did kind of
00:47:21
throw me out of the bunts.
00:47:22
They disavowed giving me permission.
00:47:25
They said that I had been deceptive and had not explained
00:47:28
to them exactly what the nature of the request was, which was bs
00:47:32
.
00:47:32
But you know, whatever I you know I'm not even saying who the
00:47:36
person is, because let bygones be bygones.
00:47:40
But what was interesting was the talk that I put together this
00:47:43
year.
00:47:44
That I'll be giving is sort of the the story of my couple years
00:47:48
after I left the NSA, leading up to where I got into PCI and I
00:47:54
started doing PCI in 2004.
00:47:56
And when I was putting the talk together I said, oh, I got to
00:47:58
tell a little bit about this story and I've got a lot of the
00:48:01
evidence.
00:48:01
I've got copies of the letters went back and forth.
00:48:04
I've got a copy of the letters from the director that had never
00:48:07
got sent and I saw the date on it and I'm like you know, it was
00:48:11
like August 21st or something like that.
00:48:14
And I'm like that's weird because I left before the end of
00:48:18
September because, you know, in government the end of the
00:48:20
fiscal year is end of September and I left before the end of the
00:48:23
fiscal year.
00:48:24
So I'm like wow, that was like five weeks.
00:48:27
And I think about all that transpired in terms of I was put
00:48:30
on double secret probation.
00:48:32
I had my clearance pulled.
00:48:34
They still let me sit at my desk, but they disabled my
00:48:38
access to the network, which was silly because we all had like a
00:48:42
half dozen ways to get on the system.
00:48:44
But I had to go talk to all sorts of people at internal
00:48:47
security and external security and lawyers and this and that
00:48:51
and the other.
00:48:51
But they also at the time, because it was coming up to the
00:48:54
end of the fiscal year and it was post Cold War we hadn't had,
00:48:59
we didn't basically have an enemy in 1996 that we knew about
00:49:04
.
00:49:04
So they were doing a buyout, they were letting people, they
00:49:07
were paying people to leave, basically, and I finally become
00:49:10
eligible for that.
00:49:11
So I took advantage of them paying me basically a thousand
00:49:17
dollars for every year of government service and I'd been
00:49:20
at NSA for 10 years and two years with the Navy prior to
00:49:23
that.
00:49:23
So they paid me $12, which was basically three months of
00:49:27
pay to go out and I had gotten the first job offer that came
00:49:31
along and I was leaving on a Friday and starting.
00:49:36
I think I maybe took a week off , but starting a week later with
00:49:40
like a 30% or 40% pay raise.
00:49:43
So you know, it was like a no-brainer.
00:49:45
Like a no-brainer.
00:49:49
A bunch of us have been considering going out into the
00:49:51
private sector anyway because of the allure of making more money
00:49:53
solving the world's problems.
00:49:55
But for me, in part at least, it was getting rid of the
00:50:00
bureaucracy and the red tape.
00:50:01
Because you know, when I first started doing pen testing in the
00:50:04
private sector, we'd get a customer saying that they wanted
00:50:06
it done and we'd negotiate a start time and a start date and
00:50:11
we'd go do it and write up a report and present it, and that
00:50:14
usually took place in about a month, if not quicker, and they
00:50:19
were very appreciative of all the findings that we had and we
00:50:22
would work with them to fix things, and so this follow-on
00:50:26
business was just a lot neater and cleaner and we didn't have
00:50:30
to wait weeks for dozens of signatures and initials from all
00:50:36
sorts of different levels of management.
00:50:37
So that's kind of how I left.
00:50:39
Two anecdotes I'll share with you, because I know we're coming
00:50:42
up on close to an hour.
00:50:43
It wasn't until DEF CON.
00:50:45
Again, it was probably 2017.
00:50:51
It might have been a little bit before that 16 or 15.
00:50:54
I didn't go to DEF CON until 2014 when I went to work for a
00:50:59
vendor in 13.
00:51:01
And the next year I got to go to DEF CON for the first time
00:51:05
because I'd been a consultant, a billable resource, for most of
00:51:09
those years.
00:51:09
That wasn't allowed to go out and play and go to conferences
00:51:12
unless I did it on my own.
00:51:14
But I, I was at DEF CON and it was.
00:51:17
I think I was in that if you've ever been to DEF CON, I was
00:51:20
somewhere between Bally's and Paris and sort of a thoroughfare
00:51:24
when DEF CON was over in that area.
00:51:26
And who do I bump into?
00:51:28
But it's this lawyer that I had worked with very closely for
00:51:33
months and months back in 1996.
00:51:35
And I hadn't seen him.
00:51:37
It was probably 2015.
00:51:40
So it had been almost 20 years since I'd seen the guy and I'd
00:51:44
been kind of pissed off at him for most of those 20 years
00:51:47
because I felt like he threw me under the bus and he turned
00:51:50
because he was so into it and we were very, you know, had a very
00:51:53
close working relationship.
00:51:55
He was learning a lot and he, you know, to my way of thinking,
00:51:58
he turned on me.
00:51:59
The first thing he said to me when he saw me was I forgive you
00:52:04
and I'm like what are you talking about?
00:52:05
You forgive me, I'm the one that's mad at you.
00:52:07
And then he proceeded to tell me about how, since he was the
00:52:11
one that had sent me, he had gotten in so much more trouble
00:52:14
than I did.
00:52:15
And he was able to withstand it , of course, because he ended up
00:52:19
still working at NSA for probably another 10 or 15 years
00:52:22
after that and became known as a cybersecurity expert and so on
00:52:28
and so forth.
00:52:29
But he also told me that they weren't just trying to fire me,
00:52:32
they were trying to find reasons to charge me with treason and
00:52:36
they wanted to prosecute me.
00:52:37
So I'm like, oh, that's good to know.
00:52:39
Many years later, my God.
00:52:39
So I'm like, oh, that's good to know.
00:52:40
Many years later, my God.
00:52:41
So that's one anecdote.
00:52:43
The other anecdote I'll tell you is I went out to Vegas to DEF
00:52:46
CON this past we're in September now, so it was just a month ago
00:52:51
One of the guys that I used to work with at NSA he actually was
00:52:55
a manager that was across the hall from the pit.
00:52:57
Really great guy, sharp guy.
00:52:59
He's been involved in cyber for many years.
00:53:02
He worked at NSA probably another 20 years after I did.
00:53:11
Then he went to work for Center for Internet Security, cis, got
00:53:12
involved in the CIS Top 20.
00:53:15
I won't say his name to protect the guilty.
00:53:17
This is not a story about him as much.
00:53:18
But he said he was going to be speaking at DEF CON this year
00:53:21
and I had seen him post on LinkedIn a couple of weeks
00:53:24
before DEF CON where he was saying you know, I'm kind of
00:53:27
excited to go out to DEF CON and have a chance to speak.
00:53:30
And he was reminiscing and thinking I remember the first
00:53:34
time that NSA officially went to NSA and he said it was in like
00:53:37
2007 or 2008.
00:53:39
And I thought, you know, good Lord, we had people from the pit
00:53:44
going to like the first or second DEF CON back in 93 or 94.
00:53:50
One of my frustrations, one of all of our frustrations, was how
00:53:53
long it took for NSA to do things and to change things.
00:53:57
And I left in 96, and he's saying that NSA officially went
00:54:02
to DEF CON 11 years later when we were screaming those of us in
00:54:08
the pit were screaming at management guys, you got to get
00:54:12
with the times.
00:54:13
Things are moving much more quickly.
00:54:15
Internet speed is not three to five year design development
00:54:19
projects anymore.
00:54:20
You got to speed it the hell up and change your way.
00:54:23
And I just thought, holy crap, 11.
00:54:26
It took them 11 years after I left for them to get around to
00:54:30
get into defcon of fish.
00:54:32
That's one of the reasons you know, other than you know, other
00:54:36
than this little incident, that was one of the main reasons why
00:54:39
dnsa was just because they were too freaking slow and they were
00:54:43
too full of themselves.
00:54:44
They had kind of a monopoly, at least on the InfoSec side,
00:54:49
because they had no competition, but they also fell behind very
00:54:54
quickly.
00:54:54
I mean, they used to be the sole provider of
00:54:57
cryptographyystems, data protections for, obviously, the
00:55:03
military and the government, and they just were in a lot of ways
00:55:07
not equipped to compete when competition became available.
00:55:11
We are a free market system.
00:55:13
So I left.
00:55:15
It was bittersweet.
00:55:16
If things hadn't have blown up I might have stayed there a
00:55:19
while longer, although a lot of us were looking to go out into
00:55:23
the private sector.
00:55:24
But I did the private sector thing for a few years, did pen
00:55:27
testing for a few years, got frustrated that nobody was
00:55:31
changing anything.
00:55:32
We would break in one way, come back six months later and break
00:55:36
in the exact same way.
00:55:37
Passwords hadn't been changed, permissions, trust,
00:55:41
relationships hadn't been changed, things hadn't been
00:55:44
patched or updated.
00:55:45
And at some point I was like why, you know, why isn't saying
00:55:50
we've got root on all your systems?
00:55:52
The equivalent today would be domain admin.
00:55:56
Why isn't that getting the point across?
00:56:00
You've got problems that you need to fix something.
00:56:02
So I was very frustrated at not you know, that wasn't working.
00:56:06
There had to be a better way to doing things and a lot of the
00:56:09
clients.
00:56:10
They didn't really understand security.
00:56:12
They didn't understand any of the technology.
00:56:14
They didn't understand any of the concepts of data security or
00:56:18
cybersecurity or information security.
00:56:20
And along came PCI and I fell into PCI.
00:56:25
It's a love-hate relationship, mind you, but one of the reasons
00:56:29
why I love it was, all of a sudden, it gave me an audience
00:56:33
with clients where they may not have understood things any
00:56:36
better, but they had to do stuff .
00:56:39
So all of a sudden, okay, how do we make this work?
00:56:42
They were listening in a way that they never listened before,
00:56:45
so I kind of had a captive audience.
00:56:47
But it was where over I've been doing it for 20 years,
00:56:51
developed a few techniques and honed a few skills.
00:56:54
I like to think I got pretty good at trying to explain the
00:56:58
concepts of data security in a way that most people will say,
00:57:03
yeah, that makes sense, we should be doing that and then
00:57:06
help them to start doing it.
00:57:08
But it was a big stick because just pleading with people and
00:57:12
telling people that it's in their best interest to do things
00:57:15
differently or to make huge investments in security when
00:57:19
they're not understanding why they need it or how they need it
00:57:23
or where they need it or what do they even have to protect
00:57:28
Home Depot the CEO when they were breached, which was gosh
00:57:31
over 10 years ago, he's rather famously quoted as saying why do
00:57:34
I care about cybersecurity?
00:57:36
We sell hammers.
00:57:37
Well, they also deal with lots of, you know, tens of millions
00:57:42
and hundreds of millions of credit cards that got stolen
00:57:45
because that was lucrative.
00:57:46
Back, the bad guys started monetizing.
00:57:50
So the lessons learned, the things that I learned from the
00:57:53
past, I I think they still largely apply today, although
00:57:56
the world's changing and I'm and I'm happy to acknowledge if
00:58:00
some, some of the concepts have become outdated and obsolete,
00:58:05
giving way to other things.
00:58:06
But I haven't run into a whole lot of and we're still
00:58:08
struggling with getting people to change factory settings and
00:58:12
defaults and changing the default passwords and or coming
00:58:16
up with strong passwords or eliminating passwords all
00:58:20
together.
00:58:21
And let's move on to some of the other forms of
00:58:23
authentication, like biometrics.
00:58:25
There's all sorts of clever things, but the bad guys in the
00:58:29
hacker community can always think of a hundred ways to
00:58:31
bypass and get around it, and the human nature wants us to get
00:58:37
things done.
00:58:38
We want all the data and all the things fast, so the
00:58:41
convenience of the internet, but at the same time, we have this
00:58:46
concept that we need to secure all this stuff.
00:58:48
And you know, frankly, I don't think we're doing a very good
00:58:51
job.
00:58:51
At the end of the day, I don't know if we can do a good job.
00:58:54
Part of me thinks and this is the curmudgeon in me thinks that
00:58:57
the ship has sailed, the handora is out of the box.
00:59:01
That if I can help one company, one organization, be a little
00:59:06
bit better and be able to stay in business, I feel like I'm
00:59:09
making a difference, and I've had the opportunity to do that
00:59:13
with working within the construct of PCI, where I talk
00:59:17
to many other people that keep beating their heads against the
00:59:21
wall and are frustrated because their organizations and their
00:59:23
clients aren't doing the things that they need to be doing, but
00:59:26
they don't have this regulatory stick behind them, at least not
00:59:31
one that's as powerful as PCI, because PCI is quite simple.
00:59:35
You don't have to follow it.
00:59:36
You just don't get to engage in commerce and do business and be
00:59:42
able to take credit cards, and there's some companies that opt
00:59:45
out of that, but most companies want to opt into that and so
00:59:49
money talks.
00:59:50
It's been a huge motivator and companies by and large that are
00:59:55
involved in it have gotten more secure, whether they liked it or
00:59:58
not or whether they wanted to or not, but they know they
01:00:01
needed to.
01:00:01
So they could avoid the fines and avoid the breaches, avoid
01:00:05
the public scrutiny if their company is involved in a breach.
01:00:11
That's the state of things, and I don't know how, but I've been
01:00:14
doing it for 20 years.
01:00:15
That's what my career has been PCI, wow.
01:00:20
Speaker 1: So, jeff, you know I have one last question before I
01:00:24
let you go.
01:00:24
So when you were transitioning from the agency to you know a
01:00:29
company, right?
01:00:30
What was that process like?
01:00:33
Did you know someone at that company that got you the job,
01:00:35
that kind of knew your skill set and whatnot, or you know, was
01:00:39
it literally that easy back then where you got let go on Friday
01:00:42
or you left on Friday and you started a new place on Monday?
01:00:45
I asked because a lot of the people that I talked to with you
01:00:50
know TS clearances and whatnot, right, they all say that like
01:00:55
they're basically completely lying to employers to employ
01:00:59
them for the first five years, five to seven years after their
01:01:02
employment with the government, because they can't tell anyone
01:01:05
that they even worked for the government.
01:01:07
And especially when it's like the only thing that they've done
01:01:10
.
01:01:10
They have nothing to lean on, you know.
01:01:13
So they're making things up just so that they can get
01:01:15
employed.
01:01:16
What was that like for you?
01:01:19
Speaker 2: Well, I think in part it was a unique time in history
01:01:23
because this whole what we now call cybersecurity is kind of a
01:01:26
new thing and there weren't a lot of people that knew anything
01:01:31
about it, knew how to do it.
01:01:32
There wasn't the gazillion vendors out there selling all
01:01:36
sorts of solutions and have all sorts of use cases like we see
01:01:39
at the RSAs and the Black Hats these days.
01:01:41
There was like four or five companies that sold firewalls
01:01:46
and most of them, you know, started with building a firewall
01:01:49
for the government.
01:01:50
There's a couple of freeware vulnerability scanners one
01:01:54
flipped and became closed sourced and commercial, and one
01:01:57
stayed open and that was about it technology, technology wise.
01:02:01
So what was in demand were people that kind of knew what it
01:02:05
was all about.
01:02:07
I had been talking to a couple different companies and much of
01:02:10
us had gone on different interviews because we were
01:02:12
always kind of looking for that grass is always greener on the
01:02:16
other side of the fence.
01:02:17
None of us had pulled the trigger.
01:02:20
Well, one of us had left One the original members that had
01:02:25
left before all this had happened.
01:02:27
Um, but you know, none of us were in a huge rush to leave.
01:02:31
But you know, when I sort of went through what I went through
01:02:36
.
01:02:36
I started calling people that I might have been speaking to
01:02:39
before and I got in touch with a guy two guys that were running
01:02:45
a practice that were doing government work.
01:02:47
It was a government contractor, but they were just beginning to
01:02:51
want to spin up a practice that would start looking at the
01:02:54
private sector, and that was kind of new.
01:02:56
Back then there wasn't a whole lot of focus on the private
01:03:01
sector.
01:03:01
So I was hired by these guys an office chief and his deputy for
01:03:07
a government contractor with the idea that I would come in as
01:03:10
sort of a co-director, third in command type of thing, and
01:03:17
focus more on building a practice that was focused on the
01:03:21
private sector and one that was doing vulnerability assessments
01:03:24
and pen testing.
01:03:25
As it turned out, the two guys that I had interviewed a couple
01:03:30
of times and went to work for they both resigned within like a
01:03:34
month after I started this company because they went off
01:03:37
for their own better offer to start a practice and do it more
01:03:42
their way out from under the auspices of a government
01:03:45
contractor, which was almost as bad as working with the
01:03:48
government.
01:03:48
So my immediate job out of the agency only lasted for six
01:03:53
months and two or three months I'm like, okay, I got to look
01:03:57
for a more permanent position and I kind of was in a rush and
01:04:02
I took the first offer because I really wanted to get out the
01:04:04
door before the end of September and be eligible for the for the
01:04:08
buyout.
01:04:09
So I I went on a bunch of interviews and I think I ended
01:04:13
up getting offers from like four or five different companies and
01:04:16
I didn't take the one that offered the most money out of
01:04:19
the gate.
01:04:19
I went with the company that it seemed to be smaller and leaner
01:04:24
and was more serious and interested in spinning up a
01:04:28
commercial practice that would focus on the private sector.
01:04:31
And let me build a team of people doing pen testing.
01:04:35
We called it pen testing, but it was mostly vulnerability
01:04:38
assessment.
01:04:38
Back in those days People wanted to know what all the holes were
01:04:41
.
01:04:41
They didn't want to know if you could break in.
01:04:43
They knew you could break in.
01:04:44
They wanted to know all the ways you could break in.
01:04:46
That experience was that was a little bit more deliberate, and
01:04:51
I took more time and tried to talk to different types of
01:04:56
companies.
01:04:56
Half of them were I guess they were all pretty much still
01:04:59
government contractors, because that was pretty much all there
01:05:01
was back then, at least in terms of professional services
01:05:05
companies, but I went that route .
01:05:07
I know a lot of people these days think the way to get into
01:05:11
this business is the entrepreneurial route and I talk
01:05:15
to many, many people that are excited about their startup
01:05:18
company and they have this vision and a lot of them I feel
01:05:20
like that's the path to success.
01:05:24
Is the entrepreneurial route, like well, there are other ways
01:05:27
to do it.
01:05:27
You're not going to get rich and retire and buy your own
01:05:30
island, necessarily being a consultant.
01:05:32
But if you want to make a difference and feel like you're
01:05:35
impacting people's lives and have a sense of accomplishment
01:05:40
which I'm not saying you don't get that by building some sort
01:05:43
of product company and becoming a vendor.
01:05:45
But I've always had this thing against vendors because they
01:05:48
were competing for the same dollars.
01:05:50
Many of my clients in the early days.
01:05:53
They only had so much to spend and they felt like they needed
01:05:58
to buy something rather than buy somebody telling them what they
01:06:01
needed to buy, which makes sense at one level.
01:06:04
But you know, at the end of the day they still didn't know what
01:06:08
they were doing and they were buying the thing from the most
01:06:11
convincing sales guy.
01:06:12
So I've had, for most of my commercial private sector
01:06:17
figures, sort of a disdain for vendors and for salespeople from
01:06:24
the vendors.
01:06:24
Not personally, I know.
01:06:26
I have many friends that are salespeople, but they understand
01:06:30
where I'm coming from.
01:06:30
It's a competition thing, but beyond a competition thing it's.
01:06:35
You know your job is to sell something.
01:06:37
My job is to help the client be more secure.
01:06:40
You might say that that's what your job is, but you have this
01:06:43
conflict of interest because you got a quota to meet and even if
01:06:48
you're a reseller, we've got a hundred widgets on the shelf.
01:06:52
We got to move those these quarters.
01:06:53
So all of a sudden that widget is.
01:06:56
That is the absolute solution that you need, mr Client.
01:07:01
In fact you need 10 of them.
01:07:04
How many can I put you down?
01:07:06
For?
01:07:08
I'm not saying everybody's like that.
01:07:09
I used to go around saying that vendors are liars and then I
01:07:13
went to work for a vendor and I found out, no, they're not lying
01:07:15
, they just don't know.
01:07:16
They don't understand it any better than anybody else does.
01:07:19
All they know is to read from the script that sales and
01:07:22
marketing people put together for them, and I'm grossly, you
01:07:27
know, painting a very wide picture here.
01:07:30
There's exceptions to all of this, but generally speaking,
01:07:33
people don't know what they're talking about.
01:07:34
And I say that having 40 years under my belt.
01:07:38
I don't think anybody knows what they're talking about.
01:07:40
But I don't know what I'm talking about, frankly.
01:07:43
But I have 40 years of having the conversation and I've seen a
01:07:49
few things and I think I've learned a few things about how
01:07:52
to motivate organizations to do the right thing or help them to
01:07:56
rethink how they're doing it.
01:07:57
I've given talks over the years.
01:07:59
One was called Rethinking Security.
01:08:00
What we're doing isn't working, so maybe let's try something
01:08:05
different.
01:08:06
Let's try something old rather than new, like applying the
01:08:09
actual principles of data security, the way that we used
01:08:11
to do it in what was arguably the organization that invented
01:08:15
the discipline, which was InfoSec at NSA.
01:08:18
So I don't know if that's the answer.
01:08:21
Yeah, like I said, it's not a.
01:08:22
My experience is very much a uniform experience because it
01:08:26
was a point in time and a point in history.
01:08:28
So I don't know how helpful it is other than to be diligent.
01:08:31
Take your time, I tell people, look for something that you like
01:08:34
to do.
01:08:35
Look for something that you feel like you have the aptitude
01:08:37
or you think you could do well at it.
01:08:39
Hopefully they're the same thing.
01:08:41
Do that, you'll get paid well, you'll get paid enough to make a
01:08:44
living.
01:08:45
You may not be able to retire rich and buy an island, but
01:08:50
there's a lot of people that make a pretty decent living and,
01:08:52
at the end of the day, most of us need to make a living.
01:08:55
We've got mortgages to pay and mouths to put food in and
01:09:01
college tuition for our children to think about in years to come
01:09:05
.
01:09:05
You're probably not there yet, but most of us are in that boat.
01:09:11
Speaker 1: Yeah, I have about 18 years before I have to make
01:09:14
that first payment for someone other than myself, right?
01:09:19
Well, there's no time like the present to start saying he has
01:09:23
or teach them to be a hacker, and they don't need to go to
01:09:26
college right, they could just, uh, give themselves the the
01:09:30
degree that they need, right, that's right.
01:09:32
Well, well, jeff, you know it's been a fantastic conversation
01:09:36
like it was.
01:09:37
The last time I'll absolutely have to have you back on talk
01:09:39
about some mental health stuff and whatnot, what that looks
01:09:42
like for you.
01:09:42
But yeah, I mean it was a fantastic conversation.
01:09:47
I really enjoyed it.
01:09:47
I appreciate the opportunity.
01:09:50
Yeah, absolutely.
01:09:52
Well, you know, before I let you go, how about you tell my
01:09:54
audience or remind them again?
01:09:56
You know where they could find you if they wanted to reach out
01:09:58
and maybe connect.
01:09:59
Or you know where they could find you if they wanted to reach
01:10:00
out and maybe connect.
01:10:01
Or you know, uh, you know, learn more about you.
01:10:03
Speaker 2: Sure, my Twitter X handle is Mr Jeff man.
01:10:06
I'm Jeff man on LinkedIn.
01:10:08
I'm mostly on LinkedIn these days.
01:10:09
If you Google me, if you go to YouTube, you can find
01:10:12
presentations that I've done at various conferences, and if you
01:10:16
go to the end, usually there's a slide that actually has my
01:10:19
email and even my cell phone number.
01:10:21
I've actually had people call me only like once or twice, but
01:10:26
I do try to connect with people as much as possible, giving life
01:10:30
advice, mentoring as much as I can, and try to help out and
01:10:35
give back as much as I can.
01:10:36
So spell my name right.
01:10:38
It's only one N, a-n and type in Jeff Mann in security.
01:10:42
I'll pop up in most of the browsers out there.
01:10:46
Search engines.
01:10:49
Speaker 1: Awesome.
01:10:49
Well, thanks everyone.
01:10:51
I hope you enjoyed this episode .
01:10:52
Go check out part one in the description of this episode if
01:10:56
you're interested in hearing more.
01:10:57
Thanks a lot, jeff.