What happens when a seasoned American cybersecurity expert navigates the intricate world of European data privacy? Richard Hollis, with over three decades in the cybersecurity industry, shares his captivating journey from Washington DC's government projects to leading Risk Crew in London. Listen as Richard emphasizes the critical role of process over products in cybersecurity and offers a wealth of insights into the ever-changing threat landscape. Along the way, he recounts the unique challenges and personal experiences of living and working in Europe, shedding light on the cultural contrasts that shape global cybersecurity practices.
Imagine the personalized service of a cigar lounge in Germany and the stringent protections of GDPR — a stark contrast to American business practices and views on data privacy. This episode unpacks the cultural differences between Europe and America with vivid anecdotes and eye-opening discussions. We explore how European values around data privacy influence business operations and consumer rights, offering a fresh perspective on what Americans might learn from these practices. Richard’s insights help bridge the gap, revealing the importance of robust data protections in today's interconnected world.
Our conversation also delves into the urgent need for enhanced data privacy and cybersecurity regulations, drawing parallels to past safety improvements in other industries. Richard shares his candid thoughts on the influence of big tech companies and the current inadequacies in data protection measures. Reflecting on personal stakes and the emotional disconnect many professionals have with data security, we highlight the broader implications for both individuals and businesses. Don’t miss this engaging episode that combines expert insights with a unique cross-cultural perspective, offering valuable lessons for listeners on both sides of the Atlantic.
Chapters
00:00 Introduction and Appreciation for the Podcast
00:52 Richard's Background in Cybersecurity
05:45 Living in Europe and Cultural Differences
12:09 Being an American in Europe
16:00 Data Privacy and GDPR
20:12 The Lack of Federal Regulation for Data Protection in the US
25:14 The Historical Context of Europe Compared to America
31:20 The Impact of America's Size on Data Privacy Laws
34:16 The Need for a Ralph Nader for Data Privacy
36:07 Monetization of Personal Data and Lack of Accountability
41:37 Differences in Mindset: Americans vs Europeans on Data Privacy
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, Richard?
00:00:01
It's great to finally get you on the podcast here.
00:00:03
I'm really excited for our conversation today.
00:00:07
Speaker 2: Joe, thank you, I'm excited to be here.
00:00:08
Great, I love the podcast Big fan and I really appreciate the
00:00:12
opportunity to chat with you.
00:00:13
Thank you, sir.
00:00:15
Speaker 1: Yeah, absolutely.
00:00:16
It's always refreshing to hear that people actually listen to
00:00:19
the podcast and that they actually enjoy it.
00:00:23
It's not all for nothing.
00:00:24
Speaker 2: Joe, your work has not been in vain.
00:00:26
People actually do listen.
00:00:27
I certainly do.
00:00:28
No, I am a big fan.
00:00:30
I'm not overselling.
00:00:31
I think it's really pragmatic and I find it hard it's hard for
00:00:34
me, my age in cybersecurity, to find something pragmatic that's
00:00:39
not trying to sell you a gadget or a gadget which it's not.
00:00:42
So it seems to be very pragmatic advice I find on your
00:00:46
podcast.
00:00:46
Well done.
00:00:48
Speaker 1: Well, thank you, I really do appreciate that.
00:00:51
You know, it's been an interesting journey, you know,
00:00:55
and I feel like this year I don't know what it is.
00:00:59
I feel like every year is a different, you know, I guess
00:01:03
like growing opportunity or growing season, if that makes
00:01:06
any sense at all.
00:01:07
You know, like this year I haven't looked at my numbers at
00:01:11
all, you know, but like last year I obsessed over my numbers,
00:01:16
you know, and this year I'm just focused on like, having
00:01:19
good conversations, having people on that I want to talk to
00:01:22
.
00:01:22
I feel like that's made podcasts a little bit better in
00:01:26
some ways.
00:01:28
Speaker 2: I think you're right, joe.
00:01:28
I think you know, the more you tend to look after it, you know,
00:01:31
the more you have fun, the more it perpetuates itself.
00:01:34
And one answers the other.
00:01:37
Speaker 1: Yeah, yeah, absolutely so, richard.
00:01:40
You know.
00:01:41
Why don't we start with how you got into Richard?
00:01:43
Why don't we start with how you got into security, how you got
00:01:46
into IT?
00:01:47
Take me back to when that was, and what was that decision like?
00:01:51
Speaker 2: If you have video you can see I'm a really old guy
00:01:54
and I'm good to be one of those old guys I didn't want to be.
00:01:56
When you're growing up you always had that neighbor who you
00:02:00
kick the ball into his lawn and you're afraid to ask for it
00:02:02
back.
00:02:02
That's the guy I'm starting to be.
00:02:04
I've been 30 years in what is now called the cybersecurity
00:02:08
industry.
00:02:08
It was once upon a time computer industry, which was an
00:02:12
extension from information security realm when we started
00:02:17
to process sensitive information on computers.
00:02:20
I was there.
00:02:21
I was there, I saw it all happen.
00:02:22
I was there.
00:02:23
I was at that meeting.
00:02:24
We had good biscuits but we did a lot of things.
00:02:28
I wished in hindsight we could have done differently, but I've
00:02:31
been in the industry for 30 years.
00:02:32
I started my career.
00:02:33
I'm an American, I'm currently working in London and I'm
00:02:37
running a company called Risk Crew and we are a product
00:02:41
agnostic.
00:02:41
You know fundamentals consultancy.
00:02:46
We're small, about 35 people, and we preach the gospel
00:02:50
according to its process on the product.
00:02:52
It's the configuration of the firewall, the maintenance, the
00:02:56
management, the patching.
00:02:57
It's not the firewall itself, anyway.
00:03:01
So we, you know when we do things like GRC, fundamentals
00:03:06
lines to ISO or SOC 2, and out here it's NIST 2 and DORA.
00:03:10
Now you know, through the risk assessments and supply chain,
00:03:14
you know risk management strategies and all the way
00:03:18
through pen testing and sexy things like routine testing.
00:03:20
So I've been doing it, for the business is running for about 20
00:03:23
years.
00:03:24
But once upon a time I started in Washington DC, worked for
00:03:32
government projects and I got headhunted by Lucent and Philips
00:03:37
.
00:03:37
They did a joint venture way back when on cell phones and
00:03:41
brought me over and took me to Paris and did that for a couple
00:03:45
of years until I thought I really wanted to start something
00:03:47
by myself.
00:03:48
So I started the business in London and that's it.
00:03:50
We opened up the doors doing things like application pen
00:03:53
testing, because we saw the game on an application level.
00:03:56
We were early founders of OWASP and anything open source.
00:04:00
We put our arms around it and say if that's what you can
00:04:02
afford and that's't have to buy something, buy a product.
00:04:06
So anyway, I've had a long.
00:04:08
But at the end of the day, joe, I'm a big picture song and dance
00:04:11
guy.
00:04:11
I'm a risk guy.
00:04:13
I could rebuild my old laptop if I had to.
00:04:16
But that's a big If I had to.
00:04:18
It was pulling a lot of weight in that sense, as they say yes,
00:04:21
I would know a firewall if you hit me over the head with it,
00:04:23
but you'd probably have to hit me two or three times.
00:04:25
I'm a big picture.
00:04:27
What are you trying to protect?
00:04:28
Why are you trying to protect it?
00:04:29
What's going to happen to you if you fail?
00:04:31
You know what's your appetite for risk.
00:04:32
And one of these guys who I've always thought that
00:04:39
cybersecurity is an oxymoron such thing as a secure computer
00:04:47
computer it's identify, minimize , manage.
00:04:48
So I spent a career doing that, truly helping clients do
00:04:49
strategy with cybersecurity strategy and then filling out
00:04:51
that strategy with controls and people process technology.
00:04:56
It's been a long road.
00:04:57
I'd like to say it's been fun.
00:04:58
It hasn't been as fun as I thought it was going to be in
00:05:01
hindsight.
00:05:01
Yeah, out here in Europe we have a really good view as a
00:05:05
consultancy.
00:05:06
That's not selling a thing.
00:05:07
We have a really good view of the threat landscape.
00:05:09
I really do feel in the position I'm in in my industry.
00:05:13
I've got my finger, or at least we have a clear view of how
00:05:16
threat actors are getting in and strong controls that you could
00:05:21
do to negate that.
00:05:23
So, yeah, it's been fun.
00:05:25
Still is fun.
00:05:26
It changes every day I come to work, which is a good thing.
00:05:30
Speaker 1: Well, tell me about being an American and moving to
00:05:35
Europe.
00:05:35
You know I actually love Europe .
00:05:39
Unfortunately, I've mostly spent, I think, pretty much all
00:05:45
of my time in Germany.
00:05:47
I've been to Ireland for 48 hours.
00:05:50
I feel like I did the whole island in 48 hours.
00:05:53
I went to Amsterdam, but I don't know if that really counts
00:05:57
as spending time in the Netherlands.
00:05:59
Speaker 2: No, it counts.
00:05:59
That's a check in your passport .
00:06:02
It counts In some places it counts for the amount of time
00:06:06
you actually spent there, because you could have lost time
00:06:08
there.
00:06:10
Speaker 1: Yeah, for sure, you know, as someone that loves
00:06:14
Germany, right, I'm trying to convince the wife that when we
00:06:16
retire in like 30 years, right, hey, let's go over to Germany.
00:06:20
Like, let's go to Europe, let's go travel over there.
00:06:23
You know, and she's she's very against it, but she's also never
00:06:28
been right.
00:06:29
So what's that?
00:06:30
What's that light?
00:06:31
Because, and I I ask also because in America it isn't very
00:06:37
often that we get Europeans right.
00:06:41
But I remember the last time I was over in Germany, I went into
00:06:45
a random cigar shop in Munich and you know, one of the people
00:06:49
in the lounge there was three people in the lounge right,
00:06:52
including myself.
00:06:53
One of them was an American, born in Seattle, founded a
00:06:59
company, sold it to Microsoft, moved to Germany and founded
00:07:03
another company.
00:07:04
He's getting ready to sell it again, right, so like, I'll run
00:07:08
into Americans in Europe, like all the time, it seems, but very
00:07:14
rarely do I get any run-ins with anyone from Europe in
00:07:20
America.
00:07:21
Speaker 2: I didn't have that experience the bulk of my
00:07:23
professional life in the States, while I was born in the Midwest
00:07:27
, but I ended up going to school in Washington DC, which was
00:07:30
very international Georgetown American University, very, very
00:07:33
international In fact.
00:07:34
I studied international affairs and so I've heard, you know,
00:07:37
you can walk down the streets of Washington and hear French and
00:07:39
Portuguese and German spoken on the street.
00:07:41
But I like to travel and of course Washington is a very
00:07:45
international city because of all the embassies and the
00:07:48
government seat, and I used to travel.
00:07:51
I used to travel a lot, I loved it, but the first chance I got
00:07:54
I wanted to get out.
00:07:56
I was in the military and I lived outside of the US for
00:08:00
quite some time and I think there are benefits that are not
00:08:03
known until you actually receive them.
00:08:05
And for me, being an American in Washington DC, inside that
00:08:10
little DC bubble of politics and reading the Times in the
00:08:15
morning and the Post at night, I was one of these geeks who
00:08:17
would actually listen to McNeil-Lair on the weekends.
00:08:20
I was just information-obsessed and yet felt like I knew
00:08:26
nothing politically.
00:08:27
Anyway, the point of that story is that I had the chance to go
00:08:31
overseas and I initially went to Paris.
00:08:33
I live in France.
00:08:34
Now I have a business in London , my wife is French and my wife,
00:08:39
who's French, frankly doesn't think very highly of Americans.
00:08:43
You know what I can't say?
00:08:45
I blame her outside of the obvious reason.
00:08:47
But in Europe I find there's a finer appreciation to a certain
00:08:51
rhythm of life, the benefits of family, friends, food.
00:08:56
I remember when I first moved to Paris as an American, I'd get
00:09:00
there and I'd say, okay, let's eat, let's go do something.
00:09:03
I was like, why do you want to do something?
00:09:04
Let's go bowling, let's see a movie, let's go do this, let's
00:09:07
go do that.
00:09:08
And for an American it was about the quantity of how many
00:09:11
things you could pack in a night , you know, and the French was
00:09:30
very hey, ate, and that's it.
00:09:31
You and I had to be reprogrammed as an american to
00:09:34
live in a european culture.
00:09:35
I live in france, in a small village.
00:09:37
I gotta tell you the quality of life, like I said, the pace,
00:09:40
the rhythm of life is a lot more real, you know, like from the
00:09:45
fruits and the vegetables to how time passes it just.
00:09:49
And as an American, that's not it Americans find.
00:09:53
I find I was raised in the culture of.
00:09:55
You know it's quantity not quality.
00:09:57
And in Europe, I find the Europeans have a finer taste for
00:10:00
quality over quantity and I know that's a cliche, but it's
00:10:05
certainly one, for my age and experience, that is based on a
00:10:10
certain truth.
00:10:10
So I enjoy life in Europe.
00:10:14
It's not, I tell you, the hardest part for me as an
00:10:17
American was starting a business , because I started it in the UK
00:10:20
, which is supposed to be.
00:10:21
You know, it's the same culture , divided by a common language,
00:10:23
as they say.
00:10:24
But you know it's the same culture, divided by a common
00:10:25
language, as they say.
00:10:26
But, you know, very, very different from how Americans do
00:10:29
business, for example, the English.
00:10:31
I find just don't like to tell you no, and you know, as I'm
00:10:35
moving in cybersecurity, I'm like, you know, can we help you
00:10:37
out with pen testing?
00:10:38
Can we do this, can we do that?
00:10:39
Yeah, maybe Don't want to say no, and it's, you know we're in
00:10:49
the States.
00:10:49
Is, hey, you interested in getting some pen testing?
00:10:50
No, no, thanks, okay, thanks, don't want to waste your time,
00:10:51
you don't want to waste mine, but there's a certain politeness
00:10:52
that's in a European business environment that I find, as an
00:10:54
American, it's just like whoa, that was new.
00:10:56
Sorry, it's been three years I've been asking you if you'd be
00:10:59
interested in supply chain risk assessments.
00:11:02
I think it would have you know, and you have to come to a sense
00:11:05
to say, all right, you know I'm a nuisance.
00:11:09
So anyway, that there's a lot of things that you know.
00:11:13
After I've been over here 30 years, and after 30 years I'm
00:11:17
looking around thinking I'm still, I still feel like an
00:11:19
outsider here.
00:11:20
Speaker 1: Yeah, it's really interesting the culture
00:11:24
differences, you know, between Europe just overall and America,
00:11:28
right, you know, when I went to that cigar lounge because I had
00:11:35
spent a decent amount of time in Germany before that, you know
00:11:39
, I think I spent probably eight weeks total in Germany before
00:11:43
that.
00:11:43
I spent probably eight weeks total in Germany before that,
00:11:45
which is, which is a good amount , right For an American to make
00:11:52
that kind of a journey to go there, you know, several times
00:11:54
and whatnot.
00:11:54
When I went to the cigar lounge , you know it was closer to like
00:11:56
closing time, right, or or what it was was.
00:12:00
They were closing early, but it wasn't posted online.
00:12:02
So I thought that I was there, you know, two, three hours
00:12:05
before closing.
00:12:06
In all actuality, I had about 45 minutes, right, I didn't know
00:12:11
, you know.
00:12:13
So I bought the cigars and I asked very politely can I smoke
00:12:16
them in your lounge?
00:12:16
You know, because I don't know, some lounges are different,
00:12:19
right, they have different rules in America.
00:12:22
You have to spend a certain amount.
00:12:23
Maybe the lounge is only for members, all that sort of thing.
00:12:26
He goes, well, where else were you going to smoke it?
00:12:28
I'm like, well, I mean, I guess he goes.
00:12:33
He's like Joe, it's 30 degrees outside.
00:12:36
I'm not going to tell you to go outside and smoke a cigar.
00:12:38
I'm like, well, that was kind of my plan and you know, the
00:12:41
hospitality was something totally different.
00:12:43
You know, if that was in america they'd be like no, we're
00:12:46
closing.
00:12:47
Speaker 2: You know yeah, there's that, there's this,
00:12:50
that's.
00:12:50
That's, I guess, what I mean in terms of the pace of life, the
00:12:53
rhythm of life.
00:12:53
We're not going to close the doors on it.
00:12:55
You know, when you just bought the cigar, uh, we understand,
00:12:58
and it's always in there.
00:12:59
That's a good example.
00:13:00
It's a really good example of the way businesses operate
00:13:03
differently.
00:13:04
There's more focus on people.
00:13:06
But I tell you, on the flip side of that though, joe, in terms
00:13:09
of things like how people view, bring this closer to home in
00:13:12
terms of cybersecurity, I find the Europeans in general have a
00:13:15
very fine appreciation of privacy in general.
00:13:19
In the States, you would have bought some cigars and within
00:13:22
minutes, you would have been in databases across the states
00:13:24
about cigar smokers.
00:13:26
Are you left-handed, are you right-handed?
00:13:29
Do you like these cigars?
00:13:30
Do you like those cigars?
00:13:31
In Germany, you walk into a place.
00:13:33
You get a cigar.
00:13:34
The fact that you like cigars is not going to be shared, as a
00:13:37
consumer, with anybody else.
00:13:39
It's just this data privacy is a big deal and, as American, I
00:13:44
love that, because you know it's cybersecurity.
00:13:47
I love that.
00:13:47
I talk to US businesses who have , you know, data is just a
00:13:51
commodity, it's cash.
00:13:52
Where over here?
00:13:53
Data is not protecting data is not about ones and zeros.
00:13:58
It's about this is data about people's lives, whether it's the
00:14:01
kind of cigars they like or you know, or where their
00:14:03
geolocation whether it's the kind of cigars they like or
00:14:04
their geolocation where they were last week, or what kind of
00:14:06
movies they like or Netflix to-do list?
00:14:17
It's a very more private as well as personal approach to both
00:14:19
business and what I'm finding is security.
00:14:21
People see data as a human thing and the right to privacy
00:14:23
and these things that you see in GDPR the right to be forgotten,
00:14:26
the right to this, the right to that it is seen as a
00:14:29
fundamental human right that the data that businesses process,
00:14:34
store and transmit on us, on their customers, isn't just
00:14:39
given out willy-nilly, like it is in the States, frankly.
00:14:42
So that's what I see and I appreciate both you know this,
00:14:46
this pace, this rhythm of life, and also this focus on the
00:14:49
fundamentals of privacy.
00:14:50
I think the europeans got that absolutely right.
00:14:52
We could learn a lot.
00:14:53
Uh, as americans, we could learn a lot, because we just
00:14:56
don't understand and you know what we have until it's gone.
00:15:02
Speaker 1: Yeah, yeah, that's a really good point.
00:15:04
You know, and being in security , right, I remember when GDPR
00:15:10
was coming out and everyone in America was stressed out about
00:15:15
it.
00:15:15
It was on the news and you know , especially everyone in
00:15:18
security, because we were paranoid that, you know,
00:15:21
something would leave the boundaries and whatnot, right,
00:15:25
but I, I felt like I was the outlier, right.
00:15:28
I, I was the one that was saying man, this is, this is
00:15:31
actually really great.
00:15:32
I wish that we had something like that here, because it, it
00:15:36
gives the, it gives the power of that data, of holding that data
00:15:40
back to the person that created that data, right, and that is
00:15:45
something that is so foreign in America and probably 99% of
00:15:51
people in America don't even realize that.
00:15:53
You know you're on Facebook and it's free because you're the
00:15:56
product, right?
00:15:57
Cambridge Analytica set up an API with Facebook for a very,
00:16:01
very small amount, like $15 or whatever it was, and they were
00:16:07
able to harvest all of your data and guess what?
00:16:09
They used that to sell to other companies, to use direct
00:16:13
marketing against you.
00:16:14
It's a dirty business, almost.
00:16:23
Speaker 2: Well, okay, so yeah, you're making exactly.
00:16:25
You know, for me you're just ringing a bell that I wish more
00:16:29
Americans would.
00:16:30
I live and work in Europe, where there is a finer
00:16:34
appreciation to you.
00:16:36
Know, personal data is just that.
00:16:37
It's personal.
00:16:38
This is my blood type, this is my DNA and as such, I have a
00:16:44
right to it.
00:16:45
I remember 20 years ago, when the EU was first coming together
00:16:48
, that they were talking about literally.
00:16:55
Data should have a copyright to it.
00:16:55
Like a songwriter gets a royalty for a song that he
00:16:57
writes, so should a data subject get a royalty every time a
00:17:01
business uses that data to make money.
00:17:04
You're using my blood type, you're using my DNA to generate
00:17:08
a revenue for your business.
00:17:09
How come I'm not getting a piece of that?
00:17:11
That logically holds up to me as a consumer, I'm thinking and,
00:17:18
of course, the impact on our lives from our data being sucked
00:17:22
up into this big vacuum cleaner of Amazon or Facebook or
00:17:27
whoever, and it's resold and repackaged, or Analytica.
00:17:30
You don't know until it's over and you've either lost a
00:17:33
political party by that and you understand what the impact
00:17:38
really was on the society, or, as you, as a net user.
00:17:40
And when it's over, it's over.
00:17:43
You can't get back privacy.
00:17:44
It's a net user and it's, you know, when it's over, it's over.
00:17:46
You can't get back privacy.
00:17:47
You know it's a binary, it's a binary condition.
00:17:49
Once your DNA is out there, it's out there and that's it.
00:17:51
You know, and I know so many people who've been in that, you
00:17:55
know so many people who oh sure, I'll join up to ancestrycom
00:17:59
here's a DNA swab and then suddenly they can't get health
00:18:02
insurance and their kids can't get health insurance.
00:18:04
And you're like, of course, don't you understand that's the
00:18:06
way it works?
00:18:07
You know that that data is sold to people who want to buy and
00:18:11
understand who's prone to emphysema or or bronchitis or
00:18:15
leukemia or you know.
00:18:17
But data privacy is just I, anyway, I I get the europeans.
00:18:22
I think it is one of the pleasures of living and working
00:18:24
here that there's a finer sensitivity to actually, you
00:18:28
know, once we lose this data, we're, and so it gives it more
00:18:32
of an importance to.
00:18:33
I find American firms, even those American firms who are
00:18:36
here working in Europe, they just see it as ones and zeros.
00:18:39
Europeans see it as this is data about my life and it
00:18:43
deserves I deserve for it to be processed or transmitted, you
00:18:48
know, securely, and that doesn't mean it always is.
00:18:51
That's far from it.
00:19:00
Speaker 1: But there's that approach to it which I
00:19:03
appreciate much more, as a cybersecurity has seen more
00:19:06
revolutions, right kind of plays a role into it.
00:19:11
Right, I mean, no one in America even thinks of the
00:19:16
revolution that we had.
00:19:17
Right, that was 300, 400 years ago, right, like, who cares
00:19:22
about that?
00:19:22
That no one knows about it anymore.
00:19:24
If the, if the government were to tell us you know one thing
00:19:27
about it, like we wouldn't be able to refute it really,
00:19:31
because you know you're in grade school when you learn it, fifth
00:19:33
grade, I don't think I remember anything about it other than it
00:19:36
happened, you know, in in whatever year, right, 1776,
00:19:41
right, that's like that's it, that's the extent.
00:19:44
And we're it's so foreign to us to say that's not right, we
00:19:49
have to go change it, like we don't even know how to go change
00:19:51
it.
00:19:51
I was talking to a friend over in over in the uk and, uh, he
00:19:56
was saying I'm very confused as to how america hasn't had a
00:20:00
revolution yet.
00:20:00
You guys have all the guns.
00:20:02
How do you not have a revolution?
00:20:05
Like, how do you not have the government that you want?
00:20:07
And I was like, hey, man, we don't know how to do a
00:20:11
revolution anymore.
00:20:12
We know how to.
00:20:13
You know how to protest, we know how to, you know, have some
00:20:17
riots, but we don't know how to take it beyond.
00:20:19
That, you know, and we we have no ability to, because we
00:20:23
haven't done it in 400 years almost.
00:20:26
And we see other countries doing it but somehow we still
00:20:31
kind of have it in the back of our minds hey, you can't do that
00:20:34
, that's not for you, right?
00:20:36
Yeah?
00:20:38
Speaker 2: It's odd, because the American personality, at least
00:20:42
to Europeans, is that we're very aggressive and we are.
00:20:46
That's about our guns, that's about our freedom, but we don't
00:20:50
put two and two together and it doesn't.
00:20:52
You're right, I love that we don't get the government we
00:20:55
deserve or that we want, but I don't know, I don't think it's
00:21:00
as easy as that.
00:21:00
I think there's just a leadership void.
00:21:03
Think it's as easy as that.
00:21:06
I think there's just a leadership void.
00:21:07
Um, and you know, frankly, we're we're a big country full
00:21:09
of you know, we're all immigrants.
00:21:10
We've all come from somewhere else.
00:21:11
We all went there to have a build a better life and this,
00:21:14
and europeans get that.
00:21:15
You know.
00:21:15
They see the americans.
00:21:17
You know because and they have, you know their aunts, their
00:21:20
uncles, whether they're you know from from germany, from poland,
00:21:23
poland, from France, from Spain .
00:21:25
They know somebody in the States because they have
00:21:28
relatives there who've immigrated there.
00:21:30
20, 30, 40, 50 years ago.
00:21:31
My grandparents came from Poland and it was the place to
00:21:35
be, and so you went there to get away from what was happening in
00:21:39
Poland at the time or what was happening everywhere.
00:21:42
We don't know how good we have it where the Europeans have a
00:21:46
long, as you say, a long bloody history of turmoil and social
00:21:51
unrest, and they know what it takes to go out on the street.
00:21:55
The French just had an election .
00:21:57
The UK the Brits just had an election.
00:21:59
It is just we're a two-party system in the United States.
00:22:03
We're a two-party system.
00:22:04
It's this one or it's this one.
00:22:06
It's the right hand or the left hand.
00:22:08
Which do you want?
00:22:08
And you know, in a country as big as 340 million people,
00:22:13
whatever we are now, how can it come down to two choices?
00:22:16
I don't know.
00:22:17
It's just amazing.
00:22:18
You know, and because of that we don't get leadership back to
00:22:26
ie like data protection, privacy legislation.
00:22:27
A country like the United States does not have federal
00:22:28
regulation to mandate data protection.
00:22:29
That's crazy.
00:22:30
That's crazy to a European.
00:22:32
You know we talk about Europeans see it as a
00:22:34
fundamental human right and you know Americans don't even have
00:22:38
it registered as a law.
00:22:39
You know, unless you live in California and you've got
00:22:41
California State Senate Bill, you know no-transcript that you
00:23:08
deserve.
00:23:08
And as a common denominator culture the Americans, I see it,
00:23:13
living overseas we elect common denominator leaders who speak
00:23:18
in soundbites and try to please as many people as possible and
00:23:22
year after year we get less and less done.
00:23:24
I don't know I'm being pessimistic.
00:23:26
I'm technically a baby boomer.
00:23:28
I'm on the end of a baby boomer .
00:23:32
I was born in the late 50s and I don't know.
00:23:35
It was a different time growing up in the 60s.
00:23:37
For a kid like me, when people are trying to live at the
00:23:41
Pentagon and, you know, stop the war in Vietnam and suddenly
00:23:45
we're overtaking the Capitol, I don't get my tribe.
00:23:48
I don't understand what happened to us.
00:23:51
How do we go from baby boomers trying to levitate the Pentagon,
00:23:55
you know, to overtaking the House and Senate?
00:23:59
I don't know it's, but you know .
00:24:03
The other thing, though, joe, is we're young, you know, and
00:24:06
that's absolutely what Europeans know.
00:24:07
As you said, they've had hundreds and hundreds and
00:24:10
hundreds of history.
00:24:11
I lived in Paris.
00:24:13
I lived in an apartment building.
00:24:14
It was older than my country.
00:24:15
It was older than the United.
00:24:16
Speaker 1: States.
00:24:16
Speaker 2: The apartment had been around for 500 years and I
00:24:19
thought, oh okay, that gives me perspective and I'm just living
00:24:22
here, you know, and my country hasn't even, you know, had uh
00:24:29
separated from england for that.
00:24:29
So, uh, it's young, it's early days, uh, I'm hoping that we get
00:24:33
through turmoil and start to understand what we really were,
00:24:36
because, man, we're capable of so much man, that that is so
00:24:40
wild that your apartment building was older than america,
00:24:44
especially considering, you know, went through two world,
00:24:47
two world wars.
00:24:49
Speaker 1: That is, uh, that is wild to think that you know, and
00:24:52
I think that that's part of the appeal to europe for me.
00:24:56
You know, is the history behind it right, like it's just, uh,
00:25:00
it's amazing, like, everywhere you turn, and the way that they
00:25:02
present the history and, um, you know, the way that they teach
00:25:06
it.
00:25:06
It's something that's unlike, you know, anything that I've
00:25:10
experienced in America, which is , I think, kind of what draws me
00:25:13
there.
00:25:14
But you know to your point about data privacy.
00:25:16
You know, that's a really good point.
00:25:19
America doesn't have, you know, one overarching body that
00:25:24
governs data privacy, right, or one overarching law, so to speak
00:25:28
.
00:25:29
Right, we have a whole bunch of different, you know little
00:25:33
individual laws that are different by state.
00:25:37
That you know, and for people that are listening, potentially,
00:25:40
you know, in Europe, right, one of our states is the size of
00:25:43
your country, like that's literally what it is, and you
00:25:47
know, it's funny.
00:25:47
First time I went to Germany, I was talking to someone from
00:25:52
Russia and he was confused as to how I hadn't, like, seen,
00:25:59
thoroughly traveled and been to every single state in America he
00:26:03
goes.
00:26:04
I can understand Hawaii because that's so far away, but how can
00:26:08
you live on the same continent and not go to everything?
00:26:12
And someone else from the UK that actually studied American
00:26:16
history was like hey, man, the state of Illinois is like two
00:26:22
times the size of Germany, right , and he was so confused, he was
00:26:26
so blown away by that fact and I literally told him I was like
00:26:29
hey, like you could drive for eight hours and you're still in
00:26:32
Illinois, you know?
00:26:33
Or you're like just about to break that border Right, like
00:26:36
that's how crazy, just how crazy .
00:26:40
Speaker 2: The distance is, you know?
00:26:42
Speaker 1: Yeah.
00:26:43
Speaker 2: Yeah.
00:26:47
If you start at the top and you go to the bottom, it's like
00:26:48
eight and a half hours is insane .
00:26:49
No, no, I I, I remember used to drive from washington dc to to
00:26:51
wisconsin and you know, it was just to get across ohio felt
00:26:56
like you know, a major commitment.
00:26:58
You felt like you know and I just I've never understood, like
00:27:01
the trucking industry, how you could drive like that for a
00:27:03
living.
00:27:04
Yeah, yeah, the size, the, the space is just overwhelming.
00:27:07
But on the other hand, when you talk about international travel
00:27:10
and stuff, so yes, we're living in states that are the size of
00:27:13
countries in Europe and elsewhere around the world, but
00:27:16
at the end of the day, the other thing is that Americans I see
00:27:19
this statistic and it's like seven out of 10 Americans don't
00:27:22
even have a passport.
00:27:24
So we might go to Canada, we might drive down to Tijuana or,
00:27:28
you know, go to Cancun, for you know that's our idea of
00:27:30
international travel.
00:27:31
But honestly, you know to go to another country and then look
00:27:36
back and look at the United States, you see it in a
00:27:39
completely different context.
00:27:41
Yes, you see how big it is and, yes, you see how varied and how
00:28:04
tough the problems are.
00:28:05
But it's not until you leave someplace War and look at it now
00:28:09
and think, wow, this is just the world is around us.
00:28:15
Speaker 1: Yeah, that is.
00:28:16
That's so absurd that seven out of 10 Americans don't have
00:28:22
their passport.
00:28:23
Speaker 2: It's a rough statistic.
00:28:24
I heard this years ago but every year I just hear it's like
00:28:26
the same.
00:28:27
It's like you know, it's like those FBI cybersecurity
00:28:31
statistics they're different but they're the same year after
00:28:33
year after year.
00:28:34
And the fact is, and you know what, hey, okay, if you're happy
00:28:37
and you don't feel the need to travel, that's fine.
00:28:42
But I know, growing up, you know , with Polish grandparents who
00:28:47
really sacrificed everything they could to get to the United
00:28:50
States, you know, to get a job making beer in Milwaukee, you
00:28:54
know that was the end goal.
00:28:55
Why do I want to go back to Poland?
00:28:57
You know, or much less you know , france or Italy or Spain, no,
00:29:03
I wanted to be here and there was that mentality when I was
00:29:06
growing up.
00:29:07
You don't need everything you need is right here in the United
00:29:09
States.
00:29:09
And to a certain extent you know it was, it maybe still is,
00:29:13
you know, yeah, we don't, had it not been national.
00:29:18
My dad had a fix for national geographics.
00:29:20
You know that was my little fix and you read those and just
00:29:23
something gets in your blood.
00:29:23
You got little fix and you read those and just something gets
00:29:33
in your blood, you gotta you gotta gotta go to you gotta go
00:29:34
to you gotta see china.
00:29:34
And once you do, you say cool, what was I whining about back in
00:29:35
the states?
00:29:36
You know, yeah, we don't know how good we have it, but we're
00:29:37
off the work yeah, that's a.
00:29:40
Speaker 1: That's a really good point, though, that you bring up
00:29:42
um, and do you think, do you think, that the sheer size of
00:29:47
america kind of plays into the data privacy law issue in
00:29:53
america?
00:29:53
The?
00:29:54
The reason why I say that is because it seems like in america
00:29:59
, there, there's like very definitive that we have as a
00:30:06
nation that the federal government leaves up to the
00:30:09
states, right, I think more because they don't want to deal
00:30:11
with it, they don't want to spend the time and resources to
00:30:14
actually, you know, solve it right, and so they leave it up
00:30:17
to the states to decide on how they're going to, you know,
00:30:20
treat something or act with something right.
00:30:21
Treat something or act with something right.
00:30:23
Do you think that the size plays a role, or is it just, you
00:30:28
know, lazy Americans not wanting to kind of go all the
00:30:31
way with something?
00:30:32
Speaker 2: I don't know.
00:30:32
Look at, look at, look at how, look at.
00:30:37
I think of when I was a kid.
00:30:38
I was driving around in cars with no seatbelts.
00:30:41
They weren't mandatory, you know.
00:30:43
They weren't mandatory and we were getting in car crashes and
00:30:46
flying through the windshield at 20 miles an hour.
00:30:48
We were, and people were needlessly dying because Detroit
00:30:53
did not see any financial motivation in putting I don't
00:30:58
know how much does a seatbelt cost, a piece of canvas with
00:31:01
some metal buckle?
00:31:01
Speaker 1: on.
00:31:01
It Can't be much.
00:31:02
Speaker 2: Two, three bucks if you bought them at mass.
00:31:04
Yeah, exactly, but it took guys like Ralph Nader in my lifetime
00:31:08
to just say this has got to stop right.
00:31:11
So in certain areas airline regulation, automobile safety
00:31:15
suddenly we get from no seatbelts to anti-lock brakes
00:31:18
and airbags, and look at all the features.
00:31:21
That's mandated by federal legislation and overseen by
00:31:28
safety counsel.
00:31:28
What's the problem?
00:31:30
What's the difference between that and cybersecurity?
00:31:32
I don't get it.
00:31:33
I don't understand it.
00:31:35
It can hurt a consumer, it's proven, your data's lost and it
00:31:39
can have a financial impact on you, whether that's immediately
00:31:42
in this identity theft or long-term like I can't get
00:31:45
health insurance, whatever it is but clearly the connection
00:31:49
between the citizen in a society and the protection of that
00:31:53
citizen's data.
00:31:53
They're not sure they're a data set, so I'm not talking about
00:31:56
name and date of birth, but I am talking about DNA and I am
00:32:00
talking about blood type and health records and things that
00:32:03
are sensitive to that end user.
00:32:05
I don't understand how we don't see that as a government
00:32:08
requirement to protect citizens.
00:32:10
We protect our citizens when we put them behind the wheels of
00:32:13
cars, but we don't protect them when we allow them to use the
00:32:16
internet and have Jeff Bezos take their geolocation and sell
00:32:24
that for their religious affiliation or their sexual
00:32:27
preference?
00:32:27
Or are they gay, are they bi?
00:32:29
Are they this, are they that, and sell that, monetize that?
00:32:32
I don't get it.
00:32:33
Uh, I don't get it.
00:32:35
So I think it's not because we're a big country or because
00:32:38
people who live in south dakota differ from people who live in
00:32:42
north Dakota or California or Chicago or, you know, illinois.
00:32:46
I think it's because we have not looked at it from a macro
00:32:52
level and understood the damage this is doing to our society, to
00:32:55
our people.
00:32:56
Like safety in automobiles or safety in airplanes, we need
00:33:03
safety in our computers.
00:33:04
And it's funny because, when it comes to understanding cyber
00:33:07
threats from nation states, this administration alone
00:33:11
unbelievable.
00:33:12
Making sure that the government , when the government is
00:33:15
processing, storing or transmitting government's
00:33:20
information, that we keep all our harms away.
00:33:22
But why aren't we doing that for our citizens?
00:33:23
No idea, no idea.
00:33:25
I don't.
00:33:25
Our citizens, yeah, no idea, no idea.
00:33:27
I don't understand how people don't.
00:33:28
We don't have a ralph naver nader for cyber security that
00:33:32
says enough is enough.
00:33:34
We're losing too much data yeah , it's a good point.
00:33:37
Speaker 1: You know, I I only come back to you know, like,
00:33:42
what's been going on with facebook, right, or meta or Meta
00:33:44
, where Meta was caught.
00:33:46
You know what was it.
00:33:48
It was like promoting, you know , sex trafficking in some way
00:33:52
and allowing you know pedophiles to connect with children and
00:33:56
things like that, right, and they kind of got a slap on the
00:33:59
wrist right.
00:33:59
They got brought before Congress and now we are three,
00:34:03
four months removed from that.
00:34:05
We're in an election year and no one is thinking about that,
00:34:09
right?
00:34:10
Anyone listening to this podcast, I almost guarantee you
00:34:13
haven't thought about that since them, right?
00:34:16
But when the data is so abundant and you have so much of it and
00:34:31
you're making money off of that data, what is a fine going to do
00:34:34
?
00:34:34
Right, meta probably has lobbyists working for them
00:34:39
that's going to go get that fine down, so it's not going to be
00:34:42
the real total amount or anything like that, right?
00:34:45
But when you're basically printing money, you know, as
00:34:49
these big data companies, um, what's a what's a fine going to
00:34:54
do?
00:34:54
You know, I feel like there's just too much, there's too much
00:34:57
money going around for any real impact to take place in the data
00:35:03
privacy right, because the companies that have this data,
00:35:07
that are making insane amounts of money off of this data.
00:35:11
They have the money to go and pay for the lobbyists to go and
00:35:15
say, hey, you're invested in my company too.
00:35:17
Congressperson, senate person, whatever right, you're invested
00:35:21
in my company too.
00:35:22
You have a vested stake in this .
00:35:24
How about you just leave it alone?
00:35:25
And nine times out of 10, they'll either leave it alone or
00:35:29
they'll make a change that isn't really significant to the
00:35:33
business to improve anyone else's data privacy.
00:35:36
Joey, you're bringing me down.
00:35:39
Speaker 2: I thought we were going to be positive.
00:35:41
I thought I was going to hang up feeling, hey, uplifted.
00:35:44
No, you're right.
00:35:44
For me, I'm listening to two things.
00:35:46
I'm thinking all the money they could make.
00:35:47
They could put a little more effort into protecting that data
00:35:50
.
00:35:50
And the other key word that you use is they were caught.
00:35:53
Now, when you're caught for one thing, think of all the things
00:35:57
you were caught for.
00:35:59
So, getting caught profitizing off of you know, a sex trade,
00:36:03
that is one thing.
00:36:04
Well, that just shows you that if that's where you're caught, I
00:36:08
think it's like for me, I absolutely do believe Criminals.
00:36:11
You always see the tip of the iceberg.
00:36:12
You see when you're getting caught, you're just stupid.
00:36:16
And it's where you're not getting caught that people are
00:36:18
getting away with real crime and making real money.
00:36:21
I don't know, joe, I don't have an answer.
00:36:24
I think we need a consumer.
00:36:26
I'm not kidding when I said we need a Ralph Nader for privacy.
00:36:29
We need somebody to say enough is enough.
00:36:32
How is it that Facebook and Amazon and all these tech
00:36:38
companies have shown us that data equals cash?
00:36:41
And once we understood that our data, our personal data, meant
00:36:44
their cash, how come we didn't say stop, stop, stop.
00:36:48
You have no right to sell my DNA.
00:36:50
You have no right to sell my biometrics, my blood type, my
00:36:57
religious affiliation, my tax code, my social security number.
00:37:00
These things are mine.
00:37:02
This is my intellectual property.
00:37:04
They belong to me.
00:37:05
I don't know, but yet we're just.
00:37:10
I never forget my sister once.
00:37:12
She's working on a family tree and she showed me this family
00:37:15
tree.
00:37:15
I said where'd you get this?
00:37:16
And it says here uncle so-and-so came over from Krakow
00:37:19
in 1906.
00:37:21
And I'm like I don't even think grandma knew that.
00:37:24
How does this company located in Illinois, how do they know
00:37:29
more about my family than my family does?
00:37:31
Please tell me that.
00:37:32
You know we never even knew who our aunt so-and-so was married
00:37:36
to or what her real maiden name was, or you know.
00:37:38
And yet this is in the database that people are selling to us.
00:37:41
Here's your family.
00:37:42
Let me sell that back to you.
00:37:43
Here's your family.
00:37:44
Let me sell that back to you.
00:37:44
Here's your family tree.
00:37:45
These ancestry platforms you're thinking now they get into DNA.
00:37:48
I just don't understand how we didn't catch up and say, hey,
00:37:52
wait a minute, that's my family.
00:37:53
How come I got to pay for that information?
00:37:55
They give us a free this and they give us a free DNA swab and
00:37:59
we'll show you who you could be connected to.
00:38:01
And you're like, wait a minute, you know, I found out my niece
00:38:05
gave away her DNA and I'm saying , hey, that's my DNA too.
00:38:08
That wasn't your right to give up our family DNA.
00:38:11
I don't know, we just don't get it.
00:38:13
We don't, we don't, it's, you know, this, this, this, and of
00:38:16
course, we're we're, we're lazy, we're fat, we're overfed,
00:38:21
underloved and we just love the convenience, the convenience of
00:38:25
Bezos telling us the next book we're going to buy, or Netflix
00:38:28
telling us, hey, if you like this, watch this, this and this,
00:38:30
and not understanding, well, where do they get that data?
00:38:34
We just don't get it, we just don't get it.
00:38:35
So, anyway, I think it's a consumer thing and I think,
00:38:39
until consumers say enough is enough and then we ask for it,
00:38:43
I'm not pro-legislation or regulation, but I don't see how
00:38:46
this is going to be fixed.
00:38:48
We couldn't put seatbelts in cars without legislation,
00:38:52
without the government mandating it, and then it changed lives,
00:38:55
it saved lives.
00:38:56
Millions and millions of us have been saved from drug
00:39:02
driving, and the same thing could happen in our industry, in
00:39:04
our cybersecurity industry.
00:39:05
But I don't know.
00:39:05
I'm at a conference and I'm advocating regulation and I
00:39:08
sound like some neo-fanatical.
00:39:10
The government's not the answer .
00:39:12
When the government has to step in, we all got it wrong.
00:39:13
Clearly, we need to fix something that's broken.
00:39:17
And the fact that our personal data is monetized by other
00:39:20
companies with no kickback to me , I think something's wrong.
00:39:23
And the fact that they don't pay to protect that data?
00:39:28
You know they breach it and suddenly bad guys have it, as
00:39:32
well as Meta, and you know, because Meta lost its database
00:39:35
of everybody who's left-handed in Wisconsin.
00:39:37
And they have that database of everybody who's left-handed in
00:39:40
Wisconsin.
00:39:41
They do.
00:39:41
And because if you can monetize left-handers in Wisconsin, you
00:39:44
can sell it.
00:39:45
I bet I'm telling you that's the whole point, until we see
00:39:49
that then suddenly you know, hey , the left-handed database is up
00:39:52
for sale on the dark web.
00:39:53
People don't care.
00:39:54
Okay, I'm left-handed, but it doesn't mean I won't have job
00:39:58
opportunities in my future.
00:39:59
Really, we'll see when the right-handed people take control
00:40:05
.
00:40:05
Speaker 1: How do you feel about that?
00:40:06
Yeah, people take control.
00:40:08
How do you feel about that?
00:40:08
Yeah, that's uh, that's interesting.
00:40:09
I wonder how many sports organizations would actually buy
00:40:11
a database like that.
00:40:12
You know, that's left-handed people that play a certain sport
00:40:16
.
00:40:16
You know, because predominantly the players are right,
00:40:20
right-handed, right, so if you can go with your left hand, it
00:40:22
throws everything off.
00:40:24
It's like it really confuses a right-handed player sometimes of
00:40:28
you know the timing and everything, right, I'm thinking
00:40:31
about baseball.
00:40:31
But to bring it back to risk, you know you, you brought up an
00:40:35
interesting point with facebook, right, they were, or meta, they
00:40:39
were caught with, you know an atrocious crime, right, sex
00:40:45
trafficking, right, and they were not committing the sex
00:40:48
trafficking themselves, but they were definitely enabling it and
00:40:51
allowing it to happen on their platform.
00:40:54
right, they were profiting from it.
00:40:55
And you know someone you know at Meta, right, a lawyer or
00:41:03
someone else, right Literally said what's the risk of us
00:41:07
getting caught with this compared to this other thing?
00:41:11
And you know someone had to say , oh, I'd rather be caught with
00:41:15
sex trafficking, I mean that is the.
00:41:21
That's the risk calculation that they made, that's the actual
00:41:25
math, right there there.
00:41:27
And they, literally, they probably attributed how much
00:41:30
they would be fined, the publicity that they would lose
00:41:34
or gain, all of that stuff.
00:41:36
And so it's fascinating to think about it that way, because
00:41:41
I'll give you an example.
00:41:43
Right, um, I was at a friend's wedding and me, and probably
00:41:46
like five other couples are, you know, having their first kid
00:41:50
within the next six months.
00:41:51
Right, that's a.
00:41:53
It's a huge thing.
00:41:54
And you know these, these couples are saying you know, I'm
00:41:58
not gonna go with a baby monitor or like a normal, you
00:42:01
know camera, baby monitor, because you know the camera feed
00:42:04
is being sold on the dark web and you know predators are
00:42:07
breaking into homes and stealing babies and things like that.
00:42:10
You know the 0% of you know these occurrences, that's what
00:42:17
they're basing their risk profile off of.
00:42:19
And you know they're asking me what I'm going to do, right,
00:42:23
because I'm in cybersecurity, and what am I going to do?
00:42:26
And you know I told them like, yeah, I'm getting a NANET, you
00:42:30
know, baby monitor, whatever.
00:42:35
It's going to be great.
00:42:36
And they were, they were just completely confused how are you
00:42:37
going to do that?
00:42:38
Right, and I'm saying look, you are a small, small, small fish
00:42:42
in a very large pond.
00:42:43
One, what are the odds that someone is going to find that
00:42:49
specific data feed?
00:42:50
And two, I actually have network security controls in
00:42:53
place where I can, very I can limit very precisely how much
00:42:57
data that camera sends externally, which I actually had
00:43:01
to really lock down significantly because nanit and
00:43:05
nanit is just a spy tool at this point.
00:43:08
Nanit is just a spy tool for how much data it was actually
00:43:12
sending out of my network.
00:43:17
I have a medium-sized enterprise network in my home.
00:43:20
I have network switches, I have firewalls, everything like that
00:43:24
right, and the amount of data that it was actually sending
00:43:29
outside of my network was crashing my network, it was
00:43:33
slowing everything down to a halt and I'm like what the hell
00:43:37
is going on?
00:43:38
I did not just pay a premium to Xfinity to get 100 down, like I
00:43:42
paid the premium for 1.5 gigs down, you know, and I look and
00:43:49
it's NANIT N and it's taken up 90 of my network bandwidth and I
00:43:51
could not believe it.
00:43:52
I, I mean, I obviously I put the restrictions so that it's
00:43:56
literally just allowing me to, you know, connect from my app.
00:44:02
You know, like as a parent, that's very, very concerning.
00:44:05
It's alarming, right, because I mean, it's a camera that you're
00:44:09
using to watch your kid when they sleep, right, make sure
00:44:13
that they're okay.
00:44:13
You know that's it right, but they're sending so much data.
00:44:17
I mean they have to be collecting data from every other
00:44:19
device on the network and then sending that somewhere and
00:44:22
they're selling it.
00:44:23
They have to be selling it.
00:44:24
It's very lucrative, and then sending that somewhere and
00:44:25
they're selling it.
00:44:25
They have to be selling it.
00:44:27
Speaker 2: It's very lucrative.
00:44:27
It's everywhere, though, Joe.
00:44:29
It's in everything you see.
00:44:32
Look at the iPhone earbuds they take your pulse rate, your
00:44:39
heart rate, Literally.
00:44:42
You're reading Not only that, but just the general volume that
00:44:46
you listen to, which might be your hearing level.
00:44:49
All that, send that back.
00:44:50
Who bought Fitbit?
00:44:53
Who was losing money?
00:44:54
Hand over glove, Microsoft.
00:44:57
Speaker 1: Was it.
00:44:57
Speaker 2: Google, google, yeah, google, you're right.
00:44:59
Google bought Fitbit.
00:45:00
It's a losing business.
00:45:02
What did they buy?
00:45:02
They bought Fitbit because Fitbit takes your heart pressure
00:45:07
, your blood pressure, your, you know it's medical data,
00:45:08
everybody who uses it.
00:45:09
You know the Peloton bicycles, all that, they're just user
00:45:13
reporting.
00:45:14
You know profiling vehicles that send that data back.
00:45:18
You're right to look at the egress.
00:45:20
You know what's egressing your house from your refrigerator for
00:45:23
crying out loud, much less your nanny cam, but, hey, your house
00:45:27
from your refrigerator for crying out loud, much less your
00:45:28
nanny cam.
00:45:29
But hey, why does it keep going up?
00:45:29
Why doesn't somebody say enough ?
00:45:30
Why, why, why can we?
00:45:31
Why can you be a vendor and make a product that is
00:45:34
essentially fly well and sell that legally?
00:45:37
And this is where I say, you know, is the answer, legislation
00:45:40
.
00:45:40
At this point, I don't see what else is going to do, because
00:45:42
when there's money to be made, you know, and and that's still
00:45:46
here's the problem with that editor, whoever they are,
00:45:48
they're they're taking that, that, that data, pulling it, you
00:45:51
know, vacuuming it out of your house, putting on this, uh, and
00:45:53
not spending any money, you know to to, and then then threat
00:45:56
actors come in, take that, and that's the kind of stuff that
00:45:59
you know your, your friends are talking about.
00:46:02
Oh, pedophiles can get on the same page.
00:46:04
Yeah, because they're selling these in databases.
00:46:07
And all of this data because it can be monetized has value to
00:46:12
somebody.
00:46:12
And it is just the craziest one I saw.
00:46:16
I was on the dark web and I saw a database of 5 million users
00:46:21
and the data was all about what hair conditioner they used.
00:46:23
And I thought to myself A why would you steal this if you were
00:46:27
a hacker?
00:46:27
And B who would you sell it to?
00:46:29
And it had been sold over 200 times in the last year.
00:46:34
You know per record and you think, wow, you know, if you
00:46:43
don't get that data equals cash.
00:46:44
I don't care if it's.
00:46:44
Here's a database of who uses what hair conditioner.
00:46:46
Now you think who's the buyer to that?
00:46:47
Is it other people who are selling hair condition who want
00:46:48
to figure out who uses what?
00:46:49
Maybe so it's competitor, is it ?
00:46:51
I don't know, but it's just amazing how you can monetize the
00:46:56
strangest data and make some cash off it.
00:46:59
And our threat actors.
00:47:00
I don't see any difference between a hacker and Mark
00:47:03
Zuckerberg None whatsoever, just different methodologies.
00:47:07
Mark will take it from you for landing on a site, amazon.
00:47:10
Amazon puts a cookie that will take your password.
00:47:13
Go back and go to Experian or Equifax and pull your credit
00:47:19
report for just landing on the site, just landing on the site,
00:47:24
and they will suck up all your passwords.
00:47:26
What you build are you using?
00:47:28
Are you you download Spotify?
00:47:30
Are you using this?
00:47:30
Are you?
00:47:31
What applications?
00:47:32
What operating system?
00:47:33
What all for landing on the site.
00:47:35
How does he get away with that?
00:47:36
How can anybody get away with?
00:47:37
Speaker 1: that.
00:47:38
Speaker 2: Because in my mind, that's spyware, that's an
00:47:41
intrusion by privacy.
00:47:42
My grandmother wants to go to Amazon one day and she goes to
00:47:45
that big thing called the internet and she types in Amazon
00:47:48
and lands on his site and suddenly my grandma's got her
00:47:51
credit written in Mark in Jeff Bezos' database and he doesn't
00:47:57
protect it.
00:47:57
Somebody steals my grandma's data you know, credit profile
00:48:01
and sets up and does identity theft because Jeff Bezos didn't
00:48:05
protect it and my poor grandma landed on the site and that's
00:48:08
the way, that's the.
00:48:09
We don't connect those dots.
00:48:11
You know, when I say we don't connect, this is not ones and
00:48:14
zeros.
00:48:14
Think of that person who sat at that, you know, at that meeting
00:48:18
and said, hey, if we get caught in this sex traffic thing, but
00:48:21
then they went home to their husband or their wife and their
00:48:24
kids and that's what they did.
00:48:25
They sat at that meeting where they said, hey, we had Facebook.
00:48:28
I met her that day and said, let's go with this until we get
00:48:31
caught, you know, and then we'll just buy our way out or
00:48:33
somebody.
00:48:33
When we go to work in these tech companies, we just see this
00:48:36
as ones and zeros we don't understand about.
00:48:39
Hey, wait a minute, this is people's lives.
00:48:41
We put people in farms away from sex traffickers or
00:48:45
pedophiles or whatever.
00:48:46
The issue is, if we don't look after this data, Data for
00:48:49
children, data for housewives All of this data in the wrong
00:48:54
hands is critical to our well-being.
00:48:58
Now we have not passed a law mandating the same protections
00:49:04
required for top-secret information for a government.
00:49:08
How come we're not doing that?
00:49:09
Eyes only.
00:49:11
I want my DNA treated as information top secret eyes and
00:49:15
only five people in the world should see my DNA, and I should
00:49:19
approve all five of them, and one is my doctor.
00:49:22
Outside of my doctor, nobody else has got a right to see my
00:49:24
DNA, and that's because I'm undergoing some sort of
00:49:27
prognosis for some disease that I may or may not have, but
00:49:30
that's a need to know.
00:49:31
I just don't understand how we don't apply that same government
00:49:34
secrecy classification scheme very easily to what is personal,
00:49:39
sensitive data sets of you and me.
00:49:43
Speaker 1: Yeah.
00:49:44
Speaker 2: Do I sound cynical, joe?
00:49:45
Because part of it is the cappuccinos and part of it is
00:49:49
the lack of cappuccinos, so I sound too optimistic.
00:49:53
Speaker 1: Well, you know, this is, you know, this is why I have
00:49:57
the podcast, right, this is a security, this is a security
00:50:01
professionals conversation about , you know, topics that touch
00:50:07
everyone, right, like when we're working at these companies.
00:50:10
You know, as a security professional, you're not
00:50:12
thinking about one subsect of your customer base or one
00:50:16
subsect of the data.
00:50:17
You're not.
00:50:18
You're thinking about every single customer, every single
00:50:22
location that they reside in, every policy, every law that's
00:50:27
going into effect, every law that isn't even on the books
00:50:30
that's being talked about.
00:50:31
You're thinking about those things that's being talked about
00:50:36
, right, you're thinking about those things.
00:50:37
And so if me and you, you know, met up at a pub right somewhere
00:50:38
in Europe like this is what our conversation would be right.
00:50:41
So that's the whole reason why this platform exists.
00:50:47
And you know, I wanted to ask you, being an American and going
00:50:52
to Europe and then being in the risk industry the risk side of
00:50:58
cybersecurity was there a difference in mindset that you
00:51:03
had to make to more effectively run your business?
00:51:09
People are so polite that they won't tell you no, but we're
00:51:14
also talking about fundamental ways in which entire continents
00:51:20
view the risk of someone's data, right?
00:51:22
So I'm wondering if there's a shift.
00:51:25
That happened somewhere in there where you said you know,
00:51:30
yes, what is it like?
00:51:32
Ancestrycom shouldn't have my data right.
00:51:37
That's a mental switch, though Most people don't think like
00:51:39
that, but you and I, of course, think like that.
00:51:42
But how did you make that switch?
00:51:56
Speaker 2: data.
00:51:56
You know, and as seen through things like GDPR and the
00:51:58
adoption of GDPR, and how quickly it went to market and
00:51:59
how quickly European firms were able to be compliant.
00:52:01
Literally, it wasn't a big deal.
00:52:03
It wasn't a big deal why?
00:52:04
Because it was always there.
00:52:05
It just was codified into legislation and now to do
00:52:09
business with a European firm, you had to be GDPR compliant to
00:52:13
these principles, to this, you know.
00:52:14
And so Americans, like we talked about, had a hard time
00:52:19
with GDPR, had a harder time.
00:52:20
So I have cultural, you know, I feel that client is more this
00:52:23
thing that we've been talking about.
00:52:24
This is, you know, it's not ones and zeros, it's the state
00:52:28
of people's lives.
00:52:29
European clients are much more sensitive to that fact.
00:52:32
That's not to say, if there's one ADB made, they won't sell a
00:52:35
database, they do.
00:52:36
I mean, the Americans are the ultimate example of the
00:52:40
commoditization of our personal data at little or no regard to
00:52:44
the data subject.
00:52:45
I think no.
00:52:47
The changes I had to make essentially to doing business
00:52:51
with Europeans was, you know, americans are very you know,
00:52:55
we've got to have it now.
00:52:56
We've got to have it.
00:52:56
You know it is half-dad, but yesterday, I need it cheaper, I
00:53:01
need it faster, I need it brighter.
00:53:02
I'd be this far, okay, but Europeans are very, very slow.
00:53:06
So the sales cycle, for instance, I don't know what the
00:53:09
sales cycle is for penetration tests in the States, but
00:53:13
technically we could talk to a client for almost a year before
00:53:15
they get one and they haven't talked to anybody else and
00:53:18
they're not shopping around, they're not considering it and
00:53:20
it's an expenditure.
00:53:21
But the other thing I find is they actually calculate the
00:53:24
return on the investment and I have more.
00:53:26
You know I quantify, all right, what benefits will come back
00:53:30
from the pen test.
00:53:31
You know, and that return on my investment of 20 pounds or
00:53:36
euros for a pen test, well, you can buy a car for $20, not a
00:53:39
great one, but you know it's a big purchase.
00:53:42
And they're getting a pen test report of 10 pages, 15 pages,
00:53:45
you know.
00:53:45
So there's a lot more fixation on the value and the return on
00:53:50
the investment for the spend, which I love, love it.
00:53:53
I don't think American CISOs in general quantify or use KPIs to
00:53:59
measure the return on their cybersecurity spend.
00:54:02
I don't.
00:54:02
So I just I think that's very pragmatic and I think that's
00:54:05
great and there is emotional response.
00:54:08
But in terms of this is data about people's lives?
00:54:11
I get that, but it's on the management level but not
00:54:16
necessarily in the board level.
00:54:18
For instance, I walk into a board and I say right.
00:54:21
First of all, raise your hands if you have personal data in
00:54:23
these systems and you don't see a hand go up In fact it's one
00:54:27
when you say wait a minute, none of you, none of the board
00:54:31
members, have their personal data their first name, last name
00:54:41
, home address, where their children go to school, in the
00:54:42
systems that have to be GDPR compliant or whatever.
00:54:43
You know our client's requirement or I'm about to help
00:54:45
them with their risk appetite for that, and that's the first
00:54:47
thing I notice is, well, that's a business thing.
00:54:50
That's not about me.
00:54:51
You know, and I've got skin in the game, you know, is not there
00:54:56
and I I'm just shocked and they're looking at me like what
00:54:59
are you trying to be funny?
00:55:00
I wouldn't put my wife's, you know, my wife's uh, our mortgage
00:55:05
, or where my wife said you are my chill, where do my children
00:55:08
go to school in these business systems.
00:55:10
And you think, well, you don't get it.
00:55:12
And I still think that this idea that it's not personal, I don't
00:55:16
know about you, joe, but I go to security events I got.
00:55:19
I go to about 50 a year, 30 to 50 a year, seriously, one to a
00:55:23
month, and I go and listen to people talk about it, talk about
00:55:25
it, talk about it.
00:55:26
But very few of us walk the walk.
00:55:28
Very few people come, you know, and they say the security in
00:55:31
that system over there, and I say, well, what's on your laptop
00:55:34
, let's start there.
00:55:35
And they go well, I don't know what's on my laptop, it has
00:55:37
nothing to do with mine, our threshold.
00:55:39
We always project cybersecurity onto another system but we
00:55:43
seldom actually practice that.
00:55:46
And what I see lacking in cybersecurity professionals,
00:55:48
regardless of if they're European or if they're Americans
00:55:51
, is that personal connection of this is my data that we've been
00:55:56
talking about.
00:55:56
One more quick every three or four months I have.
00:55:59
So I'm out here in London, I have lunch with who is the MD of
00:56:03
what is the UK's largest seller of firewalls, and I'm not big
00:56:09
on cybersecurity products, as you can tell, but hey, you know,
00:56:11
it's part of the industry and I keep in touch and I like this
00:56:13
guy.
00:56:13
We meet up for lunch every three, four months, once a
00:56:16
quarter or whatnot, anyway, but the last time I met him,
00:56:20
literally earlier in the spring, he walks in and he's red, he's
00:56:22
mad, he's upset.
00:56:23
I said what's wrong, what's wrong?
00:56:25
And he said somebody hacked into my laptop and I went the
00:56:29
irony, very good living selling cybersecurity products.
00:56:32
Somebody broke into your laptop , you know, and said I'm just
00:56:35
having a great day.
00:56:36
I said, well, what's wrong?
00:56:37
So what?
00:56:37
I mean?
00:56:38
It happens, you know, it's risk management.
00:56:40
But now he's upset because he said, yeah, and they took
00:56:44
pictures of my wife and kids on our vacation.
00:56:47
So now he's got this visual somebody looking at pictures of
00:56:52
his wife and kids on a beach someplace and Ibiza or whatever
00:56:56
and now it's personal.
00:56:57
And this is a guy in our industry who sells firewalls for
00:57:00
a freaking living and makes a very comfortable living at it,
00:57:04
but now he's upset.
00:57:06
Now he's upset why?
00:57:07
Because it's personal.
00:57:09
Because you know, it's his wife , his kids, and I think that's
00:57:13
our problem.
00:57:13
That's our problem.
00:57:14
That's our problem in the industry that we don't have.
00:57:17
You know, we talk a lot of stuff, but until it, until it's
00:57:21
our data, until it's our hair conditioner, my hair conditioner
00:57:24
I'd be upset if somebody knew what conditioner I use.
00:57:27
That's not a big secret.
00:57:29
I just say, yeah, anyway, that's.
00:57:31
It's that disconnect that I find, joe, is we're guilty of.
00:57:34
Speaker 1: Yeah, yeah, absolutely.
00:57:36
You know, richard, unfortunately we're at the top
00:57:42
of our time here, but I've really enjoyed our conversation
00:57:43
you know and I really want to have you back on.
00:57:46
Speaker 2: That's kind.
00:57:46
Thanks.
00:57:46
I'd love to be back.
00:57:47
Sorry to get too talky but, like I said, I'm an old man and
00:57:51
I rank you.
00:57:52
Things could be better in our industry.
00:57:54
Anyway, well done on the podcast and keep pushing that
00:57:58
boulder uphill Joe.
00:57:59
I appreciate our time.
00:58:01
I appreciate our chat.
00:58:04
Speaker 1: Let's keep fighting the good fight.
00:58:05
Yeah, absolutely, Richard.
00:58:06
You know, before I let you go, how about you tell my audience
00:58:09
where they could find you if they wanted to connect and where
00:58:11
they could find your company if they wanted to learn more?
00:58:13
Speaker 2: That's great.
00:58:14
I'm on LinkedIn, like everybody else.
00:58:15
I'm on LinkedIn.
00:58:17
Just look for Richard Hollis, h-o-l-l-i-s.
00:58:19
The companies I work for is Risk Crew, it's riskcrewcom and,
00:58:25
again, just no products, all services.
00:58:28
But you know, certainly see, you know the philosophy in the
00:58:33
company pages in terms of also some of the products and skin in
00:58:35
the company pages, in terms of most of the silver product and
00:58:37
skin in the game, these things that we've been talking about.
00:58:39
We do try to practice what we preach and the company's a
00:58:43
reflection of that.
00:58:43
Thank you, joe.
00:58:44
By all means, reach out to connect.
00:58:47
If you've got a problem with something I said or a good joke
00:58:51
or a new hair conditioner, I could use.
00:58:53
I appreciate all three.
00:58:54
Speaker 1: Awesome.
00:58:55
Well, thanks everyone.
00:58:56
I hope you enjoyed this episode .