Discover the intricate dance of cybersecurity and compliance in a world where geopolitical fragmentation is the stage, and Mathieu Gorge, CEO of VG Trust, is our guide. With the finesse of a seasoned expert, Mathieu navigates through the complexities of maintaining continuous compliance amidst the shifting sands of cyber threats. We delve into a realm where the distinction between the compromised and the unscathed is paper-thin, exploring the sobering implications for companies ensnared in the webs of intellectual property wars. This is a conversation that goes beyond the surface, revealing the stark realities of operating within contentious digital territories.
Join us as we merge the worlds of business strategy and cybersecurity, showcasing how digital skirmishes often foreshadow physical battles. As we dissect the trends reshaping the security culture within organizations, Mathieu illuminates the surprising involvement of private sectors in managing critical infrastructure. We tackle the personal responsibility that comes with the surge of connected devices and the burgeoning industry of hacking as a service, emphasizing the need for continued education to safeguard our digital footprints. Here, the importance of personal vigilance becomes as clear as day.
Looking ahead, we confront the future of AI security, unearthing the challenges in workforce planning as we anticipate the evolution of skillsets and the rise of AI governance. The conversation then pivots to an examination of the robustness of our critical infrastructure in the face of cyber onslaughts, with a spotlight on Europe's pioneering Digital Operational Resiliency Act (DORA). Mathieu leaves us with a treasure trove of insights into ransomware intricacies and the criticality of proactive resilience, steering listeners toward resources that bridge the gap between cybersecurity concerns and corporate leadership.
Affiliate Links:
NordVPN: https://go.nordvpn.net/aff_c?offer_id=15&aff_id=87753&url_id=902
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, Matthew?
00:00:01
It's really good to have you back on the podcast here.
00:00:04
It's been I mean, it's been probably 18 months since you
00:00:08
were last on the podcast.
00:00:10
I'm really excited, you know, for what we, what we have in
00:00:13
store today.
00:00:15
Speaker 2: Yeah, great.
00:00:15
Thank you very much for having me back.
00:00:17
Time flies when you're having fun, I guess yeah.
00:00:21
Speaker 1: Yeah, it doesn't seem like it's.
00:00:22
It was that long ago, but it's just like one thing after the
00:00:26
next.
00:00:27
You know in our lives where it's like traveling and just
00:00:30
constant, constant, go on different topics.
00:00:35
Speaker 2: Yeah, indeed, and 2023 has been particularly busy
00:00:40
on many, many fronts, everywhere .
00:00:44
Speaker 1: Yeah, absolutely so you know before we dive into you
00:00:48
know 2023 recap.
00:00:50
Why don't you tell my audience you know who you are, what your
00:00:55
expertise is and all that good information?
00:00:57
Right, because you know, maybe maybe there's some new listeners
00:01:01
that haven't heard you before and I just want to make sure
00:01:03
that everyone knows.
00:01:04
You know who you are and what you provide in the field.
00:01:08
Speaker 2: Yeah, sure.
00:01:08
So my name is Matthew Gorge.
00:01:11
I'm the founder and CEO of VG Trust.
00:01:13
We're a software provider of GRC solutions and we've got an
00:01:20
award-winning solution called VG1 that allows you to prepare
00:01:23
for, validate and manage continuous compliance with about
00:01:26
a hundred security frameworks worldwide, specifically anything
00:01:30
that has to do with data privacy, information, governance
00:01:33
and compliance.
00:01:33
So, as you can imagine, all the usual suspects PCI, hipaa, gdpr
00:01:39
, nist, iso and so on and so I've been in cybersecurity for
00:01:45
longer than I care to admit, and probably been about 25 years.
00:01:49
I started when cyber was not called cyber, it was called
00:01:52
network security, and then it became content security,
00:01:56
internet security, data security , privacy, and now we're in the
00:02:01
area of global compliance and global security.
00:02:05
I'm involved with a number of security think tanks, including
00:02:08
the VG Trust Global Advisory Board, which is a non-for-profit
00:02:13
think tank with about 1 members from 30 countries, and
00:02:19
we talk about what's happening in the industry under chatroom
00:02:21
house rules.
00:02:22
One of the things I would say straight off is that my view is
00:02:28
that security, if you work in security and you do your job
00:02:32
correctly, nobody knows your name, but if something goes
00:02:35
wrong, you become public enemy number one very quickly within
00:02:38
the organization and it carries a stigma in the industry moving
00:02:43
forward, and I think that as a community, we need to look after
00:02:45
each other and we need to make sure that we share best
00:02:49
practices, not just by saying this is what you should do, but
00:02:52
also saying you know what?
00:02:54
This is where I make mistakes.
00:02:55
I'm going to share that with you so that you don't have to
00:02:57
make my mistakes, and hopefully you will share your mistakes
00:02:59
with me so I don't have to make them.
00:03:04
Speaker 1: Yeah, that's a really good point and I think that
00:03:06
that kind of experience is often overlooked.
00:03:10
I used to work for a company and they were bringing in a new
00:03:14
VP of security and he was recently at, I think, two or
00:03:20
three other places back to back that were breached, and they
00:03:23
were big, huge breaches, like the target breach and a couple
00:03:27
other ones like, I think, home Depot.
00:03:30
It could have been a very unfortunate situation, right,
00:03:33
like this guy just came into the role, they get breached and
00:03:39
it's pinned on him and whatnot, right.
00:03:41
But everyone internally was like, ooh, are we sure we want
00:03:46
to hire that person?
00:03:47
And the only person that actually stood up for him was
00:03:52
the senior director that would be reporting to him and said,
00:03:53
you know like, well, why wouldn't we want that experience
00:03:57
in house?
00:03:58
We haven't been breached, he's gone through it, he knows what
00:04:02
happens, what can happen and how to handle that situation.
00:04:07
Speaker 2: Yeah, and you know, the reality is that there's only
00:04:10
two types of companies out there the ones that have been
00:04:13
breached and the ones that don't know who's been breached.
00:04:16
And it's nearly better to understand that you have been
00:04:20
breached so you can better the systems, the processes, the
00:04:22
security awareness, the culture of security, so that you can
00:04:27
make that an ongoing journey.
00:04:28
I always say that security is a journey and not a destination.
00:04:34
So by the time you reach compliance with regulation one,
00:04:37
two, three or XYZ, your ecosystem has evolved.
00:04:39
You know, maybe people have left, maybe you've acquired a
00:04:44
business, maybe there's a new system that has been rolled out,
00:04:47
and so your risk surface changes all the time.
00:04:52
And so to say that we are secure right now, you might be secure
00:04:55
for a millisecond, but everything is dynamic, and so
00:05:00
you need to work with people that understand that, and
00:05:01
somebody that's already dealt with a breach most likely we'll
00:05:05
understand it better than somebody that hasn't.
00:05:08
That doesn't mean that the scales are not equal.
00:05:11
I'm just saying that having to deal with a breach from a PR
00:05:15
perspective, a technology perspective, a legal perspective
00:05:18
, you know, an internal perspective is something that,
00:05:24
unless you've lived it, it's difficult to grasp.
00:05:26
Now you can get amazing training for it and you will be
00:05:32
much better at dealing with it if you've had the training.
00:05:35
But unfortunately you won't really grasp it until it happens
00:05:40
to you, if that makes sense.
00:05:44
Speaker 1: Yeah, that's a really good point.
00:05:45
You know you bring up there's two kinds of companies companies
00:05:48
that know that they've been breached and ones that don't
00:05:50
know that they've been breached yet.
00:05:51
Right, do you have an interesting question.
00:05:57
That's probably a loaded question.
00:05:59
When companies have, you know, subsidiaries or, let's say,
00:06:04
branch branches in China or more adversarial countries, right,
00:06:13
do you assume that they're already breached and they it's
00:06:17
more of an internal breach at that point?
00:06:19
Does that make sense?
00:06:21
Because if, how China and Russia just throw out a couple
00:06:26
adversarial countries out there, how they typically operate is
00:06:31
that when you operate in their country, they own all the IP
00:06:35
that you create there.
00:06:36
So, do you see it that way or not necessarily?
00:06:40
Speaker 2: So I don't necessarily think that they've
00:06:42
been breached or spied on, but what I would say is that if you
00:06:48
need to understand your ecosystem and what I mean by
00:06:50
your ecosystem is anything that's behind your firewalls,
00:06:54
your hybrid workforce, your applications, your third parties
00:06:57
, fourth parties, anybody that interacts with your systems,
00:07:03
even from time to time, even sporadically and if some of
00:07:08
those subsidiaries or branches or people are based in a country
00:07:12
that is at risk, then you need to run a tabletop exercise as to
00:07:17
what it would mean if you could no longer get to that data, if
00:07:22
you could no longer get the physical assets, the hardware,
00:07:25
for instance, back into your own country, if you could no longer
00:07:30
talk to the regulator, the local regulator, because what
00:07:34
might happen and that happens specifically with Russia and
00:07:37
Ukraine is that from one day to the next, suddenly it became
00:07:43
super difficult to get your data and even if you have a backup
00:07:46
of your data, because of nationalization of Western
00:07:50
assets in Russia, for instance, you will never get that data
00:07:53
back and you can be sure that at that stage, that data is going
00:07:56
to be analyzed.
00:07:57
So there are currently about 10 plus really bad conflicts
00:08:06
worldwide and of course, you hear about Russia and Ukraine,
00:08:08
and you hear about Israel and Gaza, but there are a few others
00:08:11
that don't really make the headlines the same way.
00:08:14
You need to map out where you do business and the impact on
00:08:19
your business and I think that's where the boards are really
00:08:25
starting to wake up suddenly in 2023 and into 2024, that you
00:08:30
can't just assume that because you do business in a country
00:08:34
right now and it's all solid, there's the right policies and
00:08:39
the right backups and so on, that you don't have to actually
00:08:43
plan for the worst.
00:08:44
And I believe that, as humans, we are generally thinking
00:08:51
optimists, sometimes too much, and it's a case of understanding
00:08:57
what am I ready to lose?
00:08:58
Have I trained for that?
00:09:00
Because, as I said, if you've trained for it, even if you
00:09:04
haven't really experienced it, but if you've done a tabletop
00:09:07
exercise as to oh, from tomorrow onwards, we can no longer do
00:09:11
business in Taiwan, you know what that means and you've
00:09:14
prepared for it.
00:09:15
And I think that it leads me to the World Economic Forum's
00:09:20
Global Cyber Security Outlook 2023 report, where, essentially,
00:09:25
they list all of the top risks that organizations need to deal
00:09:30
with, and the first risk is not the advantage of AI, it's not
00:09:34
the rise in security breaches, it's the geopolitical
00:09:40
fragmentation.
00:09:41
So, in other words, what's happening in countries where you
00:09:44
may or may not do business will actually impact your business.
00:09:47
Again.
00:09:48
I go back to Russia invading Ukraine.
00:09:51
You can see a huge rise in ransomware attacks coming from
00:09:56
Russia into countries that are openly supporting Ukraine.
00:10:00
You can.
00:10:01
It's reasonably well documented that there were a number of
00:10:06
critical infrastructure attacks on the Ukrainian critical
00:10:10
infrastructure assets in the nine months leading up to the
00:10:14
physical attack and then that kind of dropped about two weeks
00:10:17
before the physical attack and now it's back up and so we are
00:10:22
monitoring, as an industry and threat intelligence then the
00:10:25
countries that are getting the most attacks right now, because
00:10:29
it could be it's not guaranteed, but it could be a sign of
00:10:33
physical attacks.
00:10:34
And I think that you know I'm not telling anyone to forget
00:10:38
about privacy and so on, absolutely not.
00:10:40
You need to continue working on that.
00:10:42
But I do think that right now is a good time to go back to
00:10:47
basics and to say what is my ecosystem?
00:10:50
What am I protecting?
00:10:51
What am I willing to lose in 2024?
00:10:54
What can I absolutely not afford to lose in 2024?
00:10:59
And that will drive your threat intelligence and your
00:11:04
protection strategy into the new year, I guess.
00:11:08
Speaker 1: Yeah, it makes a lot of sense.
00:11:09
You know, I actually I took part in a tabletop exercise
00:11:15
before and those are extremely good at identifying the areas of
00:11:21
improvement.
00:11:21
It's really interesting.
00:11:23
You know, they'll come up with a different scenario and you
00:11:26
have to work through it and everyone on the call, you know,
00:11:29
has a role.
00:11:30
I've seen it from both ends, where everyone knew what they
00:11:33
were doing and then the other side was, you know, no one
00:11:36
really knew what they were doing and, you know, in this tabletop
00:11:39
exercise, the company was breached for an entire week
00:11:43
before security even knew about it, right, and it's a really
00:11:50
good tool that organizations should, and typically do, use to
00:11:55
really, you know, identify those gaps and actually it's
00:11:59
really important to shore them up once you identify them.
00:12:01
You know, now, looking back into 2023, what were some of
00:12:09
your top items I guess that happened in 2023 that you think
00:12:14
may be setting the stage for 2024?
00:12:19
Speaker 2: Well, from a technical perspective, the rise
00:12:22
of ransomware attacks, the scaling in the number of attacks
00:12:31
against CEOs and C-suite.
00:12:34
From a social engineering perspective, that was extremely
00:12:38
visible.
00:12:38
What we saw as well is a number of key executives being
00:12:43
prosecuted and, in very limited cases, being jailed for not
00:12:50
doing the right thing with regards to privacy, and that can
00:12:55
be a game changer.
00:12:56
We saw a number of new regulations coming out in
00:13:03
specific areas and we saw obviously the advantage of NIS-2
00:13:08
in Europe with regards to critical infrastructure
00:13:10
protection.
00:13:11
We saw a number of new data privacy regulations in I think
00:13:16
about six states in the US, which is good, but they're not
00:13:20
all exactly going the same direction.
00:13:22
So I think we unfortunately were still a long way away from
00:13:25
the federal equivalent of GDPR.
00:13:28
So we obviously saw the Ukrainian-Russian conflict going
00:13:37
on and the impact of that.
00:13:41
We now have the conflict between Israel and Gaza and, ironically
00:13:47
, a lot of cybersecurity funding comes out of Israel every year,
00:13:51
not just to the US, but also to Europe and to Asia, and that
00:13:56
funding is probably going to slow down, meaning less money to
00:14:00
invest in cyber, also meaning more attacks on Israel.
00:14:04
Also meaning potentially another equivalent of shadow IT coming
00:14:09
out of Gaza and people supporting Gaza.
00:14:14
So it's a very dynamic environment.
00:14:17
But we also saw a rise in that idea of security culture and
00:14:25
that is mentioned as well in the World Economic Forum report
00:14:29
where we see more and more business people trying to engage
00:14:33
with security and compliance people to understand what they
00:14:37
can and cannot do and for them to work together.
00:14:39
And then we go back to where we were saying at the beginning,
00:14:42
that if you work in security, nobody wants to talk to you and,
00:14:46
generally speaking, it's because you're either telling
00:14:49
the business no, you can't do this because you're going to put
00:14:51
aside a compliance or you're going to increase our risk
00:14:55
surface beyond what we can accept, or you're like, hey, you
00:15:00
go to the board and say, hey, the business wants me to do that
00:15:02
, can I get another million dollars to make it happen
00:15:05
securely?
00:15:05
So it's a difficult one.
00:15:07
But now we're seeing that trend where more and more business
00:15:11
people are talking to security and compliance and we're going
00:15:14
to see a bit more of that in 2024, into the next two to three
00:15:18
years.
00:15:21
Speaker 1: Yeah, it's a lot to unpack there.
00:15:24
One of the things that you brought up previously that I
00:15:28
have talked about is the fact that now we're seeing a lot of
00:15:34
digital attacks or cyber warfare attacks, before kinetic attacks
00:15:39
ever take place.
00:15:40
Do you see that ramping up at all?
00:15:46
Because I feel like there should almost be, like you know,
00:15:51
a watch group that is saying like, oh, we're seeing an
00:15:54
increased, you know specialized attack in you know Europe or
00:15:59
wherever it might be, and you know, kind of like, put out the
00:16:03
watch on that.
00:16:03
Because I feel like everyone in cybersecurity is aware of that,
00:16:07
they understand that and they know the implications of that.
00:16:11
But it's much more difficult to get people outside of
00:16:16
cybersecurity to fully grasp the concept of, oh, like, they're
00:16:21
going to take down my phone network before they, you know,
00:16:26
send troops in, right?
00:16:28
Speaker 2: Well, so you know, the issue with critical
00:16:31
infrastructure assets is that as citizens, as Joe public, we
00:16:36
believe that this is the responsibility of the government
00:16:39
, and what we do not understand is that, depending on the survey
00:16:43
you look at, but generally speaking, it's between 70 and
00:16:46
80% of critical infrastructure assets like electricity, power,
00:16:52
food, transportation and so on is actually owned and are
00:16:58
operated by the commercial sector, by private companies,
00:17:04
and the part that is actually managed purely by the government
00:17:08
is, generally speaking, only the army and the police systems,
00:17:14
because even hospitals and specifically in the US, you know
00:17:19
, half of the hospital systems are actually private systems Not
00:17:23
so much in Europe, but still parts are still actually private
00:17:28
.
00:17:28
And so what you want to do is you want to bring the awareness
00:17:33
level, with Joe public, that everything starts with them.
00:17:37
And which actually leads me on to another.
00:17:40
I suppose another issue here and we are seeing that more and
00:17:47
more over the last few years is that concept of your own
00:17:50
critical infrastructure.
00:17:51
So right now, most of you have three, four connected devices on
00:17:57
you, you know, a smartwatch and maybe a personal cell phone, a
00:18:01
business cell phone and an iPad or whatever, and that's before
00:18:03
you even get into your car, which is completely connected,
00:18:06
and then you get to your house and so on.
00:18:08
And so if I educate you, either as the industry and or the
00:18:15
government, as to the value of that, and I can say, well, if
00:18:19
you take care and if you, if you are careful, nobody's going to
00:18:23
be able to drive by and order whatever they want by hacking
00:18:28
into your phone that is linked to your fridge, that has a
00:18:33
system that allows you to connect to, to Walmart or
00:18:37
wherever, to replenish everything, and now I can buy
00:18:40
different things and get them sent to my home instead of yours
00:18:43
, and that is the problem.
00:18:45
But it's not like threatening.
00:18:46
But let's say I hack into your HVAC system, your air
00:18:51
conditioning system, where it depends on where you live, but
00:18:53
like, if you live in Michigan in the middle of summer and you
00:18:57
can't get cool air, or in the middle of winter and you can get
00:19:02
heating, that will become critical.
00:19:04
And so I think that what we need to do is we need to like if
00:19:08
people do that at home, they're more likely to pay attention at
00:19:11
work, and vice versa.
00:19:12
So it needs to be a continuous cycle of educating them on both
00:19:16
sides.
00:19:18
I do think that again it goes back to that idea of the your
00:19:23
risk surface.
00:19:24
So my risk surface before, when I was walking, was just social
00:19:29
engineering my watch was not connected, I didn't have a cell
00:19:32
phone, or my cell phone was so dumb that you couldn't even hack
00:19:35
into it.
00:19:36
Now I'm like walking a tax service and everywhere I go it
00:19:44
keeps growing.
00:19:44
So I need to, I need to train people to understand hey, do I
00:19:49
really need that connected wallet?
00:19:52
Do I really need this?
00:19:55
Do I really need that?
00:19:55
Do the benefits outweigh the risks?
00:19:59
So if I have a connected wallet and I lose it, I can connect to
00:20:03
it.
00:20:03
That's great, I can see where it is.
00:20:05
But if, for some reason, there's no, there's only default
00:20:11
settings on it, I might be able to connect to your, to your
00:20:14
wallet, and then, once you're home, I use the wallet to
00:20:18
piggyback onto your, your computers, and then to piggyback
00:20:21
onto your VPNs that go from your computer to your workplace.
00:20:25
You can see where I'm going and it's not that far fetched, to
00:20:28
be honest.
00:20:29
I mean, I'm not that technical and I think I could.
00:20:32
I could do a demonstration reasonably easily.
00:20:34
Not that I would do that, by the way.
00:20:38
Speaker 1: Yeah, it's.
00:20:39
It's actually a lot easier than what people would assume, in my
00:20:43
opinion.
00:20:44
You know, like I, I'm not a hacker by any means, and I could
00:20:47
absolutely pull something off like that, especially in 2023,
00:20:53
where these exploits and packages are kind of already
00:20:55
pre-made and you just kind of find the right one and get in.
00:21:00
Speaker 2: You raised a good point.
00:21:01
You know, the 2023 has seen a huge increase in hacking as a
00:21:08
service where you go to the deep web and it's not even digging
00:21:12
too deep and you can buy a kit where you create your own
00:21:17
ransomware or your own DDOS and so, and you literally configure
00:21:22
it the same way as you configure your iPhone.
00:21:25
And you know, for some of them, they actually have a customer
00:21:28
service line where they provide better customer service than
00:21:33
normal companies, and so I think that you know the level of
00:21:38
skills that an attacker needs to have keeps going down whilst
00:21:42
the attack surface keeps going up, and so you can see where you
00:21:46
can see that's creating a huge vacuum, and as an industry, we
00:21:50
need to work together, and I think that I applaud all of the
00:21:55
work that's been done in 2023 around teaching kids how to code
00:22:01
, teaching them cybersecurity or the sense of security from
00:22:05
primary school up to, you know, up to college, because if we
00:22:09
don't catch them now, they're gonna be our next security
00:22:14
people or our next head of IT or a head of database in like five
00:22:18
years or 10 years, and they're just gonna be walking targets
00:22:22
with my name on it you know at the back, my company name, and
00:22:25
so I don't want that to happen.
00:22:27
So I actually not only do I have a duty, but there's
00:22:29
definitely something in it for me to do.
00:22:32
That, which actually leads me on to another point.
00:22:38
I'm in the process of writing my second book around the life of
00:22:42
CSOs, but not around, not, generally speaking, around.
00:22:46
You know, the certifications that they have, but I asked them
00:22:51
all the same 15 questions in the same order, about work-life
00:22:54
balance and the threats that they see out there.
00:22:56
And one very interesting question that I asked them is do
00:23:01
you think we are creating the right succession plan for when
00:23:05
you get out of the industry?
00:23:06
Either you're gonna go do something else or, you know some
00:23:09
of you have been in IT or in cyber for 20 years.
00:23:12
Maybe you're gonna want to retire.
00:23:14
How do we extract the level of experience that you have so that
00:23:18
we can document it and pass it on to new people, and are we
00:23:23
actually creating people with the right skills?
00:23:25
Is the curricular out there too outdated for the new threats?
00:23:32
And I see a divide.
00:23:35
So I set out to do a hundred interviews.
00:23:38
I'm about three quarters into it right now, but I see a divide
00:23:43
.
00:23:43
Some people say, no, actually we are doing the right thing.
00:23:45
Others are saying I don't think we are and I mostly don't think
00:23:49
that we have the ability to pass on our knowledge, which I
00:23:55
think is an interesting point Because if you think about it,
00:23:59
you know people that became network security managers in and
00:24:04
around early 2000s would have had maybe five to 10 years
00:24:09
experience in IT already.
00:24:10
So these people are all coming up to retirement in the next
00:24:14
five to 10 years.
00:24:15
So we're gonna have that cliff of skills going down.
00:24:19
I'm not saying that new people don't have skills.
00:24:21
Some of you absolutely do, and in fact they're probably faster
00:24:26
at some things that we all these don't.
00:24:27
You know we take time to process, but we understand the
00:24:31
value of process, whereas younger generations, they, want
00:24:35
everything faster because they never grew up with the idea of,
00:24:39
you know, waiting for a file to download.
00:24:41
That's unknown to them.
00:24:42
So why would you wait five days to do the right thing, to find
00:24:47
out where the breach came from?
00:24:48
You have a hunch, you go after it, and in going after it
00:24:54
quickly you actually destroy all the legal evidence that an
00:24:58
older person would have found within 10 days but would have
00:25:01
been able to use.
00:25:02
I think we have a bit of a challenge there in the next five
00:25:05
years five to 10 years.
00:25:10
Speaker 1: Yeah, everyone always talks about the talent shortage
00:25:14
or the talent gap, and not a lot of people are bringing up
00:25:18
the fact that a lot of the people that are in leadership
00:25:22
roles or have been, you know, very experienced in their job
00:25:27
for the past you know 10, 15 years they're all retiring
00:25:30
fairly soon.
00:25:31
You know, I actually got brought on at a company to
00:25:35
replace someone as their security expert that had been at
00:25:40
the company for 25 plus years.
00:25:42
They were retiring in the next, you know, six or nine months,
00:25:46
something like that, and you know that knowledge dump right
00:25:52
that we had to go through I mean , it's every day for you know
00:25:55
nine months.
00:25:56
Why did you make this choice?
00:25:57
What was this situation?
00:25:58
Who did you work with on this?
00:26:00
Who do you trust within the organization?
00:26:02
All of those sorts of things, you know.
00:26:06
And now this company that I came and worked for, they had a very
00:26:11
forward thinking view.
00:26:12
You know they were very good at thinking ahead, planning ahead,
00:26:17
and so things like that were always on their roadmap of who's
00:26:21
retiring when.
00:26:22
What skill sets do we have to pick up?
00:26:24
What skill sets do you know we need to augment and replace and
00:26:28
things like that.
00:26:28
But not every organization is thinking like that.
00:26:32
That's a huge challenge that's gonna be coming up very shortly.
00:26:36
It's almost like a different.
00:26:38
It's almost like a different problem from the talent shortage
00:26:41
that we already have.
00:26:44
Speaker 2: Yeah, and I think you know there are a lot of
00:26:48
talented people that are coming on the market.
00:26:50
That's not exactly the problem.
00:26:52
The problem is did we give them , as an industry, the right
00:26:58
pointers so that they can either learn what we need right now or
00:27:02
have a basis that's good enough that they can be molded into
00:27:08
what we need?
00:27:08
Because obviously there's no point in creating an expert in
00:27:13
forensics if we have enough people in forensics.
00:27:16
But equally, if you know forensics really well, you'll be
00:27:20
able to add value in incident response, in purple team and so
00:27:23
on, so you'll be able to reshape your knowledge.
00:27:26
But I think the worry is more about are we creating people
00:27:32
that have too narrow of a scope and that scope is valid today
00:27:36
but may not be valid tomorrow, and will they manage to retrain?
00:27:40
I'll give you an example very topical.
00:27:47
Let's go back to 2018 for just one second.
00:27:50
2018, gdpr was enacted and so, overnight, millions of people
00:27:59
were GDPR experts.
00:28:00
They just added that to their LinkedIn or their resumes or
00:28:03
whatever.
00:28:04
Today, everybody is an AI expert and, more worryingly, a
00:28:09
lot of people are AI security experts.
00:28:11
So that's great.
00:28:14
At least there's an interest.
00:28:16
But the challenge is not really just in AI security, as in
00:28:20
securing the code and securing the LLMs and so on, because
00:28:24
there's emerging technology on that.
00:28:27
It's about AI governance, and there are very few real AI
00:28:33
governance courses out there that allow you to grasp the real
00:28:39
risks and the way to embrace AI in a way that allows you to
00:28:44
govern the process and to deal with issues, and so what I
00:28:48
wouldn't want to see is tens of thousands of AI cyber experts
00:28:56
being born over the next 12 months and they're actually not
00:29:00
trained the right way and they actually add no value, but they
00:29:03
think that they're going to be able to get jobs, because
00:29:05
they're probably not going to be able to get the jobs they want
00:29:09
and they may actually not add value or not add as much value.
00:29:12
So I think it's really important for us, as the industry, to
00:29:16
work with third-level universities, to go and do guest
00:29:20
lectures.
00:29:20
I do guest lectures for various universities.
00:29:23
It allows me to keep my finger on the pulse, to understand how
00:29:27
younger people think, what they want to learn, the questions
00:29:30
that they ask and so on, as opposed to saying oh, the next
00:29:33
big thing is AI risk management.
00:29:38
Maybe it is, maybe it isn't.
00:29:39
I mean, the thing with AI is we don't exactly know as an
00:29:42
industry, and anybody that tells you they know, take it with a
00:29:47
pinch of salt, because it's such a fast-moving target that we
00:29:51
don't exactly know just yet.
00:29:55
Speaker 1: Yeah, that's a really good point that you bring up.
00:29:57
It's a lot easier to add these key terms to your LinkedIn or to
00:30:03
your resume than it is to actually create the skills and
00:30:07
get the skills that are needed to actually fulfill that AI
00:30:11
security title.
00:30:12
I feel like the only people that they're harming is
00:30:19
themselves.
00:30:19
Because they get a job.
00:30:21
Maybe they fool someone at the job because they know a little
00:30:25
bit more than what the person interviewing them does, so they
00:30:30
get the job and then they get that job and they fail at it.
00:30:33
It's just one failure after the next and they're constantly
00:30:36
trying to play catch up, especially in an advanced area
00:30:39
like AI security.
00:30:41
That really isn't even defined right now.
00:30:44
One we don't know where AI is going.
00:30:48
Two, ai security is something that we're just starting to talk
00:30:53
about now.
00:30:54
Speaker 2: Right and I think it's great to have an interest
00:30:57
in AI.
00:30:58
It's great to understand chat GPT, but AI is not chat GPT.
00:31:04
It's way bigger than that.
00:31:05
I think that right now there's good expertise in the market
00:31:14
around the data that you can feed AI and the risks that you
00:31:18
take and how to mitigate those risks and how to classify the
00:31:22
data and maybe have a filter and train people and so on, but in
00:31:26
terms of the full architecture of AI, the coding that goes in
00:31:32
the AI, coding that goes into your standard code, and how to
00:31:36
keep track of that and actually manage that process, it's still
00:31:42
early days Now.
00:31:43
That said, in the last two years , there's been about 35 new
00:31:49
AI-related regulations and standards that came out.
00:31:52
There's been stuff like, for instance, the EU AI Act.
00:31:59
There's been other things coming out from the industry and
00:32:03
it reminds me of the beginning of the cybersecurity industry
00:32:06
where, believe it or not, back in 2005, there were a lot of
00:32:13
industry standards that came out .
00:32:15
Some of them were driven by Vandu, some of them were driven
00:32:19
by associations and so on, and we're seeing that right now.
00:32:22
But you have to remember that if you dial back today,
00:32:27
according to the UCF, the Unified Compliance Framework,
00:32:29
there's about four and a half thousand regulations around
00:32:33
privacy, data and security, but the reality is they all dial
00:32:37
back to about 20, and then when you look at those 20, they
00:32:41
really dial back to ISO, NIST, CIS, GDPR, potentially PCI as a
00:32:49
restricted one, and a few on the software security side.
00:32:52
So it's very likely that we will have the same with regards
00:32:56
to AI.
00:32:56
So I would keep a watch on that if I was interested in working
00:33:01
in risk management for AI.
00:33:05
Speaker 1: So where do you think ?
00:33:06
What are some key areas that you think are going to be really
00:33:12
booming, that people need to pay attention to in 2024?
00:33:17
Speaker 2: I definitely think we're going to see some attacks
00:33:20
on personal infrastructure.
00:33:22
So there are already vendors coming out with ways to help you
00:33:28
secure your infrastructure at home all of your stuff that's
00:33:31
connected.
00:33:32
I think we're going to continue to see ransomware.
00:33:37
I have absolutely no doubt there's going to be a few new
00:33:41
zero day attacks every year.
00:33:43
That's what happens.
00:33:45
We are seeing, as always, attacks on government, but
00:33:53
mostly financial institutions.
00:33:54
It's also interesting to see what's happening in the UK with
00:33:58
regards to PSD3, and everything that has to do with
00:34:02
authentication and strong authentication Identification.
00:34:06
So I would suspect there's going to be continued investment
00:34:09
in that.
00:34:09
I think we're also going to see ridiculous things being
00:34:15
connected.
00:34:16
I heard that example the other day of a vacuum cleaner,
00:34:23
completely connected, that actually goes and vacuums on a
00:34:27
regular basis but actually maps out your property.
00:34:29
So now you know that Matthew has a two bedroom or three
00:34:33
bedroom apartment on one floor or two floors, can you imagine
00:34:37
where this is going?
00:34:40
I do believe that a number of attacks are going to be
00:34:43
automated, but I also do believe that a number of counterattacks
00:34:48
are going to be automated using AI.
00:34:50
That's the good side of AI.
00:34:51
That's good because, whilst a system is able to deal with the
00:34:58
noise.
00:34:58
The actual analysts can deal with the real attacks or the
00:35:02
attacks that require more thinking.
00:35:04
We are going to see more regulation, of course.
00:35:10
We're going to see some new AI regulation.
00:35:13
We're going to see some updates to EU GDPR.
00:35:16
There's a chance that the UK is going to lose their adequacy
00:35:21
because the UK GDPR currently is recognized as being equivalent
00:35:28
to European GDPR, but the ICO, the Information Commissioners
00:35:32
Office, has already taken steps to go a different direction than
00:35:36
the European Data Protection Board.
00:35:38
So if they lose their adequacy, that will mean that from an EU
00:35:41
perspective, transferring data to the UK will be the same as
00:35:45
transferring it to the US or Mexico or Australia, and you can
00:35:49
see that, the evolution of that .
00:35:51
So are we going to see a digital Pearl Harbor, like we
00:35:57
all think might happen at some stage?
00:35:58
I don't know that 2024 is the right year for that, but I do
00:36:03
believe that the geopolitical fragmentation is not going to go
00:36:06
away and we're just going to have to learn how to deal with
00:36:11
it.
00:36:11
So I wouldn't be surprised if people offering red teaming and
00:36:17
purple teaming and table top exercises will make the fortune
00:36:22
in 2024.
00:36:23
And it probably wouldn't be a bad thing for the industry.
00:36:30
Speaker 1: So what are some areas that our current AI policy
00:36:35
and governance is lacking in?
00:36:37
Because I feel like this field is advancing pretty rapidly and,
00:36:44
per usual, the governance of it and the policy behind it is
00:36:50
lagging behind.
00:36:51
So what are some areas that we need to pick up and pace it?
00:36:56
Speaker 2: Well, there are a number of best practices and
00:36:59
checklists that are available.
00:37:00
So the IAPP the International Association for Privacy
00:37:05
Professionals came out this year with a very good document that
00:37:12
has, I think, about 65 keywords and key topics that you need to
00:37:16
look at in your AI initiatives from a technical and a policy
00:37:21
and a training perspective.
00:37:22
So you're going to see more of that.
00:37:25
There are, as I said, a number of vendors coming out with
00:37:29
interesting technology about how to make sure that whatever you
00:37:35
do using AI doesn't actually impact on the generic codes of
00:37:39
your software.
00:37:40
So I think we're going to see some more of that, and it
00:37:43
wouldn't be.
00:37:44
I think what we need is like a NOASP top 10 and a SandStop 24
00:37:51
AI, and it's coming.
00:37:53
I think there are a few out there that are just industry
00:37:56
driven, but there's going to be some more.
00:37:58
I would urge people to try and grasp the idea of AI governance.
00:38:07
There are some very good AI governance forums coming out
00:38:12
right now.
00:38:12
I spend a lot of time attending those events and I'm fascinated
00:38:18
at the conversion the two cybersecurity and AI trying to
00:38:27
meet somewhere in the middle.
00:38:28
It's an interesting thing to watch, because cybersecurity is
00:38:33
very at this stage there's a risk, or there isn't a risk, we
00:38:37
can mitigate the risk, or we can't, because we understand it
00:38:41
reasonably well but we don't really understand AI.
00:38:43
I think another thing to keep in mind is if you're familiar
00:38:47
with the Cloud Security Alliance , the CSA, they are basically
00:38:54
saying that protecting your AI systems and infrastructure will
00:39:00
follow a similar trajectory to what we've learned about the
00:39:04
Cloud.
00:39:05
Initially, everybody was saying, well, I'm not moving to the
00:39:07
Cloud, too dangerous, I don't know what's there.
00:39:09
Then, eventually, you see, you have no choice but to move some
00:39:13
critical elements of what you do to the Cloud.
00:39:15
But now there's good practice, there's ways to protect it,
00:39:20
there's continuous compliance.
00:39:21
I think that the CSA says that it's going to follow a similar
00:39:27
path, and they may well be right on that.
00:39:30
We're not going to be able to not embrace AI, but we need the
00:39:34
right structure that organizations can use.
00:39:38
If you think about it, very small organizations or mid-sized
00:39:43
organizations are well able to embrace the Cloud now because
00:39:47
there's so much expertise out there.
00:39:49
That's where we need to get to with AI, or at least with
00:39:54
mainstream AI.
00:39:55
I'm not talking about Terminator and that type of
00:39:57
stuff.
00:39:58
I'm talking about what we're trying to do right now, which is
00:40:01
to use AI to automate the mundane and other tasks, that
00:40:09
our time would be better used to do something else.
00:40:13
Speaker 1: Yeah, that makes a lot of sense.
00:40:15
You brought up the prospect of potentially cyber Pearl Harbor
00:40:24
or something like that.
00:40:25
From my perspective, what I'm thinking about, that I'm
00:40:36
thinking of an attack that is very large in scale, that
00:40:41
changes the world forever, right in a very tangible way.
00:40:46
Is that how you see it?
00:40:47
What do you think it would take for something like that to
00:40:51
happen, like a power grid going down for a month, or what does
00:40:54
that look like to you?
00:40:57
Speaker 2: Pretty dark, actually Use the pond, but it could
00:41:02
happen depending on where you're based.
00:41:05
I get up in the morning and I'm happy to be alive, and I'm
00:41:12
happy that I have electricity and I have water and so on, and
00:41:15
I don't want this to change.
00:41:16
And so I believe that also, we've gone from just pure
00:41:27
critical infrastructure protection to critical
00:41:30
infrastructure resilience.
00:41:31
You look at Dora, for instance, and for the banking industry in
00:41:35
Europe, the Digital Operation Resiliency Act, if I got that
00:41:41
right.
00:41:41
But anyway, dora is all about making sure your critical
00:41:45
systems are resilient.
00:41:47
So will they go down for a day?
00:41:50
No problem, it'll be a pain, but it's okay.
00:41:53
For a week, it'll be a major pain, but it'll be okay for a
00:41:55
month.
00:41:56
That will have societal effects , that will have issues with
00:42:02
potentially, after a while, riots and social unrest and so
00:42:06
on, and so we can't really afford to do that.
00:42:08
So it's interesting that idea of you.
00:42:14
Know, we all understand now that we need to protect the critical
00:42:17
infrastructure of the cities, of nations.
00:42:22
Now we need to understand that we need to protect our own
00:42:25
critical infrastructure because it's a backdoor to the rest.
00:42:28
But we need to talk about resilience, right, how do I make
00:42:33
my way of living resilience.
00:42:35
How do I make my way of doing business resilience If my
00:42:40
e-commerce site goes down?
00:42:41
Am I out of business?
00:42:43
So, am I 50% out of business, and for how long?
00:42:46
And how long can I sustain that ?
00:42:48
And, by the way, that is why some organizations decide to pay
00:42:52
ransoms.
00:42:52
And you should never pay a ransom by default, because if
00:42:57
you pay it, you may not get the right information back or the
00:43:01
key back.
00:43:02
It may not work.
00:43:03
But also, you advertise yourself as somebody who's going
00:43:06
to pay, so you're going to remain a target right.
00:43:08
But the reality is some companies are like ah, do you
00:43:11
know what?
00:43:11
On the grand scheme of things, we're better off paying, and so,
00:43:16
but with critical assets, you can't always think like that,
00:43:21
you know.
00:43:21
So I think we need to move towards resilience.
00:43:25
We've spent enough years developing good risk assessment
00:43:29
methodologies and looking at all of that.
00:43:32
Now we need to get to the next level.
00:43:33
How do I make this a continuous proactive thing and I make my
00:43:37
ecosystem resilient and my staff resilient and myself resilient?
00:43:43
Speaker 1: Hmm, yeah, you bring up a really interesting point
00:43:47
and that is something that I myself even see as being often
00:43:52
overlooked is the resilience factor of deploying this
00:43:58
revenue-generating application that is generating I don't know
00:44:03
a million dollars a day.
00:44:04
Well, what happens if that web app goes down?
00:44:07
You know, do we have HA set up?
00:44:09
Is it failing over to the same location?
00:44:13
Because if it's failing over to the same location, it's
00:44:15
probably not a good idea.
00:44:16
All of these things are often overlooked or put on the back
00:44:21
burner, and so that we'll get to it, you know, eventually.
00:44:24
Well, in the meantime, when eventually is coming, you know
00:44:29
you can have an attack that takes it down completely and
00:44:32
it's like oh, that thing that we said we were going to get to
00:44:35
eventually never came because it was already at risk.
00:44:39
You know, well, you know, matthew, we're coming to the end
00:44:43
of our time here, unfortunately , but you know, before I let you
00:44:46
go, how about you tell my audience, you know, where they
00:44:49
could find you, where they could find Vigitrust if they want to
00:44:52
learn more.
00:44:54
Speaker 2: Yeah, sure.
00:44:54
So, first of all, thanks again for the opportunity to talk to
00:44:57
you today.
00:44:57
So you can find information about Vigitrust at Vigitrustcom
00:45:03
V-I-G-I-T-R-U-S-Tcom, you can find information about myself,
00:45:09
Matthew Gorgecom, in one word.
00:45:11
I've also published a book called the Cyber Elephant in the
00:45:15
Boardroom, published by Forbes and Best Seller on Amazon, and
00:45:20
you'll find it on Amazon and it's all about translating cyber
00:45:24
risk into business risk, primarily for non-technical
00:45:28
people.
00:45:28
And, of course, I'm very easy to find on LinkedIn and I
00:45:33
actually love networking.
00:45:34
I love meeting people from the industry.
00:45:36
There is not a day that I don't learn something new about cyber
00:45:40
, and I've been at it for 25 years and it's a great industry
00:45:44
that way.
00:45:46
Speaker 1: Awesome.
00:45:47
Well, thanks, matthew.
00:45:48
I really appreciate you coming on and I hope everyone listening
00:45:50
enjoyed this episode.
00:45:51
See you everyone.