Prepare for a thrilling exploration of the cybersecurity world with our extraordinary guest, Aviv Grafi. A cybersecurity expert, founder of a successful tech startup, and a former member of Israel's elite cybersecurity unit 8200 group, Aviv's journey will take you from his early days of hacking and building computer systems to the cutting edge of IT security. Learn from his experiences and discover how an engineering mindset, combined with a passion for problem-solving, can lead to unexpected career paths.
Aviv gives a gripping account of his transition from working in startups to founding his own venture. The highlights of our conversation revolve around hacking prevention, defensive programming, and the pioneering concept of Content Disarmament, Reconstruction (CDR) technology. The technology, which creates a virtual shield against malicious software by transferring the content of a document onto a new template, is a fascinating leap in cybersecurity. Watch as he unravels how startups are navigating the complex world of cybersecurity.
As we steer into the intricacies of kernel security, you'll get an in-depth understanding of the challenges faced by tech giants like Microsoft with their longstanding Windows system. Contrast that with Apple's leading security architecture, and it's a captivating exploration of the tech world's giants. We end our journey with a peek into the future of document security - the evolution of Voteer's solution to safeguard users from harmful content and its innovative API-centric platform that integrates seamlessly with popular services. It's all about creating a barrier between the user and the data to ensure a safe digital experience. Join us on this exciting journey through the world of cybersecurity with Aviv!"
[END SHOW NOTES]
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, Aviv?
00:00:01
It's really good to finally have you on the podcast here.
00:00:04
I've been looking forward to our conversation.
00:00:08
Speaker 2: I think that's really great and thank you very much
00:00:10
for inviting me.
00:00:11
Speaker 1: Yeah, absolutely so, aviv.
00:00:13
You know I always start my guests off with how they got
00:00:16
into IT overall, because there's a portion of my audience that
00:00:24
are in that boat where they're trying to, you know, figure out
00:00:28
if this is a path that they can take, if this is something that
00:00:31
they can actually do, make a career transition, you know,
00:00:34
whatever it might be.
00:00:35
And so I think it's very helpful for everyone to hear you
00:00:39
know the background kind of.
00:00:40
You know, maybe they're coming from a similar background and
00:00:43
they, you know, can finally hear like, oh, this is possible.
00:00:47
Speaker 2: Okay, so sure.
00:00:47
So I'm based in Tel Aviv, Israel, just for those who don't
00:00:50
know, and one of the things I can tell myself that I was kind
00:00:54
of interested in how things are working from probably when I was
00:00:56
really into IT, Probably when I was really really young.
00:00:59
So I was tearing apart some you know, computers and some tech
00:01:03
stuff.
00:01:03
I was really into assemble things and creating new things
00:01:08
and I think the most important skill, I think, for someone
00:01:11
who's really interested in getting into that, I would say,
00:01:15
area and the mindset is do not be afraid of anything, Just try
00:01:21
and get your hands dirty.
00:01:22
I think that would help actually to build all of the
00:01:24
confidence and the knowledge and experience.
00:01:26
So I think when I was kind of in high school, I was starting
00:01:29
to do that.
00:01:30
I was mainly was you know?
00:01:32
Curious about cybersecurity.
00:01:33
Back then it was just you know, security and I think one thing
00:01:39
I learned is how to hack some stuff, how to, you know, take
00:01:44
advantage of some websites and stuff like that.
00:01:46
I was kind of doing those little things as a high school
00:01:52
and one thing happened is here in Israel you need to do an army
00:01:57
service for three years, at least three years.
00:01:58
So I was recruited by the IDF, in fact, specifically for the
00:02:03
intelligence forces in Israel.
00:02:04
It's the 8200 unit, it's like the NSA, and I think this is
00:02:09
where my kind of security real security career actually started
00:02:13
to evolve.
00:02:15
Speaker 1: So that's really interesting.
00:02:17
You know you mentioned how you kind of started hacking or
00:02:20
playing around with it in high school.
00:02:23
How did you get that frame of mind to kind of look at these
00:02:29
you know websites and web pages you know in a different way and
00:02:33
try to, you know, exploit them in some way?
00:02:36
Like, how did you develop that mindset?
00:02:38
Because that's a totally different you know mindset you
00:02:40
know a lot of people have when they're looking at the Internet.
00:02:43
Speaker 2: Yeah, I think it started from the desire to build
00:02:47
things and actually it started from how do I achieve this and
00:02:52
that it started from the kind of engineering mindset.
00:02:55
But then it started how do I do that in the easiest way, the
00:03:00
cheapest way?
00:03:01
And when you start to do that you actually figure out that you
00:03:04
need to travel through a lot of stuff and sometimes you know
00:03:08
I've heard that saying that a security or a pentester is
00:03:11
actually a QA guy with a lot of motivation and I think it
00:03:14
started from there to understand that how things are really
00:03:18
working, because you need to fix it.
00:03:21
And when you really started to think how things are working,
00:03:23
whether that can be like, I built my own kind of billing,
00:03:28
like a voice voice system, and when I was in high school, using
00:03:34
very old computers that I had in the garage in my parents'
00:03:37
house, so building that and say, okay, I want to have a way,
00:03:41
when I'm traveling with the family, to have a long distance
00:03:44
call without having all the costs back then.
00:03:45
So it started to solve problems and building those kind of
00:03:49
hacks and when you really understand how things are
00:03:52
working, how, for example, in that specific case, how VoIP
00:03:58
teleconference systems are working, say oh, so maybe I can
00:04:02
leverage this and that and play with something that is online
00:04:06
out there, and I think that's where the curiosity came.
00:04:10
Speaker 1: So when you made it into the Israel's 8200 group,
00:04:15
you know what did you specialize in.
00:04:16
Can you talk a little bit about that?
00:04:18
Speaker 2: Yeah.
00:04:18
So in terms of the process that was recruited when I was in
00:04:21
high school and went through a series of tests for about like a
00:04:24
year while I was in high school and then I started my career as
00:04:30
doing a course with folks like me for a few months and learning
00:04:35
more about intelligence and complex systems and that kind of
00:04:40
stuff programming and then I started that roots of being kind
00:04:46
of the developer at start and then we'll be doing security
00:04:49
researcher, both working on offense and defense operations.
00:04:54
And I think one of the great things that happened at least to
00:04:58
me, I was with myself doing those almost five years, four or
00:05:02
something years that I was there is that you learn, you
00:05:06
know, as like a 19, 20 years old , I mean you're pretty naive.
00:05:09
So when someone telling you go get this thing, so you're so
00:05:14
naive, say, yeah, okay, that's possible.
00:05:15
I guess that's possible.
00:05:18
And there's some magic happening there when you're getting 20
00:05:21
something years old into that room, that building, that floor,
00:05:25
that things actually happen, things that if you look, I mean
00:05:30
now, like 20 years after that, looking back, say, look, those
00:05:33
things are crazy and maybe today I would think that they're
00:05:36
impossible.
00:05:36
You know, when we're getting older, we're getting more
00:05:39
cynical, we're getting based on our experience.
00:05:41
Man, maybe that's not work, maybe that's a complex, maybe
00:05:44
just some other constraint that that would fail us, but when,
00:05:48
when you're so naive and you're so passionate and there's room
00:05:52
or building full of those kind of young things, young guys and
00:05:55
girls, there's a magic happening there.
00:05:57
I think that's where the mindset of everything is
00:06:01
possible.
00:06:01
I still been in my mind Huh, that's interesting.
00:06:08
Speaker 1: I assume that they probably have to develop that
00:06:11
mentality in you that everything is possible, because you know
00:06:15
they give you kind of like I mean, at least this is assuming,
00:06:20
right, they probably give you a target, they tell you what
00:06:24
needs to be done and then you figure out how to achieve it
00:06:27
With that mentality of not giving you an easy way out,
00:06:31
right, like not saying like, oh, this can't be done, this isn't
00:06:34
possible.
00:06:35
It's like, no, you need to figure out how to actually get
00:06:38
this done, no matter what.
00:06:39
That's a great mentality to have.
00:06:41
You know, like I feel like that mentality would pay dividends.
00:06:45
Speaker 2: I agree and I think the biggest thing I learned from
00:06:47
that experience that there is a solution to almost every
00:06:51
problem and probably there are multiple solutions to almost
00:06:54
every problem.
00:06:55
And you know, put in one room enough smart people and they
00:07:00
will find a way.
00:07:01
Yeah, there will be some constraints.
00:07:02
Of course, there will be some assumptions.
00:07:03
Yes, you need some luck in the process.
00:07:06
Maybe that will take time, but there is that mentality of
00:07:10
everything is possible.
00:07:11
That is really great and I think if I would be hearing
00:07:14
myself like 20 years ago, I would emphasize that say, follow
00:07:19
that thing.
00:07:19
This is the most important thing.
00:07:21
Speaker 1: Oh yeah, that is a.
00:07:24
It's just a really fascinating mentality to have.
00:07:28
So when you're going through your testing in high school, you
00:07:32
know, I assume everyone around you is also going through that
00:07:36
testing as well, and maybe it's different tests the farther you
00:07:40
go down the road, right, are you able to discuss the testing
00:07:45
with other people around you?
00:07:46
Right, because I'm thinking of you know, here in America, right
00:07:49
, if I was to go and test for the NSA, I mean, I'm not allowed
00:07:53
to tell my mom that.
00:07:54
I'm probably not allowed to tell my wife that I could tell
00:07:57
her what region you know I'm going, what state I'm going to,
00:08:01
but you know, I can't even tell her like, oh yeah, I'm going to.
00:08:04
You know, for me to go into this building, on this top
00:08:07
secret floor, you know to figure out something, right, like, I
00:08:11
can't do that.
00:08:12
But it's a totally different culture at least that's what I
00:08:16
would think you know in Israel, right, like you're not going to
00:08:19
just tell anyone everything?
00:08:21
Right, that would be highly frowned upon.
00:08:23
But it's more open, right, because everyone knows like, oh,
00:08:27
you're 17, you're testing.
00:08:29
Everyone went through it before .
00:08:31
Is it more of an open topic?
00:08:33
Is what I'm trying to get at?
00:08:35
I guess.
00:08:36
Speaker 2: So yeah, I think that when you're 17, actually
00:08:39
there's something, you know, very prestigious around that.
00:08:42
So you want to get into the like, the top units.
00:08:46
So if you're a combat, you want to get to the like, the Navy
00:08:49
SEALs in Israel.
00:08:51
So you want to be, you know, qualified to do that.
00:08:54
You want to get through those tests and in fact in high school
00:08:57
, those who really can really get into those elite units, they
00:09:01
share that and say, okay, they did well because they were able
00:09:05
to get into those units.
00:09:06
Now, once you start a service, that's a completely different
00:09:09
thing.
00:09:10
I mean, you're not talking about what you're doing with
00:09:12
your family, obviously You're not.
00:09:13
Sometimes you're not talking about where is your base is or
00:09:18
you need to go to the service for it yet for days, maybe for a
00:09:21
day, maybe for a week.
00:09:22
That's a completely different story.
00:09:24
But the fact that you actually qualified and you accepted to
00:09:28
specific units, I think that's something that people are
00:09:31
talking about.
00:09:31
It's very encouraging also the others to follow that route.
00:09:35
I mean, if you want to make sure you get a career in the
00:09:38
security space, in the high tech , you better go start your
00:09:41
career from one of those elite units, because that would be a
00:09:44
great jumpstart.
00:09:45
Speaker 1: Yeah, it seems like it's almost like a tech
00:09:47
incubator to some extent, right, I mean, there's so many people
00:09:51
out of Israel that come from that unit that start
00:09:53
revolutionary companies.
00:09:55
You know one of them that comes to mind.
00:09:57
I've interviewed several, but one of them that comes to mind
00:10:02
is, you know, cyber reason.
00:10:03
The founders came from that unit and they created this
00:10:06
technology that I would argue, at least the last time that I
00:10:10
was using it, that it was better than the top EDR solutions on
00:10:14
the market and they were like brand new, right out of the box.
00:10:16
You know.
00:10:17
Speaker 2: So, yes, I can tell you that a lot of those unit
00:10:20
graduate folks first they have the network so you can actually
00:10:24
found your new venture, your new company with the folks that you
00:10:27
served with for a few years.
00:10:29
You know that you'll be doing well.
00:10:30
This is one, and of course, I would say that it's incubated in
00:10:35
terms of ideas, because a lot of the things that are actually
00:10:37
happening after that service, those are new things that mostly
00:10:42
you know.
00:10:42
You haven't worked on them in the unit, but at least in the
00:10:45
mindset you know how the bad guys are thinking and you know
00:10:49
how the defense should be built.
00:10:50
So that's definitely the great thing and at least for me,
00:10:54
that's actually started my journey.
00:10:57
I worked in a couple of startups after graduating from that
00:11:01
specific unit and then I just wanted to have my own venture.
00:11:06
I knew that I want to solve the problem and I think that's a
00:11:10
lot of those graduate, a lot of those who really good at problem
00:11:14
solving, like I mentioned.
00:11:14
Like get them into one room to solve this and this will be
00:11:18
solved.
00:11:18
I think a lot of the passion for problem solving actually you
00:11:21
would in those days.
00:11:25
Speaker 1: So talk to me about you know what you started doing
00:11:28
after the 8200 group.
00:11:31
Like you had your time at the 8200 group.
00:11:34
For obvious reasons, we can't talk about what exactly you did,
00:11:38
but what did you start diving into afterwards?
00:11:40
Speaker 2: Yeah, so right after I finished my service I was
00:11:44
working for a couple of startups .
00:11:46
One of them was deep in the security space and the security
00:11:52
solution based on virtualization back then and really core
00:11:55
Windows kernel product, and after that I realized that I
00:11:59
want to have my own venture.
00:12:00
Speaker 1: I had a couple of ideas, and that's an interesting
00:12:02
part.
00:12:03
Speaker 2: The first idea, the original idea was nothing to do
00:12:05
with what Fotearo, the company I founded, is actually doing
00:12:09
today.
00:12:09
And this is also a lesson I learned, because sometimes your
00:12:13
first idea that would be one thing, but then you realize.
00:12:16
In our case it was like about five months into the founding
00:12:20
the company, we found out that it's not going to work.
00:12:23
But then I realized, you know, I still want to have my own
00:12:27
thing.
00:12:28
So we started to do like freelancing and services.
00:12:31
I was doing penetration testing audits for companies and one of
00:12:36
the things I learned is that this experience actually got me
00:12:40
the idea it's what behind actually Fotearo today.
00:12:43
And the thing I learned is that when I was traveling around the
00:12:46
world as a 20-something years old guy, I was traveling around
00:12:49
the world my clients paid for my trip and said, look, we want an
00:12:51
audit, we want a security audit .
00:12:53
It tells what the where the weak points are.
00:12:55
And I was sharing with them a document and saying, okay, these
00:12:58
are the vulnerabilities, this is how you should fix them.
00:13:00
Now, one thing I learned is that I was able to show them or
00:13:05
demonstrate.
00:13:05
I could hack them pretty easily , and I found there was one
00:13:09
technique that worked for me 100% of the times and it was
00:13:12
very simple.
00:13:13
Just, I went to the website, I went to the open positions
00:13:17
section and I sent a PDF, which is a weaponized resume, and I
00:13:21
said hi, I want to apply to this position and I want to work
00:13:25
with Joe and, by the way, I know Philip from the finance and
00:13:30
I'll be happy to share some references and on the other side
00:13:34
there is a guy or lady that their job is to screen hundreds
00:13:37
of resumes a week.
00:13:38
That's in order to do their job, so they cannot really think
00:13:40
twice before they're opening that webinar's PDF.
00:13:43
And bam, that was just working 100% of the times.
00:13:47
And I would think to myself maybe there's a real problem
00:13:52
that need to be solved here, because years after date
00:13:57
anti-mower, edr, all those technologies invented still
00:14:01
people are actually falling in that trap and opening those
00:14:04
weapons documents.
00:14:04
And I think this was the moment where I realized that this is
00:14:08
the problem I want to solve, and this was actually what led to
00:14:10
found what today's video that's interesting.
00:14:14
Speaker 1: So is that kind of?
00:14:16
Is that solution more defensive programming, where you're
00:14:21
proactively building in defenses to different kinds of attacks,
00:14:25
or do I have this mistaken Right ?
00:14:28
Let's start off with, maybe, what defensive programming is
00:14:32
and then how it's applied, right ?
00:14:33
Speaker 2: Yeah, so.
00:14:34
So the idea behind the material of it's a technology that's
00:14:39
called content disarmament, reconstruction or CDR, and what
00:14:44
it actually means is that, if you think about all the defense
00:14:48
systems out there that need to screen those weaponized resume
00:14:51
or invoices or any attachment, maybe a company deck or maybe a
00:14:56
podcast outline that I will share with you, is that they all
00:15:00
do the same thing.
00:15:01
They try to take that document and you know, judge whether it's
00:15:05
malicious, suspicious or benign , and they're all relying on one
00:15:09
thing they're relying on the history and trying to predict
00:15:11
the future, because either, if you think about anti-mower, they
00:15:15
had the signature database, which is an historical database.
00:15:17
If you think about next gen AVs , they built on machine learning
00:15:21
, and what machine learning or AI is?
00:15:23
It's just a model that is trained, based on samples, past
00:15:27
samples, and now I need to predict the future.
00:15:29
Same goes for IDR they're trying to understand whether
00:15:33
there's a malicious behavior based on past malicious behavior
00:15:36
.
00:15:36
So they all do the same thing.
00:15:38
They're trying to look at the past and trying to predict the
00:15:41
future, and that's where we all fail, right, because the bent
00:15:44
eyes are faster than us If you think about the number of
00:15:48
malicious samples of documents that produce every like minutes,
00:15:51
like thousands of new samples.
00:15:53
We cannot really keep up with that.
00:15:54
I think this is where most of the technology kind of a defense
00:15:59
technology kind of fail because they're all doing that same
00:16:04
paradigm of looking back and trying to predict the future.
00:16:07
And what I thought about how it could really change that
00:16:10
dynamic, how it can really not look at the past and but really
00:16:15
protect the user.
00:16:16
And the idea was okay.
00:16:18
I was asking myself what is that recruiter, that HR guy or lady
00:16:22
really interested in when they reading that resume?
00:16:24
They really want to have the content of the resume, they want
00:16:28
to see the text, they want to see the paragraphs, they want to
00:16:30
see the phone number, maybe a picture, maybe some links.
00:16:33
Where the vulnerability or the exploit actually resides, it
00:16:38
actually resides in the specification of the format and
00:16:42
the actual bits in that word document.
00:16:44
There's something in the structure, the binary structure
00:16:47
of the word document that actually lead to that
00:16:50
exploitation of that vulnerability.
00:16:52
But they're really interested in the content.
00:16:55
So what if I would take the content, lift it up and just
00:16:58
paste it on a fresh, brand new template of that word document?
00:17:01
I would create a replica which looks and feel exactly the same
00:17:06
like the original, but anything that might be malicious in it
00:17:09
it's just gone because I'm not delivering it.
00:17:11
I'm just delivering a fresh and safe, a known, safe replicas of
00:17:15
the original documents and that's the idea behind CDR or
00:17:19
content disarmament reconstruction that you can
00:17:21
actually deliver a known, safe content 100% of the times and
00:17:26
you don't need to chase those bit guides anymore because
00:17:29
you're just delivering the good content and not trying to decide
00:17:34
to keep it.
00:17:35
Very specific document.
00:17:37
Speaker 1: That's interesting.
00:17:38
So you know, if I have a document and I embed some
00:17:42
malicious code into it, right, is this solution going to also
00:17:48
somehow read that malicious code and print that to this document
00:17:52
that I would then read?
00:17:54
Or, I guess, is it too far buried into the document or into
00:17:58
the program for it to actually be read on the screen, so to
00:18:01
speak?
00:18:02
Speaker 2: So if you think about , let's say, a PDF or Excel
00:18:05
spreadsheet, maybe there's some bad code in it, but you as a
00:18:09
user, when you talk we talk about the V9 documents you're
00:18:12
interested in the charts, in the formulas, in the values in that
00:18:15
Excel spreadsheet.
00:18:16
By copying the valid content to a fresh template of Excel
00:18:20
spreadsheet, delivering that, you don't need to deal with all
00:18:24
those questions Whether the malicious code will do something
00:18:27
, because the malicious code is not there anymore.
00:18:28
When you're delivering the known good content, it's like
00:18:33
rethinking about the entire problem.
00:18:35
It's like turning the problem on its head, because you're not
00:18:40
looking for the bad stuff anymore, you're just looking for
00:18:42
the good stuff.
00:18:44
And this is kind of the idea that your question is spot on,
00:18:48
because you're asking okay, what do we do with the malicious
00:18:51
code?
00:18:51
There's nothing that you need to do with it.
00:18:53
You just need to ask where is the known good content, what I'm
00:18:56
really interested to read in that document, whether that will
00:18:59
be in the word document, the paragraphs, the fonts, the
00:19:02
styling, maybe some links, maybe the images, maybe some embedded
00:19:06
objects that we would take care of, all that stuff that the
00:19:10
user need to read and really be productive.
00:19:14
So that's actually what we deliver the rest, that might
00:19:19
probably, maybe malicious.
00:19:20
We're just throwing away in that sense.
00:19:24
So your question is spot on.
00:19:27
You just need to rethink about how we really approach that.
00:19:32
Speaker 1: That's very interesting.
00:19:33
That's a new way to approach that problem.
00:19:37
I mean it just seems to be ever present, right Like we can't
00:19:42
get away from it, but it seems like the solution really solves
00:19:45
it.
00:19:45
Is this an idea that maybe formed even when you were at the
00:19:50
8200 group of prior experiences of how you identified ways to
00:19:56
compromise systems and whatnot.
00:19:57
Speaker 2: So I think I've asked myself if I want to really be
00:20:03
aggressive or block all those malicious attempts that
00:20:06
delivered by documents.
00:20:07
What would we do?
00:20:08
And what I had in mind?
00:20:10
Okay, I wouldn't be delivering the original document.
00:20:13
I said, okay, but I do need to have the content.
00:20:16
So what should I do?
00:20:17
Because the financial company, any company, still need to work.
00:20:20
So by replicating the content itself, I think that actually
00:20:24
solved that tension between the security and the productivity
00:20:28
which a lot of us are suffering from.
00:20:30
Because that's exactly what the recruiter was suffering,
00:20:33
because he was told to open resumes and at the same time the
00:20:38
security guy told them look, think twice before you're
00:20:41
opening documents.
00:20:42
And then that's going to self.
00:20:43
What should I be doing now?
00:20:44
Can I open documents or can I just?
00:20:50
ask my manager what to do.
00:20:51
So by having those kind of solutions, and you know, a lot
00:20:55
of companies and the industry talk about zero trust, and this
00:21:00
is actually zero trust for data security, because you're
00:21:04
actually not trusting a thing out of those documents and then
00:21:08
you're delivering a replica which is completely same.
00:21:11
So I think this is actually implementation for zero trust
00:21:15
for data, which most of the zero trust implementations today are
00:21:19
for network, for identity, and I think this is now the time
00:21:22
where we're going to talk about the zero trust for data.
00:21:25
So I think that in that way, the tension between security and
00:21:30
productivity is being solved and, of course, we're
00:21:32
introducing a new concept into the security stack.
00:21:36
Speaker 1: That's interesting.
00:21:37
You know you bring up zero trust for data and you know,
00:21:42
honestly, I probably haven't even thought about that myself.
00:21:46
You know doing some sort of zero trust solution or framework
00:21:51
around the data.
00:21:52
Whenever I think about securing data, it's always encryption
00:21:56
and least privilege right.
00:21:57
Who can access it and what's the state that it's in and how
00:22:01
are we securing that data in that state?
00:22:03
But adding zero trust kind of augments it, you know.
00:22:07
How do you think that it augments it?
00:22:09
Do you think it's more or less preparing for the future, or is
00:22:14
it adding additional security protections around it?
00:22:17
What's your thoughts on it?
00:22:18
Speaker 2: So the way I think about that is that we have
00:22:21
multiple layers of zero trust or security frameworks.
00:22:25
One is for the network who can access that, first, what you can
00:22:32
access and of course, the second thing is for the identity
00:22:34
who can access that resource.
00:22:35
So, once you went to the two zero trust frameworks of who can
00:22:39
access that resource and how we can access that resource,
00:22:42
whether you need to go to two factor authentication, maybe
00:22:45
that specific authentication, maybe from those IP addresses,
00:22:49
maybe from that specific ports or from that specific VPN, and
00:22:54
then you can access the resource , but no one asking whether the
00:22:58
resource itself, how you can actually protect that data.
00:23:01
So if you think about I mean a lot of our customers and
00:23:06
protecting their web application using material technology, the
00:23:10
way to think about that is now, for example, insurance company.
00:23:13
It's open the system to the clients.
00:23:15
So now, as a client, you can upload documents, for example,
00:23:18
like insurance claims.
00:23:20
I want to, you know, file a claim.
00:23:22
So I need to upload a doctor report, maybe some pictures and
00:23:25
maybe some scans and maybe an idea.
00:23:27
So now I need to upload those old things.
00:23:29
Who can assure that I'm not uploading a resource or data
00:23:34
that is really malicious and say it goes from the other way.
00:23:37
If I'm accessing a resource, I want to make sure that I'm not
00:23:40
accessing like a malicious one, because someone already uploaded
00:23:44
that malicious thing and everything was working like zero
00:23:46
trust network, zero trust from identity perspective.
00:23:49
But yet I'm dealing with that layer of it can be even
00:23:54
encrypted, like you mentioned.
00:23:55
It can be encrypted at rest, it can be encrypted in motion, but
00:23:59
the data itself is weaponized and the thing that's the missing
00:24:04
piece from completing all that chain of zero trust model from
00:24:09
the identity through the network encryption, as you mentioned,
00:24:12
for data security and for the asset itself.
00:24:17
Speaker 1: That's interesting.
00:24:18
So you know, I saw on LinkedIn you did a bit of kernel security
00:24:22
research.
00:24:23
You want to talk about that a bit?
00:24:26
Speaker 2: Yeah, sure, I think one of the things I really like,
00:24:27
as I mentioned, is how things are working, and I have a strong
00:24:32
background in that research of kernel models.
00:24:35
Back then I was working in a company that we were actually
00:24:39
running several like virtualization, several
00:24:41
operating system on the same hardware and actually how to
00:24:46
make this actually like what today, like parallel desktop and
00:24:49
that kind of stuff.
00:24:50
So I was very familiar with how the kernel working in terms of
00:24:54
the subsystems and how to make sure that you actually compile
00:24:58
with the hardware.
00:24:58
One of the things that's really really nice for the bad guys is
00:25:03
to have those kernel exportation path and techniques.
00:25:07
If you think today but I think that was very popular to inject
00:25:12
malicious documents with malicious fonts, because fonts,
00:25:17
for example, the Windows subsystem of fonts, is actually
00:25:20
a kernel based system.
00:25:22
So in fact, that font that you read on your web browser or on
00:25:27
your Word documents is being, you know, there is a work done
00:25:31
in the kernel.
00:25:31
So that was a very, very slick method of the bad guys to send a
00:25:36
documented execute kernel code, which is very, very important
00:25:41
if you want to just, you know, go through all those EDR things
00:25:45
straight to the holy grail which is the kernel, which is the
00:25:48
unprotected memory portion.
00:25:50
So that was one of the things I learned as part of that kernel
00:25:55
research and I think today a lot of the great work is being done
00:26:00
, especially in some of the operating systems like the Apple
00:26:03
one.
00:26:04
I also think that Microsoft is doing a better job, but yet
00:26:08
there is tons of playground for those who really want to have
00:26:14
some kernel research and privilege escalation that apple
00:26:21
does it best with their security architecture of how they are
00:26:26
architecting their OS's.
00:26:28
So I wouldn't say that I'm an expert with Apple OS but at
00:26:32
least from my experience with that, they are doing a great job
00:26:37
.
00:26:37
I would at least better job than Microsoft and I'm not
00:26:40
saying that Microsoft, I know the Microsoft they had, you know
00:26:43
, huge, huge challenge.
00:26:45
Because if you think about Microsoft, you know having
00:26:49
windows for no 30 something years and with tons of hardware.
00:26:55
Think about, I mean, apple have a very easy job the coupling
00:26:59
the hardware and the software, the controlling everything.
00:27:01
So for them, if you think about the kind of no the world in
00:27:06
terms of even the qualifications process, even the possibilities
00:27:09
of what they need to support from OS level towards the
00:27:12
hardware, that's crazy.
00:27:14
As opposed to, apple have one set Of hardware and software
00:27:17
that they can actually test and close, and with Microsoft it's
00:27:23
not.
00:27:23
You know, it's like a hell.
00:27:25
Think about the number of permutation and need they need
00:27:27
to support and no wonder I mean the tons of security exploits
00:27:33
and bridges out there.
00:27:34
Because if you think about even a device drivers, think about
00:27:39
the huge amount of hardware devices that need to be loaded
00:27:43
in with in work with Microsoft OS.
00:27:46
This is huge as opposed to Apple, which is, you only know
00:27:49
that handful things that they actually build and provide.
00:27:52
So I think the advantage that Apple has is that they have that
00:27:57
close kind of garden thing and they can really work with that
00:28:01
close ecosystem.
00:28:03
I think it's makes, also from security perspective, it way,
00:28:08
way easier on them.
00:28:10
So I'm not saying that Microsoft are not trying their
00:28:13
best, and just saying that the problem is 100 times, you know,
00:28:18
bigger than Apple.
00:28:20
Speaker 1: Yeah, it seems like Microsoft has like, almost like
00:28:24
a unsurmountable work in front of them where they continuously
00:28:30
build their OS is on top of code that they built 30 years ago
00:28:34
and they're trying to, you know, secure it and do their best and
00:28:37
whatnot.
00:28:38
But at the same time, you know, like you said, it has to work
00:28:41
with so many other products.
00:28:42
It has to work on basically any hardware out there.
00:28:47
Right, you can put Microsoft Windows on it.
00:28:49
That's a very complex issue and you know a lot of people give
00:28:53
Apple, you know, a lot of grief for, you know, locking down
00:28:57
their ecosystem in the ways that they do.
00:28:59
But you know, at the end of the day, they control it all, they
00:29:03
control the security, they have full control over that
00:29:06
environment and in return, it tends to be a bit more secure
00:29:10
because they have the proper guardrails in place.
00:29:13
It seems like they thought through it in the beginning
00:29:15
rather than, you know, okay, we have this great piece of
00:29:18
software, oh wait, we have to secure it.
00:29:20
You know they also don't have like cloud service, so to speak,
00:29:24
right, like they don't have Microsoft Azure.
00:29:27
That you know.
00:29:28
You know what I'm saying like Microsoft has a huge suite of
00:29:32
products that they also have to Develop and build and secure.
00:29:36
Apple has a product suite to what it's you know what one
00:29:39
percent of what Microsoft actually has.
00:29:41
It's insane.
00:29:44
Speaker 2: I also think that Microsoft productivity suite,
00:29:46
like on the office, with it's probably dominating the market
00:29:50
and I'm not seeing that changing .
00:29:52
I mean, someone asked me like Don't you think that the Google
00:29:56
Docs, google sheets and that's what would you know?
00:30:00
I would say take over, but this would buy it and I don't think
00:30:03
that this would happen.
00:30:04
If you think about the effort needed To shift all those
00:30:09
enterprises, governments my mom, I mean, that's what she knows
00:30:12
she would know how to work with Microsoft word, teaching all
00:30:15
those companies and changing all the process, I think that it's,
00:30:18
you know, many years to do that change and I think that
00:30:22
Microsoft control that.
00:30:24
This is one and of course, they also control the Azure, which
00:30:28
is a great cloud platform, but obviously I think it's it's not
00:30:32
easy for Microsoft.
00:30:33
I mean, it gets the Google and Amazon, but it is.
00:30:38
There is a dad Point where I think Microsoft controlling the
00:30:41
market without any doubt.
00:30:43
That's probably the productivity suite and that's, I think, why
00:30:46
documents are gonna stay here for quite some time With very
00:30:50
difficult to replace that.
00:30:51
Even if I'm thinking about myself when I'm exchanging, you
00:30:54
know, version of, like legal agreement, the lawyers with
00:30:58
partners, I'm doing that with the track changes over word.
00:31:02
I'm not doing that with any other thing, only when I find
00:31:05
that's everything.
00:31:06
Yes, now I'm sending that to and to sign off, but but yet I'm
00:31:12
all the work, all the productivity work is being done
00:31:15
using the same thing that probably I was doing five years
00:31:18
ago, ten years ago and fifteen years ago, which is the
00:31:21
Microsoft.
00:31:22
So I think that this is the hit the state list for the next
00:31:26
probably five, ten years.
00:31:27
Speaker 1: Yeah, that is extremely interesting because
00:31:31
now that you say it, right like, apple even has their own
00:31:35
productivity suite, but I don't think I could tell you what it
00:31:40
is.
00:31:40
And I own a Mac, right like, but I have Microsoft Office
00:31:44
installed on it and that's what I use.
00:31:46
I probably even disabled the other stuff, you know.
00:31:49
Speaker 2: Yeah, and not a lot of users know how it's called me
00:31:51
in the numbers, know what it's called numbers, right, yeah?
00:31:57
it's something easy to remember to, but no one uses it exactly
00:32:01
and exactly as I mentioned, I mean, once I've installed my Mac
00:32:04
as well, probably the third or second or third thing I thought
00:32:07
it was installing Microsoft Office, so so I think that's
00:32:11
that's where Microsoft still controlling the market and I
00:32:14
think it would take some time, although from operating system
00:32:18
perspective and hardware, think I'm in Apple, probably doing
00:32:21
really, really good job, and I think I switched from Microsoft
00:32:26
Intel hardware to, you know, apple Mac, probably like Like
00:32:30
four or five years ago and, and I'm seeing more Macs than ever,
00:32:35
definitely so when you're securing a kernel of a device,
00:32:40
you know, just, irrespective of the device or the OS or anything
00:32:43
like that, right, just talking about kernel security in general
00:32:47
, how do we do that?
00:32:48
Speaker 1: you know, because when we're talking about
00:32:50
securing, let's just say, endpoint security, right, you
00:32:53
have a EDR, you have, you know your network configurations on
00:32:57
that device, you have patches and whatnot.
00:32:59
But when we're talking about the kernel, is there anything of
00:33:03
that equivalent for the kernel that you can, I guess, so to
00:33:07
speak, install and monitor in that way, or are we relying on
00:33:13
all the upper levels of security to protect it?
00:33:16
Speaker 2: Yeah, so I think there used to be products that
00:33:19
protect the user.
00:33:21
From current level.
00:33:21
To be honest, they were doing more harm than good, because if
00:33:25
there were bugs in those products they were just killing
00:33:29
the entire machine.
00:33:29
That's historically what happened, and another aspect of
00:33:34
that is that I think it was in silo.
00:33:36
It was a company that presented in black a few years ago that
00:33:40
showed how, by leveraging exploitation of vulnerabilities
00:33:45
in the anti-mower or EDR products that actually lives in
00:33:49
the kernel, they can actually take over the entire machine,
00:33:53
Because if you can actually export stuff in the kernel,
00:33:56
that's it game over.
00:33:56
So what?
00:33:58
I believe that a lot of stuff would be moving from the kernel
00:34:02
to the user mode, and this is where it's way easier to protect
00:34:05
using EDR, some other traditional solutions.
00:34:08
But once you get into the kernel, it's game over.
00:34:12
So I think that they would see more and more of those big
00:34:16
vendors like Microsoft and Apple actually stripping stuff out of
00:34:20
that kernel to the user land and making sure it's working
00:34:24
there, because otherwise it would be very, very hard to
00:34:27
protect Kernel.
00:34:28
That would be probably the land of process management, memory
00:34:31
management, device management, all the things that you should
00:34:35
never be touching in the lifespan of the user and if
00:34:40
something actually comes there, I think that's game over.
00:34:42
That's how I see it.
00:34:43
Obviously, there are some technologies that Microsoft and
00:34:46
Apple are actually utilizing to protect from kernel exploitation
00:34:49
, like back then it was called KASLR, which is the address,
00:34:54
readalomization and that kind of stuff to make the bad guys
00:34:58
laugh harder if something bad is going into the kernel.
00:35:02
So at least it will be hard to exploit that.
00:35:05
But it's just higher kernel fans.
00:35:09
It's not something that is proven to protect like 1%.
00:35:12
So I think that probably something got into the kernel is
00:35:16
probably game over.
00:35:19
Speaker 1: Yeah, I mean, it's basically a root kit, that's
00:35:21
another word for infecting the kernel.
00:35:24
I guess that's what would make your product even more valuable
00:35:31
because, like you said, there's nothing really protecting the
00:35:34
kernel.
00:35:34
And if you're using a Microsoft device, like 90% of the
00:35:38
population is, whatever it might be, everyone opens documents,
00:35:43
everyone uses Microsoft Office.
00:35:44
Most of us have to do it for our job.
00:35:47
You know like we can't do our job without opening documents
00:35:50
and clicking on links and things like that, and so it really
00:35:54
creates an extreme importance around the security of
00:35:58
documentation and how to secure those documents.
00:36:01
And you know, I remember before the way to go with it was more
00:36:05
of you know the path of sandboxing, right, and oh, it'll
00:36:09
execute this thing in a sandbox before you executed on your
00:36:13
device and whatnot.
00:36:14
And I feel like there's ways to get around it.
00:36:17
Right, Because the malware is starting to, or the malware
00:36:20
developers are starting to learn and adapt to that, where it's
00:36:24
like, hey, if these signals are running and they're looking for
00:36:29
a device, you know, in these certain ways, just don't execute
00:36:32
, you know, and then it'll pass the scan, and then you execute
00:36:35
it and lo and behold, it's like, okay, here we go.
00:36:38
You know, now it's game on, but your solution really eliminates
00:36:42
all of that Exactly.
00:36:43
Speaker 2: In fact, sandbox is just, you know, a sophisticated
00:36:46
way to do that automated malware analysis.
00:36:48
And, in fact, exactly as I mentioned, the malware creators,
00:36:52
they know how to check whether they're living now in sandbox.
00:36:56
And, for example, one of the coolest techniques is just, you
00:36:59
know, if you have a Word document and, let's say, the
00:37:02
malware just checks whether there are entries in the recent
00:37:06
documents in the Word, like in Word in Word, you can actually
00:37:09
open those recent documents.
00:37:11
If that's empty, oh, probably that's a sandbox, because
00:37:14
there's what is the chance that we know documents open that
00:37:18
machine.
00:37:18
So there's some cool, cool techniques to do that, as you
00:37:23
mentioned with the signals.
00:37:24
Again, sandbox is just exactly like the anti malware.
00:37:27
It relies on the history, on how malware should be behaving
00:37:31
or malicious documents should behave, and then try to block it
00:37:35
and give that verdict.
00:37:36
And the bad thing with sandbox, it takes a lot of time, a lot
00:37:40
of resources.
00:37:41
That's why we're not seeing that very popular.
00:37:43
Actually, we've seen a significant decline in the
00:37:46
sandbox market for the last seven years.
00:37:48
So, yeah, so with Voteer, with constantly summoning
00:37:51
reconstructing solutions, obviously you can do that.
00:37:54
And when we replicate or generate those safe documents,
00:37:59
we actually do that pretty fast because, as opposed to sandbox,
00:38:01
we don't need to run those documents, we don't need to wait
00:38:04
for them to do something like in sandbox, we just replicate
00:38:08
the document and deliver that.
00:38:09
It takes milliseconds.
00:38:11
There is no latency in that, as opposed to some other solutions
00:38:14
that need to check whether those documents are really
00:38:17
behaving like.
00:38:17
It'd be nine documents and, to be honest, after like three
00:38:20
minutes, okay, they're done.
00:38:23
I mean, okay, this looks fine.
00:38:24
And on top of that, sandbox is a network security kind of thing
00:38:27
and with the Voteer as we, an API centric platform we connect
00:38:33
with using API to almost any ingress of documents out there.
00:38:36
So it can be email, like Office 65, sharepoint, onedrive, but
00:38:40
it can also be Slack and Dropbox and any other application that
00:38:48
has an open API we can interpret with.
00:38:50
So I think it's a different era now as it was when the sandbox
00:38:57
was invented back then.
00:38:59
Speaker 1: So your solution can even integrate with Office and
00:39:03
Outlook and Teams and Slack, because it's just taking the
00:39:07
text and putting it into another window, right, or whatever it
00:39:12
might be, that is already secured and sanitized from
00:39:15
anything that might be poisoning it.
00:39:16
It's very interesting, that's.
00:39:18
I mean it sounds like you know that that's kind of where
00:39:22
security is going overall.
00:39:25
Right from moving the user from as much of the data or you know
00:39:30
, hands on this right of that data.
00:39:33
It's adding a layer of abstraction between the user and
00:39:36
that data to kind of protect the user from themselves.
00:39:39
Speaker 2: Yeah, I think you're right.
00:39:41
It's like providing data in a safe way, or a safe copy of that
00:39:47
data so the user can open it and just do whatever it needs to
00:39:51
do.
00:39:51
It doesn't need to really think twice before it's actually
00:39:55
doing the job.
00:39:55
Because I think that's the problem.
00:39:57
If you think about the fishing awareness campaigns that every
00:40:01
company today runs right, so they were teaching the employees
00:40:05
and the users, you know you need to be cautious with your
00:40:09
open argument or playing like every quarter playing the spot
00:40:13
the fishing game.
00:40:14
No, it's not working.
00:40:16
It's not really working, because a day after you have a
00:40:19
successful fishing awareness campaign, you know send an email
00:40:23
saying hey, there was a problem while you're a paycheck is.
00:40:26
You know, fill in the tax form and if you send it to me, I
00:40:29
would open the dash form because I want to have my paycheck if
00:40:32
I'm getting that from the HR.
00:40:33
So I think that that tension between productivity and
00:40:37
security cannot be really sold by the traditional technology
00:40:42
which relies on detection, on the past indicators of signals,
00:40:46
and, on the other hand, we cannot really throw the
00:40:48
responsibility on the user, telling them the poor user, you
00:40:52
spot the malicious actor, you spot the fishing.
00:40:55
It's not gonna work either, and that's why I think that maybe
00:40:59
implementing the zero trust for data with the colleges, like
00:41:02
what we have in material and then some others, is probably
00:41:05
like the more proactive approach and more modern approach to
00:41:09
solving that problem.
00:41:12
Speaker 1: Yeah, that's a really good point, a really
00:41:15
interesting way, you know, to look at this problem that
00:41:18
everyone is dealing with.
00:41:19
So you know, I try to be very conscious of my guest time and
00:41:23
whatnot, and you know you're over in Israel right now so I
00:41:26
can only imagine what time it is over by you.
00:41:29
So, before I let you go, how about you let my audience know
00:41:32
where they can reach out to you if they wanted to, you know,
00:41:34
reach out and Maybe get some questions answered, and where
00:41:38
they can find your company to learn more about it.
00:41:42
Speaker 2: Perfect, yeah, so I'm encourage everyone to just log
00:41:46
into what you're website, it's what you're dot com and force
00:41:50
the tons of resources, demos and , of course, what your
00:41:54
specialist would love to show the product and see how we can
00:41:58
help you making your user more proactive and open any document
00:42:02
without anything twice.
00:42:03
And, of course, feel free to reach out to me via LinkedIn.
00:42:07
I would love to hear your thoughts, your feedback on this
00:42:10
session and your ideas about the industry.
00:42:13
I think there's no better way of knowing the market, just know
00:42:18
speaking peer to peer.
00:42:19
So don't be shy and reach out.
00:42:22
Speaker 1: Absolutely well, thanks to be.
00:42:23
Speaker 2: I really appreciate you coming on thank you very
00:42:26
much for the pleasure of a great day.
00:42:29
Speaker 1: Absolutely.
00:42:29
Thanks everyone for listening, see you.