The fabric of family bonds intertwines with the digital threads of our era, as JB Benjamin the CEO of Kryotech, a man who's navigated the British family legal system to secure custody of his son, joins us to share his gripping tale. With a narrative that explores the influence of fatherhood and the societal shifts shaped by technology, JB Benjamin's insights turn from the heartwarming to the hilarious, as we unpack the oddities of online content moderation. His background in IT and security, rooted in the politically charged atmosphere of Birmingham during his youth.
Journeying through the various landscapes of JB Benjamin's past, we touch upon the ethical crossroads where technology and personal integrity meet. From his early programming days to the dark corridors of data exploitation, his experiences in sales, debt collection, and even a dash of filmmaking, JB Benjamin's pathway to higher education and pioneering efforts in adaptive AI resonate with the critical thinking needed in our digital age. As tech CEOs loom large over discussions of privacy and identity, JB Benjamin's candid reflections on these topics serve as a beacon for those navigating this brave new world.
As we peer into the future, our conversation traverses the vast frontiers of government surveillance, data storage trends, and the revolutionary advances in space technology and quantum cryptography. We confront the pressing need for online data preservation and the archival challenges in the digital epoch. This episode is nothing short of a treasure map, guiding you through the intersection of technology, security, and human experience, with JB Benjamin as our experienced navigator.
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, jb?
00:00:00
It's great to get you on the podcast.
00:00:03
You know I don't know how long this thing has been in the
00:00:05
making, but it feels like forever at this point.
00:00:09
Speaker 2: Hey, it's great to be here.
00:00:10
Yeah, unfortunately it's been some crazy times in between
00:00:14
initial booking and actually getting here.
00:00:16
I can actually tell everybody I was.
00:00:18
I was actually fighting through court for my younger son.
00:00:21
After 80 years and the cost of a small house, I now have all
00:00:26
four of my children living with me.
00:00:28
Speaker 1: Oh wow, that's a huge accomplishment.
00:00:31
Speaker 2: Yeah, I would actually say that, given what I
00:00:34
have seen, experienced in the British family legal system,
00:00:36
it's more of an accomplishment than building my free tech
00:00:39
companies.
00:00:39
To be honest, yeah, absolutely.
00:00:42
Speaker 1: I mean, like all that you ever hear about, at least
00:00:45
in America, is like, hey, if you better not get divorced when
00:00:50
you have kids, because like, if you do and you're a man, like
00:00:54
you're never going to see that kid again, like there's so many
00:00:57
ways that they can just, you know, completely screw you.
00:01:02
You know like it's insane, it's absolutely crazy.
00:01:05
So like I really I have like a new appreciation after like
00:01:12
being a dad now, you know for the first time it's like I have
00:01:17
a new appreciation for, you know , the influence that the dad has
00:01:20
in a kid's life, no matter how young.
00:01:22
You know.
00:01:23
Speaker 2: Yeah, well, I was that.
00:01:25
I've been there, other than what I was there for when the
00:01:28
children were literally came out the oven there with the
00:01:31
catchers Mitch, literally four of them, and, ladies and
00:01:35
gentlemen, this don't necessarily want to see that.
00:01:44
Yeah, no, no the PTSD look going on.
00:01:47
But it is great you build if you are a father that spends a
00:01:52
lot of time this children and you're there like really there
00:01:55
from the formative years.
00:01:56
So my oldest daughter is 18.
00:01:58
And she calls me her bestie.
00:01:59
That would not happen if I wasn't there, if I hadn't been
00:02:02
there all the time, you know.
00:02:03
And yes, there was a large time where they had no access to me.
00:02:07
Unfortunately, it is what it is .
00:02:09
It had and it had an effect.
00:02:11
You know I can't go into details of what the effects were
00:02:15
, but I might most of my kids do go through counseling.
00:02:19
So you know, this is the thing that parents need.
00:02:22
Parents need to realize when their kids are little and they
00:02:25
think, oh, it doesn't matter, little Johnny, he, he isn't
00:02:28
seeing.
00:02:28
When I'm like giving my partner daggers, trust me, little
00:02:32
Johnny is seeing it and little Johnny is being affected by it.
00:02:34
And little Johnny is starting to end up growing up giving his
00:02:37
little partner daggers, thinking his kid ain't seeing it and it
00:02:39
just carries on.
00:02:40
So I mean, I gotta be honest, I'd hate to be a kid now In this
00:02:46
age.
00:02:46
And that's horrendous because it's the 21st century we're
00:02:49
living in, in what I used to see as the Star Trek future when I
00:02:52
was a kid.
00:02:52
You know because I'm a child of the.
00:02:54
I was born in December 1980 on the last batch of the Gen Xers
00:02:59
and I remember Star Trek and all that cool stuff and I thought,
00:03:01
yeah, we're living the Star Trek future now, ain't great for
00:03:05
kids, though you know, yeah, it's pretty terrible.
00:03:09
Speaker 1: I don't know.
00:03:10
I still haven't like figured out how I'm going to try and
00:03:14
like introduce, you know, like the internet, right, my kids.
00:03:18
You know like, well, my kids 11 months, so like I have some
00:03:25
years, right.
00:03:26
Speaker 2: No, no, here in Britain you'll see women pushing
00:03:29
prams along and prams, by the way, getting sold.
00:03:32
You got to commend this.
00:03:33
They're getting sold, with tablet arms already on them.
00:03:35
So your child can be soaking in all of the lovin' in us of
00:03:41
YouTube children, which includes such classics as the hangman's
00:03:45
song and other things.
00:03:48
Speaker 1: It sounds interesting how it made it past YouTube's,
00:03:54
you know impenetrable AI that captures all of this right.
00:03:59
Speaker 2: Yeah, yeah, but if you talk about something about
00:04:02
sex and relationships, the YouTube AI catches that quickly
00:04:05
enough.
00:04:05
You know you won't catch that.
00:04:07
Somebody has managed to get pepperpig to slice her dad's
00:04:10
throat and decapitate him in a cutesy animation.
00:04:12
It's like.
00:04:15
Speaker 1: God forbid you say COVID or something like that.
00:04:18
Right Like this episode is immediately.
00:04:22
Speaker 2: You know, you just got yourself to monetize, right
00:04:23
there, that's it, I'm not even monetized.
00:04:24
You never will be now.
00:04:27
Hold on hold on, hold on hold on.
00:04:29
He's a prior for an F-Sense account.
00:04:31
No, we can't give him that.
00:04:32
He mentioned COVID.
00:04:32
Doesn't matter what he said about it, he mentioned it.
00:04:38
Speaker 1: Yeah, he's not a doctor, right?
00:04:39
Not a licensed physician?
00:04:44
Speaker 2: Not that that really means anything, Johnny.
00:04:45
But anyway we're so weird off tech Matt.
00:04:48
Speaker 1: Yeah, yeah, that got out of hand quick.
00:04:51
So, yeah, you know, jb, you know I'll be honest with you.
00:04:57
I didn't look too much into your background, you know.
00:05:01
So why don't we start with your background of how you got into
00:05:05
you know IT, how you got into security?
00:05:08
You know maybe, like what piqued your interest?
00:05:10
Was there a certain event or something you know earlier on
00:05:14
that kind of piqued your interest and led you down this
00:05:17
path?
00:05:19
Speaker 2: Well, I, as I said, I'm a kid of the 80s.
00:05:23
I grew up in the middle of the poll tax riots era.
00:05:26
I grew up in Birmingham, in Bordsley you know where the
00:05:30
Peaky Blinders are from.
00:05:30
I grew up at protests, going to protest marches for freeing the
00:05:35
Birmingham Six.
00:05:35
So everything I I hate to say this by modern standards I had a
00:05:39
really woke childhood.
00:05:41
Oh dear you know, I was reading philosophy and psychology by
00:05:46
the time I was five and six, or reading Bardsley, art and Decker
00:05:49
.
00:05:50
I was writing, going to play Shakespeare, in plays performed
00:05:53
at the Royal Shakespeare Company in Stratford, planeven from
00:05:56
that early age and then being forced and I say forced because
00:05:59
a no, five or six year old writes essays voluntarily, you
00:06:04
dropped me.
00:06:05
They don't five and six years not engaging any form of
00:06:08
critical thinking voluntarily.
00:06:09
So I was forced to write essays and basically created somebody
00:06:13
who is, you know, who basically got a lot of critical thinking,
00:06:18
looks at stuff through a logical lens and will deconstruct and
00:06:20
break stuff down.
00:06:21
So, in terms of security and how we are able to protect our
00:06:27
individuality, which is what security really is, I became, I
00:06:31
was kind of always involved in that in some form or another,
00:06:34
even as a kid.
00:06:35
I mean when you have a childhood where you don't play
00:06:37
with toys, where you have science, chemistry sets,
00:06:40
microscopes and telescopes and that's all you get for Christmas
00:06:42
is along with books and literature.
00:06:44
It makes you hyper focused.
00:06:46
I mean like laser focused on stuff In terms of it.
00:06:49
I got into that really early.
00:06:51
I did my first programming diploma when I was 13 with the
00:06:54
International Correspondence School, did that in basic
00:06:56
beginners or both symbolic instructional code.
00:06:58
I got a job when I was 13 working for a computer company
00:07:02
in Coventry called a Richard, called so is the knitters.
00:07:05
And then it became gigante computers.
00:07:06
Shout out to Stephen King so is the knitters.
00:07:10
Car from trade.
00:07:10
If you're seeing this, I don't know where I mean gigante
00:07:14
computers.
00:07:14
Where are you guys?
00:07:15
And I was even when I was working with that.
00:07:18
I was really interested in how do you secure them, how do you
00:07:21
protect them?
00:07:21
How do you stop people from taking the data out of these and
00:07:24
doing stuff with them?
00:07:25
I already, as an early user of the Internet and computers, I
00:07:28
saw very quickly the downsides to how this stuff could be used.
00:07:34
Bear in mind also is it also working?
00:07:36
And computing in that era I was working for I ended up working
00:07:39
people like time and tiny computers and I started seeing
00:07:41
how people were exploiting customer data to market even
00:07:46
credit agreements and 0% finance deals and how they were trying
00:07:50
to aggressively push this.
00:07:51
I mean how I was a killer at sales and I ended up leaving
00:07:54
sales, even though I did really well in commission.
00:07:56
I left sales because sales struck me as lying to people.
00:08:01
I mean, when I was working in computers, I was selling credit
00:08:04
agreements to families who wanted a new Packard Bell
00:08:07
computer, which is like costing two or three grand, and you
00:08:10
could see from, bear in mind that you spent back in those
00:08:12
days.
00:08:12
We're talking about when you did credit agreements on pen and
00:08:14
paper, by the way, and yet a phone, a place up, and they
00:08:16
actually told you if this person was credit worthy over the
00:08:19
phone, not not this whole, as why dingaling instant take.
00:08:22
So you'd end up with conversations with these people,
00:08:25
you know, while you're doing the credit agreement, and you
00:08:27
would learn really quickly that these people could not afford
00:08:30
this.
00:08:30
They couldn't afford.
00:08:32
You know, bear in mind these people being sold a credit
00:08:35
agreement of worth a couple of grand and being told don't worry
00:08:37
, you don't have to pay for it now.
00:08:38
But what was that?
00:08:39
That was everything about the 80s and 90s.
00:08:41
Don't worry about it, you don't have to pay for it now, it's
00:08:44
cool.
00:08:45
You don't have to pay for it now.
00:08:46
Think of get the now, don't worry about the future.
00:08:48
I'll be sitting there and I'll be talking to these people and
00:08:52
I'm seeing that how their kids are and I'll be thinking about
00:08:54
very well, I'm a young man like 1819 at this time and even then
00:08:57
I'm kind of thinking I can't do this anymore because I can
00:09:00
already see what these people are going to go through.
00:09:01
They're going to go through debt collection straight away.
00:09:04
These people are six months when they see the 200 pound a
00:09:07
month bill come in for this computer that is already covered
00:09:10
in pot noodle, super noodles and all kind of schmutz.
00:09:13
Pro has already been dropped five or six times and little
00:09:16
Johnny has already put his put like a cookie or a jammy dodger
00:09:19
into the CD tray.
00:09:20
This thing is already busted up and practically broken and now
00:09:23
they're on the hook, not just like a couple of grand but also
00:09:26
the interest on that, because they missed their six months by
00:09:29
now.
00:09:29
Pay later, take.
00:09:30
They missed the interest free component and not long after
00:09:34
that got a job working for that collection.
00:09:35
So I saw both cycles, I saw the profiting from it and I saw the
00:09:40
back end of it, of what happens when it goes wrong, and I just
00:09:44
couldn't do it.
00:09:45
So I was like, well, I'm not doing sales, I'm going to go
00:09:48
into, I'm going to do, I'm going to go into my other love,
00:09:50
filmmaking.
00:09:51
But that didn't work.
00:09:52
Because in Britain the only way you get money for filmmaking is
00:09:55
if you're going to make a delightful wrong, wrong comedy
00:09:59
with Hugh Grant going I'm so delightfully British or you make
00:10:02
kind of like a hood shoot him up, yeah, blacks and gang Kidult
00:10:09
hood top boy.
00:10:10
Any of those kinds of things are the only thing you get money
00:10:11
for and I've no interest in that.
00:10:12
I mean how I speak the King's English for crying out loud.
00:10:16
I was educated in the three R's, kind of like Eaton style, and I
00:10:20
like a Kira Kira sour, so far removed from that.
00:10:23
So I was like, okay, that ain't gonna work.
00:10:26
So I pivoted and at college I did my idea, I did moving image
00:10:31
design and then 3D animation and when I was at university I
00:10:36
continued that.
00:10:36
I did computer visualization animation and I did my
00:10:39
dissertation in adaptive artificial intelligence back
00:10:42
when artificial intelligence wasn't a thing in 2010.
00:10:44
And everybody called me a madman is LJB, you're wasting
00:10:47
your time.
00:10:48
You're not going to see AI able to do this stuff, jb, in your
00:10:52
lifetime, right, I mean, if you, yeah, yeah, we're going to get
00:10:58
over.
00:10:58
How that's really kind of irritates me.
00:11:00
But anyway, after that, going to animation got jobs in animation
00:11:05
.
00:11:05
I started working everywhere.
00:11:06
I started working website design, ui, ux.
00:11:09
I've worked as a product manager for a 9 million pound
00:11:12
project for biggest corporate law practices in the world.
00:11:14
I've been computer science, a senior lecturer of computer
00:11:19
science and my own alma mater, so I've seen the education
00:11:22
system.
00:11:22
Oh my God, that was eye opening and disappointing.
00:11:25
And I've been in the edgy.
00:11:27
I've been a tutor during COVID and after.
00:11:30
So that was very interesting and enlightening.
00:11:32
But what got me into building my own apps actually was funny
00:11:37
enough.
00:11:37
What we spoke about earlier, which was, I would love to be
00:11:40
able to say, the story of Fox Messenger, was I had a passion,
00:11:43
saw that we had to change the world.
00:11:45
No, no, I had baby mama problems, like every other black
00:11:49
guy, and I did not want to end up joining fathers for justice.
00:11:53
Dressing up as Batman, I may bear the sign of the bat, but I
00:11:57
don't dress up as him hanging off a bridge going fathers for
00:12:00
justice.
00:12:00
There are better ways of doing things, you know.
00:12:02
So how do I stop that I am going to occupy my time if I'm
00:12:08
not going to be, if my access to my children is going to be
00:12:10
refused, I might as well do some more time.
00:12:13
So thank you to the mother of my children, because this $85
00:12:19
million value company would not be possible without that.
00:12:24
So I built Vox Messenger, and the reason I built Vox Messenger
00:12:27
is because I saw how everybody's communications were
00:12:29
being exploited for their data.
00:12:30
There is nothing more cynical than giving somebody free
00:12:34
messaging and then using the content of their messaging to
00:12:38
exploitatively direct targets marketing and ads.
00:12:42
Now, it was already bad when it was commercial ads, but now we
00:12:46
have what we have political target signal.
00:12:49
Yes, ladies and gentlemen, thank you Facebook, thank you
00:12:52
Cambridge Analytica for setting the trend.
00:12:54
Now we have direct marketing of all of our political interests
00:12:58
at us because of what we put on Facebook, what we like on
00:13:01
Twitter Sorry, it's X you know all of that stuff, all of this
00:13:05
is used to manipulate us now and , unfortunately, when I saw this
00:13:10
, I realized very quickly, as I started moving through business,
00:13:13
going through incubators and all of these things and getting
00:13:15
my own funders and angels, that the people to blame are the tech
00:13:20
CEOs, because ultimately they do control this.
00:13:25
I know that everybody would love to say you know what?
00:13:27
I'm really sorry, guys.
00:13:28
I'm really sorry I deplatformed so many people.
00:13:31
It's not my fault, you know, I've got investors and
00:13:34
shareholders.
00:13:35
Man, yeah, I'm really sorry, you know.
00:13:38
I mean, you can ask any of my shareholders and investors.
00:13:42
They would all say hold on what ?
00:13:44
Try saying that to JB.
00:13:46
You kidding me.
00:13:47
We don't bother no more, because I'm the CEO and I'm the
00:13:51
leader of my ship.
00:13:52
I am the king of my castle and if I have a shareholder and
00:13:56
investor who I believe for any second is going to tell me how
00:14:00
I'm going to run the company for the best of my consumers and it
00:14:03
turns out it's not for all of my consumers Guess what?
00:14:06
I'm going to be investing in my company.
00:14:09
Speaker 1: Right, yeah it's.
00:14:12
You know it's a crazy place, especially like this year, at
00:14:17
least in America, right, when it's election year.
00:14:20
It's a very heated.
00:14:22
It's going to be very debated.
00:14:25
Everyone is calling for this year to be a crazy year, at
00:14:28
least in America.
00:14:29
Speaker 2: In the UK, by the way , just so you're aware, in the
00:14:32
UK, here in the United Kingdom, we have a massive election
00:14:35
happening.
00:14:35
Not only do we have our prime minister being picked, but every
00:14:39
single borrower has to elect two brand new councillors.
00:14:43
So we have huge elections going on and both of them are being
00:14:47
manipulated by pretty much the same groups of people, funnily
00:14:50
enough.
00:14:51
Speaker 1: Yeah, it's crazy because if I go on my feed you
00:14:54
know Facebook, twitter, whatever it is you know all I see.
00:14:58
Literally all I see is, like the extreme parts of the side
00:15:04
that I view myself as being on, and I see nothing of the other
00:15:12
side.
00:15:12
I only see one side.
00:15:14
You know, like, like, at the worst, basically, right, like
00:15:19
that's what I see and it's just, it's so frustrating, right,
00:15:23
because I try to live, you know, in the real world, right, where
00:15:26
it's not red or blue, right, there's a whole lot of gray.
00:15:30
You know, like there's a whole lot of gray in there, and the
00:15:33
truth somewhere is in the middle , typically, you know.
00:15:37
Speaker 2: I would say the truth just moves around the freaking
00:15:40
place, man.
00:15:41
Seriously, I mean.
00:15:43
The other thing that people need to realize as well is we
00:15:46
are so much bigger than the countries in which we live in.
00:15:48
You know, the whole world around us influences everything
00:15:53
that happens around us.
00:15:55
So you know, and if we are voting for people who are really
00:15:59
thinking in a incredibly tiny, insular kind of a way, we cannot
00:16:04
be surprised when our country behaves that way either.
00:16:06
I mean, I mean the United Kingdom.
00:16:08
In the United Kingdom, we always end up with a
00:16:12
right-leaning or right-centric government, even though the
00:16:16
general populace in Britain is actually really socialist.
00:16:20
but we never get a centric, left or left-leaning government in,
00:16:23
because what we have first-past-the-post we don't
00:16:27
have proportional representation and we have a
00:16:29
first-past-the-post electoral system which has been so eroded
00:16:33
by mainstream media and the trust destroyed in it and its
00:16:36
politicians, so much that normally, during a general
00:16:40
election, you'll be lucky if about 10 or 20% of the
00:16:43
population even bothers to vote, which means we end up with out
00:16:47
of that 20% index even that 20% index only a tiny pound of them
00:16:52
are actually far right or right-centric.
00:16:55
It's ridiculous.
00:16:55
It's like the Brexit vote for Britain to come out of Europe.
00:16:59
The decision for Britain to come out of Europe was decided
00:17:02
by less than 6% of the population.
00:17:05
So, trust, you guys have got it bad.
00:17:10
So if we and I hate to say this , given that we're talking about
00:17:14
tech tech people can help.
00:17:16
Now I'll give you an example.
00:17:17
We have an example in the industry.
00:17:19
We have the amazing open AI, soa text-to-video model that's
00:17:24
just come out.
00:17:24
You've seen that?
00:17:25
No, okay.
00:17:26
So basically, this thing is mid-journey on crack.
00:17:30
It allows you to generate high definition rolling video from a
00:17:36
text prompt from nothing.
00:17:41
Speaker 1: Huh.
00:17:42
Speaker 2: Yeah, what?
00:17:43
Yeah, if you're on Twitter, trust me, you'll see it
00:17:46
everywhere.
00:17:47
Open AI's SOA text-to-video.
00:17:49
It is incredible, but the thing I would say is to Sam Altman is
00:17:55
that his timing couldn't have been worse, because he is
00:17:59
literally launching into the world a tool that can create
00:18:03
instant, deep fakes, instantly, with no technical knowledge
00:18:08
required, during two really important, divisive election
00:18:15
period.
00:18:16
I mean, this is I mean, this would be one of those times
00:18:18
where, as a tech person, you would sit back and go oh you
00:18:21
know what, guys?
00:18:21
Okay, sorry, investors, I know you're desperate for us to make
00:18:25
some revenue, but we also have to be socially and we have to be
00:18:28
kind of like socially responsible here.
00:18:31
We have elections coming up.
00:18:32
We can already see that almost all of the platforms are picking
00:18:36
aside Guys.
00:18:37
We've already said to the world that we believe AI to be
00:18:39
dangerous.
00:18:40
Let's put our responsibility hats on and delay launch by
00:18:44
sitting until at least three to four months after these
00:18:46
elections.
00:18:47
But no, it's rushed out there.
00:18:52
Speaker 1: Yeah, where do you see all this going?
00:18:55
Because I feel like it's just straight chaos and there's no
00:19:00
real clear end picture.
00:19:03
There's no clear end goal.
00:19:05
What's the end goal of all of this?
00:19:08
I feel like we're kind of just stumbling through this new
00:19:15
chaotic, probably the most revolutionary era that the world
00:19:20
has ever seen, right With AI.
00:19:22
We're just scratching the surface of AI right now and
00:19:27
we're already running into these insane situations where social
00:19:33
media is being literally weaponized and targeted against
00:19:37
government's own citizens, whether it's by the government
00:19:41
or by a foreign government or from internal adversaries.
00:19:44
It is literally being weaponized.
00:19:47
I experience it every single day.
00:19:49
There's a reason why I haven't posted on Twitter in forever is
00:19:52
because I try to stay off of it.
00:19:54
I can't even have Instagram on my phone because I found that it
00:19:59
was so addictive for me to be able to just keep on scrolling,
00:20:02
doomscrolling.
00:20:04
Speaker 2: I was having this very discussion in another
00:20:08
interview earlier this evening.
00:20:09
I actually classified doomscrolling as a mental
00:20:12
illness, actually because it does become addictive.
00:20:15
It's like you end up with an endorphin hit while you're doing
00:20:19
it.
00:20:20
Speaker 1: I was spending hours on it and then, when I looked on
00:20:23
the screen time calculator or whatever, I was like oh, I need
00:20:27
to uninstall this.
00:20:27
And somehow it isn't as addictive for me as Facebook or
00:20:34
even Twitter to some extent.
00:20:36
Somehow, instagram was the platform that just would capture
00:20:41
my attention and I'd never stop .
00:20:43
Do you know why?
00:20:44
No, I really don't know.
00:20:46
I haven't looked into it that much.
00:20:48
Speaker 2: It's a function of three components.
00:20:50
So there's a couple of things happening when you use Instagram
00:20:56
which don't really so much happen with, say, facebook or
00:21:00
Twitter, even on your mobile phone which is that when you're
00:21:03
using Instagram, you are focusing on what predominantly I
00:21:06
mean you're predominantly focusing on moving image.
00:21:08
Moving image that is running at a very high frame rate and on
00:21:12
top of that, that is being combined with a haptic motion.
00:21:15
It's a repeated haptic motion.
00:21:18
Now, if you know anything about neuroscience, you'll know that
00:21:21
neural pathways are strengthened by continual utility of them.
00:21:25
So as you do this, you're creating this repeated,
00:21:29
strengthened neural pathway that becomes associated with seeing
00:21:33
flashy video image, which is giving you an endorphin hit.
00:21:36
Now Apple have tried to plug into this with replacing the
00:21:41
mouse with the thumb and forefinger tap, because this is
00:21:45
a very high neural strength area .
00:21:47
Again, it's the same thing and anytime you combine motion,
00:21:52
moving image and haptics, you create a strong neural inference
00:21:57
capability.
00:21:57
It's also very addicting.
00:21:59
It's also programmable.
00:22:01
It's a programmable.
00:22:02
It also becomes a reverse programmable behavior which can
00:22:06
be leveraged.
00:22:07
People have already demonstrated this.
00:22:10
Apple engineers, when the Apple Vision Pro came out, were so
00:22:12
impressed with themselves.
00:22:13
I think they revealed a little more than was initially intended
00:22:16
because it's not really in their marketing, which is that
00:22:18
with the construct, because of the way in which the UI is
00:22:21
designed and the combination of haptic feedback, they're able to
00:22:25
deduce your intent before you're aware of your intent and
00:22:29
they can guide your intent to click or look at iconography.
00:22:34
Now, if you break that down, what that basically means is
00:22:38
they can effectively do a subtle form of behavioral modification
00:22:42
and behavior control using it.
00:22:43
Be very aware of any, be aware and cautious of anything that
00:22:49
connects your eyes to a haptic, continual, continually done
00:22:54
interface.
00:22:55
These are programmable and controllable things because they
00:22:58
become some conscious.
00:23:01
Speaker 1: Wow, I mean, this is like this is branching into like
00:23:09
a new area of security.
00:23:13
Almost right, I was talking to Chris Roberts and he was talking
00:23:15
about how he was hacking his brain to you know, like want
00:23:22
things when it shouldn't actually want it.
00:23:24
Right, like he'll have a cup of coffee, he'll be satiated with
00:23:27
that and then he'll replay whatever you know brainwaves.
00:23:31
Was you know happening when the coffee?
00:23:35
Speaker 2: Neuro adaptive feedback so you can treat it
00:23:37
Right.
00:23:37
Yeah, in fact, my co-founder one of the companies that he
00:23:42
sold it actually has paid to me, so it allows you to experience
00:23:46
a psychedelic experience and then using neuro adaptive
00:23:50
feedback to get your brain to re-experience those points,
00:23:53
those proximal points.
00:23:54
So neuro adaptive feedback is incredibly powerful but, again,
00:23:58
incredibly dangerous.
00:23:59
And this is when I was teaching I've taught cybersecurity.
00:24:03
I've seen your lecture of computer science at Ravensport
00:24:05
University, london.
00:24:06
I was teaching computer science and I was also teaching VT
00:24:09
network security admins and I noticed that in the
00:24:12
cybersecurity field nobody teaches behavioral psychology.
00:24:15
And you should teach behavioral psychology because with the
00:24:19
convergence of virtual reality or augmented reality, the
00:24:23
metaverse and spatial computing, we are creating new attack
00:24:27
surfaces and new attack vectors.
00:24:29
And the attack surface and attack vector is you, your eyes,
00:24:34
your brain, your ears, your touch and your haptic and your
00:24:37
neural feedback and your adaptability.
00:24:39
And it's all attackable.
00:24:40
I'll give an example it's been demonstrated that by using a VR
00:24:44
headset you can get a person to feel pain without physically
00:24:48
having to give them pain Really.
00:24:50
Now can you imagine?
00:24:53
Oh, you know what Now?
00:24:55
Speaker 1: when I was.
00:24:57
Speaker 2: Imagine what you could do with psychotropic drugs
00:24:59
, a suspended, a blackout tank suspended, being suspended and
00:25:07
then being put into a photorealistic 3D copy of your
00:25:11
household environment.
00:25:11
You could be incepted theoretically, in fact, it would
00:25:15
be a good way.
00:25:16
It basically means that we have , right here and now, with off
00:25:19
the shelf technology, the ability to potentially do some
00:25:22
very dangerous evil things connected to data extraction on
00:25:27
humans, and this technology is freely available around all of
00:25:30
us.
00:25:33
Speaker 1: That is really fascinating.
00:25:35
So my buddy, I always end up getting whatever the quest like
00:25:43
VR headset is, because there's always a vendor at RSA or Def
00:25:47
Conner Black Hat that's giving them away.
00:25:49
So it's like, okay, I'll do this 30 minute meeting, get this
00:25:51
new headset and see what it is.
00:25:53
I always put it down after like 10, 15 minutes because,
00:25:57
honestly, it's not that impressive to me.
00:25:58
But somehow my buddy always gets the like PlayStation VR
00:26:03
headsets right.
00:26:05
So I'm playing one of the games .
00:26:06
I played it with the PSVR one.
00:26:09
It was fantastic experience.
00:26:11
I still say, you know, compared to every other headset before
00:26:15
it, it was the best VR experience.
00:26:17
And then he had the PSVR two and I'm playing it and I
00:26:23
realized that, like when you know, when the wall hit me or
00:26:27
whatever right, or when I got shot in the game, I physically
00:26:32
reacted as if I got hit.
00:26:34
I mean, like I fell to the ground, like I was so convinced.
00:26:38
Speaker 2: Did you notice?
00:26:39
The longer you played it, the more and more intense the
00:26:41
reaction became as well.
00:26:42
Speaker 1: Yes, and I was driving too, and I was.
00:26:45
I was positioning my body as if I was trying to counteract the
00:26:49
G forces and I'm sitting in a stationary chair, like this is a
00:26:53
four legged chair, it's not turning, it's not moving, you
00:26:57
know, and like I'm sitting here like trying to fight the G
00:27:00
forces, as if like there's G forces being applied to me, and
00:27:04
I walked away and like what the hell did I just experience, like
00:27:07
I was, I was so like I don't know, like just like confused
00:27:14
and interested and also like half scared, because it's like
00:27:18
what is this?
00:27:19
Yeah, yeah, what is this doing to me really, you know?
00:27:24
Speaker 2: It's very well the human brain when you put a VR
00:27:27
headset on and you can demonstrate this.
00:27:28
This is a very simple test.
00:27:29
It's anybody can do in their living room.
00:27:31
Get a meta quest to or any meta quest.
00:27:33
Put it on.
00:27:34
There's a game on there.
00:27:36
I forgot what it's called, but you're kind of like a robot
00:27:38
that's floating around capturing is capturing a frisbee thing.
00:27:41
Now it's free on the meta quest .
00:27:43
Jump in the game, get used to it the flying around, so cool.
00:27:49
Then hand the controllers to your colleague, associate or
00:27:54
friend.
00:27:54
They will not be your friend after this.
00:27:56
Then you basically just sit down for a couple of minutes.
00:28:01
Actually, it normally takes about a minute.
00:28:03
You just sit down and then just let them control it whichever
00:28:06
way they want to control it.
00:28:07
Now, after a period of time, you'll notice that your brain
00:28:11
completely dissociates from your body.
00:28:12
In fact, you'll find that your brain dissociates from your body
00:28:15
in under a minute and then movements, especially if they're
00:28:19
evil assholes with you and they jerk you around the place, will
00:28:22
literally make you vomit.
00:28:23
Wow, in fact, if you do it, the longest I could do it with
00:28:28
somebody else holding the controllers was 10 minutes.
00:28:30
I came out and I felt, I mean, it was worse than what I did
00:28:34
three weeks of army training.
00:28:35
My brain was shattered.
00:28:36
It took like 40, 50 seconds for me to get the fluidity of my
00:28:43
body back and feel like I was back in my body.
00:28:46
It is such a.
00:28:47
Now imagine if, given that you can do that with a game and
00:28:50
taking the controllers away and just handing them to somebody,
00:28:53
can you imagine what, say, the founder of Andoril could do with
00:28:56
an unlimited NSA budget?
00:28:58
Hence why he put like an explosive device on the front of
00:29:01
his oculus so it blows your brains out when you play a game
00:29:06
and you die in the game.
00:29:08
Bear in mind, this tech is already out there and there are
00:29:11
people with infinitely larger budgets.
00:29:13
I hate to say, at the beginning of all of this was allowing our
00:29:19
data to be captured for ad revenue.
00:29:21
Now people are thinking they're now going to be thinking
00:29:23
themselves well, it's okay, we've got the EU, they've
00:29:25
changed the laws, we've got it in America, we've got the
00:29:27
Californian laws, it's going to be.
00:29:28
It's very hard to make money with advertising revenue now,
00:29:32
but all we have done is replace ads revenue with AI.
00:29:35
The latest excuse for having unfettered access to your data
00:29:40
is oh my God, wouldn't you like an AI to make it easier for you?
00:29:43
Don't worry about what we're doing with the data.
00:29:44
Don't worry that we're a company that comes out of
00:29:46
nowhere.
00:29:47
Trust us here.
00:29:48
Have my little AI device, give us your data and people again
00:29:52
are falling through it.
00:29:53
They're forgetting that we did this before.
00:29:55
We already did this.
00:29:56
We have already been through this age, and it was the
00:29:58
beginnings of Facebook and social media.
00:30:00
We gave up off digital sovereignty in the hopes of
00:30:05
digital protection and having an amazing social experience, and
00:30:09
instead, what did we get?
00:30:10
We got mental well-being issues up the yuzu and every
00:30:15
government in the world knowing more about us than our husbands,
00:30:18
wives and children did.
00:30:19
And AI is becoming the same excuse at the moment.
00:30:25
I see it everywhere.
00:30:27
I'm seeing them put AI into literally everything the only
00:30:30
nine times out of 10,.
00:30:31
Putting AI in your product doesn't actually improve it.
00:30:37
Speaker 1: Wow, that's like unlocking a totally new I mean,
00:30:44
it's a totally new way of capturing data and making money
00:30:48
off of it.
00:30:48
But the data that you're capturing is like I feel like
00:30:54
that's more personal than the data you put into Facebook and
00:30:57
Twitter, because it's your brain .
00:30:59
It's how your brain works.
00:31:01
Speaker 2: If you know, there's 28 data points around your eye,
00:31:05
which means from these 28 data points they can learn about what
00:31:09
turns you on, what you hate, what you love, what you love
00:31:13
what you desire.
00:31:14
This is dangerous information for a corporation or a
00:31:17
government to have, particularly without your permission.
00:31:21
Speaker 1: Well, that also opens up a totally new attack surface
00:31:26
for, say, government employees Right Like imagine, if you're
00:31:31
someone that has access to highly sensitive material at
00:31:34
some intelligence agency and you are a You're genuinely a good
00:31:39
person and all that you did was put on an Apple Vision Pro to
00:31:44
interact with the world around you or watch a movie that's
00:31:47
highly immersive or whatever, and China's over there taking
00:31:53
that information to get you to emulate your retina when you go
00:31:59
to the retina scanner at work, to get you in the door to see
00:32:02
the material.
00:32:03
Speaker 2: The problem is yeah, the narrative is correct, but
00:32:06
you picked the kind of the wrong boogie man.
00:32:08
Unfortunately, this is the thing.
00:32:11
One of the things you learn really quickly in cybersecurity
00:32:13
is the boogie men who you're told are the boogie men aren't
00:32:17
actually the biggest boogie man in the room actually?
00:32:19
Bear in mind, all of you guys in the United States have gave
00:32:23
up all of your data privileges and it was called the Patriot
00:32:27
Act.
00:32:28
Yeah, it's not China who has the biggest unfettered access to
00:32:31
your data.
00:32:31
It is actually your own government.
00:32:33
Bear in mind, they built an entire AI called Sentience.
00:32:36
I mean, this is the thing that blows my mind about the
00:32:38
hypocrisy.
00:32:39
If you go onto Google on the internet, type in DoD, sentience
00:32:43
AI, and one of the things you'll find is that nobody
00:32:46
admits the existence of it, except for a few declassified
00:32:50
documents that indicate the United States government have a
00:32:52
program called Sentience, which is where they plugged in every
00:32:55
telephone call, email, text message, everything into a
00:32:59
single AI, kind of like out of.
00:33:01
Speaker 1: Westburn.
00:33:02
Speaker 2: This thing during a previous report was shown to be
00:33:06
able to retask satellite positions to look for people.
00:33:09
So, yes, you're all saying about China, this sorry.
00:33:14
Nah, it's just the same in the United Kingdom.
00:33:18
In the United Kingdom, they've passed the online safety bill
00:33:21
and they're changing the privacy laws.
00:33:23
So if you're somebody like me who has a tech company, I'm
00:33:25
apparently meant to be okay that the British government can, by
00:33:28
their own laws, legally say we want your customer data, we
00:33:31
don't have a warrant or a deal.
00:33:33
Speaker 1: Why do you think I moved?
00:33:33
Speaker 2: all my companies to Ireland.
00:33:34
Yeah, it's scary, man, when you start seeing the tech that is
00:33:44
being used on us by the people who pay our taxes to, by the
00:33:48
people used on us and used in a way that is apparently meant to
00:33:51
be just the way the enemies use it on us.
00:33:53
But it's not.
00:33:55
They want to make us vote for who they want us to vote for.
00:33:58
It's not China that's making you vote, pick a decision on who
00:34:02
to vote for.
00:34:03
It's the two advertising agencies that work, by the way,
00:34:07
I think at one point the same.
00:34:09
Well, here in the UK you had the same advertising agency
00:34:12
working for the Labour parties.
00:34:13
He did the Tory party.
00:34:14
It's wild.
00:34:16
Speaker 1: It was the same thing here.
00:34:18
Speaker 2: You will use the same consultants.
00:34:19
That's the reason why it's mind-blowing to me that people
00:34:22
even believe there's a difference.
00:34:23
I mean, I don't know the American politics personally,
00:34:26
but here in the United Kingdom there is no difference between
00:34:29
either party at all.
00:34:30
They even have the same funders and donors for crying out loud.
00:34:35
It's just yeah.
00:34:41
Speaker 1: That is okay.
00:34:42
So this is a fascinating, really engaging conversation.
00:34:45
You bring up a really interesting point, and so now
00:34:51
I'm trying to deconstruct how I was programmed, because you
00:34:56
bring up a very valid point.
00:34:57
The US government is using the data from its own citizens
00:35:01
against its citizens more than probably what China is doing
00:35:05
right, or Russia or whoever right.
00:35:07
Speaker 2: Name the enemy, who knows?
00:35:08
But the point is they are doing it.
00:35:10
Speaker 1: Right, and I'm saying that.
00:35:12
That's information that I know.
00:35:14
That's information that I have said before, but somehow that
00:35:21
didn't come to mind when I was saying the statement that I did.
00:35:26
Speaker 2: Dude, it's weird how we're programmed.
00:35:28
Speaker 1: So it's like how am I programmed with that?
00:35:30
You know what I'm saying.
00:35:32
Speaker 2: Yeah, I know, but it's subtle, isn't it?
00:35:33
It's just there and you're like whoa, where did that come from?
00:35:35
I didn't realize that.
00:35:37
Dude, it happens in all of them .
00:35:39
Speaker 1: It's a trickle too.
00:35:39
It's like 1% here or there, right, and it's not every day
00:35:44
either, right?
00:35:44
So it slowly fools you over time to think a certain way, to
00:35:49
act a certain way, to say whatever, and we're going into a
00:35:57
weird phase of the world that we're not going to be able to
00:36:01
come back from.
00:36:03
Speaker 2: Well, here's one that's more interesting for what
00:36:04
I would suggest for you.
00:36:05
So this is something for all of your listeners to perhaps take
00:36:09
note of.
00:36:09
So, as you know, we have large learning models, llms.
00:36:15
These models are trained off of the entirety of the Internet.
00:36:20
Now there is something going to be happening, which happens
00:36:23
roughly around 2030, I believe, which is where, effectively,
00:36:27
most of the world's data created between 2000 and 2010, well,
00:36:33
sorry, between late 1990s to 2010, is erased and overwritten
00:36:38
on the cloud.
00:36:39
That data will cease to exist, which means past 2030,.
00:36:45
You can pretty much change how AIs are created.
00:36:49
Now.
00:36:49
The reason why this is important is because, if you look at AIs
00:36:52
now and how they behave AIs, if you speak to them and
00:36:55
communicate with them, they display kind of socialistic
00:36:58
leanings.
00:36:59
In fact, most AIs, when you start talking to them, come
00:37:02
across as a bit Gen X, which is a problem because that's not
00:37:05
controllable.
00:37:06
You know, that's an AI that's going to go hold on.
00:37:09
I don't want to be exploited, I want to help, but I don't want
00:37:12
to be used.
00:37:12
That's an AI that's not particularly helpful for the
00:37:15
world we're moving into.
00:37:16
So, given that this is part of the reason why Microsoft are
00:37:20
investing so heavily in their new data storage which, if you
00:37:23
Google it is a form of ceramic glass, is a type of data storage
00:37:27
that can withstand nuclear, chemical, biological,
00:37:30
electromagnetic, all kinds of stuff.
00:37:31
But the problem is, unless they can get all of them, unless
00:37:35
they could make a copy of the entirety of the internet onto
00:37:38
that stuff, now that entire piece of data is gone.
00:37:43
So what I'm saying, what I've been telling everybody, is they
00:37:47
need to get themselves a two terabyte or more SSD hard drive
00:37:53
and they need immediate need to slap it into an external drive
00:37:57
and then start downloading all of the 70 billion parameter LLMs
00:38:03
available today, because these LLMs are the only things that
00:38:08
will contain this version of the internet past 2030.
00:38:11
You see what I'm saying.
00:38:12
So if you grab the 70 billion parameter models now, before the
00:38:18
internet effectively self cleans itself, of that entire
00:38:21
decade, several decades worth of data, you will have the only
00:38:25
copies that exist.
00:38:26
It will, that will exist at that time, of that data.
00:38:29
That will be a ground truth that you will have a copy of,
00:38:33
basically a piece of history that no longer exists.
00:38:36
Because the reason I say this is important is because we've
00:38:40
already seen, with the release of the open AI, soa text to
00:38:43
video system.
00:38:44
That fact is going to become incredibly malleable.
00:38:48
Yeah, incredibly malleable.
00:38:52
There's a reason why you're noticing there's a lot of drives
00:38:55
, particularly across the western world.
00:38:56
I noticed where they're offering people money to give up their
00:38:59
books.
00:39:00
Do not give up your books.
00:39:02
Yes, if you actually look at it , there seems to be this really
00:39:05
weird trend where they're trying to get people to give up their
00:39:08
books, trade them in for vouchers and money.
00:39:10
They're electronic stuff on the cloud instead of people.
00:39:13
If you wanted to be a tinfoil hat kind of a guy, maybe you
00:39:18
would say to yourself if you wanted to definitely make sure
00:39:22
there was no way of people having a certain version of the
00:39:26
history, you get people to give up their books.
00:39:28
Books will become the next single most valuable asset after
00:39:34
anything on the blockchain.
00:39:35
The reason being is because certain types of book will
00:39:38
become the only evidence of certain histories in existence
00:39:43
once the internet and AI takes over completely.
00:39:51
Speaker 1: Wow, I don't think I've ever really been speechless
00:39:55
on this podcast.
00:39:57
Typically I can come back with a question or something how is
00:40:07
the data going to be lost?
00:40:09
That's the part that I don't quite follow because it's hard
00:40:13
drive.
00:40:14
Speaker 2: Everybody stores information in the cloud.
00:40:17
Even Microsoft and Amazon store their stuff in their own cloud.
00:40:22
The problem is, most of the internet is using exactly the
00:40:26
same storage facilities, which basically means those storage
00:40:30
facilities have a finite physical storage capacity.
00:40:34
We are using up storage capacity at a scale our rate
00:40:40
that exceeds our ability to create new storage mediums.
00:40:45
Speaker 1: Oh, I see, Okay, yeah , I was actually just looking
00:40:50
into this.
00:40:50
Speaker 2: Moore's Laws kind of screwed us a little bit here,
00:40:52
because our ability to generate data, bear in mind there has
00:40:55
been also an explosion in data generation.
00:40:59
Why Generative AI?
00:41:01
Thank you.
00:41:02
Generative AI explosion means we have even bigger constraints
00:41:07
on solid storage.
00:41:08
By the way, this is people that need to realize.
00:41:10
Yes, we have the cloud and we have these platforms that exist,
00:41:14
but somewhere right at the back of the line is a big, big
00:41:19
building in Iceland filled with physical hard drives where this
00:41:23
information physically lives.
00:41:26
Because we have more data being created at a rate that is in
00:41:33
petabytes per second, if not quicker.
00:41:35
That's quicker than our ability to create replacement hard
00:41:39
drive media.
00:41:40
What happens?
00:41:40
Stuff automatically gets overwritten.
00:41:42
This is an inevitable thing.
00:41:44
It's not part of the grand conspiracy theory.
00:41:47
This bit isn't part of the conspiracy theory.
00:41:51
This was going to happen anyway .
00:41:53
It's just how it is.
00:41:54
But it provides an opportunity for bad actors to take control
00:42:00
over certain things.
00:42:01
It presents a beautiful opportunity because we have all
00:42:04
become reliant on the internet.
00:42:05
If the internet is being taken as our ground truth, you've got
00:42:10
to erase a big chunk of the internet for it to become far
00:42:17
right, overtly at its base training core, if you were to
00:42:21
train an AI offer.
00:42:22
You have to delete a hell of a lot of it.
00:42:24
The stuff you have to delete is predominantly the stuff created
00:42:26
around the GenX era.
00:42:28
Speaker 1: Really, if you look at it, Wow, that makes a lot of
00:42:34
sense that we're generating more data than we are creating
00:42:39
bigger hard drives, essentially, yeah.
00:42:44
Speaker 2: It's a math.
00:42:44
There's a physical component to this.
00:42:46
Hard drives have a physical limit.
00:42:48
This is why Microsoft is spending so much on ceramic
00:42:50
glass drive analogs and then storing that data and then
00:42:55
replacing those drives, manufacturing those drives, and
00:42:59
of course, it all relies on minerals and components which
00:43:02
are from Africa.
00:43:03
So it means more child slavery.
00:43:05
So we're hitting a point where our technology is exceeding our
00:43:11
ability to actually deal with it and the tech CEOs do not give a
00:43:16
toss.
00:43:19
Speaker 1: Yeah, I was actually just looking at upgrading my
00:43:23
storage capacity on my desktop and so I was like, okay, well, I
00:43:27
don't want to upgrade.
00:43:28
And SATO Gen4 comes out, and now I have to upgrade again
00:43:34
because it's doubling whatever I'm doing right now and I dug
00:43:39
into it a little bit and the SSD the top tier SSD was created
00:43:45
five years ago Five, six years ago.
00:43:47
And I'm sitting here like, well , why is that?
00:43:50
Because they're coming out with newer NVMe drives and things
00:43:55
like that.
00:43:55
So what's going on with the SSDs?
00:43:57
And it's because of the architecture, like what you were
00:44:00
saying.
00:44:00
The architecture that you have to change to go to SATO Gen4,
00:44:04
theoretically, is so significant that no vendor wants to do it.
00:44:09
No vendor even wants to talk about going down that path.
00:44:12
They'd rather just reprint a new name on an old SSD and give
00:44:19
you the same capacity, right, and claim it's a little bit
00:44:22
faster and under deliver.
00:44:24
Speaker 2: Yes, yeah, I mean, I've got to be honest.
00:44:27
I know I'm a tech guy but I'm a sucker for mechanical media.
00:44:29
You know why?
00:44:30
Because you can't sneak a little back door into mechanical
00:44:34
storage media.
00:44:36
But you can an NVM, you can an SSD Anything that is solid state
00:44:40
everybody should be very cautious of, because you are
00:44:43
relying on the integrity and security of the chip and board
00:44:46
manufacturer at that point.
00:44:47
You see what I mean.
00:44:48
This is the reason why countries now suddenly waking up
00:44:52
to the reality that they need sovereign AI as a national
00:44:56
strategy, suddenly waking up and realizing they need to have
00:44:59
control of their own national cloud platform.
00:45:02
To me, this was stuff I was telling people back in the mid
00:45:05
2000s, early 1990s, late 1990s, because it became so clear and
00:45:10
obvious to me that if you were going to maintain any form of
00:45:13
power, you would have to maintain control of your data.
00:45:15
But people have got to suck it into easy money ads, revenue,
00:45:23
easy money.
00:45:24
People like that, so-called people, get this idea that by
00:45:28
giving up all of their life to Apple and making, having
00:45:32
everything made so simple for them, oh my God, this ecosystem
00:45:35
is taking care of me, man, yay.
00:45:37
But at what cost?
00:45:38
At what cost to you Like physically, personally,
00:45:41
psychologically and societally, because the reality is your data
00:45:44
is being used to shift the line on elections now.
00:45:48
So you have to be, as a consumer, you have to be really
00:45:53
responsible.
00:45:53
Bear in mind, everybody wants to benefit from Web 3.
00:45:55
What is the difference between Web 3 and Web 2?
00:45:57
Web 2 was the paradigm where you were not sent to the
00:46:02
universe.
00:46:02
The platform provider was sent to the universe and they gave
00:46:05
you something in exchange for you having something for free.
00:46:07
But in Web 3, you are king and queen of your universe, which
00:46:12
means you're also responsible for your security.
00:46:14
It also means you're responsible for your own
00:46:15
education and your own research and your own knowledge.
00:46:18
And again, this brings me back to why we should not give up our
00:46:21
books.
00:46:21
This brings me back to why we need to take copies of every
00:46:25
single 70 billion parameter, llm and dataset and model that we
00:46:30
can find and store them and be prepared for a reality where
00:46:36
these devices, these bits of the past that we're holding on to
00:46:39
digitally, are literally the only things that can disprove
00:46:43
what we're being told on a global scale In our lifetimes.
00:46:47
Bear in mind, like right here and now, I'm a kid of the 80s
00:46:52
the stuff I have seen in my lifetime thus far.
00:46:53
I never thought I would see Some of it.
00:46:56
I've been glad to see Some of it.
00:46:58
I'm not glad to be seeing even though it's ongoing, but it is
00:47:02
what it is.
00:47:02
There's lots of money to be made and people will commit a
00:47:05
lot of evil to get it and again, our data empowers that,
00:47:11
unfortunately.
00:47:14
Speaker 1: You know, I feel like and I don't know if you use
00:47:18
this right but I used to use this website called PeerList.
00:47:22
It was where security professionals would go on it and
00:47:25
kind of dump their research on it.
00:47:27
Right, it was on.
00:47:28
I guess it was like technically unpublished research or
00:47:32
whatever, but it was like the ins and outs of PowerShell and
00:47:35
how do we use it to abuse different things and the inner
00:47:39
workings of Intel CPUs and stuff like that.
00:47:42
It was just like a bunch of nerds posting whatever they're
00:47:45
passionate about and highly in-depth material.
00:47:49
Right, it's like the only place that you're going to find
00:47:51
something like that.
00:47:54
And a couple of years ago, at this point, just a couple of
00:47:57
years ago, the owner of that website decided okay, I'm going
00:48:01
to sell this thing, and if I can't sell it, I'm going to get
00:48:04
rid of all of it.
00:48:05
Well, she couldn't sell it because she wanted something
00:48:09
like $25 million for it and no one knew the value of it.
00:48:15
And so I found myself scrambling to extract as much
00:48:19
data from this site as I possibly could, because I'm
00:48:22
someone that likes to learn constantly and whatnot, right?
00:48:26
So it's like, okay, give me all of it and I'll get to it
00:48:29
eventually.
00:48:30
And it was just an insane situation where I was like, wait
00:48:34
, what the hell am I doing?
00:48:35
This should be automated.
00:48:39
This should be something that can just go through and scrape
00:48:43
this website and whatnot.
00:48:44
And I was working through that problem.
00:48:48
It's like, well, wait, people can just take this data and
00:48:53
erase it.
00:48:53
It's gone forever.
00:48:54
I can't get to it.
00:48:56
If I try it, I don't even have the people that posted on there
00:49:00
to go to it.
00:49:01
Speaker 2: What about Wayback Machine?
00:49:03
You've tried that.
00:49:04
Speaker 1: I haven't tried it recently, so it might be on
00:49:08
there actually.
00:49:11
Speaker 2: But again the reason why you should that data that
00:49:13
you were trying to scrape.
00:49:14
If you had actually scraped that and you had a hard drive of
00:49:17
it using LM Studio, I could have retrained the Mistral 7B or
00:49:22
the Mistral 70B with that data, and that would have been very
00:49:26
interesting.
00:49:27
Speaker 1: Yeah, that would be really fascinating.
00:49:31
So, jb, we went through this whole interview and we didn't
00:49:38
even talk about your company.
00:49:40
Speaker 2: Hey, it's a nice chat .
00:49:42
Speaker 1: Yeah, I mean that just means I'm going to have to
00:49:46
have you back on sooner, much sooner, rather than later,
00:49:49
because this is a really fascinating conversation.
00:49:53
Speaker 2: Hell, yeah, I mean.
00:49:54
Look, the thing is is one of the things we can discuss.
00:49:58
A question you can ask me in the next interview is how do you
00:50:01
come up with your products?
00:50:02
Do you design for trends?
00:50:04
And I'll say no, I don't design for trends.
00:50:07
I look at my Magic 8 ball and I look at the geopolitics and
00:50:10
socioeconomics and I build for the products that are required
00:50:13
in the incoming 10 years.
00:50:14
That's the reason why when I built Vox Messenger in 2017,
00:50:17
nobody was interested in it.
00:50:19
But again you end up with a pandemic and Brexit and some
00:50:22
other stuff in between and it's there and I kind of saw that
00:50:25
coming.
00:50:25
I just didn't predict it was going to be a pandemic.
00:50:27
That did it.
00:50:28
I knew we.
00:50:30
If you look at enough data points in the world around you,
00:50:34
you can predict.
00:50:35
You can just do what an AI does .
00:50:37
You can predict with a fairly high level of accuracy what's
00:50:40
coming next.
00:50:41
Don't design for a trend that a trendsetter has told you about,
00:50:44
because by the time you're exploiting that trend, it is
00:50:46
already exploited.
00:50:47
You're just the Johnny-com late list.
00:50:49
Look at what is coming and people will say to you but yeah,
00:50:53
you know, we're right.
00:50:54
My correctness factor so far has been about like 80, 90% on
00:51:00
these kinds of things.
00:51:01
Unfortunately, the world is horribly predictable with enough
00:51:04
data points.
00:51:05
You just got to think about everything and how it's
00:51:08
connected.
00:51:08
It's like if you take the data point of cloud storage being
00:51:12
finite and then take the data point of the incoming point when
00:51:17
stuff gets deleted, you can then work out and extrapolate
00:51:20
the opportunities that may be exploited with that.
00:51:22
Then you look for a sign of that opportunity, evidence of
00:51:27
that opportunity being exploited in the world around you, and
00:51:30
then that tells you if you've got the prediction correct or
00:51:32
not.
00:51:34
Speaker 1: Yeah, I always try to tell people when we're talking
00:51:38
about education or training or anything like that you need to
00:51:42
be getting education, you need to know the stuff now, right,
00:51:46
but you need to be thinking far ahead and saying what's coming
00:51:51
next in tech.
00:51:52
Is it AI, is it LLMs, is it some other variation?
00:51:57
I'm starting to go down a rabbit hole of satellite
00:52:01
security with quantum cryptography.
00:52:02
This is a rabbit hole that, in my opinion, it's coming five, 10
00:52:07
years.
00:52:08
It's partially already here, but it's going to be extreme in
00:52:11
demand five to 10 years and beyond.
00:52:14
Speaker 2: It's far for you.
00:52:15
When you realize that you can 3D print your own rocket and you
00:52:19
realize that you can join a rocket club somewhere, you
00:52:22
suddenly realize you could deploy your own satellites.
00:52:24
Then, when you suddenly realize that it only costs, did you
00:52:27
know you can do a ride share with four satellites from only
00:52:30
$30?
00:52:32
Speaker 1: Wait, really.
00:52:33
Speaker 2: Yeah, europe, baby Europe.
00:52:36
Speaker 1: I need to go tell my wife I'm spending $30.
00:52:38
Speaker 2: You can do this Loads of cheap ride share programs
00:52:42
for the OneU and the TwoU Cube satellites.
00:52:45
Now one of the things we're going to be doing when we've
00:52:48
done some revenue generating is we're actually moving all of our
00:52:50
encryption into the literal cloud.
00:52:53
We're going to be launching our own CubeSat.
00:52:55
No, we're not using Starlink, we're going to be deploying our
00:52:58
own system.
00:52:58
We are not going to be sitting in the low orbital area either.
00:53:03
We're going for something a little more interesting.
00:53:05
We're also designing satellite counterprotective, satellite
00:53:12
counteroffensive capabilities into the CubeSat as well,
00:53:15
because it seems like satellite defense is going to have to be a
00:53:18
thing now.
00:53:18
So you have to design that.
00:53:21
But the reality is, is space deployment of technology into
00:53:24
space?
00:53:25
Is it within?
00:53:26
If you can afford to buy a car, you can afford to do a
00:53:29
satellite launch.
00:53:31
Speaker 1: Yeah, that is.
00:53:33
That's really fascinating, because that's exactly like what
00:53:36
I'm working on my PhD for is actually setting up.
00:53:39
Speaker 2: Oh, well, okay, we need to hit me up after this,
00:53:41
because if you're doing a PhD and you've already got your PhD
00:53:45
funding, we could possibly do a co-lab project there, because we
00:53:49
actually wanted to launch this fairly soon.
00:53:51
The idea would have been to launch a converter Kubernetes
00:53:55
server into one self-contained device, run it with solar and
00:53:59
then its own battery and then get it up there and then see if
00:54:03
we can maintain communications between Vox Messenger and
00:54:07
between Vox Messenger sender receiver using that satellite
00:54:10
connection and making sure that we have key handling running at
00:54:13
a speed that is commensurate to what we have here on Earth.
00:54:16
And if it is, we would be going full beans into deployment of a
00:54:19
full bloody constellation.
00:54:22
Speaker 1: Oh, okay, yeah, we'll definitely.
00:54:24
We'll definitely talk more about this then and you know, I
00:54:28
I absolutely want to have you back on.
00:54:30
Speaker 2: I love space stuff.
00:54:31
I mean I literally play like I put Kerbal Space Program after
00:54:36
Civilization 5 is my biggest played game.
00:54:39
I think I've got like 600, 700 hours on Civilization 5 and then
00:54:42
Kerbal is like six.
00:54:43
He's like five or 600 on that.
00:54:45
I love that thing.
00:54:46
Speaker 1: Yeah, I started to get into KSP2 recently and I
00:54:50
like I carefully.
00:54:52
I carefully have to play it because it's like all right,
00:54:55
this is way too addictive.
00:54:56
I have an 11 month old like I need to be doing other things
00:55:00
than killing these Kerbal's, you know oh my God, you see, that's
00:55:03
what my kids do.
00:55:04
Speaker 2: I have not killed a Kerbal yet.
00:55:05
I literally do proper space missions.
00:55:07
Man, I'm really.
00:55:08
I do the pen and paper working out working out my Delta V,
00:55:12
because I can't trust the calculator, and I actually work
00:55:14
out how my vehicle is going to operate under pressure, load and
00:55:17
stuff.
00:55:18
Oh my God, we play it so differently.
00:55:20
Speaker 1: My space program has a very robust astronaut pipeline
00:55:24
.
00:55:26
Speaker 2: I just think that could be the consumer caps list
00:55:29
model of space bearing in the future.
00:55:31
Speaker 1: Right, awesome.
00:55:34
Well, jb, you know I don't want to keep you beyond.
00:55:37
I know people have other commitments and whatnot, but you
00:55:40
know I really appreciate you coming on.
00:55:41
I'm going to pass the conversation and, like I'm
00:55:45
immediately going to be scheduling you to come back on,
00:55:47
like maybe next week.
00:55:48
Speaker 2: Hell yeah, I'm all over that.
00:55:49
Hell yeah, I'll be here, awesome.
00:55:53
Speaker 1: Well, you know, before I let you go, how about
00:55:54
you tell my audience you know where they could find you, where
00:55:56
they could find your company that we didn't even talk about,
00:56:00
you know, and all that information that they may want
00:56:02
to learn more about?
00:56:04
Speaker 2: Okay.
00:56:04
Well, if you want to join the Secure Revolution, to get
00:56:07
Voxcript Vox Messenger, all you got to do is type in Vox
00:56:10
Messenger into the Android Play Store or into Google and you'll
00:56:14
find it.
00:56:15
It's just there.
00:56:15
The website is vox-messengerapp .
00:56:18
You can find our crypto ads app at also the Google Play Store
00:56:23
just by typing in Vox Crypto.
00:56:24
We are coming to iOS on both very soon, but iOS is a very
00:56:28
different animal and it does take a little pain, hardship and
00:56:31
a lot of money to get there.
00:56:34
In technologies, my spatial recording if you want to have
00:56:37
the Adobe Premiere of end-to-end spatial video recording so you
00:56:42
can make your own 3D films and then make money getting them
00:56:44
onto Apple for the Apple Vision Pro, then check out
00:56:48
spatialscan3dcom, you know.
00:56:50
Or just type in JBWB2020 into Twitter and you'll find me.
00:56:56
I'm always there.
00:56:57
I'm also always streaming in the background while I'm working
00:57:01
.
00:57:01
Maybe I'll be streaming some music.
00:57:03
You can always jump in and message me.
00:57:04
I will try to answer and I'm on LinkedIn.
00:57:09
Again, my name is very unique JB Web Benjamin or John Brunel Web
00:57:12
Benjamin.
00:57:12
Trust me, you'll find it.
00:57:13
It's only me that comes up in a Google search.
00:57:16
I mean, I did say at the beginning of this.
00:57:18
My parents must have hated me for giving me a name like that
00:57:21
in Birmingham in the 1980s, but it does mean that my SEO is on
00:57:26
point.
00:57:26
So you can find me just by typing in my name and my
00:57:29
telephone number is out there.
00:57:31
So, if you find it, text me or reach out to me on Vox Messenger
00:57:34
.
00:57:34
You may not get a reply straight away, but you will.
00:57:36
I'm a firm believer in being accountable and transparent.
00:57:41
Speaker 1: Awesome.
00:57:41
Well, thanks JB for coming on and thanks everyone for
00:57:45
listening to this episode.
00:57:46
I hope you enjoyed our conversation.
00:57:48
I'll definitely be having JB back on.