How does a mischievous high school curiosity evolve into a rich, multifaceted career in IT and security? This episode promises a deep dive into Michael Goldstein's fascinating journey from tinkering with school computers to becoming an influential figure during the PC revolution. Learn from Michael's transition from mainframe to PC environments and his crucial role in an early managed service provider, all while absorbing the lessons of adaptability and foresight that have marked his professional life. Michael's story is not just a tale of technological advancement but a guide for anyone looking to carve their own path in IT and security.
Ever wondered how to break into the world of IT and security? Michael offers actionable insights, emphasizing the foundational role of help desk positions and the vital troubleshooting skills necessary to thrive. The episode delves into the mental fortitude required in security roles, painting a picture of seasoned professionals who tackle complex problems with strategic independence. Through personal anecdotes and professional reflections, we underscore the importance of teamwork and versatility, drawing comparisons to the multifaceted skill sets needed for success in various IT roles.
In your quest for success in the tech industry, what entrepreneurial lessons can you glean from a seasoned professional? Michael shares his entrepreneurial journey, discussing the challenges of adapting to industry changes and the importance of staying ahead with emerging technologies like AI and cybersecurity. Gain insights into forming the right team and treating customers with genuine respect, while also exploring the transformative applications of AI in business operations. Michael's conversation offers real-world examples and thought-provoking topics, making this episode a must-listen for anyone interested in the current and future landscape of technology.
Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today
Speaker 1: How's it going, michael?
00:00:01
It's great to finally get you on this podcast.
00:00:04
You know we've been talking about this thing for quite a
00:00:07
while now at this point, and I'm really excited for our
00:00:09
conversation.
00:00:10
Speaker 2: Definitely definitely excited to chat.
00:00:13
Speaker 1: I'm a big fan you know I never started this thing
00:00:28
for the intent of anyone listening, right, and then I'll
00:00:30
go to like a conference like defcon or something, and uh, you
00:00:31
know people will come up to me like, oh, you're security,
00:00:33
unfiltered.
00:00:33
Like you know, we, I always listen to your, your episode, my
00:00:37
favorite one is this one.
00:00:38
It's like it's always interesting because, you know, I
00:00:41
, I think back, I'm like man, like I, you know, my goal
00:00:45
literally on my first episode was like man, if 10 people
00:00:47
listen to this thing, like that'll be great, you know I
00:00:51
always equate things to the tone like I'm always a big swimmer
00:00:54
growing up and you can't just dip your toe and you gotta just
00:00:57
get in there.
00:00:58
Speaker 2: So you dip your toe in and now you found that he's
00:01:00
swimming last yeah, yeah, that's a really good point.
00:01:04
Speaker 1: You, you know, and I feel like that tends to hold
00:01:07
people back, right, it is not knowing the end from the
00:01:13
beginning and not knowing the middle.
00:01:14
You know, it kind of holds them back like you know, how am I
00:01:18
going to make this work, how's it going to look, and you know
00:01:21
all this sort of stuff.
00:01:22
I'm of the mentality where you know I'll think about stuff.
00:01:24
I'm all.
00:01:24
I'm of the mentality where you know I'll think about it for a
00:01:27
little bit, but I have a cutoff date.
00:01:28
Okay, by this date I have to execute.
00:01:31
So I need to think about it as much as I need to up until that
00:01:36
date.
00:01:36
Right, and different projects in my life, um, or different,
00:01:40
you know, sections or endeavors, right, they get different
00:01:42
timelines and whatnot, but there's still a cutoff date and
00:01:49
I still hold myself very hard to that date as well.
00:01:50
Speaker 2: You've got to have a plan.
00:01:51
If you're on the ground and you're kicking ass.
00:01:53
To do it, it has to go to a different part of the list.
00:01:58
Speaker 1: Yeah, that's very true.
00:01:59
So, Michael, before we kind of, I guess, dive into the
00:02:03
conversation, well, we already did right, but why don't you
00:02:07
tell my audience you know what made you want to get into IT,
00:02:10
what made you want to get into security?
00:02:12
And the reason why I started everyone there is because it
00:02:16
gives everyone a good picture, right of who you are, where you
00:02:19
came from, and it also helps that portion of my audience
00:02:24
where maybe they're trying to get into IT, maybe they're
00:02:27
trying to get into security for the very first time, and I
00:02:30
always found that when I was trying to make that jump, it was
00:02:33
helpful to hear someone else that came from a similar
00:02:36
background, and they did it, and so then I could see oh, maybe
00:02:40
this thing is actually possible, maybe I should go down this
00:02:43
route.
00:02:45
Speaker 2: No, I agree with you, you've got to start off
00:02:47
somewhere.
00:02:47
So I started in IT in the pre-PC days Now I'll date myself
00:02:52
and you know it was funny.
00:02:55
When I was in high school we had a big computer that was in
00:03:00
there.
00:03:00
I was a bit nasty Now.
00:03:02
We used to have to use constructs.
00:03:04
Then four or five of us got assigned a whole half-year
00:03:08
project and we were figuring what are we going to do with
00:03:11
this and we kind of learned some basic programming and we
00:03:15
thought it would be fun to go out there and get everyone's
00:03:18
telephone number from the school office, send it to the computer
00:03:22
and have it put out on green paper.
00:03:24
Yes, no screens, no interactors .
00:03:27
And we'd have green screens or green paper and we'd figure out.
00:03:30
You know how that telephone number can translate to what
00:03:34
would be a dirty word.
00:03:36
And you know, at the end of the year, you know they had the big
00:03:39
school on for it.
00:03:40
And I grew up in New York City and I was in one of those big
00:03:43
city schools.
00:03:43
You know 2, 3 people in the final order of order and
00:03:47
you know I had an envelope in my hand that had the word and the
00:03:53
telephone number and then the principal had all the
00:03:55
information of whose telephone numbers they were.
00:03:58
So we didn't know what we were going to pick.
00:04:00
So I'm up in front of the audience.
00:04:02
I open it up.
00:04:02
I won't repeat the phrase, but it wasn't the nicest phrase and
00:04:07
it turned out it was the principal's telephone number.
00:04:10
So we thought it was kind of cool.
00:04:11
It didn't go out and we were like realized it was just kind
00:04:13
of fun going through computers.
00:04:15
So in college I was a programmer.
00:04:17
When I came out of school I worked for a insurance and then
00:04:21
a large law firm.
00:04:22
And we kind of realized that computers would run the world.
00:04:26
We'd help coordinate things.
00:04:28
I was a mainframe programmer in a large television network.
00:04:33
The company had their own low-end economics.
00:04:36
They all went to work for a large law firm and I remember my
00:04:40
first day on the job.
00:04:41
We had probably a $30,-40 million IBM mainframe there.
00:04:45
My boss was the typical IDN-er and he was like what is a PC?
00:04:51
What is it going to be?
00:04:53
If you don't learn to work the mainframe, you're not going to
00:04:56
be here long.
00:04:57
We had eight PCs by the end of year two we had 600 PCs, so that
00:05:03
boss wasn't there anymore.
00:05:05
I kind of grew up in the PC age of converting things out there.
00:05:09
I wanted to channel up a volumetric service provider a
00:05:13
long time ago.
00:05:14
We worked primarily in law firms.
00:05:16
The biggest worry was during the day how we would protect
00:05:21
users from themselves, which didn't change really.
00:05:24
But you know, all the time.
00:05:26
You know we kind of grew up in the world.
00:05:28
You know everybody was click crazy.
00:05:30
You know internet came through there and you know I was.
00:05:34
I had a MSP 25 plus years ago on Long Island and we just
00:05:41
started building networks.
00:05:42
You know when we were.
00:05:43
You know putting everything together as the internet came
00:05:46
along and you know, pre-internet , I always viewed that as if IT
00:05:50
people were in control.
00:05:51
And as the internet came through and everybody walks
00:05:58
around with a phone these days, you know it wasn't what the IT
00:06:00
people want, it was what the user wanted.
00:06:01
And in reality, you know, users pay the bills.
00:06:05
Sometimes they didn't make the right decisions, so security was
00:06:09
just a natural kind of thing.
00:06:10
I worked in the legal vertical for a long time so we were
00:06:14
always security conscious.
00:06:15
You know we were worried about oh, where was that floppy disk?
00:06:18
And it went out.
00:06:19
Where was that CD?
00:06:20
Where was that DVD?
00:06:22
Now that USD's.
00:06:23
So in reality we always were security-minded because as we
00:06:28
ran, you know, nsp and a service provider or a bar, whatever we
00:06:31
want to call it you know we were looking to make our lives
00:06:34
easier and protect them.
00:06:35
You know.
00:06:36
And now you know you look at it and even five years ago it
00:06:39
seems like forever on the security side.
00:06:41
So security was always on the forefront of our mind.
00:06:45
You know I always said to people that work for me, so I'm based
00:06:50
in Fort Lauderdale.
00:06:51
Now, you know we have 20-plus employees, you know, and over
00:06:54
time I still do some SQL programming.
00:06:56
You know I sell out there, but you know everything becomes
00:07:03
history focused.
00:07:04
I've always said to people that want to get into this industry
00:07:10
it's not like you just pass an exam and now you're an it guy.
00:07:12
You gotta love this.
00:07:12
You gotta wake up every day and love the beast that's out there
00:07:14
.
00:07:14
You gotta like training and you always have to be the best that
00:07:17
you can be, because no one wants to pay for second best
00:07:20
yeah, that is, that's a really good point and there's a lot to
00:07:24
unpack there.
00:07:25
Speaker 1: You know it's it's interesting.
00:07:29
You know I always get asked, right, how do you get into
00:07:33
security, how do you get into IT ?
00:07:35
Right, and you know I'll tell them the answer and it's not
00:07:39
always the quick answer that people want.
00:07:42
Right, and you know the reason for that.
00:07:46
Right, like I always tell people, start in help desk,
00:07:50
start in another domain, that is , you know, kind of the
00:07:55
foundational work of IT, right, like that help desk experience,
00:08:00
and people don't want that answer because that's a longer
00:08:03
route.
00:08:03
But when you go that route, when you put in that work and
00:08:07
you learn, you know how these systems are put together, you're
00:08:11
more prepared for that future role in security.
00:08:14
You know, and it's always difficult to see how something
00:08:20
in the present will tie together to a future goal or something
00:08:23
like that, right, um, but it's important to to embrace future
00:08:26
goal or something like that, right, but it's important to
00:08:30
embrace the process and go through that process.
00:08:31
You know you mentioned, you know the importance of being able to
00:08:36
learn, right, and always willing to learn, and when
00:08:43
someone tells me that they want to get into security, the very
00:08:45
first thing that I do, maybe for good or bad, is I try to
00:08:46
convince them not to get into security.
00:08:48
Right, because if I can convince you to not get into
00:08:51
security and we're not even doing anything technical, I'm
00:08:54
not angry, I'm not yelling at you, I'm not telling you.
00:08:57
You know that you're costing the business millions of dollars
00:09:01
a day by your stupid process or procedure.
00:09:03
Right, if I can convince you without doing any of that, then
00:09:07
there's no way you are going to make it in security.
00:09:11
To be completely honest, you know, because it can be such a
00:09:14
stressful environment that you know, if a simple, you know,
00:09:19
talk is going to say, yeah, that's probably not for me, then
00:09:23
you're making a good decision for yourself.
00:09:25
It's not like a ding on you or anything like that.
00:09:29
I always bring up that point of always learning, yeah.
00:09:35
Speaker 2: I think that you have to understand the basics and
00:09:39
there's no way, while you're having the basics like seeing
00:09:41
how it is, I mean, and learning what you do and then that starts
00:09:46
the teamwork process out there, because nobody knows everything
00:09:50
and if you're getting paid to do a job, no one wants to hear
00:09:54
that.
00:09:54
I Googled it or I asked ChatGPD .
00:09:58
These days, I think that to understand the basics you have
00:10:03
to follow your own plans and I think this carries, you know, to
00:10:04
understand the basics, you have to go out there and you can
00:10:05
follow your own plans, and I think this carries to almost any
00:10:07
industry.
00:10:07
But now in the industry, you know, I view that if you're a
00:10:11
security specialist, you're almost at the top of that
00:10:14
pyramid.
00:10:14
You know the bus stops with you .
00:10:16
So you have to have troubleshooting techniques.
00:10:18
You have to kind of understand.
00:10:20
You know we all live in a black and white world in security.
00:10:23
Right, I say that the yellow lights are almost as important,
00:10:27
if not more, than the red lights , Because in our real-life
00:10:31
environments we're looking at everything and we have a tool
00:10:34
for the tool.
00:10:35
For the tool, it takes that seasoned professional to be able
00:10:38
to go out there, take a breath, look at what's going on and try
00:10:42
to figure out.
00:10:43
No, they're not really coming in the front door.
00:10:46
Somebody left the garage window a little bit, you know, and I
00:10:49
think that having those days to look around are super important.
00:10:54
Speaker 1: Yeah, yeah, that's a really good point, you know,
00:10:57
it's it's uh, and it's a challenging balance for sure.
00:11:03
And it's a challenging balance for sure, you know.
00:11:07
And I have a friend who always kind of equates the security
00:11:10
professional like this right, he says that you could take almost
00:11:15
any security professional and put them in a networking job,
00:11:19
put them in a systems engineer job, linux admin, whatever it is
00:11:24
right, and they will be successful.
00:11:26
They will be just as good as the person you know that should
00:11:30
have been in that role, right, that has that specialty, just
00:11:33
because in security you need to know the underlying systems.
00:11:36
You know, you need to know, right, that on a Linux system,
00:11:41
logs are being written to this certain directory.
00:11:43
Well, how do I find that out from scratch?
00:11:45
Right, and how do I change that too?
00:11:48
Because some people may want to change that in a very stupid
00:11:50
way, you know, and how do I find that out, right?
00:11:53
Well, how is everything built in Linux?
00:11:56
And you know, starting from the basics, right, and you know.
00:12:01
Another key distinction I think that he also makes is, you know,
00:12:06
security professionals, we're highly paid, but we earn our
00:12:09
paycheck maybe three times a year, and if it's any more than
00:12:12
three times a year.
00:12:13
You probably don't want else is failing.
00:12:14
You send them in and you give them the goal of this needs to
00:12:28
be fixed, this needs to be resolved or whatever it is.
00:12:30
I don't need to know how you do it, I don't care about how you
00:12:34
do it I, it just needs to be done Right.
00:12:36
And you don't ask this guy any questions.
00:12:38
He just goes in there and handles it, and that's.
00:12:40
That's actually very true.
00:12:48
You know my, my, my current manager at my nine to five.
00:12:49
He's from the military and he identified that in me very early
00:12:52
on and he was telling other managers, other directors and
00:12:54
leaders saying like hey, joe's like a mercenary, right?
00:12:58
We don't ask him how he gets things done, we tell him what
00:13:01
needs to get done and he figures out how to best do it Right, so
00:13:04
just get the hell out of his way.
00:13:07
Speaker 2: What's funny is that you know I am one of my key
00:13:12
employees.
00:13:12
You know we go back over time, been around a while.
00:13:14
Remember one of my key senior level engineers would say to our
00:13:18
staff listen, I don't want to hear that I have a problem.
00:13:21
I want to hear that we have an opportunity.
00:13:23
And in reality, you're right, someone's coming to you because
00:13:28
they have a definitive issue.
00:13:29
Let's call it.
00:13:30
Maybe we won't call it a problem yet and it's up to us to
00:13:34
gather those pieces of information.
00:13:36
I won't say we have it easy in security, right, but we're the
00:13:40
holder of the keys to the kingdom and in reality, we have
00:13:44
to figure this out and come back with a plan.
00:13:47
I think that that's the key.
00:13:49
Moma wants to come back.
00:13:51
Listen, we definitely have more questions and stuff, but what's
00:13:53
the first little opportunity that you could possibly have on
00:13:58
that side of it?
00:13:58
So I think it's super important .
00:14:00
And, listen, we all have to surround ourselves with good
00:14:02
people.
00:14:03
My son in the business is Ganderous.
00:14:05
You know we pick and choose our vendors not based upon how much
00:14:10
money it puts in my pocket.
00:14:11
It's really the solutions that are out there, the partnership,
00:14:15
and in reality, you know we want those vendors, like one of our
00:14:18
big vendors is Acronis.
00:14:19
You know we utilize a lot of their products, but I also want
00:14:25
to know that they have an idea, because as good as you and I,
00:14:28
joe, could say that we are, at some point sometimes we really
00:14:31
need to go back to support and hash some of these things out.
00:14:34
So I think relationships are super important and making sure
00:14:38
that we surround ourselves with good people, because it does
00:14:41
take a game sometimes.
00:14:43
We're also, in most of the times, in these big things that
00:14:46
go on.
00:14:49
Speaker 1: Yeah, when you're a really small company, it's also
00:14:53
really critical to not only choose the right vendors and
00:14:56
partners, but it's also probably very critical to choose the
00:15:00
right customers.
00:15:01
Not everyone that will throw money at you should be a
00:15:06
customer of yours.
00:15:06
There's different personalities , there's different temperaments
00:15:11
and risk profiles, right.
00:15:12
That may or may not be a fit, and it's an interesting
00:15:16
distinction, right, because everyone, if they're paying, you
00:15:20
know it doesn't matter, you just take it on, you just do it.
00:15:23
But you know, for someone that has lived through, doing that
00:15:29
and experiencing that it's not always the best option.
00:15:32
You know how you.
00:15:33
You started from a technical background, right?
00:15:37
How did you translate those skills?
00:15:40
Did any of them translate into the business side of running a
00:15:44
business and growing it and getting customers and things
00:15:47
like that?
00:15:48
How did you?
00:15:49
How did you figure out the other side of it?
00:15:52
Speaker 2: So you know, those are the things that you know.
00:15:54
You rely on peers.
00:15:56
So I will say that I worked for a firm on Long Island for a
00:16:02
while.
00:16:03
I was a partner in the firm.
00:16:04
I was selling and doing tech.
00:16:06
I left, I formed another company and I think someone said
00:16:11
to me it's similar to people who are boaters thatody's there.
00:16:12
And I think someone someone said to me it's similar to
00:16:13
people who are boaters that are out there and on boats the best
00:16:17
day is the first day that they get that and uh, clever days the
00:16:21
day that they sell that.
00:16:22
And I think it was someone said to me you know, when you get
00:16:24
that, check it's your company.
00:16:26
Check it is one of the greatest things that are out there.
00:16:29
But you know, you learn.
00:16:30
I I had a good set of peers around me.
00:16:33
I'm very involved in the industry and then I also started
00:16:38
to evaluate, you know, the customer base.
00:16:41
So I'm on 15 years I actually can't believe 15 years in this
00:16:45
company and, uh, sole owner.
00:16:48
But you know, I realized 15 years ago that my hire was
00:16:52
changing.
00:16:52
The IT guy wasn't always the IT guy, right, we had all the
00:16:58
people that are out there and we adjusted and we always kind of
00:17:01
say that we always have to look a couple years still.
00:17:04
And it's important to go out there and you can't just sell an
00:17:09
LRA stream all your life.
00:17:10
Other people have gone along on that side.
00:17:13
I hired the right people to go in place there.
00:17:16
I realized that, hey, I just can't Every one of those
00:17:19
QuickBooks but in reality, what do those numbers mean?
00:17:21
So I surrounded myself with the people that were out there and
00:17:26
again, listen, there's the pros and cons.
00:17:28
I hired someone who can do it myself and in reality it needed
00:17:32
to be done right.
00:17:32
And you know, same thing with when you mentioned, when you're
00:17:36
picking out the right customers.
00:17:37
Sometimes you got to go and you got to feel them.
00:17:39
Now, if they're playing it up front, this might not be good,
00:17:43
it might not be the right opportunity for us.
00:17:46
And then you know, looking at tech, I'm always trying to stay
00:17:49
ahead.
00:17:49
You, I'm always trying to stay ahead.
00:17:50
I kind of live through the DOS days to the Windows days, to the
00:17:55
Novell, out to Windows ND, along those lines, and I always
00:17:59
see it excites me to go out there and see the new tech.
00:18:03
I think we're at a new plateau across all industries with AI
00:18:08
and I always also did view.
00:18:10
When I got got into security I remember I got involved early on
00:18:13
with InferNard and meeting those security professionals and
00:18:19
realizing someone once said to me listen, if we have it, then
00:18:23
guys have it before and already used it.
00:18:25
And that got me thinking that always be learning, looking into
00:18:29
those pieces, because now we're not just running against a guy
00:18:34
that's just dropping a little virus on a USB stick.
00:18:37
We're well past that in some of these infiltrations and I have
00:18:41
to be the best that I can be.
00:18:42
And that includes, you know, reengineering what we offer, how
00:18:46
we offer it, how we run the business and what we're going to
00:18:50
stake.
00:18:50
You know our name on how we run the business and what we're
00:18:53
going to stake our name on.
00:18:54
Because being, you know I'm in South Florida and small
00:18:56
community that's out there, you know some of the verticals.
00:18:59
You know when you go out there and you're known in your area,
00:19:02
you've got to have a little positive.
00:19:04
There's always going to be a couple of negatives.
00:19:06
So I think that I always ran the business and I said you know
00:19:10
I treat customers as a better and I said you know I treat
00:19:13
customers as if I'm trying to treat them the way I want to be
00:19:16
treated.
00:19:22
Speaker 1: And you know no one's above.
00:19:25
You know no one's above treating people the right way.
00:19:26
Yeah, that's a great point that I think is often missed amongst
00:19:35
some leaders.
00:19:35
Maybe not you know leaders publicly, right in the industry
00:19:37
or whatnot, but leaders in a company and whatnot.
00:19:42
You know why don't we talk about the importance of
00:19:43
potentially even you know qualifying the opinion that you
00:19:45
get of other people.
00:19:46
It could be peers.
00:19:48
It could be you know someone that's beyond you in your career
00:19:51
or whatnot peers it could be you know someone that's beyond
00:19:54
you in your career or whatnot, or people that are not even in
00:19:55
the industry, right, and the reason why I want to bring that
00:19:57
up is because, you know, when I formed my LLC, right, this whole
00:20:03
podcast world kind of rolls up under I mean, I didn't have a
00:20:10
project, I didn't have a gig or anything like that, right, I
00:20:13
didn't even think about the podcast at the time.
00:20:16
It literally was not on the radar and I still had people
00:20:20
telling me you know, what value are you going to actually bring?
00:20:24
Right?
00:20:25
What's the value?
00:20:26
Why would someone give you money, right?
00:20:28
Why would anyone do that?
00:20:30
And you really think that you're going to provide more
00:20:32
value than you know Optiv or these other you know thousand
00:20:37
pound gorillas that are in the arena or whatnot.
00:20:41
You know, I had a lot of critics and one of the things that I
00:20:46
had to learn how to do is to qualify the opinion.
00:20:51
And you know I like the people right, I'm still friends with
00:20:54
them to this day and they they, you know don't like to admit
00:20:59
that they were wrong to some extent right.
00:21:02
But you know you have to qualify where it's coming from,
00:21:07
at least in my experience.
00:21:08
You know, did this person do it ?
00:21:10
You know, did they do this before?
00:21:12
And are they coming at this, you know, from a, from a view of
00:21:16
experience, or are they coming at it from a view of I've never
00:21:21
done it and it wouldn't work for me, so it probably won't work
00:21:24
for you.
00:21:24
Or you know something like that and you know it's important to
00:21:30
to go through that, you know, because you have to be able to
00:21:33
kind of stand your ground and say, well, if it doesn't work,
00:21:37
it doesn't work.
00:21:37
At the end of the day, it's on me if it works or not.
00:21:40
I'll take that bet on myself.
00:21:43
I'll take that risk on myself every single day.
00:21:47
Speaker 2: I think it definitely is important.
00:21:49
In my prior company we had offices in New York City,
00:21:54
washington DC, raleigh and Miami and as we started doing
00:21:59
different things in different areas, I learned on early that
00:22:02
what might have worked in Washington DC definitely didn't
00:22:06
work in Raleigh.
00:22:06
It might definitely not work in Miami.
00:22:09
I learned those pieces definitely not work in my end.
00:22:11
I learned those pieces.
00:22:11
It cycles back to this because where that opinion's coming from
00:22:17
, you know, hey, what's your market like, where are you in
00:22:20
the world, what size is your audience, or your company that
00:22:23
you're a customer, because it's important to know someone that
00:22:27
runs a large $100 million company.
00:22:29
Their view is much different than a small company that goes
00:22:32
out there.
00:22:33
So I like to get the opinions and then I definitely weed it
00:22:37
out and stop thinking about it, and a lot of times I'll go back
00:22:40
for a second.
00:22:40
You know, a second taste.
00:22:42
So I equate a lot of things to food and carbs.
00:22:45
You know, you and I might eat in the same restaurant two
00:22:49
different days in the same week and you might have had a
00:22:51
terrible experience, but I might have had a really good
00:22:54
experience.
00:22:54
So I'm usually, you know, a two striker kind of thing if I
00:22:58
didn't have a good meal, I might go back a second, like maybe it
00:23:00
was an off night and I knew that in all the opinions.
00:23:03
You know, if you gave me something that said it might not
00:23:05
have worked, I might have looked around for a second or
00:23:08
third opinion or don't bark to you in my life to make sure that
00:23:12
I have all the facts how it will affect me.
00:23:14
You know and I'll just you know .
00:23:15
Maybe you called your name, rank and Ethereum number.
00:23:17
You know it's the same kind of thing.
00:23:19
You know where are you, what's your background, and maybe you
00:23:23
know that person wasn't an entrepreneur.
00:23:25
So can you imagine all the things that we've missed, you
00:23:28
know over our lifetimes?
00:23:30
Where you're like?
00:23:31
Speaker 1: oh my.
00:23:31
Speaker 2: God, how did this guy actually do that?
00:23:33
Everyone said he was wrong and boom, you know, just believed in
00:23:37
it.
00:23:37
So I think if we believe in ourselves on that end of it, you
00:23:43
know, we can only learn by doing yeah.
00:23:44
Speaker 1: Yeah, that's a great point, you know, and it's
00:23:48
interesting, you know.
00:23:49
You equate it to like going to a bad restaurant or something
00:23:52
like that, you know, or having a bad experience overall, Like I
00:23:56
found having, you know, kind of like a like a test group right
00:24:01
that I run new ideas by.
00:24:02
It's a myriad of people from different backgrounds, different
00:24:08
experiences.
00:24:09
Some are far more negative than others, Some are far more
00:24:13
negative than others, some are way more positive than others,
00:24:15
and I'm trying to find that, you know, that middle ground right.
00:24:18
But I need to hear all sides.
00:24:20
I need to hear, yeah, why this person thinks it's not going to
00:24:24
work.
00:24:24
And I mean he'll, he'll really go into detail, he'll give you
00:24:28
stats and everything.
00:24:30
All of that's great, right, like I, I need that part of it.
00:24:34
But then I also need the optimistic person that is saying
00:24:37
, oh, it'll work.
00:24:38
And you know, the one that I chose was the one that that
00:24:55
obviously the most people liked and whatnot.
00:24:57
But there was still those opinions in it that, like, like,
00:25:00
this is a terrible logo, that you know, this one doesn't work,
00:25:03
you can't really see it, and this, and that you know and it's
00:25:06
important, you know, to get those, to get those opinions you
00:25:11
mentioned previously.
00:25:12
You know we're right at the cusp of a new era in technology,
00:25:16
right, and you know it's interesting because I view it as
00:25:20
that as well.
00:25:21
And then I talked to people that do AI for, like NVIDIA and
00:25:26
they're they're saying like well , ai has been around in some
00:25:29
capacity since the 80s, right, so to them, you know, they're
00:25:33
kind of living in this thing day in, day out, right, they don't
00:25:37
ever, they don't really see it as a plateau or as a platform to
00:25:42
launch everything else off of.
00:25:44
But you know, for most of us, we're seeing it for, you know,
00:25:48
the very first time, really, right, we've always heard of AI,
00:25:51
but we're seeing it for the very first time and it's really
00:25:54
impressive in some ways, and maybe least impressive, you know
00:25:58
, in some other ways, right, but we're at the very beginning of
00:26:04
something that's going to change IT forever.
00:26:07
I mean, it's going to change the world forever, but if we're
00:26:09
just talking about IT overall, it's going to change IT forever.
00:26:11
How do you make the pivot right ?
00:26:12
Because I mean it's going to change the world forever, but if
00:26:13
we're just talking about IT overall, it's going to change IT
00:26:14
forever.
00:26:14
How do you, how do you make the pivot right?
00:26:16
Because I always try to have that mentality of how do I
00:26:20
recession proof my career right, and that means being on the
00:26:24
cutting edge, that means always looking for that next challenge
00:26:28
and tackling it, getting that specialty in it and, you know,
00:26:32
going forward right.
00:26:33
So how do you, how do you make that pivot into ai yourself as a
00:26:39
, as a business leader, right, and then as a technical expert?
00:26:45
Speaker 2: you know, you're right, ai has been around a
00:26:48
while and and if we do a presentation that AI goes back
00:26:53
to its basics in the 50s and 60s , it sounds kind of crazy.
00:26:56
But you know, overall it's been around a long time and I
00:27:00
envision that and I've seen it just in my career we're in a
00:27:06
cyclical industry, you know.
00:27:08
I remember that.
00:27:09
You know the rationale of when we would call it the cloud when
00:27:13
the Internet was out, only because when we had connectivity
00:27:16
, that was a little circular thing that was there.
00:27:18
You know, when I was doing cloud, when I was at this law
00:27:21
firm, we were, we were tiny, sharing a mainframe.
00:27:24
So AI is one of those pieces.
00:27:26
And you see, you know we followed OpenAI for a while.
00:27:30
When we look at the stats, I was just there on Microsoft and
00:27:33
got to the briefing and it was really kind of cool how quick
00:27:38
OpenAI went X number of 200 million users.
00:27:42
We looked at what Facebook or cellular phones started.
00:27:46
So we're always looking at it.
00:27:48
We're a high-level Microsoft partner.
00:27:50
We were following this for the longest time.
00:27:52
We've heavily been on the Microsoft cloud.
00:27:57
We've been playing with all types of early releases, with
00:28:01
CodeBallet, we're involved with Borg, we're involved in OpenAI.
00:28:05
So we saw the power of it and then no one expected to go from
00:28:10
zero to 100 million users over that time.
00:28:13
And as we saw the chat GPT group, we determined that, hey,
00:28:19
let's find the right platform, let's stick with it and let's
00:28:23
try to become a little extra.
00:28:25
So we're hoping that.
00:28:27
You know we're in the process of building.
00:28:29
So I'm 20 employees, so I'm not by far a large type of foot For
00:28:34
a man of service.
00:28:34
We're very much a decent size and we'll have another practice
00:28:38
area in AI.
00:28:39
We're very close to launching it now.
00:28:41
We've partnered up with a couple of little vendors that
00:28:45
are giving us an idea, but most importantly, we want to go out
00:28:49
and consistently educate our customers and prospects.
00:28:53
You know this is insane.
00:28:55
You know people.
00:28:56
You know I'm not scared of a robot coming out to be able to
00:29:00
go out there and I just try to listen to different industry
00:29:04
experts.
00:29:04
You know outside of IT and you look at some of the pieces of
00:29:09
people, how they're using what they're using.
00:29:11
So we try to explain to our customer that hey, listen, while
00:29:15
chat GPT is very cool, co-pilot is very cool and you waste a
00:29:19
lot of time going in there and spending that.
00:29:22
I recorded it for like the first time it was on YouTube.
00:29:25
You know, think about the first time you're on YouTube.
00:29:27
How many days will you probably waste?
00:29:28
You just plan around because we had access to it, and I view
00:29:32
that, and this generative AI, as the same thing.
00:29:34
So, for our customers, we want to go out and talk to them, find
00:29:38
out ways that we can help them, not just sell them a copilot
00:29:42
license, you know, and now you're out there and you're
00:29:46
having to generate things, how it can be useful to your
00:29:49
business, and we're just kind of looking at it's an easy way to
00:29:53
low-code, no-code a lot of applications that could help
00:29:58
organize your business, not necessarily change the business.
00:30:02
There are other industries that we're looking at that we're
00:30:04
helping to develop very low-code applications.
00:30:08
I'll give you this simple scenario.
00:30:10
The second one is a variable that we deal with is
00:30:12
not-for-profits.
00:30:13
Not-for-profits are always looking to go out there and
00:30:16
apply for grants.
00:30:17
Grant writing is an extensive, important skill that you must
00:30:22
acquire.
00:30:22
We're working with a vendor that we're hoping to slightly
00:30:27
develop, where you can go out there and have all of your grant
00:30:31
information into this large language file, very simple piece
00:30:34
.
00:30:34
You copy and paste all the stuff about it.
00:30:37
You can feed into the engine grants that you've applied for
00:30:41
grant applications and, instead of going out there and having to
00:30:44
start prompting and make me sound like Shakespeare, make me
00:30:48
sound like John Rockefeller you'll answer a few questions,
00:30:52
drag and drop a PDF for the grant into this application.
00:30:55
It will start generating a lot of the grant response, which is
00:30:59
the hardest part.
00:31:00
So here's a simple piece that we can take people that probably
00:31:04
don't have as much budget as the next company and help
00:31:09
automate using AI, versus just giving them a tool that they can
00:31:13
do anything in the world.
00:31:15
So I use those examples of how we can kind of teach people what
00:31:19
AI truly is, ways that they can go out and use it and A it's
00:31:24
like do you need it, why do you need it?
00:31:27
You have a budget for it and how can it help your company?
00:31:30
It's the so-safe.
00:31:31
It's spending you know thousands of dollars developing
00:31:34
something that maybe will save you five minutes.
00:31:37
Speaker 1: Yeah, it's really interesting to see how useful
00:31:42
these AI tools are becoming.
00:31:44
When we spent so long not really having easy access to
00:31:50
that sort of technology and now it's so easy to have the access,
00:31:54
it's fascinating to me to see how people are going to start
00:31:57
using it more and more.
00:31:58
Like for myself.
00:31:59
Right, I'm getting my PhD in securing satellites, right?
00:32:02
I won't give you the whole title or whatever, because
00:32:04
that's like a whole paragraph, but you know, essentially it was
00:32:09
securing satellites to prepare for post-quantum encryption.
00:32:12
And it is extremely helpful for me to just go to ChatGPT and
00:32:20
say what are 10 scholarly articles on this topic?
00:32:23
Give me the top ones and then give me the opposing views,
00:32:31
rather than me trying to go without it.
00:32:34
Because in the very beginning I'm used to using Google for
00:32:39
research and pulling up different articles and whatnot,
00:32:42
right?
00:32:43
And when I was doing it in my preliminary research, I mean I'd
00:32:48
have 100 tabs open of different articles and I'm comparing them
00:32:52
and everything else, like that.
00:32:53
And then you know the realization kind of hit me.
00:32:54
When I had to like, do a literature review, it's like, oh
00:32:56
wait, realization kind of hit me.
00:32:57
When I had to like, do a literature review, it's like, oh
00:33:00
wait a minute, I need to just use this tool because this tool
00:33:02
is going to make it way easier.
00:33:03
It's going to tell me where to look and you know what the ideas
00:33:07
on it are and everything else like that, right, and it does
00:33:11
help a significant amount.
00:33:12
Like, obviously it's not writing my dissertation for me
00:33:16
or anything like that, but it's making the research more
00:33:21
optimized, right?
00:33:22
Because I'm telling it hey, find something with this
00:33:25
viewpoint on this topic and pull it up to the front and I'll
00:33:31
assess how I want, if it's useful to me or not.
00:33:35
And it's really fascinating to see how this is evolving.
00:33:41
Speaker 2: You know it's getting you to the information.
00:33:43
You know the other use that we're seeing is that we do a lot
00:33:48
of Microsoft Teams and organizations and you know early
00:33:51
days of pandemic we were offering all different
00:33:54
organizations free Teams.
00:33:55
You know you get flown into this.
00:33:57
We're all in this meeting.
00:33:59
We're on a Zoom call.
00:34:00
But you know now we see, with Copilot and in Teams we're
00:34:04
actually retraining our clients on how to conduct their proper
00:34:08
meeting and having it recognize my voice as opposed to my face
00:34:14
on there.
00:34:14
Or you know we all do this it's February, 1 o'clock.
00:34:18
You're running there.
00:34:18
Or you know we all do this it's 5 or 1 o'clock.
00:34:18
You're running to that end of the site.
00:34:20
You're into that meeting.
00:34:21
You don't know what you missed.
00:34:22
You know, simple thing Show me what I missed, summarize the
00:34:26
meeting.
00:34:26
Or you know there's lots of times where I just can't make a
00:34:30
meeting and nobody wants to go through.
00:34:32
You know our transcript, our transcript.
00:34:34
Show me any time where my name was mentioned.
00:34:37
Show me the to-dos.
00:34:38
So I'm thinking of productivity things.
00:34:41
That doesn't take much.
00:34:42
So we're reorganizing our meetings.
00:34:43
You know, if Mike's not at the meeting, say hey, listen,
00:34:46
remember to ask Michael Goldstein to do this and try to
00:34:50
do that stuff when that person misses that meeting for whatever
00:34:52
reason.
00:34:52
It's just a snark transcript, or you're late.
00:34:56
Show me what's going on there.
00:34:58
So you're right.
00:34:59
How many things for me can you do?
00:35:00
Fingertips for some reliable information or verifiable
00:35:03
information.
00:35:04
But you're right when we would have had, you know, four screens
00:35:08
, research tests.
00:35:09
You're going to build a tab, you run, you're on one search
00:35:12
engine and you can ask that simple piece in a simple phrase
00:35:16
to go out there.
00:35:17
So you're right.
00:35:18
And then just think of I look at , I remember the.
00:35:22
I think I don't remember the name of the movie, but Tom Hanks
00:35:24
was stranded in this airport.
00:35:26
I forget why it was he was.
00:35:28
He couldn't get through.
00:35:29
And then you saw the security cameras.
00:35:31
Think about when you're walking down the street in any city.
00:35:34
You know there's not enough eyes or hands anymore to monitor
00:35:37
that.
00:35:38
So now we're relying on something to fine-tune us.
00:35:41
So it would be kind of you know , it is kind of cool that, hey,
00:35:45
there's a rattling in the fence, all the cameras go out, they
00:35:47
alert someone.
00:35:48
So we've been utilizing this.
00:35:50
But it's kind of cool that we can rely on it and I always sit
00:35:53
there and you know to, and I always sit there and you know to
00:35:56
be or not to be.
00:35:56
Do you want?
00:35:57
Will we get to the point that you know AI could push the
00:36:00
button, or is it just pushing the human to make a decision to
00:36:05
push the button?
00:36:12
Speaker 1: So the only thing that helps us is that we can be
00:36:16
better informed, faster.
00:36:17
Yeah, what are some risks maybe , or cons to AI?
00:36:37
Right, because I think it's a very powerful tool, very useful.
00:36:37
Obviously.
00:36:38
I use it quite a bit and we're talking about using it in kind
00:36:38
of like optimized ways, right, how does it best optimize my
00:36:39
time and whatnot?
00:36:39
Um, but what are some cons potentially to that?
00:36:40
Are we going to get lazier, you know?
00:36:43
Speaker 2: I think we're already at that stage.
00:36:45
Right, you know, I'll just sit there and say I used to joke
00:36:48
that you know, I don't know my kids telephone numbers because
00:36:51
it's speed dial.
00:36:52
You know, now I'm not even speed dialing, I'm asking one of
00:36:55
those engines to do it.
00:36:58
So I think that's it, but I think it's verifiable.
00:37:02
Later, I think in the early days, let's say 18 months ago,
00:37:07
people were in a hurry.
00:37:09
They'd ask the engine something and swear that that data was
00:37:13
true.
00:37:14
So I think that this information that's out there
00:37:18
made you sure Again.
00:37:19
It never first well starts.
00:37:21
So I kind of say, when you're using Copilot and it gives you
00:37:25
cancer, it gives you kind of like a footnote.
00:37:27
Most people only get a footnote , so you know where the data is
00:37:30
coming from.
00:37:31
So I think the misinformation that's out there, obviously I
00:37:35
hope we come up with some standards for graphic generated
00:37:37
by AI there.
00:37:37
Obviously, I hope we come up with some standards for, you
00:37:39
know, graphic generated by AI so that we can see that we're
00:37:44
seeing this already and you know it becomes easier for guys to
00:37:49
utilize these tools for badges.
00:37:52
But I think the biggest threat is your laziness or
00:37:55
misinformation.
00:37:56
I don't like to flood those.
00:37:58
One of my friends showed me, you know, an application called
00:38:02
GPT-0, a main app where you could possibly go out there,
00:38:06
take a paper, take something, feed it into it, because as
00:38:10
humans, you know, we can never be as perfect as the
00:38:14
AI-generated attacks.
00:38:16
Right, we're all over the place .
00:38:18
Sometimes I do still say that something is definitively
00:38:21
generated by AI, but if you do a percentage of where it might be
00:38:26
that you were a human doing this versus a machine doing that
00:38:30
, people can make that.
00:38:32
So, you're right, people will go out there.
00:38:34
And now we have to talk to it.
00:38:36
We have to say hey, whenever our engine is, I don't want my
00:38:42
phone to go off when they say the word, but I think we really
00:38:46
have to get back to basics on some of these things.
00:38:49
You know, the young ones that are addicted to you know their
00:38:57
phones and asking the questions or what's the tech, or using
00:39:01
their mind to go out there.
00:39:03
And, lastly, I think that the part that we're also missing in
00:39:07
a certain generation is really customer service communication.
00:39:11
It's not all about the text, it's not all about, you know,
00:39:14
maybe having Bart or Ola write me the emails.
00:39:17
We have to communicate some way outside of the electronic stage
00:39:21
.
00:39:21
So I hope that you know people do look at the basics.
00:39:25
You know we talked about when you started off early, about
00:39:29
getting to help desk, and starting to help desk we have to
00:39:32
be able to communicate.
00:39:33
Help desk is the perfect example of you know.
00:39:36
I'm calling you for something.
00:39:38
You have to service me on that side of it and I think that I'm
00:39:41
hoping that we get back to having more customer service and
00:39:45
not just the quick.
00:39:46
I want a catapult.
00:39:47
I do want to dial 1-800-MESTIFY-HUMAN sometimes
00:39:52
because I have a problem and the chatbot's asking me too many
00:39:56
questions.
00:39:58
Speaker 1: And maybe the most frustrating thing that I go
00:40:04
through is when I deal with some bot, when I pick up the phone
00:40:08
and I dial a number, right, and now I have to answer 50
00:40:11
questions to get to the right place, when I know exactly where
00:40:15
I need to go already.
00:40:17
And if someone else would have just picked up the phone, it
00:40:19
would have been a two second thing, you know would have been
00:40:22
like hey, just put me through to someone and and you know that
00:40:25
handles this right, it would be two seconds.
00:40:28
But now I have to waste my time and spend 45 minutes on this
00:40:32
phone call.
00:40:33
And it is the most frustrating thing that I have to go through,
00:40:40
like when I call like my car insurance or you know, whatever
00:40:44
it is like, god forbid.
00:40:45
I have to call my insurance provider, like, oh my god, I
00:40:49
have to book off half my day, you know, just for just for
00:40:52
decompressing from the aggravation.
00:40:54
Speaker 2: And then it asks for your account number 17 times.
00:40:57
I have to validate this one thing.
00:40:59
One of the AI apps that we're looking at with a lot of our
00:41:03
bandwidth is we just go down to the zeros we all have.
00:41:08
Hey, I recognize your number, you know your client, but how do
00:41:11
the AI answer?
00:41:13
Hey, nice to meet you boys, but try to figure out sentiment.
00:41:17
This is Joe calling.
00:41:18
I can tell that his voice is in discretion.
00:41:21
We can't lie to the right person first.
00:41:23
So, trying to make it smaller, I'm not yelling at my insurance
00:41:28
company, but I can tell in some way of getting in there.
00:41:31
My mom was recently in the hospital.
00:41:34
It was over Memorial Day weekend and they had to open a
00:41:37
claim.
00:41:37
So I go and I'm living online.
00:41:40
I got her information.
00:41:41
My mom was recently in the hospital.
00:41:43
It was over Memorial Day weekend and they had to open a
00:41:45
claim.
00:41:45
So I go and I'm looking online.
00:41:46
I got her information and I'm getting to this point and it's
00:41:48
like oh, you have to call customer service to open this
00:41:50
claim.
00:41:50
It was a weekend, I didn't expect it.
00:41:52
So I go through this.
00:41:52
I call the 800 number and I first, you know, let's see if
00:41:58
they're trying to sell me.
00:41:59
You know, let's just do it all over again.
00:41:59
They're trying to sell me, you know, like a heart monitor.
00:42:00
You know, I whipped my way back around again, I got back around
00:42:03
again and next thing I know, you know, I'm going out there
00:42:07
and I only find out after an hour or so as I'm getting one of
00:42:11
those robot calls here.
00:42:12
What physically happened at that point is that I found that
00:42:16
they were closed.
00:42:17
So here I am.
00:42:18
I went into an hour of filling something out, getting up to a
00:42:21
spot that I couldn't go out there and do unless I spoke to
00:42:25
someone.
00:42:29
Speaker 1: I rounded out on the calls and lo and behold, they're
00:42:31
closed for Memorial Day weekend .
00:42:31
Yeah, I've experienced that myself.
00:42:33
That is just so extremely frustrating, you know myself.
00:42:38
That is just so extremely frustrating.
00:42:39
You know, kind of circle back on, you know, maybe like a
00:42:41
source of truth right From from an AI model.
00:42:45
Are you ever concerned with potentially poisoning the pot,
00:42:55
so to speak?
00:42:55
Right, and and saying that something is true, like someone
00:42:57
you know, just, this is just an example, right, someone at
00:43:01
OpenAI, right, saying that something is true when it's not
00:43:04
actually true to direct, you know, opinions of the public and
00:43:07
in certain ways, I mean, we've seen that with elections, with I
00:43:12
can't even think of the company right now, but you know,
00:43:14
netflix does a fantastic docuseries on it.
00:43:17
It was very fascinating to me, kind of opened fantastic, uh,
00:43:18
docuseries on it.
00:43:19
It was very fascinating to me, kind of opened my eyes to hey,
00:43:22
that that's something that can happen, right, and not just can
00:43:26
happen but has happened.
00:43:28
Um, are you ever concerned with that?
00:43:29
Like I, I get concerned with that because, I guess, from the
00:43:33
security mindset maybe I have this problem where I'm always
00:43:36
forever in the security mindset where I'm thinking, well, is
00:43:40
that coming from a source of truth, or is that being
00:43:44
influenced some way?
00:43:45
Or you know where's the flaws in it?
00:43:49
Right, and my wife always picks on me because like I'm doing
00:43:53
that and she's, you know, a little bit more trustworthy.
00:43:56
I mean, she's a teacher, you know.
00:43:57
So like she's, she's going to be a little bit more trustworthy
00:44:00
.
00:44:00
But do you ever get that, that kind of that thought as well,
00:44:04
that concern?
00:44:06
Speaker 2: definitely.
00:44:06
You know I think that you didn't need ai to bring that
00:44:09
along.
00:44:10
I guess you know, if we go back eight years, we're calling it
00:44:13
the algorithm, right?
00:44:15
You know you read about this.
00:44:16
You know it's still scary to me .
00:44:18
You know you read about this.
00:44:18
You know it's still scary to me .
00:44:20
I was away in Colorado out on business and next thing, you
00:44:25
know, I went to Qatari and then logged into anything.
00:44:28
You know I'm getting fed this in my social media feeds and all
00:44:31
those kind of things.
00:44:31
So it does worry me.
00:44:33
It does worry me about that, but I think that we have to be
00:44:38
careful as we go.
00:44:40
We're not sure who we can trust , right?
00:44:42
So I hope that I look at it in multiple verified sources and I
00:44:48
always say that if it looks too good to be true, there must be
00:44:52
something up.
00:44:53
And I see it now.
00:44:54
I'm a huge Florida Panther talkie.
00:44:56
Just went to the parade yesterday and did a season
00:44:59
ticket.
00:45:00
All the response In your feeds, even over playoffs or over the
00:45:04
weekend, we had just all these little trade rumors.
00:45:07
They weren't trade rumors, they were looking at life.
00:45:09
It happened and I was like this has to be.
00:45:12
You know it can't be true.
00:45:14
So I think that we always have to finally get a secondary
00:45:18
verifiable source.
00:45:19
It's out there today, no matter what tools we use, and I think
00:45:23
we just have to educate people.
00:45:25
I think the younger they are, the less likely they are to look
00:45:30
at a second source.
00:45:31
I do a lot of speaking and education with local colleges
00:45:38
and local schools to go out there and just talk about this,
00:45:42
just like we're talking.
00:45:43
You can't just leave because you came on that app feed and
00:45:47
then you're going to tell someone else and when you post
00:45:49
it that's the old beam of your telephone that you go out there.
00:45:53
You have to be sure and we're trying to convince them that
00:45:56
just because you came on what you thought was the trusted
00:45:59
source feed, if it doesn't make sense, you've got to look
00:46:03
elsewhere and you can't spread it.
00:46:06
I think that publicly we're out there educating all of our
00:46:11
clients in various markets on just the cyber threats and I
00:46:16
view this as a terrible cyber piece because it just takes.
00:46:21
Now they have a little more horsepower because of the AI.
00:46:24
We look at this and it's important to go out there and
00:46:29
educate.
00:46:29
We speak at middle schools, we've spoken at retirement
00:46:33
communities, we've spoken all over the place that we have to
00:46:37
be cyber aware in almost every aspect of our life these days.
00:46:43
Speaker 1: Yeah, that is.
00:46:44
That's like the was it like the blessing and the burden or
00:46:49
whatever.
00:46:50
Whatever the slogan is right, because you know, technology is
00:46:53
a fantastic thing, right?
00:46:56
I mean, it catapulted our entire community and world into
00:47:01
this different, you know, kind of dimension almost right.
00:47:04
And it, you know, helps millions of people every single
00:47:08
day.
00:47:08
You know billions of people every single day.
00:47:11
But we're at this point where it's like you can't just trust
00:47:17
you, what you're seeing, you know, on google.
00:47:19
You can't just trust whatever you're seeing on your feed,
00:47:22
right and like, like you were mentioning, with the trade
00:47:26
rumors and whatnot.
00:47:26
I see that non-stop and it is the most frustrating thing
00:47:31
because it's almost like ai generated this article that's
00:47:36
intentionally clickbait, that doesn't even talk about the
00:47:39
thing that's in the title, just to get you to click on the
00:47:43
website so that they can get more ad revenue or whatever it
00:47:46
is.
00:47:46
And I've started to actually unfollow a lot of those accounts
00:47:50
because it's really frustrating .
00:47:52
You're monetizing me in a way that I am okay with if you're
00:47:58
providing me what you're providing me with, what you're
00:48:00
supposed to be providing me with , but you're not, and so you're
00:48:04
just kind of abusing my intrigue , my interest.
00:48:10
It happens every single year.
00:48:13
Speaker 2: I'm a big NFL fan and if it's bad enough, I go out
00:48:16
there and report and you'll see X Y Z died.
00:48:18
I'm like, oh my god, the first time I'm like I'm not going for
00:48:23
any of those things X, Y, z.
00:48:24
And then you're added by some gadgets.
00:48:30
Speaker 1: Yeah, it's very frustrating.
00:48:34
How do we, is there a way to have an unbiased fact checker?
00:48:42
You know, because I go back right and you know, in the 2020
00:48:46
elections there, it was a huge, huge thing of everyone saying
00:48:51
you know, you have to fact check .
00:48:53
And you know you had all these different fact checking
00:48:55
resources and whatnot, and even friends of mine in security, you
00:48:57
know you had all these different fact checking
00:48:57
resources and whatnot, and even friends of mine in security, you
00:49:01
know, would go to a fact checker and and check it and
00:49:03
whatnot.
00:49:04
Right, and and I'm, I felt like I was the only one sitting here
00:49:07
saying like, well, what if someone that you know created
00:49:11
that fact checker doesn't agree with that viewpoint?
00:49:13
Yeah, and they're, and they're, you know, saying that something
00:49:16
is true when it's not right.
00:49:18
Like, how do you know?
00:49:18
And no one could ever answer that question, you know.
00:49:21
So it's just we're going into like uncharted territory, I feel
00:49:26
.
00:49:27
Speaker 2: We are, we are and you know it's funny, I'm a big
00:49:30
reader.
00:49:30
I, like you, know fantasy books , you know the and all the spy
00:49:34
kind of stuff and everything I read these and you're like, is
00:49:37
that real?
00:49:38
Like do they really have this tool?
00:49:40
And you know, sometimes they do .
00:49:42
You know, after we're looking at you know the stuff that was
00:49:46
created for good out of the bad.
00:49:48
You know I always was a really big you know World War II kind
00:49:53
of book.
00:49:54
I always followed.
00:49:55
You know Manhattan Projects and I'm often hierarchy.
00:49:59
Now and you look at that, I went back and I re-read stuff that I
00:50:05
learned before.
00:50:05
Just because you look at this, they would say it in the world
00:50:10
until they realized what could possibly happen.
00:50:13
And you can apply that before they develop the atomic bomb.
00:50:17
You can apply that to all the tech that could have been
00:50:19
developed yesterday, right, so we've developed this amazing
00:50:23
thing called social media and you know the people have freedom
00:50:27
to have their device of choice at a very reasonable price out
00:50:30
there, right anywhere in the world to see this stuff.
00:50:33
But you know you can choose to use some of the good for bad, or
00:50:37
I'll say a lot of the good for bad and you know leads people
00:50:41
confused and we have a newer generation that trusts what they
00:50:44
see on.
00:50:45
You know, coming across that feed and I always kind of say
00:50:49
it's our job to educate them.
00:50:50
That's what we're talking.
00:50:51
Look for another source just to be sure, because if it's too
00:50:55
bad or too good to be true there's someone trying to take
00:50:58
advantage of us, it's going to be on the next 10, 15 years,
00:51:02
because I have four grandchildren under five and I
00:51:05
worry for that.
00:51:05
You know a five-year-old now to walk into my house, you know,
00:51:09
flip on the smart TV, find his spot on there, you know, hit it
00:51:13
with the phone.
00:51:13
But you know they're trusting because they grew up with that
00:51:17
and they're used to thinking well.
00:51:19
Why would I doubt this?
00:51:21
Speaker 1: Yeah, that's very true.
00:51:22
I have a one-year-old at home and I'm already thinking through
00:51:25
like man, like I'm training them to trust these.
00:51:28
You know these resources, like natively, you know by default,
00:51:48
like how am I supposed to?
00:51:48
You know, raise this kid?
00:51:49
That has a healthy sense of caution.
00:51:49
You know, when they're told something, something, hey, maybe
00:51:51
I should go find out for myself .
00:51:51
And what's the?
00:51:52
What's the source of that information, all that sort of
00:51:53
stuff.
00:51:53
It's um, it's a.
00:51:55
It's a very difficult problem to solve because of where we're
00:51:59
going in tech right, where like, and how quickly it's evolving.
00:52:03
I feel like we, we can't go around.
00:52:06
We're kind of just trying to catch up.
00:52:11
Speaker 2: We do a lot of customer and prospect-facing
00:52:15
events.
00:52:16
Listen, after a while, we can feed them cyber feeds, left and
00:52:22
right.
00:52:22
We can feed them the latest reach.
00:52:23
What we try to do is we try to have a cause and right, we can
00:52:25
feed them the latest reach.
00:52:26
What we try to do is we try to have a cause and effect.
00:52:27
And you know, in South Florida, we're high on the human
00:52:34
trafficking scale in there.
00:52:36
So I get on this piece of the last year where we talk about
00:52:40
cyberbullying.
00:52:40
The after effects, the secondary effects, are that the
00:52:43
wrong people have access to that information.
00:52:46
So we're trying to tie together some of these things that
00:52:49
people can see that cause and effect, not just hey, what would
00:52:54
happen to all the power manufacturers and the RBL ships?
00:52:57
Or look at MGM or all of this, when we talk about some of those
00:53:02
big breaches that we can every day relate to, like we can
00:53:06
relate to You're in a hotel, oh my God, everything's a lot of
00:53:09
time.
00:53:09
What would happen if they got out?
00:53:11
Well, it happens, you know.
00:53:13
So I showed you the realistic things and lead them up to this
00:53:16
so that maybe they can come to their own conclusions.
00:53:19
You know, I think you and I at one point might have been in the
00:53:22
room and maybe we were definitely the low minority.
00:53:25
What the heck are these guys warning about?
00:53:27
Well, you know, the Internet's a great thing and as people
00:53:30
turned to that, you know, we became just a larger group that
00:53:34
everybody is counting back at the end user of security things.
00:53:38
But we have to give them those calls and effects to see what
00:53:42
else goes out.
00:53:43
There's a reason they want your credit card, you know.
00:53:46
There's a reason that they're breaching your data and if you
00:53:48
might not see that right away but left there, but you might
00:53:52
see it later on and you wonder how it happened.
00:53:54
Even in our best protection states, I will say that one of
00:54:00
my credit cards for a while was constantly breached, even when I
00:54:06
went back to the thing vendor and now I'm on their A-list and
00:54:10
it takes me four cones to get it .
00:54:13
Just somehow it happens and I like to give them personal
00:54:18
things that happen.
00:54:19
I own a Pineapple device.
00:54:20
I do stories of the Pineapple device Just to show them.
00:54:24
Hey, I don't even turn it on anymore.
00:54:27
It's so scary of what the data that you can get.
00:54:29
So I want to give them realistic things that they can
00:54:32
that they caustically relate to and you know, kind of white out
00:54:36
to protect the users or not bash the company.
00:54:39
But down there, basically, we do do more of those realistic
00:54:42
things to show them what could potentially happen and how one
00:54:47
simple thing or two simple things can better protect you,
00:54:51
so that the bad guy is doing the next guy or point out, not that
00:54:57
you're left or right, but that and that it's a killer.
00:55:00
There's no way that that could be possibly true.
00:55:03
And occasionally I have a throwaway device, a Chromebook
00:55:08
or something, and we talk about it and I'm like, hey, I saved a
00:55:11
couple of these things, watch what happens, you know nothing
00:55:15
tied to me kind of world so you can show people like we know
00:55:19
this is going to happen, don't give us a hold.
00:55:21
Watch this and hey, I'm going to wipe this device after it's
00:55:24
done.
00:55:24
You know, watch this and hey, I'm gonna wipe this device after
00:55:29
it's done.
00:55:30
You know, just to give them things.
00:55:31
Speaker 1: So we're very much into realistic kind of scenarios
00:55:33
stories and pictures and you know it's, you know it's real
00:55:36
true for stupid right right, yeah, well, you know, michael,
00:55:40
this is, uh, this has been a fantastic conversation, you know
00:55:44
, know, and I, I, we went down so many different rabbit holes
00:55:48
and whatnot.
00:55:49
You know, I, I, accidentally, you know, didn't even bring up
00:55:52
what your company is.
00:55:53
You know, we talked about it, but not not really what it is.
00:55:57
So, you know, before I let you go, how about you tell my
00:56:00
audience where they could find you if they wanted to reach out
00:56:02
to you and where they could find your company if they wanted to
00:56:04
learn more?
00:56:05
Speaker 2: So my company is LionInfo there and we are based
00:56:09
in South Florida.
00:56:10
We're a big Microsoft partner.
00:56:12
You can find me on LinkedIn.
00:56:14
You know Michael A Goldstein that's there.
00:56:17
You know, on most of the major social media pieces.
00:56:21
You know it's not the real Michael Goldstein, but there's a
00:56:25
lot of me that's out there.
00:56:26
But you, you know it's not the real Michael Goldstein, but
00:56:28
there's a lot of me that's out there.
00:56:28
But you know, this is definitely great.
00:56:29
I think there's a lot of good rabbit holes to discuss and, you
00:56:31
know, gives a lot of people a lot of things to think about.
00:56:34
But we're out there.
00:56:35
We're out there trying to help the world, you know, through
00:56:39
technology yeah, absolutely well , thanks everyone.
00:56:44
Speaker 1: I hope you enjoyed this episode.