We’re back for Q3 2024. Quick question of the day … then onto the topics of the day.
Topic 1: MIT Releases the AI Risk Repository
Thanks to the Massachusetts Institute of Technology for a major resource. It’s definitely more for you than your clients. But we see great opportunity here.
The public database contains 700+ risks as documented by published papers. It includes many categories, based on severity and other variables.
Check it out here: https://airisk.mit.edu/
You can poke around online and gets some good information. Or download the entire database in Excel format. You can then sort, color code, or do whatever you want with the data.
And that includes providing educationi and training for your clients
— — —
Topic 2: Shocker – Google Has a Monopoly on Search!
Okay, not really a shocker. But does it matter that a judege says so?
Does anyone care (other than other search engines)? Would breaking up Google change anything? What is most likely to actually happen (e.g., Microsoft-like consent decree)? The 1984 AT&T breakup into seven baby Bells may help us see 40 years into the future
As usual, we discuss.
Related Link: https://www.nytimes.com/2024/08/13/technology/google-monopoly-antitrust-justice-department.html
— — —
Topic 3: NIST Releases Cryptography for Quantum Computing
See https://www.axios.com/2024/08/13/nist-post-quantum-cryptography-encryption.
Quantum computing has been ten years away . . . for more than forty years. Besides the “no harm to update,” how much attention do we really need to give to quantum computing?
We have three different perspective on how much this “news” matters. As usual. What do you think?
00:00:00.000 –> 00:00:06.000
From
somewhere deep in the cloud and the
00:00:06.025 –> 00:00:12.200
corners of the earth, this is the Killing
It podcast, with a focus on helping you
00:00:12.225 –> 00:00:15.660
make sense and dollars of all things IT.
00:00:15.685 –> 00:00:21.140
With your hosts, Dave Sobel,
Ryan Morris, and Karl Palachuk.
00:00:21.165 –> 00:00:29.100
Welcome, everybody, to the brand new
episode 209 of the Killing It, Killing It.
00:00:29.125 –> 00:00:31.375
Podcast.
You You guys are actually in sync.
00:00:31.400 –> 00:00:33.060
I’m not going to have
to do anything to that.
00:00:33.085 –> 00:00:34.640
This is instinctual at this point.
00:00:34.665 –> 00:00:36.100
We do it so well.
00:00:36.125 –> 00:00:37.420
Smooth professionals.
00:00:37.445 –> 00:00:40.460
Probably did it in your sleep last night.
00:00:40.485 –> 00:00:43.440
Guys, I’m going to just dive right in
because I’m excited to talk
00:00:43.465 –> 00:00:46.775
to you about our topics.
I want to warm us up a little bit.
00:00:46.800 –> 00:00:47.940
We’re coming off the Olympics.
00:00:47.965 –> 00:00:53.220
If you could actually compete at an
Olympic level, what athletic
00:00:53.245 –> 00:00:54.400
event would it be?
00:00:55.240 –> 00:00:56.980
I am super non-athletic.
00:00:57.005 –> 00:01:01.800
I’ve never actually been
able to run a block.
00:01:02.160 –> 00:01:06.700
But I think if I were in the Olympics, I
would want to do archery
00:01:06.725 –> 00:01:11.050
just because it would be fun to
actually be good at something like that.
00:01:11.080 –> 00:01:16.050
I can shoot an arrow,
I can’t hit anything.
00:01:16.080 –> 00:01:20.770
They always say, If you miss the
target, it’s not the target’s fault.
00:01:20.800 –> 00:01:24.380
That defines my business.
00:01:24.405 –> 00:01:31.620
See, and if one of the options is not
becoming a professional Olympic Olympic
00:01:31.645 –> 00:01:36.460
watcher, which, by the way, I did in the
last couple of weeks, it
00:01:36.485 –> 00:01:38.480
was very compelling stuff.
00:01:38.505 –> 00:01:42.975
And by the way, super kudos to Paris for
the way that they did the locations
00:01:43.000 –> 00:01:44.240
and the venues and everything.
00:01:44.265 –> 00:01:48.360
Seriously, that was cooler
than any other Olympics.
00:01:48.385 –> 00:01:53.140
But I would still go back to where I
am from, which is the soccer stuff.
00:01:53.165 –> 00:01:58.340
I was actually, as a teen, coming up in a
development program, Olympic track
00:01:58.365 –> 00:02:01.495
for the U23 Men’s Soccer Tournament.
00:02:01.520 –> 00:02:03.260
Now, I missed it because
I wasn’t good enough.
00:02:03.285 –> 00:02:09.880
But if I could overcome that in my youth,
one only can’t imagine what
00:02:09.905 –> 00:02:11.780
the trajectory would have been.
00:02:11.805 –> 00:02:14.700
So the Men’s Soccer team,
they need some help.
00:02:14.725 –> 00:02:17.255
Let’s just admit it.
They need some help right now.
00:02:17.280 –> 00:02:19.120
They need to take some lessons
from the women’s soccer.
00:02:19.145 –> 00:02:21.560
You might actually be the closest to being
able to compete at that level
00:02:21.585 –> 00:02:24.170
than many of the other sports.
I’m with you, Ryan.
00:02:24.200 –> 00:02:28.360
Gold Zone was the greatest
invention ever for peak.
00:02:28.385 –> 00:02:32.890
I got it like the ability to watch
match all the sport all day long.
00:02:32.920 –> 00:02:37.055
As they moved quickly around, and here’s
this one, and here’s this one, and here’s
00:02:37.080 –> 00:02:38.860
gold over here, and here’s gold over here.
00:02:38.885 –> 00:02:41.500
I was addicted, and it was too fun.
00:02:41.525 –> 00:02:45.360
But if you were actually- Oh, by the
way, it’s in front of the Eiffel Tower.
00:02:45.385 –> 00:02:46.280
It was pretty cool.
00:02:46.305 –> 00:02:50.400
When I’m picking a sport, I’m going to go
with distance cycling because
00:02:50.425 –> 00:02:52.180
that feels the most useful.
00:02:52.205 –> 00:02:58.040
Because not only could I actually do it at
that level, I could then also commute
00:02:58.065 –> 00:02:59.860
locally in really useful ways.
00:02:59.885 –> 00:03:02.695
It feels like it would
be a really cool sport.
00:03:02.720 –> 00:03:02.920
Right.
00:03:02.945 –> 00:03:05.820
You can get back to
pedaling your electric bike.
00:03:05.845 –> 00:03:07.320
I know.
Exactly.
00:03:07.345 –> 00:03:10.360
I want to take time off E-bike.
00:03:10.640 –> 00:03:15.000
And I’m pretty sure Carl’s skill is also
tactically applicable.
00:03:15.025 –> 00:03:20.460
So in a world, I’m thinking being
able to play soccer, that’s cute.
00:03:20.485 –> 00:03:23.240
But archery would get you some kudos.
00:03:23.265 –> 00:03:26.520
On occasion, you really need that skill.
00:03:26.920 –> 00:03:28.080
All righty.
00:03:28.105 –> 00:03:31.500
Well, let’s dive into we
have three topics As usual.
00:03:31.525 –> 00:03:35.775
So the first topic we’re going
to dig into today is MIT.
00:03:35.800 –> 00:03:37.340
We’re going to put a
link in the show notes.
00:03:37.365 –> 00:03:43.100
But MIT has released an AI Risk
Repository, which is
00:03:43.125 –> 00:03:50.480
literally 700 articles and published
papers on risks that you need to be
00:03:50.505 –> 00:03:53.140
aware of with artificial intelligence.
00:03:53.165 –> 00:03:59.600
The cool thing about this is if you go to
the link that we give you, it’s at MIT at
00:03:59.625 –> 00:04:06.610
the university, and they allow you to
download this as an Excel spreadsheet.
00:04:06.640 –> 00:04:13.780
So if you want to do your own analysis,
sort, filter, whatever, do counts on
00:04:13.805 –> 00:04:17.020
keywords, anything you want to do.
00:04:17.045 –> 00:04:24.160
What I love about this is it’s an academic
view of something that is what we in
00:04:24.185 –> 00:04:26.620
the nerd world would call open source.
00:04:26.645 –> 00:04:32.800
So here you have this open database,
and you can decide for yourself
00:04:32.825 –> 00:04:36.020
whether or not it’s useful.
Now, who is it useful to?
00:04:36.045 –> 00:04:41.660
Well, it is 10% of this
audience and 0% of your clients.
00:04:41.685 –> 00:04:43.160
Oh, I disagree completely.
00:04:43.185 –> 00:04:46.060
Okay, this is good when we disagree.
00:04:46.085 –> 00:04:53.440
We’re in the golden age of NIST
and academic frameworks that I feel
00:04:53.465 –> 00:04:56.420
are incredibly useful in business.
00:04:56.445 –> 00:05:02.580
Again, I’m going to fall back on my basic
concept of the true value of IT services
00:05:02.605 –> 00:05:05.860
organizations
is in their expertise in guiding
00:05:05.885 –> 00:05:07.700
customers to the correct technologies.
00:05:07.725 –> 00:05:14.400
You have more than any time ever,
academics and industry professionals who
00:05:14.425 –> 00:05:19.260
spend their entire lives
thinking about this in a
00:05:19.285 –> 00:05:24.940
conceptual sense and give you all of the
resources to be incredibly smart
00:05:24.965 –> 00:05:27.420
very quickly on a lot of topics.
00:05:27.445 –> 00:05:32.700
I look at a database like this and say,
this is exactly the useful framework
00:05:32.725 –> 00:05:36.895
that I can then take to customers
and apply when I need to consider
00:05:36.920 –> 00:05:38.060
and answer their questions.
00:05:38.085 –> 00:05:41.900
I don’t have to talk in theoretical
terms about what AI risk is.
00:05:41.925 –> 00:05:46.460
I have exact models and use cases to
compare against to understand where the
00:05:46.485 –> 00:05:51.000
smartest people around who just spit their
time thinking about this have given
00:05:51.025 –> 00:05:52.800
me all of the resources needed.
00:05:52.825 –> 00:06:01.620
You can get up to speed on practices
at an incredibly rapid rate and leverage
00:06:01.645 –> 00:06:04.140
incredible intelligence
quickly with your customers.
00:06:04.165 –> 00:06:09.080
What I want people to think is, I’m not
saying I think everyone needs to go
00:06:09.105 –> 00:06:12.860
and learn all 700 and some scenarios.
00:06:12.885 –> 00:06:17.060
It’s more the, you need to be a really
good person
00:06:17.085 –> 00:06:23.740
at matching information to your customers,
and you’re given a searchable, indexable,
00:06:23.765 –> 00:06:27.580
sortable resource that
you can use very quickly.
00:06:27.605 –> 00:06:31.580
I look at this and say, your job
is knowing all these resources.
00:06:31.605 –> 00:06:35.700
These resources are available to you, and
leveraging them makes you exceptional.
00:06:35.725 –> 00:06:38.940
To your point, Carl, is
only 10% will use it?
00:06:38.965 –> 00:06:40.880
Yeah, those are the best 10%.
00:06:42.360 –> 00:06:48.400
Well, and those are the 10% who are going
to get paid for their professional
00:06:48.425 –> 00:06:50.820
services and consulting advice.
00:06:50.845 –> 00:06:56.000
I have three comments about this
that I think it is very interesting.
00:06:56.025 –> 00:07:01.300
Number one, thank you very much, MIT, for
quantifying what has up until now been
00:07:01.325 –> 00:07:04.580
just a general sense of foreboding doom.
00:07:04.605 –> 00:07:08.420
Everybody has said, AI equals risk.
What risk?
00:07:08.445 –> 00:07:12.100
Well, one, and two, and three, and
holy cow, you can’t imagine how many.
00:07:12.125 –> 00:07:16.880
Yes, we can because we’re scientists and
Because in a professional world, you can
00:07:16.905 –> 00:07:23.500
say there are not just a lot,
there are a number, and that’s very good.
00:07:23.525 –> 00:07:27.440
Number two, and by the way, there
are at least 700 risks with AI.
00:07:27.465 –> 00:07:31.540
We all know that intuitively, but
we need a number to work against.
00:07:31.565 –> 00:07:36.800
Number two, exactly to your point, Carl,
very few customers are going to go into
00:07:36.825 –> 00:07:40.760
that and try to solve for these problems
because it is the paralysis
00:07:40.785 –> 00:07:42.980
by analysis paradox.
00:07:43.005 –> 00:07:45.800
It’s literally, I can be responsible.
00:07:45.825 –> 00:07:49.295
Holy cow, there’s too many
things to be responsible about.
00:07:49.320 –> 00:07:51.400
I’m just going to do what I was going to
do anyway, and we’re just going to
00:07:51.425 –> 00:07:53.660
hope that everything works out okay.
00:07:53.685 –> 00:07:58.100
Customers are not going to go in
there and use this information.
00:07:58.125 –> 00:08:02.520
But point number three, I think, and if
anybody would like to know how to do this,
00:08:02.545 –> 00:08:04.500
we would be happy to do this for you.
00:08:04.525 –> 00:08:12.040
You can package a service offering by
industry, by customer size that says an
00:08:12.065 –> 00:08:16.220
AI risk assessment for your organization.
00:08:16.245 –> 00:08:21.020
I will examine the following X number
of categories and specific use cases.
00:08:21.045 –> 00:08:27.020
I will identify actual behavior within
your organization, a profile of what your
00:08:27.045 –> 00:08:32.080
business processes are, what is actually
going on in the real world, too, because I
00:08:32.105 –> 00:08:35.400
have network tools that allow me to
identify whether these tools are
00:08:35.425 –> 00:08:37.380
being used in your environment.
00:08:37.405 –> 00:08:41.000
And I can give you an action oriented
report that says, here are your
00:08:41.025 –> 00:08:44.720
vulnerabilities, here are the action item
recommendations, and here’s exactly
00:08:44.745 –> 00:08:46.580
what you should do about that stuff.
00:08:46.605 –> 00:08:51.920
Oh, and by the way, if it’s in the SMB
world, that costs 5 to $10,000 for that
00:08:51.945 –> 00:08:55.140
level of work, and it’s going
to be highly templatized.
00:08:55.165 –> 00:09:00.335
If I’m in the mid-market,
I’m talking 25 to 35, and it’s
00:09:00.360 –> 00:09:01.740
going to require some interviews.
00:09:01.765 –> 00:09:07.280
If you happen to want to get involved with
some divisions of larger enterprises in
00:09:07.305 –> 00:09:11.960
your local neighborhood, this is a great
way to get your foot in the door and get a
00:09:11.985 –> 00:09:19.720
50 to 100K services gig that is, A, very
timely, and B, once you’ve done it a few
00:09:19.745 –> 00:09:23.860
times and you’ve figured out what the
templates are, it is oh, so infinitely
00:09:23.885 –> 00:09:28.060
repeatable that your margin
shouldn’t be 50% on that thing.
00:09:28.085 –> 00:09:31.220
They should be 75% on that thing.
00:09:31.245 –> 00:09:32.060
This is great.
00:09:32.085 –> 00:09:36.380
That, to me, makes the point that 10% of
our audience
00:09:36.405 –> 00:09:41.740
is actually interested in going to
all that work or do that a deep dive.
00:09:41.765 –> 00:09:48.280
I do think this is a rich, rich area
to create a spectacular presentation on
00:09:48.305 –> 00:09:54.540
the dangers of AI that you can give in 20
minutes, not a super crazy deep dive, but
00:09:54.565 –> 00:09:58.120
just literally browse through this thing
and say, Well, what are
00:09:58.145 –> 00:09:59.695
the concerns about privacy?
00:09:59.720 –> 00:10:01.040
What are the What are the
concerns about misinformation?
00:10:01.065 –> 00:10:03.380
What are the concerns about bad actors?
00:10:03.405 –> 00:10:09.695
And get a few things where you can point
to the references that are outlined
00:10:09.720 –> 00:10:12.300
because this is all referencing articles,
referencing articles,
00:10:12.325 –> 00:10:14.000
referencing articles.
00:10:14.560 –> 00:10:19.920
You could repeat that to the Kauanis, the
Rotary, the BNI, the Morning
00:10:19.945 –> 00:10:21.300
Chambers of Commerce.
00:10:21.325 –> 00:10:25.160
You can give this presentation every day
for the next month, and you
00:10:25.185 –> 00:10:26.735
might get some clients out of it.
00:10:26.760 –> 00:10:27.760
Most people are not going to do it.
00:10:27.785 –> 00:10:28.500
Yeah, but we can say
that about everything.
00:10:28.525 –> 00:10:30.095
We can say that about everything.
00:10:30.120 –> 00:10:32.380
They can say that everything.
Most people will not execute.
00:10:32.405 –> 00:10:35.935
And that’s why, in a certain degree, I
don’t mind talking about it
00:10:35.960 –> 00:10:36.540
because it’s the…
00:10:36.565 –> 00:10:38.300
Look, I know a lot of people
aren’t going to do it.
00:10:38.325 –> 00:10:40.015
You don’t want to be that person.
00:10:40.040 –> 00:10:42.180
You want to be the person
that actually executes.
00:10:42.205 –> 00:10:46.900
And that’s where the value is.
00:10:46.925 –> 00:10:50.720
Well, and to your point, Carl, if you’re
not going to package it the way I
00:10:50.745 –> 00:10:55.680
described and sell it as a professional
service, please use it as a promotional
00:10:55.705 –> 00:11:00.895
vehicle to get into a conversation that
then says, Oh, and by the way, we
00:11:00.920 –> 00:11:02.380
could monitor your network for you.
00:11:02.405 –> 00:11:04.340
You should sign a managed
services contract.
00:11:04.365 –> 00:11:11.580
That, to me, is a very topical
reality that almost everybody can use.
00:11:11.605 –> 00:11:14.560
But we’re done giving
you advice on that topic.
00:11:14.585 –> 00:11:18.100
Let’s move on to our next one, guys.
00:11:18.125 –> 00:11:21.400
Ripped from the headlines of the most
obvious news that any of us have
00:11:21.425 –> 00:11:25.420
encountered recently,
Google has a monopoly.
00:11:25.445 –> 00:11:26.940
What?
00:11:26.965 –> 00:11:28.820
Say it in a sense.
00:11:28.845 –> 00:11:32.360
And not just a monopoly,
but a monopoly in search.
00:11:32.385 –> 00:11:38.080
So the news, if you have not been keeping
up with this, this is not immediate.
00:11:38.105 –> 00:11:41.360
This is not just a short term assessment
where somebody woke up one day
00:11:41.385 –> 00:11:43.000
and decided to point fingers.
00:11:43.025 –> 00:11:49.660
This is years of analysis and legal
exploration going into a scenario to
00:11:49.685 –> 00:11:54.300
examine the business behavior, not the
prevalence of technology,
00:11:54.325 –> 00:11:59.660
but the business practices around the
application, requiring it to be positioned
00:11:59.685 –> 00:12:06.640
as default, requiring compatibilities and
embedded capabilities with their search
00:12:06.665 –> 00:12:09.900
engine in other people’s software.
00:12:09.925 –> 00:12:11.740
It’s not the technology.
00:12:11.765 –> 00:12:15.820
It is the business behavior around the
technology that
00:12:15.845 –> 00:12:18.060
apparently has risen to a level.
00:12:18.085 –> 00:12:23.720
Now, we’ve been talking about
some things that we believe are obvious
00:12:23.745 –> 00:12:28.220
monopolies in many dimensions of our
industries for a number of years.
00:12:28.245 –> 00:12:33.940
This is the first time anybody with any
authority actually said something about it
00:12:33.965 –> 00:12:35.860
since Microsoft.
00:12:35.885 –> 00:12:37.540
That’s a million years ago.
00:12:37.565 –> 00:12:41.660
What do you guys think is
actually going to come from this?
00:12:41.685 –> 00:12:45.840
And do you believe this is going to change
anything about their behavior?
00:12:45.865 –> 00:12:48.820
I want to make predictions.
I’m totally on board for this.
00:12:48.845 –> 00:12:53.280
So I’m going to say I don’t
think Google will get broken up.
00:12:53.305 –> 00:12:58.980
I think that is one of those moves that
the government doesn’t generally like to
00:12:59.005 –> 00:13:03.420
do if they don’t have I’m
not sure it’s necessarily as clean.
00:13:03.445 –> 00:13:07.560
I could see a very clear argument for you
could split main Google
00:13:07.585 –> 00:13:09.500
search with YouTube.
00:13:09.525 –> 00:13:13.020
It makes sense to me because you end
up with a world of two search engines.
00:13:13.045 –> 00:13:14.400
But I don’t actually…
00:13:14.425 –> 00:13:18.420
Because first, you immediate move would
to make themselves search the other thing.
00:13:18.445 –> 00:13:22.260
It makes a lot of sense if you
want to create a search engine bit.
00:13:22.285 –> 00:13:25.460
But I actually think they’re going
to be a little bit more cautious.
00:13:25.485 –> 00:13:30.460
I think the main thing I’m going to see
out of this is a ban on
00:13:30.485 –> 00:13:36.060
these contracts where they’re buying
off someone else to not get in there.
00:13:36.085 –> 00:13:38.260
The obvious one is Apple.
00:13:38.285 –> 00:13:43.240
Google just writes a check to Apple so
that they don’t want this for placement as
00:13:43.265 –> 00:13:45.895
a search engine, but it also
disincentivizes Apple
00:13:45.920 –> 00:13:46.980
from ever looking at that.
00:13:47.005 –> 00:13:51.720
I think they’re going to clearly say, When
you are at some measurable level of the
00:13:51.745 –> 00:13:55.980
dominant player, you cannot
buy your way to hold on that.
00:13:56.005 –> 00:13:57.340
You’ve got to compete.
00:13:57.365 –> 00:14:01.380
I think that’s going to be the obvious
remedy that we’ll see out of this.
00:14:01.405 –> 00:14:05.980
Whether or not that will inspire players
like Apple to get into search
00:14:06.005 –> 00:14:09.240
will be an interesting play.
00:14:09.520 –> 00:14:16.820
I think the one thing that I’m looking for
is a remedy that actually does create
00:14:16.845 –> 00:14:19.300
the opportunity to create new markets.
00:14:19.325 –> 00:14:24.460
And I’ll be interested in, and by the way,
just slowing Google down might do that.
00:14:24.485 –> 00:14:28.060
I think it’s argument that their search
results have been
00:14:28.085 –> 00:14:33.140
garbage for a while and that they are
not delivering on the best product.
00:14:33.165 –> 00:14:39.680
If they’re distracted by having to handle
a bunch of legal stuff, it might open the
00:14:39.705 –> 00:14:42.135
opportunity for somebody
else to do something there.
00:14:42.160 –> 00:14:45.200
But I think the obvious bit is we’re going
to say, No, you can’t buy your
00:14:45.225 –> 00:14:47.340
way to the top of the market?
00:14:47.365 –> 00:14:48.700
A couple of things.
00:14:48.725 –> 00:14:56.900
First of all, thank goodness, the EU
has done a better job of being effective
00:14:56.925 –> 00:15:02.060
with managing the big tech giants
than the United States.
00:15:02.085 –> 00:15:06.500
I think that basically saying, No, you
can’t do that,
00:15:06.525 –> 00:15:09.760
has already set Google on the path to
figuring out how they’re
00:15:09.785 –> 00:15:10.740
going to handle this.
00:15:10.765 –> 00:15:14.280
I don’t know what their answer is, but I
know that Google knows
00:15:14.305 –> 00:15:15.960
what their answer is.
00:15:15.985 –> 00:15:17.360
I do think it’s interesting.
00:15:17.385 –> 00:15:20.420
It’s not illegal to have a monopoly.
00:15:20.445 –> 00:15:25.360
It is illegal to behave like a monopoly.
00:15:25.385 –> 00:15:27.620
It’s like, Okay, here’s a question.
00:15:27.645 –> 00:15:31.380
If you could choose
any search engine in the world, which
00:15:31.405 –> 00:15:34.660
you can, would you choose Google?
00:15:34.685 –> 00:15:39.740
Today and day after day after day, I do.
00:15:39.765 –> 00:15:43.040
Not that it is the greatest
I could possibly imagine.
00:15:43.065 –> 00:15:44.300
It’s like the government.
00:15:44.325 –> 00:15:47.580
It’s not the best
democracy that can be made.
00:15:47.605 –> 00:15:50.060
It’s the best that we will accept.
00:15:50.085 –> 00:15:54.260
I would like Google to
be better in many ways.
00:15:54.285 –> 00:16:00.220
I think the interesting part is the look
at Microsoft is probably a good example
00:16:00.245 –> 00:16:04.100
that what they really did is they looked
at little things like,
00:16:04.125 –> 00:16:08.560
Well, you can’t make a deal with IBM
so that IBM has to pay you for an
00:16:08.585 –> 00:16:11.300
operating system, whether
they install it or not.
00:16:11.325 –> 00:16:13.615
Like, holy crap.
00:16:13.640 –> 00:16:15.800
That’s a genius It’s not a
great move if you’re selling.
00:16:15.825 –> 00:16:19.500
It’s not a great move if you’re buying.
00:16:19.525 –> 00:16:23.380
And maybe they will figure out how to
open it up and have some real competition.
00:16:23.405 –> 00:16:27.740
Apple would be smart to never
get into the search business.
00:16:27.765 –> 00:16:32.680
So they’re always going to buy a partner
It’s just a matter of, Okay, can somebody
00:16:32.705 –> 00:16:36.780
else offer them more money than Google or
a better deal or something that fits
00:16:36.805 –> 00:16:42.940
better with their future AI offering
or fits better with their ecosystem?
00:16:42.965 –> 00:16:48.660
What’s the searching that works best
for them, separate from everybody else?
00:16:48.685 –> 00:16:52.040
I would love to think that there’s a
future where we each have
00:16:52.065 –> 00:16:54.300
these bespoke search engines.
00:16:54.325 –> 00:16:58.300
I have what works for my business,
you have what works for your business.
00:16:58.325 –> 00:17:02.720
There are lots of specialties search
engines, but they’re really hard to
00:17:02.745 –> 00:17:04.900
use unless you’re in that specialty.
00:17:04.925 –> 00:17:09.575
Maybe we’re going to force some
innovation to other people as well.
00:17:09.600 –> 00:17:11.280
Before I let Ryan, I’m going
to just make a quick comment.
00:17:11.305 –> 00:17:13.380
You asked a really good question
like, would you choose something else?
00:17:13.405 –> 00:17:17.040
I’m finding that for at least 50% of
my searches, I’m not choosing Google.
00:17:17.065 –> 00:17:18.960
I’m, in fact, choosing ChatGPT.
00:17:19.400 –> 00:17:23.420
The reason is I have a focus question.
00:17:23.445 –> 00:17:28.050
I generally know that it will be in the
vast, broad, settled amount of knowledge.
00:17:28.080 –> 00:17:32.135
It’s not something that this is just
that is going to be variable or timely.
00:17:32.160 –> 00:17:34.360
And by the way, the fact that it gives me
a couple of sources means I
00:17:34.385 –> 00:17:35.810
can double-check its work.
00:17:35.840 –> 00:17:39.770
It’s an established thing that
hasn’t been known for 20 years.
00:17:39.800 –> 00:17:40.720
That’s a great way of getting it.
00:17:40.745 –> 00:17:43.760
I’m 50% of my searches, I’m
not choosing it.
00:17:43.785 –> 00:17:48.380
Well, see, and this is where
if you go back to the fundamentals of
00:17:48.405 –> 00:17:52.090
why is monopoly behavior frowned upon?
00:17:52.120 –> 00:17:54.220
Why do we actually care about that?
00:17:54.245 –> 00:17:59.740
Well, it’s because you use market position
and money to substitute
00:17:59.765 –> 00:18:02.010
competition and innovation.
00:18:02.040 –> 00:18:08.810
If you have the very best product and you
achieve a dominant market position because
00:18:08.840 –> 00:18:12.570
it is bigger, better, faster than anything
else that’s out there,
00:18:12.600 –> 00:18:17.160
congratulations to you and congratulations
to me as the user because you’ve made my
00:18:17.185 –> 00:18:22.570
life better with a product that is
demonstrably better than anything else.
00:18:22.600 –> 00:18:26.770
When you get to that position, and to your
point, Carl, you begin to
00:18:26.800 –> 00:18:29.290
behave in a monopolistic way.
00:18:29.320 –> 00:18:33.480
What happens is you buy your market
dominance instead of earn your market
00:18:33.505 –> 00:18:38.180
dominance, meaning you take your foot off
the gas pedal of innovation and you cease
00:18:38.205 –> 00:18:43.360
to ship the best technology, which
punishes not only the buyers, but
00:18:43.385 –> 00:18:45.900
also the actual customers, the users.
00:18:45.925 –> 00:18:48.700
That’s exactly where
Dave’s example is going.
00:18:48.725 –> 00:18:58.000
The capabilities of Gen AI added to search
makes the potential for that service, for
00:18:58.025 –> 00:19:02.240
that basic technology, dramatic radically
better than anything we have been
00:19:02.265 –> 00:19:05.010
accustomed to in the last 20 years.
00:19:05.040 –> 00:19:08.250
Google has not done well with that.
00:19:08.280 –> 00:19:11.740
They have stumbled into a lot of the
research that I’m reading in
00:19:11.765 –> 00:19:13.250
my own personal experience.
00:19:13.280 –> 00:19:16.740
There are many times I will ask a question
in Google, and everything
00:19:16.765 –> 00:19:18.940
I get back is an ad.
00:19:18.965 –> 00:19:21.960
Literally everything I get back is an ad.
Not cool.
00:19:21.985 –> 00:19:23.860
That’s not what I’m looking for.
00:19:23.885 –> 00:19:28.320
Chatgpt has some reliability and
hallucination problems, but it
00:19:28.345 –> 00:19:30.770
does give me a more robust answer.
00:19:30.800 –> 00:19:36.090
But Google still is the A number one by a
long way in the marketplace because they
00:19:36.120 –> 00:19:41.250
forced it through monopolistic behavior
rather than earned it through innovation.
00:19:41.280 –> 00:19:47.380
If this has that simple impact on the
search market, it solves a problem.
00:19:47.405 –> 00:19:48.380
I’m all for it.
00:19:48.405 –> 00:19:53.180
If it sends a warning sign to others in
other marketplaces to
00:19:53.205 –> 00:19:57.810
stop it with the buying dominance and go
back to a world where you actually
00:19:57.840 –> 00:20:01.290
innovate and earn,
that That would be a pipe dream.
00:20:01.320 –> 00:20:03.010
I don’t think we’re
going to get there yet.
00:20:03.040 –> 00:20:06.500
It’ll take some more wraps on the knuckle
before anybody else
00:20:06.525 –> 00:20:08.380
actually changes behavior.
00:20:08.405 –> 00:20:10.400
But this is a good start.
00:20:10.720 –> 00:20:13.810
I would say, just for a note,
Google has made a pretty good case.
00:20:13.840 –> 00:20:19.000
They do continue to innovate, and my
number two search engine is YouTube.
00:20:20.600 –> 00:20:21.980
Maybe they will get programed.
00:20:22.005 –> 00:20:25.215
But I’m going to move us on to the last
topic, and I want to take an
00:20:25.240 –> 00:20:26.160
interesting angle on this.
00:20:26.185 –> 00:20:32.220
Nist just released their guidance for
cryptography for quantum computing.
00:20:32.245 –> 00:20:36.480
I will say, look, it’s an interesting set
of framework for those that
00:20:36.505 –> 00:20:38.140
are in the encryption space.
00:20:38.165 –> 00:20:40.620
Get to work, guys, because
you’ve got new data.
00:20:40.645 –> 00:20:45.320
But what I wanted to do is I actually
rejected this story for the business of
00:20:45.345 –> 00:20:49.560
tech, and I wanted to bring it to you guys
instead because I looked at this and said,
00:20:49.585 –> 00:20:52.500
okay, there’s the obvious element of
upgrade your cryptography
00:20:52.525 –> 00:20:54.290
when there’s a better version.
00:20:54.320 –> 00:20:56.640
That just always makes sense.
00:20:56.760 –> 00:21:02.280
But I feel like quantum computing as a
thing has always been, well, we’re within
00:21:02.305 –> 00:21:07.640
10 years of it, every three years, as
far as I can remember now at this point.
00:21:07.665 –> 00:21:12.600
I mean, it just feels like we’re so close
to quantum that I’ve come to the point
00:21:12.625 –> 00:21:15.290
where I think I just don’t believe them.
00:21:15.320 –> 00:21:20.050
I think I just don’t
think this is a thing.
00:21:20.080 –> 00:21:26.360
I will open a space for, sure, in the 27th
century, when the next version of Star
00:21:26.385 –> 00:21:29.050
Trek is out, we may
have quantum computing.
00:21:29.080 –> 00:21:33.700
But I don’t think in any
practical terms, this is a thing.
00:21:33.725 –> 00:21:36.600
Am I missing the boat?
I want to get check here.
00:21:36.625 –> 00:21:37.980
What’s your take on quantum?
00:21:38.005 –> 00:21:43.215
So much technology,
and we’ve seen this just recently in the
00:21:43.240 –> 00:21:45.920
pandemic where we’re saying, Oh, how
come robots aren’t taking over the world?
00:21:45.945 –> 00:21:51.455
Well, because in the real world,
people are saying, Well, I’ll work for a
00:21:51.480 –> 00:21:53.260
dollar less if you don’t take my job away.
00:21:53.285 –> 00:21:57.540
And so it’s delayed
the actual use of that technology.
00:21:57.565 –> 00:22:03.360
I think things like
the fact that we have GPUs and we have
00:22:03.385 –> 00:22:09.500
these processors that can just
put more horsepower on a problem, we don’t
00:22:09.525 –> 00:22:12.140
have a need for quantum computing.
00:22:12.165 –> 00:22:16.940
It’s not that it’s not real or it’s
not there or it’s not going to happen.
00:22:16.965 –> 00:22:22.020
If it doesn’t happen, it’ll be because we
don’t need it, because we can
00:22:22.045 –> 00:22:24.620
continually increase horsepower.
00:22:24.645 –> 00:22:28.700
Even though we don’t increase the
horsepower necessarily with one chip,
00:22:28.725 –> 00:22:34.860
having more and more processors, but
we can now buy NVIDIA chips
00:22:34.885 –> 00:22:36.920
by the gallon, right?
00:22:36.945 –> 00:22:41.980
We’ll just buy more horsepower, and then
they get smaller and smaller and smaller.
00:22:42.005 –> 00:22:45.940
See, and I will go to
the next order of impact.
00:22:45.965 –> 00:22:50.960
Because what Carl is saying is,
if I have an alternative technology that I
00:22:50.985 –> 00:22:56.090
can use to solve the future problem, it’s
existing, it’s tested, it’s understood.
00:22:56.120 –> 00:22:57.940
Let’s just do what’s familiar.
00:22:57.965 –> 00:23:03.050
I’ll go to the next level and say that I I
think the thing that will prevent rapid
00:23:03.080 –> 00:23:07.800
deployment of quantum is
the environmental impact.
00:23:07.825 –> 00:23:11.055
And I don’t mean just on trees and
waterways and cute
00:23:11.080 –> 00:23:11.900
little bunnies in nature.
00:23:11.925 –> 00:23:17.090
What I mean is the literal environment
in which these systems play.
00:23:17.120 –> 00:23:22.975
The power consumption requirements for
quantum computing are projected because
00:23:23.000 –> 00:23:24.800
nobody really has one of these things yet.
00:23:24.825 –> 00:23:29.520
If you’re going to use one of these that
does what it is scientifically suggested
00:23:29.545 –> 00:23:35.280
it could do, it’s going to consume power
at a rate beyond even what these data
00:23:35.305 –> 00:23:39.920
centers are doing for AI, which we know is
an order of magnitude beyond
00:23:39.945 –> 00:23:42.050
what regular data centers did.
00:23:42.080 –> 00:23:47.050
We already covered, years ago, if you guys
remembered, a story about
00:23:47.080 –> 00:23:53.090
the health impacts for people living in
proximity of large format data centers.
00:23:53.120 –> 00:23:57.320
The hum that happens 24/7
leads to hearing problems.
00:23:57.345 –> 00:24:00.570
It creates sleeping problems, et cetera.
00:24:00.600 –> 00:24:06.050
Order of magnitude to AI, further order of
magnitude up to the consumption of
00:24:06.080 –> 00:24:08.700
what quantum computing is going to be.
00:24:08.725 –> 00:24:11.020
I don’t think we can’t
innovate out of that.
00:24:11.045 –> 00:24:17.260
I think that the world will look around
and go, I’m not willing to pay you the
00:24:17.285 –> 00:24:22.290
extra second order costs in
order to consume this technology.
00:24:22.320 –> 00:24:27.500
I will buy a quantum computer for X,
but then in order to cool
00:24:27.525 –> 00:24:32.440
it, to power it, in order for it to live
in a place where it doesn’t poison the
00:24:32.465 –> 00:24:36.330
local citizenry, it’s going
to cost me X times four.
00:24:36.360 –> 00:24:38.960
I’m not willing to pay the X times four.
00:24:38.985 –> 00:24:40.400
So thank you very much.
00:24:40.425 –> 00:24:42.570
I’ll just stick with
what I’m doing right now.
00:24:42.600 –> 00:24:46.680
I think the capability, the
potential, the stories we’ve been told
00:24:46.705 –> 00:24:53.620
about quantum, yes, they are fascinating,
but they are not yet economically viable.
00:24:53.645 –> 00:24:55.455
And that’s an engineering problem.
00:24:55.480 –> 00:24:56.780
That’s not a scientific problem.
00:24:56.805 –> 00:24:58.200
That’s an engineering problem.
00:24:58.225 –> 00:25:03.420
You need to figure out a way to deploy
power without consumption.
00:25:03.445 –> 00:25:08.015
That’s a single variable that you need to
control for in designing and
00:25:08.040 –> 00:25:09.500
deploying these kinds of systems.
00:25:09.525 –> 00:25:11.460
By the way, one last comment.
00:25:11.485 –> 00:25:18.330
If quantum does happen,
all your cybersecurity is grandma stuff.
00:25:18.360 –> 00:25:19.360
It is outdated.
00:25:19.385 –> 00:25:24.400
It is the olden times, and it
does not work in a quantum world.
00:25:24.425 –> 00:25:30.420
We do not presently possess
cryptography that can withstand
00:25:30.445 –> 00:25:34.570
quantum attacks for longer
than just a couple of minutes.
00:25:34.600 –> 00:25:37.570
We’re literally not using technology.
00:25:37.600 –> 00:25:41.920
So if and when quantum comes,
I hope you all are thinking about the
00:25:41.945 –> 00:25:44.720
cyber impacts because
it’s going to break all your tools.
00:25:44.745 –> 00:25:47.020
Hold on.
I want to break that sentence down.
00:25:47.045 –> 00:25:49.260
First off, when is the question?
00:25:49.285 –> 00:25:52.330
And I’m coming to the
conclusion of it isn’t.
00:25:52.360 –> 00:25:57.220
And if is doing a whole lot of heavy
lifting right there, Ryan, because
00:25:57.245 –> 00:26:01.420
if when is never, if doesn’t matter.
00:26:01.445 –> 00:26:04.940
That’s your sentence
diagramming by the way.
00:26:04.965 –> 00:26:08.940
I’m just getting break down to
the basics here and observing it.
00:26:08.965 –> 00:26:09.680
Okay, sure.
00:26:09.705 –> 00:26:13.920
I hear you on all of this stuff,
but I’m coming to the conclusion
00:26:13.945 –> 00:26:16.290
that this isn’t a thing.
00:26:16.320 –> 00:26:24.050
In fact, head cycle spent on it in
any level is just not worth my time.
00:26:24.080 –> 00:26:29.700
Now, I will give a space of,
Look, I think making things
00:26:29.725 –> 00:26:32.330
generally more secure, and I’m putting
that in big old air
00:26:32.360 –> 00:26:33.570
quotes, is a good thing.
00:26:33.600 –> 00:26:38.860
My statement is, Okay, if you’ve designed
a new set of encryption that is
00:26:38.885 –> 00:26:43.360
invulnerable to this theoretical thing
that is way more advanced than our regular
00:26:43.385 –> 00:26:47.800
stuff, well, then it should also be good
enough for our regular stuff, and I
00:26:47.825 –> 00:26:50.860
don’t see any reason to not do it.
00:26:50.885 –> 00:26:55.680
But if somebody says, Dave, I want you to
spend time and brain cycles on this thing
00:26:55.705 –> 00:26:58.940
that only benefits quantum computing, I
think my answer is,
00:26:58.965 –> 00:27:03.660
That’s a waste of my time,
and I’m not doing any work on it because
00:27:03.685 –> 00:27:06.500
you don’t even prove
you can make the thing.
00:27:06.525 –> 00:27:10.330
Like, literally, the basics of
making it have not even been proven.
00:27:10.360 –> 00:27:15.140
So to Ryan, I would say, literally, all of
your objections, old man,
00:27:15.165 –> 00:27:19.380
can be overcome and will be
overcome by time and technology.
00:27:19.405 –> 00:27:22.180
Everything starts out being too expensive.
00:27:22.205 –> 00:27:23.520
Everything’s impossible.
00:27:23.545 –> 00:27:24.780
Everything costs too much.
00:27:24.805 –> 00:27:31.695
If you think of how many BTUs did it take
to light a house 100 years ago, well, you
00:27:31.720 –> 00:27:33.120
got to get the kerosine, but you have to
get the kerosine to the
00:27:33.145 –> 00:27:35.900
house and like, holy smokes.
00:27:35.925 –> 00:27:39.420
Now you have LED lights that take
essentially nothing except
00:27:39.445 –> 00:27:44.260
static electricity.
So the future will take care of itself.
00:27:44.285 –> 00:27:51.200
Today, if I would say, I have been
hearing about AI since before I was born.
00:27:51.225 –> 00:27:56.260
Like, literally, you watch old black and
white science fiction movies and TV shows.
00:27:56.285 –> 00:28:00.090
They’ve been talking
about AI for 60 years.
00:28:00.120 –> 00:28:03.940
And then one day, it became reality.
00:28:03.965 –> 00:28:08.360
Everybody, five years ago, you could have
made the argument, They’ve been talking
00:28:08.385 –> 00:28:10.800
about AI for 50 years, and
it’s never going to happen.
00:28:10.825 –> 00:28:17.980
I think it’s going to be the exact
same thing with quantum computing.
00:28:18.005 –> 00:28:22.000
Part of it is with AI, we still haven’t
got, we haven’t figured
00:28:22.025 –> 00:28:23.810
out the killer app.
00:28:23.840 –> 00:28:30.280
We haven’t figured out a thing that’s
the email equivalent of the web.
00:28:30.305 –> 00:28:33.570
What is it that we would
do with quantum computing?
00:28:33.600 –> 00:28:35.520
What’s the killer wrap
of quantum computing?
00:28:35.545 –> 00:28:41.090
Well, if you’re not a quantum physicist or
you’re not a mathematician,
00:28:41.120 –> 00:28:43.975
I’m not sure there’s a use for
it outside of your paycheck.
00:28:44.000 –> 00:28:45.480
Well, and that’s the thing, right?
00:28:45.505 –> 00:28:46.760
There are use cases.
00:28:46.785 –> 00:28:50.220
They’re just not economically
viable yet, right?
00:28:50.245 –> 00:28:51.740
I agree with you, Carl.
00:28:51.765 –> 00:28:57.615
The ecosystem of technology will solve
for the problems that I’ve been outlining.
00:28:57.640 –> 00:28:59.840
That sounds like opportunity, and it
sounds like business
00:28:59.865 –> 00:29:01.520
development to me, right?
00:29:01.545 –> 00:29:03.420
From an industry perspective.
00:29:03.445 –> 00:29:08.280
My thing is, I do believe that
the future takes forever
00:29:08.305 –> 00:29:10.380
until it happens all at once.
00:29:10.405 –> 00:29:16.400
Exactly as Carl is describing, it will
come to Dave’s point, not this year.
00:29:16.425 –> 00:29:19.760
If you’re focused on this year, please
don’t spend any time building a
00:29:19.785 –> 00:29:21.900
quantum practice, but it will come.
00:29:21.925 –> 00:29:23.760
We just need to be ready for it.
00:29:23.785 –> 00:29:27.640
Well, I want to push back slightly there,
Carl, because while you’re right on the AI
00:29:27.665 –> 00:29:32.020
stuff, we have seen advancements
advancements along the way
00:29:32.045 –> 00:29:34.360
for a good portion of that.
00:29:34.385 –> 00:29:37.680
We could go back in time, let’s say 2015.
00:29:37.705 –> 00:29:42.320
I worked on projects that were advanced,
that were data science stuff, that was
00:29:42.345 –> 00:29:45.090
using machine learning,
that was a stretch to be AI.
00:29:45.120 –> 00:29:46.570
That had the glimmers of that.
00:29:46.600 –> 00:29:50.320
You could go back further than that and
you could see versions where we’re
00:29:50.345 –> 00:29:54.000
building the systems
that communicate that.
00:29:54.025 –> 00:29:58.240
I have this element of I saw a trend line
over time of advancements,
00:29:58.265 –> 00:29:59.215
and you’re right.
00:29:59.240 –> 00:30:01.810
All of a sudden, we see the
breakthrough and it crashes through.
00:30:01.840 –> 00:30:06.330
What I’m pushing back on is, they don’t
even have the basic bits over on quantum.
00:30:06.360 –> 00:30:10.290
It’s all whiteboard stuff
where nothing actually works.
00:30:10.320 –> 00:30:15.720
There’s a difference between seeing
elements of it and You guys are having
00:30:15.745 –> 00:30:17.810
fun on a whiteboard, and that’s all cool.
00:30:17.840 –> 00:30:22.050
But until you can actually build
any of it, stop wasting my time.
00:30:22.080 –> 00:30:27.280
Maybe AI will be the tool that
productises quantum.
00:30:27.305 –> 00:30:30.380
Perhaps.
That replaces itself with quantum.
00:30:30.405 –> 00:30:35.080
Well,
when the word theoretical is built into
00:30:35.105 –> 00:30:38.000
your definition, maybe you’ll
never show up in the real world.
00:30:38.025 –> 00:30:40.900
Maybe you’ll never show up.
00:30:40.925 –> 00:30:49.860
And with that happy note, we bring an
end to episode 209 of the Killing It…
00:30:49.885 –> 00:30:51.320
Podcast.
00:30:51.400 –> 00:30:54.380
Thanks for tuning in to
the Killing It podcast.
00:30:54.405 –> 00:30:59.620
Please share with your friends and tell
everyone to subscribe on iTunes,
00:30:59.645 –> 00:31:03.020
Stitcher, all the podcast places.
00:31:03.045 –> 00:31:07.160
Join us next week and help us keep
killing it in the technology business.