235. The Impact of Artificial Intelligence w/ Jacob Ward
Join us as we welcome Jacob Ward, a veteran journalist and thought leader, to explore the profound impact of artificial intelligence on our lives. In this episode, we delve into how AI interacts with human behavior, the societal implications of predictive policing and surveillance, and the future of work in an AI-driven world. We also discuss the concept of a technocracy, the Fermi Paradox, and the importance of purpose in human satisfaction. Tune in for a thought-provoking conversation that navigates the cultural and economic shifts shaping our future. Welcome back to Infinite Rabbit Hole!
Check out more of Jacob's work at https://www.jacobward.com/
For everything IRH, visit InfiniteRabbitHole.com Join us live every Sunday on Twitch.tv/InfiniteRabbitHole at 8PM CST!
*Make sure to check out the updated MERCH SHOP by clicking the "Merch" tab in the website!!!* Its a great way to help support the show!
00:00:00,120 --> 00:00:02,480
Hey travelers.
In today's episode, we're
2
00:00:02,480 --> 00:00:06,760
thrilled to welcome Jacob Ward,
a veteran journalist and thought
3
00:00:06,760 --> 00:00:10,320
leader focus on the complex
relationship between technology,
4
00:00:10,920 --> 00:00:15,240
science, and human behavior.
Jacob brings his extensive
5
00:00:15,240 --> 00:00:18,680
experience as a correspondent
for Major Networks and his
6
00:00:18,680 --> 00:00:22,640
insights from his critically
acclaimed book The Loop to our
7
00:00:22,640 --> 00:00:24,480
discussion.
We'll dive into the fascinating
8
00:00:24,480 --> 00:00:28,120
world of artificial intelligence
and its profound impact on our
9
00:00:28,120 --> 00:00:31,120
lives.
Explore how AI interacts with
10
00:00:31,120 --> 00:00:34,960
our human behavior, often
tapping into our automatic
11
00:00:34,960 --> 00:00:39,040
decision making processes driven
by our quote UN quote lizard
12
00:00:39,040 --> 00:00:41,440
brain.
We'll discuss the societal
13
00:00:41,440 --> 00:00:45,000
implications of AI, from
potential job losses to ethical
14
00:00:45,000 --> 00:00:49,480
concerns, and how predictive
policing and surveillance are
15
00:00:49,480 --> 00:00:53,280
reshaping our privacy landscape.
As we look to the future,
16
00:00:53,560 --> 00:00:57,480
question the value of human
labor in an AI driven world in a
17
00:00:57,480 --> 00:01:00,440
potential mental health
challenges that would arise from
18
00:01:00,480 --> 00:01:03,600
increased free time and a lack
of purpose.
19
00:01:04,160 --> 00:01:08,880
We'll also touch on the concept
of a technocracy, where a few
20
00:01:08,880 --> 00:01:13,400
hold significant power due to
technological advancements, and
21
00:01:13,560 --> 00:01:16,400
consider the vastness of the
universe with the Fermi Paradox
22
00:01:16,720 --> 00:01:19,160
and the possibility of
extraterrestrial life.
23
00:01:19,920 --> 00:01:24,480
Join us as we navigate these
cultural and economic shifts,
24
00:01:25,360 --> 00:01:28,800
emphasizing the importance of
purpose and human satisfaction,
25
00:01:29,280 --> 00:01:32,640
and the need for long term
thinking to address global
26
00:01:32,640 --> 00:01:35,880
challenges.
Welcome back to Infinite Rabbit
27
00:01:35,880 --> 00:02:15,580
Hole.
Welcome back to The Infinite
28
00:02:15,580 --> 00:02:17,500
Rabbit Hole.
I'm your host, Jeremy, and
29
00:02:17,500 --> 00:02:20,580
tonight we have a special guest
like we typically do.
30
00:02:20,580 --> 00:02:25,340
This one, though, is going to
dissect a very interesting and
31
00:02:25,340 --> 00:02:30,780
current event related topic, AI
and a lot of the aspects of AI
32
00:02:30,780 --> 00:02:35,180
that you're not thinking about.
So today we are thrilled to be
33
00:02:35,180 --> 00:02:40,680
joined by Jacob Ward, a veteran
generalist, author and thought
34
00:02:40,680 --> 00:02:43,400
leader focused on the
complicated relationship between
35
00:02:43,400 --> 00:02:46,600
technology, science, and human
behavior.
36
00:02:47,280 --> 00:02:51,080
For years, Jacob has served as a
correspondent for major networks
37
00:02:51,080 --> 00:02:55,000
including NBC News, where he
reported on technology and its
38
00:02:55,000 --> 00:02:58,360
social implications for shows
like Nightly News and the Today
39
00:02:58,360 --> 00:03:01,360
Show.
His extensive career includes
40
00:03:01,360 --> 00:03:05,840
work with PBSCNN and Al Jazeera,
and he previously held the
41
00:03:05,840 --> 00:03:09,160
position of Editor in Chief of
Popular Science Magazine.
42
00:03:09,720 --> 00:03:14,920
He is the host and Co writer of
the Landmark for our PBS series
43
00:03:14,920 --> 00:03:18,320
Hacking Your Mind, which
explores the science of decision
44
00:03:18,320 --> 00:03:21,640
making and bias.
Most importantly, we're here
45
00:03:21,640 --> 00:03:25,360
today to discuss his work on his
critically acclaimed book, The
46
00:03:25,360 --> 00:03:27,920
Loop.
How AI is creating a world
47
00:03:27,920 --> 00:03:32,280
without choices and how to fight
back in the loop, Jacob warned
48
00:03:32,280 --> 00:03:35,120
us about the unintended
consequences of artificial
49
00:03:35,120 --> 00:03:40,080
intelligence on our free will
and decision making, offering a
50
00:03:40,120 --> 00:03:45,120
vital guide for navigating an
increasingly automated world.
51
00:03:46,520 --> 00:03:50,800
Jacob, welcome to the.
I really appreciate you having
52
00:03:50,800 --> 00:03:54,560
me, this is very exciting.
Good, good, good AI is your is
53
00:03:54,560 --> 00:03:56,880
your your bread and butter right
now.
54
00:03:57,440 --> 00:04:01,720
I know the you've done a lot of
work into psychological topics,
55
00:04:01,720 --> 00:04:03,840
right?
Especially influence.
56
00:04:03,920 --> 00:04:06,000
How did you get into all this?
Let's start with the.
57
00:04:06,040 --> 00:04:07,120
Let's start with the very
beginning.
58
00:04:08,360 --> 00:04:10,160
Sure.
So I really appreciate you guys
59
00:04:10,400 --> 00:04:13,160
having me.
The So I've been a technology
60
00:04:13,160 --> 00:04:15,440
correspondent for a long time.
And Once Upon a time, technology
61
00:04:15,440 --> 00:04:17,399
correspondents were sort of
expected to be just kind of
62
00:04:17,399 --> 00:04:20,880
upbeat nerds who thought only
about like, which phone is
63
00:04:20,880 --> 00:04:25,040
better, right?
And around 2013 and I started
64
00:04:25,040 --> 00:04:28,120
working first for Al Jazeera and
then I did this PBS series.
65
00:04:28,560 --> 00:04:31,840
I was thinking more and more
about technology as a kind of
66
00:04:31,840 --> 00:04:36,040
reflection of society.
I had a, an early mentor who
67
00:04:36,040 --> 00:04:38,280
used to ask this question.
A lot of the time he'd say, what
68
00:04:38,280 --> 00:04:40,760
is the technology trying to tell
us about ourselves, right?
69
00:04:40,760 --> 00:04:45,880
We build stuff specifically to
address something in ourselves,
70
00:04:46,000 --> 00:04:48,520
a vulnerability, a market
opportunity, whatever it is.
71
00:04:49,040 --> 00:04:51,840
And so as I was starting to
think about that, I had this
72
00:04:51,840 --> 00:04:53,880
opportunity to do this PBS
series Hacking Your Mind.
73
00:04:53,960 --> 00:04:57,920
And Hacking Your Mind was a four
hour kind of crash course in
74
00:04:57,920 --> 00:05:00,400
about the last 50 years of
behavioral science.
75
00:05:00,440 --> 00:05:02,520
And I got the opportunity to
travel all over the world and
76
00:05:02,520 --> 00:05:06,480
interview all these people who
study why we make choices the
77
00:05:06,480 --> 00:05:10,080
way we do and what the basic
take away of that series is, as
78
00:05:10,080 --> 00:05:13,040
is the basic take away of the
last 50 years of all of this
79
00:05:13,400 --> 00:05:17,280
behavioral science.
Is that a huge amount of our
80
00:05:17,280 --> 00:05:20,800
decision making?
Probably 85 to 90% at any given
81
00:05:20,800 --> 00:05:25,880
time is totally automatic and
based on very, very ancient
82
00:05:25,880 --> 00:05:29,800
instincts, instincts that we
share with primates like, you
83
00:05:29,800 --> 00:05:34,160
know, choosing who to trust,
what to eat, where to go.
84
00:05:34,560 --> 00:05:37,920
One big example of this is like
if, if any of us right.
85
00:05:37,920 --> 00:05:41,200
So I, I just spent the weekend
driving my kid from volleyball
86
00:05:41,200 --> 00:05:44,880
game to volleyball game.
If I'm assigned to drive my kid
87
00:05:44,880 --> 00:05:49,080
somewhere, it's as often as not,
I might just automatically drive
88
00:05:49,080 --> 00:05:52,240
her by accident to school
because that is the standard
89
00:05:52,560 --> 00:05:53,920
route you take every morning,
right?
90
00:05:53,920 --> 00:05:55,760
And so I don't know how many,
how often you've been in the
91
00:05:55,760 --> 00:05:57,720
position, right?
You guys have like, you show up
92
00:05:57,720 --> 00:05:59,480
at the school and you're like,
oh, wait, this is not where I'm
93
00:05:59,480 --> 00:06:00,920
supposed to take her.
Oh, my God, why have I done
94
00:06:00,920 --> 00:06:02,800
this?
And the fact is, right, if you
95
00:06:02,800 --> 00:06:05,360
cast your mind back across that
drive like you've done an
96
00:06:05,360 --> 00:06:08,560
incredibly sophisticated
mechanical task with no
97
00:06:08,560 --> 00:06:11,480
consciousness at all, you're on
total autopilot, right?
98
00:06:11,480 --> 00:06:13,800
And that kind of autopilot is
what all these behavioral
99
00:06:13,800 --> 00:06:18,880
scientists have shown is driving
a huge amount of our choices in
100
00:06:18,920 --> 00:06:20,960
our day-to-day lives.
So the same time that I was
101
00:06:20,960 --> 00:06:24,240
learning about all of that and
the way in which those choices
102
00:06:24,240 --> 00:06:26,560
that we make are very
predictable, that the mistakes
103
00:06:26,560 --> 00:06:29,840
we make are very predictable.
I was also in my day job as as a
104
00:06:29,840 --> 00:06:33,520
technology correspondent,
learning about these fly by
105
00:06:33,520 --> 00:06:36,520
night companies that were
starting to use early machine
106
00:06:36,520 --> 00:06:38,920
learning and what was at the
time called human reinforcement
107
00:06:38,920 --> 00:06:42,240
learning algorithms.
This was early AI, before
108
00:06:42,240 --> 00:06:45,840
ChatGPT became possible.
And I thought, oh, this is a
109
00:06:45,840 --> 00:06:48,400
real problem.
I'm not the kind of person who
110
00:06:48,400 --> 00:06:50,760
believes, like, I have to get my
ideas into the world at all
111
00:06:50,760 --> 00:06:52,520
costs.
But this was a moment where I
112
00:06:52,520 --> 00:06:55,560
was like, man, I don't see
anybody else talking about the
113
00:06:55,600 --> 00:06:59,280
intersection of our
vulnerabilities in terms of our
114
00:06:59,280 --> 00:07:03,440
brains and our behavior and this
for profit industry that's about
115
00:07:03,440 --> 00:07:06,720
to explode around the
commercialization of AI.
116
00:07:07,440 --> 00:07:09,920
And I thought you guys that I
was like 5 years early.
117
00:07:09,960 --> 00:07:12,600
I thought I was like, you know,
way out in front of this.
118
00:07:12,600 --> 00:07:14,840
And I had a few people read this
book and be like, I don't know,
119
00:07:14,840 --> 00:07:15,760
this sounds like science
fiction.
120
00:07:15,760 --> 00:07:17,560
This is pretty speculative.
You know, when we first
121
00:07:17,560 --> 00:07:21,200
published the book in 2020, the
first issue, the first version
122
00:07:21,200 --> 00:07:24,400
of it came out in 20.
No, sorry early or sorry at the
123
00:07:24,400 --> 00:07:27,480
very beginning of 2022, we
didn't even put the word AI on
124
00:07:27,480 --> 00:07:30,440
it because we thought, oh, this
is going to that'll, that'll
125
00:07:30,440 --> 00:07:32,000
alienate people.
People will think this is a book
126
00:07:32,000 --> 00:07:33,600
for nerds.
And, and we don't and we want
127
00:07:33,600 --> 00:07:35,400
people to, to, you know,
everyone to read this.
128
00:07:36,040 --> 00:07:39,440
And then nine months later,
ChatGPT comes out and suddenly
129
00:07:39,440 --> 00:07:43,000
my thesis comes to life.
And you know, I mean, I, I just,
130
00:07:43,320 --> 00:07:47,280
I'm seeing my book that was
supposed to predict something
131
00:07:47,280 --> 00:07:50,480
way out in advance and try to
hopefully warn people against it
132
00:07:50,480 --> 00:07:52,240
in some way.
I was encouraging people to get
133
00:07:52,240 --> 00:07:55,160
involved in policy and in
lawsuits to try and slow this
134
00:07:55,160 --> 00:07:57,560
down.
Suddenly it's just taken off.
135
00:07:57,600 --> 00:08:00,440
And I, I have to admit to you
that like the last like year and
136
00:08:00,440 --> 00:08:04,840
a half, I've kind of been in a
little bit of a depressive
137
00:08:04,840 --> 00:08:07,800
slump, just kind of watching my
thesis come to life.
138
00:08:07,800 --> 00:08:11,000
I've just been so bummed,
honestly, that I've kind of
139
00:08:11,000 --> 00:08:14,280
withdrawn from the topic.
I've just been like surfing and
140
00:08:14,280 --> 00:08:16,400
hanging out with my kids.
I got laid off from NBC News
141
00:08:16,400 --> 00:08:18,760
last year.
And so I've had some severance
142
00:08:18,760 --> 00:08:20,600
and some time to just like
chill.
143
00:08:21,400 --> 00:08:23,000
And now I'm suddenly looking
around.
144
00:08:23,000 --> 00:08:25,000
I'm going, you know what, I got
to start talking about this
145
00:08:25,000 --> 00:08:26,480
again.
I got to get re engaged in this
146
00:08:26,480 --> 00:08:29,240
topic because it is happening so
fast and it's poised to, I
147
00:08:29,240 --> 00:08:33,039
think, transform our world and
not, I would argue necessarily
148
00:08:33,039 --> 00:08:35,720
for the better unless we start
making some big moves right now.
149
00:08:35,720 --> 00:08:38,080
And so that's why I'm joining
you today is trying to just sort
150
00:08:38,080 --> 00:08:40,400
of I'm, I'm trying to have as
many of these conversations as I
151
00:08:40,400 --> 00:08:44,000
can to say, listen, I don't
think our brains are ready for
152
00:08:44,000 --> 00:08:45,480
this.
That's fundamentally my thesis
153
00:08:45,480 --> 00:08:47,360
here.
So which phone is the best then?
154
00:08:47,920 --> 00:08:51,640
Yeah.
All that stuff.
155
00:08:52,480 --> 00:08:54,560
So, So what?
I know, right?
156
00:08:54,600 --> 00:08:56,720
Exactly.
I can't tell you how often I, I,
157
00:08:56,760 --> 00:08:59,040
this is exactly what I was like.
I'll get into a long
158
00:08:59,040 --> 00:09:02,240
conversation with somebody about
like agency and human will and
159
00:09:02,240 --> 00:09:03,920
blah, blah, blah, blah.
And then they'll be like, and by
160
00:09:03,920 --> 00:09:05,120
the way, I'm having some e-mail
trouble.
161
00:09:05,120 --> 00:09:07,880
Can you help me with my e-mail?
That's really funny.
162
00:09:08,280 --> 00:09:10,320
Well, you know, if you're
offering, you know, yeah.
163
00:09:10,880 --> 00:09:12,680
Yeah, I know, right?
So you're just like a
164
00:09:12,800 --> 00:09:15,920
overpowered IT, right?
Yeah, yeah, exactly, exactly.
165
00:09:15,920 --> 00:09:17,880
I don't know about all that.
But anyway, can you help me with
166
00:09:17,880 --> 00:09:22,400
my Ring camera?
No, we, we've recently dove deep
167
00:09:22,400 --> 00:09:25,720
quite a bit into AI and the
troubles that it poses.
168
00:09:25,720 --> 00:09:26,720
Jeff, did you have a question,
man?
169
00:09:26,720 --> 00:09:28,840
So I'm sorry bud.
No, no, you're good.
170
00:09:28,840 --> 00:09:29,520
Go ahead.
OK.
171
00:09:30,360 --> 00:09:32,840
Yeah, we've, we've dove very
heavily into it.
172
00:09:32,840 --> 00:09:35,640
I can guarantee you there's
going to be topics that you may
173
00:09:35,640 --> 00:09:37,840
not have ever been asked to come
out of our mouths today.
174
00:09:37,840 --> 00:09:38,360
It's let's.
Go.
175
00:09:39,400 --> 00:09:40,400
I'm really excited to hear
about.
176
00:09:41,240 --> 00:09:45,880
It but I I want to get I want to
get everything else out of the
177
00:09:45,880 --> 00:09:48,760
way first before we get into,
you know, the kind of freestyle
178
00:09:48,760 --> 00:09:51,440
questions and everything.
Jake, Jeff, do you guys have any
179
00:09:51,440 --> 00:09:53,480
questions before I dive into the
ones that I have?
180
00:09:53,520 --> 00:09:54,560
Nope.
I hope you're ready, Jacob,
181
00:09:54,560 --> 00:09:56,680
because I'm the crazy 1, so
we're going to be.
182
00:09:57,200 --> 00:10:00,760
Good, good, good is.
I don't have any questions, I'm
183
00:10:00,760 --> 00:10:02,640
just like, I'm just interested
in the topic.
184
00:10:02,640 --> 00:10:08,960
I'm so against AI it's not even
funny and I just watched that
185
00:10:09,400 --> 00:10:12,480
artificial intelligence movie
and Bicentennial Man yesterday
186
00:10:12,760 --> 00:10:14,520
and so.
I'm pretty well versed.
187
00:10:14,720 --> 00:10:16,200
You're well poised for this
conversation.
188
00:10:16,200 --> 00:10:18,160
You are ready.
He did all of this studying.
189
00:10:18,480 --> 00:10:20,080
Look at him.
It's the first time he's ever
190
00:10:20,080 --> 00:10:23,600
done studying.
But yeah, well, I can't say I'm
191
00:10:23,600 --> 00:10:27,520
a huge fan of it, but I can see
it benefit benefiting us in
192
00:10:27,520 --> 00:10:30,880
limited usages.
But I can definitely see the
193
00:10:30,880 --> 00:10:34,280
danger that's on the horizon.
It's actually terrifying.
194
00:10:35,160 --> 00:10:36,800
Well, here's the thing, I'll
just sort of say maybe this will
195
00:10:36,800 --> 00:10:39,480
kind of set off some some
questions and some conversation.
196
00:10:39,480 --> 00:10:42,760
Like I want to say, like, I am
not opposed to it as a
197
00:10:42,760 --> 00:10:45,320
technology.
This is it's almost always the
198
00:10:45,320 --> 00:10:48,720
case for me that like a
technology in and of itself is
199
00:10:48,720 --> 00:10:52,360
not a problem, you know, so, so
in the case of AI, you know,
200
00:10:52,360 --> 00:10:54,320
there are incredible uses,
right?
201
00:10:54,320 --> 00:10:57,000
And as the, as a guy who used
to, you know, run a magazine
202
00:10:57,000 --> 00:11:00,640
that was all about foundational
science, like the incredible
203
00:11:00,640 --> 00:11:05,240
work that can be done with a
pattern recognition system made
204
00:11:05,240 --> 00:11:09,040
possible by AI to turn, you
know, the, the, the truth, the
205
00:11:09,040 --> 00:11:11,760
revolution that made all of
these LLMS and generative AI
206
00:11:11,760 --> 00:11:13,840
possible, right?
Is this, is this transformer
207
00:11:13,840 --> 00:11:17,240
model revolution that in
20/17/2018 made it possible to
208
00:11:17,240 --> 00:11:20,120
suddenly take huge amounts of
undifferentiated data that you
209
00:11:20,120 --> 00:11:23,200
could never as a human being sit
and sift through and find
210
00:11:23,200 --> 00:11:26,720
patterns in funnel and get all
these insights out of the bottom
211
00:11:26,720 --> 00:11:28,240
of it.
If you think of all the things
212
00:11:28,240 --> 00:11:30,720
that scientists could pour into
a funnel like that, right?
213
00:11:30,720 --> 00:11:36,600
Like every single photograph of
a, of a mole on someone's back
214
00:11:36,600 --> 00:11:38,760
that might or might not be
cancerous, right?
215
00:11:38,760 --> 00:11:42,680
It, it's incredible at finding
the patterns in which predicting
216
00:11:42,680 --> 00:11:44,760
who's mole is going to turn
cancerous and who's is not.
217
00:11:45,120 --> 00:11:50,240
You pour all of the, you know,
uncharted, unnamed, well,
218
00:11:50,240 --> 00:11:53,080
everything's named, but all but
all of the stars in the sky,
219
00:11:53,120 --> 00:11:54,560
right.
And you and you look for
220
00:11:54,560 --> 00:11:56,920
patterns in how they are moving
or what's going on.
221
00:11:56,920 --> 00:11:59,480
You know, incredible discoveries
can come out of that.
222
00:12:00,000 --> 00:12:02,520
I've talked to people who say,
you know, I've, I know a guy who
223
00:12:02,760 --> 00:12:06,360
is working on a system.
He, he, he exclusively tries to
224
00:12:06,360 --> 00:12:10,480
make AI systems for either
not-for-profit sort of
225
00:12:10,480 --> 00:12:15,400
healthcare systems or for like
state governments and, and he
226
00:12:15,400 --> 00:12:18,480
makes no money as a result.
And his thing is, you know, he
227
00:12:18,480 --> 00:12:22,040
says if you gave me everybody's
birth certificate and everyone's
228
00:12:22,040 --> 00:12:26,960
home address, I could use AI to
predict whose apartments need to
229
00:12:26,960 --> 00:12:32,000
be repainted in advance to cover
up the lead paint and avoid lead
230
00:12:32,000 --> 00:12:34,400
poisoning in, in kids.
You could wipe out lead
231
00:12:34,400 --> 00:12:37,360
poisoning in young kids.
You know, like the ability to do
232
00:12:37,360 --> 00:12:40,600
incredible stuff like that is,
is enormous and fantastic.
233
00:12:41,000 --> 00:12:43,520
But here's the problem, you
guys, no one's making money
234
00:12:43,520 --> 00:12:44,920
stripping lead paint out of
apartments.
235
00:12:45,040 --> 00:12:47,840
Nobody's making money porn stars
into funnels, right?
236
00:12:47,840 --> 00:12:49,760
That is not how we work in this
country.
237
00:12:49,760 --> 00:12:54,080
Instead, you're in a world
already where the leadership of
238
00:12:54,080 --> 00:12:57,600
these companies is saying, and
now we're going to start, you
239
00:12:57,600 --> 00:13:01,760
know, using this to let you
write erotica or make cartoon
240
00:13:01,760 --> 00:13:04,520
porn girlfriends for yourselves
or whatever it is, right?
241
00:13:04,520 --> 00:13:08,920
It's it's how you make money off
it that turns the technology
242
00:13:08,920 --> 00:13:11,320
into something bad.
And in my case, I was just at a
243
00:13:11,320 --> 00:13:14,600
conference where I was watching
presentation after presentation
244
00:13:14,600 --> 00:13:16,920
after presentation by people who
work in what's called trust and
245
00:13:16,920 --> 00:13:18,520
safety.
These are the sort of content
246
00:13:18,520 --> 00:13:20,960
moderators at at social media
platforms.
247
00:13:21,440 --> 00:13:23,600
And, you know, they were just
showing time and again and again
248
00:13:23,600 --> 00:13:28,280
here, look, here's the
photographs that AI can generate
249
00:13:28,280 --> 00:13:32,680
that absolutely no one will be
able to spot as being fake.
250
00:13:32,920 --> 00:13:37,120
Here's the political messaging
that turns out to be way more
251
00:13:37,120 --> 00:13:40,080
persuasive, not way more, but
substantially more persuasive
252
00:13:40,080 --> 00:13:42,920
than than messaging written by
an actual human.
253
00:13:43,120 --> 00:13:45,920
And it doesn't even have to be
written by somebody who who
254
00:13:45,920 --> 00:13:48,200
writes the language, who even
understands our language.
255
00:13:48,480 --> 00:13:50,840
So you just are seeing time and
again and again that this stuff
256
00:13:50,840 --> 00:13:54,320
can be used for great stuff, but
it can also be used for terrible
257
00:13:54,320 --> 00:13:55,640
stuff.
And that terrible stuff tends to
258
00:13:55,640 --> 00:13:57,880
make us more money.
And so that's my big problem.
259
00:13:58,600 --> 00:14:02,160
That's my big push back against
it.
260
00:14:02,160 --> 00:14:05,280
It's not that AI can't be used
for amazing things, but as a
261
00:14:05,280 --> 00:14:09,880
human species, we tend to do the
worst things with what we have.
262
00:14:09,880 --> 00:14:13,960
And so, you know, my big concern
is, you know, the whole deep
263
00:14:13,960 --> 00:14:17,000
fake technology, at some point
it's going to get, so it's going
264
00:14:17,000 --> 00:14:21,000
to get so freaking difficult to
tell what's a fabricated photo
265
00:14:21,320 --> 00:14:25,640
or a, you know, like if I wanted
to make a, a video of Jacob Ward
266
00:14:25,640 --> 00:14:29,360
drowning puppies in a river in
order to ruin his life, you
267
00:14:29,360 --> 00:14:33,240
know, I could do that in 10
seconds, put it out there.
268
00:14:33,240 --> 00:14:35,240
And then you're still going to
have people out.
269
00:14:35,280 --> 00:14:38,680
Even if it's able to be, you
know, shown that this is, you
270
00:14:38,680 --> 00:14:40,600
know, baloney.
It's not a real video.
271
00:14:40,800 --> 00:14:44,000
You're still going to have
people out there years after the
272
00:14:44,000 --> 00:14:47,680
event, like every single, you
know, smear campaign that we
273
00:14:47,680 --> 00:14:49,440
see.
They're going to believe that
274
00:14:49,440 --> 00:14:51,760
lie or at least think in the
back of their heads.
275
00:14:51,880 --> 00:14:54,200
Maybe Jacob Ward is capable of
drowning puppies.
276
00:14:54,200 --> 00:14:55,760
Totally.
You know, and let's and let's
277
00:14:55,760 --> 00:14:58,400
say and let's say I.
Am If it ruins your life, you
278
00:14:58,400 --> 00:14:59,280
know why not.
Totally.
279
00:14:59,360 --> 00:15:02,080
And let's say I am a serial
puppy drowner, right?
280
00:15:02,160 --> 00:15:05,560
And you catch me with a camera
doing it, I'm then going to be
281
00:15:05,560 --> 00:15:07,320
in a position to say, that's not
me.
282
00:15:07,760 --> 00:15:09,760
Look, you can't tell.
You can't believe your lying
283
00:15:09,760 --> 00:15:12,840
eyes because clearly we're in a
world in which you can't trust
284
00:15:12,840 --> 00:15:15,320
video evidence anymore.
This is my whole problem with
285
00:15:15,320 --> 00:15:18,600
the latest release from from
Open AI, which allows people to
286
00:15:18,600 --> 00:15:21,000
make these 10 second videos that
are all set for, you know,
287
00:15:21,000 --> 00:15:23,360
they're perfect for social media
and they're already flooding my
288
00:15:23,360 --> 00:15:26,560
social media, you know, like,
I'm like, what is the purpose of
289
00:15:26,560 --> 00:15:28,680
this?
What is it that we needed?
290
00:15:28,840 --> 00:15:30,600
What problem is this solving?
Right?
291
00:15:30,600 --> 00:15:33,960
Like, I think we are already not
doing a great job with hanging
292
00:15:33,960 --> 00:15:35,560
on to what's true and what's
not.
293
00:15:36,080 --> 00:15:39,200
And, and to totally it, it just
feels like the, it's almost as
294
00:15:39,200 --> 00:15:42,240
if the objective is to
absolutely destabilize what we
295
00:15:42,240 --> 00:15:45,720
can trust and what we can't so
that there won't be any video
296
00:15:45,720 --> 00:15:47,280
evidence tomorrow.
I mean, as a journalist who
297
00:15:47,280 --> 00:15:50,320
believes that there really is a
way to to prove a thing, You
298
00:15:50,320 --> 00:15:52,840
know, one of the number one ways
you prove a thing is you catch
299
00:15:52,840 --> 00:15:54,280
somebody on camera doing that
thing.
300
00:15:54,560 --> 00:15:57,240
And if that's going to go out
the window, I don't know you
301
00:15:57,240 --> 00:15:59,160
guys like I'm not sure what
we're going to, how we're going
302
00:15:59,160 --> 00:16:02,840
to agree on anything anymore.
Yeah, yeah, I mean, I obviously
303
00:16:02,840 --> 00:16:06,080
I see that that's like an issue,
but I, I don't even see that as
304
00:16:06,080 --> 00:16:08,640
like a major issue personally,
because I think that most of the
305
00:16:08,640 --> 00:16:11,200
society is kind of already at
that point, even without AI,
306
00:16:11,200 --> 00:16:13,280
like nobody believes anything,
right?
307
00:16:13,280 --> 00:16:17,760
Or, you know, you have all these
different like factions of ideas
308
00:16:17,760 --> 00:16:20,320
and people can't agree on
anything no matter what, even if
309
00:16:20,320 --> 00:16:22,720
you do show them a real video of
something, right?
310
00:16:22,720 --> 00:16:24,920
Like it doesn't matter at this
point.
311
00:16:24,920 --> 00:16:30,680
So my, my bigger concern is that
the use of AI to sift through
312
00:16:30,680 --> 00:16:32,880
and like organize data on us,
right?
313
00:16:32,880 --> 00:16:35,840
So I know that there's like a
big, we all know this, right?
314
00:16:35,840 --> 00:16:38,120
Like we're all being tracked in
every way, shape and form,
315
00:16:38,120 --> 00:16:38,960
right?
Thank, thank you.
316
00:16:38,960 --> 00:16:40,960
Patriot Act.
They're listening to and
317
00:16:40,960 --> 00:16:42,560
watching everything that's
happening.
318
00:16:42,640 --> 00:16:45,280
It's not happening.
Yeah, literally, right.
319
00:16:45,280 --> 00:16:47,600
But the problem is like they're
going to take it up to the next
320
00:16:47,600 --> 00:16:49,240
level, I think, right?
They're going to start using
321
00:16:49,240 --> 00:16:53,000
this to track like biometric
data on all of us and these
322
00:16:53,000 --> 00:16:55,800
types of things.
The control factor from whether
323
00:16:55,800 --> 00:16:59,080
it's a government or
corporations like, you know,
324
00:16:59,080 --> 00:17:01,400
compiling all that data to feed
you, whatever.
325
00:17:01,880 --> 00:17:05,240
That's my bigger concern.
The the deep fake stuff is like
326
00:17:05,240 --> 00:17:06,640
whatever, nobody believes shit
anyway.
327
00:17:06,640 --> 00:17:09,599
So, well, yeah, I'm not quite as
I'm not, I'm not as nihilistic
328
00:17:09,599 --> 00:17:13,480
as you are about about truth.
And I like to believe, you know,
329
00:17:13,480 --> 00:17:16,119
I'm trying to do this job still
of sifting some truth out of the
330
00:17:16,119 --> 00:17:17,599
world.
But I, I absolutely agree with
331
00:17:17,599 --> 00:17:19,880
you that like that trust is at
an all time low around that
332
00:17:19,880 --> 00:17:21,119
stuff.
And so it's going to get even
333
00:17:21,119 --> 00:17:24,160
worse in that case.
But I 100% agree with you that
334
00:17:24,160 --> 00:17:28,960
there is about to be a huge
surveillance problem and that
335
00:17:29,280 --> 00:17:34,240
problem, you know, so I, I think
that the, the, the government
336
00:17:34,240 --> 00:17:38,600
version of it, even if you put
that aside and, and I'm not
337
00:17:38,680 --> 00:17:40,680
arguing that, that that's
somehow not something to worry
338
00:17:40,680 --> 00:17:42,320
about.
But like, I think mostly about
339
00:17:42,520 --> 00:17:44,280
corporate actors these days,
right?
340
00:17:44,280 --> 00:17:49,680
Because the the pipeline of data
from the incredible amount of
341
00:17:49,680 --> 00:17:51,840
information that we have
volunteered to these companies
342
00:17:51,920 --> 00:17:56,280
in the last 10 years and how
that is about to get poured into
343
00:17:56,360 --> 00:18:00,680
these funnels that can find
patterns in huge amounts of data
344
00:18:01,480 --> 00:18:04,240
has created an incredible
surveillance opportunity for
345
00:18:04,240 --> 00:18:06,400
these companies, right.
So like Once Upon a time, the
346
00:18:06,400 --> 00:18:08,800
big breakthrough, Mark
Zuckerberg's big breakthrough,
347
00:18:08,800 --> 00:18:14,080
was this idea of revealed
preferences was the term of art
348
00:18:14,080 --> 00:18:17,880
in the social media world, which
is, you know, if I look at that,
349
00:18:18,280 --> 00:18:21,120
you know enough, if I look at
you enough, Jeff, and I look at
350
00:18:21,120 --> 00:18:25,160
enough other people who are
similar to your behavior, I can
351
00:18:25,160 --> 00:18:29,640
pick out of your collective
behavior some revealed
352
00:18:29,640 --> 00:18:31,080
preferences is what they call
it.
353
00:18:31,120 --> 00:18:34,200
And that just means like the
stuff where if I asked you in a
354
00:18:34,200 --> 00:18:36,080
poll, hey, are you into this
kind of thing?
355
00:18:36,320 --> 00:18:38,480
You'd be like, no, I like this
and this and this.
356
00:18:38,680 --> 00:18:42,640
But your unconscious mind, you
know, just can't help but look
357
00:18:42,640 --> 00:18:45,920
at XY and Z when it's shown it
and it's, you know, it's the
358
00:18:45,920 --> 00:18:48,520
equivalent of like you drive
past a car accident.
359
00:18:48,800 --> 00:18:51,760
Nobody would ever say in a, you
know, in a survey, I like
360
00:18:51,760 --> 00:18:54,960
looking at car accidents, you
know, but it is clear from your
361
00:18:54,960 --> 00:18:57,840
behavior, the behavior that you
should exhibit on the social
362
00:18:57,840 --> 00:19:00,240
media platform that you can't
help but look at that, right?
363
00:19:00,440 --> 00:19:04,160
So you you pour enough of that
kind of revealed preference data
364
00:19:04,160 --> 00:19:07,880
into a social media platform and
they start to build right
365
00:19:07,880 --> 00:19:09,760
engagement algorithms based on
that.
366
00:19:09,760 --> 00:19:11,840
And that's how we got into the
trouble we were in to start
367
00:19:11,840 --> 00:19:13,800
with.
Now you're going to have this
368
00:19:13,800 --> 00:19:19,400
incredible ability to find
meaning in that data and
369
00:19:19,400 --> 00:19:23,200
predictive forecasting that's
going to let you say this
370
00:19:23,200 --> 00:19:26,360
person's going to be way into
this and way into this kind of
371
00:19:26,360 --> 00:19:27,600
thing.
And it's not even going to be a
372
00:19:27,600 --> 00:19:31,840
question of funneling, you know,
content that you, you know, that
373
00:19:31,840 --> 00:19:35,080
someone has made to you, they're
going to be able to just custom
374
00:19:35,080 --> 00:19:37,600
generate it, right?
So for me, that's one huge
375
00:19:37,880 --> 00:19:39,800
problem.
And then, like you say, the
376
00:19:39,800 --> 00:19:42,800
surveillance part of it, I mean,
already there's technology out
377
00:19:42,800 --> 00:19:46,200
there that, you know, can turn
Wi-Fi signals into an ability to
378
00:19:46,200 --> 00:19:48,280
tell how many people are in the
room and who's moving around.
379
00:19:48,280 --> 00:19:53,240
And you know, the the incredible
ability to watch our behavior on
380
00:19:53,240 --> 00:19:58,040
an ongoing basis is just going
to be on precedented and and the
381
00:19:58,040 --> 00:19:59,920
fact that it's all going to be
done for that it's going to be
382
00:19:59,920 --> 00:20:03,960
done for money almost makes me
more worried than if it were
383
00:20:03,960 --> 00:20:04,920
done.
For some reason.
384
00:20:04,920 --> 00:20:07,840
I'm I'm more worried about it
being done by corporations than
385
00:20:07,840 --> 00:20:12,000
I am by governments.
But that's my sure little thing.
386
00:20:12,160 --> 00:20:14,280
I'm with that.
I mean, for sure, you know, I've
387
00:20:14,280 --> 00:20:16,440
been seeing me and my other
codes.
388
00:20:16,440 --> 00:20:19,200
We talk a lot about this thing
that's been talked about a lot
389
00:20:19,200 --> 00:20:21,800
recently, which is kind of weird
predictive policing.
390
00:20:22,040 --> 00:20:23,920
Have you Have you heard?
This yes Pred Pol.
391
00:20:24,080 --> 00:20:26,000
I have a whole part of my book
about that.
392
00:20:26,040 --> 00:20:29,080
You know, like to me, like that
ties into all of this data
393
00:20:29,080 --> 00:20:30,960
collection and the surveillance
state stuff.
394
00:20:30,960 --> 00:20:34,040
So like, it's literally going to
be like minority report in my,
395
00:20:34,160 --> 00:20:36,880
my worst case, right?
It's like they're going to just
396
00:20:36,880 --> 00:20:39,200
look at you and be like, hey,
and we, we can predict based off
397
00:20:39,200 --> 00:20:42,000
of all these things that you're
talking about in five years, you
398
00:20:42,160 --> 00:20:44,160
you're probably going to commit
some kind of crime and they'll
399
00:20:44,160 --> 00:20:47,520
be able to somehow work that
into legislation and like, it
400
00:20:47,520 --> 00:20:49,560
becomes like a whole new world,
literally.
401
00:20:49,560 --> 00:20:51,440
And here's the here's the thing
that I get really worried about
402
00:20:51,440 --> 00:20:53,960
with that is that, you know, one
of the, one of the themes that I
403
00:20:53,960 --> 00:20:56,160
come back to a lot in the book
is this concept of
404
00:20:56,160 --> 00:20:59,000
anthropomorphism, right?
Which is the, the, the, the
405
00:20:59,000 --> 00:21:02,240
assuming that a really
sophisticated, sorry, assuming
406
00:21:02,240 --> 00:21:04,920
that a system you don't
understand is somehow really
407
00:21:04,920 --> 00:21:07,360
sophisticated.
So just because you don't know
408
00:21:07,720 --> 00:21:11,120
how it works, you assume it's
right is just it's just a
409
00:21:11,120 --> 00:21:13,720
natural human tendency.
This is one of those greatest
410
00:21:13,720 --> 00:21:15,920
hits of the last 50 years of
behavioral science was learning
411
00:21:15,920 --> 00:21:19,080
that one.
In that case you just you have
412
00:21:19,160 --> 00:21:21,000
these.
So that's the thing about
413
00:21:21,000 --> 00:21:23,680
predictive policing right is
that it it delivers.
414
00:21:23,680 --> 00:21:26,720
This is true also of like facial
recognition systems is they
415
00:21:26,720 --> 00:21:33,720
deliver a you know, a a
conclusion for an arresting
416
00:21:33,720 --> 00:21:38,240
officer that you know, it
doesn't explain itself, doesn't
417
00:21:38,240 --> 00:21:41,440
say how it came to that
conclusion and saves that person
418
00:21:41,440 --> 00:21:45,440
a huge amount of work and that
and that be is an irresistible
419
00:21:45,440 --> 00:21:47,680
combination.
And so you wind up in these
420
00:21:47,680 --> 00:21:52,040
systems where so pred poll LA
Los Angeles used this for a long
421
00:21:52,040 --> 00:21:56,080
time until it was basically
shown that it had all kinds of
422
00:21:56,080 --> 00:21:59,560
biased results because it was
leading what it was essentially
423
00:21:59,560 --> 00:22:04,160
doing was kind of pre inditing
sections of the city based on
424
00:22:04,160 --> 00:22:07,800
past patterns.
And as a result, a kid who just
425
00:22:07,800 --> 00:22:10,480
happens to live in that
neighborhood is going to get
426
00:22:10,480 --> 00:22:12,280
grabbed by the police way more
often.
427
00:22:12,280 --> 00:22:14,760
You know, a perfectly innocent
kid is going to get is much more
428
00:22:14,760 --> 00:22:16,480
likely to get accidentally
grabbed by the police.
429
00:22:17,040 --> 00:22:19,840
And so eventually LA has now
cancelled their contract with
430
00:22:19,840 --> 00:22:22,280
this these systems.
But for years they were, they
431
00:22:22,280 --> 00:22:25,520
were using it and it was because
it, everyone just kept saying
432
00:22:25,520 --> 00:22:27,880
it's a neutral system, It's a
neutral arbiter.
433
00:22:27,880 --> 00:22:30,840
It's a, you know, but the truth
is no one really at the front
434
00:22:30,840 --> 00:22:34,760
lines of operating that system
knows what it's doing, how it's
435
00:22:34,760 --> 00:22:37,000
making its choices.
And it's so convenient.
436
00:22:37,000 --> 00:22:39,280
No one wants to resist it.
So that's my other worry about
437
00:22:39,280 --> 00:22:40,280
this.
You know, I really, I think
438
00:22:40,280 --> 00:22:43,400
we're not great at saying, you
know, 1 stat I, I bumped into,
439
00:22:43,400 --> 00:22:48,440
in the, in the book is there?
I, I looked at this big study of
440
00:22:48,440 --> 00:22:52,520
a, the big healthcare records
company is a company called Epic
441
00:22:52,520 --> 00:22:55,520
that it deals with your
electronic medical records.
442
00:22:56,120 --> 00:23:00,080
And they created a predictive
system that will help
443
00:23:00,240 --> 00:23:05,600
cardiologists make a priority
list of their patients that day
444
00:23:05,640 --> 00:23:09,280
based on who, who's most likely
to have a cardiac event that
445
00:23:09,280 --> 00:23:11,920
day.
And The thing is, it works.
446
00:23:12,040 --> 00:23:14,160
It works pretty well.
They're getting out in front of
447
00:23:14,160 --> 00:23:16,920
cardiac events early.
These doctors are, are able to,
448
00:23:16,960 --> 00:23:18,880
to get in and make interventions
early like that.
449
00:23:18,880 --> 00:23:20,520
You know, in a lot of ways you
can't argue with it because it
450
00:23:20,520 --> 00:23:22,520
really works well.
But here's where I get worried
451
00:23:22,520 --> 00:23:24,360
about it.
I said, well, I asked the, the
452
00:23:24,360 --> 00:23:28,080
makers of it, well, how often
like your, your doctors must be
453
00:23:28,200 --> 00:23:31,080
irritated to be told what to do
by a system like this.
454
00:23:31,080 --> 00:23:32,800
And they said, oh, no, no,
they're thrilled.
455
00:23:32,800 --> 00:23:34,880
They're thrilled about it.
Like they don't want to have to
456
00:23:34,880 --> 00:23:37,960
make that choice.
They love being told what the
457
00:23:37,960 --> 00:23:40,640
order is.
And I was like, oh wow, has
458
00:23:40,640 --> 00:23:42,320
anyone?
And, and how often do they ask
459
00:23:42,320 --> 00:23:45,320
how it makes decisions?
And they were like, one guy
460
00:23:45,320 --> 00:23:48,040
asked once and that's it.
You know what I mean?
461
00:23:48,040 --> 00:23:50,600
And so like, so there but for
the grace of God, right.
462
00:23:50,600 --> 00:23:52,200
You might have a system that
works really well.
463
00:23:52,200 --> 00:23:55,200
You might not, you don't know,
but people are only too happy to
464
00:23:55,200 --> 00:23:57,120
let the decision get handed off
to these systems.
465
00:23:57,120 --> 00:23:58,800
This is our what our wiring is
about.
466
00:23:59,280 --> 00:24:03,760
We don't like to make choices.
We like to be able to offload or
467
00:24:04,000 --> 00:24:06,440
outsource our decision making.
It's part of our human
468
00:24:06,440 --> 00:24:07,880
programming.
And this is the ultimate
469
00:24:08,040 --> 00:24:11,040
outsourcing system, and yet we
have no idea how it works or
470
00:24:11,040 --> 00:24:14,280
whether it's doing a good job.
Do you believe?
471
00:24:14,440 --> 00:24:16,400
I mean, that's basically a form
of triage, isn't it?
472
00:24:17,520 --> 00:24:21,240
Yeah, right.
And and like again, like you'd
473
00:24:21,240 --> 00:24:25,080
want that to work well.
But there was AI had a, a
474
00:24:25,080 --> 00:24:28,040
physician once explained to me
that one of the number one
475
00:24:28,040 --> 00:24:33,320
causes of malpractice lawsuits
is a failure to look at a
476
00:24:33,320 --> 00:24:36,840
patient's back.
And they'll then get bed sores
477
00:24:36,840 --> 00:24:39,120
and and no one it finds out,
right.
478
00:24:39,480 --> 00:24:42,960
And the theory going about that
is that the doctor is too buried
479
00:24:42,960 --> 00:24:45,960
in the chart.
The doctor's already overworked,
480
00:24:46,080 --> 00:24:47,600
right?
Has 2A patient load they can't
481
00:24:47,600 --> 00:24:49,400
handle.
They're buried in the chart.
482
00:24:49,400 --> 00:24:51,440
They just look at the vitals
that are on the chart.
483
00:24:51,440 --> 00:24:54,040
And then they say, you know, OK,
I think it's this, let's do
484
00:24:54,040 --> 00:24:56,040
this, right?
And they don't, they just don't
485
00:24:56,040 --> 00:24:58,880
have the time or the incentive
to roll that person over and
486
00:24:58,880 --> 00:25:00,720
check out their whole body,
right?
487
00:25:00,720 --> 00:25:02,920
Like it's that kind of thing.
And I, I would just worry like
488
00:25:02,920 --> 00:25:05,120
at every level of really
essential services in this
489
00:25:05,120 --> 00:25:08,040
country, you're going to have
people saying, oh, I don't have
490
00:25:08,040 --> 00:25:11,400
to do that anymore.
The AI tells me which suspect to
491
00:25:11,400 --> 00:25:14,440
arrest.
The AI tells me who to hire for
492
00:25:14,440 --> 00:25:16,960
this job.
The AI tells me which of these
493
00:25:16,960 --> 00:25:19,720
people I should date it.
We're going to be using these
494
00:25:19,720 --> 00:25:22,520
systems in ways that I think are
going to wipe out our, our, it's
495
00:25:22,520 --> 00:25:24,560
going to do to our ability to
make good choices for ourselves.
496
00:25:24,560 --> 00:25:27,600
What like GPS has done to our
sense of direction.
497
00:25:27,600 --> 00:25:31,000
I can't find my way around my
own town anymore because of GPS.
498
00:25:31,040 --> 00:25:32,440
Same.
Is this going to be?
499
00:25:32,440 --> 00:25:34,480
So I kind of go back and forth
on this a lot.
500
00:25:35,600 --> 00:25:37,480
You know, obviously, like it
freaks me out because we're
501
00:25:37,480 --> 00:25:39,680
living in this transitional
period, right?
502
00:25:39,680 --> 00:25:41,240
We're going to witness it from
before.
503
00:25:41,240 --> 00:25:42,760
I remember going outside and
playing, right?
504
00:25:42,760 --> 00:25:45,120
And then now it's like we're
going into this new world
505
00:25:45,120 --> 00:25:47,440
situation.
But is it going to be a thing
506
00:25:47,440 --> 00:25:50,440
where in a few generations in
the future, maybe 50 to 100
507
00:25:50,440 --> 00:25:53,960
years in the future?
Like it's so well developed in
508
00:25:53,960 --> 00:25:57,880
advance that it's actually
incredibly beneficial, like net
509
00:25:57,880 --> 00:26:01,240
positive for humanity, right?
Like, we live in some kind of,
510
00:26:01,880 --> 00:26:06,480
you know, utopian paradise, you
know, an abundant civilization
511
00:26:06,840 --> 00:26:08,640
because of it.
You know, I mean, that is
512
00:26:08,640 --> 00:26:10,760
certainly what that's, that's
the vision being sold by the
513
00:26:10,760 --> 00:26:12,880
companies making it right.
They're all it's, it's the
514
00:26:12,880 --> 00:26:17,000
weirdest time to be a reporter
because they say things that in
515
00:26:17,000 --> 00:26:20,280
the old days, and I'm old enough
to remember, some old days would
516
00:26:20,280 --> 00:26:22,640
have been an absolute death
sentence from a sort of public
517
00:26:22,640 --> 00:26:25,000
relations standpoint.
So you'll have a guy like Sam
518
00:26:25,000 --> 00:26:29,680
Altman saying openly, huge
numbers of jobs are going to be
519
00:26:29,680 --> 00:26:33,560
destroyed by our creation, you
know, and there's going to be
520
00:26:33,600 --> 00:26:36,760
big scams that are going to be
really scary made possible by
521
00:26:36,760 --> 00:26:40,080
what we have built here.
You know, to my mind, it and but
522
00:26:40,200 --> 00:26:43,520
but he says it and then in the
same breath, essentially says,
523
00:26:43,520 --> 00:26:46,400
but it's going to be worth it
because down the line there's
524
00:26:46,400 --> 00:26:50,360
going to be this incredible, you
know, upshot, you know, upside
525
00:26:50,640 --> 00:26:53,600
for humanity.
And I think I would just say
526
00:26:53,600 --> 00:26:57,320
like, again, I'm down for like,
if, if it were up to me, I'd
527
00:26:57,320 --> 00:27:00,480
make it like, let's make it like
a A5 year moratorium on
528
00:27:00,480 --> 00:27:02,880
commercial use and just give it
to the scientists.
529
00:27:02,960 --> 00:27:05,600
I'd like to have just scientists
use it for for five years and
530
00:27:05,600 --> 00:27:07,680
then we can start trying to make
money off it, you know, and
531
00:27:07,680 --> 00:27:09,360
let's see where that takes us.
First.
532
00:27:09,680 --> 00:27:11,840
Let's see if we can wipe out
cancer and then we'll start
533
00:27:13,080 --> 00:27:16,960
wiping out, you know, real world
girlfriends, OK, Like they hold
534
00:27:16,960 --> 00:27:21,200
off, you know, and in the case
of like already there, there
535
00:27:21,200 --> 00:27:24,320
isn't even evidence yet that it
has any kind of real
536
00:27:24,320 --> 00:27:28,440
productivity gains, right?
Like there was a big MIT study
537
00:27:28,440 --> 00:27:31,840
recently that showed that 95% of
these companies that have
538
00:27:31,840 --> 00:27:35,840
adopted this stuff can are
reporting no productivity gains
539
00:27:35,840 --> 00:27:38,200
at all.
Like they can't figure out any,
540
00:27:38,440 --> 00:27:40,040
any improvement that's been
made.
541
00:27:40,440 --> 00:27:44,560
So, you know, it may be that
that's the case, but you also
542
00:27:44,560 --> 00:27:46,120
have to think about the other
kinds of things that the
543
00:27:46,120 --> 00:27:47,560
leadership of these companies
predict.
544
00:27:47,760 --> 00:27:50,920
So one of the things that that
Sam Altman also has said, he
545
00:27:50,920 --> 00:27:55,400
said it at the in a January 2024
podcast with Alexis Ohanian, he
546
00:27:55,400 --> 00:27:59,520
said that that he has a bet
going with his tech CEO friends
547
00:27:59,520 --> 00:28:04,000
on their group chat as to when
we'll see the first billion
548
00:28:04,000 --> 00:28:09,080
dollar one person company.
And that's like a dream of their
549
00:28:09,080 --> 00:28:11,200
say it like he says it like it's
a positive thing.
550
00:28:11,920 --> 00:28:14,320
And so I think to myself, OK,
well, wait a minute.
551
00:28:14,320 --> 00:28:19,200
So if the future is a handful of
single person billionaires,
552
00:28:19,200 --> 00:28:21,920
single person billionaire
companies, what are the rest of
553
00:28:21,920 --> 00:28:24,680
us going to do for a living?
You know, and the and the the
554
00:28:24,680 --> 00:28:26,680
dream they sort of offers like
what we're going to have a huge
555
00:28:26,680 --> 00:28:29,320
amount of free time.
I don't know about you guys, but
556
00:28:29,320 --> 00:28:32,920
like, this is the United States.
We don't like free time and we
557
00:28:32,920 --> 00:28:34,280
don't like to give people free
time.
558
00:28:34,280 --> 00:28:36,480
We certainly don't like to pay
people for free time.
559
00:28:36,760 --> 00:28:40,080
You know, like to me, to my
mind, there's just no, there's
560
00:28:40,080 --> 00:28:44,080
no example in history of a, of a
time when we've made it possible
561
00:28:44,080 --> 00:28:50,320
to do less work and, and given
people, you know, a living for
562
00:28:50,320 --> 00:28:51,760
that.
Like it's just, that's not what
563
00:28:51,760 --> 00:28:52,800
we do.
Sure.
564
00:28:53,160 --> 00:28:55,920
I mean, I do also like I pay a
lot of attention to like
565
00:28:55,920 --> 00:28:58,640
macroeconomic stuff.
So, you know, it is also
566
00:28:58,640 --> 00:29:01,360
interesting to me that it seems
like, and maybe this is just my
567
00:29:01,360 --> 00:29:04,680
conspiracy minded self thinking
too much, but it does seem like
568
00:29:04,680 --> 00:29:07,440
we're going through some kind of
monetary change, right?
569
00:29:07,440 --> 00:29:10,000
Yes.
So it it could be something
570
00:29:10,000 --> 00:29:12,440
where they they kind of know
they they foreseeing that this
571
00:29:12,440 --> 00:29:14,400
being an issue.
So they're trying to flip
572
00:29:14,720 --> 00:29:17,800
whether it be into like crypto
rails or whatever the case might
573
00:29:17,800 --> 00:29:19,480
be.
Because I know everybody when AI
574
00:29:19,480 --> 00:29:21,280
started dropping, everybody was
like, oh, we're going to lose
575
00:29:21,280 --> 00:29:23,000
all these jobs.
We're going to talk about
576
00:29:23,000 --> 00:29:26,320
universal basic income and
obviously like that doesn't
577
00:29:26,320 --> 00:29:30,080
sound too, too good for anybody
who's not just a lazy ass, right
578
00:29:30,080 --> 00:29:30,960
So.
Right.
579
00:29:30,960 --> 00:29:32,360
I mean already, right?
You wouldn't want that.
580
00:29:32,560 --> 00:29:33,880
A lot of people don't want that.
That's right.
581
00:29:33,880 --> 00:29:36,080
Right.
But if there is some kind of
582
00:29:36,080 --> 00:29:40,720
like monetary shift globally and
we stop looking at the financial
583
00:29:40,720 --> 00:29:44,840
system the way that we always
have, maybe like maybe right,
584
00:29:44,840 --> 00:29:47,880
maybe there could be some way to
make all that work together.
585
00:29:47,880 --> 00:29:51,080
But.
Maybe, I mean, you know, now
586
00:29:51,080 --> 00:29:55,040
we're out past the power of, of
history to be in any way
587
00:29:55,040 --> 00:29:57,200
predictive.
Like we don't have any examples
588
00:29:57,200 --> 00:29:59,680
of, of, of making that, of, of
that being possible.
589
00:29:59,680 --> 00:30:02,160
And so, you know, yeah, maybe
down the road there will be some
590
00:30:02,160 --> 00:30:05,640
kind of, you know, maybe we'll
be the end of Fiat currency and
591
00:30:05,640 --> 00:30:09,160
we'll all be somehow, I don't
know, trading sugar syrup and,
592
00:30:09,160 --> 00:30:12,040
and, and not worrying about, you
know, against some digital
593
00:30:12,040 --> 00:30:13,560
currency.
And we'll be, you know, and
594
00:30:13,560 --> 00:30:16,280
we'll be fine.
My problem is that, you know, so
595
00:30:16,280 --> 00:30:18,800
there's a, there's a thing
called the Jevons paradox that
596
00:30:18,800 --> 00:30:21,120
really haunts me.
So back in the 19th century, guy
597
00:30:21,120 --> 00:30:23,480
named William Jevons was a 19th
century.
598
00:30:23,480 --> 00:30:26,160
He was a he was a British
economist and he was trying to
599
00:30:26,160 --> 00:30:28,800
figure out why it was that the
British Empire was running out
600
00:30:28,800 --> 00:30:31,920
of coal, which he was
recognizing was a big problem,
601
00:30:31,920 --> 00:30:34,360
that England itself is going to
run out of coal.
602
00:30:34,760 --> 00:30:38,320
And, and his thing that he
identified was this weird
603
00:30:38,320 --> 00:30:41,920
paradox in which the technology
at the time had actually gotten
604
00:30:41,920 --> 00:30:44,880
better at burning coal in a much
more efficient way.
605
00:30:44,880 --> 00:30:46,880
There were these new steam
engines that really had
606
00:30:46,880 --> 00:30:50,440
revolutionized the efficient use
of coal, and yet they were using
607
00:30:50,440 --> 00:30:53,360
more coal than ever.
And it's become a way of
608
00:30:53,360 --> 00:30:55,880
describing that dynamic in all
these different fields.
609
00:30:55,880 --> 00:31:00,080
So it turns out, like the more
aqueducts you create to hang on
610
00:31:00,080 --> 00:31:03,640
to drinking water, the more we
use up drinking water, like in
611
00:31:03,760 --> 00:31:07,320
in example after example, the
more efficient you are in using
612
00:31:07,320 --> 00:31:09,520
a thing, the more you consume
it.
613
00:31:09,760 --> 00:31:12,120
And that's how I think it's
going to be with our free time.
614
00:31:12,240 --> 00:31:15,040
I think that the more it is
possible to give someone free
615
00:31:15,040 --> 00:31:18,280
time, the more we're going to
consume that free time and take
616
00:31:18,280 --> 00:31:20,840
it away from people basically.
I just don't believe that
617
00:31:21,280 --> 00:31:26,240
there's, there's a, a world in
which you're going to, you know,
618
00:31:26,600 --> 00:31:31,720
make a, a sort of a free flowing
monetary system and a kind of,
619
00:31:32,160 --> 00:31:37,400
you know, super generous social
policy that somehow can work
620
00:31:37,400 --> 00:31:40,400
across borders in which we
don't, in which we're, we're
621
00:31:40,400 --> 00:31:43,480
cool with, with people not
having to work in AI doing all
622
00:31:43,480 --> 00:31:45,200
the jobs.
You know, Amazon has just
623
00:31:45,200 --> 00:31:46,680
announced, the New York Times
just announced that like
624
00:31:46,880 --> 00:31:49,800
Amazon's about to wipe out half
a million jobs.
625
00:31:49,800 --> 00:31:51,840
They're going to automate half a
million jobs, according to
626
00:31:51,840 --> 00:31:53,240
internal documents at Amazon,
right?
627
00:31:53,240 --> 00:31:56,200
That's, that's, you know, half a
million jobs, you know, out of
628
00:31:56,200 --> 00:31:58,560
like, think they employ like 1.3
million people.
629
00:31:58,560 --> 00:32:02,240
Like that's a lot of jobs gone.
I don't think we're on a path
630
00:32:02,240 --> 00:32:06,920
toward hanging out and and being
living a happy, relaxed life.
631
00:32:06,920 --> 00:32:09,840
That doesn't, in my experience,
tend to be, you know, and the
632
00:32:09,840 --> 00:32:13,560
historical lessons doesn't tend
to be the result of the moves of
633
00:32:13,560 --> 00:32:16,080
big companies.
You own nothing and you will be
634
00:32:16,080 --> 00:32:18,400
happy, as they say.
Yeah, that's right.
635
00:32:18,640 --> 00:32:21,840
So, well, even if we did have a
a completely automated world,
636
00:32:21,840 --> 00:32:25,680
we'd all end up being like those
people in Wall-e, just in their,
637
00:32:25,720 --> 00:32:28,360
well, super fat, staring at
screens.
638
00:32:28,360 --> 00:32:30,280
Jake, here's what you're here's
where we're at, right?
639
00:32:30,280 --> 00:32:32,480
This is how I think about it.
So everyone keeps saying, oh,
640
00:32:32,480 --> 00:32:33,880
you mean the Terminator.
You're worried about the
641
00:32:33,880 --> 00:32:35,120
Terminator.
And I'm like, no, man, I'm not
642
00:32:35,120 --> 00:32:37,400
worried about the Terminator.
I'm worried about Idiocracy.
643
00:32:37,520 --> 00:32:38,120
Yeah.
Oh, yeah.
644
00:32:38,160 --> 00:32:40,200
That movie.
I'm worried about wall-e.
645
00:32:40,240 --> 00:32:43,320
Exactly.
It is a world now. wall-e would
646
00:32:43,320 --> 00:32:46,000
be OK.
Like, I mean, wall-e is a little
647
00:32:46,000 --> 00:32:49,280
bit like what Jeff is, is, you
know, suggesting could be
648
00:32:49,280 --> 00:32:51,520
possible, right?
Like a world in which we're just
649
00:32:51,520 --> 00:32:54,360
like our Slurpees are brought to
us and we watch TV all day.
650
00:32:54,360 --> 00:32:56,280
Like we're OK, you know?
Am I right?
651
00:32:56,800 --> 00:33:00,520
That's not the worst outcome,
but I'm worried about, you know,
652
00:33:00,960 --> 00:33:04,280
I'm, I'm worried about, you
know, something between, like,
653
00:33:04,280 --> 00:33:07,520
what's the Elysium and
Idiocracy?
654
00:33:07,640 --> 00:33:14,360
You know, like a very, you know,
a very powerful gap, a big gap
655
00:33:14,360 --> 00:33:18,600
between rich and poor, you know,
an extremely sharp pyramid at
656
00:33:18,600 --> 00:33:21,040
which only a handful of people
get to live at the top and the
657
00:33:21,040 --> 00:33:22,160
rest of us are all at the
bottom.
658
00:33:22,160 --> 00:33:26,000
And no one has the critical
faculties to see, you know, to
659
00:33:26,000 --> 00:33:27,520
figure out a solution to the
problem.
660
00:33:28,320 --> 00:33:30,240
Yeah.
And then also the plants
661
00:33:30,280 --> 00:33:32,280
obviously craving Brondo
because.
662
00:33:32,720 --> 00:33:34,080
Exactly.
Exactly.
663
00:33:34,280 --> 00:33:35,320
That's right.
That's right.
664
00:33:35,320 --> 00:33:39,000
Exactly.
So how worried are you about the
665
00:33:39,200 --> 00:33:41,920
concept of a technology?
Well, I am worried about that.
666
00:33:41,920 --> 00:33:45,040
I mean, you know, I, I, so I
have a podcast called the RIP
667
00:33:45,040 --> 00:33:47,800
Current and, and I was trying to
figure out what it was.
668
00:33:47,800 --> 00:33:50,200
I was going to call it when I
launched the RIP Current.
669
00:33:50,200 --> 00:33:53,120
And, and at one point I was
thinking about calling it NPC
670
00:33:53,680 --> 00:33:56,680
because it's a term that you
hear a lot in Silicon Valley at
671
00:33:56,680 --> 00:33:59,760
the top level from people.
So NPC, right, for anybody,
672
00:33:59,760 --> 00:34:02,160
everybody, I'm sure who listens
to this to you guys will, will
673
00:34:02,160 --> 00:34:03,800
know this term.
But non player character, right?
674
00:34:03,800 --> 00:34:06,680
It's the background characters
and soft in, in video games.
675
00:34:07,120 --> 00:34:09,000
It's like the extras in movies,
right.
676
00:34:09,480 --> 00:34:12,760
And a friend of mine makes the
the joke that that the people
677
00:34:12,760 --> 00:34:16,719
running the top companies in
Silicon Valley are like the your
678
00:34:16,719 --> 00:34:20,199
friend who watched Akira too
many times and didn't quite get
679
00:34:20,199 --> 00:34:22,480
it, you know, didn't quite
understand that the lesson that
680
00:34:22,480 --> 00:34:24,600
movie is not that superpowers
are cool.
681
00:34:24,760 --> 00:34:27,400
It's that you shouldn't have
superpowers and that it's bad
682
00:34:27,400 --> 00:34:30,000
for everybody else.
You know, and that's sort of how
683
00:34:30,000 --> 00:34:33,120
I feel about the leadership of a
lot of these companies.
684
00:34:33,120 --> 00:34:36,080
Like they is this idea, you
know, that that they can joke
685
00:34:36,080 --> 00:34:40,320
about the idea of a billion
dollar single person company.
686
00:34:40,719 --> 00:34:43,239
That's the definition of a
technocracy, right?
687
00:34:43,239 --> 00:34:45,440
That for me is an enormous red
flag.
688
00:34:45,960 --> 00:34:48,719
And so I do worry about that.
I mean, we had a whole campaign
689
00:34:48,719 --> 00:34:50,719
here in Northern California
where I am based, there's a
690
00:34:50,719 --> 00:34:54,000
whole group of people who are
trying to build their own Tech
691
00:34:54,000 --> 00:34:55,760
City.
They were trying to basically
692
00:34:55,760 --> 00:34:58,880
incorporate a whole part of, I
can't remember which I think was
693
00:34:58,880 --> 00:35:01,160
Lake County, but this rural part
of California.
694
00:35:01,160 --> 00:35:03,480
They're going to basically just
like take it over if they could.
695
00:35:03,920 --> 00:35:07,920
They lost, but now they've got a
whole new plan to try and coop
696
00:35:10,000 --> 00:35:17,280
coopt a a existing town.
You know, there's a real idea in
697
00:35:17,280 --> 00:35:22,040
tech circles that, like, the
very smartest are the ones who
698
00:35:22,040 --> 00:35:26,200
count, and everybody else is
just kind of a user and, you
699
00:35:26,200 --> 00:35:28,320
know, a consumer.
And that that bothers the hell
700
00:35:28,320 --> 00:35:30,320
out of me.
What do you mean by they were
701
00:35:30,320 --> 00:35:33,560
trying to take it over like like
a 15 minute city or something
702
00:35:33,560 --> 00:35:35,280
like that?
They were trying to basically
703
00:35:35,280 --> 00:35:39,000
the the concept.
I'll do a little subtle Googling
704
00:35:39,000 --> 00:35:41,040
while I can here, but there was
a whole idea that they were
705
00:35:41,040 --> 00:35:45,320
going to create a like a sort of
tech utopian sort of city.
706
00:35:45,320 --> 00:35:48,440
There was also the rumor mill
had it for a while that, you
707
00:35:48,440 --> 00:35:50,480
know, it at the beginning of the
Trump administration, there was
708
00:35:50,480 --> 00:35:54,680
some conversations going on
around, you know, trying to
709
00:35:54,680 --> 00:35:59,040
create some kind of regulation
less city in which tech
710
00:35:59,040 --> 00:36:02,600
companies could experiment
without any regulations at all.
711
00:36:02,920 --> 00:36:06,520
And in the big beautiful bill
that passed at one point, there
712
00:36:06,520 --> 00:36:09,760
was a provision that was going
to make it such that there would
713
00:36:09,760 --> 00:36:13,680
be no regulations allowed on AI
at all for 10 years.
714
00:36:13,680 --> 00:36:16,240
And states in the states
included would not be allowed to
715
00:36:16,240 --> 00:36:18,640
pass any regulations on AI for
10 years, Right?
716
00:36:18,640 --> 00:36:23,160
So there's this just there's
this clear feeling on the
717
00:36:23,160 --> 00:36:25,200
leadership, on the part of the
leadership, these companies that
718
00:36:25,760 --> 00:36:28,440
just leave us alone, you know,
let us figure it out.
719
00:36:28,440 --> 00:36:31,760
I once interviewed the the
former CEO of Google, Eric
720
00:36:31,760 --> 00:36:35,040
Schmidt, and he basically said,
he basically said that.
721
00:36:35,040 --> 00:36:37,200
He's basically said, you know,
policy makers can't be trusted
722
00:36:37,200 --> 00:36:39,280
to think about this stuff.
They're not smart enough.
723
00:36:39,480 --> 00:36:42,720
Leave it to us.
We will figure it out and then
724
00:36:42,720 --> 00:36:44,840
we'll regulate it later.
We'll figure out what the rules
725
00:36:44,840 --> 00:36:47,360
should be because only we are
sort of smart enough to do this.
726
00:36:47,360 --> 00:36:49,320
This is how this is how I think
they think about things.
727
00:36:50,000 --> 00:36:52,640
Yeah, I mean, I, I kind of see
that, right, going back to what
728
00:36:52,640 --> 00:36:56,000
I was saying earlier about like
maybe at some point in the, you
729
00:36:56,000 --> 00:36:59,280
know, relatively distant future,
it becomes like a utopian
730
00:36:59,280 --> 00:37:01,440
situation.
But, you know, going back to
731
00:37:01,440 --> 00:37:04,520
what you were saying, you know,
if this AI can figure out a way
732
00:37:04,520 --> 00:37:08,480
to cure cancer or slow down
aging or whatever the case might
733
00:37:08,480 --> 00:37:11,800
be, it's not going to be cheap
enough for us to, to, to use
734
00:37:11,800 --> 00:37:13,760
right.
Like those, those advances are
735
00:37:13,760 --> 00:37:16,600
going to be something that only
these billionaires, at least I
736
00:37:16,600 --> 00:37:18,440
think are going to be able to
utilize that.
737
00:37:18,440 --> 00:37:20,560
So maybe that's why they have
this mentality because they,
738
00:37:20,560 --> 00:37:22,080
they kind of in the back of
their mind, they're like, hey
739
00:37:22,080 --> 00:37:24,360
man, we're going to live the
next 1000 years.
740
00:37:24,560 --> 00:37:26,880
The schmucks aren't, you know,
So if we could just.
741
00:37:27,680 --> 00:37:30,680
The longevity stuff is a is a
whole cultural facet of Silicon
742
00:37:30,680 --> 00:37:32,800
Valley.
There's huge amounts of cultural
743
00:37:32,800 --> 00:37:36,280
interest here at that high level
in sort of living forever.
744
00:37:36,280 --> 00:37:38,240
There's a huge amount of
longevity talk.
745
00:37:38,240 --> 00:37:41,160
You know, you've got people who
are, you know, taking endless
746
00:37:41,160 --> 00:37:43,880
supplements and icing their skin
and doing all kinds of stuff
747
00:37:43,880 --> 00:37:45,040
because they want to live
forever.
748
00:37:46,120 --> 00:37:48,440
Yeah, all that stuff is really
is really there.
749
00:37:48,840 --> 00:37:53,440
And at the same time, I mean, I
just think just you, there's a,
750
00:37:53,440 --> 00:37:57,040
a feeling of intolerance.
There's a there's a guy who
751
00:37:57,120 --> 00:38:01,040
who's a big, who's very popular
in tech circles here named
752
00:38:01,040 --> 00:38:05,680
Curtis Yarvin.
And he has a very successful
753
00:38:05,800 --> 00:38:08,520
newsletter and is a very
successful sort of public
754
00:38:08,520 --> 00:38:11,280
intellectual in Silicon Valley.
And Curtis Yarvin's whole
755
00:38:11,280 --> 00:38:13,600
argument is we need a monarchy.
We should have a techno
756
00:38:13,600 --> 00:38:15,560
monarchy.
We shouldn't even be doing
757
00:38:15,560 --> 00:38:17,560
democracy anymore because it
slows stuff down.
758
00:38:17,840 --> 00:38:21,160
And, and you know, lots of
people in the tech community who
759
00:38:21,160 --> 00:38:23,680
believe, who are into that, you
know, who are into that idea of
760
00:38:23,960 --> 00:38:27,600
people like Peter Thiel, people
like Marc Andreessen, who's a
761
00:38:27,600 --> 00:38:30,280
big VC out here.
You know, these, these, these,
762
00:38:30,280 --> 00:38:33,880
there's a real cabal of people
who really believe the smartest
763
00:38:33,880 --> 00:38:36,720
people's ideas count.
And everybody else is just kind
764
00:38:36,720 --> 00:38:39,680
of an NPC.
And I think that that is a that
765
00:38:39,680 --> 00:38:44,000
is a real, which is why I want
the utopian idea to happen.
766
00:38:44,440 --> 00:38:45,360
But this is the other thing,
right?
767
00:38:45,360 --> 00:38:48,440
Like let's say they make an AI
based cancer system.
768
00:38:49,040 --> 00:38:53,480
I mean, you know the what they
call the capital expenditures on
769
00:38:53,520 --> 00:38:55,720
AI, the investment they actually
have to make in the data
770
00:38:55,720 --> 00:38:58,040
centers.
The amount of money being spent
771
00:38:58,040 --> 00:39:04,040
on that accounted for 1/3 of the
total growth in GDP between last
772
00:39:04,040 --> 00:39:06,400
year and this year.
The 1/3 of the difference
773
00:39:06,400 --> 00:39:09,000
between how much we spent last
year as a nation, how much we
774
00:39:09,000 --> 00:39:12,800
spent this year as a nation is
just in the tubes and computers
775
00:39:12,800 --> 00:39:16,000
used to power this stuff.
There's investing 10s of
776
00:39:16,000 --> 00:39:19,560
billions of dollars and that
money has to get made back.
777
00:39:19,840 --> 00:39:22,200
So let's say they do figure out
how to cure cancer.
778
00:39:22,240 --> 00:39:23,560
They're not going to give it
away for free.
779
00:39:23,560 --> 00:39:25,080
They're going to need to make
that money back.
780
00:39:25,160 --> 00:39:26,600
That's what's happening right
now.
781
00:39:26,600 --> 00:39:28,720
And so to my mind, the
commercialization of AI is going
782
00:39:28,720 --> 00:39:31,000
to create this incredible
pressure to make money.
783
00:39:31,520 --> 00:39:35,120
And and that's not going to lead
them to utopian stuff, it seems
784
00:39:35,120 --> 00:39:36,800
to me.
You know you guys ever seen that
785
00:39:36,800 --> 00:39:40,280
show Altered Carbon?
Yes, you know what's so funny?
786
00:39:40,440 --> 00:39:43,800
That that's those are the books
I relax with, weirdly enough, I
787
00:39:43,800 --> 00:39:45,000
love.
I love writing.
788
00:39:45,240 --> 00:39:46,840
I can't read, but the show was
great.
789
00:39:46,840 --> 00:39:51,120
And this is what it makes me
feel like it it it makes me feel
790
00:39:51,120 --> 00:39:55,720
like that that utopian AI driven
world is going to be for those
791
00:39:55,720 --> 00:39:57,480
people who live above the
clouds, right?
792
00:39:57,480 --> 00:39:59,920
It's like, like you were saying
earlier, it's going to be a huge
793
00:40:00,320 --> 00:40:02,640
gap between like the rest of us
and them.
794
00:40:03,000 --> 00:40:04,800
And you know, they might live
1000 years.
795
00:40:04,800 --> 00:40:06,560
They might be able to upload
their consciousness to the
796
00:40:06,560 --> 00:40:09,640
stacks and clone themselves to
continue like some weird shit
797
00:40:09,640 --> 00:40:11,280
like that.
But that, that's what I always
798
00:40:11,280 --> 00:40:13,080
think about when I really start
thinking about this stuff.
799
00:40:13,240 --> 00:40:16,200
Yeah, totally.
I think there's a, there's a, a,
800
00:40:16,880 --> 00:40:22,080
you know, I, I just think
there's a, there's a, like the
801
00:40:22,080 --> 00:40:25,480
problem with our society as
we've got it built right now,
802
00:40:25,480 --> 00:40:28,080
right?
The whole neoliberal project,
803
00:40:28,400 --> 00:40:32,080
the, the, the capitalist world
that we live in is somebody's
804
00:40:32,080 --> 00:40:34,560
got to lose, right?
It's not a world in which
805
00:40:34,560 --> 00:40:37,080
everybody gets to win.
Somebody wins and somebody
806
00:40:37,080 --> 00:40:39,400
loses.
And the problem has always been,
807
00:40:39,400 --> 00:40:43,240
can you keep those things in at
least a relative balance so that
808
00:40:43,240 --> 00:40:47,440
being at the bottom end of the
society isn't like living in a
809
00:40:47,440 --> 00:40:51,080
medieval way.
It is a, you know, an OK life,
810
00:40:51,080 --> 00:40:52,120
right?
We and we've had that in
811
00:40:52,120 --> 00:40:54,840
American history.
But this is not the path to
812
00:40:54,840 --> 00:40:56,040
that.
It doesn't seem to me.
813
00:40:56,040 --> 00:40:58,920
So.
At a certain point, we can also
814
00:40:58,920 --> 00:41:00,120
talk about happier topics.
You guys.
815
00:41:00,120 --> 00:41:01,480
I know I am.
Like I'm such a bum.
816
00:41:01,480 --> 00:41:04,600
Oh, no, no, no, I.
Ruin every party I go to.
817
00:41:04,600 --> 00:41:06,760
Our listeners are sick.
They love the depressing.
818
00:41:06,760 --> 00:41:08,400
Yeah, you seem like a lot of fun
for sure.
819
00:41:09,560 --> 00:41:12,360
No, our, our listeners are, are
pretty gross, like our, our
820
00:41:12,360 --> 00:41:18,360
toughest, our highest, like
grossing shows or topics are
821
00:41:18,360 --> 00:41:22,120
ones where people are horribly
killed in terrible accidents and
822
00:41:22,120 --> 00:41:24,760
things like that.
They're missing forever and
823
00:41:24,760 --> 00:41:26,200
stuff like.
That maybe you're my tribe,
824
00:41:26,200 --> 00:41:27,360
maybe I've been.
We love it.
825
00:41:29,640 --> 00:41:32,840
So yeah, we've, we've talked a
lot just kind of freestyling
826
00:41:32,840 --> 00:41:34,360
here.
But I want to, I want to shine
827
00:41:34,840 --> 00:41:37,560
some light on the book.
I'm a, I'm a big book guy.
828
00:41:37,560 --> 00:41:41,600
So I always end up steering the
conversation, especially with an
829
00:41:41,600 --> 00:41:44,160
author into the topic of their
book.
830
00:41:44,760 --> 00:41:47,040
So I want to talk about the
loop, right?
831
00:41:47,040 --> 00:41:49,680
So which was what you published,
like you said in January of
832
00:41:49,680 --> 00:41:53,040
2022, you predicted the AI
revolution we're seeing now.
833
00:41:53,320 --> 00:41:56,040
And we talked about that pretty
heavily already.
834
00:41:56,040 --> 00:41:58,320
Not necessarily a prediction,
but AI in general.
835
00:41:58,320 --> 00:42:02,120
How about the subtitle is how AI
is Creating a World without
836
00:42:02,120 --> 00:42:04,720
choices.
Now, could you break that down
837
00:42:04,720 --> 00:42:07,520
for the audience?
And how does how exactly does a
838
00:42:07,520 --> 00:42:11,560
system designed to give us more
options or a better experience
839
00:42:11,600 --> 00:42:15,040
ultimately limit our choices?
Well, I love this question.
840
00:42:15,040 --> 00:42:18,120
So as I said, I spent a huge
amount of time looking at both
841
00:42:18,120 --> 00:42:20,680
the behavioral science world and
then the technology world.
842
00:42:20,680 --> 00:42:23,040
So the behavioral science world
accounts for about the first
843
00:42:23,040 --> 00:42:25,120
third of the book where I'm
trying to get people kind of a
844
00:42:25,120 --> 00:42:30,200
crash course in some of the big
lessons of the last 50 years and
845
00:42:30,320 --> 00:42:32,120
and more of a behavioral
science.
846
00:42:32,120 --> 00:42:35,720
And one of the big ones is we as
a as a species.
847
00:42:36,200 --> 00:42:39,320
So a lot of this comes from a
guy named Daniel Kahneman who
848
00:42:39,320 --> 00:42:41,360
wrote a book called Thinking
Fast and Slow.
849
00:42:41,360 --> 00:42:42,760
That was a very famous book for
a time.
850
00:42:42,760 --> 00:42:44,360
He won the Nobel Prize in
economics.
851
00:42:44,360 --> 00:42:47,960
He was a psychologist who
beginning in the 1970s with a
852
00:42:47,960 --> 00:42:51,240
partner named Amos Tversky,
created a whole bunch of
853
00:42:51,240 --> 00:42:55,720
experiments that showed that if
you basically people hate to
854
00:42:55,720 --> 00:43:00,280
make choices and will take any
excuse to outsource that
855
00:43:00,280 --> 00:43:04,520
decision making if they can.
And he came up with an idea of
856
00:43:05,000 --> 00:43:07,120
this thinking fast and slow
idea.
857
00:43:07,360 --> 00:43:10,240
So there's a fast thinking brain
and a slow thinking brain, a
858
00:43:10,240 --> 00:43:13,840
system one and a system 2.
That system 1 fast thinking
859
00:43:13,840 --> 00:43:16,240
brain is this one that I, as I
mentioned earlier, we have in
860
00:43:16,240 --> 00:43:19,120
common with primates.
It goes way, way, way, way back.
861
00:43:19,480 --> 00:43:24,240
It is an incredibly powerful and
well tested decision making
862
00:43:24,240 --> 00:43:26,000
system.
And it is the system that allows
863
00:43:26,000 --> 00:43:29,200
you to drive automatically to
your kids school without
864
00:43:29,200 --> 00:43:32,080
thinking about it.
Yeah, your lizard brain.
865
00:43:32,080 --> 00:43:33,200
Exactly.
That's how we say it.
866
00:43:33,200 --> 00:43:34,640
Exactly.
That's your, that's your lizard
867
00:43:34,640 --> 00:43:37,280
brain.
And, you know, and, and, and
868
00:43:37,280 --> 00:43:39,200
we're embarrassed typically of
our, of our lizard brain.
869
00:43:39,200 --> 00:43:41,200
But the truth is our lizard
brain got us where we are
870
00:43:41,200 --> 00:43:45,520
because you didn't need what
using your lizard brain.
871
00:43:45,520 --> 00:43:46,880
You don't have to think things
through.
872
00:43:46,880 --> 00:43:49,080
You don't have to, you know, if
a snake comes into the room and
873
00:43:49,080 --> 00:43:52,240
we're all sitting together, you
don't go, oh, what kind of snake
874
00:43:52,240 --> 00:43:54,640
is that?
You know, you, you freak out,
875
00:43:54,640 --> 00:43:57,480
stand up, everybody else in the
room freaks out and stands up
876
00:43:57,480 --> 00:43:59,840
and you're out.
The room and the tribe is saved,
877
00:43:59,840 --> 00:44:01,960
right?
It's the automatic ways we make
878
00:44:01,960 --> 00:44:03,800
choices, not just individually,
but as a group.
879
00:44:03,800 --> 00:44:07,920
We transmit when I show you a
mask of horror because there's
880
00:44:07,920 --> 00:44:09,920
because the room's on fire.
You don't need to see that the
881
00:44:09,920 --> 00:44:12,120
room's on fire.
You just run.
882
00:44:12,120 --> 00:44:14,160
You know, the Cedric the
entertainer has a great line
883
00:44:14,160 --> 00:44:16,880
about, well, he's like, when I
see black people run, you don't
884
00:44:16,880 --> 00:44:19,280
ask why you just start running.
He's like, you can't help but
885
00:44:19,280 --> 00:44:21,880
run it, you know, and, and, and
that is exactly it.
886
00:44:21,880 --> 00:44:23,920
Like that's what, what, that's
how our system works.
887
00:44:24,320 --> 00:44:27,680
So that's all the lizard brain
stuff, our slow thinking brain,
888
00:44:27,680 --> 00:44:30,600
our fat, our, our, what's called
our, our system 2, what he,
889
00:44:30,960 --> 00:44:32,360
Daniel Kahneman called our
system 2.
890
00:44:32,720 --> 00:44:38,080
That's the cautious, creative,
rational, kind of better, more
891
00:44:38,080 --> 00:44:43,320
human decision making system.
And the idea is that like
892
00:44:43,320 --> 00:44:46,920
100,000 years ago, that's the
system that caused, you know,
893
00:44:46,920 --> 00:44:49,040
back when everyone was living on
the continent of what is now
894
00:44:49,040 --> 00:44:51,760
Africa, everyone, you know,
somebody stood up and said,
895
00:44:51,760 --> 00:44:54,400
what's over there?
And that's that part of the
896
00:44:54,400 --> 00:44:57,360
brain that goes, oh, what else
is beyond this campfire?
897
00:44:57,400 --> 00:44:59,840
You know, I wonder what would
happen if we didn't just think
898
00:44:59,840 --> 00:45:01,920
about our survival, like what
happens after we die?
899
00:45:01,920 --> 00:45:03,480
And why did I have that weird
dream?
900
00:45:03,480 --> 00:45:08,080
You know, those thoughts are a
totally new decision making
901
00:45:08,080 --> 00:45:11,200
system.
And that system is super glitchy
902
00:45:11,200 --> 00:45:14,720
and you know, because it's
untested, it's like a brand new
903
00:45:14,720 --> 00:45:17,000
piece of software in in
evolutionary terms.
904
00:45:17,600 --> 00:45:22,160
So what the estimate is that 90%
of your choices are made by the
905
00:45:22,160 --> 00:45:24,640
lizard brain.
And this little group of choices
906
00:45:24,640 --> 00:45:26,640
is made by your slow thinking
brain.
907
00:45:27,360 --> 00:45:32,360
And my thesis in the book is if
you're a company that's created
908
00:45:32,360 --> 00:45:36,320
a pattern recognition system
that can tell you what people
909
00:45:36,320 --> 00:45:40,000
are going to do in the future,
can can forecast their behavior
910
00:45:40,000 --> 00:45:42,400
and even maybe shape their
behavior because you want to try
911
00:45:42,400 --> 00:45:45,720
and make money off of them.
Who are you going to try and
912
00:45:45,720 --> 00:45:47,640
sell to?
Do you want to sell to the
913
00:45:47,640 --> 00:45:51,360
cautious, creative, rational
part of the brain that thinks
914
00:45:51,360 --> 00:45:54,320
things through right and makes
good choices?
915
00:45:54,840 --> 00:45:57,480
Or do you want to sell to the
one that can't help but look at
916
00:45:57,480 --> 00:46:01,000
the cleavage picture, right?
That can't help but drive to his
917
00:46:01,000 --> 00:46:06,040
kids school by accident, right?
The automatic brain that, you
918
00:46:06,040 --> 00:46:09,280
know, was like where my
alcoholism came from.
919
00:46:09,280 --> 00:46:12,200
You know, that they would much
rather sell booze to that guy
920
00:46:12,720 --> 00:46:15,720
than they would to the guy that
I am today who said, you know
921
00:46:15,720 --> 00:46:17,080
what?
I got to actually stop drinking
922
00:46:17,080 --> 00:46:18,200
and I'm I'm going to quit
drinking.
923
00:46:18,880 --> 00:46:23,160
So to my mind, a system like
this absolutely could create
924
00:46:23,360 --> 00:46:26,960
huge amounts of choice.
We would love that, but I think
925
00:46:26,960 --> 00:46:29,960
when you overlay the need to
make money off these systems
926
00:46:29,960 --> 00:46:32,600
onto it, they're not going to.
Why would they?
927
00:46:32,600 --> 00:46:35,080
Why would they try and make us
more like these people?
928
00:46:35,080 --> 00:46:36,960
They're going to want to make
us, you know, more like our
929
00:46:36,960 --> 00:46:39,200
rational selves.
Why wouldn't they want to make
930
00:46:39,200 --> 00:46:42,360
us more and more instinctive?
Because the instincts are so
931
00:46:42,360 --> 00:46:44,760
easy to predict.
Do you think that they've been
932
00:46:44,760 --> 00:46:47,080
working towards that already
with just the way that the
933
00:46:47,080 --> 00:46:49,960
algorithms work, you know,
TikTok brain and all that and
934
00:46:50,440 --> 00:46:53,840
shortening of tension spans?
Totally and I want to say like I
935
00:46:53,840 --> 00:46:57,160
described this like I'm, you
know, I'm not judgy about this
936
00:46:57,400 --> 00:47:01,520
because I am the worst person I
know around this stuff like I am
937
00:47:01,520 --> 00:47:03,760
super manipulatable around this
stuff.
938
00:47:03,760 --> 00:47:06,760
You know, I'm the one who
scrolls TikTok and eventually
939
00:47:06,760 --> 00:47:10,040
the woman comes on with a TikTok
branded video that says you
940
00:47:10,040 --> 00:47:12,680
should go to bed.
Now you've you've had enough,
941
00:47:12,760 --> 00:47:15,200
you know, because like when the
bartender says you've had
942
00:47:15,200 --> 00:47:17,720
enough, you know, like when the
crack dealer says you've had
943
00:47:17,720 --> 00:47:20,480
enough, you know that you were
addicted to crack and he knows
944
00:47:20,480 --> 00:47:23,480
you're coming back tomorrow.
You know, like I, so I, I very
945
00:47:23,480 --> 00:47:25,800
much write this stuff and think
about this stuff not from a
946
00:47:25,800 --> 00:47:27,840
perspective of like, hey,
everybody, you got to be
947
00:47:27,840 --> 00:47:29,640
smarter.
I'm, I write it from the place
948
00:47:29,640 --> 00:47:34,040
of like, I am not able to resist
any of this because it's so
949
00:47:34,040 --> 00:47:38,120
powerful, you know, and I just
think like we have to what I
950
00:47:38,120 --> 00:47:40,400
want everyone to do and I and
what I like is that young people
951
00:47:40,400 --> 00:47:42,080
are better at this.
Like you, you know, Jeff, you
952
00:47:42,080 --> 00:47:44,720
said TikTok brain, you know,
like the fact that we're naming
953
00:47:44,720 --> 00:47:47,200
that in ourselves, right, Jake,
you said lizard brain like that.
954
00:47:47,200 --> 00:47:50,440
We're naming that in ourselves.
I just love that we're entering
955
00:47:50,440 --> 00:47:52,200
the world.
This is where I was where my
956
00:47:52,200 --> 00:47:55,240
hope for the future comes from
is like young people talk very
957
00:47:55,240 --> 00:47:57,040
openly about like, oh, brain
rot.
958
00:47:57,120 --> 00:47:59,160
Oh, I got into brain rot.
You know, like, like they're,
959
00:47:59,160 --> 00:48:02,720
they're conscious of how their
brains fall for this stuff.
960
00:48:03,040 --> 00:48:06,560
And I think the more that we can
articulate that, the more we're
961
00:48:06,560 --> 00:48:08,840
going to get to a place where
not only are we going to be able
962
00:48:08,840 --> 00:48:12,480
to say, you know, I got to make
better choices for myself, which
963
00:48:12,480 --> 00:48:14,480
is part of the solution, but I
don't think the main part.
964
00:48:14,880 --> 00:48:16,880
I think then you're going to get
to a world where you're going to
965
00:48:16,880 --> 00:48:20,080
start being able to sue on the
basis of you're going to be able
966
00:48:20,080 --> 00:48:24,840
to sue companies for for taking
advantage of your instincts in a
967
00:48:24,840 --> 00:48:27,280
way that you would never choose
consciously.
968
00:48:27,560 --> 00:48:31,080
That, to my mind, is is where
the path out of this is probably
969
00:48:31,080 --> 00:48:34,640
going to come from.
I got some crazy allegory of The
970
00:48:34,640 --> 00:48:37,800
Cave vibes from from the
beginning of that that
971
00:48:37,800 --> 00:48:41,280
explanation.
Plato's, what was it?
972
00:48:41,280 --> 00:48:43,880
The Republic.
OK, tell me about that.
973
00:48:43,880 --> 00:48:45,400
You.
You're you're better read than I
974
00:48:45,400 --> 00:48:46,640
am.
I tell me I don't know my
975
00:48:46,640 --> 00:48:50,120
Republic.
It's just a a small portion of
976
00:48:50,720 --> 00:48:53,720
of his work called the Republic.
Prisoners are locked inside a
977
00:48:53,720 --> 00:48:57,520
cave, and they've been in there
for so long that they're only
978
00:48:57,520 --> 00:49:01,160
allowed to face one wall and see
the flames from the fire that's
979
00:49:01,160 --> 00:49:04,280
stocking the wall.
They don't know how the fire
980
00:49:04,280 --> 00:49:08,400
gets fed or anything, they just
know that it does. 1 prisoner
981
00:49:09,200 --> 00:49:13,200
frees himself, somehow breaks
out of The Cave, learns that
982
00:49:13,200 --> 00:49:17,720
there's an entire world outside
of The Cave, comes in, tells
983
00:49:17,720 --> 00:49:20,360
them about it and nobody
believes them because their
984
00:49:20,360 --> 00:49:23,840
whole life, basically since
birth, they were prisoners
985
00:49:23,840 --> 00:49:27,080
inside of this cave.
And all they know is the the
986
00:49:27,080 --> 00:49:30,040
shadows cast on the walls caused
by the fire.
987
00:49:30,080 --> 00:49:32,880
So like when you were talking
about like Jay made the the
988
00:49:32,880 --> 00:49:35,600
reference to lizard brain, then
you had talked about there's a
989
00:49:35,720 --> 00:49:37,600
small part of your brain where
you stand up and you say, well,
990
00:49:37,600 --> 00:49:39,840
what's over there?
I know a lot of people, some
991
00:49:39,840 --> 00:49:42,840
people, right?
I'm not going to use figures,
992
00:49:43,080 --> 00:49:46,240
but there's a lot of people in
trying to trying to walk a thin
993
00:49:46,240 --> 00:49:48,360
line here.
A lot of people would be like,
994
00:49:48,800 --> 00:49:51,240
don't be an idiot.
Don't look over there, but
995
00:49:51,240 --> 00:49:52,440
because there's nothing there,
right?
996
00:49:52,440 --> 00:49:54,800
Our whole world is right here in
this cave and our, our,
997
00:49:55,000 --> 00:49:57,520
everything we know is being cast
in shadows on the wall.
998
00:49:59,000 --> 00:50:01,800
But then you have those people
that are willing to to look
999
00:50:01,800 --> 00:50:05,040
outside and say, no, there's a
whole other world out here.
1000
00:50:05,520 --> 00:50:07,800
But then eventually the people
will make it out outside and
1001
00:50:07,800 --> 00:50:11,040
then everyone's so depressed and
that's no that actually, that's
1002
00:50:11,040 --> 00:50:13,600
a great segue into something
else that I wanted to talk to
1003
00:50:13,600 --> 00:50:15,640
you about.
Sorry, I'm kind of just jumping
1004
00:50:15,640 --> 00:50:18,400
topics here really, really
quick, but it was a perfect
1005
00:50:18,400 --> 00:50:21,520
segue.
So you had made a point earlier
1006
00:50:21,520 --> 00:50:24,760
about how we're going to have a
ton of free time or that that's
1007
00:50:24,960 --> 00:50:26,840
kind of the the route.
That's what they promised
1008
00:50:26,840 --> 00:50:28,280
anyway.
That's what they say that we
1009
00:50:28,280 --> 00:50:31,520
will have.
Do you have any fears of?
1010
00:50:31,560 --> 00:50:34,320
I don't even know if there's a
term for it, but the the
1011
00:50:34,320 --> 00:50:37,360
depression that people get when
they don't have something to put
1012
00:50:37,360 --> 00:50:39,400
their efforts towards, you see
it a lot with.
1013
00:50:39,840 --> 00:50:43,000
Oh yeah, recent.
Retirees or Oh yeah, like Jake
1014
00:50:43,000 --> 00:50:45,200
and I, we, we were in the Navy
together.
1015
00:50:45,480 --> 00:50:47,640
And I mean, I went through it,
right.
1016
00:50:47,640 --> 00:50:50,560
I got, I got medically, I got
forced medically, retired out of
1017
00:50:50,560 --> 00:50:53,680
the Navy, and I went through a
massive depressive cycle.
1018
00:50:54,560 --> 00:50:57,920
Oh dude, I had a total ego death
and now I just spend every
1019
00:50:57,920 --> 00:50:59,880
second that I have doing
something.
1020
00:51:00,680 --> 00:51:02,920
Yeah, and and you know, I'm.
Constantly busy.
1021
00:51:03,320 --> 00:51:06,480
My wife and I, we, I wouldn't
say we argue about it, but we,
1022
00:51:06,520 --> 00:51:08,960
we talk about it all the time.
You know, she's like you, you
1023
00:51:09,120 --> 00:51:11,120
always have to be doing
something and it's like, you
1024
00:51:11,120 --> 00:51:14,320
have no idea.
Like if my mind's not being
1025
00:51:14,320 --> 00:51:18,000
exercised by a task, whether
it's at work or the podcast or
1026
00:51:18,080 --> 00:51:21,440
hanging out with my kids or
something, I, I go crazy or I
1027
00:51:21,440 --> 00:51:24,360
either fall asleep, right?
These guys give me crap all the
1028
00:51:24,360 --> 00:51:26,480
time because I don't watch
movies.
1029
00:51:26,480 --> 00:51:30,040
I fall asleep to them.
And I didn't understand 90% of
1030
00:51:30,040 --> 00:51:31,880
the references that were made
today already.
1031
00:51:32,320 --> 00:51:33,960
But so that's that's why I like
books.
1032
00:51:33,960 --> 00:51:36,240
I get management from books
because I can physically do
1033
00:51:36,240 --> 00:51:37,000
something.
I love that.
1034
00:51:37,480 --> 00:51:40,280
Right, right, right.
But are you worried?
1035
00:51:40,680 --> 00:51:43,520
Oh, man, yes, I, I mean, I, so I
love this.
1036
00:51:43,520 --> 00:51:46,800
I think you're, you're touching
on something so important, so
1037
00:51:47,120 --> 00:51:49,840
purpose.
You know, there's a whole world
1038
00:51:50,200 --> 00:51:56,440
of research going on right now
around purpose and how important
1039
00:51:56,440 --> 00:52:00,920
that is to human satisfaction
because a bunch of these
1040
00:52:00,920 --> 00:52:05,240
researchers are recognizing that
in the future, if, you know,
1041
00:52:05,240 --> 00:52:09,800
people are are going to suddenly
not be working, how will they
1042
00:52:09,800 --> 00:52:12,160
function, right?
I mean, this is a whole thing.
1043
00:52:12,160 --> 00:52:15,240
It's not just about the money it
brings in.
1044
00:52:15,240 --> 00:52:17,920
It's the identity and the value
you feel.
1045
00:52:18,120 --> 00:52:24,160
You know, people really like
purpose and I really worry about
1046
00:52:24,200 --> 00:52:28,040
a world in which, you know, we
have we have established the
1047
00:52:28,040 --> 00:52:33,360
idea that the value of your of a
person has to do with their
1048
00:52:33,360 --> 00:52:36,920
production, their productivity.
How much money do they generate?
1049
00:52:36,920 --> 00:52:40,320
How much value do they generate,
you know, is measured in
1050
00:52:40,320 --> 00:52:43,360
monetary terms right now.
And if we're about to enter a
1051
00:52:43,360 --> 00:52:46,960
world in which the AI takes care
of all of that, well, then what
1052
00:52:46,960 --> 00:52:49,640
is the value of that person?
There was a whole, I was with a,
1053
00:52:50,040 --> 00:52:51,560
there was a think tank that I
went to once.
1054
00:52:51,560 --> 00:52:54,840
That was all they were trying to
think about was how do we come
1055
00:52:54,840 --> 00:52:59,680
up with a new way of, of
describing the value of people
1056
00:53:00,000 --> 00:53:05,040
beyond the money they generate?
And to my mind, you know, so I
1057
00:53:05,040 --> 00:53:07,200
think that it's one of the big
problems with what you're
1058
00:53:07,200 --> 00:53:08,280
describing.
And it's one of the big
1059
00:53:08,280 --> 00:53:10,720
problems, I think with people
who say, oh, we're not going to
1060
00:53:10,720 --> 00:53:12,840
have to work as hard in the
future or we're not going to
1061
00:53:12,840 --> 00:53:14,240
have to work at all in the
future.
1062
00:53:14,480 --> 00:53:19,240
That feels like a real like high
school freshman's idea of what,
1063
00:53:19,960 --> 00:53:22,640
you know, a sort of perfect
world is supposed to be.
1064
00:53:22,640 --> 00:53:25,360
I think we're going to need work
and purpose.
1065
00:53:25,360 --> 00:53:29,960
I think that's a huge amount of
of a huge part of, of what has
1066
00:53:29,960 --> 00:53:32,080
been satisfying to about
humanity.
1067
00:53:32,080 --> 00:53:33,760
I mean, you know, we were
talking about this primitive
1068
00:53:33,760 --> 00:53:36,440
brain, you know, lizard brain
and our higher functions, the
1069
00:53:36,440 --> 00:53:38,160
higher functions.
One of the examples I always
1070
00:53:38,160 --> 00:53:41,000
give around higher functions and
why we're it's so I'm so proud
1071
00:53:41,000 --> 00:53:45,040
to be a modern human is all the
stuff we've built, you know, the
1072
00:53:45,040 --> 00:53:48,320
highways and the cathedrals and
the bridges, like it's
1073
00:53:48,320 --> 00:53:51,920
incredible and, and the pride
people take in having done that
1074
00:53:51,920 --> 00:53:53,240
stuff.
So I just think we're going to
1075
00:53:53,240 --> 00:53:56,480
need, we're absolutely going to
need that in the future.
1076
00:53:56,480 --> 00:53:58,880
And if we don't have it, because
we think somehow people are
1077
00:53:58,880 --> 00:54:02,200
going to be happier if they
don't have to do any work.
1078
00:54:02,640 --> 00:54:05,320
I think misunderstands how our
economics work, as you know,
1079
00:54:05,320 --> 00:54:09,320
Jeff, and misunderstands how
people find happiness in life.
1080
00:54:09,760 --> 00:54:12,120
I'm surprised that Jeff hasn't
brought up that it.
1081
00:54:13,360 --> 00:54:19,360
I mean, it seems to me that it's
the promises of a more relaxed
1082
00:54:19,360 --> 00:54:23,600
society, but people are so
forgetful that just what was it
1083
00:54:23,600 --> 00:54:27,640
four or five years ago, everyone
was locked inside their homes.
1084
00:54:27,920 --> 00:54:30,880
They couldn't go to work.
They couldn't do anything,
1085
00:54:31,080 --> 00:54:34,720
anything of enjoyment, which you
get do get a tremendous amount
1086
00:54:34,720 --> 00:54:37,280
of satisfaction from work in
general, you know, whether
1087
00:54:37,280 --> 00:54:40,320
that's working with your hands
or actually doing a job, right,
1088
00:54:40,320 --> 00:54:43,880
whatever it might be.
But, you know, those sorts of
1089
00:54:43,880 --> 00:54:47,840
things decreased and our mental
health issues spiked through the
1090
00:54:47,840 --> 00:54:50,800
roof, right?
And so maybe, you know, the
1091
00:54:51,000 --> 00:54:55,840
whole grandiose idea of like,
oh, you know, less, less time at
1092
00:54:55,840 --> 00:54:58,960
work and more time for yourself
and all this sort of stuff, but
1093
00:54:59,000 --> 00:55:03,280
with the the knowledge that it's
going to lead to insane mental
1094
00:55:03,280 --> 00:55:05,760
health crisises.
And why why not?
1095
00:55:05,760 --> 00:55:08,000
Right.
Look, you know, who would want
1096
00:55:08,000 --> 00:55:10,120
that?
Well, you know, I don't know.
1097
00:55:10,120 --> 00:55:11,280
But why?
You know?
1098
00:55:13,040 --> 00:55:17,040
This is why, This is why it
bothers me so much how casually
1099
00:55:17,400 --> 00:55:19,680
you have business leaders right
now saying, oh, we're just going
1100
00:55:19,680 --> 00:55:23,080
to not need people, you know,
like they're the damage that
1101
00:55:23,080 --> 00:55:25,640
that will do to society if they
really lay off people the way
1102
00:55:25,640 --> 00:55:28,240
they seem to want to.
Whereas at the same time, like I
1103
00:55:28,240 --> 00:55:30,680
was mentioning earlier, this guy
who says he could do away with
1104
00:55:30,680 --> 00:55:33,160
lead poisoning and kids if you
gave him an AI system.
1105
00:55:33,360 --> 00:55:36,600
He also has a program, I think
it's for the state of New
1106
00:55:36,600 --> 00:55:37,600
Hampshire.
I can't remember.
1107
00:55:37,600 --> 00:55:42,120
But basically he's got he's got
a system that's that takes new
1108
00:55:42,120 --> 00:55:45,240
arrivals in the state of New
Hampshire, people who have just
1109
00:55:45,240 --> 00:55:48,800
arrived there, whether they are
immigrants or removed there from
1110
00:55:48,800 --> 00:55:51,800
another state.
And once they register, get a
1111
00:55:51,800 --> 00:55:57,000
driver's license or whatever
this system pushes to them, Hey,
1112
00:55:57,320 --> 00:55:58,520
would you like to be a bus
driver?
1113
00:55:58,880 --> 00:56:00,760
Would you like to be, you know,
a sanitation worker?
1114
00:56:00,760 --> 00:56:03,280
What about this, you know,
because they're all these jobs
1115
00:56:03,280 --> 00:56:05,320
that the state of New Hampshire
needs filled that they can't
1116
00:56:05,320 --> 00:56:08,040
fill snow plow drivers and all
this stuff, right?
1117
00:56:08,320 --> 00:56:13,600
To my mind, like stop talking
about getting rid of paralegals
1118
00:56:13,600 --> 00:56:15,920
and entry level bookkeeping and
all that stuff.
1119
00:56:15,920 --> 00:56:17,480
Don't do that.
Don't wipe that out.
1120
00:56:17,480 --> 00:56:20,280
We're going to need that job.
Instead, use AI to find people
1121
00:56:20,280 --> 00:56:23,240
work, you know, use AI to pair
people with the services they
1122
00:56:23,240 --> 00:56:25,240
need.
That's what we're going to need
1123
00:56:25,320 --> 00:56:29,160
much more than than you know, I
think I feel like the idea that
1124
00:56:29,160 --> 00:56:32,480
we need to be more efficient
feels out of whack.
1125
00:56:32,520 --> 00:56:35,040
That doesn't, I think we've got
no place where the top people in
1126
00:56:35,040 --> 00:56:39,400
America are making enough money.
They, I think they should take a
1127
00:56:39,400 --> 00:56:42,400
little break on the money making
and let's let's get some people
1128
00:56:42,400 --> 00:56:43,840
some help.
Let's use the assistance for
1129
00:56:43,840 --> 00:56:46,320
that instead.
There's definitely a concern for
1130
00:56:46,320 --> 00:56:48,600
a lot of these jobs.
I feel like, you know, going
1131
00:56:48,600 --> 00:56:52,120
away with automation and AI and
stuff, but I still, you know,
1132
00:56:52,120 --> 00:56:54,560
I'm, I'm like a, I'm a blue
collar worker, right?
1133
00:56:54,560 --> 00:56:58,600
Like, I go out and I do tasks
that I don't see.
1134
00:56:58,600 --> 00:57:00,000
I think about this all the time,
right?
1135
00:57:00,000 --> 00:57:04,760
I don't see how an Elon robot
can do half of the jobs that
1136
00:57:04,760 --> 00:57:07,360
I've done in my life because
there's certain things that you
1137
00:57:07,360 --> 00:57:11,400
just couldn't program into, you
know, to do some of these tasks.
1138
00:57:11,680 --> 00:57:14,640
Now, when you're talking about,
you know, a lot of jobs, sure,
1139
00:57:14,640 --> 00:57:17,120
right.
Artists, even musicians, like
1140
00:57:17,120 --> 00:57:20,080
that's going away with some of
the new AI writers, you know,
1141
00:57:21,960 --> 00:57:25,480
customer service, like a lot of
these things, retail, delivery
1142
00:57:25,480 --> 00:57:28,040
services, taxis, like, yeah, all
that could go away.
1143
00:57:28,040 --> 00:57:30,240
Waste management, like that can
all be automated.
1144
00:57:30,240 --> 00:57:33,200
But when you start talking
about, you know, like service
1145
00:57:33,480 --> 00:57:36,240
industry, like blue collar
service industry jobs, I don't
1146
00:57:36,240 --> 00:57:39,560
see how like, a robot's going to
be able to do 90% of those jobs.
1147
00:57:39,560 --> 00:57:43,080
So it could be a situation where
a lot of these people do lose
1148
00:57:43,080 --> 00:57:47,640
work, but there's so much that's
going to be needed in order to
1149
00:57:47,640 --> 00:57:51,480
like maintain the infrastructure
of society that just people just
1150
00:57:51,480 --> 00:57:53,080
don't want to do, right?
Because it's not air
1151
00:57:53,080 --> 00:57:55,840
conditioned, it's not
comfortable, it's hard work.
1152
00:57:55,840 --> 00:57:58,440
So sure, you might have like
this this problem, but again,
1153
00:57:58,440 --> 00:58:00,560
kind of looking out into the
future a little bit, it may
1154
00:58:00,560 --> 00:58:03,280
become a thing where it's like
we don't actually have more free
1155
00:58:03,280 --> 00:58:04,880
time.
We just have more people doing
1156
00:58:04,880 --> 00:58:07,600
things to upkeep the
infrastructure that robots can't
1157
00:58:07,600 --> 00:58:09,480
do.
Yeah, I hope that that is true,
1158
00:58:09,480 --> 00:58:13,920
that there is, that there can
somehow be a move toward, let's
1159
00:58:13,920 --> 00:58:17,520
say you know, enough people in
the trades that you could have
1160
00:58:17,520 --> 00:58:19,000
that it could sort of balance
out.
1161
00:58:19,000 --> 00:58:21,760
But one, one problem I have
heard a great deal.
1162
00:58:22,040 --> 00:58:23,800
There's AI.
Was just talking to somebody the
1163
00:58:23,800 --> 00:58:26,920
other day who's doing a bunch of
research right now around the
1164
00:58:26,920 --> 00:58:32,880
thesis that AI could conceivably
do to women who never go to
1165
00:58:32,880 --> 00:58:38,720
college what foreign offshoring
did to men who never went to
1166
00:58:38,720 --> 00:58:41,280
college back in the years and
early 90s.
1167
00:58:41,720 --> 00:58:45,920
Because for a huge number of
women who don't have a college
1168
00:58:45,920 --> 00:58:50,440
education, there have been a
whole set of knowledge jobs,
1169
00:58:50,440 --> 00:58:55,920
bookkeeping, clerical admin,
right, that can lead to a really
1170
00:58:55,920 --> 00:59:00,440
good, reliable paycheck for a
long time and can lead to, you
1171
00:59:00,440 --> 00:59:03,440
know, a real career that then
winds up even leading to a
1172
00:59:03,440 --> 00:59:06,960
really stable retirement.
And these are the jobs that
1173
00:59:06,960 --> 00:59:09,360
these companies are talking
about wiping out completely.
1174
00:59:09,880 --> 00:59:12,600
And so that's something I really
worry about.
1175
00:59:12,600 --> 00:59:14,920
I think you're right that like
there could be a push toward the
1176
00:59:14,920 --> 00:59:16,120
trade.
You know, I was just talking to
1177
00:59:16,120 --> 00:59:19,080
a guy the other day who's was
really lamenting that his
1178
00:59:19,080 --> 00:59:22,160
daughter doesn't want to become
a lawyer or an engineer and she
1179
00:59:22,160 --> 00:59:23,760
wants to be an artist.
And I was saying to him, like, I
1180
00:59:23,760 --> 00:59:25,360
don't know, man, have you seen
what's going on with lawyers and
1181
00:59:25,360 --> 00:59:27,120
engineers?
Like they are not they're
1182
00:59:27,120 --> 00:59:29,400
getting fired left and right
because the entry level versions
1183
00:59:29,400 --> 00:59:33,320
of those jobs are going away.
But, but a truly like
1184
00:59:33,320 --> 00:59:36,960
disciplined, creative person
that could actually be a, you
1185
00:59:36,960 --> 00:59:39,240
know, a valuable skill in a
whole new way.
1186
00:59:39,360 --> 00:59:42,360
But I'm always like, I'm always
collecting like because I, I,
1187
00:59:42,400 --> 00:59:44,720
you know, I worry about my own
ability to make money in the
1188
00:59:44,720 --> 00:59:46,040
future.
Like, I really don't know, you
1189
00:59:46,040 --> 00:59:47,880
know, and I, and I'm too old to
go become a journeyman
1190
00:59:47,880 --> 00:59:50,320
apprentice electrician.
So I can't, you know that.
1191
00:59:50,320 --> 00:59:52,480
And there's a lot of people in
my position who only have like
1192
00:59:52,680 --> 00:59:55,800
15 years left in their working
lives who aren't going to be
1193
00:59:55,800 --> 00:59:57,800
able to, to make a transition
like that.
1194
00:59:57,800 --> 00:59:59,400
So there's a real, you know,
real trouble there.
1195
00:59:59,480 --> 01:00:02,000
But I'm always keeping my eyes
open for like, like, what is a
1196
01:00:02,000 --> 01:00:06,440
gig that that is AI proof?
I met a guy the other day at the
1197
01:00:06,440 --> 01:00:09,080
airport and, and we're sitting
waiting for a flight together.
1198
01:00:09,080 --> 01:00:09,960
And I was like, what are you
doing?
1199
01:00:09,960 --> 01:00:12,760
He was like, I own a chain of
Barber shops And I was like,
1200
01:00:12,920 --> 01:00:15,800
yes.
That's fantastic.
1201
01:00:16,040 --> 01:00:18,520
Tell me about that.
You know, like, like so I, I
1202
01:00:18,520 --> 01:00:22,560
agree with you, Jeff, I, I think
there will be a lot of jobs that
1203
01:00:22,640 --> 01:00:25,280
won't get replaced, at least not
in the short term because it
1204
01:00:25,280 --> 01:00:27,840
just doesn't make enough money.
Like it wouldn't be, it wouldn't
1205
01:00:27,840 --> 01:00:29,400
be a cost savings to automate
that work.
1206
01:00:29,600 --> 01:00:33,560
But you know, if you look at
the, at the original filings of
1207
01:00:33,560 --> 01:00:38,040
Uber to the company, the
original stock filings all say
1208
01:00:38,040 --> 01:00:40,640
from the very beginning, we're
going to automate this work as
1209
01:00:40,640 --> 01:00:44,120
quick as we possibly can, right?
The driving Uber is going to be
1210
01:00:44,120 --> 01:00:47,280
absolutely a robot's job in a
few years, because that's the,
1211
01:00:47,320 --> 01:00:49,440
because they can do it more
cheaply that way.
1212
01:00:49,440 --> 01:00:51,600
And so as long as they can do it
more cheaply by automating it,
1213
01:00:51,600 --> 01:00:54,240
they will.
And I, and I think it and, and,
1214
01:00:54,400 --> 01:00:56,640
and This is why I think one of
the ways, you know, where the
1215
01:00:56,640 --> 01:00:59,000
subtitle of my book is how it's
Creating a World without Choices
1216
01:00:59,000 --> 01:01:01,560
and How to Fight back.
We're seeing around the world,
1217
01:01:01,560 --> 01:01:04,600
people are fighting back.
So AI sorry, in India, they have
1218
01:01:04,600 --> 01:01:08,080
outlawed self driving cars
because a huge number of people
1219
01:01:08,080 --> 01:01:09,560
in India make their job as
drivers.
1220
01:01:09,880 --> 01:01:13,000
They have said it's illegal to
make to have robots do that
1221
01:01:13,240 --> 01:01:15,880
because they know that it'll
just put so many people out of
1222
01:01:15,880 --> 01:01:17,760
work.
And, you know, we may have to
1223
01:01:17,760 --> 01:01:19,520
make that kind of choice here in
the United States, I think.
1224
01:01:20,000 --> 01:01:22,920
Yeah, for sure getting the
trades people.
1225
01:01:23,920 --> 01:01:26,520
You ain't going to catch no Elon
robot doing underwater welding.
1226
01:01:26,520 --> 01:01:29,080
Not for a long time, you know.
That's funny.
1227
01:01:29,080 --> 01:01:30,880
I bet.
I bet if you had Elon on, he'd
1228
01:01:30,880 --> 01:01:33,480
be like, oh, I totally want to
make a robot that does that.
1229
01:01:34,440 --> 01:01:36,320
It's funny too, because I'm
sitting here thinking about it
1230
01:01:36,320 --> 01:01:40,880
from my job because I do
avionics work on airplanes.
1231
01:01:41,400 --> 01:01:43,480
Yeah.
Certainly there wouldn't be a
1232
01:01:43,480 --> 01:01:46,000
robot that would, you know, take
my job.
1233
01:01:46,000 --> 01:01:51,320
However, they could use AI to
figure out, you know, why are
1234
01:01:52,200 --> 01:01:56,000
half the lights not working on
this, you know, in this cabin of
1235
01:01:56,000 --> 01:01:59,520
this aircraft and completely
eliminate the need to pay
1236
01:01:59,520 --> 01:02:02,200
someone who specializes in.
Avionics, right?
1237
01:02:02,200 --> 01:02:04,040
Has experience.
That's exactly right.
1238
01:02:04,040 --> 01:02:05,480
So I mean, this has, this is,
that's right.
1239
01:02:05,600 --> 01:02:08,320
That's right.
We've seen, yeah, they all say
1240
01:02:08,320 --> 01:02:10,680
threw a ludicrous number at me
because they're like, yeah,
1241
01:02:10,680 --> 01:02:13,360
well, you got, you know, 10
years of experience working at
1242
01:02:13,800 --> 01:02:16,000
on avionics and that's a
specialty role.
1243
01:02:16,000 --> 01:02:19,040
But if you could get a, you
know, a computer program to just
1244
01:02:19,040 --> 01:02:23,000
find what's probably the
solution to fix this, then you
1245
01:02:23,000 --> 01:02:25,440
could have any monkey with a
wrench, you know, figure it out
1246
01:02:25,440 --> 01:02:27,640
and do it right, So.
And like, out of the little
1247
01:02:27,640 --> 01:02:31,680
drawer comes the exact piece
machines that you need.
1248
01:02:31,680 --> 01:02:33,720
And just put it here, turn it
three times.
1249
01:02:33,720 --> 01:02:36,360
OK, Take lunch.
Yeah, that's right.
1250
01:02:36,480 --> 01:02:37,800
That's right.
That's right.
1251
01:02:38,360 --> 01:02:41,800
And we'll see.
Maybe so again, like, yeah,
1252
01:02:41,800 --> 01:02:44,960
right.
The value of people has got to
1253
01:02:44,960 --> 01:02:48,080
start getting calculated in a,
you know, form other than just
1254
01:02:48,080 --> 01:02:49,400
how much money they can bring
in.
1255
01:02:49,440 --> 01:02:53,520
We're going to need to start
defending purpose and human
1256
01:02:53,520 --> 01:02:57,360
satisfaction, I think in some in
some new way.
1257
01:02:57,480 --> 01:02:59,440
And we just don't have a lot of
history of that, you know?
1258
01:02:59,440 --> 01:03:03,400
Yeah.
Well, ready for a wild question?
1259
01:03:04,040 --> 01:03:08,360
Yeah, let's go.
Do you, I don't know if you've,
1260
01:03:08,440 --> 01:03:12,400
you've dove into this at all,
but you think that there's any
1261
01:03:13,280 --> 01:03:21,320
connection between AI and, and
the recent massive increase of
1262
01:03:21,440 --> 01:03:27,520
UFO stuff that's going on right
now, whether it be whether it be
1263
01:03:28,920 --> 01:03:31,720
through mainstream media,
through independent outlets like
1264
01:03:31,720 --> 01:03:35,840
us or anything and everything.
That's really interesting.
1265
01:03:35,840 --> 01:03:38,320
I don't know if I believe so.
I'll just say I don't know
1266
01:03:38,320 --> 01:03:42,240
anything about it fundamentally
like I, I, I, I hadn't, I didn't
1267
01:03:42,240 --> 01:03:43,440
know about the uptick in
reports.
1268
01:03:43,440 --> 01:03:50,000
That's interesting.
I, I think that there is a new,
1269
01:03:50,000 --> 01:03:53,720
we're in a new information
ecosystem where every single
1270
01:03:53,720 --> 01:03:59,920
person is kind of a self
appointed watchdog for weird
1271
01:03:59,920 --> 01:04:03,720
stuff.
And that is, I think in a lot of
1272
01:04:03,720 --> 01:04:05,520
cases good.
There are a lot of good things
1273
01:04:05,520 --> 01:04:11,080
that have come out of that And
and there are also places in
1274
01:04:11,080 --> 01:04:17,000
which that creates just an
incredible, you know, ecosystem
1275
01:04:17,000 --> 01:04:20,320
for dangerous conspiracy
conspiracy theorists that that,
1276
01:04:20,360 --> 01:04:23,040
you know, get us into trouble as
a society.
1277
01:04:23,080 --> 01:04:27,880
I think.
So I wonder if like if I were
1278
01:04:27,880 --> 01:04:32,200
to, if I had to guess, I would
guess that it is, you know, it
1279
01:04:32,200 --> 01:04:39,360
might be a function of just how
easy it is to, to report
1280
01:04:39,480 --> 01:04:44,960
evidence of a thing and have
that thing analyzed and
1281
01:04:45,400 --> 01:04:47,800
amplified by lots and lots and
lots of people.
1282
01:04:48,360 --> 01:04:52,000
It's, you know, one of the
things that we've seen a lot of
1283
01:04:52,080 --> 01:04:54,800
in the last, you know, like
what, what we've seen time and
1284
01:04:54,800 --> 01:04:57,600
again is that when you can
measure a thing more
1285
01:04:57,600 --> 01:05:01,200
effectively, the rates of that
thing go up because we can
1286
01:05:01,200 --> 01:05:03,480
measure it more.
And I just wonder if it's just
1287
01:05:03,480 --> 01:05:06,000
because there are so many people
with cameras filming the night
1288
01:05:06,000 --> 01:05:09,360
sky that you're either seeing,
maybe they're really seeing
1289
01:05:09,360 --> 01:05:12,320
something or at the very least
the incidents of people who
1290
01:05:12,320 --> 01:05:15,800
think they have seen something
has gotten a lot higher.
1291
01:05:15,800 --> 01:05:16,800
Yeah.
I don't know.
1292
01:05:17,200 --> 01:05:19,520
I'm speculating here, but that's
that would be my guess.
1293
01:05:19,520 --> 01:05:21,280
We.
Can get in some real brain rot
1294
01:05:21,280 --> 01:05:23,880
topics here, we can tell you
now.
1295
01:05:24,000 --> 01:05:28,080
I asked because one of the
primary theories going around
1296
01:05:28,080 --> 01:05:34,960
right now is that UFOs is AI
Incarnate, like a physical form
1297
01:05:34,960 --> 01:05:39,840
of AI creating either itself to
move around inside the physical
1298
01:05:39,840 --> 01:05:44,280
space, which also delves into,
you know, time travel and and
1299
01:05:44,280 --> 01:05:49,240
other real fringe topics, right?
I mean, we, we dive into all of
1300
01:05:49,240 --> 01:05:50,480
it here.
We really do.
1301
01:05:52,320 --> 01:05:55,720
So I was just wondering if you
had heard anything like that or
1302
01:05:55,920 --> 01:05:58,480
had a had a point.
I haven't the only the only
1303
01:05:58,480 --> 01:06:02,200
perspective I'll offer on on
UFOs and extraterrestrial life
1304
01:06:02,240 --> 01:06:05,400
that has always stuck with me is
a thing that years and years and
1305
01:06:05,400 --> 01:06:09,800
years ago team of academics who
study space were explaining to
1306
01:06:09,800 --> 01:06:14,640
me about the sheer size of space
and also how old the universe
1307
01:06:14,640 --> 01:06:16,280
is.
And as a result.
1308
01:06:16,280 --> 01:06:19,320
So there's this, there's this
thing, the Fermi paradox, which
1309
01:06:19,320 --> 01:06:21,840
you probably know about, which
is right and Enrico Fermi,
1310
01:06:21,840 --> 01:06:24,400
right, one of the fathers of the
atomic bomb.
1311
01:06:24,400 --> 01:06:27,160
He just would idly chat with his
lunch group and one of the
1312
01:06:27,160 --> 01:06:29,160
things he would always ask is
where is everybody?
1313
01:06:29,200 --> 01:06:31,880
Where is where are where is
alien life elsewhere in the in
1314
01:06:31,880 --> 01:06:34,120
the world, in the universe,
because clearly there's so much
1315
01:06:34,120 --> 01:06:36,440
potential for it to be there.
Why isn't it out there?
1316
01:06:36,880 --> 01:06:40,640
Well, and the best answer anyone
has come up with that makes
1317
01:06:40,640 --> 01:06:45,680
mathematical sense to answer the
Fermi paradox is it's not that
1318
01:06:45,680 --> 01:06:49,000
there isn't extraterrestrial
life out there, it's that the
1319
01:06:49,000 --> 01:06:53,600
universe is so old and the
amount of time in the universe
1320
01:06:53,600 --> 01:07:00,560
is so vast that the chance that
out of all of these stars, 2
1321
01:07:00,560 --> 01:07:05,280
civilizations would exist at the
same time is very mathematically
1322
01:07:05,280 --> 01:07:07,400
small.
So the idea that in all this
1323
01:07:07,400 --> 01:07:12,600
darkness that a, that our, the
single match flame that is our
1324
01:07:12,600 --> 01:07:16,280
civilization, right?
That that that lights and is
1325
01:07:16,280 --> 01:07:22,760
extinguished instantly in the
time scale of the universe, the
1326
01:07:22,760 --> 01:07:27,000
chance that two of those would
be lit together at the same time
1327
01:07:27,000 --> 01:07:31,360
such that they overlap and
actually see one another is
1328
01:07:31,360 --> 01:07:35,560
incredibly tiny.
So for me, that's been my,
1329
01:07:35,680 --> 01:07:38,360
that's something I've hung on to
in my career for a long time.
1330
01:07:38,360 --> 01:07:40,800
When when people talk about
extraterrestrial life is I think
1331
01:07:40,800 --> 01:07:44,120
to myself, it makes absolute
sense to me that it's out there.
1332
01:07:45,000 --> 01:07:48,000
It just may have already
happened or hasn't happened yet
1333
01:07:48,000 --> 01:07:49,400
such that we would ever
encounter it.
1334
01:07:49,600 --> 01:07:53,320
Yeah, you've actually touched on
multiple answers to the the
1335
01:07:53,320 --> 01:07:56,960
Fermi paradox there.
I I always like to plug books.
1336
01:07:57,080 --> 01:08:01,360
Again, if anybody's interested
in a really good read for
1337
01:08:01,560 --> 01:08:05,360
answers to the Fermi paradox,
there is a book by Steven Webb.
1338
01:08:05,360 --> 01:08:08,120
It's called If the Universe is
Teeming with Aliens, Where is
1339
01:08:08,120 --> 01:08:10,760
everybody?
75 solutions to the Fermi
1340
01:08:10,760 --> 01:08:12,160
paradox.
Oh, cool.
1341
01:08:13,240 --> 01:08:16,760
And it's, it's probably one of
my most read books.
1342
01:08:16,760 --> 01:08:19,479
I, I constantly go to it and use
it for reference.
1343
01:08:20,080 --> 01:08:23,640
Cool.
And then, of course, just to tie
1344
01:08:23,640 --> 01:08:30,720
it all in the 76th solution was
actually a recent thing, which
1345
01:08:30,720 --> 01:08:33,279
is the, the, the one that's
brought up in the third body
1346
01:08:33,279 --> 01:08:37,920
problem series by author of 6
and you where he talks about
1347
01:08:37,920 --> 01:08:43,040
the, the dark forest theory.
That's that's another one that's
1348
01:08:43,040 --> 01:08:45,520
not covered in Steven Webb's
book, but that's also very
1349
01:08:45,520 --> 01:08:48,319
interesting, which you, you
didn't necessarily touch on that
1350
01:08:48,319 --> 01:08:49,760
one.
You, you more or less there.
1351
01:08:49,760 --> 01:08:53,279
There's a, there's a answer
called the island, right?
1352
01:08:53,560 --> 01:08:57,080
Where we're just alone on an
island and all we have is
1353
01:08:57,479 --> 01:09:00,080
basically, if you look at the
earth, all we have is the
1354
01:09:00,080 --> 01:09:03,240
materials here on the earth.
We don't have the ways of
1355
01:09:03,640 --> 01:09:07,240
manufacturing what we need to be
able to bend space-time and
1356
01:09:07,240 --> 01:09:11,439
travel vest distances with it
all within one lifespan, right?
1357
01:09:11,439 --> 01:09:14,160
So where we are is where we're
going to be at.
1358
01:09:14,520 --> 01:09:18,040
Distances are too, too far.
Time is too limited.
1359
01:09:18,040 --> 01:09:21,000
Unless you can go through and
somehow manipulate one of
1360
01:09:21,000 --> 01:09:23,479
Einstein's theories of
relativity via, you know,
1361
01:09:23,600 --> 01:09:27,240
bending of space or black hole
manipulation, you're not going
1362
01:09:27,240 --> 01:09:29,920
to really do much.
And even that is theoretical at
1363
01:09:29,920 --> 01:09:32,080
best.
Yeah, yeah, yeah, yeah, right,
1364
01:09:32,359 --> 01:09:33,840
right.
I'll just leave you with this,
1365
01:09:33,840 --> 01:09:36,880
an idea that that I started the
book with, which is the idea of
1366
01:09:36,880 --> 01:09:40,920
the generation ship, which is
this concept that gets kicked
1367
01:09:40,920 --> 01:09:43,000
around at NASA.
And there's a couple of science
1368
01:09:43,000 --> 01:09:45,399
fiction books that have have
taken this idea on.
1369
01:09:45,399 --> 01:09:49,560
And it's the whole idea that so,
so the nearest habitable planet
1370
01:09:49,560 --> 01:09:52,359
to us, the one that they think
we actually could walk around on
1371
01:09:52,359 --> 01:09:55,400
and, and possibly breathe the
atmosphere is called Proxima
1372
01:09:55,400 --> 01:09:59,320
Centauri B.
And it's only like 4.3 light
1373
01:09:59,320 --> 01:10:01,520
years away.
It's right down the block in
1374
01:10:01,520 --> 01:10:05,120
terms of, of, you know, of, of
being nearby.
1375
01:10:05,720 --> 01:10:10,080
The trouble is that 4.3 light
years away at the current speeds
1376
01:10:10,080 --> 01:10:13,840
we can travel in space ends up
being something like it's more
1377
01:10:13,840 --> 01:10:16,560
than 100,000 years.
It's like 200,000 years.
1378
01:10:16,720 --> 01:10:20,360
So the concept is that you'd
have to have people on that ship
1379
01:10:21,560 --> 01:10:26,240
live and die and have babies and
continue to create a culture
1380
01:10:26,240 --> 01:10:28,760
that just lives right on that
ship for that whole period of
1381
01:10:28,760 --> 01:10:30,040
time.
It's like 2000 human
1382
01:10:30,040 --> 01:10:32,840
generations.
And so I've used this as like
1383
01:10:32,840 --> 01:10:35,800
it, my example of like where
we're at on this planet is like
1384
01:10:36,160 --> 01:10:38,720
that amount of time is
essentially the entire time that
1385
01:10:38,720 --> 01:10:41,560
we've been that species that
walked off of the continent of
1386
01:10:41,560 --> 01:10:44,880
Africa and, and, you know,
became the modern selves with
1387
01:10:44,920 --> 01:10:48,760
our better brains.
You know, the possibility of
1388
01:10:48,760 --> 01:10:53,520
actually living all that time on
a, a single ship is crazy.
1389
01:10:53,520 --> 01:10:56,080
And instead it makes me think we
are on that ship.
1390
01:10:56,240 --> 01:10:58,400
Like this is our ship.
This is the one we get.
1391
01:10:58,400 --> 01:11:01,040
We're on it already.
And we got to figure out how to
1392
01:11:01,040 --> 01:11:03,680
do a better job of thinking in
advance, thinking in
1393
01:11:03,680 --> 01:11:05,680
generations.
You know, you guys were asking
1394
01:11:05,680 --> 01:11:07,800
about like what's going to
happen in a few generations,
1395
01:11:07,800 --> 01:11:09,560
right?
Like, like, we need to think
1396
01:11:09,560 --> 01:11:10,920
that way.
We're not thinking that way
1397
01:11:10,920 --> 01:11:12,280
right now.
We're thinking in financial
1398
01:11:12,280 --> 01:11:14,880
quarters and we need to be
thinking a lot longer than that.
1399
01:11:14,880 --> 01:11:17,960
It seems to be.
Proxima Centauri is, is very
1400
01:11:17,960 --> 01:11:20,880
interesting just for food for
thought for people who are
1401
01:11:20,880 --> 01:11:24,520
listening.
Proxima Centauri was once
1402
01:11:24,520 --> 01:11:27,160
considered a dual star system.
And then they they end up
1403
01:11:27,160 --> 01:11:30,400
finding 1/3 traveling star that
actually orbited the two that
1404
01:11:30,400 --> 01:11:34,000
are orbiting themselves.
And then, yeah, that's I think
1405
01:11:34,000 --> 01:11:36,000
that's fascinating.
Yeah, yeah.
1406
01:11:36,000 --> 01:11:39,240
Someone in the chat said that
and then when the ship gets
1407
01:11:39,240 --> 01:11:41,120
there it's just a bunch of
inbred.
1408
01:11:41,560 --> 01:11:42,920
Totally.
I saw that.
1409
01:11:43,040 --> 01:11:47,080
So I have a, I have a like half
written screenplay in which
1410
01:11:47,080 --> 01:11:50,680
that's the exactly the concept.
Did you guys ever watch in the
1411
01:11:50,680 --> 01:11:52,120
name of the The Name of the
Rose?
1412
01:11:52,120 --> 01:11:53,760
It was an early Sean Connery
movie.
1413
01:11:53,760 --> 01:11:56,200
It's a great movie about
medieval.
1414
01:11:56,200 --> 01:11:58,320
It's a murder mystery set in a
medieval monastery.
1415
01:11:58,320 --> 01:12:01,960
And everybody is super inbred.
So they're all just like, you
1416
01:12:01,960 --> 01:12:05,320
know, and you would just imagine
at the very end of that trip, as
1417
01:12:05,320 --> 01:12:07,320
they're approaching the planet,
they're all just going to be
1418
01:12:07,320 --> 01:12:10,400
like so messed up, you know,
it's so gross.
1419
01:12:10,400 --> 01:12:12,680
They're going to look like.
Sloth from The Goonies.
1420
01:12:12,760 --> 01:12:15,840
And they're all like, naked
because all the, all the, you
1421
01:12:15,840 --> 01:12:19,000
know, clothing is melted away.
You know, like, that's right.
1422
01:12:19,160 --> 01:12:19,920
That's right.
Exactly.
1423
01:12:19,920 --> 01:12:22,600
It'd be a messy party.
Well, was there anything that
1424
01:12:22,600 --> 01:12:27,080
you came on to try and
specifically talk about today so
1425
01:12:27,200 --> 01:12:30,960
we're not wasting all of the
time on on fantastical stuff
1426
01:12:30,960 --> 01:12:33,360
about UFOs and spacecraft?
No, we've done it to you guys.
1427
01:12:33,480 --> 01:12:36,080
I really, I love hearing your
perspectives on, you know, the
1428
01:12:36,080 --> 01:12:39,320
macroeconomic trends here and
on, you know, whether these,
1429
01:12:39,480 --> 01:12:42,880
these systems really can replace
jobs as we have them right now.
1430
01:12:42,880 --> 01:12:44,360
No, I think we've done a really
good job.
1431
01:12:44,360 --> 01:12:46,640
I would say, you know, if you
don't mind me promoting my own
1432
01:12:46,640 --> 01:12:47,080
stuff.
Yeah.
1433
01:12:47,080 --> 01:12:49,720
The ripcurrent.com is where
people can sign up for my
1434
01:12:49,720 --> 01:12:51,120
podcast if they'd like to
listen.
1435
01:12:51,120 --> 01:12:54,960
I do a weekly interview with
somebody who, you know, the RIP
1436
01:12:54,960 --> 01:12:57,640
current is named for, like the
invisible forces that I think
1437
01:12:57,640 --> 01:12:58,840
are kind of working on us right
now.
1438
01:12:58,840 --> 01:13:03,720
And that stuff like, you know,
the, that's like it, that's,
1439
01:13:03,720 --> 01:13:08,480
that's, you know, all these
weird political trends moving us
1440
01:13:08,480 --> 01:13:10,280
here and there.
It's economic forces.
1441
01:13:10,280 --> 01:13:13,040
So each, each week is a
different sort of expert in
1442
01:13:13,360 --> 01:13:16,160
something along those lines.
So it's, I think it's very
1443
01:13:16,160 --> 01:13:17,800
similar to what you guys are
thinking about here.
1444
01:13:17,800 --> 01:13:19,440
And I really appreciate the
opportunity to be here.
1445
01:13:19,720 --> 01:13:22,240
No, it's awesome.
This was this was a blast,
1446
01:13:22,360 --> 01:13:24,960
really.
Yeah, good episode for sure.
1447
01:13:24,960 --> 01:13:28,040
Appreciate you coming on.
Thanks guys, I really appreciate
1448
01:13:28,040 --> 01:13:29,440
your time.
Hope to do it again.
1449
01:13:29,880 --> 01:13:33,280
Absolutely anytime I'm going to
go ahead and sign this off if
1450
01:13:33,280 --> 01:13:35,560
you could hang out for, you
know, two or three more minutes
1451
01:13:35,560 --> 01:13:37,160
afterwards.
That way I can just make sure
1452
01:13:37,160 --> 01:13:40,520
all of your audio and video are
are sent over.
1453
01:13:41,240 --> 01:13:43,920
Usually it's it's really quick,
but every once in a while that
1454
01:13:43,920 --> 01:13:45,880
just takes a few extra minutes.
I've been there.
1455
01:13:45,880 --> 01:13:49,000
I'll absolutely stick around.
All right, Jacob, is there
1456
01:13:49,000 --> 01:13:53,320
anything else that you would
like to plug or anything other
1457
01:13:53,320 --> 01:13:56,280
than the RIP current?
Again, the ripcurrent.com come
1458
01:13:56,280 --> 01:13:59,400
check us out if you can.
For some reason, my, my biggest
1459
01:13:59,400 --> 01:14:01,720
following is on Tiktok.
I have a huge Tiktok following,
1460
01:14:01,720 --> 01:14:04,200
which is for an old guy, it
makes no sense to me at all, but
1461
01:14:04,200 --> 01:14:07,120
I really enjoy that community
And so come check me out there.
1462
01:14:07,120 --> 01:14:10,160
That's where I'm most active
trying to grow my YouTube, but
1463
01:14:10,160 --> 01:14:12,760
I, I don't even, I'm not doing a
smart job of that.
1464
01:14:12,760 --> 01:14:15,080
So I got to, I got to shift
toward that if I can.
1465
01:14:15,080 --> 01:14:17,080
But yeah, ripcurrent.com, that's
why I'm mostly.
1466
01:14:17,080 --> 01:14:21,240
We do have one question from the
chat that popped up the more
1467
01:14:21,280 --> 01:14:22,920
asked what got him in the
conspiracies.
1468
01:14:22,920 --> 01:14:25,200
How long ago?
You know, so when you're the,
1469
01:14:25,600 --> 01:14:28,400
you're the editor in chief of
Popular Science magazine, you
1470
01:14:28,400 --> 01:14:31,880
really can't, you really can't
avoid them.
1471
01:14:32,240 --> 01:14:34,160
You know, you'll have people,
it's very interesting.
1472
01:14:34,160 --> 01:14:36,720
You'll have people full of, you
know, Popular Science.
1473
01:14:36,720 --> 01:14:39,200
The readership is, you know, it
tends to be a lot of military
1474
01:14:39,200 --> 01:14:41,080
people.
We got a lot of Navy, you know,
1475
01:14:41,560 --> 01:14:43,880
I think a lot of people, you
know, you have people serving on
1476
01:14:43,880 --> 01:14:46,440
a submarine, right, who are who
are at sea for a long time.
1477
01:14:46,440 --> 01:14:48,800
And that's kind of one of their
only forms of entertainment is
1478
01:14:48,800 --> 01:14:50,720
magazines.
So they'd have us with them.
1479
01:14:50,720 --> 01:14:52,160
And so just a really interesting
group.
1480
01:14:52,160 --> 01:14:54,720
You know, a lot of people would
come to us with a lot of various
1481
01:14:54,720 --> 01:14:57,320
interesting questions.
And so I think that got me, that
1482
01:14:57,400 --> 01:14:59,440
got me going on it.
And then you combine that within
1483
01:14:59,440 --> 01:15:03,280
all this reporting I've done
around kind of, you know, why
1484
01:15:03,280 --> 01:15:06,520
certain ideas really stick in
the brain, and that brings you
1485
01:15:06,520 --> 01:15:08,400
into contact with a lot of folks
who think about conspiracy
1486
01:15:08,400 --> 01:15:10,760
theories as well.
Cool, You put a smart guy in
1487
01:15:10,760 --> 01:15:13,040
front of this information.
It's just natural and you just
1488
01:15:13,040 --> 01:15:15,160
become a conspiracy theorist and
so.
1489
01:15:16,800 --> 01:15:17,840
That's right.
Well, good.
1490
01:15:17,840 --> 01:15:20,880
We'll, I'll make sure to take
you in all the TikTok clips that
1491
01:15:20,880 --> 01:15:22,960
we get from this episode and.
Appreciate it.
1492
01:15:23,120 --> 01:15:24,360
We'll spread, we'll spread the
word.
1493
01:15:24,360 --> 01:15:25,920
I can't wait for your book to
come in.
1494
01:15:26,280 --> 01:15:28,920
I'm sure I'll have a ton of
questions afterwards.
1495
01:15:28,920 --> 01:15:30,960
So if you ever want to come back
on the show, you're more than
1496
01:15:30,960 --> 01:15:32,440
welcome, man.
Sounds great.
1497
01:15:32,440 --> 01:15:33,480
Yeah, Just call me up.
I'm around.
1498
01:15:33,640 --> 01:15:36,680
All right, guys.
Jake, you got anything for Jake?
1499
01:15:36,760 --> 01:15:39,600
No, but this, that this whole
thing has been tripping me out a
1500
01:15:39,600 --> 01:15:40,840
little bit.
Jake.
1501
01:15:41,320 --> 01:15:43,240
Jake.
Between this and your lazy guys,
1502
01:15:43,240 --> 01:15:44,840
you must be having a trippy
evening.
1503
01:15:45,000 --> 01:15:48,640
Yeah, I'm ready for bed too.
It's my bedtime.
1504
01:15:48,640 --> 01:15:49,760
I got to go work on planes
tomorrow.
1505
01:15:50,120 --> 01:15:52,760
Good, good.
But no, it was a it was a cool
1506
01:15:52,760 --> 01:15:55,440
episode.
Definitely not what I expected.
1507
01:15:55,520 --> 01:15:57,360
Yeah, I would love to have you
back on.
1508
01:15:57,360 --> 01:16:01,320
That was interesting to hear
your insights.
1509
01:16:01,760 --> 01:16:05,880
Sick thanks guys and.
Another Californian is cool too.
1510
01:16:05,880 --> 01:16:07,440
I'm from Fresno.
Oh, see.
1511
01:16:07,840 --> 01:16:08,840
Nice.
Awesome, awesome.
1512
01:16:09,480 --> 01:16:12,480
And then next time, we'll make
sure we have some some even
1513
01:16:12,480 --> 01:16:14,920
Wilder questions, Something to
make you real good, right?
1514
01:16:15,120 --> 01:16:16,240
Yeah, you weren't easy on me
tonight.
1515
01:16:16,600 --> 01:16:18,920
Yeah.
All right, well, that has been
1516
01:16:18,920 --> 01:16:20,840
another episode of the Infinite
Rabbit Hole podcast.
1517
01:16:21,240 --> 01:16:24,600
Until next time, travelers,
we'll see you in the next fork
1518
01:16:24,600 --> 01:16:26,440
in the path of the Infinite
Rabbit Hole.
1519
01:16:26,640 --> 01:16:27,480
Bye, buddy.
Goodnight.
1520
01:16:31,560 --> 01:16:34,080
Hey everybody, thanks for
checking out the Infinite Rabbit
1521
01:16:34,080 --> 01:16:36,160
Hole podcast.
If you're looking for more of
1522
01:16:36,160 --> 01:16:39,320
our stuff, head on over to
infiniterabbithole.com where you
1523
01:16:39,320 --> 01:16:42,040
can find links to all the
podcast players that we are
1524
01:16:42,040 --> 01:16:45,560
available on and even our video
platforms such as TikTok and
1525
01:16:45,560 --> 01:16:47,440
YouTube.
While you're there, make sure to
1526
01:16:47,440 --> 01:16:50,440
check out all the links for our
socials and hit that follow so
1527
01:16:50,440 --> 01:16:53,520
you know when all the new stuff
from our podcast comes out.
1528
01:16:53,760 --> 01:16:57,400
And until next time, travelers,
we'll see you right here and the
1529
01:16:57,400 --> 01:16:59,880
next fork in the path of the
infinite rabbit hole.