Jan. 26, 2026
#138: Minnesota, 1st & 2nd Amendments / Create Your Own Algorithm/ Corporate Airbnbs
Solo episode talking about
- Political pendulum swinging
- Processing current events in Minnesota
- My new podcast with William Mann
- Creating our own algorithms in our lives
- The corporatization of Airbnb
If you enjoy, please like, subscribe, and give us five stars!
For more information visit Cometogether-podcast.com
1
00:00:01,080 --> 00:00:04,000
Welcome to another episode of
Come Together.
2
00:00:15,320 --> 00:00:18,120
We are live.
This is another solo episode.
3
00:00:18,120 --> 00:00:22,760
I am going to do my best to keep
this a quality.
4
00:00:23,200 --> 00:00:26,960
We'll see how it goes.
So I'm just adjusting my screen.
5
00:00:27,240 --> 00:00:29,560
I'm not comfortable in front of
the camera yet.
6
00:00:29,760 --> 00:00:32,320
I have so many thoughts.
It's been a little while since
7
00:00:32,320 --> 00:00:34,400
I've done a solo episode and
it's really been a while since
8
00:00:34,400 --> 00:00:36,960
I've done a coherent, I feel
like solo episode where I feel
9
00:00:36,960 --> 00:00:39,840
like I actually have something
to talk about or some thoughts
10
00:00:39,840 --> 00:00:43,040
to get out there.
So here we go.
11
00:00:43,040 --> 00:00:46,640
It's been a lot of fun.
One, I literally have just been
12
00:00:46,640 --> 00:00:50,000
thinking I was just sitting by
myself in the in by my by my
13
00:00:50,000 --> 00:00:53,280
desk, catching up.
I took a week off last week with
14
00:00:53,280 --> 00:00:56,720
my family to go to Orlando,
still did a couple of recordings
15
00:00:56,720 --> 00:01:00,480
and released a podcast episode
with William Mann, who is a
16
00:01:00,600 --> 00:01:05,280
awesome historian of Hollywood
and college professor and New
17
00:01:05,280 --> 00:01:07,600
York Times bestselling writer.
He's the man.
18
00:01:07,960 --> 00:01:11,200
We did a really, really cool
episode talking about Marlon
19
00:01:11,200 --> 00:01:14,880
Brando, old School Hollywood.
His new book that's actually
20
00:01:14,880 --> 00:01:18,480
available is it today is today
the 27th tomorrow, tomorrow the
21
00:01:18,480 --> 00:01:21,160
27th.
So if you're listening today,
22
00:01:21,440 --> 00:01:23,120
you can check out the book
tomorrow.
23
00:01:23,840 --> 00:01:28,360
It is called because I don't
want to butcher it.
24
00:01:30,680 --> 00:01:31,960
Let's see.
I'm sorry.
25
00:01:32,400 --> 00:01:34,880
Here we go.
His new book is called Black
26
00:01:34,880 --> 00:01:38,000
Dahlia Murder, Monsters in
madness in Midwest Hollywood in
27
00:01:38,000 --> 00:01:44,720
mid century Hollywood, and it's
and William Mann ventured into
28
00:01:44,960 --> 00:01:49,880
the world of unsolved true
crime, Hollywood murders, not
29
00:01:49,880 --> 00:01:54,440
necessarily just to kind of jump
into the into the into the
30
00:01:54,440 --> 00:02:02,560
bandwagon of true crime, but he
really observes it in the
31
00:02:02,560 --> 00:02:04,480
context of its impact on
society.
32
00:02:04,480 --> 00:02:07,960
So for example, I know that, and
this is some really interesting
33
00:02:07,960 --> 00:02:10,800
stats that you could look up
relative to, for example, the
34
00:02:11,080 --> 00:02:16,000
the Charles Manson infamous
murder was the fact how many
35
00:02:16,000 --> 00:02:19,840
people bought guns after that in
Hollywood or how many people
36
00:02:19,840 --> 00:02:25,040
bought enhanced door locks or
security in Hollywood after the
37
00:02:25,040 --> 00:02:26,680
Manson murders.
And it's really, really
38
00:02:26,680 --> 00:02:31,160
interesting because it these
these events do have societal
39
00:02:31,160 --> 00:02:34,840
ripples and impacts.
And I mean, obviously we see it
40
00:02:34,840 --> 00:02:38,640
right now how we see with a lot
that's going on in the world
41
00:02:38,640 --> 00:02:43,080
today.
And the events are the events
42
00:02:43,280 --> 00:02:46,640
and we might not ever know the
real story behind them.
43
00:02:46,640 --> 00:02:50,880
This, you know, I mean, you have
10 different accounts, 10
44
00:02:50,880 --> 00:02:53,160
witnesses, you have 10 different
accounts of what's going on.
45
00:02:53,160 --> 00:02:57,600
And you also have unfortunately,
interpretation, which is, which
46
00:02:57,600 --> 00:03:03,440
is when people, when people
witness something and because,
47
00:03:03,520 --> 00:03:06,720
you know, we're only
experiencing, I would say we,
48
00:03:06,720 --> 00:03:09,560
we're, we're getting a
particular type of input we're
49
00:03:09,560 --> 00:03:13,240
getting, we're viewing something
and our brain is making sense of
50
00:03:13,240 --> 00:03:15,000
it in the context of what we
already know.
51
00:03:15,280 --> 00:03:19,480
So when we've witnessed events
like things that obviously most
52
00:03:19,480 --> 00:03:24,240
hot topic these days are going
on in Minnesota, when we
53
00:03:24,240 --> 00:03:29,040
witnessed these events, three
people, five people, 1000 people
54
00:03:29,040 --> 00:03:32,600
could look at the pictures and
the videos that are going on
55
00:03:32,600 --> 00:03:36,680
that are making the way through
online and through corporate or
56
00:03:36,680 --> 00:03:41,120
formal media.
And we all interpret it in the
57
00:03:41,120 --> 00:03:43,520
context of the way of the
picture that's already in our
58
00:03:43,520 --> 00:03:45,640
mind of what we presume is the
case.
59
00:03:46,040 --> 00:03:52,120
And then there are the ripples
that actually flow out of into
60
00:03:52,120 --> 00:03:56,280
society about what happens as a
consequence of it.
61
00:03:56,320 --> 00:03:59,120
And you know, we obviously know,
for example, that some of the
62
00:03:59,120 --> 00:04:02,360
most consequential, you know,
you have small scale events.
63
00:04:02,360 --> 00:04:05,640
Like I said, Charlie Manson was
horrifying, but relatively small
64
00:04:05,640 --> 00:04:07,400
scale relative to something like
911.
65
00:04:07,680 --> 00:04:11,560
And the scale obviously impacts
the scale of the ripple effect
66
00:04:11,560 --> 00:04:14,880
in the reaction.
So I think that William Mann did
67
00:04:14,920 --> 00:04:17,519
a really, really amazing job
expressing his points.
68
00:04:18,000 --> 00:04:19,320
It's worth checking out his
episode.
69
00:04:19,320 --> 00:04:21,000
It's worth checking out his
writing, his books.
70
00:04:21,200 --> 00:04:23,400
He's written on a bunch of
different really cool Hollywood
71
00:04:23,560 --> 00:04:26,280
characters and families and even
political families like the
72
00:04:26,280 --> 00:04:29,600
Roosevelts.
And I think it was the
73
00:04:29,600 --> 00:04:31,600
Roosevelts I'm going to get, I'm
going to feel bad if I got that
74
00:04:31,600 --> 00:04:35,080
wrong.
Yeah, the Roosevelt, I was
75
00:04:35,120 --> 00:04:38,880
right, and it's worth checking
it out.
76
00:04:40,840 --> 00:04:51,560
I was trying to think about what
to talk about and I didn't want
77
00:04:51,560 --> 00:04:54,560
to kind of, I didn't want to
dwell too much on the Minnesota
78
00:04:54,560 --> 00:04:56,080
situation.
I kind of have to talk.
79
00:04:56,080 --> 00:05:00,000
I feel like I have to talk about
it because A, it's relevant and
80
00:05:00,120 --> 00:05:06,480
B, it does have real life,
again, implications and
81
00:05:06,480 --> 00:05:08,640
consequences.
And it's really, really
82
00:05:08,640 --> 00:05:12,480
interesting just to sit on the
side and observe the voices and
83
00:05:12,480 --> 00:05:17,160
the reactions that's going on.
And I'm not, What's funny is I'm
84
00:05:17,160 --> 00:05:21,160
not really tuned into what the
most prominent voices are saying
85
00:05:21,160 --> 00:05:23,200
because I don't really listen to
the most prominent voices.
86
00:05:25,240 --> 00:05:27,000
I just they're not, I'm not in
that.
87
00:05:27,000 --> 00:05:28,920
They're not in my algorithm.
I'm not exposed to their
88
00:05:28,920 --> 00:05:35,680
information on my feed.
But this is kind of funny.
89
00:05:36,000 --> 00:05:39,320
I mean, not nothing about the
nothing about the conflict in
90
00:05:39,320 --> 00:05:43,800
particular is funny.
But what's funny is how just a
91
00:05:43,800 --> 00:05:46,920
couple of years ago the right
was complaining about freedom of
92
00:05:46,920 --> 00:05:54,960
speech and right to bear arms.
And now the left has been making
93
00:05:54,960 --> 00:05:59,240
that very, very same argument.
And the right is, or at least
94
00:05:59,480 --> 00:06:02,480
not all the right, but at least
the government representation of
95
00:06:02,480 --> 00:06:06,160
the right has been pretty
dramatically pushing back,
96
00:06:06,480 --> 00:06:08,480
almost in the same sense that
you're really seeing the
97
00:06:08,480 --> 00:06:11,640
pendulum swinging.
It's so funny because people
98
00:06:11,680 --> 00:06:14,800
would say a couple of years ago,
yeah, well, in the 60s, you
99
00:06:14,800 --> 00:06:17,240
know, I really had more in
common with the liberal if
100
00:06:17,240 --> 00:06:19,520
someone was Republican, rather
if someone was to the right and
101
00:06:19,520 --> 00:06:21,760
they say, yeah, in the 60s or
70s, I really probably would
102
00:06:21,760 --> 00:06:26,320
have had much more in common
with the modern Democrats, with
103
00:06:26,320 --> 00:06:28,560
Democrats then then modern
Democrats.
104
00:06:28,880 --> 00:06:33,400
And, you know, it's really,
really, it just shows how
105
00:06:33,400 --> 00:06:36,680
extreme everyone's become
because we're pro free speech
106
00:06:36,680 --> 00:06:39,120
and we do want limited war.
We don't want a lot of war.
107
00:06:39,320 --> 00:06:43,320
And we do want to control
enormous organizations, whether
108
00:06:43,320 --> 00:06:47,120
it's corporations or government.
And you're like, wow, that
109
00:06:47,120 --> 00:06:48,360
sounds, you know, really
similar.
110
00:06:48,360 --> 00:06:52,960
But but then you have like
literally just in a span of like
111
00:06:53,280 --> 00:06:56,320
a year and a half, like you
have, you have people
112
00:06:56,320 --> 00:06:58,760
complaining about the fact that
one of Biden's the first thing
113
00:06:58,760 --> 00:07:01,040
Biden did when he got into
office was go to the social
114
00:07:01,040 --> 00:07:04,400
media companies and start to and
start to silence people that
115
00:07:04,400 --> 00:07:08,840
would, for example, push back on
COVID, on the COVID vaccine or
116
00:07:09,160 --> 00:07:12,520
on January 6th or really on any
kind of political issue that
117
00:07:12,520 --> 00:07:15,400
became a matter of politics and
policy.
118
00:07:17,480 --> 00:07:19,440
And it takes a year.
That's all it took.
119
00:07:19,680 --> 00:07:22,160
Took a year and a half for
everyone that was just screaming
120
00:07:22,160 --> 00:07:25,320
on the right to that we need
free speech and the right to
121
00:07:25,320 --> 00:07:27,520
bear arms.
And then you have Cash Patel
122
00:07:27,520 --> 00:07:29,800
walking around now.
I saw a video of him saying,
123
00:07:29,800 --> 00:07:33,400
well, you can't walk around with
a gun and come to a protest.
124
00:07:34,640 --> 00:07:39,920
Imagine three years ago, imagine
2020, imagine if someone that
125
00:07:39,920 --> 00:07:44,440
was on the right went to a Black
Lives Matter riot or not riot, I
126
00:07:44,440 --> 00:07:45,600
apologize.
Let's say protest.
127
00:07:45,600 --> 00:07:48,080
Let's say it is now a protest.
It is peaceful.
128
00:07:48,680 --> 00:07:51,720
I've actually seen a bunch of
actually when I was, I was
129
00:07:51,720 --> 00:07:58,280
living in Seattle in 2020 and
listen, it was fascinating to
130
00:07:58,280 --> 00:08:05,520
experience all the protests and
Chaz and you know, all the I
131
00:08:05,520 --> 00:08:08,640
didn't really see, I never
personally saw any violence.
132
00:08:08,640 --> 00:08:10,360
I never heard any.
I never experienced any
133
00:08:10,360 --> 00:08:12,800
violence.
I never heard anyone escalating
134
00:08:12,800 --> 00:08:15,720
anything beyond.
So you know, social justice
135
00:08:15,800 --> 00:08:17,680
chance, which everyone's
entitled to do.
136
00:08:17,880 --> 00:08:21,640
But just imagine for a minute
that you have a story of a Black
137
00:08:21,640 --> 00:08:27,680
Lives Matter protest and there's
someone in the crowd that is
138
00:08:28,560 --> 00:08:31,080
protest protesting, you know,
anti protesting.
139
00:08:31,360 --> 00:08:34,559
And he's standing on the other
side and he has a weapon in his
140
00:08:34,559 --> 00:08:36,480
pocket.
He doesn't take it out, he
141
00:08:36,480 --> 00:08:38,840
doesn't use it.
He just has a weapon and he's
142
00:08:38,840 --> 00:08:40,799
legally loud.
He is a carrying permit or
143
00:08:40,799 --> 00:08:43,919
whatever it is.
And, and for whatever reason,
144
00:08:44,039 --> 00:08:48,320
someone, you know, he gets, he
gets detained or shot or hurt or
145
00:08:48,320 --> 00:08:53,640
killed because he had a weapon,
the right would flip out.
146
00:08:53,640 --> 00:08:56,240
We have, you know, we, we have,
we need rights.
147
00:08:56,240 --> 00:08:58,520
We had it's a Second Amendment
we have to overthrow.
148
00:08:58,800 --> 00:09:00,720
If you're able to overthrow a
government that's going to
149
00:09:00,720 --> 00:09:04,200
become a dictatorship or, or
communist or fascist or, you
150
00:09:04,200 --> 00:09:06,800
know, filling the filling the
hot topic word that anyone is
151
00:09:06,800 --> 00:09:08,800
throwing out.
But it really just took a year
152
00:09:08,800 --> 00:09:10,600
and a half for all the tables to
just flip.
153
00:09:11,000 --> 00:09:14,240
And it's incredible how quickly
it went.
154
00:09:14,240 --> 00:09:21,040
And it's incredible how quickly
people, when you point out the
155
00:09:21,040 --> 00:09:25,160
inconsistency to them, as I've
tried to do with some people
156
00:09:25,160 --> 00:09:29,120
that I know where the answer
becomes, yeah, but.
157
00:09:29,480 --> 00:09:32,200
And that, yeah, but is the exact
same answer and the exact same
158
00:09:32,200 --> 00:09:36,840
perspective that the left had
for the woke years and for
159
00:09:36,840 --> 00:09:42,920
20/20/2024.
And it we, it's not great.
160
00:09:42,920 --> 00:09:47,160
It's not a great look and it's
not a great look because I think
161
00:09:47,160 --> 00:09:52,840
that people today are more done
with partisan politics than
162
00:09:52,840 --> 00:09:57,200
ever.
And I think that people today
163
00:09:57,560 --> 00:10:03,160
are really seeing that they're
seeing everyone's
164
00:10:03,160 --> 00:10:06,080
inconsistencies.
I, I, I saw actually today on,
165
00:10:06,120 --> 00:10:09,240
on on X and I had a whole
response ran out that I didn't
166
00:10:09,240 --> 00:10:12,600
write yet because I think I
might just send it personally to
167
00:10:13,320 --> 00:10:16,480
try to reach out personally.
I saw that Ro Kahana, who's a
168
00:10:16,600 --> 00:10:18,840
California governor that's been
working with, I think he's a
169
00:10:18,840 --> 00:10:22,880
governor working with Thomas
Massie and he's, and I heard him
170
00:10:22,880 --> 00:10:25,440
on Sean Ryan show.
I heard him on Theo Von.
171
00:10:26,280 --> 00:10:31,200
And I have to say I was overall
extremely, extremely impressed
172
00:10:31,200 --> 00:10:33,280
with a couple of things about
him.
173
00:10:34,000 --> 00:10:36,440
Obviously he's very eloquent.
Obviously he's a politician.
174
00:10:36,440 --> 00:10:40,960
He's supposed to be eloquent.
One thing that he did say, and
175
00:10:40,960 --> 00:10:44,760
it's so upsetting because it's
hard to believe, especially
176
00:10:44,760 --> 00:10:48,960
after all the excitement that I
had a year and a half ago with,
177
00:10:48,960 --> 00:10:53,280
with all the, with the, with the
huge MAGA wave with Trump and
178
00:10:53,280 --> 00:10:57,120
Elon and RFK and Tulsi and you
know, and all the people that
179
00:10:57,120 --> 00:10:59,120
everyone is really excited about
people haven't heard from them
180
00:10:59,120 --> 00:11:01,440
in a long time.
You know, Elon, Elon's fallout
181
00:11:01,440 --> 00:11:04,720
with Musk, which I'm sorry,
Elon's fallout with Trump, which
182
00:11:05,360 --> 00:11:07,480
looks like it's kind of being
rectified a little bit.
183
00:11:07,840 --> 00:11:10,720
RFKI heard the best call.
I think it was Tim Dylan was so
184
00:11:10,720 --> 00:11:12,280
funny.
He's the least controversial
185
00:11:12,280 --> 00:11:15,560
member of the cabinet at this
point, which, you know, go
186
00:11:15,560 --> 00:11:17,880
figure.
You know, everyone was pushing
187
00:11:17,880 --> 00:11:21,200
back the hardest against him and
and he just kind of quietly
188
00:11:21,200 --> 00:11:23,440
minding his own business and
occasionally confessing to
189
00:11:23,640 --> 00:11:25,720
extremely bizarre things on
social media.
190
00:11:26,200 --> 00:11:31,280
And yeah.
And then and Tulsi, I mean, no
191
00:11:31,280 --> 00:11:33,920
one really heard from her.
And the truth was that that she
192
00:11:33,920 --> 00:11:36,680
was supposed to be the
representation of the anti
193
00:11:36,680 --> 00:11:40,520
imperial America, the the empire
perspective.
194
00:11:40,560 --> 00:11:43,080
And it looks like she kind of
got shunned aside a little bit.
195
00:11:43,400 --> 00:11:48,360
But so I think that a lot of
people include my myself
196
00:11:48,360 --> 00:11:49,920
included, we're really, really
hopeful.
197
00:11:49,920 --> 00:11:53,320
And now we're not.
And then and, and so Rocahana
198
00:11:53,320 --> 00:11:55,400
said something really
interesting, which kind of
199
00:11:55,400 --> 00:11:59,120
piqued my interest, even though
I'm skeptical just because and
200
00:11:59,720 --> 00:12:03,920
you know, I, I, I know that this
is potentially A fallacy, but I
201
00:12:03,920 --> 00:12:06,560
think that it's also, I, I think
that enough history has proven
202
00:12:06,560 --> 00:12:12,640
that it could be an issue is a,
the guys from California, which,
203
00:12:12,800 --> 00:12:15,640
you know, which is the governor
of California, which makes me
204
00:12:15,640 --> 00:12:21,640
think that he and I know that he
has a much more grounded story
205
00:12:21,640 --> 00:12:24,160
that he failed, I think, two or
three times to get elected.
206
00:12:24,160 --> 00:12:26,760
And I think, I think on his
third or fourth try he got, he
207
00:12:26,760 --> 00:12:33,640
got, he, he was successful, but
I don't know, seems just a
208
00:12:33,640 --> 00:12:36,520
little too.
He's really smart.
209
00:12:36,560 --> 00:12:38,160
He's saying really, really smart
things.
210
00:12:39,000 --> 00:12:42,640
He's talking about, for example,
Epstein, which is a uni party
211
00:12:42,640 --> 00:12:47,040
issue.
And he's talking about ICE and
212
00:12:47,040 --> 00:12:49,960
immigration, which people on the
right and the left are really
213
00:12:49,960 --> 00:12:53,600
not very happy about.
And again, you know, the
214
00:12:53,600 --> 00:12:55,600
question isn't whether or not
there should be immigration
215
00:12:55,600 --> 00:12:56,960
enforce.
The question isn't whether or
216
00:12:56,960 --> 00:12:59,680
not ICE should be active.
The question just is to what
217
00:12:59,680 --> 00:13:06,760
degree the behaviour is what to
what degree is the behaviour
218
00:13:06,760 --> 00:13:10,440
justified, acceptable?
And it's really, really hard.
219
00:13:10,440 --> 00:13:12,600
And I've heard, I've heard both
cases that, you know, for
220
00:13:12,600 --> 00:13:16,840
example, cops have to wear badge
identification badges and ICE
221
00:13:16,840 --> 00:13:23,400
agents don't.
So if I mean really there's no
222
00:13:23,400 --> 00:13:27,320
way to identify if a cop's
misbehaving, you could get their
223
00:13:27,320 --> 00:13:30,000
badge number, you could get
their name and look them up.
224
00:13:30,320 --> 00:13:35,360
If an ICE agent who is wearing a
mask and has no recognizable
225
00:13:35,360 --> 00:13:37,600
signs for you to be able to
report any kind of abuse of
226
00:13:37,600 --> 00:13:39,880
behavior.
If the ICE agent doesn't have
227
00:13:39,880 --> 00:13:42,600
any of that, it becomes a lot
more difficult to kind of
228
00:13:42,600 --> 00:13:47,400
control and hold them
accountable for potentially bad
229
00:13:47,400 --> 00:13:51,080
behaviour or even behaviour that
is justified.
230
00:13:51,080 --> 00:13:55,600
But that you might want to take
some some look into the context
231
00:13:55,600 --> 00:13:57,560
of the, of the person and the
story and the character in the
232
00:13:57,560 --> 00:14:00,120
history.
And it's really, really, it's
233
00:14:00,120 --> 00:14:05,240
really, really hard to do.
So anyway, so Rocahana was
234
00:14:05,280 --> 00:14:08,520
talking about how the elite just
seemed to be getting away with
235
00:14:08,520 --> 00:14:10,760
everything.
And this, that's the problem
236
00:14:10,760 --> 00:14:12,320
with this, with this
administration.
237
00:14:12,600 --> 00:14:14,880
And I wanted to comment.
And I want, and again, I'm going
238
00:14:14,880 --> 00:14:17,040
to, I'm going to reach out.
And obviously, I mean,
239
00:14:17,840 --> 00:14:21,040
overwhelming odds are that it's
not going to get anywhere.
240
00:14:21,040 --> 00:14:23,440
I'll attach a couple of the
episodes that feature the
241
00:14:23,440 --> 00:14:27,000
relatively prominent and
recognizable names of people
242
00:14:27,000 --> 00:14:30,120
that I've spoken with to try to
add some credentials to who I
243
00:14:30,120 --> 00:14:33,280
am.
But I want to reach out and say,
244
00:14:33,280 --> 00:14:37,680
hey, if this is really uni party
issue like Epstein is where
245
00:14:37,680 --> 00:14:40,600
everyone on almost everyone on
the Senate floor and everyone in
246
00:14:40,600 --> 00:14:44,200
Congress was going to be able to
was happy to pass along this
247
00:14:44,520 --> 00:14:48,080
this bill to force that force
the files and information out.
248
00:14:50,000 --> 00:14:53,240
I think that I think that the
elite behavior is a uni party
249
00:14:53,240 --> 00:14:55,040
issue.
And by you phrasing it
250
00:14:55,040 --> 00:14:57,800
unfortunately as the problem
with this administration is that
251
00:14:57,800 --> 00:14:59,800
we take advantage from an
executive level.
252
00:15:00,160 --> 00:15:02,760
Well, guess what, pal?
Unfortunately, you kind of lost
253
00:15:02,760 --> 00:15:06,120
playing that card if you didn't
if you didn't fight back against
254
00:15:06,240 --> 00:15:11,600
a lot of what went on in 2020 to
2024 or again, you know,
255
00:15:11,600 --> 00:15:13,280
20/16/2020.
I think we just came.
256
00:15:13,280 --> 00:15:16,240
We, you know, since 9:11, I
mean, since 9:11, we've really
257
00:15:16,240 --> 00:15:22,040
become very, very authoritarian,
a more authoritarian, a lot more
258
00:15:22,040 --> 00:15:24,760
government control over the
people, a lot less freedom.
259
00:15:25,280 --> 00:15:29,640
And that actually is going to
lead me to my next point, which
260
00:15:30,040 --> 00:15:33,240
I was so excited.
This actually got me talking.
261
00:15:33,240 --> 00:15:36,000
This got me excited to record.
And I'm 15 minutes in.
262
00:15:36,000 --> 00:15:37,240
I didn't get started yet, I'm
sorry.
263
00:15:37,680 --> 00:15:40,560
But the point that one of the
points that I really was
264
00:15:40,560 --> 00:15:47,200
thinking about was, OK, so
there's awesome, awesome,
265
00:15:47,200 --> 00:15:49,360
awesome scientist named Eric
Weinstein.
266
00:15:49,800 --> 00:15:52,920
And he one of the things, and
I've talked about him before.
267
00:15:53,080 --> 00:15:56,520
I'm a huge, huge fan of his.
Every time I hear him or every
268
00:15:56,520 --> 00:15:59,200
time I see that he's done some
kind of podcast or conversation,
269
00:15:59,600 --> 00:16:04,200
I try, I try and listen and
participate to the best that I
270
00:16:04,200 --> 00:16:06,120
can.
He's really, really smart and I
271
00:16:06,120 --> 00:16:08,600
can't keep up with everything he
says all the time.
272
00:16:09,080 --> 00:16:13,840
But one thing that he did say
that was really interesting to
273
00:16:13,840 --> 00:16:17,920
me was that human beings also
essentially kind of work like
274
00:16:17,920 --> 00:16:19,920
LLMS, like large language
models.
275
00:16:19,920 --> 00:16:24,600
And I'm shifting gears right now
from politics more to technology
276
00:16:24,600 --> 00:16:29,280
and more to the way that we get
our information and more to the
277
00:16:30,800 --> 00:16:34,400
ever present issue of the
algorithm.
278
00:16:34,800 --> 00:16:39,400
And I put these kind of thoughts
together as I was trying to
279
00:16:39,400 --> 00:16:42,840
think about what to talk about.
Human beings work very, very
280
00:16:42,840 --> 00:16:46,400
much like LLMS.
And I'll explain, for example,
281
00:16:47,200 --> 00:16:49,240
in a way that hopefully makes
sense.
282
00:16:51,640 --> 00:16:55,360
We, our brains are lazy.
Our brains want to be efficient,
283
00:16:55,760 --> 00:17:03,640
efficient and effective, and our
brains also want to use the
284
00:17:03,640 --> 00:17:06,920
least amount of effort or energy
in order to be able to
285
00:17:06,920 --> 00:17:11,800
accomplish a mission or a task.
We just, we've evolved to try to
286
00:17:11,800 --> 00:17:14,720
be efficient.
Sometimes we cut too many
287
00:17:14,720 --> 00:17:17,240
shortcuts.
So for example, and this is a
288
00:17:17,240 --> 00:17:19,560
very, very common experience
that I have, it's my life.
289
00:17:19,560 --> 00:17:21,680
I work with my father and we're
very close.
290
00:17:23,520 --> 00:17:27,319
We speak on the phone 10/15/20
times a day for a couple minutes
291
00:17:27,319 --> 00:17:28,960
at a time just dealing with
whatever work issue we're
292
00:17:28,960 --> 00:17:33,200
dealing with or talking about
life or sales or the business or
293
00:17:33,200 --> 00:17:37,760
family or whatever.
And a lot of times we'll, if we
294
00:17:37,760 --> 00:17:41,280
have a disagreement, which we
often do, and I've been trying
295
00:17:41,280 --> 00:17:44,080
to work on this and sometimes I
can't help it and sometimes I'm
296
00:17:44,080 --> 00:17:46,880
better about it.
But if we have an, an issue or a
297
00:17:46,880 --> 00:17:49,080
disagreement and I think I know
he's going to say he's going to
298
00:17:49,080 --> 00:17:51,760
start by making a point and I
cut him off midpoint.
299
00:17:51,760 --> 00:17:54,320
I'm like, yeah, but and then he
goes, you didn't even hear what
300
00:17:54,320 --> 00:17:58,040
I finished saying.
And my brain, my LLM Green
301
00:17:58,400 --> 00:18:02,720
finished his thought for him in
a similar way to and it's funny
302
00:18:02,720 --> 00:18:05,320
because it's just a matter of
degrees of how much better it's
303
00:18:05,320 --> 00:18:08,400
getting right.
If you if you remember a couple
304
00:18:08,400 --> 00:18:11,720
of years ago when Apple was
first rolling out or iMessage
305
00:18:11,720 --> 00:18:16,040
was first rolling out there, I
guess dipping the toe in the
306
00:18:16,040 --> 00:18:18,520
water of LLMS large language
models.
307
00:18:18,840 --> 00:18:22,920
And one of the things that one
of the ways that LLMS work is by
308
00:18:22,920 --> 00:18:26,680
predicting what the next, what
the best potential choice of the
309
00:18:26,680 --> 00:18:29,280
next word or feature is going to
be.
310
00:18:29,280 --> 00:18:34,400
So for example, if I was sending
a text sounds the answer that
311
00:18:34,400 --> 00:18:37,040
you know, the recommended word
choice, the potential words to
312
00:18:37,040 --> 00:18:41,680
follow the to the net potential
word to follow the word sounds
313
00:18:41,800 --> 00:18:45,080
would most likely be good like
sounds good with a thumbs up or
314
00:18:45,080 --> 00:18:48,440
be sounds awful or sounds
interesting, right?
315
00:18:48,440 --> 00:18:52,000
And so the LLM would predict
would give you 3 predictions of
316
00:18:52,000 --> 00:18:55,840
words that you're most likely to
use with that that follow the
317
00:18:55,840 --> 00:18:58,360
word sounds.
When I cut off my dad and I'm
318
00:18:58,360 --> 00:19:01,560
having the argument with him, I
am essentially guessing what
319
00:19:01,560 --> 00:19:04,240
he's going to say based on my
experience of having dealt with
320
00:19:04,240 --> 00:19:06,800
him before.
I'm not letting him finish his
321
00:19:06,800 --> 00:19:09,440
thought out.
I'm not letting him expresses
322
00:19:09,440 --> 00:19:12,040
his point.
I am predicting what he's going
323
00:19:12,040 --> 00:19:15,400
to say and am choosing to
respond in real time before I
324
00:19:15,400 --> 00:19:19,480
even have all the context.
I'm choosing to respond in real
325
00:19:19,480 --> 00:19:22,480
time without having all the
information.
326
00:19:22,480 --> 00:19:25,600
And it's a stupid fucking way to
communicate with people.
327
00:19:26,560 --> 00:19:30,520
And I do apologise because I do
it a lot and it makes me a
328
00:19:30,520 --> 00:19:32,320
really, really bad listener,
unfortunately.
329
00:19:33,160 --> 00:19:37,560
If we are Ella, you know, if we
do think like LLMS and you know,
330
00:19:37,560 --> 00:19:41,600
I think and also, you know,
there's only so many words and
331
00:19:41,600 --> 00:19:43,480
conversations and predictive
ways that you can have a
332
00:19:43,480 --> 00:19:45,720
conversation, right?
So you start off with someone,
333
00:19:45,720 --> 00:19:49,440
you say, hey, how you doing?
There's only a limited number of
334
00:19:49,440 --> 00:19:51,760
potential responses.
And then we kind of get thrown
335
00:19:51,760 --> 00:19:52,880
off if there's anything else,
right?
336
00:19:52,880 --> 00:19:54,000
So the answer is usually pretty
good.
337
00:19:54,000 --> 00:19:57,600
You all right, or another day in
paradise or whatever it is.
338
00:19:57,880 --> 00:20:00,880
And then sometimes if someone
throws something like I'll
339
00:20:01,080 --> 00:20:03,160
usually tell people, oh, I'm
just peachy man.
340
00:20:04,600 --> 00:20:10,480
Sounds gay, but.
Sorry if I, you know, in the in
341
00:20:10,480 --> 00:20:15,240
the old school way, I asked
someone what's going on.
342
00:20:15,240 --> 00:20:19,480
There's only a certain amount of
expected potential responses and
343
00:20:19,480 --> 00:20:23,080
any other potential response,
any other response throws you
344
00:20:23,080 --> 00:20:26,200
through a loop a little bit.
So you notice when someone says,
345
00:20:26,200 --> 00:20:27,840
hey, how you doing?
Pretty bad actually.
346
00:20:28,320 --> 00:20:30,040
You're like, whoa.
And that person will actually
347
00:20:30,040 --> 00:20:34,000
leave a mark on you because that
was a new implementation of a
348
00:20:34,000 --> 00:20:38,760
potential response into your LLM
diary or LLM tool book.
349
00:20:39,000 --> 00:20:40,920
And you kind of keep an eye out
for it.
350
00:20:40,920 --> 00:20:44,720
And if you go, if, if it's an
anomaly, then it's an anomaly
351
00:20:44,720 --> 00:20:46,880
and you end up being surprised
or sometimes you notice it more.
352
00:20:46,880 --> 00:20:49,600
And that's kind of what happens.
For example, if you, and I've
353
00:20:49,600 --> 00:20:51,800
had this experience myself, I
forgot what it's called, but
354
00:20:51,800 --> 00:20:54,480
when you buy, for example, a car
or you lease a car.
355
00:20:54,480 --> 00:20:58,880
So when I got my Kia K5, I never
heard of the model my entire
356
00:20:58,880 --> 00:21:02,680
life.
And the second I got the car, I
357
00:21:02,680 --> 00:21:04,200
started seeing them everywhere
on the road.
358
00:21:04,520 --> 00:21:08,360
And I'm like, that is wild.
Like I had like my, the, my
359
00:21:08,360 --> 00:21:10,360
brain was not looking for that
detail.
360
00:21:10,680 --> 00:21:13,520
And now that it exists in my
library, I can keep an eye out
361
00:21:13,520 --> 00:21:15,000
for it.
Really, really interesting.
362
00:21:15,400 --> 00:21:22,120
So that on a very, very surface
level in very, very, I'm sure
363
00:21:22,120 --> 00:21:23,640
there's a ton more of more
depth.
364
00:21:23,640 --> 00:21:25,360
I'm sure there's a ton that I'm
missing.
365
00:21:25,360 --> 00:21:27,160
I'm not claiming to have
everything covered over there,
366
00:21:27,520 --> 00:21:31,200
but I think that there's a
general gist that we tend to,
367
00:21:31,280 --> 00:21:37,600
people tend to work that way.
Our algorithms, we, our
368
00:21:37,600 --> 00:21:47,640
algorithms are shaping us more
and more everyday because it's
369
00:21:47,640 --> 00:21:52,640
constantly reinforcing a
perspective without giving you a
370
00:21:52,640 --> 00:21:56,360
chance to explore other avenues,
overwrite your associations or
371
00:21:56,360 --> 00:21:58,360
be exposed to new perspectives
and new thoughts.
372
00:21:58,880 --> 00:22:03,520
And So what we determine and
deem to be truth or reality is
373
00:22:03,520 --> 00:22:07,480
really just a whole bunch of
reinforced information.
374
00:22:09,280 --> 00:22:12,400
And I've heard people complain
about that quite a bit and I've
375
00:22:12,400 --> 00:22:15,040
heard people talk about how
that's a massive problem and it
376
00:22:15,040 --> 00:22:17,480
is.
But I really haven't heard of
377
00:22:17,480 --> 00:22:21,400
many people offering very many
solutions other than don't do
378
00:22:21,440 --> 00:22:26,760
social media, which I'm an
advocate for.
379
00:22:26,800 --> 00:22:30,720
And just because I advocate for
not doing social media doesn't
380
00:22:30,720 --> 00:22:32,680
mean I'm not addicted to it.
You know, it doesn't mean that I
381
00:22:32,680 --> 00:22:35,720
don't spend a ton of a ton more
time on it doesn't mean that my
382
00:22:35,720 --> 00:22:37,960
opinions and my perspectives are
not shaped by it.
383
00:22:38,440 --> 00:22:42,160
And in fact, every single time I
try to find myself, I say, oh,
384
00:22:42,160 --> 00:22:44,600
I'm just, you know, I'm just
being blasted with, with, with,
385
00:22:44,600 --> 00:22:50,520
with content that is meant to,
you know, put me into a, a
386
00:22:50,520 --> 00:22:54,000
particular frame of mind or
enforce a particular
387
00:22:54,240 --> 00:22:58,600
perspective.
I look for other content to
388
00:22:58,600 --> 00:23:01,600
balance that out and the, you
know, and, and, and then, and
389
00:23:01,600 --> 00:23:04,520
then you have to catch yourself.
OK, well, at a certain point,
390
00:23:04,760 --> 00:23:06,760
everything becomes anti
authoritarian.
391
00:23:06,760 --> 00:23:08,240
At a certain point, everything
is a problem.
392
00:23:08,240 --> 00:23:09,720
What are the good things?
What are the potential good
393
00:23:09,720 --> 00:23:11,400
things?
How am I looking at this in a
394
00:23:11,400 --> 00:23:15,280
way that is both kind of picking
it apart and also potentially
395
00:23:15,280 --> 00:23:19,080
positive?
But it's really hard to do.
396
00:23:19,400 --> 00:23:22,680
And So what I was thinking of
actually, and what I hope to
397
00:23:22,680 --> 00:23:27,560
encourage now is the value of
creating our own algorithms.
398
00:23:27,720 --> 00:23:31,520
And I think that that means that
there's a lot of really cool
399
00:23:31,520 --> 00:23:33,240
ways to create our own
algorithms.
400
00:23:34,440 --> 00:23:43,480
Choosing content, particularly
long form historical, scientific
401
00:23:44,160 --> 00:23:52,440
or narrative based content, I
think is in long form content
402
00:23:52,520 --> 00:23:56,080
like a book or like a long
podcast or an audio book or a
403
00:23:56,080 --> 00:23:59,440
movie or ATV series or whatever
it is.
404
00:24:00,240 --> 00:24:05,240
I think that the long form
contents and allow us to create
405
00:24:05,240 --> 00:24:14,320
our own algorithms and instead
of us being shaped by what is
406
00:24:14,320 --> 00:24:19,600
being fed to us or presented to
us, we can get up, go to the do
407
00:24:19,600 --> 00:24:21,960
the equivalent of going to the
supermarket and picking out a
408
00:24:21,960 --> 00:24:25,200
healthy meal for ourselves.
So if you're sitting at home and
409
00:24:25,200 --> 00:24:28,880
you get up and you look and you
say, hey, I'm hungry, can you
410
00:24:28,880 --> 00:24:31,080
get me a, you know, and you ask
your sibling, you ask your
411
00:24:31,080 --> 00:24:34,480
parent, you ask your spouse, you
ask your friend, Hey, can you
412
00:24:34,480 --> 00:24:37,800
grab me a snack?
They're pretty much subject to
413
00:24:37,800 --> 00:24:40,640
what they bring you.
You know, you get up and do it
414
00:24:40,640 --> 00:24:43,520
yourself.
Then you can pick what is
415
00:24:43,520 --> 00:24:45,360
actually going to be healthy.
You could pick something that
416
00:24:45,360 --> 00:24:47,720
might, you're in the mood for,
it could be something that
417
00:24:47,720 --> 00:24:49,000
you're potentially interested
in.
418
00:24:49,000 --> 00:24:50,680
You want to see how it goes with
other flavors.
419
00:24:51,040 --> 00:24:53,160
And I think the same thing
applies to knowledge and our
420
00:24:53,160 --> 00:24:55,240
understanding of the world and
information that we choose to
421
00:24:55,240 --> 00:24:57,880
consume.
I think that if the more that we
422
00:24:57,880 --> 00:25:01,320
use social media, the more the
information is being imposed on
423
00:25:01,320 --> 00:25:04,800
to us and the more that we
choose the information to
424
00:25:04,800 --> 00:25:08,080
consume is the more that we're
getting up and choosing what
425
00:25:08,080 --> 00:25:11,560
kind of snacks to have, whether
they're healthy or not, whether
426
00:25:11,560 --> 00:25:12,880
they're good for you, what kind
of mood you're in.
427
00:25:12,880 --> 00:25:15,480
Again, do you need something
that's a little bit salty,
428
00:25:15,480 --> 00:25:18,880
sweet, high, high calorie, low
calorie doesn't matter.
429
00:25:19,360 --> 00:25:20,640
There's a context for
everything.
430
00:25:20,640 --> 00:25:21,840
There's a time and place for
everything.
431
00:25:21,840 --> 00:25:27,320
And So what is best for us to do
is to decide that for ourselves.
432
00:25:27,320 --> 00:25:31,720
I think that we think that
because our algorithm will
433
00:25:31,720 --> 00:25:36,200
respond to, you know, obviously
our algorithm is responsive to
434
00:25:36,200 --> 00:25:38,840
what catches our eye, but it
doesn't necessarily mean that
435
00:25:38,840 --> 00:25:44,320
it's what we want.
If I walk into a pizza shop and
436
00:25:44,320 --> 00:25:48,920
I that sounds incredible,
doesn't mean that someone should
437
00:25:48,920 --> 00:25:51,200
shove a pizza in my face.
It doesn't, it doesn't mean that
438
00:25:51,200 --> 00:25:52,440
someone should shove a pizza
down my throat.
439
00:25:52,920 --> 00:25:55,360
It just means, whoa, that's
interesting.
440
00:25:55,680 --> 00:25:57,240
I like that.
I don't know if I want it right
441
00:25:57,240 --> 00:26:04,680
now, but I like it, you know,
and I think that, you know, I
442
00:26:04,680 --> 00:26:06,840
myself have been going on this
really, really interesting
443
00:26:06,840 --> 00:26:10,720
journey where I've kind of been
learning about, I haven't been
444
00:26:10,720 --> 00:26:13,600
following broad news, but I've
been taking particular news
445
00:26:13,920 --> 00:26:16,160
stories that I've been hearing
and I've been diving into
446
00:26:16,160 --> 00:26:17,800
understanding what they're
about.
447
00:26:18,040 --> 00:26:21,360
Because it turns out that
there's a Hebrew expression.
448
00:26:21,600 --> 00:26:23,840
I love the expression in Kaddash
Takasashemesh.
449
00:26:24,120 --> 00:26:26,440
There's no, there's nothing new
under the sun.
450
00:26:26,440 --> 00:26:28,920
Everything that we think we've
seen already has already had
451
00:26:28,920 --> 00:26:30,760
everything that we thought that
we think is new has already
452
00:26:30,760 --> 00:26:32,440
happened already.
And everything that we thought,
453
00:26:33,520 --> 00:26:35,360
you know, if we haven't seen
something doesn't mean that it
454
00:26:35,360 --> 00:26:36,840
hasn't existed in a particular
way.
455
00:26:37,280 --> 00:26:42,800
And that's, I mean, that's my
personal push and plea for
456
00:26:42,800 --> 00:26:46,600
learning about history and
learning about humanity, but I
457
00:26:46,600 --> 00:26:47,960
think that that's important for
everything.
458
00:26:47,960 --> 00:26:52,160
So for example, let's say again,
just to stick with an example
459
00:26:52,160 --> 00:26:56,160
that we started talking about
earlier with the ice stories and
460
00:26:56,800 --> 00:27:00,840
does you know, you can find out
there's so many worlds to learn
461
00:27:00,840 --> 00:27:03,160
about what the context of what's
going on over here is.
462
00:27:03,400 --> 00:27:06,400
What's the relationship between
Minnesota and Somalia?
463
00:27:06,640 --> 00:27:10,360
What's the relationship between
ice and the citizens?
464
00:27:10,920 --> 00:27:16,720
Where was the right always
embracing ice and what was ice
465
00:27:16,720 --> 00:27:19,480
created for?
Was it created to catch
466
00:27:19,840 --> 00:27:22,480
Venezuelan cartel members that
are running around the country?
467
00:27:22,480 --> 00:27:25,760
Was it designed to enforce all
immigration law?
468
00:27:26,080 --> 00:27:28,560
What's the point of that right.
So you do, I mean, and then that
469
00:27:28,560 --> 00:27:34,440
that so those two topics,
Somalia and Minnesota ICE and
470
00:27:35,320 --> 00:27:39,440
and and and its purpose are
enormous deep dives that require
471
00:27:39,640 --> 00:27:42,200
a lot of learning.
And you can pick what you want
472
00:27:42,200 --> 00:27:44,160
to learn about it without having
it fed to you.
473
00:27:44,480 --> 00:27:46,480
You can find the books that look
interesting to you.
474
00:27:46,480 --> 00:27:49,040
You can find the articles that
speak to you or the podcast with
475
00:27:49,040 --> 00:27:53,680
the podcast ghost guest or host
that will go into an
476
00:27:53,960 --> 00:27:56,880
understanding of what the
context is.
477
00:27:57,040 --> 00:28:01,200
So that while these events are
unfolding, regardless of what
478
00:28:01,200 --> 00:28:05,200
your opinion is, at least you're
not as as short as short
479
00:28:05,200 --> 00:28:09,160
sighted, short memoried as so
many people are.
480
00:28:09,360 --> 00:28:12,320
Where a year and a half ago
we're talking about defending
481
00:28:12,600 --> 00:28:16,320
gun rights and the rights to
protest and to end for free
482
00:28:16,320 --> 00:28:19,040
speech, and now are shutting
down the exact same thing.
483
00:28:19,280 --> 00:28:28,120
And the more that people start
to continue, hopefully to take
484
00:28:28,120 --> 00:28:30,720
their information and the
learning into their own hands
485
00:28:30,720 --> 00:28:33,680
instead of depending on it being
fed to you.
486
00:28:33,680 --> 00:28:36,840
Like me sitting on the couch
waiting for my friend or my kid
487
00:28:36,840 --> 00:28:41,320
or my wife or someone to bring
me a snack and getting up and
488
00:28:41,320 --> 00:28:43,520
doing it myself.
There's enormous, enormous
489
00:28:43,520 --> 00:28:46,040
value.
And every now and then, yeah,
490
00:28:46,040 --> 00:28:47,600
it's good to have someone bring
you a cheap snack.
491
00:28:47,600 --> 00:28:49,240
If you're feeling lazy, if you
feel like you don't want to get
492
00:28:49,240 --> 00:28:50,760
up and you don't really care
what you're in the mood for,
493
00:28:50,760 --> 00:28:53,320
you're just hungry, then go
ahead and go check out social
494
00:28:53,320 --> 00:28:55,600
media for a little bit.
But I think it's much, much,
495
00:28:55,600 --> 00:29:00,040
much more productive and much
more valuable to to really do
496
00:29:00,040 --> 00:29:02,040
the work yourself and create our
own algorithms.
497
00:29:02,040 --> 00:29:05,520
And then, and then everything
snowballs and, and, and, and
498
00:29:05,520 --> 00:29:11,520
it's incredible to see how
entangled all of society and all
499
00:29:11,520 --> 00:29:17,000
of culture and all of economy
and all of music and, and all of
500
00:29:17,000 --> 00:29:20,560
film are just so deeply and
tightly correlated.
501
00:29:20,560 --> 00:29:26,200
And, and I mean, and so, so if
there's a particular aspect of,
502
00:29:26,840 --> 00:29:32,600
let's say, the news that is
interesting to you, do deep
503
00:29:32,600 --> 00:29:34,880
dive, because along that deep
dive, you're going to find other
504
00:29:34,880 --> 00:29:36,760
rabbit holes and you're going to
learn more and you're going to
505
00:29:36,760 --> 00:29:40,080
understand more.
And you're going to see really
506
00:29:40,280 --> 00:29:45,160
how much things are really
entangled and how events that
507
00:29:45,160 --> 00:29:48,520
are being that that, that that
feel like they are isolated
508
00:29:48,920 --> 00:29:52,240
really are all just circular.
And it's really, really
509
00:29:52,240 --> 00:29:54,560
interesting.
It's really, really cool and I
510
00:29:54,560 --> 00:29:57,400
really, really encourage people
to do to to, to learn about this
511
00:29:57,400 --> 00:30:01,480
and to dive into and this is the
value of democratized education
512
00:30:01,480 --> 00:30:05,040
and this is the value of
democratized information is that
513
00:30:05,280 --> 00:30:09,080
this allows us to fight
propaganda by.
514
00:30:10,040 --> 00:30:13,120
But propaganda is always going
to be put out there for better
515
00:30:13,120 --> 00:30:14,680
or for worse.
Sometimes it's good propaganda,
516
00:30:14,680 --> 00:30:19,960
sometimes it's bad propaganda.
But in order for us to maintain
517
00:30:19,960 --> 00:30:23,160
our own perspective, and this is
how we still know that we live
518
00:30:23,160 --> 00:30:24,600
in a free country.
And, you know, even though
519
00:30:24,600 --> 00:30:28,640
people will say, yeah, it's not
free, there are definitely
520
00:30:28,640 --> 00:30:30,560
aspects of society that really,
really aren't.
521
00:30:30,560 --> 00:30:34,400
But you know what, the fact that
there are podcasts with hundreds
522
00:30:34,400 --> 00:30:37,120
of thousands or millions of
listeners that are saying things
523
00:30:37,120 --> 00:30:40,040
that are very, very clearly
against the establishment or the
524
00:30:40,040 --> 00:30:42,120
elite class or whatever it is.
And the fact that they have the
525
00:30:42,120 --> 00:30:44,000
right to do it.
And the fact that that that
526
00:30:44,000 --> 00:30:45,320
millions of people have access
to it.
527
00:30:45,320 --> 00:30:47,960
And the fact that people are
listening to it and whether
528
00:30:47,960 --> 00:30:50,640
they're doing anything about it
or not is another conversation.
529
00:30:51,080 --> 00:30:54,240
But the fact that it's out there
means that it's inherently free
530
00:30:54,600 --> 00:30:58,440
and we do not want to destroy
that.
531
00:30:58,440 --> 00:31:01,200
We do not want to trample that.
It's a really, really potential
532
00:31:01,200 --> 00:31:05,120
scary road to go down.
The other topic that other
533
00:31:05,120 --> 00:31:07,400
topics that were just came to
mind because I came back from
534
00:31:08,200 --> 00:31:14,480
again, another trip to Orlando
and it was really, really a lot
535
00:31:14,480 --> 00:31:22,000
of fun.
I have, excuse me, I what, what,
536
00:31:22,000 --> 00:31:26,560
what, what we usually do
nowadays is we get a Airbnb in
537
00:31:26,560 --> 00:31:28,640
Orlando.
And I don't know if you've ever
538
00:31:28,640 --> 00:31:31,840
been to Orlando, but they have
this enormous.
539
00:31:32,360 --> 00:31:36,160
I mean, it's obviously one of
the tops hubs for vacations,
540
00:31:37,000 --> 00:31:41,960
conferences, tourism in the
world other than rather in
541
00:31:41,960 --> 00:31:45,920
America, other than potentially
Las Vegas, a couple other
542
00:31:45,920 --> 00:31:47,440
places, but Orlando's really up
there.
543
00:31:47,760 --> 00:31:51,120
And what was really cool, So
Airbnb, it started off a couple
544
00:31:51,120 --> 00:31:54,040
of years even as even as
recently as like 5 or 6 years
545
00:31:54,040 --> 00:32:01,000
ago, Airbnb was something that
was essentially a kind of form
546
00:32:01,320 --> 00:32:05,240
of second income for, for
everyday people that own their
547
00:32:05,240 --> 00:32:07,040
houses.
And that if you lived in a house
548
00:32:07,040 --> 00:32:10,360
that was a little too big, or
you had a back house, or you had
549
00:32:10,360 --> 00:32:12,800
a garage that you can convert
into a private low suite, then
550
00:32:12,800 --> 00:32:15,680
you'd rent it out.
Sometimes if you had a vacation
551
00:32:15,680 --> 00:32:17,680
home, you made an investment,
then you would, you know, then
552
00:32:17,680 --> 00:32:19,920
you'd buy a house and you'd rent
it out when you didn't want to
553
00:32:19,920 --> 00:32:21,000
go on vacation.
And then while you're on
554
00:32:21,000 --> 00:32:23,320
vacation, you would you'd be
able to stay there.
555
00:32:24,280 --> 00:32:28,320
And Airbnb was awesome is
awesome.
556
00:32:28,720 --> 00:32:32,080
I love Airbnb.
And then corporations started
557
00:32:32,080 --> 00:32:34,200
realizing that this is a
phenomenal business model.
558
00:32:34,520 --> 00:32:40,000
And So what Airbnb is in
particular started to do is what
559
00:32:40,000 --> 00:32:44,560
corporations rather started to
do was to buy and develop huge
560
00:32:44,560 --> 00:32:46,880
communities.
And this is happening all over
561
00:32:46,880 --> 00:32:50,960
Orlando.
And instead of people renting
562
00:32:50,960 --> 00:32:53,240
out their summer house or
instead of people renting out a
563
00:32:53,240 --> 00:32:56,720
house, you know, the second-half
of their house or whatever it
564
00:32:56,720 --> 00:32:59,520
is, the garage back house.
And I've stayed in some really,
565
00:32:59,520 --> 00:33:02,960
really wonderful Airbnb's in
Austin, TX, for example, I, I
566
00:33:02,960 --> 00:33:05,480
stayed in someone's back house
and it was really, really,
567
00:33:05,640 --> 00:33:08,800
really an awesome experience.
It was cheap, it was spacious.
568
00:33:08,800 --> 00:33:12,320
There's a separate walking path
for us to get into and the host
569
00:33:12,320 --> 00:33:14,840
and the host, even though we'd
interact with them, had had
570
00:33:14,840 --> 00:33:16,480
thought of everything and were
really, really great.
571
00:33:17,040 --> 00:33:19,720
And then there's another time
where I went to with a couple of
572
00:33:19,720 --> 00:33:25,560
buddies of mine, Sam and Heim to
oh, gosh, Newport, RI I love
573
00:33:25,560 --> 00:33:26,520
Newport.
I've been there a couple of
574
00:33:26,520 --> 00:33:30,800
times and it's such a cool
place, such an, it's so much
575
00:33:30,800 --> 00:33:33,160
beautiful nature.
I, there was the bridge that you
576
00:33:33,160 --> 00:33:35,880
drive over to go to Newport and
I don't even remember what was
577
00:33:35,880 --> 00:33:37,880
going on in my life in
particular at that point.
578
00:33:37,880 --> 00:33:39,920
But I remember driving over that
bridge and thinking about how
579
00:33:39,920 --> 00:33:44,240
beautiful it was and seeing all
the sailboats and the literally
580
00:33:44,240 --> 00:33:46,240
just like sapphire water below
me.
581
00:33:46,560 --> 00:33:51,800
And I just, I, I got emotional.
I was tearing or crying or
582
00:33:52,280 --> 00:33:55,480
whatever, but it was really a
Newport's a really, really
583
00:33:55,480 --> 00:33:57,720
beautiful.
That part of Rhode Island in the
584
00:33:57,720 --> 00:33:59,520
country is really, really
remarkable.
585
00:33:59,880 --> 00:34:06,600
But we stayed at Airbnb there
and it was in someone's back
586
00:34:06,600 --> 00:34:08,920
house, right?
So you would you can go through
587
00:34:08,920 --> 00:34:11,360
the front or you can go in on
the back and you climb up the
588
00:34:11,800 --> 00:34:15,280
the back stairs up to the deck
on the second story of the
589
00:34:15,280 --> 00:34:18,120
house.
And there you had a big, it was
590
00:34:18,120 --> 00:34:21,040
a big loft area that they
converted into a bedroom with a
591
00:34:21,040 --> 00:34:24,000
couple of beds and shower in the
bathroom and is closed off from
592
00:34:24,000 --> 00:34:25,960
the rest of the house.
But the owners that were living
593
00:34:25,960 --> 00:34:28,520
there were very, very warm and
very sweet and very nice.
594
00:34:28,840 --> 00:34:31,760
And they had a little garden.
And during the morning, they
595
00:34:31,760 --> 00:34:33,560
just knocked on the door and
said, hey, we have some, you
596
00:34:33,560 --> 00:34:36,120
know, some tomatoes and
cucumbers.
597
00:34:36,120 --> 00:34:37,560
And if there's anything you
need, just let me know.
598
00:34:37,560 --> 00:34:42,520
We'll help take care of you.
And then, and then Fast forward
599
00:34:42,520 --> 00:34:47,360
five years from now, you go to
Orlando and you go to Airbnb and
600
00:34:48,800 --> 00:34:51,719
you go to a house that's owned
by a corporation that make you
601
00:34:51,719 --> 00:34:55,560
sign this form, this liability,
not liability form, but like
602
00:34:55,600 --> 00:34:59,080
rules form of, of, of procedures
that you have to follow.
603
00:34:59,480 --> 00:35:04,040
And the bathrooms have these
little like mini shampoos from
604
00:35:04,040 --> 00:35:05,960
the hotels and stuff like that.
And they give you the little
605
00:35:05,960 --> 00:35:10,200
hotel towels and thin sheets.
And, and I'm not complaining
606
00:35:10,280 --> 00:35:12,160
because the houses are
remarkable.
607
00:35:12,360 --> 00:35:17,280
The only thing I'm complaining
about is the fact that this is a
608
00:35:17,280 --> 00:35:22,240
common thread that's been going
on for as America's evolved and
609
00:35:22,240 --> 00:35:26,360
as the country has become a
little bit a lot more geared
610
00:35:26,800 --> 00:35:31,160
towards corporate takeover.
Where is where and you have an
611
00:35:31,160 --> 00:35:35,240
idea and you have a concept or a
business that starts off really,
612
00:35:35,240 --> 00:35:39,360
really fun, individualized,
personalized and for the
613
00:35:39,360 --> 00:35:42,960
everyday person.
And once it scales into a
614
00:35:42,960 --> 00:35:48,000
business model, then, and I
don't even blame the people for
615
00:35:48,000 --> 00:35:50,880
trying to make money then in
corporations, hedge funds,
616
00:35:51,080 --> 00:35:54,920
investment banks, they see the
opportunity for revenue and
617
00:35:54,920 --> 00:35:57,280
they'll take it.
And I don't, and I, I, I
618
00:35:57,280 --> 00:35:59,560
genuinely do not blame them for
it.
619
00:35:59,920 --> 00:36:03,280
I just wish that it kept a
little bit more of the homey
620
00:36:03,280 --> 00:36:06,640
feel and a little bit more
authentic and customer based.
621
00:36:06,680 --> 00:36:10,920
And but it doesn't surprise me
because I think that it feels to
622
00:36:10,920 --> 00:36:15,560
me that a lot is corporatized,
including people's personalities
623
00:36:15,560 --> 00:36:16,720
these days.
It's so funny.
624
00:36:16,720 --> 00:36:20,400
And I didn't get LinkedIn used
to get a lot of I've seen that
625
00:36:20,400 --> 00:36:23,840
LinkedIn got a lot gets a lot of
hate, even though I think it's a
626
00:36:23,840 --> 00:36:27,640
really nice platform.
But one of the reasons why is
627
00:36:27,640 --> 00:36:32,000
because it kind of corporatized
all humanity and it turned
628
00:36:32,000 --> 00:36:39,440
everyone into these very polite
robots that write the same way.
629
00:36:39,440 --> 00:36:44,520
I was down to my luck and, you
know, several paragraphs below
630
00:36:44,520 --> 00:36:45,800
it where you have to click see
more.
631
00:36:46,080 --> 00:36:48,240
And then I didn't.
And then I didn't give up.
632
00:36:49,040 --> 00:36:53,040
And then my luck changed.
Keep hustling, you know, And
633
00:36:53,160 --> 00:36:55,920
that's every LinkedIn post, just
in another variation one way or
634
00:36:55,920 --> 00:36:58,320
the other.
I got fired from the job.
635
00:36:59,120 --> 00:37:00,920
But I know that tomorrow's going
to be a brighter day.
636
00:37:01,320 --> 00:37:05,440
And yeah, I mean, that's cool,
but it also kind of feels
637
00:37:05,600 --> 00:37:10,480
robotic and not real And film,
I've complained and I've I've
638
00:37:10,480 --> 00:37:11,760
complained.
I've talked to people, I've
639
00:37:11,760 --> 00:37:14,560
observed and talked to a bunch
of guests recently, especially
640
00:37:14,560 --> 00:37:19,680
in entertainment world about how
the film industry.
641
00:37:20,760 --> 00:37:24,200
And this is again, I'm not, this
isn't a, this isn't a blame
642
00:37:24,200 --> 00:37:26,720
game.
I'm not trying to blame people
643
00:37:26,720 --> 00:37:29,280
or say this is all a problem or
I understand where this is
644
00:37:29,280 --> 00:37:33,040
coming from.
Film studios, streaming studios
645
00:37:33,040 --> 00:37:35,480
and services, they have to make
money.
646
00:37:35,480 --> 00:37:36,880
I get that they have to make
money.
647
00:37:36,880 --> 00:37:39,120
The question just is what are
you?
648
00:37:39,120 --> 00:37:46,400
Where are you going to trim the
fat from in order to produce
649
00:37:46,480 --> 00:37:49,120
profits?
And if that fat trimming is
650
00:37:49,120 --> 00:37:53,000
customer based, then you're
going to have a push back from
651
00:37:53,000 --> 00:37:55,440
the customer base if the
trimming.
652
00:37:55,520 --> 00:37:58,440
And that's why actually.
So it's what I think is
653
00:37:58,440 --> 00:37:59,720
interesting.
And what I've heard from
654
00:37:59,920 --> 00:38:03,480
anecdotally from several Amazon
employees that I know personally
655
00:38:03,480 --> 00:38:05,960
have spoken to throughout the
years is that Amazon is known to
656
00:38:05,960 --> 00:38:08,280
be a very difficult place to
work, even though they obviously
657
00:38:08,280 --> 00:38:10,600
care tremendously about the
customer.
658
00:38:10,920 --> 00:38:12,960
And so where they'll say
they'll, they'll trim the their
659
00:38:12,960 --> 00:38:19,920
quote UN quote fat for to make
profits might be on their
660
00:38:20,240 --> 00:38:22,600
employee end as opposed to the
customer's end.
661
00:38:23,280 --> 00:38:26,160
And so for all the customers
that are happy that Amazon has
662
00:38:26,160 --> 00:38:28,800
no questions asked returns, that
might just mean that their
663
00:38:28,800 --> 00:38:31,040
employees are getting their
employees are getting paid less
664
00:38:31,040 --> 00:38:33,560
because they have to figure out
where to cut those where to make
665
00:38:33,560 --> 00:38:35,240
that difference.
And by the way, I don't even
666
00:38:35,240 --> 00:38:37,680
know if this is the case.
I'm literally just speculating,
667
00:38:37,680 --> 00:38:39,880
making this up.
But the point and and the point
668
00:38:39,880 --> 00:38:42,680
is not in the example that I'm
giving, but in the in the
669
00:38:42,960 --> 00:38:45,760
general gist of the idea that
I'm portraying.
670
00:38:46,200 --> 00:38:51,960
And that is that I don't fault
you for having to be responsible
671
00:38:52,120 --> 00:38:59,200
financially and to make money.
I fault two things #1 having it
672
00:38:59,200 --> 00:39:02,520
come out, coming out of the
customer experience, and then #2
673
00:39:02,520 --> 00:39:05,280
expecting to be bailed out on a
massive level.
674
00:39:05,280 --> 00:39:09,800
Like with government, When big
airlines or big banks, Oregon,
675
00:39:10,000 --> 00:39:13,040
big entertainment companies
start falling apart and the
676
00:39:13,040 --> 00:39:15,360
government saves them if they're
falling apart, there's a reason
677
00:39:15,360 --> 00:39:19,040
for that.
That's capitalism and to save
678
00:39:19,040 --> 00:39:24,440
them is such an issue because it
ignores the rules of what
679
00:39:24,440 --> 00:39:27,920
people, of what people want.
And I mean, that is that's
680
00:39:27,920 --> 00:39:30,000
something that's also really,
really interesting as far as
681
00:39:30,000 --> 00:39:32,080
democracy goes.
If you're going to be a
682
00:39:32,080 --> 00:39:35,920
capitalist, most, you know,
mostly capitalistic democracy.
683
00:39:36,960 --> 00:39:39,640
Companies shouldn't get bailed
out by the government if they go
684
00:39:39,640 --> 00:39:41,760
out of business or if they fail
because they're failing for a
685
00:39:41,760 --> 00:39:43,000
reason.
And that's because they're not
686
00:39:43,000 --> 00:39:47,080
doing a good job.
So by keeping them afloat, you
687
00:39:47,080 --> 00:39:49,480
get, you're incentivizing them
to continue mistreating
688
00:39:49,480 --> 00:39:52,480
customers.
And that to me is much more of
689
00:39:52,480 --> 00:39:57,560
an issue than anything else.
So these, you know, and I'm in
690
00:39:57,560 --> 00:40:01,040
these Airbnb communities and
I've spent, I mean, I've spent a
691
00:40:01,040 --> 00:40:03,560
lot of time with them and I go
to them quite a bit because at
692
00:40:03,560 --> 00:40:04,920
the end of the day, they're
convenient.
693
00:40:05,160 --> 00:40:07,880
They're huge houses for really
cheap and they're really good
694
00:40:07,880 --> 00:40:10,840
for my family and for my good,
my kids to run around and, and
695
00:40:10,840 --> 00:40:12,320
it really does feel like a good
getaway.
696
00:40:12,320 --> 00:40:14,880
And so I am happy to go and I am
happy to pay up for it.
697
00:40:16,720 --> 00:40:19,960
But when you walk down the
block, are you go, are you going
698
00:40:19,960 --> 00:40:22,160
for a drive?
And you see for miles and miles
699
00:40:22,160 --> 00:40:25,320
on end the exact same houses
that are all sitting empty, that
700
00:40:25,320 --> 00:40:29,680
are all built the same way, the
same style, cut and paste, copy,
701
00:40:29,680 --> 00:40:32,880
paste, copy, paste, copy, paste.
You feel like you're missing a
702
00:40:32,880 --> 00:40:34,040
little something.
You feel like you're missing a
703
00:40:34,040 --> 00:40:36,720
little bit of soul.
Till next time guys, We got a
704
00:40:36,720 --> 00:40:37,880
bunch of episodes coming up this
week.
705
00:40:38,320 --> 00:40:38,440
Bye.
00:00:01,080 --> 00:00:04,000
Welcome to another episode of
Come Together.
2
00:00:15,320 --> 00:00:18,120
We are live.
This is another solo episode.
3
00:00:18,120 --> 00:00:22,760
I am going to do my best to keep
this a quality.
4
00:00:23,200 --> 00:00:26,960
We'll see how it goes.
So I'm just adjusting my screen.
5
00:00:27,240 --> 00:00:29,560
I'm not comfortable in front of
the camera yet.
6
00:00:29,760 --> 00:00:32,320
I have so many thoughts.
It's been a little while since
7
00:00:32,320 --> 00:00:34,400
I've done a solo episode and
it's really been a while since
8
00:00:34,400 --> 00:00:36,960
I've done a coherent, I feel
like solo episode where I feel
9
00:00:36,960 --> 00:00:39,840
like I actually have something
to talk about or some thoughts
10
00:00:39,840 --> 00:00:43,040
to get out there.
So here we go.
11
00:00:43,040 --> 00:00:46,640
It's been a lot of fun.
One, I literally have just been
12
00:00:46,640 --> 00:00:50,000
thinking I was just sitting by
myself in the in by my by my
13
00:00:50,000 --> 00:00:53,280
desk, catching up.
I took a week off last week with
14
00:00:53,280 --> 00:00:56,720
my family to go to Orlando,
still did a couple of recordings
15
00:00:56,720 --> 00:01:00,480
and released a podcast episode
with William Mann, who is a
16
00:01:00,600 --> 00:01:05,280
awesome historian of Hollywood
and college professor and New
17
00:01:05,280 --> 00:01:07,600
York Times bestselling writer.
He's the man.
18
00:01:07,960 --> 00:01:11,200
We did a really, really cool
episode talking about Marlon
19
00:01:11,200 --> 00:01:14,880
Brando, old School Hollywood.
His new book that's actually
20
00:01:14,880 --> 00:01:18,480
available is it today is today
the 27th tomorrow, tomorrow the
21
00:01:18,480 --> 00:01:21,160
27th.
So if you're listening today,
22
00:01:21,440 --> 00:01:23,120
you can check out the book
tomorrow.
23
00:01:23,840 --> 00:01:28,360
It is called because I don't
want to butcher it.
24
00:01:30,680 --> 00:01:31,960
Let's see.
I'm sorry.
25
00:01:32,400 --> 00:01:34,880
Here we go.
His new book is called Black
26
00:01:34,880 --> 00:01:38,000
Dahlia Murder, Monsters in
madness in Midwest Hollywood in
27
00:01:38,000 --> 00:01:44,720
mid century Hollywood, and it's
and William Mann ventured into
28
00:01:44,960 --> 00:01:49,880
the world of unsolved true
crime, Hollywood murders, not
29
00:01:49,880 --> 00:01:54,440
necessarily just to kind of jump
into the into the into the
30
00:01:54,440 --> 00:02:02,560
bandwagon of true crime, but he
really observes it in the
31
00:02:02,560 --> 00:02:04,480
context of its impact on
society.
32
00:02:04,480 --> 00:02:07,960
So for example, I know that, and
this is some really interesting
33
00:02:07,960 --> 00:02:10,800
stats that you could look up
relative to, for example, the
34
00:02:11,080 --> 00:02:16,000
the Charles Manson infamous
murder was the fact how many
35
00:02:16,000 --> 00:02:19,840
people bought guns after that in
Hollywood or how many people
36
00:02:19,840 --> 00:02:25,040
bought enhanced door locks or
security in Hollywood after the
37
00:02:25,040 --> 00:02:26,680
Manson murders.
And it's really, really
38
00:02:26,680 --> 00:02:31,160
interesting because it these
these events do have societal
39
00:02:31,160 --> 00:02:34,840
ripples and impacts.
And I mean, obviously we see it
40
00:02:34,840 --> 00:02:38,640
right now how we see with a lot
that's going on in the world
41
00:02:38,640 --> 00:02:43,080
today.
And the events are the events
42
00:02:43,280 --> 00:02:46,640
and we might not ever know the
real story behind them.
43
00:02:46,640 --> 00:02:50,880
This, you know, I mean, you have
10 different accounts, 10
44
00:02:50,880 --> 00:02:53,160
witnesses, you have 10 different
accounts of what's going on.
45
00:02:53,160 --> 00:02:57,600
And you also have unfortunately,
interpretation, which is, which
46
00:02:57,600 --> 00:03:03,440
is when people, when people
witness something and because,
47
00:03:03,520 --> 00:03:06,720
you know, we're only
experiencing, I would say we,
48
00:03:06,720 --> 00:03:09,560
we're, we're getting a
particular type of input we're
49
00:03:09,560 --> 00:03:13,240
getting, we're viewing something
and our brain is making sense of
50
00:03:13,240 --> 00:03:15,000
it in the context of what we
already know.
51
00:03:15,280 --> 00:03:19,480
So when we've witnessed events
like things that obviously most
52
00:03:19,480 --> 00:03:24,240
hot topic these days are going
on in Minnesota, when we
53
00:03:24,240 --> 00:03:29,040
witnessed these events, three
people, five people, 1000 people
54
00:03:29,040 --> 00:03:32,600
could look at the pictures and
the videos that are going on
55
00:03:32,600 --> 00:03:36,680
that are making the way through
online and through corporate or
56
00:03:36,680 --> 00:03:41,120
formal media.
And we all interpret it in the
57
00:03:41,120 --> 00:03:43,520
context of the way of the
picture that's already in our
58
00:03:43,520 --> 00:03:45,640
mind of what we presume is the
case.
59
00:03:46,040 --> 00:03:52,120
And then there are the ripples
that actually flow out of into
60
00:03:52,120 --> 00:03:56,280
society about what happens as a
consequence of it.
61
00:03:56,320 --> 00:03:59,120
And you know, we obviously know,
for example, that some of the
62
00:03:59,120 --> 00:04:02,360
most consequential, you know,
you have small scale events.
63
00:04:02,360 --> 00:04:05,640
Like I said, Charlie Manson was
horrifying, but relatively small
64
00:04:05,640 --> 00:04:07,400
scale relative to something like
911.
65
00:04:07,680 --> 00:04:11,560
And the scale obviously impacts
the scale of the ripple effect
66
00:04:11,560 --> 00:04:14,880
in the reaction.
So I think that William Mann did
67
00:04:14,920 --> 00:04:17,519
a really, really amazing job
expressing his points.
68
00:04:18,000 --> 00:04:19,320
It's worth checking out his
episode.
69
00:04:19,320 --> 00:04:21,000
It's worth checking out his
writing, his books.
70
00:04:21,200 --> 00:04:23,400
He's written on a bunch of
different really cool Hollywood
71
00:04:23,560 --> 00:04:26,280
characters and families and even
political families like the
72
00:04:26,280 --> 00:04:29,600
Roosevelts.
And I think it was the
73
00:04:29,600 --> 00:04:31,600
Roosevelts I'm going to get, I'm
going to feel bad if I got that
74
00:04:31,600 --> 00:04:35,080
wrong.
Yeah, the Roosevelt, I was
75
00:04:35,120 --> 00:04:38,880
right, and it's worth checking
it out.
76
00:04:40,840 --> 00:04:51,560
I was trying to think about what
to talk about and I didn't want
77
00:04:51,560 --> 00:04:54,560
to kind of, I didn't want to
dwell too much on the Minnesota
78
00:04:54,560 --> 00:04:56,080
situation.
I kind of have to talk.
79
00:04:56,080 --> 00:05:00,000
I feel like I have to talk about
it because A, it's relevant and
80
00:05:00,120 --> 00:05:06,480
B, it does have real life,
again, implications and
81
00:05:06,480 --> 00:05:08,640
consequences.
And it's really, really
82
00:05:08,640 --> 00:05:12,480
interesting just to sit on the
side and observe the voices and
83
00:05:12,480 --> 00:05:17,160
the reactions that's going on.
And I'm not, What's funny is I'm
84
00:05:17,160 --> 00:05:21,160
not really tuned into what the
most prominent voices are saying
85
00:05:21,160 --> 00:05:23,200
because I don't really listen to
the most prominent voices.
86
00:05:25,240 --> 00:05:27,000
I just they're not, I'm not in
that.
87
00:05:27,000 --> 00:05:28,920
They're not in my algorithm.
I'm not exposed to their
88
00:05:28,920 --> 00:05:35,680
information on my feed.
But this is kind of funny.
89
00:05:36,000 --> 00:05:39,320
I mean, not nothing about the
nothing about the conflict in
90
00:05:39,320 --> 00:05:43,800
particular is funny.
But what's funny is how just a
91
00:05:43,800 --> 00:05:46,920
couple of years ago the right
was complaining about freedom of
92
00:05:46,920 --> 00:05:54,960
speech and right to bear arms.
And now the left has been making
93
00:05:54,960 --> 00:05:59,240
that very, very same argument.
And the right is, or at least
94
00:05:59,480 --> 00:06:02,480
not all the right, but at least
the government representation of
95
00:06:02,480 --> 00:06:06,160
the right has been pretty
dramatically pushing back,
96
00:06:06,480 --> 00:06:08,480
almost in the same sense that
you're really seeing the
97
00:06:08,480 --> 00:06:11,640
pendulum swinging.
It's so funny because people
98
00:06:11,680 --> 00:06:14,800
would say a couple of years ago,
yeah, well, in the 60s, you
99
00:06:14,800 --> 00:06:17,240
know, I really had more in
common with the liberal if
100
00:06:17,240 --> 00:06:19,520
someone was Republican, rather
if someone was to the right and
101
00:06:19,520 --> 00:06:21,760
they say, yeah, in the 60s or
70s, I really probably would
102
00:06:21,760 --> 00:06:26,320
have had much more in common
with the modern Democrats, with
103
00:06:26,320 --> 00:06:28,560
Democrats then then modern
Democrats.
104
00:06:28,880 --> 00:06:33,400
And, you know, it's really,
really, it just shows how
105
00:06:33,400 --> 00:06:36,680
extreme everyone's become
because we're pro free speech
106
00:06:36,680 --> 00:06:39,120
and we do want limited war.
We don't want a lot of war.
107
00:06:39,320 --> 00:06:43,320
And we do want to control
enormous organizations, whether
108
00:06:43,320 --> 00:06:47,120
it's corporations or government.
And you're like, wow, that
109
00:06:47,120 --> 00:06:48,360
sounds, you know, really
similar.
110
00:06:48,360 --> 00:06:52,960
But but then you have like
literally just in a span of like
111
00:06:53,280 --> 00:06:56,320
a year and a half, like you
have, you have people
112
00:06:56,320 --> 00:06:58,760
complaining about the fact that
one of Biden's the first thing
113
00:06:58,760 --> 00:07:01,040
Biden did when he got into
office was go to the social
114
00:07:01,040 --> 00:07:04,400
media companies and start to and
start to silence people that
115
00:07:04,400 --> 00:07:08,840
would, for example, push back on
COVID, on the COVID vaccine or
116
00:07:09,160 --> 00:07:12,520
on January 6th or really on any
kind of political issue that
117
00:07:12,520 --> 00:07:15,400
became a matter of politics and
policy.
118
00:07:17,480 --> 00:07:19,440
And it takes a year.
That's all it took.
119
00:07:19,680 --> 00:07:22,160
Took a year and a half for
everyone that was just screaming
120
00:07:22,160 --> 00:07:25,320
on the right to that we need
free speech and the right to
121
00:07:25,320 --> 00:07:27,520
bear arms.
And then you have Cash Patel
122
00:07:27,520 --> 00:07:29,800
walking around now.
I saw a video of him saying,
123
00:07:29,800 --> 00:07:33,400
well, you can't walk around with
a gun and come to a protest.
124
00:07:34,640 --> 00:07:39,920
Imagine three years ago, imagine
2020, imagine if someone that
125
00:07:39,920 --> 00:07:44,440
was on the right went to a Black
Lives Matter riot or not riot, I
126
00:07:44,440 --> 00:07:45,600
apologize.
Let's say protest.
127
00:07:45,600 --> 00:07:48,080
Let's say it is now a protest.
It is peaceful.
128
00:07:48,680 --> 00:07:51,720
I've actually seen a bunch of
actually when I was, I was
129
00:07:51,720 --> 00:07:58,280
living in Seattle in 2020 and
listen, it was fascinating to
130
00:07:58,280 --> 00:08:05,520
experience all the protests and
Chaz and you know, all the I
131
00:08:05,520 --> 00:08:08,640
didn't really see, I never
personally saw any violence.
132
00:08:08,640 --> 00:08:10,360
I never heard any.
I never experienced any
133
00:08:10,360 --> 00:08:12,800
violence.
I never heard anyone escalating
134
00:08:12,800 --> 00:08:15,720
anything beyond.
So you know, social justice
135
00:08:15,800 --> 00:08:17,680
chance, which everyone's
entitled to do.
136
00:08:17,880 --> 00:08:21,640
But just imagine for a minute
that you have a story of a Black
137
00:08:21,640 --> 00:08:27,680
Lives Matter protest and there's
someone in the crowd that is
138
00:08:28,560 --> 00:08:31,080
protest protesting, you know,
anti protesting.
139
00:08:31,360 --> 00:08:34,559
And he's standing on the other
side and he has a weapon in his
140
00:08:34,559 --> 00:08:36,480
pocket.
He doesn't take it out, he
141
00:08:36,480 --> 00:08:38,840
doesn't use it.
He just has a weapon and he's
142
00:08:38,840 --> 00:08:40,799
legally loud.
He is a carrying permit or
143
00:08:40,799 --> 00:08:43,919
whatever it is.
And, and for whatever reason,
144
00:08:44,039 --> 00:08:48,320
someone, you know, he gets, he
gets detained or shot or hurt or
145
00:08:48,320 --> 00:08:53,640
killed because he had a weapon,
the right would flip out.
146
00:08:53,640 --> 00:08:56,240
We have, you know, we, we have,
we need rights.
147
00:08:56,240 --> 00:08:58,520
We had it's a Second Amendment
we have to overthrow.
148
00:08:58,800 --> 00:09:00,720
If you're able to overthrow a
government that's going to
149
00:09:00,720 --> 00:09:04,200
become a dictatorship or, or
communist or fascist or, you
150
00:09:04,200 --> 00:09:06,800
know, filling the filling the
hot topic word that anyone is
151
00:09:06,800 --> 00:09:08,800
throwing out.
But it really just took a year
152
00:09:08,800 --> 00:09:10,600
and a half for all the tables to
just flip.
153
00:09:11,000 --> 00:09:14,240
And it's incredible how quickly
it went.
154
00:09:14,240 --> 00:09:21,040
And it's incredible how quickly
people, when you point out the
155
00:09:21,040 --> 00:09:25,160
inconsistency to them, as I've
tried to do with some people
156
00:09:25,160 --> 00:09:29,120
that I know where the answer
becomes, yeah, but.
157
00:09:29,480 --> 00:09:32,200
And that, yeah, but is the exact
same answer and the exact same
158
00:09:32,200 --> 00:09:36,840
perspective that the left had
for the woke years and for
159
00:09:36,840 --> 00:09:42,920
20/20/2024.
And it we, it's not great.
160
00:09:42,920 --> 00:09:47,160
It's not a great look and it's
not a great look because I think
161
00:09:47,160 --> 00:09:52,840
that people today are more done
with partisan politics than
162
00:09:52,840 --> 00:09:57,200
ever.
And I think that people today
163
00:09:57,560 --> 00:10:03,160
are really seeing that they're
seeing everyone's
164
00:10:03,160 --> 00:10:06,080
inconsistencies.
I, I, I saw actually today on,
165
00:10:06,120 --> 00:10:09,240
on on X and I had a whole
response ran out that I didn't
166
00:10:09,240 --> 00:10:12,600
write yet because I think I
might just send it personally to
167
00:10:13,320 --> 00:10:16,480
try to reach out personally.
I saw that Ro Kahana, who's a
168
00:10:16,600 --> 00:10:18,840
California governor that's been
working with, I think he's a
169
00:10:18,840 --> 00:10:22,880
governor working with Thomas
Massie and he's, and I heard him
170
00:10:22,880 --> 00:10:25,440
on Sean Ryan show.
I heard him on Theo Von.
171
00:10:26,280 --> 00:10:31,200
And I have to say I was overall
extremely, extremely impressed
172
00:10:31,200 --> 00:10:33,280
with a couple of things about
him.
173
00:10:34,000 --> 00:10:36,440
Obviously he's very eloquent.
Obviously he's a politician.
174
00:10:36,440 --> 00:10:40,960
He's supposed to be eloquent.
One thing that he did say, and
175
00:10:40,960 --> 00:10:44,760
it's so upsetting because it's
hard to believe, especially
176
00:10:44,760 --> 00:10:48,960
after all the excitement that I
had a year and a half ago with,
177
00:10:48,960 --> 00:10:53,280
with all the, with the, with the
huge MAGA wave with Trump and
178
00:10:53,280 --> 00:10:57,120
Elon and RFK and Tulsi and you
know, and all the people that
179
00:10:57,120 --> 00:10:59,120
everyone is really excited about
people haven't heard from them
180
00:10:59,120 --> 00:11:01,440
in a long time.
You know, Elon, Elon's fallout
181
00:11:01,440 --> 00:11:04,720
with Musk, which I'm sorry,
Elon's fallout with Trump, which
182
00:11:05,360 --> 00:11:07,480
looks like it's kind of being
rectified a little bit.
183
00:11:07,840 --> 00:11:10,720
RFKI heard the best call.
I think it was Tim Dylan was so
184
00:11:10,720 --> 00:11:12,280
funny.
He's the least controversial
185
00:11:12,280 --> 00:11:15,560
member of the cabinet at this
point, which, you know, go
186
00:11:15,560 --> 00:11:17,880
figure.
You know, everyone was pushing
187
00:11:17,880 --> 00:11:21,200
back the hardest against him and
and he just kind of quietly
188
00:11:21,200 --> 00:11:23,440
minding his own business and
occasionally confessing to
189
00:11:23,640 --> 00:11:25,720
extremely bizarre things on
social media.
190
00:11:26,200 --> 00:11:31,280
And yeah.
And then and Tulsi, I mean, no
191
00:11:31,280 --> 00:11:33,920
one really heard from her.
And the truth was that that she
192
00:11:33,920 --> 00:11:36,680
was supposed to be the
representation of the anti
193
00:11:36,680 --> 00:11:40,520
imperial America, the the empire
perspective.
194
00:11:40,560 --> 00:11:43,080
And it looks like she kind of
got shunned aside a little bit.
195
00:11:43,400 --> 00:11:48,360
But so I think that a lot of
people include my myself
196
00:11:48,360 --> 00:11:49,920
included, we're really, really
hopeful.
197
00:11:49,920 --> 00:11:53,320
And now we're not.
And then and, and so Rocahana
198
00:11:53,320 --> 00:11:55,400
said something really
interesting, which kind of
199
00:11:55,400 --> 00:11:59,120
piqued my interest, even though
I'm skeptical just because and
200
00:11:59,720 --> 00:12:03,920
you know, I, I, I know that this
is potentially A fallacy, but I
201
00:12:03,920 --> 00:12:06,560
think that it's also, I, I think
that enough history has proven
202
00:12:06,560 --> 00:12:12,640
that it could be an issue is a,
the guys from California, which,
203
00:12:12,800 --> 00:12:15,640
you know, which is the governor
of California, which makes me
204
00:12:15,640 --> 00:12:21,640
think that he and I know that he
has a much more grounded story
205
00:12:21,640 --> 00:12:24,160
that he failed, I think, two or
three times to get elected.
206
00:12:24,160 --> 00:12:26,760
And I think, I think on his
third or fourth try he got, he
207
00:12:26,760 --> 00:12:33,640
got, he, he was successful, but
I don't know, seems just a
208
00:12:33,640 --> 00:12:36,520
little too.
He's really smart.
209
00:12:36,560 --> 00:12:38,160
He's saying really, really smart
things.
210
00:12:39,000 --> 00:12:42,640
He's talking about, for example,
Epstein, which is a uni party
211
00:12:42,640 --> 00:12:47,040
issue.
And he's talking about ICE and
212
00:12:47,040 --> 00:12:49,960
immigration, which people on the
right and the left are really
213
00:12:49,960 --> 00:12:53,600
not very happy about.
And again, you know, the
214
00:12:53,600 --> 00:12:55,600
question isn't whether or not
there should be immigration
215
00:12:55,600 --> 00:12:56,960
enforce.
The question isn't whether or
216
00:12:56,960 --> 00:12:59,680
not ICE should be active.
The question just is to what
217
00:12:59,680 --> 00:13:06,760
degree the behaviour is what to
what degree is the behaviour
218
00:13:06,760 --> 00:13:10,440
justified, acceptable?
And it's really, really hard.
219
00:13:10,440 --> 00:13:12,600
And I've heard, I've heard both
cases that, you know, for
220
00:13:12,600 --> 00:13:16,840
example, cops have to wear badge
identification badges and ICE
221
00:13:16,840 --> 00:13:23,400
agents don't.
So if I mean really there's no
222
00:13:23,400 --> 00:13:27,320
way to identify if a cop's
misbehaving, you could get their
223
00:13:27,320 --> 00:13:30,000
badge number, you could get
their name and look them up.
224
00:13:30,320 --> 00:13:35,360
If an ICE agent who is wearing a
mask and has no recognizable
225
00:13:35,360 --> 00:13:37,600
signs for you to be able to
report any kind of abuse of
226
00:13:37,600 --> 00:13:39,880
behavior.
If the ICE agent doesn't have
227
00:13:39,880 --> 00:13:42,600
any of that, it becomes a lot
more difficult to kind of
228
00:13:42,600 --> 00:13:47,400
control and hold them
accountable for potentially bad
229
00:13:47,400 --> 00:13:51,080
behaviour or even behaviour that
is justified.
230
00:13:51,080 --> 00:13:55,600
But that you might want to take
some some look into the context
231
00:13:55,600 --> 00:13:57,560
of the, of the person and the
story and the character in the
232
00:13:57,560 --> 00:14:00,120
history.
And it's really, really, it's
233
00:14:00,120 --> 00:14:05,240
really, really hard to do.
So anyway, so Rocahana was
234
00:14:05,280 --> 00:14:08,520
talking about how the elite just
seemed to be getting away with
235
00:14:08,520 --> 00:14:10,760
everything.
And this, that's the problem
236
00:14:10,760 --> 00:14:12,320
with this, with this
administration.
237
00:14:12,600 --> 00:14:14,880
And I wanted to comment.
And I want, and again, I'm going
238
00:14:14,880 --> 00:14:17,040
to, I'm going to reach out.
And obviously, I mean,
239
00:14:17,840 --> 00:14:21,040
overwhelming odds are that it's
not going to get anywhere.
240
00:14:21,040 --> 00:14:23,440
I'll attach a couple of the
episodes that feature the
241
00:14:23,440 --> 00:14:27,000
relatively prominent and
recognizable names of people
242
00:14:27,000 --> 00:14:30,120
that I've spoken with to try to
add some credentials to who I
243
00:14:30,120 --> 00:14:33,280
am.
But I want to reach out and say,
244
00:14:33,280 --> 00:14:37,680
hey, if this is really uni party
issue like Epstein is where
245
00:14:37,680 --> 00:14:40,600
everyone on almost everyone on
the Senate floor and everyone in
246
00:14:40,600 --> 00:14:44,200
Congress was going to be able to
was happy to pass along this
247
00:14:44,520 --> 00:14:48,080
this bill to force that force
the files and information out.
248
00:14:50,000 --> 00:14:53,240
I think that I think that the
elite behavior is a uni party
249
00:14:53,240 --> 00:14:55,040
issue.
And by you phrasing it
250
00:14:55,040 --> 00:14:57,800
unfortunately as the problem
with this administration is that
251
00:14:57,800 --> 00:14:59,800
we take advantage from an
executive level.
252
00:15:00,160 --> 00:15:02,760
Well, guess what, pal?
Unfortunately, you kind of lost
253
00:15:02,760 --> 00:15:06,120
playing that card if you didn't
if you didn't fight back against
254
00:15:06,240 --> 00:15:11,600
a lot of what went on in 2020 to
2024 or again, you know,
255
00:15:11,600 --> 00:15:13,280
20/16/2020.
I think we just came.
256
00:15:13,280 --> 00:15:16,240
We, you know, since 9:11, I
mean, since 9:11, we've really
257
00:15:16,240 --> 00:15:22,040
become very, very authoritarian,
a more authoritarian, a lot more
258
00:15:22,040 --> 00:15:24,760
government control over the
people, a lot less freedom.
259
00:15:25,280 --> 00:15:29,640
And that actually is going to
lead me to my next point, which
260
00:15:30,040 --> 00:15:33,240
I was so excited.
This actually got me talking.
261
00:15:33,240 --> 00:15:36,000
This got me excited to record.
And I'm 15 minutes in.
262
00:15:36,000 --> 00:15:37,240
I didn't get started yet, I'm
sorry.
263
00:15:37,680 --> 00:15:40,560
But the point that one of the
points that I really was
264
00:15:40,560 --> 00:15:47,200
thinking about was, OK, so
there's awesome, awesome,
265
00:15:47,200 --> 00:15:49,360
awesome scientist named Eric
Weinstein.
266
00:15:49,800 --> 00:15:52,920
And he one of the things, and
I've talked about him before.
267
00:15:53,080 --> 00:15:56,520
I'm a huge, huge fan of his.
Every time I hear him or every
268
00:15:56,520 --> 00:15:59,200
time I see that he's done some
kind of podcast or conversation,
269
00:15:59,600 --> 00:16:04,200
I try, I try and listen and
participate to the best that I
270
00:16:04,200 --> 00:16:06,120
can.
He's really, really smart and I
271
00:16:06,120 --> 00:16:08,600
can't keep up with everything he
says all the time.
272
00:16:09,080 --> 00:16:13,840
But one thing that he did say
that was really interesting to
273
00:16:13,840 --> 00:16:17,920
me was that human beings also
essentially kind of work like
274
00:16:17,920 --> 00:16:19,920
LLMS, like large language
models.
275
00:16:19,920 --> 00:16:24,600
And I'm shifting gears right now
from politics more to technology
276
00:16:24,600 --> 00:16:29,280
and more to the way that we get
our information and more to the
277
00:16:30,800 --> 00:16:34,400
ever present issue of the
algorithm.
278
00:16:34,800 --> 00:16:39,400
And I put these kind of thoughts
together as I was trying to
279
00:16:39,400 --> 00:16:42,840
think about what to talk about.
Human beings work very, very
280
00:16:42,840 --> 00:16:46,400
much like LLMS.
And I'll explain, for example,
281
00:16:47,200 --> 00:16:49,240
in a way that hopefully makes
sense.
282
00:16:51,640 --> 00:16:55,360
We, our brains are lazy.
Our brains want to be efficient,
283
00:16:55,760 --> 00:17:03,640
efficient and effective, and our
brains also want to use the
284
00:17:03,640 --> 00:17:06,920
least amount of effort or energy
in order to be able to
285
00:17:06,920 --> 00:17:11,800
accomplish a mission or a task.
We just, we've evolved to try to
286
00:17:11,800 --> 00:17:14,720
be efficient.
Sometimes we cut too many
287
00:17:14,720 --> 00:17:17,240
shortcuts.
So for example, and this is a
288
00:17:17,240 --> 00:17:19,560
very, very common experience
that I have, it's my life.
289
00:17:19,560 --> 00:17:21,680
I work with my father and we're
very close.
290
00:17:23,520 --> 00:17:27,319
We speak on the phone 10/15/20
times a day for a couple minutes
291
00:17:27,319 --> 00:17:28,960
at a time just dealing with
whatever work issue we're
292
00:17:28,960 --> 00:17:33,200
dealing with or talking about
life or sales or the business or
293
00:17:33,200 --> 00:17:37,760
family or whatever.
And a lot of times we'll, if we
294
00:17:37,760 --> 00:17:41,280
have a disagreement, which we
often do, and I've been trying
295
00:17:41,280 --> 00:17:44,080
to work on this and sometimes I
can't help it and sometimes I'm
296
00:17:44,080 --> 00:17:46,880
better about it.
But if we have an, an issue or a
297
00:17:46,880 --> 00:17:49,080
disagreement and I think I know
he's going to say he's going to
298
00:17:49,080 --> 00:17:51,760
start by making a point and I
cut him off midpoint.
299
00:17:51,760 --> 00:17:54,320
I'm like, yeah, but and then he
goes, you didn't even hear what
300
00:17:54,320 --> 00:17:58,040
I finished saying.
And my brain, my LLM Green
301
00:17:58,400 --> 00:18:02,720
finished his thought for him in
a similar way to and it's funny
302
00:18:02,720 --> 00:18:05,320
because it's just a matter of
degrees of how much better it's
303
00:18:05,320 --> 00:18:08,400
getting right.
If you if you remember a couple
304
00:18:08,400 --> 00:18:11,720
of years ago when Apple was
first rolling out or iMessage
305
00:18:11,720 --> 00:18:16,040
was first rolling out there, I
guess dipping the toe in the
306
00:18:16,040 --> 00:18:18,520
water of LLMS large language
models.
307
00:18:18,840 --> 00:18:22,920
And one of the things that one
of the ways that LLMS work is by
308
00:18:22,920 --> 00:18:26,680
predicting what the next, what
the best potential choice of the
309
00:18:26,680 --> 00:18:29,280
next word or feature is going to
be.
310
00:18:29,280 --> 00:18:34,400
So for example, if I was sending
a text sounds the answer that
311
00:18:34,400 --> 00:18:37,040
you know, the recommended word
choice, the potential words to
312
00:18:37,040 --> 00:18:41,680
follow the to the net potential
word to follow the word sounds
313
00:18:41,800 --> 00:18:45,080
would most likely be good like
sounds good with a thumbs up or
314
00:18:45,080 --> 00:18:48,440
be sounds awful or sounds
interesting, right?
315
00:18:48,440 --> 00:18:52,000
And so the LLM would predict
would give you 3 predictions of
316
00:18:52,000 --> 00:18:55,840
words that you're most likely to
use with that that follow the
317
00:18:55,840 --> 00:18:58,360
word sounds.
When I cut off my dad and I'm
318
00:18:58,360 --> 00:19:01,560
having the argument with him, I
am essentially guessing what
319
00:19:01,560 --> 00:19:04,240
he's going to say based on my
experience of having dealt with
320
00:19:04,240 --> 00:19:06,800
him before.
I'm not letting him finish his
321
00:19:06,800 --> 00:19:09,440
thought out.
I'm not letting him expresses
322
00:19:09,440 --> 00:19:12,040
his point.
I am predicting what he's going
323
00:19:12,040 --> 00:19:15,400
to say and am choosing to
respond in real time before I
324
00:19:15,400 --> 00:19:19,480
even have all the context.
I'm choosing to respond in real
325
00:19:19,480 --> 00:19:22,480
time without having all the
information.
326
00:19:22,480 --> 00:19:25,600
And it's a stupid fucking way to
communicate with people.
327
00:19:26,560 --> 00:19:30,520
And I do apologise because I do
it a lot and it makes me a
328
00:19:30,520 --> 00:19:32,320
really, really bad listener,
unfortunately.
329
00:19:33,160 --> 00:19:37,560
If we are Ella, you know, if we
do think like LLMS and you know,
330
00:19:37,560 --> 00:19:41,600
I think and also, you know,
there's only so many words and
331
00:19:41,600 --> 00:19:43,480
conversations and predictive
ways that you can have a
332
00:19:43,480 --> 00:19:45,720
conversation, right?
So you start off with someone,
333
00:19:45,720 --> 00:19:49,440
you say, hey, how you doing?
There's only a limited number of
334
00:19:49,440 --> 00:19:51,760
potential responses.
And then we kind of get thrown
335
00:19:51,760 --> 00:19:52,880
off if there's anything else,
right?
336
00:19:52,880 --> 00:19:54,000
So the answer is usually pretty
good.
337
00:19:54,000 --> 00:19:57,600
You all right, or another day in
paradise or whatever it is.
338
00:19:57,880 --> 00:20:00,880
And then sometimes if someone
throws something like I'll
339
00:20:01,080 --> 00:20:03,160
usually tell people, oh, I'm
just peachy man.
340
00:20:04,600 --> 00:20:10,480
Sounds gay, but.
Sorry if I, you know, in the in
341
00:20:10,480 --> 00:20:15,240
the old school way, I asked
someone what's going on.
342
00:20:15,240 --> 00:20:19,480
There's only a certain amount of
expected potential responses and
343
00:20:19,480 --> 00:20:23,080
any other potential response,
any other response throws you
344
00:20:23,080 --> 00:20:26,200
through a loop a little bit.
So you notice when someone says,
345
00:20:26,200 --> 00:20:27,840
hey, how you doing?
Pretty bad actually.
346
00:20:28,320 --> 00:20:30,040
You're like, whoa.
And that person will actually
347
00:20:30,040 --> 00:20:34,000
leave a mark on you because that
was a new implementation of a
348
00:20:34,000 --> 00:20:38,760
potential response into your LLM
diary or LLM tool book.
349
00:20:39,000 --> 00:20:40,920
And you kind of keep an eye out
for it.
350
00:20:40,920 --> 00:20:44,720
And if you go, if, if it's an
anomaly, then it's an anomaly
351
00:20:44,720 --> 00:20:46,880
and you end up being surprised
or sometimes you notice it more.
352
00:20:46,880 --> 00:20:49,600
And that's kind of what happens.
For example, if you, and I've
353
00:20:49,600 --> 00:20:51,800
had this experience myself, I
forgot what it's called, but
354
00:20:51,800 --> 00:20:54,480
when you buy, for example, a car
or you lease a car.
355
00:20:54,480 --> 00:20:58,880
So when I got my Kia K5, I never
heard of the model my entire
356
00:20:58,880 --> 00:21:02,680
life.
And the second I got the car, I
357
00:21:02,680 --> 00:21:04,200
started seeing them everywhere
on the road.
358
00:21:04,520 --> 00:21:08,360
And I'm like, that is wild.
Like I had like my, the, my
359
00:21:08,360 --> 00:21:10,360
brain was not looking for that
detail.
360
00:21:10,680 --> 00:21:13,520
And now that it exists in my
library, I can keep an eye out
361
00:21:13,520 --> 00:21:15,000
for it.
Really, really interesting.
362
00:21:15,400 --> 00:21:22,120
So that on a very, very surface
level in very, very, I'm sure
363
00:21:22,120 --> 00:21:23,640
there's a ton more of more
depth.
364
00:21:23,640 --> 00:21:25,360
I'm sure there's a ton that I'm
missing.
365
00:21:25,360 --> 00:21:27,160
I'm not claiming to have
everything covered over there,
366
00:21:27,520 --> 00:21:31,200
but I think that there's a
general gist that we tend to,
367
00:21:31,280 --> 00:21:37,600
people tend to work that way.
Our algorithms, we, our
368
00:21:37,600 --> 00:21:47,640
algorithms are shaping us more
and more everyday because it's
369
00:21:47,640 --> 00:21:52,640
constantly reinforcing a
perspective without giving you a
370
00:21:52,640 --> 00:21:56,360
chance to explore other avenues,
overwrite your associations or
371
00:21:56,360 --> 00:21:58,360
be exposed to new perspectives
and new thoughts.
372
00:21:58,880 --> 00:22:03,520
And So what we determine and
deem to be truth or reality is
373
00:22:03,520 --> 00:22:07,480
really just a whole bunch of
reinforced information.
374
00:22:09,280 --> 00:22:12,400
And I've heard people complain
about that quite a bit and I've
375
00:22:12,400 --> 00:22:15,040
heard people talk about how
that's a massive problem and it
376
00:22:15,040 --> 00:22:17,480
is.
But I really haven't heard of
377
00:22:17,480 --> 00:22:21,400
many people offering very many
solutions other than don't do
378
00:22:21,440 --> 00:22:26,760
social media, which I'm an
advocate for.
379
00:22:26,800 --> 00:22:30,720
And just because I advocate for
not doing social media doesn't
380
00:22:30,720 --> 00:22:32,680
mean I'm not addicted to it.
You know, it doesn't mean that I
381
00:22:32,680 --> 00:22:35,720
don't spend a ton of a ton more
time on it doesn't mean that my
382
00:22:35,720 --> 00:22:37,960
opinions and my perspectives are
not shaped by it.
383
00:22:38,440 --> 00:22:42,160
And in fact, every single time I
try to find myself, I say, oh,
384
00:22:42,160 --> 00:22:44,600
I'm just, you know, I'm just
being blasted with, with, with,
385
00:22:44,600 --> 00:22:50,520
with content that is meant to,
you know, put me into a, a
386
00:22:50,520 --> 00:22:54,000
particular frame of mind or
enforce a particular
387
00:22:54,240 --> 00:22:58,600
perspective.
I look for other content to
388
00:22:58,600 --> 00:23:01,600
balance that out and the, you
know, and, and, and then, and
389
00:23:01,600 --> 00:23:04,520
then you have to catch yourself.
OK, well, at a certain point,
390
00:23:04,760 --> 00:23:06,760
everything becomes anti
authoritarian.
391
00:23:06,760 --> 00:23:08,240
At a certain point, everything
is a problem.
392
00:23:08,240 --> 00:23:09,720
What are the good things?
What are the potential good
393
00:23:09,720 --> 00:23:11,400
things?
How am I looking at this in a
394
00:23:11,400 --> 00:23:15,280
way that is both kind of picking
it apart and also potentially
395
00:23:15,280 --> 00:23:19,080
positive?
But it's really hard to do.
396
00:23:19,400 --> 00:23:22,680
And So what I was thinking of
actually, and what I hope to
397
00:23:22,680 --> 00:23:27,560
encourage now is the value of
creating our own algorithms.
398
00:23:27,720 --> 00:23:31,520
And I think that that means that
there's a lot of really cool
399
00:23:31,520 --> 00:23:33,240
ways to create our own
algorithms.
400
00:23:34,440 --> 00:23:43,480
Choosing content, particularly
long form historical, scientific
401
00:23:44,160 --> 00:23:52,440
or narrative based content, I
think is in long form content
402
00:23:52,520 --> 00:23:56,080
like a book or like a long
podcast or an audio book or a
403
00:23:56,080 --> 00:23:59,440
movie or ATV series or whatever
it is.
404
00:24:00,240 --> 00:24:05,240
I think that the long form
contents and allow us to create
405
00:24:05,240 --> 00:24:14,320
our own algorithms and instead
of us being shaped by what is
406
00:24:14,320 --> 00:24:19,600
being fed to us or presented to
us, we can get up, go to the do
407
00:24:19,600 --> 00:24:21,960
the equivalent of going to the
supermarket and picking out a
408
00:24:21,960 --> 00:24:25,200
healthy meal for ourselves.
So if you're sitting at home and
409
00:24:25,200 --> 00:24:28,880
you get up and you look and you
say, hey, I'm hungry, can you
410
00:24:28,880 --> 00:24:31,080
get me a, you know, and you ask
your sibling, you ask your
411
00:24:31,080 --> 00:24:34,480
parent, you ask your spouse, you
ask your friend, Hey, can you
412
00:24:34,480 --> 00:24:37,800
grab me a snack?
They're pretty much subject to
413
00:24:37,800 --> 00:24:40,640
what they bring you.
You know, you get up and do it
414
00:24:40,640 --> 00:24:43,520
yourself.
Then you can pick what is
415
00:24:43,520 --> 00:24:45,360
actually going to be healthy.
You could pick something that
416
00:24:45,360 --> 00:24:47,720
might, you're in the mood for,
it could be something that
417
00:24:47,720 --> 00:24:49,000
you're potentially interested
in.
418
00:24:49,000 --> 00:24:50,680
You want to see how it goes with
other flavors.
419
00:24:51,040 --> 00:24:53,160
And I think the same thing
applies to knowledge and our
420
00:24:53,160 --> 00:24:55,240
understanding of the world and
information that we choose to
421
00:24:55,240 --> 00:24:57,880
consume.
I think that if the more that we
422
00:24:57,880 --> 00:25:01,320
use social media, the more the
information is being imposed on
423
00:25:01,320 --> 00:25:04,800
to us and the more that we
choose the information to
424
00:25:04,800 --> 00:25:08,080
consume is the more that we're
getting up and choosing what
425
00:25:08,080 --> 00:25:11,560
kind of snacks to have, whether
they're healthy or not, whether
426
00:25:11,560 --> 00:25:12,880
they're good for you, what kind
of mood you're in.
427
00:25:12,880 --> 00:25:15,480
Again, do you need something
that's a little bit salty,
428
00:25:15,480 --> 00:25:18,880
sweet, high, high calorie, low
calorie doesn't matter.
429
00:25:19,360 --> 00:25:20,640
There's a context for
everything.
430
00:25:20,640 --> 00:25:21,840
There's a time and place for
everything.
431
00:25:21,840 --> 00:25:27,320
And So what is best for us to do
is to decide that for ourselves.
432
00:25:27,320 --> 00:25:31,720
I think that we think that
because our algorithm will
433
00:25:31,720 --> 00:25:36,200
respond to, you know, obviously
our algorithm is responsive to
434
00:25:36,200 --> 00:25:38,840
what catches our eye, but it
doesn't necessarily mean that
435
00:25:38,840 --> 00:25:44,320
it's what we want.
If I walk into a pizza shop and
436
00:25:44,320 --> 00:25:48,920
I that sounds incredible,
doesn't mean that someone should
437
00:25:48,920 --> 00:25:51,200
shove a pizza in my face.
It doesn't, it doesn't mean that
438
00:25:51,200 --> 00:25:52,440
someone should shove a pizza
down my throat.
439
00:25:52,920 --> 00:25:55,360
It just means, whoa, that's
interesting.
440
00:25:55,680 --> 00:25:57,240
I like that.
I don't know if I want it right
441
00:25:57,240 --> 00:26:04,680
now, but I like it, you know,
and I think that, you know, I
442
00:26:04,680 --> 00:26:06,840
myself have been going on this
really, really interesting
443
00:26:06,840 --> 00:26:10,720
journey where I've kind of been
learning about, I haven't been
444
00:26:10,720 --> 00:26:13,600
following broad news, but I've
been taking particular news
445
00:26:13,920 --> 00:26:16,160
stories that I've been hearing
and I've been diving into
446
00:26:16,160 --> 00:26:17,800
understanding what they're
about.
447
00:26:18,040 --> 00:26:21,360
Because it turns out that
there's a Hebrew expression.
448
00:26:21,600 --> 00:26:23,840
I love the expression in Kaddash
Takasashemesh.
449
00:26:24,120 --> 00:26:26,440
There's no, there's nothing new
under the sun.
450
00:26:26,440 --> 00:26:28,920
Everything that we think we've
seen already has already had
451
00:26:28,920 --> 00:26:30,760
everything that we thought that
we think is new has already
452
00:26:30,760 --> 00:26:32,440
happened already.
And everything that we thought,
453
00:26:33,520 --> 00:26:35,360
you know, if we haven't seen
something doesn't mean that it
454
00:26:35,360 --> 00:26:36,840
hasn't existed in a particular
way.
455
00:26:37,280 --> 00:26:42,800
And that's, I mean, that's my
personal push and plea for
456
00:26:42,800 --> 00:26:46,600
learning about history and
learning about humanity, but I
457
00:26:46,600 --> 00:26:47,960
think that that's important for
everything.
458
00:26:47,960 --> 00:26:52,160
So for example, let's say again,
just to stick with an example
459
00:26:52,160 --> 00:26:56,160
that we started talking about
earlier with the ice stories and
460
00:26:56,800 --> 00:27:00,840
does you know, you can find out
there's so many worlds to learn
461
00:27:00,840 --> 00:27:03,160
about what the context of what's
going on over here is.
462
00:27:03,400 --> 00:27:06,400
What's the relationship between
Minnesota and Somalia?
463
00:27:06,640 --> 00:27:10,360
What's the relationship between
ice and the citizens?
464
00:27:10,920 --> 00:27:16,720
Where was the right always
embracing ice and what was ice
465
00:27:16,720 --> 00:27:19,480
created for?
Was it created to catch
466
00:27:19,840 --> 00:27:22,480
Venezuelan cartel members that
are running around the country?
467
00:27:22,480 --> 00:27:25,760
Was it designed to enforce all
immigration law?
468
00:27:26,080 --> 00:27:28,560
What's the point of that right.
So you do, I mean, and then that
469
00:27:28,560 --> 00:27:34,440
that so those two topics,
Somalia and Minnesota ICE and
470
00:27:35,320 --> 00:27:39,440
and and and its purpose are
enormous deep dives that require
471
00:27:39,640 --> 00:27:42,200
a lot of learning.
And you can pick what you want
472
00:27:42,200 --> 00:27:44,160
to learn about it without having
it fed to you.
473
00:27:44,480 --> 00:27:46,480
You can find the books that look
interesting to you.
474
00:27:46,480 --> 00:27:49,040
You can find the articles that
speak to you or the podcast with
475
00:27:49,040 --> 00:27:53,680
the podcast ghost guest or host
that will go into an
476
00:27:53,960 --> 00:27:56,880
understanding of what the
context is.
477
00:27:57,040 --> 00:28:01,200
So that while these events are
unfolding, regardless of what
478
00:28:01,200 --> 00:28:05,200
your opinion is, at least you're
not as as short as short
479
00:28:05,200 --> 00:28:09,160
sighted, short memoried as so
many people are.
480
00:28:09,360 --> 00:28:12,320
Where a year and a half ago
we're talking about defending
481
00:28:12,600 --> 00:28:16,320
gun rights and the rights to
protest and to end for free
482
00:28:16,320 --> 00:28:19,040
speech, and now are shutting
down the exact same thing.
483
00:28:19,280 --> 00:28:28,120
And the more that people start
to continue, hopefully to take
484
00:28:28,120 --> 00:28:30,720
their information and the
learning into their own hands
485
00:28:30,720 --> 00:28:33,680
instead of depending on it being
fed to you.
486
00:28:33,680 --> 00:28:36,840
Like me sitting on the couch
waiting for my friend or my kid
487
00:28:36,840 --> 00:28:41,320
or my wife or someone to bring
me a snack and getting up and
488
00:28:41,320 --> 00:28:43,520
doing it myself.
There's enormous, enormous
489
00:28:43,520 --> 00:28:46,040
value.
And every now and then, yeah,
490
00:28:46,040 --> 00:28:47,600
it's good to have someone bring
you a cheap snack.
491
00:28:47,600 --> 00:28:49,240
If you're feeling lazy, if you
feel like you don't want to get
492
00:28:49,240 --> 00:28:50,760
up and you don't really care
what you're in the mood for,
493
00:28:50,760 --> 00:28:53,320
you're just hungry, then go
ahead and go check out social
494
00:28:53,320 --> 00:28:55,600
media for a little bit.
But I think it's much, much,
495
00:28:55,600 --> 00:29:00,040
much more productive and much
more valuable to to really do
496
00:29:00,040 --> 00:29:02,040
the work yourself and create our
own algorithms.
497
00:29:02,040 --> 00:29:05,520
And then, and then everything
snowballs and, and, and, and
498
00:29:05,520 --> 00:29:11,520
it's incredible to see how
entangled all of society and all
499
00:29:11,520 --> 00:29:17,000
of culture and all of economy
and all of music and, and all of
500
00:29:17,000 --> 00:29:20,560
film are just so deeply and
tightly correlated.
501
00:29:20,560 --> 00:29:26,200
And, and I mean, and so, so if
there's a particular aspect of,
502
00:29:26,840 --> 00:29:32,600
let's say, the news that is
interesting to you, do deep
503
00:29:32,600 --> 00:29:34,880
dive, because along that deep
dive, you're going to find other
504
00:29:34,880 --> 00:29:36,760
rabbit holes and you're going to
learn more and you're going to
505
00:29:36,760 --> 00:29:40,080
understand more.
And you're going to see really
506
00:29:40,280 --> 00:29:45,160
how much things are really
entangled and how events that
507
00:29:45,160 --> 00:29:48,520
are being that that, that that
feel like they are isolated
508
00:29:48,920 --> 00:29:52,240
really are all just circular.
And it's really, really
509
00:29:52,240 --> 00:29:54,560
interesting.
It's really, really cool and I
510
00:29:54,560 --> 00:29:57,400
really, really encourage people
to do to to, to learn about this
511
00:29:57,400 --> 00:30:01,480
and to dive into and this is the
value of democratized education
512
00:30:01,480 --> 00:30:05,040
and this is the value of
democratized information is that
513
00:30:05,280 --> 00:30:09,080
this allows us to fight
propaganda by.
514
00:30:10,040 --> 00:30:13,120
But propaganda is always going
to be put out there for better
515
00:30:13,120 --> 00:30:14,680
or for worse.
Sometimes it's good propaganda,
516
00:30:14,680 --> 00:30:19,960
sometimes it's bad propaganda.
But in order for us to maintain
517
00:30:19,960 --> 00:30:23,160
our own perspective, and this is
how we still know that we live
518
00:30:23,160 --> 00:30:24,600
in a free country.
And, you know, even though
519
00:30:24,600 --> 00:30:28,640
people will say, yeah, it's not
free, there are definitely
520
00:30:28,640 --> 00:30:30,560
aspects of society that really,
really aren't.
521
00:30:30,560 --> 00:30:34,400
But you know what, the fact that
there are podcasts with hundreds
522
00:30:34,400 --> 00:30:37,120
of thousands or millions of
listeners that are saying things
523
00:30:37,120 --> 00:30:40,040
that are very, very clearly
against the establishment or the
524
00:30:40,040 --> 00:30:42,120
elite class or whatever it is.
And the fact that they have the
525
00:30:42,120 --> 00:30:44,000
right to do it.
And the fact that that that
526
00:30:44,000 --> 00:30:45,320
millions of people have access
to it.
527
00:30:45,320 --> 00:30:47,960
And the fact that people are
listening to it and whether
528
00:30:47,960 --> 00:30:50,640
they're doing anything about it
or not is another conversation.
529
00:30:51,080 --> 00:30:54,240
But the fact that it's out there
means that it's inherently free
530
00:30:54,600 --> 00:30:58,440
and we do not want to destroy
that.
531
00:30:58,440 --> 00:31:01,200
We do not want to trample that.
It's a really, really potential
532
00:31:01,200 --> 00:31:05,120
scary road to go down.
The other topic that other
533
00:31:05,120 --> 00:31:07,400
topics that were just came to
mind because I came back from
534
00:31:08,200 --> 00:31:14,480
again, another trip to Orlando
and it was really, really a lot
535
00:31:14,480 --> 00:31:22,000
of fun.
I have, excuse me, I what, what,
536
00:31:22,000 --> 00:31:26,560
what, what we usually do
nowadays is we get a Airbnb in
537
00:31:26,560 --> 00:31:28,640
Orlando.
And I don't know if you've ever
538
00:31:28,640 --> 00:31:31,840
been to Orlando, but they have
this enormous.
539
00:31:32,360 --> 00:31:36,160
I mean, it's obviously one of
the tops hubs for vacations,
540
00:31:37,000 --> 00:31:41,960
conferences, tourism in the
world other than rather in
541
00:31:41,960 --> 00:31:45,920
America, other than potentially
Las Vegas, a couple other
542
00:31:45,920 --> 00:31:47,440
places, but Orlando's really up
there.
543
00:31:47,760 --> 00:31:51,120
And what was really cool, So
Airbnb, it started off a couple
544
00:31:51,120 --> 00:31:54,040
of years even as even as
recently as like 5 or 6 years
545
00:31:54,040 --> 00:32:01,000
ago, Airbnb was something that
was essentially a kind of form
546
00:32:01,320 --> 00:32:05,240
of second income for, for
everyday people that own their
547
00:32:05,240 --> 00:32:07,040
houses.
And that if you lived in a house
548
00:32:07,040 --> 00:32:10,360
that was a little too big, or
you had a back house, or you had
549
00:32:10,360 --> 00:32:12,800
a garage that you can convert
into a private low suite, then
550
00:32:12,800 --> 00:32:15,680
you'd rent it out.
Sometimes if you had a vacation
551
00:32:15,680 --> 00:32:17,680
home, you made an investment,
then you would, you know, then
552
00:32:17,680 --> 00:32:19,920
you'd buy a house and you'd rent
it out when you didn't want to
553
00:32:19,920 --> 00:32:21,000
go on vacation.
And then while you're on
554
00:32:21,000 --> 00:32:23,320
vacation, you would you'd be
able to stay there.
555
00:32:24,280 --> 00:32:28,320
And Airbnb was awesome is
awesome.
556
00:32:28,720 --> 00:32:32,080
I love Airbnb.
And then corporations started
557
00:32:32,080 --> 00:32:34,200
realizing that this is a
phenomenal business model.
558
00:32:34,520 --> 00:32:40,000
And So what Airbnb is in
particular started to do is what
559
00:32:40,000 --> 00:32:44,560
corporations rather started to
do was to buy and develop huge
560
00:32:44,560 --> 00:32:46,880
communities.
And this is happening all over
561
00:32:46,880 --> 00:32:50,960
Orlando.
And instead of people renting
562
00:32:50,960 --> 00:32:53,240
out their summer house or
instead of people renting out a
563
00:32:53,240 --> 00:32:56,720
house, you know, the second-half
of their house or whatever it
564
00:32:56,720 --> 00:32:59,520
is, the garage back house.
And I've stayed in some really,
565
00:32:59,520 --> 00:33:02,960
really wonderful Airbnb's in
Austin, TX, for example, I, I
566
00:33:02,960 --> 00:33:05,480
stayed in someone's back house
and it was really, really,
567
00:33:05,640 --> 00:33:08,800
really an awesome experience.
It was cheap, it was spacious.
568
00:33:08,800 --> 00:33:12,320
There's a separate walking path
for us to get into and the host
569
00:33:12,320 --> 00:33:14,840
and the host, even though we'd
interact with them, had had
570
00:33:14,840 --> 00:33:16,480
thought of everything and were
really, really great.
571
00:33:17,040 --> 00:33:19,720
And then there's another time
where I went to with a couple of
572
00:33:19,720 --> 00:33:25,560
buddies of mine, Sam and Heim to
oh, gosh, Newport, RI I love
573
00:33:25,560 --> 00:33:26,520
Newport.
I've been there a couple of
574
00:33:26,520 --> 00:33:30,800
times and it's such a cool
place, such an, it's so much
575
00:33:30,800 --> 00:33:33,160
beautiful nature.
I, there was the bridge that you
576
00:33:33,160 --> 00:33:35,880
drive over to go to Newport and
I don't even remember what was
577
00:33:35,880 --> 00:33:37,880
going on in my life in
particular at that point.
578
00:33:37,880 --> 00:33:39,920
But I remember driving over that
bridge and thinking about how
579
00:33:39,920 --> 00:33:44,240
beautiful it was and seeing all
the sailboats and the literally
580
00:33:44,240 --> 00:33:46,240
just like sapphire water below
me.
581
00:33:46,560 --> 00:33:51,800
And I just, I, I got emotional.
I was tearing or crying or
582
00:33:52,280 --> 00:33:55,480
whatever, but it was really a
Newport's a really, really
583
00:33:55,480 --> 00:33:57,720
beautiful.
That part of Rhode Island in the
584
00:33:57,720 --> 00:33:59,520
country is really, really
remarkable.
585
00:33:59,880 --> 00:34:06,600
But we stayed at Airbnb there
and it was in someone's back
586
00:34:06,600 --> 00:34:08,920
house, right?
So you would you can go through
587
00:34:08,920 --> 00:34:11,360
the front or you can go in on
the back and you climb up the
588
00:34:11,800 --> 00:34:15,280
the back stairs up to the deck
on the second story of the
589
00:34:15,280 --> 00:34:18,120
house.
And there you had a big, it was
590
00:34:18,120 --> 00:34:21,040
a big loft area that they
converted into a bedroom with a
591
00:34:21,040 --> 00:34:24,000
couple of beds and shower in the
bathroom and is closed off from
592
00:34:24,000 --> 00:34:25,960
the rest of the house.
But the owners that were living
593
00:34:25,960 --> 00:34:28,520
there were very, very warm and
very sweet and very nice.
594
00:34:28,840 --> 00:34:31,760
And they had a little garden.
And during the morning, they
595
00:34:31,760 --> 00:34:33,560
just knocked on the door and
said, hey, we have some, you
596
00:34:33,560 --> 00:34:36,120
know, some tomatoes and
cucumbers.
597
00:34:36,120 --> 00:34:37,560
And if there's anything you
need, just let me know.
598
00:34:37,560 --> 00:34:42,520
We'll help take care of you.
And then, and then Fast forward
599
00:34:42,520 --> 00:34:47,360
five years from now, you go to
Orlando and you go to Airbnb and
600
00:34:48,800 --> 00:34:51,719
you go to a house that's owned
by a corporation that make you
601
00:34:51,719 --> 00:34:55,560
sign this form, this liability,
not liability form, but like
602
00:34:55,600 --> 00:34:59,080
rules form of, of, of procedures
that you have to follow.
603
00:34:59,480 --> 00:35:04,040
And the bathrooms have these
little like mini shampoos from
604
00:35:04,040 --> 00:35:05,960
the hotels and stuff like that.
And they give you the little
605
00:35:05,960 --> 00:35:10,200
hotel towels and thin sheets.
And, and I'm not complaining
606
00:35:10,280 --> 00:35:12,160
because the houses are
remarkable.
607
00:35:12,360 --> 00:35:17,280
The only thing I'm complaining
about is the fact that this is a
608
00:35:17,280 --> 00:35:22,240
common thread that's been going
on for as America's evolved and
609
00:35:22,240 --> 00:35:26,360
as the country has become a
little bit a lot more geared
610
00:35:26,800 --> 00:35:31,160
towards corporate takeover.
Where is where and you have an
611
00:35:31,160 --> 00:35:35,240
idea and you have a concept or a
business that starts off really,
612
00:35:35,240 --> 00:35:39,360
really fun, individualized,
personalized and for the
613
00:35:39,360 --> 00:35:42,960
everyday person.
And once it scales into a
614
00:35:42,960 --> 00:35:48,000
business model, then, and I
don't even blame the people for
615
00:35:48,000 --> 00:35:50,880
trying to make money then in
corporations, hedge funds,
616
00:35:51,080 --> 00:35:54,920
investment banks, they see the
opportunity for revenue and
617
00:35:54,920 --> 00:35:57,280
they'll take it.
And I don't, and I, I, I
618
00:35:57,280 --> 00:35:59,560
genuinely do not blame them for
it.
619
00:35:59,920 --> 00:36:03,280
I just wish that it kept a
little bit more of the homey
620
00:36:03,280 --> 00:36:06,640
feel and a little bit more
authentic and customer based.
621
00:36:06,680 --> 00:36:10,920
And but it doesn't surprise me
because I think that it feels to
622
00:36:10,920 --> 00:36:15,560
me that a lot is corporatized,
including people's personalities
623
00:36:15,560 --> 00:36:16,720
these days.
It's so funny.
624
00:36:16,720 --> 00:36:20,400
And I didn't get LinkedIn used
to get a lot of I've seen that
625
00:36:20,400 --> 00:36:23,840
LinkedIn got a lot gets a lot of
hate, even though I think it's a
626
00:36:23,840 --> 00:36:27,640
really nice platform.
But one of the reasons why is
627
00:36:27,640 --> 00:36:32,000
because it kind of corporatized
all humanity and it turned
628
00:36:32,000 --> 00:36:39,440
everyone into these very polite
robots that write the same way.
629
00:36:39,440 --> 00:36:44,520
I was down to my luck and, you
know, several paragraphs below
630
00:36:44,520 --> 00:36:45,800
it where you have to click see
more.
631
00:36:46,080 --> 00:36:48,240
And then I didn't.
And then I didn't give up.
632
00:36:49,040 --> 00:36:53,040
And then my luck changed.
Keep hustling, you know, And
633
00:36:53,160 --> 00:36:55,920
that's every LinkedIn post, just
in another variation one way or
634
00:36:55,920 --> 00:36:58,320
the other.
I got fired from the job.
635
00:36:59,120 --> 00:37:00,920
But I know that tomorrow's going
to be a brighter day.
636
00:37:01,320 --> 00:37:05,440
And yeah, I mean, that's cool,
but it also kind of feels
637
00:37:05,600 --> 00:37:10,480
robotic and not real And film,
I've complained and I've I've
638
00:37:10,480 --> 00:37:11,760
complained.
I've talked to people, I've
639
00:37:11,760 --> 00:37:14,560
observed and talked to a bunch
of guests recently, especially
640
00:37:14,560 --> 00:37:19,680
in entertainment world about how
the film industry.
641
00:37:20,760 --> 00:37:24,200
And this is again, I'm not, this
isn't a, this isn't a blame
642
00:37:24,200 --> 00:37:26,720
game.
I'm not trying to blame people
643
00:37:26,720 --> 00:37:29,280
or say this is all a problem or
I understand where this is
644
00:37:29,280 --> 00:37:33,040
coming from.
Film studios, streaming studios
645
00:37:33,040 --> 00:37:35,480
and services, they have to make
money.
646
00:37:35,480 --> 00:37:36,880
I get that they have to make
money.
647
00:37:36,880 --> 00:37:39,120
The question just is what are
you?
648
00:37:39,120 --> 00:37:46,400
Where are you going to trim the
fat from in order to produce
649
00:37:46,480 --> 00:37:49,120
profits?
And if that fat trimming is
650
00:37:49,120 --> 00:37:53,000
customer based, then you're
going to have a push back from
651
00:37:53,000 --> 00:37:55,440
the customer base if the
trimming.
652
00:37:55,520 --> 00:37:58,440
And that's why actually.
So it's what I think is
653
00:37:58,440 --> 00:37:59,720
interesting.
And what I've heard from
654
00:37:59,920 --> 00:38:03,480
anecdotally from several Amazon
employees that I know personally
655
00:38:03,480 --> 00:38:05,960
have spoken to throughout the
years is that Amazon is known to
656
00:38:05,960 --> 00:38:08,280
be a very difficult place to
work, even though they obviously
657
00:38:08,280 --> 00:38:10,600
care tremendously about the
customer.
658
00:38:10,920 --> 00:38:12,960
And so where they'll say
they'll, they'll trim the their
659
00:38:12,960 --> 00:38:19,920
quote UN quote fat for to make
profits might be on their
660
00:38:20,240 --> 00:38:22,600
employee end as opposed to the
customer's end.
661
00:38:23,280 --> 00:38:26,160
And so for all the customers
that are happy that Amazon has
662
00:38:26,160 --> 00:38:28,800
no questions asked returns, that
might just mean that their
663
00:38:28,800 --> 00:38:31,040
employees are getting their
employees are getting paid less
664
00:38:31,040 --> 00:38:33,560
because they have to figure out
where to cut those where to make
665
00:38:33,560 --> 00:38:35,240
that difference.
And by the way, I don't even
666
00:38:35,240 --> 00:38:37,680
know if this is the case.
I'm literally just speculating,
667
00:38:37,680 --> 00:38:39,880
making this up.
But the point and and the point
668
00:38:39,880 --> 00:38:42,680
is not in the example that I'm
giving, but in the in the
669
00:38:42,960 --> 00:38:45,760
general gist of the idea that
I'm portraying.
670
00:38:46,200 --> 00:38:51,960
And that is that I don't fault
you for having to be responsible
671
00:38:52,120 --> 00:38:59,200
financially and to make money.
I fault two things #1 having it
672
00:38:59,200 --> 00:39:02,520
come out, coming out of the
customer experience, and then #2
673
00:39:02,520 --> 00:39:05,280
expecting to be bailed out on a
massive level.
674
00:39:05,280 --> 00:39:09,800
Like with government, When big
airlines or big banks, Oregon,
675
00:39:10,000 --> 00:39:13,040
big entertainment companies
start falling apart and the
676
00:39:13,040 --> 00:39:15,360
government saves them if they're
falling apart, there's a reason
677
00:39:15,360 --> 00:39:19,040
for that.
That's capitalism and to save
678
00:39:19,040 --> 00:39:24,440
them is such an issue because it
ignores the rules of what
679
00:39:24,440 --> 00:39:27,920
people, of what people want.
And I mean, that is that's
680
00:39:27,920 --> 00:39:30,000
something that's also really,
really interesting as far as
681
00:39:30,000 --> 00:39:32,080
democracy goes.
If you're going to be a
682
00:39:32,080 --> 00:39:35,920
capitalist, most, you know,
mostly capitalistic democracy.
683
00:39:36,960 --> 00:39:39,640
Companies shouldn't get bailed
out by the government if they go
684
00:39:39,640 --> 00:39:41,760
out of business or if they fail
because they're failing for a
685
00:39:41,760 --> 00:39:43,000
reason.
And that's because they're not
686
00:39:43,000 --> 00:39:47,080
doing a good job.
So by keeping them afloat, you
687
00:39:47,080 --> 00:39:49,480
get, you're incentivizing them
to continue mistreating
688
00:39:49,480 --> 00:39:52,480
customers.
And that to me is much more of
689
00:39:52,480 --> 00:39:57,560
an issue than anything else.
So these, you know, and I'm in
690
00:39:57,560 --> 00:40:01,040
these Airbnb communities and
I've spent, I mean, I've spent a
691
00:40:01,040 --> 00:40:03,560
lot of time with them and I go
to them quite a bit because at
692
00:40:03,560 --> 00:40:04,920
the end of the day, they're
convenient.
693
00:40:05,160 --> 00:40:07,880
They're huge houses for really
cheap and they're really good
694
00:40:07,880 --> 00:40:10,840
for my family and for my good,
my kids to run around and, and
695
00:40:10,840 --> 00:40:12,320
it really does feel like a good
getaway.
696
00:40:12,320 --> 00:40:14,880
And so I am happy to go and I am
happy to pay up for it.
697
00:40:16,720 --> 00:40:19,960
But when you walk down the
block, are you go, are you going
698
00:40:19,960 --> 00:40:22,160
for a drive?
And you see for miles and miles
699
00:40:22,160 --> 00:40:25,320
on end the exact same houses
that are all sitting empty, that
700
00:40:25,320 --> 00:40:29,680
are all built the same way, the
same style, cut and paste, copy,
701
00:40:29,680 --> 00:40:32,880
paste, copy, paste, copy, paste.
You feel like you're missing a
702
00:40:32,880 --> 00:40:34,040
little something.
You feel like you're missing a
703
00:40:34,040 --> 00:40:36,720
little bit of soul.
Till next time guys, We got a
704
00:40:36,720 --> 00:40:37,880
bunch of episodes coming up this
week.
705
00:40:38,320 --> 00:40:38,440
Bye.