#276 - Bob Bonniol

Bob Bonniol is an Emmy Award winning Director, Production Designer, andProducer. He is known for his implementation of extensive media andinteractive features in his productions and installations. His practice spans the worlds of theater, broadcast, opera, installations, and architecture.
In 2023, he has widely deployed new interactive technology and AI to elevate live concert production, with tour designs for Billy Joel, Blake Shelton, Megan Thee Stallion, and the One Night Houston Festival.
In the course of his career he has worked from Broadway to Beijing, helpingclients like Disney, NBCU, ABC, Live Nation, AEG, Marvel Studios, and Lucas' film gathers and dazzles audiences of millions.
We had the pleasure of Bob being on the show in February of 2023, you can check out episode #183 on our website www.geezersofgear.com
Please welcome Bob back to the Podcast.
This Episode is brought to you by Elation and Main Light
00:00:00,240 --> 00:00:05,280
What can you say it's Bob Bonnie
all really cool talk about AI
2
00:00:05,720 --> 00:00:10,320
Finally someone who really is
into it even more than I am and
3
00:00:10,320 --> 00:00:14,200
and super smart guy.
Obviously Bob has written his
4
00:00:14,200 --> 00:00:18,320
own code.
He's a very smart computer guy.
5
00:00:18,320 --> 00:00:21,800
So fun talk.
We got into it pretty heavy here
6
00:00:21,800 --> 00:00:26,400
on on this episode, which is
episode #276.
7
00:00:26,560 --> 00:00:31,000
So I really hope you enjoy it.
It was fun and I look forward to
8
00:00:31,000 --> 00:00:35,120
doing it again with Bob because
we only got we only got so far
9
00:00:35,120 --> 00:00:36,720
into it.
So enjoy.
10
00:00:38,400 --> 00:00:41,600
Thank you for joining me today
for Gears of Gear episode number
11
00:00:41,600 --> 00:00:45,200
276.
Today's podcast is brought to
12
00:00:45,200 --> 00:00:49,040
you by Main Light, a national
dry hire rental provider
13
00:00:49,200 --> 00:00:52,040
specializing in supplying the
latest lighting and stage
14
00:00:52,040 --> 00:00:54,960
equipment technology.
Their extensive rental inventory
15
00:00:54,960 --> 00:00:59,520
includes moving lights, trust
solutions, control consoles, LE
16
00:00:59,520 --> 00:01:04,760
DS and an array of weather
resistant IP65 rated fixtures
17
00:01:05,400 --> 00:01:08,760
perfect for outdoor venues.
Whether for theater productions,
18
00:01:08,760 --> 00:01:12,440
outdoor stadium events or TV
film broadcast, Main Light is
19
00:01:12,440 --> 00:01:14,560
the partner you trust with the
gear you want.
20
00:01:14,560 --> 00:01:18,200
Committed to quality and
reliability, Main Light also
21
00:01:18,200 --> 00:01:21,320
provides a robust selection of
used sales options to their
22
00:01:21,320 --> 00:01:24,520
client with each item
meticulously maintained and
23
00:01:24,520 --> 00:01:27,400
owned by Main Light.
Coming with a 60 day depot
24
00:01:27,400 --> 00:01:32,240
warranty.
Strategically located across the
25
00:01:32,440 --> 00:01:36,280
US, their four key locations
include Teterboro, New Jersey,
26
00:01:36,280 --> 00:01:42,000
Wilmington, DE, Nashville, TN,
and Las Vegas, NV Discover how
27
00:01:42,000 --> 00:01:46,920
Main Light can equip your next
event by visiting Main light.com
28
00:01:47,360 --> 00:01:51,400
today.
And today's podcast is also
29
00:01:51,400 --> 00:01:53,720
brought to you by Elation
Professional.
30
00:01:53,960 --> 00:01:57,240
Please take a moment and watch.
This wonderful video.
31
00:02:11,080 --> 00:02:14,880
Inspire your audience's
imagination with Elation's Pulse
32
00:02:14,880 --> 00:02:17,800
Strobe series.
Designed to bolster creativity
33
00:02:17,840 --> 00:02:21,920
and deliver breathtaking kinetic
intensity, the Pulse series
34
00:02:21,920 --> 00:02:25,560
empowers designers to create
dynamic lighting experiences
35
00:02:25,600 --> 00:02:30,040
that captivate any audience.
The Pulse series synthesizes
36
00:02:30,040 --> 00:02:33,640
mesmerizing lighting experiences
with power and precision,
37
00:02:34,000 --> 00:02:37,080
allowing designers to create
rhythmic, heart pounding
38
00:02:37,080 --> 00:02:40,720
lighting effects with effortless
control elations.
39
00:02:40,720 --> 00:03:29,640
Pulse Series.
You've probably already seen it
40
00:03:29,640 --> 00:03:32,120
and therefore I'm going to drive
you crazy by telling you again
41
00:03:32,120 --> 00:03:36,960
that I went to see this week
David Gilmour from Pink Floyd
42
00:03:36,960 --> 00:03:41,120
fame, of course, and designed by
Mark Brickman, of course, who
43
00:03:41,120 --> 00:03:42,720
we've had on the podcast
recently.
44
00:03:43,000 --> 00:03:47,200
I believe it was episode 271 if
I remember correctly.
45
00:03:47,680 --> 00:03:55,240
And so Bob or I'm sorry, Mark
invited me to see the show.
46
00:03:55,240 --> 00:03:58,200
And sorry, I was reading about
today's guest, Bob Barniel.
47
00:03:58,520 --> 00:04:02,880
Mark invited me to see the show
along with John Wiseman from PRG
48
00:04:02,880 --> 00:04:06,520
of course, and hung out with
them, spent some great time with
49
00:04:06,520 --> 00:04:09,280
them.
Also Jerry Harris from, you
50
00:04:09,280 --> 00:04:14,360
know, the founder of PRG, of
course he invited or he, he was
51
00:04:14,360 --> 00:04:16,320
there as well.
And I got to spend some time
52
00:04:16,320 --> 00:04:18,839
with him, some quality time with
Jerry, which is always fun.
53
00:04:19,079 --> 00:04:21,040
Sorry, I'm just trying to get
this light out of my eyes.
54
00:04:21,040 --> 00:04:27,200
There's a LED friggin video
light thing in my eyes and it's
55
00:04:27,200 --> 00:04:31,320
driving me crazy.
So anyways, got to spend some
56
00:04:31,320 --> 00:04:33,960
time with Jerry, got to see the
show, love the show.
57
00:04:34,200 --> 00:04:36,400
I'm not going to talk too much
about it because I've posted
58
00:04:36,400 --> 00:04:39,760
about it already a whole bunch.
And if you want to see some
59
00:04:39,760 --> 00:04:42,960
pictures or videos, you can go
to I think either my Instagram
60
00:04:42,960 --> 00:04:46,080
or the geez of gear Instagram.
I can't remember which it is,
61
00:04:46,400 --> 00:04:51,440
but there's some reels and you
know, it's just a fabulous show.
62
00:04:51,440 --> 00:04:55,320
It's it's so well designed, it's
so unique and different.
63
00:04:55,320 --> 00:04:59,160
And I just really enjoyed myself
and enjoyed spending time with
64
00:04:59,160 --> 00:05:02,360
some some great friends as well.
And it's always fun to pop into
65
00:05:02,360 --> 00:05:07,320
New York for 24 hours as well.
So I would say if you get a
66
00:05:07,320 --> 00:05:09,800
chance to see it, go see it.
But unless you're in New York,
67
00:05:09,800 --> 00:05:12,200
you're you're not going to see
it 'cause the tour is very
68
00:05:12,200 --> 00:05:13,840
short.
It was only four cities.
69
00:05:13,840 --> 00:05:15,640
It's in the fourth city now, New
York.
70
00:05:16,040 --> 00:05:19,200
And I believe there are two more
shows, if I remember correctly,
71
00:05:19,200 --> 00:05:21,960
maybe tonight and tomorrow night
or tonight and Sunday, something
72
00:05:21,960 --> 00:05:25,000
like that.
Maybe even just one show this
73
00:05:25,000 --> 00:05:28,040
weekend.
But yeah, what is today?
74
00:05:28,040 --> 00:05:31,320
Today is the 8th.
So I think there's a show
75
00:05:31,320 --> 00:05:33,520
tonight and a show Sunday and
that's it.
76
00:05:33,840 --> 00:05:37,600
And then it's over.
And so I hate to say it and I
77
00:05:37,600 --> 00:05:40,840
don't want to be the one to say
it, but this could be the last
78
00:05:40,840 --> 00:05:44,400
chance you get to see David
Gilmour or Pink Floyd or David
79
00:05:44,400 --> 00:05:46,720
Gilmour doing Pink Floyd or any
of that.
80
00:05:47,160 --> 00:05:49,320
And also, it's really worth
seeing this design.
81
00:05:49,320 --> 00:05:52,400
So if you have, if you're in New
York, if you have any ability to
82
00:05:52,400 --> 00:05:56,360
get tickets, if, if you actually
by the time you hear this, it's
83
00:05:56,360 --> 00:05:58,520
over.
So I apologize, forget all that
84
00:05:58,520 --> 00:06:01,200
stuff.
So anyways, it was great.
85
00:06:01,360 --> 00:06:06,440
Watch it on YouTube.
I guess So what else do I want
86
00:06:06,440 --> 00:06:08,000
to talk about?
I'm not going to do a big intro
87
00:06:08,000 --> 00:06:11,120
today because we are actually
going to talk about AI today.
88
00:06:11,120 --> 00:06:15,520
And I've been looking forward to
this one because there are a few
89
00:06:15,520 --> 00:06:19,000
people in the industry I think
who are as geeked out as I am on
90
00:06:19,160 --> 00:06:22,120
AI and today's guest is one of
them.
91
00:06:22,600 --> 00:06:24,480
Bob Bonnie.
All of course, is an Emmy
92
00:06:24,480 --> 00:06:27,040
award-winning director,
production designer and
93
00:06:27,040 --> 00:06:29,480
producer.
He's known for his
94
00:06:29,480 --> 00:06:33,160
implementation of extensive
media and interactive features
95
00:06:33,400 --> 00:06:35,240
in his productions and
installations.
96
00:06:35,560 --> 00:06:40,320
His practice spans the world of
theatre, broadcast, opera
97
00:06:40,320 --> 00:06:46,960
installations and architecture.
In 2023, he widely deployed new
98
00:06:46,960 --> 00:06:50,880
interactive technology and AI to
elevate live concert production
99
00:06:51,200 --> 00:06:54,920
with designs for Billy Joel,
Blake Shelton, Megan Thee
100
00:06:54,920 --> 00:06:58,800
Stallion, and the One Night
Houston Festival.
101
00:06:59,600 --> 00:07:03,560
In the course of his career,
he's worked from Broadway to
102
00:07:03,560 --> 00:07:09,000
Beijing, helping clients from
Disney, NBCUABC, Live Nation,
103
00:07:09,000 --> 00:07:12,400
AEG, Marvel Studios, and
Lucasfilms.
104
00:07:12,400 --> 00:07:17,880
Gather, gathers and dazzles.
Dazzles audiences of millions.
105
00:07:18,200 --> 00:07:20,240
Speaking today is challenging
for me.
106
00:07:20,680 --> 00:07:24,440
We had the pleasure of having
Bob on the podcast in February
107
00:07:24,440 --> 00:07:27,240
of 2023.
You can check that episode out.
108
00:07:27,240 --> 00:07:34,680
It's #183 and that is available
on Spotify, on YouTube, on lots
109
00:07:34,680 --> 00:07:38,680
of places, or on our website,
geezerofgear.com.
110
00:07:39,400 --> 00:07:44,360
So please join me and welcome
Bob Bonnie all again here to
111
00:07:44,360 --> 00:07:52,920
episode #276.
Bob Marcel, how are you?
112
00:07:52,920 --> 00:07:54,080
What's?
Happening.
113
00:07:54,320 --> 00:07:57,720
I am lovely, man I'm I'm good.
It's a beautiful fall day here
114
00:07:57,720 --> 00:07:59,080
in Seattle.
Yeah.
115
00:07:59,360 --> 00:08:03,480
And I've been living in Los
Angeles for the last two plus
116
00:08:03,480 --> 00:08:06,320
months.
So I just got back home after my
117
00:08:06,480 --> 00:08:10,280
residency in Los Angeles and.
What was that?
118
00:08:11,000 --> 00:08:14,080
Can you talk about it?
Yeah, there, there.
119
00:08:14,080 --> 00:08:17,160
You know, there's a variety of
projects that I have going on in
120
00:08:17,160 --> 00:08:21,440
LA, but primarily I was there
because my daughter was doing
121
00:08:21,440 --> 00:08:26,200
two months of training and ACT
lighting on becoming an MA3
122
00:08:27,880 --> 00:08:30,520
support.
Oh, very cool.
123
00:08:30,720 --> 00:08:35,159
So she's a programmer.
She's a budding programmer,
124
00:08:35,159 --> 00:08:36,760
let's put it that way.
Yeah.
125
00:08:37,120 --> 00:08:42,960
And had had become quite
proficient on the ETC consoles
126
00:08:42,960 --> 00:08:44,920
through the course of her
education.
127
00:08:45,080 --> 00:08:46,560
Yeah.
Just not surprising.
128
00:08:46,560 --> 00:08:49,320
ETC kind of has that
marketplace.
129
00:08:49,320 --> 00:08:52,840
They do, yeah.
Sewn up, but she was quite eager
130
00:08:53,760 --> 00:08:56,600
to learn about MA.
She's been on site with me a few
131
00:08:56,600 --> 00:09:00,520
times and she's always been
particularly blown away with,
132
00:09:00,640 --> 00:09:06,640
with Felix Peralta and his
proficiency with MA.
133
00:09:07,000 --> 00:09:10,880
Yeah.
So I, you know, I reached out to
134
00:09:10,880 --> 00:09:14,080
Ben Saltzman at ACT Lighting and
said, hey, do you guys have any
135
00:09:14,080 --> 00:09:18,200
opportunities for training or
for that?
136
00:09:18,200 --> 00:09:19,960
And he's like, it's funny you
mention it.
137
00:09:20,160 --> 00:09:22,880
We are actively looking for
people to put through this
138
00:09:22,880 --> 00:09:26,320
program, you know, to come, you
know, sort of serve an
139
00:09:26,320 --> 00:09:30,240
internship at ACTS.
And of course that my daughter
140
00:09:30,240 --> 00:09:32,640
Izzy, she got to learn about Zac
tracks.
141
00:09:32,640 --> 00:09:35,840
She got to go through all of the
training on MA 3, but then go
142
00:09:35,840 --> 00:09:40,120
yet deeper and learn how to
debug the new software and learn
143
00:09:40,160 --> 00:09:42,080
oh wow.
You know, so she learned it at a
144
00:09:42,080 --> 00:09:44,920
level that's not just a
programming level, but at a deep
145
00:09:44,920 --> 00:09:48,400
functionality level.
So we've just, I was primarily
146
00:09:48,400 --> 00:09:55,080
there to make sure that my, my
daughter was successful.
147
00:09:56,000 --> 00:09:59,200
You know, let, let's put it this
way, any of any of our listeners
148
00:09:59,960 --> 00:10:05,560
who are parents know that, you
know, sometimes your kids have
149
00:10:06,240 --> 00:10:11,800
a, a, a variably successful
relationship with like living in
150
00:10:11,800 --> 00:10:15,240
the world, you know, yes.
How old is she?
151
00:10:16,040 --> 00:10:19,040
She's 20.
Oh, OK, yeah, my son's 20, so I
152
00:10:19,040 --> 00:10:22,000
trust me.
I get it, you know, just keeping
153
00:10:22,000 --> 00:10:25,600
him from killing himself by
doing something dumb.
154
00:10:25,880 --> 00:10:28,600
Is is, you know, most of my goal
in life.
155
00:10:28,880 --> 00:10:32,680
I, I worried less about that
with Izzy than than than, you
156
00:10:32,680 --> 00:10:37,200
know, just her, you know, a, you
know, call me conservative about
157
00:10:37,200 --> 00:10:40,760
this, but setting my daughter
free in Los Angeles, where she's
158
00:10:40,760 --> 00:10:42,840
ever been before.
Yeah.
159
00:10:42,840 --> 00:10:46,560
Maybe a little harrowing, yeah,
but but I also wanted her to be
160
00:10:46,560 --> 00:10:48,600
free to really focus on what she
was up to.
161
00:10:48,600 --> 00:10:53,200
And of course, I have infinite
reasons to be in Los Angeles.
162
00:10:53,320 --> 00:10:55,200
Yeah.
You know, both projects going
163
00:10:55,200 --> 00:10:57,360
on, but also developing
developing new stuff.
164
00:10:57,360 --> 00:11:01,160
Too, I did something similar and
I, I, because again, I guess one
165
00:11:01,160 --> 00:11:05,880
of the differences between a
young man and a young woman is
166
00:11:05,880 --> 00:11:09,760
you're a little less nervous
about sending away a young man.
167
00:11:09,760 --> 00:11:15,480
And my son went to London by
himself for six months and, and
168
00:11:15,720 --> 00:11:19,480
I didn't worry about him at all.
I just, I got, I dropped him off
169
00:11:19,480 --> 00:11:22,640
with his bags and stuff at an
Airbnb and said, see you later,
170
00:11:22,640 --> 00:11:24,960
you know, and that was that,
right?
171
00:11:25,280 --> 00:11:27,600
And then went over to visit when
he was racing or whatever.
172
00:11:27,600 --> 00:11:31,360
So yeah, I mean, I get it, I get
it.
173
00:11:31,360 --> 00:11:33,680
But it's really cool that you
did that with her and, and
174
00:11:33,680 --> 00:11:35,480
helped to make sure she was
successful.
175
00:11:35,480 --> 00:11:39,520
And obviously, it's nice when
you have connections in the
176
00:11:39,520 --> 00:11:43,600
industry that you can sort of
help some of those things along
177
00:11:43,600 --> 00:11:47,600
or or at least, yeah, so good
for you.
178
00:11:48,040 --> 00:11:50,880
Good fathering.
Just doing my best.
179
00:11:50,880 --> 00:11:52,720
You know, back when I was a
production designer for
180
00:11:52,720 --> 00:11:57,120
Nickelback, I remember talking
to to Chad's mom, the lead
181
00:11:57,120 --> 00:11:59,240
singer Chad.
I was talking to his mom and we
182
00:11:59,240 --> 00:12:03,960
were talking about being parents
and, and, and you know, I have a
183
00:12:03,960 --> 00:12:07,960
girl, obviously she had a boy
and I and she said, well, I'll
184
00:12:07,960 --> 00:12:10,400
tell you Bob, you know, when
you're raising a boy, you only
185
00:12:10,400 --> 00:12:13,360
have to worry about one dude.
When you're raising a girl, you
186
00:12:13,360 --> 00:12:15,200
have to worry about every dude
in town.
187
00:12:15,200 --> 00:12:19,520
Yeah, I heard, I heard, I heard
a slightly more colorful version
188
00:12:19,520 --> 00:12:22,400
of that same story.
There is that there I can tell
189
00:12:22,480 --> 00:12:24,720
you that that is not a quote it.
Was.
190
00:12:24,960 --> 00:12:27,280
Yeah, it was.
A slightly different verb going.
191
00:12:27,760 --> 00:12:29,680
On.
But I think that gets the point
192
00:12:29,680 --> 00:12:31,040
across.
Exactly.
193
00:12:31,080 --> 00:12:32,200
Exactly.
No.
194
00:12:32,200 --> 00:12:35,200
And I get it, trust me.
But I mean, you know, again,
195
00:12:35,200 --> 00:12:40,960
with boys, I think really like,
you know, again, my son likes to
196
00:12:40,960 --> 00:12:43,400
go fast in everything that he
does, right?
197
00:12:43,400 --> 00:12:45,120
So like.
Course he does if.
198
00:12:45,120 --> 00:12:47,440
He's pushing a shopping cart
through the store.
199
00:12:47,440 --> 00:12:50,800
He's riding it like a skateboard
or a scooter or something as
200
00:12:50,800 --> 00:12:53,480
fast as he possibly can and
probably racing his body or
201
00:12:53,480 --> 00:12:55,960
something.
And you know, just bad things
202
00:12:55,960 --> 00:12:58,760
tend to happen.
You know, it's like funny story,
203
00:12:58,760 --> 00:13:02,920
when I was designing my, the
house that I used to live in pre
204
00:13:02,920 --> 00:13:08,000
divorce, when I was designing
the pool on the back of it, I
205
00:13:08,000 --> 00:13:11,280
had a guy with CAD and you know,
he's designing all here's the
206
00:13:11,280 --> 00:13:14,080
waterfall and this is going to
be the hot tub over here.
207
00:13:14,080 --> 00:13:16,240
And here's where we're going to,
you know, whatever.
208
00:13:16,720 --> 00:13:19,760
And I said that distance right
there between the house and the
209
00:13:19,760 --> 00:13:21,680
edge of the pool.
How far is that?
210
00:13:21,680 --> 00:13:24,600
And he said it's 8 feet.
That's pretty standard.
211
00:13:24,960 --> 00:13:29,000
And I said, we've got the room.
Can you move that double like 12
212
00:13:29,000 --> 00:13:32,560
to 16 feet?
And he goes, I mean, we can, but
213
00:13:32,560 --> 00:13:35,000
won't it look a little strange
that far from the house?
214
00:13:35,000 --> 00:13:37,040
And then you got to build a
bigger screen if you build a
215
00:13:37,040 --> 00:13:39,960
screen over it.
And I said, yeah, but that
216
00:13:39,960 --> 00:13:42,120
upstairs window, that's his
bedroom.
217
00:13:42,840 --> 00:13:45,960
And I guarantee you we're.
Measuring the trajectory of the.
218
00:13:46,040 --> 00:13:48,760
Yeah, yeah.
If it's too far, he's not even
219
00:13:48,760 --> 00:13:51,600
going to try it, right?
But 8 feet, he's going to be
220
00:13:51,600 --> 00:13:54,160
jumping, you know, he's going to
jump from there to the pool.
221
00:13:54,160 --> 00:13:57,480
It's going to happen.
So I'd rather not inspire that
222
00:13:57,480 --> 00:13:59,800
to happen right the minute I
leave.
223
00:14:00,600 --> 00:14:03,840
But anyway, I appreciate you
coming back on again.
224
00:14:03,840 --> 00:14:06,320
It's been a year and a half and
lots of things have happened in
225
00:14:06,320 --> 00:14:08,400
the last year and a half.
God forbid.
226
00:14:08,560 --> 00:14:11,720
I mean, our industry went that
shit crazy and now it's crazy in
227
00:14:11,720 --> 00:14:15,000
a different way.
And you know, I was talking to
228
00:14:15,800 --> 00:14:19,720
an account exec from one of the
big companies the other day and
229
00:14:20,120 --> 00:14:23,920
he said, you know, I went from
the crazy busiest year in my
230
00:14:23,920 --> 00:14:27,880
career and, and biggest, most
successful year in my career to
231
00:14:28,440 --> 00:14:31,160
like this weird year where it
was incredibly busy, but they
232
00:14:31,160 --> 00:14:34,040
were all smaller deals.
And, and so I was still just as
233
00:14:34,040 --> 00:14:35,600
busy, but I didn't make any
money.
234
00:14:36,200 --> 00:14:39,000
And you know, it's just like
weird, it's weird.
235
00:14:39,000 --> 00:14:42,640
Our industry's weird, but you're
staying busy.
236
00:14:43,240 --> 00:14:45,520
It's a, it's an up and downs
thing.
237
00:14:45,520 --> 00:14:49,760
I'm lucky in that I've got
professional ADD, you know, and
238
00:14:49,760 --> 00:14:52,680
plenty of our colleagues are
very focused folks like they're,
239
00:14:53,240 --> 00:14:55,720
you know, they, they either
focus on their their concert
240
00:14:55,720 --> 00:14:59,680
career or they focus on doing
more brand event type stuff from
241
00:14:59,680 --> 00:15:05,560
maybe a focus in architecture.
But because I am like stay
242
00:15:05,560 --> 00:15:10,920
focused on much for very long.
The net benefit of that is that
243
00:15:11,080 --> 00:15:14,880
you know what mode we've we
built a a business it's about
244
00:15:15,160 --> 00:15:17,400
playing in all three sandboxes.
Yeah.
245
00:15:17,640 --> 00:15:22,440
So as businesses seem to have
their cyclical ups and downs,
246
00:15:22,440 --> 00:15:24,600
usually there's an offset so.
There's overlaps.
247
00:15:24,880 --> 00:15:28,400
Having a good concert year, we
can focus on the corporate work.
248
00:15:29,200 --> 00:15:32,360
If the corporates are pulling
back on spending and there's
249
00:15:32,360 --> 00:15:36,440
less work there, we can look at
architecture and music and you
250
00:15:36,440 --> 00:15:40,480
know, between the three
verticals, we we seem to find
251
00:15:40,480 --> 00:15:43,240
enough to buy the groceries.
Was that on purpose, or was that
252
00:15:43,360 --> 00:15:47,080
just entirely by accident?
You know, if you look back at
253
00:15:47,080 --> 00:15:50,440
the history of the company and
the like across my career and
254
00:15:50,440 --> 00:15:54,760
Colleen, my my wife's career,
you might infer that we were
255
00:15:54,760 --> 00:15:57,080
very smart about how we did
that.
256
00:15:57,080 --> 00:15:59,440
But.
But in fact, you know, I'm, I'm
257
00:15:59,440 --> 00:16:02,840
here to say it was a little bit
of a happy accident, you know?
258
00:16:03,280 --> 00:16:07,680
And to that end, I tell people
just starting out, say yes to
259
00:16:07,680 --> 00:16:10,120
everything.
Yeah, try everything.
260
00:16:10,120 --> 00:16:12,400
Go work in fashion.
Yeah.
261
00:16:12,400 --> 00:16:14,440
Do fashion shows.
Go work in business.
262
00:16:14,440 --> 00:16:18,440
Go do corporate events, go do
theater, right?
263
00:16:18,440 --> 00:16:21,040
Go go see what architecture
stuff is like.
264
00:16:21,040 --> 00:16:25,000
Because I, I think, you know, so
many of the, the best
265
00:16:25,000 --> 00:16:29,360
practitioners at what we do have
had that diversity of
266
00:16:29,360 --> 00:16:30,920
experience.
And I think in the beginning of
267
00:16:30,920 --> 00:16:36,240
your career you are well served
to not worry about as much about
268
00:16:36,680 --> 00:16:41,680
what pays more or maybe what you
think you're interested in and
269
00:16:41,680 --> 00:16:43,800
just to get that diversity of
experience.
270
00:16:43,800 --> 00:16:47,560
Now, I think as a career
proceeds, what you really ought
271
00:16:47,560 --> 00:16:50,480
to be doing is saying no a lot
more because.
272
00:16:50,480 --> 00:16:52,960
Then of course you.
Determine what your focus should
273
00:16:52,960 --> 00:16:57,640
be and and you know where where
you are best able to contribute
274
00:16:57,640 --> 00:16:59,120
to things.
Well, and so many people don't
275
00:16:59,120 --> 00:17:02,320
understand that it's generally
harder to say no, especially in
276
00:17:02,320 --> 00:17:04,680
business.
Like what business people have
277
00:17:04,680 --> 00:17:07,839
the hardest time learning to do
is to say no what to say no to,
278
00:17:07,839 --> 00:17:11,280
not what to say yes to.
But it's funny, you know, when
279
00:17:11,280 --> 00:17:14,319
you talk about the, the multiple
different disciplines and, and
280
00:17:14,319 --> 00:17:17,960
that kind of thing.
So I went and saw David Gilmour
281
00:17:17,960 --> 00:17:19,880
in New York this week.
And isn't, isn't.
282
00:17:20,720 --> 00:17:23,440
That show, I, I get to see it at
the Hollywood Bowl.
283
00:17:23,720 --> 00:17:24,720
Yeah.
Oh, cool.
284
00:17:24,720 --> 00:17:28,240
So, but I was going to say that,
you know, Mark had taken
285
00:17:28,240 --> 00:17:31,640
basically a couple of years, I
think 8 or 10 years off of
286
00:17:31,640 --> 00:17:35,880
lighting and became an A very
good artist.
287
00:17:35,880 --> 00:17:38,000
Like he's he's become an artist.
Yeah.
288
00:17:38,000 --> 00:17:46,000
And and so seeing that show, you
could see very much art blending
289
00:17:46,000 --> 00:17:52,520
with a, you know, classic rock
show and so well, like, it was
290
00:17:52,520 --> 00:17:56,560
such a beautiful show.
It was just so unique and and so
291
00:17:56,560 --> 00:17:59,680
cool.
And I mean, it had some aspects
292
00:17:59,680 --> 00:18:02,520
of some old Pink Floyd shows
with the big lasers and stuff.
293
00:18:02,920 --> 00:18:06,440
So much fog, more fog than I
think I could ever remember
294
00:18:06,440 --> 00:18:11,000
anyone using but just an
incredible show.
295
00:18:11,000 --> 00:18:13,360
I loved it.
I this morning again I went and
296
00:18:13,360 --> 00:18:16,760
looked back at the pictures from
the show 2 days ago and and I
297
00:18:16,760 --> 00:18:19,360
was just like wow you know what
a beautiful show.
298
00:18:19,680 --> 00:18:25,040
I think that Mark is in a really
fortunate position there.
299
00:18:25,040 --> 00:18:28,280
I mean, it's, it's, it's evident
and, and, and it's true.
300
00:18:28,640 --> 00:18:34,680
I think that the David Gilmour
regards Mark as a, as a, a
301
00:18:34,680 --> 00:18:38,560
trusted and and equal
collaborator, yeah.
302
00:18:38,680 --> 00:18:41,320
In artistic freedom, so.
Few examples of that.
303
00:18:41,320 --> 00:18:45,000
I mean, OK, you know, Steve
Cohen with Billy Joel, Billy
304
00:18:45,000 --> 00:18:47,440
Williams with you too.
You know, there there's some
305
00:18:47,720 --> 00:18:53,960
some very rare examples where
the artist has has not just
306
00:18:53,960 --> 00:18:58,680
given permission, but has fully
embraced like a A and what is
307
00:18:58,680 --> 00:19:04,000
usually a long collaboration
with with a person in
308
00:19:04,000 --> 00:19:06,880
particular.
So you know, Mark's got that
309
00:19:06,880 --> 00:19:10,880
with David and and Mark having
had his art sabbatical, as you
310
00:19:10,880 --> 00:19:13,880
say, you know, he was he came to
the store.
311
00:19:13,880 --> 00:19:18,360
And what I love about Mark is
he's so performative, right,
312
00:19:18,520 --> 00:19:22,760
Like like he's he in as much as
he is.
313
00:19:23,360 --> 00:19:27,160
You know, he's like jamming with
the band, right?
314
00:19:27,880 --> 00:19:29,920
There's not a lick of time code
going on there.
315
00:19:30,360 --> 00:19:32,880
Well, and I don't know if you
heard his podcast, but he talked
316
00:19:32,880 --> 00:19:35,040
about that.
He said if I'm going to do this
317
00:19:35,040 --> 00:19:38,560
and if this is going to be my
last David Gilmour show because
318
00:19:38,560 --> 00:19:42,240
I don't know, because you know,
David's up there.
319
00:19:42,240 --> 00:19:44,680
I think he's turning 80 this
year or something.
320
00:19:45,200 --> 00:19:48,720
And he said, so I don't know if
this is the last one.
321
00:19:48,720 --> 00:19:51,960
If it is, I want to, I want to
control the lights.
322
00:19:51,960 --> 00:19:57,560
I want to actually operate the
lights and I want to make, you
323
00:19:57,560 --> 00:19:59,360
know, create art with the
lights.
324
00:19:59,640 --> 00:20:01,480
Yeah.
And, and he did.
325
00:20:01,560 --> 00:20:05,200
I mean, he really did.
It's, it's such a very, you
326
00:20:05,200 --> 00:20:08,160
know, it's not like it's
outrageous technology or it's
327
00:20:08,160 --> 00:20:11,280
just an outrageous lighting
package or anything.
328
00:20:11,480 --> 00:20:14,160
It's just really cool use of
light.
329
00:20:14,320 --> 00:20:17,480
It is, is.
And so I like it.
330
00:20:17,600 --> 00:20:19,520
I think it's really cool.
But you know, it's a.
331
00:20:19,560 --> 00:20:21,800
It's just interesting that you
talk about.
332
00:20:22,200 --> 00:20:25,080
The different disciplines and
saying yes, because I think when
333
00:20:25,080 --> 00:20:27,840
you like, if you grow up a
rock'n'roll designer and you
334
00:20:27,840 --> 00:20:31,200
stay a rock'n'roll designer your
entire life, you're kind of
335
00:20:31,200 --> 00:20:34,360
boxed in a little bit, right?
But when you start pulling in
336
00:20:34,360 --> 00:20:37,520
theater, or you start pulling in
art, or you start pulling in
337
00:20:37,520 --> 00:20:41,560
different things is when it gets
interesting, I think.
338
00:20:42,360 --> 00:20:45,600
Well, and I think that that was
a that was a differentiated for
339
00:20:45,600 --> 00:20:46,840
me in the beginning of my
career.
340
00:20:47,200 --> 00:20:50,120
Yeah, I grew up in Rhode Island.
Yeah, I was coming up in the
341
00:20:50,120 --> 00:20:56,000
scene in Providence and I was, I
was mainly, you know, I was kind
342
00:20:56,000 --> 00:20:58,760
of, I was very wishy washy about
what I was studying either in
343
00:20:58,760 --> 00:21:01,200
the high school or college.
I was kind of half of my time
344
00:21:01,200 --> 00:21:03,560
with Tech.
Like I was, I was working to
345
00:21:03,560 --> 00:21:07,360
learn computer programming,
which has since informed so much
346
00:21:07,360 --> 00:21:09,840
of everything about what I've
done, which I think we're going
347
00:21:09,840 --> 00:21:15,800
to get to.
But then simultaneously, I got a
348
00:21:15,800 --> 00:21:20,320
job at Trinity Repertory Company
in Providence, which is very
349
00:21:20,320 --> 00:21:24,320
well regarded regional theater
as a spotlight operator.
350
00:21:24,320 --> 00:21:27,320
And then and then that's where I
learned to become an
351
00:21:27,320 --> 00:21:29,640
electrician.
That's where I met my wife.
352
00:21:29,920 --> 00:21:31,400
Who?
It took me 12 years.
353
00:21:31,560 --> 00:21:34,520
To go on a date with me.
But you know, when I met Colleen
354
00:21:34,520 --> 00:21:37,920
at Trinity, she had just come
off the road with the original
355
00:21:37,920 --> 00:21:40,880
Joshua Tree tour.
She had been on the lighting
356
00:21:40,880 --> 00:21:43,880
crew for you too.
She had been out with
357
00:21:44,080 --> 00:21:48,200
Springsteen and had worked with
Jeff Rabbits for a while, and I
358
00:21:48,200 --> 00:21:51,240
thought she was amazing, you
know, but I would say, but that
359
00:21:51,240 --> 00:21:53,680
was a theater and we were doing
incredible theater.
360
00:21:53,680 --> 00:21:55,080
We were.
And it was a very theatrical.
361
00:21:55,280 --> 00:21:58,680
I also met Anne Militello there
who came into design shows.
362
00:21:58,680 --> 00:22:02,080
I met a whole bunch of other
great L DS and I got steeped in
363
00:22:02,080 --> 00:22:04,960
this theatrical tradition of
what lighting look like.
364
00:22:05,240 --> 00:22:13,240
Now at the same time I'm also
working, I found a job at a club
365
00:22:13,240 --> 00:22:16,560
in Providence called The Living
Room, which was the one of the
366
00:22:16,560 --> 00:22:20,640
big concert clubs in Providence
where basically for the
367
00:22:20,640 --> 00:22:23,720
compensation of as much beer as
I could drink and maybe if I
368
00:22:23,720 --> 00:22:28,040
were lucky I'd get a 20 or you
know, you know, 20 bucks a
369
00:22:28,040 --> 00:22:32,960
night, 40 bucks a night.
I was running, you know, an old
370
00:22:32,960 --> 00:22:37,560
Leprechaun LP 1002 scene preset
board and 24 park cans.
371
00:22:38,240 --> 00:22:41,520
And I was doing bands and I was
learning how to busk, you know,
372
00:22:41,920 --> 00:22:45,520
when I was, I was learning how
to be performative with
373
00:22:45,520 --> 00:22:48,280
lighting, with music.
So I was doing music from the
374
00:22:48,280 --> 00:22:49,840
beginning.
I was also doing theater from
375
00:22:49,840 --> 00:22:51,800
the beginning.
And I had then I also stumbled
376
00:22:51,800 --> 00:22:55,520
into the local corporate market.
All of this is an electrician,
377
00:22:56,120 --> 00:22:58,760
you know, but also then very
quickly I was bringing the tech
378
00:22:58,760 --> 00:23:02,840
stuff that, you know, like I
was, I was, I'd worked out all
379
00:23:02,840 --> 00:23:06,160
kinds of, you know, I was
fascinated by graphics and
380
00:23:06,160 --> 00:23:09,120
computers.
So one of the first thing that I
381
00:23:09,120 --> 00:23:12,360
did at the living room, for
instance, is I went out and I
382
00:23:12,400 --> 00:23:16,680
like, I found an old ruined GE
light valve, which is an
383
00:23:16,960 --> 00:23:21,800
improbably gigantic, incredibly
stupid projector to work with,
384
00:23:21,800 --> 00:23:26,400
but it, it had been abandoned
and unusable by an AV company in
385
00:23:26,400 --> 00:23:28,520
Massachusetts.
I went and got this thing and I
386
00:23:28,520 --> 00:23:33,360
would bring it to the living
room and I was doing, I was, I
387
00:23:33,360 --> 00:23:38,920
was literally using like a video
toaster or an Amiga computer to
388
00:23:38,920 --> 00:23:44,760
like, do I I'm not going to
insult Joshua White by saying I
389
00:23:44,760 --> 00:23:47,920
was doing liquid light shows.
Yeah, but I was doing like, I
390
00:23:47,920 --> 00:23:51,840
was doing like psychedelic
projected texture on these bands
391
00:23:52,720 --> 00:23:56,760
at, you know, by like fiddling
around with the computer at the
392
00:23:56,760 --> 00:24:01,120
same time I was doing the the
lighting so so you could see all
393
00:24:01,120 --> 00:24:04,760
the pieces that now I've become
known for.
394
00:24:04,920 --> 00:24:07,360
Yeah, at that point.
But I'm, I can't tell you that
395
00:24:07,360 --> 00:24:09,080
there was a conscious decision.
Yeah.
396
00:24:09,080 --> 00:24:11,080
Oh, I know what I want to be and
what I want to do.
397
00:24:11,080 --> 00:24:14,480
I want to combine all these
things and I, I was just a like
398
00:24:14,480 --> 00:24:19,800
a 21 year old kid, like saying,
you know, man, my, my beer money
399
00:24:19,800 --> 00:24:23,680
jobs sure are cool.
Yeah, well, I mean, it's, you
400
00:24:23,680 --> 00:24:27,040
know, those lucky accidents and
and things that happen like
401
00:24:27,040 --> 00:24:29,720
that, you just go, hey, this
actually turned out really well.
402
00:24:29,720 --> 00:24:32,320
You know, like a lot of those
things like verilite, you know,
403
00:24:32,320 --> 00:24:36,760
I remember the the story that I
heard on my podcast where, you
404
00:24:36,760 --> 00:24:39,920
know, the the light wasn't
supposed to be on while it
405
00:24:39,920 --> 00:24:41,280
moved.
That was just and.
406
00:24:41,400 --> 00:24:43,400
Then they to.
Reposition the fixture.
407
00:24:43,760 --> 00:24:46,040
The shutter was supposed to be
closed in the middle and and
408
00:24:46,040 --> 00:24:47,520
there was a glitch where it
stayed.
409
00:24:47,520 --> 00:24:49,960
Open moved and everyone lost
their mind, right?
410
00:24:50,160 --> 00:24:53,280
Yeah, and went well, do that
again, you know, and suddenly,
411
00:24:53,320 --> 00:24:56,920
you know, the look that became
so famous happened.
412
00:24:56,920 --> 00:24:59,520
So I like that kind of stuff
when that.
413
00:24:59,520 --> 00:25:00,960
Happens.
I adore that anecdote.
414
00:25:00,960 --> 00:25:04,960
Yeah, sometimes mistakes are
are, you know, the best part of
415
00:25:04,960 --> 00:25:07,960
the show.
But you know, the cool thing,
416
00:25:07,960 --> 00:25:10,640
one of the cool things, not to
go on and on about this, this
417
00:25:10,640 --> 00:25:13,560
Brickman show, but one of the
cool things about it is I've
418
00:25:13,560 --> 00:25:17,160
seen the same song multiple
times on video and stuff from
419
00:25:17,160 --> 00:25:19,960
this show, and every time it
looks completely different
420
00:25:19,960 --> 00:25:23,120
because it is because he's
performing it, you know?
421
00:25:24,120 --> 00:25:27,280
Among the things he's doing,
he's got, he's got that Corel
422
00:25:27,280 --> 00:25:30,760
paint thing, which is feeding
the VPU on the MA, which is
423
00:25:31,200 --> 00:25:33,120
letting him literally be
rhythmic.
424
00:25:33,480 --> 00:25:40,160
Yeah, and and, and and motion
based and emotional based in the
425
00:25:40,160 --> 00:25:42,720
way he's running the lighting
and pieces of the show.
426
00:25:43,240 --> 00:25:45,680
And I just think that that's
it's just phenomenal.
427
00:25:45,720 --> 00:25:46,920
You know, I've heard busting.
On.
428
00:25:46,920 --> 00:25:50,040
Acid talking about that show and
how different it is from
429
00:25:50,040 --> 00:25:52,200
anything else.
And that's Mark.
430
00:25:52,240 --> 00:25:56,640
Mark thinks without constraints.
He doesn't worry if something
431
00:25:56,640 --> 00:25:59,560
doesn't exist or if a workflow
hasn't been determined.
432
00:25:59,880 --> 00:26:02,800
He's going to, he's going to,
he's going to get it figured out
433
00:26:02,800 --> 00:26:05,520
and he's going to be, he's going
to put it to work and he's going
434
00:26:05,520 --> 00:26:09,640
to put his, his fingerprints on
it.
435
00:26:09,680 --> 00:26:12,440
Yeah.
You know, you see that show and
436
00:26:12,440 --> 00:26:15,040
you can't.
Well, the, the very first time I
437
00:26:15,040 --> 00:26:20,160
ever met Mark, it's a story I've
never shared, but we, I was with
438
00:26:20,160 --> 00:26:25,520
Martin and we were creating our
largest, most expensive booth
439
00:26:25,520 --> 00:26:29,040
and trade show at LDI ever,
which was in Miami Beach.
440
00:26:29,520 --> 00:26:33,800
And we hired Mark Brickman to
design the show on our booth.
441
00:26:34,320 --> 00:26:40,600
And so Brickman like took him
forever to come up with the
442
00:26:40,600 --> 00:26:42,840
design.
And I'm not mad at him for this.
443
00:26:42,840 --> 00:26:46,600
This is just how he operates.
We put walls around him and you
444
00:26:46,600 --> 00:26:51,200
can't do that with Mark.
And so he came to us with a
445
00:26:51,200 --> 00:26:54,680
design that had like a half
$1,000,000 in video on it.
446
00:26:55,040 --> 00:26:58,440
And our whole budget, including
the party was like a half
447
00:26:58,440 --> 00:27:02,400
$1,000,000, right?
And so we were like, Mark, this
448
00:27:02,400 --> 00:27:05,120
ain't going to work.
And he went, wow, I can't do it
449
00:27:05,120 --> 00:27:06,800
then.
You people are too small
450
00:27:06,800 --> 00:27:12,520
thinking and, and so, you know,
in a panic, you know, we, we, I
451
00:27:12,520 --> 00:27:16,400
called Peter Morse and said,
Peter, you were really our first
452
00:27:16,400 --> 00:27:18,240
choice.
Can you come do this with us?
453
00:27:18,240 --> 00:27:22,280
You know, and, and Peter did an
incredible job and I don't know
454
00:27:22,280 --> 00:27:26,240
if you were there in 1994 or
whatever with the.
455
00:27:26,240 --> 00:27:27,360
Bicycles.
It was there.
456
00:27:27,520 --> 00:27:30,120
It was one of my, it wasn't the
first.
457
00:27:30,160 --> 00:27:32,160
It was, I think it was my second
LDI and everything.
458
00:27:32,160 --> 00:27:33,600
OK.
Yeah, yeah.
459
00:27:33,600 --> 00:27:38,640
I mean, it, it was, it was like
legendary the the show that
460
00:27:38,640 --> 00:27:42,280
Peter did and how hard it was
for us to pull it off with pals
461
00:27:42,280 --> 00:27:46,200
that were like overheating and
just things failing left and
462
00:27:46,200 --> 00:27:49,640
right because it was a hot box
of a room there, you know, with
463
00:27:49,640 --> 00:27:53,040
no ventilation and stuff.
And but, but it worked out
464
00:27:53,040 --> 00:27:55,360
really, really cool.
But that was my first experience
465
00:27:55,360 --> 00:27:57,720
with Mark.
And what I learned was Mark
466
00:27:57,720 --> 00:28:01,560
needs to be with bands the size
of Pink Floyd.
467
00:28:01,560 --> 00:28:04,720
Like you can't.
But then the next thing I saw
468
00:28:04,720 --> 00:28:08,520
him do was matchbox 20 with
those incredible with the motors
469
00:28:08,520 --> 00:28:10,960
and the and the rig moving so
quickly and stuff.
470
00:28:10,960 --> 00:28:14,200
It was just incredible.
I was like, wow, you know, and
471
00:28:14,200 --> 00:28:15,840
he was the first guy that I saw
do that.
472
00:28:15,840 --> 00:28:19,360
So it was just so cool.
But anyways, moving on.
473
00:28:19,360 --> 00:28:22,200
So one of the reasons I wanted
you to do this, and I'm glad you
474
00:28:22,200 --> 00:28:27,000
finally agreed to it, is because
when I think of who the two
475
00:28:27,400 --> 00:28:31,920
probably geekiest AI talking
guys are in the industry, it's
476
00:28:31,920 --> 00:28:34,200
me and you.
And I'm not much of A geeky guy
477
00:28:34,200 --> 00:28:38,000
'cause I'm not that smart.
But I do love AI, and I do love
478
00:28:38,000 --> 00:28:40,920
talking about it.
And I've certainly tried a lot
479
00:28:40,920 --> 00:28:44,640
of AI stuff and have
experimented an awful lot.
480
00:28:44,640 --> 00:28:46,720
And I actually want to share
something with you real quick
481
00:28:46,720 --> 00:28:49,760
before I forget.
Hang on one SEC.
482
00:28:49,760 --> 00:28:52,200
Let me figure out how to do
this.
483
00:28:52,200 --> 00:28:54,440
This is only going to take me
one second.
484
00:28:55,080 --> 00:28:59,840
Here we go to share audio Share
a tab instead.
485
00:28:59,840 --> 00:29:03,600
Well, how do I do that?
It's not going to let me share
486
00:29:03,600 --> 00:29:05,920
audio you.
Suck.
487
00:29:06,600 --> 00:29:10,120
Hang on a second, let me figure
this out.
488
00:29:11,920 --> 00:29:17,120
Oh, Oh yeah, I don't think it's
going to let me share audio.
489
00:29:17,120 --> 00:29:23,880
This ain't fun.
Well, I'm gonna do it without
490
00:29:23,880 --> 00:29:25,760
sharing audio.
That's what I'm gonna do.
491
00:29:26,040 --> 00:29:29,480
All right, well, I'll just, I'll
use my mind's ear, as it were.
492
00:29:29,840 --> 00:29:37,880
Well, it's your basic.
I've seen some wild stuff.
493
00:29:38,400 --> 00:29:39,920
It's no good without the audio.
It's.
494
00:29:40,400 --> 00:29:42,360
Like, you know, a magician I can
kind of have.
495
00:29:42,440 --> 00:29:46,440
The audio A new AI tech too.
He's impressing that in 823
496
00:29:46,680 --> 00:29:50,120
using all the crazy.
Interactive tech and AI to make
497
00:29:50,120 --> 00:29:52,840
live concerts.
Absolutely mind blowing.
498
00:29:53,720 --> 00:29:57,440
We're talking Billy Joel, Blake
Shelton, Megan Lee Stallion.
499
00:29:57,680 --> 00:30:01,360
Bob's been behind the scenes
making their shows pop and get
500
00:30:01,360 --> 00:30:04,640
this, he even rocked the one
night Houston festival.
501
00:30:05,280 --> 00:30:08,240
It's insane how he takes all
these high tech gadgets and
502
00:30:08,240 --> 00:30:11,720
turns concerts into these
immersive experiences.
503
00:30:19,240 --> 00:30:20,920
Dude kind of looks like check
out what Bob.
504
00:30:20,920 --> 00:30:23,920
'S been up to and learn a ton
about AI as well on Geezers of
505
00:30:23,920 --> 00:30:30,240
Gear episode number 276.
So that that was created on a on
506
00:30:30,240 --> 00:30:34,400
an AI video app called NVIDIA in
about 5 minutes.
507
00:30:34,400 --> 00:30:38,240
And all I did was I took your
bio and copied and pasted it
508
00:30:38,240 --> 00:30:41,160
into the the script for the
video.
509
00:30:41,360 --> 00:30:44,800
Yeah, and it so do you use
NVIDIA at all?
510
00:30:45,480 --> 00:30:47,400
No, no, I don't use NVIDIA at
all.
511
00:30:47,400 --> 00:30:51,360
I use a little bit of a
different tool set for the video
512
00:30:51,720 --> 00:30:54,600
that I do, but.
Yeah, well, NVIDIA, if you're
513
00:30:54,600 --> 00:30:56,960
looking to do a quick little
marketing video or something
514
00:30:56,960 --> 00:30:59,800
like that, it's just really
super easy to use.
515
00:31:00,200 --> 00:31:05,880
And again, you can just paste a
document into it and it'll look
516
00:31:05,880 --> 00:31:07,920
through the document and create
a video based on the.
517
00:31:08,440 --> 00:31:12,000
But what it used to do is it
used to just use stock video and
518
00:31:12,000 --> 00:31:15,240
stock images and stuff.
So it was OK, you could do a
519
00:31:15,400 --> 00:31:19,520
kind of a cool video, but it was
just like guys shaking hands and
520
00:31:19,520 --> 00:31:22,720
like just boring stuff.
Now you can say OK, like that
521
00:31:22,720 --> 00:31:27,680
one I said I wanted animated.
And, you know, it's about this
522
00:31:27,680 --> 00:31:30,000
Geezers of Gear episode number
276.
523
00:31:30,360 --> 00:31:34,240
And then I put your bio in there
and it came up with that in
524
00:31:34,240 --> 00:31:37,000
like, you know, 3 or 4 minutes
probably.
525
00:31:37,280 --> 00:31:42,720
So if you can do generative
video now and you can do it
526
00:31:42,720 --> 00:31:45,840
animated, you can do it
lifelike, you can do it in all
527
00:31:45,840 --> 00:31:48,000
kinds of different ways.
It can be fun, it can be
528
00:31:48,000 --> 00:31:54,040
serious, it can be, you know, so
it's actually a tool that I use
529
00:31:54,040 --> 00:31:59,000
quite often just for creating
little market blurbs and things
530
00:31:59,040 --> 00:32:01,640
like that.
But yeah.
531
00:32:02,520 --> 00:32:05,600
I like it, it was good.
Yeah, yeah.
532
00:32:05,640 --> 00:32:08,360
I mean, it was just something
funny that I thought I'd, I
533
00:32:08,360 --> 00:32:11,640
thought I'd show you just using
one of the apps.
534
00:32:11,880 --> 00:32:16,080
So, you know, I want to talk
about some of the things I use
535
00:32:16,080 --> 00:32:18,640
on a daily basis, which will be
totally different than some of
536
00:32:18,640 --> 00:32:21,920
the things you use, of course,
because you're using them for to
537
00:32:21,920 --> 00:32:25,000
generate probably content and to
generate.
538
00:32:26,680 --> 00:32:29,560
Yeah.
So give me give me some examples
539
00:32:29,560 --> 00:32:34,240
of of ways that you've managed
to incorporate AAI into your
540
00:32:34,240 --> 00:32:36,400
workflow.
Well, there's a few different
541
00:32:36,400 --> 00:32:37,920
ways and there's a few
different.
542
00:32:38,000 --> 00:32:43,000
Of course, when we when we say
AI, that's a that's a broad
543
00:32:43,000 --> 00:32:43,960
topic, right?
Yeah.
544
00:32:43,960 --> 00:32:49,600
I mean, you know, for the
record, I started, I, I attended
545
00:32:49,600 --> 00:32:55,440
a computer camp when I was a
teenager at MIT.
546
00:32:55,800 --> 00:33:00,440
Yeah.
And I remember that when we went
547
00:33:00,440 --> 00:33:03,680
there, you know, it was, it was
very cool.
548
00:33:03,680 --> 00:33:06,160
But they said, OK, what we're
going to do is by the end of the
549
00:33:06,160 --> 00:33:10,200
Cam, you're all going to have
written a computer program.
550
00:33:11,040 --> 00:33:14,440
All right?
And out of the, you know, out of
551
00:33:14,440 --> 00:33:17,520
the 100 kids there, 99 were
like, I want to write a computer
552
00:33:17,520 --> 00:33:18,920
game.
Oh, OK.
553
00:33:19,120 --> 00:33:22,520
Which is predictable and cool
right now from the time this is
554
00:33:22,520 --> 00:33:27,480
like 1980 3 maybe?
Something like that.
555
00:33:27,840 --> 00:33:30,840
I was in high.
School at the time I did this, I
556
00:33:30,840 --> 00:33:37,560
did this and but what I wanted
to do was I wanted to write a
557
00:33:37,560 --> 00:33:41,800
program that would teach the
computer to talk to me, you
558
00:33:41,800 --> 00:33:44,440
know, and not necessarily
verbalize, but I wanted to be
559
00:33:44,440 --> 00:33:47,680
able to type, hi, how are you?
And have the computer go, Oh,
560
00:33:47,680 --> 00:33:49,800
I'm I'm having a good day or
whatever.
561
00:33:49,800 --> 00:33:52,240
Like, I wanted to be able to
have a free form conversation
562
00:33:52,240 --> 00:33:56,880
with a computer, which the
instructors were both, you know,
563
00:33:56,880 --> 00:33:58,080
they were like, oh, wow, that's
cool.
564
00:33:58,080 --> 00:34:01,560
And they were also like, you're
really screwing up our Lesson
565
00:34:01,560 --> 00:34:03,520
plan here because you're
supposed to want to make a
566
00:34:03,520 --> 00:34:05,720
computer game, right?
Yeah, but.
567
00:34:06,040 --> 00:34:10,360
You know, but but what I did was
at that time in the course at
568
00:34:10,360 --> 00:34:14,239
camp, which was a month long
thing that I did and you know, I
569
00:34:14,239 --> 00:34:17,719
think it was a freshman in high
school or something was I wrote,
570
00:34:17,960 --> 00:34:21,600
you know, a piece of code that
was effectively a chatbot.
571
00:34:22,120 --> 00:34:25,320
OK.
Yeah, it was a very limited chat
572
00:34:25,320 --> 00:34:29,520
bot because at the time all you
could do was maybe guess at what
573
00:34:29,520 --> 00:34:33,000
people would say to this thing.
And then you could come up with
574
00:34:33,520 --> 00:34:35,760
a range of things it could say
in response.
575
00:34:35,760 --> 00:34:38,280
So you could only have a very
rudimentary discussion with the
576
00:34:38,280 --> 00:34:40,120
singing.
And it only had so many options
577
00:34:40,120 --> 00:34:45,000
as as to how it could reply.
But so now we look at modern day
578
00:34:45,320 --> 00:34:51,760
large language models.
So ChatGPT anthropics clawed
579
00:34:53,320 --> 00:34:59,440
Google's Gemini.
These are the the, the, the, you
580
00:34:59,480 --> 00:35:03,080
know, the offspring of that chat
bot thinking that goes all the
581
00:35:03,080 --> 00:35:06,160
way back to that.
So I've been noodling with this
582
00:35:06,160 --> 00:35:08,200
early on.
And, and even if we go back to
583
00:35:08,200 --> 00:35:13,960
2010 when mode, when we put our
interactive screens in Microsoft
584
00:35:13,960 --> 00:35:18,720
headquarters in Studios A
through D, we were using machine
585
00:35:18,720 --> 00:35:22,920
learning to allow the screens to
be different every day.
586
00:35:22,920 --> 00:35:27,360
And to to how they responded to.
These were screens that you
587
00:35:27,360 --> 00:35:30,920
could gesturally play with if
you were in the right spots in
588
00:35:30,920 --> 00:35:34,000
the lobby and you could audit,
you could play with them by
589
00:35:34,120 --> 00:35:38,280
through your voice.
And we came up with, you know,
590
00:35:38,280 --> 00:35:42,240
machine learning things that
allowed the code to put together
591
00:35:42,240 --> 00:35:44,880
different graphic looks on the
spot.
592
00:35:44,880 --> 00:35:47,080
And this is 2010.
Wow.
593
00:35:47,320 --> 00:35:49,600
Yeah, that was very.
Early and this and it's been a
594
00:35:49,600 --> 00:35:52,080
hallmark of a lot of the work
we've done, especially on the
595
00:35:52,080 --> 00:35:56,600
permanent installation side of
creating what we dream to be
596
00:35:56,600 --> 00:36:00,920
semi autonomous art pieces, you
know, art pieces that would that
597
00:36:00,920 --> 00:36:06,040
would think for themselves or do
what do do do things because of
598
00:36:06,040 --> 00:36:07,200
environmental.
That's interesting.
599
00:36:07,280 --> 00:36:08,440
Yeah.
What the?
600
00:36:08,440 --> 00:36:11,400
Weather was what the traffic's
like around the building.
601
00:36:11,880 --> 00:36:14,160
You know how many people are
moving through a lobby and then
602
00:36:14,160 --> 00:36:17,720
suddenly this LED sculpture was
behaving differently, showing
603
00:36:17,720 --> 00:36:19,400
different colors, doing
different patterns.
604
00:36:19,400 --> 00:36:24,200
We repeated that in the in an
installation at a Salesforce
605
00:36:24,200 --> 00:36:27,320
building up here.
We repeated it in a bunch of
606
00:36:27,320 --> 00:36:30,760
different installations.
And then in 2018, when we
607
00:36:30,760 --> 00:36:34,800
designed GM World, which is a
visitor center in the lobby of
608
00:36:34,800 --> 00:36:38,640
General Motors Global
headquarters, where we had 17
609
00:36:38,640 --> 00:36:42,760
moving LED screens, we came up
with a thing called the
610
00:36:42,760 --> 00:36:47,720
Interactive Content Engine,
which would basically give an
611
00:36:47,720 --> 00:36:52,080
input from the GM people, but
also environmental input and
612
00:36:52,120 --> 00:36:55,920
computer vision input.
It would put together new pieces
613
00:36:55,920 --> 00:36:58,240
of content every day for those
screens.
614
00:36:58,480 --> 00:37:01,680
And that came from the chief
marketing officer for GM, Tim
615
00:37:01,680 --> 00:37:06,200
Mahoney saying, Bobby, how can
we make sure that people's
616
00:37:06,200 --> 00:37:09,560
experience of this area is
different this Tuesday at 9:00
617
00:37:09,560 --> 00:37:13,880
AM from next Tuesday at 9 AMI
don't have a budget to pay an
618
00:37:13,880 --> 00:37:17,720
agency to be constantly making
new content.
619
00:37:17,800 --> 00:37:19,280
Yeah.
And I was like, well, I think we
620
00:37:19,280 --> 00:37:21,240
have an answer.
And that was because we have
621
00:37:21,240 --> 00:37:24,680
been doing all of this work.
So we built the interactive
622
00:37:24,680 --> 00:37:29,560
content engine which which which
does this, it comes up with
623
00:37:29,560 --> 00:37:35,360
whole basically GM is making new
content everyday, like terabytes
624
00:37:35,360 --> 00:37:37,640
of it.
So we were having all of that go
625
00:37:37,640 --> 00:37:43,080
into a bucket and we had one
machine learning piece of code
626
00:37:43,360 --> 00:37:46,680
that would look at every piece
of content that that GM
627
00:37:46,680 --> 00:37:50,600
contributed.
So an ad for, you know, an ad
628
00:37:50,600 --> 00:37:57,400
for Chevy Silverado, training
video, marketing videos,
629
00:37:57,440 --> 00:37:59,280
etcetera.
It would go through all of these
630
00:37:59,280 --> 00:38:02,800
videos and it would look at the
videos and it would tag those
631
00:38:02,800 --> 00:38:06,200
videos.
So we put meta tags on it to say
632
00:38:06,200 --> 00:38:09,120
this piece of content features a
Chevy Silverado.
633
00:38:09,640 --> 00:38:13,640
And I can see that it has this
trim package and I can see that
634
00:38:13,640 --> 00:38:17,400
the the truck is on the beach
and there appears to be a family
635
00:38:17,400 --> 00:38:20,920
around it of a mother and a dad
and some kids.
636
00:38:21,240 --> 00:38:24,680
And so it would then Add all
this metadata to those clips.
637
00:38:25,000 --> 00:38:29,080
So all of the media going into
the bucket now had a way to, you
638
00:38:29,080 --> 00:38:33,440
had a way to identify, you know,
through metadata, what it was.
639
00:38:33,640 --> 00:38:36,160
Then we built a different
machine learning program and
640
00:38:36,160 --> 00:38:39,440
this was the interactive content
engine, which on a given day,
641
00:38:39,440 --> 00:38:45,280
let's say it's February, it's
truck month, the Chinese
642
00:38:45,280 --> 00:38:48,920
manufacturing representatives
from the factory that GMs in
643
00:38:48,920 --> 00:38:53,720
Shanghai are visiting and
there's there's more traffic
644
00:38:53,720 --> 00:38:55,640
around the Renaissance Center
than usual.
645
00:38:55,960 --> 00:38:58,760
The interactive content engine
would know all of these things.
646
00:38:58,760 --> 00:39:01,280
It would know the stuff about
who was visiting because GM
647
00:39:01,280 --> 00:39:05,120
would tell it, but it would know
what the weather was.
648
00:39:05,120 --> 00:39:06,440
It would know what the traffic
was.
649
00:39:06,440 --> 00:39:09,480
It would know things like it's
truck month and then it would go
650
00:39:09,480 --> 00:39:13,080
into the media buckets and say,
oh, I've got a clip of a Chevy
651
00:39:13,080 --> 00:39:14,920
Silverado.
Oh, it's on the beach.
652
00:39:15,160 --> 00:39:17,480
That's good because it's cold
outside.
653
00:39:17,480 --> 00:39:21,240
So we want to put nice warm
images on the screen.
654
00:39:21,520 --> 00:39:25,920
And I can also use it because
the Silverados are drive trains
655
00:39:25,920 --> 00:39:29,760
are manufactured in, you know,
and I'm making this up, by the
656
00:39:29,760 --> 00:39:32,160
way.
This is I don't do, you know,
657
00:39:32,240 --> 00:39:35,520
not actual content, but they're
writing engines in China.
658
00:39:35,800 --> 00:39:38,280
They probably aren't.
But yeah, you know, it it would
659
00:39:38,280 --> 00:39:42,120
know, oh, these pieces of that
truck are made in the Chinese
660
00:39:42,120 --> 00:39:44,840
plant.
So the Chinese contingents going
661
00:39:44,840 --> 00:39:47,320
to be happy to see that.
And then it would put together a
662
00:39:47,320 --> 00:39:50,400
whole content sequence based
around that.
663
00:39:50,400 --> 00:39:54,920
And it does that, and it and as
the more it does it, the more it
664
00:39:54,920 --> 00:39:57,480
can build its model out and it
gets better.
665
00:39:57,480 --> 00:40:01,040
So it was learning.
So we were doing that in 2018,
666
00:40:01,040 --> 00:40:04,520
which is, which is, you know,
relatively speaking, like
667
00:40:04,520 --> 00:40:07,480
centuries before we were all
talking about what we're talking
668
00:40:07,480 --> 00:40:10,120
about now.
So we're not, this isn't
669
00:40:10,120 --> 00:40:12,040
something that we've come lately
to.
670
00:40:12,040 --> 00:40:13,960
It's kind of been something
we've been doing from the
671
00:40:13,960 --> 00:40:16,560
beginning.
Now would the would the new
672
00:40:16,560 --> 00:40:19,280
tools that have came come out
with the chat, you know, on the
673
00:40:19,280 --> 00:40:22,640
large language model side, just
to give you an example like we
674
00:40:22,640 --> 00:40:25,120
use.
ChatGPT affords you the
675
00:40:25,120 --> 00:40:28,000
opportunity to create custom
GPTS.
676
00:40:28,000 --> 00:40:31,920
They call it some other people
call these agents right, which
677
00:40:31,920 --> 00:40:35,480
is basically you can train the
version of the model to become
678
00:40:35,480 --> 00:40:39,200
specifically knowledgeable about
something you would like it to
679
00:40:39,200 --> 00:40:42,200
be specifically knowledgeable
about by feeding it a lot of
680
00:40:42,200 --> 00:40:47,520
information, which it then adds
to what what open AI is already
681
00:40:47,520 --> 00:40:50,240
trained it on and gives it
better context.
682
00:40:50,240 --> 00:40:55,720
So we, for instance, at Mode,
we've been working to train a
683
00:40:55,800 --> 00:41:01,800
custom GPT into being an
experiential strategist, right?
684
00:41:01,800 --> 00:41:05,440
So it helps us with our
corporate world like we can.
685
00:41:05,800 --> 00:41:09,520
You know, we feed it all the
news about brand activations and
686
00:41:09,520 --> 00:41:14,200
what the big agencies are up to
and what installations were
687
00:41:14,200 --> 00:41:19,440
South by Southwest this year and
what you know what, what popular
688
00:41:19,760 --> 00:41:24,280
installations out there and the
news around them and data around
689
00:41:24,280 --> 00:41:26,160
it and stuff.
We've been working to build this
690
00:41:26,160 --> 00:41:31,880
custom GPT, which has now become
quite an expert at, you know, so
691
00:41:31,920 --> 00:41:36,200
now I can come up to it and say,
hey, I'm going to do, I'm going
692
00:41:36,200 --> 00:41:39,040
to do a big engagement for a
tech company.
693
00:41:39,360 --> 00:41:43,120
What would be a great
interactive, immersive,
694
00:41:43,120 --> 00:41:49,360
temporary engagement to show
clients or to show guests the
695
00:41:49,360 --> 00:41:57,920
power of using, you know, using
a particular software suite to
696
00:41:57,920 --> 00:41:59,960
get this done in their lives?
Yeah.
697
00:42:00,160 --> 00:42:03,720
And this custom GPT will then
think about it, and it'll go
698
00:42:03,720 --> 00:42:05,160
into all the training we've
given it.
699
00:42:05,360 --> 00:42:09,200
And it'll say, here's some great
ideas you should set up, you
700
00:42:09,200 --> 00:42:13,040
know, the high tech TP on the
ground and you should like, you
701
00:42:13,040 --> 00:42:16,600
know, put these, put these
installations in it and these
702
00:42:16,600 --> 00:42:19,800
engagement points in it.
And here's the strategic reasons
703
00:42:19,800 --> 00:42:23,600
why that is good.
Here's stuff I know from metrics
704
00:42:23,600 --> 00:42:26,800
on other activations,
corporations have done that have
705
00:42:26,800 --> 00:42:30,080
been successful or not and why
this might be a good idea to do
706
00:42:30,080 --> 00:42:32,360
it.
So in one way we're using it.
707
00:42:33,280 --> 00:42:35,840
We're using artificial
intelligence in our work as a
708
00:42:35,840 --> 00:42:42,680
creative agency to help us with
coming up with new ideas and
709
00:42:42,680 --> 00:42:46,840
strategy stuff, in particular in
that corporate activation
710
00:42:46,840 --> 00:42:48,560
market.
So that's one way we're using AI
711
00:42:49,000 --> 00:42:51,280
one second.
Go back up one second.
712
00:42:51,280 --> 00:42:54,360
So like, I know what a custom
GPT is.
713
00:42:54,360 --> 00:42:57,720
I know how a custom GPGPT comes
about.
714
00:42:57,720 --> 00:43:04,960
But can you explain like just a
very simple custom GPT that
715
00:43:04,960 --> 00:43:08,480
somebody might build to do
something and and what that
716
00:43:08,480 --> 00:43:11,320
would entail?
Definitely, definitely.
717
00:43:11,320 --> 00:43:16,440
So you know, let's say that you
run, let's say that you run a
718
00:43:17,320 --> 00:43:20,440
major lighting distributing
company or distributorship,
719
00:43:20,440 --> 00:43:23,440
let's say, let's say you're
you're Ben Saltzman and Act
720
00:43:23,440 --> 00:43:25,360
Lighting.
Or let's say you're, you know,
721
00:43:25,440 --> 00:43:28,160
you're Colin Waters at TMB or
something like that.
722
00:43:28,240 --> 00:43:31,480
Yeah.
And you've got a lot of people
723
00:43:31,480 --> 00:43:34,720
on your staff, right?
You're managing a lot of
724
00:43:34,720 --> 00:43:36,800
employees.
But you're from show business,
725
00:43:36,800 --> 00:43:38,600
right?
You're not necessarily the best
726
00:43:38,600 --> 00:43:39,880
HR guy, right?
Yeah.
727
00:43:40,320 --> 00:43:44,160
And but you've got employees who
want to know, is it appropriate
728
00:43:44,160 --> 00:43:49,960
for me to tell my coworker at
work, wow, those jeans make your
729
00:43:49,960 --> 00:43:52,240
butt look good today or, you
know, whatever.
730
00:43:52,240 --> 00:43:53,760
Right.
Like or whatever.
731
00:43:53,760 --> 00:43:54,200
Right.
Yeah.
732
00:43:54,680 --> 00:43:57,160
You could phrase.
I could totally see Ben saying
733
00:43:57,160 --> 00:43:59,280
that.
Yeah, but I could not see that.
734
00:43:59,560 --> 00:44:01,800
Exactly.
But I could see other people
735
00:44:01,800 --> 00:44:03,240
maybe.
Of course, of course.
736
00:44:03,240 --> 00:44:06,880
I think I might have said that
in my career 100 times or so.
737
00:44:07,320 --> 00:44:08,480
You've been around a while.
I'm going.
738
00:44:08,680 --> 00:44:11,480
To be controversial about it,
let's say that you are running a
739
00:44:11,480 --> 00:44:14,840
company and you want, you want
your employees to be able to
740
00:44:14,840 --> 00:44:18,360
find out like can I take a
personal day or what are the
741
00:44:18,520 --> 00:44:22,760
rules about if I'm late to work,
right?
742
00:44:22,800 --> 00:44:24,400
Or.
And you don't want them to have
743
00:44:24,400 --> 00:44:27,840
to go find an employee manual
that you gave them when they got
744
00:44:27,840 --> 00:44:30,840
hired that they probably stuck
in a drawer and they don't know.
745
00:44:31,000 --> 00:44:35,920
So for instance, you could take
your employee manual and you
746
00:44:35,920 --> 00:44:41,520
could take your HR guidelines
and you could train a custom GPT
747
00:44:41,520 --> 00:44:45,520
on that and make that into a
chat bot that exists on your
748
00:44:45,520 --> 00:44:48,000
company's Internet.
So it's a great application.
749
00:44:48,000 --> 00:44:53,000
To know, hey what is my
company's policy on PTO and they
750
00:44:53,000 --> 00:44:56,640
can engage with the chat bot and
say how many, how many days off
751
00:44:56,640 --> 00:45:01,000
have I allowed in a given year
and what for and that and now
752
00:45:01,000 --> 00:45:06,760
that Custom GPT is like your
ultimate unbiased HR
753
00:45:06,760 --> 00:45:08,240
representative.
Right.
754
00:45:08,360 --> 00:45:11,840
Yep.
Who your employees have 24 hour
755
00:45:11,840 --> 00:45:15,480
immediate access to.
Yep, that's a good example of.
756
00:45:15,800 --> 00:45:19,720
It is, yeah.
That's a great rather low tech
757
00:45:20,440 --> 00:45:23,840
example of why and how you would
create a custom GPT.
758
00:45:24,120 --> 00:45:26,720
So thanks.
Thanks for going through that
759
00:45:26,720 --> 00:45:28,600
one.
So where were you going next
760
00:45:28,600 --> 00:45:29,960
that you were on to something
when I.
761
00:45:29,960 --> 00:45:31,040
Stopped.
So there's another.
762
00:45:31,160 --> 00:45:35,560
So that's one kind of AI which
are these, which are these kind
763
00:45:35,560 --> 00:45:39,720
of very intelligent chat bots.
The other kind of AI I think
764
00:45:39,720 --> 00:45:42,200
that a lot of people are seeing
and talking about is the
765
00:45:42,200 --> 00:45:45,800
generative art AI.
So things are creating the
766
00:45:45,800 --> 00:45:52,040
pictures like Dolly or Mid
journey or programs like that,
767
00:45:52,480 --> 00:46:00,280
you know, and now also the ones
that create video like Runways
768
00:46:00,280 --> 00:46:05,200
Gen. 3, the one, the one that
you mentioned that you just
769
00:46:05,200 --> 00:46:07,560
used.
There's a spectrum of these
770
00:46:07,560 --> 00:46:11,720
these apps as well.
And what these are is these are
771
00:46:12,680 --> 00:46:17,920
these are, these are diffusion
based transformer apps.
772
00:46:17,920 --> 00:46:21,280
And I don't want to, I don't
want to make people's eyes roll
773
00:46:21,280 --> 00:46:22,800
into the back of their heads,
right.
774
00:46:22,840 --> 00:46:27,000
But basically what these apps do
is they've been trained on, that
775
00:46:27,000 --> 00:46:33,320
is to say they have been exposed
to a vast volume of, of, of
776
00:46:33,320 --> 00:46:37,560
visual information, pictures,
videos, etcetera.
777
00:46:38,120 --> 00:46:41,360
And then what they do, and I,
and I want to be careful because
778
00:46:41,360 --> 00:46:43,560
I want to make sure that people
really understand this.
779
00:46:44,120 --> 00:46:47,920
They don't absorb all that and
it sits in the database.
780
00:46:47,920 --> 00:46:51,080
And then, and I think this is a
a misapprehension people have
781
00:46:51,400 --> 00:46:54,400
that like they then go into mid
journey and they say, give me a
782
00:46:54,400 --> 00:47:00,600
picture, you know, of, you know,
a lady walking down the street
783
00:47:01,200 --> 00:47:05,560
in the start, you know, you
know, as if it were a painting
784
00:47:05,560 --> 00:47:08,120
made by Monet.
Yeah.
785
00:47:08,800 --> 00:47:13,560
Those programs then don't go
look at that moment at Monet's
786
00:47:13,560 --> 00:47:17,400
art.
And they don't then like sample
787
00:47:17,400 --> 00:47:21,440
that and put paste together a
collage of existing stuff.
788
00:47:22,480 --> 00:47:25,000
Here's what happens.
And it and it's, and it's much
789
00:47:25,000 --> 00:47:28,720
in the same way like people who
go to art school, the programs
790
00:47:28,720 --> 00:47:31,920
are trained on the materials.
So the materials end up
791
00:47:31,920 --> 00:47:36,840
resulting in a database that
contains no pictorial imagery
792
00:47:36,840 --> 00:47:39,960
whatsoever.
It is a, it is a literally a
793
00:47:40,040 --> 00:47:45,240
four dimensional hang on there.
It is A4 dimensional
794
00:47:47,520 --> 00:47:52,800
mathematical space that
basically has a bunch of data
795
00:47:53,320 --> 00:47:59,160
about composition, line weight,
color, aesthetics, styles.
796
00:47:59,160 --> 00:48:01,480
There's no pictures in it
whatsoever, whatsoever.
797
00:48:01,640 --> 00:48:05,560
If you were to look at it, if
you could look at it, because
798
00:48:05,560 --> 00:48:09,200
looking at a latent space
database has problems of its
799
00:48:09,200 --> 00:48:12,960
own, it would look like a
massive table of numbers.
800
00:48:13,040 --> 00:48:16,280
Yeah.
So basically when you enter a
801
00:48:16,280 --> 00:48:20,480
prompt into a program like Mid
Journey, which is a very popular
802
00:48:22,920 --> 00:48:27,600
image generator, it looks at
what you've written, right?
803
00:48:27,600 --> 00:48:30,520
Yeah.
I want a picture of a blue
804
00:48:30,520 --> 00:48:35,200
painted picture of a crazy
purple haired, you know,
805
00:48:35,200 --> 00:48:42,440
entertainment designer, you
know, sitting in a room full of
806
00:48:42,440 --> 00:48:47,120
clouds, right?
It turns that verbiage into
807
00:48:47,120 --> 00:48:51,600
tokens, right?
It translates what you've
808
00:48:51,600 --> 00:48:55,560
written into tokens.
Now each of those tokens, it
809
00:48:55,840 --> 00:49:02,760
relates to a specific thing, a
color, a descriptive term, male,
810
00:49:02,760 --> 00:49:04,760
female, you know, etcetera,
etcetera.
811
00:49:04,760 --> 00:49:09,440
And then then the program takes
those tokens which basically
812
00:49:09,440 --> 00:49:15,560
give it numbers, data points, it
goes off into its latent space
813
00:49:15,560 --> 00:49:18,800
database which has been
generated from the training and
814
00:49:18,800 --> 00:49:25,600
it matches the tokens to areas
of the numerical matrix.
815
00:49:25,880 --> 00:49:28,600
Yeah.
And so it understands that
816
00:49:28,600 --> 00:49:31,920
bananas are yellow when it
understands that clouds are
817
00:49:31,920 --> 00:49:34,840
puffy and white.
And it understands that, you
818
00:49:34,840 --> 00:49:38,360
know, hair looks like this and
etcetera, etcetera.
819
00:49:38,360 --> 00:49:44,720
And it goes into that space, and
then the app begins with noise,
820
00:49:45,240 --> 00:49:47,840
right?
It begins with literally what
821
00:49:47,840 --> 00:49:50,480
you might call static if you
were looking at it.
822
00:49:51,400 --> 00:49:55,680
And then it sharpens it and
sharpens it and sharpens it
823
00:49:55,880 --> 00:49:59,680
until an image begins to appear
based off the tokens and based
824
00:49:59,680 --> 00:50:03,680
off the data it's gotten from
its latent space database.
825
00:50:04,000 --> 00:50:07,480
Now, the best ones, the best
versions of these programs are
826
00:50:07,480 --> 00:50:10,880
actually 2 AIS.
One AI is doing this work.
827
00:50:11,480 --> 00:50:15,320
The other AI is sitting there
serving as a real time critic
828
00:50:15,640 --> 00:50:19,560
and going that's not a banana.
Yeah, I don't know about that.
829
00:50:19,880 --> 00:50:22,760
Right.
And and basically it's help, one
830
00:50:22,760 --> 00:50:25,880
AI is helping the other AI to
tune its results.
831
00:50:26,160 --> 00:50:29,960
And that fuzzy static picture
gets clearer and clearer and
832
00:50:29,960 --> 00:50:34,840
clearer and clearer until it
matches the verbal request, the
833
00:50:34,840 --> 00:50:37,160
typed request you'd give it.
Yeah.
834
00:50:37,240 --> 00:50:41,160
So it is not sampling.
It is not copying.
835
00:50:41,680 --> 00:50:46,480
It is not directly going out,
scraping the Internet and
836
00:50:47,040 --> 00:50:51,480
putting a picture together.
It has learned about.
837
00:50:51,600 --> 00:50:55,480
It's learned about, you know, a
Monet painting.
838
00:50:55,800 --> 00:50:58,920
Monet paintings are soft,
they're impressionistic.
839
00:50:58,920 --> 00:51:02,600
They're painterly.
They usually have a more pastel
840
00:51:02,600 --> 00:51:05,640
palette, right?
And so now the computer program
841
00:51:05,640 --> 00:51:10,160
knows what pastel means.
It knows what warm means.
842
00:51:10,760 --> 00:51:13,560
It knows what soft and painterly
means.
843
00:51:13,720 --> 00:51:17,240
And instead of sharp and
graphic, right.
844
00:51:17,920 --> 00:51:21,240
And so when you prompted to give
you something like that, it
845
00:51:21,240 --> 00:51:24,760
doesn't go find a Monet painting
and bend it and twist it into
846
00:51:24,760 --> 00:51:26,240
something you want.
Yeah.
847
00:51:26,480 --> 00:51:29,040
It makes, that's such a great
point.
848
00:51:29,440 --> 00:51:31,880
Completely custom picture.
Yeah.
849
00:51:32,240 --> 00:51:37,000
That that references the
aesthetic, the composition, the
850
00:51:37,000 --> 00:51:39,480
style.
It's just like me going to art
851
00:51:39,480 --> 00:51:42,880
school with which I did I, you
know, I attended RISD for a
852
00:51:42,880 --> 00:51:44,520
while, Rhode Island School of
Design.
853
00:51:45,440 --> 00:51:49,040
When I went to RISD, I showed up
and the first thing they did was
854
00:51:49,040 --> 00:51:54,160
teach us art history and we
learned about the great masters
855
00:51:54,160 --> 00:51:57,400
and we looked at their work and
we went into the museum and they
856
00:51:57,400 --> 00:52:00,800
would make us try to recreate
their work, right?
857
00:52:01,240 --> 00:52:02,720
They would.
And at the same time, they're
858
00:52:02,720 --> 00:52:06,160
teaching us techniques.
You know, they would teach us
859
00:52:06,160 --> 00:52:09,400
the techniques of these artists
so that we could do things like
860
00:52:09,400 --> 00:52:13,000
what they did, having looked at
what they've done before and
861
00:52:13,000 --> 00:52:18,400
been inspired by that, and then
go out and, and then they taught
862
00:52:18,400 --> 00:52:21,080
us how to synthesize all these
things together to come up with
863
00:52:21,080 --> 00:52:24,560
our own individual style.
You know, the way these apps
864
00:52:24,560 --> 00:52:27,360
work is a lot like that.
You know, they get exposed to
865
00:52:27,360 --> 00:52:30,120
things.
They get taught why those things
866
00:52:30,120 --> 00:52:33,720
happened and how they work, and
then they get taught on how to
867
00:52:33,720 --> 00:52:36,680
create things like it.
Well, you know, it's such an
868
00:52:36,680 --> 00:52:41,040
amazing point because this is
one of the greatest fears.
869
00:52:41,040 --> 00:52:44,800
You know, it is just the the
fact that it's going to, you
870
00:52:44,800 --> 00:52:48,480
know, copy music, it's going to
take people's likelihood and do
871
00:52:48,480 --> 00:52:50,040
things to it.
It's going to do this.
872
00:52:50,040 --> 00:52:54,480
It's going to all of this copy
replica stuff is such a big
873
00:52:54,480 --> 00:52:57,240
concern, but it's no different
than, you know, with music.
874
00:52:57,240 --> 00:52:59,280
There's only so many notes in
music.
875
00:52:59,320 --> 00:53:02,160
And so yes, there's a good
chance the song's going to have
876
00:53:02,160 --> 00:53:06,440
an A in it and it might have AG
or AB minor or something, right.
877
00:53:06,960 --> 00:53:10,160
And but that doesn't mean you're
copying another song that's got
878
00:53:10,160 --> 00:53:14,240
AB minor and an A in it.
You know, it's and no different
879
00:53:14,240 --> 00:53:17,520
than than lighting design.
You know, you're going to use a
880
00:53:17,520 --> 00:53:19,720
similar color of red.
You're going to use a similar
881
00:53:19,720 --> 00:53:23,240
color of blue, perhaps doesn't
mean you copied, you know, John
882
00:53:23,240 --> 00:53:26,680
Featherstone or, or Cosmo Wilson
or Mark Rickman.
883
00:53:26,680 --> 00:53:29,440
You know it.
It just means that you're doing
884
00:53:29,440 --> 00:53:31,160
a design.
And when you do that design,
885
00:53:31,160 --> 00:53:33,520
you're taking all those
experiences, all the other
886
00:53:33,520 --> 00:53:35,360
things you've seen and learned
over the years.
887
00:53:35,640 --> 00:53:38,680
And you know, you're, you're
fitting them together in a way
888
00:53:38,680 --> 00:53:40,840
that matches this particular
performance, right.
889
00:53:41,240 --> 00:53:43,600
And so I think AI does a lot of
that as well.
890
00:53:43,600 --> 00:53:47,440
And, and you know, the way you
just explained it is incredible.
891
00:53:47,440 --> 00:53:50,200
Like, I don't know that I've
ever, I've seen 1000 YouTube
892
00:53:50,200 --> 00:53:52,800
videos and I don't think I've
seen anybody explain it quite as
893
00:53:52,800 --> 00:53:56,440
well as you did just now,
Especially the two AIS.
894
00:53:56,720 --> 00:53:59,480
You know, when you're saying 1
is basically creating it and one
895
00:53:59,480 --> 00:54:03,640
is critiquing it.
So how do things like 3 arms or
896
00:54:03,640 --> 00:54:06,680
three eyeballs happen?
Then what?
897
00:54:06,680 --> 00:54:10,840
Is that you know AI doesn't
understand context, right?
898
00:54:10,840 --> 00:54:15,400
Like it doesn't, it has, it can
tell you that it no, like it has
899
00:54:15,440 --> 00:54:18,040
a general sense of what a hand
looks like.
900
00:54:18,720 --> 00:54:21,640
You know, it's a big BLOB with
some sticks coming off of it.
901
00:54:21,640 --> 00:54:25,440
Kind of like it knows from a
compositional standpoint what a
902
00:54:25,440 --> 00:54:28,360
hand looks like.
It doesn't understand because no
903
00:54:28,360 --> 00:54:32,200
one that you know that it's not
the AIS aren't trained with
904
00:54:32,200 --> 00:54:35,120
context.
And This is why there's always
905
00:54:35,120 --> 00:54:38,560
going to be a place for artists
in this, because artists and
906
00:54:38,560 --> 00:54:43,000
creatives, I mean, part of what
makes up me anyway creative is
907
00:54:43,040 --> 00:54:46,200
that I have the, you know, a ton
of books behind me, right?
908
00:54:46,200 --> 00:54:47,960
If you could see the office
around me.
909
00:54:47,960 --> 00:54:49,960
Yeah.
All books.
910
00:54:50,160 --> 00:54:50,880
Yeah.
Right.
911
00:54:50,880 --> 00:54:54,600
And a lot of what I bring to my
creativity is context.
912
00:54:54,640 --> 00:54:57,320
Yeah.
And the AIS don't understand
913
00:54:57,320 --> 00:54:58,840
context.
So yes, it's true.
914
00:54:58,840 --> 00:55:02,480
Like the classic thing of like,
you know, you can tell an AI
915
00:55:02,480 --> 00:55:04,680
image because everybody's got 7
fingers.
916
00:55:04,760 --> 00:55:07,600
Yeah, they got the wrong number
of fingers and toes.
917
00:55:08,760 --> 00:55:12,200
The AIS don't they understand
that they're making, they're
918
00:55:12,200 --> 00:55:17,240
going to, they're going to
sharpen a diffuse thing into a
919
00:55:17,240 --> 00:55:21,400
picture of something that they
that it's been told is the shape
920
00:55:21,400 --> 00:55:24,040
of a human.
But the AIS, you know, that
921
00:55:24,040 --> 00:55:29,280
these creative AIS haven't been
trained in deeply in human
922
00:55:29,280 --> 00:55:32,200
anatomy.
Yeah, you know, they don't know.
923
00:55:32,200 --> 00:55:36,160
So there's, there's like so much
that they don't know about yet,
924
00:55:37,200 --> 00:55:41,480
you know, and I say yet quite on
purpose, because, you know, with
925
00:55:41,520 --> 00:55:44,960
every time they refine these
models and train these models,
926
00:55:44,960 --> 00:55:47,840
for instance, the whole
phenomenon of too many fingers
927
00:55:47,840 --> 00:55:50,920
on hands has gets, has been
reduced and reduced and reduced
928
00:55:50,920 --> 00:55:54,440
has come because because it
became something that people
929
00:55:54,440 --> 00:55:56,640
noticed.
And so the people training the
930
00:55:56,640 --> 00:56:03,360
models were like, oh, I can fix
that now, you know, and and
931
00:56:03,360 --> 00:56:07,320
that's the thing, you can just
like I talked about training the
932
00:56:07,320 --> 00:56:10,640
large language model to become
an expert in experiential
933
00:56:10,640 --> 00:56:14,920
strategy, you can train a
graphics model to be to get much
934
00:56:14,920 --> 00:56:17,440
better at a specific thing you
want it to be much better at.
935
00:56:17,440 --> 00:56:22,040
For instance, when we did Blake
Shelton and Blake the back to
936
00:56:22,040 --> 00:56:27,040
the Honky Tonk tour, we wanted
to build a big virtual piece of
937
00:56:27,040 --> 00:56:30,600
scenery in the video that was
kind of supposed to be like
938
00:56:30,600 --> 00:56:34,760
Blake's barn on his property.
That was a honky tonk and that
939
00:56:34,840 --> 00:56:38,080
one of the things in there would
be all of these honky tonk
940
00:56:38,240 --> 00:56:44,320
posters of Blake, right?
Like Blake plays, you know, the
941
00:56:44,360 --> 00:56:47,520
the the Riverfront Bar and Grill
and Blake plays this and Blake
942
00:56:47,520 --> 00:56:51,920
plays that.
Well, you know, and AI doesn't
943
00:56:51,920 --> 00:56:53,680
really know what Blake looks
like.
944
00:56:53,680 --> 00:56:58,560
Like it probably has seen Blake
a few times in the materials
945
00:56:58,560 --> 00:57:00,840
it's been given.
But you're going to get hit or
946
00:57:00,840 --> 00:57:02,920
miss results out of it for sure,
right?
947
00:57:04,040 --> 00:57:09,840
So one thing we did was we used
a technique called Laura LORA,
948
00:57:10,280 --> 00:57:15,360
which is a way to give a
generative image AI the ability
949
00:57:15,360 --> 00:57:20,640
to gain extra knowledge about a
specific way something looks or
950
00:57:20,640 --> 00:57:24,400
feels.
We trained a version of Stable
951
00:57:24,400 --> 00:57:29,000
Diffusion, which is a different
generative AI art thing, to be
952
00:57:29,000 --> 00:57:32,320
an expert at making pictures of
Blake.
953
00:57:33,360 --> 00:57:34,480
Oh, interesting.
Added.
954
00:57:34,680 --> 00:57:38,000
All of these pictures of Blake
from his, from his younger days
955
00:57:38,000 --> 00:57:41,320
to his older days in all kinds
of different, like we, we had
956
00:57:41,320 --> 00:57:43,800
access to all the photographs
that had ever been, you know,
957
00:57:43,840 --> 00:57:48,040
that Blake's people had.
And we trained this AI to be
958
00:57:48,240 --> 00:57:50,480
really good at making images of
Blake.
959
00:57:50,920 --> 00:57:54,040
It was already pretty good at
making images of stuff that
960
00:57:54,040 --> 00:57:55,880
would look like honky tonk
posters.
961
00:57:56,120 --> 00:58:02,160
So now we could prompt it make a
series of Honky tonk concert
962
00:58:02,160 --> 00:58:06,120
show advertisement poster
posters featuring Blake Shelton,
963
00:58:06,360 --> 00:58:09,000
and it made hundreds of them for
us to choose.
964
00:58:09,000 --> 00:58:10,800
From Oh, that's awesome, so
that.
965
00:58:10,800 --> 00:58:13,800
Was the way that we trained
Stable Diffusion to get better
966
00:58:13,800 --> 00:58:17,240
at that we, you know.
But don't a lot of these models
967
00:58:17,240 --> 00:58:21,680
don't a lot of these like things
like Dolly and, and different
968
00:58:22,520 --> 00:58:26,920
image generation models or video
generation models, aren't they
969
00:58:27,120 --> 00:58:32,400
really afraid of like the sort
of anti plagiarism and, and you
970
00:58:32,400 --> 00:58:34,360
know, so aren't they avoiding
that?
971
00:58:34,360 --> 00:58:38,120
Like, aren't they purposely not
allowing you to say, give me a
972
00:58:38,120 --> 00:58:40,840
picture of Taylor Swift with a
mustache or whatever?
973
00:58:40,840 --> 00:58:44,400
100% and I think that's that's
probably from a.
974
00:58:44,800 --> 00:58:47,560
It's judicious of them to do
that, of course.
975
00:58:47,560 --> 00:58:49,920
Yeah.
You know, being accessible to a
976
00:58:49,920 --> 00:58:54,520
larger public, there are risks
of misinformation, Yeah, you
977
00:58:54,520 --> 00:58:56,040
know, and.
And the.
978
00:58:56,560 --> 00:58:59,400
You know, the kind of the bad
things that can happen if.
979
00:58:59,480 --> 00:59:00,880
You fake?
Yeah.
980
00:59:02,280 --> 00:59:06,800
Which is why we trained Stable
Diffusion is an open source
981
00:59:06,800 --> 00:59:10,240
model.
That is not, it's not, you know,
982
00:59:10,240 --> 00:59:13,720
I have that model here in the
studio on our own computers.
983
00:59:13,960 --> 00:59:15,920
It's not accessible to the
public.
984
00:59:15,920 --> 00:59:18,520
OK.
We train this model on Blake.
985
00:59:18,520 --> 00:59:21,200
It's not out in the world.
You can't just you.
986
00:59:21,200 --> 00:59:23,360
Can't go to stable.
You can't download Stable
987
00:59:23,360 --> 00:59:27,040
Diffusion now and it's, you
know, and we've made it an
988
00:59:27,040 --> 00:59:29,760
expert on Blake.
We've made our own version in
989
00:59:29,760 --> 00:59:33,680
our server closet.
Very good at making Blade Blake
990
00:59:33,680 --> 00:59:36,080
pictures.
That's cool, really sad out into
991
00:59:36,080 --> 00:59:38,480
the universe.
So you could do Blake, You could
992
00:59:38,480 --> 00:59:43,560
do Blake Shelton songs, you
know, deep fakes from your
993
00:59:43,560 --> 00:59:44,200
computers.
Yeah.
994
00:59:44,640 --> 00:59:47,840
We can, and intentionally, like
we've been granted the
995
00:59:47,840 --> 00:59:50,160
permission by by Blake to do
that.
996
00:59:50,240 --> 00:59:52,560
Yeah.
And now we've created something
997
00:59:52,560 --> 00:59:56,320
that is like with anything and
is for us it's a highly, you
998
00:59:56,320 --> 00:59:58,920
know, that's a piece of
proprietary technology we.
999
00:59:58,960 --> 01:00:02,400
Have.
And our agreements with with
1000
01:00:02,400 --> 01:00:06,120
Blake Shelton and with Blake
Shelton's management is that's
1001
01:00:06,120 --> 01:00:07,840
not publicly accessible.
Yeah.
1002
01:00:07,840 --> 01:00:09,160
Of course.
Yeah, of course.
1003
01:00:09,280 --> 01:00:14,000
And in fact, they own they own
the that.
1004
01:00:14,160 --> 01:00:15,680
All the image rights and
everything.
1005
01:00:15,680 --> 01:00:19,080
Else that version of it.
So yeah, if you ask, for
1006
01:00:19,080 --> 01:00:22,040
instance, mid journey to give
you a picture of someone famous,
1007
01:00:22,040 --> 01:00:25,240
yeah, they've trained it to.
Yeah, trust me, I've tried.
1008
01:00:25,280 --> 01:00:27,240
Seen this true of Dolly, Same as
true of many.
1009
01:00:28,320 --> 01:00:32,040
They've set up guardrails to
make sure that can't happen.
1010
01:00:32,040 --> 01:00:37,920
Yeah, but if you have to know
how to code, you have to know
1011
01:00:37,920 --> 01:00:42,000
how to.
It's not easy to grab one of
1012
01:00:42,000 --> 01:00:48,080
these open source AIS like Flux,
yeah, or Stable Diffusion and
1013
01:00:48,080 --> 01:00:51,520
just use it like I have computer
programming.
1014
01:00:51,640 --> 01:00:53,600
Yeah, yeah, yeah, yeah.
Experience.
1015
01:00:53,600 --> 01:00:56,280
So for me, you know, and even
though I had to puzzle out a
1016
01:00:56,280 --> 01:00:59,800
bunch of stuff and yeah,
afternoon, I mean Python And you
1017
01:00:59,800 --> 01:01:02,880
have to, you have to, it's not.
Yeah, mid Journey and mid
1018
01:01:02,880 --> 01:01:06,360
Journey and ChatGPT and all of
these are end user types of
1019
01:01:06,360 --> 01:01:10,560
programs that a knucklehead like
me can go on to and, you know,
1020
01:01:11,040 --> 01:01:15,600
figure out how to do some, some
prompt engineering and get some
1021
01:01:15,600 --> 01:01:17,720
results out of it, get some
pretty good results out of it,
1022
01:01:17,720 --> 01:01:21,560
which I have.
But no, I mean, I've tried all
1023
01:01:21,560 --> 01:01:23,920
of those things that that you're
talking about, like getting
1024
01:01:23,920 --> 01:01:27,960
images of, of specific people,
specific places, specific
1025
01:01:27,960 --> 01:01:30,840
whatever, and it doesn't do so
well at most of those things.
1026
01:01:31,160 --> 01:01:35,040
So music blows my mind, though
the music ones blow my mind.
1027
01:01:35,320 --> 01:01:39,360
Like I've, I've written some
really good songs on AI and just
1028
01:01:39,360 --> 01:01:41,720
gone, whoa, this is nuts, you
know?
1029
01:01:41,720 --> 01:01:42,840
Yeah.
And then now there are
1030
01:01:42,840 --> 01:01:45,360
guardrails showing up on that
because of course.
1031
01:01:45,600 --> 01:01:48,320
Of course you know the.
The, the record labels and the
1032
01:01:48,320 --> 01:01:52,680
people that control the rights
to music are, are, are, are
1033
01:01:52,680 --> 01:01:55,480
legally well equipped as we
know.
1034
01:01:56,320 --> 01:01:58,840
And they've, they've you know,
and so now you can't.
1035
01:01:58,840 --> 01:02:01,480
It used to be that there were
some of these opens, there was
1036
01:02:01,480 --> 01:02:04,480
some of these music AIS you
could go and you could say I
1037
01:02:04,480 --> 01:02:10,280
want a song of like a like
Kendrick Lamar singing, you
1038
01:02:10,280 --> 01:02:12,800
know, song.
By The Beatles, right?
1039
01:02:12,800 --> 01:02:14,920
And it would do it and it would
blow your mind.
1040
01:02:14,920 --> 01:02:17,320
It would be like, I never
thought of the Eye, you know,
1041
01:02:17,320 --> 01:02:21,040
And it, and I think it's really
cool, but the industry clamped
1042
01:02:21,040 --> 01:02:23,920
down on it immediately.
And the reason is because they
1043
01:02:23,920 --> 01:02:25,360
don't want the public doing
this.
1044
01:02:25,360 --> 01:02:28,120
And I think there is still a
bunch of great tools out there
1045
01:02:28,120 --> 01:02:31,400
that if you are musically
inclined or a musician, you can
1046
01:02:31,400 --> 01:02:33,560
still use these AI tools to
create music.
1047
01:02:33,560 --> 01:02:39,320
And it's super cool, but you
can't do that that thing.
1048
01:02:40,040 --> 01:02:44,120
But I'll tell you who can is the
labels themselves.
1049
01:02:44,440 --> 01:02:47,320
The reason, you know, the reason
is that the the the record
1050
01:02:47,320 --> 01:02:52,240
labels and the same goes for the
major any of the major media
1051
01:02:52,240 --> 01:02:57,840
companies on the planet.
You know they're going to be and
1052
01:02:57,840 --> 01:02:59,880
some of them already very much
are.
1053
01:03:00,120 --> 01:03:06,320
They are training their own
versions of these A highs so
1054
01:03:07,320 --> 01:03:10,320
they can have the ability.
So they can sign an artist to
1055
01:03:10,320 --> 01:03:14,360
four albums and do 12.
Yeah, yeah.
1056
01:03:14,360 --> 01:03:18,440
I mean, they own the the, the
likeness of the artist or
1057
01:03:18,440 --> 01:03:22,400
whatever.
And, and yeah, I mean, that's
1058
01:03:22,400 --> 01:03:24,240
scary.
That's definitely some scary
1059
01:03:24,240 --> 01:03:27,200
stuff right there.
It is scary, but, but I think
1060
01:03:27,200 --> 01:03:29,280
there were a lot of things that
were a lot of people were scared
1061
01:03:29,320 --> 01:03:32,120
of a there are a lot.
The big overwhelming,
1062
01:03:32,120 --> 01:03:34,880
overarching scary thing that a
lot of people thought about was
1063
01:03:34,880 --> 01:03:37,240
like, oh, are we headed into
Terminator times?
1064
01:03:37,280 --> 01:03:40,560
Is there going to be an
artificial general intelligence
1065
01:03:40,840 --> 01:03:44,320
that basically decides that
that, that humans are optional
1066
01:03:44,320 --> 01:03:46,000
and probably should be
eliminated?
1067
01:03:46,400 --> 01:03:51,520
We're not close to that.
I'm here to tell you, really
1068
01:03:52,240 --> 01:03:57,760
smart sounding chat bots are
nowhere near being an artificial
1069
01:03:57,760 --> 01:04:02,040
general intelligence, right?
It's a different, it's a whole
1070
01:04:02,040 --> 01:04:04,000
different, I know from
programming.
1071
01:04:04,000 --> 01:04:07,640
But did you hear Ray Kurzweil?
I think what where was he?
1072
01:04:07,640 --> 01:04:10,960
Was it on Joe Rogan?
Or maybe it was, I can't
1073
01:04:10,960 --> 01:04:16,840
remember which podcast it was
on, but he says 2029.
1074
01:04:18,160 --> 01:04:21,000
I don't think it's quite that
close.
1075
01:04:21,000 --> 01:04:25,760
I think, look, the kind of code
that results in an AGI depends
1076
01:04:25,760 --> 01:04:31,400
on a very different structure.
Yeah, you know, and the chat bot
1077
01:04:31,400 --> 01:04:35,240
is a basically is basically what
it's really good at is guessing
1078
01:04:35,240 --> 01:04:39,240
the next word that you want.
Yeah, right.
1079
01:04:39,240 --> 01:04:41,040
That's.
When you get right down to the
1080
01:04:41,040 --> 01:04:42,480
root of what it's doing, that's
what it's.
1081
01:04:42,880 --> 01:04:45,680
Yeah, right.
You have to build a very
1082
01:04:45,680 --> 01:04:51,000
different code base to be just
good at everything, right?
1083
01:04:51,280 --> 01:04:54,880
It it's a real neural network,
right?
1084
01:04:54,960 --> 01:04:59,040
The, the large language models
are not, they are neural
1085
01:04:59,040 --> 01:05:01,040
networks, but they're a
different kind of neural
1086
01:05:01,040 --> 01:05:03,720
network.
I think we're, I think we are
1087
01:05:03,720 --> 01:05:06,720
going to see AGI.
Yeah, I think we, you and I
1088
01:05:06,720 --> 01:05:09,760
might see it in our lifetime.
Like maybe.
1089
01:05:09,760 --> 01:05:11,640
Somewhere, I think we
definitely.
1090
01:05:11,720 --> 01:05:15,200
Would probably.
I don't think it's that far off.
1091
01:05:15,200 --> 01:05:17,560
I mean, when you look at how
fast things have happened over
1092
01:05:17,560 --> 01:05:19,960
the last, what is it 18 months
or something?
1093
01:05:20,120 --> 01:05:23,800
We've gone from pretty much no
one knowing not no one but the
1094
01:05:23,800 --> 01:05:28,480
normal human walking the street
knowing nothing about AI to now
1095
01:05:28,480 --> 01:05:31,680
they've got it on their iPhone
or or Google phone or whatever,
1096
01:05:31,680 --> 01:05:35,800
right?
And I think things are going to
1097
01:05:35,800 --> 01:05:38,440
continue moving very fast.
I think the low hanging fruit
1098
01:05:38,440 --> 01:05:40,760
with some of the stuff we've
been seeing, the translator
1099
01:05:40,760 --> 01:05:47,200
models and the the web coding
models and you know, personal
1100
01:05:47,200 --> 01:05:50,480
assistant types of models and
things like that, writing is
1101
01:05:50,480 --> 01:05:52,800
obviously gotten pretty good
already.
1102
01:05:52,920 --> 01:05:55,200
Oh, yeah, yeah.
And I think that, you know, I
1103
01:05:55,200 --> 01:05:58,720
think, you know, I don't think
that we need to talk about
1104
01:05:58,720 --> 01:06:01,400
politics or anything, but I can
tell you that the incoming
1105
01:06:01,400 --> 01:06:06,480
administration has a very
sympathetic viewpoint in terms
1106
01:06:06,480 --> 01:06:14,040
of deregulating AI and allowing
the tech companies to go as fast
1107
01:06:14,040 --> 01:06:16,200
as they want in the directions
they want to go.
1108
01:06:16,440 --> 01:06:17,800
Yeah.
I mean that that.
1109
01:06:17,800 --> 01:06:20,920
May be, which is fine.
I don't have a Yeah, you know, I
1110
01:06:20,920 --> 01:06:22,680
don't have a horse in that race.
I.
1111
01:06:22,680 --> 01:06:27,080
Think The funny thing is Bob
Elon is very much the opposite
1112
01:06:27,080 --> 01:06:29,760
of that.
So I would think Elon in, in
1113
01:06:29,760 --> 01:06:36,080
other people's ears is going to
be because Elon is, is very, I
1114
01:06:36,080 --> 01:06:40,680
wouldn't say afraid, but nervous
of where things are going and
1115
01:06:40,680 --> 01:06:44,240
how quickly they're going.
Marcel Elon says that.
1116
01:06:44,320 --> 01:06:47,120
Yeah, that's true.
But then he spins up Grok.
1117
01:06:47,280 --> 01:06:50,880
Yeah, which is an open source
large language model.
1118
01:06:50,880 --> 01:06:53,040
Yeah, and freely distributed it
to the world.
1119
01:06:53,040 --> 01:06:56,200
The Chinese now have Grok.
Yeah.
1120
01:06:56,480 --> 01:06:59,360
And what they're going to do
with it may or may not align
1121
01:06:59,360 --> 01:07:00,920
with whatever sympathies Elon
has.
1122
01:07:00,920 --> 01:07:06,600
So one is saying one thing, but
at the same time, he's building
1123
01:07:07,040 --> 01:07:11,440
the world's biggest training
data centers that exist, and
1124
01:07:11,600 --> 01:07:13,920
he's freely distributing the
result.
1125
01:07:14,120 --> 01:07:16,680
Well, he also says that by the
end of the year he's going to
1126
01:07:16,680 --> 01:07:23,440
have GPT 44-O power.
December I think is when he
1127
01:07:23,760 --> 01:07:27,080
announces Grok 3.
I believe it is and.
1128
01:07:27,360 --> 01:07:29,880
Like he has, he's definitely
made huge strides.
1129
01:07:30,320 --> 01:07:33,040
What I can tell you is a
training a big large language
1130
01:07:33,040 --> 01:07:37,720
model like GPT 4 is hugely
expensive in terms of times and
1131
01:07:37,880 --> 01:07:44,040
time and resources.
But also, it's not like Elon
1132
01:07:44,040 --> 01:07:49,960
Bolt believes in this very this,
this, this work philosophy of
1133
01:07:49,960 --> 01:07:53,280
the of intensity and that that,
you know, they look, there are
1134
01:07:53,280 --> 01:07:56,560
things in his companies that
that I admire a great deal.
1135
01:07:56,560 --> 01:07:59,000
Like I I am a huge fan of
SpaceX.
1136
01:07:59,000 --> 01:08:01,200
Yeah.
And I think that because SpaceX
1137
01:08:01,200 --> 01:08:04,440
is vertically integrated, and I
think because of Elon's
1138
01:08:04,440 --> 01:08:08,520
philosophy of how to tackle
engineering, it has afforded
1139
01:08:08,520 --> 01:08:11,240
them this huge advantage.
Yeah, but it's one thing.
1140
01:08:11,240 --> 01:08:15,680
But I also know this about Elon.
He says things like are gonna be
1141
01:08:15,880 --> 01:08:18,200
something by something.
Oh yeah.
1142
01:08:18,680 --> 01:08:20,200
And they're that.
Never happens.
1143
01:08:20,399 --> 01:08:21,840
Yeah, that never happened, you
know.
1144
01:08:21,920 --> 01:08:23,880
I.
Have well, it's like I have AI
1145
01:08:23,880 --> 01:08:27,960
have a friend who has a deposit
on the the whatever the the
1146
01:08:27,960 --> 01:08:30,200
sports car is called.
Yeah, exactly.
1147
01:08:30,200 --> 01:08:33,520
The Roadster I think it's called
or whatever, but I have a friend
1148
01:08:33,520 --> 01:08:37,240
who's had a deposit on that for
like 3 years and is like, are we
1149
01:08:37,240 --> 01:08:40,279
ever going to see this party?
You know, and look, we'll see
1150
01:08:40,279 --> 01:08:43,680
about what he has by December.
In terms of Grok, Yeah, Yeah.
1151
01:08:43,920 --> 01:08:46,640
But I want to circle back to
that fear factor.
1152
01:08:46,640 --> 01:08:48,880
I mean, artificial intelligence
is one thing.
1153
01:08:48,880 --> 01:08:51,520
I don't think that that's what
we need to be worrying about
1154
01:08:51,520 --> 01:08:54,760
immediately right now.
But but one another thing that
1155
01:08:54,760 --> 01:08:59,040
people worry about is that that
AI is going to put people out of
1156
01:08:59,040 --> 01:09:02,600
work.
Now look, AI is going to change
1157
01:09:02,600 --> 01:09:06,600
the nature of the workforce.
And there already has been huge
1158
01:09:06,600 --> 01:09:10,880
impacts in journalism, in
particular if you're a
1159
01:09:10,920 --> 01:09:15,760
professional writer, like there
used to be a decent living to be
1160
01:09:15,760 --> 01:09:20,640
made, for instance, in
aggregating news about a special
1161
01:09:20,640 --> 01:09:22,359
topic.
Like, let's say you were a tech
1162
01:09:22,359 --> 01:09:23,720
journalist.
Yeah.
1163
01:09:23,720 --> 01:09:27,760
And you write reviews about tech
stuff like, you know, like
1164
01:09:27,760 --> 01:09:31,240
phones or like gaming consoles
and stuff like that.
1165
01:09:31,240 --> 01:09:32,880
Forget it.
That job's gone.
1166
01:09:32,960 --> 01:09:36,040
That job is gone.
Yeah, there are AIS who are
1167
01:09:36,040 --> 01:09:38,920
just, they can much, they can
look at everything on the web
1168
01:09:38,920 --> 01:09:41,800
instantly.
They can and they can summarize
1169
01:09:41,800 --> 01:09:46,120
it and repackage it.
So you know, so you know.
1170
01:09:47,240 --> 01:09:51,120
Have you used notebook LM yet?
Notebook LM Yeah, I've tried.
1171
01:09:51,120 --> 01:09:53,640
Yeah, I've worked with Notebook,
with the with Google.
1172
01:09:53,920 --> 01:09:56,800
It's pretty freakish and cool.
I love it.
1173
01:09:57,040 --> 01:09:58,640
I love it.
Here's.
1174
01:09:58,640 --> 01:10:04,080
Another, you know, and the job
of the paralegal is to go in the
1175
01:10:04,080 --> 01:10:08,200
most part, or a big part of that
job was go find existing
1176
01:10:08,200 --> 01:10:12,480
information, find the important
parts of it that relate to this
1177
01:10:12,480 --> 01:10:16,920
particular case, summarize it,
organize it, put it together,
1178
01:10:16,920 --> 01:10:19,600
give it to the lawyer.
That job's gone.
1179
01:10:19,760 --> 01:10:22,040
Yeah, it's gone.
Well, you know, some low end
1180
01:10:22,040 --> 01:10:26,760
lawyers, some lawyers who do
like contract law or, you know,
1181
01:10:26,760 --> 01:10:29,880
very basic, like, I mean, if you
got to do a lease for an
1182
01:10:29,880 --> 01:10:32,280
apartment that you're trying to
rent out, are you going to go to
1183
01:10:32,280 --> 01:10:34,800
a lawyer?
You're going to go to ChatGPT?
1184
01:10:34,800 --> 01:10:39,280
I mean, it's so easy to get
those documents done now a. 100%
1185
01:10:39,280 --> 01:10:41,960
like, and then this goes like
for instance, we do a lot of
1186
01:10:41,960 --> 01:10:44,280
work with big corporations,
right?
1187
01:10:44,280 --> 01:10:47,400
Like work with GM.
To work with GM, you have to
1188
01:10:47,400 --> 01:10:50,280
jump through a lot of hoops.
You have to have, you have to
1189
01:10:50,280 --> 01:10:53,440
have compliance documentation.
You have to have all kinds of
1190
01:10:53,440 --> 01:10:57,240
documentation for yourself as a
company, your emergency response
1191
01:10:57,240 --> 01:10:59,120
documentation.
Because for instance, GM wants
1192
01:10:59,120 --> 01:11:03,000
to know that if there is a
hurricane or an, or where I am
1193
01:11:03,000 --> 01:11:07,800
an earthquake, what, what are
our, what are our emergency
1194
01:11:07,800 --> 01:11:10,560
response procedures to make sure
that we're still going to be
1195
01:11:10,560 --> 01:11:12,760
able to do business the next
day.
1196
01:11:13,360 --> 01:11:15,280
Interesting.
This our GM account right?
1197
01:11:15,280 --> 01:11:18,760
So for instance, so there's a
lot of documentation you have to
1198
01:11:18,760 --> 01:11:21,440
write as a company if you're
doing business with people like
1199
01:11:21,440 --> 01:11:22,240
that.
Yeah.
1200
01:11:22,720 --> 01:11:25,960
That kind of stuff, like, would
you, we've been able to use
1201
01:11:25,960 --> 01:11:31,400
ChatGPT to do emergency response
plans, compliance plans, you
1202
01:11:31,400 --> 01:11:35,080
know, computer security plans,
you know, Yeah.
1203
01:11:35,080 --> 01:11:39,560
Me too, I that simplifies so
much of my day so.
1204
01:11:39,880 --> 01:11:43,360
But there's also been a big,
there's been a big within the
1205
01:11:43,360 --> 01:11:46,560
art community, in the arts
community, there's been this
1206
01:11:46,640 --> 01:11:50,120
backlash of like this is going
to put artists out of work.
1207
01:11:50,720 --> 01:11:53,440
I don't agree.
It's another tool.
1208
01:11:53,680 --> 01:11:59,640
It's another thing we can use.
The effects industry did not put
1209
01:12:00,400 --> 01:12:03,680
regular movie making people out
of work.
1210
01:12:03,880 --> 01:12:06,120
It just became part of the
workflow.
1211
01:12:07,320 --> 01:12:10,440
You know I want.
To give the flip side of that,
1212
01:12:10,440 --> 01:12:13,000
because like, you know, what
about like Tyler Perry who
1213
01:12:13,000 --> 01:12:16,880
cancelled, what was it an 800,
1000 square foot studio or
1214
01:12:16,880 --> 01:12:21,880
something that was being built?
And you know, because he just
1215
01:12:21,880 --> 01:12:24,880
said we're not going to need
studios anymore.
1216
01:12:24,880 --> 01:12:25,920
We're just not going to need
them.
1217
01:12:26,000 --> 01:12:30,720
Tyler Perry personal opinion.
I think that was calculated much
1218
01:12:30,720 --> 01:12:36,200
more off of that was an excuse
all of the big networks and film
1219
01:12:36,200 --> 01:12:40,760
studios and most probably
streamers stopped buying
1220
01:12:40,760 --> 01:12:42,240
content.
Yeah.
1221
01:12:42,840 --> 01:12:45,080
Right, because they're writing
content down because they're
1222
01:12:45,080 --> 01:12:49,480
trying to trying to drive share,
share price up by reducing debt.
1223
01:12:49,680 --> 01:12:53,520
Here's the fact of the matter.
It is there's a huge, huge,
1224
01:12:56,160 --> 01:13:00,400
there's a huge fall off in
production of television and
1225
01:13:00,400 --> 01:13:02,960
film right now.
And none of it's got anything to
1226
01:13:02,960 --> 01:13:05,800
do with AI.
Yeah, it's got to do with media
1227
01:13:05,800 --> 01:13:10,280
companies trying to figure out
how to monetize their content in
1228
01:13:10,280 --> 01:13:13,760
the age of streaming.
I think consumption consumption
1229
01:13:13,800 --> 01:13:16,040
was changed.
Wasn't about to build a billion
1230
01:13:16,040 --> 01:13:21,640
dollar film studio to sit empty.
Not because AI had replaced it,
1231
01:13:21,640 --> 01:13:24,040
but because there's no market
for that shit.
1232
01:13:24,040 --> 01:13:25,760
Well, do.
You know, do you know, would you
1233
01:13:25,760 --> 01:13:30,680
know Glenn Rogeman from AED in
Nether, Belgium?
1234
01:13:31,560 --> 01:13:32,960
No.
He'd kill me for saying
1235
01:13:32,960 --> 01:13:39,200
Netherlands because he has a, he
has a lighting sound, all kinds
1236
01:13:39,200 --> 01:13:41,640
of companies, but he's got a, a
studio as well.
1237
01:13:42,120 --> 01:13:45,320
And I talked to him about this
and I said, does that make you
1238
01:13:45,320 --> 01:13:47,720
nervous at all?
And he said, not really, you
1239
01:13:47,720 --> 01:13:50,800
know, we're going to stay busy.
But the problem is here's what
1240
01:13:50,800 --> 01:13:53,440
he says.
Like, you know, I know I paid a
1241
01:13:53,440 --> 01:13:56,000
lot of attention to Sora.
I haven't followed some of the
1242
01:13:56,000 --> 01:13:58,200
newer ones that have come out,
but Sora was doing some
1243
01:13:58,200 --> 01:14:00,520
incredible stuff, like really
incredible stuff.
1244
01:14:01,000 --> 01:14:04,240
And but he said, here's what's
going to happen.
1245
01:14:04,240 --> 01:14:07,520
Movies are going to get to here.
They're going to get really
1246
01:14:07,520 --> 01:14:10,560
good.
And but he said then what's
1247
01:14:10,560 --> 01:14:14,040
going to happen is all of this
content is going to be right
1248
01:14:14,040 --> 01:14:15,880
here.
It's all going to be really
1249
01:14:15,880 --> 01:14:18,240
good.
And so really good is going to
1250
01:14:18,240 --> 01:14:24,040
become vanilla, and everything
will become sort of ubiquitous
1251
01:14:24,040 --> 01:14:27,640
and boring.
And he just thinks that, you
1252
01:14:27,640 --> 01:14:32,360
know, the nuances and the good
and bad combined together is
1253
01:14:32,360 --> 01:14:37,960
what makes, you know, film
discovery or movie discovery,
1254
01:14:37,960 --> 01:14:41,480
show discovery so good.
It's not like everything is an
1255
01:14:41,480 --> 01:14:43,320
A.
Well, what about some A pluses
1256
01:14:43,320 --> 01:14:46,000
and some B minuses and whatever?
You know, you need some of that
1257
01:14:46,000 --> 01:14:47,800
stuff, right?
But well.
1258
01:14:48,680 --> 01:14:52,280
I, I, I'm of, I'm of various
opinions about that.
1259
01:14:52,360 --> 01:14:57,720
I think that that I think that
misunderstands and maybe skips
1260
01:14:57,720 --> 01:15:01,080
over the idea that the AI should
get better, right?
1261
01:15:01,400 --> 01:15:04,360
Like they're going to feed
context into these AIS.
1262
01:15:04,360 --> 01:15:07,680
The AIS are going like the
generative AIS are going to be
1263
01:15:07,680 --> 01:15:12,240
taught that everything being
perfect leads to homogenization,
1264
01:15:12,280 --> 01:15:14,280
right?
So they will introduce little
1265
01:15:14,520 --> 01:15:15,440
human.
Point.
1266
01:15:16,640 --> 01:15:19,720
You know, just to appeal to us.
So they also haven't yet even
1267
01:15:19,720 --> 01:15:24,640
begun to train these AIS to
produce stuff that is targeted
1268
01:15:24,640 --> 01:15:27,880
to demographic psychographics
and audience segmentation.
1269
01:15:27,880 --> 01:15:28,720
Right.
Good point.
1270
01:15:28,720 --> 01:15:30,080
Yeah.
And they're going to get really
1271
01:15:30,080 --> 01:15:33,280
good at it.
And in the culture has become
1272
01:15:33,280 --> 01:15:36,400
short attention span theater.
You know, I know that my
1273
01:15:36,400 --> 01:15:40,720
daughter, she gets, she does
most of her media consumption
1274
01:15:40,720 --> 01:15:44,240
not in this, not in a, in a, in
a movie theater.
1275
01:15:44,360 --> 01:15:45,760
Yeah, my son too.
And.
1276
01:15:45,760 --> 01:15:50,800
Only barely in any long form,
anything she sees on TV, Yeah,
1277
01:15:50,800 --> 01:15:55,040
she mainly looks at TikTok,
Instagram stories and stuff like
1278
01:15:55,040 --> 01:15:56,000
that.
Yeah.
1279
01:15:56,000 --> 01:15:59,720
And you know, or maybe like half
hour episodic stuff, right?
1280
01:15:59,720 --> 01:16:03,080
Yeah, Yeah.
So I mean, look, I agree.
1281
01:16:04,880 --> 01:16:09,120
I just want to say the AIS, yes,
what your friend said, I think
1282
01:16:09,120 --> 01:16:12,000
that there's some value in it,
but but I think it skips by the
1283
01:16:12,000 --> 01:16:16,840
idea that they're going to teach
the AIS to to get better at not.
1284
01:16:16,840 --> 01:16:19,440
Being that's better, Yeah,
that's true all the time.
1285
01:16:19,440 --> 01:16:22,640
We didn't discuss that, yeah.
And then the other part, this is
1286
01:16:22,640 --> 01:16:25,560
kind of a three prong thing.
The other part of it is I once
1287
01:16:25,560 --> 01:16:28,320
had an executive producer told
me, and this is a guy who's made
1288
01:16:28,320 --> 01:16:31,240
billions of dollars off
audiences.
1289
01:16:31,240 --> 01:16:39,520
He, he said, Bobby, never
underestimate the, the, the, you
1290
01:16:39,520 --> 01:16:42,720
know, the, the appetites of the
general public, which is to say
1291
01:16:43,080 --> 01:16:48,600
they don't really care.
You know, most of mainstream
1292
01:16:48,600 --> 01:16:52,200
culture, they don't give a shit
about quality.
1293
01:16:52,480 --> 01:16:57,840
You know, they look at video in
bad MPEG 4 on their phone all
1294
01:16:57,840 --> 01:17:02,480
day in little bits and pieces
and they they aren't necessarily
1295
01:17:02,480 --> 01:17:06,200
captivated or captured in a
conscious way by art.
1296
01:17:06,240 --> 01:17:08,520
So.
Troy, I heard, I heard Justine
1297
01:17:08,520 --> 01:17:12,920
Bateman who is the president of
of Screen Actors Guild, she
1298
01:17:12,920 --> 01:17:14,600
said.
I think it was in a Variety
1299
01:17:14,600 --> 01:17:18,480
article this week that there was
going to be a mass cultural
1300
01:17:18,480 --> 01:17:23,720
backlash against AI produced
content 'cause people would get
1301
01:17:23,720 --> 01:17:29,240
tired of it not being great or,
or being just vanilla.
1302
01:17:29,360 --> 01:17:31,360
Yeah, They're not.
No, they're not.
1303
01:17:31,640 --> 01:17:34,840
No, I think a lot of the
audience is perfectly happy with
1304
01:17:34,840 --> 01:17:36,680
vanilla.
And I am not.
1305
01:17:37,120 --> 01:17:41,720
I I am an artist.
I want people to be inspired and
1306
01:17:41,720 --> 01:17:44,760
to feel deeper feelings and to
go deeper with things.
1307
01:17:44,840 --> 01:17:49,120
Yeah, but I think a lot of the
mainstream culture, they just
1308
01:17:49,120 --> 01:17:52,440
want to hit a dopamine when they
want to hit a dopamine.
1309
01:17:53,040 --> 01:17:55,240
Yeah.
And they're and they they don't
1310
01:17:55,240 --> 01:17:59,480
much they don't get attached.
They they don't think to the
1311
01:17:59,480 --> 01:18:03,800
extent of like, wait a minute,
this all looks a little bit just
1312
01:18:03,800 --> 01:18:07,800
too homogeneously.
AII don't want to watch anything
1313
01:18:07,800 --> 01:18:09,560
made by.
Yeah, I saw one of those things
1314
01:18:09,560 --> 01:18:13,600
today.
I saw, I saw a video that had a
1315
01:18:14,240 --> 01:18:16,320
notebook LM.
You know how you can do the
1316
01:18:16,320 --> 01:18:20,280
podcast thing with notebook LM
And so it had the notebook.
1317
01:18:21,080 --> 01:18:24,120
It's actually AI was going to
show it to you, but boring.
1318
01:18:24,400 --> 01:18:28,960
But it's it's a a video about
the Jeff Bezos.
1319
01:18:29,240 --> 01:18:31,600
What's it called PI0?
Have you seen this thing yet?
1320
01:18:31,600 --> 01:18:37,080
Jeff Bezos backed it.
So Pi 0 is basically giving
1321
01:18:37,080 --> 01:18:41,000
robots brains that you can tell
a robot what you want it to do.
1322
01:18:41,000 --> 01:18:44,760
And the video shows like folding
clothes and lot of house chores,
1323
01:18:44,760 --> 01:18:47,000
right?
Household chores, breaking and
1324
01:18:47,080 --> 01:18:50,280
folding clothes and cleaning up
and doing things like that.
1325
01:18:50,560 --> 01:18:55,920
But I mean, Bezos round I think
headed up around of like 1/2 a
1326
01:18:55,920 --> 01:18:59,920
billion dollars in funding or
something for this, this π zero
1327
01:18:59,920 --> 01:19:03,800
thing.
But the, the video is a video of
1328
01:19:03,800 --> 01:19:07,040
this robot doing its thing.
And the audio in the background
1329
01:19:07,040 --> 01:19:12,960
is an LM notebook, LM podcast.
And there's a few commenters who
1330
01:19:12,960 --> 01:19:17,000
have like 1 follower who are
like, how dare you insult me
1331
01:19:17,000 --> 01:19:20,960
with an AI generated podcast.
Really.
1332
01:19:21,480 --> 01:19:23,160
You know, are you serious right
now?
1333
01:19:23,400 --> 01:19:26,440
I think a lot of people would be
perfectly happy to listen to the
1334
01:19:26,480 --> 01:19:28,960
Notebook LM podcast.
Their voices are great.
1335
01:19:29,160 --> 01:19:32,240
I actually find that they they
have a very interesting dynamic,
1336
01:19:32,240 --> 01:19:38,000
those I love it, you know, P
Here's what you know a lot of
1337
01:19:38,000 --> 01:19:40,800
audiences are perfectly happy
with good enough.
1338
01:19:41,080 --> 01:19:42,320
Yeah.
Right.
1339
01:19:42,600 --> 01:19:48,000
And I and I and I am constantly
seeking to help escalate
1340
01:19:48,000 --> 01:19:52,240
audiences to a point where they
can see something extraordinary
1341
01:19:52,240 --> 01:19:54,960
and really be moved by it.
That's where I am as an artist.
1342
01:19:55,320 --> 01:19:59,040
But but but but as a
professional designer and
1343
01:19:59,040 --> 01:20:02,520
creative and strategist, who
advises?
1344
01:20:03,080 --> 01:20:07,360
On these topics and who's I
spend a lot of time thinking
1345
01:20:07,360 --> 01:20:10,280
about audience segmentation and
demographics.
1346
01:20:10,320 --> 01:20:13,720
Yeah, what do audiences want,
right.
1347
01:20:14,120 --> 01:20:17,920
And I could tell you that a lot
of audiences are perfectly happy
1348
01:20:17,920 --> 01:20:21,600
with good enough.
Now there are, there is an
1349
01:20:21,600 --> 01:20:27,240
audience segment, there is a
demographic that appreciates,
1350
01:20:27,240 --> 01:20:32,800
handcrafted, they appreciate and
they want and in this era of AI
1351
01:20:32,800 --> 01:20:38,680
they are going to seek out real
human created art just kind of
1352
01:20:38,680 --> 01:20:40,680
like at the independent film
movement.
1353
01:20:40,680 --> 01:20:41,720
Right.
Yeah, yeah, like.
1354
01:20:41,800 --> 01:20:43,560
People.
People have an appreciation for
1355
01:20:43,560 --> 01:20:48,040
smaller stories told more
intimately in an intimate way.
1356
01:20:48,240 --> 01:20:52,920
There's always going to be a
place for real human handcrafted
1357
01:20:52,920 --> 01:20:55,600
art.
And by the way, that's probably
1358
01:20:55,600 --> 01:20:58,960
in the marketplace going to
escalate to a fairly expensive
1359
01:20:58,960 --> 01:21:00,560
place where people are going to
be.
1360
01:21:00,640 --> 01:21:04,600
Those people who want that are
going to be willing to pay more.
1361
01:21:04,920 --> 01:21:07,840
To connect, yeah.
You know, absolutely going to go
1362
01:21:07,840 --> 01:21:09,840
out of existence.
And here's the other thing I
1363
01:21:09,840 --> 01:21:13,680
have to say, Mode Studios, since
we started widely deploying AI,
1364
01:21:14,120 --> 01:21:18,880
we haven't laid people off.
We've hired more people.
1365
01:21:18,880 --> 01:21:23,080
We have more designers working
here then now that we're two
1366
01:21:23,080 --> 01:21:26,880
years into deploying AI in all
kinds of places than we did
1367
01:21:26,880 --> 01:21:30,320
before.
So it seems to me that like the
1368
01:21:30,360 --> 01:21:34,760
evidence I have in hand isn't
that it's eliminating jobs, it's
1369
01:21:34,760 --> 01:21:37,760
changing jobs.
You know, with the with the
1370
01:21:37,800 --> 01:21:42,360
advent of things like follow me
and Robos, you know, robospots,
1371
01:21:43,080 --> 01:21:48,680
we no longer have rigs full of
stagehands in spot baskets, you
1372
01:21:48,680 --> 01:21:53,440
know, move.
That job has changed because of
1373
01:21:53,440 --> 01:21:55,840
technology, but it hasn't gone
away.
1374
01:21:56,000 --> 01:22:00,360
Moving lights didn't get rid of
follow spots entirely, right?
1375
01:22:00,760 --> 01:22:03,800
Yeah.
You know, things, technology
1376
01:22:03,800 --> 01:22:07,840
changes the way things are made.
It changes the way that that
1377
01:22:07,840 --> 01:22:12,440
vocations happen and it changes
what kind of careers are
1378
01:22:12,440 --> 01:22:16,440
possible, you know, But the fact
of the matter is that right now
1379
01:22:16,440 --> 01:22:20,440
in the marketplace, AI, you
know, we are searching for
1380
01:22:20,440 --> 01:22:23,560
people with AI talents.
We also want those people to be
1381
01:22:23,560 --> 01:22:29,040
talented in using software like
Unreal Engine or 3D software or
1382
01:22:29,040 --> 01:22:32,440
other complimentary tool sets
that allow us to put all these
1383
01:22:32,440 --> 01:22:36,520
pieces together and make great.
Contact but going back to like
1384
01:22:36,520 --> 01:22:42,520
this whole fear thing and you
know, like the common thing in
1385
01:22:42,520 --> 01:22:45,040
our industry is just like I
don't want anything to do with
1386
01:22:45,040 --> 01:22:47,000
this AI.
It's coming from my job and it's
1387
01:22:47,000 --> 01:22:51,520
going to take away my job and
you know, I used to my my go to
1388
01:22:51,520 --> 01:22:55,160
statement on that used to be you
know, no, it's really not.
1389
01:22:55,160 --> 01:22:58,840
It's going to provide you tools
to make your job better and to
1390
01:22:58,840 --> 01:23:03,200
make you more functional and
more efficient.
1391
01:23:03,200 --> 01:23:07,240
And and so that at the sort of
lowest level, that is what it's
1392
01:23:07,240 --> 01:23:09,920
going to do.
It's, it's today, you know, like
1393
01:23:10,520 --> 01:23:14,640
I had Kadena on the podcast and
we talked about AI at one point
1394
01:23:14,640 --> 01:23:18,440
and, you know, he grabbed the
tool that I had shared with him
1395
01:23:18,440 --> 01:23:22,640
and went out and started using
it to create JavaScript for him
1396
01:23:23,080 --> 01:23:25,040
on some of the things that he
was doing and it.
1397
01:23:25,040 --> 01:23:27,160
Very good at creating code.
Correct.
1398
01:23:27,160 --> 01:23:31,840
Yeah, so, so you know that I
think at the very lowest level,
1399
01:23:31,840 --> 01:23:33,200
that's the stuff that it's going
to do.
1400
01:23:33,200 --> 01:23:35,320
It's going to help you.
But when you look at these
1401
01:23:35,320 --> 01:23:41,000
robots, you know, both Elon's
robot, this, this Pi 01, there's
1402
01:23:41,000 --> 01:23:44,200
a Chinese one that I read about
that's incredible and very close
1403
01:23:44,200 --> 01:23:47,120
to launching right now.
And you know, these things are
1404
01:23:47,120 --> 01:23:51,600
twenty, 30,000 bucks and they
can be housekeepers, they can
1405
01:23:51,600 --> 01:23:56,120
be, you know, they can work in
your shop doing tedious things.
1406
01:23:56,520 --> 01:24:00,320
I'm sure at some point, you
know, they'll be able to work on
1407
01:24:00,320 --> 01:24:03,760
lighting consoles or or sound
consoles or whatever it is.
1408
01:24:04,160 --> 01:24:07,960
Cuz I used to say, you know, the
people who are always gonna be
1409
01:24:07,960 --> 01:24:10,760
safe no matter what are the
plumbers and the, you know, the
1410
01:24:10,760 --> 01:24:14,600
people who do physical jobs.
Yeah, I don't think that so much
1411
01:24:14,640 --> 01:24:16,400
anymore.
Like I think that's agree with.
1412
01:24:16,680 --> 01:24:17,920
You.
I think that, I think there will
1413
01:24:17,920 --> 01:24:20,280
be a massive deployment of
automation.
1414
01:24:20,560 --> 01:24:21,880
Yeah.
Of robotics, yeah.
1415
01:24:21,880 --> 01:24:23,360
I think it is gonna change
things.
1416
01:24:23,800 --> 01:24:27,400
I think before we see robots
behind lighting desks, the first
1417
01:24:27,400 --> 01:24:32,440
thing we're going to see is
lighting desks deploying more AI
1418
01:24:32,440 --> 01:24:36,480
functionality.
How you approach a console to
1419
01:24:36,480 --> 01:24:40,840
create effects, for instance,
Yeah, or even like you is like,
1420
01:24:41,120 --> 01:24:44,680
you know, I know because I'm.
I've been talking with people at
1421
01:24:44,680 --> 01:24:48,440
various of the companies that
that, that are making lighting
1422
01:24:48,440 --> 01:24:49,400
console.
Yeah, me too.
1423
01:24:49,480 --> 01:24:53,560
They're already working on being
able to have a prompting model
1424
01:24:53,560 --> 01:24:54,760
for lighting.
Yeah.
1425
01:24:54,760 --> 01:24:58,560
Like, so rather than saying, oh,
I want group 5 at 50 in Group,
1426
01:24:58,960 --> 01:25:03,440
you know, eight at 75 in Group
1000, and you're working towards
1427
01:25:03,440 --> 01:25:08,160
making, you know, a blue
mysterious night time queue,
1428
01:25:08,360 --> 01:25:10,440
you're going to be able to
prompt the console.
1429
01:25:10,440 --> 01:25:13,080
You're going to be able to say,
you know, create a blue
1430
01:25:13,080 --> 01:25:16,200
mysterious night time Exactly.
Focus mainly on stage, right?
1431
01:25:16,360 --> 01:25:18,840
Speaking in English going to be
able to do that.
1432
01:25:18,840 --> 01:25:19,760
Right.
Yeah, yeah.
1433
01:25:19,840 --> 01:25:21,520
Look, there's going to be an
impact.
1434
01:25:22,600 --> 01:25:26,160
I, you know, and I, I think, you
know, we're going to see it.
1435
01:25:26,400 --> 01:25:28,760
We're going to see AI deployed
in a whole bunch of
1436
01:25:28,760 --> 01:25:31,120
technological ways.
Yeah.
1437
01:25:31,760 --> 01:25:33,920
This is inevitable.
And I mean, I think that I
1438
01:25:33,920 --> 01:25:35,440
understand the people pushing
back.
1439
01:25:35,440 --> 01:25:38,160
Can I understand the fear?
Change is hard.
1440
01:25:38,360 --> 01:25:42,520
Humans are not wired to be
constantly adapting.
1441
01:25:42,520 --> 01:25:48,360
We crave finding cycles, finding
rhythms of things we can do.
1442
01:25:48,360 --> 01:25:51,480
You know, we're, we're going to
plant the seeds, take care of
1443
01:25:51,480 --> 01:25:55,200
the crops, harvest the seeds,
make the food, put the, you
1444
01:25:55,200 --> 01:25:57,720
know, like we, we have certain
things that are genetically
1445
01:25:57,720 --> 01:26:01,800
coded in US and behaviorally
coded in US and holy crap, we
1446
01:26:01,800 --> 01:26:05,160
are in a hockey.
Stick curve big time, yeah.
1447
01:26:05,160 --> 01:26:08,640
Technological change.
That's really uncomfortable,
1448
01:26:08,640 --> 01:26:11,920
right?
But at the same time, like I'm
1449
01:26:11,920 --> 01:26:16,240
so excited about it because of
the control that it gives the
1450
01:26:16,240 --> 01:26:19,080
humans.
So like for example, you know, I
1451
01:26:19,080 --> 01:26:21,760
don't remember what it was, but
last week there was something
1452
01:26:21,760 --> 01:26:24,480
bothering me health wise.
You know, I just turned 60, so
1453
01:26:24,480 --> 01:26:26,040
there's always something
bothering me.
1454
01:26:26,480 --> 01:26:30,840
And you know, I would have
normally gone to the doctor and
1455
01:26:30,840 --> 01:26:32,840
said, here's what's happening.
Bah, Bah, Bah.
1456
01:26:32,840 --> 01:26:34,640
What should I do?
Should I take a different
1457
01:26:34,640 --> 01:26:36,120
vitamin?
Should I do this?
1458
01:26:36,120 --> 01:26:38,320
Like what, what is it that's
causing this problem?
1459
01:26:38,720 --> 01:26:43,960
And instead I use ChatGPT.
And within 1015 minutes of
1460
01:26:43,960 --> 01:26:47,160
prompting and, and going back
and forth a little bit, I had
1461
01:26:47,160 --> 01:26:49,280
the answers and they were really
good answers.
1462
01:26:49,280 --> 01:26:52,800
And it worked, you know?
And so some people will say
1463
01:26:52,800 --> 01:26:54,680
you're a maniac, that's
dangerous.
1464
01:26:54,680 --> 01:26:58,360
You know, how dare you trust
your health to what the hell do
1465
01:26:58,360 --> 01:27:00,960
you think doctors have been
doing for the last 20 years?
1466
01:27:01,160 --> 01:27:03,000
The same thing they.
Go into Google.
1467
01:27:03,240 --> 01:27:06,920
They either know things or they
or they go to their bookshelf.
1468
01:27:06,920 --> 01:27:08,520
Yeah.
Like, and they like.
1469
01:27:08,520 --> 01:27:11,280
Oh.
Judging by your symptoms, yeah.
1470
01:27:11,560 --> 01:27:15,960
I I I'm pretty sure that you
have a violent case of herpes,
1471
01:27:15,960 --> 01:27:18,600
Marcel.
No, but I mean, all we're
1472
01:27:18,600 --> 01:27:22,240
dealing with, if we do it
ourselves with Chachi, it's just
1473
01:27:22,240 --> 01:27:25,280
a larger language model than
what the doctor's been using,
1474
01:27:25,280 --> 01:27:26,240
you know well.
Exactly.
1475
01:27:26,240 --> 01:27:28,760
I'm like just mentioned that
they're really good at code.
1476
01:27:29,040 --> 01:27:31,840
What?
What AIS are really good as, as
1477
01:27:31,840 --> 01:27:33,880
anything.
That's a question of deriving a
1478
01:27:33,880 --> 01:27:36,240
result from aggregated
information.
1479
01:27:36,240 --> 01:27:38,280
Correct.
Super good at yeah.
1480
01:27:38,280 --> 01:27:39,000
Right.
Correct.
1481
01:27:39,280 --> 01:27:41,480
So the nature of jobs is going
to change.
1482
01:27:41,480 --> 01:27:45,120
You know, is the lighting
programming job in danger?
1483
01:27:45,280 --> 01:27:50,280
It's in danger of changing.
Ultimately, we might move the
1484
01:27:50,400 --> 01:27:53,720
ultimately, and I'm talking, you
know, probably maybe a decade
1485
01:27:53,720 --> 01:27:57,960
out, we might be at a place
where, you know, a production
1486
01:27:57,960 --> 01:28:02,800
designer is now sitting next to
a director and prompting the
1487
01:28:02,800 --> 01:28:07,360
media servers and prompting the
light just verbally talking to
1488
01:28:07,360 --> 01:28:08,720
his interface.
Yeah.
1489
01:28:09,000 --> 01:28:11,640
And there aren't as many
programmers involved.
1490
01:28:11,640 --> 01:28:14,120
Yeah, I think.
That that might happen, yeah.
1491
01:28:15,800 --> 01:28:18,800
It's hard to know.
It's hard to know where it's
1492
01:28:18,840 --> 01:28:23,160
going.
I can tell you that right now it
1493
01:28:23,160 --> 01:28:26,440
hasn't changed a whole lot.
It is going to change things
1494
01:28:26,520 --> 01:28:28,680
moving forward.
It's even going to change them
1495
01:28:28,680 --> 01:28:31,360
dramatically.
We talked about at the beginning
1496
01:28:31,360 --> 01:28:34,040
of the episode about how my
daughter just went and got deep
1497
01:28:34,040 --> 01:28:36,080
training in MA Three.
Yeah.
1498
01:28:36,160 --> 01:28:39,760
You know, judging by the
crawling pace of lighting
1499
01:28:39,840 --> 01:28:42,160
console innovation, correct.
Yeah, correct.
1500
01:28:43,000 --> 01:28:47,400
We might be also here, you know
our industry is small enough
1501
01:28:47,400 --> 01:28:49,840
where it's relationship based
and all of those things.
1502
01:28:49,840 --> 01:28:52,400
I don't think anybody's going to
want to be the first console
1503
01:28:52,400 --> 01:28:56,240
that program programs itself,
you know, because that's a scary
1504
01:28:56,240 --> 01:28:59,280
thought because you're going to
be boycotted and cancelled and
1505
01:28:59,280 --> 01:29:01,680
all of those things when you're
the first one coming out.
1506
01:29:02,200 --> 01:29:07,080
There is that you have to get by
that resistance to really bring.
1507
01:29:07,080 --> 01:29:10,120
Into there will be resistance.
Time there are going to be
1508
01:29:10,120 --> 01:29:13,680
people who are super interested
in that and who are even if
1509
01:29:13,680 --> 01:29:17,520
they're not people working at
the, you know, ETC or MA
1510
01:29:17,520 --> 01:29:19,800
lighting or stuff.
There are people who are
1511
01:29:19,960 --> 01:29:23,480
interested in I've seen a
product, I get assaulted with it
1512
01:29:23,480 --> 01:29:26,360
all the time on my social feeds.
Are these guys that have made a
1513
01:29:26,360 --> 01:29:28,840
little box?
Yeah, that was the one that got
1514
01:29:28,840 --> 01:29:31,920
blasted at LDI last year, Yeah.
And I think they did get
1515
01:29:31,920 --> 01:29:33,160
blasted.
But you know what?
1516
01:29:33,800 --> 01:29:36,000
That company is going
gangbusters.
1517
01:29:36,000 --> 01:29:37,560
Yeah.
They are selling lots of those
1518
01:29:37,560 --> 01:29:39,920
little boxes to little clubs.
Yeah.
1519
01:29:40,080 --> 01:29:43,240
Does it make sense, you know, to
DJs and to people playing in
1520
01:29:43,240 --> 01:29:47,880
certain things who just want
some basic, you know, generative
1521
01:29:47,880 --> 01:29:51,960
lighting to happen on tempo and,
yeah, you know, moves of the
1522
01:29:51,960 --> 01:29:54,480
thing?
There is a marketplace for it.
1523
01:29:54,680 --> 01:29:57,320
So one other thing.
Stopping, you know, trying to
1524
01:29:57,320 --> 01:30:00,920
stop technical innovation from
changing what we know is like
1525
01:30:00,920 --> 01:30:04,800
trying to is like going to the
beach with a mop and trying to
1526
01:30:04,800 --> 01:30:06,440
keep the waves.
It's just it's.
1527
01:30:06,600 --> 01:30:07,920
You're right.
No, you're right.
1528
01:30:07,920 --> 01:30:11,440
There's no stopping this thing.
Like you either get on the train
1529
01:30:11,440 --> 01:30:14,240
or it runs over you.
And we decided to get on the
1530
01:30:14,240 --> 01:30:17,640
train and, and I have had, I
have had discussions with people
1531
01:30:18,080 --> 01:30:22,680
who literally have come at me
and said you're a vile, you are,
1532
01:30:22,880 --> 01:30:25,440
you are architecting the end of
art.
1533
01:30:25,600 --> 01:30:28,840
Well, while that may be true, I
and I happen to be with you on
1534
01:30:28,840 --> 01:30:31,840
the AI thing.
Like circumstances beyond my
1535
01:30:31,840 --> 01:30:34,400
control and time and the advance
of technology.
1536
01:30:34,600 --> 01:30:37,120
I've always been an artist
interested in deploying
1537
01:30:37,120 --> 01:30:39,240
technology since the very
beginning.
1538
01:30:39,240 --> 01:30:43,920
Yeah, since the very beginning,
I and I can't ignore this.
1539
01:30:43,920 --> 01:30:47,520
I want to, I am curious.
I want to understand this stuff.
1540
01:30:47,520 --> 01:30:49,600
I want to understand how it
works in the process.
1541
01:30:49,600 --> 01:30:50,800
Well, I want to understand how
I.
1542
01:30:50,800 --> 01:30:53,960
Remember how bad, how hard
people fought against time code,
1543
01:30:53,960 --> 01:30:57,920
how hard people fought against,
you know, programming cues and
1544
01:30:57,920 --> 01:30:59,920
then running them in sequence
and things like that.
1545
01:30:59,920 --> 01:31:04,680
Like, you know, advancements
always tend to have people
1546
01:31:04,680 --> 01:31:06,040
fighting on the other side,
right?
1547
01:31:06,040 --> 01:31:08,760
It you run into to resistance
always.
1548
01:31:09,120 --> 01:31:11,280
But one of the things I want to
spend a few minutes talking
1549
01:31:11,280 --> 01:31:14,840
about because I know I don't
have you all day is this this
1550
01:31:14,840 --> 01:31:19,840
term prompt engineering.
And so you know, I know that
1551
01:31:19,920 --> 01:31:24,400
from my own experience when you
take the time to learn how to
1552
01:31:24,400 --> 01:31:27,720
and you can go out and Google
and there's 1000 YouTube videos
1553
01:31:27,720 --> 01:31:31,520
and probably 100,000 TikTok
videos and stuff on how to do
1554
01:31:31,520 --> 01:31:33,720
the best prompt engineering and
all that stuff.
1555
01:31:34,120 --> 01:31:37,360
But one of the things I've
discovered recently, and I'm
1556
01:31:37,360 --> 01:31:41,240
sure you're very aware of it, is
anthropic.
1557
01:31:41,840 --> 01:31:46,640
Claude writes prompts.
So you can basically tell Claude
1558
01:31:46,640 --> 01:31:52,160
to write a prompt for ChatGPT to
do blah blah blah whatever it
1559
01:31:52,160 --> 01:31:55,600
is, and you speak to it in
English, it writes the prompt in
1560
01:31:55,600 --> 01:32:00,920
prompt world and then you copy
and paste it into ChatGPT and it
1561
01:32:00,920 --> 01:32:04,880
works amazingly well.
Yeah, and in fact, you should
1562
01:32:04,880 --> 01:32:09,440
know that if you're using Dolly
3, which is Open a Eyes image
1563
01:32:09,440 --> 01:32:15,200
generator, when you prompt Dolly
three, it sends your prompt to
1564
01:32:15,200 --> 01:32:18,600
ChatGPT for ChatGPT to rewrite
it.
1565
01:32:19,120 --> 01:32:20,520
Really.
Better prompt.
1566
01:32:20,520 --> 01:32:22,720
Oh, that's interesting.
And this is all behind the
1567
01:32:22,720 --> 01:32:23,480
curtain.
Yeah.
1568
01:32:23,480 --> 01:32:26,240
You don't see it.
And then ChatGPT feeds the
1569
01:32:26,240 --> 01:32:28,600
better prompt.
Back to Dolly three.
1570
01:32:28,760 --> 01:32:31,680
Yeah, You know, Yeah.
And the, you know, and I know
1571
01:32:31,680 --> 01:32:36,400
some people who, who deploy this
in a way I use, you know, the
1572
01:32:36,400 --> 01:32:38,680
tools I use, like I'm a big mid
journey user.
1573
01:32:38,680 --> 01:32:42,280
My workflow is I tend to make
images in mid journey and then
1574
01:32:42,280 --> 01:32:47,080
if I need to then move to video,
I'll take anchor images that,
1575
01:32:47,200 --> 01:32:49,880
that, that are like the look in
the feel, the aesthetic in the
1576
01:32:49,880 --> 01:32:51,400
composition.
I want to make them in mid
1577
01:32:51,400 --> 01:32:54,440
journey.
And then I go into Runways Gen.
1578
01:32:54,440 --> 01:33:00,040
3, which is a fantastic video
creation tool.
1579
01:33:00,280 --> 01:33:03,120
Yeah.
And I will use an image prompt.
1580
01:33:03,120 --> 01:33:05,520
I will use the picture I made in
mid Journey.
1581
01:33:05,520 --> 01:33:08,680
I will put it into Runway.
I will add a little bit of
1582
01:33:08,680 --> 01:33:10,800
detail.
And then Runway makes a piece of
1583
01:33:10,800 --> 01:33:11,760
video.
Yeah.
1584
01:33:11,960 --> 01:33:15,280
The thing about prompt
engineering and like and like
1585
01:33:15,280 --> 01:33:20,720
using AI to help you prompt
other AI is a the models move
1586
01:33:20,720 --> 01:33:22,360
really fast, right?
Yeah.
1587
01:33:22,360 --> 01:33:26,960
And how you prompted, like for
instance, in Mid Journey version
1588
01:33:26,960 --> 01:33:33,120
4, you benefited from making
really long prompts, right?
1589
01:33:33,120 --> 01:33:35,960
Like, like the more description
you gave it, the better.
1590
01:33:36,480 --> 01:33:40,480
When Mid Journey went to version
5, because of the way the model
1591
01:33:40,480 --> 01:33:43,240
worked, it preferred you to be
very direct.
1592
01:33:43,520 --> 01:33:46,520
Like it wanted you to deliver
like a couple of sentences and
1593
01:33:46,520 --> 01:33:50,640
then it filled in the gaps.
You know, version 6 works in a
1594
01:33:50,640 --> 01:33:55,000
very different way, which is
kind of structured more in a way
1595
01:33:55,520 --> 01:34:00,000
of like version six wants you to
work in a in a very specific
1596
01:34:00,320 --> 01:34:04,440
like it wants to you to start
with style, tell it about
1597
01:34:04,440 --> 01:34:09,760
subject, give it detail about
settings, then information about
1598
01:34:09,760 --> 01:34:13,880
composition, then information
maybe about exterior forces like
1599
01:34:13,880 --> 01:34:15,320
lighting.
Yeah.
1600
01:34:15,520 --> 01:34:22,320
And then add additional info
like commands about aspect ratio
1601
01:34:22,320 --> 01:34:25,440
and stuff like that.
OK now it has format it wants
1602
01:34:25,440 --> 01:34:30,640
you to follow and if you feed it
a ChatGPT prompt into version 6
1603
01:34:30,640 --> 01:34:33,000
and mid journey your results
might be hit or.
1604
01:34:33,000 --> 01:34:37,840
Miss, you're going to get crap.
ChatGPT has not caught up with
1605
01:34:38,680 --> 01:34:43,440
the format that mid journey.
Prefers, right, So when we're
1606
01:34:43,440 --> 01:34:47,720
talking about prompt prompt
engineering, so like ChatGPT, if
1607
01:34:47,720 --> 01:34:51,720
you go to it and you just say,
write me a blog article about,
1608
01:34:51,800 --> 01:34:56,680
you know, the live events
lighting industry, you're going
1609
01:34:56,680 --> 01:34:58,680
to get shit right.
You're going to just get.
1610
01:34:58,680 --> 01:35:03,440
You're going to need to do, and
I do use ChatGPT to help me
1611
01:35:03,440 --> 01:35:06,600
write things, Yeah.
But usually I start with
1612
01:35:06,600 --> 01:35:11,560
something like, you know, if I
and then I'll feed it to ChatGPT
1613
01:35:11,560 --> 01:35:13,280
to help me.
That's me too.
1614
01:35:13,520 --> 01:35:15,280
Polish it and.
Or vice versa.
1615
01:35:15,280 --> 01:35:18,680
Or I'll do it the opposite way.
Out of ChatGPT and then I go
1616
01:35:18,680 --> 01:35:20,880
back and that's.
What I do and then yeah, yeah,
1617
01:35:20,880 --> 01:35:21,400
that's what.
I do.
1618
01:35:21,400 --> 01:35:24,600
My work with the large language
models is very back and forth
1619
01:35:24,600 --> 01:35:27,320
iterative.
Yeah, like if you just say give
1620
01:35:27,320 --> 01:35:31,320
me a brilliant article, you
know, about the current state of
1621
01:35:31,320 --> 01:35:34,080
the lighting industry, you're
going to get garbage.
1622
01:35:34,320 --> 01:35:37,680
Well, you know, it's like I had
people tell me that Grok in the
1623
01:35:37,680 --> 01:35:43,560
fun mode on Grok, it wrote a lot
better things than ChatGPT where
1624
01:35:43,560 --> 01:35:46,600
you could actually copy and
paste right out of Grok and and
1625
01:35:46,600 --> 01:35:48,920
use that article.
I don't think so.
1626
01:35:48,920 --> 01:35:54,320
Like Grok, Grok tries to use AI
humor and it's just so stupid.
1627
01:35:54,360 --> 01:35:58,080
You know, it's just like it,
it's nonsensical, right?
1628
01:35:58,080 --> 01:35:59,960
It just doesn't work, doesn't
work.
1629
01:36:00,080 --> 01:36:04,080
It makes me throw up, not laugh,
but but yeah.
1630
01:36:04,080 --> 01:36:10,720
So I mean, you know, the thing
about AI, I guess in closing is
1631
01:36:10,720 --> 01:36:14,680
that that I say to friends all
the time, just start using it
1632
01:36:14,680 --> 01:36:17,800
like find if you want, like I
tell people and I'm telling
1633
01:36:17,800 --> 01:36:19,600
people again here.
And I'm sure you would do the
1634
01:36:19,600 --> 01:36:23,480
same if you come up to me at LDI
or you send me an e-mail or a
1635
01:36:23,480 --> 01:36:26,640
text message and say, what are
you using for spreadsheets?
1636
01:36:26,640 --> 01:36:29,240
What are you using for writing
articles?
1637
01:36:29,240 --> 01:36:31,680
What are you using for images?
What are you using for this or
1638
01:36:31,680 --> 01:36:33,720
that?
I'm happy to share that
1639
01:36:33,720 --> 01:36:36,240
information with you, you know,
Yeah.
1640
01:36:36,240 --> 01:36:39,600
I mean, it's it's like, like
people ask me, you know, it's
1641
01:36:39,600 --> 01:36:41,640
just you and Sarah doing the
podcast.
1642
01:36:41,640 --> 01:36:45,240
How does Sarah do so many posts?
Like she's always posting
1643
01:36:45,440 --> 01:36:48,920
YouTube Shorts and Instagram
reels and does she not sleep?
1644
01:36:49,400 --> 01:36:51,520
And the bottom line is she
sleeps really well.
1645
01:36:51,520 --> 01:36:54,000
She's probably sleeping right
now while we're recording this.
1646
01:36:54,400 --> 01:36:58,760
And but we use AI apps to do
some of those things.
1647
01:36:58,760 --> 01:37:01,640
So like there's, there's one
called Opus clips that we use
1648
01:37:01,640 --> 01:37:06,360
pretty religiously, which takes
a YouTube video of our podcast
1649
01:37:06,720 --> 01:37:09,080
and it grabs like 20 or 30 clips
out of it.
1650
01:37:09,080 --> 01:37:11,560
And then you look at them and
you find the best ones.
1651
01:37:11,560 --> 01:37:15,040
But it's looking for things that
might become viral, things that
1652
01:37:15,040 --> 01:37:18,440
make a lot of sense, things
that, you know, it thinks people
1653
01:37:18,440 --> 01:37:22,080
are going to relate to and, and
it works pretty well.
1654
01:37:22,080 --> 01:37:25,880
And, you know, so there's a lot
of really logical, really good
1655
01:37:25,880 --> 01:37:29,440
things that I use on a daily
basis that I think people would
1656
01:37:29,440 --> 01:37:33,360
benefit from and at least get
comfortable with it so that when
1657
01:37:33,360 --> 01:37:37,640
it moves a little deeper into
your world, you know, you're not
1658
01:37:37,640 --> 01:37:40,320
afraid of it.
That's my advice to people too.
1659
01:37:40,320 --> 01:37:43,640
It's, it's like, look, if you're
a professional technician, if
1660
01:37:43,640 --> 01:37:46,480
you're a, if you're a designer,
if you're a creative director or
1661
01:37:46,480 --> 01:37:50,880
creative producer, you can make
the global decision, do I want
1662
01:37:50,880 --> 01:37:54,480
to use this stuff or not?
And I honor that, whatever your
1663
01:37:54,480 --> 01:37:57,360
choices are.
But the fact is that this
1664
01:37:57,360 --> 01:37:59,920
technology is coming.
It's not coming.
1665
01:37:59,920 --> 01:38:02,480
It's here.
If there's only going to be more
1666
01:38:02,480 --> 01:38:04,600
of it, it's only going to be
more pervasive.
1667
01:38:04,920 --> 01:38:08,360
Yeah, you can, if you're
interested in participating in
1668
01:38:08,360 --> 01:38:10,920
this industry and being a
professional in this industry,
1669
01:38:11,320 --> 01:38:15,600
choose to learn about it and use
it or not.
1670
01:38:16,040 --> 01:38:18,120
Yeah.
And, and again, I honor whatever
1671
01:38:18,120 --> 01:38:21,160
decision you want to make.
I've always had the
1672
01:38:21,160 --> 01:38:24,640
predisposition to want to figure
out how to use new technology
1673
01:38:24,640 --> 01:38:27,240
and what I do.
I was one of the first people
1674
01:38:27,240 --> 01:38:30,680
that figured out media servers
and using media servers.
1675
01:38:31,200 --> 01:38:34,480
You know, I was, I, I, you know,
I've been along for the ride in
1676
01:38:34,480 --> 01:38:38,240
terms of like, you know, moving
lights coming into the picture
1677
01:38:38,240 --> 01:38:41,680
and then, and then, you know,
and then the advent of LED and
1678
01:38:41,760 --> 01:38:44,880
now this diversity of fixture
types that do all these kind of
1679
01:38:44,880 --> 01:38:48,000
different things.
I've always been an artist that
1680
01:38:48,000 --> 01:38:50,720
wanted to incorporate technology
into my design.
1681
01:38:50,720 --> 01:38:53,040
So I'm curious about AI.
Yeah.
1682
01:38:53,120 --> 01:38:55,920
And since the beginning and,
and, and as you heard in the
1683
01:38:55,920 --> 01:38:59,960
podcast, you know, we've been
deploying it really since like
1684
01:39:00,400 --> 01:39:01,080
2000.
That's.
1685
01:39:01,080 --> 01:39:02,680
Crazy.
Yeah, I didn't know that.
1686
01:39:02,680 --> 01:39:07,400
That's such a cool story.
So I, you know, I'm not, I'm not
1687
01:39:07,400 --> 01:39:08,920
slamming the brakes.
Yeah.
1688
01:39:09,040 --> 01:39:11,600
And keeping up with it.
I spend a lot of time educating
1689
01:39:11,600 --> 01:39:14,440
myself on it.
We've certainly been able to do
1690
01:39:14,440 --> 01:39:17,160
things we've never been able to
do as a creative agency.
1691
01:39:17,160 --> 01:39:20,040
We've been able to swing way
above our weight because we've
1692
01:39:20,040 --> 01:39:23,560
been using using AI to help us
put together creative pitches,
1693
01:39:24,040 --> 01:39:27,480
to help us put strategy stuff
together to give our work
1694
01:39:27,520 --> 01:39:31,160
underpinnings that are stronger.
We've been able to compete with
1695
01:39:31,400 --> 01:39:35,400
agencies with way bigger
resources than ours because we
1696
01:39:35,400 --> 01:39:36,720
have.
That makes so much sense.
1697
01:39:36,720 --> 01:39:38,320
Listen to us as a small
business.
1698
01:39:38,320 --> 01:39:40,880
It's your superpower.
It's your superpower.
1699
01:39:41,120 --> 01:39:43,280
I'm interested in it.
I'm curious about it.
1700
01:39:43,280 --> 01:39:46,920
My feeling on it is if I'm
talking to people that are new
1701
01:39:46,920 --> 01:39:49,640
or just coming into the
business, learn about it.
1702
01:39:49,800 --> 01:39:52,160
You can choose.
Whether or not you want to use
1703
01:39:52,160 --> 01:39:53,840
it.
But learn about it.
1704
01:39:53,840 --> 01:39:57,120
Learn how it is used.
Learn how you can deploy it.
1705
01:39:57,200 --> 01:39:59,520
Learn how you can deploy it to
sell yourself.
1706
01:39:59,520 --> 01:40:01,960
Sell your work.
Improve your work.
1707
01:40:02,040 --> 01:40:06,160
Yeah, be more nimble.
Save time, right?
1708
01:40:06,160 --> 01:40:08,760
Save money.
Well, there's a part of your job
1709
01:40:08,760 --> 01:40:13,280
that you hate doing and there's
a really good chance that
1710
01:40:13,320 --> 01:40:18,040
there's AI tools already in
place that will enable you to do
1711
01:40:18,040 --> 01:40:23,040
less of that, whatever that is.
So, you know, that's, that's the
1712
01:40:23,040 --> 01:40:27,240
bottom line for me is, is AI,
like when I saw the, the clothes
1713
01:40:27,240 --> 01:40:32,640
folding thing from, from Bezos
yesterday, I was like, oh,
1714
01:40:32,640 --> 01:40:35,320
really?
It can fold your clothes for
1715
01:40:35,320 --> 01:40:36,880
you.
I want one of those, you know,
1716
01:40:37,360 --> 01:40:40,680
that's perfect.
So are you doing anything at LDI
1717
01:40:40,680 --> 01:40:44,840
as far as like are you doing any
any speeches or talks or
1718
01:40:44,840 --> 01:40:48,040
anything on on AI stuff?
I will be I will be talking on
1719
01:40:48,040 --> 01:40:49,320
the 10th.
OK.
1720
01:40:49,640 --> 01:40:51,680
Along with some of my friends
from Tate.
1721
01:40:51,880 --> 01:40:54,760
OK about placemaking.
We're gonna have a panel that's
1722
01:40:54,760 --> 01:41:00,040
part of the the DSE, part of
LDI, the typical signage people.
1723
01:41:00,040 --> 01:41:02,800
Yeah, we're gonna be talking
about interactivity and
1724
01:41:02,800 --> 01:41:07,360
technology used in creating
immersive experiences for
1725
01:41:07,360 --> 01:41:09,760
permanent installation, so I'll
be talking about that.
1726
01:41:09,760 --> 01:41:13,840
I'm also apparently one of the
judges for Battle of the Busk.
1727
01:41:14,240 --> 01:41:16,720
Fun.
So I'll be, I will be there as
1728
01:41:16,720 --> 01:41:18,920
well.
And then I'll be walking the
1729
01:41:18,920 --> 01:41:21,120
floor and.
So there's no AI?
1730
01:41:21,200 --> 01:41:23,640
There's no AI talks or anything
that surprises me.
1731
01:41:23,680 --> 01:41:26,960
I thought they'd do something.
I think there is the there is
1732
01:41:26,960 --> 01:41:30,960
the X live part of.
LDI yeah, yeah.
1733
01:41:31,200 --> 01:41:36,280
Where they I think they are
having and Jake Pinholster, OK
1734
01:41:36,560 --> 01:41:38,440
is leading that this time
around.
1735
01:41:38,440 --> 01:41:43,960
And I think that Jake does have
several panels about AI and how
1736
01:41:44,320 --> 01:41:45,680
it's being used.
Cool.
1737
01:41:45,800 --> 01:41:47,400
I'm not talking about it this
year.
1738
01:41:47,400 --> 01:41:51,760
I've been, I was, I've been, I
was too busy this year to commit
1739
01:41:51,840 --> 01:41:56,640
to, yeah, to doing a lot of
stuff at LDI, but I will
1740
01:41:56,640 --> 01:41:59,080
certainly be there.
I will be talking on the 10th
1741
01:41:59,080 --> 01:42:02,840
and I'll be around and about if
anybody wants to talk to me
1742
01:42:02,920 --> 01:42:03,840
one-on-one.
So.
1743
01:42:03,840 --> 01:42:06,720
Yeah, buy him a beer at the
Circle Bar and he'll tell you
1744
01:42:06,720 --> 01:42:08,160
whatever you want to know about
AII.
1745
01:42:08,760 --> 01:42:11,160
Don't know, you know, having
been having been clean and
1746
01:42:11,160 --> 01:42:12,920
sober.
For about don't buy him a beer,
1747
01:42:12,920 --> 01:42:13,680
buy him a.
Coffee.
1748
01:42:14,200 --> 01:42:17,600
I don't know that I'll share the
beer, but we have fizzy water
1749
01:42:17,600 --> 01:42:20,280
with a slice of There you go.
There you go.
1750
01:42:20,440 --> 01:42:23,680
Well, that's probably about
$27.00 at the Circle bar, so.
1751
01:42:24,280 --> 01:42:26,480
That's the problem.
That's a that's a nice fee.
1752
01:42:27,280 --> 01:42:29,240
All right, Bob, thank you so
much for doing this.
1753
01:42:29,240 --> 01:42:32,840
I appreciate all your time.
You're you're certainly very,
1754
01:42:32,840 --> 01:42:35,480
very up on this.
And I'd actually love to buy you
1755
01:42:35,480 --> 01:42:40,200
that soda water with a, with a
lime and, and have a deeper talk
1756
01:42:40,200 --> 01:42:43,440
because I'm not a technical guy,
but I'm really an AI guy.
1757
01:42:43,440 --> 01:42:46,080
I love the tools that it's
providing me.
1758
01:42:46,080 --> 01:42:51,040
So I I play with them every day.
Thanks.