20 Minute Leaders

Ep39: Mikey Fischer | PhD student at Stanford University

August 21, 2020 Michael Matias Season 2 Episode 39
20 Minute Leaders
Ep39: Mikey Fischer | PhD student at Stanford University
Chapters
20 Minute Leaders
Ep39: Mikey Fischer | PhD student at Stanford University
Aug 21, 2020 Season 2 Episode 39
Michael Matias

Mikey Fischer is a PhD student at Stanford University in Computer Science studying Artificial Intelligence and Natural Language Processing, where he is developing a system that translates natural language into code.  He was the head TA for “Regulating AI”, a class taught in the law school by a California Supreme Court Justice.  Mikey is currently writing a book, “Regulating AI”, exploring the intersection of AI and Law.

Show Notes Transcript

Mikey Fischer is a PhD student at Stanford University in Computer Science studying Artificial Intelligence and Natural Language Processing, where he is developing a system that translates natural language into code.  He was the head TA for “Regulating AI”, a class taught in the law school by a California Supreme Court Justice.  Mikey is currently writing a book, “Regulating AI”, exploring the intersection of AI and Law.

1
00:00:21,240 --> 00:00:22,040
Mikey Fischer

2
00:00:22,120 --> 00:00:23,840
thank you for joining me
on 20 Minute Leaders

3
00:00:24,040 --> 00:00:25,440
this is great

4
00:00:25,640 --> 00:00:26,800
thank you so much, Michael

5
00:00:27,000 --> 00:00:28,640
huge fan of yours as well

6
00:00:28,840 --> 00:00:31,200
so, thanks so much for inviting me on
thank you

7
00:00:31,400 --> 00:00:34,240
so you're talking from campus from
Stanford right now, right?

8
00:00:34,440 --> 00:00:37,680
I am, I am hold up here for a little bit
it's been great kind of working

9
00:00:37,680 --> 00:00:38,840
it's kind of like cloistering in

10
00:00:39,040 --> 00:00:41,760
while I finish the last bits of my thesis
right

11
00:00:41,960 --> 00:00:44,280
so PhD at Stanford, you're focusing

12
00:00:44,480 --> 00:00:45,960
on law and artificial intelligence

13
00:00:46,160 --> 00:00:49,360
you're writing a book about regulating
artificial intelligence

14
00:00:49,560 --> 00:00:51,240
tell me about your path

15
00:00:51,440 --> 00:00:52,440

16
00:00:52,640 --> 00:00:54,840
what is this intersection of law and AI?

17
00:00:55,120 --> 00:00:56,540
where does this come from?

18
00:00:56,540 --> 00:00:57,920
Yeah
so it's a little bit

19
00:00:58,120 --> 00:01:00,360
so I originally started,
I love computer science

20
00:01:00,560 --> 00:01:02,280
obviously I've been doing it
for a really long time

21
00:01:02,480 --> 00:01:04,400
I did undergrad here at Stanford

22
00:01:04,600 --> 00:01:07,120
always been very involved in research

23
00:01:07,320 --> 00:01:10,880
and, you know, learning about the newest
things

24
00:01:11,080 --> 00:01:14,440
and then sort of AI came along and I got
involved with that

25
00:01:14,640 --> 00:01:16,800
that was sort of right as I was starting
my PhD

26
00:01:17,000 --> 00:01:19,360
and so I like realized quickly
that this was

27
00:01:19,560 --> 00:01:22,160
going to be sort of a non incremental

28
00:01:22,360 --> 00:01:24,040
thing that was coming

29
00:01:24,240 --> 00:01:26,640
and so I sort of switched gears a little

30
00:01:26,840 --> 00:01:30,640
bit and started going into AI
.

31
00:01:30,840 --> 00:01:33,360
my interest has always been around sort

32
00:01:33,560 --> 00:01:35,720
of democratizing technology

33
00:01:35,920 --> 00:01:38,280
so thinking about like technology

34
00:01:38,280 --> 00:01:39,360
is this great enabler

35
00:01:39,560 --> 00:01:41,880
and how do we make it

36
00:01:42,080 --> 00:01:44,400
more accessible and also and how do

37
00:01:44,600 --> 00:01:46,440
we use technology to also
make it more accessible?

38
00:01:46,440 --> 00:01:48,600
so the thing that struck me as being a

39
00:01:48,600 --> 00:01:49,640
huge need is right now

40
00:01:49,840 --> 00:01:52,440
programming languages are extremely
difficult to use

41
00:01:52,640 --> 00:01:55,040
you know, C++ was

42
00:01:55,240 --> 00:01:57,840
interesting, Python, JavaScript, you
know, it's constantly

43
00:01:58,040 --> 00:01:59,600
getting easier and easier to program
right

44
00:02:00,720 --> 00:02:03,240
still it's still very tricky to
actually understand

45
00:02:03,440 --> 00:02:06,560
to break down a problem, to learn the
syntax

46
00:02:06,760 --> 00:02:08,240
to learn how to debug something

47
00:02:08,440 --> 00:02:10,960
all these things require, you know, a
year of expertise

48
00:02:11,160 --> 00:02:13,840
before you can even build sort of a

49
00:02:14,040 --> 00:02:16,680
program of some size

50
00:02:16,880 --> 00:02:20,680
and this seemed ridiculous to me because

51
00:02:20,680 --> 00:02:21,720
I envisioned a world

52
00:02:21,920 --> 00:02:24,280
I mean, I dream a world in which, you
know, you can

53
00:02:24,480 --> 00:02:27,760
just talk to your computer and then it
could develop the program for you

54
00:02:27,960 --> 00:02:30,320
why do I have to know how to tell it,
what to

55
00:02:30,520 --> 00:02:33,680
do in a programming language? why can't
it come a little bit closer to me?

56
00:02:33,880 --> 00:02:36,320
and I can say, you know, if Bitcoin

57
00:02:36,520 --> 00:02:38,960
goes below

58
00:02:39,160 --> 00:02:41,680
seven thousand dollars, order two more

59
00:02:41,880 --> 00:02:44,240
and like, I could build a program that
can do that, but I probably

60
00:02:44,440 --> 00:02:46,760
won't do it because it's too tricky to
do so

61
00:02:46,960 --> 00:02:50,600
but it would be nice if I could just be
telling my computer all the time

62
00:02:50,800 --> 00:02:54,320
these different programs that I would
like and it could then build them for me

63
00:02:54,320 --> 00:02:55,480
so I looked into

64
00:02:55,680 --> 00:02:58,080
so my research is on sort of how to
how to

65
00:02:58,280 --> 00:03:00,720
what is the next generation of
programming languages

66
00:03:00,920 --> 00:03:03,360
going to look like? and my vision is

67
00:03:03,560 --> 00:03:05,920
that it'll encompass a lot more natural
language

68
00:03:06,120 --> 00:03:08,480
so you'll describe to the computer in
some

69
00:03:08,680 --> 00:03:11,600
not totally bizarre terms, but in
something it can understand

70
00:03:11,800 --> 00:03:14,320
and it can then use a translation,
sort of

71
00:03:14,520 --> 00:03:16,920
in a very similar way in how English is

72
00:03:17,120 --> 00:03:18,880
translated into Spanish by Google

73
00:03:19,080 --> 00:03:21,400
how can you translate English into

74
00:03:21,600 --> 00:03:24,480
a programming language like a very
simple JavaScript

75
00:03:24,680 --> 00:03:27,000
or a very simple python that sort of
based around these

76
00:03:27,200 --> 00:03:28,470
sort of trigger action commands
yeah

77
00:03:28,460 --> 00:03:30,960
that
people typically want to do

78
00:03:31,160 --> 00:03:32,240
so, you're literally doing a thesis,

79
00:03:32,440 --> 00:03:34,800
you're doing a PhD at Stanford

80
00:03:35,000 --> 00:03:37,680
about what is the next generation of
human

81
00:03:37,880 --> 00:03:39,360
computer interaction going to look like

82
00:03:39,560 --> 00:03:41,900
but from a development perspective,

83
00:03:41,900 --> 00:03:43,560
so not necessarily the consumer

84
00:03:43,760 --> 00:03:45,480
you know, Alexa, buy me more toilet
paper

85
00:03:45,680 --> 00:03:48,120
it's more I want to think

86
00:03:48,320 --> 00:03:50,640
of a new Alexa and I want to talk my
computer

87
00:03:50,640 --> 00:03:52,320
through working it out
exactly

88
00:03:52,520 --> 00:03:55,760
or demonstrate, another one is we're
working on is demonstration

89
00:03:55,960 --> 00:03:59,280
so I just want to maybe show my computer
what to do,

90
00:03:59,280 --> 00:04:00,840
like click this button,
click this button,

91
00:04:01,040 --> 00:04:03,320
but only click this next button

92
00:04:03,520 --> 00:04:05,880
if the price of hand sanitizer is

93
00:04:06,080 --> 00:04:09,240
below ten dollars and if it's not,
then do this

94
00:04:09,440 --> 00:04:11,880
so you're sort of these next generation
programming

95
00:04:12,080 --> 00:04:14,960
systems that allow either natural
language

96
00:04:15,160 --> 00:04:19,040
or multi modal interfaces or

97
00:04:19,240 --> 00:04:21,600
or program by demonstration to allow
you to

98
00:04:21,800 --> 00:04:25,080
build these sorts of programs that will
run on

99
00:04:25,280 --> 00:04:29,280
if I say Alexa, it'll pop up and there it
goes

100
00:04:29,640 --> 00:04:32,040
it'll allow these next generation systems to
run

101
00:04:32,240 --> 00:04:34,560
on these virtual assistant type
interesting

102
00:04:34,760 --> 00:04:37,200
and so, Mikey, you're writing a book
called Regulating AI

103
00:04:37,400 --> 00:04:39,040
well, what is that about?
yeah

104
00:04:39,240 --> 00:04:41,360
so I've always been interested in AI

105
00:04:41,560 --> 00:04:44,200
and as I was doing more and more of it

106
00:04:44,400 --> 00:04:47,640
the social implications became more and
more pronounced

107
00:04:47,840 --> 00:04:50,640
and, you know, especially being in
Stanford, Stanford is all about

108
00:04:50,840 --> 00:04:53,920
the intersection between technology and

109
00:04:54,120 --> 00:04:57,480
society for a lot of things
it's not

110
00:04:57,680 --> 00:04:59,760
I mean, there's other schools that are
very technology focused

111
00:04:59,960 --> 00:05:01,800
there's other schools that are very human
focused

112
00:05:02,000 --> 00:05:05,120
but I think Stanford
brings those two things

113
00:05:05,120 --> 00:05:08,360
together and so, I mean, that's always

114
00:05:08,360 --> 00:05:12,360
been top of mind to me
and I took a class a couple of years ago

115
00:05:12,560 --> 00:05:16,080
on regulating AI
I did well in it

116
00:05:16,280 --> 00:05:18,800
and then the next time it was taught,

117
00:05:19,000 --> 00:05:21,240
the professor asked me to be the
TA for it

118
00:05:21,440 --> 00:05:24,200
and of course, I was extremely excited
because this professor is

119
00:05:24,400 --> 00:05:27,000
a California Supreme Court justice, so
he knows the law backwards

120
00:05:27,200 --> 00:05:31,260
and forwards and he's been thinking
about law and AI

121
00:05:31,260 --> 00:05:32,080
and this class got me thinking

122
00:05:32,280 --> 00:05:33,760
about it a lot and then being a TA

123
00:05:33,960 --> 00:05:36,840
for it also got me thinking about,

124
00:05:37,040 --> 00:05:39,520
you know, we obviously want AI we need

125
00:05:39,720 --> 00:05:43,160
AI out there, but we need a little we
need more

126
00:05:43,360 --> 00:05:45,680
clear understanding for both sides, for

127
00:05:46,240 --> 00:05:48,200
from society's perspective, from the

128
00:05:48,400 --> 00:05:50,840
people who are building it, nobody wants
to released a self-driving car

129
00:05:51,040 --> 00:05:53,440
if they're not sure what, how it's going to

130
00:05:53,640 --> 00:05:55,400
be penalized when something bad happens

131
00:05:55,600 --> 00:05:56,940
they want some clear
sure

132
00:05:56,940 --> 00:05:58,040
it's helpful for

133
00:05:58,240 --> 00:06:02,000
both sides to have some clear regulatory
framework so that there is

134
00:06:02,200 --> 00:06:04,880
the law always the law always finds a
framework

135
00:06:05,080 --> 00:06:06,800
I mean, it's just not clear before

136
00:06:07,000 --> 00:06:09,520
it's like you want to make it clear
before it happens, as opposed

137
00:06:09,720 --> 00:06:10,540
to after it happens
sure

138
00:06:10,540 --> 00:06:12,040
when you know someone
will sue someone else and

139
00:06:12,240 --> 00:06:14,680
some judge somewhere will review the
case and

140
00:06:14,880 --> 00:06:16,440
come up with a ruling on it

141
00:06:16,640 --> 00:06:19,360
it's just very inconsistent

142
00:06:19,560 --> 00:06:21,360
or haphazard

143
00:06:21,560 --> 00:06:22,920
so it's better to think, maybe think

144
00:06:22,920 --> 00:06:24,480
about it a little and not so much that
you think about it

145
00:06:24,480 --> 00:06:26,800
so much that makes so many
rules around it that you stifle

146
00:06:27,000 --> 00:06:28,640
innovation because that's bad as well

147
00:06:28,840 --> 00:06:30,680
but just so that there's some framework
around it

148
00:06:30,880 --> 00:06:32,720
so I've been thinking about these ideas a
lot

149
00:06:32,920 --> 00:06:36,000
I'd been TAing the class

150
00:06:36,200 --> 00:06:38,520
and so I thought it would be great to
get these

151
00:06:38,720 --> 00:06:41,200
ideas out there more
and I thought the best

152
00:06:41,400 --> 00:06:43,760
I mean, even when I was doing the class,
I tried to disseminate

153
00:06:43,960 --> 00:06:46,360
the knowledge as much as possible by
opening it up

154
00:06:46,560 --> 00:06:49,560
to auditors, people who could just sign up

155
00:06:49,760 --> 00:06:51,900
and we had an email list where
you could stay up to date

156
00:06:51,900 --> 00:06:52,720
with the class announcements

157
00:06:52,920 --> 00:06:55,400
and review materials for people trying to
MOOC the course,

158
00:06:55,400 --> 00:06:57,640
just sort of Ad hoc

159
00:06:57,640 --> 00:06:58,680
and so that's good

160
00:06:58,880 --> 00:07:00,760
and so then I thought it would be great
to write a book

161
00:07:00,960 --> 00:07:02,240
and the law is so interesting

162
00:07:02,440 --> 00:07:05,120

163
00:07:05,320 --> 00:07:07,800
and myself are writing this book

164
00:07:08,000 --> 00:07:10,360
and what we're doing is we're
thinking about it in terms of like

165
00:07:10,560 --> 00:07:13,640
a primer for people who want to learn
about law

166
00:07:14,000 --> 00:07:17,360
and also a primer for people who
want to learn about AI

167
00:07:17,560 --> 00:07:21,280
so then we're sort of like taking
different concepts

168
00:07:21,480 --> 00:07:22,560
and then tying them together

169
00:07:22,760 --> 00:07:25,080
so we're looking at like self-driving cars

170
00:07:25,280 --> 00:07:28,560
and then we're tying that together with
product liability and tort law

171
00:07:28,760 --> 00:07:31,000
so we'll give like a primer in what is
a self driving car

172
00:07:31,200 --> 00:07:32,460
how does computer vision work,

173
00:07:32,460 --> 00:07:33,640

174
00:07:33,840 --> 00:07:36,540
and then tying that together with product
liability and tort law

175
00:07:36,540 --> 00:07:38,960
and these interesting areas of law and we're
looking at like generative AI

176
00:07:38,960 --> 00:07:39,960
so how does AI

177
00:07:40,160 --> 00:07:43,120
produce like ideas and

178
00:07:43,320 --> 00:07:45,920
trademark, trademark things that can be
trademarked or words

179
00:07:46,120 --> 00:07:47,800
that can be...or things like
that

180
00:07:48,000 --> 00:07:50,520
and then tying that together with patent
law and

181
00:07:50,720 --> 00:07:52,120
copyrights and property law

182
00:07:52,320 --> 00:07:54,420
like sort of those things
and giving a primer to both

183
00:07:54,420 --> 00:07:55,720
and then merging them together

184
00:07:55,920 --> 00:07:59,200
so, Mikey, let me I want to I want to pick
your brain on this

185
00:07:59,400 --> 00:08:03,180
I recently saw this wide study about the
trolley problem

186
00:08:03,180 --> 00:08:04,680
where they presented
in many

187
00:08:04,880 --> 00:08:07,760
countries this question of, various
scenarios of the trolly problem

188
00:08:07,960 --> 00:08:10,560
I'm sure you're aware of the study where
millions of respondents

189
00:08:10,760 --> 00:08:14,280
said whether they would have an
autonomous vehicle,

190
00:08:14,880 --> 00:08:16,480
go one way or the other,

191
00:08:16,680 --> 00:08:19,120
let's say a grandmother or a

192
00:08:19,320 --> 00:08:20,480
baby, right?
yes

193
00:08:20,680 --> 00:08:23,120
and, um, and what was very
surprising

194
00:08:23,320 --> 00:08:25,760
was that the different countries had

195
00:08:25,960 --> 00:08:28,240
different responses to this based on
their cultures

196
00:08:28,440 --> 00:08:30,760
and I always thought to myself, how does
what are the implications

197
00:08:30,960 --> 00:08:33,480
of this in terms of the law and how are
different countries

198
00:08:33,680 --> 00:08:37,240
going to regulate sort of AI
in their own sphere?

199
00:08:37,240 --> 00:08:38,320
and what happens
when one country

200
00:08:38,520 --> 00:08:41,040
develops technology that another
country uses?

201
00:08:41,240 --> 00:08:44,240
do you have some idea behind
these things?

202
00:08:44,440 --> 00:08:46,960
yeah, I mean, a lot of countries I mean like

203
00:08:47,160 --> 00:08:49,560
I think. Singapore, all

204
00:08:49,760 --> 00:08:50,920
these things are cultural too

205
00:08:51,120 --> 00:08:52,720
so
Yeah, definitely

206
00:08:52,720 --> 00:08:53,480
like Singapore

207
00:08:53,680 --> 00:08:55,320
is just a smaller place that has
fewer roads

208
00:08:55,520 --> 00:08:57,880
there's not any rural roads in

209
00:08:58,080 --> 00:09:00,200
which, you know, everything is
well maintained

210
00:09:00,400 --> 00:09:03,640
the dividers are all well painted

211
00:09:03,640 --> 00:09:04,720
things probably aren't

212
00:09:04,920 --> 00:09:07,280
it's all it's all very it's all

213
00:09:07,480 --> 00:09:08,880
very urban

214
00:09:08,880 --> 00:09:10,040
yeah

215
00:09:10,240 --> 00:09:12,560
whereas the United States is just,
we have super urban

216
00:09:12,760 --> 00:09:14,560
areas like New York, San Francisco, LA

217
00:09:15,680 --> 00:09:18,240
we have suburban areas where it works

218
00:09:18,440 --> 00:09:20,960
and urban is really, really hard because
there's so much inner lap

219
00:09:21,160 --> 00:09:22,440
of everything and people

220
00:09:22,640 --> 00:09:25,160
and suburban is fairly easy and
rural is just like

221
00:09:25,360 --> 00:09:28,440
yes, actually super helpful because you
can just drive somewhere

222
00:09:28,640 --> 00:09:31,040
and so all these things

223
00:09:31,240 --> 00:09:33,640
I don't know that there is one sort of
overarching theme, because

224
00:09:33,840 --> 00:09:36,280
this law is tied

225
00:09:36,480 --> 00:09:39,140
a lot with cultural norms
sure

226
00:09:39,140 --> 00:09:42,040
and what it's
going to be used for

227
00:09:42,240 --> 00:09:44,560
but there's certainly some places

228
00:09:44,760 --> 00:09:47,280
that are further ahead and even some states
are further ahead, like

229
00:09:47,480 --> 00:09:51,480
Arizona and Texas are doing a lot

230
00:09:52,120 --> 00:09:55,120
and I think Nevada, California is too

231
00:09:55,320 --> 00:09:57,640
and they're doing it in actually a
pretty smart way,

232
00:09:57,840 --> 00:10:00,520
they're doing staged rollouts in certain
areas

233
00:10:00,720 --> 00:10:02,680
with safety drivers, which is
beneficial

234
00:10:02,880 --> 00:10:05,400
I mean, it doesn't always doesn't work
what we saw on the

235
00:10:05,600 --> 00:10:08,480
case where there was a safety driver,
but they weren't paying attention

236
00:10:08,680 --> 00:10:09,960
and someone ended up dying

237
00:10:10,160 --> 00:10:12,560
but it's still

238
00:10:12,760 --> 00:10:16,760
I mean, if I had to say what they should
do, they need to

239
00:10:17,120 --> 00:10:20,400
have like I think the safety driver is
is a wonderful idea

240
00:10:20,600 --> 00:10:22,920
and they just need to set up some
sort of metric

241
00:10:23,120 --> 00:10:25,760
around when their

242
00:10:25,960 --> 00:10:28,800
errors per mile

243
00:10:29,000 --> 00:10:31,640
is less than what you'd expect for a
human driver

244
00:10:31,840 --> 00:10:33,080
and once, now

245
00:10:33,280 --> 00:10:34,440
the discussion is always around

246
00:10:34,440 --> 00:10:35,560
oh, the Google AI

247
00:10:35,760 --> 00:10:37,800
car made, you know, X number of errors

248
00:10:38,000 --> 00:10:40,400
but then if you look at it, it's like much,
much less than what a human

249
00:10:40,440 --> 00:10:40,920
of course

250
00:10:41,160 --> 00:10:41,320
would make

251
00:10:41,520 --> 00:10:43,560
and it's just in absolute terms, yes, it
is bad

252
00:10:43,760 --> 00:10:46,320
but so many people are distracted with
driving

253
00:10:46,520 --> 00:10:49,080
and, you know, one of the mistakes,

254
00:10:49,280 --> 00:10:51,840
most mistakes that car accidents happen
are a result

255
00:10:52,040 --> 00:10:54,440
of user error and

256
00:10:54,640 --> 00:10:58,360
we're still coming to terms with like,
is it OK?

257
00:10:58,560 --> 00:11:01,480
when a human hurts somebody,

258
00:11:01,680 --> 00:11:04,320
we have a way of dealing with it
but when a

259
00:11:04,520 --> 00:11:05,640
computer does

260
00:11:05,840 --> 00:11:08,320
people are still very on edge about that
and rightfully

261
00:11:08,520 --> 00:11:10,960
so because people are worried about sort
of a

262
00:11:10,960 --> 00:11:12,120
dystopian future

263
00:11:12,320 --> 00:11:14,760
but,
yeah, definitely

264
00:11:14,960 --> 00:11:17,440
so, Mikey, you were also a pretty
big advocate

265
00:11:17,640 --> 00:11:18,800
for Andrew Yang

266
00:11:19,000 --> 00:11:22,520
and you got to both meet him and
advocate for him on campus

267
00:11:22,720 --> 00:11:25,160
tell me a little bit about how this comes
into play with

268
00:11:25,360 --> 00:11:29,080
your love for technology and innovation
moving forward

269
00:11:29,080 --> 00:11:29,640
yeah

270
00:11:29,840 --> 00:11:32,320
well, the thing that excited me about
Andrew Yang

271
00:11:32,520 --> 00:11:35,160
was, he was the first

272
00:11:35,360 --> 00:11:37,960
politician that was really pushing

273
00:11:38,160 --> 00:11:40,640
the idea of technology

274
00:11:40,640 --> 00:11:41,800
being an enabler in government

275
00:11:42,000 --> 00:11:43,120
right now, I don't think the US

276
00:11:43,320 --> 00:11:44,400
is doing nearly enough

277
00:11:44,600 --> 00:11:47,520
we're so behind in so many areas

278
00:11:47,720 --> 00:11:51,280
we should be doing things with
electronic voting, with cryptocurrencies,

279
00:11:51,480 --> 00:11:55,320
with, you know, digital assets, laws
that are all digital

280
00:11:55,320 --> 00:11:56,160
and we're not doing any of this

281
00:11:56,360 --> 00:11:58,240
and it's a disaster because other
countries are doing it

282
00:11:58,440 --> 00:11:59,680
we're being left behind

283
00:11:59,880 --> 00:12:02,320
and so Andrew Yang was the first one that
said, look

284
00:12:02,520 --> 00:12:05,200
I'm a politician and I'm in technology

285
00:12:05,400 --> 00:12:07,800
and to me, that was that was enough that
said, OK

286
00:12:08,000 --> 00:12:11,600
this guy is really interesting and I
want to learn more about him

287
00:12:11,800 --> 00:12:14,640
I had dinner with him in San Francisco
many months ago

288
00:12:14,840 --> 00:12:17,440
and after that I was like, oh, I need to
be doing more to help you, at least

289
00:12:17,640 --> 00:12:19,320
even if he's not going to get elected

290
00:12:19,520 --> 00:12:22,760
I need to do more to just start getting
these ideas out there

291
00:12:22,760 --> 00:12:23,480


292
00:12:23,480 --> 00:12:26,340
and so his main thing was this idea

293
00:12:26,340 --> 00:12:29,480
of universal basic income, which I think
is a great idea

294
00:12:29,680 --> 00:12:33,120
I mean, basically, it gives everyone X
number of dollars a month

295
00:12:33,320 --> 00:12:34,400
I think there could be some issues

296
00:12:34,600 --> 00:12:35,600
he was giving it to everybody

297
00:12:35,800 --> 00:12:38,920
but when I talked, when I was talking
with voters and everything,

298
00:12:39,120 --> 00:12:41,520
people that were rich, the idea that
someone who

299
00:12:41,720 --> 00:12:44,040
is already making a million dollars a
year

300
00:12:44,240 --> 00:12:46,000
would be getting UBI

301
00:12:46,200 --> 00:12:48,240
a thousand dollars a month, really upset
a lot of people

302
00:12:48,440 --> 00:12:50,920
so I think there's some areas where it
needs to adapt

303
00:12:51,120 --> 00:12:53,440
that to maybe make it slightly more

304
00:12:53,640 --> 00:12:56,480
so that people are only making some
certain threshold, something maybe

305
00:12:56,680 --> 00:12:59,160
more similar to what we're seeing with
the payment protection program, where

306
00:12:59,160 --> 00:13:01,960
it's tiered based on how much money
you're making

307
00:13:02,160 --> 00:13:04,480
but the idea that automation

308
00:13:04,680 --> 00:13:07,160
is taking away jobs and where you need
a

309
00:13:07,360 --> 00:13:09,800
safety net to keep people

310
00:13:10,000 --> 00:13:12,440
from becoming impoverished is a

311
00:13:12,640 --> 00:13:15,600
wonderful idea, which I really stand,
which I really support

312
00:13:15,800 --> 00:13:18,120
and so he was pushing that and I started
the

313
00:13:18,320 --> 00:13:19,760
Stanford Yang Gang on campus
yeah

314
00:13:19,960 --> 00:13:21,360

315
00:13:21,560 --> 00:13:24,160

316
00:13:24,360 --> 00:13:27,110
we had talks about universal basic
income sure

317
00:13:27,620 --> 00:13:30,400
we got up
I was the representative for the remote

318
00:13:30,440 --> 00:13:31,480
Iowa caucus here

319
00:13:31,680 --> 00:13:34,360
so we tried to get people to caucus

320
00:13:34,560 --> 00:13:36,480
for Andrew Yang at Stanford

321
00:13:36,680 --> 00:13:39,840
so it's been a great experience and it's
also been a wonderful experience to

322
00:13:40,040 --> 00:13:43,080
make Stanford campus a little more
political

323
00:13:43,280 --> 00:13:46,440
I think a lot of people here, for the
most part, are apathetic

324
00:13:46,640 --> 00:13:49,040
and so getting some of the especially
that

325
00:13:49,240 --> 00:13:51,800
we had a lot of people who are computer
science people and

326
00:13:52,000 --> 00:13:54,360
also a broad spectrum, but we don't
typically

327
00:13:54,560 --> 00:13:57,360
expect to see people who are in computer
science doing it

328
00:13:57,720 --> 00:13:58,080
definitely

329
00:13:58,280 --> 00:14:00,480
so, you know, right before we started
this show, we had a really

330
00:14:00,480 --> 00:14:03,440
interesting conversation about
about education and technology

331
00:14:03,640 --> 00:14:04,720
in general and

332
00:14:04,920 --> 00:14:07,240
and I think that we shared some of our
some of

333
00:14:07,440 --> 00:14:09,800
the different confusion about where is
this going

334
00:14:10,000 --> 00:14:12,360
and what would happen, you know, if
future generations

335
00:14:12,560 --> 00:14:15,000
would use methods

336
00:14:15,200 --> 00:14:17,840
like we have in these weeks with Corona
times

337
00:14:17,840 --> 00:14:18,800
yeah

338
00:14:19,000 --> 00:14:20,040
and tell me a little bit

339
00:14:20,240 --> 00:14:23,160
how do you see this education through
technology?

340
00:14:23,360 --> 00:14:25,800
what are its shortfalls? what are its
what

341
00:14:26,000 --> 00:14:27,680
are the upsides to it?

342
00:14:27,880 --> 00:14:30,680
yeah, well, I mean, anytime you go to
class

343
00:14:30,880 --> 00:14:34,880
classrooms hasn't changed for two
thousand forever

344
00:14:35,320 --> 00:14:38,600
I don't think anyone would argue that
having a professor

345
00:14:38,800 --> 00:14:41,200
give a PowerPoint presentation at the
front of the room is the

346
00:14:41,400 --> 00:14:43,760
optimal way for students

347
00:14:43,960 --> 00:14:47,520
to learn like lots of people need
more interactivity

348
00:14:47,720 --> 00:14:51,560
you need to study the content needs to be way
more engaging PowerPoint

349
00:14:51,760 --> 00:14:54,200
it's like going to see a high school

350
00:14:54,400 --> 00:14:59,160
play vs a
Hollywood production

351
00:14:59,360 --> 00:15:01,680
it's just like when you add production
value to things

352
00:15:01,680 --> 00:15:02,760
it makes it more interesting

353
00:15:02,960 --> 00:15:05,680
and look at things online, like
3Blue1Brown

354
00:15:05,880 --> 00:15:08,360
and the content quality is just so much
higher

355
00:15:08,560 --> 00:15:10,760
than what you'd normally see in a
classroom

356
00:15:10,960 --> 00:15:12,840
I'm taking a MasterClass, if you know that

357
00:15:13,040 --> 00:15:14,520
yeah, exactly

358
00:15:14,520 --> 00:15:15,530
exactly like the

359
00:15:15,720 --> 00:15:17,880
the quality of the content is

360
00:15:18,080 --> 00:15:20,160
and it's engaging like they thought about
it

361
00:15:20,360 --> 00:15:22,360
there's a story line around it
it's interesting

362
00:15:22,360 --> 00:15:22,600
yeah

363
00:15:22,800 --> 00:15:25,120
and so at Stanford I think like

364
00:15:25,320 --> 00:15:27,520
there's something there for the online
there's gonna be two camps

365
00:15:27,720 --> 00:15:30,560
there's gonna be those who
keep going the way it's going

366
00:15:30,760 --> 00:15:33,320
but then there's the other camp that
embraces this change

367
00:15:33,520 --> 00:15:36,040
and says, hey, now, how do we really
think about

368
00:15:36,240 --> 00:15:38,680
what's going to change? like how do we
make the content even

369
00:15:38,880 --> 00:15:41,240
more engaging and more interesting?
and then make people

370
00:15:41,440 --> 00:15:42,680
want to learn as opposed to

371
00:15:42,880 --> 00:15:45,040
but I think there's also as we were talking
about earlier, I think there's

372
00:15:45,040 --> 00:15:48,240
something missing
no one is feeling it

373
00:15:48,440 --> 00:15:50,000
no one is feeling excited about Zoom

374
00:15:50,200 --> 00:15:52,520
well, Mikey, I think what you mentioned
before, which really touched

375
00:15:52,720 --> 00:15:55,160
me, was, you know, classrooms are meant
for us

376
00:15:55,360 --> 00:15:58,880
to go and meet people and interact with
our friends, interact with the Professor

377
00:15:59,080 --> 00:16:01,720
they're not meant to sit down in the
wooden chair and just stare

378
00:16:01,920 --> 00:16:03,640
at a screen

379
00:16:03,840 --> 00:16:05,840
at my computer science classes right
now

380
00:16:06,040 --> 00:16:08,360
I feel like I'm learning just as much as
I was in the classroom

381
00:16:08,560 --> 00:16:09,800
because it's lecture based

382
00:16:10,000 --> 00:16:12,360
but then there are the classes which are
about the interaction and about

383
00:16:12,560 --> 00:16:15,520
going to the professor after class and
getting their input

384
00:16:15,720 --> 00:16:18,360
and getting her to, and pinging

385
00:16:18,560 --> 00:16:20,440
questions back and forth with her, which

386
00:16:20,440 --> 00:16:22,560
which is not really the same
yeah

387
00:16:22,560 --> 00:16:24,760
in zoom, right?

388
00:16:24,760 --> 00:16:25,600

389
00:16:25,600 --> 00:16:27,880
yeah
so more interactive, more group projects

390
00:16:28,080 --> 00:16:30,600
more experiential learning

391
00:16:30,800 --> 00:16:35,440
as opposed to just taking notes online
right

392
00:16:35,640 --> 00:16:37,080
just more interactive
yeah

393
00:16:37,280 --> 00:16:39,680
so I'm
looking at, you know, augmented

394
00:16:39,680 --> 00:16:43,000
reality into this
and obviously I really

395
00:16:43,200 --> 00:16:46,160
and I told you, I really hope that
future generations don't

396
00:16:46,360 --> 00:16:49,040
that they're not going to experience
exactly what we're experiencing

397
00:16:49,240 --> 00:16:50,640
now because it's just not enough
yeah

398
00:16:50,920 --> 00:16:51,440
as far as I see it

399
00:16:51,640 --> 00:16:54,920
and it doesn't make sense to me that we
have so much amazing technology

400
00:16:55,120 --> 00:16:58,000
that we could be leveraging for
augmented interaction

401
00:16:58,200 --> 00:17:00,520
virtually, that we're putting to

402
00:17:00,720 --> 00:17:03,480
other uses that

403
00:17:03,680 --> 00:17:07,680
I'm a little bit surprised that this is
the best we can come up with

404
00:17:08,000 --> 00:17:10,840
although, granted, nobody really expected
that the whole world

405
00:17:11,040 --> 00:17:14,440
will transition to online education
within a matter of a week

406
00:17:14,640 --> 00:17:17,080
yeah, I mean, I'm so optimistic for the
future

407
00:17:17,280 --> 00:17:19,320
there's going to be such incredible things built out
of this time

408
00:17:19,520 --> 00:17:22,040
this is just the
start

409
00:17:22,240 --> 00:17:24,720
of a huge opportunity in which there's

410
00:17:24,920 --> 00:17:27,280
going to be like incredible classes

411
00:17:27,480 --> 00:17:30,880
online and incredible changes
in health care

412
00:17:30,880 --> 00:17:31,880
incredible

413
00:17:32,080 --> 00:17:34,800
sometimes you need that deadline to
really

414
00:17:35,000 --> 00:17:37,640
it's like a deadline for a
paper like

415
00:17:37,840 --> 00:17:40,760
but until like it's due on April 1st

416
00:17:40,960 --> 00:17:43,480
does it do things actually come to

417
00:17:43,680 --> 00:17:45,280
yeah you know, digital health care

418
00:17:45,480 --> 00:17:47,800
all of a sudden, almost everybody has to
be able

419
00:17:48,000 --> 00:17:50,160
to get their prescription medicine
online

420
00:17:50,160 --> 00:17:51,200
yeah, exactly

421
00:17:51,400 --> 00:17:53,560
that's not going to go back to normal
afterwards

422
00:17:53,560 --> 00:17:55,200
yeah
incredible

423
00:17:55,400 --> 00:17:57,760
I'm super curious about how what are the
lasting changes that are going

424
00:17:57,960 --> 00:17:59,640
to come out of this and what parts are
going to go back to normal

425
00:17:59,840 --> 00:18:02,520
give me one guess for one lasting change

426
00:18:02,720 --> 00:18:05,160
that will stay afterwards.

427
00:18:05,360 --> 00:18:07,720
I would say Zoom call like work from
home could

428
00:18:07,920 --> 00:18:11,320
be much more of the norm than it normally
would have been before

429
00:18:11,520 --> 00:18:12,860
I think this is going to be unless

430
00:18:12,860 --> 00:18:13,840
unless someone has a strong

431
00:18:14,040 --> 00:18:16,440
need to like you, unless you need a

432
00:18:16,640 --> 00:18:19,120
haircut or something similar, like it's

433
00:18:19,320 --> 00:18:21,840
going, most people are going to do
work from home and

434
00:18:22,040 --> 00:18:24,520
to build on that, there's going to be a lot
more tools developed to make

435
00:18:24,520 --> 00:18:27,280
yes
this more interactive around it

436
00:18:27,280 --> 00:18:27,680
definitely

437
00:18:27,880 --> 00:18:30,280
you know, I think right now, when you
apply for jobs, a lot

438
00:18:30,480 --> 00:18:32,880
of the times you look at the locations
and you think, do I want to work

439
00:18:33,080 --> 00:18:35,400
in San Francisco or Austin or Denver

440
00:18:35,600 --> 00:18:37,960
and it might not really matter soon
because you're gonna

441
00:18:38,160 --> 00:18:40,480
you're gonna have so many more
opportunities, especially in the

442
00:18:40,480 --> 00:18:43,960
tech world, to work at any company you
want, because they will have

443
00:18:44,160 --> 00:18:46,920
they will at least enable you to work
from home

444
00:18:47,120 --> 00:18:49,280
yeah,
because of it's experience

445
00:18:49,480 --> 00:18:51,240
yeah

446
00:18:51,920 --> 00:18:53,400
Mikey
before we finish

447
00:18:53,600 --> 00:18:55,920
I want to put you on the spot one last
time, and I want you to tell

448
00:18:56,120 --> 00:18:59,320
me three words that you think would best
describe you

449
00:18:59,520 --> 00:19:00,560
mm hmm

450
00:19:00,560 --> 00:19:01,680
good question

451
00:19:01,880 --> 00:19:04,720
let me think

452
00:19:04,920 --> 00:19:08,400
I would say

453
00:19:08,400 --> 00:19:09,600
Fearless

454
00:19:09,800 --> 00:19:12,280
because I think once you

455
00:19:12,480 --> 00:19:15,000
are afraid of something,

456
00:19:15,200 --> 00:19:17,480
you don't go for it like you would
otherwise

457
00:19:17,680 --> 00:19:20,080
sure

458
00:19:20,280 --> 00:19:21,600
Judicious

459
00:19:21,800 --> 00:19:25,720
so thinking things all the way
through

460
00:19:25,920 --> 00:19:28,240
and, you know, thinking, weighing
different

461
00:19:28,440 --> 00:19:30,960
sides of an argument

462
00:19:31,160 --> 00:19:33,140
and

463
00:19:33,140 --> 00:19:36,840
so far sounds exactly like a PhD
candidate at Stanford

464
00:19:37,040 --> 00:19:41,040
and I would say another one would be Now

465
00:19:41,680 --> 00:19:44,880
so there's this quote I like it's like

466
00:19:45,080 --> 00:19:46,680
and along with Now would be Pithy

467
00:19:46,880 --> 00:19:47,960
these are sort of similar things

468
00:19:48,160 --> 00:19:49,680
Pithy or Now

469
00:19:49,880 --> 00:19:51,000
there's this quote I like

470
00:19:51,200 --> 00:19:53,040
"If not now, when?

471
00:19:53,240 --> 00:19:54,880
and I feel like that's an important
thing

472
00:19:55,080 --> 00:19:57,520
it's like if we're going to do it, like
there's so many things

473
00:19:57,720 --> 00:19:59,080
that can always take your time

474
00:19:59,280 --> 00:20:00,800
but like, what are the most important
things

475
00:20:01,000 --> 00:20:02,840
so like Now would be

476
00:20:03,040 --> 00:20:05,240
of course there's times to wait for
certain things with exceptions

477
00:20:05,440 --> 00:20:08,240
but Now, Now is like an important
thing

478
00:20:08,440 --> 00:20:12,280
so if you're not going to do it Now, when?
I love it Mikey

479
00:20:12,280 --> 00:20:13,200
thank you
thank you so much

480
00:20:13,200 --> 00:20:14,320
namaste
thank you

481
00:20:15,040 --> 00:20:15,290
namaste
take care