The Language Neuroscience Podcast
A podcast about the scientific study of language and the brain. Neuroscientist Stephen Wilson talks with leading and up-and-coming researchers about their work and ideas. This podcast is geared to an audience of scientists who are working in the field of language neuroscience, from students to postdocs to faculty.
The Language Neuroscience Podcast
‘A mountain of small things’ with Masud Husain
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In this episode, I talk with Masud Husain, Professor of Neurology and Cognitive Neuroscience at the University of Oxford, about his recent editorial ‘A mountain of small things’.
Husain M. A mountain of small things. Brain 2024; 147: 739. [doi]
1
00:00:00,000 --> 00:00:09,640 Welcome to episode 31 of the Language Neuroscience Podcast.
2
00:00:09,640 --> 00:00:13,840
I'm Stephen Wilson and I'm a neuroscientist at the University of Queensland in Brisbane,
3
00:00:13,840 --> 00:00:14,840 Australia.
4
00:00:14,840 --> 00:00:19,040 I have a very special guest today, Professor Masud Husain.
5
00:00:19,040 --> 00:00:23,640
He's Professor of Neurology and Cognitive Neuroscience at the University of Oxford.
6
00:00:23,640 --> 00:00:26,960
Masud is different from a lot of the guests that I've had on the podcast because he's
7
00:00:26,960 --> 00:00:28,840 not really a language and brain guy.
8
00:00:28,840 --> 00:00:34,080
He is a neuroscientist and an neurologist, but he works on topics like attention, neglect,
9
00:00:34,080 --> 00:00:38,320
the link between attention and short-term memory, motivation and apathy.
10
00:00:38,320 --> 00:00:40,480 Not the kind of topics that we normally talk about.
11
00:00:40,480 --> 00:00:43,920 And indeed, I didn't invite him to talk about his own research today.
12
00:00:43,920 --> 00:00:48,200
I invited him to talk about an editorial that he wrote for Brain, at the Journal where he
13
00:00:48,200 --> 00:00:50,680 is the editor-in-chief.
14
00:00:50,680 --> 00:00:54,920
As you guys probably know, Brain is a very important journal for our field, some of the most important
15
00:00:54,920 --> 00:00:59,120
Language and Brain papers of all time have been published in that journal, such as Ludwig
16
00:00:59,120 --> 00:01:05,480
Lichtheim's 1885 Masterpiece on the "House" - model and Norman Geschwind's beautiful two-part
17
00:01:05,480 --> 00:01:08,920 epic from 1965, many others.
18
00:01:08,920 --> 00:01:14,320
So Masud is editor-in-chief of that journal, which is a really prestigious position.
19
00:01:14,320 --> 00:01:15,920 Earlier this year, I read his editorial.
20
00:01:15,920 --> 00:01:17,600 It's called "A Mountain of Small Things."
21
00:01:17,600 --> 00:01:22,960 It's published in volume 147, issue 3, March 2024.
22
00:01:22,960 --> 00:01:28,440
And Masud argues that we have a big problem facing science, bigger even than the reproducibility
23
00:01:28,440 --> 00:01:29,520 crisis.
24
00:01:29,520 --> 00:01:33,720
So I invited him to come on the podcast and read his editorial for us, which he's going to
25
00:01:33,720 --> 00:01:35,880 do, and then we're going to have a chat about it.
26
00:01:35,880 --> 00:01:39,200 I'm recording this on the weekend before SNL in Brisbane.
27
00:01:39,200 --> 00:01:42,480 I'm hoping to see lots of the listeners of the podcast there at SNL.
28
00:01:42,480 --> 00:01:45,200 Please come and say hi if you see me.
29
00:01:45,200 --> 00:01:47,840
And I'd love to hear what you guys think about this different kind of episode, which is
30
00:01:47,840 --> 00:01:54,000
not so much a content episode, but a sort of practice of science episode.
31
00:01:54,000 --> 00:01:55,520 Very curious what you guys think.
32
00:01:55,520 --> 00:01:56,680 So let me know.
33
00:01:56,680 --> 00:02:02,000
And if you're not at the conference, feel free to send me an email, smwilsonau@gmail.com.
34
00:02:02,000 --> 00:02:03,480 And I hope you enjoy the episode. 35
00:02:03,480 --> 00:02:05,080 All right, let's get to it.
36
00:02:05,080 --> 00:02:07,080 Hi Masud, how are you?
37
00:02:07,080 --> 00:02:08,280 I'm well, thank you.
38
00:02:08,280 --> 00:02:11,560 Yeah, it's a pleasure to meet you and thanks for taking the time.
39
00:02:11,560 --> 00:02:13,120 Yeah, great for inviting me.
40
00:02:13,120 --> 00:02:14,120 Thank you.
41
00:02:14,120 --> 00:02:17,760
Before we get into it, can you tell me a little bit about yourself and just
42
00:02:17,760 --> 00:02:20,800
let our listeners know like, who you are and what you're interested in?
43
00:02:20,800 --> 00:02:26,400
Yeah, I'm a neurologist, still active, clinically, and a neuroscientist.
44
00:02:26,400 --> 00:02:30,280 And my interest has really been in sort of cognitive neurology.
45
00:02:30,280 --> 00:02:37,600
I started off in attention and prominently in people who have inattention or the neglect
46
00:02:37,600 --> 00:02:38,600 syndrome.
47
00:02:38,600 --> 00:02:42,760 That was the area of study for about 15 years or so.
48
00:02:42,760 --> 00:02:47,200 That led to work on the link between attention and short-term memory.
49
00:02:47,200 --> 00:02:51,400
And we did quite a bit of work in trying to understand the architecture of short-term memory in
50
00:02:51,400 --> 00:02:57,200
healthy people, but also started using new techniques to measure memory in different
51
00:02:57,200 --> 00:02:58,440 patient groups.
52
00:02:58,440 --> 00:03:07,280
More recently, more recently we've got into, stumbled into motivation and loss of motivation,
53
00:03:07,280 --> 00:03:13,960
syndrome called apathy, and whether there might be a neurobiological basis for that, across,
54
00:03:13,960 --> 00:03:16,360 across different patient groups.
55
00:03:16,360 --> 00:03:17,840 But also in healthy people.
56
00:03:17,840 --> 00:03:19,640 Right, okay.
57
00:03:19,640 --> 00:03:20,640 That's fascinating.
58
00:03:20,640 --> 00:03:25,080
And do you link your clinical practice and your research or do you kind of have separate
59
00:03:25,080 --> 00:03:26,560 strands there?
60
00:03:26,560 --> 00:03:27,720 There's a strong link.
61
00:03:27,720 --> 00:03:33,000
We see people in the cognitive disorders clinic who have all sorts of disorders, including
62
00:03:33,000 --> 00:03:36,160 language disorders.
63
00:03:36,160 --> 00:03:40,600
But a lot of the patients who come to our clinics also are interested in taking part in
64
00:03:40,600 --> 00:03:41,600 the research we do.
65
00:03:41,600 --> 00:03:43,640 So yeah, there's a very strong link.
66
00:03:43,640 --> 00:03:45,360 Yeah, wonderful.
67
00:03:45,360 --> 00:03:52,300
So, about your editorial, when I read it a few months ago when it came out, I was very struck
68
00:03:52,300 --> 00:03:53,300 by it. 69
00:03:53,300 --> 00:03:54,300 It really resonated.
70
00:03:54,300 --> 00:03:58,920
And I sent it to a lot of colleagues instantly, they've all kind of felt the same way.
71
00:03:58,920 --> 00:04:02,380
So I was wondering if people haven't read it, then they're not really going to get much
72
00:04:02,380 --> 00:04:03,540 out of our discussion.
73
00:04:03,540 --> 00:04:05,720 So would you be able to read it for us?
74
00:04:05,720 --> 00:04:09,160 I think it's about six paragraphs long.
75
00:04:09,160 --> 00:04:10,480 It's a very nice piece of writing.
76
00:04:10,480 --> 00:04:12,000 How do you, would you be okay with that?
77
00:04:12,000 --> 00:04:14,360 Of course, yeah, I'd be happy to do that.
78
00:04:14,360 --> 00:04:19,440
Perhaps I could just set this in a bit of context, because I'm the Editor of Brain.
79
00:04:19,440 --> 00:04:22,480 I've been doing that for the last three years.
80
00:04:22,480 --> 00:04:29,680
And of course, most editorials that we write, I write, are also about
academic work in the
81
00:04:29,680 --> 00:04:33,840 journal or related work in other journals.
82
00:04:33,840 --> 00:04:40,080
But I also think it's a responsibility of intellectuals like us and your podcast listeners
83
00:04:40,080 --> 00:04:49,760
to flag up areas where we think are actually changing in such a way that they're having a
84
00:04:49,760 --> 00:04:56,040
devastating impact on our fundamental work, whether you do language or anything else, whether
85
00:04:56,040 --> 00:04:58,840 you do cognitive neuroscience or not.
86
00:04:58,840 --> 00:05:03,280 This is actually an all pervasive problem.
87
00:05:03,280 --> 00:05:09,360
So I've thought it's my responsibility as an editor to also write about these things.
88
00:05:09,360 --> 00:05:16,880
And of course, in order to get interest, you have to write about these in a way that is not
89
00:05:16,880 --> 00:05:20,280 the normal way of academic writing.
90
00:05:20,280 --> 00:05:24,040 But I just want to assure you readers that I'm also a real academic.
91
00:05:24,040 --> 00:05:30,880
Yes, I can assure our listeners that Masud is an extremely eminent academic.
92
00:05:30,880 --> 00:05:35,680
You don't get to be the editor-in-chief of Brain when you just wander off the street.
93
00:05:35,680 --> 00:05:40,880
But yes, this is a very unusual piece of writing.It's not like a typical Brain Editorial.
94
00:05:40,880 --> 00:05:41,880 But I loved it.
95
00:05:41,880 --> 00:05:47,440 And I think our listeners will love it too.
96
00:05:47,440 --> 00:05:50,160 Okay, so this is the editorial which came out this year.
97
00:05:50,160 --> 00:05:53,960 It's called "A Mountain of Small Things".
98
00:05:53,960 --> 00:05:56,040 I live under its shadow.
99
00:05:56,040 --> 00:05:58,880 I suspect most of you do too.
100
00:05:58,880 --> 00:06:02,600 It is the great mountain of small things.
101
00:06:02,600 --> 00:06:09,680
Every year it grows a little taller, a little more imposing, a little more daunting.
102
00:06:09,680 --> 00:06:17,080
The higher it gets, the bigger the shadow it casts, a malignant darkness that pervades
103
00:06:17,080 --> 00:06:23,120 our lives, one that becomes ever more difficult to breach.
104
00:06:23,120 --> 00:06:30,680
So much so that it has become the norm for many of us to live entirely in the gloom.
105
00:06:30,680 --> 00:06:35,000 We no longer ask why it has come to this, even
106
00:06:35,000 --> 00:06:40,040 though we barely glimpse the light that warmed us in the past.
107
00:06:40,040 --> 00:06:42,840 Here we stand trembling.
108
00:06:42,840 --> 00:06:45,640 Our energy for innovation sapped, our
109
00:06:45,640 --> 00:06:50,000 motivation to focus on research, drained.
110
00:06:50,000 --> 00:06:53,760 Tell me, have you had a great thought lately?
111
00:06:53,760 --> 00:06:55,520 I rest my case.
112
00:06:55,520 --> 00:07:02,080
The mountain of small things makes transformative research, far less likely to happen.
113
00:07:02,080 --> 00:07:06,760 Its shadow smothers our aspirations.
114
00:07:06,760 --> 00:07:10,720 How has it grown so formidably, this mountain?
115
00:07:10,720 --> 00:07:17,280
I have, from time to time, trained my old set of binoculars to inspect its substance.
116
00:07:17,280 --> 00:07:24,080
The most curious thing is that where one might expect rocky outcrops or cascading waterfalls.
117
00:07:24,080 --> 00:07:29,400 There is instead paper or its digital counterpart.
118
00:07:29,400 --> 00:07:35,000 Yes, the mountain is paperwork, sheafs and sheafs of it.
119
00:07:35,000 --> 00:07:41,040
Here there are forms to fill, reports to write, statements of compliance with policy to
120
00:07:41,040 --> 00:07:44,520
sign, and your training to perform, appraisal
121
00:07:44,520 --> 00:07:50,560
documents to upload, research protocols and ethics applications to complete,
122
00:07:50,560 --> 00:07:53,680 and much, much more.
123
00:07:53,680 --> 00:07:59,760
Once you've appreciated this, you understand why we can no longer cross the shadow's edge:
124
00:07:59,760 --> 00:08:06,000
why the mountain of small things simply gets bigger and bigger over the years.
125
00:08:06,000 --> 00:08:12,160
The pressures on our employers and funders from legislation, insurers
and lawyers has meant
126
00:08:12,160 --> 00:08:18,960
there is an irresistible urge to issue, on an annual basis yet more demands upon the people
127
00:08:18,960 --> 00:08:21,080 working in the fields.
128
00:08:21,080 --> 00:08:26,360
If like me, you have ploughed a conscientious furrow, respecting compliantly for years
129
00:08:26,360 --> 00:08:32,480
the edicts that are issued, you will be rewarded by receiving a fresh set of requests.
130
00:08:32,480 --> 00:08:38,360
There will be several more forms and reports, training modules and policies that you have
131
00:08:38,360 --> 00:08:43,960
to comply with this year, generated by an increasing number of staff who are employed
132
00:08:43,960 --> 00:08:46,840 to do just this.
133
00:08:46,840 --> 00:08:52,920
This is doubly so for those of us who are clinicians as well as scientists.
134
00:08:52,920 --> 00:08:57,600
The measures in the edicts will protect us, we're told. They will help secure our institutions
135
00:08:57,600 --> 00:08:58,600 from threats.
136
00:08:58,600 --> 00:09:02,720 They are necessary to mitigate the risks to ourselves. 137
00:09:02,720 --> 00:09:09,640
But with each year, the mountain of small things gets bigger, making it ever more likely
138
00:09:09,640 --> 00:09:14,120 that we continue to live in its growing shadow.
139
00:09:14,120 --> 00:09:16,320
Cold blooded and inert, we
140
00:09:16,320 --> 00:09:23,400
are left unable to devote barely any time to the things we are actually employed to do: research,
141
00:09:23,400 --> 00:09:26,400 teaching and clinical work.
142
00:09:26,400 --> 00:09:32,080 In the darkness we have no hope of growing anything useful.
143
00:09:32,080 --> 00:09:40,240
The result is stunted shoots, disfigured in the hopeless, tenebrific atmosphere.
144
00:09:40,240 --> 00:09:44,840 The pernicious impact of the mountain is hard to estimate.
145
00:09:44,840 --> 00:09:49,200
Ask most researchers though, and they will tell you that they have not, in recent years,
146
00:09:49,200 --> 00:09:53,680 spent any time in the Sunlit Uplands.
147
00:09:53,680 --> 00:09:56,600 Whereas in the past, they enjoyed their jobs, this
148
00:09:56,600 --> 00:09:59,360 feeling has all but disappeared.
149
00:09:59,360 --> 00:10:04,560
The sense of belonging to an institution where there is a community of academics and clinicians
150
00:10:04,560 --> 00:10:10,240
that one can be proud to be part of and learn from - has simply vanished.
151
00:10:10,240 --> 00:10:16,360 We just plough the fields in the frigid darkness.
152
00:10:16,360 --> 00:10:18,440 Does it have to be like this?
153
00:10:18,440 --> 00:10:25,080
From time to time, there is a spark, an ephemeral attempt to remedy the current trajectory, a candle lit,
154
00:10:25,080 --> 00:10:30,040 throwing its meagre beams across the vast landscape.
155
00:10:30,040 --> 00:10:32,760 But it does not last.
156
00:10:32,760 --> 00:10:38,360 The problem is that we do not have the will to resist the mountain.
157
00:10:38,360 --> 00:10:44,200
Instead, we may even unwittingly contribute to it, extinguishing hope for the generations
158
00:10:44,200 --> 00:10:46,120 to come.
159
00:10:46,120 --> 00:10:50,840 Some younger researchers may never have seen the light. 160
00:10:50,840 --> 00:10:56,400
They might hear about times in the past when there was a sense of purpose, camaraderie
161
00:10:56,400 --> 00:10:59,080 and self-fulfillment, but
162
00:10:59,080 --> 00:11:05,240
the mountain's growth is relentless, leading to fewer and fewer people wanting to pursue
163
00:11:05,240 --> 00:11:08,080 a career in its shadow.
164
00:11:08,080 --> 00:11:14,040
There is a serious crisis in attracting and retaining people in biomedical and clinical
165
00:11:14,040 --> 00:11:15,720 science.
166
00:11:15,720 --> 00:11:23,080
If we recognize this but do nothing, we will also have contributed to the growing darkness.
167
00:11:23,080 --> 00:11:27,160 So that's it.
168
00:11:27,160 --> 00:11:29,200 It's a lovely piece.
169
00:11:29,200 --> 00:11:30,680 It really struck me.
170
00:11:30,680 --> 00:11:37,920
And it's kind of shocking that you're basically saying that paperwork
is one of the biggest
171
00:11:37,920 --> 00:11:39,800 problems facing science right now.
172
00:11:39,800 --> 00:11:44,640
It's not an obvious conclusion, but I think once you to think about it, it's
173
00:11:44,640 --> 00:11:47,520 the right conclusion to me at least.
174
00:11:47,520 --> 00:11:53,920
I think about other things that are huge, like P-Hacking, for instance, and the reproducibility
175
00:11:53,920 --> 00:11:54,920 crisis.
176
00:11:54,920 --> 00:11:58,640 And I talked to Dorothy Bishop.
177
00:11:58,640 --> 00:12:00,200 We talked all about those problems.
178
00:12:00,200 --> 00:12:01,640 Then those are definitely problems.
179
00:12:01,640 --> 00:12:07,160
Do you feel like this is an even bigger problem or a problem of the same order of magnitude?
180
00:12:07,160 --> 00:12:09,960 I think it is a bigger problem.
181
00:12:09,960 --> 00:12:19,640
And having been a researcher for over 30 years, I think what people of my vintage realize is
182
00:12:19,640 --> 00:12:23,160
that it doesn't have to be this way.
183
00:12:23,160 --> 00:12:25,200 It never was this way.
184
00:12:25,200 --> 00:12:26,480
And we were doing research,
185
00:12:26,480 --> 00:12:32,800
We didn't have calamitous consequences for either our institutions, the participants
186
00:12:32,800 --> 00:12:36,080 we've studied, or anyone else.
187
00:12:36,080 --> 00:12:44,560
And what has happened is, I don't think anybody in particular is at fault.
188
00:12:44,560 --> 00:12:53,600 It's a growing need to secure institutions from threats.
189
00:12:53,600 --> 00:13:01,080
And those are potentially legal threats, but it really boils down to making decisions
190
00:13:01,080 --> 00:13:03,560 under risk and uncertainty.
191
00:13:03,560 --> 00:13:09,520
As psychologists, some of your listeners would be very used to the idea that it's not really
192
00:13:09,520 --> 00:13:13,200 possible to bring those risks down to zero in the real world.
193
00:13:13,200 --> 00:13:17,320 But that's what I think people are attempting to do. 194
00:13:17,320 --> 00:13:24,160
Even though the evidence that the kind of instruments they're using would reduce the risk
195
00:13:24,160 --> 00:13:25,160 is minimal.
196
00:13:25,160 --> 00:13:27,320 It's absolutely minimal.
197
00:13:27,320 --> 00:13:29,400 So I think it's a bigger problem.
198
00:13:29,400 --> 00:13:36,080
It's not just clinical research, it's in all forms of research, and it's stopping us
199
00:13:36,080 --> 00:13:39,480 getting on with the things, that I said, that we're paid to do.
200
00:13:39,480 --> 00:13:44,720
Yeah, sometimes I feel like they've protected against every single imaginable risk, except
201
00:13:44,720 --> 00:13:50,040
for the risk that in doing so, they would completely stifle our ability to do meaningful
202
00:13:50,040 --> 00:13:51,040 work.
203
00:13:51,040 --> 00:13:54,680
That's the one risk, which nobody seems to be wanting to protect against it all.
204
00:13:54,680 --> 00:14:01,000 But there's no form of is this going tp destroyer ability to work?
205
00:14:01,000 --> 00:14:08,440
So I think it's our responsibility to articulate that problem, because we have been extremely
206
00:14:08,440 --> 00:14:12,840 compliant because most of us are very conscientious.
207
00:14:12,840 --> 00:14:13,840 That's the way we're built.
208
00:14:13,840 --> 00:14:15,920 We do the things we're asked to do.
209
00:14:15,920 --> 00:14:21,840
We don't say no, we don't push back, and we've never questioned this in a sort of concerted
210
00:14:21,840 --> 00:14:23,480 way.
211
00:14:23,480 --> 00:14:25,480 This is the time to do that.
212
00:14:25,480 --> 00:14:29,600
And I think you might say, and a lot of people said to me, "What can I do?
213
00:14:29,600 --> 00:14:31,680 What as an individual can I do?"
214
00:14:31,680 --> 00:14:36,040 But little nudges can have huge effects.
215
00:14:36,040 --> 00:14:44,640
So a little editorial like this has been read over 10,000 times by people, and some of my
216
00:14:44,640 --> 00:14:50,520
colleagues are saying they're using it to motivate themselves and
their groups to push
217
00:14:50,520 --> 00:14:52,920 back against things that aren't necessary.
218
00:14:52,920 --> 00:14:56,240 I'm now doing that on a regular basis, and nothing happens.
219
00:14:56,240 --> 00:15:00,520 It's not like my institution says, "Well, you're fired."
220
00:15:00,520 --> 00:15:05,240
I will just say, "I'm not doing that because there is no necessity to do that."
221
00:15:05,240 --> 00:15:07,800 And usually there isn't a necessity to do that.
222
00:15:07,800 --> 00:15:10,240 There isn't even a legal reason to do that.
223
00:15:10,240 --> 00:15:14,680
If you push back, you will see that it doesn't mean that something terrible is going to happen
224
00:15:14,680 --> 00:15:16,240 to you.
225
00:15:16,240 --> 00:15:24,760
And it's our responsibility to do this because the generations that follow are not going
226
00:15:24,760 --> 00:15:27,960 to be thanking us for not resisting this.
227
00:15:27,960 --> 00:15:29,200 Yeah.
228
00:15:29,200 --> 00:15:35,560
This reminds me of when I was a grad student, I had this side job to make money, transcribing
229
00:15:35,560 --> 00:15:39,760
interviews with biomedical scientists that was like a project, and that was being done in
230
00:15:39,760 --> 00:15:40,760 the library.
231
00:15:40,760 --> 00:15:42,520 It was like oral histories of people who had won some prize.
232
00:15:42,520 --> 00:15:43,960 I don't remember what prize it was.
233
00:15:43,960 --> 00:15:48,680
There was this guy, and I don't remember his name is from the
University of Utah, and
234
00:15:48,680 --> 00:15:51,880 he was some kind of wet lab scientist.
235
00:15:51,880 --> 00:15:58,440
And I remember he said, "Whenever I get a request for something, the first thing I do is
236
00:15:58,440 --> 00:16:01,880 I never respond to anything on the first request.
237
00:16:01,880 --> 00:16:04,320 If it's important, it will come back again."
238
00:16:04,320 --> 00:16:09,520
And he said, "You can make maybe 60% of requests will go away if you simply don't respond
239
00:16:09,520 --> 00:16:10,520
to them."
240
00:16:10,520 --> 00:16:14,400 Is that the kind of strategy that you could have been advocating?
241
00:16:14,400 --> 00:16:21,080
Sometimes I do that, but I think I'm also, I will verbally respond to it by saying, "I
242
00:16:21,080 --> 00:16:24,400 don't see the need for this, and I don't know why we have to do this.
243
00:16:24,400 --> 00:16:26,520
I'm not doing it."
244
00:16:26,520 --> 00:16:32,240
And I will sometimes do that if it's, I don't like reply to all in group emails.
245
00:16:32,240 --> 00:16:37,280
I don't usually do that, but in these cases, I think it's important because in a way,
246
00:16:37,280 --> 00:16:41,440
there's a setting a bar there for other people in your department to say, "Look, he's not
247
00:16:41,440 --> 00:16:42,440 doing this."
248
00:16:42,440 --> 00:16:43,440 Yeah.
249
00:16:43,440 --> 00:16:44,440 Why should I do that?
250
00:16:44,440 --> 00:16:48,200 And I think it's really important for us to resist that. 251
00:16:48,200 --> 00:16:54,440 It's, I'm not saying that administrators are malicious people.
252
00:16:54,440 --> 00:16:59,640
I'm just saying, this is the way they feel they need to protect the institution, or it's
253
00:16:59,640 --> 00:17:03,600 come down from an high, from central university.
254
00:17:03,600 --> 00:17:06,760 And they may not have thought that this is, this has a cost.
255
00:17:06,760 --> 00:17:11,920
All of these things have a cost, and the cost is academic time, which is actually costly
256
00:17:11,920 --> 00:17:13,600 for the university.
257
00:17:13,600 --> 00:17:14,600 Yeah.
258
00:17:14,600 --> 00:17:18,760 And how do your colleagues respond when you tell them no?
259
00:17:18,760 --> 00:17:20,600 I don't get anything back.
260
00:17:20,600 --> 00:17:26,000
Or I'll get an email from some of my colleagues saying, "Thank you for setting that."
261
00:17:26,000 --> 00:17:34,160
This is, I haven't had to reverse a decision I've made because usually there is no reason
262
00:17:34,160 --> 00:17:36,520
for that request coming through.
263
00:17:36,520 --> 00:17:40,400
And it's increasing, you know, the number of requests, I'm sure it is in your institution.
264
00:17:40,400 --> 00:17:46,640
The number of requests you get for simple things are just out of control.
265
00:17:46,640 --> 00:17:53,320
In the UK, over, I think over a decade, the number of academics doubled, but the number
266
00:17:53,320 --> 00:18:00,520
of administrators in UK universities, quadrupled, you know, those people have to do something,
267
00:18:00,520 --> 00:18:02,400 they're employed to do something.
268
00:18:02,400 --> 00:18:04,560 And they also employed to have initiatives.
269
00:18:04,560 --> 00:18:06,480 And those initiatives include, "Oh, why don't we do this?"
270
00:18:06,480 --> 00:18:09,040 I'm not blaming them.
271
00:18:09,040 --> 00:18:13,320
That's what they're employed to do, but we don't actually need that many of them.
272
00:18:13,320 --> 00:18:15,640 No, we got by fine in the past, right?
273
00:18:15,640 --> 00:18:16,640
We did.
274
00:18:16,640 --> 00:18:17,640 And yeah, that's the thing.
275
00:18:17,640 --> 00:18:22,360
I guess that's the thing you and I are old enough that we've seen this change in the
276
00:18:22,360 --> 00:18:24,080 course of our careers.
277
00:18:24,080 --> 00:18:30,040
Can you talk about, like, can you give any examples of things that you did research-wise
278
00:18:30,040 --> 00:18:35,000
when you were young academic just starting out that would never fly nowadays?
279
00:18:35,000 --> 00:18:40,200
Well, I think that's particularly the case with patient-related research, which is a special
280
00:18:40,200 --> 00:18:41,520 case.
281
00:18:41,520 --> 00:18:48,680
And it used to be possible to get an ethics application approved for observational or
282
00:18:48,680 --> 00:18:55,920
behavioral studies and perhaps imaging in patients within about two or three weeks.
283
00:18:55,920 --> 00:18:59,640 I got a major award last year.
284
00:18:59,640 --> 00:19:02,600 It started in December last year. 285
00:19:02,600 --> 00:19:05,960 We have only just got ethics approval.
286
00:19:05,960 --> 00:19:12,200
And then we still can't start this work because there is a division if you do clinical
287
00:19:12,200 --> 00:19:18,320
related research between hospitals, the NHS in the UK and the university, and
288
00:19:18,320 --> 00:19:22,360 everyone's really haggling for a bit of the pie.
289
00:19:22,360 --> 00:19:28,800
Despite the fact that this is really simple observation research, patients are very happy
290
00:19:28,800 --> 00:19:30,040 to get involved.
291
00:19:30,040 --> 00:19:34,040 There is no intervention, there's no risk here.
292
00:19:34,040 --> 00:19:39,880
Everything is about trying to get money out of this in terms of, you know, and that slows
293
00:19:39,880 --> 00:19:41,400 the process down.
294
00:19:41,400 --> 00:19:45,680 Is any hospital facility being used for this?
295
00:19:45,680 --> 00:19:49,000 Is there a room that isn't a university room that's being used?
296
00:19:49,000 --> 00:19:50,000 All right.
297
00:19:50,000 --> 00:19:51,000 All that kind of stuff.
298
00:19:51,000 --> 00:19:56,120
And you think, well, if you worked out how much it's costing to have administrators do this
299
00:19:56,120 --> 00:19:57,120 work?
300
00:19:57,120 --> 00:20:01,200 You would realize that this is, actually not worth the while.
301
00:20:01,200 --> 00:20:05,040 The money we're bringing in, in terms of research is not so big.
302
00:20:05,040 --> 00:20:08,200 We're not corporations that you have to worry.
303
00:20:08,200 --> 00:20:10,280 These are pennies we're talking about.
304
00:20:10,280 --> 00:20:12,280 But that's just giving you an example.
305
00:20:12,280 --> 00:20:15,680
What could be done in three weeks has still has not been done in ten months.
306
00:20:15,680 --> 00:20:16,680 Yeah.
307
00:20:16,680 --> 00:20:17,680 And it's going to be even longer.
308
00:20:17,680 --> 00:20:21,000 It's definitely the same here.
309
00:20:21,000 --> 00:20:23,760 Like you wouldn't get anything up on the ground very quickly.
310
00:20:23,760 --> 00:20:26,280 When I was thinking about this, yeah, go on.
311
00:20:26,280 --> 00:20:30,000
I was just saying, and of course, I was just talking here about observational studies.
312
00:20:30,000 --> 00:20:36,840
Imagine that I had a new intervention for a disease which means that your lifespan is
313
00:20:36,840 --> 00:20:38,200 really reduced.
314
00:20:38,200 --> 00:20:45,800
I can't get that intervention, drug or whatever it is, to you even try because of these processes.
315
00:20:45,800 --> 00:20:53,280
And I think if patients really understood that these administrative processes are blocking
316
00:20:53,280 --> 00:20:59,800
them being able to trial these new interventions, they would actually probably voice the same
317
00:20:59,800 --> 00:21:01,760 concerns that we are beginning to voice.
318
00:21:01,760 --> 00:21:06,000 So the other way of doing this is getting the public involved.
319
00:21:06,000 --> 00:21:07,000
Yeah.
320
00:21:07,000 --> 00:21:11,320
Were you surprised at how fast they manage to get the COVID vaccine out?
321
00:21:11,320 --> 00:21:14,320 Or do you think they could have got it out even quicker without,
322
00:21:14,320 --> 00:21:16,840 I mean, they sort of had a working vaccine in weeks.
323
00:21:16,840 --> 00:21:17,840 Right.
324
00:21:17,840 --> 00:21:22,920
So, you know, I work at Oxford and that's where it came from in the UK.
325
00:21:22,920 --> 00:21:32,360
And essentially, the regulatory bodies fast tracked those applications.
326
00:21:32,360 --> 00:21:34,240 They had an incentive to do that.
327
00:21:34,240 --> 00:21:38,120 It worked out perfectly well, but it showed us that it's possible.
328
00:21:38,120 --> 00:21:39,120 It did.
329
00:21:39,120 --> 00:21:46,000
Now, the risks of a vaccine are much higher than the kind of studies that most of us are doing.
330
00:21:46,000 --> 00:21:47,000
Right.
331
00:21:47,000 --> 00:21:50,120 We're not doing any interventional studies.
332
00:21:50,120 --> 00:21:54,240
Yet it's taken me 10 months and I still haven't got an approval to do this.
333
00:21:54,240 --> 00:21:59,840
Whereas a COVID vaccine which came with potential far higher risks as well as potential benefits
334
00:21:59,840 --> 00:22:03,920 could be fast tracked within a few weeks.
335
00:22:03,920 --> 00:22:05,880 So it shows us that it's possible.
336
00:22:05,880 --> 00:22:11,360
And it makes us think why on Earth couldn't we do this on a regular basis?
337
00:22:11,360 --> 00:22:12,600 Yeah.
338
00:22:12,600 --> 00:22:18,400
And I think back to when I was a grad student, and we used to just use the scanner, for free
339
00:22:18,400 --> 00:22:21,720 after hours because no one else was using it.
340
00:22:21,720 --> 00:22:23,240 It was just sitting there.
341
00:22:23,240 --> 00:22:27,480
And you could just go in and you know, we consented the research participants, you know, we had an IRB.
342
00:22:27,480 --> 00:22:32,960 We could scan a patient, scan the participant, participants.
343
00:22:32,960 --> 00:22:36,760
But like, you know, a grad student who hadn't really had any training was like allowed to
344
00:22:36,760 --> 00:22:41,160 just like enter the building after hours, scan a person.
345
00:22:41,160 --> 00:22:46,360
And when I think about like what that allowed me to do in terms of, by learning, like, I was
346
00:22:46,360 --> 00:22:47,880 just out of play around with stuff, right?
347
00:22:47,880 --> 00:22:51,320
I just tried out a million different things and like, I'd just make up, I'd just think of some
348
00:22:51,320 --> 00:22:53,400 paradigm over the weekend and
349
00:22:53,400 --> 00:22:57,600
I just, you know, coded it up and two days later I just ran it on somebody.
350
00:22:57,600 --> 00:23:00,440 And like most of them didn't work and some of them did.
351
00:23:00,440 --> 00:23:02,680 And those are the ones that grew into like lines of research later.
352
00:23:02,680 --> 00:23:05,320 I just think like, what about young people nowadays?
353
00:23:05,320 --> 00:23:10,960
They're not getting that opportunity to just like explore because everything is so hard to
354
00:23:10,960 --> 00:23:11,960 get started.
355
00:23:11,960 --> 00:23:18,160
You're absolutely right. After I did my PhD, I went to do a postdoc at MIT.
356
00:23:18,160 --> 00:23:23,520
And I could not believe the culture there because like everyone else, we chat over a coffee
357
00:23:23,520 --> 00:23:27,480 about a potential thought experiment.
358
00:23:27,480 --> 00:23:31,640
When I've been in Oxford doing my PhD, none of that actually translated into anything, those
359
00:23:31,640 --> 00:23:32,640 thought experiments.
360
00:23:32,640 --> 00:23:39,960
At MIT, full of postdocs from different parts of the world who are there for a short time,
361
00:23:39,960 --> 00:23:44,440
there are incentives to get on with things. People would just go, well, let's do it tonight.
362
00:23:44,440 --> 00:23:46,440 And I would say, what do you mean tonight?
363
00:23:46,440 --> 00:23:50,680 Well, let's code it up now and let's do it tonight.
364
00:23:50,680 --> 00:23:54,400 And we would be doing it that evening, just like you said. 365
00:23:54,400 --> 00:23:58,200 And most of those things didn't work out, but occasionally they did.
366
00:23:58,200 --> 00:24:01,880 But it also made you feel like you were doing science.
367
00:24:01,880 --> 00:24:02,880 You were at the cutting edge.
368
00:24:02,880 --> 00:24:03,880 You were trying something.
369
00:24:03,880 --> 00:24:04,880 It didn't work out.
370
00:24:04,880 --> 00:24:05,880 Okay, why doesn't it work out?
371
00:24:05,880 --> 00:24:07,600 Let's play with this.
372
00:24:07,600 --> 00:24:09,920 Those days are gone.
373
00:24:09,920 --> 00:24:19,920
Some of my other colleagues who really enjoyed the time to think about a problem, say that
374
00:24:19,920 --> 00:24:25,320
that has become extremely difficult for them to just sit there thinking for some time.
375
00:24:25,320 --> 00:24:29,440
And I guess if you're an administrator, walking passed an office seeing Stephen there and sitting
376
00:24:29,440 --> 00:24:32,920
there thinking, what on earth are you doing?
377
00:24:32,920 --> 00:24:34,240 You're not doing something useful.
378
00:24:34,240 --> 00:24:39,280
But of course, that is something useful. That's the whole purpose of our being in what we
379
00:24:39,280 --> 00:24:40,280 do, right?
380
00:24:40,280 --> 00:24:41,400 Yeah, right.
381
00:24:41,400 --> 00:24:44,440 So we need to get that back.
382
00:24:44,440 --> 00:24:46,960 And there is no reason we can't get that back.
383
00:24:46,960 --> 00:24:54,840
But we do have to express the problem, articulate it well, and put it on the table for people
384
00:24:54,840 --> 00:24:57,200 to understand that there is an issue.
385
00:24:57,200 --> 00:25:02,360 And pushing back is just the beginning of trying to change this.
386
00:25:02,360 --> 00:25:08,320
I also understand that trying to do this within an institution is all fine, but what we really
387
00:25:08,320 --> 00:25:14,840
need is a much bigger kind of framework to do this because
institutions need to feel like
388
00:25:14,840 --> 00:25:18,760 they're doing something that other places are doing too.
389
00:25:18,760 --> 00:25:24,920
There is no reason why that can't happen if we can get the push on this for this to happen.
390
00:25:24,920 --> 00:25:30,480
We just need to hold them by the hand and allow them to take a little bit more risk than
391
00:25:30,480 --> 00:25:34,760 they're doing because it didn't cause any problems in the past.
392
00:25:34,760 --> 00:25:37,880
And of course, they will sometimes say, well, look, there's an example of this.
393
00:25:37,880 --> 00:25:40,480 This went wrong.
394
00:25:40,480 --> 00:25:42,440 But that is what we live with.
395
00:25:42,440 --> 00:25:43,680 That's how it happens.
396
00:25:43,680 --> 00:25:45,680 There are always going to be problems.
397
00:25:45,680 --> 00:25:46,960 You can't get rid of them all.
398
00:25:46,960 --> 00:25:50,960 No, nothing is going to be perfect.
399
00:25:50,960 --> 00:25:55,960
Yeah, but like, apart from, you know, so if you describe like individual action we can take
400
00:25:55,960 --> 00:26:03,800
a distance of pushing back against some unreasonable requests here and there, and I'm just thinking
401
00:26:03,800 --> 00:26:08,960
of a couple of days ago, I received a request to do something truly meaningless and pointless, to
402
00:26:08,960 --> 00:26:13,640
attend a two hour meeting, which I would give a three minute, single slide presentation
403
00:26:13,640 --> 00:26:18,000
on something that I actually don't have any knowledge or familiarity with anyway.
404
00:26:18,000 --> 00:26:23,400
And I looked at my, I thought, oh, I hope I have, I hope I've already got something on
405
00:26:23,400 --> 00:26:24,400 at that time.
406
00:26:24,400 --> 00:26:26,960 And so I looked at my calendar and was like, oh, I don't.
407
00:26:26,960 --> 00:26:30,600 And I thought, but the person that sent the email doesn't know that.
408
00:26:30,600 --> 00:26:32,640 And so I wrote back and I was like, oh, I'd love to, but
409
00:26:32,640 --> 00:26:35,120 unfortunately, I've got a pre-existing commitment.
410
00:26:35,120 --> 00:26:38,240 And so, you know, we can make these little small pushbacks. 411
00:26:38,240 --> 00:26:43,840
But like, how do you see it like ramping up and really changing, like the whole enterprise?
412
00:26:43,840 --> 00:26:46,680 Like, do you think there's ways of scaling up?
413
00:26:46,680 --> 00:26:52,360 Well, I mean, you know, I would say you, you, you ducked out there.
414
00:26:52,360 --> 00:26:58,040
You could have said, I don't want to do this because you know, I'm,
I'm, this is not something that
415
00:26:58,040 --> 00:26:59,440 I have any expertise in.
416
00:26:59,440 --> 00:27:02,600 And I actually need those two hours to do something else.
417
00:27:02,600 --> 00:27:07,000
So I think we need to say these things rather than say, I don't, oh, sorry, I can't make
418
00:27:07,000 --> 00:27:09,520 it because I'm doing something else.
419
00:27:09,520 --> 00:27:11,600 So I'm going to point the finger back at you.
420
00:27:11,600 --> 00:27:12,600 OK, alright. I'll take that.
421
00:27:12,600 --> 00:27:13,600
It was,
422
00:27:13,600 --> 00:27:14,600
It was,
423
00:27:14,600 --> 00:27:20,200 It was a little bit, I was a little bit shady and yeah, not very...
424
00:27:20,200 --> 00:27:25,320
But seriously, I think what we really need to do is to get to departmental heads because
425
00:27:25,320 --> 00:27:32,760
they're the people who meet on the wider table of the university and start having a conversation
426
00:27:32,760 --> 00:27:33,760 with them.
427
00:27:33,760 --> 00:27:38,720
Now, of course, many departmental heads will not necessarily be sympathetic or even if they're
428
00:27:38,720 --> 00:27:41,880
sympathetic, they'll just say, shrug the shoulders and say, well, this is the way the
429
00:27:41,880 --> 00:27:43,520 world is.
430
00:27:43,520 --> 00:27:45,920 This is not the way the world needs to be.
431
00:27:45,920 --> 00:27:49,160 And we're the people who are doing the work, right?
432
00:27:49,160 --> 00:27:53,600
We're the people who are doing the work for the universities and we don't have to do it
433
00:27:53,600 --> 00:27:54,600 this way.
434
00:27:54,600 --> 00:27:57,640 So we, I'm not militant.
435
00:27:57,640 --> 00:28:02,080 I've never been anything but straightforward.
436
00:28:02,080 --> 00:28:11,200
But I think it is our responsibility as academics, intellectuals, people who are supposed to be
437
00:28:11,200 --> 00:28:16,280
thinking about the future and what we can do with it to actually take action.
438
00:28:16,280 --> 00:28:21,960
All action can make a big difference if it's cumulative or it's across the board.
439
00:28:21,960 --> 00:28:29,000
If 50% of your department agreed to this and said so to the departmental head, they need
440
00:28:29,000 --> 00:28:30,000 to think.
441
00:28:30,000 --> 00:28:31,000 Mm-hmm.
442
00:28:31,000 --> 00:28:32,000 Yeah.
443
00:28:32,000 --> 00:28:40,080
And do you think like, what do you think a young person entering the field can do to make
444
00:28:40,080 --> 00:28:43,200 their way through this new world that we find ourselves in? 445
00:28:43,200 --> 00:28:45,920 I mean, they don't have the air of the department head.
446
00:28:45,920 --> 00:28:47,680 Can they just protect themselves?
447
00:28:47,680 --> 00:28:51,480 Is that all they can hope to do as they get started?
448
00:28:51,480 --> 00:28:52,640 I think it's really difficult.
449
00:28:52,640 --> 00:28:58,320
It's difficult for us to navigate this sort of Byzantine complex of things you have to
450
00:28:58,320 --> 00:28:59,320 get through.
451
00:28:59,320 --> 00:29:02,720 It's not straightforward at all to me.
452
00:29:02,720 --> 00:29:06,400 So I have every sympathy for younger people starting this.
453
00:29:06,400 --> 00:29:12,120
If I had my big grant and I didn't have a preexisting grant, I would be tearing my
454
00:29:12,120 --> 00:29:15,880 hair out, my out, because it's 10 months and I haven't started.
455
00:29:15,880 --> 00:29:21,520
I did have a preexisting grant so I could just extend the old ethics and keep that going
456
00:29:21,520 --> 00:29:25,480 while we wasted time trying to get the new one. 457
00:29:25,480 --> 00:29:32,640
And so I think it's really difficult for younger people and we shouldn't forget that.
458
00:29:32,640 --> 00:29:37,680
Even when they get a grant, it's obviously just the beginning of the problem that they're
459
00:29:37,680 --> 00:29:39,520 confronting.
460
00:29:39,520 --> 00:29:46,080
So if I was in their position, I think I would also kind of document the problems they're
461
00:29:46,080 --> 00:29:51,880
having because without any documentation of how long it's taken you to get through this
462
00:29:51,880 --> 00:29:56,560 or why you've had a problem with this, it's meaningless.
463
00:29:56,560 --> 00:30:05,080
And departmental heads can't do anything unless they have real cases, scenarios because
464
00:30:05,080 --> 00:30:07,240 they can't really take this forward anywhere.
465
00:30:07,240 --> 00:30:08,640 It's just a little bit vague.
466
00:30:08,640 --> 00:30:15,000
So probably the best thing a young person can do is to document exactly what the hurdles
467
00:30:15,000 --> 00:30:21,120
have been and why it's been so frustrating and how long it's taken them.
468
00:30:21,120 --> 00:30:22,120 Yeah.
469
00:30:22,120 --> 00:30:25,800 And if they're hearing that from all sides, then that might lead to...
470
00:30:25,800 --> 00:30:26,800 Exactly.
471
00:30:26,800 --> 00:30:27,800
Imagine you're the head of department,
472
00:30:27,800 --> 00:30:32,360
You will after a while get frustrated with email after email telling you this is what's
473
00:30:32,360 --> 00:30:33,360 happening.
474
00:30:33,360 --> 00:30:35,680 You're going to have to do something.
475
00:30:35,680 --> 00:30:41,720
So what's happened in my own department is that we've had these conversations with heads
476
00:30:41,720 --> 00:30:50,240
of department but also with administrative heads so that they understand.
477
00:30:50,240 --> 00:30:56,080
They may not understand that there is a problem because this is just what they're supposed
478
00:30:56,080 --> 00:30:57,080 to do.
479
00:30:57,080 --> 00:30:59,320 This is what central universities told them to do.
480
00:30:59,320 --> 00:31:03,920
So I think it's really important to get them on your side and make them think, "Okay,
481
00:31:03,920 --> 00:31:07,200 I didn't realize that this is an issue."
482
00:31:07,200 --> 00:31:10,960 Because they would just say, "But this would take an hour to do.
483
00:31:10,960 --> 00:31:11,960 What's the problem?"
484
00:31:11,960 --> 00:31:16,640 And what they don't know is you're being asked to do 10 hours of this.
485
00:31:16,640 --> 00:31:17,640 That's so true.
486
00:31:17,640 --> 00:31:18,640 That's so true.
487
00:31:18,640 --> 00:31:19,640 Yeah.
488
00:31:19,640 --> 00:31:24,280
They all seem so trivial in isolation that when you complain, you feel like you're being
489
00:31:24,280 --> 00:31:25,280 a whiner for complaining.
490
00:31:25,280 --> 00:31:26,280
Well, but
491
00:31:26,280 --> 00:31:31,720
what they don't understand is that the thing you asked me, took me 10 minutes but a thousand
492
00:31:31,720 --> 00:31:33,800 people asked me for 10 minutes.
493
00:31:33,800 --> 00:31:39,160
And the other thing that they don't understand, I think, is that if you think of my 40-hour
494
00:31:39,160 --> 00:31:42,880
work week, which is really, of course, like a 50 or 60-hour work week or I don't know how
495
00:31:42,880 --> 00:31:48,800
you are, but most of us probably putting in more than 40, you could say, "Okay, well,
496
00:31:48,800 --> 00:31:53,240 it's only like this chore is only 1% of your work week.
497
00:31:53,240 --> 00:31:54,240
Why are you complaining?"
498
00:31:54,240 --> 00:32:01,840
But of my disposable time, 36 hours of my 40-hour work week is already fully taken up
499
00:32:01,840 --> 00:32:06,440
with teaching and ongoing responsibilities and things that are just scheduled and going
500
00:32:06,440 --> 00:32:07,440 to happen.
501
00:32:07,440 --> 00:32:14,320
It's like if you dig into the remaining time, it's actually only a
very small part of disposable
502
00:32:14,320 --> 00:32:15,320 time.
503
00:32:15,320 --> 00:32:19,320
It's like when you dig into it that you're taking much more than you think as a percentage,
504
00:32:19,320 --> 00:32:20,800 you said what I'm saying.
505
00:32:20,800 --> 00:32:28,440
Well, that is exactly the inspiration for a mountain of small things because that's exactly
506
00:32:28,440 --> 00:32:29,440 what I was thinking.
507
00:32:29,440 --> 00:32:34,280
I'm going to convey the idea that these are all trivial little things, but actually you
508
00:32:34,280 --> 00:32:35,280 put them together, it's
509
00:32:35,280 --> 00:32:38,520 a huge obstacle.
510
00:32:38,520 --> 00:32:44,000
That was the actual inspiration behind thinking about the title for this and that sort of concept
511
00:32:44,000 --> 00:32:45,440 for the editorial.
512
00:32:45,440 --> 00:32:57,280
Yeah, and you gave it just such like, poignant examples, training modules, forms and reports.
513
00:32:57,280 --> 00:33:02,400
What I found after some of this conversation with our administrators is they've said, "Well,
514
00:33:02,400 --> 00:33:09,760
actually X, Y and Z are not absolute requirements."
515
00:33:09,760 --> 00:33:16,520
But A is, everybody needs to do A, but X, Y and Z, when it comes to when push comes to
516
00:33:16,520 --> 00:33:21,120 shove, X, Y and Z don't really need to be done.
517
00:33:21,120 --> 00:33:22,120 That's been helpful.
518
00:33:22,120 --> 00:33:24,120 They've said, "Okay."
519
00:33:24,120 --> 00:33:28,120
So these are little small gains, but I think that's what we're going to have to do.
520
00:33:28,120 --> 00:33:32,160 We're going to have to dismantle the mountain one piece at a time.
521
00:33:32,160 --> 00:33:36,680
It's not going to be, we're going to sweep this thing away in one go, but the small gains
522
00:33:36,680 --> 00:33:37,680 are worth it.
523
00:33:37,680 --> 00:33:43,200
I think that's what I'd like to hope that your listeners will think about is that each of
524
00:33:43,200 --> 00:33:45,040 these small things are little wins.
525
00:33:45,040 --> 00:33:50,080
Yeah, do you feel optimistic or do you just kind of feel like, gosh,
I'm going to like,
526
00:33:50,080 --> 00:33:55,200
swimming is the tide, but I'm really going to get swept away at the end of the day.
527
00:33:55,200 --> 00:33:57,840 I'm not going to stop swimming, but I'm going to get swept away.
528
00:33:57,840 --> 00:34:04,640
Yeah, a very senior neuroscientist has described me as being like King
Canute in front
529
00:34:04,640 --> 00:34:13,360 of the sea, trying to whoosh the sea away before the deluge comes.
530
00:34:13,360 --> 00:34:19,480
I'm not necessarily super optimistic, but I have been really amazed by the reaction
531
00:34:19,480 --> 00:34:20,480 to these editorials.
532
00:34:20,480 --> 00:34:23,600 There are a couple of others.
533
00:34:23,600 --> 00:34:31,800
And how people have actually emailed me or texted me or gone on X to say, how good they
534
00:34:31,800 --> 00:34:39,140
feel that somebody actually is articulating this in a way that is coherent and perhaps can
535
00:34:39,140 --> 00:34:44,760 be used by as material in the argument.
536
00:34:44,760 --> 00:34:47,880 So I'm not necessarily naive about this.
537
00:34:47,880 --> 00:34:53,160
I'm not sure that we're going to be able to change everything here, but I mean, hey, if
538
00:34:53,160 --> 00:34:57,120 we don't actually try, then there is no hope.
539
00:34:57,120 --> 00:35:01,720
I have to compliment you on the way you just used the word X as if it was like, like it just
540
00:35:01,720 --> 00:35:02,720 rolled off your tongue.
541
00:35:02,720 --> 00:35:07,060
I think it's the first time I've ever heard somebody say X and not say, you know, the site
542
00:35:07,060 --> 00:35:11,160
formerly known as Twitter, so called X, you know, you just like, throw it right in there,
543
00:35:11,160 --> 00:35:13,960 right into your sounds without skipping a beat.
544
00:35:13,960 --> 00:35:16,720 That's very good.
545
00:35:16,720 --> 00:35:21,720
But you know, I think maybe when I could earlier I mentioned, like, you know, the sort of reproducibility
546
00:35:21,720 --> 00:35:24,280
crisis, right?
547
00:35:24,280 --> 00:35:32,440
That started with conversation and exposure and bringing into the sort of consciousness
548
00:35:32,440 --> 00:35:34,440 of everybody, right?
549
00:35:34,440 --> 00:35:38,120
20 years ago, nobody was thinking about those issues and then people started talking about
550
00:35:38,120 --> 00:35:44,160
them and then, you know, momentum built and I mean, still an issue, still a huge issue.
551
00:35:44,160 --> 00:35:46,640 But like, you feel that things are changing, right?
552
00:35:46,640 --> 00:35:50,400
Clearly there's like a shift has happened in the way we're doing science as a result of
553
00:35:50,400 --> 00:35:54,000 that and it didn't happen overnight or 10 or 20 years.
554
00:35:54,000 --> 00:36:00,560
So I think what you're doing, kick starting this conversation, I can hope, I mean, like,
555
00:36:00,560 --> 00:36:05,840
I'm a sort of naturally pessimistic person, but if there is hope, then
I think that it's
556
00:36:05,840 --> 00:36:11,000
going to be kind of, it's, it's just starting to talk about it and say, this is a problem
557
00:36:11,000 --> 00:36:15,240
that is bigger than the reproducibility crisis that is crushing science.
558
00:36:15,240 --> 00:36:21,000
Like this paper work obsession, like getting that to be kind of like a mainstream opinion,
559
00:36:21,000 --> 00:36:23,800
because you know, as soon as you say it, other people are like, hey, you're right.
560
00:36:23,800 --> 00:36:25,960 Like, I don't enjoy science anymore.
561
00:36:25,960 --> 00:36:30,800 I've not had exactly and I've not had any messages saying, I'm wrong.
562
00:36:30,800 --> 00:36:31,800 Yeah.
563
00:36:31,800 --> 00:36:32,800 So many really interesting things.
564
00:36:32,800 --> 00:36:38,560
Actually, I really like writing, writing ethics, ethics proposals and waiting 10 months
565
00:36:38,560 --> 00:36:39,840 for them.
566
00:36:39,840 --> 00:36:42,240 And you know, those online training modules are very helpful.
567
00:36:42,240 --> 00:36:47,080
Like, it's really important to know like what the symbol is for hazardous gases when
568
00:36:47,080 --> 00:36:49,320
I work in an office building.
569
00:36:49,320 --> 00:36:50,320 Yeah.
570
00:36:50,320 --> 00:36:51,320 Yeah.
571
00:36:51,320 --> 00:36:56,880
But I also think it's important to think about younger people because they also need to realize
572
00:36:56,880 --> 00:37:02,200
that this is a problem because otherwise, they're not going to understand that it wasn't
573
00:37:02,200 --> 00:37:03,200 always like this.
574
00:37:03,200 --> 00:37:04,200 Yeah.
575
00:37:04,200 --> 00:37:05,720 It really wasn't like this, right?
576
00:37:05,720 --> 00:37:12,160
And within our lifetimes, it has changed into this, this monster, which is sort of throttling
577
00:37:12,160 --> 00:37:14,080 our energy to do other stuff.
578
00:37:14,080 --> 00:37:18,400 And it really is having a detrimental effect on innovation.
579
00:37:18,400 --> 00:37:20,320 You don't have to the time to think.
580
00:37:20,320 --> 00:37:24,160 You don't have the time to play.
581
00:37:24,160 --> 00:37:26,960 And that's such an important part of doing research, right?
582
00:37:26,960 --> 00:37:28,360 Trying something out.
583
00:37:28,360 --> 00:37:29,360 Absolutely.
584
00:37:29,360 --> 00:37:31,560 It's really crucial.
585
00:37:31,560 --> 00:37:32,560 Yeah.
586
00:37:32,560 --> 00:37:33,560 Cool.
587
00:37:33,560 --> 00:37:43,480
Thank you so much for, you know, taking me out on my unexpected request to be on a podcast.
588
00:37:43,480 --> 00:37:47,720
I'm really glad, you know, to have met you and to, you know, kind of get your message,
589
00:37:47,720 --> 00:37:52,320
like help spread your message a little bit because I really enjoyed it.
590
00:37:52,320 --> 00:37:53,640 Thank you very much, Stephen. Thanks.
591
00:37:53,640 --> 00:37:54,640
Yeah.
592
00:37:54,640 --> 00:37:59,360
So good luck getting, finding some time today to do some real work, do some thinking, do
593
00:37:59,360 --> 00:38:00,360 some science.
594
00:38:00,360 --> 00:38:01,840 I'm going to try.
595
00:38:01,840 --> 00:38:02,840 Yeah. Definitely.
596
00:38:02,840 --> 00:38:03,840 Okay.
597
00:38:03,840 --> 00:38:04,840 All right.
598
00:38:04,840 --> 00:38:05,840 Thank you so much.
599
00:38:05,840 --> 00:38:06,840 Hey, yeah.
600
00:38:06,840 --> 00:38:07,840 Thanks, Stephen. Bye.
601
00:38:07,840 --> 00:38:08,840 Okay.
602
00:38:08,840 --> 00:38:09,840 Okay.
603
00:38:09,840 --> 00:38:10,840 Well, that's it for episode 31.
604
00:38:10,840 --> 00:38:14,440
Thanks a lot, Masud, for coming on the podcast and reading aloud your editorial and talking
605
00:38:14,440 --> 00:38:15,440 about it with me.
606
00:38:15,440 --> 00:38:16,440 Really enjoyed it.
607
00:38:16,440 --> 00:38:17,840 And I hope you guys did too.
608
00:38:17,840 --> 00:38:18,840 All right.
609
00:38:18,840 --> 00:38:19,840 Bye for now.
610
00:38:19,840 --> 00:38:19,840 See you next time.
611
00:38:20,840 --> 00:38:20,840
612
00:38:21,840 --> 00:38:21,840
613
00:38:22,840 --> 00:38:23,840
614
00:38:23,840 --> 00:38:28,000
[Music]