1 00:00:01,864 --> 00:00:03,088 Speaker 1: Hello, habit Mechanics. 2 00:00:03,088 --> 00:00:04,169 It's Dr John Finn here. 3 00:00:04,169 --> 00:00:07,065 I hope you're well Just recording a walking pod. 4 00:00:07,065 --> 00:00:13,577 Over the last probably seven days, there has been a lot of 5 00:00:13,939 --> 00:00:20,929 discussion about just how devastating AI is potentially 6 00:00:20,971 --> 00:00:26,224 going to be to human jobs is potentially going to be to human 7 00:00:26,224 --> 00:00:31,373 jobs, and some commentators completely buy into this. 8 00:00:31,373 --> 00:00:39,118 On the other end, some people say this is all hype. 9 00:00:39,118 --> 00:00:40,862 It's not real, and I thought a really interesting resource that 10 00:00:40,862 --> 00:00:48,234 I wanted to point out to people is the BBC Reef Lecture Series. 11 00:00:48,234 --> 00:00:58,487 The BBC do Christmas Reef Lecture Series and in I think it 12 00:00:58,487 --> 00:01:08,867 was 2021, december 2021, so before we had the release and 13 00:01:08,888 --> 00:01:12,465 the discussions around things like chat, gtp, these new 14 00:01:15,007 --> 00:01:25,024 generative AI systems, the lecturer if that's the right 15 00:01:25,063 --> 00:01:29,829 word for the Reef Lecture Series was Professor Stuart Russell, 16 00:01:31,280 --> 00:01:36,081 and Professor Stuart Russell is let me get this right he is the 17 00:01:38,609 --> 00:01:41,799 founder of the Centre of Human Compatible Artificial 18 00:01:41,819 --> 00:01:48,079 Intelligence at University of California, berkeley, and I 19 00:01:48,099 --> 00:01:49,162 listened to that lecture series. 20 00:01:49,182 --> 00:01:54,191 I think there's there's at least four one hour lectures and he 21 00:01:54,373 --> 00:01:54,933 he goes. 22 00:01:54,933 --> 00:01:57,566 He starts from covering ai at a very high level and he goes 23 00:01:57,587 --> 00:02:01,263 into these different kind of verticals, if you like, for how 24 00:02:01,403 --> 00:02:09,002 ai is going to impact humans in different sectors, and there's 25 00:02:09,122 --> 00:02:12,070 nothing hype around what he's saying. 26 00:02:12,070 --> 00:02:18,492 He's coming out almost from a cold clinical academic 27 00:02:18,573 --> 00:02:19,174 perspective. 28 00:02:19,174 --> 00:02:26,366 So he's actually he's English, but, yeah, he's a professor at 29 00:02:27,088 --> 00:02:32,049 Berkeley, california and he has a very measured way of talking 30 00:02:32,068 --> 00:02:32,730 about this stuff. 31 00:02:32,730 --> 00:02:37,570 And what surprised me is how much of what he he's not really 32 00:02:37,610 --> 00:02:37,831 seen. 33 00:02:37,831 --> 00:02:38,413 He's not really. 34 00:02:38,413 --> 00:02:41,722 You don't really hear about him , stuart Russell. 35 00:02:41,722 --> 00:02:47,949 You hear more about the godfathers of AI, like, I think, 36 00:02:47,949 --> 00:02:50,927 geoffrey Hinton, who's another English guy, and there's a 37 00:02:50,967 --> 00:02:54,001 couple of I think it's two of the godfathers, isn't there One 38 00:02:54,040 --> 00:02:57,431 French and I think one maybe Canadian? 39 00:02:57,431 --> 00:03:04,368 You don't really hear much about Stuart Russell, but if 40 00:03:04,408 --> 00:03:06,885 you're thinking, is this stuff real that I'm hearing about? 41 00:03:06,885 --> 00:03:11,748 Listen to those lecture series because he's almost predicted 42 00:03:11,788 --> 00:03:17,625 the future, um, without the hype , just in real terms, and it's 43 00:03:17,645 --> 00:03:18,508 really interesting. 44 00:03:18,508 --> 00:03:20,377 One of the things he talks about, because he's pretty 45 00:03:20,418 --> 00:03:25,608 dangerous, is the use of things like drones and and you know, 46 00:03:25,627 --> 00:03:32,959 just this week we saw this almost like SS-style operation 47 00:03:33,561 --> 00:03:37,191 from the Ukrainian armed forces against the Russian armed forces 48 00:03:37,191 --> 00:03:40,247 using these drone technologies. 49 00:03:40,247 --> 00:03:44,985 That wouldn't have been thought about probably even 12, 24 50 00:03:45,026 --> 00:03:46,468 months ago, if that even possible. 51 00:03:46,468 --> 00:03:54,286 And all this is being driven by AI and I think, ultimately, 52 00:03:55,608 --> 00:04:01,556 what Stuart Russell is saying, which completely dovetails with 53 00:04:01,575 --> 00:04:02,981 the most extreme. 54 00:04:03,042 --> 00:04:07,451 What people might find the most extreme impact that AI is going 55 00:04:07,472 --> 00:04:16,750 to have, is that just slowly but surely, um, ai is gonna get 56 00:04:18,012 --> 00:04:23,505 better and better at being able to do every single role, every 57 00:04:23,545 --> 00:04:28,117 single task that humans currently do, and that doesn't 58 00:04:28,158 --> 00:04:28,839 just right now. 59 00:04:28,839 --> 00:04:31,464 We're seeing roles like HR. 60 00:04:31,464 --> 00:04:35,793 I think Microsoft just laid off not Microsoft IBM just laid off 61 00:04:35,793 --> 00:04:38,423 8,000 people from their HR departments. 62 00:04:38,423 --> 00:04:42,529 We're seeing, I think, I think in the UK, I think in the UK I 63 00:04:42,550 --> 00:04:47,041 think this is the figure for the last 12 months that graduate 64 00:04:47,062 --> 00:04:56,711 recruitment is down 67% Because companies don't need low-skilled 65 00:04:56,711 --> 00:05:01,588 workers anymore and most of what graduates do is low-skilled 66 00:05:01,588 --> 00:05:03,264 compared to the people that have been doing their jobs for 67 00:05:03,240 --> 00:05:03,596 20 years. 68 00:05:03,596 --> 00:05:03,985 They can just get AI to do that . 69 00:05:03,985 --> 00:05:04,348 Finance is being affected. 70 00:05:04,348 --> 00:05:04,774 Accountants is being affected. 71 00:05:04,774 --> 00:05:05,536 Skilled compared to their, the people that have been doing 72 00:05:05,512 --> 00:05:07,187 their jobs for 20 years right, they can just get ai to do that. 73 00:05:07,187 --> 00:05:10,896 Finance is being affected, accountants is being affected. 74 00:05:10,896 --> 00:05:15,247 So we're seeing right now the emergence of these technologies 75 00:05:15,286 --> 00:05:21,923 and they are disrupting more cognitive skills, but actually 76 00:05:23,206 --> 00:05:25,271 what stuart russell also points out is they're going to move 77 00:05:25,310 --> 00:05:26,853 into the physical skills as well . 78 00:05:26,853 --> 00:05:33,244 So lots of people are moving to sort of the trades, but that 79 00:05:33,283 --> 00:05:37,752 will also be disrupted those sectors by AI. 80 00:05:38,913 --> 00:05:41,540 And, if you, there's a really interesting Amazon TV programme 81 00:05:42,283 --> 00:05:47,807 called Clarkson's Farm and I suppose Jeremy Clarkson's been 82 00:05:47,827 --> 00:05:49,040 like mustard people like him. 83 00:05:49,040 --> 00:05:49,661 I don't like him. 84 00:05:49,661 --> 00:05:52,449 But what I find really interesting, having done quite a 85 00:05:52,449 --> 00:05:56,786 lot of work for John Deere John Deere are, I think, the biggest 86 00:05:56,786 --> 00:06:00,901 manufacturer of agricultural equipment in the world and they 87 00:06:00,940 --> 00:06:03,807 see themselves as a technology company. 88 00:06:03,807 --> 00:06:09,723 So when you go to John Deere HQ in the UK you know they have 89 00:06:09,803 --> 00:06:13,190 boats that are cutting the grass , for example, but they have 90 00:06:13,672 --> 00:06:16,103 algorithms that are predicting you know what. 91 00:06:16,103 --> 00:06:19,490 What's the best way to get the best return on this crop? 92 00:06:19,490 --> 00:06:22,423 What's the best way to promote healthy soil conditions? 93 00:06:22,423 --> 00:06:26,072 And what's really interesting in the jeremy clarkson 94 00:06:26,452 --> 00:06:32,548 documentary for me is actually how technology is being used not 95 00:06:32,548 --> 00:06:37,440 software, hardware to actually do jobs that farmers would have 96 00:06:37,901 --> 00:06:42,166 used to have done, whether that's putting in a post like 97 00:06:42,286 --> 00:06:48,475 just something being hammered down, wrapping a hair bale, the 98 00:06:48,875 --> 00:06:50,701 combine harvesters how clever they are. 99 00:06:50,701 --> 00:06:53,810 I think that's a great demonstration of what the future 100 00:06:53,810 --> 00:06:55,803 is going to start to look like again. 101 00:06:55,985 --> 00:06:58,757 The machines are not doing it by themselves, they're co-working 102 00:06:58,798 --> 00:06:59,220 with humans. 103 00:06:59,220 --> 00:07:05,613 But one thing that stuart russell says is that the last 104 00:07:05,634 --> 00:07:10,625 sort of defendable set of skills for humans are going to be what 105 00:07:10,625 --> 00:07:14,033 I would call human AI performance psychology skills. 106 00:07:14,033 --> 00:07:18,630 It's going to be the skills of humans helping other humans, 107 00:07:19,980 --> 00:07:25,293 because that is going to remain the most difficult thing for AI 108 00:07:25,434 --> 00:07:26,276 to be able to do. 109 00:07:26,276 --> 00:07:30,237 And you know, in my new book, train your Brain for the AI 110 00:07:30,257 --> 00:07:34,156 Revolution, I point towards this in chapter I think 27. 111 00:07:34,264 --> 00:07:37,310 I talk about three roles for the future that I think will emerge 112 00:07:37,310 --> 00:07:37,310 . 113 00:07:37,310 --> 00:07:41,689 There will be, I think, the innovators that are. 114 00:07:41,689 --> 00:07:44,637 Just their job is to solve complex problems. 115 00:07:44,637 --> 00:07:46,071 That's what they're working on every day. 116 00:07:46,071 --> 00:07:48,391 How do we create a vaccine for this? 117 00:07:48,391 --> 00:07:53,576 How do we create buildings that absorb CO2 instead of pump it 118 00:07:53,716 --> 00:07:53,836 out? 119 00:07:53,836 --> 00:07:57,555 How do we build better education systems for our young 120 00:07:57,595 --> 00:08:00,884 people that are going to allow them to thrive in the AI world? 121 00:08:00,944 --> 00:08:03,629 So we're going to have these innovators and then we're going 122 00:08:03,668 --> 00:08:05,670 to have the automators people that build the systems. 123 00:08:05,670 --> 00:08:06,939 So already we're seeing to have the automators people that 124 00:08:06,959 --> 00:08:07,300 build the systems. 125 00:08:07,300 --> 00:08:11,752 So already we're seeing that medium skilled developers have 126 00:08:11,771 --> 00:08:14,096 been pretty much wiped out by ai already. 127 00:08:14,096 --> 00:08:16,327 Let's be real ai is a baby. 128 00:08:16,327 --> 00:08:18,535 It's just a baby. 129 00:08:18,535 --> 00:08:21,504 It's just a beginning and it's already wiping out very skilled 130 00:08:22,627 --> 00:08:26,014 jobs, but we're still going to need humans in the loop on this 131 00:08:26,074 --> 00:08:26,355 stuff. 132 00:08:26,355 --> 00:08:30,550 Um, so we're going to have the automators that are going to 133 00:08:30,571 --> 00:08:34,945 build the software and the hardware and, you know, build 134 00:08:35,004 --> 00:08:36,308 the machines that actually do that. 135 00:08:36,308 --> 00:08:42,044 But then the third role is going to be human ai performance 136 00:08:42,044 --> 00:08:46,472 psychology coaches, because the innovators and the automators 137 00:08:47,374 --> 00:08:50,785 are only going to be as good as their ability to manage their 138 00:08:50,806 --> 00:08:54,734 brain states and consistently get into their high charge brain 139 00:08:54,734 --> 00:09:03,273 states, which is the the type of highly mental, complex 140 00:09:03,312 --> 00:09:12,905 thinking that ai is is least well able to do so. 141 00:09:12,905 --> 00:09:15,232 Human air performance psychology coaches are going to 142 00:09:15,253 --> 00:09:20,794 be working with humans to make it, with innovators and the 143 00:09:20,855 --> 00:09:22,118 automators to help them to do that. 144 00:09:23,265 --> 00:09:27,636 So I'm not trying to paint some dystopian picture of the future. 145 00:09:27,636 --> 00:09:34,616 I think this is real and it's emerging and all the evidence 146 00:09:34,657 --> 00:09:37,044 that we're seeing is all pointing in the same direction. 147 00:09:37,044 --> 00:09:40,947 But if you think it's high, I would really recommend just go 148 00:09:40,986 --> 00:09:45,345 to the BBC or you can google this or use chatGTP or something 149 00:09:45,345 --> 00:09:45,345 . 150 00:09:45,345 --> 00:09:49,725 Look for the Wraith lecture series by professor stewart 151 00:09:49,765 --> 00:09:55,937 russell and I think it's from december 2021 and I think you'll 152 00:09:55,937 --> 00:09:57,299 find it very instructive. 153 00:09:57,299 --> 00:09:58,931 No hype, it's just. 154 00:09:58,931 --> 00:10:05,610 This is how this world leading expert in this technology this 155 00:10:05,671 --> 00:10:09,238 is this is what he was predicting would be unfolding. 156 00:10:09,238 --> 00:10:12,908 Um, you know, in the years ahead, and now we are. 157 00:10:12,908 --> 00:10:17,302 You know, we're four and a half years since those those lecture 158 00:10:17,302 --> 00:10:21,077 series was recorded and it's surprising how much of what he 159 00:10:21,097 --> 00:10:22,504 said is actually coming to fruition. 160 00:10:22,504 --> 00:10:25,730 Um, so I hope that's insightful . 161 00:10:26,932 --> 00:10:31,250 Um, I I see my role here as I want to help people, I want to 162 00:10:31,292 --> 00:10:33,624 educate people, but we need to be realistic. 163 00:10:33,624 --> 00:10:36,514 We can't bury our heads in the sand on this. 164 00:10:36,514 --> 00:10:38,845 We've got to educate ourselves. 165 00:10:38,845 --> 00:10:41,971 We've got to learn about the tech, we've got to learn how to 166 00:10:42,010 --> 00:10:46,186 embed it into our work cultures, into our workflows and our work 167 00:10:46,186 --> 00:10:51,235 cultures into our workflows and , yeah, and our work cultures. 168 00:10:51,235 --> 00:10:54,046 Um, so, yeah, I hope that was interesting. 169 00:10:54,687 --> 00:11:02,258 Remember, we now have our ai era leadership accelerator, where 170 00:11:02,278 --> 00:11:05,730 we are turning up every day and we're supporting people to 171 00:11:05,770 --> 00:11:12,048 become ai era leaders that are future proofed, that are in 172 00:11:12,087 --> 00:11:15,995 demand experts, and that includes becoming a ai era coach 173 00:11:15,995 --> 00:11:16,356 as well. 174 00:11:16,356 --> 00:11:22,419 So if that's of interest to you , um, you can access that via. 175 00:11:22,419 --> 00:11:23,865 I'll put a link actually in this podcast. 176 00:11:23,865 --> 00:11:25,528 Beneath this podcast. 177 00:11:25,528 --> 00:11:28,976 We're running, uh, fourday challenges as part of that 178 00:11:29,017 --> 00:11:29,398 programme. 179 00:11:29,398 --> 00:11:32,474 So it's a monthly challenge programme where there's a new 180 00:11:32,514 --> 00:11:35,030 challenge every month and we have four-day mini challenges in 181 00:11:35,030 --> 00:11:41,106 the middle of that and we just did our first one last week and 182 00:11:41,289 --> 00:11:42,284 the feedback's been excellent. 183 00:11:42,284 --> 00:11:47,456 So I'm really excited about these and the benefits that it's 184 00:11:47,456 --> 00:11:48,378 delivering to people. 185 00:11:48,378 --> 00:11:52,139 So, yeah, if that's of interest , you can check out the link 186 00:11:52,500 --> 00:11:55,971 beneath the podcast or just contact us via our website. 187 00:11:55,971 --> 00:11:56,894 We can give you more details. 188 00:11:56,894 --> 00:11:58,278 It'd be great to see you in there. 189 00:11:58,278 --> 00:12:03,510 Okay, enjoy the rest of the day and remember you're only ever 190 00:12:03,650 --> 00:12:05,475 one brain state habit away.