1 00:00:21,794 --> 00:00:23,018 Speaker 1: Hello, habit Mechanics. 2 00:00:23,018 --> 00:00:24,100 It's Dr John Finn here. 3 00:00:24,100 --> 00:00:26,995 I hope you're well Just recording a walking pod. 4 00:00:26,995 --> 00:00:33,507 Over the last probably seven days, there has been a lot of 5 00:00:33,868 --> 00:00:40,859 discussion about just how devastating AI is potentially 6 00:00:40,900 --> 00:00:46,153 going to be to human jobs is potentially going to be to human 7 00:00:46,153 --> 00:00:51,302 jobs, and some commentators completely buy into this. 8 00:00:51,302 --> 00:00:59,048 On the other end, some people say this is all hype. 9 00:00:59,048 --> 00:01:00,792 It's not real, and I thought a really interesting resource that 10 00:01:00,792 --> 00:01:08,164 I wanted to point out to people is the BBC Reef Lecture Series. 11 00:01:08,164 --> 00:01:18,417 The BBC do Christmas Reef Lecture Series and in I think it 12 00:01:18,417 --> 00:01:28,798 was 2021, december 2021, so before we had the release and 13 00:01:28,818 --> 00:01:32,395 the discussions around things like chat, gtp, these new 14 00:01:34,937 --> 00:01:44,954 generative AI systems, the lecturer if that's the right 15 00:01:44,993 --> 00:01:49,759 word for the Reef Lecture Series was Professor Stuart Russell, 16 00:01:51,210 --> 00:01:56,012 and Professor Stuart Russell is let me get this right he is the 17 00:01:58,539 --> 00:02:01,729 founder of the Centre of Human Compatible Artificial 18 00:02:01,748 --> 00:02:08,009 Intelligence at University of California, berkeley, and I 19 00:02:08,030 --> 00:02:09,092 listened to that lecture series. 20 00:02:09,111 --> 00:02:14,121 I think there's there's at least four one hour lectures and he 21 00:02:14,302 --> 00:02:14,864 he goes. 22 00:02:14,864 --> 00:02:17,496 He starts from covering ai at a very high level and he goes 23 00:02:17,516 --> 00:02:21,193 into these different kind of verticals, if you like, for how 24 00:02:21,334 --> 00:02:28,932 ai is going to impact humans in different sectors, and there's 25 00:02:29,052 --> 00:02:32,001 nothing hype around what he's saying. 26 00:02:32,001 --> 00:02:38,423 He's coming out almost from a cold clinical academic 27 00:02:38,503 --> 00:02:39,104 perspective. 28 00:02:39,104 --> 00:02:46,296 So he's actually he's English, but, yeah, he's a professor at 29 00:02:47,019 --> 00:02:51,979 Berkeley, california and he has a very measured way of talking 30 00:02:51,998 --> 00:02:52,661 about this stuff. 31 00:02:52,661 --> 00:02:57,501 And what surprised me is how much of what he he's not really 32 00:02:57,540 --> 00:02:57,762 seen. 33 00:02:57,762 --> 00:02:58,343 He's not really. 34 00:02:58,343 --> 00:03:01,652 You don't really hear about him , stuart Russell. 35 00:03:01,652 --> 00:03:07,879 You hear more about the godfathers of AI, like, I think, 36 00:03:07,879 --> 00:03:10,858 geoffrey Hinton, who's another English guy, and there's a 37 00:03:10,897 --> 00:03:13,931 couple of I think it's two of the godfathers, isn't there One 38 00:03:13,971 --> 00:03:17,361 French and I think one maybe Canadian? 39 00:03:17,361 --> 00:03:24,299 You don't really hear much about Stuart Russell, but if 40 00:03:24,338 --> 00:03:26,816 you're thinking, is this stuff real that I'm hearing about? 41 00:03:26,816 --> 00:03:31,679 Listen to those lecture series because he's almost predicted 42 00:03:31,718 --> 00:03:37,555 the future, um, without the hype , just in real terms, and it's 43 00:03:37,575 --> 00:03:38,438 really interesting. 44 00:03:38,438 --> 00:03:40,307 One of the things he talks about, because he's pretty 45 00:03:40,348 --> 00:03:45,538 dangerous, is the use of things like drones and and you know, 46 00:03:45,557 --> 00:03:52,889 just this week we saw this almost like SS-style operation 47 00:03:53,491 --> 00:03:57,121 from the Ukrainian armed forces against the Russian armed forces 48 00:03:57,121 --> 00:04:00,177 using these drone technologies. 49 00:04:00,177 --> 00:04:04,915 That wouldn't have been thought about probably even 12, 24 50 00:04:04,956 --> 00:04:06,399 months ago, if that even possible. 51 00:04:06,399 --> 00:04:14,216 And all this is being driven by AI and I think, ultimately, 52 00:04:15,538 --> 00:04:21,485 what Stuart Russell is saying, which completely dovetails with 53 00:04:21,505 --> 00:04:22,911 the most extreme. 54 00:04:22,971 --> 00:04:27,382 What people might find the most extreme impact that AI is going 55 00:04:27,401 --> 00:04:36,680 to have, is that just slowly but surely, um, ai is gonna get 56 00:04:37,942 --> 00:04:43,435 better and better at being able to do every single role, every 57 00:04:43,475 --> 00:04:48,048 single task that humans currently do, and that doesn't 58 00:04:48,088 --> 00:04:48,769 just right now. 59 00:04:48,769 --> 00:04:51,394 We're seeing roles like HR. 60 00:04:51,394 --> 00:04:55,723 I think Microsoft just laid off not Microsoft IBM just laid off 61 00:04:55,723 --> 00:04:58,353 8,000 people from their HR departments. 62 00:04:58,353 --> 00:05:02,459 We're seeing, I think, I think in the UK, I think in the UK I 63 00:05:02,480 --> 00:05:06,971 think this is the figure for the last 12 months that graduate 64 00:05:06,992 --> 00:05:16,641 recruitment is down 67% Because companies don't need low-skilled 65 00:05:16,641 --> 00:05:21,519 workers anymore and most of what graduates do is low-skilled 66 00:05:21,519 --> 00:05:23,194 compared to the people that have been doing their jobs for 67 00:05:23,170 --> 00:05:23,526 20 years. 68 00:05:23,526 --> 00:05:23,915 They can just get AI to do that . 69 00:05:23,915 --> 00:05:24,278 Finance is being affected. 70 00:05:24,278 --> 00:05:24,704 Accountants is being affected. 71 00:05:24,704 --> 00:05:25,466 Skilled compared to their, the people that have been doing 72 00:05:25,442 --> 00:05:27,117 their jobs for 20 years right, they can just get ai to do that. 73 00:05:27,117 --> 00:05:30,826 Finance is being affected, accountants is being affected. 74 00:05:30,826 --> 00:05:35,177 So we're seeing right now the emergence of these technologies 75 00:05:35,216 --> 00:05:41,853 and they are disrupting more cognitive skills, but actually 76 00:05:43,136 --> 00:05:45,201 what stuart russell also points out is they're going to move 77 00:05:45,240 --> 00:05:46,783 into the physical skills as well . 78 00:05:46,783 --> 00:05:53,174 So lots of people are moving to sort of the trades, but that 79 00:05:53,213 --> 00:05:57,682 will also be disrupted those sectors by AI. 80 00:05:58,843 --> 00:06:01,470 And, if you, there's a really interesting Amazon TV programme 81 00:06:02,213 --> 00:06:07,737 called Clarkson's Farm and I suppose Jeremy Clarkson's been 82 00:06:07,757 --> 00:06:08,970 like mustard people like him. 83 00:06:08,970 --> 00:06:09,591 I don't like him. 84 00:06:09,591 --> 00:06:12,379 But what I find really interesting, having done quite a 85 00:06:12,379 --> 00:06:16,716 lot of work for John Deere John Deere are, I think, the biggest 86 00:06:16,716 --> 00:06:20,831 manufacturer of agricultural equipment in the world and they 87 00:06:20,870 --> 00:06:23,737 see themselves as a technology company. 88 00:06:23,737 --> 00:06:29,653 So when you go to John Deere HQ in the UK you know they have 89 00:06:29,733 --> 00:06:33,120 boats that are cutting the grass , for example, but they have 90 00:06:33,602 --> 00:06:36,033 algorithms that are predicting you know what. 91 00:06:36,033 --> 00:06:39,420 What's the best way to get the best return on this crop? 92 00:06:39,420 --> 00:06:42,353 What's the best way to promote healthy soil conditions? 93 00:06:42,353 --> 00:06:46,002 And what's really interesting in the jeremy clarkson 94 00:06:46,382 --> 00:06:52,478 documentary for me is actually how technology is being used not 95 00:06:52,478 --> 00:06:57,370 software, hardware to actually do jobs that farmers would have 96 00:06:57,831 --> 00:07:02,096 used to have done, whether that's putting in a post like 97 00:07:02,216 --> 00:07:08,405 just something being hammered down, wrapping a hair bale, the 98 00:07:08,805 --> 00:07:10,632 combine harvesters how clever they are. 99 00:07:10,632 --> 00:07:13,740 I think that's a great demonstration of what the future 100 00:07:13,740 --> 00:07:15,733 is going to start to look like again. 101 00:07:15,915 --> 00:07:18,687 The machines are not doing it by themselves, they're co-working 102 00:07:18,728 --> 00:07:19,150 with humans. 103 00:07:19,150 --> 00:07:25,543 But one thing that stuart russell says is that the last 104 00:07:25,564 --> 00:07:30,555 sort of defendable set of skills for humans are going to be what 105 00:07:30,555 --> 00:07:33,963 I would call human AI performance psychology skills. 106 00:07:33,963 --> 00:07:38,560 It's going to be the skills of humans helping other humans, 107 00:07:39,911 --> 00:07:45,223 because that is going to remain the most difficult thing for AI 108 00:07:45,364 --> 00:07:46,206 to be able to do. 109 00:07:46,206 --> 00:07:50,167 And you know, in my new book, train your Brain for the AI 110 00:07:50,187 --> 00:07:54,086 Revolution, I point towards this in chapter I think 27. 111 00:07:54,194 --> 00:07:57,240 I talk about three roles for the future that I think will emerge 112 00:07:57,240 --> 00:07:57,240 . 113 00:07:57,240 --> 00:08:01,619 There will be, I think, the innovators that are. 114 00:08:01,619 --> 00:08:04,567 Just their job is to solve complex problems. 115 00:08:04,567 --> 00:08:06,001 That's what they're working on every day. 116 00:08:06,001 --> 00:08:08,322 How do we create a vaccine for this? 117 00:08:08,322 --> 00:08:13,507 How do we create buildings that absorb CO2 instead of pump it 118 00:08:13,646 --> 00:08:13,766 out? 119 00:08:13,766 --> 00:08:17,485 How do we build better education systems for our young 120 00:08:17,525 --> 00:08:20,814 people that are going to allow them to thrive in the AI world? 121 00:08:20,875 --> 00:08:23,559 So we're going to have these innovators and then we're going 122 00:08:23,598 --> 00:08:25,600 to have the automators people that build the systems. 123 00:08:25,600 --> 00:08:26,869 So already we're seeing to have the automators people that 124 00:08:26,889 --> 00:08:27,230 build the systems. 125 00:08:27,230 --> 00:08:31,682 So already we're seeing that medium skilled developers have 126 00:08:31,701 --> 00:08:34,026 been pretty much wiped out by ai already. 127 00:08:34,026 --> 00:08:36,257 Let's be real ai is a baby. 128 00:08:36,257 --> 00:08:38,465 It's just a baby. 129 00:08:38,465 --> 00:08:41,434 It's just a beginning and it's already wiping out very skilled 130 00:08:42,557 --> 00:08:45,943 jobs, but we're still going to need humans in the loop on this 131 00:08:46,004 --> 00:08:46,284 stuff. 132 00:08:46,284 --> 00:08:50,480 Um, so we're going to have the automators that are going to 133 00:08:50,500 --> 00:08:54,875 build the software and the hardware and, you know, build 134 00:08:54,934 --> 00:08:56,238 the machines that actually do that. 135 00:08:56,238 --> 00:09:01,974 But then the third role is going to be human ai performance 136 00:09:01,974 --> 00:09:06,402 psychology coaches, because the innovators and the automators 137 00:09:07,303 --> 00:09:10,715 are only going to be as good as their ability to manage their 138 00:09:10,735 --> 00:09:14,663 brain states and consistently get into their high charge brain 139 00:09:14,663 --> 00:09:23,202 states, which is the the type of highly mental, complex 140 00:09:23,242 --> 00:09:32,835 thinking that ai is is least well able to do so. 141 00:09:32,835 --> 00:09:35,162 Human air performance psychology coaches are going to 142 00:09:35,182 --> 00:09:40,724 be working with humans to make it, with innovators and the 143 00:09:40,784 --> 00:09:42,048 automators to help them to do that. 144 00:09:43,195 --> 00:09:47,566 So I'm not trying to paint some dystopian picture of the future. 145 00:09:47,566 --> 00:09:54,546 I think this is real and it's emerging and all the evidence 146 00:09:54,586 --> 00:09:56,974 that we're seeing is all pointing in the same direction. 147 00:09:56,974 --> 00:10:00,876 But if you think it's high, I would really recommend just go 148 00:10:00,916 --> 00:10:05,274 to the BBC or you can google this or use chatGTP or something 149 00:10:05,274 --> 00:10:05,274 . 150 00:10:05,274 --> 00:10:09,655 Look for the Wraith lecture series by professor stewart 151 00:10:09,695 --> 00:10:15,866 russell and I think it's from december 2021 and I think you'll 152 00:10:15,866 --> 00:10:17,229 find it very instructive. 153 00:10:17,229 --> 00:10:18,860 No hype, it's just. 154 00:10:18,860 --> 00:10:25,539 This is how this world leading expert in this technology this 155 00:10:25,600 --> 00:10:29,168 is this is what he was predicting would be unfolding. 156 00:10:29,168 --> 00:10:32,837 Um, you know, in the years ahead, and now we are. 157 00:10:32,837 --> 00:10:37,231 You know, we're four and a half years since those those lecture 158 00:10:37,231 --> 00:10:41,007 series was recorded and it's surprising how much of what he 159 00:10:41,027 --> 00:10:42,434 said is actually coming to fruition. 160 00:10:42,434 --> 00:10:45,660 Um, so I hope that's insightful . 161 00:10:46,862 --> 00:10:51,180 Um, I I see my role here as I want to help people, I want to 162 00:10:51,221 --> 00:10:53,553 educate people, but we need to be realistic. 163 00:10:53,553 --> 00:10:56,443 We can't bury our heads in the sand on this. 164 00:10:56,443 --> 00:10:58,774 We've got to educate ourselves. 165 00:10:58,774 --> 00:11:01,900 We've got to learn about the tech, we've got to learn how to 166 00:11:01,940 --> 00:11:06,115 embed it into our work cultures, into our workflows and our work 167 00:11:06,115 --> 00:11:11,165 cultures into our workflows and , yeah, and our work cultures. 168 00:11:11,165 --> 00:11:13,975 Um, so, yeah, I hope that was interesting. 169 00:11:14,616 --> 00:11:22,187 Remember, we now have our ai era leadership accelerator, where 170 00:11:22,207 --> 00:11:25,659 we are turning up every day and we're supporting people to 171 00:11:25,700 --> 00:11:31,977 become ai era leaders that are future proofed, that are in 172 00:11:32,017 --> 00:11:35,925 demand experts, and that includes becoming a ai era coach 173 00:11:35,925 --> 00:11:36,286 as well. 174 00:11:36,286 --> 00:11:42,349 So if that's of interest to you , um, you can access that via. 175 00:11:42,349 --> 00:11:43,794 I'll put a link actually in this podcast. 176 00:11:43,794 --> 00:11:45,458 Beneath this podcast. 177 00:11:45,458 --> 00:11:48,906 We're running, uh, fourday challenges as part of that 178 00:11:48,947 --> 00:11:49,327 programme. 179 00:11:49,327 --> 00:11:52,403 So it's a monthly challenge programme where there's a new 180 00:11:52,443 --> 00:11:54,960 challenge every month and we have four-day mini challenges in 181 00:11:54,960 --> 00:12:01,036 the middle of that and we just did our first one last week and 182 00:12:01,219 --> 00:12:02,214 the feedback's been excellent. 183 00:12:02,214 --> 00:12:07,385 So I'm really excited about these and the benefits that it's 184 00:12:07,385 --> 00:12:08,307 delivering to people. 185 00:12:08,307 --> 00:12:12,068 So, yeah, if that's of interest , you can check out the link 186 00:12:12,430 --> 00:12:15,901 beneath the podcast or just contact us via our website. 187 00:12:15,901 --> 00:12:16,823 We can give you more details. 188 00:12:16,823 --> 00:12:18,207 It'd be great to see you in there. 189 00:12:18,207 --> 00:12:23,440 Okay, enjoy the rest of the day and remember you're only ever 190 00:12:23,580 --> 00:12:25,405 one brain state habit away.