The Lucent Perspective

Future-Proofing Your Team: Navigating AI and the Metaverse, with Theo Priestly

Rebecca Hastings Season 1 Episode 17

The convergence of technologies like AI, Blockchain, VR, and Metaverse will reshape society and the future of work. How are you preparing your business and your workforce to adapt and leverage AI to your advantage?

Episode Outline and Highlights

  • [01:28] Exploring AI's Impact on Society and the Future of Work.
  • [04:32] Equipping the Workforce for AI Adaptation.
  • [09:11] Theo's Vision of the Future for Businesses and Technology.
  • [15:36] Addressing Bias Risks and Ethical AI Implementation.
  • [19:41] The Regulatory Landscape's Influence on AI in Tech.
  • [25:01] Nurturing Critical Skills in Your Team.
  • [29:58] Debunking Misconceptions and Discussing the Future of the Metaverse.
  • [34:42] Theo's Unique Perspective as an Anti-Futurist.
  • [37:03] Insights on AI-Induced Doom-Mongering.
  • [39:30] Theo's Approach to Staying Informed in His Field.

About Theo Priestly

Theo has over two decades working across various industries and held exec roles in software companies focusing on product management and marketing. He also bootstrapped and raised funding for personal ventures, mentored startups within top accelerators, wrote a book on the future, and did everything else in between.

Theo has appeared on many podcasts, written over 300 articles across publications including Forbes, WIRED, HuffPost, and VentureBeat, and given keynote talks across the world. He has been featured on multiple thought leaders and influencer lists in the past on topics relating to virtual reality, artificial intelligence, and emerging technologies and continues to write articles that get people talking.

Related Podcasts You Might Enjoy

Would you like to be a guest? 

Book a time to speak with Rebecca:  https://calendly.com/rebeccahastings/discovery-call



Rebecca Hastings, Founder and Director at The Lucent Group

Connect on LinkedIn: https://www.linkedin.com/in/hastingsrebecca/

Would you like to be a guest? Book a time to speak with Rebecca: https://calendly.com/rebeccahastings/discovery-call

The Lucent Group Ltd website - https://www.thelucentgroup.co.uk/

The Lucent Perspective website - https://thelucentperspective.com/

Rebecca has extensive talent and executive search experience supporting digital and technology businesses through complex changes and fast-paced scale-up periods. She works with businesses advising on C-level, technical, sales and commercial appointments, workforce planning, strategic talent management, recruitment processes and associated technology and employer brand development.

Rebecca Turnbull:

Welcome to the Listen perspective. I'm your host, Rebecca Hastings. I've spent over a decade working with executives in the tech sector, and help successful companies build their leadership teams and scale. During my career, I've been lucky to have the privilege of learning from many exceptional leaders. In these conversations, you'll get perspectives from peers, be inspired, and learn what it takes to become one of the best. This is your chance to listen to experts talking about the challenges, solutions, and the vital insights they've gained in their careers to date.

Unknown:

It's not up to the policymakers. It's not up to the big technology giants like Facebook, Google or Microsoft to dictate to us what the future should look like using technology. We're still individuals collectively, we are actually a force so we still have a stake in the future.

Rebecca Turnbull:

I'm joined by Phil Priestley to discuss AI, the metaverse and the future of work. Thiel. Thanks so much for joining us. I'm really excited to hear your insights into these areas. There's just so many things we could discuss. So let's get straight into it. You've been recognized and you're really well known as a top influencer in areas like across blockchain, FinTech, virtual reality, artificial intelligence, the Internet of Things. Now, which do you believe of these will be the most significant in reshaping our everyday lives in the upcoming years. And why, but I

Unknown:

think it's fair to say in the immediate future, that AI will help reshape a lot of society and a lot of work the business of work the future of work as well, primarily because of the the impact in terms of the levels of automation and task handling, they can do, but also in terms of scientific discovery, medical breakthroughs, etc. Because of the amount of data they can handle, because of its ability to look for patterns that we often miss. Drug Discovery, for example, has been used heavily in that. And, and so I think it will impact a lot of different areas in different ways. That will also open up a lot of opportunities for people. But obviously, we have to be mindful about just how much control we relinquish to AI. And I think this is the one thing that people are kind of wary about. But also I see a lot of convergence between the set separate technologies that we treat as in silos today, like AI, Blockchain, VR Metaverse, there's actually a convergence happening here as well.

Rebecca Turnbull:

And they're all going to impact each other in ways that I'm sure that I can't imagine quite yet. But really accelerate the development of some of those technologies, but then totally change the experience and the applications or some of them, just the amount that you can do around, you know, like data analysis with AI, so properly is just going to really help those things evolve.

Unknown:

Yeah, I mean, I, it's interesting, the way that we look at workplaces, for example, if we look at the work, and we sit at our desk, and we look at applications and things like that, and they've been designed specifically for, you know, task based work, to route it to another task, someone fills in a form, it gets routed to someone and fill in another form bits of data get passed around and things but with AI sitting on top, we have to actually reimagine what does your enterprise software even look like? So, you know, the desktop of today will look completely different to the desktop of tomorrow. And it is all around all around? How are we going to restructure data, and the analysis of that data for these AI to take advantage of and faster and better ways?

Rebecca Turnbull:

And then how are we going to get people working with AI effectively is something I'm quite interested in. I think that it you know, you can see it yourself. When you're using prompts and chat GPT or barge or something, you know, garbage in garbage out, it still applies. So if you start to think more effectively about how you use it, and then you can have to train your workforce, and you really get people to accept that this is what they are going to be doing now. Their job, maybe, in some cases doesn't exist. And they have to do something different in order to contribute value to the organization society, potentially.

Unknown:

Yeah, so I mean, this is a bit like when the desktop computer started coming into financial services, for example. So people were used to working with terminals, and then desktops came in and people were looking at mouse the most and gone. What the hell do I do with this? You know, and people are dragging it up walls because they didn't realize you could actually pick it up and put it back in place and drag it again. So it's almost like a retraining of the world. workforce to think about how the process work in a completely different way. And for some people that are gonna get left behind, because they just won't be able to grasp that you can write an instruction for something to do rather than having to do it yourself. Yeah. And how we retrain that workforce is something I think a lot of businesses are going to have to think very long and hard about. Because it's not the the knee jerk reaction is to bring in consultants and then talk about a headcount reduction and cost savings and things like that. But and then they go, Well, we're going to retrain the the organ or you know, retrain the displaced people and give them better things to do. But retraining, those exercises cost a lot of money, which will eat into the cost savings that they were looking for. So there's a really strange balance that we'll need to take into effect here.

Rebecca Turnbull:

Totally. And if you're the first company that goes out and retrains part of your workforce, those people are going to be much more employable and much more valuable to your competitors. So as much as you're thinking about this, you know, the incentive of training and interesting new jobs, you're going to also have to make sure that these people, like feel a bit of love, probably, or you will lose them and you'll lose that competitive advantage that you've tried to build.

Unknown:

Yeah, I mean, if you look at the Big Four, for example, and even I think even Wipro has announced that they're spending a large amount of money developing new competencies and capabilities. Internally, I think it was Accenture, or one of them that announced a billion dollar investment in the people to train them up to use large language models and GPT tools, as part of their, you know, as part of their consultancy, offering. And like you say, once you train someone, how are you going to protect that investment? Without walking out the door? Do you lock these people in, you know, essentially, within the organization, or, you know, it's, it's going to be very difficult, but then, you know, 1010 years time, everybody's going to be trained exactly the same. So that competitive advantage isn't going to exist for very long. I think the world seems to move a lot quicker now than it did 20 years ago. Yeah, I

Rebecca Turnbull:

think that it was like, technologies probably only have any kind of like dominance for five years max before something else is gonna come along. And who knows, that might reduce further and further.

Unknown:

Yeah, if you look at the technology hype cycle, that Gartner for example, you see that sort of sine wave type curve. And to be honest, they don't put a timeline against the bottom axis for a reason, because that seems to be shrinking ever quicker. So you know, that that kind of sort of adoption curve is getting faster and faster and slower the rates of return? To be honest,

Rebecca Turnbull:

I'm sure there's currently some AI working on a formula to assess how much any technology has dominance?

Unknown:

Well, we've got quantum computing coming around the corner as well. So that'll that'll, that'll, yeah, they'll supersede AI at some point.

Rebecca Turnbull:

And I think that when you get AI and quantum together, that's just gonna be Yeah, immense change, you know, if we can get that working.

Unknown:

Yeah, there are a, I've already seen papers looking at Quantum based sort of algorithms. And I think it'll be a quick case that the most advanced organization, so have classical AI versus what we have now GPT, in large language models, and then quantum on top, so there'll be overlaying all three,

Rebecca Turnbull:

the speed at which things will be able to happen will just be dramatically, dramatically faster. So I know that you, you're published in like, things like Forbes, wired Huff Post, you've gotten loads of keynote speeches, and you wrote a book, The future starts now, which kind of discusses the impact technology will have on people, society and business. Now, with your kind of like perspective, as a futurist, how do you think all of this is going to impact businesses over the next decades? And? Or do you think that the biggest impact is gonna be on society? I mean, that would be equally interesting to know.

Unknown:

Well, they both go hand in hand. So I mean, from a business perspective, it's not just about the future of work and what people are doing, you actually have to look at the organizational makeup and no Target Operating Model in a sense as well, from top to bottom, even the boardroom. I mean, there, there are examples where, you know, people are are pointing in inverted commas, an AI to do a lot of strategic analysis that the board would normally do and then take those decisions. And that's something that I think even, you know, see the C suite should feel threatened by over the next 1015 years, but as a consequence of business being reshaped to that extent, then you have to look About the societal impact, and it's and of course of the world of work changes where people, you know, if we really potentially have structural unemployment, then what do we do with these people? For a start? Where do they go? Do they still remain in white collar and knowledge based work? Or do they move to blue collar work? Or? And where does the money come from? economically? If people aren't working, how do we support them? You know, you've got universal basic income that people talk about. But where does that money come from? Because, you know, no, no country has a, an unlimited money printer. So does that come from the businesses who are employing AI to do you know, to maximize profits to be taxed them? Do we find taxation? Other ways, do we how do we level the playing field, everybody has a chance to live work survive, and essentially thrive as well. So society will change over the next, you know, 1015 years, like I see. And as a result of that period, so Well, the technology side, so we're seeing AI, for as an example, as it is today, that six month period from between November to know when open AI launched, and you've seen the exponential explosion, in terms of capability. And, you know, you add another five years on top of that, and the complexity of tasks that we'll be able to handle, and the work, they will be able to process, just through simple commands, for example, I think is going to go exponential, and we're just not prepared for

Rebecca Turnbull:

totally. And it's amazing. Whenever you see the stats about AI adoption, especially in places like the US where this is trades, but more closely, is still a really small percentage of the workforce. They're actually working with AI. And it's it's so incredibly powerful. Why do you think this is not like something every single person is using? And every business is deploying yet? What do you you know, going back? Do you think it's just like the fear of change? Is it to do with not knowing where to start with like their Target Operating Model? What, what, what are your thoughts?

Unknown:

Yeah, I think it's, it's still very experimental, just know, people are still trying to work out how it's going to impact their jobs, and where should it be applied. I mean, if you look at the tasks again, perfect example, he says is Target Operating Model, a Target Operating Model is made up of 1000s of business processes. And it's a case of looking at your process architecture, and seeing which of these jobs, tasks, etc, are the ones that we should automate, versus the ones that we shouldn't, that we should still have some kind of control over. And that control can mean sort of empathy towards the customer, or it could mean data protection and privacy. And the thing is, with retraining all these AI and large language models, and algorithms, you need to put in your company's data, and you need to have the customer's consent, whether it's a corporate customer, you know, if you're an enterprise software, or whether you know, he's a customer, a retail customer, like a bank, you still have to get consent from these people to be able to train these tools in order to make your business more effective, and then the service more effective. So it's not just the case of yeah, we'll just walk in, and we'll see what happens is a case of we really have to understand what are the ethical considerations, legal, the data protection considerations? What's the impact to our business, to the employees to the operating model to the processes to the end customer? You know, and what are the what are the consequences if it goes wrong?

Rebecca Turnbull:

Yeah. And so many businesses are, you know, incredibly complex and the technologies they work with, that it's not as simple as Oh, well, I'll just put it into that process there and see what happens. That might then mean that other things don't work or what looks like a quick wait and can sometimes be really costly to implement. And it's, it's interesting to see which companies are like developing the talent to deal with this internally versus those who are kind of like, can, I suppose maybe in a bit more denial, like I have spoken with people who are in financial services, and so there's a mixed uptake there. And there's been mixed investment in building out their AI related talent. But I don't think there's been an any of the businesses that I'm speaking with or interacting with huge amount of thought into the people side of AI and how that is, because because it's not just about like, you know, organizational design workforce planning is you know, a lot about the culture within a team or how you can go back to what said, how do you make sure that you retain the people you're actually investing in? There's a danger people are seen as disposable.

Unknown:

Yeah, I think there's also a danger to to look over the talent that you've already got in the organization. Shouldn't so I mean, but a few years back there was this whole rush around data scientists. And yet many organizations had data analysts sitting there that could do the job, but and then looking at data scientists being hired from like Silicon Valley, with rock star salaries and thinking, well hang on a minute, these people are actually doing less than I am, I'm paying paid a fifth. And I know the business, whereas I'm having to train the data scientists to understand the business, not just the data. So it's, I think there's, you know, massive gaps in people's understanding on how to implement this thing.

Rebecca Turnbull:

What are the things while we're talking about work that's come up, is, you know, I recently read this study from the University of North Carolina, which suggested there was a serious potential gender bias and the rise of EA specifically, they highlighted that the risk of job loss for women was much higher compared to men due to the nature of their office roles. There's, you know, a lot more talk around, is AI discriminatory? And how do we, you know, deal with the biases in there? What are your thoughts, right, and mitigating those risks and trying to ensure that we have like a fair transition into a more AI driven world.

Unknown:

So AI is funny that people say that AI is bias, when it's actually fake, the data is biased in the first place, it's only trained to understand what is given. And the fact is that the root of the problem is our own biases are have crept into every part of society and every organization and the way we treat, you know, women, minorities, etc. I mean, if you looked at the classic example is Amazon's hiring AI was basically rejecting women's CVS, because it was trained on mostly male CVS and male CVS use, use specific terminology and phrasing. And that's what it accepted as a good example of a CV. And of course, if you didn't match that, then you were bend automatically. And that's the problem is that, you know, we've, we've inherently designed systems to recognize data patterns on specific subsets, and not being inclusive at all. And I think this is where the human side still has an important part to play in being able to design ethical systems, and understand the quality of the data that we're using to train these. But of course, humans are biased as well. So we have to ensure that, you know, minorities, women, people of you know, people of every gender, and race are included in these discussions. Otherwise, we're just gonna go round in circles, very much

Rebecca Turnbull:

so. And I noticed that the Linux Foundation, I think, have got an AI and ethics course already out. So it's a hot, hot topic, but definitely something that, I think is a value for people to start educating themselves on, it's really easy to think I am an ethical person because of these things. But when you do look into ethics and decision making, there's a lot more science and frameworks behind it, I guess, and definitely something that I would recommend you look into if you're interested in it. For sure, you know that there's a lot of conversation out there as well, just now about the threats posed by new technologies to businesses and people and like, certainly, these sites are just way more advanced than before. You know, it's it's much easier to finesse your phishing emails, if you've got access to chat GPT, for example, but I think that these sites are becoming like much more diverse, and they're really evolving quickly. And, you know, there's a lot of challenges there. So, classically, there's things like the fake terrorist attack on the Pentagon. But you can even start to play with history, I suppose. And really change how people believe and think about things. Now, you know, it looks like and certainly a lot of people feel like the regulatory and compliance systems are just not keeping up with the pace of change. I'm sure you've been thinking about this. Because if you don't know what the changes are coming, it's really hard to bake that into your product when you're developing it. So could you share your thoughts on regulatory landscapes and how you think this is going to impact tech businesses or technology within businesses? What are the key things we should be looking out for?

Unknown:

Yeah, historically, I think the regulatory side of things has always been the regulators and the people who create the legislation. I've always been far behind because the technology is released and then it runs And the legislators are always walking slow pace to try and understand. I think AI the certainly what I've seen in the last six months has been the first example of from a from a global perspective, especially regulator setting up and going, Oh, this actually has a material impact on not only us, but people in general and the harm they can do, just by the average person picking this tool up and doing something like you say, producing fake imagery, or producing fake documentation or whatever, or altering someone's speaking and changing the words that they use to sue a narrative that they want to promote. We are seeing the regulator's set up here. And I think for people who have internal regulatory and compliance functions, they should start to look at, you know, updating their own policies and procedures are in line with how the regulators are having these conversations. And then how it impacts the not only the business, but the industry in general, because you want to be seen to be leading your industry. Now, rather than taking a step back and waiting. Because these technologies will happen, they will happen very quickly. And you can't sit and wait for someone else to make a decision these days. I remember the days of M Corp and mortgage, the mortgage application side of things a few years back back in 2010 2012, something like that. And a lot of the businesses were trying to get ahead of of the FSA at that point in time, by doing internal work to understand well, if this is a new regulation, how are we going to apply it. And all of a second guess what the eventual papers would become the consultations would become, and we need to do the same, again, with AI, with quantum computing with cryptography, etc, is actually start to set, you know, take that step forward, and put those policies in place before anyone else has to think about it.

Rebecca Turnbull:

I totally agree. And I think that we have an added an added challenge to think about in the broader context. I've always thought that one of the things that, you know, in the UK is interesting, you know, we have like our, like two houses, we've got the parliament and lords and there's like double checks, and it's a lengthy process to get through legislation. And certainly when I was studying, and at the current law University, it was very much like, well, you know, we, it's good that we put a lot of thought into these things, it's great that we take time to think and you know that that has gone as value. But I think there is going to have to be maybe a change in the dynamic in some areas where we start moving faster, and how we create and implement legislation or how we build flexibility into it without there being any ambiguity about whether the lines between, you know, criminality and legality life, and how we protect people and their data. And how then, at the same time, where's the balance between that and stifling innovation? Because if we are not innovating, then other countries with less regulation will will outpace us?

Unknown:

Yeah, the regulation should always be seen as a set of guidelines and guidelines have soft barriers and boundaries and safeguards. And you shouldn't really sort of wait for them to become verbatim or wait for them to become perfected. You know, if someone issues a guideline, then you go away, you test that guideline, and then you go back and you adopt or adapt depending on what your business's rather than wait like you say, for the legislation to go through several iterations and then be passed as law. You should always see a guidelines as as a first draft or are very loose framework for you to work around. And use it as almost like a knowledge base for yourself and your your own business, to either to safeguard your own business or safeguard the customers safeguard the employees, you know that, you know, that's the way I look at legislation these days, and especially the pace of change with technology, it cannot wait for the final law to be passed anymore.

Rebecca Turnbull:

Yeah, so making sure that safety and safeguarding is at the forefront of your decision making and what you do but also at the forefront and for how legislation is prioritized, potentially if we're gonna get into that situation. So shifting on to education. We've talked a lot about you know, it's really clear the skills that you need, they're changing, but one thing that I see is very constant. Well, there's three things that I see as constant just now, like execution, critical thinking and self awareness, but we focus on critical thinking. What do you think are the best ways to identify and evaluate that critical thinking skills in people and potential employees? Because that is going to meet? That gives you the edge really, isn't it?

Unknown:

Yeah, I mean, if we're automating tasks, then we have to look at the things that make us human, and critical thinking and empathy, and all these kinds of sort of things. And you don't, you know, strategic and imagination, and creativity are the things that actually will stand you out as a candidate, but also as an employee and as a person in general. And I think looking at education, for example, we have to start looking at how do we bring the social sciences and the and the arts and humanities back into the curriculum. Rather than treating them as surprising, they should actually be baked into every single course, as standard knows, or default, rather than something that you opt in to take because these are the things that are actually the core to who we are as a species and civilization. But we're, you know, I do think that the education, especially just No, I think it was a report basically saying that Harvard are looking at creating our prep, professorial type AI for every single student, and in which case, it's like, well, what is education then? And why do I need to pay a university fee? If you're just gonna give me a chatbot, to become to tailor my, my tutoring for me? So the education, I think, is in real trouble here.

Rebecca Turnbull:

I definitely think the education sector has got to do a lot to keep pace. But I wonder if as a society is going to make us look more closely at early years education, and like, there's a lot that is baked into people by the age of like, I don't know, like eight, maybe something like that. And, you know, certainly, we take for granted that there is early years education, because it's free in the UK, and you have to go whether or not you like it. But in certain countries, that's actually a luxury. So how far are they going to be left behind? If critical thinking skills are really important, and maybe you've got parents who have not had formal education with children who don't have access to like reliable and consistent education? I think we need to think about early years education, but beyond where we are, as well, I don't know, I think it's something that could create problems for us just in even in our own like small society, if we don't look at that globally. If you're looking at critical thinking, like, I certainly, you know, I speak to people and they, they see that one of the things they have experienced is it's not just that problem solving skills are diminishing. But it's, in some cases, I've heard that it's people's ability to spot problems that they noticed is also starting to lessen. How can we encourage and help develop these skills within our teams?

Unknown:

I think it's a byproduct of the over reliance of technology. And I think what we need to do is actually start stripping away a lot of the tools that we take for granted and, and force people to start thinking, you've also got, you know, systems thinking and all different types of frameworks I've existed before, and it's probably well worth actually looking at designing courses and training that people and teams through these particular training courses to give them those skills. Again, I think it's, it's not enough to just say, Oh, well, you know, it's on the on the decline, or people should look out for it for themselves. I think as part of continual professional development is like, go through a systems, train your systems thinking and design course, and actually start looking at problems in a different way, and come coming out with solutions in a different way. Rather than, you know, what people are going to be doing now is using chat GBT and large language models to say, what's the solution for this? It will come out with a solution, but you won't actually have gone through the learning process of figuring it out. And that just makes a dumb workforce.

Rebecca Turnbull:

So there's a lot that's going to change in the future. We've talked a great deal about AI. I'm interested to know a bit more about the metaverse like so you're CEO of a company called meta genomic, which was a Metaverse analytics platform, which is sold. What was it like early 2023? Tell us how do you see the metaverse evolving in the next decade? Because it seems to be something that's quite hidden people have quite controversial opinions and some people don't really know that much about it.

Unknown:

Yeah, I think it's important to say that the man of ours is not a destination. It's not a virtual world or video game or it's not Roblox and it's not fortnight. And we don't have to sit and log into our computer and create an avatar and run around and collect coins and things like that. That is the common misconception that I think a lot of people have. And we should have learned our lessons 20 years ago when Second Life was developed, because obviously, Second Life came out all businesses spent a lot of money in it, and then pulled out when they realized that there was no return on investment. And that was pretty much because they didn't understand what they were doing. The Metaverse is actually layered realities. So we have physical reality which we inhabit. We have the internet, which is web two. And we have virtual reality, augmented reality, mixed reality, etc, etc. We've got spatial computing, which is Apple's new paradigm. The Metaverse actually, is all of it encapsulated. And as a customer, as a business user as a business, you should be able to think about how does your business extend into each of these that I can literally use at any point in time to engage with your business and have one kind of experience or multiple experiences at the same time as the rest of the customers. Or my clients are having that experience using their own layer. And the way I look at it, especially for marketers and brands, you know, Nike is a really good example, for instance, where they have a physical store, it has an augmented reality aspect to it. So you can bring your phone and interact with products and things will pop up and teach you about the products. But you can continue that journey at any point in time to go to the website to buy the pair of shoes, or to the belt nightclub and Roblox so you can have fun. And still, it's still all tied to the same experience, which is the brand value and the shoes and teaching you about it and being able to purchase something. And that's how I see the meta versus as being which is an extension of the brand and many different types of realities that our customer can experience wherever they feel like and they can complete their transaction in whichever one so they could start in one and finish elsewhere. Whereas right now, I think a lot of companies think I shouldn't be building a very virtual store in Roblox and that's my metaverse. And that's not the case at all. So I think over the next decade, we'll see people waking up to the fact that you have to build in multiple things. It's a lifelong commitment, you know, which means that significant marketing and technical infrastructure you need to consider. But I think the reality is, is that more and more as a technology adopts or and adapts to be able to create these experiences. More and more customers will start to demand to be able to experience them in those realities as well.

Rebecca Turnbull:

Now that's really interesting. And who knows, there might be a time where is preferable to interact in the metaverse who knows maybe maybe people will want to join a podcast in the metaverse and sit alongside the discussion. Well,

Unknown:

if you look at filmmaking, for example, I mean, I can I can actually see volumetric filmmaking becoming a thing. You know, James Cameron has been pushing 3d for a while. Now, the thing is, is that with 3d Kinda and spatial computing and cameras, there's nothing to say that I could experience that wearing a headset and actually be within the movie or any, any viewpoint, watching it from any vantage point and experiencing the movie a completely different way. And who knows if business could be the same.

Rebecca Turnbull:

Interesting to see whether or not that takes off, I can imagine it'd be really popular with notice some people and certainly way more immersive. And people tend to watch a movie once. Now, if you had that kind of experience, they'd probably watch it many times from different vantage points, and things like that. So you could maybe monetize that somehow so that you have, you know, a pay per view, but people are going to view more often. So be interesting to see if that happens. Now, I I want to ask you something. When I was looking at your website and things, you describe yourself as the world's first anti futurist, what exactly is an anti futurist? So

Unknown:

I so I tend to look at a lot of futurism. And a lot of the practice is very carousel cheerleading. So it's, you know, Yay, we're gonna have brain chips and year we're gonna live 250 and blah, blah, blah. And I tend to look at it from a more pragmatic and realistic point of view of Futurism should be, which is, you know, what are the possible timelines and futures are a war a probable, you know, which are preferred and which are not preferred. And it's, and it's always the role of a futurist to actually chart these out and give people the choice of which future that they want to build. And a lot of Futurism, though, especially people who call themselves futurists tend to go for the really happy path and say, this is the one that we should go for. But of course, that involves a lot of technology. And it's also driven by a lot of the narratives that these technology companies are creating. So there's no goes back to critical thinking, which is why do we want this particular future to happen? What are the consequences that we're not thinking about? And I see anti futurism as a way of actually examining what the consequences of the happy path? You know, because it can all be happy? No Living 250? Well, what are we going to do for the extra 75 years? where's the money going to come from? And in economic sense, am I expected to work another 25 years or whatever? You know, yeah, that kind of thing. So

Rebecca Turnbull:

how many hip replacements and knee replacements?

Unknown:

Well, exactly, yeah. There's longevity, and then there's quality of life.

Rebecca Turnbull:

Yeah, this is very true. Who knows, AI may come up with solutions to some of those things, and we may want to live around longer. So it's interesting that you look at it as presenting people with opportunities. Because what I experienced, and I don't know, maybe I've fallen into a rabbit hole is just like, so much Doom mongering, like, we're gonna have robot armies that just came with a new chemical weapons that will wipe us out. And hopefully, ethics will be deployed, these things will not happen. But what do you make of all of that Doom mongering? That seems to be happening around technology innovation just now?

Unknown:

Well, it's like, it's like filmmaking. Nobody wants no, you know, if you don't have an antagonist, or a protagonist, though, if you don't have a conflict, then it becomes very boring. And so, you know, the Doom mongering, I think is kind of like, it's kind of like a movie, where the you have the narrative that kind of sort of makes people sit up and take notice, and things are all gonna die, AI is going to design a virus and my persona, etc. The only problem with that is that the louder and more, it proliferates, the more people become switched off, because it's just like, well, this is too fantastical. It's never going to happen. I'm getting bored of hearing about this, tell me something new. And again, this is where the the choice angle comes into it, which is like, well, here's the choice. Everybody has a stake in the future. And this was part of the reason why we wrote the book is to kind of sort of wake people up and make them appreciate that they still have a choice in terms of deciding the future that they want to see, you know, it's not up to the policymakers, it's not up to the big technology giants like Facebook, Google or Microsoft to dictate to us what the future should look like using technology or policymaking should look like, we still have a voice, we're still individuals collectively, we can't, we are actually a force. So we still have a stake in the future. And just giving people that appreciation of choice, and the different paths and decisions that can be taken, again, back to critical thinking, decision making, system thinking etc. You know, baking that into people's everyday process and questioning, the decisions are being made for us.

Rebecca Turnbull:

It's interesting, you talk about people having choices and realizing that I think at a time when people are feeling actually, like some people are feeling quite helpless, all of this technology that can be critical. These forms, actually it gives you even more. Well, like empowerment, if you leverage it properly, so you can really take advantage of that.

Unknown:

Yeah, yeah, everybody has agency, I think it's understanding just how to activate that.

Rebecca Turnbull:

So before you go, I just want to ask you one last thing. How do you manage to stay up to date with all of these emerging technologies and trends? What are your tips for someone who wants to keep their finger on the pulse because you really are quite prolific in this space.

Unknown:

I'm a voracious reader. You know, I look at what's what the conversation is happening on social media, obviously, I engage in different groups across Reddit and things like that. I think you just have to go to where the conversations are. I read an awful lot in terms of from, you know, decent websites, you know, I don't go to the sun or the Daily Mail or whatever, because you're never going to get good informed opinion from there and certainly not in terms of current technology. kind of sort of thing. So just, it's really just a question of reading what's happening. If you see something that's interesting, go, go and find out a little bit more about it, find out where the companies are building it, understand what their thought processes, you know what you know why they're doing it as well, because it's always good to understand the motivations behind people, building products and going in a particular direction with technology and trends. And then try to again, add a little bit of critical thinking, and they extrapolate what it could mean for you as a person. Or it could mean for society where it can be from the business that you're building as well. And then look at both pros and cons. I know it sounds like a lot, it doesn't take that much time. You know, an hour a day is enough. Stay in touch, add up how much time you wasted on social networks, and then basically just take that time to learn instead.

Rebecca Turnbull:

Great tip. So thanks so much for sharing all your insights. Thiel has been really interesting and like hugely valuable. Make sure that links to your websites and also to your book and some of the other things that we discussed are in the description below. And if that we if anyone wants to get in touch with you, they can reach out.

Unknown:

Yeah, perfect. It's been a pleasure. Thank you.

Rebecca Turnbull:

Thanks for listening to the Listen perspective. I'm Rebecca Hastings, founder and director at the Listen crib, a tech sector executive search and talent consultancy. If you enjoyed this episode, please subscribe, share it with others, post about it on social media or leave a rating and review. If you're a company looking to hire top technology leaders, or you'd like to discuss your next move, please reach out to me on LinkedIn or send me an email to Rebecca at the Listen group at dog co.uk. Thanks again for listening today.