.jpg)
The Examined Life
The Examined Life podcast explores the questions we should be asking ourselves with a range of leading thinkers. Each episode features a different interview, and appeals to those interested in wisdom, personal development, and what it might mean to live a good life. Topics vary from discussing the role of dopamine mining and status anxiety, to exploring the science of awe and attention.
The Examined Life
Tom Chatfield - What myths are we telling ourselves about technology?
Technology is taking on a mythic mantle as we look to our creations to supply us with a sense of belonging and purpose, but this is a category error because tech cannot honestly deliver on these promises. In this podcast Tom Chatfield explores some of the issues bound up with the ways we are thinking about technology.
• Technology is not a bolt-on or optional extra, but has been integral to human existence since before our species evolved
• The delusion of neutrality allows us to abdicate responsibility for design choices and embedded values in our tools
• Technology has affordances that push us toward certain behaviors – email "wants" more emails, cars "want" highways
• The delusion of determinism suggests technology drives history along a predetermined path, diminishing human agency
• We've confused progress with salvation, imbuing tech with religious qualities like transcendence and apocalyptic narratives
• Understanding ourselves as "dependent rational animals" helps us appreciate our fundamental interdependence
• Each new generation must be taught a way into modernity, allowing them to question, change, and remix our culture
• Being a "good ancestor" means considering how our technological choices will impact future generations
"Even if you're the richest person in the world, let alone the poorest, you don't have perhaps as much leverage as you might wish to. Nevertheless, that's what you've got, and it does no good whatsoever to say, therefore I have no power, no control, no insight, nothing to give. You do what you can within the limits of what you can know and bring into being."
Once you start digging into the way that technology is talked about people talking about being blessed or cursed by the algorithm, people talking about uploading and immortality and transcendence you realize very quickly that technology is taking on this mythic mantle. I think this is a category error, because we are looking to these creations to supply us with that sense of belonging and purpose, that sense of greater belonging, that sense of our relationship with something larger than ourselves. But it cannot honestly supply those. It's an illusion, it's a mirage.
Kenneth Primrose:Hello and welcome to another episode of the Examined Life with me, kenny Primrose. Today I am in conversation with Dr Tom Chatfield. Tom is a writer and a philosopher of technology. He's written books like Critical Thinking, how to Think, his fiction book this is Gamora, and, most recently, wise Animals. In today discussion, we talk about what we get wrong about technology, about the stories we tell ourselves about technology. We talk myth salvation and what it means to live well and deliberately with technology in our lives, in our communities and societies.
Kenneth Primrose:I found it a really helpful conversation, one that drew attention to things that often fall below the surface of our conscious awareness. I hope you will enjoy it too. This is going to be the final in the current series, though. I have a few podcast conversations to drop in the coming months and then I will hopefully be producing another series come autumn. If you have enjoyed it, then do please leave a review, share it with someone you think might enjoy it or on social media. That's very much appreciated. I hope you enjoy the conversation now, tom. Thank you so much for joining me on the Examined Life podcast. It's a joy to be speaking to you. Thank you very much. Brilliant to be here. So, as you know, you've been briefed that the point of this podcast is that we explore a question that an influential thinker such as yourself believes we do well to be asking ourselves. What is that question for you? And then we can begin to unpack it.
Tom Chatfield:I guess the key question for me is are some of the stories that we most commonly tell about technology wrong or dangerous or misleading? And I guess linked to this is the idea what stories should we tell about technology? And I use the word story because I think a lot of people, when they talk about singularity, ai technology and governance all these important things don't necessarily think in terms of stories and narratives. People think they're dealing in objective facts. My contention is that actually, not only is the idea of storytelling incredibly important, but that it's the story that makes the difference. That kind of buried within so much of the discourse, there are a whole bunch of narrative assumptions and it's really important that we examine these and are cognizant of these and maybe try to be wise in regard to them if we want to have hopeful, humane, ethically literate discussions about the place of technology in our world.
Kenneth Primrose:Wonderful. It's an incredibly rich place to begin. What is it we get wrong about technology, or what story we're telling ourselves? That's problematic, and what should we be telling ourselves? Well, you're in your book Wise Animals. You begin with the story, as you see it, of technology right from kind of hunter-gatherer days, or indeed before Homo sapiens evolution. So can you sketch out what's the story of technology as you understand it?
Tom Chatfield:so the place I start is trying to deal with, I guess, the foundational myth that technology is a kind of optional bolt-on or extra. It's gadgets, it's stuff we pick up and put down, it's just tools that we use. The important thing for me is to recognize that, in fact, technology is older than the very existence of our species and that there is not, and never has been, such a thing as human existence without technologies. If you go back three million years, you see very early hominins, way before homo sapiens existed, beginning to use sharpened stone tools, beginning to create, moving towards cultures not just tool making of shelter making, of container making, of hunting and so on and so on. And the point is that by the time you get to things like is that by the time you get to things like fire, by the time you get to things like sophisticated stone tools which are carefully crafted, you've moved beyond anything that any other creature does. You've moved beyond the opportune use of environmental offerings towards a shared, iterated, evolving and improving culture based on knowledge, based on imagination, based on sociability, and this transforms the course of our evolution. Fire, for example, creates artificial heat. It allows our species to get more calories from food. It allows us to shed hair and become warm in the absence of shelter or kind of bodily heat. It allows us to drive off prey. It allows us to make other tools by hafting, by creating glues and things like that. It's bound up in very complicated ways that we can only partially explore with language and communication and sociability, because it goes beyond the individual.
Tom Chatfield:Then I think this helps us get away from the idea, which is very pernicious, that tools are neutral. We just kind of pick them up and put them down, that it's how you use them that matters. And it takes us, I think, towards the more challenging and interesting idea that in fact, the human-made world language and tools and things like that is very far from neutral. It has a whole load of values and assumptions and affordances bound up with it and that what we need to do collectively is this very difficult and interesting thing of having an informed negotiation with the tendencies of our tools. If I want to kill you, a gun is much better than a pen. If I want to write a poem for perpetuity or write down how to make a gun, I'm probably going to need a pen rather than just to try and shoot it into a wall. This becomes incredibly important once we get into the modern era and the digital age, when we now have increasingly sort of autonomous tools, these vastly complex systems surrounding us, and this idea of an informed negotiation seems incredibly important to me that technologies like artificial intelligence, big data, email and so on. They have all kinds of values and assumptions bound up with them that the book is aimed against.
Tom Chatfield:I call them delusions because I think of them as a form of a false consciousness, a kind of oversimplified or wishful narrative. One is the idea of determinism, that technology drives history along a preordained course, like a kind of railroad pointed at the future. This seems to be very dangerous because it seems to diminish the idea of human agency. We should just get out of the way. Let innovators innovate and what will be what will be completely overlooks the chaotic, complicated, reflective, belated business of looking at what has happened and then trying to modify and debate and mitigate and design and control.
Tom Chatfield:And the second thing is this idea of neutrality, that it's just how we use these tools, that they themselves are neutral, and this strikes me as a very dangerous abnegation of responsibility. It lets us all off the hook, lets designers and makers and maintainers off the hook because they no longer have to answer for what they do or don't build into their tools, or what they fail to consider. So you can't really hold people responsible or expect them to behave differently. So this is where I start with this big picture.
Kenneth Primrose:Fascinating. So you've got these two kind of delusions, the things that are problematic One of neutrality that our technologies are neutral but when they're actually value-laden and the other of determinism. I wonder if we can take the second one you mentioned neutrality first. Can you give an example of a technology that we use that we don't assume has any values embedded within it that clearly does to your mind?
Tom Chatfield:Yeah, absolutely. It's a funny way of thinking because this can sound very strange and then, once you get into it, it can become quite obvious or irresistible. So one trick which the author, kevin Kelly, has particularly used very well is to ask what a technology wants, what it wants you to do. Now, it doesn't want you to do things in the same way that I want my children to work hard or my numbers to come up in the lottery, but it wants you to do things, in that it facilitates and pushes you towards certain behaviors when I'm in a car. A car wants me to drive at speed, using fuel, along tarmac roads through the landscape. That's what it affords, that's what it offers to me. It does not want me to walk very slowly looking around the place and go cross country across fields. A car wants slash needs there to be a place to park it. It wants large amounts of urban acreage to be devoted to places to put cars. It probably wants me to be able to go to big, centralized shopping places. It wants to be on a dedicated private infrastructure, and so on and so on. If we imagine an American city like Los Angeles, which is, as it were, today built and designed to maintain primarily around the affordances of motor vehicles, the offerings of motor vehicles. It's a massive sprawling grid of highways and parking lots, the more desirable housing areas which are designed around the affordances of people who can afford to walk to the shops and have a stroll. If we don't think about these things, it's easy to say that the car is inevitable, that we have no choice, that every city must be designed around these things, whereas, of course, if we put it another way and say, well, what do we want out of urban space, we can get into this idea of a negotiation with it. We may wish to actively design around the affordances of the human body, of what human beings can do with their legs, the idea of a walkable city, or a bicycle or a moped or a transit system and of course, los Angeles used to have a wonderful transit system which was dug up to make way for cars and perhaps more insidiously, we can pick a technology like email, and it may seem very weird to say that email wants anything. It's just email, but of course, its affordances, what it offers.
Tom Chatfield:Email is an instant, free way to send messages. So what email wants you to do is send emails. It's cost free, except in time for me to reply to it with another email. I can copy in 50 or 100 or 2,000 people with emails and when I'm away from my inbox I can set up an auto responder that will email people to tell them that when I'm back I want to send an email, if you compare a physical letter which has very high friction, which in a sense wants me or needs me to wait several days before replying and so on, and to use rather laborious and perhaps even expensive supplies of paper and ink and concentration.
Tom Chatfield:It's not, then, that email is bad and letters are good or anything like that, but it is that, if you think about people's jobs and working lives, for many people much of their professional life looks like being a professional inbox emptier, whereby you have an inbox which is literally a box and day and night, seven days a week, because of course we have always on devices, messages come in and alerts come in and communications come in, and then you go through them and you get rid of them, either by ignoring them or by replying to them, and so everybody is filling up everybody else's inboxes and, from a certain point of view, much of our professional lives is predicated upon the idea of instant, ubiquitous, cost-free but attention-hungry communication. Once we've seen this, it's not that we can click our fingers and escape the system surrounding us. Of course we can't do that. We do this much more difficult thing of saying, well, okay, to what degree is this working for me, rather than me accidentally kind of working for it in ways that don't work for me. Do I want to have protocols around email? Do I want to shift communications to other forms or communicate in different ways? Do I want to occasionally talk to people face-to-face or phone them up or not, send them emails and so on? And so, once we see technologies having wants and desires and pressures, I think we begin to see the human-made world as it is a landscape of kind of offerings and withholding.
Tom Chatfield:And again, with data a very simple example we're translating our world into something machine-readable. We're gathering petabytes of data about geography, about people, about behavior. Not much of this data is about weight or smell. Not much of this data captures the smell of new mown grass or the whiff of the tang of pollution or the sensation of crunch under my feet as I walk down the street. And this isn't bad or good, but it would be a very grave mistake to confuse the map with the territory, the data with the reality, because part of my experience of a place is my bodily experience of it.
Tom Chatfield:The data reveals some things but conceals some things.
Tom Chatfield:When I drive through a city guided by SatNav fed off data from reviews, my experience of that urban space is gliding around in a large, powered, air-conditioned, insulated, expensive box, looking at data run by very large transnational for-profit companies about highly reviewed places that may or may not have paid to optimize themselves for algorithms.
Tom Chatfield:And when I glide to a highly rated coffee store, buy, using an app, a highly rated cup of coffee, log on to the Wi-Fi, answer my email, get back in my air-conditioned box and drive 25 miles to my air-conditioned other box, this again is a way of living that would, of course, have been utterly bizarre and incomprehensible to human beings 200 years ago and that in 200 years may be equally bizarre to our descendants. It's not inevitable, it's not neutral. It does good things and bad things. It has environmental and social consequences. And if we don't talk about these things, if we just treat it as normal, full stop, unchallengeable, full stop, then really we fail to participate in the very difficult, very important ongoing series of conversations between people about what we want, what we need, what we are, what we deserve.
Kenneth Primrose:That's really helpful. Those are really helpful examples there. Thank you, Tom. It's interesting you say it's not bad or good, because I'm going to say something about my disposition. I think of it as inherently a bit negative to be disembodied, Like I am more human when I realize that I'm an embodied creature and the abstract or the map for the territory, as you say, seems like a dehumanizing thing. Therefore, I'd be tempted to say it is potentially a bad thing and it has something to answer for in terms of social connection, mental health and so on.
Tom Chatfield:Absolutely, and I think what I'm trying to say is that it's not simply a bad thing or a good thing. There's not a crude dichotomy whereby a technology or a tool is either a bad thing, and we should try and get rid of it, or a good thing and we should just accept it. I think you've hit the nail on the head, really, in that we can explore the ways in which it is life, enhancing, thickening of our sense of self, in which it is good according to certain criteria of goodness and in which it is bad or negative by other criteria. But I think it's really important to show you're working, so to speak, and so one I think dangerously easy thing to do is to be treated as disembodied, is to kind of thin and potentially diminish your physical relationship with the world, with others. It's to perhaps underestimate the degree to which you are an embodied creature of flesh and blood, a mortal creature, a sociable creature, a loving creature, an ethical creature. A machine is a very poor analogy for a life or a mind. However, if we go too far down this path, it's very easy to ignore the fact that most of my ancestors lived very short and unpleasant lives and for them to be able to get into a clean, air-conditioned box, fly across the ground and go and have delicious, clean, safe food and water, and then go back to their beautiful, safe home full of books and music and screams that can summon their loved ones. This would be beyond the dreams of paradise. Of course it isn't paradise. One of the reasons it isn't paradise is the fact that some people do this and some people absolutely do not, are diminished and demeaned in their dignity, their autonomy and so on. So what I mean is we shouldn't be lazy in our kind of dichotomies and judgments. It behoves us to show our working and to try and make the ethical case and tell the story explicitly. The very idea that none of this is neutral implies that none of it is ethically neutral either, that there is ethical content to design decisions to the weight of history pressing down upon us.
Tom Chatfield:And much of our knowledge of this is belated. Paul Virilio once wrote to invent the train is to invent the train wreck. To invent the ship is to invent the shipwreck, or to paraphrase him anyway, and this is very perceptive. Our knowledge is often belated and imperfect, but, crucially, that doesn't mean it's sort of useless. It doesn't mean we then put our hands in the air and say, oh dear. Well there we are, Nothing we can do. If only we couldn't have seen that coming. No, then we have this very difficult, let's say duty, this moral burden.
Tom Chatfield:Despite our limitations, our imperfections, our belatedness, our conflicts, we still have to try to do what we can with our knowledge of the world and each other. And this is something I think some of your other interviewees have made beautifully as a point. I'm thinking of Oliver Burtman, for example, who, I think, among many other things, has just written beautifully about the fact that when it comes to I'm paraphrasing him life and ethics and so on, step one is to appreciate you don't have much time, you don't have much knowledge, and even if you're the richest person in the world, let alone the poorest, you don't have perhaps as much leverage as you might wish to. Nevertheless, that's what you've got, and it does no good whatsoever to say, therefore I have no power, no control, no insight, nothing to give. You do what you can within the limits of what you can know and bring into being, and you do so, of course, always alongside others, even when you don't think so.
Kenneth Primrose:That's helpful. It definitely brings to mind Marshall McLuhan's questions of what does technology extend and if it's overextended, what does it amputate? Really helpful point, I think, when stepping back and thinking about how technology is forming us. You mentioned, kevin Kelly's, what technology wants. Does technology want or at least certain technologies want us to think we have no agency. So I can see why this is a common delusion. If you will, certainly during COVID you needed a smartphone to use QR codes there's lots of ways where it's really difficult to function without the technology that insists on itself, as it were.
Tom Chatfield:Yeah, and there's a fine line between what technology wants and what the people who make and maintain and design the technology wants and, of course, what they want you to think. And I guess there's almost a trivial but very important point that, to paraphrase William Gibson, power knowledge a very Gibson power knowledge are very unevenly distributed. Agency is very unevenly distributed. You're absolutely right that for an individual citizen today, even a very privileged one, a lot of the technologies surrounding them are opaque, ubiquitous, powerful, manipulative. Importantly, they are not kind of natural or inevitable. It doesn't mean that that person can change them, it doesn't mean they're easy to change, but it does mean that to some degree there are kind of makers and maintainers and designers behind them and we can tell these stories. So for, for example, as a very simple point, when people are researching things like social media and artificial intelligence and are telling some of the hidden stories of those people who, for example, moderate content, those people whose job it is to look day after day at appalling and terrible and desperate and disturbing things in order to kind of tidy up the results of searches, of artificial intelligence, inferences and so on, this is a human story about workers somewhere in the world sitting down doing a job and if that story has surfaced, it may be that the conditions those people work in eventually in some ways will be changed or that we will understand that what we are using is a material thing, not an abstract, magical floaty thing. And of course, it's enormously in the interests of those running very profitable corporations to efface this, to use words like cloud, to abstract things, to create often illusions of choice. Now, it doesn't mean that I am simply anti-tech and think it's bad. Of course, a lot of the incentives, things going on are not intended overtly there to do with people simply trying to generate profit or generate retention.
Tom Chatfield:Gerard Lanier, a great author and critic of technology, but I think a loving critic, has made the point that algorithmic optimization can get people stuck in a psychological rut as a product not of malice, but as a product almost of just the optimization of the system. If you tell a system, I really want to try and get this person staying here for longer and being more engaged and clicking more and doing more and giving us more data and this is how we make our money, then that system, if it's smart enough in an algorithmic sense, will try and trigger people in various ways, emotionally, it will try and kind of find the stuff that is most engaging to them, that is most either confirmatory or arousing and so on, and potentially, at least as a product simply of kind of fundamental design principles, it will get them maybe stuck in a rut, it will get them narrowed down, and this is where the idea of memes and memetics is very useful, the idea that our minds, if you like, can be hijacked by certain patterns, by certain self-replicating and self-reinforcing modes of behavior. I'm cautious about this. The reason I'm cautious is that I think, said, it all depends on the story you're telling.
Tom Chatfield:If you focus on this narrative too much, you can very quickly end up as a kind of epiphenomenalist, someone who thinks that human freedom and thought are really just after effects like steam whistling out of an engine, and that actually there's a purely behaviorist or mechanical account of us and I think, again, in a weird way that mistakes the map for the territory.
Tom Chatfield:Once you've kind of drawn these diagrams of people clicking and engaging, and clicking and engaging, you end up thinking that we are just algorithms plugged into algorithms, when in fact, if you look outside of this again and again, you see that these allegedly perfect manipulators. These allegedly perfect predictors are really bad at long-term prediction. Algorithms, advertising social media networks are really bad at predicting and altering people's long-term behavior and, very boringly but very importantly, what is good at shaping people's behavior is kicking down their doors and threatening them with guns. Technologists sitting in safe warm rooms often forget that the world of physical threat and love and hope and fear and violence and need in this deep biological sense, is not erased or magicked away by the magical algorithms and that the world's most powerful and expert manipulators, they, understand that power is a bodily thing.
Kenneth Primrose:Yeah, I've heard you push back a bit against the narrative that we are ensnared in this kind of dopamine mining system where we lose our agency and you'd like to say actually that's overselling it, we can step back back and well, maybe we can, but we lose our agency more profoundly when we are threatened and abused.
Tom Chatfield:And if you want to understand human agency and motivation, I think that kind of bodily geopolitical arena is you can't. There are so many technological discussions that seem to forget we're creatures that live in a world of politics and so on. I was reading a post by andrew brown recently, the kind of author and thinker who talks about, I suppose, the delusion that ai and technology is the thing that makes the difference. He makes a very brutal reference to certain political tactics. You know when he says a lot of technologists debate cryptography, as though you know the crucial thing determining whether you can or can't get into someone's email is how strong cryptography is and whether you can work out their password. But of course, in totalitarian regimes around the world, under Assad, the crucial thing that determined whether you could get into someone's accounts was whether the secret police were standing in front of you threatening to execute your family unless you told them your password.
Tom Chatfield:And I'm not saying this for shock value. I have to kind of factor this stuff in. This is the hope worth wanting, the freedom worth wanting, because then we can tap into our knowledge of history, which tells us many, many terrible things. It tells us that people have done absolutely appalling things to one another, but also people have, against all hope and reason, resisted and people have borne witness and people have endured. And that's the real stuff, that's the real context. If we lose sight of that, we're simply not describing the human animal.
Kenneth Primrose:So we have these delusions of determinism, that we don't have agency and that are the tools and the technology that we use are neutral. Really helpful to parse those out. You also write about technology kind of fitting the mold of religion in a sense. John Gray, of course, writes about the illusions of progress being a lot like Christianity in terms of its salvific potential. Christianity in terms of its salvific potential. I wonder if you could say a bit more about that as one of the delusions that we think. You know, when you compared Neanderthal man, imagining screens and cars and safe, clean water and food and so on, it does sound like salvation in a world full of dangers. So this is kind of arisen as a belief that seems to have sedimented into lots of at least Western culture. Is that how you see it?
Tom Chatfield:That's right. Yes, I think the problem is confusing progress with salvation. John Gray specifically talks about process theologies, which are really, I think, a 19th century American idea about temporal salvation. Salvation not just as something belonging to the realm of the eternal and the transcendent, but salvation as something that is being built on earth, that humanity is in the process of becoming something exalted or transcendent, of becoming something kind of exalted or transcendent. This sounds a lot like the idea of the singularity, the technological singularity, the term taken from Heinlein's science fiction sufficiently advanced super intelligences or powerful AIs will design more powerful AIs, will design more powerful AIs, will design more powerful AIs. So this is like crossing over the event horizon of a black hole, a singularity. It's a point of historical no return beyond which, in a sense, technology's relentless and accelerating self-improvement dissolves history as a human inhabited process and either saves us or damns us, or perhaps a bit of both. And for people who believe in this, one of the crucial questions, which is profoundly theological, is then are the things we're doing right now are the AIs that we're building? Is the way that we're approaching this aligned with the emergence of a benign superintelligence that, on some level, will guard our best interests and take us towards a version of the rapture or heaven, or are they aligned with a non-beneficent superintelligence that may and some people would say for perfectly good reasons in its own terms view us as more like a virus or an infection which to enslave us or purge us? The world of the matrix, again highly influenced by theological and philosophical ideas, is one in which machine overlords snare us in a kind of delusory hell.
Tom Chatfield:For me, the singularity is a myth. It may have a lot of truth in it, but its impact, its structuring, its narrative is fundamentally mythic in that it gives vent to a kind of atavistic fear that we are giving birth to our own destruction, that we are kind of, in our technology, birthing either salvation or destruction, but that either way there is an apotheosis or rapture coming. We're in end times, and the thing about apocalypse and judgment, of course, is that they trump all earthly ethics, as we know from Calvinists and others, or indeed, you know, kind of extreme sheer faiths, so on. If you believe that a day of judgment is coming and all that matters is the state of your soul or the predetermined allocation of your lot, then you can do anything on earth you like, or rather the only criteria for your deeds on earth is their alignment with this ultimate state of transcendent affairs. So these ideas recur again and again. And once you start digging into the way that technology is talked about people talking about being blessed or cursed by the algorithm, people talking about uploading and immortality and transcendence you realize very quickly that technology is taking on this mythic mantle. I think this is a category error, because we are looking to these creations to supply us with that sense of belonging and purpose, that sense of greater belonging, that sense of our relationship with something larger than ourselves. But it cannot honestly supply those. It's an illusion, it's a mirage.
Tom Chatfield:And the very important contrast for me is between, as I say, kind of progress as a limited thing, progress as a contingent, contestable achievement. It's the difference between progress being an inevitable march towards heaven or hell, which is a fundamentally mythical, religious idea, and progress being the very difficult business of saying, okay, I think it will be a good thing to reduce the number of children dying of disease and I'm going to take these steps to try and make that happen, that it's going to cost this and it's going to need this, it's going to do this and I'm going to try and do it and I'm prepared to defend it because I believe it's better than the world in which lots of children die of disease or perhaps more, you know, challengingly. I believe that all human beings, no matter what their gender or orientation or race, should have equal rights but should be defended in certain ways. Or I believe that sentient life, including animals, ought to have a certain dignity and autonomy, and I'm going to fight for that and so on. And this, I think, kind of contingent, contestable, secular view of progress is powerful and useful precisely because it is concrete and debatable and it doesn't have to bear the mantle of these vast hopes and longings and yearnings.
Tom Chatfield:And so this seems to me to be a central confusion and, paradoxically, even if the heart of the singularity narrative is in a sense true, if it's expressing a deeply uncomfortable truth, that the exponential impacts of current technological improvements are unprecedented, transformative and may take us to a new kind of relationship with ourselves, with our planet, even if all this is true, I think viewing it through the kind of mythic lens of the singularity is precisely the wrong way to think about it, because it effaces present complexities, wrongs, injustices, suffering.
Tom Chatfield:I think it behoves us to see that which is under our nose, to be, if you like, phenomenologists of the present, to be deeply attentive to present injustice. And I think a lot of the very well-meaning and brilliant ethical altruists and others have, for me at least, made the category error of assuming that relentlessly focusing on the very long term is the very best way to serve the long term. And weirdly, I think that's bad consequentialism in the philosophical sense, because I think it's overconfident. It potentially not all of them, of course puts too many eggs in certain baskets and makes you a hostage to other people's fear what you're doing, why you're doing it and the criteria you're using to arrive at your judgments that's really interesting, and so do you think.
Kenneth Primrose:The thing that's held sacred there is the idea of progress.
Tom Chatfield:Is that the sacred belief? I think what people are holding sacred is the idea of self-transcendence. Not just progress, but progress as a ladder leading to the dissolving of problems and wants and needs.
Kenneth Primrose:You've obviously got tech optimists who will largely in Silicon Valley, but all over the place who will buy into this, and you have people who are deeply disenchanted and kind of scared of that narrative. Do you think there's any relationship between that kind of disenchantment with this narrative and an increase in a turn towards the spiritual? I see a kind of rise in religiosity or churchgoing.
Tom Chatfield:that's different from materialist, dawkins-y days, you know many other things, speaks to our deep, deep need to feel part of something not only larger than ourselves, but something that is that touches the eternal in some way, to express our relationship with the universe, with the spiritual, with something that is either non-material or, if you're a thoroughgoing materialist, that is emergent from the material in a kind of profound and sort of non-fungible way. By fungible I mean interchange, you know the idea in a kind of profound and sort of non-fungible way. By fungible I mean interchange, you know the idea of a kind of quantifiable money and benefit and so on. One of the big themes of the book for me is love. Love in the sense, perhaps, that the Greeks used to think about it. Of course they had great taxonomies of love, but the reason they had taxonomies of love was to express the multiplicity and sort of mystery and strangeness and centrality of our ability to care passionately and deeply about many things in a way that is not rational if you view humans as primarily self-interested. It doesn't make much sense to view our species as selfish in the narrow sense, because of course we're all going to die but our future is literally our children and our relationship with our children, with these staggeringly kind of vulnerable, dependent little creatures is one of enormous sacrifice. And so I think if you interrogate most sources of meaning in people's lives in any larger sense, stuff that they believe matters in a sense more than just is nice, you very quickly get towards the idea of its continuity beyond themselves, of it being something that endures beyond them.
Tom Chatfield:So I do think that today we absolutely see people hungering for this vaster identification, this participation, and what are the spaces that are amenable to it?
Tom Chatfield:Of course there are churches and religious practices, but I think more generally, around the edges of technology and physics alike, there are gestures towards the eternal, the infinite. Some of the theories in physics that interest me most at the moment are in a way quite animist, in that they talk about the kind of emergent patterns of star development in the only universe of black holes. As to some degree selected for in an evolutionary sense and generative, in a way akin to life, these systems seem to resist entropy and generate order and self-sustain in ways that it's hard to explain without reaching for analogies to biology. This is highly speculative, but even if this isn't true in a sense far beyond my P grade mathematically, I think it does speak to this kind of creaturely longing to be part of a living, caring cosmos, that is on a deep level, not just data that one way or another has emergent from it, kind of other things with their own resilience within another frame of reference.
Kenneth Primrose:It's fascinating seeing the language in some recent physics sounding like ancient Near Eastern philosophy or theology.
Kenneth Primrose:If we think back in the singularity idea and the idea of progress and salvation. It's an act of imagination and one of faith. It's about the fall in Genesis, that it's a kind of prophetic story for the moment, in that we're trying to live beyond our limitations in the pursuit of controlling a ultimately, I suppose, uncontrollable future, and I think the implication is that we're damning ourselves in the process. Are there cautionary tales or myths that you think they're for this moment? I think of Frankenstein's monster as a good story about, or the Sorcerer's Apprentice, these mythological tales that tell us respect your limits, and technology pushes back against that.
Tom Chatfield:I think myth is a very good place to look for insights that are kind of flexible and deep enough to equip us to be imaginative, to be thoughtful, to examine ourselves according to multiple lights. One myth I like is that of Pygmalion. Ovid retells the story. So does George Bernard Shaw. Pygmalion is a sculptor blessed by divine talent and he creates a supremely beautiful statue of a woman, galatea, and then he falls in love with his own supremely beautiful creation. The gods have mercy. They breathe life into the marble flesh. He kisses it. There's a wonderful translation in Ted Hughes's translation of Ovid. It there's a wonderful translation in Ted Hughes's translation of Ovid which gets across the uneasiness of this kind of unrequested maker's kiss being bestowed upon his creation. And in many versions of the story they have children and raise a family.
Tom Chatfield:But it's an interesting myth for me that resonates, for example, with Plato's Republic, where Plato warns against poetry. He's pretty authoritarian. He's very preoccupied with human weakness and specifically our weakness to artificial seduction. So poets are not popular in Plato's Republic because the untruths they weave are too powerful, too seductive and too beautiful. They're not to be trusted. Similarly, in the cave we see a world in which, as Plato puts it, most of humanity is inside a cave, watching shadows play on the wall in a realm of illusion. Very few people are psychologically, emotionally or ethically equipped to step outside the cave, into the light and face the truth. And there are deep questions here about our relationship with artifice, seduction and illusion. Plato was, as we know, in the Phaedrus and elsewhere, worried about writing. Of course he wrote this down, which is how we know he was worried about writing. That it creates in a written culture something vast and complicated and beautiful and enduring and empowering, but by doing so it creates texts that can be abused, that cannot speak for themselves, so that those reading them may be hearers of many things but understand nothing, that they may be like bad actors stumbling their way through a script rather than participants in a living dialogue or discourse. And again we have Hephaestus, the divine smith, creating Telos, the iron man, this appalling, inhuman killing machine that strides around Crete destroying ships and taking lives. That is an implacable manifestation of automated force. So these are all warnings about human weakness.
Tom Chatfield:A lot of myths are like this the Ring of gyges make someone invisible and the best human being alive will become an immoral valier, because with that power comes temptation. And now we have, for goodness sake, we have a firm called palantir, which is literally named after a kind of scrying eye wielded by the baddies in a fantasy fiction, partly, I think, because it serves their purposes to seem kind of scary and magic rather than just like a bunch of geeks doing regression analyses. So these myths warn us about hubris and they warn us about how vulnerable we are to seductions, and they warn us about how vulnerable we are to seductions.
Tom Chatfield:Just 15 years ago I wrote a book about video games and I gave a talk about games.
Tom Chatfield:I loved video games, but I was aware that their power was very double-edged and one way I put it and I apologize for quoting myself was that we evolved over hundreds of thousands of years to find certain things fascinating and enjoyable and appealing and tasty and delightful, and now, technologically, we get to engineer interactions and words and foods and things that are almost irresistibly delicious and stimulating and hyper-stimulating to us Cheesecake for the mouth and chocolates, you know, kind of memes and TikTok for the mind.
Tom Chatfield:Parts of our culture are wrestling with deprivation and hunger and at the same time, parts of it are wrestling with glut, with junk, junk food, junk information, junk time, junk data, junk relationships, parasocial relationships, unreal unreal celebrities, simulacra. So, as you can tell, I don't have some neat and tidy answer for this, but these stories and these myths and these structural warnings about how vulnerable we are to our own gifts as persuaders and manipulators and designers seem to me very much of our time and, along with Luciano Floridi, I would suggest that Plato, who lived through one information revolution, the beginnings of a move from orality to written culture, has a lot of enduring warnings and hopes for the now. Sorry.
Kenneth Primrose:What a range you drew on there, amazing. I wonder if I could just kind of pull together some of the threads and land on one final question. So we've talked about neutrality and determinism. These are myths, as is the idea that we can look to technology for transcendence, to the singularity and so on, and they're pernicious myths. They do us harm. Myths. They do us harm, and we do well to be at once aware of our weakness as humans, but also optimistic about our potential as a cooperative species who are defined in many ways by love and by our relationships and by our need to be embodied and live in the present rather than in the abstract. Have you yourself been on a journey in how you relate to technology, and has having children changed that at all?
Tom Chatfield:Absolutely. I may change my mind about half the things I've said in the next year and I feel fairly comfortable with that. I'm very worried by people who seem to have enormously high degrees of certainty in areas of incredible volatility. I'm very worried by people who seem to have enormously high degrees of certainty in areas of incredible volatility. I don't quite know how they do it, how someone stands there and says, yep, this is what AI means, this is what will happen, this is what's bad and this is what's good.
Tom Chatfield:Having children my son is 11 and my daughter is nine. They feature a lot in my book which I hope they'll forgive me for as they get older, and it sort of changed or intensified everything. Really, most obviously, having children means that the most important parts of your life and self are walking around in the world outside of you. You care more about others than yourself. In certain ways, it's fairly normal to be, I think, genuinely or a bit reluctantly willing to lay down your life for your children, and of course, it makes you feel old. But at the same time, you see a new human being coming into the world and the thing that impresses upon me is that, even though technological civilization and all its kind of glory and mystery and horror, alienates and divides us profoundly from even our recent ancestors, let alone the ancient ancestors who struggled before. Nevertheless, in the newness of each child there is this astonishing, wonderful basic challenge that they must be taught a way into modernity. That in order to hand on and perpetuate our culture, we literally and obviously have to teach or rely upon the learning of a new generation, and that in teaching them they can question and change and remix. This is incredibly, but to feel the force of it on a daily basis is really to recognize a kind of evolutionary and ethical truth that part of our survival and thriving is bound up with the fact that the old die, that anything and everything we passed on must be done through acts of love and teaching and learning. The technology itself needs to be maintained, it needs to be built and maintained and powered that. If we don't do this, it just crumbles. It doesn't exist in the way that nature exists. It's a part of the natural world, but it's not a self-maintaining part. Natural world, but it's not a self-maintaining part. Technology is incredibly needy, and so those we give the world to, those who inherit the world, those who take the world, whether we want them to or not, get to question it.
Tom Chatfield:The author, roman krisnarich, has written several books, using children and generations as a lens for talking about ethics and technology and culture. In his book, the Good Ancestor, he uses the framing of a question what does it mean to be a good ancestor? And, of course, thinking about timescales in this way can help us face up to the obvious and urgent, and yet often overlooked, fact that we are temporary custodians of the earth, and often not very good ones, that we do our bit and then we're gone. And so what does it mean to imagine, or, as Roman puts it, to have in the room some people advocating for the future? When we're doing something, a, a project, a bit of governance, a building? What's it like to have someone in the room who's told okay, your job in this room is to advocate as best you can for those who will be living in the world in 2050 or 2060 or 2017? You're going to try and put their view across. You're going going to say, well, hang on, this is all very well, but you're going to get benefit and fun from this. We're going to get an expensive, outmoded, environmentally costly shell, or what we're going to get is maybe something beautiful, maybe something enduring, maybe something that will lift us up.
Tom Chatfield:The cost-benefit analysis starts to look very different when you do this. A library starts to look an awful lot better than a superstore. Of course superstores are great in certain ways. You want to give people good, clean food affordably. There's many very, very wonderful things about them. I don't think it does any good to deplore the present, but the frame of reference, the questions you ask, is the fundamental determinant of the answers you get. So I love what children have done to my relationship with time, except ironically, of course, I have far, far less time to spend on myself than I might like. But that's okay. And I guess the question I often ask Jeff, I'm very clarifying when I'm lucky enough to be visiting people who work in tech or other people who have some influence, is to sort of bring children into the room, probably metaphorically rather than literally, because people are often saying this will make you more productive, this will make you more money, this will make you get places faster, like great.
Kenneth Primrose:Well, I hope you enjoyed listening to that conversation with Dr Tom Chatfield. I plan on dropping a few more episodes in the coming months and producing a new series in autumn, though for now that is the end of the current run. In the meantime, then, let me thank all of my guests for their wisdom and their time, and those who made this series possible through all the different types of support. I really appreciated it, and thank you for listening. Do please share with others. Find me on Substack, where I'll be continuing to write about examining life and exploring questions through the Substack series Positively Maladjusted. I do hope to be writing more in the coming months, and for now, I wish you well and, in the words of Krista Tippett, that you'll be finding something life-giving and redemptive about asking better questions.
Tom Chatfield:Okay, that's all fine, but look, what do you want for your children? Because even the most ambitious capitalist realizes on some level that their three-year-old or four-year-old can't just be given $100 and told to go away, that they're going to want a cuddle, a smile, a pet, a joke, a game. You simply cannot, when it comes to the raising of children or care of the vulnerable, reduce things purely to a kind of financial level. Of course you can kind of buy services, but then what does it mean for me to pay to outsource the love and care of my children or my parents?
Tom Chatfield:I come back a lot to Alistair MacIntyre's book Dependent Rational Animals, and he uses this wonderful phrase dependent rational animals to describe humans as our defining feature, our interdependence, because we come into this world utterly vulnerable and dependent upon one another. We age, we die. Independency and our independence as adults is really notional or illusions, because actually, the language I speak, the house I live in, the things I eat, the safety I enjoy, all of this is predicated upon the work of many other humans dependent upon one another, and dependent upon one another's compassion and empathy and patience more than we are upon the purely transactional.
Kenneth Primrose:That's a really helpful place to pull this together the need for humility. In a world which is driven by hubris so much of the time, it seems like a very necessary thing to keep in the forefront of the mind. Tom, thank you so much for writing your book, which I find fascinating, and exploring this question of what do we get wrong about tech. With me today, it's been just like super interesting and practical too. In terms of the best philosophy. Right is practical, I think, and yeah, you've done a fantastic job of making us think a very kind of you to say so.
Kenneth Primrose:Thank you, it's been a pleasure I hope you enjoyed listening to that conversation with dr tom chatfield. As I said, that concludes the current series, though I do hope to drop a couple more conversations into the podcast world in the coming months and hope to produce another series in the autumn. All that remains, then, is for me to thank those who made this possible, for moral encouragement, financial encouragement for my brother, who made the music you hear and gave him the mic that I'm currently speaking into all very much appreciated and you listeners. Thank you for listening and, if I could ask you, if you've enjoyed it, to share it with others, leave a review. It really helps other people find it. In the meantime, I wish you all very well and hope that you will, in the words of Christopher Tippett, find something redemptive and life-giving about asking better questions.