Implausipod

E0031 GPT Squared

Season 1 Episode 31

Are the new GPTs - Generative Pre-trained Transformers - powering the current wave of AI tools actually the emergence of a new GPT - General Purpose Technology - that we will soon find embedded into every aspect of our lives? Earlier examples of GPTs include tech like steam power, electricity, radio, and computing, all tech that is foundational to our modern way of life. Will our AI tools soon join this pantheon as another long wave of technological progress begins?

Bibliography
Bresnahan, T. F., & Trajtenberg, M. (1995). General purpose technologies “Engines of growth”? Journal of Econometrics, 65(1), 83–108.

Kurz, M. (2023). The Market Power of Technology: Understanding the Second Gilded Age. Columbia University Press.

Nye, D. E. (1990). Electrifying America: Social meanings of a new technology, 1880-1940. MIT Press.

Rosenberg, N. (1982). Inside the black box: Technology and economics. Cambridge University Press.

Winner, L. (1993). Upon Opening the Black Box and Finding it Empty: Social Constructivism and the Philosophy of Technology. Science Technology & Human Values, 18(3), 362–378.

Support the show

Let's start with the question. Do you remember a time before electricity? Unless this show is vastly more popular with time travelers and certain vampires than I thought, the answer is probably not. But now, in 2024, it's literally everywhere. It's sublimated into the background. It's become part of the infrastructure, and we no longer really think about it.

We flip a switch and the lights go on, and we can find a plug in almost anywhere to recharge our devices and take that electricity with us on the go. But how long did it take to get to that point? And the answer is longer than you think. The production of electricity was invented in 1832, but it took half a century for it to become commercially viable, and then from there another 70 years to effectively transform our lives with everything from lights and appliances to communication devices like radios and television.

And even now we're still feeling the effects of that transformation as we move to electric powered vehicles for personal use. So across all those decades, it took a long time for electricity to come from concept to application to becoming a general purpose technology, or GPT. And in 2024, we're just starting to feel the impacts of another GPT, degenerative pre trained transformers that are powering the current wave of AI tools.

So the question we're really trying to find out is, are these current GPTs a new GPT, or what we might call GPT squared, in this week's episode of The Implosipod.

Welcome to The Implausipod, a podcast about the intersection of art, technology, and popular culture. I'm your host, Dr. Implausible. And in this episode, we'll be exploring exactly what a GPT is, a general purpose technology, that is, and how they have had a massive impact on society. By looking at the definition and some commonalities amongst them, we'll be able to evaluate whether the current GPT, the generative pre-trained transformers, are going to have the same impact, or whether they qualify as a GPT at all.

As always, I'm using a couple of references for this, and I'll put the bibliography in the show notes so you can track back the people we're citing here. For us, the two main sources are going to be Engines of Growth by Bresnahan and Trattenberg from 1992 1995, and Mordecai Kurz's The Market Power of Technology, Understanding the Second Gilded Age, a book he published in 2023.

Kurz is a professor of economics at Stanford University, and his first book was published back in 1970, so he's literally been doing this longer than I've been alive. And in addition to those, I'm sure we'll fold in a few more references as required. Now, for the first half of this episode, whenever I mention GPT, we're going to be explicitly talking about general purpose technology, so I'll call out the AI tools when mentioned.

And we'll get to the discussion of those in the second half after we talk about the cyclical nature of technological development. But for the moment, we should get right down to business and find out exactly what is the GPT. A general purpose technology is basically that, a technology that can apply broadly to virtually all sectors of the economy.

And by doing so, it can change the way that society functions. They do this by being pervasive, in that they can be used in a wide variety of functions. And they also do this by sublimating into the background, as David E. Nye notes in his history of the Electrification of America, that once they're part of the infrastructure, we can stop thinking about it and use them in almost any function.

Now, it took a little while for electricity to get to that point, but that's part of their nature, that the general purpose technology will evolve in advance and spread throughout the economy. For previous instances of GPTs, that led to productivity gains in a wide number of areas, but even if we're not specifically looking at Productivity growth, we can still see how they have beneficial impacts.

Now in the original study on GPTs, Bresnahan and Trattenberg's engines of growth from 1992, they looked at three particular case studies, steam power, the electric motor, and the integrated circuit. And by studying these GPTs, they were able to come up with some basic characteristics. The first and most obvious is that their general purpose, that the function they provide is generic, and because of that generic nature, they're able to apply it in a lot of different contexts.

If we think of all the ways that the continuous rotary motion that was provided by steam power and then electric engines has been adapted and serves throughout our economy, it's massive, it's fascinating. And once the production of the integrated circuit really started taking off in the 60s and 70s, it became a product that could be embedded in almost anything, and very nearly has.

This is obviously scaled over time as integrated circuits have followed Moore's Law, providing exponential growth in the amount of circuitry that can provide it in the same space, and complementary technologies like batteries have also improved and shrunk and been able to service the chips that have gotten more and more power efficient over time, leading to even more widespread adoption.

And this brief description hints at the second and third characteristics. The second one is that they have technological dynamism. That continuous work is done to innovate and improve the basic technology that makes it more efficient over time. This is why you often see that costs to use that GPT drop over time.

And that's why it shows up in more and more parts of our society. And the third characteristic of GPTs that Bresnahan and Trattenberg talk about are the innovational complementarities that technical advances in the GPT make it more profitable for its users to innovate and vice versa. And we can see hints of that with how the improving battery technologies went hand in hand with the development of integrated circuits.

One of the things that B& T note, especially in their example of the steam engine electric motor, is that the function that they provide isn't necessarily obvious with respect to some of the jobs. That the continuous rotary motion that is now used in a lot of things, everything from sewing, polishing, cutting, wasn't necessarily seen as something that could be adapted to those skills.

So the people that were doing them were surprised when there was a technological replacement for the things that they were doing. Let's put a pin in that idea and we'll come back to it in about 10 or 15 minutes. Sometimes the way that the GPT is applied is inefficient initially, but as price and performance ratios improve, as the technology and the complementary technologies around it improve, then it becomes more feasible.

Sometimes those payoffs come quickly, but Often it takes a long time for it to get distributed throughout the economy. In the case of electric motors, they note that it took about three decades to go from 5 percent of the installed horsepower in the U. S. to over 80 percent by 1930. And those productivity gains came because everything was getting electrified at the same time.

The infrastructure was there. And these all go hand in hand. They're complementary. B& T quote at length from Rosenberg from 1982. Quote, the social payoff to electricity would have to include not only lower energy and capital costs, but also the benefits flowing from the newfound freedom to redesign factories with a far more flexible power source.

The steam engine required clumsy belting and shafting techniques for the transmission of power within the plant. These methods imposed serious constraints upon the organization and flow of work. which had to be grouped according to their power requirements close to the energy source. With the advent of fractionalized power made possible by electricity in the electric motor, it now became possible to provide power in very small, less costly units.

This flexibility made possible a wholesale reorganization of work arrangements and in this way, made a wide and pervasive contribution to productivity growth throughout manufacturing. Machines and tools could now be put anywhere efficiency dictated, not where belts and shafts could most easily reach them.

Now, I want to state that I'm not a member of the cult of efficiency by any means, and that Rosenzberg's claim that quote, it's not some contradiction to that Foucauldian argument that the architecture of society is shaped by the architecture of our factories and our other buildings. That we have some Deleuzian form of control society because the very hierarchy of the way our power is distributed within our factories lends itself to certain forms of social organization.

Far be it. I think these are saying exactly the same thing. Different perspectives. What the subtext of all these articles is, is that to get past those hierarchical forms, united find different ways to distribute the power. And by doing so, you can have very liberating effects on society as a whole. And.

Ultimately, this is what a GPT is. It's what it provides. As Kurz notes, GPTs reflect fundamental changes in the state of human knowledge that occur maybe once in a generation or once in a century. They are technologies that enable up to Paradigm Shift, and as Kurz notes, we need to distinguish between small changes within a given technological paradigm and revolutionary technologies that change everything.

End quote. A GPT serves as a founding technology or platform for Further technological innovation. And because of that, it's really important to note something on the work that goes into the development of a GPT as both B&T and Kurz note, quote, it is vital to keep in mind the distinction between innovations within the paradigm of a GPT.

And innovation of a new technological paradigm, or a new GPT. Some GPTs, like electricity or IT, change everything and ultimately transform the entire economy. Others, like the discovery of DNA and genetic sequencing, change completely only a segment of the economy, like we've seen with CRISPR and genetic engineering.

And this idea of a paradigm shift is perhaps one of the most central features of the introduction of a new GPT, especially if you're a large incumbent firm well established within the Current dominant technological paradigm for you see a paradigm shift threatens to upset the natural order of things where the large Incumbent firms exercise their market power and use small firms operating within that paradigm effectively as research labs acquiring them if they happen to develop a patent or an innovation that would prove it to be useful or would Threaten their own dominance within the marketplace These patterns have been well observed historically within the development of electricity, with the rollout of radio and television, with the early computing industry, and can be even seen within 21st century industries, where a dominant player like Facebook will acquire Instagram or WhatsApp.

that may threaten their dominance. And if they're unable to acquire those competitors outright, they may exert their market power through lobbying or other efforts in order to challenge them, as we're seeing currently in the United States with the proposed TikTok ban of March 2024. This is all standard operating procedure.

It's the way these things seem to work. But when a new technology comes around, when the paradigm shifts, that's when things get interesting. As Kurz notes, it's a period where Quote, the most intense technological competition arises when a new GPT is invented. This leads to the eruption of economy wide technological competition in which winners begin the long journey to consolidate market power.

During that period, we'll either see new players rise to the level of the incumbents, pushing out the old dominant players that can't adapt. Or we'll see those dominant players do everything they can to try and keep their hand in the game. Which is what we're starting to see already within the field of AI, which is one of the reasons we suggest it might be a new GPT, a General Purpose Technology.

But as we've hinted at, these things go in cycles, so let's look at what some of the earlier ones were.

The idea that the economy behaves in a cyclical manner was first introduced almost 100 years ago, in the 1920s, by Nikolai Kondratiev. They've been subsequently named in his honor. Kondratiev hypothesized that these cycles were due to the underlying technological basis of society, the technological paradigms that we've been discussing in the first half of this episode, where we see rising boom and bust cycles that take place over a period of roughly 50 to 60 years.

Now, the Kondratiev waves, or what are sometimes called long waves or Carrier waves are only one of the various economic waves or cycles that have been observed. Others, including those proposed by Kuznets or Jugler or Kitchen, look at things like infrastructure or investment or even inventory for various products, and development time frames can have a major impact for all of this as well.

When you map these all out on a timeline, the various economic waves can all seem to interact, much like overlapping sine waves in a synthesizer, where the sum of the smaller waves occasionally comes together in a much larger peak, or ocean waves come together out of nowhere and suddenly form a rogue wave big enough to sink a ship.

When Kondratchev was originally observing that a long 60 year period, he said that there was three phases to the cycle, a period of expansion, stagnation, recession, and nowadays we've added collapse into that as well. When Kondratiev was originally writing in the 1920s, he identified a number of periods that as it's originally taken place with, starting with the Industrial Revolution, followed by the Age of Steam and the expansion of the railways, and then the subsequent rise of electric power that took place, as noted, between the 1890s and 1930s in North America.

Since then, we've seen the cycle continue in two other long waves, the rise of the internal combustion engine, the associated technologies that that facilitated, like the automobile and air flight, and then the rise of the microchip and the transformation that the computing technologies and communication technologies had across the face of a modern world.

Now, the idea of an economic long wave has had an enduring appeal. People have taken the theory and have Cast it back earlier in time and a lot of predictions have come about trying to guess what the sixth long wave would be. Again, five that we've had so far if we started at the industrial revolution.

Some of the possible contenders as a driver for the sixth Kondratiev wave include that of renewable energy and green technologies as proposed by Moody and Nagredi, or that of biotechnology as proposed by Leo Neffiato. Back in 1996. And while those are strong contenders, they haven't necessarily turned into the drivers of economic change that we might have expected.

They may still yet, but in some ways they lack the general purpose nature of the technologies that we've seen as drivers of previous long waves. In 2024, it looks like another contender has emerged. A GPT build out of GPTs, the generative pre-trained transformers that power our AI tools. So based on our three characteristics of those GPTs that we mentioned earlier, we'll take a closer look and see if those AI tools might qualify.

As we said earlier, with Bresnahan and Trattenberg's definition of a GPT, the three characteristics were general purposeness, technological dynamism, and innovational complementarities. Within their paper from 1992, they use the case study of semiconductor technology, which is the dominant GPT at the time of the 1990s.

At the time they were writing, the pervasiveness of computing had already been assumed, but initially that assumption wasn't the case. Hence early prognostications like Thomas Watson's from IBM famously saying that I think there is a world market for maybe five computers, a prediction that turned out to be drastically wrong.

By the 1970s, the integrated circuit was well developed and its use in the computing mainframes of large banks was already well underway. What allowed electronic circuits to become a general purpose technology was that they could work inside virtually any system. Those systems can be rationalized and broken down into their component activities, and each of those activities could be replaced with a integrated circuit or transformer at certain stages.

And if you can break the steps down to something that can be replicated by binary logic, like ones and zeros with gates opening or switches turning on and off, then you can apply it anywhere within a production process. It meant that there was a wide range of. technological processes that at its root were pretty simple operations.

But as B& T note here, even though substituting binary logic for a mechanical part was often very inefficient, because you might have to increase the number of steps in order to accomplish something with binary logic, as the price dropped on the circuits and more and more processes could be included within one circuit, it became much easier to actually implement the technology.

electronic circuits within the system. And as the costs came down and the processes were improved, they became more widely implemented within a lot more sectors of the economy, to now that they're basically everywhere. So, do our current GPTs, the current crop of AI tools, exhibit these same characteristics?

Is there a general purposeness to them? Well, qualified yes. I think when it comes to the current AI tools, we need to recognize a few things. The first is that they're part of a much longer process that a lot of the tools that we're seeing right now were two years ago called machine learning tools, and they've just been rebranded as a I tools with the popularity of Chad GPT and some of the AI art tools like stable diffusion and mid journey.

So both the history of the technology and its implications go back much further. And it's actually uses are much broader than we're currently seeing and thinking about the range of industries where I've seen AI tools adopted far exceed just the large language models popularized by chat, GPT, or the art tools that were seen online increasingly.

We're seeing machine learning algorithms deployed in everything from photography to astronomy, to health, to production, to robotics, to website design, to audio engineering, and a whole host of industries. And this explains partially why we're seeing so many companies involved, which feeds directly into the second characteristics of GPTs, the dynamism, the continuous innovation that's being brought forth by companies that currently developing those AI tools.

Now, is everyone going to be a hit? No, there's a lot of them that are absolutely not places where AI should be involved. But some of them are going to be creating tools that are well suited to the application of AI. And just as the early days of electricity and radio and television all saw a lot of different ways that people were trying to apply the new technology to their particular field or product or problem.

We're seeing a lot of that with AI right now, just as any company that has a machine learning model is either rebranding it or adapting it to the use of AI. I think a lot of people are recognizing that. AI tools could be that general purpose technology that would be applicable to whatever their given field is.

There's definitely a speculative resource rush component that's driving some of this growth. There's a lot of people are getting into the market, but, but as Mordecai Kurz points out, there's a difference between working within the new paradigm created by a GPT, which a lot of these companies are doing, and on working Directly on the GPT itself, those working directly on the AI tools like OpenAI are the ones that are looking to become the new incumbents, which goes a long way in explaining why Microsoft has reached out and partnered with OpenAI in the development of their tools.

Incumbents that are lagging behind in the development of the tools may soon find themselves locked out, so a company that was dominant within the previous paradigm, like Apple, that currently doesn't have much in the way of AI development, could be in a precarious position as things change and the cycle of technology continues.

Now, the last characteristics of a GPT was the complementarity that it had that allowed for a Other innovations to take place. And I think at this point, it's still too soon to tell. We can speculate about how AI may interface with other technologies, but for now, the most interesting ones look to be things like robotics and drones.

Seeing how a tool like OpenAI can integrate with the robots from Boston Dynamics, or the recent announcement of the Fusion AI model that can provide robotic workers for Amazon's warehouses. Both hinted where some of this may be going. It may seem like the science fiction of 30 or 40 or 50 years ago, but as it was written back then, the future is already here, it's just not widely distributed yet.

Ultimately, the labeling of a technological era or a GPT or a Kondratiev wave is something that's done historically. Looking back from a vantage point, it's confirmed yet, yes, this is what took place and this was the dominant paradigm. But from our vantage point right now, there's definitely signs and it looks like the gts, maybe the GPT we need to deal with as the wave rises and falls.

Once again, thanks for joining us on this episode of the Implausipod. I've been your host Dr. Implausible, responsible for the research, writing, editing, mixing, and mastering. You can reach me at drimplausible at implausipod. com and check out our episode archive at implausipod. com as well. I have a few quick announcements.

Depending on scheduling, I should have another tech based episode about the architecture of our internet coming out in the next few weeks. And then around the middle of April, we'll invite some guests to discuss the first episode of the Fallout series airing on Amazon Prime. Or streaming on Amazon Prime, I guess.

Nothing's really broadcast anymore. Following that, Tie in with another Jonathan Nolan series and also its linkages to AI, we're going to take a look at Westworld season one. And if you've been following our Appendix W series on the sci fi prehistory of Warhammer 40, 000, we're going to spin off the next episode into its own podcast.

Starting on April 15th, we're currently looking at the Joe Haldeman's 1974 novel, The Forever War. So if you'd like to read ahead and send us any questions you might have about the text, you can send them to Dr. implausible@implausiblepod.com. We will keep the same address, but the website for Appendix W should now be available.

Check it out@appendixw.com and we'll start moving those episodes over to there. You can also find the transcript only version of those episodes up on YouTube. Just look for Appendix W in your search bar. We've made the first few available, and as I finish off the transcription, I'll move more and more over.

And just a reminder that both the Implazapod and the Appendix W podcast are licensed under Creative Commons Share A Like 4. 0 license. And we look forward to having you join us with the upcoming episodes soon. Take care, have fun.

People on this episode