Trial By Wire

Episode 2: How Did We Get Here? A Brief History of Computers

January 21, 2024 Denton Wood
Episode 2: How Did We Get Here? A Brief History of Computers
Trial By Wire
More Info
Trial By Wire
Episode 2: How Did We Get Here? A Brief History of Computers
Jan 21, 2024
Denton Wood

We have to know where we've been to better understand where we are. This episode, we look at how computers have changed over time since the very first computer.

Keep up with the show! https://linktr.ee/trialbywireshow

Questions? Comments? Email trialbywireshow@gmail.com

Music:

Show Notes Transcript

We have to know where we've been to better understand where we are. This episode, we look at how computers have changed over time since the very first computer.

Keep up with the show! https://linktr.ee/trialbywireshow

Questions? Comments? Email trialbywireshow@gmail.com

Music:

Welcome back to Trial by Wire! My name is Denton, and I'm excited to continue our conversation. Last time, we explored the different kinds of computers and touched very briefly on the impacts they have on your life. Today, we're going to take a quick look backward and figure out how exactly we ended up with all of these computers.

The first "computer" is hard to put a finger on because it depends on what you mean by a computer. You might be familiar with the abacus, an ancient device with beads on wooden rods which was used for counting. It had different rows to represent different digits of a number - if you had counted your way through one row (5, 6, 7, 8, 9), you moved a bead on the second row (10) and started back at 0 on the first row. The abacus isn't a computer by our definition from last episode. A computer has to be programmable - in other words, it has to be able to be told to do things. But it does show us what computers do best, which is math!

The earliest computers were designed to automatically solve math problems. Charles Babbage of England came up with the idea of the "Difference Machine" in 1823 to create mathematical tables for navigation at sea using a machine instead of writing them by hand, which was prone to error. Encryption, or hiding a message by obscuring it somehow, is basically a complex math problem. Alan Turing, also from England, cracked German encryption during World War II using a machine called the "Bombe." He's also famous for the "Turing Test", which is a popular way of wondering about whether computers can "think". A group of Americans led by John W. Mauchly and J. Presper Eckert, Jr. developed ENIAC, the first general-purpose digital computer, for the US Army to calculate artillery firing tables in 1946. While these computers were useful, they were massive, complex, and a very small number of people could actually use them.

Over the next few decades and beyond, computers would trend towards being more usable by larger numbers of people. Programming languages, for example, made it easier for people to make software for computers. Instead of writing an application in a way that the computer could understand it, programmers could now write something that they, the humans, could understand more easily and then let the computer convert into something it could understand. IBM made computers viable for business by adding operating systems in 1965, which made them more stable. Graphical User Interfaces, or GUIs also came around in the 1960s - now you could use a mouse and click on things instead of having to type commands in order to do anything. The 1970s brought personal computers - if you knew what you were doing, you could have a computer in your house, which was revolutionary for the time. Applications for those personal computers would soon follow.

From here on out, computers start to look more like what you know today. Apple released the first Macintosh computer in 1984. The Microsoft Windows operating system came out in 1985. The Internet was invented by Sir Tim Berners-Lee in 1991. The iPhone graced existence in 2007. Later would come tablets, smart watches, and so on.

Here's the thing - under the hood, these computers are still just doing math, because that's what computers do. But over time, both the way computers are intended to be used and the way that we use them has changed as we build more and more stuff on top of that math. So let's break down a little more of that.

There are lots of ways to think about how our relationship to computers has changed because computers have changed in so many ways since Babbage's machine. Each one gives us a little info about the beneficial and the harmful effects of computers and a little insight into how it drives our world.

Let's start with the way we actually use computers. Like I mentioned, computers used to be much harder to work with. The massive machines of the 1950s and 60s required special training and expertise to do anything, and they had lots of switches and dials and flashing lights. As they went on, we got keyboards and so-called "command-line interfaces" where users would type commands into a window to have the computer do things. Interestingly, command-line interfaces are one of the fastest ways of interacting with a computer. I use them all the time at work, and I would be much slower if I had to wiggle a mouse every time that I needed to do something.

However, for most people, WIMP interfaces, or windows, icons, menus, and pointers, are a lot easier to use. This is what you're probably used to - using a mouse to run applications on your computer by clicking the icon on your desktop or opening the menu to find it. WIMP interfaces generally give you much more visual feedback than command-line interfaces, and they have more metaphors for real-life objects. Your "desktop" with file icons on it is supposed to remind you of a physical desk with files stacked on top. Your file folder is supposed to remind you of, well, a folder of files. Your settings icon is a gear because you think about messing with the nuts and bolts of something. Some of those metaphors have faded as people have become more accustomed to computer user interfaces, but they have their roots in real-life items.

As computers have changed, though, so have the interfaces. Your phone isn't a WIMP interface; you're tapping and swiping on it with no mouse in sight. If you've ever used a voice assistant, you interact with it entirely by speaking - which can be more natural, but can also be more frustrating if it doesn't understand what you're saying. VR & AR headsets have no mouse or keyboard; you use the hand controllers to indicate movement to the computer. Even your microwave prefers buttons. Imagine using a mouse to work it! What's important to consider here isn't whether the interface is better or worse than other interfaces; it's what it encourages and discourages. Let's take command-line interfaces. They're not very natural to use for humans. However, they encourage speed, as the way you interact with the computer is much more natural to the computer itself. WIMP interfaces discourage that speed, but they encourage familiarity with the computer for business users, which is who they were designed for. Voice assistant interfaces encourage natural interaction with the computer, but discourage it at the same time by being clunky to use. We'll go a little deeper on some of that in the future.

Let's also think about we do with computers. At first, computers were designed to solve very specific complex math problems. Then, they helped with business operations. Then, in the home, they began to take on other uses. The Internet and search engines helped us learn new things from all around the world. Social networking sites connected us together, even across long distances. Music sites let us listen to our favorite tunes. E-commerce let us shop from the comfort of our homes or even build businesses completely online. Movie and video streaming sites let us watch whatever we wanted, whenever we wanted. Video conferencing let us work or take classes from home. And now, AI is giving us quick answers to pressing questions and helping us create new things even more quickly than we could on our own.

However, everything that I've said so far has been from a positive perspective. In the same way that computers positively affect us, they also negatively affect us. The Internet and search engines began feeding us all kinds of information, even blatant lies. Social networking sites created division as we learned to interact through a computer screen with each other. Music sites created copyright headaches as music companies went to war against digital pirates. E-commerce created competition that shut the doors of brick and mortar stores, causing people to lose their jobs. Movie and video streaming sites give us even more ways to distract ourselves from engaging in life. Video conferencing increases the difficulty of peer interactions as we try to figure out how to use a video camera and a microphone to communicate with other people remotely. And now, AI can simulate human writing, speech, and persona in ways that we've never even encountered before.

Advances in technology are rarely completely good or bad. They will always have benefits and downsides. The challenge of engaging with computers is figuring out how to handle the bad with the good, and even to disengage when the bad becomes too much. But to understand the consequences of the technology, you have to understand the technology itself. And that's one of the biggest challenges facing us as the speed of innovation keeps picking up.

Your homework this time, should you choose to accept it, is to think about some piece of technology in your life. Maybe it's your phone, maybe it's your Facebook account, maybe it's your alarm clock. Think about the positive and negative ways that using that piece of technology affects you. Try to arrive at some final assessment of that technology in your life. You may have already arrived at this conclusion, but I'm curious if your answer changes if you think about all of the ways you use it. And I'll see you next time.

Hey, thanks for listening! If you want to keep up with the show, you can subscribe to our biweekly uploads on your favorite podcast feed or on YouTube at https://www.youtube.com/@TrialByWireShow. You can also find us on X (formerly known as Twitter) or on Instagram at @trialbywireshow or on Facebook at https://facebook.com/trialbywirepodcast. If you have comments or questions, I'd love to hear them. Send me an email at trialbywireshow@gmail.com. See you soon!