Kaigai Blog living abroad in my twenties

【My Study Note】History of Computing

General Infotech

Contents

【My Study Note】History of Computing

History of Computing

Contents

Meaning of Computer

Computer

To put it simply, a computer is a device that stores and processes data by performing calculations. Before we had actual computer devices, the term computer was used to refer to someone who actually did the calculation.

Abacus

Abacus

Do you know what an abacus is? It looks like a wooden toy that a child would play with, but it’s actually one of the earliest known computers. It was invented in 500 BC to count large numbers. While we have calculators like the old reliable TI-89s or the ones in our computers, abacuses actually are still used today. Over the centuries, humans built more advanced counting tools but they still required a human to manually perform the calculations. 

Mechanical Calculator

Mechanical Calculator
Source: Wikipedia

The first major step forward was the invention of the mechanical calculator in the 17th by Blaise Pascal. This device used a series of gears and levers to perform calculations for the user automatically. While it was limited to addition, subtraction, multiplication and division for pretty small numbers, it paved the way for more complex machines.

Punch Cards


Source: National Museums Scotland

The fundamental operations of the mechanical calculator were later applied to the textile industry. Before we had streamlined manufacturing, looms were used to weave yarn into fabric. If you wanted to design patterns on your fabric, that took an incredible amount of manual work. 

In the 1800s, a man by the name of Joseph Jacquard invented a programmable loom. These looms took a sequence of cards with holes in them. When the loom encountered a hole, it would hook the thread underneath it. If it didn’t encounter a hole, the hook wouldn’t thread anything. Eventually this spun up a design pattern on the fabric. 

These cards were known as punch cards. And while Mr. Jacquard reinvented the textile industry, he probably didn’t realize that his invention would shaped the world of computing and the world itself today. 

Difference Engine

Difference Engine
Source: Wikipedia

Let’s fast forward a few decades and meet a man by the name of Charles Babbage. Babbage was a gifted engineer who developed a series of machines that are now known as the greatest breakthrough on our way to the modern computer. He built what was called a difference engine. It was a very sophisticated version of some of the mechanical calculators. It could perform fairly complicated mathematical operations but not much else.

Analytical Engine

Analytical Engine
Source: Britannica

Babbage’s follow up to the difference engine was a machine he called the Analytical Engine. He was inspired by Jacquard’s use of punch cards to automatically perform calculations instead of manually entering them by hand. Babbage used punch cards in his Analytical engine to allow people to predefine a series of calculations they wanted to perform. As impressive as this achievement was, the Analytical engine was still just a very advanced mechanical calculator. 

Algorithm

Algorithm

It took the powerful insights of a mathematician named Ada Lovelace to realize the true potential of the analytical engine. She was the first person to recognize that the machine could be used for more than pure calculations. She developed the first algorithm for the engine. It was the very first example of computer programming. An algorithm is just a series of steps that solves specific problems. Because of Lovelace’s discovery that algorithms could be programmed into the Analytical engine, it became the very first general purpose computing machine in history.

The Path to Modern Computers

The Path to Modern Computers
The development of computing has been steadily growing since the invention of the analytical engine, but didn’t make a huge leap forward until World War II. 

Why?

Back then, research into computing was super expensive, electronic components were large and you needed a lots of them to compute anything of value. This also meant that computers took up a ton of space and many efforts were underfunded and unable to make headway.

Contents

After The War

After The War
But when the war broke out, governments started pouring money and resources into computing research. They wanted to help develop technologies that would give them advantages over other countries. Lots of efforts were spun up and advancements were made in fields like cryptography.

Cryptography: the study of secure communications techniques that allow only the sender and intended recipient of a message to view its contents.

After the war, companies like IBM, Hewlett Packard, and others were advancing their technologies into the academic, business, and government realms. Lots of technological advancements and computing were made in the 20th century. Thanks to direct interests from governments, scientists, and companies leftover from World War II. 

New Methods to store data

New Methods to store data
These organizations invented new methods to store data in computers, which fueled the growth of computational power. Consider this, until the 1950s, punched cards were a popular way to store data. 

Operators would have decks of ordered punched cards that were used for data processing. If they dropped the deck by accident and the cards got out of order, it was almost impossible to get them sorted again. There were obviously some limitations to punched cards, but thanks to new technological innovations like magnetic tape and its counterparts, people began to store more data on more reliable media. 

A magnetic tape worked by magnetizing data onto a tape. Back in the 1970s and 80s, people used to listen to music on vinyl records or cassette tapes. These relics are examples of how a magnetic tapes can store information and run that information from a machine. 

Vacuum Tubes

Vacuum Tubes
It’s not a joke that early computers took a lot of space. 

They had huge machines to read data and racks of vacuum tubes that helped move that data. Vacuum tubes controlled the electricity voltages and all electronic equipment like televisions and radios.

But these specific vacuum tubes were bulky and broke all the time.
» More about Vacuum Tubes

ENIAC


Source: WIRED
The ENIAC was one of the earliest forms of general-purpose computers. It was a wall-to-wall convolution of massive electronic components and wires, had 17,000 vacuum tubes and took up about 1,800 square feet of floor space. 

Transistor

Transistor
Eventually, the industry started using transistors to control electricity voltages. This is now a fundamental component of all electronic devices. Transistors perform almost the same functions as vacuum tubes but they are more compact and more efficient. You can easily have billions of transistors in a small computer chip today. 
» More about Transistor

Compiler


Source: Rosie Riveters

The very first compiler was invented by Admiral Grace Hopper. Compilers made it possible to translate human language via a programming language into machine code. This advancement was a huge milestone in computing that led to where we are today.
» More about Compiler (Japanese Article)

Eventually the industry gave way to the first hard disk drives and microprocessors. Then programming language started becoming the predominant way for engineers who develop computer software. Computers were getting smaller and smaller. Thanks to advancements and electronic components. Instead of filling up entire rooms like ENIAC, they were getting small enough to fit on tabletops. 

Xerox Alto

Xerox Alto
Source: Ars Technica

The Xerox Alto was the first computer that resembled the computers we’re familiar with now. It was also the first computer to implement a graphical user interface that used icons, a mouse in a window. You might remember that the sheer size and cost of historical computers made it almost impossible for an average family to own one. Instead, they were usually found in military and university research facilities. When companies like Xerox started building machines at a relatively affordable price and at a smaller form factor. The consumer age of computing began.

Apple

Apple
Source: National Museum of American History

Then in the 1970s, a young engineer named Steve Wozniak invented the Apple I, a single-board computer meant for hobbyists. With his friend Steve Jobs, they created a company called Apple Computer. 

Their follow-up to the Apple I, the Apple II, was ready for the average consumer to use. The Apple II was a phenomenal success, selling for nearly two decades and giving a new generation of people access the personal computers. For the first time, computers became affordable for the middle class and helped bring computing technology into both the home and office.

OS

OS
In the 1980s, IBM introduced its personal computer. It was released with a primitive version of an operating system called MS-DOS or Microsoft Disk Operating System. 

Side Note: Modern operating systems don’t just have text anymore. They have beautiful icons, words, and images like what we see on our smartphones. It’s incredible how far we’ve come from the first operating system to the operating systems we use today.

Back to IBM’s PC. It was widely adopted and made more accessible to consumers. It’s thanks to a partnership with Microsoft. 

Microsoft

Microsoft founded by Bill Gates eventually created Microsoft Windows. For decades, it was the preferred operating system in the workplace and dominated the computing industry because it can be run on any compatible hardware.

Video Games

Video Games
Not only were personal computers entering the household for the first time, but a new type of computing was emerging. Video games. 

During the 1970s and 80s, coin-operated entertainment machine called arcades became more and more popular. A company called Atari developed one of the first coin-operated arcade games in 1972 called Pong. Pong was such a sensation that people were standing in lines at bars and recreation center for hours at a time to play. Entertainment computers like Pong launched the video game era. 

Eventually, Atari went on to launch the video computer system, which help bring personal video consoles into the home. Video games have contributed to the evolution of computers in a very real way, tell that to the next person who dismisses them as a toy. 

Video games show people that computers didn’t always have to be all work and no play. They were a great source of entertainment too. This was an important milestone for the computing industry. Since at that time, computers were primarily used in the workplace or at research institutions. 

Open Source

Open Source
With huge players in the market like Apple Macintosh and Microsoft Windows, taking over the operating system space, a program whereby the name of Richard Stallman started developing a free Unix-like operating system. 

Unix: Operating system developed by Ken Thompson and Dennis Ritchie. But it wasn’t cheap and it wasn’t available to everyone. 

Stallman created an OS that he called GNU. It was meant to be free to use with similar functionality to Unix. Unlike Windows or Macintosh, GNU wasn’t owned by a single company. Its code was open source, which meant that anyone could modify and share it. 

GNU didn’t evolve into a full operating system, but it set a foundation for the formation of one of the largest open-source operating system, Linux, which is created by Linus Torvalds.

Btw, GNU is a Unix-like operating system.

Mobile Phone

Mobile Phone

By the early 90s, computers started getting even smaller. Then a real game changer made his way into the scene. PDAs or personal digital assistants, which allows computing to go mobile. These mobile devices included portable media players, word processors, e-mail clients, Internet browsers, and more all-in-one handy handheld device. 

In the late 1990s, Nokia introduced the PDA with mobile phone functionality. This ignited an industry of pocketable computers, or as we know them today, smartphones.