Who Invented the First Computer?

By: William Harris & Chris Pollette  | 
Charles Babbage
Charles Babbage created the concept of a programmable computer. National Library of Wales/Wikimedia Commons

Key Takeaways

  • The concept of a computer dates back to Charles Babbage's mechanical difference and analytical engines, but the first electronic computer was the brainchild of Dr. John Vincent Atanasoff and his graduate student Clifford Berry, resulting in the Atanasoff-Berry Computer (ABC) by 1942.
  • World War II accelerated computer development, leading to machines like ENIAC for artillery calculations and Colossus for code-breaking; by 1951, the first commercial computer, UNIVAC, was built for the U.S. Census Bureau.
  • The evolution of personal computers began with prototypes like Hewlett-Packard's HP 9100A scientific calculator, Apple's Apple I and Apple II, and culminated in IBM's 5150 Personal Computer in 1981, which became a staple in businesses globally.

We could argue that the first computer was the abacus or its descendant, the slide rule, invented by William Oughtred in 1622. But many people consider English mathematician Charles Babbage's analytical engine to be the first computer resembling today's modern machines.


Before Babbage came along, a "computer" was a person, someone who literally sat around all day, adding and subtracting numbers and entering the results into tables. The tables then appeared in books, so other people could use them to complete tasks, such as launching artillery shells accurately or calculating taxes.

In fact, Babbage wrote that he was daydreaming over logarithmic tables during his time at Cambridge, sometime around 1812-1813, when he first imagined that a machine could do the job of a human computer. In July 1822, Babbage wrote a letter to the Royal Society proposing the idea that machines could do calculations based on a "method of differences." The Royal Society was intrigued and agreed to fund development on the idea. The first machine design that came out of these efforts was Babbage's first difference engine.

It was, in fact, a mammoth number-crunching project that inspired Babbage in the first place. In 1792 the French government had appointed Gaspard de Prony to supervise the creation of the Cadastre, a set of logarithmic and trigonometric tables. The French wanted to standardize measurements in the country and planned to use the tables to aid in those efforts to convert to the metric system. De Prony was in turn inspired by Adam Smith's famous work "Wealth of Nations." Smith wrote about how the division of labor improved efficiency when manufacturing pins. De Prony wanted to apply the division of labor to his mathematical project.

Unfortunately, once the 18 volumes of tables – with one more describing mathematical procedures – were complete, they were never published.

In 1819, Babbage visited the City of Light and viewed the unpublished manuscript with page after page of tables. If only, he wondered, there was a way to produce such tables faster, with less manpower and fewer mistakes. He thought of the many marvels generated by the Industrial Revolution. If creative and hardworking inventors could develop the cotton gin and the steam locomotive, then why not a machine to make calculations?

Babbage returned to England and decided to build just such a machine. His first vision was something he dubbed the difference engine, which worked on the principle of finite differences, or making complex mathematical calculations by repeated addition without using multiplication or division. He secured 1,500 pounds from the English government in 1823 and hired engineer Joseph Clement to begin construction on the difference engine.

Clement was a well-respected engineer and suggested improvements to Babbage, who allowed Clement to implement some of his ideas. Unfortunately, in 1833 the two had a falling out over the terms of their arrangement. Clement quit, ending his work on the difference engine.

But, as you might have guessed, the story doesn't end there.


Charles Babbage and the Analytical Engine

calculating machine,
This analytical engine, conceived by Charles Babbage in 1834, was designed to calculate any mathematical formula and to have even higher powers of analysis than his original difference engine. This portion of the mill was under construction at the time of his death. SSPL/Getty Images

By the time Clement packed up his tools, Babbage had already started thinking of an even grander idea — the analytical engine, a new kind of mechanical computer that could make even more complex calculations, including multiplication and division. The British government, however, cut his funding, which was, after all, intended to produce thedifference engine. The analytical engine is what so many people think of as the first computer.

The basic parts of the analytical engine resemble the components of any computer sold on the market today. It featured two hallmarks of any modern machine: a central processing unit or CPU and memory. Babbage, of course, didn't use those terms. He called the CPU the "mill." Memory was known as the "store." He also had a device — the "reader" — to input instructions, as well as a way to record, on paper, results generated by the machine. Babbage called this output device a printer, the precursor of inkjet and laser printers so common today.


Babbage's new invention existed almost entirely on paper. He kept voluminous notes and sketches about his computers — nearly 5,000 pages' worth — and although he never built a single production model of the analytical engine, he had a clear vision about how the machine would look and work. Borrowing the same technology used by the Jacquard loom, a weaving machine developed in 1804-05 that made it possible to create a variety of cloth patterns automatically, data would be entered on punched cards. Up to 1,000 50-digit numbers could be held in the computer's store. Punched cards would also carry the instructions, which the machine could execute out of sequential order. A single attendant would oversee the whole operation, but steam would power it, turning cranks, moving cams and rods and spinning gearwheels.

Unfortunately, the technology of the day couldn't deliver on Babbage's ambitious designs. It wasn't until 1991 that his particular ideas were finally translated into a functioning computer. That's when the Science Museum in London built, to Babbage's exact specifications, his difference engine. It stands 11 feet long and 7 feet tall (more than 3 meters long and 2 meters tall), contains 8,000 moving parts and weighs 5 tons (4.5 metric tons). A copy of the machine was built and shipped to the Computer History Museum in Mountain View, California, where it remained on display until December 2010. Neither device would function on a desktop, but they are no doubt the first computers and precursors to the modern PC. And those computers influenced the development of the World Wide Web.


Who Invented the First Modern Computer?

Apple I
1976's Apple I was the first computer sold with fully assembled circuit board, though consumers still needed to buy a case, power supply, keyboard, and display. By 1977 some of these features had been added to Apple II personal computers. SSPL/Getty Images

There are many differences between Babbage's difference and analytical engines and the machine sitting on your desktop now. Those machines are mechanical and yours is electronic. So, who invented the first electronic computer? As with most inventions, the digital computer was the work of many different people.

Like Babbage before him, Iowa State College (now Iowa State University) mathematics and physics professor Dr. John Vincent Atanasoff required a lot of computing power for his work. Even though he had one of the best calculators of its day, it still took a long time to do calculations. Also, like Babbage, Atanasoff wanted to see if he could do better. In 1937 he went for a drive to clear his mind and when he stopped for a drink, decided what kind of device he would build. His machine would use electricity. And rather than the base-10 standard, his computer would use the binary system our modern computers use.


Iowa State provided funding for the machine and Atanasoff hired an exceptionally talented graduate student Clifford Berry to help him realize his vision. They showed the prototype to Iowa State officials who then granted Atanasoff and Berry funding to build the real thing. By 1942 the Atanasoff-Berry Computer (or ABC) was ready

World War II spurred the creation of many new computers to solve specific problems. One was ENIAC, designed to compute artillery range tables. Another was Colossus, used to break German codes at Bletchley Park in the U.K. In 1949, the world's first practical stored-program computer, the EDSAC, entered history. Unlike earlier computers which were built to perform one specific task, the EDSAC could do multiple tasks. In the early 1950s, engineers at the Massachusetts Institute of Technology (MIT) completed Whirlwind I, designed to train pilots. Project Whirlwind introduced magnetic core memory to the world.

The first commercial computer was 1951's UNIVAC (Universal Automatic Computer) built for the U.S. Census Bureau by the makers of ENIAC. It was huge, weighed 16,000 pounds (7,258 kilograms) and had 5,000 vacuum tubes. It rose to fame when it correctly predicted Dwight D. Eisenhower's landslide presidential victory when only a small percentage of votes had been counted. UNIVAC could do 1,000 calculations in a second, an amazing feat at the time.

In 1956, IBM's 305 Random Access Memory Accounting System (RAMAC) was the first with a hard drive. Piece by piece, the modern electronic computer was starting to come together.

In 1968, Douglas Engelbart demonstrated a prototype of the modern computer, which included a mouse and graphical user interface (windows, icons and a menu). This showed that the computer could be of benefit to more than academics and technical experts and reach the lay public.

Bill Hewlett and Dave Packard, two friends who met on a camping trip, began working in a garage in Palo Alto, California. Their first product was an oscillator to test audio equipment. Hewlett-Packard's HP 9100A scientific calculator was released in 1968 and used the phrase "personal computer" in its advertising. The HP-85, released in 1980, was their (actual) first PC.

Steve Wozniak and Steve Jobs both had experience at Hewlett-Packard. While still in school, Jobs landed an internship by cold-calling Bill Hewlett. Wozniak not only worked for HP but also offered the design for the Apple I personal computer to the company five times, and was turned down each time.

Eventually, the two Steves left HP to start their own company in a garage, just as Hewlett and Packard had. The Apple I launched in 1976, followed by the Apple II in 1977. The Apple I was the first "fully assembled" personal computer, though buyers still needed a case, power supply, keyboard, and display to go along with the fully assembled circuit board. The Apple II included a case with a keyboard, plus more RAM and color graphics.

IBM's 5150 Personal Computer, released in 1981, put computers on the desktops of businesses around the world and came with a system unit, a keyboard and a color/graphics capability. It used the MS-DOS operating system from Microsoft. Throughout the 1980s, computers got less expensive and included more features until they became indispensable to almost every home and business.


Who Invented Computer FAQ

When was the first computer invented?
The first computer that resembled the modern machines we see today was invented by Charles Babbage between 1833 and 1871. He developed a device, the analytical engine, and worked on it for nearly 40 years. It was a mechanical computer that was powerful enough to perform simple calculations.
Is the abacus the first computer?
No, but it is the first-known calculating device. The abacus was first used in 1100 B.C.E. and it is still used in some parts of Asia. An abacus consists of a rectangular frame containing thin parallel rods attached with beads for counting.
Who invented the laptop?
Adam Osborne built the first laptop in 1981 and named it "Osborne 1." Back then, it was priced at nearly $2,000 and came with a small built-in computer screen.
What is the first modern computer in the world?
Created in 1943, the ENIAC computing system was developed by J. Presper Eckert and John Mauchly at the campus of the University of Pennsylvania. Since it was the first computer to use electronic technology, it was nearly 1,000 times faster than previous computers. It weighed nearly 50 tons as it used 18,000 vacuum tubes.

Lots More Information

Related Articles

  • "Analytical Engine." Encyclopædia Britannica. Encyclopædia Britannica Online. Encyclopædia Britannica, April 4, 2022. (Dec. 14, 2022) http://www.britannica.com/EBchecked/topic/22628/Analytical-Engine
  • Babbage Engine, The. Computer History Museum, online exhibit. (Dec. 14 2022) http://www.computerhistory.org/babbage/
  • Campbell-Kelly, Martin. "Origin of Computing." Scientific American. Sept. 1, 2009. (Dec. 14, 2022) https://www.scientificamerican.com/article/origin-of-computing/
  • <a>George, Aleta. "Booting Up a Computer Pioneer's 200-Year-Old Design." Smithsonian magazine. April 1, 2009. (Dec. 14, 2022) https://www.smithsonianmag.com/science-nature/booting-up-a-computer-pioneers-200-year-old-design-122424896/</a>
  • Kim, Eugene Eric and Betty Alexandra Toole. "Ada and the First Computer." Scientific American. May 1, 1999. (Dec. 14, 2022) https://www.scientificamerican.com/article/ada-and-the-first-computer/
  • <a>Park, Edwards. "The Object at Hand." Smithsonian magazine. October 1995. (Dec. 14, 2022) https://www.smithsonianmag.com/articles/the-object-at-hand-1-102561973</a>
  • Stoll, Cliff. "When Slide Rules Ruled." Scientific American. May 1, 2006. (Dec. 14, 2022) <a>https://www.scientificamerican.com/article/when-slide-rules-ruled/</a>