The world has seen amazing technological feats in the last few years, but we wouldn't be where we are today without some serious brainpower in the past. Charles Babbage, for example, designed a programmable computer in the 19th century made of gears and levers. He died without building a full version of his computer but his designs inspired future computer engineers. He’s a favorite figure among the steampunk crowd.
Father of computer science, creator of the Turing test that determines if a machine possesses intelligence and breaker of codes for the Allies during World War II -- Alan Turing was a legitimate technological heavyweight.
Ada Lovelace, the Enchantress of Numbers, was daughter to the famed poet Lord Byron. Working with Babbage, she saw that numbers could represent everything from music to images. She designed algorithms and programs before there were any computers to run them!
A Navy officer and computer scientist, Grace Murray Hopper was one of the earliest computer programmers. Her work led to the development of programming languages like COBOL. According to one legend, Hopper coined the terms “debugging” and “computer bug” when she had to remove an actual moth from the inner workings of a computer.
Gordon Moore, the co-founder of Intel, observed that due to improvements in manufacturing, technology and economic efficiencies, the number of discrete elements on a microchip doubles every two years. We call it Moore’s law and it’s why your computer is obsolete before you even get it out of the box.
The reason you are on this site is because of Tim Berners-Lee. He invented the World Wide Web and launched the first Web page in 1990. Unlike every Web page that followed for the next 10 years, it didn’t have an “Under Construction” image on it.
While working on ARPANET, a predecessor of the Internet, Vinton Cerf and Robert Kahn designed the transmission control protocol and Internet protocol. These protocols are the rules computers follow to send information across the Internet.
Back in 1963, Douglas Engelbart designed the first computer mouse at SRI International. In 1968, Engelbart headed a demonstration that showed off not only the mouse but a graphic user interface (GUI). We have him to thank for the way we use computers today.
The British author Arthur C. Clarke was one of the famous fathers of science fiction and was known for works like 2001: A Space Odyssey. He also had a huge impact on technology. In 1945, he described an idea for a satellite-based communication system that included satellites in geostationary orbit. Today, such satellites serve as a worldwide communications network.
William Gibson is famous for his speculative fiction and for coining the term “cyberspace” way back in 1982. His fiction predicted technologies ranging from the World Wide Web to virtual reality. He failed to predict that cat videos would play such an important role, though.
Working independently, Jack Kilby and Robert Noyce each discovered the possibilities of building circuits directly on semiconductor material. The integrated circuit became the basis of the electronics and computer industry. Without them, our smartphones would require a wheelbarrow to carry them.
Fujio Masuoka invented flash memory -- storing data onto semiconductor chips rather than on a platter-based hard drive. Masuoka worked for Toshiba and should have become a tech superstar overnight but the company failed to capitalize on his invention and Intel took up the slack. For years, Toshiba refused to admit to the embarrassing fact that the company ignored a billion-dollar idea.
In the 1950s, George Devol filed patents and laid the groundwork for a technology that would change the manufacturing industry: the industrial robot. Called the Unimate, the robot used hydraulics to lift heavy loads and complete repetitive actions. The first robot in the workforce punched the clock at General Motors.
As a co-inventor of Ethernet, Bob Metcalfe pioneered advances in local area networks. But he also proposed Metcalfe’s Law: The value of a telecommunications network is proportional to the square of the number of connected users. Simply put, as you add more users to a network the network itself becomes more valuable at an advanced rate.
If you’re connected to the Internet over a Wi-Fi network, you have Vic Hayes to thank for it. As different companies began to develop wireless networking technologies, it became clear a standard approach was needed to avoid dozens of proprietary conflicts. Hayes established standards and secured cooperation with governments to set aside parts of the wireless spectrum for Wi-Fi networks.
Martin Cooper, a vice president with Motorola, led a team in the 1970s that had a crazy goal -- to create a portable, cellular telephone. By 1973, he and his team had created the first cell phone, which weighed in at 2.5 pounds (1.1 kilograms). His first call was to a corporate rival, Joel S. Engel of Bell Labs, making it the first crank call made on a cell phone!