Computing History — Brief Timeline and Facts
This is not intended as a complete history of the evolution of the computer, but rather as a highlighting of specific, important events and individuals.
- Approximately 3000 B.C.
- Abacus, the first calculating device, invented. There seems to be some question as to its origination — China or the Indus River Valley area
A time of great philosophical and scientific progress, including some important milestones in mathematics that would lead toward the invention of the computer
- Scottish mathematician John Napier invented logarithms.
- French mathematician Blaise Pascal created machine that could add/subtract (in other words, a basic calculator)
- German mathematician Gottfried Wilhelm von Leibnitz developed a calculator that could multiply as well as add and subtract (but still couldn't divide).
The Industrial Revolution
The 1700s and early 1800s were a time of great political and social unrest (examples: the American and French Revolutions). As a result, mathematics and science took a back seat to other endeavors until the political and social climate settled down and the Industrial Revolution began.
- Charles Babbage invented the Difference Engine, a machine which solved polynomial equations of the form ax2+bx+c to an accuracy of six places.
- Babbage followed his invention of the Difference Engine with his specification of the Analytical Engine, a stored program machine to perform any type of arithmetic calculation. The Analytical Engine worked in theory but Babbage was unable to actually build one because of technology limitations (the machine used gears and wheels to carry out its calculations). Some 70 years later, researchers found Babbage's plans for the Analytical Engine and were able to finally build one and show that it worked exactly as Babbage had designed it. His assistant, Lady Ada Augusta Lovelace (daughter of the English poet Lord Byron) is often regarded as the first computer programmer because of her work in describing the sequence of operations which the Analytical Engine should perform to carry out calculations.
- early 1830s
- Joseph Jacquard invented the Jacquard loom, which wove intricately patterned cloth based on instructions contained on punched cards. The first stored instruction machine actually built.
- George Boole publishes his clarification of the mathematical field of Boolean algebra, which forms the numerical basis for computer hardware.
- Herman Hollerith develops machines for sorting punched cards based on patterns formed by the holes and for counting or tabulating data from these cards.
- The Tabulating Machine Company formed by Herman Hollerith to manufacture and sell his machines.
- The Tabulating Machine Company became International Business Machines Corporation, aka IBM
The 1930s and World War II
Provided the real push for the invention of the modern-day computer
- Alan Turing develops his concept of the Turing Machine, a hypothetical construct merging the idea of a machine, the action of the mind, and logical instruction which formed a theoretical basis for the modern-day computer.
- 1941 German
- Development of the MARK I computer in a joint IBM/Harvard University effort led by Howard Aiken. MARK I contained 760,000 electronic parts, 500 miles of wiring, AND 3000 electromechanical relays as switches (thus the MARK I is not considered an electronic computer). MARK I occupied a whole building on Harvard campus!
- Naval lieutenant Grace Murray Hopper logs the first computer "bug" on September 19 at 15:45 hours — a small moth that had been trapped in one of the electromechanical switches in the MARK II (the successor to the MARK I, naturally).
- Invention of ENIAC, the first fully electronic computer. ENIAC was based on vacuum tube technology to replace the unreliable electromechanical relays. Developed by researchers Presper Eckert and John Mauchly. ENIAC contained 18,000 vacuum tubes, 80,000 resistors/capacitors, weighed 30 tons, took up over 15000 square feet of space. ENIAC did not store information, and thus had to be rewired for each job that it was to perform. Now on exhibit at the Smithsonian in Washington.
- Completion of UNIVAC I, the first commercially available computer (all previous computers had been Department of Defense research projects). UNIVAC I was, ironically, sold to the Census Bureau.
- Grace Murray Hopper produces the first programming language translator to translate assembly language into machine language (0s and 1s) for programming. Allowed the use of mnemonics (abbreviations) to represent the bit patterns of 0s and 1s that told the computer what to do.
- Introduction of FORTRAN I (FORmula TRANslator), the first high level programming language. FORTRAN was developed by scientists for science and engineering applications, thus its strength is and has always been "number crunching."
The Second Generation
- Replacement of the vacuum tube by transistors (solid-state devices). Provided increased reliability and speed as well as decreased size of computers.
- Introduction of COBOL (COmmon Business Oriented Language), first business oriented high level language. Introduction of LISP (LISt Processing language) for artificial intelligence applications.
- Development of SABRE, the first airline passenger reservation system, in a joint IBM/American Airlines effort.
- BASIC (Basic All-purpose Symbolic Instruction Code) introduced as an easy-to-learn programming language. PL/I (Programming Language/I) introduced as a synthesis of the best of FORTRAN and COBOL (and is a huge and extremely complex language as a result).
The Third Generation & Beyond
- Introduction of the integrated circuit (IC) — transistors on a chip to replace an entire circuit board of separate transistors wired together to perform some function. Again, increased reliability and speed of computers while decreasing size.
- Development of the UNIX operating system at Bell Laboratories
- The Department of Defense creates ARPANET, the precursor to the Internet, by connecting together four computers,
- Introduction of the first microprocessor, the Intel 4004, containing an entire CPU on a chip. Led to supercomputers and PCs (both ends of the computing spectrum!).
- Introduction of Pascal, the first strongly typed programming language.
- Decision by Judge Earl Richard Larson, in the lawsuit Honeywell vs. Sperry Rand, that the patent issued for the ENIAC computer was invalid, and that the ABC invented by John Atanasoff was the first electronic digital computer.
- The C programming language was developed at Bell Labs by Brian Kernighan and Dennis Ritchie, a couple of researchers just playing around with a spare computer. C is considered to be a middle-level language, with most features of high level languages but also the fine control of machine that assembly language gives.
The Personal Computer Era
- The Altair kit computer, the first personal computer.
- Smalltalk, the first object-oriented language, invented at Xerox PARC
- The first IBM PC introduced
- dBASE, the first database program for personal computers, introduced.
- Pixar is founded
- Tim Berners-Lee develops HyperText Markup Language, providing the foundation for the development of the World Wide Web
- Microsoft releases the first stable & commercially successful version of Windows, Windows 3.0
- Introduction of the Linux operating system, a UNIX-derived program initially developed by Linus Torvalds
- Tim Berners-Lee invents Mosaic, the first "web browser"
- Intel releases the first Pentium microprocessor CPU
- Introduction of the Java programming language