Computer science |
---|
|
The history of computer science began long before our modern discipline of computer science, usually appearing in forms like mathematics or physics. Developments in previous centuries alluded to the discipline that we now know as computer science.[1] This progression, from mechanical inventions and mathematical theories towards modern computer concepts and machines, led to the development of a major academic field, massive technological advancement across the Western world, and the basis of a massive worldwide trade and culture.[2]
Early history[]
The earliest known tool for use in computation was the abacus, developed in the period between 2700–2300 BCE in Sumer (modern Iraq).[3] The Sumerians' abacus consisted of a table of successive columns which delimited the successive orders of magnitude of their sexagesimal number system.[4]:11 Its original style of usage was by lines drawn in sand with pebbles . Abaci of a more modern design are still used as calculation tools today, such as the Chinese abacus.[5]
In the 5th century BC in ancient India, the grammarian Pāṇini formulated the grammar of Sanskrit in 3959 rules known as the Ashtadhyayi which was highly systematized and technical. Panini used metarules, transformations and recursions.[6]
Code-breaking algorithms have existed for centuries. In the 9th century, the Arab mathematician Al-Kindi described a cryptographic algorithm for deciphering encrypted code, in A Manuscript on Deciphering Cryptographic Messages. He gave the first description of cryptanalysis by frequency analysis, the earliest code-breaking algorithm.[7] Al-Kindi is thus regarded as the first codebreaker in history.[8] His breakthrough work was influenced by Al-Khalil (717–786), who wrote the Book of Cryptographic Messages, which contains the first use of permutations and combinations to list all possible Arabic words with and without vowels.[9]
Mechanical analog computer devices appeared in the medieval Islamic world and were developed by Muslim astronomers, such as the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,[10] and the torquetum by Jabir ibn Aflah.[11] According to Simon Singh, Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.[12][13] Programmable machines were also invented by Muslim engineers, such as the automatic flute player by the Banū Mūsā brothers (9th century),[14] and Al-Jazari's programmable humanoid automata.[15] Technological artifacts of similar complexity appeared in 14th century Europe, with mechanical astronomical clocks.[16]
In 1206, Arab engineer Al-Jazari invented the castle clock, the first programmable analog computer.[17] It was a complex device that was about 11 feet high, and had multiple functions alongside timekeeping. It included a display of the zodiac and the solar and lunar orbits, and a pointer in the shape of the crescent moon which travelled across the top of a gateway, moved by a hidden cart and causing automatic doors to open, each revealing a mannequin, every hour.[18] It was possible to re-program the length of day and night everyday in order to account for the changing lengths of day and night throughout the year, and it also featured five robotic musicians who automatically play music when moved by levers operated by a hidden camshaft attached to a water wheel.[17] Other components of the castle clock included a main reservoir with a float, a float chamber and flow regulator, plate and valve trough, two pulleys, crescent disc displaying the zodiac, and two falcon automata dropping balls into vases.[19]
Around 1640, Blaise Pascal, a leading French mathematician, constructed a mechanical adding device based on a design described by ancient Egyptian mathematician Hero of Alexandria.[20] Then in 1672, Gottfried Wilhelm Leibniz invented the Stepped Reckoner which he completed in 1694.[21] Considerable advancement in mathematics and electronics theory was required before the first modern computers could be designed.
Before the 1920s, computers (sometimes computors) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women.[22][23][24][25] Some performed astronomical calculations for calendars, others ballistic tables for the military.[26]
Binary logic[]
In 1702, Gottfried Wilhelm Leibniz developed logic in a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros also represent true and false values or on and off states. But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled .[27]
By this time, the first mechanical devices driven by a binary pattern had been invented. The industrial revolution had driven forward the mechanization of many tasks, and this included weaving. Punched cards controlled Joseph Marie Jacquard's loom in 1801, where a hole punched in the card indicated a binary one and an unpunched spot indicated a binary zero. Jacquard's loom was far from being a computer, but it did illustrate that machines could be driven by binary systems .[27]
Emergence of a discipline[]
Charles Babbage and Ada Lovelace[]
Charles Babbage is often regarded as one of the first pioneers of computing. Beginning in the 1810s, Babbage had a vision of mechanically computing numbers and tables. Putting this into reality, Babbage designed a calculator to compute numbers up to 8 decimal points long. Continuing with the success of this idea, Babbage worked to develop a machine that could compute numbers with up to 20 decimal places. By the 1830s, Babbage had devised a plan to develop a machine that could use punched cards to perform arithmetical operations. The machine would store numbers in memory units, and there would be a form of sequential control. This means that one operation would be carried out before another in such a way that the machine would produce an answer and not fail. This machine was to be known as the “Analytical Engine”, which was the first true representation of what is the modern computer.[28]
Ada Lovelace (Augusta Ada Byron) is credited as the pioneer of computer programming and is regarded as a mathematical genius, a result of the mathematically heavy tutoring regimen her mother assigned to her as a young girl. Lovelace began working with Charles Babbage as an assistant while Babbage was working on his “Analytical Engine”, the first mechanical computer.[29] During her work with Babbage, Ada Lovelace became the designer of the first computer algorithm, which had the ability to compute Bernoulli numbers.[30] Moreover, Lovelace’s work with Babbage resulted in her prediction of future computers to not only perform mathematical calculations, but also manipulate symbols, mathematical or not.[31] While she was never able to see the results of her work, as the “Analytical Engine” was not created in her lifetime, her efforts in later years, beginning in the 1840s, did not go unnoticed.[32]
Alan Turing and the Turing machine[]
The mathematical foundations of modern computer science began to be laid by Kurt Gödel with his incompleteness theorem (1931). In this theorem, he showed that there were limits to what could be proved and disproved within a formal system. This led to work by Gödel and others to define and describe these formal systems, including concepts such as mu-recursive functions and lambda-definable functions.Template:Citation needed
In 1936 Alan Turing and Alonzo Church independently, and also together, introduced the formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing.Template:Citation needed This became the Church–Turing thesis, a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.Template:Citation needed
In 1936, Alan Turing also published his seminal work on the Turing machines, an abstract digital computing machine which is now simply referred to as the Universal Turing machine. This machine invented the principle of the modern computer and was the birthplace of the stored program concept that almost all modern day computers use.[33] These hypothetical machines were designed to formally determine, mathematically, what can be computed, taking into account limitations on computing ability. If a Turing machine can complete the task, it is considered Turing computable or more commonly, Turing complete.[34]
The Los Alamos physicist Stanley Frankel, has described John von Neumann's view of the fundamental importance of Turing's 1936 paper, in a letter:[33]
I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936… Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing...
Akira Nakashima and switching circuit theory[]
Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an ad hoc manner, lacking any theoretical rigor. This changed with NEC engineer Akira Nakashima's switching circuit theory in the 1930s. From 1934 to 1936, Nakashima published a series of papers showing that the two-valued Boolean algebra, which he discovered independently (he was unaware of George Boole's work until 1938), can describe the operation of switching circuits.[35][36][37][38] This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers. Switching circuit theory provided the mathematical foundations and tools for digital system design in almost all areas of modern technology.[38]
Nakashima's work was later cited and elaborated on in Claude Elwood Shannon's seminal 1937 master's thesis "A Symbolic Analysis of Relay and Switching Circuits".[37] While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. His thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.Template:Citation needed
Early computer hardware[]
The world's first electronic digital computer, the Atanasoff–Berry computer, was built on the Iowa State campus from 1939 through 1942 by John V. Atanasoff, a professor of physics and mathematics, and Clifford Berry, an engineering graduate student.
In 1941, Konrad Zuse developed the world's first functional program-controlled computer, the Z3. In 1998, it was shown to be Turing-complete in principle.[39][40] Zuse also developed the S2 computing machine, considered the first process control computer. He founded one of the earliest computer businesses in 1941, producing the Z4, which became the world's first commercial computer. In 1946, he designed the first high-level programming language, Plankalkül.[41]
In 1948, the Manchester Baby was completed; it was the world's first electronic digital computer that ran programs stored in its memory, like almost all modern computers.[33] The influence on Max Newman of Turing's seminal 1936 paper on the Turing Machines and of his logico-mathematical contributions to the project, were both crucial to the successful development of the Baby.[33]
In 1950, Britain's National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on Turing's philosophy. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.[33][42] Turing's design for ACE had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer, which was enormous by the standards of his day.[33] Had Turing's ACE been built as planned and in full, it would have been in a different league from the other early computers.[33]
The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II.[43] While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper, a future rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9, 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found" (see software bug for details).[43]
John von Neumann and the von Neumann architecture[]
In 1946, a model for computer architecture was introduced and became known as Von Neumann architecture. Since 1950, the von Neumann model provided uniformity in subsequent computer designs. The von Neumann model is composed of three major parts, the arithmetic logic unit (ALU), the memory, and the instruction processing unit (IPU). In von Neumann machine design, the IPU passes addresses to memory, and memory, in turn, is routed either back to the IPU if an instruction is being fetched or to the ALU if data is being fetched.[44]
With von Neumann architecture, main memory along with the accumulator (the register that holds the result of logical operations)[45] are the two memories that are addressed. Operations can be carried out as simple arithmetic (these are performed by the ALU and include addition, subtraction, multiplication and division), conditional branches (these are more commonly seen now as if
statements or while
loops. The branches serve as go to
statements), and logical moves between the different components of the machine, i.e., a move from the accumulator to memory or vice versa. Von Neumann architecture accepts fractions and instructions as data types. Finally, as the von Neumann architecture is a simple one, its register management is also simple. The architecture uses a set of seven registers to manipulate and interpret fetched data and instructions. These registers include the "IR" (instruction register), "IBR" (instruction buffer register), "MQ" (multiplier quotient register), "MAR" (memory address register), and "MDR" (memory data register)."[44] The architecture also uses a program counter ("PC") to keep track of where in the program the machine is.[44]
Transistors and the computer revolution[]
The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948.[46][47] In 1953, the University of Manchester built the first transistorized computer, called the Transistor Computer.[48] However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialised applications.[49]
The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959.[50] It was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[49] With its high scalability,[51] and much lower power consumption and higher density than bipolar junction transistors,[52] the MOSFET made it possible to build high-density integrated circuits.[53][54] The MOSFET later led to the microcomputer revolution,[55] and became the driving force behind the computer revolution.[56][57] The MOSFET is the most widely used transistor in computers,[58][59] and is the fundamental building block of digital electronics.[60]
See also[]
- Computer Museum
- History of computing
- History of computing hardware
- History of software
- List of computer term etymologies, the origins of computer science words
- List of prominent pioneers in computer science
- Timeline of algorithms
- History of personal computers
- Women in computing
- Timeline of women in computing
References[]
- ↑ Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. Chapman Hall.
- ↑ "History of Computer Science". uwaterloo.ca.
- ↑ Boyer, Carl B.; Merzbach, Uta C. (1991). A History of Mathematics (2nd ed.). John Wiley & Sons, Inc. pp. 252–253. ISBN 978-0-471-54397-8.
{{cite book}}
: Invalid|ref=harv
(help) - ↑ Ifrah, Georges (2001). The Universal History of Computing: From the Abacus to the Quantum Computer. John Wiley & Sons. ISBN 978-0-471-39671-0.
- ↑ Bellos, Alex (2012-10-25). "Abacus adds up to number joy in Japan". The Guardian (London). https://www.theguardian.com/science/alexs-adventures-in-numberland/2012/oct/25/abacus-number-joy-japan. Retrieved 2013-06-25.
- ↑ Sinha, A. C. (1978). "On the status of recursive rules in transformational grammar". Lingua. 44 (2–3): 169–218. doi:10.1016/0024-3841(78)90076-1.
- ↑ Dooley, John F. (2013). A Brief History of Cryptology and Cryptographic Algorithms. Springer Science & Business Media. pp. 12–3. ISBN 9783319016283.
- ↑ Sahinaslan, Ender; Sahinaslan, Onder (2 April 2019). "Cryptographic methods and development stages used throughout history". AIP Conference Proceedings. 2086 (1): 030033. Bibcode:2019AIPC.2086c0033S. doi:10.1063/1.5095118. ISSN 0094-243X.
Al-Kindi is considered the first code breaker
- ↑ Broemeling, Lyle D. (1 November 2011). "An Account of Early Statistical Inference in Arab Cryptology". The American Statistician. 65 (4): 255–257. doi:10.1198/tas.2011.10191. S2CID 123537702.
- ↑ "Islam, Knowledge, and Science". Islamic Web. Retrieved 2017-11-05.
- ↑ Lorch, R. P. (1976), "The Astronomical Instruments of Jabir ibn Aflah and the Torquetum", Centaurus, 20 (1): 11–34, Bibcode:1976Cent...20...11L, doi:10.1111/j.1600-0498.1976.tb00214.x
- ↑ Simon Singh, The Code Book, pp. 14-20
- ↑ "Al-Kindi, Cryptography, Codebreaking and Ciphers". Retrieved 2007-01-12.
- ↑ Koetsier, Teun (2001), "On the prehistory of programmable machines: musical automata, looms, calculators", Mechanism and Machine Theory, 36 (5): 589–603, doi:10.1016/S0094-114X(01)00005-2..
- ↑ Ancient Discoveries, Episode 11: Ancient Robots, History Channel, archived from the original on March 1, 2014, retrieved 2008-09-06
- ↑ In search of lost time, Jo Marchant, Nature 444, #7119 (November 30, 2006), pp. 534–538, doi:10.1038/444534a Template:PMID.
- ↑ 17.0 17.1 [[Ancient Discoveries]], Episode 11: Ancient Robots, History Channel, retrieved 2008-09-06
{{citation}}
: URL–wikilink conflict (help) - ↑ Howard R. Turner (1997), Science in Medieval Islam: An Illustrated Introduction, p. 184. University of Texas Press, ISBN 0292781490.
- ↑ Salim Al-Hassani (13 March 2008). "How it Works: Mechanism of the Castle Clock". FSTC. Retrieved 2008-09-06.
- ↑ "History of Computing Science: The First Mechanical Calculator". eingang.org.
- ↑ Kidwell, Peggy Aldritch; Williams, Michael R. (1992). The Calculating Machines: Their history and development (PDF). Massachusetts Institute of Technology and Tomash Publishers., p.38-42, translated and edited from Martin, Ernst (1925). Die Rechenmaschinen und ihre Entwicklungsgeschichte. Germany: Pappenheim.
- ↑ Light, Jennifer S. (1999-07-01). "When Computers Were Women". Technology and Culture. 40 (3): 455–483. ISSN 1097-3729.
- ↑ Kiesler, Sara; Sproull, Lee; Eccles, Jacquelynne S. (1985-12-01). "Pool Halls, Chips, and War Games: Women in the Culture of Computing". Psychology of Women Quarterly. 9 (4): 451–462. doi:10.1111/j.1471-6402.1985.tb00895.x. ISSN 1471-6402.
- ↑ Fritz, W. B. (1996). "The women of ENIAC - IEEE Xplore Document". IEEE Annals of the History of Computing. 18 (3): 13–28. doi:10.1109/85.511940.
- ↑ Gürer, Denise (2002-06-01). "Pioneering Women in Computer Science". SIGCSE Bull. 34 (2): 175–180. doi:10.1145/543812.543853. ISSN 0097-8418.
- ↑ Grier 2013, p. 138.
- ↑ 27.0 27.1 Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. CRC Press.
- ↑ "Charles Babbage". Encyclopædia Britannica In. Retrieved 2013-02-20.
{{cite web}}
: Unknown parameter|encyclopedia=
ignored (help) - ↑ Evans 2018, p. 16.
- ↑ Evans 2018, p. 21.
- ↑ Evans 2018, p. 20.
- ↑ Isaacson, Betsy (2012-12-10). "Ada Lovelace, World's First Computer Programmer, Celebrated With Google Doodle". The Huffington Post. http://www.huffingtonpost.com/2012/12/10/google-doodle-ada-lovelace_n_2270668.html. Retrieved 2013-02-20.
- ↑ 33.0 33.1 33.2 33.3 33.4 33.5 33.6 "Turing's Automatic Computing Engine". The Modern History of Computing. Metaphysics Research Lab, Stanford University. 2017.
{{cite book}}
: Unknown parameter|encyclopedia=
ignored (help) - ↑ Barker-Plummer, David (1995-09-14). "Turing Machines". The Stanford Encyclopedia of Philosophy. Retrieved 2013-02-20.
- ↑ History of Research on Switching Theory in Japan, IEEJ Transactions on Fundamentals and Materials, Vol. 124 (2004) No. 8, pp. 720-726, Institute of Electrical Engineers of Japan
- ↑ Switching Theory/Relay Circuit Network Theory/Theory of Logical Mathematics, IPSJ Computer Museum, Information Processing Society of Japan
- ↑ 37.0 37.1 Radomir S. Stanković (University of Niš), Jaakko T. Astola (Tampere University of Technology), Mark G. Karpovsky (Boston University), Some Historical Remarks on Switching Theory, 2007, DOI 10.1.1.66.1248
- ↑ 38.0 38.1 Radomir S. Stanković, Jaakko Astola (2008), Reprints from the Early Days of Information Sciences: TICSP Series On the Contributions of Akira Nakashima to Switching Theory, TICSP Series #40, Tampere International Center for Signal Processing, Tampere University of Technology
- ↑ Rojas, R. (1998). "How to make Zuse's Z3 a universal computer". IEEE Annals of the History of Computing. 20 (3): 51–54. doi:10.1109/85.707574.
- ↑ Rojas, Raúl. "How to Make Zuse's Z3 a Universal Computer". Archived from the original on 2014-07-14.
- ↑ Talk given by Horst Zuse to the Computer Conservation Society at the Science Museum (London) on 18 November 2010
- ↑ "BBC News – How Alan Turing's Pilot ACE changed computing". BBC News. May 15, 2010. http://news.bbc.co.uk/2/hi/technology/8683369.stm.
- ↑ 43.0 43.1 "The First "Computer Bug"" (PDF). CHIPS. United States Navy. 30 (1): 18. January–March 2012.
- ↑ 44.0 44.1 44.2 Cragon, Harvey G. (2000). Computer Architecture and Implementation. Cambridge: Cambridge University Press. pp. 1–13. ISBN 978-0-521-65168-4.
- ↑ "Accumlator" Def. 3. Oxford Dictionaries.
- ↑ Lee, Thomas H. (2003). The Design of CMOS Radio-Frequency Integrated Circuits (PDF). Cambridge University Press. ISBN 9781139643771.
- ↑ Puers, Robert; Baldi, Livio; Voorde, Marcel Van de; Nooten, Sebastiaan E. van (2017). Nanoelectronics: Materials, Devices, Applications, 2 Volumes. John Wiley & Sons. p. 14. ISBN 9783527340538.
- ↑ Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society, pp. 34–35
- ↑ 49.0 49.1 Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st century. John Wiley & Sons. pp. 165–167. ISBN 9780470508923.
- ↑ "1960: Metal Oxide Semiconductor (MOS) Transistor Demonstrated". The Silicon Engine: A Timeline of Semiconductors in Computers. Computer History Museum. Retrieved August 31, 2019.
- ↑ Motoyoshi, M. (2009). "Through-Silicon Via (TSV)" (PDF). Proceedings of the IEEE. 97 (1): 43–48. doi:10.1109/JPROC.2008.2007462. ISSN 0018-9219.
- ↑ "Transistors Keep Moore's Law Alive". EETimes. 12 December 2018. https://www.eetimes.com/author.asp?section_id=36&doc_id=1334068. Retrieved 18 July 2019.
- ↑ "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
- ↑ Hittinger, William C. (1973). "Metal-Oxide-Semiconductor Technology". Scientific American. 229 (2): 48–59. Bibcode:1973SciAm.229b..48H. doi:10.1038/scientificamerican0873-48. ISSN 0036-8733. JSTOR 24923169.
- ↑ Malmstadt, Howard V.; Enke, Christie G.; Crouch, Stanley R. (1994). Making the Right Connections: Microcomputers and Electronic Instrumentation. American Chemical Society. p. 389. ISBN 9780841228610.
The relative simplicity and low power requirements of MOSFETs have fostered today's microcomputer revolution.
- ↑ Fossum, Jerry G.; Trivedi, Vishal P. (2013). Fundamentals of Ultra-Thin-Body MOSFETs and FinFETs. Cambridge University Press. p. vii. ISBN 9781107434493.
- ↑ "Remarks by Director Iancu at the 2019 International Intellectual Property Conference". United States Patent and Trademark Office. June 10, 2019. Retrieved 20 July 2019.
- ↑ "Dawon Kahng". National Inventors Hall of Fame. Retrieved 27 June 2019.
- ↑ "Martin Atalla in Inventors Hall of Fame, 2009". Retrieved 21 June 2013.
- ↑ "Triumph of the MOS Transistor". YouTube. Computer History Museum. 6 August 2010. Retrieved 21 July 2019.
Sources[]
- Evans, Claire L. (2018). Broad Band: The Untold Story of the Women Who Made the Internet. New York: Portfolio/Penguin. ISBN 9780735211759.
{{cite book}}
: Invalid|ref=harv
(help) - Grier, David Alan (2013). When Computers Were Human. Princeton: Princeton University Press. ISBN 9781400849369 – via Project MUSE.
{{cite book}}
: Invalid|ref=harv
(help)
Further reading[]
- Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. Taylor and Francis / CRC Press. ISBN 978-1-4822-1769-8.
- Kak, Subhash : Computing Science in Ancient India; Munshiram Manoharlal Publishers Pvt. Ltd (2001)
- The Development of Computer Science: A Sociocultural Perspective Matti Tedre's Ph.D. Thesis, University of Joensuu (2006)
- Ceruzzi, Paul E. (1998). A History of a Modern Computing. The MIT Press. ISBN 978-0-262-03255-1.
- Template:Cite SEP
External links[]
- Computer History Museum
- Computers: From the Past to the Present
- The First "Computer Bug" at the Naval History and Heritage Command Photo Archives.
- Bitsavers, an effort to capture, salvage, and archive historical computer software and manuals from minicomputers and mainframes of the 1950s, 1960s, 1970s, and 1980s
- Oral history interviews