The Digital Revolution is the change from mechanical and analogue electronic technology to digital electronics which began anywhere from the late 1950s to the late 1970s with the adoption and proliferation of digital computers and digital record keeping that continues to the present day.[1] Implicitly, the term also refers to the sweeping changes brought about by digital computing and communication technology during (and after) the latter half of the 20th century. Analogous to the Agricultural Revolution and Industrial Revolution, the Digital Revolution marked the beginning of the Information Age.
Central to this revolution is the mass production and widespread use of digital logic circuits, and its derived technologies, including the computer, digital cellular phone, and the Internet.
Theoretical background[]
The notion of the digital revolution is part of the Schumpeterian theory of socio-economic evolution,[2] which consists of an incessant process of creative destruction that modernizes the modus operandi of society as a whole, including its economic, social, cultural, and political organization.[3][4]
The motor of this incessant force of creative destruction is technological change.[5][6] While the key carrier technology of the first Industrial Revolution (1770–1850) was based on water-powered mechanization, the second Kondratiev wave (1850–1900) was enabled by steam-powered technology, the third (1900–1940) was characterized by the electrification of social and productive organization, the fourth by motorization and the automated mobilization of society (1940–1970), and the most recent one by the digitization of social systems.[2] Each one of those so-called long waves has been characterized by a sustained period of social modernization, most notably by sustained periods of increasing economic productivity. According to Carlota Perez: "this quantum jump in productivity can be seen as a technological revolution, which is made possible by the appearance in the general cost structure of a particular input that we could call the 'key factor', fulfilling the following conditions: (1) clearly perceived low-and descending-relative cost; (2) unlimited supply for all practical purposes; (3) potential all-pervasiveness; (4) a capacity to reduce the costs of capital, labour and products as well as to change them qualitatively".[6] Digital Information and Communication Technologies fulfill those requirements and therefore represent a general purpose technology that can transform an entire economy, leading to a modern, and more developed form of socio-economic and political organization often referred to as the post-industrial society, the fifth Kondratiev, Information society, digital age, and network society, among others.[7]
The Agricultural Revolution led to agricultural cities in the ancient world in the Middle East, Mesoamerica, China, the Indus Valley, Southern Europe and South America. Then the Industrial Revolution led to industrial cities in the 19th century such as Manchester, Newcastle Upon Tyne and New York City. In the 20th century, the rise of the service economy caused people to leave the industrial cities and move out into the suburbs.
The Industrial Revolution and Digital Revolution have been taking place concurrently in newly industrialized countries like China and India, as people leave the rural areas for industrial and high tech cities like Beijing, Shanghai, and Mumbai.
History[]
Digital communication became economical for widespread adoption after the inventions of the MOS transistor, MOS integrated circuit, microprocessor, and personal computer.
The digital revolution converted technology that had been analog into a digital format. By doing this, it became possible to make copies that were identical to the original. In digital communications, for example, repeating hardware was able to amplify the digital signal and pass it on with no loss of information in the signal. Of equal importance to the revolution was the ability to easily move the digital information between media, and to access or distribute it remotely.
The turning point of the revolution was the change from analogue to digitally recorded music.[8] During the 1980s, the digital format of optical compact discs gradually replaced analog formats, such as vinyl records and cassette tapes, as the popular medium of choice.[9]
Foundations[]

Akira Nakashima invented switching circuit theory in 1934, laying the theoretical foundations for digital electronics.
The underlying technology for computers dates back to the later half of the 19th century, including Babbage's analytical engine and the telegraph.
In the 1930s, NEC engineer Akira Nakashima laid the foundations for digital system design with his switching circuit theory.[10][11][12][13] Nakajima's work on switching circuit theory was further developed by Claude Shannon,[12] a Bell Labs mathematician, credited for having laid out the foundations of digitalization in his 1948 article, A Mathematical Theory of Communication.[14] The switching circuit theory of Nakashima and Shannon provided the mathematical foundations and tools for digital system design in modern technology.[13]
In 1947, the first transistor was invented,[15] leading the way to more advanced digital computers. In the 1950s and 1960s the military, governments and other organizations had computer systems.
1950s[]

Mohamed M. Atalla invented the MOS transistor in 1959 and MOS integrated circuit chip in 1960. These inventions are fundamental to the MOS revolution, Digital Revolution and Digital Age.

Dawon Kahng co-invented the MOS transistor with Mohamed M. Atalla in 1959.
In 1959, Fairchild Semiconductor engineer Robert Noyce invented the silicon integrated circuit (IC) chip. The basis for Noyce's silicon IC was the planar process, developed in early 1959 by Jean Hoerni, who was in turn building on Mohamed Atalla's silicon surface passivation method developed in 1957.[16] This new technique, the integrated circuit, allowed for quick, low-cost fabrication of complex circuits by having a set of electronic circuits on one small plate ("chip") of semiconductor material, typically silicon.
The metal–oxide–semiconductor field-effect transistor (MOSFET), also known as the MOS transistor, was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959.[17][18][19] The MOSFET's advantages include high scalability,[20] affordability,[21] low power consumption, and high transistor density.[22] Its rapid on–off electronic switching speed also makes it ideal for generating pulse trains,[23] the basis for electronic digital signals,[24][25] in contrast to BJTs which more slowly generate analog signals resembling sine waves.[23] These factors make the MOSFET an important switching device for digital circuits.[26]
1960s[]

Masatoshi Shima (left) and Stanley Mazor (right) developed the Intel 4004, the first single-chip microprocessor, from 1969 to 1971.
Mohamed Atalla realised that the main advantage of a MOS transistor was its ease of fabrication, particularly suiting it for use in the recently invented integrated circuits. He first proposed the MOS integrated circuit (MOS IC) chip in 1960.[27] MOS IC technology led to the development of large-scale integration (LSI), very large-scale integration (VLSI),[28] microprocessors, memory chips, and digital telecommunication circuits.[29] The MOSFET revolutionized the electronics industry,[30][31] and is the most common semiconductor device.[32][33] MOSFETs are the fundamental building blocks of digital electronics, during the Digital Revolution of the late 20th to early 21st centuries.[34][35][36] This paved the way for the Digital Age of the early 21st century.[34]
The advent of the metal–oxide–semiconductor field-effect transistor (MOSFET) enabled the practical use of metal–oxide–semiconductor (MOS) transistors as memory cell storage elements.[37] MOS memory is the basis for digital semiconductor memory chips.[38]
The microprocessor has origins in the development of the MOSFET.[39] Following the development of MOS integrated circuit chips in the early 1960s, MOS chips reached higher transistor density and lower manufacturing costs than bipolar integrated circuits by 1964. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on several MOS LSI chips.[39] Designers in the late 1960s were striving to integrate the central processing unit (CPU) functions of a computer onto a handful of MOS LSI chips, called microprocessor unit (MPU) chipsets.
The single-chip microprocessor was introduced with the Intel 4004. It began with the "Busicom Project"[40] as Masatoshi Shima's three-chip CPU design in 1968,[41][40] before Sharp's Tadashi Sasaki conceived of a single-chip CPU design, which he discussed with Busicom and Intel in 1968.[42] The Intel 4004 was then developed as a single-chip microprocessor from 1969 to 1971, led by Intel's Marcian Hoff and Federico Faggin and Busicom's Masatoshi Shima.[40] Federico Faggin's silicon gate MOS technology was used to fabricate the microprocessor on a single MOS chip.[43] The microprocessor led to the development of microcomputers, and thus the microcomputer revolution.
The public was first introduced to the concepts that would lead to the Internet when a message was sent over the ARPANET in 1969. Packet switched networks such as ARPANET, Mark I, CYCLADES, Merit Network, Tymnet, and Telenet, were developed in the late 1960s and early 1970s using a variety of protocols.
The Whole Earth movement of the 1960s advocated the use of new technology.[44]
1970s[]

Nasir Ahmed invented the discrete cosine transform (DCT) in 1972. DCT coding is the basis for most modern digital media.
In the early days of integrated circuits, each chip was limited to only a few transistors, and the low degree of integration meant the design process was relatively simple. Manufacturing yields were also quite low by today's standards. The wide adoption of the MOS integrated circuit by the early 1970s led to the first large-scale integration (LSI) chips with more than 10,000 transistors on a single chip.[45] Following the wide adoption of CMOS, a type of MOSFET logic, by the 1980s, millions and then billions of MOSFETs could be placed on one chip as the technology progressed,[46] and good designs required thorough planning, giving rise to new design methods. As of 2013, billions of MOSFETs are manufactured every day.[47]
ARPANET led to the development of protocols for internetworking, in which multiple separate networks could be joined together into a network of networks. The Transmission Control Protocol (TCP) was published by Yogen Dalal, Vint Cerf and Carl Sunshine in 1974.[48] It contains the first attested use of the term internet, as a shorthand for internetworking. Between 1976 and 1977, Yogen Dalal proposed separating TCP's routing and transmission control functions into two discrete layers,[49][50] which led to the splitting of TCP into the TCP and IP protocols, and the development of TCP/IP.[50] This was the beginning of the Internet.
Discrete cosine transform (DCT) coding, a data compression technique first proposed by Nasir Ahmed in 1972,[51] enabled practical digital media transmission.[52][53][54] It "played a major role in allowing digital files to be transmitted across computer networks."[55] It is the basis for most modern digital media, including image compression formats such as JPEG (1992), video coding formats such as H.26x (1988 onwards) and MPEG (1993 onwards),[56] audio coding standards such as Dolby Digital (1991)[57][58] and MP3 (1994),[56] and digital TV standards such as video-on-demand (VOD)[52] and high-definition television (HDTV).[59]
In the 1970s, the home computer was introduced,[60] time-sharing computers,[61] the video game console, the first coin-op video games,[62][63] and the golden age of arcade video games began with Space Invaders. As digital technology proliferated, and the switch from analog to digital record keeping became the new standard in business, a relatively new job description was popularized, the data entry clerk. Culled from the ranks of secretaries and typists from earlier decades, the data entry clerk's job was to convert analog data (customer records, invoices, etc.) into digital data.
1980s[]

Asad Ali Abidi invented RF CMOS technology in the late 1980s. RF CMOS is fundamental to the wireless revolution.
In developed nations, computers achieved semi-ubiquity during the 1980s as they made their way into schools, homes, business, and industry. Automated teller machines (ATM), industrial robots, CGI in film and television, electronic music, bulletin board systems, and video games all fueled what became the zeitgeist of the 1980s. Millions of people purchased home computers, making household names of early personal computer manufacturers such as Apple, Commodore, and Tandy. To this day the Commodore 64 is often cited as the best selling computer of all time, having sold 17 million units (by some accounts)[64] between 1982 and 1994.
In 1984, the U.S. Census Bureau began collecting data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and that households with children under the age of 18 were nearly twice as likely to own one at 15.3% (middle and upper middle class households were the most likely to own one, at 22.9%).[65] By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under the age of 18 owned one.[66] By the late 1980s, many businesses were dependent on computers and digital technology.
Motorola created the first mobile phone, Motorola DynaTac, in 1983. However, this device used analog communication - digital cell phones were not sold commercially until 1991 when the 2G network started to be opened in Finland to accommodate the unexpected demand for cell phones that was becoming apparent in the late 1980s.
Compute! magazine predicted that CD-ROM would be the centerpiece of the revolution, with multiple household devices reading the discs.[67]
The first true digital camera was created in 1988, and the first were marketed in December 1989 in Japan and in 1990 in the United States.[68] By the mid-2000s, they would eclipse traditional film in popularity.
Digital ink was also invented in the late 1980s. Disney's CAPS system (created 1988) was used for a scene in 1989's The Little Mermaid and for all their animation films between 1990's The Rescuers Down Under and 2004's Home On The Range.
RF CMOS chips are RF circuit chips that use mixed-signal (digital and analog) MOS integrated circuit technology and are fabricated using the CMOS process. RF CMOS technology was invented by Pakistani engineer Asad Ali Abidi at UCLA in the late 1980s.[69] There was a rapid growth of the wireless telecommunications industry towards the end of the 20th century, primarily due to the introduction of digital signal processing in wireless communications, driven by the development of low-cost, very large-scale integration (VLSI) RF CMOS technology.[70] RF CMOS integrated circuits enabled sophisticated, low-cost and portable end-user terminals, and gave rise to small, low-cost, low-power and portable units for a wide range of wireless communication systems. This enabled "anytime, anywhere" communication and helped bring about the wireless revolution, leading to the rapid growth of the wireless industry.[71] RF CMOS is used in the radio transceivers of all modern wireless networking devices and mobile phones,[69] and is widely used to transmit and receive wireless signals in a variety of applications, such as satellite technology (e.g. GPS), bluetooth, Wi-Fi, near-field communication (NFC), mobile networks (e.g. 3G and 4G), terrestrial broadcast, and automotive radar applications, among other uses.[72]
Tim Berners-Lee invented the World Wide Web in 1989.
1990s[]
The first public digital HDTV broadcast was of the 1990 World Cup that June; it was played in 10 theaters in Spain and Italy. However HDTV did not become a standard until the mid-2000s outside Japan.
The World Wide Web became publicly accessible in 1991, which had been available only to government and universities.[73] In 1993 Marc Andreessen and Eric Bina introduced Mosaic, the first web browser capable of displaying inline images[74] and the basis for later browsers such as Netscape Navigator and Internet Explorer. The Internet expanded quickly, and by 1996, it was part of mass culture and many businesses listed websites in their ads. By 1999 almost every country had a connection, and nearly half of Americans and people in several other countries used the Internet on a regular basis. However throughout the 1990s, "getting online" entailed complicated configuration, and dial-up was the only connection type affordable by individual users; the present day mass Internet culture was not possible.
In 1989, about 15% of all households in the United States owned a personal computer, by 2000, this was up to 51%;[75] for households with children nearly 30% owned a computer in 1989, and in 2000 65% owned one.
The wireless revolution, the introduction and proliferation of wireless networks, began in the 1990s and was enabled by the wide adoption of MOSFET-based RF power amplifiers (power MOSFET and LDMOS) and RF circuits (RF CMOS).[76][77][78] Wireless networks allowed for public digital transmission without the need for cables, leading to digital television (digital TV), GPS, satellite radio, wireless Internet and mobile phones through the 1990s–2000s.
2000s[]

From left to right: Chad Hurley, Steve Chen, and Jawed Karim. The trio founded YouTube (2005), which revolutionized online video and video streaming.
Cell phones became as ubiquitous as computers by the early 2000s, with movie theaters beginning to show ads telling people to silence their phones. They also became much more advanced than phones of the 1990s, most of which only took calls or at most allowed for the playing of simple games.
Text messaging existed in the 1990s but was not widely used until the early 2000s, when it became a cultural phenomenon.
Online video and video streaming were revolutionized by YouTube, an online video platform founded by Chad Hurley, Jawed Karim and Steve Chen in 2005. It enabled the streaming of MPEG-4 AVC (H.264) user-generated content from anywhere on the World Wide Web.[79]
The digital revolution became truly global in this time as well - after revolutionizing society in the developed world in the 1990s, the digital revolution spread to the masses in the developing world in the 2000s.
In late 2005, the population of the Internet reached 1 billion,[80] and 3 billion people worldwide used cell phones by the end of the decade. HDTV became the standard television broadcasting format in many countries by the end of the decade.
2010s[]
By 2012, over 2 billion people used the Internet, twice the number using it in 2007. Cloud computing had entered the mainstream by the early 2010s. By 2015, tablet computers and smartphones were expected to exceed personal computers in Internet usage.[81] In 2016, half of the world's population is connected.[82]
Rise in digital technology use of computers, 1980–2015[]

Analog to digital transition 1986 - 2014
In the late 1980s, less than 1% of the world's technologically stored information was in digital format, while it was 94% in 2007, with more than 99% by 2014.[83] The year 2002 is estimated to be the year when human kind was able to store more information in digital, than in analog format (the "beginning of the digital age").[84]
It is estimated that the world's capacity to store information has increased from 2.6 (optimally compressed) exabytes in 1986, to some 5,000 exabytes in 2014 (5 zettabytes).[83][85]
1990[]
- Cell phone subscribers: 12.5 million (0.25% of world population in 1990)[86]
- Internet users: 2.8 million (0.05% of world population in 1990)[87]
2002[]
- Cell phone subscribers: 1.2 billion (19% of world population in 2002)[87]
- Internet users: 631 million (11% of world population in 2002)[87]
2010[]
- Cell phone subscribers: 4 billion (68% of world population in 2010)[88]
- Internet users: 1.8 billion (26.6% of world population in 2010)[89]
2016[]
- Internet users: 3.6 billion (49.5% of world population in 2016)[90]

A university computer lab containing many desktop PCs
Converted technologies[]
Conversion of below analog technologies to digital. (The decade indicated is the period when digital became dominant form.)
- Analog computer to digital computer (1950s)
- Telex to fax (1980s)
- Phonograph cylinder, gramophone record and compact cassette to compact disc (1980s)
- VHS to DVD (2000s)
- Analog photography (photographic plate and photographic film) to digital photography (2000s)
- Analog cinematography (film stock) to digital cinematography (2010s)
- Analog television to digital television (2010s (expected))
- Analog radio to digital radio (2020s (expected))
- Analog mobile phone (1G) to digital mobile phone (2G) (1990s)
- Analog watch and clock to digital watch and clock (not yet predictable)
- Analog thermometer to digital thermometer (2010s)
- Offset printing to digital printing (2020s (expected))
Decline or disappearance of below analog technologies:
- Mail (parcel to continue, others to be discontinued) (2020s (expected))
- Telegram (2010s)
- Typewriter (2010s)
- Fax (2010s (expected))
- Landline phone (2020s (expected)) (only offices will continue using landlines)
- Payphone (2020s (expected))
Disappearance of other technologies also attributed to digital revolution. (Analog–digital classification doesn't apply to these.)
- CRT (2010s)
- Plasma display (2010s)
- CCFL backlit LCDs (2010s)
Improvements in digital technologies.
- Desktop computer to laptop to tablet computer
- DVD to Blu-ray Disc to 4K Blu-ray Disc
- 2G to 3G to 4G
- Digital watch to smartwatch
- Analog weighing scale to digital weighing scale
Technological basis[]
Underlying the digital revolution was the development of the digital electronic computer, the personal computer, and particularly the microprocessor with its steadily increasing performance (as described by Moore's law), which enabled computer technology to be embedded into a huge range of objects from cameras to personal music players. Equally important was the development of transmission technologies including computer networking, the Internet and digital broadcasting. 3G phones, whose social penetration grew exponentially in the 2000s, also played a very large role in the digital revolution as they simultaneously provide ubiquitous entertainment, communications, and online connectivity.
Socio-economic impact[]
Positive aspects include greater interconnectedness, easier communication, and the exposure of information that in the past could have more easily been suppressed by totalitarian regimes. Michio Kaku wrote in his books Physics of the Future that the failure of the Soviet coup of 1991 was due largely to the existence of technology such as the fax machine and computers that exposed classified information.
The Revolutions of 2011 were enabled by social networking and smartphone technology; however these revolutions in hindsight largely failed to reach their goals as hardcore Islamist governments and in Syria a civil war have formed in the absence of the dictatorships that were toppled.
The economic impact of the digital revolution has been large. Without the World Wide Web (WWW), for example, globalization and outsourcing would not be nearly as feasible as they are today. The digital revolution radically changed the way individuals and companies interact. Small regional companies were suddenly given access to much larger markets. Concepts such as On-demand services and manufacturing and rapidly dropping technology costs made possible innovations in all aspects of industry and everyday life.
After initial concerns of an IT productivity paradox, evidence is mounting that digital technologies have significantly increased the productivity and performance of businesses.[91]
Negative effects include information overload, Internet predators, forms of social isolation, and media saturation. In a poll of prominent members of the national news media, 65 percent said the Internet is hurting journalism more than it is helping[92] by allowing anyone no matter how amateur and unskilled to become a journalist; causing information to be muddier and the rise of conspiracy theory in a way it didn't exist in the past.
In some cases, company employees' pervasive use of portable digital devices and work related computers for personal use—email, instant messaging, computer games—were often found to, or perceived to, reduce those companies' productivity. Personal computing and other non-work related digital activities in the workplace thus helped lead to stronger forms of privacy invasion, such as keystroke recording and information filtering applications (spyware and content-control software).
Information sharing and privacy[]
Privacy in general became a concern during the digital revolution. The ability to store and utilize such large amounts of diverse information opened possibilities for tracking of individual activities and interests. Libertarians and privacy rights advocates feared the possibility of an Orwellian future where centralized power structures control the populace via automatic surveillance and monitoring of personal information in such programs as the CIA's Information Awareness Office.[93] Consumer and labor advocates opposed the ability to direct market to individuals, discriminate in hiring and lending decisions, invasively monitor employee behavior and communications and generally profit from involuntarily shared personal information.
The Internet, especially the WWW in the 1990s, opened whole new avenues for communication and information sharing. The ability to easily and rapidly share information on a global scale brought with it a whole new level of freedom of speech. Individuals and organizations were suddenly given the ability to publish on any topic, to a global audience, at a negligible cost, particularly in comparison to any previous communication technology.
Large cooperative projects could be endeavored (e.g. Open-source software projects, SETI@home). Communities of like-minded individuals were formed (e.g. MySpace, Tribe.net). Small regional companies were suddenly given access to a larger marketplace.
In other cases, special interest groups as well as social and religious institutions found much of the content objectionable, even dangerous. Many parents and religious organizations, especially in the United States, became alarmed by pornography being more readily available to minors. In other circumstances the proliferation of information on such topics as child pornography, building bombs, committing acts of terrorism, and other violent activities were alarming to many different groups of people. Such concerns contributed to arguments for censorship and regulation on the WWW.
Copyright and trademark issues[]
Copyright and trademark issues also found new life in the digital revolution. The widespread ability of consumers to produce and distribute exact reproductions of protected works dramatically changed the intellectual property landscape, especially in the music, film, and television industries.
The digital revolution, especially regarding privacy, copyright, censorship and information sharing, remains a controversial topic. As the digital revolution progresses it remains unclear to what extent society has been impacted and will be altered in the future.
Concerns[]
While there have been huge benefits to society from the digital revolution, especially in terms of the accessibility of information, there are a number of concerns. Expanded powers of communication and information sharing, increased capabilities for existing technologies, and the advent of new technology brought with it many potential opportunities for exploitation. The digital revolution helped usher in a new age of mass surveillance, generating a range of new civil and human rights issues. Reliability of data became an issue as information could easily be replicated, but not easily verified. The digital revolution made it possible to store and track facts, articles, statistics, as well as minutiae hitherto unfeasible.
From the perspective of the historian, a large part of human history is known through physical objects from the past that have been found or preserved, particularly in written documents. Digital records are easy to create but also easy to delete and modify. Changes in storage formats can make recovery of data difficult or near impossible, as can the storage of information on obsolete media for which reproduction equipment is unavailable, and even identifying what such data is and whether it is of interest can be near impossible if it is no longer easily readable, or if there is a large number of such files to identify. Information passed off as authentic research or study must be scrutinized and verified.
These problems are further compounded by the use of digital rights management and other copy prevention technologies which, being designed to only allow the data to be read on specific machines, may well make future data recovery impossible. Interestingly, the Voyager Golden Record, which is intended to be read by an intelligent extraterrestrial (perhaps a suitable parallel to a human from the distant future), is recorded in analog rather than digital format specifically for easy interpretation and analysis.
See also[]
- Revolution
- Neolithic Revolution
- Agricultural Revolution
- Scientific Revolution
- Industrial Revolution
- Second Industrial Revolution
- Information revolution
- Nanotechnology
- Dot-com company
- Digital native
- Digital omnivore
- Digital addict
- Digital Phobic
- Electronic document
- Industry 4.0
- Information Age
- Microcomputer revolution
- Moore's law
- Paperless office
- Post Cold War era
- Semiconductors
- Technological revolution
- Telework
- Timeline of electrical and electronic engineering
References[]
- ↑ The Digital Revolution. UCSD.
- ↑ 2.0 2.1 C. Freeman and F. Louçã, As Time Goes By: From the Industrial Revolutions to the Information Revolution, Oxford University Press, USA, 2002.
- ↑ J. Schumpeter, Business Cycles: A Theoretical, Historical, And Statistical Analysis of the Capitalist Process, New York; McGraw-Hill, 1939.
- ↑ Online Course "Digital Technology & Social Change" (University of California), Hilbert M., DT&SC 4-2: Long Waves of Social Evolution https://www.youtube.com/watch?v=W7I6XoclH_4
- ↑ C. Perez, "Technological Revolutions, Paradigm Shifts and Socio-Institutional Change" in E. Reinert, Globalization, Economic Development and Inequality: An alternative Perspective, Cheltenham: Edward Elgar, 2004, pp. 217–242. http://www.carlotaperez.org/papers/basic-technologicalrevolutionsparadigm.htm
- ↑ 6.0 6.1 C. Perez, "Structural change and assimilation of new technologies in the economic and social systems" Futures, vol. 15, 1983, pp. 357–375. http://carlotaperez.org/papers/scass_v04.pdf
- ↑ Online Course "Digital Technology & Social Change" (University of California), Hilbert M., DT&SC 4-3: How Societies evolve https://www.youtube.com/watch?v=PS1GmzR0rYk
- ↑ http://www.powerhousemuseum.com/the80sareback/2011/02/the-rise-of-the-compact-disc/ Retrieved 10/03/2016
- ↑ "The Digital Revolution Ahead for the Audio Industry," Business Week. New York, March 16, 1981, p. 40D.
- ↑ History of Research on Switching Theory in Japan, IEEJ Transactions on Fundamentals and Materials, Vol. 124 (2004) No. 8, pp. 720-726, Institute of Electrical Engineers of Japan
- ↑ Switching Theory/Relay Circuit Network Theory/Theory of Logical Mathematics, IPSJ Computer Museum, Information Processing Society of Japan
- ↑ 12.0 12.1 Radomir S. Stanković (University of Niš), Jaakko T. Astola (Tampere University of Technology), Mark G. Karpovsky (Boston University), Some Historical Remarks on Switching Theory, 2007, DOI 10.1.1.66.1248
- ↑ 13.0 13.1 Radomir S. Stanković, Jaakko Astola (2008), Reprints from the Early Days of Information Sciences: TICSP Series On the Contributions of Akira Nakashima to Switching Theory, TICSP Series #40, Tampere International Center for Signal Processing, Tampere University of Technology
- ↑ Shannon, Claude E.; Weaver, Warren (1963). The mathematical theory of communication (4. print. ed.). Urbana: University of Illinois Press. p. 144. ISBN 0252725484.
- ↑ Phil Ament (17 April 2015). "Transistor History - Invention of the Transistor". Retrieved 17 April 2015.
- ↑ Bassett, Ross Knox (2007). To the Digital Age: Research Labs, Start-up Companies, and the Rise of MOS Technology. Johns Hopkins University Press. p. 46. ISBN 9780801886393.
- ↑ "1960 - Metal Oxide Semiconductor (MOS) Transistor Demonstrated". The Silicon Engine. Computer History Museum.
- ↑ "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
- ↑ "Triumph of the MOS Transistor". YouTube. Computer History Museum. 6 August 2010. Retrieved 21 July 2019.
- ↑ Motoyoshi, M. (2009). "Through-Silicon Via (TSV)" (PDF). Proceedings of the IEEE. 97 (1): 43–48. doi:10.1109/JPROC.2008.2007462. ISSN 0018-9219.
- ↑ "Tortoise of Transistors Wins the Race - CHM Revolution". Computer History Museum. Retrieved 22 July 2019.
- ↑ "Transistors Keep Moore's Law Alive". EETimes. 12 December 2018. https://www.eetimes.com/author.asp?section_id=36&doc_id=1334068. Retrieved 18 July 2019.
- ↑ 23.0 23.1 "Applying MOSFETs to Today's Power-Switching Designs". Electronic Design. 23 May 2016. Retrieved 10 August 2019.
- ↑ B. SOMANATHAN NAIR (2002). Digital electronics and logic design. PHI Learning Pvt. Ltd. p. 289. ISBN 9788120319561.
Digital signals are fixed-width pulses, which occupy only one of two levels of amplitude.
- ↑ Joseph Migga Kizza (2005). Computer Network Security. Springer Science & Business Media. ISBN 9780387204734.
- ↑ 2000 Solved Problems in Digital Electronics. Tata McGraw-Hill Education. 2005. p. 151. ISBN 978-0-07-058831-8.
- ↑ Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st century. John Wiley & Sons. pp. 165–167. ISBN 9780470508923.
- ↑ 2000 Solved Problems in Digital Electronics. Tata McGraw-Hill Education. 2005. p. 151. ISBN 978-0-07-058831-8.
- ↑ Colinge, Jean-Pierre; Greer, James C. (2016). Nanowire Transistors: Physics of Devices and Materials in One Dimension. Cambridge University Press. p. 2. ISBN 9781107052406.
- ↑ Chan, Yi-Jen (1992). Studies of InAIAs/InGaAs and GaInP/GaAs heterostructure FET's for high speed applications. University of Michigan. p. 1.
The Si MOSFET has revolutionized the electronics industry and as a result impacts our daily lives in almost every conceivable way.
- ↑ Grant, Duncan Andrew; Gowar, John (1989). Power MOSFETS: theory and applications. Wiley. p. 1. ISBN 9780471828679.
The metal-oxide-semiconductor field-effect transistor (MOSFET) is the most commonly used active device in the very large-scale integration of digital integrated circuits (VLSI). During the 1970s these components revolutionized electronic signal processing, control systems and computers.
- ↑ "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
- ↑ Golio, Mike; Golio, Janet (2018). RF and Microwave Passive and Active Technologies. CRC Press. p. 18–2. ISBN 9781420006728.
- ↑ 34.0 34.1 "Triumph of the MOS Transistor". YouTube. Computer History Museum. 6 August 2010. Retrieved 21 July 2019.
- ↑ Raymer, Michael G. (2009). The Silicon Web: Physics for the Internet Age. CRC Press. p. 365. ISBN 9781439803127.
- ↑ Wong, Kit Po (2009). Electrical Engineering - Volume II. EOLSS Publications. p. 7. ISBN 9781905839780.
- ↑ "Transistors – an overview". ScienceDirect. Retrieved 8 August 2019.
- ↑ "Transistors – an overview". ScienceDirect. Retrieved 8 August 2019.
- ↑ 39.0 39.1 Shirriff, Ken (30 August 2016). "The Surprising Story of the First Microprocessors". IEEE Spectrum. Institute of Electrical and Electronics Engineers. 53 (9): 48–54. doi:10.1109/MSPEC.2016.7551353. S2CID 32003640. Retrieved 13 October 2019.
- ↑ 40.0 40.1 40.2 Federico Faggin, The Making of the First Microprocessor, IEEE Solid-State Circuits Magazine, Winter 2009, IEEE Xplore
- ↑ Nigel Tout. "The Busicom 141-PF calculator and the Intel 4004 microprocessor". Retrieved November 15, 2009.
- ↑ Aspray, William (1994-05-25). "Oral-History: Tadashi Sasaki". Interview #211 for the Center for the History of Electrical Engineering. The Institute of Electrical and Electronics Engineers, Inc. Retrieved 2013-01-02.
- ↑ Faggin, F.; Klein, T.; Vadasz, L. (23 October 1968). Insulated Gate Field Effect Transistor Integrated Circuits with Silicon Gates (JPEG image). International Electronic Devices Meeting. IEEE Electron Devices Group. Retrieved 2009-12-23.
- ↑ "History of Whole Earth Catalog". Retrieved 17 April 2015.
- ↑ Hittinger, William C. (1973). "Metal-Oxide-Semiconductor Technology". Scientific American. 229 (2): 48–59. Bibcode:1973SciAm.229b..48H. doi:10.1038/scientificamerican0873-48. ISSN 0036-8733. JSTOR 24923169.
- ↑ Peter Clarke (14 October 2005). "Intel enters billion-transistor processor era". EE Times.
- ↑ "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
- ↑ Vint Cerf, Yogen Dalal, Carl Sunshine, (December 1974), RFC 675, Specification of Internet Transmission Control Protocol
- ↑ Panzaris, Georgios (2008). Machines and romances: the technical and harrative construction of networked computing as a general-purpose platform, 1960-1995. Stanford University. p. 128.
Despite the misgivings of Xerox Corporation (which intended to make PUP the basis of a proprietary commercial networking product), researchers at Xerox PARC, including ARPANET pioneers Robert Metcalfe and Yogen Dalal, shared the basic contours of their research with colleagues at TCP and lnternet working group meetings in 1976 and 1977, suggesting the possible benefits of separating TCPs routing and transmission control functions into two discrete layers.
- ↑ 50.0 50.1 Pelkey, James L. (2007). "Yogen Dalal". Entrepreneurial Capitalism and Innovation: A History of Computer Communications, 1968-1988. Retrieved 5 September 2019.
- ↑ Ahmed, Nasir (January 1991). "How I Came Up With the Discrete Cosine Transform". Digital Signal Processing. 1 (1): 4–5. doi:10.1016/1051-2004(91)90086-Z.
- ↑ 52.0 52.1 Lea, William (1994). Video on demand: Research Paper 94/68. 9 May 1994: House of Commons Library. Archived from the original on 20 September 2019. Retrieved 20 September 2019.
{{cite book}}
: CS1 maint: location (link) - ↑ Frolov, Artem; Primechaev, S. (2006). "Compressed Domain Image Retrievals Based On DCT-Processing". Semantic Scholar. Retrieved 18 October 2019.
- ↑ Lee, Ruby Bei-Loh; Beck, John P.; Lamb, Joel; Severson, Kenneth E. (April 1995). "Real-time software MPEG video decoder on multimedia-enhanced PA 7100LC processors" (PDF). Hewlett-Packard Journal. 46 (2). ISSN 0018-1153.
- ↑ Jones, Willie D. (19 August 2024). "Nasir Ahmed: An Unsung Hero of Digital Media". IEEE Spectrum. Retrieved 2024-08-25.
- ↑ 56.0 56.1 Stanković, Radomir S.; Astola, Jaakko T. (2012). "Reminiscences of the Early Work in DCT: Interview with K.R. Rao" (PDF). Reprints from the Early Days of Information Sciences. 60. Retrieved 13 October 2019.
- ↑ Luo, Fa-Long (2008). Mobile Multimedia Broadcasting Standards: Technology and Practice. Springer Science & Business Media. p. 590. ISBN 9780387782638.
- ↑ Britanak, V. (2011). "On Properties, Relations, and Simplified Implementation of Filter Banks in the Dolby Digital (Plus) AC-3 Audio Coding Standards". IEEE Transactions on Audio, Speech, and Language Processing. 19 (5): 1231–1241. doi:10.1109/TASL.2010.2087755.
- ↑ Shishikui, Yoshiaki; Nakanishi, Hiroshi; Imaizumi, Hiroyuki (October 26–28, 1993). "An HDTV Coding Scheme using Adaptive-Dimension DCT". Signal Processing of HDTV: Proceedings of the International Workshop on HDTV '93, Ottawa, Canada. Elsevier: 611–618. doi:10.1016/B978-0-444-81844-7.50072-3. ISBN 9781483298511.
- ↑ "Personal Computer Milestones". Retrieved 17 April 2015.
- ↑ http://home.ccil.org/~remlaps/www.bobbemer.com/TIMESHAR.HTM
- ↑ "Atari - Arcade/Coin-op". Retrieved 17 April 2015.
- ↑ Vincze Miklós. "Forgotten arcade games let you shoot space men and catch live lobsters". io9. Retrieved 17 April 2015.
- ↑ "How many Commodore 64 computers were really sold?". pagetable.com. Retrieved 17 April 2015.
- ↑ http://www.census.gov/hhes/computer/files/1984/p23-155.pdf
- ↑ http://www.census.gov/hhes/computer/files/1989/p23-171.pdf
- ↑ "COMPUTE! magazine issue 93 Feb 1988".
If the wheels behind the CD-ROM industry have their way, this product will help open the door to a brave, new multimedia world for microcomputers, where the computer is intimately linked with the other household electronics, and every gadget in the house reads tons of video, audio, and text data from CD-ROM disks.
- ↑ "1988". Retrieved 17 April 2015.
- ↑ 69.0 69.1 O'Neill, A. (2008). "Asad Abidi Recognized for Work in RF-CMOS". IEEE Solid-State Circuits Society Newsletter. 13 (1): 57–58. doi:10.1109/N-SSC.2008.4785694. ISSN 1098-4232.
- ↑ Srivastava, Viranjay M.; Singh, Ghanshyam (2013). MOSFET Technologies for Double-Pole Four-Throw Radio-Frequency Switch. Springer Science & Business Media. p. 1. ISBN 9783319011653.
- ↑ Daneshrad, Babal; Eltawil, Ahmed M. (2002). "Integrated Circuit Technologies for Wireless Communications". Wireless Multimedia Network Technologies. The International Series in Engineering and Computer Science. Springer US. 524: 227–244. doi:10.1007/0-306-47330-5_13. ISBN 0-7923-8633-7.
- ↑ Veendrick, Harry J. M. (2017). Nanometer CMOS ICs: From Basics to ASICs. Springer. p. 243. ISBN 9783319475974.
- ↑ Martin Bryant (6 August 2011). "20 years ago today, the World Wide Web was born - TNW Insider". The Next Web. Retrieved 17 April 2015.
- ↑ "The World Wide Web". Retrieved 17 April 2015.
- ↑ http://www.census.gov/prod/2005pubs/p23-208.pdf
- ↑ Golio, Mike; Golio, Janet (2018). RF and Microwave Passive and Active Technologies. CRC Press. pp. ix, I-1, 18–2. ISBN 9781420006728.
- ↑ Rappaport, T. S. (November 1991). "The wireless revolution". IEEE Communications Magazine. 29 (11): 52–71. doi:10.1109/35.109666.
- ↑ "The wireless revolution". The Economist. January 21, 1999. https://www.economist.com/leaders/1999/01/21/the-wireless-revolution. Retrieved 12 September 2019.
- ↑ Matthew, Crick (2016). Power, Surveillance, and Culture in YouTube™'s Digital Sphere. IGI Global. pp. 36–7. ISBN 9781466698567.
- ↑ "One Billion People Online!". Retrieved 17 April 2015.
- ↑ Tablets, Phones to Surpass PCs for Internet Use in Four Years. PCWorld.
- ↑ "World Internet Users Statistics and 2014 World Population Stats". Retrieved 17 April 2015.
- ↑ 83.0 83.1 "The World’s Technological Capacity to Store, Communicate, and Compute Information", especially Supporting online material, Martin Hilbert and Priscila López (2011), Science (journal), 332(6025), 60-65; free access to the article through here: http://www.martinhilbert.net/worldinfocapacity-html/
- ↑ "video animation on The World’s Technological Capacity to Store, Communicate, and Compute Information from 1986 to 2010
- ↑ Information in the Biosphere: Biological and Digital Worlds, doi:10.1016/j.tree.2015.12.013, Gillings, M. R., Hilbert, M., & Kemp, D. J. (2016), Trends in Ecology & Evolution, 31(3), 180–189
- ↑ "Worldmapper: The world as you've never seen it before - Cellular Subscribers 1990". Retrieved 17 April 2015.
- ↑ 87.0 87.1 87.2 "Worldmapper: The world as you've never seen it before - Communication Maps". Retrieved 17 April 2015.
- ↑ http://www.computeruser.com/articles/cell-phone-dangers-protecting-our-homes-from-cell-phone-radiation.html
- ↑ "World Internet Users Statistics and 2014 World Population Stats". Retrieved 17 April 2015.
- ↑ "World Internet Users Statistics and 2014 World Population Stats". Retrieved 17 April 2015.
- ↑ "Computing Productivity: Firm-Level Evidence". SSRN 290325.
{{cite web}}
:|access-date=
requires|url=
(help); Missing or empty|url=
(help) - ↑ Cyra Master (10 April 2009). "Media Insiders Say Internet Hurts Journalism". The Atlantic. Retrieved 17 April 2015.
- ↑ John Markoff (November 22, 2002). "Pentagon Plans a Computer System That Would Peek at Personal Data of Americans". The New York Times.