Computer Wiki

Please help us by creating a new article!

READ MORE

Computer Wiki

A computer is a logical electronic device design that performs high-speed mathematical or logical operations, correlates and processes information from data that is input, retrieved or processed.

Early computers were only conceived as calculating devices. Since ancient times, simple manual devices like the abacus aided people in doing calculations. Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit (IC) chip technologies in the late 1950s, leading to the MOS revolution, microprocessor and the microcomputer revolution in the 1970s. The speed, power and versatility of computers have been increasing dramatically ever since then, with MOS transistor counts increasing at a rapid pace (as predicted by Moore's law), leading to the Digital Revolution during the late 20th to early 21st centuries.

Devices[]

All the physical parts of computer are called Hardware.

Input devices are used to give computer our instructions.

Oldkeyboard-transparent

Keyboard

Keyboard is the only device which can type and give instructions to the computer.

Plus

Mouse(Right)

Mouse is the only device from which we can point and click to give instructions,It can also draw on the computer which any other device can't.

Computer-microphone

Microphone

Microphone is the only device from which we can give sound data.

Download

Joystick

Download (1)

Gamepad

These are the two devices which are used in playing games.These are Different Devices,From Joystick we can move object faster and it has only one button.Gamepad

has multiple buttons just like Xbox and PS3 controllers.



Output Devices are the devices from which we get results of the data we gave.

Sharp ax286n 1

Monitor(Top)

Monitor is an output device which displays output on the screen,It cannot be seen permenantly and this is called Soft Copy.

Printers are the output devices which can display output on a paper,It can be seen permenantly and this is called Hard Copy.

Download (2)

Impact Printer

Impact printers are low quality printers,makes noise,slow speed and does strike on the ribbon.

1296827140 163945433 1-Pictures-of--Dell-1130N-Laser-Printer

Non-Impact Printer

Non-Impact printers are high quality printers,doesn't makes noise,fast speed and does not strike on the ribbon.


Computer-speakers2

Speakers

Speakers are the output devices which

gives sound output which we gave from microphone.


63199109 4-hp-plotter-service-center-in-gurgaon-9811747325-Computer

Plotter

Plotter is similar to printer but printer can't print out difficult designs that plotter can. eg: House Design


Storage Devices are the devices which can store the data inside them.

Hard disc

Hard Disc

Hard Disc can store 200GB to 3TB,It is not portable because it is fixed inside the computer.

Download (3)

Compact Disc (CD)

It is an optical storage device with a storage capacity more than 700 Megabytes.It is also a portable device.

DVD

DVD

It is an optical storage device with a storage capacity of 4.7GB to 17.08GB.It is also a portable device.

Blu-ray disc

Blu-ray Disc

It is an optical storage device with a storage capacity of 25GB to 200GB.It is also a portable device.

USB

Flask Drive

It is an USB device which can store 2GB to 256GB.It is also a portable device.


Processing devices Process the data given by a user

CPU
CPU is a processing device which has two parts,it is also known as the brain of the computer.

CU is the Control Unit of the CPU,It decodes the users instruction.


Aritmathic Logic Unit does Aritmathic and Logical operations





Software is an important part of the computer system,without it computer would be a dumb machine.

System software is the main software which loads data on the screen.

Operating System Software governs both hardware and software.

Win8logo

Windows

Windows are now a days avalible. They are inside every laptop and computer.They display the output.



Application software is made to solve user's particular problems.

Wordlogo
It is a word processing software which is used to create,modify or format the text documents.

MS excel
It is a spreadsheet package which can be used to create marks sheet,salary statements,budgets,etc.

PPTlogo
This application is used to create multimedia presentation on the computer.



Types[]

Computers can be classified in a number of different ways, including:

By architecture[]

  • Analog computer
  • Digital computer
  • Hybrid computer
  • Harvard architecture
  • Von Neumann architecture
  • Reduced instruction set computer

By size and form-factor[]

History[]

Early computing devices[]

Os d'Ishango IRSNB

The Ishango bone, a bone tool dating back to prehistoric Africa.

Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers.[1][2][3] The use of counting rods is one example. The Lebombo bone from the mountains between Swaziland and South Africa may be the oldest known mathematical artifact.[4] It dates from 35,000 BCE and consists of 29 distinct notches that were deliberately cut into a baboon's fibula.[5][6]

Abacus 6

The Chinese suanpan (算盘). The number represented on this abacus is 6,302,715,408.

The abacus was early used for arithmetic tasks. It was used in Babylonia (modern Iraq) as early as circa 2700–2400 BC. Since then, many other forms of reckoning boards or tables have been invented.

Several computing devices were constructed in ancient times to perform astronomical calculations. These include the south-pointing chariot (c. 1000–900 BC) from ancient China, Antikythera mechanism from the Hellenistic world (c. 150–100 BC),[7] and an early astrolabe from Byzantine Egypt (c. 400 CE).[8]

Early analog computers[]

Khalili Collection Islamic Art sci 0430 back

Planispheric Astrolabe, 9th century, North African (Khalili Collection).

Several analog computers were constructed in medieval times to perform astronomical and navigational calculations. In the medieval Islamic world, the astrolabe evolved into the planispheric astrolabe, an early analog computer capable of calculating solutions to various problems.[9][8] The 10th-century astronomer ʿAbd al-Raḥmān al-Ṣūfī wrote a massive text of 386 chapters on the astrolabe, which reportedly described more than 1,000 applications for the astrolabe's various functions.[10]

Abū Rayhān al-Bīrūnī (c. 1000) invented the first mechanical geared lunisolar calendar astrolabe,[11] an early fixed-wired knowledge processing machine[12] with a gear train and gear-wheels[13]. Other early mechanical devices used to perform one or another type of calculations include the planisphere and other mechanical computing devices invented by Abu Rayhan al-Biruni (c. 1000);[14] the equatorium and universal latitude-independent astrolabe by Abu Ishaq Ibrahim al-Zarqali (c. 1015);[15] the astronomical analog computers of other medieval Muslim astronomers and engineers; and the astronomical clock tower of Su Song (c. 1090) during the Song dynasty.

An astrolabe incorporating a mechanical calendar computer[16][17] and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235.[18] The Plate of Zones, a mechanical planetary computer which could graphically solve a number of planetary problems, was invented by Al-Kashi in the 15th century. It could predict the true positions in longitude of the sun and moon,[19] and the planets in terms of elliptical orbits;[20] the latitudes of the Sun, Moon, and planets; and the ecliptic of the Sun. The instrument also incorporated an alidade and ruler.[21] The Plate of Conjunctions, a computing instrument used to determine the time of day at which planetary conjunctions will occur[22] and for performing linear interpolation, was invented by Al-Kashi in the 15th century.[23]

Programmable analog computer[]

Castle clock

Al-Jazari's castle clock (1206) was the first programmable analog computer.

In 1206, Arab engineer Al-Jazari invented the castle clock, a hydropowered mechanical astronomical clock and the first programmable analog computer.[24] It was a complex device that was about 11 feet high, and had multiple functions alongside timekeeping. It included a display of the zodiac and the solar and lunar orbits, and a pointer in the shape of the crescent moon which travelled across the top of a gateway, moved by a hidden cart and causing automatic doors to open, each revealing a mannequin, every hour.[25] It was possible to re-program the length of day and night everyday in order to account for the changing lengths of day and night throughout the year, and it also featured five robotic musicians who automatically play music when moved by levers operated by a hidden camshaft attached to a water wheel.[24] Other components of the castle clock included a main reservoir with a float, a float chamber and flow regulator, plate and valve trough, two pulleys, crescent disc displaying the zodiac, and two falcon automata dropping balls into vases.[26]

19th century[]

Ada lovelace

Ada Lovelace, whose notes were added to the end of Luigi Menabrea's paper included the first algorithm designed for processing by Charles Babbage's Analytical Engine.

Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer",[27] he conceptualized and invented the first mechanical computer in the early 19th century. After working on his revolutionary difference engine, designed to aid in navigational calculations, in 1833 he realized that a much more general design, an Analytical Engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.[28][29]

The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906.

Ada Lovelace translated and added notes to the "Sketch of the Analytical Engine" by Luigi Federico Menabrea. This appears to be the first published description of programming, so Ada Lovelace is widely regarded as the first computer programmer.[30]

Theoretical foundations[]

The mathematical basis of digital computing is Boolean algebra. In the 1930s, Japanese NEC engineer Akira Nakashima introduced switching circuit theory. In a series of papers published from 1934 to 1936, he formulated a two-valued Boolean algebra, which he discovered independently, as a way to analyze and design circuits by algebraic means.[31][32][33][34] Nakashima's work laid the foundations for digital circuit design.[34]

The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper,[35] On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing's design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper.[36] Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

Electro-mechanical computers[]

Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939, was one of the earliest examples of an electromechanical relay computer.[37]

In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer.[38][39] The Z3 was built with 2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–10 Hz.[40] Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time.[41] The Z3 was Turing complete.[42][43]

Stored-program computers[]

Early stored-program computers were initially not transistorized. The first stored-program transistor computer was the ETL Mark III, developed by Japan's Electrotechnical Laboratory.[44][45][46] It began development in 1954,[47] and was completed in 1956.[45]

The use of programming in electronic transistor computers was introduced in 1961, with the KT-Pilot, developed by Kyoto University and Toshiba in Japan.[48][49]

Transistor computers[]

The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948.[50][51] From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialised applications.[52]

At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves.[53] Their first transistorised computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955,[54] built by the electronics division of the Atomic Energy Research Establishment at Harwell.[54][55]

Early transistor computers were initially not stored-program computers. The first transistorized stored-program computer was the ETL Mark III, developed by Japan's Electrotechnical Laboratory.[56][57][58] It began development in 1954,[59] and was completed in 1956.[57]

MOSFET Structure

MOS transistor (MOSFET), the basis for the microcomputer revolution.

Atalla1963

Mohamed M. Atalla invented the MOS transistor (1959) and MOS integrated circuit (1960), the basis for the microcomputer revolution.

Dawon Kahng

Dawon Kahng co-invented the MOS transistor (1959) with Mohamed M. Atalla.

The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959.[60] It was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[52] With its high scalability,[61] and much lower power consumption and higher density than bipolar junction transistors,[62] the MOSFET made it possible to build high-density integrated circuits.[63][64] In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers.[65] The MOSFET led to the microcomputer revolution,[66] and became the driving force behind the computer revolution.[67][68] The MOSFET is the most widely used transistor in computers,[69][70] and is the fundamental building block of digital electronics.[71]

Integrated circuit[]

The next great advance in computing power came with the advent of the integrated circuit (IC). The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor.[72] Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[73] However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip.[74] Kilby's IC had external wire connections, which made it difficult to mass-produce.[75]

Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[76] Noyce's invention was the first true monolithic IC chip.[77][75] His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on the silicon surface passivation and thermal oxidation processes developed by Mohamed Atalla at Bell Labs in the late 1950s.[78][79][80]

Modern monolithic ICs are predominantly MOS integrated circuits (MOS IC), built from MOS transistors.[81] After the first MOS transistor was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959,[82] Atalla first proposed the concept of the MOS integrated circuit in 1960, followed by Kahng in 1961, both noting that the MOS transistor's ease of fabrication made it useful for integrated circuits.[52][83] The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962.[84] General Microelectronics later introduced the first commercial MOS IC in 1964,[85] developed by Robert Norman.[84] Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968.[86] The MOS transistor has since become the most critical device component in modern ICs.[87]

Microprocessor[]

The development of the MOS integrated circuit led to the invention of the microprocessor,[88][89] and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely known that the first single-chip microprocessor was the Intel 4004,[90] designed and realized by Federico Faggin with his silicon-gate MOS IC technology,[88] along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[91][92] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip.[64]

The first single-chip microprocessor was the Intel 4004.[93] It originated in Japan with the "Busicom Project"[94] as Masatoshi Shima's three-chip CPU design in 1968,[95][94] before Sharp's Tadashi Sasaki conceived of a single-chip microprocessor, which he discussed with Busicom and Intel in 1968.[96] Tadashi Sasaki and Masatoshi Shima at Busicom, a calculator manufacturer, had the initial insight that the CPU could be a single MOS LSI chip, supplied by Intel.[97] The Intel 4004 was then designed and realized as a single-chip microprocessor from 1969 to 1970, by Busicom's Masatoshi Shima and Intel's Ted Hoff and Federico Faggin. It was developed on a single PMOS LSI chip.

Microcomputer revolution[]

The microprocessor and semiconductor memory led to the development of the microcomputer, which are small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond. The first microcomputer was Japan's Sord SMP80/08, developed in April 1972.[98] It was soon followed by several other unique hobbyist systems developed based on the Intel 4004 and its successor, the Intel 8008. The first commercially available microcomputer kits were based on the Intel 8080: the Sord SMP80/x series, released in May 1974,[98] and the Altair 8800, which was announced in the January 1975 cover article of Popular Electronics.

Mobile computers[]

The first portable computers were heavy and ran from mains power. The 50lb IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in.

Yukio Yokozawa, an employee for Suwa Seikosha, a branch of Japanese company Seiko (now Seiko Epson), invented the first laptop/notebook computer in July 1980, receiving a patent for the invention.[99] Seiko's notebook computer, known as the HC-20 in Japan, was announced in 1981.[100] In North America, Epson introduced it as the Epson HX-20 in 1981, at the COMDEX computer show in Las Vegas, where it drew significant attention for its portability.[101] It had a mass-market release in July 1982, as the HC-20 in Japan[100] and as the Epson HX-20 in North America.[102] It was the first notebook-sized handheld computer,[103][104][105] the size of an A4 notebook and weighing 1.6 kg (3.5 lb).[100]

The first laptops removed the plug-in requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s.[106] The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s.

These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market.[107] These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin.[108]

Hardware[]

The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and "mice" input devices are all hardware.

History of computing hardware[]

Other hardware topics

Peripheral device (input/output) Input Mouse, keyboard, joystick, image scanner, webcam, graphics tablet, microphone
Output Monitor, printer, loudspeaker
Both Floppy disk drive, hard disk drive, optical disc drive, teleprinter
Computer buses Short range RS-232, SCSI, PCI, USB
Long range (computer networking) Ethernet, ATM, FDDI

A general purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits.

Input devices[]

When unprocessed data is sent to the computer with the help of input devices, the data is processed and sent to output devices. The input devices may be hand-operated or automated. The act of processing is mainly regulated by the CPU. Some examples of input devices are:

  • Computer keyboard
  • Digital camera
  • Digital video
  • Graphics tablet
  • Image scanner
  • Joystick
  • Microphone
  • Mouse
  • Overlay keyboard
  • Real-time clock
  • Trackball
  • Touchscreen

Output devices[]

The means through which computer gives output are known as output devices. Some examples of output devices are:

  • Computer monitor
  • Printer
  • PC speaker
  • Projector
  • Sound card
  • Video card

Central processing unit (CPU)[]

The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor.

Memory[]

A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered "address" and can store a single number. The computer can be instructed to "put the number 123 into the cell numbered 1357" or to "add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595." The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers.

In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory.

The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed.

Computer main memory comes in two principal varieties:

RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[109]

In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part.

Software[]

Software refers to parts of the computer which do not have a material form, such as programs, data, protocols, etc. Software is that part of a computer system that consists of encoded information or computer instructions, in contrast to the physical hardware from which the system is built. Computer software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software Computer hardware and software require each other and neither can be realistically used on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called "firmware".

Networking and Internet[]

Computers have been used to coordinate information between multiple locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre.[110] In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET.[111] The technologies that made the Arpanet possible spread and evolved.

In time, the network spread beyond academic and military institutions and became known as the Internet. The emergence of networking involved a redefinition of the nature and boundaries of the computer. Computer operating systems and applications were modified to include the ability to define and access the resources of other computers on the network, such as peripheral devices, stored information, and the like, as extensions of the resources of an individual computer. Initially these facilities were available primarily to people working in high-tech environments, but in the 1990s the spread of applications like e-mail and the World Wide Web, combined with the development of cheap, fast networking technologies like Ethernet and ADSL saw computer networking become almost ubiquitous. In fact, the number of computers that are networked is growing phenomenally. A very large proportion of personal computers regularly connect to the Internet to communicate and receive information. "Wireless" networking, often utilizing mobile phone networks, has meant networking is becoming increasingly ubiquitous even in mobile computing environments.

Professions and organizations[]

As the use of computers has spread throughout society, there are an increasing number of careers involving computers.

Computer-related professions
Hardware-related Electrical engineering, Electronic engineering, Computer engineering, Telecommunications engineering, Optical engineering, Nanoengineering
Software-related Computer science, Computer engineering, Desktop publishing, Human–computer interaction, Information technology, Information systems, Computational science, Software engineering, Video game industry, Web design

See also[]

  • Glossary of computers
  • Computability theory
  • Computer insecurity
  • Computer security
  • Glossary of computer hardware terms
  • History of computer science
  • List of computer term etymologies
  • List of fictional computers
  • List of pioneers in computer science
  • Pulse computation
  • TOP500 (list of most powerful computers)
  • Unconventional computing

References[]

  1. According to Schmandt-Besserat 1981, these clay containers contained tokens, the total of which were the count of objects being transferred. The containers thus served as something of a bill of lading or an accounts book. In order to avoid breaking open the containers, first, clay impressions of the tokens were placed on the outside of the containers, for the count; the shapes of the impressions were abstracted into stylized marks; finally, the abstract marks were systematically used as numerals; these numerals were finally formalized as numbers. Eventually (Schmandt-Besserat estimates it took 4000 years Archived 2012-01-30 at the Wayback Machine) the marks on the outside of the containers were all that were needed to convey the count, and the clay containers evolved into clay tablets with marks for the count.
  2. Robson, Eleanor (2008), Mathematics in Ancient Iraq, ISBN 978-0-691-09182-2. p.5: calculi were in use in Iraq for primitive accounting systems as early as 3200–3000 BCE, with commodity-specific counting representation systems. Balanced accounting was in use by 3000–2350 BCE, and a sexagesimal number system was in use 2350–2000 BCE.
  3. Robson has recommended at least one supplement to Schmandt-Besserat, e.g., a review Englund, R. (1993) "The origins of script" Science 260, 1670-1671
  4. Helaine Selin (12 March 2008). Encyclopaedia of the History of Science, Technology, and Medicine in Non-Western Cultures. Springer Science & Business Media. p. 1356. Bibcode:2008ehst.book.....S. ISBN 978-1-4020-4559-2. {{cite book}}: |journal= ignored (help)
  5. Pegg, Ed Jr. "Lebombo Bone". MathWorld.
  6. Darling, David (2004). The Universal Book of Mathematics From Abracadabra to Zeno's Paradoxes. John Wiley & Sons. ISBN 978-0-471-27047-8.
  7. Lazos 1994
  8. 8.0 8.1 Savage-Smith, Emilie (1992). "Celestial Mapping" (PDF). In Harley, J. B.; Woodward, David (eds.). The History of Cartography, Volume 2, Book 1: Cartography in the Traditional Islamic and South Asian Societies. The History of Cartography. Vol. 2. Chicago, Illinois: University of Chicago Press. ISBN 0226316351.
  9. "Astrolabe". Encyclopedia Britannica. 2024-08-31. Retrieved 2024-09-11.
  10. Bean, Adam L. (2009). "Astrolabes". In Birx, H. James (ed.). Encyclopedia of Time: Science, Philosophy, Theology, & Culture. Vol. 1. SAGE. pp. 59–60. ISBN 978-1-4129-4164-8.
  11. Price, Derek de S. (1984). "A History of Calculating Machines". IEEE Micro. 4 (1): 22–52. doi:10.1109/MM.1984.291305.
  12. Őren, Tuncer (2001). "Advances in Computer and Information Sciences: From Abacus to Holonic Agents" (PDF). Turk J Elec Engin. 9 (1): 63–70.
  13. Donald Routledge Hill (1985). "Al-Biruni's mechanical calendar", Annals of Science 42, pp. 139–163.
  14. G. Wiet, V. Elisseeff, P. Wolff, J. Naudu (1975). History of Mankind, Vol 3: The Great medieval Civilisations, p. 649. George Allen & Unwin Ltd, UNESCO.
  15. "Abu Ishaq Ibrahim Ibn Yahya Al-Zarqali | Muslim Heritage". muslimheritage.com. Retrieved 2018-05-09.
  16. Fuat Sezgin "Catalogue of the Exhibition of the Institute for the History of Arabic-Islamic Science (at the Johann Wolfgang Goethe University", Frankfurt, Germany) Frankfurt Book Fair 2004, pp. 35 & 38.
  17. Charette, François (2006). "Archaeology: High tech from Ancient Greece". Nature. 444 (7119): 551–552. Bibcode:2006Natur.444..551C. doi:10.1038/444551a. PMID 17136077.
  18. Bedini, Silvio A.; Maddison, Francis R. (1966). "Mechanical Universe: The Astrarium of Giovanni de' Dondi". Transactions of the American Philosophical Society. 56 (5): 1–69. doi:10.2307/1006002. JSTOR 1006002.
  19. E. S. Kennedy (1950), "A Fifteenth-Century Planetary Computer: al-Kashi's Tabaq al-Manateq I. Motion of the Sun and Moon in Longitude", Isis 41 (2), p. 180-183.
  20. E. S. Kennedy (1952), "A Fifteenth-Century Planetary Computer: al-Kashi's Tabaq al-Maneteq II: Longitudes, Distances, and Equations of the Planets", Isis 43 (1), pp. 42-50.
  21. E. S. Kennedy (1951), "An Islamic Computer for Planetary Latitudes", Journal of the American Oriental Society 71 (1), pp. 13-21.
  22. E. S. Kennedy (1947), "Al-Kashi's Plate of Conjunctions", Isis 38 (1-2), p. 56-59 [56].
  23. E. S. Kennedy (1950), "A Fifteenth-Century Planetary Computer: al-Kashi's Tabaq al-Manateq I. Motion of the Sun and Moon in Longitude", Isis 41 (2), pp. 180-183.
  24. 24.0 24.1 Ancient Discoveries, Episode 11: Ancient Robots, History Channel, retrieved 2008-09-06
  25. Howard R. Turner (1997), Science in Medieval Islam: An Illustrated Introduction, p. 184. University of Texas Press, ISBN 0292781490.
  26. Salim Al-Hassani (13 March 2008). "How it Works: Mechanism of the Castle Clock". FSTC. Retrieved 2008-09-06.
  27. Halacy, Daniel Stephen (1970). Charles Babbage, Father of the Computer. Crowell-Collier Press. ISBN 978-0-02-741370-0.
  28. "Babbage". Online stuff. Science Museum. 19 January 2007. Retrieved 1 August 2012.
  29. "Let's build Babbage's ultimate mechanical computer". opinion. New Scientist. 23 December 2010. Retrieved 1 August 2012.
  30. Menabrea & Lovelace 1843.
  31. History of Research on Switching Theory in Japan, IEEJ Transactions on Fundamentals and Materials, Vol. 124 (2004) No. 8, pp. 720-726, Institute of Electrical Engineers of Japan
  32. Switching Theory/Relay Circuit Network Theory/Theory of Logical Mathematics, IPSJ Computer Museum, Information Processing Society of Japan
  33. Radomir S. Stanković (University of Niš), Jaakko T. Astola (Tampere University of Technology), Mark G. Karpovsky (Boston University), Some Historical Remarks on Switching Theory, 2007, DOI 10.1.1.66.1248
  34. 34.0 34.1 Radomir S. Stanković, Jaakko Astola (2008), Reprints from the Early Days of Information Sciences: TICSP Series On the Contributions of Akira Nakashima to Switching Theory, TICSP Series #40, Tampere International Center for Signal Processing, Tampere University of Technology
  35. Turing, A. M. (1937). "On Computable Numbers, with an Application to the Entscheidungsproblem". Proceedings of the London Mathematical Society. 2. 42 (1): 230–265. doi:10.1112/plms/s2-42.1.230.
  36. "von Neumann ... firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing—insofar as not anticipated by Babbage, Lovelace and others." Letter by Stanley Frankel to Brian Randell, 1972, quoted in Jack Copeland (2004) The Essential Turing, p22.
  37. Zuse, Horst. "Part 4: Konrad Zuse's Z1 and Z3 Computers". The Life and Work of Konrad Zuse. EPE Online. Archived from the original on 1 June 2008. Retrieved 17 June 2008.
  38. Zuse, Konrad (2010) [1984], The Computer – My Life Translated by McKenna, Patricia and Ross, J. Andrew from: Der Computer, mein Lebenswerk (1984), Berlin/Heidelberg: Springer-Verlag, ISBN 978-3-642-08151-4
  39. Salz Trautman, Peggy (20 April 1994). "A Computer Pioneer Rediscovered, 50 Years On". The New York Times. https://www.nytimes.com/1994/04/20/news/20iht-zuse.html. 
  40. Zuse, Konrad (1993). Der Computer. Mein Lebenswerk (in German) (3rd ed.). Berlin: Springer-Verlag. p. 55. ISBN 978-3-540-56292-4.
  41. "Crash! The Story of IT: Zuse". Archived from the original on 18 September 2016. Retrieved 1 June 2016.
  42. Rojas, R. (1998). "How to make Zuse's Z3 a universal computer". IEEE Annals of the History of Computing. 20 (3): 51–54. doi:10.1109/85.707574.
  43. Rojas, Raúl. "How to Make Zuse's Z3 a Universal Computer" (PDF).
  44. Early Computers, Information Processing Society of Japan
  45. 45.0 45.1 【Electrotechnical Laboratory】 ETL Mark III Transistor-Based Computer, Information Processing Society of Japan
  46. Early Computers: Brief History, Information Processing Society of Japan
  47. Martin Fransman (1993), The Market and Beyond: Cooperation and Competition in Information Technology, page 19, Cambridge University Press
  48. Early Computers, Information Processing Society of Japan
  49. 【Kyoto University,Toshiba】 KT-Pilot, Information Processing Society of Japan
  50. Lee, Thomas H. (2003). The Design of CMOS Radio-Frequency Integrated Circuits (PDF). Cambridge University Press. ISBN 9781139643771.
  51. Puers, Robert; Baldi, Livio; Voorde, Marcel Van de; Nooten, Sebastiaan E. van (2017). Nanoelectronics: Materials, Devices, Applications, 2 Volumes. John Wiley & Sons. p. 14. ISBN 9783527340538.
  52. 52.0 52.1 52.2 Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st century. John Wiley & Sons. pp. 165–167. ISBN 9780470508923.
  53. Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society, pp. 34–35
  54. 54.0 54.1 Cooke-Yarborough, E. H. (June 1998), "Some early transistor applications in the UK", Engineering Science & Education Journal, 7 (3): 100–106, doi:10.1049/esej:19980301, ISSN 0963-7346, retrieved 7 June 2009 (subscription required)
  55. Cooke-Yarborough, E.H. (1957). Introduction to Transistor Circuits. Edinburgh: Oliver and Boyd. p. 139.
  56. Early Computers, Information Processing Society of Japan
  57. 57.0 57.1 【Electrotechnical Laboratory】 ETL Mark III Transistor-Based Computer, Information Processing Society of Japan
  58. Early Computers: Brief History, Information Processing Society of Japan
  59. Martin Fransman (1993), The Market and Beyond: Cooperation and Competition in Information Technology, page 19, Cambridge University Press
  60. "1960: Metal Oxide Semiconductor (MOS) Transistor Demonstrated". The Silicon Engine: A Timeline of Semiconductors in Computers. Computer History Museum. Retrieved August 31, 2019.
  61. Motoyoshi, M. (2009). "Through-Silicon Via (TSV)" (PDF). Proceedings of the IEEE. 97 (1): 43–48. doi:10.1109/JPROC.2008.2007462. ISSN 0018-9219.
  62. "Transistors Keep Moore's Law Alive". EETimes. 12 December 2018. https://www.eetimes.com/author.asp?section_id=36&doc_id=1334068. Retrieved 18 July 2019. 
  63. "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
  64. 64.0 64.1 Hittinger, William C. (1973). "Metal-Oxide-Semiconductor Technology". Scientific American. 229 (2): 48–59. Bibcode:1973SciAm.229b..48H. doi:10.1038/scientificamerican0873-48. ISSN 0036-8733. JSTOR 24923169.
  65. "Transistors - an overview". ScienceDirect. Retrieved 8 August 2019.
  66. Malmstadt, Howard V.; Enke, Christie G.; Crouch, Stanley R. (1994). Making the Right Connections: Microcomputers and Electronic Instrumentation. American Chemical Society. p. 389. ISBN 9780841228610. The relative simplicity and low power requirements of MOSFETs have fostered today's microcomputer revolution.
  67. Fossum, Jerry G.; Trivedi, Vishal P. (2013). Fundamentals of Ultra-Thin-Body MOSFETs and FinFETs. Cambridge University Press. p. vii. ISBN 9781107434493.
  68. "Remarks by Director Iancu at the 2019 International Intellectual Property Conference". United States Patent and Trademark Office. June 10, 2019. Retrieved 20 July 2019.
  69. "Dawon Kahng". National Inventors Hall of Fame. Retrieved 27 June 2019.
  70. "Martin Atalla in Inventors Hall of Fame, 2009". Retrieved 21 June 2013.
  71. "Triumph of the MOS Transistor". YouTube. Computer History Museum. 6 August 2010. Retrieved 21 July 2019.
  72. Kilby, Jack (2000), Nobel lecture (PDF), Stockholm: Nobel Foundation, retrieved 15 May 2008
  73. The Chip that Jack Built, (c. 2008), (HTML), Texas Instruments, Retrieved 29 May 2008.
  74. Saxena, Arjun N. (2009). Invention of Integrated Circuits: Untold Important Facts. World Scientific. p. 140. ISBN 9789812814456.
  75. 75.0 75.1 "Integrated circuits". NASA. Retrieved 13 August 2019.
  76. Robert Noyce's Unitary circuit, US patent 2981877, "Semiconductor device-and-lead structure", issued 1961-04-25, assigned to Fairchild Semiconductor Corporation 
  77. "1959: Practical Monolithic Integrated Circuit Concept Patented". Computer History Museum. Retrieved 13 August 2019.
  78. Lojek, Bo (2007). History of Semiconductor Engineering. Springer Science & Business Media. p. 120. ISBN 9783540342588.
  79. Bassett, Ross Knox (2007). To the Digital Age: Research Labs, Start-up Companies, and the Rise of MOS Technology. Johns Hopkins University Press. p. 46. ISBN 9780801886393.
  80. Huff, Howard R.; Tsuya, H.; Gösele, U. (1998). Silicon Materials Science and Technology: Proceedings of the Eighth International Symposium on Silicon Materials Science and Technology. Electrochemical Society. pp. 181–182.
  81. Kuo, Yue (1 January 2013). "Thin Film Transistor Technology—Past, Present, and Future" (PDF). The Electrochemical Society Interface. 22 (1): 55–61. doi:10.1149/2.F06131if. ISSN 1064-8208.
  82. "1960: Metal Oxide Semiconductor (MOS) Transistor Demonstrated". Computer History Museum.
  83. Bassett, Ross Knox (2007). To the Digital Age: Research Labs, Start-up Companies, and the Rise of MOS Technology. Johns Hopkins University Press. pp. 22–25. ISBN 9780801886393.
  84. 84.0 84.1 "Tortoise of Transistors Wins the Race - CHM Revolution". Computer History Museum. Retrieved 22 July 2019.
  85. "1964 – First Commercial MOS IC Introduced". Computer History Museum.
  86. "1968: Silicon Gate Technology Developed for ICs". Computer History Museum. Retrieved 22 July 2019.
  87. Kuo, Yue (1 January 2013). "Thin Film Transistor Technology—Past, Present, and Future" (PDF). The Electrochemical Society Interface. 22 (1): 55–61. doi:10.1149/2.F06131if. ISSN 1064-8208.
  88. 88.0 88.1 "1971: Microprocessor Integrates CPU Function onto a Single Chip". Computer History Museum. Retrieved 22 July 2019.
  89. Colinge, Jean-Pierre; Greer, James C. (2016). Nanowire Transistors: Physics of Devices and Materials in One Dimension. Cambridge University Press. p. 2. ISBN 9781107052406.
  90. Intel's First Microprocessor—the Intel 4004, Intel Corp., November 1971, archived from the original on 13 May 2008, retrieved 17 May 2008
  91. The Intel 4004 (1971) die was 12 mm2, composed of 2300 transistors; by comparison, the Pentium Pro was 306 mm2, composed of 5.5 million transistors, according to Patterson, David; Hennessy, John (1998), Computer Organization and Design, San Francisco: Morgan Kaufmann, pp. 27–39, ISBN 978-1-55860-428-5
  92. Federico Faggin, The Making of the First Microprocessor, IEEE Solid-State Circuits Magazine, Winter 2009, IEEE Xplore
  93. Intel_4004 1971
  94. 94.0 94.1 Federico Faggin, The Making of the First Microprocessor, IEEE Solid-State Circuits Magazine, Winter 2009, IEEE Xplore
  95. Nigel Tout. "The Busicom 141-PF calculator and the Intel 4004 microprocessor". Retrieved November 15, 2009.
  96. Aspray, William (1994-05-25). "Oral-History: Tadashi Sasaki". Interview #211 for the Center for the History of Electrical Engineering. The Institute of Electrical and Electronics Engineers, Inc. Retrieved 2013-01-02.
  97. Intel 1971.
  98. 98.0 98.1 【Sord】 SMP80/x series, Information Processing Society of Japan
  99. FR2487094A1 patent: Notebook computer system small
  100. 100.0 100.1 100.2 【Shinshu Seiki / Suwa Seikosha】 HC-20, Information Processing Society of Japan
  101. Epson HX-20, Old Computers
  102. Michael R. Peres, The Focal Encyclopedia of Photography, page 306, [[Taylor & Francis
  103. "Epson SX-20 Promotional Brochure" (PDF). Epson America, Inc. 1987. Retrieved 2 November 2008.
  104. 【Shinshu Seiki / Suwa Seikosha】 HC-20, Information Processing Society of Japan
  105. Michael R. Peres, The Focal Encyclopedia of Photography, page 306, Taylor & Francis
  106. Chartier, David (23 December 2008). "Global notebook shipments finally overtake desktops". Ars Technica. https://arstechnica.com/uncategorized/2008/12/global-notebook-shipments-finally-overtake-desktops/. 
  107. IDC (25 July 2013). "Growth Accelerates in the Worldwide Mobile Phone and Smartphone Markets in the Second Quarter, According to IDC". Archived from the original on 26 June 2014.
  108. "7 dazzling smartphone improvements with Qualcomm's Snapdragon 835 chip". 3 January 2017.
  109. Flash memory also may only be rewritten a limited number of times before wearing out, making it less useful for heavy random access usage. (Verma & Mielke 1988)
  110. Agatha C. Hughes (2000). Systems, Experts, and Computers. MIT Press. p. 161. ISBN 978-0-262-08285-3. The experience of SAGE helped make possible the first truly large-scale commercial real-time network: the SABRE computerized airline reservations system ...
  111. Leiner, Barry M.; Cerf, Vinton G.; Clark, David D.; Kahn, Robert E.; Kleinrock, Leonard; Lynch, Daniel C.; Postel, Jon; Roberts, Larry G.; Wolf, Stephen (1999). "A Brief History of the Internet". Internet Society. arXiv:cs/9901011. Bibcode:1999cs........1011L. Retrieved 20 September 2008. {{cite journal}}: Cite journal requires |journal= (help)

Notes[]