web counter


https://www.amazon.it/dp/B0CT9YL557

We support WINRAR [What is this] - [Download .exe file(s) for Windows]

CLASSICISTRANIERI HOME PAGE - YOUTUBE CHANNEL
SITEMAP
Audiobooks by Valerio Di Stefano: Single Download - Complete Download [TAR] [WIM] [ZIP] [RAR] - Alphabetical Download  [TAR] [WIM] [ZIP] [RAR] - Download Instructions

Make a donation: IBAN: IT36M0708677020000000008016 - BIC/SWIFT:  ICRAITRRU60 - VALERIO DI STEFANO or
Privacy Policy Cookie Policy Terms and Conditions
Wikipedysta:Wpedzich/Historia sprzętu komputerowego - Wikipedia, wolna encyklopedia

Wikipedysta:Wpedzich/Historia sprzętu komputerowego

Z Wikipedii

Oryginał: tutaj

Computing hardware is a platform for information processing (block diagram).
Computing hardware is a platform for information processing (block diagram).


The history of computing hardware covers the history of computer hardware,[1] its architecture, and its impact on software. Originally calculations were computed by humans, who were called computers, as a job title. See the history of computing article for methods intended for pen and paper, with or without the aid of tables. For a detailed timeline of events, see the computing timeline article.

The Von Neumann architecture unifies our current computing hardware implementations.[2][3] The major elements of computing hardware are input,[4] output,[5] control,[6] CPU,[7] and memory.[8] They have undergone successive refinement or improvement over the history of computing hardware. Beginning with mechanical mechanisms, the hardware then started using analogs for a computation, including water and even air as the quantities: analog computers have used lengths, pressures, voltages, and currents to represent the results of calculations.[9] Eventually the voltages or currents were standardized and digital computers were developed over a period of evolution dating back centuries. Digital computing elements have ranged from mechanical gears, to electromechanical relays, to vacuum tubes, to transistors, and to integrated circuits, all of which are currently implementing the Von Neumann architecture.

Since digital computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storage is tied to the development of computers. The degree of improvement in computing hardware has triggered world-wide use of the technology. Even as performance has improved, the price has declined,[10] until computers have become commodities, accessible to ever-increasing sectors[11] of the world's population. Computing hardware thus became a platform for uses other than computation, such as automation, communication, control, entertainment, and education. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements.

Spis treści

[edytuj] Earliest calculators

Zobacz więcej w osobnym artykule: Calculator.
Suanpan (the number represented on this abacus is 6,302,715,408)
Suanpan (the number represented on this abacus is 6,302,715,408)

Devices have been used to aid computation for thousands of years; Georges Ifrah notes that humans learned to count on their hands.[12] The earliest counting device was probably a form of tally stick. Later record keeping aids include phoenician clay shapes which represented counts of items, probably livestock or grains, in containers.[13] The abacus was used for arithmetic tasks. The Roman abacus was used in Babylonia as early as 2400 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.[14][15]

A number of analog computers were constructed in ancient and medieval times to perform astronomical calculations. These include the Antikythera mechanism and the astrolabe from ancient Greece (c. 150-100 BC), and are generally regarded as the first mechanical computers.[16] Other early versions of mechanical devices used to perform some type of calculations include the Planisphere; some of the inventions of Abū Rayhān al-Bīrūnī (c. AD 1000); the Equatorium of Abū Ishāq Ibrāhīm al-Zarqālī (c. AD 1015); the astronomical analog computers of other medieval Muslim astronomers and engineers, and the Astronomical Clock Tower of Su Song during the Song Dynasty.

John Napier (1550–1617) noted that multiplication and division of numbers could be performed by addition and subtraction, respectively, of logarithms of those numbers. While producing the first logarithmic tables Napier needed to perform many multiplications, and it was at this point that he designed Napier's bones, an abacus-like device used for multiplication and division.[17] Since real numbers can be represented as distances or intervals on a line, the slide rule was invented in the 1620s to allow multiplication and division operations to be carried out significantly faster than was previously possible.[18] Slide rules were used by generations of engineers and other mathematically inclined professional workers, until the invention of the pocket calculator. The engineers in the Apollo program to send a man to the moon made many of their calculations on slide rules, which were accurate to three or four significant figures.

A mechanical calculator from 1914. Note the lever used to rotate the gears.
A mechanical calculator from 1914. Note the lever used to rotate the gears.

In 1623, Wilhelm Schickard built the first digital mechanical calculator and thus became the father of the computing era.[19] Since his machine used techniques such as cogs and gears first developed for clocks, it was also called a 'calculating clock'. It was put to practical use by his friend Johannes Kepler, who revolutionized astronomy. An original calculator by Pascal (1640) is preserved in the Zwinger Museum. Machines by Blaise Pascal (the Pascaline, 1642) and Gottfried Wilhelm von Leibniz (1671) followed. Around 1820, Charles Xavier Thomas created the first successful, mass-produced mechanical calculator, the Thomas Arithmometer, that could add, subtract, multiply, and divide. It was mainly based on Leibniz's work. Mechanical calculators, like the base-ten addiator, the comptometer, the Monroe, the Curta and the Addo-X remained in use until the 1970s. Leibniz also described the binary numeral system,[20]A central ingredient of all modern computers. However, up to the 1940s, many subsequent designs (including Charles Babbage's machines of the 1800s and even ENIAC of 1945) were based on the decimal system.[21] ENIAC's ring counters emulated the operation of the digit wheels of a mechanical adding machine.

[edytuj] 1801: punched card technology

Zobacz więcej w osobnym artykule: Analytical engine.
Zobacz też: Logic Piano.

As early as 1725 Basile Bouchon used a perforated paper loop in a loom to establish the pattern to be reproduced on cloth, and in 1726 his co-worker Jean-Baptiste Falcon improved on his design by using perforated paper cards attached to one another for efficiency in adapting and changing the program. The Bouchon-Falcon loom was semi-automatic and required manual feed of the program. In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark point in programmability.

Punched card system of a music machine. Also referred to as Book music, a one-stop European medium for organs
Punched card system of a music machine. Also referred to as Book music, a one-stop European medium for organs

In 1833, Charles Babbage moved on from developing his difference engine to developing a more complete design, the analytical engine, which would draw directly on Jacquard's punched cards for its programming.[22] In 1835, Babbage described his analytical engine. It was the plan of a general-purpose programmable computer, employing punch cards for input and a steam engine for power. One crucial invention was to use gears for the function served by the beads of an abacus. In a real sense, computers all contain automatic abacuses (technically called the arithmetic logic unit or floating-point unit). His initial idea was to use punch-cards to control a machine that could calculate and print logarithmic tables with huge precision (a specific purpose machine). Babbage's idea soon developed into a general-purpose programmable computer, his analytical engine. While his design was sound and the plans were probably correct, or at least debuggable, the project was slowed by various problems. Babbage was a difficult man to work with and argued with anyone who didn't respect his ideas. All the parts for his machine had to be made by hand. Small errors in each item can sometimes sum up to large discrepancies in a machine with thousands of parts, which required these parts to be much better than the usual tolerances needed at the time. The project dissolved in disputes with the artisan who built parts and was ended with the depletion of government funding. Ada Lovelace, Lord Byron's daughter, translated and added notes to the "Sketch of the Analytical Engine" by Federico Luigi, Conte Menabrea.[23]

A reconstruction of the Difference Engine II, an earlier, more limited design, has been operational since 1991 at the London Science Museum. With a few trivial changes, it works as Babbage designed it and shows that Babbage was right in theory. The museum used computer-operated machine tools to construct the necessary parts, following tolerances which a machinist of the period would have been able to achieve. Some feel that the technology of the time was unable to produce parts of sufficient precision, though this appears to be false. The failure of Babbage to complete the engine can be chiefly attributed to difficulties not only related to politics and financing, but also to his desire to develop an increasingly sophisticated computer. [24] Following in the footsteps of Babbage, although unaware of his earlier work, was Percy Ludgate, an accountant from Dublin, Ireland. He independently designed a programmable mechanical computer, which he described in a work that was published in 1909.

IBM 407 tabulating machine, (1961). Note the patch panel, which is visible on the right side of the machine. A row of toggle switches is above the patch panel.
IBM 407 tabulating machine, (1961). Note the patch panel, which is visible on the right side of the machine. A row of toggle switches is above the patch panel.
Punch card with the extended alphabet.
Punch card with the extended alphabet.

In 1890, the United States Census Bureau used punched cards, sorting machines, and tabulating machines designed by Herman Hollerith to handle the flood of data from the decennial census mandated by the Constitution.[25] Hollerith's company eventually became the core of IBM. IBM developed punch card technology into a powerful tool for business data-processing and produced an extensive line of specialized unit record equipment. By 1950, the IBM card had become ubiquitous in industry and government. The warning printed on most cards intended for circulation as documents (checks, for example), "Do not fold, spindle or mutilate," became a motto for the post-World War II era.[26]

Leslie Comrie's articles on punched card methods and W.J. Eckert's publication of Punched Card Methods in Scientific Computation in 1940, described techniques which were sufficiently advanced to solve differential equations[27] or perform multiplication and division using floating point representations, all on punched cards and unit record machines. The Thomas J. Watson Astronomical Computing Bureau, Columbia University performed astronomical calculations representing the state of the art in computing.

In many computer installations, punched cards were used until (and after) the end of the 1970s. For example, science and engineering students at many universities around the world would submit their programming assignments to the local computer center in the form of a stack of cards, one card per program line, and then had to wait for the program to be queued for processing, compiled, and executed. In due course a printout of any results, marked with the submitter's identification, would be placed in an output tray outside the computer center. In many cases these results would comprise solely a printout of error messages regarding program syntax etc., necessitating another edit-compile-run cycle.[28] Also see Computer programming in the punch card era.

Punched cards are still used and manufactured to this day, and their distinctive dimensions[29] (and 80-column capacity) can still be recognized in forms, records, and programs around the world.

[edytuj] 1930s–1960s: desktop calculators

Zobacz więcej w osobnym artykule: Post–Turing machine.

Szablon:See

The Curta calculator can also do multiplication and division
The Curta calculator can also do multiplication and division

By the 1900s, earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. In the 1920s Lewis Fry Richardson's interest in weather prediction led him to study numerical analysis; to this day, the most powerful computers on Earth are needed to adequately model the Navier-Stokes equations, which are used to model the weather.[30] Companies like Friden, Marchant Calculator and Monroe made desktop mechanical calculators from the 1930s that could add, subtract, multiply and divide. The word "computer" was a job title assigned to people who used these calculators to perform mathematical calculations. During the Manhattan project, future Nobel laureate Richard Feynman was the supervisor of the roomful of human computers, many of them women mathematicians, who understood the differential equations which were being solved for the war effort. Even the renowned Stanisław Ulam was pressed into service to translate the mathematics into computable approximations for the hydrogen bomb,[31] after the war.

In 1948, the Curta was introduced. This was a small, portable, mechanical calculator that was about the size of a pepper grinder. Over time, during the 1950s and 1960s a variety of different brands of mechanical calculator appeared on the market. The first all-electronic desktop calculator was the British ANITA Mk.VII, which used a Nixie tube display and 177 subminiature thyratron tubes. In June 1963, Friden introduced the four-function EC-130. It had an all-transistor design, 13-digit capacity on a Szablon:Convert CRT, and introduced reverse Polish notation (RPN) to the calculator market at a price of $2200. The model EC-132 added square root and reciprocal functions. In 1965, Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could compute logarithms.

[edytuj] Advanced analog computers

Zobacz więcej w osobnym artykule: Analogy.
Cambridge differential analyzer, 1938
Cambridge differential analyzer, 1938

Before World War II, mechanical and electrical analog computers were considered the "state of the art", and many thought they were the future of computing. Analog computers take advantage of the strong similarities between the mathematics of small-scale properties — the position and motion of wheels or the voltage and current of electronic components — and the mathematics of other physical phenomena, e.g. ballistic trajectories, inertia, resonance, energy transfer, momentum, etc.[32]

Modeling physical phenomena with electrical voltages and currents[33][34][35] as the analog quantities, yields great advantage over using mechanical models:

1) Electrical components are smaller and cheaper; they're more easily constructed and exercised.
2) Though otherwise similar, electrical phenomenon can be made to occur in conveniently short time frames.

Centrally, these analog systems work by creating electrical analogs of other systems, allowing users to predict behavior of the systems of interest by observing the electrical analogs. The most useful of the analogies was the way the small-scale behavior could be represented with integral and differential equations, and could be thus used to solve those equations. An ingenious example of such a machine, using water as the analog quantity, was the water integrator built in 1928; an electrical example is the Mallock machine built in 1941. A planimeter is a device which does integrals, using distance as the analog quantity. Until the 1980s, HVAC systems used air both as the analog quantity and the controlling element. Unlike modern digital computers, analog computers are not very flexible, and need to be reconfigured (i.e., reprogrammed) manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited. But as digital computers have become faster and use larger memory (e.g., RAM or internal storage), they have almost entirely displaced analog computers. Computer programming, or coding, has arisen as another human profession.

A Smith Chart is a well-known nomogram.
A Smith Chart is a well-known nomogram.

Since computers were rare in this era, the solutions were often hard-coded into paper forms such as graphs and nomograms,[36] which could then produce analog solutions to these problems, such as the distribution of pressures and temperatures in a heating system. Some of the most widely deployed analog computers included devices for aiming weapons, such as the Norden bombsight[37] and the fire-control systems, [38] such as Arthur Pollen's Argo system for naval vessels. Some stayed in use for decades after WWII; the Mark I Fire Control Computer was deployed by the United States Navy on a variety of ships from destroyers to battleships. Other analog computers included the Heathkit EC-1, and the hydraulic MONIAC Computer which modeled econometric flows.[39]

The art of analog computing reached its zenith with the differential analyzer,[40] invented in 1876 by James Thomson and built by H. W. Nieman and Vannevar Bush at MIT starting in 1927. Fewer than a dozen of these devices were ever built; the most powerful was constructed at the University of Pennsylvania's Moore School of Electrical Engineering, where the ENIAC was built. Digital electronic computers like the ENIAC spelled the end for most analog computing machines, but hybrid analog computers, controlled by digital electronics, remained in substantial use into the 1950s and 1960s, and later in some specialized applications. But like all digital devices, the decimal precision of a digital device is a limitation,[41] as compared to an analog device, in which the accuracy is a limitation.[42] As electronics progressed during the twentieth century, its problems of operation at low voltages while maintaining high signal-to-noise ratios[43] were steadily addressed, as shown below, for a digital circuit is a specialized form of analog circuit, intended to operate at standardized settings (continuing in the same vein, logic gates can be realized as forms of digital circuits).

[edytuj] Early digital computers

Zobacz też: computer science.
Punched tape programs would be much longer than the short fragment shown.
Punched tape programs would be much longer than the short fragment shown.

The era of modern computing began with a flurry of development before and during World War II, as electronic circuit elements [44] replaced mechanical equivalents and digital calculations replaced analog calculations. Machines such as the Atanasoff–Berry Computer, the Z3, the Colossus, and the ENIAC were built by hand using circuits containing relays or valves (vacuum tubes), and often used punched cards or punched paper tape for input and as the main (non-volatile) storage medium.

In this era, a number of different machines were produced with steadily advancing capabilities. At the beginning of this period, nothing remotely resembling a modern computer existed, except in the long-lost plans of Charles Babbage and the mathematical musings of Alan Turing and others. At the end of the era, devices like the EDSAC had been built, and are universally agreed to be digital computers. Defining a single point in the series as the "first computer" misses many subtleties.

Alan Turing's 1936 paper[45] proved enormously influential in computing and computer science in two ways. Its main purpose was to prove that there were problems (namely the halting problem) that could not be solved by any sequential process. In doing so, Turing provided a definition of a universal computer which executes a program stored on tape. This construct came to be called a Turing machine; it replaces Kurt Gödel's more cumbersome universal language based on arithmetics. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. This limited type of Turing completeness is sometimes viewed as a threshold capability separating general-purpose computers from their special-purpose predecessors.

Design of the von Neumann architecture (1947)
Design of the von Neumann architecture (1947)

For a computing machine to be a practical general-purpose computer, there must be some convenient read-write mechanism, punched tape, for example. For full versatility, the Von Neumann architecture uses the same memory both to store programs and data; virtually all contemporary computers use this architecture (or some variant). While it is theoretically possible to implement a full computer entirely mechanically (as Babbage's design showed), electronics made possible the speed and later the miniaturization that characterize modern computers.

There were three parallel streams of computer development in the World War II era; the first stream largely ignored, and the second stream deliberately kept secret. The first was the German work of Konrad Zuse. The second was the secret development of the Colossus computer in the UK. Neither of these had much influence on the various computing projects in the United States. The third stream of computer development, Eckert and Mauchly's ENIAC and EDVAC, was widely publicized.[46][47]

[edytuj] Konrad Zuse's Z-series: the first program-controlled computers

Zobacz więcej w osobnych artykułach: Konrad Zuse, Z1, Z2, Z3, Z4.
A reproduction of Zuse's Z1 computer.
A reproduction of Zuse's Z1 computer.

Working in isolation in Germany, Konrad Zuse started construction in 1936 of his first Z-series calculators featuring memory and (initially limited) programmability. Zuse's purely mechanical, but already binary Z1, finished in 1938, never worked reliably due to problems with the precision of parts.

Zuse's subsequent machine, the Z3, was finished in 1941. It was based on telephone relays and did work satisfactorily. The Z3 thus became the first functional program-controlled, all-purpose, digital computer. In many ways it was quite similar to modern machines, pioneering numerous advances, such as floating point numbers. Replacement of the hard-to-implement decimal system (used in Charles Babbage's earlier design) by the simpler binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. This is sometimes viewed as the main reason why Zuse succeeded where Babbage failed.

Programs were fed into Z3 on punched films. Conditional jumps were missing, but since the 1990s it has been proved theoretically that Z3 was still a universal computer (ignoring its physical storage size limitations). In two 1936 patent applications, Konrad Zuse also anticipated that machine instructions could be stored in the same storage used for data – the key insight of what became known as the Von Neumann architecture and was first implemented in the later British EDSAC design (1949). Zuse also claimed to have designed the first higher-level programming language, (Plankalkül), in 1945 (which was published in 1948) although it was implemented for the first time in 2000 by a team around Raúl Rojas at the Free University of Berlin – five years after Zuse died.

Zuse suffered setbacks during World War II when some of his machines were destroyed in the course of Allied bombing campaigns. Apparently his work remained largely unknown to engineers in the UK and US until much later, although at least IBM was aware of it as it financed his post-war startup company in 1946 in return for an option on Zuse's patents.

[edytuj] Colossus

Zobacz więcej w osobnym artykule: Colossus_computer.
Colossus was used to break German ciphers during World War II.
Colossus was used to break German ciphers during World War II.

During World War II, the British at Bletchley Park, just outside British Town Milton Keynes achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was attacked with the help of electro-mechanical machines called bombes. The bombe, designed by Alan Turing and Gordon Welchman, after the Polish cryptographic bomba by Marian Rejewski (1938), ruled out possible Enigma settings by performing chains of logical deductions implemented electrically. Most possibilities led to a contradiction, and the few remaining could be tested by hand.

The Germans also developed a series of teleprinter encryption systems, quite different from Enigma. The Lorenz SZ 40/42 machine was used for high-level Army communications, termed "Tunny" by the British. The first intercepts of Lorenz messages began in 1941. As part of an attack on Tunny, Professor Max Newman and his colleagues helped specify the Colossus. The Mk I Colossus was built between March and December 1943 by Tommy Flowers and his colleagues at the Post Office Research Station at Dollis Hill in London and then shipped to Bletchley Park.

Colossus was the first totally electronic computing device. The Colossus used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Details of their existence, design, and use were kept secret well into the 1970s. Winston Churchill personally issued an order for their destruction into pieces no larger than a man's hand. Due to this secrecy the Colossi were not included in many histories of computing. A reconstructed copy of one of the Colossus machines is now on display at Bletchley Park.

[edytuj] American developments

Szablon:See In 1937, Shannon produced his master's thesis[48] at MIT that implemented Boolean algebra using electronic relays and switches for the first time in history. Entitled A Symbolic Analysis of Relay and Switching Circuits, Shannon's thesis essentially founded practical digital circuit design. George Stibitz completed a relay-based computer he dubbed the "Model K" at Bell Labs in November 1937. Bell Labs authorized a full research program in late 1938 with Stibitz at the helm. Their Complex Number Calculator,[49] completed January 8, 1940, was able to calculate complex numbers. In a demonstration to the American Mathematical Society conference at Dartmouth College on September 11, 1940, Stibitz was able to send the Complex Number Calculator remote commands over telephone lines by a teletype. It was the first computing machine ever used remotely, in this case over a phone line. Some participants in the conference who witnessed the demonstration were John Von Neumann, John Mauchly, and Norbert Wiener, who wrote about it in their memoirs.

Atanasoff–Berry Computer replica at 1st floor of Durham Center, Iowa State University
Atanasoff–Berry Computer replica at 1st floor of Durham Center, Iowa State University

In 1939, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed the Atanasoff–Berry Computer (ABC),[50]a special purpose digital electronic calculator for solving systems of linear equations. (The original goal was to solve 29 simultaneous equations of 29 unknowns each. However the punch card mechanism has encountered some fatal errors during the process, the completed machine was only able to solve a few equations in its completed form.) The design used over 300 vacuum tubes for high speed and employed capacitors fixed in a mechanically rotating drum for memory. Though the ABC machine was not programmable, it was the first to use electronic circuits. ENIAC co-inventor John Mauchly examined the ABC in June 1941, and its influence on the design of the later ENIAC machine is a matter of contention among computer historians. The ABC was largely forgotten until it became the focus of the lawsuit Honeywell v. Sperry Rand, the ruling of which invalidated the ENIAC patent (and several others) as, among many reasons, having been anticipated by Atanasoff's work.

In 1939, development began at IBM's Endicott laboratories on the Harvard Mark I. Known officially as the Automatic Sequence Controlled Calculator,[51] the Mark I was a general purpose electro-mechanical computer built with IBM financing and with assistance from IBM personnel, under the direction of Harvard mathematician Howard Aiken. Its design was influenced by Babbage's Analytical Engine, using decimal arithmetic and storage wheels and rotary switches in addition to electromagnetic relays. It was programmable via punched paper tape, and contained several calculation units working in parallel. Later versions contained several paper tape readers and the machine could switch between readers based on a condition. Nevertheless, the machine was not quite Turing-complete. The Mark I was moved to Harvard University and began operation in May 1944.

[edytuj] ENIAC

Zobacz więcej w osobnym artykule: ENIAC.
ENIAC performed ballistics trajectory calculations with 160 kW of power.
ENIAC performed ballistics trajectory calculations with 160 kW of power.

The US-built ENIAC (Electronic Numerical Integrator and Computer) was the first electronic general-purpose computer.[52] Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, it was 1,000 times faster than the Harvard Mark I. ENIAC's development and construction lasted from 1943 to full operation at the end of 1945.

When its design was proposed, many researchers believed that the thousands of delicate valves (i.e. vacuum tubes) would burn out often enough that the ENIAC would be so frequently down for repairs as to be useless. It was, however, capable of up to thousands of operations per second for hours at a time between valve failures. It proved to the potential consumers that electronics could be useful for large-scale computing. The support from the public proved to be very crucial in the future.

ENIAC was unambiguously a Turing-complete device. A "program" on the ENIAC, however, was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that evolved from it. To program it meant to rewire it.[53] (Improvements completed in 1948 made it possible to execute stored programs set in function table memory, which made programming less a "one-off" effort, and more systematic.) It was possible to run operations in parallel, as it could be wired to operate multiple accumulators simultaneously. Thus the sequential operation which is the hallmark of a von Neumann machine occurred after ENIAC.

[edytuj] First-generation von Neumann machine and the other works

Zobacz więcej w osobnym artykule: algorithm.

Szablon:See Even before the ENIAC was finished, Eckert and Mauchly recognized its limitations and started the design of a stored-program computer, EDVAC. John von Neumann was credited with a widely-circulated report describing the EDVAC design in which both the programs and working data were stored in a single, unified store. This basic design, denoted the von Neumann architecture, would serve as the foundation for the continued development of ENIAC's successors.[54]

Magnetic tape: the 10.5 inch reel of 9 track tape, has been in continuous use for 50 years. It will be the primary data storage mechanism when CERN's Large Hadron Collider comes online in 2008.
Magnetic tape: the 10.5 inch reel of 9 track tape, has been in continuous use for 50 years. It will be the primary data storage mechanism when CERN's Large Hadron Collider comes online in 2008.
Magnetic core memory remained in use until past the mid-1970s, when semiconductor memories became more economically feasible. Each core is one bit.
Magnetic core memory remained in use until past the mid-1970s, when semiconductor memories became more economically feasible. Each core is one bit.

In this generation, temporary or working storage was provided by acoustic delay lines, which used the propagation time of sound through a medium such as liquid mercury (or through a wire) to briefly store data. As series of acoustic pulses is sent along a tube; after a time, as the pulse reached the end of the tube, the circuitry detected whether the pulse represented a 1 or 0 and caused the oscillator to re-send the pulse. Others used Williams tubes, which use the ability of a television picture tube to store and retrieve data. By 1954, magnetic core memory[55] was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s.

The first working von Neumann machine was the Manchester "Baby" or Small-Scale Experimental Machine, developed by Frederic C. Williams and Tom Kilburn and built at the University of Manchester in 1948;[56] it was followed in 1949 by the Manchester Mark I computer which functioned as a complete system using the Williams tube and magnetic drum for memory, and also introduced index registers.[57] The other contender for the title "first digital stored program computer" had been EDSAC, designed and constructed at the University of Cambridge. Operational less than one year after the Manchester "Baby", it was also capable of tackling real problems. EDSAC was actually inspired by plans for EDVAC (Electronic Discrete Variable Automatic Computer), the successor to ENIAC; these plans were already in place by the time ENIAC was successfully operational. Unlike ENIAC, which used parallel processing, EDVAC used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. Some view Manchester Mark I / EDSAC / EDVAC as the "Eves" from which nearly all current computers derive their architecture.

The first universal programmable computer in the Soviet Union was created by a team of scientists under direction of Sergei Alekseyevich Lebedev from Kiev Institute of Electrotechnology, Soviet Union (now Ukraine). The computer MESM (МЭСМ, Small Electronic Calculating Machine) became operational in 1950. It had about 6,000 vacuum tubes and consumed 25 kW of power. It could perform approximately 3,000 operations per second. Another early machine was CSIRAC, an Australian design that ran its first test program in 1949. CSIRAC is the oldest computer still in existence and the first to have been used to play digital music.[58]

In October 1947, the directors of J. Lyons & Company, a British catering company famous for its teashops but with strong interests in new office management techniques, decided to take an active role in promoting the commercial development of computers. By 1951 the LEO I computer was operational and ran the world's first regular routine office computer job.

Manchester University's machine became the prototype for the Ferranti Mark I. The first Ferranti Mark I machine was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.

In June 1951, the UNIVAC I (Universal Automatic Computer) was delivered to the U.S. Census Bureau. Remington Rand eventually sold 46 machines at more than $1 million each. UNIVAC was the first 'mass produced' computer; all predecessors had been 'one-off' units. It used 5,200 vacuum tubes and consumed 125 kW of power. It used a mercury delay line capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words) for memory. Unlike IBM machines it was not equipped with a punch card reader but 1930s style metal magnetic tape input, making it incompatible with some existing commercial data stores. High speed punched paper tape and modern-style magnetic tapes were used for input/output by other computers of the era.

In November 1951, the J. Lyons company began weekly operation of a bakery valuations job on the LEO (Lyons Electronic Office). This was the first business application to go live on a stored program computer.

In 1952, IBM publicly announced the IBM 701 Electronic Data Processing Machine, the first in its successful 700/7000 series and its first IBM mainframe computer. The IBM 704, introduced in 1954, used magnetic core memory, which became the standard for large machines. The first implemented high-level general purpose programming language, Fortran, was also being developed at IBM for the 704 during 1955 and 1956 and released in early 1957. (Konrad Zuse's 1945 design of the high-level language Plankalkül was not implemented at that time.)

IBM 650 front panel wiring.
IBM 650 front panel wiring.

IBM introduced a smaller, more affordable computer in 1954 that proved very popular. The IBM 650 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 or could be leased for $3,500 a month. Its drum memory was originally only 2000 ten-digit words, and required arcane programming for efficient computing. Memory limitations such as this were to dominate programming for decades afterward, until the evolution of hardware capabilities and a programming model that were more sympathetic to software development.

In 1955, Maurice Wilkes invented microprogramming,[59] which was later widely used in the CPUs and floating-point units of mainframe and other computers, such as the IBM 360 series. Microprogramming allows the base instruction set to be defined or extended by built-in programs (now called firmware or microcode).[60],[61]

In 1956, IBM sold its first magnetic disk system, RAMAC (Random Access Method of Accounting and Control). It used 50 Szablon:Convert metal disks, with 100 tracks per side. It could store 5 megabytes of data and cost $10,000 per megabyte. (As of 2008, magnetic storage, in the form of hard disks, costs less than one 50th of a cent per megabyte).

[edytuj] Second generation: transistors

Zobacz więcej w osobnych artykułach: Computer architecture, Von Neumann architecture.
Die of a Bipolar junction transistor KSY34 high-frequency NPN transistor, base and emitter connected via bonded wires
Die of a Bipolar junction transistor KSY34 high-frequency NPN transistor, base and emitter connected via bonded wires

In the second half of the 1950s bipolar junction transistors (BJTs)[62] replaced vacuum tubes. Their use gave rise to the "second generation" computers. Initially, it was believed that very few computers would ever be produced or used.[63] This was due in part to their size, cost, and the skill required to operate or interpret their results. Transistors[64] greatly reduced computers' size, initial cost and operating cost. The bipolar junction transistor[65] was invented in 1947.[66] With no electrical current flowing through a bipolar transistor's base-emitter path, the transistor's collector-emitter path turns full off (blocks electrical current). With sufficient current flowing through the transistor's base-emitter path, the transistor's collector-emitter path turns full on (passes current). Current flow and current blockage represent binary 1 and 0 or true or false (GE Transistor Manual, 7ed, pages 139 through 204).[67] Compared to vacuum tubes, transistors have many advantages: they are less expensive to manufacture and are ten times faster, switching from the condition 1 to 0 in millionths or billionths of a second. Transistor volume is measured in cubic millimeters compared to vacuum tubes' cubic centimeters. Transistors' lower operating temperature increased their reliability, compared to vacuum tubes. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space.

Typically, second-generation computers[68][69] were composed of large numbers of printed circuit boards such as the IBM Standard Modular System [70] each carrying one to four logic gates or flip-flops. A second generation computer, the IBM 1401, captured about one third of the world market. IBM installed more than one hundred thousand 1401s between 1960 and 1964— This period saw the only Italian attempt: the ELEA by Olivetti, produced in 110 units.

The IBM 350 RAMAC was introduced in 1956 and was the world's first disk drive. This unit is being restored at the Computer History Museum 50 years later.
The IBM 350 RAMAC was introduced in 1956 and was the world's first disk drive. This unit is being restored at the Computer History Museum 50 years later.

Transistorized electronics improved not only the CPU (Central Processing Unit), but also the peripheral devices. The second generation disk data storage units were able to store tens millions of letters and digits. Multiple Peripherals can be connected to the CPU, increasing the total memory capacity to hundreds of millions of characters. Next to the fixed disk storage units, connected to the CPU via high-speed data transmission, were removable disk data storage units. A removable disk stack can be easily exchanged with another stack in a few seconds. Even if the removable disks' capacity is smaller than fixed disks,' their interchangeability guarantees an nearly unlimited quantity of data close at hand. But magnetic tape provided archival capability for this data, at a lower cost than disk.

Many second generation CPUs delegated peripheral device communications to a secondary processor. For example, while the communication processor controlled card reading and punching, the main CPU executed calculations and binary branch instructions.

During the second generation remote terminal units (often in the form of teletype machines) saw greatly increased use.[71] Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remote-terminals and the computing center.

[edytuj] Post-1960: third generation and beyond

Zobacz więcej w osobnym artykule: history of computing hardware (1960s–present).
Zobacz też: integrated circuit, minicomputer, microprocessor, technology, software, design.
The integrated circuit from an Intel 8742, an 8-bit microcontroller that includes a CPU running at 12 MHz, 128 bytes of RAM, 2048 bytes of EPROM, and I/O in the same chip.
The integrated circuit from an Intel 8742, an 8-bit microcontroller that includes a CPU running at 12 MHz, 128 bytes of RAM, 2048 bytes of EPROM, and I/O in the same chip.

The explosion in the use of computers began with 'Third Generation' computers. These relied on Jack St. Clair Kilby's[72] and Robert Noyce's[73] independent invention of the integrated circuit (or microchip), which later led to the invention of the microprocessor,[74] by Ted Hoff and Federico Faggin at Intel.[75] But after semiconductor memories became commodities, then computer software became less labor-intensive; programming languages became less arcane and more understandable to code in.[76] When the CMOS field effect transistor-based logic gates supplanted bipolar transistors, computer power consumption could decrease dramatically (A CMOS FET draws current during the transition between logic states,[77] unlike the higher current draw of a BJT). This has allowed computing to become a commodity which is now ubiquitous, available in many forms, from the Internet, on satellites, aircraft, automobiles, and ships, and in televisions, cellphones and household appliances, in work-oriented tools and equipment, and in robots, toys, and games.

During the 1960s there was considerable overlap between second and third generation technologies.[78] IBM implemented its IBM Solid Logic Technology modules in hybrid circuits for the IBM System/360 in 1964. As late as 1975, Sperry Univac continued the manufacture of second-generation machines such as the UNIVAC 494. The Burroughs large systems such as the B5000 were stack machines which allowed for simpler programming. These pushdown automatons were also implemented in minicomputers and microprocessors later, which influenced programming language design. Minicomputers served as low-cost computer centers for industry, business and universities.[79] The microprocessor led to the development of the microcomputer, small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, co-founder of Apple Computer, is credited with developing the first mass-market home computers. However, his first computer, the Apple I, came out some time after the KIM-1 and Altair 8800, and the first Apple computer with graphic and sound capabilities came out well after the Commodore PET. Computing has evolved with microcomputer architectures, with features added from their larger brethren, now dominant in most market segments. In the twenty-first century, multi-core CPUs became commercially available. Content-addressable memory (CAM) has become inexpensive enough to be used in networking, although no computer system has yet implemented hardware CAMs for use in programming languages. Currently, CAMs (or associative arrays) in software are programming-language-specific.

There has been a recent (in the late twentieth century and early twenty-first century) convergence in the techniques of computer hardware and computer software, called reconfigurable computing. [80] [81] The concept is to use the software developed for the various virtual machines written in the extant programming languages, and implement them in hardware, using VHSIC, Verilog, or other hardware description languages. These soft microprocessors can even execute programs in the C programming language.

An indication of the rapidity of development of this field can be inferred by the history of the seminal article.[82] By the time that anyone had time to write anything down, it was obsolete. After 1945, others read the John von Neumann's First Draft of a Report on the EDVAC, and immediately started implementing their own systems. To this day, the pace of development has continued, worldwide. [83]

Przypisy

  1. The term computer, meaning a piece of hardware, shifted to its current meaning in the years after the Manhattan Project, when people were still computers, as a job title.
  2. Backus, John. Can Programming be Liberated from the von Neumann Style?. Communications of the ACM. 21, nr 8. 1977 ACM Turing Award Lecture.
  3. The major advance in this architecture over its prototype, ENIAC, is that a computer program be stored in memory. Zuse' architecture did the same. However, Von Neumann deferred his floating point implementation until after ENIAC, while Zuse' Z3 implemented floating point.
  4. Historically, the medium for input has been punch card, keyboard, paper tape, or mouse click.
  5. Historically, the medium for output has been punch card, print, paper tape, or light bulb.
  6. Historically, control has been implemented by manual intervention (the earliest versions), switch setting (nineteenth century), patch panels (twentieth century), and then stored program, perhaps in microcode.
  7. The election of the CPU dates back to the ENIAC's decision (1945-1946) to implement arithmetic as part of the fundamental architecture, and to defer the implementation of floating point arithmetic to a later date as noted in David A. Patterson: Computer Organization and Design. John L. Hennessy. 1998, ss. 312-334. ISBN 1-55860-428-6. 
  8. Memory is US usage; Storage is UK usage; the terms are not completely equivalent: Memory has the connotation of rapid access; Storage has the connotation of large capacity. (The use of the term Storage dates back to Ada Lovelace in the nineteenth century.) Magnetic core memory was much faster than disk; which was cheaper, with higher capacity. There is a Computer_storage#Hierarchy of storage. To this day, semiconductor memory is more expensive than disk or tape.
  9. Babbage's Analytical engine was intended to be steam powered.
  10. David A. Patterson: Computer Organization and Design. John L. Hennessy. 1998, s. 3. ISBN 1-55860-428-6. 
  11. Although the price of hardware has fallen dramatically in the past century, a computer system needs software and skilled use, to this day, and the total cost is still subject to economic forces, such as competition. For example, the open source movement has prompted some formerly proprietary software vendors to open up some of their software, so as to remain competitive in the market.
  12. Ifrah shows, for example, a picture of Boethius (who lived 480–524 or 525) reckoning on his fingers in Georges Ifrah: The Universal History of Numbers: From prehistory to the invention of the computer.. 2000, s. 48. ISBN 0-471-39340-1.  Translated from the French by David Bellos, E.F. Harding, Sophie Wood and Ian Monk. Ifrah's thesis is supported by idiomatic phrases from the entire world's languages.
  13. Schmidt-Besserat, Denise. Decipherment of the earliest tablets.. Science, nr 211, 283-285.
  14. Karl Menninger: Number Words and Number Symbols: A Cultural History of Numbers. 1992.  German to English translation, M.I.T., 1969
  15. The money ranged in form from metal coins, to the sea shells of Southeast Asia and Oceania
  16. Christos Lazos: The Antikythera Computer (Ο ΥΠΟΛΟΓΙΣΤΗΣ ΤΩΝ ΑΝΤΙΚΥΘΗΡΩΝ),. ΑΙΟΛΟΣ PUBLICATIONS GR, 1994. 
  17. A Spanish implementation of Napier's bones (1617), is documented in Montaner i Simon: Hispano-American Encyclopedic Dictionary. 1887. 
  18. Kells, Kern i Bland: The Log-Log Duplex Decitrig Slide Rule No. 4081: A Manual. Keuffel & Esser, 1943, s. 92. 
  19. Schmidhuber, Jürgen: Wilhelm Schickard (1592 - 1635) Father of the computer age. [dostęp 2008-05-15].
  20. Gottfried Leibniz: Explication de l'Arithmétique Binaire. 1703. 
  21. Binary-coded decimal (BCD) is a numeric representation, or character encoding, which is still extant.
  22. Jones, Douglas W: Punched Cards: A brief illustrated technical history. The University of Iowa. [dostęp 2008-05-15].
  23. Menabrea, Luigi Federico i Lovelace, Ada. Sketch of the Analytical Engine Invented by Charles Babbage. Scientific Memoirs, nr 3. 1843. With notes upon the Memoir by the Translator
  24. Today, many in the computer field term this sort of obsession creeping featuritis.
  25. Szablon:Cite paper
  26. Lubar, Steve: "Do not fold, spindle or mutilate": A cultural history of the punched card. maj 1991. [dostęp 2006-10-31].
  27. Wallace Eckert: Punched Card Methods in Scientific Computation. 1940. 
  28. Fisk, Dale: Punch cards. Columbia University ACIS, 2005. [dostęp 2008-05-19].
  29. Hollerith selected the size of the punch card to fit in the metal containers which held the American dollar bills of the day. The dollar is now smaller than it was then.
  30. Richardson, L F: The Collected Papers of Lewis Fry Richardson, Volume 1: Meteorology and numerical analysis.. Ashford, Oliver M; Charnock H; Drazin, P G; Hunt, J C R; Smoker, P, Sutherland, Ian. Cambridge: 1993. ISBN 978-0521382977. 
  31. Stanisław Ulam: Adventures of a Mathematician. Nowy Jork: 1983. (autobiografia). 
  32. "The same equations have the same solutions." — R. P. Feynman
  33. See, for example, Paul Horowitz: The Art of Electronics. Winfield Hill. 1989, ss. 1-44. ISBN 0-521-37095-7. 
  34. Electrical circuits are composed of elements with resistance, capacitance, inductance, and researchers just have found a fourth basic integrated circuit element: called a memristor as of April 2008
  35. Chua, Leon O. Memristor—The Missing Circuit Element. IEEE Transactions on Circuit Theory. CT-18, 5, 507-519.
  36. Steinhaus, H.: Mathematical Snapshots. Wyd. 3. ed.. Nowy Jork: Dover, 1999, ss. 92-95, s. 301. 
  37. Norden M9 Bombsight. National Museum of the USAF. [dostęp 2008-05-17].
  38. Singer in World War II, 1939-1945 - the M5 Director. Singer Manufacturing Co., 1946. [dostęp 2008-05-17].
  39. Phillips, A.W.H.: The MONIAC. Reserve Bank Museum. [dostęp 2006-05-17].
  40. Szablon:Fr Coriolis, Gaspard-Gustave. Note sur un moyen de tracer des courbes données par des équations différentielles. Journal de Mathématiques Pures et appliquées series I. 1, 5-9.
  41. The number of digits in the accumulator is a fundamental limitation to a computation. If a result exceeds the number of digits, this condition is called overflow.
  42. The noise level, compared to the signal level, is a fundamental limitation. See, for example Roger E. Ziemer: Signals and Systems: Continuous and Discrete. William H. Tranter i D. Ronald Fannin. 1993. ISBN 0-02-431641-5. 
  43. Wilbur B. Davenport, Jr: An Introduction to the theory of Random Signals and Noise. William L. Root. 1958. Numer katalogowy 57-10020 Biblioteki Kongresu Stanów Zjednoczonych. 
  44. In this era, the circuit elements were relays, capacitors, inductors, and vacuum tubes.
  45. Turing, Alan. On computable numbers, with an application to the Entscheidungsproblem. Proceedings of the London Mathematical Society. Serie 2, 42, 230-265. Errata appeared in Series 2, 43 (1937), pp 544 - 546. Other online versions: http://plms.oxfordjournals.org/cgi/reprint/s2-42/1/230 http://www.thocp.net/biographies/papers/turing_oncomputablenumbers_1936.pdf
  46. Moye, William T.: ENIAC: The Army-Sponsored Revolution. styczeń 1996. [dostęp 2008-05-17].
  47. Bergin, Thomas J. (ed.): Fifty Years of Army Computing: from ENIAC to MSRC. Army Research Laboratory and the U.S.Army Ordnance Center and School., 13 i 14 listopada 1996. [dostęp 2008-05-17].
  48. Szablon:Cite paper
  49. U.S. Patent 2,668,661 "Complex Computer" filed April 1941, issued February 1954 (102 pages) Szablon:Ref patent
  50. January 15, 1941 notice in the Des Moines Register.
  51. , Da Cruz, Frank: The IBM Automatic Sequence Controlled Calculator (ASCC). W: Columbia University Computing History: A Chronology of Computing at Columbia University [on-line]. Columbia University ACIS, 28 lutego 2008. [dostęp 2008-05-17].
  52. Nancy Stern: From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers. Digital Press, 1981. ISBN 0-932376-14-2. 
  53. Six women did most of the programming of ENIAC.
  54. Szablon:Cite conference. The title page, as submitted by Goldstine, reads: "First Draft of a Report on the EDVAC by John von Neumann, Contract No. W-670-ORD-4926, Between the United States Army Ordnance Department and the University of Pennsylvania Moore School of Electrical Engineering"
  55. An Wang filed October 1949, Szablon:Ref patent
  56. Enticknap, Nicholas. Computing's Golden Jubilee. RESURRECTION, nr 20. The Computer Conservation Society. ISSN 0958-7403.
  57. R.B.E. Napper i inni Manchester Mark I. Computer History Museum, The University of Manchester, 1998, 1999. [dostęp 2008-04-19].
  58. CSIRAC: Australia’s first computer. 2005-06-03. [dostęp 2007-12-21].
  59. Maurice Wilkes, Memoirs of a Computer Pioneer. The MIT Press. 1985. ISBN 0-262-23122-0
  60. Paul Horowitz: The Art of Electronics. Winfield Hill. 1989. ISBN 0-521-37095-7. 
  61. David A. Patterson: Computer Organization and Design. John L. Hennessy. 1998, s. 424. ISBN 1-55860-428-6. : Horowitz and Hill note that when IBM was preparing its transition from the 700/7000 series to S/360, they emulated the software of the older systems in microcode, so as to be able to run older programs on the new IBM 360.
  62. D. V. Morgan: Physics and Technology of Heterojunction Devices. Robin H. Williams. Institution of Electrical Engineers. 
  63. Bowden, Lord. The Language of Computers. American Scientist. 58, 43-53.
  64. A transistor is an electronic device of germanium, silicon or other semiconductor crystal to which dopants have been added in small quantities, in selected sections of the device.
  65. In 1947, Bardeen and Brattain prototyped the point-contact transistor, in form very much like a cat's whisker diode, which had inherent reliability problems. It was superseded by the BJT.
  66. Americans John Bardeen, Walter Brattain and William Shockley shared the 1956 Nobel Prize in Physics for their invention of the transistor.
  67. J. F. Cleary: GE Transistor Manual, 7ed. 1964. 
  68. Second generation computers include the CDC 1604, DEC PDP-1, IBM 7030 Stretch, IBM 7090, IBM 1401, IBM 1620, Sperry Rand Athena, Univac LARC and Western Electric 1ESS Switch.
  69. The 1ESS switch could not be marketed as a computer in order for AT&T to comply with their anti-monopoly consent decree, which also affected the Unix operating system
  70. IBM_SMS: IBM Standard Modular System SMS Cards. IBM, 1960. [dostęp 2008-03-06]. 
  71. Alan Newell used remote terminals to communicate cross-country with the RAND computers, as noted in Herbert Simon: Models of My Life. 1991. Sloan Foundation Series. 
  72. Szablon:Cite conference
  73. Robert Noyce's Unitary circuit, Szablon:Ref patent
  74. Intel: Intel's First Microprocessor—the Intel® 4004. Intel Corp., listopad 1971. [dostęp 2008-05-17].
  75. The Intel 4004 (1971) die was 12mm2, composed of 2300 transistors; by comparison, the Pentium Pro was 306mm2, composed of 5.5 million transistors, according to David A. Patterson: Computer Organization and Design. John L. Hennessy. Morgan Kaufmann, 1998, ss. 27-39. ISBN 1-55860-428-6. 
  76. For example, programming on a drum memory required that the programmer be aware of the real-time position of the read head, as the drum was spinning.
  77. Carver Mead: Introduction to VLSI Systems. Lynn Conway. Addison-Wesley, 1980. ISBN 0-201-04358-0. 
  78. In the defense field, considerable work was done in the computerized implementation of equations such as Kalman, R.E.. A new approach to linear filtering and prediction problems. Journal of Basic Engineering. 82, 1, 35-45. 1960.
  79. Richard H., Jr. Eckhouse: Minicomputer Systems: organization, programming, and applications (PDP-11). L. Robert Morris. 1979. ISBN 0-13-583914-9. 
  80. These new techniques use application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), and complex programmable logic devices (CPLDs).
  81. David Pellerin: Practical FPGA Programming in C. Scott Thibault. Prentice Hall Modern Semiconductor Design Series Sub Series: PH Signal Integrity Library, 22 kwietnia 2005. ISBN 0-13-154318-0. 
  82. Burks, Arthur W., Goldstine, Herman i von Neumann, John: Preliminary discussion of the Logical Design of an Electronic Computing Instrument. Institute for Advanced Study, 1947. [dostęp 2008-05-18]. reprinted in Datamation, September-October 1962. Note that preliminary discussion/design was the term later called system analysis/design, and even later, called system architecture.
  83. Online access to the IEEE Annals of the History of Computing here: IEEE Annals of the History of Computing. IEEE, Series dates from 1979. [dostęp 2008-05-19].. DBLP summarizes the Annals of the History of Computing year by year, back to 1996, so far.

[edytuj] References

[edytuj] Further reading

  • See List of books on the history of computing

[edytuj] External links

[edytuj] British history


Static Wikipedia 2008 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2007 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2006 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Sub-domains

CDRoms - Magnatune - Librivox - Liber Liber - Encyclopaedia Britannica - Project Gutenberg - Wikipedia 2008 - Wikipedia 2007 - Wikipedia 2006 -

Other Domains

https://www.classicistranieri.it - https://www.ebooksgratis.com - https://www.gutenbergaustralia.com - https://www.englishwikipedia.com - https://www.wikipediazim.com - https://www.wikisourcezim.com - https://www.projectgutenberg.net - https://www.projectgutenberg.es - https://www.radioascolto.com - https://www.debitoformtivo.it - https://www.wikipediaforschools.org - https://www.projectgutenbergzim.com