The first computing devices and machines. Calculating machine

Other 19.12.2021
Other

Generations:

I. Computer on el. lamps, performance is about 20,000 operations per second, each machine has its own programming language.

(“BESM”, “Strela”).

II. In 1960, transistors, invented in 1948, were used in computers; they were more reliable, durable, and had large RAM. 1 transistor can replace ~40 el. lamps and works at a higher speed. Magnetic tapes were used as storage media. (“Minsk-2”, “Ural-14”).

III. In 1964, the first integrated circuits (ICs) appeared and became widely used. An IC is a crystal with an area of ​​10 mm2. 1 IC can replace 1000 transistors.

1 crystal - 30-ton “Eniak”. It became possible to process several programs in parallel.

Around 1820, Charles Xavier Thomas created the first successful, mass-produced mechanical calculator, the Thomas Arithmometer, which could add, subtract, multiply and divide. It was mainly based on the work of Leibniz. Mechanical calculators that count decimal numbers were used until the 1970s. Leibniz also described the binary number system, the central ingredient of all modern computers. However, until the 1940s, many subsequent developments (including Charles Babbage's machines and even the 1945 ENIAC) were based on a more difficult-to-implement decimal system.

Punch card jukebox system

In 1801, Joseph Marie Jacquard developed a loom in which the embroidered pattern was determined by punched cards. The series of cards could be replaced, and changing the pattern did not require changes in the mechanics of the machine. This was an important milestone in the history of programming. In 1838, Charles Babbage moved from developing the Difference Engine to designing a more complex Analytical Engine, the programming principles of which directly traced back to Jaccard's punched cards. In 1890, the US Census Bureau used punch cards and sorting mechanisms developed by Herman Hollerith to process the flood of decennial census data mandated by the Constitution. Hollerith's company eventually became the core of IBM. This corporation developed punched card technology into a powerful tool for business data processing and produced an extensive line of specialized data recording equipment. By 1950, IBM technology had become ubiquitous in industry and government. Many computer solutions used punch cards before (and after) the late 1970s.

1835-1900s: First programmable machines

In 1835, Charles Babbage described his Analytical Engine. It was a general purpose computer design, using punched cards as input data and program storage, and a steam engine as the power source. One of the key ideas was the use of gears to perform mathematical functions. Following in Babbage's footsteps, although unaware of his earlier work, was Percy Ludgate, an accountant from Dublin [Ireland]. He independently designed a programmable mechanical computer, which he described in a paper published in 1909.

1930s - 1960s: desktop calculators

The Felix adding machine is the most common in the USSR. Produced in 1929-1978

In 1948, Curta appeared, a small mechanical calculator that could be held in one hand. In the 1950s and 1960s, several brands of similar devices appeared on the Western market. The first fully electronic desktop calculator was the British ANITA Mk. VII, which used a "Nixie" tube display and 177 miniature thyratron tubes. In June 1963, Friden introduced the EC-130 with four functions. It was entirely transistorized, had 13-digit resolution on a 5-inch cathode ray tube, and was marketed by the company at $2,200 for the calculator market. Square root and inverse functions have been added to the EC 132 model. In 1965, Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could calculate logarithms.

The emergence of analog computers in the pre-war years

Differential Analyzer, Cambridge, 1938 Before World War II, mechanical and electrical analogue computers were considered the most advanced machines and were widely believed to be the future of computing. Analog computers took advantage of the fact that the mathematics of small-scale phenomena—wheel positions or electrical voltage and current—are similar to the mathematics of other physical phenomena, such as ballistic trajectories, inertia, resonance, energy transfer, moment of inertia, etc. They modeled these and other physical phenomena with values ​​of electrical voltage and current.

The first electromechanical digital computers

Konrad Zuse's Z-series In 1936, while working in isolation in Nazi Germany, Konrad Zuse began work on his first Z-series computer, which had memory and (still limited) programmability. Created mainly on a mechanical basis, but based on binary logic, the Z1 model, completed in 1938, never worked reliably enough due to insufficient precision in the execution of its component parts. Zuse's next car, the Z3, was completed in 1941. It was built on telephone relays and worked quite satisfactorily. Thus, the Z3 became the first working computer controlled by a program. In many ways, the Z3 was similar to modern machines, pioneering a number of innovations such as floating point arithmetic. Replacing the difficult-to-implement decimal system with a binary one made Zuse machines simpler and, therefore, more reliable; this is thought to be one of the reasons that Zuse succeeded where Babbage failed. Programs for the Z3 were stored on perforated film. There were no conditional branches, but in the 1990s the Z3 was theoretically proven to be a general purpose computer (if you ignore physical memory size limitations). In two 1936 patents, Konrad Zuse mentioned that machine instructions could be stored in the same memory as data - thus anticipating what later became known as the von Neumann architecture and was first implemented only in 1949 by the British EDSAC.

British "Colossus"

The British Colossus was used to break German codes during World War II. Colossus was the first fully electronic computing device. It used a large number of vacuum tubes, information was entered from punched tape. Colossus could be configured to perform various Boolean logic operations, but it was not a Turing complete machine. In addition to the Colossus Mk I, nine more Mk II models were built. Information about the existence of this machine was kept secret until the 1970s. Winston Churchill personally signed the order to destroy the machine into pieces no larger than the size of a human hand. Because of its secrecy, Colossus is not mentioned in many works on the history of computers.

First generation of von Neumann architecture computers

Memory on ferrite cores. Each core is one bit. The first working machine with von Neumann architecture was the Manchester “Baby” - Small-Scale Experimental Machine, created at the University of Manchester in 1948; it was followed in 1949 by the Manchester Mark I computer, which was already a complete system, with Williams tubes and a magnetic drum as memory, as well as index registers. Another contender for the title of “first digital stored program computer” was EDSAC, designed and constructed at the University of Cambridge. Launched less than a year after Baby, it could already be used to solve real problems. In fact, EDSAC was created based on the architecture of the EDVAC computer, the successor of ENIAC. Unlike ENIAC, which used parallel processing, EDVAC had a single processing unit. This solution was simpler and more reliable, so this option became the first implemented after each successive wave of miniaturization. Many believe that the Manchester Mark I / EDSAC / EDVAC became the “Evas” from which almost all modern computers derive their architecture.

The first universal programmable computer in continental Europe was created by a team of scientists led by Sergei Alekseevich Lebedev from the Kyiv Institute of Electrical Engineering of the USSR, Ukraine. The MESM (Small Electronic Computing Machine) computer went into operation in 1950. It contained about 6,000 vacuum tubes and consumed 15 kW. The machine could perform about 3,000 operations per second. Another machine of the time was the Australian CSIRAC, which carried out its first test program in 1949.

In October 1947, the directors of Lyons & Company, a British company that owned a chain of shops and restaurants, decided to become actively involved in the development of commercial computer development. The LEO I computer went live in 1951 and was the first computer in the world to be regularly used for routine office work.

The Manchester University machine became the prototype for the Ferranti Mark I. The first such machine was delivered to the university in February 1951, and at least nine others were sold between 1951 and 1957.

In June 1951, UNIVAC 1 was installed by the US Census Bureau. The machine was developed by Remington Rand, which eventually sold 46 of the machines for more than $1 million each. UNIVAC was the first mass-produced computer; all its predecessors were produced in a single copy. The computer consisted of 5200 vacuum tubes and consumed 125 kW of energy. Mercury delay lines were used, storing 1000 words of memory, each with 11 decimal digits plus sign (72-bit words). Unlike IBM machines equipped with punch card input, the UNIVAC used 1930s-style metallized magnetic tape input, providing compatibility with some existing commercial systems data storage. Other computers of the time used high-speed punched tape input and I/O using more modern magnetic tapes.

The first Soviet serial computer was the Strela, produced since 1953 at the Moscow Factory of Computing and Analytical Machines. “Strela” belongs to the class of large universal computers (Mainframe) with a three-address command system. The computer had a speed of 2000-3000 operations per second. Two magnetic tape drives with a capacity of 200,000 words were used as external memory; the RAM capacity was 2048 cells of 43 bits each. The computer consisted of 6,200 lamps, 60,000 semiconductor diodes and consumed 150 kW of energy.

In 1955, Maurice Wilkes invented microprogramming, a principle that was later widely used in the microprocessors of a wide variety of computers. Microprogramming allows you to define or extend a basic set of commands using built-in programs (called microprogram or firmware).

In 1956, IBM first sold a device for storing information on magnetic disks - RAMAC (Random Access Method of Accounting and Control). It uses 50 metal disks with a diameter of 24 inches, with 100 tracks on each side. The device stored up to 5 MB of data and cost $10,000 per MB. (In 2006, similar storage devices - hard drives - cost about $0.001 per MB.)

1950s - early 1960s: second generation

The next major step in the history of computer technology was the invention of the transistor in 1947. They have become a replacement for fragile and energy-intensive lamps. Transistorized computers are usually referred to as the "second generation" that dominated the 1950s and early 1960s. Thanks to transistors and printed circuit boards, a significant reduction in size and energy consumption, as well as increased reliability, was achieved. For example, the transistor-powered IBM 1620, which replaced the tube-based IBM 650, was the size of an office desk. However, second generation computers were still quite expensive and therefore were only used by universities, governments, and large corporations.

Second-generation computers typically consisted of a large number of printed circuit boards, each containing one to four logic gates or flip-flops. In particular, the IBM Standard Modular System defined the standard for such boards and connection connectors for them. In 1959, based on transistors, IBM released the IBM 7090 mainframe and the IBM 1401 mid-range machine. The latter used punch card input and became the most popular general-purpose computer of the time: in the period 1960-1964. More than 100 thousand copies of this car were produced. It used a 4,000-character memory (later increased to 16,000 characters). Many aspects of this project were based on the desire to replace punched card machines, which were widely used from the 1920s until the early 1970s. In 1960, IBM released the transistorized IBM 1620, initially a punched-tape machine only, but soon upgraded to punched cards. The model became popular as a scientific computer, with about 2,000 copies produced. The machine used magnetic core memory with a capacity of up to 60,000 decimal digits.

Also in 1960, DEC released its first model, the PDP-1, intended for use by technical personnel in laboratories and for research.

In 1961, Burroughs Corporation released the B5000, the first dual-processor computer with virtual memory. Other unique features were its stack-based architecture, handle-based addressing, and lack of programming directly in assembly language.

The first Soviet serial semiconductor computers were “Spring” and “Snow”, produced from 1964 to 1972. The peak performance of the Snow computer was 300,000 operations per second. The machines were made on the basis of transistors with a clock frequency of 5 MHz. A total of 39 computers were produced.

The BESM-6, created in 1966, is considered the best domestic computer of the 2nd generation. In the BESM-6 architecture, the principle of combining command execution was widely used for the first time (up to 14 unicast machine commands could be at different stages of execution). Interruption mechanisms, memory protection and other innovative solutions made it possible to use BESM-6 in multiprogram mode and time sharing mode. The computer had 128 KB of RAM on ferrite cores and external memory on magnetic drums and tape. BESM-6 operated with a clock frequency of 10 MHz and a record performance for that time - about 1 million operations per second. A total of 355 computers were produced.

1960s onwards: third and subsequent generations

The rapid growth in the use of computers began with the so-called. "3rd generation" of computers. This began with the invention of integrated circuits, which were independently invented by Nobel Prize winner Jack Kilby and Robert Noyce. This later led to the invention of the microprocessor by Tad Hoff (Intel). During the 1960s, there was some overlap between 2nd and 3rd generation technologies. At the end of 1975, Sperry Univac continued production of 2nd generation machines such as the UNIVAC 494.

The advent of microprocessors led to the development of microcomputers, small, inexpensive computers that could be owned by small companies or individuals. Microcomputers, members of the fourth generation, first appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, one of the founders of Apple Computer, became known as the developer of the first mass-produced home computer, and later the first personal computer. Computers based on microcomputer architecture, with capabilities added from their larger cousins, now dominate most market segments.

1970-1990 - fourth generation of computers

It is generally believed that the period from 1970 to 1990 belongs to computers fourth generation. However, there is another opinion - many believe that the achievements of this period are not so great as to consider it an equal generation. Supporters of this point of view call this decade belonging to the “third and a half” generation of computers, and only from 1985, in their opinion, should we count the years of the life of the fourth generation itself, which is still alive today.

One way or another, it is obvious that since the mid-70s there have been fewer and fewer fundamental innovations in computer science. Progress is proceeding mainly along the path of developing what has already been invented and invented, primarily through increasing power and miniaturization of the element base and the computers themselves.

And, of course, the most important thing is that since the beginning of the 80s, thanks to the advent of personal computers, computing technology has become truly widespread and accessible to the public. A paradoxical situation arises: despite the fact that personal and minicomputers still lag behind large machines in all respects, the lion's share of innovations of the last decade - graphical user interfaces, new peripherals, global networks- owe their appearance and development to precisely this “frivolous” technique. Large computers and supercomputers, of course, are by no means extinct and continue to develop. But now they no longer dominate the computer arena as they once did.

The elemental base of a computer is large integrated circuits (LSI). The machines were intended to dramatically increase labor productivity in science, production, management, healthcare, service and everyday life. A high degree of integration helps to increase the packaging density of electronic equipment and improve its reliability, which leads to an increase in computer performance and a reduction in its cost. All this has a significant impact on the logical structure (architecture) of the computer and its software. The connection between the structure of the machine and its software becomes closer, especially the operating system (or monitor) - a set of programs that organize the continuous operation of the machine without human intervention. This generation includes EC computers: ES-1015, -1025, -1035, -1045, -1055, -1065 (“Row 2”), -1036, -1046, -1066, SM-1420, -1600, - 1700, all personal computers (“Electronics MS 0501”, “Electronics-85”, “Iskra-226”, ES-1840, -1841, -1842, etc.), as well as other types and modifications. The fourth generation computer also includes the Elbrus multiprocessor computing complex. "Elbrus-1KB" had a speed of up to 5.5 million floating point operations per second, and a RAM capacity of up to 64 MB. Elbrus-2 has a performance of up to 120 million operations per second, a RAM capacity of up to 144 MB or 16 MSwords (72-bit word), and a maximum throughput of I/O channels of 120 MB/s.

Example: IBM 370-168

Manufactured in 1972. This car model was one of the most common. RAM capacity - 8.2 MB. Performance - 7.7 million operations per second.


1990-...to the present day - 5th generation of computers

The transition to fifth-generation computers implied a transition to new architectures aimed at creating artificial intelligence.

It was believed that the fifth generation computer architecture would contain two main blocks. One of them is the computer itself, in which communication with the user is carried out by a unit called the “intelligent interface”. The task of the interface is to understand text written in natural language or speech, and translate the problem statement thus stated into a working program.

Basic requirements for 5th generation computers: Creation of a developed human-machine interface (speech recognition, image recognition); Development of logic programming for creating knowledge bases and artificial intelligence systems; Creation of new technologies in the production of computer equipment; Creation of new computer architectures and computing systems.

New technical capabilities of computer technology should have expanded the range of tasks to be solved and made it possible to move on to the tasks of creating artificial intelligence. One of the components necessary for creating artificial intelligence is knowledge bases (databases) in various areas of science and technology. Creating and using databases requires high speed computing systems and a large amount of memory. General purpose computers are capable of performing high-speed calculations, but are not suitable for high speed operations of comparison and sorting of large volumes of records, usually stored on magnetic disks. To create programs that fill, update, and work with databases, special object-oriented and logical programming languages ​​were created that provide the greatest capabilities compared to conventional procedural languages. The structure of these languages ​​requires a transition from traditional von Neumann computer architecture to architectures that take into account the requirements of the tasks of creating artificial intelligence.

Example: IBM eServer z990

Manufactured in 2003. Physical parameters: weight 2000 kg, power consumption 21 kW, area 2.5 sq. m., height 1.94 m., RAM capacity 256 GB, performance - 9 billion instructions/sec.

Start

A calculator and a computer are far from the only devices with which you can carry out calculations. Humanity began to think quite early on how to make the processes of division, multiplication, subtraction and addition easier for itself. One of the first such devices can be considered balance scales, which appeared in the fifth millennium BC. However, let's not dive so far into the depths of history.

Andy Grove, Robert Noyce and Gordon Moore. (wikipedia.org)

The abacus, known to us as the abacus, was born around 500 BC. Ancient Greece, India, China and the Inca state can compete for the right to be considered its homeland. Archaeologists suspect that there were even computing mechanisms in ancient cities, although the existence of such mechanisms has not yet been proven. However, the antiker mechanism, which we already mentioned in the previous article, may well be considered a computational mechanism.

With the advent of the Middle Ages, the skills to create such devices were lost. Those dark times were generally a period of sharp decline in science. But in the 17th century, humanity again began to think about computing machines. And they were not slow to appear.

The first computers

Creating a device that could perform calculations was the dream of the German astronomer and mathematician Wilhelm Schickard. He had many different projects, but most of them failed. Schickard was not embarrassed by failures, and he eventually achieved success. In 1623, the mathematician designed the “Counting Clock” - an incredibly complex and cumbersome mechanism, which, however, could perform simple calculations.

"Chiccard's counting clock." Drawing. (wikipedia.org)

The “counting clocks” were of considerable size and large mass; it was difficult to use them in practice. Schickard's friend, the famous astronomer Johannes Kepler, jokingly remarked that it is much easier to do calculations in your head than to use a watch. However, it was Kepler who became the first user of the Schickard clock. It is known that with their help he carried out many of his calculations.

Johannes Kepler. (wikipedia.org)

This device got its name because it was based on the same mechanism that worked in wall clocks. And Schickard himself can be considered the “father” of the calculator. Twenty years have passed, and the family of computers has been expanded with the invention of the French mathematician, physicist and philosopher Blaise Pascal. The scientist presented Pascalina in 1643.

Pascal's summing machine. (wikipedia.org)

Pascal was then 20 years old, and he made the device for his father, a tax collector who had to deal with very complex calculations. The adding machine was driven by gears. To enter the required number into it, you had to turn the wheels a certain number of times.

Thirty years later, in 1673, the German mathematician Gottfried Leibniz created his project. His device was the first in history to be called a calculator. The principle of operation was the same as that of Pascal's machine.

Gottfried Leibniz. (wikipedia.org)

There is one very interesting story connected with Leibniz's calculator. At the beginning of the 18th century, the car was seen by Peter I, who was visiting Europe as part of the Great Embassy. The future emperor was very interested in the device and even bought it. Legend has it that Peter later sent the calculator to the Kangxi Emperor of China as a gift.

From calculator to computer

The case of Pascal and Leibniz developed further. In the 18th century, many scientists made attempts to improve computers. The main idea was to create a commercially successful device. Success ultimately followed the Frenchman Charles Xavier Thomas de Colmar.

Charles Xavier Thomas de Colmar. (wikipedia.org)

In 1820, he launched mass production of computing instruments. Strictly speaking, Colmar was more of a skilled industrialist than an inventor. His “Thoma machine” was not much different from Leibniz’s calculator. Colmar was even accused of stealing someone else's invention and trying to make a fortune from someone else's labor.

In Russia, serial production of calculators began in 1890. The calculator acquired its current form already in the twentieth century. In the 1960-1970s, this industry experienced a real boom. The devices were improved every year. In 1965, for example, a calculator appeared that could calculate logarithms, and in 1970 a calculator that fit in a person’s hand was first released. But at this time the computer age had already begun, although humanity had not yet had time to feel it.

Computers

Many consider the French weaver Joseph Marie Jacquard to be the person who laid the foundations for the development of computer technology. It's hard to say whether this is a joke or not. However, it was Jacquard who invented the punch card. Back then people didn’t yet know what a memory card was. Jacquard's invention may well claim this title. The weaver invented it to control the loom. The idea was that a punch card was used to create a pattern for the fabric. That is, from the moment the punch card was launched, the pattern was applied without human intervention - automatically.

Punch card. (wikipedia.org)

Naturally, there was no Jaccard punch card electronic device. The appearance of such objects was still very far away, because Jacquard lived at the turn of the 18th-19th centuries. Ekov. However, punched cards later became widely used in other areas, going far beyond the famous loom.

In 1835, Charles Babbage described an analytical engine that could be based on punched cards. The key principle of operation of such a device was programming. Thus, the English mathematician predicted the appearance of the computer. Alas, Babbage himself was never able to build the machine he invented. The world's first analog computer was born in 1927. It was created by University of Massachusetts professor Vannevar Bush.

History of the development of computer technology


2. “Time - events - people”


1. Stages of development of computer technology

Until the 17th century. the activity of society as a whole and of each person individually was aimed at mastering the substance, that is, knowledge of the properties of the substance and the production of first primitive, and then increasingly complex tools, up to mechanisms and machines that allow the production of consumer values.

Then, in the process of the formation of industrial society, the problem of mastering energy came to the fore - first thermal, then electrical, and finally atomic. Mastery of energy made it possible to master the mass production of consumer values ​​and, as a result, improve people’s living standards and change the nature of their work.

At the same time, humanity has a characteristic need to express and remember information about the world around us - this is how writing, printing, painting, photography, radio, and television appeared. In the history of the development of civilization, several information revolutions can be distinguished - the transformation of social relations due to fundamental changes in the field of information processing, information technologies. The consequence of such transformations was the acquisition of a new quality by human society.

At the end of the 20th century. humanity has entered a new stage of development - the stage of building an information society. Information has become the most important factor economic growth, and the level of development of information activity and the degree of its involvement and influence on the global information infrastructure have become the most important condition for the country’s competitiveness in the world economy. The understanding of the inevitability of the arrival of this society came much earlier. Australian economist K. Clarke spoke back in the 40s about the approaching era of a society of information and services, a society of new technological and economic opportunities. The American economist F. Machlup put forward the assumption about the onset of the information economy and the transformation of information into most important product at the end of the 50s. At the end of the 60s. D. Bell noted the transformation of an industrial society into an information society. As for the countries that were previously part of the USSR, the processes of informatization in them developed at a slow pace.

Computer science is changing the entire system of social production and interaction of cultures. With the advent of the information society, a new stage begins not only in the scientific and technical, but also in the social revolution. The entire information communications system is changing. The destruction of old information connections between economic sectors, areas of scientific activity, regions, and countries intensified the economic crisis of the end of the century in countries that paid insufficient attention to the development of informatization. The most important task of society is to restore communication channels in the new economic and technological conditions to ensure clear interaction of all areas of economic, scientific and social development of both individual countries and on a global scale.

Computers in modern society have taken over a significant part of the work related to information. By historical standards, computer information processing technologies are still very young and are at the very beginning of their development. Computer technologies today are transforming or replacing old information processing technologies.


2. “Time - events - people”

Let's consider the history of development computing facilities and methods “in persons” and objects (Table 1).

Table 1. Main events in the history of the development of computational methods, instruments, automata and machines

John Napier

The Scotsman John Napier published “A Description of the Amazing Tables of Logarithms” in 1614. He discovered that the sum of the logarithm of the numbers a and b is equal to the logarithm of the product of these numbers. Therefore, the operation of multiplication was reduced to a simple addition operation. He also developed a tool for multiplying numbers - “Napere's knuckles”. It consisted of a set of segmented rods that could be positioned in such a way that, by adding numbers in segments adjacent to each other horizontally, the result of their multiplication was obtained. Napier's knuckles were soon replaced by other computing devices (mostly of the mechanical type). Napier's tables, the calculation of which required a lot of time, were later “built in” into a convenient device that speeds up the calculation process - the slide rule (R. Bissacar, late 1620)

Wilhelm Schickard

It was believed that the first mechanical calculating machine was invented by the great French mathematician and physicist B. Pascal in 1642. However, in 1957, F. Hammer (Germany, director of the Keplerian Scientific Center) discovered evidence of the creation of a mechanical computing machine approximately two decades before Pascal’s invention Wilhelm Schickard. He called it the “counting clock.” The machine was intended to perform four arithmetic operations and consisted of parts: an adding device; duplicating device; mechanism for intermediate results. The adding device consisted of gears and represented simplest form adding machine. The proposed mechanical counting scheme is considered classical. However, this simple and effective scheme had to be reinvented, since information about Schickard’s machine did not become public knowledge

Blaise Pascal

In 1642, when Pascal was 19 years old, the first working model of a adding machine was made. A few years later, Blaise Pascal created a mechanical adding machine (“pascaline”), which made it possible to add numbers in the decimal number system. In this machine, the digits of a six-digit number were set by corresponding turns of disks (wheels) with digital divisions; the result of the operation could be read in six windows - one for each digit. The units disk was connected to the tens disk, the tens disk to the hundreds disk, etc. Other operations were performed using a rather inconvenient procedure of repeated additions, and this was the main drawback of Pascaline. In just about a decade, he built more than 50 different versions of the machine. Pascal's principle of linked wheels was the basis on which most computing devices were built over the next three centuries.

Gottfried Wilhelm Leibniz

In 1672, while in Paris, Leibniz met the Dutch mathematician and astronomer Christian Huygens. Seeing how many calculations an astronomer had to do, Leibniz decided to invent a mechanical device for calculations. In 1673 he completed the creation of a mechanical calculator. Developing Pascal's ideas, Leibniz used the shift operation for bitwise multiplication of numbers. Addition was carried out on it in essentially the same way as on the Pascaline, but Leibniz included in the design a moving part (a prototype of the movable carriage of future desktop calculators) and a handle with which it was possible to turn a stepped wheel or - in subsequent versions of the machine - cylinders located inside the device

Joseph-Marie Jacquard

The development of computing devices is associated with the advent of punch cards and their use. The appearance of perforated cards is associated with weaving production. In 1804, engineer Joseph-Marie Jacquard built a fully automated machine (Jaccard machine), capable of reproducing complex patterns. The operation of the machine was programmed using a deck of punched cards, each of which controlled one shuttle stroke. The transition to a new drawing occurred by replacing the deck of punched cards
Charles Babbage (1791-1871) He discovered errors in Napier's tables of logarithms, which were widely used in calculations by astronomers, mathematicians, and navigators. In 1821, he began to develop his own computer, which would help perform more accurate calculations. In 1822, a difference engine (trial model) was built, capable of calculating and printing large mathematical tables. It was a very complex, large device and was intended for automatic calculation of logarithms. The model was based on the principle known in mathematics as the “finite difference method”: when calculating polynomials, only the addition operation is used and does not involve multiplication and division, which are much more difficult to automate. Subsequently, he came up with the idea of ​​creating a more powerful analytical engine. She not only had to solve mathematical problems of a certain type, but also perform various computational operations in accordance with the instructions given by the operator. By design, this is nothing less than the first universal programmable computer. The analytical engine had to have such components as a “mill” (an arithmetic device in modern terminology) and a “warehouse” (memory). Instructions (commands) were entered into the analytical engine using punched cards (the idea of ​​program control by Jaccard using punched cards was used). The Swedish publisher, inventor and translator Per Georg Scheutz, using Babbage's advice, built a modified version of this machine. In 1855, Scheutz's machine was awarded a gold medal at the World Exhibition in Paris. Subsequently, one of the principles underlying the idea of ​​​​the analytical engine - the use of punched cards - was embodied in a statistical tabulator built by the American Herman Hollerith (to speed up the processing of the results of the US census in 1890)

Augusta Ada Byron

(Countess Lovelace)

Countess Augusta Ada Lovelace, daughter of the poet Byron, worked with Charles Babbage to create programs for his calculating machines. Her work in this area was published in 1843. However, at that time it was considered indecent for a woman to publish her writings under her full name, and Lovelace put only her initials on the title. Babbage's materials and Lovelace's comments outlined such concepts as “subroutine” and “subroutine library”, “instruction modification” and “index register”, which began to be used only in the 50s. XX century The term “library” itself was introduced by Babbage, and the terms “work cell” and “cycle” were proposed by A. Lovelace. “It can be said with good reason that the Analytical Engine weaves algebraic patterns in the same way as Jacques Card’s loom reproduces flowers and leaves,” wrote Countess Lovelace. She was actually the first programmer (the Ada programming language was named after her)

George Boole

J. Boole is rightfully considered the father of mathematical logic. A branch of mathematical logic, Boolean algebra, is named after him. In 1847 he wrote the article “Mathematical Analysis of Logic.” In 1854, Boole developed his ideas in a work entitled “An Inquiry into the Laws of Thought.” These works brought revolutionary changes to logic as a science. J. Boole invented a kind of algebra - a system of notations and rules applied to all kinds of objects, from numbers and letters to sentences. Using this system, Boole could encode statements (statements) using his language, and then manipulate them in the same way as ordinary numbers are manipulated in mathematics. The three basic operations of the system are AND, OR and NOT

Pafnutiy Lvovich Chebyshev

He developed the theory of machines and mechanisms and wrote a number of works devoted to the synthesis of hinge mechanisms. Among the numerous mechanisms he invented, there are several models of adding machines, the first of which was designed no later than 1876. Chebyshev’s adding machine was one of the most original computing machines for that time. In his designs, Chebyshev proposed the principle of continuous transmission of tens and automatic transition of the carriage from digit to digit during multiplication. Both of these inventions came into widespread practice in the 30s. XX century in connection with the use of electric drives and the spread of semi-automatic and automatic keyboard computers. With the advent of these and other inventions, it became possible to significantly increase the speed of mechanical counting devices.
Alexey Nikolaevich Krylov (1863-1945) Russian shipbuilder, mechanic, mathematician, academician of the USSR Academy of Sciences. In 1904, he proposed the design of a machine for integrating ordinary differential equations. In 1912, such a machine was built. It was the first continuous integrating machine, allowing the solution of differential equations up to the fourth order

Wilgodt Theophil Odner

A native of Sweden, Vilgodt Theophil Odner came to St. Petersburg in 1869. For some time he worked at the Russian Diesel plant on the Vyborg side, where in 1874 the first sample of his adding machine was manufactured. Created on the basis of Leibniz's stepped rollers, the first serial adding machines had big sizes primarily because a separate roller had to be allocated for each discharge. Instead of stepped rollers, Odhner used more advanced and compact gears with a varying number of teeth - Odhner wheels. In 1890, Odner received a patent for the production of adding machines and in the same year 500 adding machines were sold (a very large number for those times). Adding machines in Russia were called: “Odner Arithmometer”, “Original-Odner”, “Odner System Arithmometer”, etc. In Russia, until 1917, approximately 23 thousand Odner adding machines were produced. After the revolution, the production of adding machines was established at the Sushchevsky Mechanical Plant named after. F.E. Dzerzhinsky in Moscow. Since 1931, they began to be called “Felix” adding machines. Further, in our country, models of Odhner arithmometers with key input and electric drive were created
Herman Hollerith (1860-1929) After graduating from Columbia University, he went to work in the census office in Washington. At this time, the United States began an extremely labor-intensive (seven and a half years) manual processing data collected during the census in 1880. By 1890, Hollerith had completed the development of a tabulation system based on the use of punched cards. Each card had 12 rows, in each of which 20 holes could be punched; they corresponded to such data as age, gender, place of birth, number of children, Family status and other information included in the census questionnaire. The contents of the completed forms were transferred to cards by appropriate perforation. Punched cards were loaded into special devices connected to a tabulation machine, where they were threaded onto rows of thin needles, one needle for each of the 240 punched positions on the card. When the needle entered the hole, it closed a contact in the corresponding electrical circuit of the machine. The full statistical analysis of the results took two and a half years (three times faster than the previous census). Hollerith subsequently founded the company Computer Tabulating Recording (CTR). The company's young traveling salesman, Tom Watson, was the first to see the potential profitability of selling calculating machines to American businessmen using punched cards. He later took over the company and in 1924 renamed it International Business Machines (IBM) Corporation.

Vannevar Bush

In 1930 he built a mechanical computing device - a differential analyzer. It was a machine that could solve complex differential equations. However, it had many serious disadvantages, most notably its gigantic size. Bush's mechanical analyzer was a complex system of rollers, gears, and wires connected in a series of large units that filled an entire room. When assigning a task to the machine, the operator had to manually select many gears. This usually took 2-3 days. Later, V. Bush proposed a prototype of modern hypertext - the MEMEX project (MEMory EXtention - memory expansion) as an automated bureau in which a person would store his books, records, any information he receives in such a way that he can use it at any time with maximum speed and convenience . In fact, it had to be a complex device equipped with a keyboard and transparent screens on which texts and images stored on microfilm would be projected. MEMES would establish logical and associative connections between any two blocks of information. Ideally, we are talking about a huge library, universal information base

John Vincent Atanasoff

Professor of physics, author of the first project of a digital computer based on a binary rather than decimal number system. The simplicity of the binary number system combined with the simplicity of the physical representation of two symbols (0, 1) instead of ten (0, 1,..., 9) in electrical diagrams computer outweighed the inconvenience associated with the need to convert from binary to decimal and back. In addition, the use of the binary number system would help reduce the size of the computer and would reduce its cost. In 1939, Atanasoff built a model of the device and began to seek financial assistance to continue the work. Atanasoff's machine was almost ready in December 1941, but was disassembled. Due to the outbreak of World War II, all work on this project ceased. Only in 1973, Atanasoff’s priority as the author of the first project of such a computer architecture was confirmed by a decision of the US federal court
Howard Aiken In 1937, G. Aiken proposed a project for a large calculating machine and was looking for people willing to finance this idea. The sponsor was Thomas Watson, president of IBM Corporation: his contribution to the project amounted to about 500 thousand US dollars. The design of the new Mark-1 machine, based on electromechanical relays, began in 1939 in the laboratories of the New York branch of IBM and continued until 1944. The finished computer contained about 750 thousand parts and weighed 35 tons. The machine operated with binary numbers up to 23 digits and multiplied two numbers of the maximum digit in about 4 s. Since the creation of the Mark-1 took quite a long time, the palm went not to it, but to Konrad Zuse’s relay binary computer Z3, built in 1941. It is worth noting that the Z3 machine was significantly smaller than Aiken’s machine and also cheaper to manufacture

Konrad Zuse

In 1934, while a student at a technical university (in Berlin), without the slightest idea about the work of Charles Babbage, K. Zuse began to develop a universal computer, much like Babbage's analytical engine. In 1938, he completed the construction of the machine, which occupied an area of ​​4 square meters. m., called Z1 (in German his last name is written as Zuse). It was a fully electromechanical programmable digital machine. It had a keyboard for entering task conditions. The results of the calculations were displayed on a panel with many small lights. Its restored version is kept in the Verker und Technik museum in Berlin. It is the Z1 in Germany that is called the world's first computer. Zuse later began coding instructions for the machine by punching holes in used 35mm photographic film. The machine, which worked with perforated tape, was called Z2. In 1941, Zuse built a program-controlled machine based on the binary number system - Z3. This machine was superior in many of its characteristics to other machines built independently and in parallel in other countries. In 1942, Zuse, together with the Austrian electrical engineer Helmut Schreyer, proposed creating a computer of a fundamentally new type - based on vacuum vacuum tubes. This machine was supposed to work a thousand times faster than any of the machines available in Germany at that time. Speaking about the potential areas of application of a high-speed computer, Zuse and Schreyer noted the possibility of using it to decrypt encrypted messages (such developments have already been carried out in various countries)

Alan Turing

An English mathematician, gave a mathematical definition of an algorithm through a construction called a Turing machine. During World War II, the Germans used the Enigma machine to encrypt messages. Without a key and a switching circuit (the Germans changed them three times a day), it was impossible to decipher the message. In order to uncover the secret, British intelligence assembled a group of brilliant and somewhat eccentric scientists. Among them was the mathematician Alan Turing. At the end of 1943, the group managed to build a powerful machine (instead of electromechanical relays, it used about 2000 electronic vacuum tubes). The car was called "Colossus". The intercepted messages were encoded, put on punched tape and entered into the machine's memory. The tape was entered via a photoelectric reader at a speed of 5000 characters per second. The machine had five such reading devices. In the process of searching for a match (decryption), the machine compared the encrypted message with already known Enigma codes (according to the Turing machine algorithm). The group's work still remains classified. Turing’s role in the work of the group can be judged by the following statement by a member of this group, mathematician I. J. Goode: “I do not want to say that we won the war thanks to Turing, but I take the liberty of saying that without him we might have lost it " The Colossus machine was a lamp-based machine (a major step forward in the development of computer technology) and specialized (deciphering secret codes)

John Mauchly

Presper Eckert

(born 1919)

The first computer is considered to be the ENIAC machine (ENIAC, Electronic Numerial Integrator and Computer - electronic digital integrator and computer). Its authors, American scientists J. Mauchly and Presper Eckert, worked on it from 1943 to 1945. It was intended to calculate the flight trajectories of projectiles, and was the most complex for the middle of the 20th century. an engineering structure more than 30 m long, with a volume of 85 cubic meters. m, weighing 30 tons. ENIAC used 18 thousand vacuum tubes, 1500 relays, the machine consumed about 150 kW. Next, the idea arose of creating a machine with software stored in the machine’s memory, which would change the principles of organizing calculations and pave the way for the emergence of modern programming languages ​​(EDVAC - Electronic Discret Variable Automatic Computer, EDVAC). This machine was created in 1950. The more capacious internal memory contained both data and program. Programs were recorded electronically in special devices - delay lines. The most important thing was that in EDVAC the data was encoded not in the decimal system, but in the binary system (the number of vacuum tubes used was reduced). J. Mauchly and P. Eckert after creating their own company set out to create a universal computer for wide commercial use - UNIVAC (Universal Automatic Computer). About a year before the first
ENIAC UNIVAC entered into operation with the US Census Bureau, the partners found themselves in dire financial straits and were forced to sell their company to Remington Rand. However, UNIVAC was not the first commercial computer. It was the LEO machine (LEO, Lyons "Bectronic Office), which was used in England to calculate wages for employees of tea shops (Lyons company). In 1973, a US federal court recognized their copyright for the invention of an electronic digital computer as invalid, and their ideas - borrowed from J. Atanasoff
John von Neumann (1903-1957)

Working in the group of J. Mauchly and P. Eckert, von Neumann prepared a report - “Preliminary report on the EDVAC machine”, in which he summarized plans for work on the machine. This was the first work on digital electronic computers that certain circles of the scientific community became familiar with (for reasons of secrecy, work in this area was not published). From that moment on, the computer was recognized as an object of scientific interest. In his talk, von Neumann identified and described in detail five key components of what is now called the “von Neumann architecture” of a modern computer.

In our country, independently of von Neumann, more detailed and complete principles for constructing electronic digital computers were formulated (Sergei Alekseevich Lebedev)

Sergey Alekseevich Lebedev

In 1946, S. A. Lebedev became director of the Institute of Electrical Engineering and organized his own modeling and regulation laboratory within it. In 1948, S. A. Lebedev oriented his laboratory towards the creation of MESM (Small Electronic Computing Machine). MESM was initially conceived as a model (the first letter in the abbreviation MESM) of the Large Electronic Computing Machine (BESM). However, in the process of its creation, the feasibility of turning it into a small computer became obvious. Due to the secrecy of work carried out in the field of computer technology, there were no corresponding publications in the open press.

The basics of computer construction, developed by S. A. Lebedev independently of J. von Neumann, are as follows:

1) the computer must include arithmetic, memory, information input/output, and control devices;

2) the calculation program is encoded and stored in memory like numbers;

3) to encode numbers and commands, the binary number system should be used;

4) calculations should be carried out automatically based on the program stored in memory and operations on commands;

5) in addition to arithmetic operations, logical ones are also introduced - comparisons, conditional and unconditional transitions, conjunction, disjunction, negation;

6) memory is built according to a hierarchical principle;

7) numerical methods for solving problems are used for calculations.

On December 25, 1951, MESM was put into operation. It was the first high-speed electronic digital machine in the USSR.

In 1948, the Institute of Precision Mechanics and Computer Technology (ITM and VT) of the USSR Academy of Sciences was created, to which the government entrusted the development of new computer technology and S. A. Lebedev was invited to head Laboratory No. 1 (1951). When BESM was ready (1953), it was in no way inferior to the latest American models.

From 1953 until the end of his life, S. A. Lebedev was the director of ITM and VT of the USSR Academy of Sciences, was elected a full member of the USSR Academy of Sciences and led the work on the creation of several generations of computers.

In the early 60s. The first computer from the series of large electronic calculating machines (BESM) - BKhM-1 - is created. When creating BESM-1, original scientific and design solutions were used. Thanks to this, it was then the most productive machine in Europe (8-10 thousand operations per second) and one of the best in the world. Under the leadership of S. A. Lebedev, two more tube computers were created and put into production - BESM-2 and M-20. In the 60s Semiconductor versions of the M-20 were created: M-220 and M-222, as well as BESM-ZM and BESM-4.

When designing BESM-6, the method of preliminary simulation modeling was used for the first time (commissioning was carried out in 1967).

S. A. Lebedev was one of the first to understand the enormous importance of the collaboration of mathematicians and engineers in the creation of computer systems. On the initiative of S. A. Lebedev, all BESM-6 circuits were written using Boolean algebra formulas. This has opened up wide opportunities for automation of design and preparation of installation and production documentation

IBM It is impossible to miss a key stage in the development of computing tools and methods associated with the activities of IBM. Historically, the first computers of classical structure and composition - Computer Installation System/360 (brand name - “Computing installation of system 360”, later known as simply IBM/360) were released in 1964, and with subsequent modifications (IBM/370, IBM /375) were supplied until the mid-80s, when, under the influence of microcomputers (PCs), they gradually began to disappear from the scene. Computers of this series served as the basis for the development in the USSR and CMEA member countries of the so-called Unified Computer System (US COMPUTER), which for several decades formed the basis of domestic computerization.
EC 1045

The machines included the following components:

Central processor (32-bit) with two-address command system;

Main (RAM) memory (from 128 KB to 2 MB);

Magnetic disk drives (NMD, MD) with removable disk packs (for example, IBM-2314 - 7.25 MB, ShM-2311 - 29 MB, IBM 3330 - 100 MB), similar (sometimes compatible) devices are known for other the above series;

Magnetic tape drives (NML, ML) reel-type, tape width 0.5 inches, length from 2400 feet (720 m) or less (usually 360 and 180 m), recording density from 256 bytes per inch (typical) and higher in 2-8 times (increased). Accordingly, the working capacity of the drive was determined by the size of the reel and the recording density and reached 160 MB per ML reel;

Printing devices - line-by-line drum-type printing devices, with a fixed (usually 64 or 128 characters) character set, including uppercase Latin and Cyrillic (or uppercase and lowercase Latin) and a standard set of service characters; information was output on paper tape 42 or 21 cm wide at a speed of up to 20 lines/s;

Terminal devices (video terminals, and initially electric typewriters) designed for interactive interaction with the user (IBM 3270, DEC VT-100, etc.), connected to the system to perform computing process control functions (operator console - 1-2 pcs. on a computer) and interactive debugging of programs and data processing (user terminal - from 4 to 64 pcs. per computer).

The listed standard sets of computer devices of the 60-80s. and their characteristics are given here as historical information for the reader, who can independently evaluate them by comparing them with modern and known data.

IBM proposed the first functionally complete OS - OS/360 - as a shell for the IBM/360 computer. The development and implementation of the OS made it possible to differentiate the functions of operators, administrators, programmers, and users, as well as to significantly (tens or hundreds of times) increase the productivity of the computer and the degree of loading of hardware. Versions of OS/360/370/375 - MFT (multiprogramming with a fixed number of tasks), MW (with a variable number of tasks), SVS (virtual memory system), SVM (virtual machine system) - successively succeeded each other and largely determined modern ideas about the role of the OS

Bill Gates and

Paul Allen

In 1974, Intel developed the first universal 8-bit microprocessor, the 8080, with 4500 transistors. Edward Roberts, a young US Air Force officer and electronics engineer, built the Altair microcomputer based on the 8080 processor, which was a huge commercial success, sold by mail and widely used for home use. In 1975, young programmer Paul Allen and Harvard University student Bill Gates implemented the BASIC language for Altair. They subsequently founded Microsoft.
Steven Paul Jobs and Steven Wozniak

In 1976, students Steve Wozniak and Steve Jobs, having set up a workshop in their garage, implemented the Apple-1 computer, laying the foundation for the Apple Corporation. 1983 - Apple Computers Corporation built the Lisa personal computer - the first office computer controlled by a mouse.

In 2001, Stephen Wozniak founded Wheels Of Zeus to create wireless GPS technology.

2001 - Steve Jobs introduced the first iPod.

2006 - Apple introduced the first laptop based on Intel processors.

2008 - Apple introduced the world's thinnest laptop, called MacBook Air.

3. Classes of computers

Applications and methods of use (as well as size and processing power).

Physical representation of processed information

Here analogue (continuous) ones are distinguished; digital (discrete action); hybrid (at certain stages of processing, various methods of physical data representation are used).

AVM - analog computers, or continuous computers, work with information presented in continuous (analog) form, i.e. in the form of a continuous series of values ​​of any physical quantity (most often electrical voltage):

Digital computers - digital computers, or discrete computers, work with information presented in discrete, or rather, digital form. Due to the universality of the digital form of information representation, a computer is a more universal means of data processing.

GVMs are hybrid computers, or combined-action computers, that work with information presented in both digital and analog forms. They combine the advantages of AVM and TsVM. It is advisable to use GVM to solve problems of controlling complex high-speed technical complexes.

Computer generations

The idea of ​​dividing machines into generations was brought to life by the fact that during the short history of its development, computer technology has undergone a great evolution both in the sense of the elemental base (lamps, transistors, microcircuits, etc.), and in the sense of changes in its structure, the emergence of new capabilities, expanding the scope of application and nature of use (Table 2).


table 2

Stages of development of computer information technologies

Parameter Period, years
50s 60s 70s 80s

The present

Purpose of using a computer Scientific and technical calculations

Technical and economic

Management, provision of information

communications, information

tional service

Computer operating mode Single program Batch Processing Time sharing Personal work Network processing
Data Integration Low Average High Very high
User location Engine room Separate room Terminal hall Desktop

free mobile

User type Software engineers

national programs

Programmers Users with general computer skills

Few trained users

Dialogue type Working at the computer console Exchange of punched notes and machine-grams Interactive (via keyboard and screen) Interactive with hard menu

active screen question-answer type

The first generation usually includes cars created at the turn of the 50s. and based on vacuum tubes. These computers were huge, clunky, and overly expensive machines that could only be purchased by large corporations and governments. The lamps consumed a significant amount of electricity and generated a lot of heat (Fig. 1).

The set of instructions was limited, the circuits of the arithmetic-logical device and the control device were quite simple, and there was practically no software. Indicators of RAM capacity and performance were low. Punched tapes, punched cards, magnetic tapes and printing devices were used for input and output. Performance is about 10-20 thousand operations per second.

Programs for these machines were written in the language of the specific machine. The mathematician who compiled the program sat down at the control panel of the machine, entered and debugged the programs and calculated them. The debugging process was quite lengthy.

Despite the limited capabilities, these machines made it possible to perform complex calculations necessary for weather forecasting, solving nuclear energy problems, etc.

Experience with first-generation machines showed that there was a huge gap between the time spent developing programs and the calculation time. These problems began to be overcome through the intensive development of automation programming tools, the creation of service program systems that simplify work on the machine and increase the efficiency of its use. This, in turn, required significant changes in the structure of computers, aimed at bringing it closer to the requirements that arose from experience in operating computers.

In October 1945, the first ENIAC (Electronic Numerical Integrator And Calculator) computer was created in the USA.

Domestic machines of the first generation: MESM (small electronic calculating machine), BESM, Strela, Ural, M-20.

The second generation of computer equipment is machines designed in 1955-65. They are characterized by the use of both electronic tubes and discrete transistor logic elements (Fig. 2). Their RAM was built on magnetic cores. At this time, the range of input-output equipment used began to expand, and high-performance devices for working with magnetic tapes (NML), magnetic drums (DRM) and the first magnetic disks appeared (Table 2).

These machines are characterized by speed up to hundreds of thousands of operations per second, memory capacity - up to several tens of thousands of words.

High-level languages ​​are appearing, the means of which allow the description of the entire necessary sequence of computational actions in a visual, easily understandable form.

A program written in an algorithmic language is incomprehensible to a computer, which understands only the language of its own commands. Therefore, special programs, called translators, translate a program from a high-level language into machine language.

A wide range of library programs for solving various problems, as well as monitor systems that control the mode of broadcasting and execution of programs, from which modern operating systems later grew.

The operating system is the most important part of computer software, designed to automate the planning and organization of program processing, input-output and data management, resource allocation, preparation and debugging of programs, and other auxiliary maintenance operations.

Second-generation machines were characterized by software incompatibility, which made it difficult to organize large information systems. Therefore, in the mid-60s. There has been a transition to the creation of computers that are software compatible and built on a microelectronic technological base.

The highest achievement of domestic computer technology created by the team of S.A. Lebedev was responsible for the development in 1966 of the BESM-6 semiconductor computer with a productivity of 1 million operations per second.

Third-generation machines are families of machines with a single architecture, i.e., software compatible. They use integrated circuits, also called microcircuits, as their elemental base.

Third generation cars appeared in the 60s. Since the process of creating computer equipment was continuous, and many people from different countries When dealing with the solution of various problems, it is difficult and useless to try to establish when a “generation” began and ended. Perhaps the most important criterion for distinguishing second and third generation machines is one based on the concept of architecture.

Third generation machines have advanced operating systems. They have multiprogramming capabilities, i.e. parallel execution of several programs. Many tasks of managing memory, devices and resources began to be taken over by the operating system or the machine itself.

Examples of third-generation machines are the families IBM-360, IBM-370, PDP-11, VAX, EC Computers (Unified Computer System), SM Computers (Family of Small Computers), etc.

The performance of machines within the family varies from several tens of thousands to millions of operations per second. The capacity of RAM reaches several hundred thousand words.

The fourth generation is the main contingent of modern computer technology developed after the 70s.

The most conceptually important criterion by which these computers can be distinguished from third-generation machines is that fourth-generation machines were designed to efficiently use modern high-level languages ​​and simplify the programming process for the end user.

In terms of hardware, they are characterized by the widespread use of integrated circuits as an elemental base, as well as the presence of high-speed random access memory devices with a capacity of tens of megabytes (Fig. 3, b).

From a structural point of view, machines of this generation are multiprocessor and multi-machine complexes that use shared memory and a common field of external devices. The performance is up to several tens of millions of operations per second, the RAM capacity is about 1-512 MB.

They are characterized by:

Use of personal computers (PC);

Telecommunications data processing;

Computer networks;

Widespread use of database management systems;

Elements of intelligent behavior of data processing systems and devices.

The fourth generation computers include the “Electronics MS 0511” PC of the educational computer equipment set of KUVT UKNTs, as well as modern IBM-compatible computers on which we work.

In accordance with the elemental base and level of software development, four real generations of computers are distinguished, a brief description of which is given in Table 3.

Table 3

Computer generations

Comparison options Computer generations
first second third fourth
Period of time 1946 - 1959 1960 - 1969 1970 - 1979 since 1980
Element base (for control unit, ALU) Electronic (or electric) lamps Semiconductors (transistors) Integrated circuits Large scale integrated circuits (LSI)
Main type of computer Large Small (mini) Micro
Basic input devices Remote control, punched card, punched tape input Added alphanumeric display and keyboard Alphanumeric display, keyboard Color graphic display, scanner, keyboard
Main output devices Alphanumeric printing device (ADP), punched tape output Plotter, printer
External memory Magnetic tapes, drums, punched tapes, punched cards Added magnetic disk Punched paper tapes, magnetic disk Magnetic and optical disks
Key software solutions Universal programming languages, translators Batch operating systems that optimize translators Interactive operating systems, structured programming languages Friendly software, network operating systems
Computer operating mode Single program Batch Time sharing Personal work and network processing
Purpose of using a computer Scientific and technical calculations Technical and economic calculations Management and economic calculations Telecommunications, information services

Table 4

Main characteristics of domestic second generation computers

Parameter First of all
Hrazdan-2 BESM-4 M-220 Ural-11 Minsk-22 Ural-16
Targeting 2 3 3 1 2 1
Data presentation form Floating point Floating point Floating point

comma separated, symbolic

comma separated, symbolic

Floating and fixed

comma separated, symbolic

Machine word length (double-bit) 36 45 45 24 37 48
Speed ​​(op/s) 5 thousand 20 thousand 20 thousand 14-15 thousand 5 thousand 100 thousand
RAM, type, capacity (words)

Product core 2048

Product core 8192

commercial core 4096-16 384

commercial core 4096-16 384

commercial core

product core 8192-65 536

VZU, type, capacity (words) NML 120 thousand NML 16 million NML 8 million NML up to 5 million NML 12 million NMB130 thousand.

In fifth-generation computers, there should presumably be a qualitative transition from data processing to knowledge processing.

The architecture of fifth-generation computers will contain two main blocks. One of them is a traditional computer, but devoid of communication with the user. This communication is carried out by an intelligent interface. The problem of decentralization of computing using computer networks will also be solved.

Briefly, the basic concept of a fifth-generation computer can be formulated as follows:

1. Computers on ultra-complex microprocessors with a parallel-vector structure, simultaneously executing dozens of sequential program instructions.

2. Computers with many hundreds of parallel working processors, allowing the construction of data and knowledge processing systems, efficient network computer systems.


Until the 17th century the activity of society as a whole and of each person individually was aimed at mastering the substance, that is, knowledge of the properties of the substance and the production of first primitive, and then increasingly complex tools, up to mechanisms and machines that allow the production of consumer values.

Then, in the process of the formation of industrial society, the problem of mastering energy came to the fore - first thermal, then electrical, and finally atomic.

At the end of the 20th century. humanity has entered a new stage of development - the stage of building an information society.

At the end of the 60s. D. Bell noted the transformation of an industrial society into an information society.

The most important task of society is to restore communication channels in new economic and technological conditions to ensure clear interaction between all areas of economic, scientific and social development, both in individual countries and on a global scale.

A modern computer is a universal, multifunctional, electronic automatic device for working with information.

In 1642, when Pascal was 19 years old, the first working model of a adding machine was made.

In 1673, Leibniz invented a mechanical device for calculations (mechanical calculator).

1804 engineer Joseph-Marie Jacquard built a fully automated machine (Jaccard machine), capable of reproducing complex patterns. The operation of the machine was programmed using a deck of punched cards, each of which controlled one shuttle stroke.

In 1822, C. Babbage built a difference engine (test model), capable of calculating and printing large mathematical tables. Subsequently, he came up with the idea of ​​creating a more powerful analytical engine. She not only had to solve mathematical problems of a certain type, but also perform various computational operations in accordance with the instructions given by the operator.

Countess Augusta Ada Lovelace worked with Charles Babbage to create programs for his calculating machines. Her work in this area was published in 1843.

J. Boole is rightfully considered the father of mathematical logic. A branch of mathematical logic, Boolean algebra, is named after him. J. Boole invented a kind of algebra - a system of notation and rules applied to all kinds of objects, from numbers and letters to sentences (1854).

Models of adding machines, the first of which was designed no later than 1876. Chebyshev's adding machine was one of the most original computers for that time. In his designs, Chebyshev proposed the principle of continuous transmission of tens and automatic transition of the carriage from digit to digit during multiplication.

In 1904, Alexey Nikolaevich Krylov proposed the design of a machine for integrating ordinary differential equations. In 1912, such a machine was built.

And others.

An electronic computer (computer), a computer, is a set of technical means designed for automatic information processing in the process of solving computational and information problems.

Computers can be classified according to a number of characteristics, in particular:

Physical representation of the processed information;

Generations (stages of creation and element base).

It began to be called arithmetic-logical. It has become the main device of modern computers. Thus, two geniuses of the 17th century set the first milestones in the history of the development of digital computing technology. The merits of V. Leibniz, however, are not limited to the creation of an “arithmetic device”. From his student years until the end of his life, he studied the properties of the binary system...

...) And modern technology, the level of development of which largely determines the progress in the production of computer equipment. In our country, electronic computers are usually divided into generations. Computer technology is characterized, first of all, by the rapid change of generations - during its short history of development, four generations have already changed, and now we are working on computers of the fifth...

The computer they created worked a thousand times faster than the Mark 1. But it turned out that most of the time this computer was idle, because to set the calculation method (program) in this computer it was necessary to connect the wires in the required way for several hours or even several days. And the calculation itself could then take only a few minutes or even seconds.

To simplify and speed up the process of setting programs, Mauchly and Eckert began to design a new computer that could store the program in its memory. In 1945, the famous mathematician John von Neumann was brought in to work and prepared a report on this computer. The report was sent to many scientists and became widely known because in it von Neumann clearly and simply formulated general principles functioning of computers, i.e. universal computing devices. And to this day, the vast majority of computers are made in accordance with the principles that John von Neumann outlined in his report in 1945. The first computer to embody von Neumann's principles was built in 1949 by the English researcher Maurice Wilkes.

The development of the first electronic serial machine UNIVAC (Universal Automatic Computer) began around 1947 by Eckert and Mauchli, who founded the ECKERT-MAUCHLI company in December of the same year. The first model of the machine (UNIVAC-1) was built for the US Census Bureau and put into operation in the spring of 1951. The synchronous, sequential computer UNIVAC-1 was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5000 vacuum tubes. The internal storage device with a capacity of 1000 12-bit decimal numbers was implemented on 100 mercury delay lines.

Soon after the UNIVAC-1 machine was put into operation, its developers came up with the idea of ​​automatic programming. It boiled down to ensuring that the machine itself could prepare the sequence of commands needed to solve a given problem.

A strong limiting factor in the work of computer designers in the early 1950s was the lack of high-speed memory. According to one of the pioneers of computing, D. Eckert, “the architecture of a machine is determined by memory.” The researchers focused their efforts on the memory properties of ferrite rings strung on wire matrices.

In 1951, J. Forrester published an article on the use of magnetic cores for storing digital information. The Whirlwind-1 machine was the first to use magnetic core memory. It consisted of 2 cubes 32 x 32 x 17 with cores that provided storage of 2048 words for 16-bit binary numbers with one parity bit.

Soon, IBM became involved in the development of electronic computers. In 1952, it released its first industrial electronic computer, the IBM 701, which was a synchronous parallel computer containing 4,000 vacuum tubes and 12,000 germanium diodes. An improved version of the IBM 704 machine was distinguished by its high speed, it used index registers and represented data in floating point form.

IBM 704
After the IBM 704 computer, the IBM 709 was released, which, in architectural terms, was close to the machines of the second and third generations. In this machine, indirect addressing was used for the first time and I/O channels appeared for the first time.

In 1956, IBM developed floating magnetic heads on an air cushion. Their invention made it possible to create new type memory - disk storage devices (SD), the importance of which was fully appreciated in the subsequent decades of the development of computing technology. The first disk storage devices appeared in IBM 305 and RAMAC machines. The latter had a package consisting of 50 magnetically coated metal disks that rotated at a speed of 12,000 rpm. The surface of the disk contained 100 tracks for recording data, each containing 10,000 characters.

Following the first production computer UNIVAC-1, Remington-Rand in 1952 released the UNIVAC-1103 computer, which worked 50 times faster. Later, software interrupts were used for the first time in the UNIVAC-1103 computer.

Rernington-Rand employees used an algebraic form of writing algorithms called “Short Code” (the first interpreter, created in 1949 by John Mauchly). In addition, it is necessary to note the US Navy officer and head of the programming team, then captain (later the only female admiral in the Navy) Grace Hopper, who developed the first compiler program. By the way, the term “compiler” was first introduced by G. Hopper in 1951. This compiling program translated into machine language the entire program, written in an algebraic form convenient for processing. G. Hopper is also the author of the term “bug” as applied to computers. Once, a beetle (in English - bug) flew into the laboratory through an open window, which, sitting on the contacts, shorted them, causing a serious malfunction in the operation of the machine. The burnt beetle was glued to the administrative log, where various malfunctions were recorded. This is how the first bug in computers was documented.

IBM took the first steps in the field of programming automation by creating the “Fast Coding System” for the IBM 701 machine in 1953. In the USSR, A. A. Lyapunov proposed one of the first programming languages. In 1957, a group led by D. Backus completed work on the first high-level programming language, which later became popular, called FORTRAN. The language, implemented for the first time on the IBM 704 computer, contributed to expanding the scope of computers.

Alexey Andreevich Lyapunov
In Great Britain in July 1951, at a conference at the University of Manchester, M. Wilkes presented a report “The Best Method for Designing an Automatic Machine,” which became a pioneering work on the fundamentals of microprogramming. The method he proposed for designing control devices has found wide application.

M. Wilkes realized his idea of ​​microprogramming in 1957 when creating the EDSAC-2 machine. In 1951, M. Wilkes, together with D. Wheeler and S. Gill, wrote the first programming textbook, “Composing Programs for Electronic Computing Machines.”

In 1956, Ferranti released the Pegasus computer, which for the first time implemented the concept of general purpose registers (GPR). With the advent of RON, the distinction between index registers and accumulators was eliminated, and the programmer had not one, but several accumulator registers at his disposal.

The advent of personal computers

Microprocessors were first used in a variety of specialized devices, such as calculators. But in 1974, several companies announced the creation of a personal computer based on the Intel-8008 microprocessor, that is, a device that performs the same functions as a large computer, but is designed for one user. At the beginning of 1975, the first commercially distributed personal computer, Altair-8800, based on the Intel-8080 microprocessor, appeared. This computer sold for about $500. And although its capabilities were very limited (RAM was only 256 bytes, there was no keyboard and screen), its appearance was greeted with great enthusiasm: several thousand sets of the machine were sold in the first months. Buyers supplied this computer with additional devices: a monitor for displaying information, a keyboard, memory expansion units, etc. Soon these devices began to be produced by other companies. At the end of 1975, Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to easily communicate with the computer and easily write programs for it. This also contributed to the rise in popularity of personal computers.

The success of Altair-8800 forced many companies to also start producing personal computers. Personal computers began to be sold fully equipped, with a keyboard and monitor; the demand for them amounted to tens and then hundreds of thousands of units per year. Several magazines dedicated to personal computers appeared. Numerous useful programs contributed greatly to sales growth practical significance. Commercially distributed programs also appeared, for example the text editing program WordStar and the spreadsheet processor VisiCalc (1978 and 1979, respectively). These and many other programs made the purchase of personal computers very profitable for business: with their help, it became possible to perform accounting calculations, draw up documents, etc. Using large computers for these purposes was too expensive.

In the late 1970s, the spread of personal computers even led to a slight decline in demand for large computers and minicomputers (minicomputers). This became a matter of serious concern for IBM, the leading company in the production of large computers, and in 1979 IBM decided to try its hand at the personal computer market. However, the company's management underestimated the future importance of this market and viewed the creation of a personal computer as just a minor experiment - something like one of dozens of works carried out at the company to create new equipment. In order not to spend too much money on this experiment, the company's management gave the unit responsible for this project freedom unprecedented in the company. In particular, he was allowed not to design a personal computer from scratch, but to use blocks made by other companies. And this unit took full advantage of the given chance.

The then latest 16-bit microprocessor Intel-8088 was chosen as the main microprocessor of the computer. Its use made it possible to significantly increase the potential capabilities of the computer, since the new microprocessor allowed working with 1 megabyte of memory, and all computers available at that time were limited to 64 kilobytes.

In August 1981, a new computer called the IBM PC was officially introduced to the public, and soon after it gained great popularity among users. A couple of years later, the IBM PC took a leading position in the market, displacing 8-bit computer models.

IBM PC
The secret of the popularity of the IBM PC is that IBM did not make its computer a single one-piece device and did not protect its design with patents. Instead, she assembled the computer from independently manufactured parts and did not keep the specifications of those parts and how they were connected a secret. In contrast, the design principles of the IBM PC were available to everyone. This approach, called the open architecture principle, made the IBM PC a stunning success, although it prevented IBM from sharing the benefits of its success. Here's how the openness of the IBM PC architecture influenced the development of personal computers.

The promise and popularity of the IBM PC made the production of various components and additional devices for the IBM PC very attractive. Competition between manufacturers has led to cheaper components and devices. Very soon, many companies ceased to be content with the role of manufacturers of components for the IBM PC and began to assemble their own computers compatible with the IBM PC. Since these companies did not need to bear IBM's huge costs for research and maintaining the structure of a huge company, they were able to sell their computers much cheaper (sometimes 2-3 times) than similar IBM computers.

Computers compatible with the IBM PC were initially contemptuously called “clones,” but this nickname did not catch on, as many manufacturers of IBM PC-compatible computers began to implement technical advances faster than IBM itself. Users were able to independently upgrade their computers and equip them with additional devices from hundreds of different manufacturers.

Personal computers of the future

The basis of computers of the future will not be silicon transistors, where information is transmitted by electrons, but optical systems. The information carrier will be photons, since they are lighter and faster than electrons. As a result, the computer will become cheaper and more compact. But the most important thing is that optoelectronic computing is much faster than what is used today, so the computer will be much more powerful.

The PC will be small in size and have the power of modern supercomputers. The PC will become a repository of information covering all aspects of our daily lives, it will not be tied to electrical networks. This PC will be protected from thieves thanks to a biometric scanner that will recognize its owner by fingerprint.

The main way to communicate with the computer will be voice. The desktop computer will turn into a “candy bar”, or rather, into a giant computer screen - an interactive photonic display. There is no need for a keyboard, since all actions can be performed with the touch of a finger. But for those who prefer a keyboard, a virtual keyboard can be created on the screen at any time and removed when it is no longer needed.

The computer will become the operating system of the house, and the house will begin to respond to the owner’s needs, will know his preferences (make coffee at 7 o’clock, play his favorite music, record the desired TV show, adjust temperature and humidity, etc.)

Screen size will not play any role in the computers of the future. It can be as big as your desktop or small. Larger versions of computer screens will be based on photonically excited liquid crystals, which will have much lower power consumption than today's LCD monitors. Colors will be vibrant and images will be accurate (plasma displays possible). In fact, today's concept of "resolution" will be greatly atrophied.

The first device designed to make counting easier was the abacus. With the help of abacus dominoes it was possible to perform addition and subtraction operations and simple multiplications.

1642 - French mathematician Blaise Pascal designed the first mechanical adding machine, the Pascalina, which could mechanically perform the addition of numbers.

1673 - Gottfried Wilhelm Leibniz designed an adding machine that could mechanically perform the four arithmetic operations.

First half of the 19th century - English mathematician Charles Babbage tried to build a universal computing device, that is, a computer. Babbage called it the Analytical Engine. He determined that a computer must contain memory and be controlled by a program. According to Babbage, a computer is a mechanical device for which programs are set using punched cards - cards made of thick paper with information printed using holes (at that time they were already widely used in looms).

1941 - German engineer Konrad Zuse built a small computer based on several electromechanical relays.

1943 - in the USA, at one of the IBM enterprises, Howard Aiken created a computer called “Mark-1”. It allowed calculations to be carried out hundreds of times faster than by hand (using an adding machine) and was used for military calculations. It used a combination of electrical signals and mechanical drives. "Mark-1" had dimensions: 15 * 2-5 m and contained 750,000 parts. The machine was capable of multiplying two 32-bit numbers in 4 seconds.

1943 - in the USA, a group of specialists led by John Mauchly and Prosper Eckert began to construct the ENIAC computer based on vacuum tubes.

1945 - mathematician John von Neumann was brought in to work on ENIAC and prepared a report on this computer. In his report, von Neumann formulated the general principles of the functioning of computers, i.e., universal computing devices. To this day, the vast majority of computers are made in accordance with the principles laid down by John von Neumann.

1947 - Eckert and Mauchly began development of the first electronic serial machine UNIVAC (Universal Automatic Computer). The first model of the machine (UNIVAC-1) was built for the US Census Bureau and put into operation in the spring of 1951. The synchronous, sequential computer UNIVAC-1 was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5,000 vacuum tubes. The internal storage capacity of 1000 12-bit decimal numbers was implemented on 100 mercury delay lines.

1949 - English researcher Mornes Wilkes built the first computer, which embodied von Neumann's principles.

1951 - J. Forrester published an article on the use of magnetic cores for storing digital information. The Whirlwind-1 machine was the first to use magnetic core memory. It consisted of 2 cubes with 32-32-17 cores, which provided storage of 2048 words for 16-bit binary numbers with one parity bit.

1952 - IBM released its first industrial electronic computer, the IBM 701, which was a synchronous parallel computer containing 4,000 vacuum tubes and 12,000 diodes. An improved version of the IBM 704 machine was distinguished by its high speed, it used index registers and represented data in floating point form.

After the IBM 704 computer, the IBM 709 was released, which in architectural terms was close to the machines of the second and third generations. In this machine, indirect addressing was used for the first time and input-output channels appeared for the first time.

1952 - Remington Rand released the UNIVAC-t 103 computer, which was the first to use software interrupts. Remington Rand employees used an algebraic form of writing algorithms called “Short Code” (the first interpreter, created in 1949 by John Mauchly).

1956 - IBM developed floating magnetic heads on an air cushion. Their invention made it possible to create a new type of memory - disk storage devices (SD), the importance of which was fully appreciated in the subsequent decades of the development of computer technology. The first disk storage devices appeared in IBM 305 and RAMAC machines. The latter had a package consisting of 50 metal disks with a magnetic coating, which rotated at a speed of 12,000 rpm. /min. The surface of the disk contained 100 tracks for recording data, each containing 10,000 characters.

1956 - Ferranti released the Pegasus computer, in which the concept of general purpose registers (GPR) was first implemented. With the advent of RON, the distinction between index registers and accumulators was eliminated, and the programmer had at his disposal not one, but several accumulator registers.

1957 - a group led by D. Backus completed work on the first high-level programming language, called FORTRAN. The language, implemented for the first time on the IBM 704 computer, contributed to expanding the scope of computers.

1960s - 2nd generation of computers, computer logic elements are implemented on the basis of semiconductor transistor devices, algorithmic programming languages ​​such as Algol, Pascal and others are being developed.

1970s - 3rd generation of computers, integrated circuits containing thousands of transistors on one semiconductor wafer. OS and structured programming languages ​​began to be created.

1974 - several companies announced the creation of a personal computer based on the Intel-8008 microprocessor - a device that performs the same functions as a large computer, but is designed for one user.

1975 - the first commercially distributed personal computer Altair-8800 based on the Intel-8080 microprocessor appeared. This computer had only 256 bytes of RAM, and there was no keyboard or screen.

Late 1975 - Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to simply communicate with the computer and easily write programs for it.

August 1981 - IBM introduced the IBM PC personal computer. The main microprocessor of the computer was a 16-bit Intel-8088 microprocessor, which allowed working with 1 megabyte of memory.

1980s - 4th generation of computers built on large integrated circuits. Microprocessors are implemented in the form of a single chip, mass production of personal computers.

1990s — 5th generation of computers, ultra-large integrated circuits. Processors contain millions of transistors. The emergence of global computer networks for mass use.

2000s — 6th generation of computers. Integration of computers and household appliances, embedded computers, development of network computing.

We recommend reading

Top