Brief history of the development of computing tools. Generations of computing

Early devices and counting devices

Humanity learned to use the simplest counting devices thousands of years ago. The most popular was the need to determine the number of items used in barter trade. One of the simplest solutions was to use the weight equivalent of the item being changed, which did not require an exact recalculation of the number of its components. For these purposes, the simplest balance scales were used, which thus became one of the first devices for the quantitative determination of mass.

The principle of equivalence was widely used in another, familiar to many, simplest counting devices, the Abacus or Abacus. The number of items counted corresponded to the number of dominoes of this instrument moved.

A relatively complex device for counting could be a rosary, used in the practice of many religions. The believer, as if on an abacus, counted the number of prayers said on the grains of the rosary, and when passing a full circle of the rosary, he moved special counter grains on a separate tail, indicating the number of counted circles.

With the invention of gear wheels, much more complex devices for performing calculations appeared. Antikythera Mechanism, discovered at the beginning of the 20th century, which was found at the site of the wreck of an ancient ship that sank around 65 BC. e. (according to other sources in or even 87 BC), he even knew how to simulate the movement of planets. Presumably it was used for calendar calculations for religious purposes, predicting solar and lunar eclipses, determining the time of sowing and harvesting, etc. Calculations were performed by connecting more than 30 bronze wheels and several dials; To calculate the lunar phases, differential transmission was used, the invention of which researchers for a long time attributed no earlier than the 16th century. However, with the passing of antiquity, the skills of creating such devices were forgotten; It took about one and a half thousand years for people to again learn how to create mechanisms of similar complexity.

"Counting Clocks" by Wilhelm Schickard

This was followed by machines by Blaise Pascal (Pascalina, 1642) and Gottfried Wilhelm Leibniz.

ANITA Mark VIII, 1961

In the Soviet Union at that time, the most famous and widespread calculator was the Felix mechanical adding machine, produced from 1929 to 1978 at factories in Kursk (Schetmash plant), Penza and Moscow.

The emergence of analog computers in the pre-war years

Main article: History of analog computing machines

Differential Analyzer, Cambridge, 1938

The first electromechanical digital computers

Z-series by Konrad Zuse

Reproduction of the Zuse Z1 computer in the Museum of Technology, Berlin

Zuse and his company built other computers, each of which began with a capital letter Z. The most famous machines were the Z11, sold to the optical industry and universities, and the Z22, the first computer with magnetic memory.

British Colossus

In October 1947, the directors of Lyons & Company, a British company that owned a chain of shops and restaurants, decided to become actively involved in the development of commercial computer development. The LEO I computer went live in 1951 and was the first computer in the world to be regularly used for routine office work.

The Manchester University machine became the prototype for the Ferranti Mark I. The first such machine was delivered to the university in February 1951, and at least nine others were sold between 1951 and 1957.

The second-generation IBM 1401 computer, released in the early 1960s, captured about a third of the global computer market, with more than 10,000 of these machines sold.

The use of semiconductors has improved not only the central processor, but also peripheral devices. The second generation of data storage devices made it possible to save tens of millions of characters and numbers. A division appeared into rigidly fixed ( fixed) storage devices connected to the processor by a high-speed data link, and removable ( removable) devices. Replacing a disk cassette in a removable device took only a few seconds. Although the capacity of removable media was usually lower, their replaceability made it possible to save an almost unlimited amount of data. Magnetic tape was commonly used for archiving data because it provided more storage capacity at a lower cost.

In many second-generation machines, the functions of communicating with peripheral devices were delegated to specialized coprocessors. For example, while the peripheral processor is reading or punching punch cards, the main processor is performing calculations or branching on the program. One data bus carries data between memory and the processor during the instruction fetch and execution cycle, and typically other data buses serve peripheral devices. On the PDP-1, a memory access cycle took 5 microseconds; Most instructions required 10 microseconds: 5 to fetch the instruction and another 5 to fetch the operand.

The best domestic computer of the 2nd generation is considered to be BESM-6, created in 1966.

1960s onwards: third and subsequent generations

The rapid growth in the use of computers began with the so-called. "3rd generation" of computers. This began with the invention of integrated circuits, which were independently made by Nobel Prize winner Jack Kilby and Robert Noyce. This later led to the invention of the microprocessor by Tad Hoff (Intel).

The advent of microprocessors led to the development of microcomputers, small, inexpensive computers that could be owned by small companies or individuals. Microcomputers, members of the fourth generation, first appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, one of the founders of Apple Computer, became known as the developer of the first mass-produced home computer, and later the first personal computer. Computers based on microcomputer architecture, with capabilities added from their larger cousins, now dominate most market segments.

In the USSR and Russia

1940s

In 1948, under the supervision of Doctor of Physical and Mathematical Sciences S. A. Lebedev, work began in Kyiv on the creation of a MESM (small electronic calculating machine). In October 1951 it came into operation.

At the end of 1948, employees of the Energy Institute named after. Krizhizhanovsky I. S. Brook and B. I. Rameev receive an author's certificate on a computer with a common bus, and in 1950-1951. create it. This machine is the first in the world to use semiconductor (cuprox) diodes instead of vacuum tubes. Since 1948, Brook has been working on electronic digital computers and control using computer technology.

At the end of the 1950s, the principles of parallelism of calculations were developed (A.I. Kitov and others), on the basis of which one of the fastest computers of that time was built - the M-100 (for military purposes).

In July 1961, the USSR launched the first semiconductor universal control machine "Dnepr" (before that there were only specialized semiconductor machines). Even before the start of serial production, experiments were carried out with it on controlling complex technological processes at

The first device designed to make counting easier was the abacus. With the help of abacus dominoes it was possible to perform addition and subtraction operations and simple multiplications.

1642 - French mathematician Blaise Pascal designed the first mechanical adding machine, the Pascalina, which could mechanically perform the addition of numbers.

1673 - Gottfried Wilhelm Leibniz designed an adding machine that could mechanically perform the four arithmetic operations.

First half of the 19th century - English mathematician Charles Babbage tried to build a universal computing device, that is, a computer. Babbage called it the Analytical Engine. He determined that a computer must contain memory and be controlled by a program. According to Babbage, a computer is a mechanical device for which programs are set using punched cards - cards made of thick paper with information printed using holes (at that time they were already widely used in looms).

1941 - German engineer Konrad Zuse built a small computer based on several electromechanical relays.

1943 - in the USA, at one of the IBM enterprises, Howard Aiken created a computer called “Mark-1”. It allowed calculations to be carried out hundreds of times faster than by hand (using an adding machine) and was used for military calculations. It used a combination of electrical signals and mechanical drives. "Mark-1" had dimensions: 15 * 2-5 m and contained 750,000 parts. The machine was capable of multiplying two 32-bit numbers in 4 seconds.

1943 - in the USA, a group of specialists led by John Mauchly and Prosper Eckert began to construct the ENIAC computer based on vacuum tubes.

1945 - mathematician John von Neumann was brought in to work on ENIAC and prepared a report on this computer. In his report, von Neumann formulated the general principles of the functioning of computers, i.e., universal computing devices. To this day, the vast majority of computers are made in accordance with the principles laid down by John von Neumann.

1947 - Eckert and Mauchly began development of the first electronic serial machine UNIVAC (Universal Automatic Computer). The first model of the machine (UNIVAC-1) was built for the US Census Bureau and put into operation in the spring of 1951. The synchronous, sequential computer UNIVAC-1 was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5,000 vacuum tubes. The internal storage capacity of 1000 12-bit decimal numbers was implemented on 100 mercury delay lines.

1949 - English researcher Mornes Wilkes built the first computer, which embodied von Neumann's principles.

1951 - J. Forrester published an article on the use of magnetic cores for storing digital information. The Whirlwind-1 machine was the first to use magnetic core memory. It consisted of 2 cubes with 32-32-17 cores, which provided storage of 2048 words for 16-bit binary numbers with one parity bit.

1952 - IBM released its first industrial electronic computer, the IBM 701, which was a synchronous parallel computer containing 4,000 vacuum tubes and 12,000 diodes. An improved version of the IBM 704 machine was distinguished by its high speed, it used index registers and represented data in floating point form.

After the IBM 704 computer, the IBM 709 was released, which in architectural terms was close to the machines of the second and third generations. In this machine, indirect addressing was used for the first time and input-output channels appeared for the first time.

1952 - Remington Rand released the UNIVAC-t 103 computer, which was the first to use software interrupts. Remington Rand employees used an algebraic form of writing algorithms called “Short Code” (the first interpreter, created in 1949 by John Mauchly).

1956 - IBM developed floating magnetic heads on an air cushion. Their invention made it possible to create a new type of memory - disk storage devices (SD), the importance of which was fully appreciated in the subsequent decades of the development of computer technology. The first disk storage devices appeared in IBM 305 and RAMAC machines. The latter had a package consisting of 50 metal disks with a magnetic coating, which rotated at a speed of 12,000 rpm. /min. The surface of the disk contained 100 tracks for recording data, each containing 10,000 characters.

1956 - Ferranti released the Pegasus computer, in which the concept of general purpose registers (GPR) was first implemented. With the advent of RON, the distinction between index registers and accumulators was eliminated, and the programmer had at his disposal not one, but several accumulator registers.

1957 - a group led by D. Backus completed work on the first high-level programming language, called FORTRAN. The language, implemented for the first time on the IBM 704 computer, contributed to expanding the scope of computers.

1960s - 2nd generation of computers, computer logic elements are implemented on the basis of semiconductor transistor devices, algorithmic programming languages ​​such as Algol, Pascal and others are being developed.

1970s - 3rd generation of computers, integrated circuits containing thousands of transistors on one semiconductor wafer. OS and structured programming languages ​​began to be created.

1974 - several companies announced the creation of a personal computer based on the Intel-8008 microprocessor - a device that performs the same functions as a large computer, but is designed for one user.

1975 - the first commercially distributed personal computer Altair-8800 based on the Intel-8080 microprocessor appeared. This computer had only 256 bytes of RAM, and there was no keyboard or screen.

Late 1975 - Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to simply communicate with the computer and easily write programs for it.

August 1981 - IBM introduced the IBM PC personal computer. The main microprocessor of the computer was a 16-bit Intel-8088 microprocessor, which allowed working with 1 megabyte of memory.

1980s - 4th generation of computers built on large integrated circuits. Microprocessors are implemented in the form of a single chip, mass production of personal computers.

1990s — 5th generation of computers, ultra-large integrated circuits. Processors contain millions of transistors. The emergence of global computer networks for mass use.

2000s — 6th generation of computers. Integration of computers and household appliances, embedded computers, development of network computing.

Municipal educational institution secondary school No. 3 of Karasuk district

Subject : History of the development of computer technology.

Compiled by:

Student MOUSOSH No. 3

Kochetov Egor Pavlovich

Manager and consultant:

Serdyukov Valentin Ivanovich,

computer science teacher MOUSOSH No. 3

Karasuk 2008

Relevance

Introduction

First steps in the development of counting devices

17th century calculating devices

18th century calculating devices

19th century counting devices

Development of computing technology at the beginning of the 20th century

The emergence and development of computer technology in the 40s of the 20th century

Development of computer technology in the 50s of the 20th century

Development of computer technology in the 60s of the 20th century

Development of computer technology in the 70s of the 20th century

Development of computer technology in the 80s of the 20th century

Development of computer technology in the 90s of the 20th century

The role of computer technology in human life

My research

Conclusion

Bibliography

Relevance

Mathematics and computer science are used in all areas of the modern information society. Modern production, computerization of society, and the introduction of modern information technologies require mathematical and information literacy and competence. However, today, school courses in computer science and ICT often offer a one-sided educational approach that does not allow one to properly increase the level of knowledge due to the lack of mathematical logic necessary for complete mastery of the material. In addition, the lack of stimulation of students’ creative potential has a negative impact on motivation to learn, and as a result, on the final level of skills, knowledge and abilities. How can you study a subject without knowing its history? This material can be used in history, mathematics and computer science lessons.

Nowadays it is difficult to imagine that you can do without computers. But not so long ago, until the early 70s, computers were available to a very limited circle of specialists, and their use, as a rule, remained shrouded in secrecy and little known to the general public. However, in 1971, an event occurred that radically changed the situation and, with fantastic speed, turned the computer into an everyday working tool for tens of millions of people.

Introduction

People learned to count using their own fingers. When this was not enough, the simplest counting devices appeared. ABAK, which became widespread in the ancient world, occupied a special place among them. Then, after years of human development, the first electronic computers (computers) appeared. They not only accelerated computing work, but also gave impetus to people to create new technologies. The word “computer” means “computer”, i.e. computing device. The need to automate data processing, including calculations, arose a long time ago. Nowadays it is difficult to imagine that you can do without computers. But not so long ago, until the early 70s, computers were available to a very limited circle of specialists, and their use, as a rule, remained shrouded in secrecy and little known to the general public. However, in 1971, an event occurred that radically changed the situation and, with fantastic speed, turned the computer into an everyday work tool for tens of millions of people. In that undoubtedly significant year, the almost unknown company Intel from a small American town with the beautiful name of Santa Clara (California) released the first microprocessor. It is to him that we owe the emergence of a new class of computing systems - personal computers, which are now used by essentially everyone, from primary school students and accountants to scientists and engineers. At the end of the 20th century, it is impossible to imagine life without a personal computer. The computer has firmly entered our lives, becoming man's main assistant. Today in the world there are many computers from different companies, different complexity groups, purposes and generations. In this essay we will look at the history of the development of computer technology, as well as a brief overview of the possibilities of using modern computing systems and further trends in the development of personal computers.

First steps in the development of counting devices

The history of counting devices goes back many centuries. The oldest calculating instrument that nature itself placed at man’s disposal was his own hand. To make counting easier, people began to use the fingers of first one hand, then both, and in some tribes, their toes. In the 16th century, finger counting techniques were described in textbooks.

The next step in the development of counting was the use of pebbles or other objects, and for memorizing numbers - notches on animal bones, knots on ropes. The so-called “Vestonitsa bone” with notches discovered in excavations allows historians to assume that even then, 30 thousand years BC, our ancestors were familiar with the rudiments of counting:


The early development of written counting was hampered by the complexity of arithmetic operations in the multiplication of numbers that existed at that time. In addition, few people knew how to write and there was no educational material for writing - parchment began to be produced around the 2nd century BC, papyrus was too expensive, and clay tablets were inconvenient to use.

These circumstances explain the appearance of a special calculating device - the abacus. By the 5th century BC. abacus became widespread in Egypt, Greece, and Rome. It was a board with grooves in which, according to the positional principle, some objects were placed - pebbles, bones.


An abacus-like instrument was known among all nations. The ancient Greek abacus (board or "Salaminian board" named after the island of Salamis in the Aegean Sea) was a plank sprinkled with sea sand. There were grooves in the sand, on which numbers were marked with pebbles. One groove corresponded to units, the other to tens, etc. If more than 10 pebbles were collected in any groove when counting, they were removed and one pebble was added in the next rank.

The Romans improved the abacus, moving from wooden planks, sand and pebbles to marble planks with chiseled grooves and marble balls. Later, around 500 AD, the abacus was improved and an abacus was born, a device consisting of a set of knuckles strung on rods. The Chinese abacus suan-pan consisted of a wooden frame divided into upper and lower sections. The sticks correspond to the columns, and the beads correspond to numbers. For the Chinese, counting was based not on ten, but on five.


It is divided into two parts: in the lower part there are 5 seeds on each row, in the upper part there are two. Thus, in order to set the number 6 on these abacuses, they first placed the bone corresponding to the five, and then added one to the units digit.


The Japanese called the same device for counting serobyan:


In Rus', for a long time, they counted by bones placed in piles. Around the 15th century, the “plank abacus” became widespread, which was almost no different from ordinary abacus and consisted of a frame with reinforced horizontal ropes on which drilled plum or cherry pits were strung.


Around the 6th century. AD In India, very advanced ways of writing numbers and rules for performing arithmetic operations, now called the decimal number system, were formed. When writing a number that lacks any digit (for example, 101 or 1204), the Indians said the word “empty” instead of the name of the number. When recording, a dot was placed in place of the “empty” digit, and later a circle was drawn. Such a circle was called “sunya” - in Hindi it meant “empty space”. Arab mathematicians translated this word into its own language - they said "sifr". The modern word “zero” was born relatively recently - later than “digit”. It comes from the Latin word "nihil" - "no". Around 850 AD. Arab scientist mathematician Muhammad ben Musa al-Khorezm (from the city of Khorezm on the Amu Darya River) wrote a book about the general rules for solving arithmetic problems using equations. It was called "Kitab al-Jabr". This book gave its name to the science of algebra. Another book by al-Khwarizmi played a very important role, in which he described Indian arithmetic in detail. Three hundred years later (in 1120) this book was translated into Latin, and it became the first a textbook of “Indian” (that is, our modern) arithmetic for all European cities.


We owe the appearance of the term “algorithm” to Muhammad ben Musa al-Khorezm.

At the end of the 15th century, Leonardo da Vinci (1452-1519) created a sketch of a 13-bit adding device with ten-tooth rings. But da Vinci’s manuscripts were discovered only in 1967, so the biography of mechanical devices comes from Pascal’s adding machine. Based on his drawings, today an American computer manufacturing company has built a working machine for advertising purposes.

17th century calculating devices


In 1614, Scottish mathematician John Naiper (1550-1617) invented logarithm tables. Their principle is that each number corresponds to a special number - a logarithm - an exponent to which the number must be raised (the base of the logarithm) to obtain a given number. Any number can be expressed this way. Logarithms make division and multiplication very simple. To multiply two numbers, simply add their logarithms. Thanks to this property, the complex multiplication operation is reduced to a simple addition operation. To simplify, tables of logarithms were compiled, which were later built into a device that could significantly speed up the calculation process - a slide rule.


Napier proposed in 1617 another (non-logarithmic) method of multiplying numbers. The instrument, called the Napier stick (or knuckle), consisted of thin plates, or blocks. Each side of the block carries numbers that form a mathematical progression.


Block manipulation allows you to extract square and cube roots, as well as multiply and divide large numbers.


Wilhelm Schickard

In 1623, Wilhelm Schickard, an orientalist and mathematician, professor at the University of Tyubin, in letters to his friend Johannes Kepler, described the design of a “counting clock” - a calculating machine with a device for setting numbers and rollers with a slider and a window for reading the result. This machine could only add and subtract (some sources say that this machine could also multiply and divide). This was the first mechanical car. In our time, according to his description, its model has been built:

Blaise Pascal


In 1642, the French mathematician Blaise Pascal (1623-1662) designed a calculating device to make the work of his father, a tax inspector, easier. This device made it possible to add decimal numbers. Externally, it looked like a box with numerous gears.


The basis of the adding machine was the counter-recorder, or counting gear. It had ten protrusions, each of which had numbers written on it. To transmit tens, there was one elongated tooth on the gear, which engaged and turned the intermediate gear, which transmitted rotation to the tens gear. An additional gear was needed to ensure that both counting gears - ones and tens - rotated in the same direction. The counting gear was connected to the lever using a ratchet mechanism (transmitting forward movement and not transmitting reverse movement). Deflection of the lever to one angle or another made it possible to enter single-digit numbers into the counter and sum them up. In Pascal's machine, a ratchet drive was attached to all the counting gears, which made it possible to add multi-digit numbers.

In 1642, the British Robert Bissacar, and in 1657 - independently - S. Partridge developed a rectangular slide rule, the design of which has largely survived to this day.


In 1673, the German philosopher, mathematician, physicist Gottfried Wilhelm Leibniz (Gottfried Wilhelm Leibniz, 1646-1716) created a “step calculator” - a calculating machine that allows you to add, subtract, multiply, divide, extract square roots, using the binary number system .

It was a more advanced device that used a moving part (a prototype of a carriage) and a handle with which the operator rotated the wheel. Leibniz's product suffered the sad fate of its predecessors: if anyone used it, it was only Leibniz's family and friends of his family, since the time of mass demand for such mechanisms had not yet come.

The machine was the prototype of the adding machine, used from 1820 to the 60s of the twentieth century.

Calculating devices from the 18th century.


In 1700, Charles Perrault published “A Collection of a Large Number of Machines of Claude Perrault’s Own Invention,” in which among the inventions of Claude Perrault (Charles Perrault’s brother) there is a adding machine in which gear racks are used instead of gears. The machine was called the "Rhabdological Abacus". This device was named so because the ancients called abacus a small board on which numbers are written, and Rhabdology - the science of performing

arithmetic operations using small sticks with numbers.


In 1703, Gottfried Wilhelm Leibniz wrote a treatise "Expication de l"Arithmetique Binary" - on the use of the binary number system in computers. His first works on binary arithmetic date back to 1679.

A member of the Royal Society of London, the German mathematician, physicist, and astronomer Christian Ludwig Gersten invented an arithmetic machine in 1723, and two years later he manufactured it. The Gersten machine is remarkable in that it is the first to use a device for calculating the quotient and the number of successive addition operations required when multiplying numbers, and also provides the ability to control the correctness of entering (setting) the second addend, which reduces the likelihood of subjective error associated with the fatigue of the calculator.

In 1727, Jacob Leupold created a calculating machine that used the Leibniz machine principle.

In the report of the commission of the Paris Academy of Sciences, published in 1751 in the Journal of Scientists, there are remarkable lines: “The results of Mr. Pereira’s method that we have seen are quite sufficient to once again confirm the opinion ... that this method of teaching the deaf-mutes is extremely practical and that the person who used it with such success is worthy of praise and encouragement... In speaking of the progress which Mr. Pereira's pupil made in a very short time in the knowledge of numbers, we must add that Mr. Pereira used the Arithmetic Engine, which he himself invented." This arithmetic machine is described in the "Journal of Scientists", but, unfortunately, the journal does not contain drawings. This calculating machine used some ideas borrowed from Pascal and Perrault, but overall it was a completely original design. It differed from known machines in that its counting wheels were not located on parallel axes, but on a single axis passing through the entire machine. This innovation, which made the design more compact, was subsequently widely used by other inventors - Felt and Odner.

In the second half of the 17th century (no later than 1770), a summing machine was created in the city of Nesvizh. The inscription on this machine states that it was “invented and manufactured by the Jew Evna Jacobson, a watchmaker and mechanic in the city of Nesvizh in Lithuania,” “Minsk Voivodeship.” This machine is currently in the collection of scientific instruments of the M.V. Lomonosov Museum (St. Petersburg). An interesting feature of the Jacobson machine was a special device that made it possible to automatically count the number of subtractions made, in other words, to determine the quotient. The presence of this device, an ingeniously solved problem of entering numbers, the ability to record intermediate results - all this allows us to consider the “watchmaker from Nesvizh” an outstanding designer of calculating equipment.


In 1774, rural pastor Philip Matthaos Hahn developed the first working calculating machine. He managed to build and, most incredibly, sell a small number of calculating machines.

In 1775, in England, Count Steinhope created a calculating device in which new mechanical systems were not implemented, but this device was more reliable in operation.


Calculating devices from the 19th century.

In 1804, French inventor Joseph-Marie Jacquard (1752-1834) came up with a way to automatically control the thread when working on a weaving loom. The method consisted of using special cards with holes drilled in the right places (depending on the pattern that was supposed to be applied to the fabric). Thus, he designed a spinning machine, the operation of which could be programmed using special cards. The operation of the machine was programmed using a whole deck of punched cards, each of which controlled one shuttle stroke. When moving on to a new drawing, the operator simply replaced one deck of punched cards with another. The creation of a loom controlled by cards with holes punched on them and connected to each other in the form of a tape is one of the key discoveries that determined the further development of computer technology.

Charles Xavier Thomas

Charles Xavier Thomas (1785-1870) in 1820 created the first mechanical calculator that could not only add and multiply, but also subtract and divide. The rapid development of mechanical calculators led to the addition of a number of useful functions by 1890: storing intermediate results and using them in subsequent operations, printing the result, etc. The creation of inexpensive, reliable machines made it possible to use these machines for commercial purposes and scientific calculations.

Charles Babbage

In 1822 English mathematician Charles Babbage (1792-1871) put forward the idea of ​​​​creating a program-controlled calculating machine with an arithmetic device, control device, input and printing.

The first machine Babbage designed, the Difference Engine, was powered by a steam engine. She calculated tables of logarithms using the method of constant differentiation and recorded the results on a metal plate. The working model he created in 1822 was a six-digit calculator capable of performing calculations and printing numerical tables.

Ada Lovelace

Lady Ada Lovelace (Ada Byron, Countess of Lovelace, 1815-1852) worked simultaneously with the English scientist. She developed the first programs for the machine, laid down many ideas and introduced a number of concepts and terms that have survived to this day.

Babbage's Analytical Engine was built by enthusiasts from the London Science Museum. It consists of four thousand iron, bronze and steel parts and weighs three tons. True, it is very difficult to use - with each calculation you have to turn the machine handle several hundred (or even thousands) times.

The numbers are written (typed) on disks arranged vertically and set to positions 0 to 9. The motor is driven by a sequence of punched cards containing instructions (program).

First telegraph

The first electric telegraph was created in 1937 by English inventors William Cook (1806-1879) and Charles Wheatstone (1802-1875). An electric current was sent through the wires to the receiver. The signals activated arrows on the receiver, which pointed to different letters and thus conveyed messages.

American artist Samuel Morse (1791-1872) invented a new telegraph code that replaced the Cook and Wheatstone code. He developed dots and dashes for each letter. Morse staged a demonstration of his code by laying a 6 km telegraph wire from Baltimore to Washington and transmitting news of the presidential election over it.

Later (in 1858), Charles Wheatstone created a system in which an operator, using Morse code, typed messages onto a long paper tape that fed into a telegraph machine. At the other end of the line, the recorder was typing the received message onto another paper tape. The productivity of telegraph operators increases tenfold - messages are now sent at a speed of one hundred words per minute.

In 1846, the Kummer calculator appeared, which was mass-produced for more than 100 years - until the seventies of the twentieth century. Calculators have now become an integral attribute of modern life. But when there were no calculators, the Kummer calculator was in use, which, at the whim of the designers, later turned into “Addiator”, “Products”, “Arithmetic Ruler” or “Progress”. This wonderful device, created in the mid-19th century, according to its manufacturer, could be made the size of a playing card, and therefore could easily fit in a pocket. The device of Kummer, a St. Petersburg music teacher, stood out among those previously invented for its portability, which became its most important advantage. Kummer's invention looked like a rectangular board with figured slats. Addition and subtraction were carried out through the simplest movement of slats. It is interesting that Kummer's calculator, presented in 1946 to the St. Petersburg Academy of Sciences, was focused on monetary calculations.

In Russia, in addition to the Slonimsky device and modifications of the Kummer numerator, the so-called counting bars, invented in 1881 by the scientist Ioffe, were quite popular.

George Boole

In 1847, the English mathematician George Boole (1815-1864) published the work "Mathematical Analysis of Logic." This is how a new branch of mathematics appeared. It was called Boolean algebra. Each value in it can take only one of two values: true or false, 1 or 0. This algebra was very useful to the creators of modern computers. After all, the computer understands only two symbols: 0 and 1. He is considered the founder of modern mathematical logic.

1855 Brothers George & Edvard Scheutz from Stockholm built the first mechanical computer using the work of Ch. Babbage.

In 1867, Bunyakovsky invented self-calculators, which were based on the principle of connected digital wheels (Pascal's gear).

In 1878, the English scientist Joseph Swan (1828-1914) invented the electric light bulb. It was a glass flask with a carbon filament inside. To prevent the thread from burning out, Swan removed the air from the flask.

The following year, American inventor Thomas Edison (1847-1931) also invented the light bulb. In 1880, Edison began producing safety light bulbs, selling them for $2.50. Subsequently, Edison and Swan created a joint company, Edison and Swan United Electric Light Company.

In 1883, while experimenting with a lamp, Edison inserted a platinum electrode into a vacuum cylinder, applied voltage and, to his surprise, discovered that current flowed between the electrode and the carbon filament. Since at that moment Edison’s main goal was to extend the life of the incandescent lamp, this result interested him little, but the enterprising American still received a patent. The phenomenon known to us as thermionic emission was then called the “Edison effect” and was forgotten for some time.

Vilgodt Teofilovich Odner

In 1880 Vilgodt Teofilovich Odner, a Swede by nationality, who lived in St. Petersburg, designed an adding machine. It must be admitted that before Odner there were also adding machines - the systems of K. Thomas. However, they were unreliable, large in size and inconvenient to operate.

He began working on the adding machine in 1874, and in 1890 he began mass production of them. Their modification "Felix" was produced until the 50s. The main feature of Odhner's brainchild is the use of gear wheels with a variable number of teeth (this wheel bears Odhner's name) instead of Leibniz's stepped rollers. It is structurally simpler than a roller and has smaller dimensions.

Herman Hollerith

In 1884, American engineer Herman Hillerith (1860-1929) took out a patent “for a census machine” (statistical tabulator). The invention included a punched card and a sorting machine. Hollerith's punch card turned out to be so successful that it has existed to this day without the slightest changes.

The idea of ​​putting data on punched cards and then reading and processing them automatically belonged to John Billings, and its technical solution belonged to Herman Hollerith.

The tabulator accepted cards the size of a dollar bill. There were 240 positions on the cards (12 rows of 20 positions). When reading information from punched cards, 240 needles pierced these cards. Where the needle entered the hole, it closed an electrical contact, as a result of which the value in the corresponding counter increased by one.

Development of computer technology

at the beginning of the 20th century

1904 The famous Russian mathematician, shipbuilder, academician A.N. Krylov proposed the design of a machine for integrating ordinary differential equations, which was built in 1912.

English physicist John Ambrose Fleming (1849-1945), studying the "Edison effect", creates a diode. Diodes are used to convert radio waves into electrical signals that can be transmitted over long distances.

Two years later, through the efforts of the American inventor Lee di Forest, triodes appeared.

1907 American engineer J. Power designed an automatic card punch.

St. Petersburg scientist Boris Rosing applies for a patent for a cathode ray tube as a data receiver.

1918 The Russian scientist M.A. Bonch-Bruevich and the English scientists V. Iccles and F. Jordan (1919) independently created an electronic device, called a trigger by the British, which played a big role in the development of computer technology.

In 1930, Vannevar Bush (1890-1974) designs a differential analyzer. In fact, this is the first successful attempt to create a computer capable of performing cumbersome scientific calculations. Bush's role in the history of computer technology is very large, but his name most often appears in connection with the prophetic article "As We May Think" (1945), in which he describes the concept of hypertext.

Konrad Zuse created the Z1 computer, which had a keyboard for entering problem conditions. Upon completion of the calculations, the result was displayed on a panel with many small lights. The total area occupied by the machine was 4 sq.m.

Konrad Zuse patented a method for automatic calculations.

For the next model Z2, K. Zuse came up with a very ingenious and cheap input device: Zuse began encoding instructions for the machine by punching holes in used 35 mm photographic film.

In 1838 American mathematician and engineer Claude Shannon and Russian scientist V.I. Shestakov in 1941 showed the possibility of a mathematical logic apparatus for the synthesis and analysis of relay contact switching systems.

In 1938, the telephone company Bell Laboratories created the first binary adder (an electrical circuit that performed binary addition) - one of the main components of any computer. The author of the idea was George Stibits, who experimented with Boolean algebra and various parts - old relays, batteries, light bulbs and wiring. By 1940, a machine was born that could perform four arithmetic operations on complex numbers.

Appearance and

in the 40s of the 20th century.

In 1941, IBM engineer B. Phelps began work on creating decimal electronic counters for tabulators, and in 1942 he created an experimental model of an electronic multiplying device. In 1941, Konrad Zuse built the world's first operational program-controlled relay binary computer, the Z3.

Simultaneously with the construction of ENIAC, also in secrecy, a computer was created in Great Britain. Secrecy was necessary because a device was being designed to decipher the codes used by the German armed forces during the Second World War. The mathematical decryption method was developed by a group of mathematicians, including Alan Turing. During 1943, the Colossus machine was built in London using 1,500 vacuum tubes. The developers of the machine are M. Newman and T. F. Flowers.

Although both ENIAC and Colossus ran on vacuum tubes, they essentially copied electromechanical machines: new content (electronics) was squeezed into an old form (the structure of pre-electronic machines).

In 1937, Harvard mathematician Howard Aiken proposed a project to create a large calculating machine. The work was sponsored by IBM President Thomas Watson, who invested $500 thousand in it. Design of the Mark-1 began in 1939; the computer was built by the New York company IBM. The computer contained about 750 thousand parts, 3304 relays and more than 800 km of wires.

In 1944, the finished machine was officially transferred to Harvard University.

In 1944, American engineer John Presper Eckert first put forward the concept of a program stored in computer memory.

Aiken, who had the intellectual resources of Harvard and a capable Mark-1 machine, received several orders from the military. So the next model, the Mark-2, was ordered by the US Navy Weapons Directorate. Design began in 1945, and construction ended in 1947. The Mark-2 was the first multitasking machine—multiple buses made it possible to simultaneously transmit multiple numbers from one part of the computer to another.

In 1948, Sergei Aleksandrovich Lebedev (1990-1974) and B.I. Rameev proposed the first project of a domestic digital electronic computer. Under the leadership of Academician Lebedev S.A. and Glushkova V.M. domestic computers are being developed: first MESM - small electronic calculating machine (1951, Kyiv), then BESM - high-speed electronic calculating machine (1952, Moscow). In parallel with them, Strela, Ural, Minsk, Hrazdan, and Nairi were created.

In 1949 An English stored program machine, EDSAC (Electronic Delay Storage Automatic Computer), was put into operation, designed by Maurice Wilkes from the University of Cambridge. The EDSAC computer contained 3,000 vacuum tubes and was six times more productive than its predecessors. Maurice Wilkis introduced a system of mnemonics for machine instructions called assembly language.

In 1949 John Mauchly created the first programming language interpreter called "Short Order Code".

Development of computer technology

in the 50s of the 20th century.

In 1951, work was completed on the creation of UNIVAC (Universal Automatic Computer). The first example of the UNIVAC-1 machine was built for the US Census Bureau. The UNIVAC-1 synchronous, sequential computer was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5000 vacuum tubes. The internal storage device, with a capacity of 1000 twelve-bit decimal numbers, was made on 100 mercury delay lines.

This computer is interesting because it was aimed at relatively mass production without changing the architecture and special attention was paid to the peripheral part (input-output facilities).

Jay Forrester patented magnetic core memory. For the first time such memory was used on the Whirlwind-1 machine. It consisted of two cubes with 32x32x17 cores, which provided storage of 2048 words for 16-bit binary numbers with one parity bit.

This machine was the first to use a universal non-specialized bus (the relationships between various computer devices become flexible) and two devices were used as input-output systems: a Williams cathode ray tube and a typewriter with punched paper tape (flexowriter).

"Tradis", released in 1955. - the first transistor computer from Bell Telephone Laboratories - contained 800 transistors, each of which was enclosed in a separate housing.

In 1957 In the IBM 350 RAMAC model, disk memory (magnetized aluminum disks with a diameter of 61 cm) appeared for the first time.

G. Simon, A. Newell, J. Shaw created GPS - a universal problem solver.

In 1958 Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently invent the integrated circuit.

1955-1959 Russian scientists A.A. Lyapunov, S.S. Kamynin, E.Z. Lyubimsky, A.P. Ershov, L.N. Korolev, V.M. Kurochkin, M.R. Shura-Bura and others created “programming programs” - prototypes of translators. V.V. Martynyuk created a symbolic coding system - a means of accelerating the development and debugging of programs.

1955-1959 The foundation was laid for programming theory (A.A. Lyapunov, Yu.I. Yanov, A.A. Markov, L.A. Kaluzhin) and numerical methods (V.M. Glushkov, A.A. Samarsky, A.N. Tikhonov ). Schemes of the mechanism of thinking and genetic processes, algorithms for diagnosing medical diseases are modeled (A.A. Lyapunov, B.V. Gnedenko, N.M. Amosov, A.G. Ivakhnenko, V.A. Kovalevsky, etc.).

1959 Under the leadership of S.A. Lebedev created the BESM-2 machine with a productivity of 10 thousand operations/s. Its use is associated with calculations of launches of space rockets and the world's first artificial Earth satellites.

1959 The M-20 machine was created, chief designer S.A. Lebedev. For its time, one of the fastest in the world (20 thousand operations/s). This machine was used to solve most theoretical and applied problems related to the development of the most advanced fields of science and technology of that time. Based on the M-20, the unique multiprocessor M-40 was created - the fastest computer of that time in the world (40 thousand operations/sec.). The M-20 was replaced by the semiconductor BESM-4 and M-220 (200 thousand operations/s).

Development of computer technology

in the 60s of the 20th century.

In 1960, for a short time, the CADASYL (Conference on Data System Languages) group, led by Joy Wegstein and with the support of IBM, developed a standardized business programming language, COBOL (Common business oriented language). This language is focused on solving economic problems, or more precisely, on processing information.

In the same year, J. Schwartz and others from the company System Development developed the Jovial programming language. The name comes from Jule's Own Version of International Algorithmic Language. Procedural Java, version of Algol-58. Used mainly for military applications by the US Air Force.

IBM has developed a powerful computing system called Stretch (IBM 7030).

1961 IBM Deutschland implemented the connection of a computer to a telephone line using a modem.

Also, American professor John McCartney developed the LISP (List procssing language) language.

J. Gordon, head of the development of simulation systems at IBM, created the GPSS (General Purpose Simulation System) language.

Employees of the University of Manchester under the leadership of T. Kilburn created the Atlas computer, which for the first time implemented the concept of virtual memory. The first minicomputer (PDP-1) appeared before 1971, the time of the creation of the first microprocessor (Intel 4004).

In 1962, R. Griswold developed the programming language SNOBOL, focused on string processing.

Steve Russell developed the first computer game. What kind of game it was, unfortunately, is not known.

E.V. Evreinov and Yu. Kosarev proposed a model of a team of computers and substantiated the possibility of building supercomputers on the principles of parallel execution of operations, variable logical structure and structural homogeneity.

IBM released the first external memory devices with removable disks.

Kenneth E. Iverson (IBM) published a book called “A Programming Language” (APL). Initially, this language served as a notation for writing algorithms. The first implementation of APL/360 was in 1966 by Adin Falkoff (Harvard, IBM). There are versions of interpreters for PC. Due to the difficulty of reading nuclear submarine programs, it is sometimes called “Chinese BASIC”. Actually, it is a procedural, very compact, ultra-high-level language. Requires a special keyboard. Further development – ​​APL2.

1963 The American standard code for information exchange has been approved - ASCII (American Standard Code Informatio Interchange).

General Electric created the first commercial DBMS (database management system).

1964 U. Dahl and K. Nygort created the SIMULA-1 modeling language.

In 1967 under the leadership of S.A. Lebedev and V.M. Melnikov, a high-speed computing machine BESM-6 was created at ITM and VT.

It was followed by "Elbrus" - a new type of computer with a productivity of 10 million operations/s.

Development of computer technology

in the 70s of the 20th century.

In 1970 Charles Murr, an employee of the National Radio Astronomy Observatory, created the FORT programming language.

Denis Ritchie and Kenneth Thomson release the first version of Unix.

Dr. Codd publishes the first paper on the relational data model.

In 1971 Intel (USA) created the first microprocessor (MP) - a programmable logical device made using VLSI technology.

The 4004 processor was 4-bit and could perform 60 thousand operations per second.

1974 Intel developed the first universal eight-bit microprocessor, the 8080, with 4500 transistors. Edward Roberts from MITS built the first personal computer, Altair, on a new chip from Intel, the 8080. Altair turned out to be the first mass-produced PC, essentially marking the beginning of an entire industry. The kit included a processor, a 256-byte memory module, a system bus and some other little things.

Young programmer Paul Allen and Harvard University student Bill Gates implemented the BASIC language for Altair. They subsequently founded Microsoft, which is today the largest software manufacturer.

Development of computer technology

in the 80s of the 20th century.

1981 Compaq released the first Laptop.

Niklaus Wirth developed the MODULA-2 programming language.

The first portable computer was created - Osborne-1, weighing about 12 kg. Despite a fairly successful start, the company went bankrupt two years later.

1981 IBM released the first personal computer, the IBM PC, based on the 8088 microprocessor.

1982 Intel released the 80286 microprocessor.

The American computer manufacturing company IBM, which previously occupied a leading position in the production of large computers, began producing professional personal computers IBM PC with the MS DOS operating system.

Sun began producing the first workstations.

Lotus Development Corp. released the Lotus 1-2-3 spreadsheet.

The English company Inmos, based on the ideas of Oxford University professor Tony Hoare about “interacting sequential processes” and the concept of the experimental programming language David May, created the OCCAM language.

1985 Intel released a 32-bit microprocessor 80386, consisting of 250 thousand transistors.

Seymour Cray created the CRAY-2 supercomputer with a capacity of 1 billion operations per second.

Microsoft released the first version of the Windows graphical operating environment.

The emergence of a new programming language, C++.

Development of computer technology

in the 90s of the 20th century.

1990 Microsoft released Windows 3.0.

Tim Berners-Lee developed the HTML language (Hypertext Markup Language; the main format of Web documents) and the prototype of the World Wide Web.

Cray released the Cray Y-MP C90 supercomputer with 16 processors and a speed of 16 Gflops.

1991 Microsoft released Windows 3.1.

JPEG graphic format developed

Philip Zimmerman invented PGP, a public key message encryption system.

1992 The first free operating system with great capabilities appeared - Linux. Finnish student Linus Torvalds (the author of this system) decided to experiment with the commands of the Intel 386 processor and posted what he got on the Internet. Hundreds of programmers from around the world began to add and rework the program. It has evolved into a fully functional working operating system. History is silent about who decided to call it Linux, but how this name came about is quite clear. "Linu" or "Lin" on behalf of the creator and "x" or "ux" - from UNIX, because the new OS was very similar to it, only it now worked on computers with x86 architecture.

DEC introduced the first 64-bit RISC Alpha processor.

1993 Intel released a 64-bit Pentium microprocessor, which consisted of 3.1 million transistors and could perform 112 million operations per second.

The MPEG video compression format has appeared.

1994 Start of release by Power Mac of the Apple Computers series - Power PC.

1995 DEC announced the release of five new models of Celebris XL personal computers.

NEC announced the completion of development of the world's first chip with a memory capacity of 1 GB.

The Windows 95 operating system appeared.

SUN introduced the Java programming language.

The RealAudio format has appeared - an alternative to MPEG.

1996 Microsoft released Internet Explorer 3.0, a fairly serious competitor to Netscape Navigator.

1997 Apple released the Macintosh OS 8 operating system.

Conclusion

The personal computer quickly entered our lives. Just a few years ago it was rare to see some kind of personal computer - they existed, but they were very expensive, and not even every company could have a computer in their office. Now every third home has a computer, which has already become deeply embedded in human life.

Modern computers represent one of the most significant achievements of human thought, the influence of which on the development of scientific and technological progress can hardly be overestimated. The scope of computer applications is enormous and is constantly expanding.

My research

Number of computers owned by students at school in 2007.

Number of students

Have computers

Percentage of total quantity

Number of computers owned by students at school in 2008.

Number of students

Have computers

Percentage of total quantity

Increase in the number of computers among students:

The rise of computers in school

Conclusion

Unfortunately, it is impossible to cover the entire history of computers within the framework of an abstract. We could talk for a long time about how in the small town of Palo Alto (California) at the Xerox PARK research center, the cream of the programmers of that time gathered to develop revolutionary concepts that radically changed the image of cars and pave the way for computers the end of the 20th century. As a talented schoolboy, Bill Gates and his friend Paul Allen met Ed Robertson and created the amazing BASIC language for the Altair computer, which made it possible to develop application programs for it. As the appearance of the personal computer gradually changed, a monitor and keyboard appeared, a floppy disk drive, the so-called floppy disks, and then a hard drive. A printer and a mouse became integral accessories. One could talk about the invisible war in the computer markets for the right to set standards between the huge corporation IBM, and the young Apple, which dared to compete with it, forcing the whole world to decide which is better, Macintosh or PC? And about many other interesting things that happened quite recently, but have already become history.

For many, a world without a computer is a distant history, about as distant as the discovery of America or the October Revolution. But every time you turn on the computer, it is impossible to stop being amazed at the human genius that created this miracle.

Modern personal IBM PC-compatible computers are the most widely used type of computer, their power is constantly growing, and their scope is expanding. These computers can be networked together, allowing tens or hundreds of users to easily exchange information and simultaneously access databases. Electronic mail allows computer users to send text and fax messages to other cities and countries using the regular telephone network and retrieve information from large data banks. The global electronic communication system Internet provides an extremely low cost opportunity to quickly receive information from all corners of the globe, provides voice and fax communication capabilities, and facilitates the creation of intracorporate information transmission networks for companies with branches in different cities and countries. However, the capabilities of IBM PC - compatible personal computers for processing information are still limited, and their use is not justified in all situations.

To understand the history of computer technology, the reviewed abstract has at least two aspects: first, all activities related to automatic computing before the creation of the ENIAC computer were considered as prehistory; second, the development of computer technology is defined only in terms of hardware technology and microprocessor circuits.

Bibliography:

1. Guk M. “IBM PC Hardware” - St. Petersburg: “Peter”, 1997.

2. Ozertsovsky S. “Intel microprocessors: from 4004 to Pentium Pro”, Computer Week magazine #41 –

3. Figurnov V.E. “IBM PC for the user” - M.: “Infra-M”, 1995.

4. Figurnov V.E. “IBM PC for the user. Short course" - M.: 1999.

5. 1996 Frolov A.V., Frolov G.V. “IBM PC Hardware” - M.: DIALOG-MEPhI, 1992.

Generations:

I. Computer on el. lamps, performance is about 20,000 operations per second, each machine has its own programming language. (“BESM”, “Strela”). II. In 1960, transistors, invented in 1948, were used in computers; they were more reliable, durable, and had large RAM. 1 transistor can replace ~40 el. lamps and works at a higher speed. Magnetic tapes were used as storage media. (“Minsk-2”, “Ural-14”). III. In 1964, the first integrated circuits (ICs) appeared and became widely used. An IC is a crystal with an area of ​​10 mm2. 1 IC can replace 1000 transistors. 1 crystal - 30-ton “Eniak”. It became possible to process several programs in parallel. IV. For the first time, large-scale integrated circuits (LSIs) were used, which roughly corresponded in power to 1000 ICs. This has led to a reduction in the cost of producing computers. In 1980, it became possible to place the central processor of a small computer on a 1/4-inch chip. (“Illiak”, “Elbrus”). V. Synthesizers, sounds, the ability to conduct dialogue, carry out commands given by voice or touch.

Early devices and counting devices

Computer technology is a critical component of the computing and data processing process. The first devices for calculations were counting sticks. As they developed, these devices became more complex, for example, such as Phoenician clay figurines, also intended to visually represent the number of items being counted. Such devices were used by traders and accountants of that time. Gradually, from the simplest devices for counting, more and more complex devices were born: abacus (abacus), slide rule, mechanical adding machine, electronic computer. The principle of equivalence was widely used in the simplest calculating device, the Abacus or Abacus. The number of items counted corresponded to the number of dominoes of this instrument moved. A relatively complex device for counting could be a rosary, used in the practice of many religions. The believer, as if on an abacus, counted the number of prayers said on the beads of a rosary, and when "

"Counting Clocks" by Wilhelm Schickard

In 1623, Wilhelm Schickard invented the "Counting Clock" - the first mechanical calculator that could perform four arithmetic operations. This was followed by machines by Blaise Pascal (Pascalina, 1642) and Gottfried Wilhelm Leibniz.

Around 1820, Charles Xavier Thomas created the first successful, mass-produced mechanical calculator, the Thomas Arithmometer, which could add, subtract, multiply and divide. It was mainly based on the work of Leibniz. Mechanical calculators that count decimal numbers were used until the 1970s. Leibniz also described the binary number system, the central ingredient of all modern computers. However, until the 1940s, many subsequent developments (including Charles Babbage's machines and even the 1945 ENIAC) were based on a more difficult-to-implement decimal system.

Punch card jukebox system

In 1801, Joseph Marie Jacquard developed a loom in which the embroidered pattern was determined by punched cards. The series of cards could be replaced, and changing the pattern did not require changes in the mechanics of the machine. This was an important milestone in the history of programming. In 1838, Charles Babbage moved from developing the Difference Engine to designing a more complex Analytical Engine, the programming principles of which directly traced back to Jaccard's punched cards. In 1890, the US Census Bureau used punch cards and sorting mechanisms developed by Herman Hollerith to process the flood of decennial census data mandated by the Constitution. Hollerith's company eventually became the core of IBM. This corporation developed punched card technology into a powerful tool for business data processing and produced an extensive line of specialized data recording equipment. By 1950, IBM technology had become ubiquitous in industry and government. Many computer solutions used punch cards before (and after) the late 1970s.

1835-1900s: First programmable machines

In 1835, Charles Babbage described his Analytical Engine. It was a general purpose computer design, using punched cards as input data and program storage, and a steam engine as the power source. One of the key ideas was the use of gears to perform mathematical functions. Following in Babbage's footsteps, although unaware of his earlier work, was Percy Ludgate, an accountant from Dublin [Ireland]. He independently designed a programmable mechanical computer, which he described in a paper published in 1909.

1930s - 1960s: desktop calculators

The Felix adding machine is the most common in the USSR. Produced in 1929-1978

In 1948, Curta appeared, a small mechanical calculator that could be held in one hand. In the 1950s and 1960s, several brands of similar devices appeared on the Western market. The first fully electronic desktop calculator was the British ANITA Mk. VII, which used a "Nixie" tube display and 177 miniature thyratron tubes. In June 1963, Friden introduced the EC-130 with four functions. It was entirely transistorized, had 13-digit resolution on a 5-inch cathode ray tube, and was marketed by the company at $2,200 for the calculator market. Square root and inverse functions have been added to the EC 132 model. In 1965, Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could calculate logarithms.

The emergence of analog computers in the pre-war years

Differential Analyzer, Cambridge, 1938 Before World War II, mechanical and electrical analogue computers were considered the most advanced machines and were widely believed to be the future of computing. Analog computers took advantage of the fact that the mathematics of small-scale phenomena - wheel positions or electrical voltage and current - are similar to the mathematics of other physical phenomena, such as ballistic trajectories, inertia, resonance, energy transfer, moment of inertia, etc. They modeled these and other physical phenomena by the values ​​of electrical voltage and current.

The first electromechanical digital computers

Konrad Zuse's Z-series In 1936, while working in isolation in Nazi Germany, Konrad Zuse began work on his first Z-series computer, which had memory and (still limited) programmability. Created mainly on a mechanical basis, but based on binary logic, the Z1 model, completed in 1938, never worked reliably enough due to insufficient precision in the execution of its component parts. Zuse's next car, the Z3, was completed in 1941. It was built on telephone relays and worked quite satisfactorily. Thus, the Z3 became the first working computer controlled by a program. In many ways, the Z3 was similar to modern machines, pioneering a number of innovations such as floating point arithmetic. Replacing the difficult-to-implement decimal system with a binary one made Zuse machines simpler and, therefore, more reliable; this is thought to be one of the reasons that Zuse succeeded where Babbage failed. Programs for the Z3 were stored on perforated film. There were no conditional branches, but in the 1990s the Z3 was theoretically proven to be a general purpose computer (if you ignore physical memory size limitations). In two 1936 patents, Konrad Zuse mentioned that machine instructions could be stored in the same memory as data - thus anticipating what later became known as the von Neumann architecture and was first implemented only in 1949 by the British EDSAC.

British "Colossus"

The British Colossus was used to break German codes during World War II. Colossus was the first fully electronic computing device. It used a large number of vacuum tubes, and information was entered from punched tape. Colossus could be configured to perform various Boolean logic operations, but it was not a Turing complete machine. In addition to the Colossus Mk I, nine more Mk II models were built. Information about the existence of this machine was kept secret until the 1970s. Winston Churchill personally signed the order to destroy the machine into pieces no larger than the size of a human hand. Because of its secrecy, Colossus is not mentioned in many works on the history of computers.

First generation of von Neumann architecture computers

Memory on ferrite cores. Each core is one bit. The first working machine with von Neumann architecture was the Manchester “Baby” - Small-Scale Experimental Machine, created at the University of Manchester in 1948; it was followed in 1949 by the Manchester Mark I computer, which was already a complete system, with Williams tubes and a magnetic drum as memory, as well as index registers. Another contender for the title of “first digital stored program computer” was EDSAC, designed and constructed at the University of Cambridge. Launched less than a year after Baby, it could already be used to solve real problems. In fact, EDSAC was created based on the architecture of the EDVAC computer, the successor of ENIAC. Unlike ENIAC, which used parallel processing, EDVAC had a single processing unit. This solution was simpler and more reliable, so this option became the first implemented after each successive wave of miniaturization. Many believe that the Manchester Mark I / EDSAC / EDVAC became the “Evas” from which almost all modern computers derive their architecture.

The first universal programmable computer in continental Europe was created by a team of scientists led by Sergei Alekseevich Lebedev from the Kyiv Institute of Electrical Engineering of the USSR, Ukraine. The MESM (Small Electronic Computing Machine) computer went into operation in 1950. It contained about 6,000 vacuum tubes and consumed 15 kW. The machine could perform about 3,000 operations per second. Another machine of the time was the Australian CSIRAC, which carried out its first test program in 1949.

In October 1947, the directors of Lyons & Company, a British company that owned a chain of shops and restaurants, decided to become actively involved in the development of commercial computer development. The LEO I computer went live in 1951 and was the first computer in the world to be regularly used for routine office work.

The Manchester University machine became the prototype for the Ferranti Mark I. The first such machine was delivered to the university in February 1951, and at least nine others were sold between 1951 and 1957.

In June 1951, UNIVAC 1 was installed by the US Census Bureau. The machine was developed by Remington Rand, which eventually sold 46 of the machines for more than $1 million each. UNIVAC was the first mass-produced computer; all its predecessors were produced in a single copy. The computer consisted of 5200 vacuum tubes and consumed 125 kW of energy. Mercury delay lines were used, storing 1000 words of memory, each with 11 decimal digits plus sign (72-bit words). Unlike IBM machines equipped with punched card input, the UNIVAC used 1930s-style metallized magnetic tape input, providing compatibility with some existing commercial storage systems. Other computers of the time used high-speed punched tape input and I/O using more modern magnetic tapes.

The first Soviet serial computer was the Strela, produced since 1953 at the Moscow Factory of Computing and Analytical Machines. “Strela” belongs to the class of large universal computers (Mainframe) with a three-address command system. The computer had a speed of 2000-3000 operations per second. Two magnetic tape drives with a capacity of 200,000 words were used as external memory; the RAM capacity was 2048 cells of 43 bits each. The computer consisted of 6,200 lamps, 60,000 semiconductor diodes and consumed 150 kW of energy.

In 1955, Maurice Wilkes invented microprogramming, a principle that was later widely used in the microprocessors of a wide variety of computers. Microprogramming allows you to define or extend a basic set of commands using built-in programs (called microprogram or firmware).

In 1956, IBM first sold a device for storing information on magnetic disks - RAMAC (Random Access Method of Accounting and Control). It uses 50 metal disks with a diameter of 24 inches, with 100 tracks on each side. The device stored up to 5 MB of data and cost $10,000 per MB. (In 2006, similar storage devices - hard drives - cost about $0.001 per MB.)

1950s - early 1960s: second generation

The next major step in the history of computer technology was the invention of the transistor in 1947. They have become a replacement for fragile and energy-intensive lamps. Transistorized computers are usually referred to as the "second generation" that dominated the 1950s and early 1960s. Thanks to transistors and printed circuit boards, a significant reduction in size and energy consumption, as well as increased reliability, was achieved. For example, the transistor-powered IBM 1620, which replaced the tube-based IBM 650, was the size of an office desk. However, second generation computers were still quite expensive and therefore were only used by universities, governments, and large corporations.

Second-generation computers typically consisted of a large number of printed circuit boards, each containing one to four logic gates or flip-flops. In particular, the IBM Standard Modular System defined the standard for such boards and connection connectors for them. In 1959, based on transistors, IBM released the IBM 7090 mainframe and the IBM 1401 mid-range machine. The latter used punch card input and became the most popular general-purpose computer of the time: in the period 1960-1964. More than 100 thousand copies of this car were produced. It used a 4,000-character memory (later increased to 16,000 characters). Many aspects of this project were based on the desire to replace punched card machines, which were widely used from the 1920s until the early 1970s. In 1960, IBM released the transistorized IBM 1620, initially a punched-tape machine only, but soon upgraded to punched cards. The model became popular as a scientific computer, with about 2,000 copies produced. The machine used magnetic core memory with a capacity of up to 60,000 decimal digits.

Also in 1960, DEC released its first model, the PDP-1, intended for use by technical personnel in laboratories and for research.

In 1961, Burroughs Corporation released the B5000, the first dual-processor computer with virtual memory. Other unique features were its stack-based architecture, handle-based addressing, and lack of programming directly in assembly language.

The first Soviet serial semiconductor computers were “Spring” and “Snow”, produced from 1964 to 1972. The peak performance of the Snow computer was 300,000 operations per second. The machines were made on the basis of transistors with a clock frequency of 5 MHz. A total of 39 computers were produced.

The BESM-6, created in 1966, is considered the best domestic computer of the 2nd generation. In the BESM-6 architecture, the principle of combining command execution was widely used for the first time (up to 14 unicast machine commands could be at different stages of execution). Interruption mechanisms, memory protection and other innovative solutions made it possible to use BESM-6 in multiprogram mode and time sharing mode. The computer had 128 KB of RAM on ferrite cores and external memory on magnetic drums and tape. BESM-6 operated with a clock frequency of 10 MHz and a record performance for that time - about 1 million operations per second. A total of 355 computers were produced.

1960s onwards: third and subsequent generations

The rapid growth in the use of computers began with the so-called. "3rd generation" of computers. This began with the invention of integrated circuits, which were independently invented by Nobel Prize winner Jack Kilby and Robert Noyce. This later led to the invention of the microprocessor by Tad Hoff (Intel). During the 1960s, there was some overlap between 2nd and 3rd generation technologies. At the end of 1975, Sperry Univac continued production of 2nd generation machines such as the UNIVAC 494.

The advent of microprocessors led to the development of microcomputers, small, inexpensive computers that could be owned by small companies or individuals. Microcomputers, members of the fourth generation, first appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, one of the founders of Apple Computer, became known as the developer of the first mass-produced home computer, and later the first personal computer. Computers based on microcomputer architecture, with capabilities added from their larger cousins, now dominate most market segments.

1970-1990 - fourth generation of computers

It is generally believed that the period from 1970 to 1990 belongs to fourth generation computers. However, there is another opinion - many believe that the achievements of this period are not so great as to consider it an equal generation. Supporters of this point of view call this decade belonging to the “third and a half” generation of computers, and only from 1985, in their opinion, should we count the years of the life of the fourth generation itself, which is still alive today.

One way or another, it is obvious that since the mid-70s there have been fewer and fewer fundamental innovations in computer science. Progress is proceeding mainly along the path of developing what has already been invented and invented, primarily through increasing power and miniaturization of the element base and the computers themselves.

And, of course, the most important thing is that since the beginning of the 80s, thanks to the advent of personal computers, computing technology has become truly widespread and accessible to the public. A paradoxical situation arises: despite the fact that personal and minicomputers still lag behind large machines in all respects, the lion's share of innovations of the last decade - graphical user interfaces, new peripheral devices, global networks - owe their appearance and development to precisely this “frivolous” technology. Large computers and supercomputers, of course, are by no means extinct and continue to develop. But now they no longer dominate the computer arena as they once did.

The elemental base of a computer is large integrated circuits (LSI). The machines were intended to dramatically increase labor productivity in science, production, management, healthcare, service and everyday life. A high degree of integration helps to increase the packaging density of electronic equipment and improve its reliability, which leads to an increase in computer performance and a reduction in its cost. All this has a significant impact on the logical structure (architecture) of the computer and its software. The connection between the structure of the machine and its software becomes closer, especially the operating system (or monitor) - a set of programs that organize the continuous operation of the machine without human intervention. This generation includes EC computers: ES-1015, -1025, -1035, -1045, -1055, -1065 (“Row 2”), -1036, -1046, -1066, SM-1420, -1600, - 1700, all personal computers (“Electronics MS 0501”, “Electronics-85”, “Iskra-226”, ES-1840, -1841, -1842, etc.), as well as other types and modifications. The fourth generation computer also includes the Elbrus multiprocessor computing complex. "Elbrus-1KB" had a speed of up to 5.5 million floating point operations per second, and a RAM capacity of up to 64 MB. Elbrus-2 has a performance of up to 120 million operations per second, a RAM capacity of up to 144 MB or 16 MSwords (72-bit word), and a maximum throughput of I/O channels of 120 MB/s.

Example: IBM 370-168

Manufactured in 1972. This car model was one of the most common. RAM capacity - 8.2 MB. Performance - 7.7 million operations per second.


1990-...to the present day - 5th generation of computers

The transition to fifth-generation computers implied a transition to new architectures aimed at creating artificial intelligence.

It was believed that the fifth generation computer architecture would contain two main blocks. One of them is the computer itself, in which communication with the user is carried out by a unit called the “intelligent interface”. The task of the interface is to understand text written in natural language or speech, and translate the problem statement thus stated into a working program.

Basic requirements for 5th generation computers: Creation of a developed human-machine interface (speech recognition, image recognition); Development of logic programming for creating knowledge bases and artificial intelligence systems; Creation of new technologies in the production of computer equipment; Creation of new computer architectures and computing systems.

New technical capabilities of computer technology should have expanded the range of tasks to be solved and made it possible to move on to the tasks of creating artificial intelligence. One of the components necessary for creating artificial intelligence is knowledge bases (databases) in various areas of science and technology. Creating and using databases requires high speed computing systems and a large amount of memory. General purpose computers are capable of performing high-speed calculations, but are not suitable for performing high-speed comparison and sorting operations on large volumes of records, usually stored on magnetic disks. To create programs that fill, update, and work with databases, special object-oriented and logical programming languages ​​were created that provide the greatest capabilities compared to conventional procedural languages. The structure of these languages ​​requires a transition from traditional von Neumann computer architecture to architectures that take into account the requirements of the tasks of creating artificial intelligence.

Example: IBM eServer z990

Manufactured in 2003. Physical parameters: weight 2000 kg, power consumption 21 kW, area 2.5 sq. m., height 1.94 m., RAM capacity 256 GB, performance - 9 billion instructions/sec.

The computer they created worked a thousand times faster than the Mark 1. But it turned out that most of the time this computer was idle, because to set the calculation method (program) in this computer it was necessary to connect the wires in the required way for several hours or even several days. And the calculation itself could then take only a few minutes or even seconds.

To simplify and speed up the process of setting programs, Mauchly and Eckert began to design a new computer that could store the program in its memory. In 1945, the famous mathematician John von Neumann was brought in to work and prepared a report on this computer. The report was sent to many scientists and became widely known because in it von Neumann clearly and simply formulated the general principles of the functioning of computers, that is, universal computing devices. And to this day, the vast majority of computers are made in accordance with the principles that John von Neumann outlined in his report in 1945. The first computer to embody von Neumann's principles was built in 1949 by the English researcher Maurice Wilkes.

The development of the first electronic serial machine UNIVAC (Universal Automatic Computer) began around 1947 by Eckert and Mauchli, who founded the ECKERT-MAUCHLI company in December of the same year. The first model of the machine (UNIVAC-1) was built for the US Census Bureau and put into operation in the spring of 1951. The synchronous, sequential computer UNIVAC-1 was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5000 vacuum tubes. The internal storage device with a capacity of 1000 12-bit decimal numbers was implemented on 100 mercury delay lines.

Soon after the UNIVAC-1 machine was put into operation, its developers came up with the idea of ​​automatic programming. It boiled down to ensuring that the machine itself could prepare the sequence of commands needed to solve a given problem.

A strong limiting factor in the work of computer designers in the early 1950s was the lack of high-speed memory. According to one of the pioneers of computing, D. Eckert, “the architecture of a machine is determined by memory.” The researchers focused their efforts on the memory properties of ferrite rings strung on wire matrices.

In 1951, J. Forrester published an article on the use of magnetic cores for storing digital information. The Whirlwind-1 machine was the first to use magnetic core memory. It consisted of 2 cubes 32 x 32 x 17 with cores that provided storage of 2048 words for 16-bit binary numbers with one parity bit.

Soon, IBM became involved in the development of electronic computers. In 1952, it released its first industrial electronic computer, the IBM 701, which was a synchronous parallel computer containing 4,000 vacuum tubes and 12,000 germanium diodes. An improved version of the IBM 704 machine was distinguished by its high speed, it used index registers and represented data in floating point form.

IBM 704
After the IBM 704 computer, the IBM 709 was released, which, in architectural terms, was close to the machines of the second and third generations. In this machine, indirect addressing was used for the first time and I/O channels appeared for the first time.

In 1956, IBM developed floating magnetic heads on an air cushion. Their invention made it possible to create a new type of memory - disk storage devices (SD), the importance of which was fully appreciated in the subsequent decades of the development of computer technology. The first disk storage devices appeared in IBM 305 and RAMAC machines. The latter had a package consisting of 50 magnetically coated metal disks that rotated at a speed of 12,000 rpm. The surface of the disk contained 100 tracks for recording data, each containing 10,000 characters.

Following the first production computer UNIVAC-1, Remington-Rand in 1952 released the UNIVAC-1103 computer, which worked 50 times faster. Later, software interrupts were used for the first time in the UNIVAC-1103 computer.

Rernington-Rand employees used an algebraic form of writing algorithms called “Short Code” (the first interpreter, created in 1949 by John Mauchly). In addition, it is necessary to note the US Navy officer and leader of the programming team, then captain (later the only female admiral in the Navy) Grace Hopper, who developed the first compiler program. By the way, the term “compiler” was first introduced by G. Hopper in 1951. This compiling program translated into machine language the entire program, written in an algebraic form convenient for processing. G. Hopper is also the author of the term “bug” as applied to computers. Once, a beetle (in English - bug) flew into the laboratory through an open window, which, sitting on the contacts, shorted them, causing a serious malfunction in the operation of the machine. The burnt beetle was glued to the administrative log, where various malfunctions were recorded. This is how the first bug in computers was documented.

IBM took the first steps in the field of programming automation by creating the “Fast Coding System” for the IBM 701 machine in 1953. In the USSR, A. A. Lyapunov proposed one of the first programming languages. In 1957, a group led by D. Backus completed work on the first high-level programming language, which later became popular, called FORTRAN. The language, implemented for the first time on the IBM 704 computer, contributed to expanding the scope of computers.

Alexey Andreevich Lyapunov
In Great Britain in July 1951, at a conference at the University of Manchester, M. Wilkes presented a report “The Best Method for Designing an Automatic Machine,” which became a pioneering work on the fundamentals of microprogramming. The method he proposed for designing control devices has found wide application.

M. Wilkes realized his idea of ​​microprogramming in 1957 when creating the EDSAC-2 machine. In 1951, M. Wilkes, together with D. Wheeler and S. Gill, wrote the first programming textbook, “Composing Programs for Electronic Computing Machines.”

In 1956, Ferranti released the Pegasus computer, which for the first time implemented the concept of general purpose registers (GPR). With the advent of RON, the distinction between index registers and accumulators was eliminated, and the programmer had not one, but several accumulator registers at his disposal.

The advent of personal computers

Microprocessors were first used in a variety of specialized devices, such as calculators. But in 1974, several companies announced the creation of a personal computer based on the Intel-8008 microprocessor, that is, a device that performs the same functions as a large computer, but is designed for one user. At the beginning of 1975, the first commercially distributed personal computer, Altair-8800, based on the Intel-8080 microprocessor, appeared. This computer sold for about $500. And although its capabilities were very limited (RAM was only 256 bytes, there was no keyboard and screen), its appearance was greeted with great enthusiasm: several thousand sets of the machine were sold in the first months. Buyers supplied this computer with additional devices: a monitor for displaying information, a keyboard, memory expansion units, etc. Soon these devices began to be produced by other companies. At the end of 1975, Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to easily communicate with the computer and easily write programs for it. This also contributed to the rise in popularity of personal computers.

The success of Altair-8800 forced many companies to also start producing personal computers. Personal computers began to be sold fully equipped, with a keyboard and monitor; the demand for them amounted to tens and then hundreds of thousands of units per year. Several magazines dedicated to personal computers appeared. The growth in sales was greatly facilitated by numerous useful programs of practical importance. Commercially distributed programs also appeared, for example the text editing program WordStar and the spreadsheet processor VisiCalc (1978 and 1979, respectively). These and many other programs made the purchase of personal computers very profitable for business: with their help, it became possible to perform accounting calculations, draw up documents, etc. Using large computers for these purposes was too expensive.

In the late 1970s, the spread of personal computers even led to a slight decline in demand for large computers and minicomputers (minicomputers). This became a matter of serious concern for IBM, the leading company in the production of large computers, and in 1979 IBM decided to try its hand at the personal computer market. However, the company's management underestimated the future importance of this market and viewed the creation of a personal computer as just a minor experiment - something like one of dozens of works carried out at the company to create new equipment. In order not to spend too much money on this experiment, the company's management gave the unit responsible for this project freedom unprecedented in the company. In particular, he was allowed not to design a personal computer from scratch, but to use blocks made by other companies. And this unit took full advantage of the given chance.

The then latest 16-bit microprocessor Intel-8088 was chosen as the main microprocessor of the computer. Its use made it possible to significantly increase the potential capabilities of the computer, since the new microprocessor allowed working with 1 megabyte of memory, and all computers available at that time were limited to 64 kilobytes.

In August 1981, a new computer called the IBM PC was officially introduced to the public, and soon after it gained great popularity among users. A couple of years later, the IBM PC took a leading position in the market, displacing 8-bit computer models.

IBM PC
The secret of the popularity of the IBM PC is that IBM did not make its computer a single one-piece device and did not protect its design with patents. Instead, she assembled the computer from independently manufactured parts and did not keep the specifications of those parts and how they were connected a secret. In contrast, the design principles of the IBM PC were available to everyone. This approach, called the open architecture principle, made the IBM PC a stunning success, although it prevented IBM from sharing the benefits of its success. Here's how the openness of the IBM PC architecture influenced the development of personal computers.

The promise and popularity of the IBM PC made the production of various components and additional devices for the IBM PC very attractive. Competition between manufacturers has led to cheaper components and devices. Very soon, many companies ceased to be content with the role of manufacturers of components for the IBM PC and began to assemble their own computers compatible with the IBM PC. Since these companies did not need to bear IBM's huge costs for research and maintaining the structure of a huge company, they were able to sell their computers much cheaper (sometimes 2-3 times) than similar IBM computers.

Computers compatible with the IBM PC were initially contemptuously called “clones,” but this nickname did not catch on, as many manufacturers of IBM PC-compatible computers began to implement technical advances faster than IBM itself. Users were able to independently upgrade their computers and equip them with additional devices from hundreds of different manufacturers.

Personal computers of the future

The basis of computers of the future will not be silicon transistors, where information is transmitted by electrons, but optical systems. The information carrier will be photons, since they are lighter and faster than electrons. As a result, the computer will become cheaper and more compact. But the most important thing is that optoelectronic computing is much faster than what is used today, so the computer will be much more powerful.

The PC will be small in size and have the power of modern supercomputers. The PC will become a repository of information covering all aspects of our daily lives, it will not be tied to electrical networks. This PC will be protected from thieves thanks to a biometric scanner that will recognize its owner by fingerprint.

The main way to communicate with the computer will be voice. The desktop computer will turn into a “candy bar”, or rather, into a giant computer screen - an interactive photonic display. There is no need for a keyboard, since all actions can be performed with the touch of a finger. But for those who prefer a keyboard, a virtual keyboard can be created on the screen at any time and removed when it is no longer needed.

The computer will become the operating system of the house, and the house will begin to respond to the owner’s needs, will know his preferences (make coffee at 7 o’clock, play his favorite music, record the desired TV show, adjust temperature and humidity, etc.)

Screen size will not play any role in the computers of the future. It can be as big as your desktop or small. Larger versions of computer screens will be based on photonically excited liquid crystals, which will have much lower power consumption than today's LCD monitors. Colors will be vibrant and images will be accurate (plasma displays possible). In fact, today's concept of "resolution" will be greatly atrophied.