History of computing table. Computing devices and devices from antiquity to the present day - document

As soon as a person discovered the concept of “quantity”, he immediately began to select tools that would optimize and facilitate counting. Today, super-powerful computers, based on the principles of mathematical calculations, process, store and transmit information - the most important resource and engine of human progress. It is not difficult to get an idea of ​​how the development of computer technology took place by briefly considering the main stages of this process.

The main stages of the development of computer technology

The most popular classification proposes to highlight the main stages of the development of computer technology on a chronological basis:

  • Manual stage. It began at the dawn of the human era and continued until the middle of the 17th century. During this period, the basics of counting emerged. Later, with the formation of positional number systems, devices appeared (abacus, abacus, and later a slide rule) that made calculations by digits possible.
  • Mechanical stage. It began in the middle of the 17th century and lasted almost until the end of the 19th century. The level of development of science during this period made it possible to create mechanical devices that perform basic arithmetic operations and automatically remember the highest digits.
  • The electromechanical stage is the shortest of all that unite the history of the development of computer technology. It only lasted about 60 years. This is the period between the invention of the first tabulator in 1887 until 1946, when the very first computer (ENIAC) appeared. New machines, the operation of which was based on an electric drive and an electric relay, made it possible to perform calculations with much greater speed and accuracy, but the counting process still had to be controlled by a person.
  • The electronic stage began in the second half of the last century and continues today. This is the story of six generations of electronic computers - from the very first giant units, which were based on vacuum tubes, to the ultra-powerful modern supercomputers with a huge number of parallel working processors, capable of simultaneously executing many commands.

The stages of development of computer technology are divided according to a chronological principle rather arbitrarily. At a time when some types of computers were in use, the prerequisites for the emergence of the following were actively being created.

The very first counting devices

The earliest counting tool known to the history of the development of computer technology is the ten fingers on human hands. Counting results were initially recorded using fingers, notches on wood and stone, special sticks, and knots.

With the advent of writing, various ways of writing numbers appeared and developed, and positional number systems were invented (decimal in India, sexagesimal in Babylon).

Around the 4th century BC, the ancient Greeks began to count using an abacus. Initially, it was a clay flat tablet with stripes applied to it with a sharp object. Counting was carried out by placing small stones or other small objects on these stripes in a certain order.

In China, in the 4th century AD, a seven-pointed abacus appeared - suanpan (suanpan). Wires or ropes - nine or more - were stretched onto a rectangular wooden frame. Another wire (rope), stretched perpendicular to the others, divided the suanpan into two unequal parts. In the larger compartment, called “earth,” there were five bones strung on wires, in the smaller compartment, called “sky,” there were two of them. Each of the wires corresponded to a decimal place.

Traditional soroban abacus has become popular in Japan since the 16th century, having arrived there from China. At the same time, abacus appeared in Russia.

In the 17th century, based on logarithms discovered by the Scottish mathematician John Napier, the Englishman Edmond Gunther invented the slide rule. This device was constantly improved and has survived to this day. It allows you to multiply and divide numbers, raise to powers, determine logarithms and trigonometric functions.

The slide rule became a device that completed the development of computer technology at the manual (pre-mechanical) stage.

The first mechanical calculating devices

In 1623, the German scientist Wilhelm Schickard created the first mechanical "calculator", which he called a counting clock. The mechanism of this device resembled an ordinary clock, consisting of gears and sprockets. However, this invention became known only in the middle of the last century.

A quantum leap in the field of computing technology was the invention of the Pascalina adding machine in 1642. Its creator, French mathematician Blaise Pascal, began work on this device when he was not even 20 years old. "Pascalina" was a mechanical device in the form of a box with a large number of interconnected gears. The numbers that needed to be added were entered into the machine by turning special wheels.

In 1673, the Saxon mathematician and philosopher Gottfried von Leibniz invented a machine that performed the four basic mathematical operations and could extract the square root. The principle of its operation was based on the binary number system, specially invented by the scientist.

In 1818, the Frenchman Charles (Karl) Xavier Thomas de Colmar, taking Leibniz's ideas as a basis, invented an adding machine that could multiply and divide. And two years later, the Englishman Charles Babbage began constructing a machine that would be capable of performing calculations with an accuracy of 20 decimal places. This project remained unfinished, but in 1830 its author developed another - an analytical engine for performing accurate scientific and technical calculations. The machine was supposed to be controlled by software, and perforated cards with different locations of holes were to be used to input and output information. Babbage's project foresaw the development of electronic computing technology and the problems that could be solved with its help.

It is noteworthy that the fame of the world's first programmer belongs to a woman - Lady Ada Lovelace (nee Byron). It was she who created the first programs for Babbage's computer. One of the computer languages ​​was subsequently named after her.

Development of the first computer analogues

In 1887, the history of the development of computer technology entered a new stage. The American engineer Herman Hollerith (Hollerith) managed to design the first electromechanical computer - the tabulator. Its mechanism had a relay, as well as counters and a special sorting box. The device read and sorted statistical records made on punched cards. Subsequently, the company founded by Hollerith became the backbone of the world-famous computer giant IBM.

In 1930, the American Vannovar Bush created a differential analyzer. It was powered by electricity, and vacuum tubes were used to store data. This machine was capable of quickly finding solutions to complex mathematical problems.

Six years later, the English scientist Alan Turing developed the concept of a machine, which became the theoretical basis for modern computers. It had all the main properties of modern computer technology: it could step-by-step perform operations that were programmed in the internal memory.

A year after this, George Stibitz, a scientist from the United States, invented the country's first electromechanical device capable of performing binary addition. His operations were based on Boolean algebra - mathematical logic created in the mid-19th century by George Boole: the use of the logical operators AND, OR and NOT. Later, the binary adder will become an integral part of the digital computer.

In 1938, Claude Shannon, an employee of the University of Massachusetts, outlined the principles of the logical design of a computer that uses electrical circuits to solve Boolean algebra problems.

The beginning of the computer era

The governments of the countries involved in World War II were aware of the strategic role of computing in the conduct of military operations. This was the impetus for the development and parallel emergence of the first generation of computers in these countries.

A pioneer in the field of computer engineering was Konrad Zuse, a German engineer. In 1941, he created the first computer controlled by a program. The machine, called the Z3, was built on telephone relays, and programs for it were encoded on perforated tape. This device was able to work in the binary system, as well as operate with floating point numbers.

The next model of Zuse's machine, the Z4, is officially recognized as the first truly working programmable computer. He also went down in history as the creator of the first high-level programming language, called Plankalküll.

In 1942, American researchers John Atanasoff (Atanasoff) and Clifford Berry created a computing device that ran on vacuum tubes. The machine also used binary code and could perform a number of logical operations.

In 1943, in an English government laboratory, in an atmosphere of secrecy, the first computer, called “Colossus,” was built. Instead of electromechanical relays, it used 2 thousand electronic tubes for storing and processing information. It was intended to crack and decrypt the code of secret messages transmitted by the German Enigma encryption machine, which was widely used by the Wehrmacht. The existence of this device was kept in the strictest confidence for a long time. After the end of the war, the order for its destruction was signed personally by Winston Churchill.

Architecture development

In 1945, the Hungarian-German American mathematician John (Janos Lajos) von Neumann created the prototype for the architecture of modern computers. He proposed writing a program in the form of code directly into the machine’s memory, implying joint storage of programs and data in the computer’s memory.

Von Neumann's architecture formed the basis for the first universal electronic computer, ENIAC, being created at that time in the United States. This giant weighed about 30 tons and was located on 170 square meters of area. 18 thousand lamps were used in the operation of the machine. This computer could perform 300 multiplication operations or 5 thousand additions in one second.

Europe's first universal programmable computer was created in 1950 in the Soviet Union (Ukraine). A group of Kyiv scientists, led by Sergei Alekseevich Lebedev, designed a small electronic calculating machine (MESM). Its speed was 50 operations per second, it contained about 6 thousand vacuum tubes.

In 1952, domestic computer technology was replenished with BESM, a large electronic calculating machine, also developed under the leadership of Lebedev. This computer, which performed up to 10 thousand operations per second, was at that time the fastest in Europe. Information was entered into the machine's memory using punched paper tape, and data was output via photo printing.

During the same period, a series of large computers was produced in the USSR under the general name “Strela” (the author of the development was Yuri Yakovlevich Bazilevsky). Since 1954, serial production of the universal computer "Ural" began in Penza under the leadership of Bashir Rameev. The latest models were hardware and software compatible with each other, there was a wide selection of peripheral devices, allowing you to assemble machines of various configurations.

Transistors. Release of the first serial computers

However, the lamps failed very quickly, making it very difficult to work with the machine. The transistor, invented in 1947, managed to solve this problem. Using the electrical properties of semiconductors, it performed the same tasks as vacuum tubes, but occupied much less space and did not consume as much energy. Along with the advent of ferrite cores for organizing computer memory, the use of transistors made it possible to significantly reduce the size of machines, make them even more reliable and faster.

In 1954, the American company Texas Instruments began mass-producing transistors, and two years later the first second-generation computer built on transistors, the TX-O, appeared in Massachusetts.

In the middle of the last century, a significant part of government organizations and large companies used computers for scientific, financial, engineering calculations, and working with large amounts of data. Gradually, computers acquired features familiar to us today. During this period, plotters, printers, and storage media on magnetic disks and tape appeared.

The active use of computer technology has led to an expansion of the areas of its application and required the creation of new software technologies. High-level programming languages ​​have appeared that make it possible to transfer programs from one machine to another and simplify the process of writing code (Fortran, Cobol and others). Special translator programs have appeared that convert code from these languages ​​into commands that can be directly perceived by the machine.

The emergence of integrated circuits

In 1958-1960, thanks to engineers from the United States Robert Noyce and Jack Kilby, the world learned about the existence of integrated circuits. Miniature transistors and other components, sometimes up to hundreds or thousands, were mounted on a silicon or germanium crystal base. The chips, just over a centimeter in size, were much faster than transistors and consumed much less power. The history of the development of computer technology connects their appearance with the emergence of the third generation of computers.

In 1964, IBM released the first computer of the SYSTEM 360 family, which was based on integrated circuits. From this time on, the mass production of computers can be counted. In total, more than 20 thousand copies of this computer were produced.

In 1972, the USSR developed the ES (unified series) computer. These were standardized complexes for the operation of computer centers that had a common command system. The American IBM 360 system was taken as the basis.

The following year, DEC released the PDP-8 minicomputer, the first commercial project in this area. The relatively low cost of minicomputers has made it possible for small organizations to use them.

During the same period, the software was constantly improved. Operating systems were developed aimed at supporting the maximum number of external devices, and new programs appeared. In 1964, they developed BASIC, a language designed specifically for training novice programmers. Five years after this, Pascal appeared, which turned out to be very convenient for solving many applied problems.

Personal computers

After 1970, production of the fourth generation of computers began. The development of computer technology at this time is characterized by the introduction of large integrated circuits into computer production. Such machines could now perform thousands of millions of computational operations in one second, and their RAM capacity increased to 500 million bits. A significant reduction in the cost of microcomputers has led to the fact that the opportunity to buy them gradually became available to the average person.

Apple was one of the first manufacturers of personal computers. Its creators, Steve Jobs and Steve Wozniak, designed the first PC model in 1976, giving it the name Apple I. It cost only $500. A year later, the next model of this company was presented - Apple II.

The computer of this time for the first time became similar to a household appliance: in addition to its compact size, it had an elegant design and a user-friendly interface. The proliferation of personal computers at the end of the 1970s led to the fact that the demand for mainframe computers fell markedly. This fact seriously worried their manufacturer, IBM, and in 1979 it released its first PC to the market.

Two years later, the company's first microcomputer with an open architecture appeared, based on the 16-bit 8088 microprocessor manufactured by Intel. The computer was equipped with a monochrome display, two drives for five-inch floppy disks, and 64 kilobytes of RAM. On behalf of the creator company, Microsoft specially developed an operating system for this machine. Numerous IBM PC clones appeared on the market, which stimulated the growth of industrial production of personal computers.

In 1984, Apple developed and released a new computer - the Macintosh. Its operating system was extremely user-friendly: it presented commands in the form of graphic images and allowed them to be entered using a mouse. This made the computer even more accessible, since now no special skills were required from the user.

Some sources date computers of the fifth generation of computing technology to 1992-2013. Briefly, their main concept is formulated as follows: these are computers created on the basis of highly complex microprocessors, having a parallel-vector structure, which makes it possible to simultaneously execute dozens of sequential commands embedded in the program. Machines with several hundred processors working in parallel make it possible to process data even more accurately and quickly, as well as create efficient networks.

The development of modern computer technology already allows us to talk about sixth generation computers. These are electronic and optoelectronic computers running on tens of thousands of microprocessors, characterized by massive parallelism and modeling the architecture of neural biological systems, which allows them to successfully recognize complex images.

Having consistently examined all stages of the development of computer technology, an interesting fact should be noted: inventions that have proven themselves well in each of them have survived to this day and continue to be used successfully.

Computer Science Classes

There are various options for classifying computers.

So, according to their purpose, computers are divided:

  • to universal ones - those that are capable of solving a wide variety of mathematical, economic, engineering, technical, scientific and other problems;
  • problem-oriented - solving problems of a narrower direction, associated, as a rule, with the management of certain processes (data recording, accumulation and processing of small amounts of information, performing calculations in accordance with simple algorithms). They have more limited software and hardware resources than the first group of computers;
  • specialized computers usually solve strictly defined tasks. They have a highly specialized structure and, with a relatively low complexity of the device and control, are quite reliable and productive in their field. These are, for example, controllers or adapters that control a number of devices, as well as programmable microprocessors.

Based on size and productive capacity, modern electronic computing equipment is divided into:

  • to ultra-large (supercomputers);
  • large computers;
  • small computers;
  • ultra-small (microcomputers).

Thus, we saw that devices, first invented by man to take into account resources and values, and then to quickly and accurately carry out complex calculations and computational operations, were constantly developing and improving.

Municipal educational institution secondary school No. 3 of Karasuk district

Subject : History of the development of computer technology.

Compiled by:

Student MOUSOSH No. 3

Kochetov Egor Pavlovich

Manager and consultant:

Serdyukov Valentin Ivanovich,

computer science teacher MOUSOSH No. 3

Karasuk 2008

Relevance

Introduction

First steps in the development of counting devices

17th century calculating devices

18th century calculating devices

19th century counting devices

Development of computing technology at the beginning of the 20th century

The emergence and development of computer technology in the 40s of the 20th century

Development of computer technology in the 50s of the 20th century

Development of computer technology in the 60s of the 20th century

Development of computer technology in the 70s of the 20th century

Development of computer technology in the 80s of the 20th century

Development of computer technology in the 90s of the 20th century

The role of computer technology in human life

My research

Conclusion

Bibliography

Relevance

Mathematics and computer science are used in all areas of the modern information society. Modern production, computerization of society, and the introduction of modern information technologies require mathematical and information literacy and competence. However, today, school courses in computer science and ICT often offer a one-sided educational approach that does not allow one to properly increase the level of knowledge due to the lack of mathematical logic necessary for complete mastery of the material. In addition, the lack of stimulation of students’ creative potential has a negative impact on motivation to learn, and as a result, on the final level of skills, knowledge and abilities. How can you study a subject without knowing its history? This material can be used in history, mathematics and computer science lessons.

Nowadays it is difficult to imagine that you can do without computers. But not so long ago, until the early 70s, computers were available to a very limited circle of specialists, and their use, as a rule, remained shrouded in secrecy and little known to the general public. However, in 1971, an event occurred that radically changed the situation and, with fantastic speed, turned the computer into an everyday working tool for tens of millions of people.

Introduction

People learned to count using their own fingers. When this was not enough, the simplest counting devices appeared. ABAK, which became widespread in the ancient world, occupied a special place among them. Then, after years of human development, the first electronic computers (computers) appeared. They not only accelerated computing work, but also gave impetus to people to create new technologies. The word “computer” means “computer”, i.e. computing device. The need to automate data processing, including calculations, arose a long time ago. Nowadays it is difficult to imagine that you can do without computers. But not so long ago, until the early 70s, computers were available to a very limited circle of specialists, and their use, as a rule, remained shrouded in secrecy and little known to the general public. However, in 1971, an event occurred that radically changed the situation and, with fantastic speed, turned the computer into an everyday work tool for tens of millions of people. In that undoubtedly significant year, the almost unknown company Intel from a small American town with the beautiful name of Santa Clara (California) released the first microprocessor. It is to him that we owe the emergence of a new class of computing systems - personal computers, which are now used by essentially everyone, from primary school students and accountants to scientists and engineers. At the end of the 20th century, it is impossible to imagine life without a personal computer. The computer has firmly entered our lives, becoming man's main assistant. Today in the world there are many computers from different companies, different complexity groups, purposes and generations. In this essay we will look at the history of the development of computer technology, as well as a brief overview of the possibilities of using modern computing systems and further trends in the development of personal computers.

First steps in the development of counting devices

The history of counting devices goes back many centuries. The oldest calculating instrument that nature itself placed at man’s disposal was his own hand. To make counting easier, people began to use the fingers of first one hand, then both, and in some tribes, their toes. In the 16th century, finger counting techniques were described in textbooks.

The next step in the development of counting was the use of pebbles or other objects, and for memorizing numbers - notches on animal bones, knots on ropes. The so-called “Vestonitsa bone” with notches discovered in excavations allows historians to assume that even then, 30 thousand years BC, our ancestors were familiar with the rudiments of counting:


The early development of written counting was hampered by the complexity of arithmetic operations in the multiplication of numbers that existed at that time. In addition, few people knew how to write and there was no educational material for writing - parchment began to be produced around the 2nd century BC, papyrus was too expensive, and clay tablets were inconvenient to use.

These circumstances explain the appearance of a special calculating device - the abacus. By the 5th century BC. abacus became widespread in Egypt, Greece, and Rome. It was a board with grooves in which, according to the positional principle, some objects were placed - pebbles, bones.


An abacus-like instrument was known among all nations. The ancient Greek abacus (board or "Salaminian board" named after the island of Salamis in the Aegean Sea) was a plank sprinkled with sea sand. There were grooves in the sand, on which numbers were marked with pebbles. One groove corresponded to units, the other to tens, etc. If more than 10 pebbles were collected in any groove when counting, they were removed and one pebble was added in the next rank.

The Romans improved the abacus, moving from wooden planks, sand and pebbles to marble planks with chiseled grooves and marble balls. Later, around 500 AD, the abacus was improved and an abacus was born, a device consisting of a set of knuckles strung on rods. The Chinese abacus suan-pan consisted of a wooden frame divided into upper and lower sections. The sticks correspond to the columns, and the beads correspond to numbers. For the Chinese, counting was based not on ten, but on five.


It is divided into two parts: in the lower part there are 5 seeds on each row, in the upper part there are two. Thus, in order to set the number 6 on these abacuses, they first placed the bone corresponding to the five, and then added one to the units digit.


The Japanese called the same device for counting serobyan:


In Rus', for a long time, they counted by bones placed in piles. Around the 15th century, the “plank abacus” became widespread, which was almost no different from ordinary abacus and consisted of a frame with reinforced horizontal ropes on which drilled plum or cherry pits were strung.


Around the 6th century. AD In India, very advanced ways of writing numbers and rules for performing arithmetic operations, now called the decimal number system, were formed. When writing a number that lacks any digit (for example, 101 or 1204), the Indians said the word “empty” instead of the name of the number. When recording, a dot was placed in place of the “empty” digit, and later a circle was drawn. Such a circle was called “sunya” - in Hindi it meant “empty space”. Arab mathematicians translated this word into its own language - they said "sifr". The modern word “zero” was born relatively recently - later than “digit”. It comes from the Latin word "nihil" - "no". Around 850 AD. Arab scientist mathematician Muhammad ben Musa al-Khorezm (from the city of Khorezm on the Amu Darya River) wrote a book about the general rules for solving arithmetic problems using equations. It was called "Kitab al-Jabr". This book gave its name to the science of algebra. Another book by al-Khwarizmi played a very important role, in which he described Indian arithmetic in detail. Three hundred years later (in 1120) this book was translated into Latin, and it became the first a textbook of “Indian” (that is, our modern) arithmetic for all European cities.


We owe the appearance of the term “algorithm” to Muhammad ben Musa al-Khorezm.

At the end of the 15th century, Leonardo da Vinci (1452-1519) created a sketch of a 13-bit adding device with ten-tooth rings. But da Vinci’s manuscripts were discovered only in 1967, so the biography of mechanical devices comes from Pascal’s adding machine. Based on his drawings, today an American computer manufacturing company has built a working machine for advertising purposes.

17th century calculating devices


In 1614, Scottish mathematician John Naiper (1550-1617) invented logarithm tables. Their principle is that each number corresponds to a special number - a logarithm - an exponent to which the number must be raised (the base of the logarithm) to obtain a given number. Any number can be expressed this way. Logarithms make division and multiplication very simple. To multiply two numbers, simply add their logarithms. Thanks to this property, the complex multiplication operation is reduced to a simple addition operation. To simplify, tables of logarithms were compiled, which were later built into a device that could significantly speed up the calculation process - a slide rule.


Napier proposed in 1617 another (non-logarithmic) method of multiplying numbers. The instrument, called the Napier stick (or knuckle), consisted of thin plates, or blocks. Each side of the block carries numbers that form a mathematical progression.


Block manipulation allows you to extract square and cube roots, as well as multiply and divide large numbers.


Wilhelm Schickard

In 1623, Wilhelm Schickard, an orientalist and mathematician, professor at the University of Tyubin, in letters to his friend Johannes Kepler, described the design of a “counting clock” - a calculating machine with a device for setting numbers and rollers with a slider and a window for reading the result. This machine could only add and subtract (some sources say that this machine could also multiply and divide). This was the first mechanical car. In our time, according to his description, its model has been built:

Blaise Pascal


In 1642, the French mathematician Blaise Pascal (1623-1662) designed a calculating device to make the work of his father, a tax inspector, easier. This device made it possible to add decimal numbers. Externally, it looked like a box with numerous gears.


The basis of the adding machine was the counter-recorder, or counting gear. It had ten protrusions, each of which had numbers written on it. To transmit tens, there was one elongated tooth on the gear, which engaged and turned the intermediate gear, which transmitted rotation to the tens gear. An additional gear was needed to ensure that both counting gears - ones and tens - rotated in the same direction. The counting gear was connected to the lever using a ratchet mechanism (transmitting forward movement and not transmitting reverse movement). Deflection of the lever to one angle or another made it possible to enter single-digit numbers into the counter and sum them up. In Pascal's machine, a ratchet drive was attached to all the counting gears, which made it possible to add multi-digit numbers.

In 1642, the British Robert Bissacar, and in 1657 - independently - S. Partridge developed a rectangular slide rule, the design of which has largely survived to this day.


In 1673, the German philosopher, mathematician, physicist Gottfried Wilhelm Leibniz (Gottfried Wilhelm Leibniz, 1646-1716) created a “step calculator” - a calculating machine that allows you to add, subtract, multiply, divide, extract square roots, using the binary number system .

It was a more advanced device that used a moving part (a prototype of a carriage) and a handle with which the operator rotated the wheel. Leibniz's product suffered the sad fate of its predecessors: if anyone used it, it was only Leibniz's family and friends of his family, since the time of mass demand for such mechanisms had not yet come.

The machine was the prototype of the adding machine, used from 1820 to the 60s of the twentieth century.

Calculating devices from the 18th century.


In 1700, Charles Perrault published “A Collection of a Large Number of Machines of Claude Perrault’s Own Invention,” in which among the inventions of Claude Perrault (Charles Perrault’s brother) there is a adding machine in which gear racks are used instead of gears. The machine was called the "Rhabdological Abacus". This device was named so because the ancients called abacus a small board on which numbers are written, and Rhabdology - the science of performing

arithmetic operations using small sticks with numbers.


In 1703, Gottfried Wilhelm Leibniz wrote a treatise "Expication de l"Arithmetique Binary" - on the use of the binary number system in computers. His first works on binary arithmetic date back to 1679.

A member of the Royal Society of London, the German mathematician, physicist, and astronomer Christian Ludwig Gersten invented an arithmetic machine in 1723, and two years later he manufactured it. The Gersten machine is remarkable in that it is the first to use a device for calculating the quotient and the number of successive addition operations required when multiplying numbers, and also provides the ability to control the correctness of entering (setting) the second addend, which reduces the likelihood of subjective error associated with the fatigue of the calculator.

In 1727, Jacob Leupold created a calculating machine that used the Leibniz machine principle.

In the report of the commission of the Paris Academy of Sciences, published in 1751 in the Journal of Scientists, there are remarkable lines: “The results of Mr. Pereira’s method that we have seen are quite sufficient to once again confirm the opinion ... that this method of teaching the deaf-mutes is extremely practical and that the person who used it with such success is worthy of praise and encouragement... In speaking of the progress which Mr. Pereira's pupil made in a very short time in the knowledge of numbers, we must add that Mr. Pereira used the Arithmetic Engine, which he himself invented." This arithmetic machine is described in the "Journal of Scientists", but, unfortunately, the journal does not contain drawings. This calculating machine used some ideas borrowed from Pascal and Perrault, but overall it was a completely original design. It differed from known machines in that its counting wheels were not located on parallel axes, but on a single axis passing through the entire machine. This innovation, which made the design more compact, was subsequently widely used by other inventors - Felt and Odner.

In the second half of the 17th century (no later than 1770), a summing machine was created in the city of Nesvizh. The inscription on this machine states that it was “invented and manufactured by the Jew Evna Jacobson, a watchmaker and mechanic in the city of Nesvizh in Lithuania,” “Minsk Voivodeship.” This machine is currently in the collection of scientific instruments of the M.V. Lomonosov Museum (St. Petersburg). An interesting feature of the Jacobson machine was a special device that made it possible to automatically count the number of subtractions made, in other words, to determine the quotient. The presence of this device, an ingeniously solved problem of entering numbers, the ability to record intermediate results - all this allows us to consider the “watchmaker from Nesvizh” an outstanding designer of calculating equipment.


In 1774, rural pastor Philip Matthaos Hahn developed the first working calculating machine. He managed to build and, most incredibly, sell a small number of calculating machines.

In 1775, in England, Count Steinhope created a calculating device in which new mechanical systems were not implemented, but this device was more reliable in operation.


Calculating devices from the 19th century.

In 1804, French inventor Joseph-Marie Jacquard (1752-1834) came up with a way to automatically control the thread when working on a weaving loom. The method consisted of using special cards with holes drilled in the right places (depending on the pattern that was supposed to be applied to the fabric). Thus, he designed a spinning machine, the operation of which could be programmed using special cards. The operation of the machine was programmed using a whole deck of punched cards, each of which controlled one shuttle stroke. When moving on to a new drawing, the operator simply replaced one deck of punched cards with another. The creation of a loom controlled by cards with holes punched on them and connected to each other in the form of a tape is one of the key discoveries that determined the further development of computer technology.

Charles Xavier Thomas

Charles Xavier Thomas (1785-1870) in 1820 created the first mechanical calculator that could not only add and multiply, but also subtract and divide. The rapid development of mechanical calculators led to the addition of a number of useful functions by 1890: storing intermediate results and using them in subsequent operations, printing the result, etc. The creation of inexpensive, reliable machines made it possible to use these machines for commercial purposes and scientific calculations.

Charles Babbage

In 1822 English mathematician Charles Babbage (1792-1871) put forward the idea of ​​​​creating a program-controlled calculating machine with an arithmetic device, control device, input and printing.

The first machine Babbage designed, the Difference Engine, was powered by a steam engine. She calculated tables of logarithms using the method of constant differentiation and recorded the results on a metal plate. The working model he created in 1822 was a six-digit calculator capable of performing calculations and printing numerical tables.

Ada Lovelace

Lady Ada Lovelace (Ada Byron, Countess of Lovelace, 1815-1852) worked simultaneously with the English scientist. She developed the first programs for the machine, laid down many ideas and introduced a number of concepts and terms that have survived to this day.

Babbage's Analytical Engine was built by enthusiasts from the London Science Museum. It consists of four thousand iron, bronze and steel parts and weighs three tons. True, it is very difficult to use - with each calculation you have to turn the machine handle several hundred (or even thousands) times.

The numbers are written (typed) on disks arranged vertically and set to positions 0 to 9. The motor is driven by a sequence of punched cards containing instructions (program).

First telegraph

The first electric telegraph was created in 1937 by English inventors William Cook (1806-1879) and Charles Wheatstone (1802-1875). An electric current was sent through the wires to the receiver. The signals activated arrows on the receiver, which pointed to different letters and thus conveyed messages.

American artist Samuel Morse (1791-1872) invented a new telegraph code that replaced the Cook and Wheatstone code. He developed dots and dashes for each letter. Morse staged a demonstration of his code by laying a 6 km telegraph wire from Baltimore to Washington and transmitting news of the presidential election over it.

Later (in 1858), Charles Wheatstone created a system in which an operator, using Morse code, typed messages onto a long paper tape that fed into a telegraph machine. At the other end of the line, the recorder was typing the received message onto another paper tape. The productivity of telegraph operators increases tenfold - messages are now sent at a speed of one hundred words per minute.

In 1846, the Kummer calculator appeared, which was mass-produced for more than 100 years - until the seventies of the twentieth century. Calculators have now become an integral attribute of modern life. But when there were no calculators, the Kummer calculator was in use, which, at the whim of the designers, later turned into “Addiator”, “Products”, “Arithmetic Ruler” or “Progress”. This wonderful device, created in the mid-19th century, according to its manufacturer, could be made the size of a playing card, and therefore could easily fit in a pocket. The device of Kummer, a St. Petersburg music teacher, stood out among those previously invented for its portability, which became its most important advantage. Kummer's invention looked like a rectangular board with figured slats. Addition and subtraction were carried out through the simplest movement of slats. It is interesting that Kummer's calculator, presented in 1946 to the St. Petersburg Academy of Sciences, was focused on monetary calculations.

In Russia, in addition to the Slonimsky device and modifications of the Kummer numerator, the so-called counting bars, invented in 1881 by the scientist Ioffe, were quite popular.

George Boole

In 1847, the English mathematician George Boole (1815-1864) published the work "Mathematical Analysis of Logic." This is how a new branch of mathematics appeared. It was called Boolean algebra. Each value in it can take only one of two values: true or false, 1 or 0. This algebra was very useful to the creators of modern computers. After all, the computer understands only two symbols: 0 and 1. He is considered the founder of modern mathematical logic.

1855 Brothers George & Edvard Scheutz from Stockholm built the first mechanical computer using the work of Ch. Babbage.

In 1867, Bunyakovsky invented self-calculators, which were based on the principle of connected digital wheels (Pascal's gear).

In 1878, the English scientist Joseph Swan (1828-1914) invented the electric light bulb. It was a glass flask with a carbon filament inside. To prevent the thread from burning out, Swan removed the air from the flask.

The following year, American inventor Thomas Edison (1847-1931) also invented the light bulb. In 1880, Edison began producing safety light bulbs, selling them for $2.50. Subsequently, Edison and Swan created a joint company, Edison and Swan United Electric Light Company.

In 1883, while experimenting with a lamp, Edison inserted a platinum electrode into a vacuum cylinder, applied voltage and, to his surprise, discovered that current flowed between the electrode and the carbon filament. Since at that moment Edison’s main goal was to extend the life of the incandescent lamp, this result interested him little, but the enterprising American still received a patent. The phenomenon known to us as thermionic emission was then called the “Edison effect” and was forgotten for some time.

Vilgodt Teofilovich Odner

In 1880 Vilgodt Teofilovich Odner, a Swede by nationality, who lived in St. Petersburg, designed an adding machine. It must be admitted that before Odner there were also adding machines - the systems of K. Thomas. However, they were unreliable, large in size and inconvenient to operate.

He began working on the adding machine in 1874, and in 1890 he began mass production of them. Their modification "Felix" was produced until the 50s. The main feature of Odhner's brainchild is the use of gear wheels with a variable number of teeth (this wheel bears Odhner's name) instead of Leibniz's stepped rollers. It is structurally simpler than a roller and has smaller dimensions.

Herman Hollerith

In 1884, American engineer Herman Hillerith (1860-1929) took out a patent “for a census machine” (statistical tabulator). The invention included a punched card and a sorting machine. Hollerith's punch card turned out to be so successful that it has existed to this day without the slightest changes.

The idea of ​​putting data on punched cards and then reading and processing them automatically belonged to John Billings, and its technical solution belonged to Herman Hollerith.

The tabulator accepted cards the size of a dollar bill. There were 240 positions on the cards (12 rows of 20 positions). When reading information from punched cards, 240 needles pierced these cards. Where the needle entered the hole, it closed an electrical contact, as a result of which the value in the corresponding counter increased by one.

Development of computer technology

at the beginning of the 20th century

1904 The famous Russian mathematician, shipbuilder, academician A.N. Krylov proposed the design of a machine for integrating ordinary differential equations, which was built in 1912.

English physicist John Ambrose Fleming (1849-1945), studying the "Edison effect", creates a diode. Diodes are used to convert radio waves into electrical signals that can be transmitted over long distances.

Two years later, through the efforts of the American inventor Lee di Forest, triodes appeared.

1907 American engineer J. Power designed an automatic card punch.

St. Petersburg scientist Boris Rosing applies for a patent for a cathode ray tube as a data receiver.

1918 The Russian scientist M.A. Bonch-Bruevich and the English scientists V. Iccles and F. Jordan (1919) independently created an electronic device, called a trigger by the British, which played a big role in the development of computer technology.

In 1930, Vannevar Bush (1890-1974) designs a differential analyzer. In fact, this is the first successful attempt to create a computer capable of performing cumbersome scientific calculations. Bush's role in the history of computer technology is very large, but his name most often appears in connection with the prophetic article "As We May Think" (1945), in which he describes the concept of hypertext.

Konrad Zuse created the Z1 computer, which had a keyboard for entering problem conditions. Upon completion of the calculations, the result was displayed on a panel with many small lights. The total area occupied by the machine was 4 sq.m.

Konrad Zuse patented a method for automatic calculations.

For the next model Z2, K. Zuse came up with a very ingenious and cheap input device: Zuse began encoding instructions for the machine by punching holes in used 35 mm photographic film.

In 1838 American mathematician and engineer Claude Shannon and Russian scientist V.I. Shestakov in 1941 showed the possibility of a mathematical logic apparatus for the synthesis and analysis of relay contact switching systems.

In 1938, the telephone company Bell Laboratories created the first binary adder (an electrical circuit that performed binary addition) - one of the main components of any computer. The author of the idea was George Stibits, who experimented with Boolean algebra and various parts - old relays, batteries, light bulbs and wiring. By 1940, a machine was born that could perform four arithmetic operations on complex numbers.

Appearance and

in the 40s of the 20th century.

In 1941, IBM engineer B. Phelps began work on creating decimal electronic counters for tabulators, and in 1942 he created an experimental model of an electronic multiplying device. In 1941, Konrad Zuse built the world's first operational program-controlled relay binary computer, the Z3.

Simultaneously with the construction of ENIAC, also in secrecy, a computer was created in Great Britain. Secrecy was necessary because a device was being designed to decipher the codes used by the German armed forces during the Second World War. The mathematical decryption method was developed by a group of mathematicians, including Alan Turing. During 1943, the Colossus machine was built in London using 1,500 vacuum tubes. The developers of the machine are M. Newman and T. F. Flowers.

Although both ENIAC and Colossus ran on vacuum tubes, they essentially copied electromechanical machines: new content (electronics) was squeezed into an old form (the structure of pre-electronic machines).

In 1937, Harvard mathematician Howard Aiken proposed a project to create a large calculating machine. The work was sponsored by IBM President Thomas Watson, who invested $500 thousand in it. Design of the Mark-1 began in 1939; the computer was built by the New York company IBM. The computer contained about 750 thousand parts, 3304 relays and more than 800 km of wires.

In 1944, the finished machine was officially transferred to Harvard University.

In 1944, American engineer John Presper Eckert first put forward the concept of a program stored in computer memory.

Aiken, who had the intellectual resources of Harvard and a capable Mark-1 machine, received several orders from the military. So the next model, the Mark-2, was ordered by the US Navy Weapons Directorate. Design began in 1945, and construction ended in 1947. The Mark-2 was the first multitasking machine—multiple buses made it possible to simultaneously transmit multiple numbers from one part of the computer to another.

In 1948, Sergei Aleksandrovich Lebedev (1990-1974) and B.I. Rameev proposed the first project of a domestic digital electronic computer. Under the leadership of Academician Lebedev S.A. and Glushkova V.M. domestic computers are being developed: first MESM - small electronic calculating machine (1951, Kyiv), then BESM - high-speed electronic calculating machine (1952, Moscow). In parallel with them, Strela, Ural, Minsk, Hrazdan, and Nairi were created.

In 1949 An English stored program machine, EDSAC (Electronic Delay Storage Automatic Computer), was put into operation, designed by Maurice Wilkes from the University of Cambridge. The EDSAC computer contained 3,000 vacuum tubes and was six times more productive than its predecessors. Maurice Wilkis introduced a system of mnemonics for machine instructions called assembly language.

In 1949 John Mauchly created the first programming language interpreter called "Short Order Code".

Development of computer technology

in the 50s of the 20th century.

In 1951, work was completed on the creation of UNIVAC (Universal Automatic Computer). The first example of the UNIVAC-1 machine was built for the US Census Bureau. The UNIVAC-1 synchronous, sequential computer was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5000 vacuum tubes. The internal storage device, with a capacity of 1000 twelve-bit decimal numbers, was made on 100 mercury delay lines.

This computer is interesting in that it was aimed at relatively mass production without changing the architecture and special attention was paid to the peripheral part (input-output facilities).

Jay Forrester patented magnetic core memory. For the first time such memory was used on the Whirlwind-1 machine. It consisted of two cubes with 32x32x17 cores, which provided storage of 2048 words for 16-bit binary numbers with one parity bit.

This machine was the first to use a universal non-specialized bus (the relationships between various computer devices become flexible) and two devices were used as input-output systems: a Williams cathode ray tube and a typewriter with punched paper tape (flexowriter).

"Tradis", released in 1955. - the first transistor computer from Bell Telephone Laboratories - contained 800 transistors, each of which was enclosed in a separate housing.

In 1957 In the IBM 350 RAMAC model, disk memory (magnetized aluminum disks with a diameter of 61 cm) appeared for the first time.

G. Simon, A. Newell, J. Shaw created GPS - a universal problem solver.

In 1958 Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently invent the integrated circuit.

1955-1959 Russian scientists A.A. Lyapunov, S.S. Kamynin, E.Z. Lyubimsky, A.P. Ershov, L.N. Korolev, V.M. Kurochkin, M.R. Shura-Bura and others created “programming programs” - prototypes of translators. V.V. Martynyuk created a symbolic coding system - a means of accelerating the development and debugging of programs.

1955-1959 The foundation was laid for programming theory (A.A. Lyapunov, Yu.I. Yanov, A.A. Markov, L.A. Kaluzhin) and numerical methods (V.M. Glushkov, A.A. Samarsky, A.N. Tikhonov ). Schemes of the mechanism of thinking and genetic processes, algorithms for diagnosing medical diseases are modeled (A.A. Lyapunov, B.V. Gnedenko, N.M. Amosov, A.G. Ivakhnenko, V.A. Kovalevsky, etc.).

1959 Under the leadership of S.A. Lebedev created the BESM-2 machine with a productivity of 10 thousand operations/s. Its use is associated with calculations of launches of space rockets and the world's first artificial Earth satellites.

1959 The M-20 machine was created, chief designer S.A. Lebedev. For its time, one of the fastest in the world (20 thousand operations/s). This machine was used to solve most theoretical and applied problems related to the development of the most advanced fields of science and technology of that time. Based on the M-20, the unique multiprocessor M-40 was created - the fastest computer of that time in the world (40 thousand operations/sec.). The M-20 was replaced by the semiconductor BESM-4 and M-220 (200 thousand operations/s).

Development of computer technology

in the 60s of the 20th century.

In 1960, for a short time, the CADASYL (Conference on Data System Languages) group, led by Joy Wegstein and with the support of IBM, developed a standardized business programming language, COBOL (Common business oriented language). This language is focused on solving economic problems, or more precisely, on processing information.

In the same year, J. Schwartz and others from the company System Development developed the Jovial programming language. The name comes from Jule's Own Version of International Algorithmic Language. Procedural Java, version of Algol-58. Used mainly for military applications by the US Air Force.

IBM has developed a powerful computing system called Stretch (IBM 7030).

1961 IBM Deutschland implemented the connection of a computer to a telephone line using a modem.

Also, American professor John McCartney developed the LISP (List procssing language) language.

J. Gordon, head of the development of simulation systems at IBM, created the GPSS (General Purpose Simulation System) language.

Employees of the University of Manchester under the leadership of T. Kilburn created the Atlas computer, which for the first time implemented the concept of virtual memory. The first minicomputer (PDP-1) appeared before 1971, the time of the creation of the first microprocessor (Intel 4004).

In 1962, R. Griswold developed the programming language SNOBOL, focused on string processing.

Steve Russell developed the first computer game. What kind of game it was, unfortunately, is not known.

E.V. Evreinov and Yu. Kosarev proposed a model of a team of computers and substantiated the possibility of building supercomputers on the principles of parallel execution of operations, variable logical structure and structural homogeneity.

IBM released the first external memory devices with removable disks.

Kenneth E. Iverson (IBM) published a book called “A Programming Language” (APL). Initially, this language served as a notation for writing algorithms. The first implementation of APL/360 was in 1966 by Adin Falkoff (Harvard, IBM). There are versions of interpreters for PC. Due to the difficulty of reading nuclear submarine programs, it is sometimes called “Chinese BASIC”. Actually, it is a procedural, very compact, ultra-high-level language. Requires a special keyboard. Further development – ​​APL2.

1963 The American standard code for information exchange has been approved - ASCII (American Standard Code Informatio Interchange).

General Electric created the first commercial DBMS (database management system).

1964 U. Dahl and K. Nygort created the SIMULA-1 modeling language.

In 1967 under the leadership of S.A. Lebedev and V.M. Melnikov, a high-speed computing machine BESM-6 was created at ITM and VT.

It was followed by "Elbrus" - a new type of computer with a productivity of 10 million operations/s.

Development of computer technology

in the 70s of the 20th century.

In 1970 Charles Murr, an employee of the National Radio Astronomy Observatory, created the FORT programming language.

Denis Ritchie and Kenneth Thomson release the first version of Unix.

Dr. Codd publishes the first paper on the relational data model.

In 1971 Intel (USA) created the first microprocessor (MP) - a programmable logical device made using VLSI technology.

The 4004 processor was 4-bit and could perform 60 thousand operations per second.

1974 Intel developed the first universal eight-bit microprocessor, the 8080, with 4500 transistors. Edward Roberts from MITS built the first personal computer, Altair, on a new chip from Intel, the 8080. Altair turned out to be the first mass-produced PC, essentially marking the beginning of an entire industry. The kit included a processor, a 256-byte memory module, a system bus and some other little things.

Young programmer Paul Allen and Harvard University student Bill Gates implemented the BASIC language for Altair. They subsequently founded Microsoft, which is today the largest software manufacturer.

Development of computer technology

in the 80s of the 20th century.

1981 Compaq released the first Laptop.

Niklaus Wirth developed the MODULA-2 programming language.

The first portable computer was created - Osborne-1, weighing about 12 kg. Despite a fairly successful start, the company went bankrupt two years later.

1981 IBM released the first personal computer, the IBM PC, based on the 8088 microprocessor.

1982 Intel released the 80286 microprocessor.

The American computer manufacturing company IBM, which previously occupied a leading position in the production of large computers, began producing professional personal computers IBM PC with the MS DOS operating system.

Sun began producing the first workstations.

Lotus Development Corp. released the Lotus 1-2-3 spreadsheet.

The English company Inmos, based on the ideas of Oxford University professor Tony Hoare about “interacting sequential processes” and the concept of the experimental programming language David May, created the OCCAM language.

1985 Intel released a 32-bit microprocessor 80386, consisting of 250 thousand transistors.

Seymour Cray created the CRAY-2 supercomputer with a capacity of 1 billion operations per second.

Microsoft released the first version of the Windows graphical operating environment.

The emergence of a new programming language, C++.

Development of computer technology

in the 90s of the 20th century.

1990 Microsoft released Windows 3.0.

Tim Berners-Lee developed the HTML language (Hypertext Markup Language; the main format of Web documents) and the prototype of the World Wide Web.

Cray released the Cray Y-MP C90 supercomputer with 16 processors and a speed of 16 Gflops.

1991 Microsoft released Windows 3.1.

JPEG graphic format developed

Philip Zimmerman invented PGP, a public key message encryption system.

1992 The first free operating system with great capabilities appeared - Linux. Finnish student Linus Torvalds (the author of this system) decided to experiment with the commands of the Intel 386 processor and posted what he got on the Internet. Hundreds of programmers from around the world began to add and rework the program. It has evolved into a fully functional working operating system. History is silent about who decided to call it Linux, but how this name came about is quite clear. "Linu" or "Lin" on behalf of the creator and "x" or "ux" - from UNIX, because the new OS was very similar to it, only it now worked on computers with x86 architecture.

DEC introduced the first 64-bit RISC Alpha processor.

1993 Intel released a 64-bit Pentium microprocessor, which consisted of 3.1 million transistors and could perform 112 million operations per second.

The MPEG video compression format has appeared.

1994 Start of release by Power Mac of the Apple Computers series - Power PC.

1995 DEC announced the release of five new models of Celebris XL personal computers.

NEC announced the completion of development of the world's first chip with a memory capacity of 1 GB.

The Windows 95 operating system appeared.

SUN introduced the Java programming language.

The RealAudio format has appeared - an alternative to MPEG.

1996 Microsoft released Internet Explorer 3.0, a fairly serious competitor to Netscape Navigator.

1997 Apple released the Macintosh OS 8 operating system.

Conclusion

The personal computer quickly entered our lives. Just a few years ago it was rare to see some kind of personal computer - they existed, but they were very expensive, and not even every company could have a computer in their office. Now every third home has a computer, which has already become deeply embedded in human life.

Modern computers represent one of the most significant achievements of human thought, the influence of which on the development of scientific and technological progress can hardly be overestimated. The scope of computer applications is enormous and is constantly expanding.

My research

Number of computers owned by students at school in 2007.

Number of students

Have computers

Percentage of total quantity

Number of computers owned by students at school in 2008.

Number of students

Have computers

Percentage of total quantity

Increase in the number of computers among students:

The rise of computers in school

Conclusion

Unfortunately, it is impossible to cover the entire history of computers within the framework of an abstract. We could talk for a long time about how in the small town of Palo Alto (California) at the Xerox PARK research center, the cream of the programmers of that time gathered to develop revolutionary concepts that radically changed the image of cars and pave the way for computers the end of the 20th century. As a talented schoolboy, Bill Gates and his friend Paul Allen met Ed Robertson and created the amazing BASIC language for the Altair computer, which made it possible to develop application programs for it. As the appearance of the personal computer gradually changed, a monitor and keyboard appeared, a floppy disk drive, the so-called floppy disks, and then a hard drive. A printer and a mouse became integral accessories. One could talk about the invisible war in the computer markets for the right to set standards between the huge corporation IBM, and the young Apple, which dared to compete with it, forcing the whole world to decide which is better, Macintosh or PC? And about many other interesting things that happened quite recently, but have already become history.

For many, a world without a computer is a distant history, about as distant as the discovery of America or the October Revolution. But every time you turn on the computer, it is impossible to stop being amazed at the human genius that created this miracle.

Modern personal IBM PC-compatible computers are the most widely used type of computer, their power is constantly growing, and their scope is expanding. These computers can be networked together, allowing tens or hundreds of users to easily exchange information and simultaneously access databases. Electronic mail allows computer users to send text and fax messages to other cities and countries using the regular telephone network and retrieve information from large data banks. The global electronic communication system Internet provides an extremely low cost opportunity to quickly receive information from all corners of the globe, provides voice and fax communication capabilities, and facilitates the creation of intracorporate information transmission networks for companies with branches in different cities and countries. However, the capabilities of IBM PC - compatible personal computers for processing information are still limited, and their use is not justified in all situations.

To understand the history of computer technology, the reviewed abstract has at least two aspects: first, all activities related to automatic computing before the creation of the ENIAC computer were considered as prehistory; second, the development of computer technology is defined only in terms of hardware technology and microprocessor circuits.

Bibliography:

1. Guk M. “IBM PC Hardware” - St. Petersburg: “Peter”, 1997.

2. Ozertsovsky S. “Intel microprocessors: from 4004 to Pentium Pro”, Computer Week magazine #41 –

3. Figurnov V.E. “IBM PC for the user” - M.: “Infra-M”, 1995.

4. Figurnov V.E. “IBM PC for the user. Short course" - M.: 1999.

5. 1996 Frolov A.V., Frolov G.V. “IBM PC Hardware” - M.: DIALOG-MEPhI, 1992.

Did you know, What is a thought experiment, gedanken experiment?
This is a non-existent practice, an otherworldly experience, an imagination of something that does not actually exist. Thought experiments are like waking dreams. They give birth to monsters. Unlike a physical experiment, which is an experimental test of hypotheses, a “thought experiment” magically replaces experimental testing with desired conclusions that have not been tested in practice, manipulating logical constructions that actually violate logic itself by using unproven premises as proven ones, that is, by substitution. Thus, the main task of the applicants of “thought experiments” is to deceive the listener or reader by replacing a real physical experiment with its “doll” - fictitious reasoning on parole without the physical verification itself.
Filling physics with imaginary, “thought experiments” has led to the emergence of an absurd, surreal, confused picture of the world. A real researcher must distinguish such “candy wrappers” from real values.

Relativists and positivists argue that “thought experiments” are a very useful tool for testing theories (also arising in our minds) for consistency. In this they deceive people, since any verification can only be carried out by a source independent of the object of verification. The applicant of the hypothesis himself cannot be a test of his own statement, since the reason for this statement itself is the absence of contradictions in the statement visible to the applicant.

We see this in the example of SRT and GTR, which have turned into a kind of religion that controls science and public opinion. No amount of facts that contradict them can overcome Einstein’s formula: “If a fact does not correspond to the theory, change the fact” (In another version, “Does the fact not correspond to the theory? - So much the worse for the fact”).

The maximum that a “thought experiment” can claim is only the internal consistency of the hypothesis within the framework of the applicant’s own, often by no means true, logic. This does not check compliance with practice. Real verification can only take place in an actual physical experiment.

An experiment is an experiment because it is not a refinement of thought, but a test of thought. A thought that is self-consistent cannot verify itself. This was proven by Kurt Gödel.

The first device designed to make counting easier was the abacus. With the help of abacus dominoes it was possible to perform addition and subtraction operations and simple multiplications.

1642 - French mathematician Blaise Pascal designed the first mechanical adding machine, the Pascalina, which could mechanically perform the addition of numbers.

1673 - Gottfried Wilhelm Leibniz designed an adding machine that could mechanically perform the four arithmetic operations.

First half of the 19th century - English mathematician Charles Babbage tried to build a universal computing device, that is, a computer. Babbage called it the Analytical Engine. He determined that a computer must contain memory and be controlled by a program. According to Babbage, a computer is a mechanical device for which programs are set using punched cards - cards made of thick paper with information printed using holes (at that time they were already widely used in looms).

1941 - German engineer Konrad Zuse built a small computer based on several electromechanical relays.

1943 - in the USA, at one of the IBM enterprises, Howard Aiken created a computer called “Mark-1”. It allowed calculations to be carried out hundreds of times faster than by hand (using an adding machine) and was used for military calculations. It used a combination of electrical signals and mechanical drives. "Mark-1" had dimensions: 15 * 2-5 m and contained 750,000 parts. The machine was capable of multiplying two 32-bit numbers in 4 seconds.

1943 - in the USA, a group of specialists led by John Mauchly and Prosper Eckert began to construct the ENIAC computer based on vacuum tubes.

1945 - mathematician John von Neumann was brought in to work on ENIAC and prepared a report on this computer. In his report, von Neumann formulated the general principles of the functioning of computers, i.e., universal computing devices. To this day, the vast majority of computers are made in accordance with the principles laid down by John von Neumann.

1947 - Eckert and Mauchly began development of the first electronic serial machine UNIVAC (Universal Automatic Computer). The first model of the machine (UNIVAC-1) was built for the US Census Bureau and put into operation in the spring of 1951. The synchronous, sequential computer UNIVAC-1 was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5,000 vacuum tubes. The internal storage capacity of 1000 12-bit decimal numbers was implemented on 100 mercury delay lines.

1949 - English researcher Mornes Wilkes built the first computer, which embodied von Neumann's principles.

1951 - J. Forrester published an article on the use of magnetic cores for storing digital information. The Whirlwind-1 machine was the first to use magnetic core memory. It consisted of 2 cubes with 32-32-17 cores, which provided storage of 2048 words for 16-bit binary numbers with one parity bit.

1952 - IBM released its first industrial electronic computer, the IBM 701, which was a synchronous parallel computer containing 4,000 vacuum tubes and 12,000 diodes. An improved version of the IBM 704 machine was distinguished by its high speed, it used index registers and represented data in floating point form.

After the IBM 704 computer, the IBM 709 was released, which in architectural terms was close to the machines of the second and third generations. In this machine, indirect addressing was used for the first time and input-output channels appeared for the first time.

1952 - Remington Rand released the UNIVAC-t 103 computer, which was the first to use software interrupts. Remington Rand employees used an algebraic form of writing algorithms called “Short Code” (the first interpreter, created in 1949 by John Mauchly).

1956 - IBM developed floating magnetic heads on an air cushion. Their invention made it possible to create a new type of memory - disk storage devices (SD), the importance of which was fully appreciated in the subsequent decades of the development of computer technology. The first disk storage devices appeared in IBM 305 and RAMAC machines. The latter had a package consisting of 50 metal disks with a magnetic coating, which rotated at a speed of 12,000 rpm. /min. The surface of the disk contained 100 tracks for recording data, each containing 10,000 characters.

1956 - Ferranti released the Pegasus computer, in which the concept of general purpose registers (GPR) was first implemented. With the advent of RON, the distinction between index registers and accumulators was eliminated, and the programmer had at his disposal not one, but several accumulator registers.

1957 - a group led by D. Backus completed work on the first high-level programming language, called FORTRAN. The language, implemented for the first time on the IBM 704 computer, contributed to expanding the scope of computers.

1960s - 2nd generation of computers, computer logic elements are implemented on the basis of semiconductor transistor devices, algorithmic programming languages ​​such as Algol, Pascal and others are being developed.

1970s - 3rd generation of computers, integrated circuits containing thousands of transistors on one semiconductor wafer. OS and structured programming languages ​​began to be created.

1974 - several companies announced the creation of a personal computer based on the Intel-8008 microprocessor - a device that performs the same functions as a large computer, but is designed for one user.

1975 - the first commercially distributed personal computer Altair-8800 based on the Intel-8080 microprocessor appeared. This computer had only 256 bytes of RAM, and there was no keyboard or screen.

Late 1975 - Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to simply communicate with the computer and easily write programs for it.

August 1981 - IBM introduced the IBM PC personal computer. The main microprocessor of the computer was a 16-bit Intel-8088 microprocessor, which allowed working with 1 megabyte of memory.

1980s - 4th generation of computers built on large integrated circuits. Microprocessors are implemented in the form of a single chip, mass production of personal computers.

1990s — 5th generation of computers, ultra-large integrated circuits. Processors contain millions of transistors. The emergence of global computer networks for mass use.

2000s — 6th generation of computers. Integration of computers and household appliances, embedded computers, development of network computing.

PC BASICS

People have always felt the need to count. To do this, they used their fingers, pebbles, which they put in piles or placed in a row. The number of objects was recorded using lines that were drawn along the ground, using notches on sticks and knots that were tied on a rope.

With the increase in the number of objects to be counted and the development of sciences and crafts, the need arose to carry out simple calculations. The most ancient instrument known in various countries is the abacus (in Ancient Rome they were called calculi). They allow you to perform simple calculations on large numbers. The abacus turned out to be such a successful tool that it has survived from ancient times almost to the present day.

No one can name the exact time and place of the appearance of the bills. Historians agree that their age is several thousand years, and their homeland may be Ancient China, Ancient Egypt, and Ancient Greece.

1.1. SHORT STORY

COMPUTING EQUIPMENT DEVELOPMENTS

With the development of exact sciences, an urgent need arose to carry out a large number of precise calculations. In 1642, French mathematician Blaise Pascal constructed the first mechanical adding machine, known as Pascal's adding machine (Figure 1.1). This machine was a combination of interlocking wheels and drives. The wheels were marked with numbers from 0 to 9. When the first wheel (units) made a full revolution, the second wheel (tens) was automatically activated; when it reached the number 9, the third wheel began to rotate, etc. Pascal's machine could only add and subtract.

In 1694, the German mathematician Gottfried Wilhelm von Leibniz designed a more advanced calculating machine (Fig. 1.2). He was convinced that his invention would find wide application not only in science, but also in everyday life. Unlike Pascal's machine, Leibniz used cylinders rather than wheels and drives. The cylinders were marked with numbers. Each cylinder had nine rows of projections or teeth. In this case, the first row contained 1 protrusion, the second - 2, and so on until the ninth row, which contained 9 protrusions. The cylinders were movable and were brought into a certain position by the operator. The design of Leibniz's machine was more advanced: it was capable of performing not only addition and subtraction, but also multiplication, division and even square root extraction.

Interestingly, the descendants of this design survived until the 70s of the 20th century. in the form of mechanical calculators (Felix type adding machine) and were widely used for various calculations (Fig. 1.3). However, already at the end of the 19th century. With the invention of the electromagnetic relay, the first electromechanical counting devices appeared. In 1887, Herman Hollerith (USA) invented an electromechanical tabulator with numbers entered using punched cards. The idea of ​​using punch cards was inspired by the punching of railway tickets with a puncher. The 80-column punched card he developed did not undergo significant changes and was used as an information carrier in the first three generations of computers. Hollerith tabulators were used during the 1st population census in Russia in 1897. The inventor himself then made a special visit to St. Petersburg. Since that time, electromechanical tabulators and other similar devices have become widely used in accounting.

At the beginning of the 19th century. Charles Babbage formulated the basic principles that should underlie the design of a fundamentally new type of computer.

In such a machine, in his opinion, there should be a “warehouse” for storing digital information, a special device that carries out operations on numbers taken from the “warehouse.” Babbage called such a device a “mill.” Another device is used to control the sequence of operations, transfer of numbers from the “warehouse” to the “mill” and back, and finally, the machine must have a device for inputting initial data and outputting calculation results. This machine was never built - only models of it existed (Fig. 1.4), but the principles underlying it were later implemented in digital computers.

Babbage's scientific ideas captivated the daughter of the famous English poet Lord Byron, Countess Ada Augusta Lovelace. She laid down the first fundamental ideas about the interaction of various blocks of a computer and the sequence of solving problems on it. Therefore, Ada Lovelace is rightfully considered the world's first programmer. Many of the concepts introduced by Ada Lovelace in the descriptions of the world's first programs are widely used by modern programmers.

Rice. 1.1. Pascal's summing machine

Rice. 1.2. Leibniz calculating machine

Rice. 1.3. Felix adding machine

Rice. 1.4. Babbage's machine

The beginning of a new era in the development of computer technology based on electromechanical relays was in 1934. The American company IBM (International Business Machines) began producing alphanumeric tabulators capable of performing multiplication operations. In the mid-30s of the XX century. based on tabulators, a prototype of the first local computer network is created. In Pittsburgh (USA), a department store installed a system consisting of 250 terminals connected by telephone lines with 20 tabulators and 15 typewriters for payments to customers. In 1934 - 1936 German engineer Konrad Zuse came up with the idea of ​​​​creating a universal computer with program control and storage of information in a memory device. He designed the Z-3 machine - it was the first program-controlled computer - the prototype of modern computers (Fig. 1.5).

Rice. 1.5. Zuse computer

It was a relay machine using a binary number system, having a memory for 64 floating point numbers. The arithmetic block used parallel arithmetic. The team included operational and address parts. Data entry was carried out using a decimal keyboard, digital output was provided, as well as automatic conversion of decimal numbers to binary and vice versa. The speed of the addition operation is three operations per second.

In the early 40s of the XX century. In the laboratories of IBM, together with scientists from Harvard University, the development of one of the most powerful electromechanical computers began. It was called MARK-1, contained 760 thousand components and weighed 5 tons (Fig. 1.6).

Rice. 1.6. Calculating machine MARK -1

The last largest project in the field of relay computing technology (CT) should be considered the RVM-1, built in 1957 in the USSR, which was quite competitive with the computers of that time for a number of tasks. However, with the advent of the vacuum tube, the days of electromechanical devices were numbered. Electronic components had great superiority in speed and reliability, which determined the future fate of electromechanical computers. The era of electronic computers has arrived.

The transition to the next stage in the development of computer technology and programming technology would be impossible without fundamental scientific research in the field of information transmission and processing. The development of information theory is associated primarily with the name of Claude Shannon. Norbert Wiener is rightfully considered the father of cybernetics, and Heinrich von Neumann is the creator of the theory of automata.

The concept of cybernetics was born from the synthesis of many scientific directions: firstly, as a general approach to the description and analysis of the actions of living organisms and computers or other automata; secondly, from the analogies between the behavior of communities of living organisms and human society and the possibility of their description using a general theory of control; and, finally, from the synthesis of information transfer theory and statistical physics, which led to the most important discovery linking the amount of information and negative entropy in a system. The term “cybernetics” itself comes from the Greek word meaning “helmsman”; it was first used by N. Wiener in the modern sense in 1947. N. Wiener’s book, in which he formulated the basic principles of cybernetics, is called “Cybernetics or control and communication in animal and car."

Claude Shannon is an American engineer and mathematician, the man who is called the father of modern information theory. He proved that the operation of switches and relays in electrical circuits can be represented using algebra, invented in the mid-19th century. English mathematician George Boole. Since then, Boolean algebra has become the basis for analyzing the logical structure of systems of any level of complexity.

Shannon proved that any noisy communication channel is characterized by a limiting speed of information transmission, called the Shannon limit. At transmission speeds above this limit, errors in the transmitted information are inevitable. However, using appropriate information encoding methods, it is possible to obtain an arbitrarily small error probability for any noisy channel. His research formed the basis for the development of information transmission systems over communication lines.

In 1946, the brilliant American mathematician of Hungarian origin, Heinrich von Neumann, formulated the basic concept of storing computer instructions in its own internal memory, which served as a huge impetus to the development of electronic computing technology.

During World War II, he served as a consultant at the Los Alamos Atomic Center, where he worked on calculations for the explosive detonation of a nuclear bomb and participated in the development of the hydrogen bomb.

Neumann owns works related to the logical organization of computers, problems of the functioning of computer memory, self-reproducing systems, etc. He took part in the creation of the first electronic computer ENIAC, the computer architecture he proposed was the basis for all subsequent models and is still called that - "von Neumann"

I generation of computers. In 1946, work was completed in the USA to create ENIAC, the first computer using electronic components (Fig. 1.7).

Rice. 1.7. First computer ENIAC

The new machine had impressive parameters: it used 18 thousand electronic tubes, it occupied a room with an area of ​​300 m 2, had a mass of 30 tons, and energy consumption was 150 kW. The machine operated at a clock frequency of 100 kHz and performed an addition operation in 0.2 ms and a multiplication in 2.8 ms, which was three orders of magnitude faster than relay machines could do. The shortcomings of the new car were quickly revealed. In its structure, the ENIAC computer resembled mechanical computers: the decimal system was used; the program was typed manually on 40 typesetting fields; It took weeks to reconfigure the switching fields. During trial operation, it turned out that the reliability of this machine is very low: troubleshooting took up to several days. Punched tapes and punched cards, magnetic tapes and printing devices were used to input and output data. The first generation computers implemented the concept of a stored program. First generation computers were used for weather forecasting, solving energy problems, military problems and in other important areas.

II generation of computers. One of the most important advances that led to the revolution in computer design and ultimately the creation of personal computers was the invention of the transistor in 1948. The transistor, which is a solid-state electronic switching element (gate), takes up much less space and consumes much less power , doing the same job as a lamp. Computing systems built on transistors were much more compact, more economical and much more efficient than tube ones. The transition to transistors marked the beginning of miniaturization, which made possible the emergence of modern personal computers (as well as other radio devices - radios, tape recorders, televisions, etc.). For generation II machines, the task of automating programming arose, as the gap between the time for developing programs and the calculation time itself increased. The second stage in the development of computer technology in the late 50s - early 60s of the XX century. characterized by the creation of developed programming languages ​​(Algol, Fortran, Cobol) and the mastery of the process of automating the management of the flow of tasks using the computer itself, i.e. development of operating systems.

In 1959, IBM released a commercial transistor machine, the IBM 1401. It was delivered in more than 10 thousand copies. In the same year, IBM created its first large computer (mainframe), the IBM 7090 model, entirely based on transistors, with a speed of 229 thousand operations per second, and in 1961 it developed the IBM 7030 model for the US nuclear laboratory at Los Alamos.

A striking representative of domestic computers of the second generation was the large electronic summing machine BESM-6, developed by S.A. Lebedev and his colleagues (Fig. 1.8). Computers of this generation are characterized by the use of high-level programming languages, which were developed in computers of the next generation. Transistor machines of the second generation took only five years in the history of computers.

Rice. 1.8. BESM-6

III generation of computers. In 1959, engineers at Texas Instruments developed a way to place multiple transistors and other components on a single base (or substrate) and connect these transistors without using wires. Thus the integrated circuit (IC, or chip) was born. The first integrated circuit contained only six transistors. Now computers were designed based on low-integration integrated circuits. Operating systems appeared that began to take on the tasks of managing memory, input/output devices and other resources.

In April 1964, IBM announced System 360, the first family of general-purpose software-compatible computers and peripherals. Hybrid microcircuits were chosen as the elemental base of the System 360 family, thanks to which the new models began to be considered third-generation machines (Fig. 1.9).

Rice. 1.9. III generation computer IBM

With the System 360 family, IBM for the last time allowed itself the luxury of releasing computers that were incompatible with previous ones. The cost-effectiveness, versatility and small size of computers of this generation quickly expanded the scope of their application - control, data transfer, automation of scientific experiments, etc. As part of this generation, the first microprocessor was developed in 1971 as an unexpected result of Intel's work on the creation of microcalculators. (We note, by the way, that microcalculators in our time get along well with their “blood brothers” - personal computers.)

IV generation of computers. This stage in the development of computer technology is associated with the development of large and ultra-large integrated circuits. IV generation computers began to use high-speed memory systems on integrated circuits with a capacity of several megabytes.

The four-bit Intel 8004 microprocessor was developed in 1971. The following year, an eight-bit processor was released, and in 1973, Intel released the 8080 processor, which was 10 times faster than the 8008 and could address 64 KB of memory. This was one of the most serious steps towards the creation of modern personal computers. IBM released its first personal computer in 1975. The Model 5100 had 16 KB of memory, a built-in BASIC language interpreter, and a built-in cassette tape drive that was used as a storage device. The debut of the IBM PC took place in 1981. On this day, the new standard took its place in the computer industry. A large number of different programs have been written for this family. The new modification is called “extended” (IBM PC-XT) (Fig. 1.10).

Rice. 1.10. Personal computer IBM PC - XT

Manufacturers abandoned the use of a tape recorder as an information storage device, added a second floppy drive, and used a 20 MB hard drive as the main device for storing data and programs. The model was based on the use of a microprocessor - Intel 8088. Due to natural progress in the field of development and production of microprocessor technology, Intel - a permanent partner of IBM - mastered the production of a new series of processors - Intel 80286. Accordingly, a new model of the IBM PC appeared. It was called IBM PC-AT. The next stage is the development of microprocessors Intel 80386 and Intel 80486, which can still be found today. Then the Pentium processors were developed, which are the most popular processors today.

V generation of computers. In the 90s of the XX century. Great attention began to be paid not so much to improving the technical characteristics of computers, but to their “intelligence,” open architecture and networking capabilities. Attention is focused on the development of knowledge bases, user-friendly interfaces, graphical means of presenting information and the development of macro programming tools. There are no clear definitions of this stage of development of computer technology, since the element base on which this classification is based has remained the same - it is clear that all computers currently produced can be classified as the V generation.

1.2. CLASSIFICATION OF COMPUTERS

Computers can be classified according to a number of criteria, in particular by principle of operation, purpose, methods of organizing the computing process, size and computing power, functionality, etc.

Based on their operating principle, computers can be divided into two broad categories: analog and digital.

Analog computers(analog computers - AVM) - continuous computers (Fig. 1.11).

Rice. 1.11. Analog computer

They work with information presented in analog form, i.e. in the form of a continuous series of values ​​of any physical quantity. There are devices in which computing operations are performed using hydraulic and pneumatic elements. However, the most widespread are electronic AVMs, in which electrical voltages and currents serve as machine variables.

The work of AVM is based on the generality of laws that describe processes of various natures. For example, the oscillations of a pendulum obey the same laws as changes in the electric field strength in an oscillatory circuit. And instead of studying a real pendulum, you can study its behavior on a model implemented on an analog computer. Moreover, this model can also be used to study some biological and chemical processes that obey the same laws.

The main elements of such machines are amplifiers, resistors, capacitors and inductors, between which connections can be made that reflect the conditions of a particular task. Programming of tasks is carried out by typing elements on a typesetting field. AVM is used to most effectively solve mathematical problems containing differential equations that do not require complex logic. The results of the solution are displayed in the form of dependences of electrical voltages as a function of time on the oscilloscope screen or recorded by measuring instruments.

In the 40s - 50s of the XX century. electronic analog computers created serious competition for newly emerging computers. Their main advantages were high performance (comparable to the speed of passage of the electrical signal through the circuit), clarity of presentation of simulation results.

Among the disadvantages are the low accuracy of calculations, limited range of problems to be solved, and manual setting of task parameters. Currently, AVMs are used only in very limited areas - for educational and demonstration purposes, and scientific research. They are not used in everyday life.

Digital computers(electronic computers - computers) are based on discrete logic “yes-no”, “zero-one”. All operations are performed by a computer in accordance with a pre-compiled program. The speed of calculations is determined by the clock speed of the system.

Based on the stages of creation and the element base, digital computers are conventionally divided into five generations:

I generation (1950s) - computers based on electronic vacuum
lamps;

II generation (1960s) - computers based on semiconductor elements (transistors);

III generation (1970s) - computers based on semiconductor integrated circuits with low and medium degrees of integration (tens and hundreds of transistors in one case);

VI generation (1980s) - large and ultra-large computers
integrated circuits - microprocessors (millions of transistors in one chip);

V generation (1990s - present) - supercomputers with thousands of parallel operating microprocessors,
allowing to build efficient systems for processing huge
arrays of information; personal computers on highly complex microprocessors and user-friendly interfaces, which
determines their implementation in almost all areas of activity
person. Network technologies make it possible to unite computer users into a single information society.

In terms of computing power in the 70s - 80s of the XX century. The following taxonomy of computers has emerged.

Supercomputers- These are computers that have maximum capabilities in terms of speed and volume of calculations. They are used to solve problems of a national and universal scale - national security, research in biology and medicine, modeling the behavior of large systems, weather forecasting, etc. (Fig. 1.12).

Rice. 1.12. Supercomputer CRAY 2

Mainframe computers(mainframes) - computers that are used in large research centers and universities to conduct research, in corporate systems - banks, insurance, trade institutions, transport, news agencies and publishing houses. Mainframes are combined into large computer networks and serve hundreds and thousands of terminals - machines on which users and clients directly work.

Mini computers- these are specialized computers that are used to perform a certain type of work that requires relatively large computing power: graphics, engineering calculations, working with video, layout of printed publications, etc.

Microcomputers- this is the most numerous and diverse class of computers, the basis of which is personal computers, currently used in almost all branches of human activity. Millions of people use them in their professional activities for interaction via the Internet, entertainment and recreation.

In recent years, a taxonomy has emerged that reflects the diversity and characteristics of a large class of computers on which direct users work. These computers differ in computing power, system and application software, set of peripheral devices, user interface and, as a result, size and price. However, they are all built on common principles and a single element base, have a high degree of compatibility, common interfaces and protocols for exchanging data between themselves and networks. The basis of this class of machines are personal computers, which in the above taxonomy correspond to the class of microcomputers.

This taxonomy, like any other, is quite conventional; Since it is impossible to draw a clear boundary between different classes of computers, models appear that are difficult to attribute to a specific class. Nevertheless, it broadly reflects the variety of computing devices that currently exist.

Servers(from English serve - “serve”, “manage”) - multi-user powerful computers that ensure the functioning of computer networks (Fig. 1.13).

Rice. 1.13. Server S 390

They serve to process requests from all workstations connected to the network. The server provides access to shared network resources - computing power, databases, program libraries, printers, faxes - and distributes these resources among users. In any institution, personal computers are combined into a local network - this allows for data exchange between end-user computers and rational use of system and hardware resources.

The fact is that preparing a document on a computer (be it an invoice for a product or a scientific report) takes much more time than printing it. It is much more profitable to have one powerful network printer for several computers, and the server will handle the distribution of the print queue. If computers are connected to a local network, it is convenient to have a single database on the server - a price list of all store products, a work plan for a scientific institution, etc. In addition, the server provides a common Internet connection for all workstations, differentiates access to information for different categories of users, sets priorities for access to shared network resources, keeps statistics on Internet use, monitors the work of end users, etc.

Personal Computer(PC - Personal computer) is the most common class of computers capable of solving problems at various levels - from preparing financial statements to engineering calculations. It is designed mainly for individual use (hence the name of the class to which it belongs). A personal computer (PC) has special tools that allow it to be included in local and global networks. The main content of this book will be devoted to a description of the hardware and software of this particular class of computers.

Laptop(from English notebook - “notebook”) - this established term completely incorrectly reflects the features of this class of personal computers (Fig. 1.14).

Rice. 1.14. Laptop

Its dimensions and weight are more consistent with the format of a large book, and its functionality and technical characteristics are fully consistent with a regular desktop PC. Another thing is that these devices are more compact, lighter and, most importantly, consume significantly less electricity, which allows them to operate on batteries. The software of this class of PC, from the operating system to application programs, is absolutely no different from desktop computers. In the recent past, this class of PC was defined as Laptop - “knee pad”. This name reflected their characteristics much more accurately, but for some reason it never caught on.

So, the main feature of personal computers of the laptop class is mobility. Small overall dimensions and weight, monoblock design make it easy to place it anywhere in the workspace, transfer it from one place to another in a special case or a “diplomat” type suitcase, and battery power allows it to be used even on the road (car or plane).

All laptop models can be divided into three classes: universal, business and compact (subnotebooks). Universal laptops are a full-fledged replacement for a desktop PC, so they are relatively large in size and weight, but at the same time they are distinguished by a large screen size and a comfortable keyboard similar to a desktop PC. They have conventional built-in storage devices: CD-ROM (R, RW, DVD), hard drive and floppy drive. This design practically eliminates the possibility of using it as a “travel” PC. The battery charge is only enough for 2-3 hours of operation.

Business laptops Designed for use in the office, at home, or on the road. They have significantly smaller overall dimensions and weight, a minimal composition of built-in devices, but advanced means for connecting additional devices. PCs of this class serve more as an addition to an office or home desktop, rather than as a replacement.

Compact laptops(subnotebooks) are the embodiment of the most advanced achievements of computer technology. They have the highest degree of integration of various devices (components such as support for audio, video, and local network are built into the motherboard). Laptops of this class are usually equipped with wireless input device interfaces (additional keyboard, mouse), have a built-in radio modem for connecting to the Internet, compact smart cards are used as information storage devices, etc. Moreover, the mass of such devices does not exceed 1 kg, and the thickness is about 1 inch (2.4 cm). The battery charge lasts for several hours, however, such computers cost two to three times more than conventional PCs.

Pocket personal computer(PDA) (RS - Rosket) - consists of the same parts as a desktop computer: processor, memory, sound and video system, screen, expansion slots, with which you can increase memory or add other devices. Battery power provides operation for two months. All these components are very compact and closely integrated, due to which the device weighs 100...200 g and fits in the palm of your hand, in a shirt's breast pocket or handbag (Fig. 1.15).

Rice. 1.15. Pocket personal computer

It’s not for nothing that these devices are also called “handhelds” (Palmtop).

However, the functionality of a PDA is very different from a desktop or laptop. First of all, it has a relatively small screen, as a rule, there is no keyboard and mouse, so user interaction is organized differently: for this, the PDA screen is used - it is sensitive to pressure, for which they use a special stick called a “stylus”. To type on a PDA, a so-called virtual keyboard is used - its keys are displayed directly on the screen, and the text is typed with a stylus. Another important difference is the absence of a hard drive, so the volumes of stored information are relatively small. The main storage of programs and data is the built-in memory of up to 64 MB, and the role of disks is played by flash memory cards. These cards store programs and data that do not necessarily need to be placed in quick access memory (photo albums, music in MP3 format, e-books, etc.). Because of these features, PDAs are often used in conjunction with a desktop PC, for which there are special interface cables.

A laptop and a PDA are designed for completely different tasks, built on different principles and only complement each other, but do not replace each other.

They work with a laptop in the same way as a desktop computer, and PDAs are turned on and off several times a day. Loading programs and shutting down occurs almost instantly.

In terms of technical characteristics, modern PDAs are quite comparable to desktop computers that were produced just a few years ago. This is quite enough for high-quality reproduction of text information, for example when working with email or a text editor. Modern PDAs are also equipped with a built-in microphone, speakers and headphone jacks. Communication with a desktop PC and other peripheral devices is carried out via a USB port, infrared port (IgDA) or Bluetooth (a modern wireless interface).

In addition to a special operating system, PDAs are usually equipped with built-in applications, which include a text editor, spreadsheet editor, scheduler, Internet browser, a set of diagnostic programs, etc. Recently, computers of the Pocket PC class have begun to be equipped with built-in means of communication with the Internet (a regular cell phone can also be used as an external modem).

Thanks to their capabilities, pocket personal computers can be considered not just as a simplified PC with reduced capabilities, but as a completely equal member of the computer community, which has its own undeniable advantages even in comparison with the most advanced models of desktop computers.

Electronic secretaries(PDA - Personal Digital Assistant) - have the format of a pocket computer (weighing no more than 0.5 kg), but are used for other purposes (Fig. 1.16).

Rice. 1.16. Electronic secretary

They are focused on the use of electronic directories that store names, addresses and telephone numbers, information about daily routines and appointments, to-do lists, expense records, etc. An electronic secretary may have built-in text and graphic editors, spreadsheets and other office applications.

Most PDAs have modems and can exchange information with other PCs and, when connected to a network, can receive and send email and faxes. Some PDAs are equipped with radio modems and infrared ports for remote wireless information exchange with other computers. Electronic secretaries have a small liquid crystal display, usually located in the hinged lid of the computer. Manual input of information is possible from a miniature keyboard or using a touch screen, like a PDA. A PDA can be called a computer only with big reservations: sometimes these devices are classified as ultra-portable computers, sometimes in the category of “smart” calculators, others believe that it is rather an organizer with advanced capabilities.

Electronic notebooks(from English organizer - “organizer”) - belong to the “lightest category” of portable computers (their weight does not exceed 200 g). Organizers have a capacious memory in which you can record the necessary information and edit it using the built-in text editor; You can store business letters, texts of agreements, contracts, daily routines and business meetings in memory. The organizer has a built-in internal timer that reminds you of important events. Access to information can be password protected. Organizers are often equipped with a built-in translator that has several dictionaries.

Information is displayed on a small monochrome liquid crystal display. Due to low power consumption, battery power provides information storage for up to five years without recharging.

Smartphone (English smartphone) is a compact device that combines the functions of a cell phone, an electronic notebook and a digital camera with mobile Internet access (Fig. 1.17).

Rice. 1.17. Smartphone

The smartphone has a microprocessor, RAM, and read-only storage; Internet access is provided via cellular communications. The quality of photographs is not high, but sufficient for use on the Internet and sending by e-mail. Video recording time is about 15 s. Has a built-in storage for smart cards. The battery charge is enough for 100 hours of operation. Weight 150 g. A very convenient and useful device, but its cost is comparable to the price of a good desktop computer.