Electronic computer. Tue history

The German astronomer Johannes Kepler was often faced in his research with great problems, the solution of which required a lot of work and time. Fortunately, he had a colleague who figured out how to help grief: Wilhelm Schickard, professor of mathematics in Tübingen, invented the first witnessed computing machine on gears. But, alas, Kepler failed to take advantage of the novelty - the model burned down in a fire. Only at the end of the 1950s. it was possible to create a copy of Schikard's car on the basis of the surviving drawings and prove its efficiency.

Filial help

To help his father, a tax collector, in his tedious calculations, Vlez Pascal developed the Pascaline, a calculating machine capable of adding and subtracting eight-digit numbers, automatically performing decimal transfers. Until the middle of the 17th century. 50 of these machines were designed, one of which became the property of the Swedish Queen Christina.

Helping humanity

The founder and first president of the Prussian Academy of Sciences in Berlin, Gottfried Wilhelm von Leibniz, not only invented differential and integral calculus, but also presented to the scientific world in 1673 an adding machine, whose mechanical device with cylindrical rollers and a carriage was much more perfect than that of Schickard and Pascal. In this machine, Leibniz first applied his invented binary notation, on which the work of future computers is based.

Start of serial production

On the basis of the Leibniz arithmometer, Charles Xavier Thoma de Colmar constructed in 1818 a calculating machine capable of also extracting square roots, raising to a power, and calculating the values \u200b\u200bof trigonometric functions. Colmar's adding machine was reliable and accurate to the twentieth decimal place. In 1821, the inventor started mass production. In 1833, the British mathematician Charles Babbage invented the first programmed calculating machine. Thus, he became the spiritual father of digital computing machines. However, until the moment when Konrad Zuse created the first modern computermore than 100 years have passed.

  • 1853: Georg Scheitz invented the first calculating machine with a printing device in Stockholm.
  • 1873: Mechanical engineer Salling in Würzburg builds a calculating machine with a keyboard.
  • 1890: Herman Hollerith received a patent for a computer using punched cards.
  • 1967: Englishman Norman Kitz created the first desktop electronic calculator, the Anita MK VIII.

The history of the development of computing

The development of computing technology can be broken down intothe following periods:

Ø Manual (VI century BC - XVII century AD)

Ø Mechanical (XVII century - mid XX century)

Ø Electronic (mid XX century - present)

Although Prometheus in the tragedy of Aeschylus asserts: "Think what I did to mortals: I invented number with them and taught them to connect letters," the concept of number arose long before the appearance of writing. People have learned to count over many centuries, passing on and enriching their experience from generation to generation.

Counting, or more broadly, calculations, can be carried out in various forms: exists oral, written and instrumental counting ... Instrumental account funds at different times had different capabilities and were called differently.

Manual stage (VI century BC - XVII century AD)

The emergence of counting in antiquity - "This was the beginning of the beginnings ..."

The estimated age of the last generation of humanity is 3-4 million years. It was precisely so many years ago that a man got to his feet and picked up an instrument made by himself. However, the ability to count (that is, the ability to break the concepts of "more" and "less" into a specific number of units) was formed in humans much later, namely 40-50 thousand years ago (late Paleolithic). This stage corresponds to the appearance modern man (Cro-Magnon). Thus, one of the main (if not the main) characteristic that distinguishes the Cro-Magnon from the more ancient stage of man is his counting ability.

It is not hard to guess that the first the person's counting device was his fingers.

The fingers turned out to be beautifulcomputing machine. With their help, it was possible to count up to 5, and if you take two hands, then up to 10. And in countries where people walked barefoot, on fingers it was easy to count to 20. Then this was practically enough for mostneeds of people.

The fingers ended up so tightly bound account that in ancient Greek the concept of "count" was expressed by the word"five". And in Russian, the word "five" resembles "metacarp" - part hands (the word "metacarpus" is rarely mentioned now, but its derivative is "wrist" is often used now).The hand, the metacarpus, is a synonym and in fact the basis of the numeral "FIVE" for many peoples. For example, the Malay “LIMA” means both “hand” and “five”.

However, peoples are known for which the units of account there were not fingers, but their joints.

Learning to count on my fingers toten, people took the next step forward and began to count in tens. And if some Papuan tribes were able to count only up to six, while others reached several dozen in the count. Only for this it was necessary invite many counters at once.

In many languages, the words "two" and "ten" are consonant. Maybe this is due to the fact that once the word "ten" meant "two hands." And now there are tribes that say"two arms" instead of "ten" and "arms and legs" instead of "twenty". And in England the first ten numbers are called by a common name - "fingers". This means that the British once counted on their fingers.

Finger counting has survived in some places to this day, for example, the historian of mathematics L. Karpinsky in the book "History of Arithmetic" reports that on the world's largest grain exchange in Chicago offers and requests, as well as prices, are announced by brokers on their fingers without a single word.

Then came counting with shifting stones, counting with a rosary ... It was a significant breakthrough in the counting ability of a person - the beginning of abstraction of numbers.

Appendix 4

Test on the topic:

"History of the development of computing technology"

Choose the correct answer

1. An electronic computer is:

a) a complex of hardware and software tools information processing;

b) complex technical means for automatic processing of information;

c) a model that establishes the composition, order and principles of interaction of its components.

2. A personal computer is:

a) a computer for an individual buyer;

b) a computer that provides a dialogue with the user;

c) desktop or personal computer that meets the requirements of general availability and versatility

3. Inventor of a mechanical device for adding numbers:

a) P. Norton;

b) B. Pascal;

c) G. Leibniz;

d) D. Napier.

4. The scientist who combined the idea of \u200b\u200ba mechanical machine with the idea of \u200b\u200bprogrammed control:

a) C. Babbage (mid-19th century);

b) J. Atanosov (30th year of the XX century);

c) K. Beri (XX century);

d) B. Pascal (mid-17th century)

5. The first programmer in the world is:

a) G. Leibniz;
b) C. Babbage;

c) J. von Neumann;

d) A. Lovelace.

6. The country where the first computer was created that implements the principles of program control:

b) England;

c) in Germany

7. Founder of domestic computer technology:

8. The city in which the first domestic computer was created:

b) Moscow;

in Saint-Petersburg;

yekaterinburg city.

9. Means of communication of the user with a computer of the second generation:

a) punched cards;

b) magnetic tokens;

c) magnetic tapes;

d) magnetic tokens.

10. The first tool for counting

a) sticks;

b) pebbles;

c) human hand;

d) seashells.

11. Calculation system in Russian accounts:

a) binary;

b) fivefold;

c) octal;

d) decimal.

12. Scope of the first generation computers:

a) design;

b) engineering and scientific calculations;

c) banking;

d) architecture and construction.

13. The generation of computers, during which programming languages \u200b\u200bbegan to appear high level:

a) the first;

b) the second;

c) third;

d) fourth.

14. Computer generation, element base which were the transistors:

a) the first;

b) the second;

c) third;

d) fourth.

15. Programming language in machines of the first generation:

a) machine code;

b) Assembler;

c) BASIC

d) Fortran

Select all correct answers:

16. Computer elements of the third generation:

a) integrated circuits;

b) microprocessors

c) CRT display

d) magnetic disks

e) manipulator "mouse"

17. Elements of Babbage's analytical engine

a) input block;

b) microprocessor;

c) output block;

d) office;

e) mill;

f) block for printing the result;

g) arithmetic device;

h) memory;

18. Computer elements of the fourth generation:

a) integrated circuits;

b) microprocessors;

c) color display;

d) transistors;

e) manipulator "joystick";

f) plotters.

19. The very first devices for counting

a) https://pandia.ru/text/78/312/images/image003_40.jpg "width \u003d" 206 "height \u003d" 69 "\u003e. jpg" width \u003d "151" height \u003d "58"\u003e. jpg " width \u003d "146" height \u003d "71"\u003e 0 "style \u003d" margin-left: -22.95pt; border-collapse: collapse; border: none "\u003e

a, d, e, f, and

Scale for evaluating the result of work

Points

Assessment

Satisfactorily

Invention electronic computer - one of the greatest technical achievements of the second half of the twentieth century, which served as the beginning of the scientific and technological revolution. Mankind has been going to this grandiose event since ancient times. In ancient times, the simplest means of calculation were the fingers and phalanges of the fingers and toes. The technical devices were wooden sticks with notches applied to them, belts and laces with knots tied. The development of the simplest forms of trade contributed to the invention of various accounts, one of the oldest is the abacus. This invention originated in China and was a board covered with gunpowder. Notes were made on the board that could be easily erased. If the serif wand was used one-time, the board could be used multiple times. One of the varieties of the abacus was a board with indentations, into which pebbles were inserted, if necessary.

Progress does not stand still. Discoveries in one area of \u200b\u200bhuman activity very often lead to major discoveries in other areas. Thus, research in the field of astronomy contributed to the emergence of new, more complex computing devices. With the invention of the logarithm by John Napier (1614), the slide rule appeared in 1620, which made it possible to quickly multiply and divide numbers. Astronomer Wilhelm Schickard (1623) and the famous French scientist Blaise Pascal (1642) were among the first inventors of the mechanical computing machine. Pascal's computing machine allows today to perform operations of addition and subtraction of multidigit numbers without the slightest error. In 1694, the famous 12-digit adding machine of the German mathematician Leibniz appeared, capable of multiplying and dividing multidigit numbers.

From 1820 to 1856, the English mathematician, mechanical engineer, inventor Charles Babbage worked on the creation of a universal analytical computer capable of performing the necessary actions on the provided data and solving arithmetic problems of varying complexity. Working on a project that was far ahead of its time, the scientist failed to achieve the goal. But created C. Babbage other computing devices, for a long time were used by the English tax services. Babbage's creation of the Difference Engine has already put him in the forefront of the creators of computing. And the basic ideas of the device and operation of the machine (the mechanism of introduction - inference, data, arithmetic device and memory, conditional transfer of control, depending on the result obtained) were so carefully developed that the first computer, which appeared 100 years later, in many ways resembled Babbage's analytical machine. He is considered the inventor of the mechanical computer.

The end of the 19th century was marked by the emergence of electrical computers. In 1875-1880. American G. Hollerith a tabulator machine was invented, designed to process information placed on punched cards. Later G. Hollerith founded a company for the production of tabulators, on its basis at the beginning of the twentieth century, the world-famous IBM company appeared. The Hollerith tabulator was the first to use electromechanical elements. Further invention and improvement of computing technology is directly related to the widest use of electricity. The creator of the automatic computing machine is considered to be the German inventor Konrad Zuse. In 1938, he created the Z1 relay electronic computer based on telephone relays, however, the recording device was still mechanical, a year later an improved Z2 model appeared. Two years later, Zuse presented the world's first programmable computer using a binary system. Similar relay computers were created in the USA (G. Aiken). In 1944, the Mark-1 machine was transferred to Harvard University. The machines were used for calculations when creating an atomic bomb and calculating the trajectories of missiles. The first computer was created by Professor J. Ataiasov and his assistant K. Beri during the Second World War. True, the car was not yet universal. In 1946, the first universal computer (ENIAC) appeared in the USA. It was designed under the direction of J. Eckert and J. Mauchly. From that moment on, the era of computers began. In 1949 the Englishman M. Wilkes created the EDSAK machine, in the memory of which the program was saved. In 1951, the UNIAC computer was put into serial production in America. The first computer in the USSR was created in Ukraine in 1951 - "MEVM", in 1952 "BEC" was built under the leadership of academician S. Lebedev. The creation of a computer is the best invention of the 20th century.

The computer they created ran a thousand times faster than the Mark-1. But it turned out that most of the time this computer was idle, because to set the calculation method (program) in this computer, it was necessary to connect for several hours or even several days the right way wires. And the calculation itself after that could take only a few minutes or even seconds.

To simplify and speed up the process of defining programs, Mauchly and Eckert began to construct new computerthat could store the program in its memory. In 1945, the famous mathematician John von Neumann was involved in the work, who prepared a report on this computer. The report was sent to many scientists and became widely known, because in it von Neumann clearly and simply formulated the general principles of the functioning of computers, that is, universal computing devices. And to this day, the vast majority of computers are made in accordance with the principles that John von Neumann set out in his report in 1945. The first computer that embodied von Neumann's principles was built in 1949 by the English explorer Maurice Wilkes.

The development of the first electronic serial machine UNIVAC (Universal Automatic Computer) was started around 1947 by Eckert and Mauchly, who founded ECKERT-MAUCHLI in December of the same year. The first prototype of the machine (UNIVAC-1) was built for the US Census Bureau and put into operation in the spring of 1951. The synchronous, sequential computer UNIVAC-1 was created on the basis of ENIAC and EDVAC computers. She worked with clock frequency 2.25 MHz and contained about 5000 electronic tubes... An internal storage device with a capacity of 1000 12-bit decimal numbers was made on 100 mercury delay lines.

Soon after the commissioning of the UNIVAC-1 machine, its developers put forward the idea of \u200b\u200bautomatic programming. It boiled down to the fact that the machine itself could prepare such a sequence of commands that is needed to solve a given problem.

The lack of high-speed memory was a strong limiting factor in the work of computer designers in the early 1950s. According to D. Eckert, one of the pioneers of computing, "the architecture of a machine is determined by memory." The researchers focused their efforts on the memory properties of ferrite beads strung on wire dies.

In 1951, J. Forrester published an article on the use of magnetic cores for storing digital information. The Whirlwind-1 was the first to use magnetic core memory. It consisted of 2 cubes 32 x 32 x 17 with cores, which provided storage of 2048 words for 16-bit binary numbers with one parity bit.

Soon IBM became involved in the development of electronic computers. In 1952, she released her first industrial electronic computer IBM 701, which was a parallel synchronous computer containing 4,000 vacuum tubes and 12,000 germanium diodes. The enhanced version of the IBM 704 was fast, indexed and floating point data.

IBM 704
After the IBM 704 computer, the IBM 709 machine was released, which, in architectural terms, approached the machines of the second and third generations. This machine pioneered indirect addressing and introduced I / O channels.

In 1956, IBM developed floating magnetic heads hovercraft. Their invention made it possible to create a new type of memory - disk storage devices (SD), the significance of which was fully appreciated in the subsequent decades of the development of computer technology. The first storage devices on disks appeared in the IBM 305 and RAMAC machines. The latter had a package of 50 magnetically coated metal disks that rotated at 12,000 rpm. There were 100 data tracks on the surface of the disc, 10,000 characters each.

Following the first serial computer UNIVAC-1, Remington-Rand in 1952 released the UNIVAC-1103 computer, which worked 50 times faster. Later, software interrupts were used for the first time in the UNIVAC-1103 computer.

Rernington-Rand employees used an algebraic form of writing algorithms called "Short Code" (the first interpreter created in 1949 by John Mauchly). In addition, it is necessary to note the officer of the US Navy and the head of the group of programmers, while the captain (later the only female admiral in the Navy) Grace Hopper, who developed the first compiler program. By the way, the term "compiler" was first introduced by G. Hopper in 1951. This compiling program produced translation into machine language the entire program written in an algebraic form that is convenient for processing. G. Hopper is also the author of the term "bug" as applied to computers. Somehow through open window a beetle flew into the laboratory (in English - bug), which, having sat down on the contacts, closed them, which caused a serious malfunction in the machine. The burnt beetle was glued to the administrative log, where various malfunctions were recorded. This is how the first bug in computers was documented.

IBM took the first steps in the field of programming automation, creating in 1953 the "Rapid Coding System" for the IBM 701 machine. In the USSR, A. A. Lyapunov proposed one of the first programming languages. In 1957, a group led by D. Backus completed work on the first high-level programming language that became popular later, called FORTRAN. The language, first implemented on the IBM 704 computer, expanded the scope of computers.

Alexey Andreevich Lyapunov
In Great Britain in July 1951 at a conference at the University of Manchester M. Wilkes presented a paper “ The best method designing an automatic machine ”, which became a pioneering work in the basics of microprogramming. The method of designing control devices proposed by him has found wide application.

M. Wilkes realized his idea of \u200b\u200bmicroprogramming in 1957 when he created the EDSAC-2 machine. M. Wilkes together with D. Wheeler and S. Gill in 1951 wrote the first programming textbook "Programming for Electronic Computing Machines".

In 1956, Ferranti released the Pegasus computer, in which the register concept was first embodied general purpose (RON) With the advent of RON, the difference between index registers and accumulators was eliminated, and the programmer had at his disposal not one, but several accumulator registers.

The emergence of personal computers

Initially, microprocessors were used in various specialized devices, such as calculators. But in 1974, several firms announced the creation of a personal computer based on the Intel-8008 microprocessor, that is, a device that performs the same functions as big computer, but designed for one user. At the beginning of 1975, the first commercially distributed personal Computer Altair-8800 based on Intel-8080 microprocessor. This computer was selling for about $ 500. And although its capabilities were very limited ( rAM was only 256 bytes, the keyboard and screen were missing), its appearance was greeted with great enthusiasm: in the very first months, several thousand sets of the machine were sold. Buyers supplied this computer additional devices: a monitor for displaying information, a keyboard, memory expansion units, etc. Soon these devices began to be produced by other companies. At the end of 1975, Paul Allen and Bill Gates (the future founders of Microsoft) created a Basic interpreter for the Altair computer, which allowed users to simply communicate with the computer and easily write programs for it. It also contributed to the rise in popularity of personal computers.

The success of "Altair-8800" forced many companies to start producing personal computers. Personal computers began to be sold already in a complete set, with a keyboard and a monitor, the demand for them amounted to tens, and then hundreds of thousands of pieces per year. Several magazines have appeared on personal computers. The growth in sales was greatly facilitated by numerous useful programs practical value. Commercially distributed programs have also appeared, such as the word-editing program WordStar and the VisiCalc spreadsheet (1978 and 1979, respectively). These and many other programs have made the purchase of personal computers very profitable for business: with their help it became possible to perform accounting calculations, draw up documents, etc. large computers it was too expensive for these purposes.

In the late 1970s, the proliferation of personal computers even led to some decline in demand for large computers and mini-computers (minicomputers). This became a major concern of IBM, a leading large computer company, and in 1979 IBM decided to try its hand at the personal computer market. However, the company's management underestimated the future importance of this market and viewed the creation of the personal computer as just a small experiment - something like one of dozens of work carried out in the firm to create new equipment. In order not to spend too much money on this experiment, the management of the company gave the unit responsible for this project freedom unprecedented in the company. In particular, he was allowed not to design a personal computer "from scratch", but to use blocks made by other firms. And this unit made full use of the given chance.

The latest 16-bit Intel-8088 microprocessor was chosen as the main microprocessor of the computer. Its use made it possible to significantly increase the potential capabilities of the computer, since the new microprocessor made it possible to work with 1 megabyte of memory, and all computers then available were limited to 64 kilobytes.

In August 1981, a new computer called the IBM PC was officially introduced to the public, and soon after that it gained great popularity among users. Within a couple of years, the IBM PC took the lead in the market, displacing 8-bit computers.

IBM PC
The secret of the popularity of the IBM PC is that IBM did not make its computer a single non-removable device and did not protect its design with patents. On the contrary, she assembled a computer from independently manufactured parts and did not keep the specifications of those parts and how they were connected in secret. On the contrary, the design principles of the IBM PC were available to everyone. This approach, called the principle of open architecture, has provided tremendous success iBM computer The PC, although it deprived IBM of the opportunity to single-handedly enjoy the fruits of this success. This is how the openness of the IBM PC architecture influenced the development of personal computers.

Prospects and popularity of the IBM PC made it very attractive to manufacture various components and accessories for the IBM PC. Competition between manufacturers has led to a reduction in the cost of components and devices. Very soon, many firms ceased to be content with the role of manufacturers of components for the IBM PC and began to assemble computers that are compatible with the IBM PC. Since these firms did not have to bear the enormous costs of IBM to research and maintain the structure of a huge firm, they were able to sell their computers much cheaper (sometimes 2-3 times) than similar computers from IBM.

IBM PC-compatible computers were initially contemptuously called "clones", but this nickname did not catch on, as many IBM manufacturers PC-compatible computers began to implement technical advances faster than IBM itself. Users have the opportunity to independently upgrade their computers and equip them with additional devices from hundreds of different manufacturers.

Personal computers of the future

The basis of computers of the future will not be silicon transistors, where information is transmitted by electrons, but optical systems. The information carrier will be photons, since they are lighter and faster than electrons. As a result, the computer will become cheaper and more compact. But most importantly, optoelectronic computing is much faster than what is used today, so the computer will be much more efficient.

The PC will be small and have the power of modern supercomputers. The PC will become a repository of information covering all aspects of our daily life, it will not be tied to electrical networks... This PC will be protected from thieves thanks to a biometric scanner that will recognize its owner by a fingerprint.

The main way to communicate with a computer will be voice. The desktop computer will turn into a "candy bar", or rather, into a giant computer screen - an interactive photonic display. The keyboard is not needed, as all actions can be performed with the touch of a finger. But for those who prefer a keyboard, at any time on the screen can be created virtual keyboard and removed when it is not needed.

The computer will become operating system at home, and the house will begin to respond to the needs of the owner, will know his preferences (make coffee at 7 o'clock, play your favorite music, record the desired TV program, adjust the temperature and humidity, etc.)

Screen size will not play any role in the computers of the future. It can be as big as your desktop, or as small. Big options computer screens will be based on photonically excited liquid crystals, which will have much lower power consumption than today's LCD monitors. Colors will be vivid and images will be accurate (possible plasma displays). In fact, today's concept of "resolution" will be greatly atrophied.

Did you like the article? To share with friends: