History of the creation and development of home appliances. History of the development of computer technology History of the creation of household computing equipment

Subscribe
Join the “koon.ru” community!
In contact with:

“My phone rang...” I’m sure that none of us today can imagine life without communications. We forget our phone at home and rush back to get it; we can’t find it in our bag or briefcase and always get upset. Who brought into our lives a unique technique that helps connect people at a distance?

Lesson plan:

Is it possible to communicate without a phone?

Of course you can! People lived before, and they didn’t have any newfangled telephone models, but they transmitted information from each other far beyond the borders of their place of residence. The need for communication forced people to come up with different ways to “challenge” and tell the news to comrades located several kilometers away. How it was?


By that time, the first attempts had already been made to create a telegraph capable of transmitting signals over long distances using electricity. The fundamentals of electrical engineering were carried out by the scientists Galvani and Volt, and the Russians Schilling and Jacobi made their contribution, who invented transmission codes and an apparatus that converted signals into text.

A little later, in 1837, thanks to the American inventor Morse, an electric telegraph and a special code system of dots and dashes, widely known to everyone under the name “Morse code,” appeared.

But even this was not enough for the scientists of those centuries. They dreamed that it would be possible not only to receive dry text over wires, but also to speak over them!

This is interesting! Archaeologists discovered two pumpkins in the Peru region, connected by a rope and concluded that this structure is the thousand-year-old ancestor of the telephone. Indeed, it is very similar to two matchboxes connected by a thread, which we tried to “ring” in childhood.

Who invented it first?

The history of the appearance of the telephone is associated with Alexander Bell from America. But he was not the only one who was actively involved in the design idea of ​​transmitting the human voice at a distance. Let's take a brief look through the pages of history and see how far the invention traveled in the first stages of its birth.

Italian Antonio Meucci

In 1860, Antonio Meucci, a native of Italy, showed the Americans a device that could transmit sound over a wire, but he filed a patent application only in 1871, and to all his questions about the fate of the documents, the company that took them answered that they were lost.

German Philipp Reis

In 1861, German physicist Philipp Reis introduced the public to an electrical apparatus capable of transmitting sound. By the way, from him came the name “telephone,” which we are accustomed to hearing today, which is translated from Greek as “sound from afar.”

Its transmitter was made in the form of a hollow box with holes: sound - in the front and covered with a membrane - on top. But the quality of sound transmission in Reis's phone was so low that it was impossible to make out anything, so his invention was not accepted by those around him.

Americans Gray and Bell

Only 15 years later, two American designers Gray and Bell, completely independently of each other, were able to discover how a metal membrane with the help of a magnet, like the eardrum of our ear, can transform sound and transmit it through an electrical signal.

Why did Bell get all the laurels of fame? It's simple! On February 14, 1876, he submitted his application to patent the invention he discovered - the “talking telegraph” - a couple of hours earlier than Gray did.

I can imagine how upset Gray was.

Bell presented the telephone at a technical exhibition in Philadelphia.

The new technology did not have a bell; the subscriber was called by the attached whistle, and the only handset both received and transmitted speech at the same time. The first telephones had to generate electricity themselves, so the telephone line only worked at a distance of up to 500 meters.

This is interesting! In 2002, the American Congress made a decision that turned the telephone world upside down: it recognized the Italian Meucci as the true inventor of the telephone.

Evolution of the phone

Since the first telephone was presented to the public, inventors and designers have put a lot of effort into making a modern means of communication out of a primitive device.

Thus, engineers were able to replace the whistle for calling a subscriber with an electric bell. In 1876, a switch was invented that could connect not only two, but several telephones with each other.

A year later, inventor Edison contributed to the development of the telephone - his induction coil increases the distance of sound transmission, and a carbon microphone, which improves the quality of communication, was used until the end of the 20th century. At the same time, in 1877, the first telephone exchange appeared in America, through which those wishing to call someone were connected to the desired number of the telephone operator through plugs.

Thanks to the contribution of the Russian inventor Golubitsky, centrally powered stations were able to serve tens of thousands of subscribers. What is noteworthy is that the first telephone conversation in Russia took place three years after the advent of the telephone, and in 1898 the first intercity line was built between Moscow and St. Petersburg.

This is interesting! The first telephones were not very convenient. It was difficult to hear through them, so they came up with special tubes of different sizes and shapes, into which you just had to stick your nose so that the subscriber could understand what the conversation was about. At first they were made separate: one - to speak into it, the second - to listen from it. Then they began to be connected with a handle, like a modern telephone receiver. Telephone sets were made from ivory, mahogany, and cast metal. The bell cups were chromed to a shine. But one thing remained unchanged: the body, the tube and the lever on which it was hung after the conversation.

By leaps and bounds towards modernity

The inventive world did not stop there. Having received a telephone at home, people wanted to use a modern means of communication on the street, in transport, and communicate on the way to work or home.

Such communication, unattached to the premises, was initially available only to special services - walkie-talkies under the nickname “walkie-talkie”, or “walk and talk”, became a tempting idea for ordinary users. Knowing the secrets of the device, the craftsmen tried to connect the devices to the line using such radio communication. So in the 80s, radiotelephones appeared that operated at a distance of up to 300 meters.

But the main advantage of recent years has undoubtedly been cellular communications, which operate on a signal moving from one station to another.

The modern “honeycomb” appeared in 1973 at Motorola. Their firstborn worked without recharging for no more than 20 minutes and was the size of a brick and weighed as much as 794 grams!

These are our modern “mobile phones” now, small and compact, capable of taking photographs, sending mail and messages, playing music and even thinking for their owner! They have become real helpers for children and their parents - you can always call and find out how they are doing!

This is interesting! Singapore resident En Yang can write SMS the fastest - he needs a little more than 40 seconds for a message of 160 characters to appear!

Interesting facts about mobile phones

This video contains 23 more interesting facts about our phones. They can be added to your project, so look carefully.

Now you know everything about the appearance of the telephone. Make a report and tell your friends, they will be interested! And I say goodbye to you, but don’t forget to look into new projects and stay in touch!

Good luck in your studies!

Evgenia Klimkovich.

Remember the fairy tale about the silver saucer and the pouring apples? The idea of ​​transmitting dynamic images over long distances possessed people even in ancient times, but only at the end of the 19th century did humanity manage to realize its idea and invent the progenitor of the modern television.


Who was its creator? It is quite difficult to answer this question, since many scientists around the world took part in the development and evolution of television.

Who was the inventor of the first mechanical television?

The history of the first television receivers began with the German technician Paul Nipkow, who in 1884 invented a special device - the Nipkow disk, capable of scanning images line by line. In 1895, German physicist Karl Braun created the first kinescope, better known as the “Brown tube”.

The scientist considered his creation unsuccessful and put it aside for 11 long years, but in 1906 his student Max Dieckmann received a patent for the tube and used his teacher’s discovery to transmit the picture. A year later, he showed the world a television receiver with a small screen 3 by 3 centimeters and a scanning frequency of 10 frames per second.

In the mid-1920s, British engineer John Logie Brad made an invaluable contribution to the development of modern television. Using Nipkow's disk, he invented a mechanical television receiver that worked without sound, but gave a fairly clear picture obtained by decomposing it into elements.


At the same time, the scientist created the Baird Corporation, which for a long time was the only global manufacturer of television equipment.

Who invented the electronic TV?

The first electronic television receiver was based on the developments of Russian physicist Boris Rosing. In 1907, he inserted a cathode ray tube into the receiving apparatus and obtained a static television picture of geometric figures. His work was continued by another Russian engineer, Vladimir Zvorykin. After the revolutionary events, he left for America, and in 1923 he patented a unique invention - television, operating entirely using electronic technology.

Subsequently, Zworykin managed to come up with the so-called iconoscope, thanks to which electronic televisions entered mass production. In 1927, regular television broadcasting began in the United States, and in subsequent years, Great Britain, Germany and other European countries began to connect to television. Initially, the image had an optical-mechanical scan, but by the mid-1930s they began broadcasting in the VHF band using the electronic principle.

Residents of the Soviet Union were able to watch television in 1939. The creator of the first television receiver was the Comintern plant in Leningrad, which in 1932 produced a device that worked on a Nipkow disk. The device was an ordinary set-top box with a 3 by 4 centimeter screen, which had to be connected to a radio receiver.


What’s interesting is that subsequently any person could make such televisions with his own hands, following the instructions in the Radiofront magazine. The device required switching the radio to another frequency and made it possible to watch programs shown by European countries.

Who invented color television?

Attempts to transmit color images were made back in the era of mechanical television receivers. One of the first to present his developments in this area was the Soviet engineer Hovhannes Adamyan, who in 1908 patented a two-color device for transmitting signals.

The recognized inventor of color television was John Logie Brad, the author of a mechanical receiver. In 1928, he assembled a device that could sequentially transmit three images using blue, green and red filters.

A real breakthrough in the development of color television occurred only after the Second World War. When the United States lost the opportunity to make money on defense orders, it switched to civilian production and began to use decimeter waves to transmit images.


In 1940, American scientists presented the Triniscope system, in which the images of three picture tubes were combined with different colors of phosphor glow. In the Soviet Union, developments of a similar nature appeared in 1951, and a year later, Soviet television viewers were able to see the first test broadcast in color.

Start

A calculator and a computer are far from the only devices with which you can carry out calculations. Humanity began to think quite early on how to make the processes of division, multiplication, subtraction and addition easier for itself. One of the first such devices can be considered balance scales, which appeared in the fifth millennium BC. However, let's not dive so far into the depths of history.

Andy Grove, Robert Noyce and Gordon Moore. (wikipedia.org)

The abacus, known to us as the abacus, was born around 500 BC. Ancient Greece, India, China and the Inca state can compete for the right to be considered its homeland. Archaeologists suspect that there were even computing mechanisms in ancient cities, although their existence has not yet been proven. However, the antiker mechanism, which we already mentioned in the previous article, may well be considered a computational mechanism.

With the advent of the Middle Ages, the skills to create such devices were lost. Those dark times were generally a period of sharp decline in science. But in the 17th century, humanity again began to think about computing machines. And they were not slow to appear.

The first computers

Creating a device that could perform calculations was the dream of the German astronomer and mathematician Wilhelm Schickard. He had many different projects, but most of them failed. Schickard was not embarrassed by failures, and he eventually achieved success. In 1623, the mathematician designed the “Counting Clock” - an incredibly complex and cumbersome mechanism, which, however, could perform simple calculations.

"Chiccard's counting clock." Drawing. (wikipedia.org)

The “counting clocks” were of considerable size and large mass; it was difficult to use them in practice. Schickard's friend, the famous astronomer Johannes Kepler, jokingly noted that it is much easier to do calculations in your head than to use a watch. However, it was Kepler who became the first user of the Schickard clock. It is known that with their help he carried out many of his calculations.

Johannes Kepler. (wikipedia.org)

This device got its name because it was based on the same mechanism that worked in wall clocks. And Schickard himself can be considered the “father” of the calculator. Twenty years have passed, and the family of computers has been expanded with the invention of the French mathematician, physicist and philosopher Blaise Pascal. The scientist presented “Pascalina” in 1643.

Pascal's adding machine. (wikipedia.org)

Pascal was then 20 years old, and he made the device for his father, a tax collector who had to deal with very complex calculations. The adding machine was driven by gears. To enter the required number into it, you had to turn the wheels a certain number of times.

Thirty years later, in 1673, the German mathematician Gottfried Leibniz created his project. His device was the first in history to be called a calculator. The principle of operation was the same as that of Pascal's machine.

Gottfried Leibniz. (wikipedia.org)

There is one very interesting story connected with Leibniz's calculator. At the beginning of the 18th century, the car was seen by Peter I, who was visiting Europe as part of the Great Embassy. The future emperor was very interested in the device and even bought it. Legend has it that Peter later sent the calculator to the Kangxi Emperor of China as a gift.

From calculator to computer

The case of Pascal and Leibniz developed further. In the 18th century, many scientists made attempts to improve computers. The main idea was to create a commercially successful device. Success ultimately followed the Frenchman Charles Xavier Thomas de Colmar.

Charles Xavier Thomas de Colmar. (wikipedia.org)

In 1820, he launched mass production of computing instruments. Strictly speaking, Colmar was more of a skilled industrialist than an inventor. His “Thoma machine” was not much different from Leibniz’s calculator. Colmar was even accused of stealing someone else's invention and trying to make a fortune from someone else's labor.

In Russia, serial production of calculators began in 1890. The calculator acquired its current form already in the twentieth century. In the 1960-1970s, this industry experienced a real boom. The devices were improved every year. In 1965, for example, a calculator appeared that could calculate logarithms, and in 1970 a calculator that fit in a person’s hand was first released. But at this time the computer age had already begun, although humanity had not yet had time to feel it.

Computers

Many consider the French weaver Joseph Marie Jacquard to be the person who laid the foundations for the development of computer technology. It's hard to say whether this is a joke or not. However, it was Jacquard who invented the punch card. Back then people didn’t yet know what a memory card was. Jacquard's invention may well claim this title. The weaver invented it to control the loom. The idea was that a punch card was used to create a pattern for the fabric. That is, from the moment the punch card was launched, the pattern was applied without human intervention - automatically.

Punch card. (wikipedia.org)

Jacquard's punch card, of course, was not an electronic device. The appearance of such objects was still very far away, because Jacquard lived at the turn of the 18th-19th centuries. Ekov. However, punched cards later became widely used in other areas, going far beyond the famous loom.

In 1835, Charles Babbage described an analytical engine that could be based on punched cards. The key operating principle of such a device was programming. Thus, the English mathematician predicted the appearance of the computer. Alas, Babbage himself was never able to build the machine he invented. The world's first analog computer was born in 1927. It was created by University of Massachusetts professor Vannevar Bush.

Vannevar Bush. (wikipedia.org)

The machine could solve differential equations. The next step was taken by the German engineer Konrad Zuse, who managed to model and build the first programmable computer.

US Treasury employees at computers. (wikipedia.org)

It went down in history as the Z1, and it is what many call the first computer. However, the Z1 had little in common with modern computers and, for that matter, the Z3 should be considered the first such device. This machine actually had many of the features of today's computers. Based on his first invention, Zuse began to design new models. As for the Z1 itself, it suffered a sad fate.

Z3 by Konrad Zuse. (wikipedia.org)

The car was destroyed during one of the bombings of Berlin in 1945. Zuse's drawings also burned down with her.

The age of mass production of computers and similar computing devices began after the war. In 1968, Intel was founded, which was to revolutionize this area. But this, however, is a completely different story.

Technical devices have firmly entered the lives of modern people who can no longer imagine solving their everyday problems and their lives without them. The realities of our life are silent vacuum cleaners, refrigerators with built-in TVs, steam generators instead of irons, and microwave ovens.

These household appliances are no longer a dream or fantasy; they are present in the life of any person. Let's plunge into the past and ask how these inventions were created and what they looked like in the past.

In 1870, the first whisk was developed, which was equipped with a mechanism. It took about 50 years for this invention to take the form of a mixer, which went on mass sale in America in 1910. Like any technical novelty, its cost was very high - about $3,000. It was for this reason that it was not in particular demand, and its weight was about 30 kilograms.

In 1782, the first manual washing machine was developed. It worked with the help of a special handle, and the electric washing machines that we are so accustomed to were born in 1906. If you need your washing machine repaired, you can contact the company tismart.ru, which has an excellent reputation in the field of repairing household appliances.

In 1922, another miracle of technology was developed, which was called a blender. With its help, it was initially possible to combine water and syrup with carbon dioxide crystals. 13 years later, in 1935, the world saw blenders that could puree, chop and chop.

The first coffee maker was invented in 1806. It was even equipped with filters. The basic principle of operation of the coffee maker was as follows: a little ground coffee was placed in a metal sieve, and boiled water was poured into it.

Percy Spencer is the creator of the well-known microwave oven. Spencer worked all his life in the laboratory and one day noticed an interesting fact. When one of the laboratory assistants approached the magnetron, all metal objects on his clothes began to heat up, and if there was a chocolate candy in his pocket, it began to melt. After a huge number of complex experiments, a metal box was developed in which the magnetron was mounted. The main purpose of this box was to heat food. In 1945, Percy Spencer received a patent for his invention, and already in 1947 the first models of this device went on sale. In those days, the weight of the “microwave” was 340 kilograms, and its height was 175 cm.

As soon as a person discovered the concept of “quantity”, he immediately began to select tools that would optimize and facilitate counting. Today, super-powerful computers, based on the principles of mathematical calculations, process, store and transmit information - the most important resource and engine of human progress. It is not difficult to get an idea of ​​how the development of computer technology took place by briefly considering the main stages of this process.

The main stages of the development of computer technology

The most popular classification proposes to highlight the main stages of the development of computer technology on a chronological basis:

  • Manual stage. It began at the dawn of the human era and continued until the middle of the 17th century. During this period, the basics of counting emerged. Later, with the formation of positional number systems, devices appeared (abacus, abacus, and later a slide rule) that made calculations by digits possible.
  • Mechanical stage. It began in the middle of the 17th century and lasted almost until the end of the 19th century. The level of development of science during this period made it possible to create mechanical devices that perform basic arithmetic operations and automatically remember the highest digits.
  • The electromechanical stage is the shortest of all that unite the history of the development of computer technology. It only lasted about 60 years. This is the period between the invention of the first tabulator in 1887 until 1946, when the very first computer (ENIAC) appeared. New machines, the operation of which was based on an electric drive and an electric relay, made it possible to perform calculations with much greater speed and accuracy, but the counting process still had to be controlled by a person.
  • The electronic stage began in the second half of the last century and continues today. This is the story of six generations of electronic computers - from the very first giant units, which were based on vacuum tubes, to the ultra-powerful modern supercomputers with a huge number of parallel working processors, capable of simultaneously executing many commands.

The stages of development of computer technology are divided according to a chronological principle rather arbitrarily. At a time when some types of computers were in use, the prerequisites for the emergence of the following were actively being created.

The very first counting devices

The earliest counting tool known to the history of the development of computer technology is the ten fingers on human hands. Counting results were initially recorded using fingers, notches on wood and stone, special sticks, and knots.

With the advent of writing, various ways of writing numbers appeared and developed, and positional number systems were invented (decimal in India, sexagesimal in Babylon).

Around the 4th century BC, the ancient Greeks began to count using an abacus. Initially, it was a clay flat tablet with stripes applied to it with a sharp object. Counting was carried out by placing small stones or other small objects on these stripes in a certain order.

In China, in the 4th century AD, a seven-pointed abacus appeared - suanpan (suanpan). Wires or ropes - nine or more - were stretched onto a rectangular wooden frame. Another wire (rope), stretched perpendicular to the others, divided the suanpan into two unequal parts. In the larger compartment, called “earth,” there were five bones strung on wires, in the smaller compartment, called “sky,” there were two of them. Each of the wires corresponded to a decimal place.

Traditional soroban abacus has become popular in Japan since the 16th century, having arrived there from China. At the same time, abacus appeared in Russia.

In the 17th century, based on logarithms discovered by the Scottish mathematician John Napier, the Englishman Edmond Gunter invented the slide rule. This device was constantly improved and has survived to this day. It allows you to multiply and divide numbers, raise to powers, determine logarithms and trigonometric functions.

The slide rule became a device that completed the development of computer technology at the manual (pre-mechanical) stage.

The first mechanical calculating devices

In 1623, the German scientist Wilhelm Schickard created the first mechanical "calculator", which he called a counting clock. The mechanism of this device resembled an ordinary clock, consisting of gears and sprockets. However, this invention became known only in the middle of the last century.

A quantum leap in the field of computing technology was the invention of the Pascalina adding machine in 1642. Its creator, French mathematician Blaise Pascal, began work on this device when he was not even 20 years old. "Pascalina" was a mechanical device in the form of a box with a large number of interconnected gears. The numbers that needed to be added were entered into the machine by turning special wheels.

In 1673, the Saxon mathematician and philosopher Gottfried von Leibniz invented a machine that performed the four basic mathematical operations and could extract the square root. The principle of its operation was based on the binary number system, specially invented by the scientist.

In 1818, the Frenchman Charles (Karl) Xavier Thomas de Colmar, taking Leibniz's ideas as a basis, invented an adding machine that could multiply and divide. And two years later, the Englishman Charles Babbage began constructing a machine that would be capable of performing calculations with an accuracy of 20 decimal places. This project remained unfinished, but in 1830 its author developed another - an analytical engine for performing accurate scientific and technical calculations. The machine was supposed to be controlled by software, and perforated cards with different locations of holes were to be used to input and output information. Babbage's project foresaw the development of electronic computing technology and the problems that could be solved with its help.

It is noteworthy that the fame of the world's first programmer belongs to a woman - Lady Ada Lovelace (nee Byron). It was she who created the first programs for Babbage's computer. One of the computer languages ​​was subsequently named after her.

Development of the first computer analogues

In 1887, the history of the development of computer technology entered a new stage. The American engineer Herman Hollerith (Hollerith) managed to design the first electromechanical computer - the tabulator. Its mechanism had a relay, as well as counters and a special sorting box. The device read and sorted statistical records made on punched cards. Subsequently, the company founded by Hollerith became the backbone of the world-famous computer giant IBM.

In 1930, the American Vannovar Bush created a differential analyzer. It was powered by electricity, and vacuum tubes were used to store data. This machine was capable of quickly finding solutions to complex mathematical problems.

Six years later, the English scientist Alan Turing developed the concept of a machine, which became the theoretical basis for modern computers. It had all the main properties of modern computer technology: it could step-by-step perform operations that were programmed in the internal memory.

A year after this, George Stibitz, a scientist from the United States, invented the country's first electromechanical device capable of performing binary addition. His operations were based on Boolean algebra - mathematical logic created in the mid-19th century by George Boole: the use of the logical operators AND, OR and NOT. Later, the binary adder will become an integral part of the digital computer.

In 1938, Claude Shannon, an employee of the University of Massachusetts, outlined the principles of the logical design of a computer that uses electrical circuits to solve Boolean algebra problems.

The beginning of the computer era

The governments of the countries involved in World War II were aware of the strategic role of computing in the conduct of military operations. This was the impetus for the development and parallel emergence of the first generation of computers in these countries.

A pioneer in the field of computer engineering was Konrad Zuse, a German engineer. In 1941, he created the first computer controlled by a program. The machine, called the Z3, was built on telephone relays, and programs for it were encoded on perforated tape. This device was able to work in the binary system, as well as operate with floating point numbers.

The next model of Zuse's machine, the Z4, is officially recognized as the first truly working programmable computer. He also went down in history as the creator of the first high-level programming language, called Plankalküll.

In 1942, American researchers John Atanasoff (Atanasoff) and Clifford Berry created a computing device that ran on vacuum tubes. The machine also used binary code and could perform a number of logical operations.

In 1943, in an English government laboratory, in an atmosphere of secrecy, the first computer, called “Colossus,” was built. Instead of electromechanical relays, it used 2 thousand electronic tubes for storing and processing information. It was intended to crack and decrypt the code of secret messages transmitted by the German Enigma encryption machine, which was widely used by the Wehrmacht. The existence of this device was kept in the strictest confidence for a long time. After the end of the war, the order for its destruction was signed personally by Winston Churchill.

Architecture development

In 1945, the Hungarian-German American mathematician John (Janos Lajos) von Neumann created the prototype for the architecture of modern computers. He proposed writing a program in the form of code directly into the machine’s memory, implying joint storage of programs and data in the computer’s memory.

Von Neumann's architecture formed the basis for the first universal electronic computer, ENIAC, being created at that time in the United States. This giant weighed about 30 tons and was located on 170 square meters of area. 18 thousand lamps were used in the operation of the machine. This computer could perform 300 multiplication operations or 5 thousand additions in one second.

Europe's first universal programmable computer was created in 1950 in the Soviet Union (Ukraine). A group of Kyiv scientists, led by Sergei Alekseevich Lebedev, designed a small electronic calculating machine (MESM). Its speed was 50 operations per second, it contained about 6 thousand vacuum tubes.

In 1952, domestic computer technology was replenished with BESM, a large electronic calculating machine, also developed under the leadership of Lebedev. This computer, which performed up to 10 thousand operations per second, was at that time the fastest in Europe. Information was entered into the machine's memory using punched paper tape, and data was output via photo printing.

During the same period, a series of large computers was produced in the USSR under the general name “Strela” (the author of the development was Yuri Yakovlevich Bazilevsky). Since 1954, serial production of the universal computer "Ural" began in Penza under the leadership of Bashir Rameev. The latest models were hardware and software compatible with each other, there was a wide selection of peripheral devices, allowing you to assemble machines of various configurations.

Transistors. Release of the first serial computers

However, the lamps failed very quickly, making it very difficult to work with the machine. The transistor, invented in 1947, managed to solve this problem. Using the electrical properties of semiconductors, it performed the same tasks as vacuum tubes, but occupied much less space and did not consume as much energy. Along with the advent of ferrite cores for organizing computer memory, the use of transistors made it possible to significantly reduce the size of machines, make them even more reliable and faster.

In 1954, the American company Texas Instruments began mass-producing transistors, and two years later the first second-generation computer built on transistors, the TX-O, appeared in Massachusetts.

In the middle of the last century, a significant part of government organizations and large companies used computers for scientific, financial, engineering calculations, and working with large amounts of data. Gradually, computers acquired features familiar to us today. During this period, plotters, printers, and storage media on magnetic disks and tape appeared.

The active use of computer technology has led to an expansion of the areas of its application and required the creation of new software technologies. High-level programming languages ​​have appeared that make it possible to transfer programs from one machine to another and simplify the process of writing code (Fortran, Cobol and others). Special translator programs have appeared that convert code from these languages ​​into commands that can be directly perceived by the machine.

The emergence of integrated circuits

In 1958-1960, thanks to engineers from the United States Robert Noyce and Jack Kilby, the world learned about the existence of integrated circuits. Miniature transistors and other components, sometimes up to hundreds or thousands, were mounted on a silicon or germanium crystal base. The chips, just over a centimeter in size, were much faster than transistors and consumed much less power. The history of the development of computer technology connects their appearance with the emergence of the third generation of computers.

In 1964, IBM released the first computer of the SYSTEM 360 family, which was based on integrated circuits. From this time on, the mass production of computers can be counted. In total, more than 20 thousand copies of this computer were produced.

In 1972, the USSR developed the ES (unified series) computer. These were standardized complexes for the operation of computer centers that had a common command system. The American IBM 360 system was taken as the basis.

The following year, DEC released the PDP-8 minicomputer, the first commercial project in this area. The relatively low cost of minicomputers has made it possible for small organizations to use them.

During the same period, the software was constantly improved. Operating systems were developed aimed at supporting the maximum number of external devices, and new programs appeared. In 1964, they developed BASIC, a language designed specifically for training novice programmers. Five years after this, Pascal appeared, which turned out to be very convenient for solving many applied problems.

Personal computers

After 1970, production of the fourth generation of computers began. The development of computer technology at this time is characterized by the introduction of large integrated circuits into computer production. Such machines could now perform thousands of millions of computational operations in one second, and their RAM capacity increased to 500 million bits. A significant reduction in the cost of microcomputers has led to the fact that the opportunity to buy them gradually became available to the average person.

Apple was one of the first manufacturers of personal computers. Its creators, Steve Jobs and Steve Wozniak, designed the first PC model in 1976, giving it the name Apple I. It cost only $500. A year later, the next model of this company was presented - Apple II.

The computer of this time for the first time became similar to a household appliance: in addition to its compact size, it had an elegant design and a user-friendly interface. The proliferation of personal computers at the end of the 1970s led to the fact that the demand for mainframe computers fell markedly. This fact seriously worried their manufacturer, IBM, and in 1979 it released its first PC to the market.

Two years later, the company's first microcomputer with an open architecture appeared, based on the 16-bit 8088 microprocessor manufactured by Intel. The computer was equipped with a monochrome display, two drives for five-inch floppy disks, and 64 kilobytes of RAM. On behalf of the creator company, Microsoft specially developed an operating system for this machine. Numerous IBM PC clones appeared on the market, which stimulated the growth of industrial production of personal computers.

In 1984, Apple developed and released a new computer - the Macintosh. Its operating system was extremely user-friendly: it presented commands in the form of graphic images and allowed them to be entered using a mouse. This made the computer even more accessible, since now no special skills were required from the user.

Some sources date computers of the fifth generation of computing technology to 1992-2013. Briefly, their main concept is formulated as follows: these are computers created on the basis of highly complex microprocessors, having a parallel-vector structure, which makes it possible to simultaneously execute dozens of sequential commands embedded in the program. Machines with several hundred processors working in parallel make it possible to process data even more accurately and quickly, as well as create efficient networks.

The development of modern computer technology already allows us to talk about sixth generation computers. These are electronic and optoelectronic computers running on tens of thousands of microprocessors, characterized by massive parallelism and modeling the architecture of neural biological systems, which allows them to successfully recognize complex images.

Having consistently examined all stages of the development of computer technology, an interesting fact should be noted: inventions that have proven themselves well in each of them have survived to this day and continue to be used successfully.

Computer Science Classes

There are various options for classifying computers.

So, according to their purpose, computers are divided:

  • to universal ones - those that are capable of solving a wide variety of mathematical, economic, engineering, technical, scientific and other problems;
  • problem-oriented - solving problems of a narrower direction, associated, as a rule, with the management of certain processes (data recording, accumulation and processing of small amounts of information, performing calculations in accordance with simple algorithms). They have more limited software and hardware resources than the first group of computers;
  • specialized computers usually solve strictly defined tasks. They have a highly specialized structure and, with a relatively low complexity of the device and control, are quite reliable and productive in their field. These are, for example, controllers or adapters that control a number of devices, as well as programmable microprocessors.

Based on size and productive capacity, modern electronic computing equipment is divided into:

  • to ultra-large (supercomputers);
  • large computers;
  • small computers;
  • ultra-small (microcomputers).

Thus, we saw that devices, first invented by man to take into account resources and values, and then to quickly and accurately carry out complex calculations and computational operations, were constantly developing and improving.

Return

×
Join the “koon.ru” community!
In contact with:
I am already subscribed to the community “koon.ru”