Complete History of Computers to this Day

History of Computer

History of Computer

In the earlier days of human history possibly there were not many roles of counting. For a primitive man, the first realization of the need of counting probably was due to his loneliness and weakness against many beasts around. So the first counting system possibly had started with two quantities, one and many.

Communicating quantities by fingers must have come naturally because of the presence of five fingers in each hand. "Three elephants are coming this way" could easily be conveyed to others by waving three fingers.

10,000 Years Ago

It is believed that the agrarian society came into being around 11,000 years ago. The use of numbers became much more than simply conveying quantities. Numbers became more useful to explain nature, astronomical happenings, the seasons, the heavenly bodies, etc. This promoted the power of human beings to research and explain science. The relationship among numbers became important and the awareness of the same brought in the knowledge of mathematical science.

Effective and written cardinal numbers took a long time to take root in human civilization. To start with there were many systems but mostly cumbersome. Writing them down as possible but hardly helped mathematical manipulation. Almost all cultures developed their own number system. One of the complicated systems was that used by the Egyptians. In this system 1 to 9 were represented by corresponding numbers of vertical lines, 10 was represented by U or a circle, 100 by a coiled rope, 1000 by a lotus, 10,000 by a pointed finger, 100,000 by a tadpole, and 1,000,000 by a picture of a man with his arms stretched towards heaven in astonishment. To represent 5,943,768 the Egyptian would draw 5 men standing in astonishment, 9 tadpoles, 4 pointed fingers, 3 lotuses, 7 coiled ropes, 6 circles, and 8 verticle lines. To add two numbers represented in this way would be unthinkable using a mathematical process. Greeks and Romans faced a similar problem. They also used letters for numbers. We all are aware of the use of Roman numbers I II III IV V VI VII VIII IX and X for representing the 10 natural numbers. The symbols being unrelated to each other, this representation does not help in numeric calculations.

4000 Years Ago

Mechanical means like using stone pieces for keeping track of numbers and manipulating them came out of necessity to human societies. Marking a tick mark on the ground by a stick for each of the domesticated animals while releasing them for grazing in the field and tallying the same by putting the mark off when they returned could have been a procedure for keeping track of possessions. A clay tablet dated 2300 - 1600 BC has been found in Senkereh in Babylonia and is kept in the British Museum, which contains the squares of numbers up to 24. It is possible that this was used as a lookup table.

Abacus Computer

The use of stone counters was very common. The stones got replaced by pebbles in a natural way, and the abacus, the world's first digital computer came into being. There are many stories about the abacus and many societies built their own version. Through the years abacus underwent several modifications. Abacus as a computing tool has been tremendously successful and is still in use in many societies. It has been so popular that the US army arranged a contest as recently as in 1946 between the Army's fastest mechanical adding machine operator and a Japanese working with an abacus. The abacus operator won in 4 categories out of 5.

1000 Years Ago

It appears that India had developed a nine-digit number system between the first and second centuries. Later they invented 0 and added it to the system. This made a notation possible by which any number in the world could be represented systematically. The date of this remarkable invention is not known. It is also believed that the Arabs had a considerable business relationship with India. The Indian astronomical tables that used the decimal system were taken to Arab around 773. The tables were translated into Arabic.

900 Years Ago

For many years the Arabic system and notation were kept secret from the Europeans. Around the year 1120. a monk translated Al-jabr for the Europeans. The Arabic system soon spread throughout Europe. In course of time, the printing press was invented and it helped the process of spreading knowledge tremendously. By 1448 there were books that explained arithmetic and abacus among other topics.

400 Years Ago

Abacus was doing its works in addition and subtraction. However, it had a fairly cumbersome process for multiplication and division. The Scottish Mathematician, John Napier invented an easy arithmetic process to perform these tasks. He invented logarithms by studying the various patterns made by the products of numbers multiplied by themselves. Napier produced the logarithm system in 1614. Henry Briggs, later, published the logarithm tables. Mathematicians instantly accepted the logarithm tables.

An English clergyman, William Oughtred, in 1621, invented the slide rule. This had numbers on the marked scale printed in two sliding wooden pieces in such a manner that multiplication and division results could be found out by sliding the two wood blocks and reading the numbers from the coinciding scale marks. In later years other features got added and it became the symbol of Engineers and mathematicians.

The slide rule did not count numbers like the abacus did. It could measure the physical property of distance, which was made analogous to numbers. Such a device that depends on the physical analogy for calculations is called an analog computer. In contrast, an abacus actually counted and so can claim to be a digital computer. 

There were many efforts to build mechanical computers, and many efforts came from mathematicians. Wilhelm Schickard, a German astronomer and mathematician, and also a friend of great astronomer Johan Kepler built a machine in 1623. Blaise Pascal in his teenage, in the years 1642 to 1644, built a mechanical computing machine for his later who was a tax collector. The machine was mainly an adder. Gottfried Wilhelm von Leibniz made a mechanical multiplier in 1673. 

Binary numbers were known for quite along. They were hardly used. In the seventeenth century, Gottfried von Leibniz, the German Philosopher cum Mathematician, had a vast interest in them, though he did not use them in building his multiplier. His interest in binary numbers was for the search of a "universal calculus" by which, he thought of reducing all human reasoning to a simple mathematical form. However, he deviated from the philosophical use of the binary system.

200 Years Ago

The use of binary for representing and reducing the human reasons came through George Boole, an English mathematician. In 1854, he published his works in a book called "Laws of Thought" in which he explained formal logic in terms of unambiguous mathematical symbols. Computer hardware architecture had its beginning some years before Boole's publication, in the works of Charles Babbage, another English mathematician. In 1821 Babbage announced that he would develop an automatic calculating machine to solve polynomial equations. He named this machine "Difference Engine". This machine could not be built in spite of Government grants, partly because of the non-availability of precision engineering. In 1833 the project was suspended.

Charles Babbage Analytical Engine

In spite of this failure, Babbage planned an elegant machine, which he called an "Analytical Engine". Unlike other attempts so far, this was a plan for a general-purpose computer, an idea conceived for the first time. The architecture of the machine incorporated the important elements which modern computers also have. It had a: store" (memory), a "mill" (central processor), and also means of incorporating patterns of actions (program) by taking inputs through punched cards like those used in Jacquard looms. Babbage did not get proper attention and support from funding authorities or the Government, possibly because of his earlier failures and some amount of eccentricity in his behavior. The only admiration and support Babbage got were from the countess of Lovelace, Ada, the daughter of Lord Byron, Both of them tried hard to make the project a success with their own money and effort, Babbage also tried to raise money by developing and selling mechanical game-playing machines. Ada had to support her by spending her husband's family jewels. Unfortunately, the project was not a successful one. Ada died in her thirties and the project had to be abandoned, Babbage,s work was forgotten for quite some time. In retrospection, Babbage is given the honor of "father of the computer", and Ada, the "world's first programmer".

120 Years Ago

Many years later a number of calculating machines were built in America. Commercially, the most successful machine was the automatic tabulating machine developed by Herman Hollerith. These were used in the 1890 census, These machines also took inputs from punched cards like what was planned by Babbage. These machines were not really computers and were unable to do anything else other than cataloging a huge amount of data. It was so successful that Hollerith was encouraged to start his own company for tabulating machines. Hollerith's company was bought by a financier and was merged with some more companies. Later in 1924 IBM (International Business Machines) was born out of this. 

90 Years Ago

In 1936 and 1937, two papers brought about the necessary theories and foundation for building general-purpose computers.  Alan Turing, the famous English mathematician from Cambridge University, Wrote a paper entitled "On computable numbers". This brought in the theory of an imaginary computer, known as the Turing machine. he described the working of such a machine. He suggested that this machine would have a long tape passing through it. The tape would be divided into segments. Each segment would be marked with a slash or be left blank. The machine was projected to perform only four operations, move the tape one segment left, move the tape one segment right, print a new slash in a blank segment, erase an Id slash. Turing claimed that such a machine would be able to do what any other computer could do. This was a remarkable thing to perceive. It meant that if properly programmed any computer could do the same work as any other computer. 

The second paper was actually a master's thesis of an MIT student, Claude Shannon. Shannon's paper titled "A symbolic analysis of relay and switching circuits". This paper put in place, the impetus to relate binary mathematics, logic, and computer circuits. 

Turing was entrusted by the British Government to build a machine to crack the encrypted messages created by a secret German machine called "Enigma". This was a top-secret operation by the name "Project Ulta". Turing's team was able to build a machine to crack the code of Enigma by the end of 1943. This machine was named "Colossus" and this machine is considered the first electronic computer in the world

Almost at the same time, in America, IBM supported a machine-building project in Harvard, called "Harvard Mark I" Under the leadership of Howard Aiken. This was completed in 1943. This machine workes on relays and not electronics switching circuits.

In 1943 another project for developing a computer for the war department was started at the University of Pennsylvania. The purpose of this machine was to calculate the trajectories of shells. Two researchers John Mauchly and J Presper Eckert tried to make a vacuum tube computer in this project. This machine proved to be much faster than the Harvard Mark I. the machine was named "Electronic Numerical Integrator And Calculator (ENIAC)". The birth of this machine is considered as the start of a new era. 

The first stored-program computer came into being in the form of Electronic Discrete Variable Calculator (EDVAC) in America and also in the form of Electronic Delay Storage Automatic Calculator (EDSAC) in Great Britain. The concept of stored-program computer is believed to be attributed to the Hungarian Mathematician John von Neumann, who also workes on the ENIAC project. It is also claimed that the concept of the stored-program was discussed much before Neumann in Moore's School.

70 Years Ago

By around the end of 1947, the transistor was invented in Bell's Laboratories. Soon it replaced vacuum tubes as a switching device. It proved faster, smaller, less power consuming, less heat-producing. In the mid-fifties, transistors were utilized in building computers. The introduction of transistors in computer circuits made the machines smaller, cheaper, and more powerful. 

Computers became even smaller with the advent of integrated circuits called the microchip (and also ICs). The success of chips called for more efforts to bring in greater miniaturization. Came Large Scale Integration (LSI) and Very Large Scale Integration (VLSI) technologies. The computer-on-a-chip was made possible by the VLSI technology, which resulted in the overwhelming demand for personal computers. Today's personal computers have much more power of Yesterday's considered-to-be-powerful computers; it occupies a small fraction of space consumed by the earlier ones and costs unbelievable lower than its predecessors.

Modern Computers

The history of computers is not alone that of hardware like CPU, Monitor, Mouse, MotherBoard, etc. Programming has undergone many transformations too. The more striking of those are programming languages and the styles of programming, From simple machine code programming, the transformation came to assemble language programming and high-level language programming with the introduction of assemblers and compilers. The operating procedures got changed from entering the program through console switches to the use of an operating system. The operating system also saw a big change from command level usage to the use of Graphical; User Interfaces. The style of programming from procedural with a lot of foto statements changes to the structured programming, in which the program got broken into small manageable chunks. Came into being the object-oriented programming, which helped in hiding details and reuse of the objects. Visual programming became state-of-the-art. The process of building big software was more formalized and matured by the introduction of Software Engineering. Every decade, since the first computer came, there were enormous changes in the hardware, software, use, and application areas of computers. The history of computers is the history of electrical engineering which is an offshoot of Physics, Mathematics, and possible that of the human imagination.

Frequently Asked Questions

1. Who invented the modern computer?

A: Charles Babbage considered the inventor of the modern computer.

2. Who called the father of the computer?

A: Charles Babbage.

3. Who the world's first programmer?

A: Ada Lovelace, the daughter of Lord Byron, was the world's first programmer.

4. Who invented the Slide rule?

A: William Oughtred, in 1621, invented the slide rule.

5. Who invented the transistor?

A: In around 1947, William Shockley, John Bardeen, and Walter Brattain are the inventors of transistors at Bell's Laboratories.

6. Who invented the Colossus computer?

A: Tommy Flowers was the inventor of the Colossus computer.

7. When was the Colossus computer invented?

A: On 8th December 1943, the Colossus computer was invented by Tommy Flowers.

8. Who is the father of modern computers?

A: Alan Turing is the father of modern computers.

9. Who is called the mother of the computer?

A: Many scientists considered Ada Lovelace as a mother of the computer because she is the world's first programmer.

10. What was the world's first electronic computer?

A: The machine was named Electronic Numerical Integrator And Calculator (ENIAC) which is the world's first electronic computer.

Post a Comment