History of computer



History of Computer:

                                                 


                                                       

                                                            Computer is very valuable machine, and it is used in our daily routine. We solve our daily issues through computer like we save our personal data, we compute results through computer. Today professional programmers have made different programs, through these programs we can solve different problems. In ancient times, there were different types of computers for use, like abacus was first computer, used calculations. This machine shows data in discrete form.

Let us discus history of computer:

Start of counting:

                               In ancient times, the earlier man did calculations on fingers and write with the sticks on ground.Then man made some development and introduce the method of notched sticks and knotted cords. Notched sticks are an example of "VIBROT" a tool that is designed and create to succumb conversion of mechanical vibration into a rotational motion. Knotted cord is for primitive surveyor's tool for the measure of distances. Further man focuses on exploring new tools for calculations. And at last, he identify the method of written on hides, parchment and then writing on paper. Man is a wonderful creation of God. He deeply thinks the things and invent the concept of numbers and different tools for storing and calculating these numbers.

Roman Empire:


                                  These were ancient romans who invent abacus. It was first calculating machine who did simple calculations. It is heard that there was an ancient Chinese abacus calculating machine. Counters in the upper groove are probably 5 x 10n   and in the lower groove are 1 x 10n.

Industrial Era-1600:

                                       In industrial era, there was a Scottish politician and great mathematician, John Napier. He devotes much of his time to find different easy ways to easily do computations. It was he who invent the concept of logarithm. He etched logarithmic measurements on the set of the 10 wooden rods and make it able to be done multiplication and division on rod.

William Oughtred's Slide rule:


                                                                                   
                                                                                                 

                                                       A Scientist, Edmund Gunter design Logarithmic scale to do logarithmic calculations. Further a scientist Whilliam Oughtred design slide rule. Slide rule was an easy calculating machine to do computations. It remains in use until mid-1970s.

Blaise Pascal-1642:

                                      

                               

                                                     

                                   

                                   Blaise pascal was a French mathematician. His father was also a mathematician. For the help of his father, he invented a machine Pascaline. This machine does simple addition and subtraction. The name of a programming language was in his honor.  This machine consists of a series of gears and contains 10 teeth each.

Each gear does his turn and trip to the next gear to make 1/10 revolution. And this machine was the foundation of mechanical adding machines.

Gottfried Wilhelm von Leibniz -1673:


                                                     

                                                                 

                                                                              In 1673, there was a great addition in mathematics, Differential and Integral. These were invented by Gottfried Wilhelm von Leibniz without the help of Sir Isaac Newton. Gottfried invented a calculating machine Leibniz's Wheel, also known as Step Reckoner. This machine can add, subtract, multiply and divide. This machine did so by repeating addition and subtraction. It was Leibniz who invented an essential thing for new computers Binary arithmetic. Today's computers are digital computers. They can understand binary language.


The Bouchon Loom - 1725:


                   

                                                 

                                                                             

                                              Basile Bouchon was a worker in a textile industry. His father was an organ maker. In that era, fabrics designs were very complicated, and they wove on loom. They were in trend. This work was very complicated to do. 

                                                       

Basile Bouchon, the son of an organ maker, worked in the textile industry. At this time fabrics with very intricate patterns woven into them were very much in vogue. To weave a complex pattern, however involved somewhat complicated manipulations of the threads in a loom which frequently became tangled, broken, or out of place. Bouchon observed the paper rolls with punched holes that his father made to program his player organs and adapted the idea as a way of "programming" a loom. The paper passed over a section of the loom and where the holes appeared certain threads were lifted. As a result, the pattern could be woven repeatedly. This was the first punched paper, stored program. Unfortunately the paper tore and was hard to advance. So, Bouchon’s loom never really caught on and eventually ended up in the back room collecting dust.

Falcon Loom-1728:

                                           

                                In 1728 Jean-Batist Falcon, substituted a deck of punched cardboard cards for the paper roll of Bouchon’s loom. This was much more durable, but the deck of cards tended to get shuffled, and it was tedious to continuously switch cards. So, Falcon’s loom ended up collecting dust next to Bouchon’s loom.

1745 - Joseph Marie Jacquard (1752-1834)

                                                                                  It took inventor Joseph M. Jacquard to bring together Bouchon’s idea of a continuous punched roll, and Falcon’s ides of durable punched cards to produce a really workable programmable loom. Weaving operations were controlled by punched cards tied together to form a long loop. And, you could add as many cards as you wanted. Each time a thread was woven in, the roll was clicked forward by one card. The results revolutionized the weaving industry and made a lot of money for Jacquard. This idea of punched data storage was later adapted for computer data input.

First Generation Computers (1940s – 1950s)

                        


                                 

                                                                                                              First generation computers were used vacuum tubes, and they were much huge and very complex. The general-purpose of electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). This computer digital, although it did not operate with the binary code, and it was reprogrammable to simplify a complete range of computing problems. It was programmed by using the plugboards and the switches, supporting the input from an IBM card reader, and the output to an IBM card punch.

Second Generation Computers (1955 – 1960)

                                                                                                              The second-generation computers used transistor, which then started the replacing of vacuum tubes in computer design. These transistor computers consumed much less power, they produced much less heat, and were much smaller than to the first generation, but bigger than today’s standards.

The first transistor computer was built at the University of Manchester in 1953. Most popular of the transistor computers was IBM 1401. 

Third Generation Computers (1960s)

                                                                 Third generation computer start with the invention of the integrated circuits (ICs), also called microchips, paved the way for the computers we know them today. Making the circuits out of the single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This may also start the under-way process of the integrating an ever-larger number of the transistors onto the single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.

Then first appeared minicomputers, first of the which were still based on non-microchip transistors, and after versions of which were hybrids, being based on the both transistors and microchips, Like IBM’s System/360. They were much smaller, and inexpensive than the first and the second generation of computers, also called mainframes

Fourth Generation Computers (1971 – present)

                                                                                                     The first microchips-based central processing units consisted of the many microchips for the different CPU components. The drive for the ever-greater integration and the miniaturization guide towards the single-chip CPUs, where all of the compulsory CPU components were put onto the single microchip, known as a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004.

Fifth Generation of Microcomputers (1971 – 1976)

                                                                                                            First microcomputers were a weird bunch. They often came in kits, and many were essentially just boxes with lights and switches, usable only to engineers and hobbyists whom could understand binary code. Some, however, did come with a keyboard and/or a monitor, bearing somewhat more resemblance to modern computers. It is arguable which of the early microcomputers could be called a first. CTC Datapoint 2200 is one candidate, although it actually didn’t contain a microprocessor (being based on a multi-chip CPU design instead), and wasn’t meant to be a standalone computer, but merely a terminal for the mainframes. The reason some might consider it a first microcomputer is because it could be used as a de-facto standalone computer, it was small enough, and its multi-chip CPU architecture actually became a basis for the x86 architecture later used in IBM PC and its descendants. Plus, it even came with a keyboard and a monitor, an exception in those days.

 






Comments