HISTORY OF COMPUTERS Early Times People use natural instruments - TopicsExpress



          

HISTORY OF COMPUTERS Early Times People use natural instruments to use on counting (fingers). Time passed, life became more complex. People began using rocks to store information. They carved or use woods for record – keeping until ABACUS was invented. The Abacus • The abacus was different from any recording device because it allowed manipulation of data. • In 1854, archaeologists found a clay tablet resembling a primitive abacus. • The discovery of this artifacts indicates that some form of calculation existed in 3000 B.C. (The tablet now resides in the British Museum.) • The abacus user manipulates beads in a wood frame to keep track of numbers & place values. • Users can perform calculations almost quickly as people who use calculators. • Of all the early aids to calculations, the abacus is the only one used today. THE TRADITIONAL PIONEERS OF COMPUTER John Napier • A Scottish mathematician invented Napier’s Rods or Bones in 1617. • These devices simplified tedious calculations, and they were faster and more accurate. Wilhelm Schickard • A German mathematician was the first to attempt to device a calculator. • 1632, he built a mechanism that could add, subtract, multiply, and divide. • He intended to send his friend, Johannes Kelper, a copy of his invention, but a fire destroyed the parts before they could be assembled. • Prototype was never found, but rough sketch of the machine survived, and model was built in the 1970s. Blaise Pascal • A child prodigy was born in France in 1923. • He had discovered an error in Descartes geometry. • 1642, he built the first operating model of his calculating machine, which he called the Pascaline. • He assembled fifty more of these machines, the next ten years. • The Pascaline was a shoe-sized brass box that operates with a system of gears and wheels. • Considered the first mechanical calculator, it could handle numbers up 999,999.99. • Because of the expense to reproduce it and because people feared it would put them out of work, the Pascaline was not a commercial success. Gottfried Wilhelm Von Leibniz • A German mathematician, designed an instrument called the Stepped Reckoner, which he completed in 1694. • His machine was more versatile than Pascal’s because it can multiply and divide as well as add and subtract. • The Stepped Reckoner and the Pascaline were not reliable machines because the technology at the time could not produce part with the necessary precision. • Leibniz most important contribution to the computer’s evolution was not his machine, but his binary arithmetic, a system of counting that uses only two digits, 0 and 1. Joseph Marie Jacquard • Jacquard’s Loom was the next invention of great significance in the development of the computer. • In 1790, he used punched cards to create pattern on a fabric woven on a loom. Charles Babbage • Difference Engine, a machine designed by Charles and it worked on solving differential equations. • Using government funds and his own resources, he labored on the computer for nineteen years but was unable to complete it. People referred to his engine as Babbage’s Folly. • After the government withdrew its funding, Babbage proceeded to work on another more sophisticated version of this machine, which he called the Analytical Engine. • Augusta Ada Byron, Countess of Lovelace raised money for his invention and wrote a demonstration program for the engine • She was considered as the first computer programmer and the programming language Ada was named after her. • In 1853, he designed a system with provision for printed data, a control unit, and an information storage unit, but the engine was still never completed. • Henry Babbage was able to complete, get it to compute, and publish samples of its work. • He was responsible for the two classification of computer: the store, or memory, and the mill, a processing unit that carries out arithmetic calculations for the machine. • He was called the “Father of Computer”. Herman Hollerith • An American inventor, worked at Census Bureau and leave to go to MIT to teach and work the Tabulating Machine. • He was offered a job in the Census Department and refused it because of his interest upon winning the contract to do the 1890 census. • Hollerith’s invention helped census to finish the work for 2 years as compared to the seven years. • The company went through a series of name changes and the last name change came in 1972 when it became known as International Business Machines (IBM). Computing Devices Before the Twentieth Century Inventor Invention Year Unknown Abacus 3000 B.C John Napier Napier’s Bones 1617 Wilhelm Schickard Mechanical Calculator 1623 Blaise Pascal Pascaline 1642 Gottfried Leibniz Stepped Reckoner 1672 Joseph Marie Jacquard Punched Card Loom 1804 Charles Babbage Analytical Engine 1835 Herman Hollerith Tabulating Machine 1887 The Modern Computer In 1944, the age of the modern computer began. Howard Aiken • In 1937, he was working at Harvard to complete his research for his Ph.D • Because he had to do tedious calculations on nonlinear, differential equations, he decided that he needed an automatic calculating machine to make the chore less arduous. • Harvard an IBM backs him in his effort. • IN 1943, the Mark I, also called as Automatic Sequence Controlled Calculator, was completed at IBM Development Laboratories. • The machine was: 51 feet long, 8 feet thick and feet thick; it had 750,000 parts and 500 miles of wire; and it weighed 5 tons. Capable of three calculations per second. • The first electromechanical computer was responsible for making IBM a giant in computer technology. • As documented, IBM had invested over $0.5 million in the Mark I and in return for their investment. Thomas J. Watson, was the head of IBM, wanted the prestige of being associated with Harvard University. • Aiken built series of machines (the Mark II, Mark III, Mark IV). • Another interesting aside on Aiken pertains to coining of the word debug. John Atanasoff • In 1939, he designed and built the first electronic digital computer while working with Clifford Berry. • Atanasoff and Berry then went to work on an optional model called the ABC, the Atanasoff – Berry Computer. • The computer, completed in 1942, used binary logic circuitry and had generative memory. • No one paid much attention to Atanasoff’s computer except John Mauchly, a physicist and faculty member from the University of Pennsylvania. John Mauchly and J. Presper Eckert • With the emergence of Worl War II, the military wanted an extremely fast computer that would be capable of doing the thousands of computations necessary for compiling ballistic tables for new guns and missiles. • John Mauchly and Presper Eckert believed the only way to solve this problem was with an electronic digital machine. • In 1946, they completed an operational electronic digital computer called the ENIAC (Electronic Numerical Integrator and Calculator) derived from the ideas of Atanasoff. • It worked on a decimal system and had all the features of today’s computers. • The ENIAC, was tremendous in size, filling up a very large room and weighing 30 tons. It conducted electricity through 18, 000 vacuum tubes, generating tremendous heat; it had to have special air conditioning to keep it cool. John Von Neumann • After John Von Neumann arrived in Philadelphia, he helped the Moore group get the contract for the EDVAC. • As a result of the Moore team’s collaboration, a major breakthrough came in the form of the stored program concept. • Until this time, a computer stored its program externally, either on plug boards, punched tapes, or card. • The ENIAC used 18,000 vacuum tubes, Mauchly and Eckert discovered that one mercury delay line that could replace dozens of these vacuum tubes. • They figured that the delay lines would mean a gigantic savings in cost of tubes and memory space. • Though he was not the sole creator, there is no question that it was Von Neumann’s original idea to have the computer store its numbers serially and process them that way – an innovation that made the EDVAC design much faster, simpler, and smaller. Still do only one thing at a time. • In 1951, with the arrival of the UNIVAC, the era of commercial computer began. • Two years later. IBM started distributing its IBM 701, and other companies manufactured computers such as the Burroughs E 101 and the Honeywell Datamatic 1000. Generations of Computer First Generation of Computers • First generation of computers began in the 1940s and extended into the 1950s. • During this period, computers used vacuum tubes, such as the one to conduct electricity. • The employment of vacuum tubes made the computer big, bulky, and expensive because the tubes were continually burning out and having to be replaced. • At this time, computers were classified by the main memory storage device they used. • UNIVAC I used an ingenious device called the memory delay line that relied on ultrasonic pulse. • Mercury delay line storage was a reliable device, but it was very slow compared to modern storage devices. • During the first generation, pioneering work was done in the area of magnetic storage. Second Generation of Computers • The second generation of computers began when the transistor replaced the vacuum tube in the late 1950s. • Transistors conduct electricity more efficiently, consume less energy, need less space, and generate less heat than vacuum tubes. • The computer with transistor became smaller, more reliable, faster, and less expensive than a computer with vacuum tubes. • Small and medium size businesses now found it more economical to buy computers. Third Generation of Computers • Began in 1964 with the introduction of the IBM 360, the computer that pioneered the use of integrated circuit on a chip. • The computer became smaller, more reliable, and less expensive than ever before. • The integrated circuit chip made it possible for minicomputers to find their way into classrooms, homes, and businesses. • They were almost a thousand times faster than the first generation of computers, and manufacturers mass produced them at a low price, making them more accessible to small companies. • The 1970s began with the development of large – scale integration (LSI), a method that put hundreds of thousands of transistors on a single silicon chip. • The chip was as minute as a speck of dust and so delicate that miniature scientific instrument were devised to create it. • The development of the LSI led to the insertion of computers in camera, television sets, and cars. Another result of LSI was the personal computer. Fourth Generation of Computers • The development of microprocessor technology. • The microcomputer processor chip is a central processing unit, the brains of the computer, built on a single chip. • In 1971, a group of individuals working at Intel introduces the 4004 microprocessor. • Apple computers came into being in the 1970s. Steve Wozniak and Steve Jobs worked out of a garage where they began selling Apples for the price of $666.66. • They hired professional help and support and in 1977 introduced, in what was a historic moment of computers, a new fully assembled version of their Apple machine called the Apple II. • The Apple II was the first computer accepted by business user because of spreadsheet simulation program called VisiCalc. Fifth Generation of Computer • In the fifth generation, there are incredibly fast computer chips. • There are computer that carry out thousands of operations simultaneously, and the execution rates of these machines are measured in teraflops. • A teraflops is equivalent to 1 trillion floating – point operations per second. • These powerful computers can solve the most complex problems in science, finance, and technology. • There will be extensive use of “artificial intelligence”, a broad range of applications that exhibit human intelligence and behavior. • There are already some accomplishments today in this area: Medical programs aid in diagnosing various diseases, and mining programs help mining companies. • AI - Artificial Intelligence o Elements already exist in educational software programs. o Has been used on board games such as chess and backgammon that could defeat user. • Computers keep decreasing in size, and there is no limit to how small they could become. • Many systems have touch screens and handwriting recognition software that let the user employ a pencil – like stylus as the input device.
Posted on: Tue, 22 Jul 2014 08:21:53 +0000

Trending Topics



Recently Viewed Topics




© 2015