Advertisement

Responsive Advertisement

History of Computers and Generations

History of Computer

History of Computers

Three computing concepts follow defined directives. If so, it'll hear pre-recorded Massive data storage and retrieval. Initially, computers were humans doing math. Sometimes they calculated a mathematical expression. At that point, calculations were complex that required years of math knowledge. Computing was first recorded in 1613 when the word "computer" had used. That was the case until the mid-20th century.

 

Origin: 

 

The tally stick was an old memory aid. Its name is an abacus that was a mechanical calculator. The abacus dates back to 2400 BC. As we all know, the abacus dates back to around 500 BC, which helped in solving basic math. 

 

John Napier created Napier's bones in 1614, whereas William Oughtred invented the slipstick in 1622, which supported Napier's logarithm principles. – Trigonometry But not for arithmetic. 

 

Pascaline had created a machine in 1642 which merely added and subtracted all the mathematical calculations—autoloom with card punch.

 

So in 1820, Thomas de Colmar invented the Arithmometer, the primary reliable, useable, and commercially successful calculator. Mathematically, the device may accomplish four things. 

 

Then, the Analytical and Difference Engines reached the market, which was a mechanical polynomial calculator. It appeared between 1822 and 1834 by Charles Babbage. 

In 1840, Augusta Ada Byron proposed the binary numeration system to Babbage. And, then, an Analytical Engine coder created in 1843 by Per Georg Scheutz was a new engine from left to right. 

 

IBM Automatic Sequence Controlled Calculator (Havard Mark 1) (ASCC) was created in 1943 by Howard H., a mechanical computer. 

 

Konrad Zuse built Z1, the primary programmable computer, in 1936-1938. A user was employing a punch tape reader to program the Z1; all output had punched tape. John Atanasoff and Clifford Berry developed the Atanasoff-Berry Computer at Iowa State University between 1939 and 1942. (ABC). 

 

In 1946, John Presper Eckert and John W. Mauchl invented the ENIAC. The first commercial computer had jointly launched by J. Presper Eckert and John Mauchly (UNIVersal Automatic Computer 1). Discrete Variable Automatic Computer (EDVAC) von Neumann 1952. Programmer and data memory. Osborne Computer Corporation debuted the Osborne 1 in 1981. Eckert and Mauchly developed Electronic Controls in 1949. 

 

Generations: 


1st Generation Computers: 

 

Early computers were massive, requiring rooms to carry hardware and memory. They were costly to run, needed much electricity, and produced much heat. The lowest-level programming language could only solve one problem on first-generation computers (1946-59).

 

2nd Generation Computers: 

 

In the second generation of computers, one transistor replaced 40 vacuum tubes. It had been reducing computer size and enhancing efficiency and dependability. Even yet, the machine was scalding. Assembler languages allowed programmers to write down commands in words instead of bytes. Input and output had done using punched cards. These were the primary core computers using memory rather than a magnetic drum (1959-64). 

 

3rd Generation Computers: 

 

The microcircuit appeared within the third generation of computers (1965-70). Transistor miniaturization improved computer performance and efficiency. Third-generation computers had used with keyboards and monitors. An OS allowed the device to run numerous programs directly while monitoring memory. Smaller and cheaper computers opened to a broader audience. 

 

4th Generation Computers: 

 

A single chip had thousands of integrated circuits (1971 to date). More powerful minicomputers could be networked to make the web. Fourth-generation PCs introduced GUIs, the mouse, and portables. 

 

5th Generation Computers: 

 

Fifth-Generation (2021) has the popularity of Artificial Intelligence-AI. And, still working. Parallelism and superconductors help make AI possible. The target is to create devices that will self-organize and learn from tongue input. In 2021, voice recognition is included in current systems to form it easier for the users.

Post a Comment

0 Comments