13 significant events in the history of computers.
Home
/
13 significant events in the history of computers
13 significant events in the history of computers.
Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.
Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.
Answer:
1. Zero
800 AD – India. Since you can’t have a computer without 1s and 0s, I think the invention of the number zero is significant. You can argue whether this happened in Egypt, Mesopotamia or India. In my opinion it was India as they were the first to treat it like a number and used the decimal point since year 595.
2. Pascal adding machine
1642 – France. Blaise Pascal builds the Pascal Adding Machine – the first workable calculator. To me this is more significant than Napier’s bones, the development of logarithm tables or some mechanical devices like the watch or the quadrant, because the device does the computing.
3. Binary number system 1679 - Germany. Gottfried Leibniz perfects the binary number system.
4. Electricity
1751 – USA. Computers don’t work without electricity so Ben Franklin’s discovery in 1751 should make the list.
5. Textile loom
1801 – France. Joseph Jacquard builds his textile loom using the concept of a punch card to weave intricate designs into cloth. This is the foundation of a programmable machine.
6. Analytical engine
1833 – UK. Charles Babbage has the idea for the Analytical Engine, and although he didn’t build it, it sets the foundations for all modern computers. Augusta Ada Byran, AKA Ada Lovelace who worked with him, proposed using punched cards like Jacquard’s loom to make it programmable.
7. Boolean algebra
1854 – UK. George Boole creates Boolean algebra laying the foundation of Information Theory. This iswhere “and”, “or” and “not” come into mathematical formulas. This was later used by Charles Sanders Peirce to develop the idea that Boole’s logic lends itself to electrical switching circuits. It would be 50 years until Bertrand Russell presented the idea that this is the foundation of all mathematics and another 30 years until Claude Shannon incorporated the symbolic “true or false” logic into electrical switching circuits.
8. Thermionic emissions
1863 – USA. Thomas Edison discovers thermionic emissions, the basis of the vacuum tube, which, in turn, becomes the building blocks for the entire electronics industry. When the vacuum was invented in 1907 it enabled amplified radio and telephone technology.
9. Nipkow disk
1925 – UK. You could argue that the TV gets its roots from fax transmissions back in 1843, but when amplification made television practical, Scottish inventor John Logie Baird employed the Nipkow disk in his prototype video systems.
10. Automatic programming
1936 – UK. I have watched a few documentaries on Alan Turing and visited an exhibit in a museum here in the UK on him. Pretty amazing guy. He was the guy that provided the basis for the development of automatic programming showing that computing machines can simulate more complicated problems. If it wasn’t for him the Z2, the first digital computer that was used to break Germans’ Enigma, would nothave built.
11. Transistor
1948 – USA. John Bardeen invents the transistor.
12. Magnetic core memory
1949 – USA. An Wang invents magnetic core memory. He doesn't build it but sells the patent to IBM for $400K to get the funds to start his company. His concept turns out not to be practical, until Jay Forrester at MIT enhances the idea to put it into a matrix to give it greater practical applications. This is later developed into computer memory developed by Fred Williams.
13. COBOL
1952 – USA. Grace Hopper pioneers the idea of using higher level computer languages and built the concept of a compiler so we could program in words not numbers and this gave rise to COBOL, the first language to run on multiple types of computers.