In modern life, the computer has become an essential component, playing a significant role. While the ancient counting device ‘abacus’ marks the beginning of the computer’s history, the functions of a computer are actually much broader. There is no single individual who can solely claim to be the inventor of the computer. Instead, we observe a sequential evolution throughout the computer’s history. It is important to note that hardware alone does not make a computer functional; without software or instructions to run the machine, it cannot operate. Today, the computer stands above all else. Continuous advancements in hardware and software have made computing easier and more widespread, bringing computers to everyone’s hands. So, let us learn how this excellence was achieved and who were the pioneers behind it.
In 1671, a German mathematician named Gottfried von Leibniz built a more modern calculating machine based on Pascal’s mechanical calculator. Using this device, addition, subtraction, multiplication, division, and product determination could be done. He was the first to devise a method for multiplication through repeated addition. In 1786, Johann Helfrich Müller proposed the idea of a calculating device. In 1801, Joseph Marie Jacquard from France started using punch cards.
At that time, Professor Charles Babbage also attempted to create an automatic calculating machine in 1801. This machine was named the Analytical Engine. Babbage envisioned a device into which input could be provided through punch cards. The machine would have a memory system to store input, a mathematical section for processing, and an automated output printing system. His contemporaries often ridiculed his ideas. If we compare his vision with today’s computer, it is truly astonishing. In 1833, Babbage’s concept of the ‘Analytical Engine’ is considered the blueprint for the modern computer. Charles Babbage is regarded as the “Father of the Computer.”
In 1936, Konrad Zuse invented the first openly programmable computer. In 1942, John Atanasoff and Clifford Berry invented the ‘ABC Computer.’ In 1944, Howard Aiken and Grace Hopper together built a computer known as the ‘Harvard Mark 1 Computer.’
The journey of the computer began with the calculating device. Gradually, the size and nature of computers evolved. Primarily, during World War II, John Presper Eckert and John W. Mauchly at the University of Pennsylvania designed ENIAC for research at the Hydrodynamics Ballistic Research Laboratory of the US Army. The first use of ENIAC was for hydrogen bomb calculations. In 1946, ENIAC was equipped with a huge brain or memory, using vacuum tubes. This made the computer suitable for a variety of tasks, and computers of this type continued to develop further.
Subsequently, the computers that emerged were much faster, far more efficient, and significantly smaller in size than their predecessors. Instead of vacuum tubes, transistors began to be used. As a result, the size of computers became even smaller. At this stage, there were attempts to invent new types of circuits. In 1955, multiple transistors and other components were combined and placed onto a small silicon wafer to create integrated circuits.
Around 1953, International Business Machines, or IBM, entered the history of computers. That year, IBM launched the IBM 701 EDPM computer, making it accessible to ordinary users, and it was commercially successful. In 1954, John Backus, with the cooperation of IBM, developed the high-level programming language FORTRAN. In America, in 1959, the Stanford Research Institute was the first to develop the Magnetic Ink Character Recognition (MICR) system for banking applications.
Microchips succeeded transistors. In 1959, an engineer named Jack Kilby invented the microchip. At the same time, Texas Instruments received the patent for it, and in 1961 commercial production began. By 1970, the use of microprocessors became widespread. In 1971, Intel Corporation released its first microchip, the Intel 4004. In 1981, IBM developed a computer for personal use, using the DOS-based operating system MS-DOS. In 1983, Apple created the first graphical user interface-based computer for personal use. In 1985, the software giant Microsoft launched the Windows operating system, which, much like Apple’s interface, was user-friendly.
Through this steady evolution of computers, it has become possible to create advanced technology-enabled computers. Today, computers are much faster and smaller in size, and their scope of work continues to grow day by day.
Published by Technology Today: “The Story of Invention”
Also visit our website: http://e-learningbd.com

Leave a comment