COMPUTING FUNDAMENTALS AND C PROGRAMMING
UNIT – I: Fundamentals of Computers : Introduction – History of Computers-Generations of
Computers- Classification of Computers-Basic Anatomy of a Computer System-Input Devices-
Processor-Output Devices-Memory Management – Types of Software- Overview of Operating
System- Programming Languages-Translator Programs-Problem Solving Techniques - Overview
of C.
UNIT – I
INTRODUCTION
Let us begin with the word ‘compute’. It means ‘to calculate’. We all are familiar with
calculations in our day to day life. We apply mathematical operations like addition, subtraction,
multiplication, etc. and many other formulae for calculations. Simpler calculations take less time.
But complex calculations take much longer time. Another factor is accuracy in calculations. So
man explored with the idea to develop a machine which can perform this type of arithmetic
calculation faster and with full accuracy. This gave birth to a device or machine called ‘computer’.
The computer we see today is quite different from the one made in the beginning. The
number of applications of a computer has increased, the speed and accuracy of calculation has
increased. You must appreciate the impact of computers in our day to day life. Reservation of
tickets in Air Lines and Railways, payment of telephone and electricity bills, deposits and
withdrawals of money from banks, business data processing, medical diagnosis, weather
forecasting, etc. are some of the areas where computer has become extremely useful.
However, there is one limitation of the computer. Human beings do calculations on their
own. But computer is a dumb machine and it has to be given proper instructions to carry out its
calculation. This is why we should know how a computer works.
HISTORY OF COMPUTERS
In 19th century England, Charles Babbage, a mathematician, proposed the construction of
a machine that he called the Babbage Difference Engine. It would not only calculate numbers, it
would also be capable of printing mathematical tables. The Computer History Museum in
Mountain View, CA (near San Diego) built a working replica from the original drawings. Visitors
can see in the device in operation there. Unable to construct the actual device, he earned quite a
few detractors among England’s literate citizens. However, Babbage made a place for himself in
history as the father of computing. Not satisfied with the machines limitations, he drafted plans for
the Babbage Analytical Engine. He intended for this computing device to use punch cards as the
control mechanism for calculations. This feature would make it possible for his computer to use
previously performed calculations in new ones.
Babbage’s idea caught the attention of Ada Byron Lovelace who had an undying passion
for math. She also saw possibilities that the Analytical Machine could produce graphics and music.
She helped Babbage move his project from idea to reality by documenting how the device would
calculate Bernoulli numbers. She later received recognition for writing the world’s first computer
,program. The United States Department of Defense named a computer language in her honor in
1979.
, The computers that followed built on each previous success and improved it. In 1943, the
first programmable computer Turing COLOSSUS appeared. It was pressed into service to decipher
World War II coded messages from Germany. ENIAC, the brain, was the first electronic computer,
in 1946. In 1951, the U.S. Census Bureau became the first government agency to buy a computer,
UNIVAC .
The Apple expanded the use of computers to consumers in 1977. The IBM PC for consumers
followed closely in 1981, although IBM mainframes were in use by government and corporations.
• 8,500 BC Bone carved with prime numbers found
• 1000 BC to 500 BC Abacus invented
• 1642 Blaise Pascal’s invented adding machine, France
• 1822 Charles Babbage drafted Babbage Difference Engine, England
• 1835 Babbage Analytical Engine proposed, England
• 1843 Ada Byron Lovelace computer program to calculate Bernoulli numbers, England
• 1943 Turing COLOSSUS the first programmable computer, England
• 1946 ENIAC first electronic computer, U.S.A.
• 1951 UNIVAC first computer used by U.S. government, U.S.A.
• 1969 ARPANET Department of Defense lays groundwork for Internet, U.S.A.
• 1968 Gordon Moore and Robert Noyce found in Intel, U.S.A.
• 1977 Apple computers for consumers sold, U.S.A.
• 1981 IBM personal computers sold, U.S.A.
• 1991 World Wide Web, CERN, Tim Berners-Lee Switzerland/France
• 2000 Y 2K Bug programming errors discovered
• Current Technologies include word processing, games, email, maps, and streaming
GENERATION OF COMPUTERS
The Five Generations of Computers
The history of computer development is often referred to in reference to the different generations
of computing devices. Each generation of computer is characterized by a major technological
development that fundamentally changed the way computers operate, resulting in increasingly
smaller, cheaper, more powerful and more efficient and reliable devices. Read about each
generation and the developments that led to the current devices that we use today.
UNIT – I: Fundamentals of Computers : Introduction – History of Computers-Generations of
Computers- Classification of Computers-Basic Anatomy of a Computer System-Input Devices-
Processor-Output Devices-Memory Management – Types of Software- Overview of Operating
System- Programming Languages-Translator Programs-Problem Solving Techniques - Overview
of C.
UNIT – I
INTRODUCTION
Let us begin with the word ‘compute’. It means ‘to calculate’. We all are familiar with
calculations in our day to day life. We apply mathematical operations like addition, subtraction,
multiplication, etc. and many other formulae for calculations. Simpler calculations take less time.
But complex calculations take much longer time. Another factor is accuracy in calculations. So
man explored with the idea to develop a machine which can perform this type of arithmetic
calculation faster and with full accuracy. This gave birth to a device or machine called ‘computer’.
The computer we see today is quite different from the one made in the beginning. The
number of applications of a computer has increased, the speed and accuracy of calculation has
increased. You must appreciate the impact of computers in our day to day life. Reservation of
tickets in Air Lines and Railways, payment of telephone and electricity bills, deposits and
withdrawals of money from banks, business data processing, medical diagnosis, weather
forecasting, etc. are some of the areas where computer has become extremely useful.
However, there is one limitation of the computer. Human beings do calculations on their
own. But computer is a dumb machine and it has to be given proper instructions to carry out its
calculation. This is why we should know how a computer works.
HISTORY OF COMPUTERS
In 19th century England, Charles Babbage, a mathematician, proposed the construction of
a machine that he called the Babbage Difference Engine. It would not only calculate numbers, it
would also be capable of printing mathematical tables. The Computer History Museum in
Mountain View, CA (near San Diego) built a working replica from the original drawings. Visitors
can see in the device in operation there. Unable to construct the actual device, he earned quite a
few detractors among England’s literate citizens. However, Babbage made a place for himself in
history as the father of computing. Not satisfied with the machines limitations, he drafted plans for
the Babbage Analytical Engine. He intended for this computing device to use punch cards as the
control mechanism for calculations. This feature would make it possible for his computer to use
previously performed calculations in new ones.
Babbage’s idea caught the attention of Ada Byron Lovelace who had an undying passion
for math. She also saw possibilities that the Analytical Machine could produce graphics and music.
She helped Babbage move his project from idea to reality by documenting how the device would
calculate Bernoulli numbers. She later received recognition for writing the world’s first computer
,program. The United States Department of Defense named a computer language in her honor in
1979.
, The computers that followed built on each previous success and improved it. In 1943, the
first programmable computer Turing COLOSSUS appeared. It was pressed into service to decipher
World War II coded messages from Germany. ENIAC, the brain, was the first electronic computer,
in 1946. In 1951, the U.S. Census Bureau became the first government agency to buy a computer,
UNIVAC .
The Apple expanded the use of computers to consumers in 1977. The IBM PC for consumers
followed closely in 1981, although IBM mainframes were in use by government and corporations.
• 8,500 BC Bone carved with prime numbers found
• 1000 BC to 500 BC Abacus invented
• 1642 Blaise Pascal’s invented adding machine, France
• 1822 Charles Babbage drafted Babbage Difference Engine, England
• 1835 Babbage Analytical Engine proposed, England
• 1843 Ada Byron Lovelace computer program to calculate Bernoulli numbers, England
• 1943 Turing COLOSSUS the first programmable computer, England
• 1946 ENIAC first electronic computer, U.S.A.
• 1951 UNIVAC first computer used by U.S. government, U.S.A.
• 1969 ARPANET Department of Defense lays groundwork for Internet, U.S.A.
• 1968 Gordon Moore and Robert Noyce found in Intel, U.S.A.
• 1977 Apple computers for consumers sold, U.S.A.
• 1981 IBM personal computers sold, U.S.A.
• 1991 World Wide Web, CERN, Tim Berners-Lee Switzerland/France
• 2000 Y 2K Bug programming errors discovered
• Current Technologies include word processing, games, email, maps, and streaming
GENERATION OF COMPUTERS
The Five Generations of Computers
The history of computer development is often referred to in reference to the different generations
of computing devices. Each generation of computer is characterized by a major technological
development that fundamentally changed the way computers operate, resulting in increasingly
smaller, cheaper, more powerful and more efficient and reliable devices. Read about each
generation and the developments that led to the current devices that we use today.