BITM: 1st Sem : Introduction to Computing and Information Technology :- Unit 1: Computer Fundamentals

A computer is an electronic machine that can be programmed to do different tasks. It takes data as input, processes it, and produces useful information. After processing, it can either show the results or store them for later use. A computer works with two main parts: hardware, which includes the physical components like the CPU, keyboard, and mouse, and software, which is made of the programs and instructions that tell the hardware what to do.

Core Functions of a Computer

Input:
This is when the computer receives data or instructions. Devices like the keyboard, mouse, or microphone help us give input to the computer.

Processing:
The CPU (Central Processing Unit) is the “brain” of the computer. It follows instructions from software and performs all the calculations and logical operations needed to turn the input into useful output.

Output:
Once the data is processed, the computer presents the result. A monitor, printer, or speaker can be used to show or deliver this information to the user.

Storage:
Computers store data so it can be used now or later. Some storage, like RAM, is temporary and clears when the computer is turned off. Other storage devices, like hard drives and SSDs, keep data safe for long-term use.


The Working Principle of Computer (IPO)




Core Characteristics of a Computer

Speed:
Computers work incredibly fast. They can handle billions of operations in just a tiny fraction of a second—much faster than any human could.

UnitFraction of a SecondHow Fast It Is
Millisecond (ms)1/1,000Faster than a blink
Microsecond (µs)1/1,000,000Much faster, used in computing
Nanosecond (ns)1/1,000,000,000Super fast, used in CPU operations
Picosecond (ps)1/1,000,000,000,000Ultra-fast scientific timing


Accuracy:
When given the correct data and instructions, computers produce highly accurate and error-free results. They don’t make mistakes unless the input itself is wrong. It follows the GIGO(Garbage In Garbage Out) process.

Automation:
Once a computer is programmed, it can complete tasks on its own without needing constant help from a person. This makes it great for doing repetitive or complicated work.

Versatility:
A single computer can do many different things. You can type documents, calculate numbers, design graphics, play games, browse the internet, or even do scientific research—all on the same machine.

Storage:
Computers can store a huge amount of data. They keep information both temporarily and permanently, and they can pull it up instantly whenever you need it.

UnitFull FormSize

Bit (b)                  Binary Digit             Smallest unit
Byte (B)            8 bits
Kilobyte (KB)      1,024 Bytes
Megabyte (MB)   1,024 KB
Gigabyte (GB)     1,024 MB
Terabyte (TB)      1,024 GB
Petabyte (PB)       1,024 TB
Exabyte (EB)        1,024 PB
Zettabyte (ZB)      1,024 EB


Applications of Computers 

Computers are used almost everywhere in today’s world because they can work fast, store large amounts of information, and perform many different tasks. Here are some common areas where computers are used:


1. Education

  • Used for teaching, online classes, digital learning, and research.

  • Students use computers to prepare notes, presentations, and projects.

  • Teachers use them for smart classes and result preparation.


2. Business

  • Helps in accounting, billing, inventory control, and communication.

  • Used in offices for storing customer data, sending emails, and making reports.


3. Communication

  • Computers make communication easy through email, social media, video calls, and messaging apps.

  • Helps people connect globally within seconds.


4. Healthcare

  • Used for storing patient records, diagnosing diseases, and running medical equipment.

  • Hospitals use computers for X-rays, MRI scans, and online appointments.


5. Banking

  • ATMs, online banking, money transfers, and maintaining customer accounts are done using computers.

  • Provides quick and secure transactions.


6. Entertainment

  • Used for watching movies, listening to music, gaming, animation, and creating videos.

  • Plays a big role in the film and music industry.


7. Science and Research

  • Scientists use computers for simulations, data analysis, weather forecasting, and space research.

  • Helps in discovering new medicines and technologies.


8. Government

  • Used for maintaining citizen records, tax collection, online services, and digital documentation.

  • Supports e-governance and making services quicker.


9. Transportation

  • Helps in ticket booking, traffic control, vehicle navigation (GPS), and airline operations.

  • Used to track trains, buses, and flights in real time.


10. Home Use

  • Used for online learning, entertainment, managing home budgets, shopping, and working from home.

  • Also used for controlling smart home devices.


Generation of Computer

The generation of computers is categorized by the technological shift in their core components, with the five main generations being: first (vacuum tubes), second (transistors), third (integrated circuits), fourth (microprocessors), and fifth (artificial intelligence). Each generation is marked by increased speed, efficiency, and smaller size.

                

Generation of ComputerTime PeriodKey TechnologyCharacteristics
First Generation1940–1956Vacuum TubesLarge, power-intensive, slow, machine language
Second Generation1956–1963TransistorsSmaller, faster, used assembly language
Third Generation1964–1971Integrated CircuitsCompact, introduced keyboards and monitors
Fourth Generation1971–PresentMicroprocessorsPersonal computers, GUIs, networking
Fifth GenerationPresent & BeyondAI and NanotechnologySelf-learning, natural language processing
Sixth GenerationEmerging & FutureQuantum Computing      Qubits, superposition, future potential


First Generation of Computers (1940–1956)

  • Used vacuum tubes for circuitry.

  • Used magnetic drums for main memory.

  • Very large in size, occupying entire rooms.

  • High electricity consumption and produced a lot of heat, causing frequent malfunctions.

  • Maximum storage capacity was about 20,000 characters.

  • Used machine language (lowest-level programming language).

  • Could solve only one problem at a time.

  • Required days or weeks to set up a new problem.

  • Input was through punched cards and paper tape.

  • Output was printed on printouts.

  • Von Neumann architecture was introduced in this generation.

  • Early examples include ENIAC and UNIVAC.

  • UNIVAC became the first commercial computer sold to the U.S. Census Bureau in 1951.


Second Generation of Computers (1956–1963)

  • Transistors replaced vacuum tubes in computers.

  • Transistors were invented in 1947 (Bell Labs) but widely used in computers in the late 1950s.

  • Included new hardware:

    • Magnetic core memory

    • Magnetic tape

    • Magnetic disks

  • Computers became smaller, faster, cheaper, more reliable, and more energy-efficient than first-generation machines.

  • Still produced heat, but much less compared to vacuum tubes.

  • Continued using punched cards for input and printouts for output.

  • Introduced assembly languages, replacing binary machine code with symbolic instructions.

  • Early high-level languages like COBOL and FORTRAN were developed.

  • Instructions were stored in memory, moving from magnetic drums to magnetic core technology.

  • Early second-generation computers were used mainly in the atomic energy industry.


Third Generation of Computers (1964–1971)

  • Introduced integrated circuits (ICs) as the key technology.

  • Transistors were miniaturized and placed on silicon chips (semiconductors).

  • Greatly improved speed, efficiency, and performance of computers.

  • Shifted from punched cards to keyboards, monitors, and operating systems.

  • Operating systems allowed multiple applications to run at once (multitasking).

  • Became smaller, cheaper, and more powerful, making computers accessible to the general public for the first time.

  • Improved user interaction and overall usability.


Did You Know… ? Integrated circuit (IC) chips are small electronic devices made out of semiconductor material. The first integrated circuit was developed in the 1950s by Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor.


Fourth Generation of Computers (1971–Present)

  • Marked by the invention of the microprocessor.

  • Thousands of integrated circuits were placed on a single silicon chip.

  • Room-sized computer technology from early generations could now fit in the palm of a hand.

  • Intel 4004 (1971) was the first microprocessor, containing the CPU, memory, and I/O controls on one chip.

  • IBM launched its first personal computer (PC) in 1981.

  • Apple introduced the Macintosh in 1984.

  • Microprocessors began appearing in everyday devices, not just computers.

  • Development of computer networks, which later led to the Internet.

  • Introduction and growth of GUIs (Graphical User Interfaces), the mouse, and handheld devices.

  • Increased computing power and decreased size led to widespread use of computers in homes, offices, and industries.


Fifth Generation of Computers (Present and Beyond)

  • Focused on Artificial Intelligence (AI) and advanced machine learning.

  • Goal: Develop machines that can learn, adapt, predict outcomes, and improve through self-learning.

  • Computers can now process natural language, enabling human-like interactions.

  • Early public AI tools included virtual assistants like Siri and Alexa (voice recognition–based).

  • Major leap in 2022 with ChatGPT, the first widely accessible Large Language Model (LLM) by OpenAI.

  • LLMs use Natural Language Processing (NLP) to understand questions and generate accurate responses.

  • AI is expanding into physical fields:

    • Self-driving cars (e.g., Tesla)

    • Trading bots

    • AI-powered robots

  • Aim: Create systems that can think and reason similarly to the human brain.

  • AI development is globally competitive, with major contributions from the United States and rising players like China’s DeepSeek.

  • Expected to transform everyday life—similar to how the internet revolutionized the world decades ago.

Sixth Generation of Computers (Quantum Computing – Future and Emerging)

  • Focuses on Quantum Computing, nanotechnology, and molecular technology.

  • Classical computers use bits (0 or 1), while quantum computers use qubits.

  • Qubits can exist in multiple states at the same time due to superposition.

  • Enables quantum computers to perform extremely complex calculations beyond classical computer capabilities.

  • Holds major potential in fields such as:

    • Cryptography

    • Material science

    • Artificial intelligence

    • Molecular modeling

    • Large-scale optimization problems

  • Can help simulate molecular interactions, design new medicines, and solve advanced scientific problems.

  • Still in an early development stage, not widely practical yet.

  • Tech giants like IBM, Google, and others are developing scalable quantum systems.

  • Future advancements may allow processing of massive datasets and solving problems classical computers cannot handle.


Types Of Computer




1. Analog Computer — (Continuous Data)

An analog computer works with real-world, continuous data.
This means the information it handles keeps changing smoothly — like temperature, speed, sound, or pressure.
Instead of giving exact numbers, an analog computer shows results in a continuous format, such as a moving needle or a rising line. Seismograph is an example of analog computer which detects and measures rector scale of earthquick.


2. Digital Computer — (Discontinuous Data)

A digital computer is a type of genereal purpose computer which acceps the discrete or discontinuous data like letter, number, symbol etc to perform the task given. It works by using binary digits (0,1). It accepts data and instructions, manipulates them using logic and arithmetic operations, and produces results. The types of digital computer are:

1. Micro Computer

2. Mini Computer

3. Mainframe Computer

4. Super Computer


1. Micro Computer:-

                                A microcomputer is a small, low-cost computer designed for the use of a single person. It uses a microprocessor as its central processing unit, which makes it compact and affordable. Microcomputers are commonly found in homes, schools, and offices. They include desktop computers, laptops, tablets, and even smartphones. Although they are smaller and less powerful compared to larger computer systems, they are excellent for everyday tasks such as browsing the internet, creating documents, watching videos, playing games, and performing simple programming activities. Because of their affordability, portability, and ease of use, microcomputers are the most widely used type of computer today.


2. Mini Computer:-

                                A minicomputer is a medium-sized computer that is more powerful than a microcomputer but less powerful than a mainframe computer. It can support multiple users at the same time, usually ranging from 10 to 50 users. Minicomputers were widely used in organizations, laboratories, and small industries where multiple people needed access to shared data or applications. These computers can handle more complex tasks such as database management, industrial control, scientific calculations, and business operations. Although they are smaller and cheaper than mainframe computers, they still offer high performance and multitasking capabilities. Examples include IBM AS/400, DEC PDP series, and VAX systems.


3.Mainframe Computer:-

                                      A mainframe computer is a large, powerful computer that can handle lots of tasks and many users at the same time.

It is mainly used by big organizations, banks, airlines, and government offices to process huge amounts of data reliably.

  • Main Points:

    • Can serve hundreds or thousands of users simultaneously.

    • Very reliable, secure, and fast.

    • Expensive and large in size.

    • Handles tasks like banking transactions, airline reservations, insurance records, and census data.

Example: IBM Z series, UNIVAC.      


4. Super Computer:-
                                A supercomputer is the fastest and most powerful computer, designed to perform complex calculations very quickly.

It is used for scientific research, weather forecasting, space exploration, and AI.

  • Main Points:

    • Can perform billions or trillions of calculations per second.

    • Handles very large and complex problems.

    • Usually used by scientists and researchers, not everyday users.

    • Extremely expensive and requires a lot of electricity.

Example: IBM Summit, Cray series, Fugaku (Japan).


3. Hybrid Computer — (Mix of both Analog and Digital)

A hybrid computer is a combination of analog and digital computers.
It takes advantage of both types:

  • Analog part → Measures real-world continuous data quickly (like speed, temperature, or heart rate).

  • Digital part → Processes the measured data accurately and displays results in numbers or graphs.

So, a hybrid computer is fast like an analog computer and accurate like a digital computer.

Features

  • Combines analog and digital functions

  • Fast measurement with precise output

  • Can handle both continuous and discrete data

  • Used in specialized applications

Examples

  1. ECG (Electrocardiogram) Machines – Measure heart signals (analog) and display them digitally.

  2. ICU Monitoring Systems – Monitor patient vitals and give accurate readings.

  3. Petrol Pumps – Measure fuel flow (analog) and calculate cost digitally.

  4. Scientific Research Computers – For experiments that need both fast measurement and precise computation.


Architecture Of Computer

Computer architecture is the conceptual design of a computer system, defining its structure and how its components, such as the CPU, memory, and input/output devices, work together. It acts as the blueprint, outlining the instruction set and the functional behavior of the machine, while computer organization is the physical implementation of that blueprint, detailing how the components are connected and function in practice.

Key Aspects of Computer Architecture:

1. Central Processing Unit (CPU)

  • The brain of the computer.

  • Performs arithmetic and logical operations.

  • Controls the flow of data between memory, input, and output devices.

  • Key parts:

    • ALU (Arithmetic Logic Unit) → Performs calculations and logic operations

    • CU (Control Unit) → Directs operations and manages data flow

    • Registers → Small, fast storage locations inside the CPU


2. Memory

  • Stores data and instructions temporarily or permanently.

  • Primary Memory (RAM, Cache): Fast, temporary storage for current tasks

  • Secondary Memory (HDD, SSD): Permanent storage for data and programs


3. Input/Output (I/O) Devices

  • Input Devices: Allow the computer to receive data (keyboard, mouse, scanner)

  • Output Devices: Allow the computer to present information (monitor, printer, speakers)


4. Bus System

  • A set of wires or pathways that carry data, instructions, and control signals between CPU, memory, and I/O devices.

  • Types of buses:

    • Data Bus → Carries data

    • Address Bus → Carries memory addresses

    • Control Bus → Carries control signals


5. Instruction Set

  • The basic commands a CPU can understand and execute.

  • Determines the CPU’s capabilities and performance.


6. Storage Hierarchy

  • Organizes storage from fastest and smallest to slowest and largest:

    • Registers → Cache → RAM → Secondary Storage (HDD/SSD) → Tertiary Storage


Types of Computer Architecture:

1. Von Neumann Architecture: Uses a single memory for data and instructions                            2. Harvard Architecture: Uses separate memories for data and instructions

1. Von Neumann Architecture: 
                                                    The Von Neumann Architecture is a design model for computers, proposed by John Von Neumann in 1945. It is the foundation for most modern computers.

It describes how a computer’s hardware components are organized and how they interact to process data and instructions.

Key Features

  1. Single Memory for Data and Instructions

    • Both program instructions and data are stored in the same memory unit.

    • This allows the CPU to fetch both from one place.

  2. Central Processing Unit (CPU)

    • Acts as the brain of the computer.

    • Contains:

      • ALU (Arithmetic Logic Unit): Performs calculations and logical operations.

      • Control Unit (CU): Directs the flow of data and instructions.

      • Registers: Small, high-speed storage inside the CPU.

  3. Input/Output Devices

    • Input devices (keyboard, mouse, scanner) provide data to the computer.

    • Output devices (monitor, printer) display the results.

  4. Sequential Execution

    • Instructions are executed one after another in a sequence.

    • CPU fetches an instruction, decodes it, executes it, then moves to the next.

  5. Stored Program Concept

    • Programs are stored in memory just like data, allowing the computer to change programs without altering hardware.



Introduction to Number System

A number system is a way of representing numbers using a set of symbols or digits.
It tells us how numbers are written, read, and interpreted in mathematics and computing.

In computers, number systems are very important because all data is ultimately represented in numbers, especially in binary form (0s and 1s).

Types of Number Systems:

1. Binary Number System -- Base2  -- (0,1)

2. Octal Number System -- Base8 -- (0,1,2,3,4,5,6,7)

3. Decimal Number System -- Base10 -- (0,1,2,3,4,5,6,7,8,9)

4. Hexadecimal Number System -- Base16 -- (0, .....9,A...,F)


Simple Formulas for Conversion or calculation:

1. Any Number System to Decimal   (Multiplication)

2. Decimal to Any Other Number System (Division)

3. Octal to Binary or Vice-Versa    (3 Digit Grouping)

4. Hexadecimal to Binary or Vice-Versa (4 Digit Grouping)


Data Representation

Data representation is the method of storing and expressing data inside a computer in a way that the computer can understand and process.

Computers only understand binary language (0s and 1s), so all types of data—numbers, text, images, sound, and instructions—must be converted into binary before processing.

Why Data Representation is Important

  1. Computers cannot understand human-readable forms (like letters or symbols).

  2. To process, store, or transmit data efficiently, it must be represented in a standard format.

  3. Makes data accurate, fast, and easy to manipulate.


Types of Data Representation in Computers

1. ASCII (American Standard Code for Information Interchange)

  • Represents characters using 7 or 8 bits.

  • Can encode 128 characters (7-bit) or 256 characters (8-bit extended ASCII).

  • Includes:

    • Letters: A-Z, a-z

    • Digits: 0-9

    • Symbols: @, #, $, etc.

    • Control characters: Enter, Backspace, Tab

Example:

  • 'A' → 65 (binary: 01000001)

  • 'a' → 97 (binary: 01100001)

Use: Mainly in older systems and simple text files.


2. Unicode (UTF-8, UTF-16, UTF-32)

  • Developed to represent all characters from all languages.

  • Can store thousands of symbols from different languages (English, Hindi, Chinese, Arabic, emojis, etc.)

  • Uses UTF (Unicode Transformation Format) to encode characters:

    • UTF-8: 1–4 bytes per character (most common on the web)

    • UTF-16: 2–4 bytes per character

    • UTF-32: 4 bytes per character

Example:

  • 'A' → U+0041

  • 'अ' → U+0905

  • '😊' → U+1F60A

Use: Modern systems, websites, mobile apps, and international applications


3. EBCDIC (Extended Binary Coded Decimal Interchange Code)

  • Developed by IBM for mainframe computers.

  • Uses 8 bits per character.

  • Mainly used in older IBM mainframes.


In Simple Words

  • ASCII: Only basic English letters, numbers, and symbols.

  • Unicode / UTF: Supports all languages and special symbols.

  • EBCDIC: Older IBM-specific code.


Written By

Helping Students Succeed • •Free Book Solutions and Study Guides • sandesh_Shiwakoti