// English for IT Students — Block 1

The Story of
Programming

From a single switch flipped inside a machine to millions of lines of human-readable code — the journey of how we learned to talk to computers.

↓   scroll to begin   ↓
1
Era 01 · 1940s

It begins with a
single switch.

Deep inside the first computers — machines the size of rooms — were thousands of transistors: tiny electronic switches that could be either ON or OFF. That's it. Two states. Two numbers.

Engineers discovered that combining these two states — 0 and 1 — was enough to represent any number, letter, or instruction. This is binary, the native language of all computers.

The letter "A" in 8-bit binary
Binary · "Hello" starts with 'H' = 72
// 8 transistors encode one character
// Position: 128 64 32 16 8 4 2 1
1 0 0 0 1 0 0 0 // = 64+8 = 72 = 'H'
0 1 1 0 0 1 0 1 // = 101 = 'e'
0 1 1 0 1 1 0 0 // = 108 = 'l'

To program the first computers, engineers had to write every instruction as a long string of 0s and 1s — then physically punch holes into paper tape or cards to feed those instructions into the machine. A program of 1000 instructions meant 1000 rows of holes.

"We were not programming. We were speaking the machine's own language, digit by digit, hole by hole." — Early ENIAC programmer, 1940s
0
Era 00 · 1801 → 1970s

Before computers,
there were looms.

🧵 Origin Story — This did not start with computers

In 1801, a French weaver named Joseph-Marie Jacquard wanted to automate his loom — a machine that weaved complex silk patterns. Hand-weaving a pattern required a master craftsman to manually raise and lower hundreds of individual threads in exactly the right sequence. One mistake meant a ruined pattern.

Jacquard's solution was revolutionary: he punched holes in stiff cardboard cards. Each hole told the loom: "raise this thread." No hole meant: "keep it down." Sound familiar? Hole = 1. No hole = 0. The loom read the cards automatically, weaving the same perfect pattern every time, with no human error.

Jacquard Loom · 1801 · Hole = thread UP (1), No hole = thread DOWN (0)

Fifty years later, mathematician Charles Babbage saw Jacquard's cards and had a vision: what if you could use the same idea to control a calculating machine? His collaborator Ada Lovelace — widely considered the world's first programmer — wrote the first algorithm designed to be processed by a machine, describing it in notes longer than the original paper she was translating.

"The Analytical Engine weaves algebraic patterns just as the Jacquard-loom weaves flowers and leaves." — Ada Lovelace, 1843

To program a computer,
you punched holes in cards.

By the 1940s and 1950s, every major computer used punch cards as its primary input method. A programmer would sit at a keypunch machine — essentially a typewriter that punched holes instead of printing ink — and prepare a deck of cards, one card per line of code.

Each card held 80 columns, each column representing one character. The position and pattern of holes in that column encoded the character as binary. A program of 500 lines was a physical stack of 500 cards, held together with a rubber band, carried carefully to the computer operator, and fed into a card reader one by one.

Example punch card · "HELLO" encoded as binary holes
IBM 80-COLUMN CARD · PROGRAM INPUT CARD 001 / 500
↑ Each column = 1 character. Dark holes = 1 (punched through). Light = 0 (solid card).

Drop the deck? Game over. The cards would scatter across the floor — 500 cards with no page numbers, no way to tell which came first. Programmers quickly learned to draw a diagonal line across the top edge of their sorted deck so they could re-sort it instantly if it was ever dropped. Debugging was physical. Programming was manual labor.

The programmer's nightmare — a dropped deck
// Your 500-card program, sorted correctly:
001: LOAD A, 0
002: LOAD B, 5
003: ADD A, B
...
500: PRINT RESULT

// After dropping the deck:
✖ 437: PRINT RESULT
✖ 003: ADD A, B
✖ 500: LOAD A, 0
✖ 002: ??? ████
// Good luck.

Punch cards remained the dominant form of programming input well into the 1970s. They were only gradually replaced by terminals — screens with keyboards where you could type and see your code, edit a single character without re-punching an entire card, and run it instantly. For programmers of that era, the terminal felt like a miracle.

2
Era 02 · Early 1950s

Humans invent
a shortcut.

Writing pure binary was exhausting and error-prone. So programmers invented Assembly language — a system of short, human-readable abbreviations called mnemonics that mapped directly to binary instructions.

Instead of 10110000 01100001, you could write MOV AL, 97. A special program called an assembler would then translate your readable code back into binary for the machine.

Assembly · print "Hi" to screen
; Load the character 'H' into register AL
MOV AL, 72 ; 72 = ASCII 'H'
INT 21h ; Call OS to print it
MOV AL, 105 ; 105 = ASCII 'i'
INT 21h
MOV AH, 4Ch ; Exit program
INT 21h

Assembly was a revolution — but it was still machine-specific. Code written for one type of processor was useless on another. And even the simplest task required dozens of individual steps. The dream of writing code closer to human thought was just beginning.

3
Era 03 · Mid 1950s–1970s

The first languages
that think like people.

In 1954, IBM engineer John Backus led a team to create FORTRAN — the first widely-used high-level programming language. For the first time, a programmer could write a mathematical formula almost exactly as it appeared on paper, and a program called a compiler would translate it all the way down to binary automatically.

In 1959, Grace Hopper pioneered COBOL, designed to read almost like English — making programs readable by business managers, not just engineers. The idea was radical: code should communicate with humans.

FORTRAN · 1957 · Calculate an average
C Calculate average of three numbers
      REAL A, B, C, AVG
      A = 10.0
      B = 20.0
      C = 30.0
      AVG = (A + B + C) / 3.0
      PRINT *, 'Average = ', AVG
      END
COBOL · 1959 · Business logic in plain English
IDENTIFICATION DIVISION.
PROGRAM-ID. SALARY-CALC.

PROCEDURE DIVISION.
  COMPUTE MONTHLY-PAY = ANNUAL-SALARY / 12
  DISPLAY "Monthly pay: " MONTHLY-PAY
  STOP RUN.

Then came C in 1972 — powerful enough to write operating systems, yet readable enough for humans. C became the parent of an entire family of modern languages. Almost every language you use today has C's DNA in it.

4
Era 04 · 1990s–Today

Code becomes
almost human.

By the 1990s, languages grew increasingly expressive. Python (1991) let programmers write logic that looked almost like plain English. JavaScript (1995) brought programming into every web browser. Compilers and interpreters became invisible — you wrote the idea, the machine did the rest.

Python · 2024 · The same "average" as before
# One line. Same result. 70 years of progress.
numbers = [10, 20, 30]
print(f"Average = {sum(numbers) / len(numbers)}")

Today, a single line of Python does what once required hundreds of lines of Assembly. High-level languages handle memory, type-checking, and data structures automatically. Programmers focus entirely on what they want to build — not how the transistors should switch.

The abstraction ladder — wider bar = harder to read for humans
← harder to read                                                      easier to read →
10110000 01001000 10110100 00001001 10111010 00001101 00000000 10110110 00000000 11001101 00100001
Machine CodeRaw binary — what the CPU actually runs
MOV AH, 09h / MOV DX, msg / INT 21h
AssemblyShort words — still very close to hardware
printf("Hello, world\n");
C / C++First truly readable syntax, 1972
DISPLAY "Hello".
COBOLEnglish-like business language, 1959
System.out.println("Hello");
Java / C#Object-oriented, structured, 1990s
print("Hello")
PythonAlmost plain English — 1 line
▲ closest to hardware ▼ closest to human thought

One truth, endless languages.

Every Python script, every website, every AI model — at the very bottom, it is all transistors switching between 0 and 1. Programming languages are just layers of translation, built by humans so we don't have to think in binary.

binary transistor bit byte Jacquard loom punch card keypunch card reader Ada Lovelace machine code assembly mnemonic assembler compiler interpreter high-level language abstraction FORTRAN COBOL Python syntax