Clueless Pundit · Computer History

Coding in the 1840s: Who Wrote the First Computer Program?

Three names collide in 1842 and 1843: Charles Babbage, Luigi Menabrea, and Ada Lovelace. The twist is simple: the computer they were “programming” was mostly a blueprint. So the real question is not just who was first, but what we even mean by “program”.

Analytical Engine · Note “G” · First Program Debate
Video + Article · 1842 → 1843

Everyone’s talking about AI like it is about to take over the world. Fair enough. But if you want to understand how we got here, you have to go back to the first step. In the last episode, a loom taught us that machines can follow instructions. Now we go hunting for a bigger claim: who wrote the first computer program?

There is one awkward detail. In 1842 and 1843 there was no working computer in the modern sense. There was a plan for one: the Analytical Engine. So this becomes a very Victorian puzzle. Can you write a program for a machine that does not exist yet?

The key idea

“First program” is not a single question. It depends on the standard you use. Earliest written? First published? First fully worked algorithm you can reproduce step by step? Different standards, different winners.

The machine and the rules

To write a program, you need rules. You need a machine model that is specific enough that someone could, in theory, take your instructions and implement them in hardware or punch them onto cards.

Babbage’s Analytical Engine was designed around three big ideas:

  • The Store (memory) to hold numbers.
  • The Mill (the arithmetic unit) to do operations.
  • Punched cards to control operations and data movement, inspired by industrial automation.

In modern terms, the translation is almost rude in its simplicity: Store = RAM Mill = CPU Cards = Program + I/O

The important part is not the metaphor. It is the control structure. The card deck can specify an operation, point to memory locations, introduce constants, and it can also support repetition and conditional jumps. That is the difference between a fancy calculator and a general-purpose computer.

Why this matters

A “program” is not a vibe. It is a sequence of machine operations that can be executed, repeated, and checked. Once the rules exist, code can exist on paper.

Menabrea’s linear equations (1842)

In 1842, Luigi Menabrea published a report describing Babbage’s machine and included a worked method for solving simultaneous linear equations. Think of it like finding where two lines meet, but written in a form that maps neatly onto Store and Mill operations.

This is a serious candidate for “first program in print”, because it goes beyond vague description. It outlines steps and intermediate results in a way that looks very much like a procedure designed for the Engine. It also proves something important: yes, you can write programs without having the machine in the room.

The catch is that it reads more like a walkthrough than a fully developed, reusable algorithm. It hints at repetition, but it does not lean hard into explicit looping and full trace tables. If your standard is “first published program”, Menabrea is strong. If your standard is “first fully worked algorithm you can run step by step without guessing”, we keep digging.

Lovelace’s Note “G” (1843) and the Bernoulli algorithm

In 1843, Ada Lovelace translated Menabrea’s paper into English and added Notes A to G. The notes are the main event. In Note “G”, she outlines a complete procedure for computing Bernoulli numbers.

You do not need to love the maths to appreciate what Note “G” does well. It is machine-oriented. It is precise about what is read from the Store, what the Mill does, and where results go. Most importantly, it includes two ingredients that make programmers relax their shoulders: explicit looping and a trace table that records values after each step.

That combination is why Note “G” is often treated as the first published, fully worked algorithm for a general-purpose computer. Not the first time anyone ever thought about procedures, of course, but a particularly clear “this is what the machine would do” presentation.

Why Note “G” is treated differently
Sequence: clear step-by-step Engine operations.
Looping: repetition is explicit, not implied.
Trace: values are recorded after steps, so you can verify.
Collaboration: Babbage reviewed and suggested changes.

Babbage’s drafts and correspondence (1830s to 1843)

Before anything hit print, there were drafts. Babbage filled notebooks with Engine procedures in the late 1830s and early 1840s. If your standard is priority (earliest surviving program, even unpublished), that evidence matters. It suggests Babbage was writing “code on paper” years before the public versions existed.

Then there is 1843 correspondence. Lovelace is shaping a publishable exposition, asking questions, adding structure. Babbage reviews, proposes corrections, and suggests improvements. It looks like co-development in the way early work often is. If you have ever shipped software, this will feel familiar. The first program, like many modern ones, did not emerge from a single genius in a vacuum.

The standard of proof (your verdict)

Here is the part that annoys people who want a simple answer. Different standards give different answers, and all three standards below can be defended.

Three reasonable standards
  • Priority (earliest surviving program, even if unpublished): Babbage.
  • First published program (any worked procedure in print): Menabrea (1842).
  • First published, fully worked algorithm (sequence + explicit looping + trace): Lovelace, Note “G” (1843).

So, who wrote the first computer program? The honest answer is: it depends on what you mean by “first”. If you value earliest writing, Babbage is hard to ignore. If you value first in print, Menabrea has a strong claim. If you want the first complete, reproducible algorithm in print, Note “G” is the headline act.

Why this matters today

Software began as writing. Not code running on hardware, but careful instructions written for a machine that was still a concept. That is a strange origin story, and it also explains something modern programmers keep rediscovering: the hard part is not the silicon, it is the clarity.

Also, collaboration is not a modern invention. The first “code” was shaped in conversation, drafts, and revisions. So when people argue about credit in tech today, we are not dealing with a new problem. We are replaying a Victorian argument with better Wi-Fi.

Want the full investigation in video form?

The YouTube episode walks the evidence clue by clue, and shows why “first” depends on the standard you choose. The YouTube episode walks the evidence clue by clue and makes the standards much easier to compare.

References and further reading

Primary sources first, then modern commentary. If you read only one thing, read Note “G” and enjoy the sheer audacity of programming a machine that was not built.

  1. L. F. Menabrea (1842), Notions sur la machine analytique de M. Charles Babbage. Archive.org scan.
  2. Menabrea (1843 English translation) with Notes A–G by Ada Lovelace, Sketch of the Analytical Engine Invented by Charles Babbage. PDF (includes Note “G”).
  3. Charles Babbage (1864), Passages from the Life of a Philosopher. Archive.org PDF.
  4. Stephen Wolfram (2015), Untangling the Tale of Ada Lovelace. Wolfram’s original article.
  5. Wikipedia quick links (for context and rabbit holes): Analytical Engine, Ada Lovelace, Charles Babbage, Luigi Menabrea, Bernoulli numbers.
Computer History · Analytical Engine · 1840s
Clueless Pundit

About Clueless Pundit

Clueless Pundit is where we take big tech-history ideas and drag them into the light, preferably without worshipping Silicon Valley like it is a sacred mountain.

If you enjoy computer history, weird inventions, paper “software”, and the occasional African analogy that hits a bit too close to home, you are in the right place.

Subscribe. Or don’t. The Victorians already did the hard part with pencil and paper.