takuranyagumbo.com

Spreading the Gospel of the Computer & Other Side Quests

← Back to blog

Computer History · Clueless Pundit

Why I Am Building a Computer History Library

I started studying computer history to write better programs. Somewhere along the way, the project became a training ground for storytelling, design, branding, and understanding the machines we now trust with almost everything.

Published: 13 May 2026 Blog + playlist bundle 8 min read
Computer History Programming Software Engineering Storytelling Computing Origins Clueless Pundit
Clueless Pundit computer history library artwork
A gateway into the Clueless Pundit computer history library: old machines, programming roots, and the ideas underneath modern software.

I started building a computer history library because I am trying to write better computer programs.

That sounds like a strange place to begin.

Most programmers who want to improve usually reach for a new framework, a new language, a new tool, a new certification, or a YouTube tutorial.

I decided to go backwards.

Not because I hate modern tools. I use them every day. But the more I write software, the more I realise that modern computing is a tower of abstractions stacked so high that most of us are writing code from somewhere near the clouds, waving our little JavaScript flags, while barely understanding the machinery underneath.

It is also dangerous.

If you only understand the surface, every problem starts to look like a framework problem. Every solution becomes another dependency. Every mistake becomes something to patch over instead of understand. It is like using a megaphone at a whispering contest: impressive equipment, completely wrong instinct.

So I wanted to understand the computer itself.

And there is no better place to start than the start.

Not the polished version of computing, where machines sit quietly on desks and pretend to be obedient. I mean the awkward, mechanical, noisy, frustrating beginning. Punched cards. Looms. Gears. Tabulators. Relays. Switches. Vacuum tubes. Mathematical obsession. Impossible patience. People looking at physical machinery and seeing the first shadows of computation.

The first major video I made was about Joseph-Marie Jacquard. I released it on the 21st of August 2025. Since then, I have made four more videos covering some of the major contributions to computing between 1806 and 1937.

On paper, that sounds neat. In reality, it has been brutal. I nearly quit several times. Not once. Not twice. Several times.

Making these videos has tested me in ways I did not expect. Research is hard. Writing is hard. Recording is hard. Editing is hard. Explaining technical history without turning it into academic cough syrup is very hard. Trying to make old machines feel alive on screen is a full-contact sport.

There were moments when the whole thing felt like trying to type while someone kept jumping in and hitting the Backspace key. Every video exposed another weakness. My writing needed work. My voice needed work. My visuals needed work. My pacing needed work. My understanding of the subject needed work. My branding needed work. My patience needed a blood transfusion.

I am glad I did not quit because the project is starting to give something back. The obvious benefit is that I now understand computing history better than I did before. I have learned about early innovators, their inventions, their constraints, their blind spots, and their moments of genius. But the deeper benefit is this, I am starting to see modern computing differently. That matters.

History is not just a museum of dead ideas. It is a record of problems people had to solve. One of my favourite lines says, "tradition is a set of solutions for which we have forgotten the problems." Throw away the solution, and the problem comes back. Sometimes it has changed shape. Often, it is still there, waiting patiently like a debt collector with excellent memory. Computing history works the same way.

  • Why do we separate hardware and software?
  • Why do computers use binary?
  • Why did programming languages become necessary?
  • Why did storage, memory, input, and output evolve the way they did?
  • Why do we still talk about algorithms written before electronic computers existed?
  • Why are modern developers still wrestling with old problems wearing newer trousers?

These are not trivia questions. They are foundation questions and foundations matter.

A programmer who understands only the latest tool can be useful. A programmer who understands the history underneath the tool has a better chance of developing judgement. That is a different animal. Tools change. Judgement compounds.

That is where the project stops being just my little rabbit hole. Software now runs banks, schools, hospitals, businesses, farms, cars, phones, side hustles, entertainment, government systems, and whatever chaos people are cooking on WhatsApp at midnight. Yet many of us who build software only understand the layer we touch. That is not because modern developers are stupid. It is because modern tools are extremely good at hiding complexity. They let us move fast, but they also make it easy to forget what we are standing on. History slows you down in the right way. It forces you to ask better questions. It reminds you that every convenience was once a breakthrough. Every abstraction was once a fight. Every "obvious" idea was once strange, expensive, controversial, or close to impossible. When you see that, you stop treating technology as magic. You start seeing it as accumulated human effort. That shift is important, especially from where I stand.

In many African tech conversations, we often focus on jobs, certificates, frameworks, and “how to break into tech.” Those things matter. People need work. People need income. People need practical routes into the industry. If that is the only conversation, we risk producing tool users without technical culture. I do not want that.

I want to understand computing as a craft, an industry, a history, and a civilisation-scale story. I want young developers, especially here, to see that they are not merely learning syntax. They are entering a long chain of human problem-solving.

That chain runs through Jacquard, Babbage, Lovelace, Hollerith, Shannon, Turing, Zuse, von Neumann, Hopper, Booth, and many others. Some were mathematicians. Some were engineers. Some were eccentrics. Some were fighting wartime constraints. Some were trying to automate drudgery. Some were trying to make machines think. Some were probably one bad committee meeting away from levitating off the ground with rage. But they all contributed to the world programmers now inherit. That inheritance is worth understanding.

The other surprise is that this library is teaching me far more than computer history. It is teaching me video production.

Every episode is a small workshop in storytelling, structure, pacing, visuals, sound, editing, and restraint. I am learning that explaining an idea clearly is not the same thing as knowing it. Knowledge can sit in your head looking impressive. Communication forces it to put on work boots.

It is also teaching me branding and design. A video is not just information. It is packaging, rhythm, visual identity, tone, trust, and memory. A good subject can still fail if presented poorly. A strong idea can be buried under weak visuals, bad pacing, or a title with all the charm of wet cardboard. That has been an education of its own.

And quietly, there are business lessons everywhere.

  • Consistency matters.
  • Positioning matters.
  • Audience trust matters.
  • Execution matters far more than the fantasy version of the idea in your head.

It is easy to imagine a beautiful project. It is much harder to sit down and build the damn thing when the research is messy, the edit is fighting you, the script is misbehaving, and your confidence has gone out for milk and not returned.

The library is not just a collection of videos. It is a training ground. A place where I am sharpening my understanding of computers while learning how to explain difficult things in public. A place where programming, history, design, storytelling, and business keep colliding in useful ways. That collision is the point.

I still have a long way to go. The library is young. The process is rough. The standard is rising faster than my ability, which is deeply inconvenient, but probably healthy.

Next, I am looking forward to making videos about Konrad Zuse, the ENIAC, John von Neumann, Alan Turing, Kathleen Booth, Grace Hopper, and many others. Those stories are going to be a lot of fun to make. Not easy. Fun. There is a difference.

Easy is scrolling through other people’s work and calling it research. Fun is wrestling with a difficult idea until it starts speaking clearly. Easy is consuming. Fun is building. That is why I am building a computer history library.

I want to become a better programmer, yes. But I also want to become the kind of person who understands the tools he uses instead of decorating ignorance with syntax highlighting. I want to become a better storyteller. I want to understand the machines I work with every day. I want to preserve the origins of an industry that is shaping the future.

And maybe, if I do the work properly, someone else will see what I am starting to see, COMPUTING IS NOT JUST CODE ON A SCREEN. It is people trying to solve problems with whatever tools they had available. Sometimes that tool was a loom. Sometimes it was a stack of punched cards. Sometimes it was a room-sized machine that needed more care and feeding than a royal baby. Today, it might be a laptop, a compiler, a cloud service, or an AI assistant confidently hallucinating its way into your blood pressure medication.

Different tools. Same old human problem: How do we turn thought into machinery?

That journey is worth documenting.

Watch the library

Computer History Playlist

The companion playlist collects the Clueless Pundit computer history episodes, from Jacquard and Lovelace to Hollerith, Shannon, and the next machines in the chain.

About Clueless Pundit

Let’s Debug History

Clueless Pundit digs through computer history, old machines, strange inventions, forgotten rivalries, and the chain of brilliant accidents that eventually gave us modern computing. Less museum whispering. More electricity in the wallpaper.