3 min read

From Scaffolding to Symphonies

Is AI coding a revolution or just the next layer of abstraction? From Assembly to Low-Code, we’ve spent decades moving further from the machine to focus on the idea. Exploring the history of "scaffolding" and the trade-off between speed and control.

In the previous article, we explored vibe coding, guiding AI to produce software without necessarily knowing every detail underneath.

It might feel like something entirely new. But in reality, software development has been evolving in this direction for decades. Every so often, a new layer of abstraction appears that makes programming less difficult — and more accessible.

Another Layer of Abstraction

Early programming required working very close to the machine. You wrote assembly (1949–1950s), managed memory manually, and thought in terms of registers and instructions.

Then higher-level languages and tools appeared, each raising the level of abstraction:

  • FORTRAN (1957) — simplified math-heavy computation for scientific purposes.
  • COBOL (1959) — English-like syntax for business programming.
  • BASIC (1964) — beginner-friendly language for students and hobbyists.
  • SQL (1974) — declarative database queries, replacing record-by-record operations.
  • Excel (1985) — spreadsheets and macros made programming concepts accessible to millions of non-programmers.
  • Visual Basic (1991) — rapid GUI application development with drag-and-drop forms.
  • Low-code / No-code platforms (mid-2000s onward) — visual environments for building applications without writing traditional code.

Each step reduced friction, broadened access, and allowed developers to focus more on ideas and composition than on repetitive mechanical work.

Scaffolding Changes How We Work

Once scaffolding exists, the approach to development changes.

Instead of starting from scratch:

You create models.
You generate migrations.
You write controllers and services.
You wire together prebuilt pieces.

The basic architecture appears quickly, and you can focus on what actually makes the software unique. In other words, the orchestra is already assembled.

AI Pushes This One Step Further

AI-assisted development continues the same progression.

Frameworks removed the need to write repetitive infrastructure code.
AI removes the need to write repetitive application code.

Describe the feature you want, and the scaffolding appears for:

Controllers
Validation
Migrations
Tests
Other boilerplate logic

The scaffolding now applies to the music itself, not just the concert hall.

The Trade-off Behind Scaffolding

Every abstraction layer in programming comes with a trade-off. You gain speed and accessibility, but you lose some control and visibility into what’s happening underneath.

Spreadsheets made programming accessible to millions, but large Excel models can become incredibly difficult to debug. Low-code and no-code platforms allow applications to be built quickly, until you hit the edges of what the platform allows.

Frameworks follow the same pattern. They remove a lot of repetitive work, but they also hide complexity behind conventions.

AI-assisted development pushes this pattern even further.

When AI generates code for you, the system may work — but the architecture might not always be obvious. As projects grow larger, the AI may struggle to keep the full system in context, producing solutions that work locally but don’t always fit the broader design.

The orchestra becomes larger. The instruments become more powerful. Which makes the role of the conductor even more important.

A Familiar Risk: Dependency

Nearly every application depends on something: frameworks, libraries, databases, infrastructure. We rely on them because they save time and let us build more complex systems.

But that convenience always comes with a trade-off.

Low-code platforms made this very visible. If your application lives entirely inside one platform, you are constrained by what it allows — and dependent on how it evolves.

The same applies to frameworks and AI tools.

If your workflow or application depends heavily on a single platform, you become exposed to decisions outside your control.

Prices may change.
API limits may appear.
The service might go down.
The model itself may evolve in ways that change its behavior.

When you vibe code, you aren't just inheriting a library; you're inheriting a black box. If the 'vibe' shifts because a model is updated, your foundation moves without you touching a single line of code.

These are not reasons to avoid these tools. But with AI, it becomes more immediate. The dependency is not just in your runtime, but in how you build the software itself.

Learning the Composition

The pattern repeats throughout the history of programming: we moved from low-level programming like assembly to higher-level languages such as FORTRAN, COBOL, and BASIC. Later, tools like SQL, Excel, and Visual Basic made it possible to express ideas without writing every step manually.

This doesn’t eliminate the need for understanding. It shifts where the understanding is most valuable.

If you want to be a conductor — guiding the AI orchestra — you still need to understand the instruments, the music, and the composition. The faster the orchestra plays, the more critical your guidance becomes.

You still have to do the work, study the system, debug when necessary, and shape the music. AI can help you produce the notes, but understanding the symphony requires experience.

In the next article, we’ll look at where AI truly shines in day-to-day development — the features, tasks, and workflows where having an orchestra at your disposal becomes genuinely transformative.

Let’s explore!

Clicky