Home Pulse Apps

Engineering the Economy: Reflections on Growth and Systems Thinking

Applying engineering principles and systems thinking to economic policy—sharing concerns about the risks of unchecked complexity

img of Engineering the Economy: Reflections on Growth and Systems Thinking

Published

- 9 min read

Listen to this story


Introduction

Economic growth has long been the central metric of success in modern societies. Politicians promise it, investors demand it, and the media celebrates it. But we rarely stop to ask: growth of what, and at what cost?

From an engineering perspective, viewing the economy as a complex system, growth is not the enemy. In fact, growth has been and remains essential—particularly for emerging economies or those recovering from crisis. However, in highly developed societies, I believe that the function and relevance of growth must evolve. Just as a startup cannot rely on the same structures and processes as it scales, advanced economies must adapt their assumptions, institutions, and incentives to reflect new constraints and priorities once basic needs are met and development has plateaued.

This is a familiar problem in tech: as teams and codebases grow, the processes that worked during early-stage hustle often become liabilities. What scaled you up won’t sustain you later. (See: Blitzscaling by Reid Hoffman).

There’s also a dangerous structural dependency: tax systems and welfare models are often growth-dependent. When growth slows—due to demographics, saturation, or resource limits—budgets tighten, promises falter, and political tensions rise. Developed societies risk instability if they fail to decouple basic public financing from endless expansion.

That doesn’t mean we should stop striving. Just as athletes continue training not to grow indefinitely but to maintain form, avoid injury, and sustain performance, mature economies must continue investing, evolving, and improving—without chasing growth for its own sake.

In this post, I offer a systems-oriented view of why our traditional growth paradigm needs refactoring, and what it might mean to build something more adaptive, resilient, and sane.

Lessons from Systems Thinking

As someone trained in engineering and software architecture, I see systems thinking as a powerful lens—one that applies far beyond code or infrastructure. It’s a framework for understanding complexity, dependencies, and failure modes in any domain, including companies, teams, or entire societies.

In a software organization, for example, the “system” isn’t just the product—it’s the people, processes, incentives, tools, and norms. If one part is over-optimized (say, output metrics) while others degrade (team morale, architectural integrity), the entire system suffers.

Similarly, when looking at entire economies, I believe that we should apply this same scrutiny.

First, overly centralized systems become fragile and rigid. Planned economies fail to respond to localized needs or to incorporate decentralized feedback, making them prone to inefficiencies and collapse. One core issue is informational: central planners cannot possibly access or process the real-time, granular knowledge distributed throughout a complex economy. The required data isn’t just hard to gather—it’s emergent, context-dependent, and often tacit. The sheer cognitive and computational complexity of optimizing an entire economic system centrally exceeds any feasible planning capability. The result is overfitting to abstract goals and underperformance in real-world dynamics.

This mirrors the challenges faced by large software monoliths: as systems grow and more teams are involved, the tight coupling and global dependencies begin to stifle agility. Coordination becomes a bottleneck, local changes are slowed by centralized constraints, and the cost of evolving the system skyrockets. At some point, the rigidity of the monolith becomes more costly than the complexity of breaking it apart. That’s when microservices—with their modularity, team autonomy, and domain boundaries—become not just viable, but necessary. Similarly, decentralized economies introduce localized optimization, faster feedback, and greater resilience by allowing parts of the system to evolve independently without awaiting central approval.

Second, overly fragmented systems become incoherent. In hyper-financialized capitalism, for instance, the real economy becomes decoupled from financial markets. Capital flows chase speculative returns, not productive investment. Layers of abstraction—derivatives, financial instruments, algorithmic trading—amplify volatility and create systemic risk, often with little connection to underlying value creation.

Third, a lack of feedback correction creates echo chambers. We see this in social media ecosystems and ideological silos, where reinforcing loops prevent systems from learning or adapting meaningfully. There’s also a broader trend: ideologies are increasingly followed with the fervor of religions. When loyalty to a worldview takes precedence over observation, critical thinking suffers. Instead of asking “what works?” and analyzing real-world data, people become committed to defending theoretical purity. This undermines the very feedback loops that allow complex systems to self-correct and evolve.

Finally, monocultures die from lack of diversity. This principle holds across domains. In agriculture, dependence on a single crop—like the Irish reliance on potatoes in the 19th century or today’s uniform corn and wheat—leads to systemic vulnerability (see: Michael Pollan’s The Omnivore’s Dilemma). In business, lack of innovation and rigid cultural thinking are linked to stagnation and collapse (see: Clayton M. Christensen’s The Innovator’s Dilemma). Diversity—ecological, organizational, or cognitive—acts as insurance against shocks and a catalyst for resilience.

In complex software systems, we don’t assume success—we measure it. We deploy monitoring, tracing, and analytics. We define KPIs and OKRs to track whether interventions are effective. We test, iterate, and refine. We know that blindly scaling complexity without insight leads to technical debt and eventual failure.

Legislators, on the other hand, rarely monitor the effectiveness of their laws once passed. The legal corpus tends to accumulate like unmaintained legacy code—layer upon layer of outdated, conflicting, or redundant regulation—with few mechanisms for systematic review, rollback, or cleanup. No single stakeholder feels accountable for pruning this complexity, and without clear metrics or feedback loops, there’s little incentive to refactor failing parts of the system.

And when a codebase grows too large without refactoring, it becomes a monster—fragile, opaque, and hostile to change. The same applies to economic systems. Without modularity and evaluative checkpoints, things break silently and systemically.

Resilience comes not from maximizing throughput, but from building slack, modularity, diversity, and graceful degradation.

The Myth of Costless Expansion

One increasingly dangerous design flaw in modern economies is the reliance on central bank-fueled spending. Governments can accumulate debt seemingly without limits, aided by monetary policies that obscure the true cost of capital. This creates a dynamic where long-term responsibility is sacrificed for short-term convenience—a pattern that undermines fiscal integrity and democratic accountability.

From a systems perspective, it’s as if the cost function has been decoupled from system inputs. The feedback loop is broken. The result is a growing pile of sovereign debt, financed through monetary expansion, which distorts markets and creates the illusion of stability.

This critique echoes concerns raised by the Austrian School of Economics, which has long criticized the role of central banks in enabling unsustainable government spending and distorting market signals. Thinkers like Carl Menger, Ludwig von Mises, and Friedrich Hayek argued for the importance of decentralized knowledge, stable currency, and spontaneous order in complex economic systems. Their rejection of the labor theory of value in favor of marginal utility theory reinforced their belief that prices should emerge organically from individual preferences—not be manipulated through centralized planning or monetary policy.

For them, a sound economy relies on a stable, non-manipulated currency—one that governments cannot inflate at will. Otherwise, market signals become distorted, incentives misaligned, and meaningful evaluation of policy outcomes impossible. Without a stable unit of account, it’s difficult to assess whether a given intervention is truly working or just temporarily masked by monetary expansion. This undermines our ability to make informed adjustments and iterate effectively.

These ideas remain controversial, but they highlight an important question: Can any complex system remain coherent if the constraints that define its stability are persistently undermined?

When Growth Becomes Fragile

Every system has a carrying capacity. When pushed past its thresholds—whether through resource depletion, social inequality, or institutional overreach—the feedback loops become unstable. The resulting symptoms—mass migrations, political polarization, widespread burnout—aren’t isolated events. They’re interconnected failures that emerge when a system operates far beyond its sustainable limits.

Even if the root causes are debated, the pattern is familiar: strained systems behave erratically. They lose their capacity to adapt, to self-correct, or to absorb shocks. What once looked like resilience becomes revealed as fragile equilibrium—one that collapses under accumulated stress.

From a technical standpoint, it’s like increasing throughput by bypassing all throttling mechanisms. It might work in the short term. But eventually, your buffers overflow, your system crashes, or worse—it silently corrupts.

Toward Meaningful Growth

This isn’t a romantic call for a return to pre-industrial life. Growth can be good—when it’s purposeful, bounded, and aligned with long-term system integrity.

Some of the emerging alternatives are worth watching. Metrics like the Genuine Progress Indicator (GPI) or Human Development Index (HDI) attempt to track actual well-being rather than just financial throughput. They’re not silver bullets, but they shift the focus from scale to quality.

Rather than prescribing a new orthodoxy, I’m more interested in the spirit of iterative reform: test different models, measure their effects, and adjust.

The Case for Incrementalism

I consider myself a reformist, not a revolutionary. Complex systems—from software architectures to national economies—are dangerous to change abruptly. One wrong move and everything crumbles. History is full of failed utopias launched with grand theories and no rollback plan.

Instead, I believe in small, measurable experiments. Change, observe, adapt, repeat. It’s how we build stable systems and resilient societies.

These principles are already visible in the real world. The Swiss model is one example: cantons compete to create attractive environments within a shared federal framework. It’s a demonstration of how competency, subsidiarity, and distributed experimentation can improve system outcomes without centralized coercion.

China provides another interesting case. Their creation of special economic zones enabled them to test out policy and market reforms in limited, controlled areas before expanding successful experiments nationwide. These zones acted as sandboxes—allowing for feedback, adaptation, and course correction.

What makes the comparison meaningful is how different China and Switzerland are—culturally, politically, ideologically. And yet both applied a similar principle: experiment locally, measure carefully, and scale only what works. This convergence across such divergent contexts points to something deeper: that effective policy design requires intellectual humility and empirical rigor. We should care less about where ideas come from, and more about whether they work.

Too often, ideologies function like belief systems, discouraging open inquiry and honest evaluation. If we want better outcomes, we must be willing to test, to fail, to learn—and above all, to let evidence, not dogma, guide our next iteration.

These examples show that in complex systems, learning and progress rarely emerge from top-down master plans. They come from controlled experiments, tight feedback loops, and iterative improvements. Start small. Observe. Adapt.

Closing

I’m not an economist, but I work with complex systems. And from where I stand, endless growth looks less like a solution and more like a legacy bug. If we want to design a system that lasts, we’ll need to refactor our assumptions.

Back to the top ↑

Join the discussion