AI is being touted as a huge productivity boon. It’s going to lead to unprecedented good times as AI transforms industries worldwide, just like the computer before it. Economists have created frameworks that help model these kinds of disruptions. Let’s explore one, look at some historical trends and make some educated guesses as to whether AI is going to transform the world.
The Solow Growth Model.
The Solow Growth Model provides a framework for understanding how economies grow over time. The key assumption of this model is that capital is subjected to diminishing returns. That makes sense to me. When capital is scarce, each new unit of capital generates bigger returns. When capital is abundant, it has less impact. The marginal productivity of new capital declines. Makes sense right?
In this model, the key driver of sustained labour productivity is technological progress. In this model, technology improves the efficiency of capital and labour. In the long-run, the productivity growth is determined only by technological progress (assuming everything else remains constant). Without technology improvements, labour productivity growth eventually stagnates.
The Solow model is made up of the following components.
Capital (K) - factories, equipment, computers, tools
Labour (L) - workers who operate the capital
Productivity (A) - the represents to total factor of productivity (TFP)
The model examines how these ingredients combine to create economic output, and how it grows over time. In the equation below, we’ve got Y
being the total output (e.g. GDP) and A
represents productivity).
We’re going to look at the model over time and we’ll introduce a new term α. This represents “the elasticity of output with respect to capital”. Or to put that in terms I understand, it’s the proportion of capital that leads to an increase in output. For example, a 10% increase in capital leads to a 3% increase in output if α is 0.3. Essentially it represents bang per buck!
For labour, we introduce another scaling factor n
which represents how the number of workers grow over time.
And for productivity (A
) we introduce another factor g
(the rate of growth of technology).
One more equation and then we’ll do something interesting, I promise. How does capital change over time? Well, capital both depreciates (loses value over time) and needs investment. We capture this dynamic over time like this with the δ constant and a saving rate s
. This one’s a fair bit more complicated than the rest (because it depends on the Y
values too).
Let’s start with a really dull graph. The graph illustrates the output per worker with different levels of productivity. The higher the productivity the more output per worker.
The key point is without technology improvements, labour productivity growth eventually stagnates
What’s AI going to do to all of this?
Well, let’s model a few scenarios.
Gradual productivity boost - AI gets linearly better year in, year out and is constantly increasing productivity
Continuous technological progress - AI gets exponentially better year on year.
A gradual productivity boost lifts the productivity level (Total Factor Productivity) to a new high, but it’s just going to raise the baseline and again labour productivity growth eventually stagnates.
In the event that tech keeps getting better, the economy becomes a runaway train and we’re going to the moon.
Which one is more likely?
What’s the history of productivity growth?
We introduced computers in the 1980s and they’ve had comparatively little impact on productivity. Let’s look at that over time.
Just think about the world since 1980. We’ve introduced computers, connected them all together with the internet, 3G, 4G, 5G, built advanced robotics and sequenced the human genome yet none of this has led to a step-change in productivity! As Robert Solow himself quipped “you can see the computer age everywhere but in the productivity statistics”.
So, is AI going to be different?
Soemtimes, the best prediction of the future comes from looking at what happened in the past. Despite all the technical innovation of the last 40 years, productivity has barely moved.
The US is currently putting $500B into Project Stargate. Sadly, this isn’t a return for Colonel O’Neill and team, instead it’s a bet that this technology will cause a bump in productivity. This sounds like a crazy amount of investment, but the GDP of the US is $25 trillion so this reflects about a 2% investment. Very roughly, for the investment to pay back it’s got to grow GDP by about 0.4% over the next five years or so. Seems plausible to me given the historical growth rate.
What makes it more complicated is this belief that AI is going to reshape the labour market, and less people will be employed (Goldman Sachs suggest 300 million jobs at risk of automation!). If the rate of the loss in jobs is greater than or equal to the gain in productivity, we’re in trouble (overall GDP will fall).
Of course, there are counterarguments.
AI is different - perhaps AI is able to perform cognitive tasks for cash, in which case we probably need to throw the model out.
Job displacement not job loss - perhaps new industries will emerge as a result of more workforce being available.
Other models treat economic growth as something that comes from the outside (e.g. positive externalities), so perhaps the whole premise is flawed!
Regardless of the answer, exciting times ahead. It definitely feels like we’re living in a future chapter of a history book!