Home Mail Articles Stats/current Supplements Subscriptions Links


The following article appeared in Left Business Observer #79, October 1997. It retains its copyright and may not be reprinted or redistributed in any form - print, electronic, facsimile, anything - without the permission of LBO.


Where's the payoff?

Daniel E. Sichel, The Computer Revolution: An Economic Perspective (Washington: Brookings Institution Press, 164 pp., $38.95 hardcover, $16.95 paperback).

Adherents of the New Paradigm - a fashionable doctrine holding that new technology has kicked the U.S. economy onto a permanently higher productivity path, meaning perpetual growth without inflation, and, most importantly, an exuberant, eternal bull market in stocks - took cheer from revisions to the second quarter productivity figures. Instead of rising a mere 0.7% annual rate, as the Bureau of Labor Statistics initially reported, output per hour worked blossomed this past spring by 2.7%. Finally, heavy investments in computers and other electronic gadgetry were showing up in the productivity numbers, where they'd been MIA for years.

Or, putting the emphasis somewhat differently, as Business Week observed, "[T]he new industrial revolution [computers] herald has hardly begun. Their real potential has been snagged in false starts in use - but they're on their way." Finally! And, computers might also put an end to the business cycle. The only problem with these arguments is that they come from the magazine's June 21, 1958, issue.

That BW quote is marshalled by Daniel Sichel, along with quite a few others from the business press over the last 40 years, to show that the payoff lurking just around the corner - or finally arriving - is an ancient theme in popular commentary on the economic possibilities of computers. Yet despite the latest BLS revision of the productivity figures, the long-term payoff still seems deferred; the spring 1997 burst looks like a blip if you inspect any time period longer than three months. The present business cycle expansion, now over six years old, has the lowest rate of productivity growth of any since modern numbers begin in 1947, and the 1990s are the weakest decade of the five shown in the nearby chart. Yes, there's a pickup in manufacturing productivity compared with earlier decades, but this is almost certainly the result of closing weaker plants rather than upgrading existing ones, and of contracting out for services like janitoring and accounting, meaning that these low-productivity functions are no longer pulling down the manufacturing numbers. Computers were supposed to bring a productivity blessing to the service industries, but with rare exceptions, they're not.

Why not? Sichel's little book gives lots of good answers.

 

Why it matters

Before reviewing Sichel's arguments, it might be worth recalling just why productivity is important at all. Leaving aside all qualitative questions for the moment, how much workers can produce in an hour of labor is the fundamental determinant of the level of material wealth. Or, more precisely, it marks a limit on the level of a society's material welfare; many other things, like politics and social institutions, determine how the produce of an hour's labor is divided. It can go into wages, social welfare spending, salaries paid to bosses, or interest and dividends paid to rentiers. Over the last 20 years in the U.S., wages have grown even more slowly than productivity, which is another way of looking at the polarization of incomes: money has been trickling upwards, thanks to higher managerial salaries and growing financial claims, because workers have been getting a smaller share of what they produce. But there's no question that the productivity slowdown has also contributed to wage stagnation. So were computers capable of the productivity miracles attributed to them, there's a chance that average incomes could rise again without making lots of institutional changes (like unionization and an attack on the rentiers).

On first, and even second, glance, it might seem that computers could contribute mightily to productivity, especially since the price of computing power has been dropping like a stone for decades. The Department of Commerce's official price measure, the price index that goes into the GDP accounts, has fallen an average of 16% a year since figures began in 1965 - and persistently. The decline was actually steeper in the 1960s and 1970s than it was in the 1980s or so far in the 1990s.

But to focus on that price decline alone is to miss many significant points, as Sichel shows. It says nothing about the costs of software, training, repairs, or support, which are declining much more slowly if at all. And their levels can be quite high; private consultants estimate, for example, that a $2,500 PC costs a typical big business from $6,000 to $13,000 a year in such secondary expenditures. Miracles can easily disappear in overhead like that.

And the hardware price decline is itself a financial pitfall: it means that an investment made today depreciates very rapidly. Though a 1992 computer may still "work" in the literal sense, it probably can't run 1997's best software. For an existing business, computers that come in the front door are often replacements for ones going out the back door. That pace of obsolescence can be very expensive, a fact that barely penetrates the cybertopians' prose.

 

Growth models

Another reason that the computer payoff is so elusive is that they account for a surprisingly small share of both current spending and the total stock of capital equipment. For the first half of 1997, just a hair over 1% of GDP was devoted to business investment in computers; in 1996, they accounted for only 2% of the value of all capital equipment (a very dicey thing to estimate, it must be conceded). And, also surprisingly, these numbers have only risen slightly since the early 1980s.

If money spent on computers is such a small portion of the economy, it makes sense that their contribution to overall productivity and growth can't be all that large. Sichel takes that intuition and mathematizes it, using orthodox neoclassical growth models. In these models, economic growth is the combined result of new capital investments plus the increase in work effort (more workers and/or longer hours). Higher investments should mean higher profits, and more work should mean more wages - together these yield a higher GDP. Such models have their limits - for one, the intense difficulty of measuring the monetary value of capital equipment, especially something as rapidly changing and prone to obsolescence as computers, not to mention their indifference to intangibles like education, attitudes, and social institutions - but they're a good place to start.

In this instance, let's assume that computers earn a "normal" rate of profit - the average rate of return on capital of all kinds throughout the entire U.S. economy. Since those profits count as part of GDP, computers' contribution to economic growth would equal the profits earned on new computer investments as a share of growth through the entire economy. (More complications: estimating a "normal" rate of profit, and then assuming that these profits come from the machinery rather than the workers who operate them - but again we're leaving these aside for the sake of argument.) From this, Sichel estimates that computers contributed just 5% to total U.S. economic growth between 1970 and 1992. Other forms of capital investment contributed nearly seven times as much. Worse, notes Sichel, "as firms boosted their purchases of computers [since 1980], they scaled back investments in other capital." Adding in software and computer-related services boosts the growth contribution a bit, but not by much. Even if you assume that computers yield superprofits, above the "normal" rate (as some studies have claimed), their overall contribution would still be minimal, given their small share of the overall capital stock. But if they were yielding such big returns, it's a safe bet that firms would be investing a lot more in computers than they are.

 

The long view

OK, so maybe we're just being impatient by expecting the productivity payoff to arrive so soon. This is the view of Stanford University economist Paul David, who claims that it took several decades for the productivity advance offered by the electric motor to kick in (an analysis endorsed by Alan Greenspan). But computers are hardly new arrivals anymore. Though their ancestors go back into the 19th century, what is usually called the first real computer, the Manchester University Enhanced Mark I, was built in 1949, and the basics of modern computer architecture were established by the 1960s. Obviously they've gotten much faster and slicker over the last 50 years, but they've been a fact of life for quite a long time now.

More broadly, as James Beniger shows in his book The Control Revolution, the management of "information" has been playing an increasingly important role for over 150 years, as capitalist industrialization has jacked up the speed and scope of economic life. Past technological advances increased the ability to collect and distribute information in ways we may find old hat today, but were eye-popping in their own time. Typewriters, printing presses, adding machines, punched-card tabulators, and the telegraph all increased the speed of information processing in their day by factors comparable to what computers have done since 1949. And, Sichel shows, none of this seemed to increase the rate of growth; you could read Beniger as arguing that the purpose of info tech has been to cope with the blooming, buzzing confusion of industrial life, managing growth rather than accelerating it.

Of course, just because computers haven't yielded a great productivity payoff doesn't mean that they're not changing the way we live and work. To take one very dear example, this newsletter could not exist in its present form - produced by a full-time equivalent staff of less than two, with up-to-date stats and reasonably presentable graphics - in a pre-microcomputer world. Nor would our music be so heavily sampled, nor public and private databases so full of information on all of us, nor new financial instruments be invented or traded with such vigor. But these sorts of qualitative issues are generally beyond the ken of most economists.

Nor does Sichel's sort of analysis say anything about the political possibilities of using computers in liberating ways - to reduce the burden of tedious work (though computers have certainly created their share of tedious work too), or rehabilitating unfashionable ideas about worker self-management and broader economic planning. On that latter point, conservative critiques of planning frequently center on the inability of planners to get their heads around the vast quantity of information in a complex economy. That may have been true in the past, but it's harder to accept in the time of Deep Blue. Computers don't deliver the quantitative productivity boost they're alleged to, but that may be the wrong thing to ask of them.

 

SIDEBAR: Myths of origin

While we're fact-checking some of the mythology that's grown up around computers, it's worth recalling the message of an older Brookings book, Kenneth Flamm's historical overview, Creating the Computer. In the received version of this history, equally popular on Wall Street and in the Silicon Valley, the machines were developed by plucky entrepreneurs tapping our wondrously munificent and flexible capital markets. While there's no denying the role of upstarts and venture capitalists in the evolution of the computer, especially over the last couple of decades, it's hard to imagine the machines would exist in their present form without several decades of support from the U.S. government, especially the military.

It's hard to overstate the government's role in the first few decades of the computer era, and even before. Ancestors like radar and code-breaking machinery were developed under government contracts as early as the First World War, and the Second accelerated the effort. Even such modern-seeming gadgets as video terminals, the light pen, the drawing tablet, and the mouse evolved from Pentagon-sponsored research in the 1950s, 1960s, and 1970s. And the Internet, today celebrated as proof of the superiority of American capitalism, owes its very existence to the Pentagon's interest in having a communications network that could survive a nuclear war. The military's influence on software was less pervasive - though no one would have even developed a programming language without a machine to run it on. But even here, Washington's generous hand is visible: database software has its roots in Air Force and Atomic Energy Commission projects, artificial intelligence in military contacts going back to the 1950s, and airline reservation systems in 1950s air-defense systems. More than half of IBM's R&D budget came from government contracts in the 1950s and 1960s, and IBM's corporate ancestor got its start providing punched-card technology for the 1890 Census.

Point these facts out to the libertarians who populate the Internet, and they often respond by saying it all would have happened anyway. But, as Flamm writes, "Key players in the military first tried to convince established businesses and investment bankers that a new and potentially profitable business opportunity was presenting itself. They did not succeed, and, consequently, the Defense Department committed itself to financing an enormously expensive development program...." Europe's weakness in computers is often attributed to its stodgy business culture and thin financial markets, but, as Flamm shows, European governments were too stingy in their subsidies in the 1950s to get an industry going. By the 1960s, the U.S. lead was unbeatable.


Home Mail Articles Stats/current Supplements Subscriptions Links