Worry over America’s recent economic stagnation, however justified, shouldn’t obscure the fact that the American economy remains Number One in the world. The United States holds 4.5 percent of the world’s population but produces a staggering 22 percent of the world’s output—a fraction that has remained fairly stable for two decades, despite growing competition from emerging countries. Not only is the American economy the biggest in absolute terms, with a GDP twice the size of China’s; it’s also near the top in per-capita income, currently a bit over $48,000 per year. Only a few small countries blessed with abundant natural resources or a concentration of financial services, such as Norway and Luxembourg, can claim higher averages.
America’s predominance isn’t new; indeed, it has existed since the early nineteenth century. But where did it come from? And is it in danger of disappearing?
By the 1830s, the late British economist Angus Maddison showed, American per-capita income was already the highest in the world. One might suppose that the nation could thank its geographical size and abundance of natural resources for its remarkable wealth. Yet other countries in the nineteenth century—Brazil is a good example—had profuse resources and vast territories but failed to turn them to comparable economic advantage.
A major reason that they failed to compete was their lack of strong intellectual property rights. The U.S. Constitution, by contrast, was the first in history to protect intellectual property rights: it empowered Congress “to promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.” As Thomas Jefferson, who became the first commissioner of the patent office, observed, the absence of accumulated wealth in the new nation meant that its most important economic resource was innovation—and America’s laws encouraged that innovation from the outset. Over two centuries later, the United States has more patents in force—1.8 million—than any other nation (Japan, with 1.2 million, holds second place). America is also the leader in “triadic patents” (that is, those filed in the United States, Europe, and Asia) registered every year—with 13,715 in 2009, the most recent year for which statistics are available, ahead of Japan’s 13,322 and Germany’s 5,764.
Another reason for early American prosperity was that the scarcity of population in a vast territory had pushed labor costs up from the very beginning of the colonial era. By the early nineteenth century, American wages were significantly higher than those in Europe. This meant that landowners, to make a profit, needed high levels of productivity—and that, in turn, meant the mechanization of agriculture, which got under way in America before it did overseas.
The replacement of labor with capital investment helped usher in the American industrial revolution, as the first industrial entrepreneurs took advantage of engineering advances developed in the fields. The southern states made a great economic as well as moral error in deciding to keep exploiting slaves instead of hiring well-paid workers and embracing new engineering technologies. The South started to catch up with the rest of the nation economically only after turning fully to advanced engineering in the 1960s as a response to rising labor costs.
The enormous American territory and the freedom that people had to move and work across it—guilds were nonexistent in the new country—also encouraged an advanced division of labor, which is essential to high productivity, as Adam Smith argued in The Wealth of Nations. And Americans’ mobility had a second benefit: by allowing entrepreneurs and workers to shift from location to location and find the best uses of their talents, it reduced prices, following David Ricardo’s law of comparative advantage. Today, globalization has the same effect, making prices drop by assigning the production of goods to countries that are relatively efficient at making them. But in nineteenth-century America, the effect was concentrated within a single large nation. Both the extended division of labor and the law of comparative advantage reduced prices to a level lower than any seen before, despite America’s high wages.
Democracy, too, encouraged ever-cheaper products. In Europe, an entrepreneur could thrive by serving a limited number of wealthy aristocrats—or even just one, provided that he was a king or a prince. Not so in the democratic United States, where entrepreneurs had to satisfy the needs of a large number of clients who compared prices among various vendors. America’s leading entrepreneurs haven’t always been the greatest innovators, but they have been the greatest cheapeners and tinkerers. Henry Ford didn’t invent the automobile, but he figured out how to make it less expensive—a mass product for a democratic market, at first American and then global.
The ultimate American economic invention was standardization, which further reduced production costs. Standardization evolved in America because consumers there tended to share a taste for the same products and services. Companies consequently began providing similarly priced goods and services of the same general quality to citizens constantly on the move across the American expanse. Not only did Coca-Cola, Hilton hotels, and McDonald’s become successful companies; they became forces for stability in a remarkably mobile society.
Immigration has been another component of American economic dynamism, for evident quantitative reasons: national GDP grows when total population and productivity increase simultaneously. But this effect has worked particularly well in the United States because its immigrants have tended to be young, energetic, and open to American values. Immigration is a self-selecting process: those who find the courage to leave behind their roots, traditions, and family often have an entrepreneurial spirit. (Indeed, prior to the emergence of the modern welfare state, it was tough to survive in America without such a spirit.) The newcomers, from Irish workingmen in the nineteenth century to Russian scientists in the twentieth, have continually reenergized the economy with their skills and knowledge.
They have also added a wild variety to American life, which helps explain why American culture—highbrow or lowbrow, sophisticated or pop—has dominated the world. In the cultural arena, at least, the globalization of the modern world is actually its Americanization. Roughly 80 percent of the movies seen in the world every year, for instance, are produced in the United States. This surely has something to do with the fact that, from the first days of the film industry, Hollywood’s producers and directors hailed from all parts of the globe, intuitively knowing what kind of movies would appeal not just to Americans but to people across the planet.
The entrenched rule of law, the absence of guilds, the unfettered competition, the democratic mass market, the immigration effect— Europeans took little notice of these striking American developments or of the expansion of the American economy generally. Not until the St. Louis World’s Fair in 1904, which brought European business delegations to the United States for the first time, did Europeans understand how far American entrepreneurs had leaped ahead of them. According to Nobel laureate Douglass North, the fair marked a turning point; from then on, the American economy was widely recognized as the global leader in per-capita income and overall output.
The American drive for innovation intensified with the growing cooperation of venture capital, business, and academia in the twentieth century. The defining moment occurred in the 1950s, when Frederick Terman, a dean of engineering at Stanford University, launched the first “industrial park”—a low-rent space where start-up firms could cluster and grow. Built on Stanford’s campus, it remains in existence; many consider it the origin of Silicon Valley. The collaborative “Stanford model” has been a trademark of what New York University economist Paul Romer calls the New Growth, in which the association of capital, labor, and ideas produces economic development. New York City, hoping to spur New Growth, has just awarded Cornell University the right to open an applied-science campus on Roosevelt Island in the East River.
In America, the three-sided nature of modern capitalism—capital, labor, ideas—has given the economy a sharp competitive edge. Other countries have tried to replicate the Stanford model, but they have little to show for it so far, partly because the best American universities have unique advantages in funding and in top research faculty and students. The failure to reproduce the model elsewhere has encouraged widespread infringement of American intellectual property, especially by China (see “Patently American,” Autumn 2011). But piracy, a short-term fix at best, doesn’t foster innovation.
Another ingredient in America’s recent prosperity is the Federal Reserve’s success at maintaining a stable, predictable currency. Thanks to its relative independence from the government, the Fed—except during its brief Keynesian periods, such as the late seventies and the current stimulus era—has been able to protect the dollar from politically expedient inflationary pressure. That has encouraged Americans to invest in production. In parts of Europe, by contrast, a long history of inflation taught residents to grab short-term returns by speculating in money markets. Indeed, private investment is always lower in inflationary countries than in noninflationary ones; think of struggling pre-euro Italy versus booming pre-euro Germany.
The American economy has also been spared the aggressions that anti-capitalist ideologues, both fascist and Marxist, unleashed in Europe. True, Washington has diverged from free-market principles at times, usually by imposing high tariffs on goods at the request of industrial lobbies. But the normal, publicly accepted form of American production has always been free-market capitalism. American investors and entrepreneurs, unlike their European counterparts, have never lived with the fear that the state would nationalize their investments or factories.
The overall level of taxation has remained lower in the United States than in Europe, and this has benefited growth as well. Americans and Europeans spend approximately the same percentage of their incomes on personal consumption, housing, education, health, and retirement. In European countries, though, these expenditures are often funded through taxes; in the United States, they’re more frequently paid for by consumers making free choices. The European redistributionist model leads to a more egalitarian society, while the American model is based on the individual’s assumed capacity to make decisions that are right for him. The proper balance between equality and freedom remains the subject of debate between liberals and free-market conservatives. But free choice does appear to be more economically efficient: as economists like Nobel laureate Gary Becker have shown, individual investments tend to be made more rationally than collective, government-directed investments. And when public expenditure grows, it may reduce the share of private investment and diminish what another Nobel economist, Columbia University’s Edmund Phelps, calls the dynamism of an economy.
Does this claim apply even to long-term investments traditionally made by the government, such as infrastructure? Was the Eisenhower administration’s decision to fund an interstate highway network a more rational investment than the creation of such a network through private funding would have been? No one can know for sure. In statist France, it’s worth noting, the freeway system is privately run, funded by tolls, and in better condition than its American counterpart. In any case, to argue that more public spending would accelerate American economic growth is to ignore the fact that all major European nations have higher levels of public spending than the United States does—and that all are poorer.
A final reason for American prosperity involves what Joseph Schumpeter called “creative destruction.” As he explained the concept in his 1942 book Capitalism, Socialism and Democracy, for economic progress to occur, obsolete activities and technologies must disappear (the destruction), and capital must shift from old uses to more productive ones (the creation). Government efforts to save or bail out companies that stick with outmoded products, services, or management methods protect the existing order at the expense of innovation, growth, and future jobs. European governments resist creative destruction by means of extensive labor regulations, which economists blame for the fact that over the long term, unemployment has been higher in Europe than in the United States. Slower growth rates don’t account for this difference: in fact, the European economy has at times grown faster than the American one. Of course, endorsing creative destruction doesn’t mean abandoning workers displaced by this harsh process—and the American safety net, while much criticized in Europe and far from perfect, has provided extended unemployment insurance for millions seeking work.
Fixing an ailing economy can be difficult in a democracy. Politicians running for office, pundits, and incumbent administrations will always be tempted to promote quick fixes, which aren’t really fixes at all. Indeed, as history shows, many popular responses to economic crises—closing borders to immigration and free trade, hiking taxes, or printing money excessively and driving up inflation—can do incredible damage to long-term growth.
In the current sluggish economic environment, the remarkable history of American dynamism is thus more instructive than ever. America’s economic might is rooted in an entrepreneurial culture and a passion for innovation and risk-taking, traits nourished by the nation’s commitment to the rule of law, property rights, and a predictable set of tax and regulatory policies. Policymakers have lost sight of these fundamental principles in recent years. The next era of American prosperity will be hastened when they return to them.
Guy Sorman is the author of Economics Does Not Lie: A Defense of the Free Market in a Time of Crisis.
Do we know how economies develop? Obviously not, it seems, or otherwise every country would be doing better than it currently is in these low-growth times. In fact, cases of sustained rapid growth, like Japan beginning in the 1960s, or other Southeast Asian countries a decade later, are so rare that they are often described as “economic miracles.”
Yet when Patrick Collison of software infrastructure company Stripe and Tyler Cowen of George Mason University recently wrote an article in The Atlantic calling for a bold new interdisciplinary “science of progress,” they stirred up a flurry of righteous indignation among academics.
Many pointed to the vast amount of academic and applied research that already addresses what Collison and Cowen propose to include in a new discipline of “Progress Studies.” Today, armies of economists are researching issues such as what explains the location of technology clusters like Silicon Valley, why the Industrial Revolution happened when it did, or why some organizations are much more productive and innovative than others. As the University of Oxford’s Gina Neff recently remarked on Twitter, the Industrial Revolution even gave birth to sociology, or what she called “Progress Studies 1.0.”
This is all true, and yet Collison and Cowen are on to something. Academic researchers clearly find it hard to work together across disciplinary boundaries, despite repeated calls for them to do so more often. This is largely the result of incentives that encourage academics to specialize in ever-narrower areas, so that they can produce the publications that will lead to promotion and professional esteem. The world has problems, as the old saying puts it, but universities have departments. Interdisciplinary research institutes like mine and Neff’s therefore have to consider carefully how best to advance the careers of younger colleagues. The same silo problem arises in government, which is likewise organized by departments.
Moreover, fashions in research can lead to hugely disproportionate intellectual efforts in specific areas. To take one example, the ethics of artificial intelligence is clearly an important subject, but is it really the dominant research challenge today, even in the fields of AI or ethics? The financial incentives embedded in technology companies’ business models seem to me at least as important as morality in explaining these firms’ behavior.
At the same time, some important economic questions are curiously underexplored. For example, in his recent book The Technology Trap, Carl Frey expands on his gloomy view of what automation will mean for the jobs of the future, pointing to the adverse effects that the original Industrial Revolution had on the typical worker. Yet Frey also notes that a later period of automation, the era of mass production in the mid-twentieth century, was one of high employment and increasingly broad-based prosperity. What explains the great difference between those two eras?
Today, the role of research in changing behavior – whether that of government officials or of businesses and citizens – is part of the broader crisis of legitimacy in Western democracies. By the early 2000s, technocrats – and economists in particular – ruled the roost, and governments delegated large swaths of policy to independent expert bodies such as central banks and utility regulators. But then came the 2008 global financial crisis. With real incomes stagnating for many, and “deaths of despair” increasing, it is not surprising that expertise has lost its luster for much of the public.
This leads to a final point about the need for a science of progress: what do we actually mean by “progress”? How should it be measured and monitored, and who experiences it? For many reasons, the standard indicator of real GDP growth, which leaves out much of what people value, will no longer do.
The debate about progress therefore raises profound political and philosophical questions about the kind of societies we want. If the global economy falls into recession, as now seems likely, then social divisions and political polarization will intensify further. And the clear message since the turn of the millennium is that if most people do not experience progress, then society isn’t really progressing at all.
Current academic research – into the impact of new technologies, the economics of innovation, and the quality of management, for example – may be providing ever more pieces of the puzzle. But many crucial questions about economic progress remain unanswered, and others have not yet even been properly posed.