Income inequality in times of war and revolution: the city of Moscow in 1916

by Elizaveta Blagodeteleva (National Research University Higher School of Economics)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.


Voznesenskaya Square, 1900s. Available at Wikimedia Commons.

In autumn of 1916, a big scandal roiled the Moscow public: local landlords petitioned the municipal government for the permission to raise rents, which was prohibited by the military administration a year before amid the escalating refugee crisis. Newspapers fumed at the selfishness of the rich, who not only avoided serving their country at the battlefield but exploited wartime hardships to get even richer. Health inspectors, lessees and workers of the largest industrial plants publicly raised their objections to the proposal.

Although the concerted effort of the city landlords to increase revenue eventually failed, the public outrage persisted. The occasional evidence of huge war profits and rumours about the luxurious life of industrialists and rentiers stoked anger among the urbanites, who struggled to make ends meet under the increasing pressure of galloping inflation and food shortages. The rent scandal highlighted the growing animosity towards the rich that the Bolsheviks would later channel into fully-fledged class warfare.

In 1916, Moscow residents sincerely believed that the gap between the wealthy and the rest of the population was enormous and it kept widening at an alarming pace. But did their beliefs match reality? In other words, how unequal was urban society in Russia in the last year of the old regime? To answer this question, a student of social and economic inequality would usually refer to income tax records. Unfortunately, there are very few of them in case of imperial Russia.

The Russian authorities had been extremely wary of income taxation up until the beginning of the Great War, when the national political mobilisation elevated the issue of the personal responsibility of each and every subject of the tsar. As a result, the state legislature passed an income tax in the spring of 1916. Its political objectives overwhelmed fiscal practicalities as lawmakers wanted it to bring the state closer to the ‘pockets’ and ‘hearts’ of the people. The progressivity of the new tax was supposed to ensure the levelling of the great fortunes and make the body politic more cohesive.

Since tax collection began in March 1917 and continued through the period of an intense power struggle and regime change, surviving records are patchy. Neither the tsar’s local treasures nor early Soviet fiscal authorities left comprehensive accounts of the sums collected in 1917. Nevertheless, Moscow archives have preserved some tax rolls that document the personal incomes for the year 1916, reported by taxpayers and then ascertained by tax collectors in the first half of 1917.

The records allow a tentative reconstruction of the level of income inequality in the city. Given that the adult population of Moscow amounted to 1.1 million in the spring of 1917, the estimates show that the wealthiest 1% and 5% must have received and then reported about 45.9% and 58.8% of their total income. With the Gini coefficient standing at 0.75, those shares display an extremely high level of income inequality among the city residents in 1916. A huge gap between the rich and the others not only felt real but was real.

Stealing for the market: the illegitimacy of enslavement in the early modern Atlantic world

by Judith Spicksley (Wilberforce Institute, University of Hull)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.



Slaves on the West Coast of Africa, c.1833 (oil on canvas)
The Slave Trade (Slaves on the West Coast of Africa), by Auguste-François Biard, 1840. Available at Wikimedia Commons. 

Slavery was understood to be illegitimate long before anti-slavery activists called for the abolition of the slave trade in the eighteenth century. Slavery is now prohibited in international law, but it was a legal institution in the vast bulk of societies at some point in the past.

A range of legal methods were used to enslave people, of which the most common were birth, capture in war, judicial punishment, debt, and poverty. But there was another method of enslavement that historians include in their list: the kidnap and theft of persons for sale on the market.

These practices were never considered acceptable forms of enslavement. In among the earliest law codes that survive from Old Babylonia in the second millennium BCE, to Israel in the first, are punishments for the theft of a person, which attracted the death penalty.

But demand for slaves created opportunities for traders to sell those they had stolen as if they were slaves proper, and increase their wealth in the process. These cases of illegal enslavement ran alongside bona fide sales throughout the period in which slavery was legitimate.

Examples include the activities of Cilician-based pirates in the eastern Mediterranean in the late Roman Republic and early Empire, and the violent sourcing of labour in Africa for the American plantations in the early modern Atlantic world. But it was the raiding bands that scoured the Slav lands of Eastern Europe for captives in the high medieval period that encouraged an understanding of the meaning of slavery as illegal in the west.

The term ‘slave’ appeared in English, and in the languages of Western Europe more generally, from the late medieval period via the ethonym Slav. This was the name given to members of the Slavic peoples living in Eastern Europe whose communities were frequently raided for persons who could be sold as slaves.

But the term ‘slavery’ does not enter the English language until the mid-sixteenth century. At that point, it was applied as a metaphor for the tyranny of Catholicism, as the development of Protestantism created a major religious schism.

The term ‘slavery’ was also applied to the activities of the earliest English slave traders. During his first voyage in 1562, John Hawkins is reputed to have violently captured around 400 Africans in Guinea, whom he later sold in the West Indies. He repeated these activities over the next five years with the support of Queen Elizabeth.

Hawkins was following in the footsteps of other Europeans, most notably Lançarote de Freitas, the Portuguese explorer, who is recognised as having set the transatlantic slave trade in motion. De Freitas returned from North Africa to Lagos in 1444 with a cargo of 235 Berber captives seized in a series of raids, who were subsequently sold into slavery.

From the mid-seventeenth century, with the challenge to the divine right of kings, ‘slavery’ became a metaphor for, and a weapon of, political tyranny in England. It also became a reality for travellers.

The seventeenth century saw an increased level of activity by the so-called Barbary pirates, operating out of North Africa, who seized European sailors and travellers and held them as ‘slaves’ for ransom. Englishmen and women were captured and enslaved in the Americas too, as the Atlantic economy underwent expansion.

As a result, the meaning of ‘slavery’ as a system of illegal subjection, linked to tyranny, violence and theft, had become deeply embedded in English thought before the abolitionists were established as an organised force from the late eighteenth century.

Family standards of living in England over the very long run

by Sara Horrell (University of Cambridge), Jane Humphries (University of Oxford), and Jacob Weisdorf (University of Southern Denmark and Centre for Economic Policy Research)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.


Over London–by Rail from London: A Pilgrimage (1872). Available at Wikimedia Commons.

The secular evolution in human wellbeing, measured by unskilled workers’ real wages, has long been the subject of scholarly debate. Attention is focused on whether modern economic growth is a relatively recent phenomenon in human history, prompted by the Industrial Revolution, or if workers in England experienced economic progress well before the Industrial Revolution, even if on a more modest scale. The answer will help inform third-world policy-makers about alternative routes to economic growth.

Thanks to recent archival work, we now have information on payments made to working-class men, women and children across 600 years of English history – from before the Black Death through to the classic years of the Industrial Revolution. In a study to be presented at the Economic History Society’s 2019 annual conference, we bring all of these payments together to provide a first-ever account of the earning possibilities of working-class families in historical England.

By asking how much a typical lower-class family consumed, in terms of basic consumption goods, such as calories, clothes, heating and housing, we are able to ask how much work was needed by the husband, as well as his wife and children, in order to achieve this. Also, because historical families were rather large (four to five children were not uncommon), we pay particular attention to the ‘family squeeze’ – that is, stages during the family lifecycle when the ratio of dependants to earners peaked.

Despite the post-Black Death period being regarded as a ‘golden age of labour’ and on assumptions of plentiful work, the husband’s earnings were not enough in the fourteenth century to satisfy a typical family’s basic consumption needs during the family squeeze. Women and children’s work was regularly needed in order to make ends meet and, even then, this was not enough to avoid insolvency problems during a couple’s old age.

But as we move forward through the medieval and early-modern periods, progressively less women and children’s work was required to ensure a stable standard of living, and old age poverty became less severe. In this sense, we conclude that the quality of life of an average lower-class family gradually improved in the centuries leading up to the Industrial Revolution.

We also conclude by arguing that a surplus in the family budget after necessities had been bought in the run-up to the Industrial Revolution enabled families to allocate a growing fraction of their income to market goods rather than homemade products.

This served as a stimulus to the Industrial Revolution because it motivated producers to innovate and profit from satisfying this increased demand. A widening market seemed important in combination (or competition) with the hypotheses that industrialisation sprung from entrepreneurial efforts to save labour.

The economic assimilation of Irish famine migrants to the United States

by William Collins and Ariell Zimran (Vanderbilt University)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.


Engraving of Emigrants leaving Ireland, 1868. Available at Wikimedia Commons.

Restrictive immigration policy is often justified by claims that immigrants and refugees are slow to assimilate culturally and economically in the receiving country. Our new research, to be presented at the Economic History Society’s 2019 annual conference, shows that the largest wave of refugees to ever arrive in the United States experienced rapid economic assimilation, closing much of the gap in occupational status relative to native-born Americans in a single generation.

These refugees were victims of the Great Irish Famine, which killed an estimated one million Irish between 1846 and 1850 and drove another million to flee abroad, mostly to the United States. Their poverty and predominantly Catholic religion set them apart from the typical American of the day, and led many politicians and commentators to argue that the Irish could and would not assimilate and thus were dangerous to American democracy and the American economy.

Notwithstanding these claims, data limitations have masked how the Irish immigrants’ labour market outcomes evolved after their arrival. To study these patterns, we use data from the US censuses of 1850 and 1880.

We begin by identifying Irish men in 1850 and using information on the birthplaces and ages of their children to determine whether they had arrived in the United States during the famine or before. We then locate their sons in the 1880 census, enabling the comparison of the sons’ adult occupations with those of their fathers. Similar links are constructed for the sons of native-born Americans.

The new data enable the documentation of three facts. First, in 1850, the Irish famine-era migrants had considerably lower levels of human capital, as measured by their literacy, than earlier Irish arrivals and native-born Americans. They were also 57% more likely than natives to hold an unskilled occupation.

The poor conditions faced by the famine Irish migrants thus did not bode well for the success of the next generation. Indeed, a simple comparison of the sons of the famine-era Irish to the sons of US natives reveals the second fact: as late as 1880, the sons of the famine Irish still fared worse than the sons of natives. These first two facts would seem, at first glance, to support claims of failure to assimilate (that is, ‘catch up’ to natives) in labour markets.

But a more detailed analysis reveals that the gap had shrunk considerably over the generation. In 1880, the sons of famine-era Irish were only 24% more likely than the sons of natives to hold an unskilled occupation. Thus, in a single generation, they closed much of the gap in status faced by their fathers.

Moreover, when the sons of the famine Irish were compared only to the sons of natives whose households were similar in 1850, only an 8% gap between the two groups remained. Thus, despite experiencing poverty, a nativist backlash against migrants, especially Catholics, and in some cases the trauma of the famine itself, the largest wave of refugee immigrants ever to arrive in the United States experienced almost the same intergenerational mobility as natives.

It is difficult to draw conclusions from these results to modern waves of refugees seeking sanctuary in Europe and the United States. The American and European economies have changed radically over the intervening centuries, and the open border policy of the United States – responsible for saving untold lives during the Irish famine – has long been closed.

But the parallels in rhetoric faced by modern and historic refugees suggests that the results of our new research can provide a useful lens through which to view modern debates.

Slavery and Anglo-American capitalism revisited

by Gavin Wright (Stanford University)

This research will be presented in the Tawney Lecture during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.


Slaves cutting sugar cane, taken from ‘Ten Views in the Island of Antigua’ by William Clark. Available at Wikimedia Commons.

For decades, scholars have debated the role of slavery in the rise of industrial capitalism, from the British Industrial Revolution of the eighteenth century to the acceleration of the American economy in the nineteenth century.

Most recent studies find an important element of truth in the thesis associated with Eric Williams that links the slave trade and slave-based commerce with early British industrial development. Long-distance markets were crucial supports for technological progress and for the infrastructure of financial markets and the shipping sector.

But the eighteenth century Atlantic economy was dominated by sugar, and sugar was dominated by slavery. The role of the slave trade was central to the process, because it would have been all but impossible to attract a free labour force to the brutal and deadly conditions that prevailed in sugar cultivation. As the mercantilist, Sir James Steuart asked in 1767: ‘Could the sugar islands be cultivated to any advantage by hired labour?’

Adherents of an insurgency known as the New History of Capitalism have extended this line of analysis to nineteenth century America, maintaining that: ‘During the eighty years between the American Revolution and the Civil War, slavery was indispensable to the economic development of the United States.’ A crucial linkage in this perspective is between slave-grown cotton and the cotton textile industries of both Britain and the United States, as asserted by Marx: ‘Without slavery you have no cotton; without cotton you have no modern industry.’

My research, to be presented in this year’s Tawney Lecture to the Economic History Society’s annual conference, argues to the contrary, that such analyses overlook the second part of the Williams thesis, which held that industrial capitalism abandoned slavery because it was no longer needed for continued economic expansion. We need not ascribe cynical or self-interested motives to the abolitionists to assert that these forces were able to succeed because the political-economic consensus that supported slavery in the eighteenth century no longer prevailed in the nineteenth.

Between the American Revolution in 1776 and the end of the Napoleonic Wars in 1815, the demands of industrial capitalism changed in fundamental ways: expansion of new export markets in non-slave areas; streamlined channels for migration of free labour; the shift of the primary raw material from sugar to cotton. Unlike sugar, cotton was not confined to unhealthy locations, did not require large fixed capital investment, and would have spread rapidly through the American South, with or without slavery.

These historic shifts were recognised in the United States as in Britain, as indicated by the post-Revolutionary abolitions in the northern states and territories. To be sure, southern slavery was highly profitable to the owners, and the slave economy experienced considerable growth in the antebellum period. But the southern regional economy seemed increasingly out of step with the US mainstream, its centrality for national prosperity diminishing over time.

Indeed, my study asserts that on balance the persistence of slavery actually reduced the growth of cotton supply compared with a free-labour alternative. The truth of this proposition is most clearly demonstrated by the expansion of production after the Civil War and emancipation, and the return of world cotton prices to their pre-war levels.

The comfortable, the rich and the super-rich: what really happened to top British incomes during the first half of the twentieth century?

by Peter Scott and James T Walker (Henley Business School, University of Reading)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.



Across road junction at Clapham Common, London, England. Available at Wikimedia Commons.

Long-run analysis of British income inequality has been severely hampered by poor historical income distribution data relative to other western countries. Until 1937, there were no official peacetime income distribution estimates for Britain, despite considerable contemporary interest in how much of the national income was taken by the rich.

In research to be presented at the Economic History Society’s 2019 annual conference, we address this question, focusing on changes in the incomes of the top 0.001-5% of the income distribution. This group is important for two reasons. First, because top incomes accounted for a substantial slice of total personal incomes, with the top 1% and top 5% taking around 30% and 45% of total income in 1911, according to our estimates.

Second, income redistribution in western countries is typically dominated by changes in the shares of the top 5% and, especially, within the top percentile. Thus examining higher incomes is crucial to explaining the apparent paradox between a relatively stagnant income distribution among the bulk of the British population and the generally assumed trend towards a more equal pre-tax income distribution.

Using a newly rediscovered Inland Revenue survey of personal incomes for taxpayers in 1911, we show that Britain had particularly high-income inequality compared with other Western countries. Top British income shares fell considerably over the following decades, though British incomes remained more unequal than in the United States or France in 1949.

Inequality reduction was driven principally by a collapse in unearned incomes, reflecting economic shocks and government policy responses. Over the period from 1911 to 1949, there was a general downward trend in rent, dividend and interest income, with particularly sharp falls during the two world wars and the recessions of 1920-21 and 1929-32.

War-time inflation eroded the real income received from fixed interest securities; new London Stock Exchange issues of overseas securities were restricted by the Treasury (to protect Britain’s foreign exchange position), reducing rentiers’ ability to invest their income overseas; and the agricultural depression lowered real (inflation-adjusted) land rents.

These trends reflected a progressive collapse of the globalised world economy from 1914 to 1950, which both reduced the incomes of the rich and redistributed income to the bottom 95% of the income spectrum.

For example, rent control (introduced in 1915 and continuing throughout the period of our study), depressed the incomes of landlords, but substantially reduced the real cost of a major household expenditure burden, in a country where around 90% of households were private tenants. Rent control also led to extensive house sales by landlords, mainly to sitting tenants, at prices reflecting their low, controlled, rents.

Meanwhile the scarcity of low-risk, high yielding assets during the interwar years led to substantial deposits in building societies by high-income individuals, funding the house-building boom of the 1930s. Restrictions on overseas new issues also led the City of London to become increasingly involved in British industrial finance – expanding industrial growth and employment.

Conversely, the policy liberalisations of the 1980s that heralded the start of the new globalisation (and the resumption of growing income inequality in western nations) have made it far easier for the rich to offshore their assets, or themselves, either in search of better investments opportunities or jurisdictions more suited to protecting their wealth. This has produced a strong international trend towards rising income inequality, with Britain returning to its position as one of the most unequal western nations.

Demographic shocks and women’s labour market participation: evidence from the 1918 influenza pandemic in India

by James Fenske, Bishnupriya Gupta and Song Yuan (University of Warwick)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.



Women washing clothes at the Sabarmati river, Ahmedabad. Available at Wikimedia Commons.

Women’s labour force participation is an important driver of economic development and gender equality. Historically, India has had low female participation in economic activities outside the home. Researchers have cited early marriage, social conservatism and limited comparative advantage of women in certain types of agriculture among the potential explanations.

Despite economic growth, female labour force participation has fallen in recent years, generating several important studies. Our research, to be presented at the Economic History Society’s 2019 annual conference, considers the response of female labour force participation to one major demographic shock – the 1918 influenza pandemic.

Past studies have looked at the wartime mortality of men as a demographic shock affecting the total supply of labour, and have found that this affects women’s labour market participation. The empirical evidence of other types of demographic shock due to epidemics is more limited.

Our research focuses on India and aims to understand the impact of the large-scale demographic shock that was the 1918 influenza pandemic, in the context of a society in which female labour market participation had typically been low due to cultural norms.

From 1918 to 1919, a deadly influenza epidemic hit India, and caused more than 13 million deaths, equivalent to 5% of the population. In contrast to typical epidemics, which are disproportionally deadly to immunologically weak individuals such as infants and the very old, this epidemic primarily caused deaths among young adults between the ages of 20 and 40.

The mortality rate varied greatly across districts, ranging from 1.4% to 17.9% of the population in our sample. We focus on three questions: did the 1918 influenza pandemic increase or decrease female employment? If so, why? Was this effect persistent?

To answer these questions, we combine detailed district-level historical census data on occupations by gender from 1901 to 1931 with data from multiple sources on influenza mortality, marital statuses by age and gender, and wages.

Using an event-study approach, we find that a 1% increase in the mortality rate raised the female labour force participation rate by 1.2% in 1921 and the change was concentrated in the service sector. But this was transitory, disappearing by 1931. By contrast, the pandemic did not affect the labour force participation of men at either the district or district-by-sector level.

How do we explain the labour market effects of the pandemic? Possible causal channels will have affected either the supply of or the demand for female labour. One possible channel is that the death of men increased the share of widows in the population. As household income was generally earned by men, widows were pushed to participate in the labour market in order to mitigate the negative economic shock.

On the other hand, the rise in the proportion of widowers had no impact on male labour force participation, as most men worked before the disaster, whether widowed or not. In addition, the pandemic led to a shortage of labour, potentially increasing wages, inducing women out of the home and into the labour force.

Our findings provide evidence that negative demographic shocks alter the working behaviour of women, at least in the short run. In contrast with previous research on events such as the slave trade and the two world wars, which have considered sex-biased demographic shocks, we show that shocks that are not sex-based can also play a role in determining female employment. Further, it enables us to understand the historical dynamics and determinants of female economic and social status in India.


The flexibility of the gold standard – any lessons for the Eurozone?

by Guillaume Bazot (University Paris 8), Eric Monnet (Banque de France, Paris School of Economics & CEPR) and Matthias Morys (University of York)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.


Cover image of Puck Magazine, v. 47, no. 1201 (1900 March 14). Available at Wikimedia Commons.

A great deal of research has drawn on the classical gold standard (1870s-1914) – an earlier and highly successful system of fixed exchange rates – to provide guidance for Europe’s monetary union (EMU). Such historical inspiration initially informed the design of the euro and then, since the outbreak of the Eurozone crisis in 2010, how to fix the flaws of the common currency.

Researchers have offered very different lessons from the past, yet they all share one key feature: the gold standard as a highly rigid system, robbing countries of their monetary policy and leaving them potentially unable to respond to major shocks.

Our research, to be presented at the Economic History Society’s 2019 annual conference, takes the opposite perspective: the extraordinary stability of the gold standard stems from the fact that it was precisely not a rigid framework. Central banks retained far more room for manoeuvre than is conventionally acknowledged, and they knew how to use this policy space for their own purposes.

Drawing on a unique high-frequency data set for all 21 central banks of the gold standard period (arduously collected by Bank of France statisticians at the time but then shelved in the bank’s archives, where we found them), we estimate how other central banks reacted when ‘the conductor of the international orchestra’ (as Keynes named the Bank of England) raised interest rates.

Our results show that there was no ‘rule of the game’ that countries automatically followed after a rate rise. Instead, we document that central banks followed a variety of – economically well-defined and econometrically clearly identifiable – strategies with the aim of mitigating the impact of a foreign central bank decision on the domestic economy.

Intriguingly, two important strategies pursued at the time – sterilisation and capital controls – have clear parallels with key policies implemented in the Eurozone since 2010.

This is especially the case with the (highly) asymmetric distribution of loans for the ‘long-term refinancing operations’ (LTROs) across countries (in order to improve the convergence of credit conditions), macroprudential policies (to tame asymmetric credit booms), the use of ‘emergency liquidity assistance’ (ELA) by national central banks and the use of capital controls as a last resort in Cyprus and Greece.

It is often said that the euro started out as a modern-day gold standard (clear rules, common monetary policy, separate fiscal policies) but has developed into something else in recent years. By contrast, our research shows that the gold standard always possessed certain features of flexibility – characteristics that the euro area had to introduce in response to the Eurozone crisis. In its move away from one-size-fits-all monetary policy, today’s EMU arguably resembles the classical gold standard more than it did before 2010.

Foreign intervention can bring stability…but stability is overrated: the United States in Latin America 1905-31

by Leticia Abad (City University of New York) and Noel Maurer (George Washington University)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.


‘The Birth of the Monroe Doctrine’ – US President James Monroe presides over a cabinet meeting in 1823, discussing the Monroe Doctrine. Available at Wikimedia Commons.

Every time American intervention in some faraway conflict appears to be off the table, another call for it rises. Today the example is the Bolivarian Republic of Venezuela, where President Nicolás Maduro has neutered the elected parliament while causing the economy to collapse to the point of famine. The Trump administration has openly backed the opposition and the president himself has floated the possibility of military intervention.

Donald Trump is following in the footsteps of President Theodore Roosevelt, who ushered in an earlier era of American intervention with his famous 1904 call: ‘The adherence of the United States to the Monroe Doctrine may force the United States, however reluctantly, in flagrant cases of such wrongdoing or impotence, to the exercise of an international police power.’

Roosevelt’s declaration began a three-decade period of extensive American intervention, in which the United States used its tools of national power to intervene in the affairs of Latin American governments. The United States took over the fiscal institutions of no fewer than eight (out of 19) Latin American states over the period; deployed military forces in five; and took over the entire state (for a time) in three.

What can that earlier period tell us about the wisdom of modern-day calls for intervention?

In research to be presented at the Economic History Society’s 2019 annual conference, we find that US intervention in the period 1905-31 indeed reduced political instability. The number of coups and coup attempts fell, as did the severity of political violence.

More specifically, US intervention reduced the probability of an unconstitutional regime change by 10% and the intensity of political violence (measured by the number of deaths per day in political violence) by almost half. In that sense, intervention worked.

But the United States failed to accomplish its other goals. American aims were not just to reduce violence; they hoped to decrease corruption, increase government revenue and promote investment in the intervened nations.

We find that the efficiency of government institutions – measured by the ability to collect customs revenue – did not change: in fact, it appears to have fallen. That is to say, intervened governments got worse at carrying out their governing functions.

Nor did foreign investors invest more capital into the intervened countries. Neither the stock of portfolio investment, foreign direct investment nor domestic investment budged. The volume of trade did not grow. What did change, however, was an increase in the short-term profits of existing bondholders and trade diversion towards the United States.

In short, intervention provided political stability to the target countries. It also generated private benefits for American investors and traders. But it did little to improve those countries’ quality of governance or long-term growth prospects.

Two general lessons emerge from our work.

First, political stability may be overrated. Investors don’t seem to care, at least inasmuch as instability doesn’t directly affect them. The reason, we postulate, is that investors have many ways of protecting their interests in poor and unstable countries.

Foreign intervention therefore provides a nice benefit to creditors, who believe that it raises their chances of being repaid, but it does little to prompt them to risk good money after bad.

Second, improving governance is hard. It is one thing to prevent rebels from sacking customs houses or stop governments from massacring their opponents; it is quite another to stamp out corruption and make government more efficient. The United States failed at its attempts to promote long-term growth.

We use the opening of the Panama Canal, which brought the seven Pacific coast countries of Latin America closer to Washington in order to strip out reverse causality or the impact of third factors. Being closer to Washington made the United States more likely to intervene but had no independent effect on countries’ political stability or government corruption.

Urbanisation and regional GDP growth in Europe over the twentieth century

by Kerstin Enflo (Lund University), Anna Missiaia (Lund University) and Joan Rosés (LSE)

This research will be presented during the EHS Annual Conference in Belfast, April 5th – 7th 2019. Conference registration can be found on the EHS website.



‘Palace of the People’ in Amsterdam. Photo dates back to about 1890. Available at Wikimedia Commons.

Fast urbanisation is a phenomenon often associated with the image of African or Asian mega-cities, but migrations from rural to urban areas are also a European phenomenon (see the growth experienced by large capitals such as London and Paris, but also smaller ones such as Stockholm and Copenhagen). And according to United Nations forecasts, the urbanisation trend will continue, with an estimated 2.5 billion people added to the world’s urban population by 2050.

The first question that comes to mind is whether urbanisation triggers economic growth, and therefore should be favoured by policy-makers, as suggested by eminent scholars such as Ed Glaeser (The Triumph of the City: How Our Best Invention Makes Us Richer, Smarter, Greener, Healthier, and Happier, 2011) or Richard Florida (The Rise of the Creative Class, 2002).

Although this relationship is overall positive, the paradigm has been challenged with respect to African mega-cities: their urbanisation rate takes off in periods of growth but it does not immediately decrease in periods of recession. As cities continue to grow in size, but fail to grow in GDP per capita, their inhabitants experience falling income levels, ultimately leading to falling living standards (Fay and Opal, 2000).

If we look at Europe, urbanisation without growth does not appear to be an issue when countries are the units of analysis. But the national dimension could be concealed by the success stories of the large capitals, with less clear success stories for middle-sized declining cities. Net of the big successful capitals, many cities that thrived during the post-war period are now struggling, with clear economic, social and political consequences.

Our work, to be presented at the Economic History Society’s 2019 annual conference, contributes to the debate by looking at this relationship for the first time at the regional, rather than national level, using urbanisation rates and GDP per capita in EU regions in the twentieth century. The regional dimension makes it possible to disentangle the effects of urbanisation from the effects of being the capital’s region.

Our main findings are that the relationship between urbanisation and growth is positive and significant until the middle of the twentieth century, while it is not significant in recent years. We therefore observe a progressive decoupling of regional urbanisation and economic growth. The effect on growth of the presence of the capital in the region is very large: between 60% and 70% of that of urbanisation until the mid-twentieth century.

When looking at macro areas, both Southern Europe and Northern Europe show no statistically significant relationship between urbanisation and economic growth, suggesting that regions containing urban areas without the status of capital do not necessarily grow more than regions without such urban areas.

This is consistent with the idea of a ‘winner-take-all urbanism’ presented by Florida (The New Urban Crisis, 2017) in which there is a growing divide between the winner cities (London, New York, Paris, San Francisco) and the rest.

In the winner cities, the middle class, the service class and the working class are priced out by highly paid creative workers. In the rest of the cities, where creative workers are not based, the middle class declines without being replaced by the new rich.

Our results are relevant for policy-makers as they challenge the view that urbanisation per se is a strong channel for economic growth regardless of the period and geographical area considered.