The market turn: From social democracy to market liberalism By Avner Offer, All Souls College, University of Oxford (email@example.com) Abstract: Social democracy and market liberalism offered different solutions to the same problem: how to provide for life-cycle dependency. Social democracy makes lateral transfers from producers to dependents by means of progressive taxation. Market liberalism uses […]
From Timothy Hatton, Professor of Economics, Australian National University and University of Essex. Originally published on 9 May 2014
The height of today’s populations cannot explain which factors matter for long-run trends in health and height. This column highlights the correlates of height in the past using a sample of British army soldiers from World War I. While the socioeconomic status of the household mattered, the local disease environment mattered even more. Better education and modest medical advances led to an improvement in average health, despite the war and depression.
The last century has seen unprecedented increases in the heights of adults (Bleakley et al., 2013). Among young men in western Europe, that increase amounts to about four inches. On average, sons have been taller than their fathers for the last five generations. These gains in height are linked to improvements in health and longevity.
Increases in human stature have been associated with a wide range of improvements in living conditions, including better nutrition, a lower disease burden, and some modest improvement in medicine. But looking at the heights of today’s populations provides limited evidence on the socioeconomic determinants that can account for long-run trends in health and height. For that, we need to understand the correlates of height in the past. Instead of asking why people are so tall now, we should be asking why they were so short a century ago.
In a recent study Roy Bailey, Kris Inwood and I ( Bailey et al. 2014) took a sample of soldiers joining the British army around the time of World War I. These are randomly selected from a vast archive of two million service records that have been made available by the National Archives, mainly for the benefit of genealogists searching for their ancestors.
For this study, we draw a sample of servicemen who were born in the 1890s and who would therefore be in their late teens or early twenties when they enlisted. About two thirds of this cohort enlisted in the armed services and so the sample suffers much less from selection bias than would be likely during peacetime, when only a small fraction joined the forces. But we do not include officers who were taller than those they commanded. And at the other end of the distribution, we also miss some of the least fit, who were likely to be shorter than average.
Cutting the welfare budget is unlikely to lead to an increase in private voluntary work and charitable giving, according to research by Nina Boberg-Fazlic and Paul Sharp.
Their study of England in the late eighteenth and early nineteenth century, published in the February 2017 issue of the Economic Journal, shows that parts of the country where there was increased spending under the Poor Laws actually enjoyed higher levels of charitable income.
The authors conclude:
‘Since the end of the Second World War, the size and scope of government welfare provision has come increasingly under attack.’
‘There are theoretical justifications for this, but we believe that the idea of ‘crowding out’ – public spending deterring private efforts – should not be one of them.’
‘On the contrary, there even seems to be evidence that government can set an example for private donors.
Why does Europe have considerably higher welfare provision than the United States? One long debated explanation is the existence of a ‘crowding out’ effect, whereby government spending crowds out private voluntary work and charitable giving. The idea is that taxpayers feel that they are already contributing through their taxes and thus do not contribute as much privately.
Crowding out makes intuitive sense if people are only concerned with the total level of welfare provided. But many other factors might play a role in the decision to donate privately and, in fact, studies on this topic have led to inconclusive results.
The idea of crowding out has also caught the imagination of politicians, most recently as part of the flagship policy of the UK’s Conservative Party in the 2010 General Election: the so-called ‘big society’. If crowding out holds, spending cuts could be justified by the notion that the private sector will take over.
The new study shows that this is not necessarily the case. In fact, the authors provide historical evidence for the opposite. They analyse data on per capita charitable income and public welfare spending in England between 1785 and 1815. This was a time when welfare spending was regulated locally under the Poor Laws, which meant that different areas in England had different levels of spending and generosity in terms of who received how much relief for how long.
The research finds no evidence of crowding out; rather, it finds that parts of the country with higher state provision of welfare actually enjoyed higher levels of charitable income. At the time, Poor Law spending was increasing rapidly, largely due to strains caused by the Industrial Revolution. This increase occurred despite there being no changes in the laws regulating relief during this period.
The increase in Poor Law spending led to concerns among contemporary commentators and economists. Many expressed the belief that the increase in spending was due to a disincentive effect of poor relief and that mandatory contributions through the poor rate would crowd out voluntary giving, thereby undermining social virtue. That public debate now largely repeats itself two hundred years later.
Summary of the article ‘Does Welfare Spending Crowd Out Charitable Activity? Evidence from Historical England under the Poor Laws’ by Nina Boberg-Fazlic (University of Duisberg-Essen) and Paul Sharp (University of Southern Denmark). Published in Economic Journal, February 2017
Professor of Economic History, University of Oxford
Published on 8 April 2014
The plague known as the Black Death which tore through 14th century Europe is traditionally held to have had at least one upside. Women, the theory runs, were able to exploit the labour shortages of post-plague England to find themselves in a richer and more stable position than before. However the idea that women of the era were forerunners of the post World War I generation doesn’t stand up to much scrutiny, as new research shows.
Medievalists have long debated the extent to which women shared in the “golden age” of the English peasantry that followed the demographic catastrophe of the Black Death. The plague killed between 30% and 45% of the population in its first wave 1348-59. Recurrences meant that by the 1370s England’s population had been halved.
The silver lining, for the peasantry at least, was the dramatic increase in workers’ remuneration as landowners struggled to recruit and retain labourers. The results are apparent in a rapid increase in male casual (nominal and real) wages from about 1349.
Some historians have argued that women’s gains were even more marked as they could find employment in hitherto male-dominated jobs, or migrate to towns to work in the growing textile industries and commercial services and so enjoy “economic independence”.
Others however have suggested that whatever the implications of the Black Death for male workers, the sexual division of labour prevented women from seizing the opportunities created by the labour shortage. As one account puts it: “Women tended to work in low-skilled, low-paid jobs … This was true in 1300 and it remained true in 1700”.
The debate has significant implications as optimists have gone further in arguing that women’s improved wages changed demographic behaviour by delaying marriage, promoting celibacy and reducing fertility, with the resulting so-called north-west European Marriage Pattern raising incomes and promoting further growth.
The phylloxera crisis in nineteenth century France destroyed 40% of the country’s vineyards, devastating local economies. According to research by Vincent Bignon, Eve Caroli, and Roberto Galbiati, the negative shock to wine production led to a substantial increase in property crime in the affected regions. But their study, published in the February 2017 issue of the Economic Journal, also finds that there was a significant fall in violent crimes because of the reduction in alcohol consumption.
It has long been debated whether crime responds to economic conditions. In particular, do crime rates increase because of financial crises or major downsizing events in regions heavily specialised in some industries?
Casual observation and statistical evidence suggest that property crimes are more frequent during economic crises. For example, the United Nations Office on Drugs and Crime has claimed that in a sample of 15 countries, theft has sharply increased during the last economic crisis.
These issues are important because crime is also known to have a damaging impact on economic growth by discouraging business and talented workers from settling in regions with high rates of crime. If an economic downturn triggers an increase in the crime rate, it could have long-lasting effects by discouraging recovery.
But since multiple factors can simultaneously affect economic conditions and the propensity to commit crime, identifying a causal effect of economic conditions on crime rates is challenging.
The new research addresses the issue by examining how crime rates were affected by a major economic crisis that massively hit wine production, France’s most iconic industry, in the nineteenth century.
The crisis was triggered by the near microscopic insect named phylloxera vastatrix. It originally lived in North America and did not reach Europe in the era of sailing ships since the transatlantic journey took so long that it had died on arrival.
Steam power provided the greater speed needed for phylloxera to survive the trip and it arrived in France in 1863 on imported US vines. Innocuous in its original ecology, phylloxera proved very destructive for French vineyards by sucking the sap of the vines. Between 1863 and 1890, it destroyed about 40% of them, thus causing a significant loss of GDP.
Because phylloxera took time to spread, not all districts started being hit at the same moment, and because districts differed widely in their ability to grow wines, not all districts were hit equally. The phylloxera crisis is therefore an ideal natural experiment to identify the impact of an economic crisis on crime because it generated exogenous variation in economic activity in 75 French districts.
To show the effect quantitatively, the researchers have collected local administrative data on the evolution of property and violent crime rates, as well as minor offences. They use these data to study whether crime increased significantly after the arrival of phylloxera and the ensuing destruction of the vineyards that it entailed.
The results suggest that the phylloxera crisis caused a substantial increase in property crime rates and a significant decrease in violent crimes. The effect on property crime was driven by the negative income shock induced by the crisis. People coped with the negative income shock by engaging in property crimes. At the same time, the reduction in alcohol consumption induced by the phylloxera crisis had a positive effect on the reduction of violent crimes.
From a policy point of view, these results suggest that crises and downsizing events can have long lasting effects. By showing that the near-disappearance of an industry (in this case only a temporary phenomenon) can trigger long-run negative consequences on local districts through an increasing crime rate, this study underlines that this issue must be high on the policy agenda at times of crises.
Summary of the article ‘Stealing to Survive? Crime and Income Shocks in Nineteenth Century France’ by Vincent Bignon, Eve Caroli and Roberto Galbiati. Published in Economic Journal on February 2017
 ‘Monitoring the impact of economic crisis on crime’, United Nations Office on Drugs and Crime, 2012. This effect was also noted by the French ‘Observatoire national de la délinquance et des réponses pénales’, when it underlines that burglaries sharply increased in France in the period 2007 to 2012.
In my post on French economic history last week, I claimed that Robert Allen’s 2001 paper in Explorations in Economic History was one of the ten most important papers of the last twenty-five years. In reaction, economic historian Benjamin Guilbert asked me “what are the other nine”? As I started thinking about the best articles, I realized that […]