Revisiting the changing body

by Bernard Harris (University of Strathclyde)

The Society has arranged with CUP that a 20% discount is available on this book, valid until the 11th November 2018. The discount page is: www.cambridge.org/wm-ecommerce-web/academic/landingPage/EHS20

The last century has witnessed unprecedented improvements in survivorship and life expectancy. In the United Kingdom alone, infant mortality fell from over 150 deaths per thousand births at the start of the last century to 3.9 deaths per thousand births in 2014 (see the Office for National Statistics  for further details). Average life expectancy at birth increased from 46.3 to 81.4 years over the same period (see the Human Mortality Database). These changes reflect fundamental improvements in diet and nutrition and environmental conditions.

The changing body: health, nutrition and human development in the western world since 1700 attempted to understand some of the underlying causes of these changes. It drew on a wide range of archival and other sources covering not only mortality but also height, weight and morbidity. One of our central themes was the extent to which long-term improvements in adult health reflected the beneficial effect of improvements in earlier life.

The changing body also outlined a very broad schema of ‘technophysio evolution’ to capture the intergenerational effects of investments in early life. This is represented in a very simple way in Figure 1. The Figure tries to show how improvements in the nutritional status of one generation increase its capacity to invest in the health and nutritional status of the next generation, and so on ‘ad infinitum’ (Floud et al. 2011: 4).

fig01
Figure 1. Technophysio evolution: a schema. Source: See Floud et al. 2011: 3-4.

We also looked at some of the underlying reasons for these changes, including the role of diet and ‘nutrition’. As part of this process, we included new estimates of the number of calories which could be derived from the amount of food available for human consumption in the United Kingdom between circa 1700 and 1913. However, our estimates contrasted sharply with others published at the same time (Muldrew 2011) and were challenged by a number of other authors subsequently. Broadberry et al. (2015) thought that our original estimates were too high, whereas both Kelly and Ó Gráda (2013) and Meredith and Oxley (2014) regarded them as too low.

Given the importance of these issues, we revisited our original calculations in 2015. We corrected an error in the original figures, used Overton and Campbell’s (1996) data on extraction rates to recalculate the number of calories, and included new information on the importation of food from Ireland to other parts of what became the UK. Our revised Estimate A suggested that the number of calories rose by just under 115 calories per head per day between 1700 and 1750 and by more than 230 calories between 1750 and 1800, with little changes between 1800 and 1850. Our revised Estimate B suggested that there was a much bigger increase during the first half of the eighteenth century, followed by a small decline between 1750 and 1800 and a bigger increase between 1800 and 1850 (see Figure 2). However, both sets of figures were still well below the estimates prepared by Kelly and Ó Gráda, Meredith and Oxley, and Muldrew for the years before 1800.

fig02
Source: Harris et al. 2015: 160.

These calculations have important implications for a number of recent debates in British economic and social history (Allen 2005, 2009). Our data do not necessarily resolve the debate over whether Britons were better fed than people in other countries, although they do compare quite favourably with relevant French estimates (see Floud et al. 2011: 55). However, they do suggest that a significant proportion of the eighteenth-century population was likely to have been underfed.
Our data also raise some important questions about the relationship between nutrition and mortality. Our revised Estimate A suggests that food availability rose slowly between 1700 and 1750 and then more rapidly between 1750 and 1800, before levelling off between 1800 and 1850. These figures are still broadly consistent with Wrigley et al.’s (1997) estimates of the main trends in life expectancy and our own figures for average stature. However, it is not enough simply to focus on averages; we also need to take account of possible changes in the distribution of foodstuffs within households and the population more generally (Harris 2015). Moreover, it is probably a mistake to examine the impact of diet and nutrition independently of other factors.

To contact the author: bernard.harris@strath.ac.uk

References

Allen, R. (2005), ‘English and Welsh agriculture, 1300-1850: outputs, inputs and income’. URL: https://www.nuffield.ox.ac.uk/media/2161/allen-eandw.pdf.

Allen, R. (2009), The British industrial revolution in global perspective, Cambridge: Cambridge University Press.

Broadberry, S., Campbell, B., Klein, A., Overton, M. and Van Leeuwen, B. (2015), British economic growth, 1270-1870, Cambridge: Cambridge University Press.

Floud, R., Fogel, R., Harris, B. and Hong, S.C. (2011), The changing body: health, nutrition and human development in the western world since 1700, Cambridge: Cambridge University Press.

Harris, B. (2015), ‘Food supply, health and economic development in England and Wales during the eighteenth and nineteenth centuries’, Scientia Danica, Series H, Humanistica, 4 (7), 139-52.

Harris, B., Floud, R. and Hong, S.C. (2015), ‘How many calories? Food availability in England and Wales in the eighteenth and nineteenth centuries’, Research in Economic History, 31, 111-91.

Kelly, M. and Ó Gráda, C. (2013), ‘Numerare est errare: agricultural output and food supply in England before and during the industrial revolution’, Journal of Economic History, 73 (4), 1132-63.

Meredith, D. and Oxley, D. (2014), ‘Food and fodder: feeding England, 1700-1900’, Past and Present, 222, 163-214.

Muldrew, C. (2011), Food, energy and the creation of industriousness: work and material culture in agrarian England, 1550-1780, Cambridge: Cambridge University Press.

Overton, M. and Campbell, B. (1996), ‘Production et productivité dans l’agriculture anglaise, 1086-1871’, Histoire et Mésure, 1 (3-4), 255-97.

Wrigley, E.A., Davies, R., Oeppen, J. and Schofield, R. (1997), English population history from family reconstitution, Cambridge: Cambridge University Press.

Surprisingly gentle confinement

Tim Leunig (LSE), Jelle van Lottum (Huygens Institute) and Bo Poulsen (Aarlborg University) have been investigating the treatment of prisoners of war in the Napoleonic Wars.

 

index
Napoleonic Prisoner of War. Available at <https://blog.findmypast.com.au/explore-our-fascinating-new-napoleonic-prisoner-of-war-records-1406376311.html&gt;

For most of history, life as a prisoner of war was nasty, brutish and short. There were no regulations on the treatment of prisoners until the 1899 Hague convention, and the later Geneva conventions. Many prisoners were killed immediately, other enslaved to work in mines, and other undesirable places.

The poor treatment of prisoners of war was partly intentional – they were the hated enemy, after all. And partly it was economic. It costs money to feed and shelter prisoners. Countries in the past – especially in times of war and conflict – were much poorer than today.

Nineteenth century prisoner death rates were horrific. Between one-half and six-sevenths of Napoleon’s 17,000 troops surrendering to the Spanish in 1808 after the Battle of Balién died as prisoners of war. The American civil war saw death rates rise to 27%, even though the average prisoner was captive for less than a year.

The Napoleonic Wars saw the British capture 7,000 Danish and Norwegian sailors, military and merchant. Britain did not desire war with Denmark (which ruled Norway at the time), but did so to prevent Napoleon seizing the Danish fleet. Prisoners were incarcerated on old, unseaworthy “prison hulks”, moored in the Thames Estuary, near Rochester. Conditions were crowded: each man was given just 2 feet (60 cm) in width to hang his hammock.

Were these prison hulks floating tombs, as some contemporaries claimed? Our research shows otherwise. The Admiralty kept exemplary records, now held in the National Archive in Kew. These show the date of arrival in prison, and the date of release, exchange, escape – or death. They also tell us the age of the prisoner, where they came from, the type of ship they served on, and whether they were an officer, craftsman, or regular sailor. We can use these records to look at how many died, and why.

The prisoners ranged in age from 8 to 80, with half aged 22 to 35. The majority sailed on merchant vessels, with a sixth on military vessels, and a quarter on licenced pirate boats, permitted to harass British shipping. The amount of time in prison varied dramatically, from 3 days to over 7 years, with an average of 31 months. About two thirds were released before the end of the war.

Taken as a whole, 5% of prisoners died. This is a remarkably low number, given how long they were held, and given experience elsewhere in the nineteenth century. Being held prisoner for longer increased your chance of dying, but not by much: those who spent three years on a prison hulk had only a 1% greater chance of dying than those who served just one year.

Death was (almost) random. Being captured at the start of the war was neither better nor worse than being captured at the end. The number of prisoners held at any one time did not increase the death rate. The old were no more likely to die than the young – anyone fit enough to go to see was fit enough to withstand any rigours of prison life. Despite extra space and better rations, officers were no less likely to die, implying that conditions were reasonable for common sailors.

There is only one exception: sailors from licenced pirate boats were twice as likely to die as merchant or official navy sailors. We cannot know the reason. Perhaps they were treated less well by their guards, or other prisoners. Perhaps they were risk takers, who gambled away their rations. Even for this group, however, the death rates were very low compared with those captured in other places, and in other wars.

The British had rules on prisoners of war, for food and hygiene. Each prisoner was entitled to 2.5 lbs (~1 kg) of beef, 1 lb of fish, 10.5 lbs of bread, 2 lbs of potatoes, 2.5lbs of cabbage, and 14 pints (8 litres) of (very weak) beer a week. This is not far short of Danish naval rations, and prisoners are less active than sailors. We cannot be sure that they received their rations in full every week, but the death rates suggest that they were not hungry in any systematic way. The absence of epidemics suggests that hygiene was also good. Remarkably, and despite a national debt that peaked at a still unprecedented 250% of GDP, the British appear to have obeyed their own rules on how to treat prisoners.

Far from being floating tombs, therefore, this was a surprisingly gentle confinement for the Danish and Norwegian sailors captured by the British in the Napoleonic Wars.

Small Bills and Petty Finance: co-creating the history of the Old Poor Law

by Alannah Tomkins (Keele University) 

Alannah Tomkins and Professor Tim Hitchcock (University of Sussex), won an AHRC award to investigate ‘Small Bills and Petty Finance: co-creating the history of the Old Poor Law’.  It is a three-year project from January 2018. The application was for £728K, which has been raised, through indexing, to £740K.  The project website can be found at: thepoorlaw.org.

 

Twice in my career I’ve been surprised by a brick – or more precisely by bricks, hurtling into my research agenda. In the first instance I found myself supervising a PhD student working on the historic use of brick as a building material in Staffordshire (from the sixteenth to the eighteenth centuries). The second time, the bricks snagged my interest independently.

The AHRC-funded project ‘Small bills and petty finance’ did not set out to look for bricks. Instead it promises to explore a little-used source for local history, the receipts and ‘vouchers’ gathered by parish authorities as they relieved or punished the poor, to write multiple biographies of the tradesmen and others who serviced the poor law. A parish workhouse, for example, exerted a considerable influence over a local economy when it routinely (and reliably) paid for foodstuffs, clothing, fuel and other necessaries. This influence or profit-motive has not been studied in any detail for the poor law before 1834, and vouchers’ innovative content is matched by an exciting methodology. The AHRC project calls on the time and expertise of archival volunteers to unfold and record the contents of thousands of vouchers surviving in the three target counties of Cumbria, East Sussex and Staffordshire. So where do the bricks come in?

The project started life in Staffordshire as a pilot in advance of AHRC funding. The volunteers met at the Stafford archives and started by calendaring the contents of vouchers for the market town of Uttoxeter, near the Staffordshire/Derbyshire border. And the Uttoxeter workhouse did not confine itself to accommodating and feeding the poor. Instead in the 1820s it managed two going concerns: a workhouse garden producing vegetables for use and sale, and a parish brickyard. Many parishes under the poor law embedded make-work schemes in their management of the resident poor, but no others that I’m aware of channelled pauper labour into the manufacture of bricks.

pic01

The workhouse and brickyard were located just to the north of the town of Uttoxeter, in an area known as The Heath. The land was subsequently used to build the Uttoxeter Union workhouse in 1837-8 (after the reform of the poor law in 1834) so no signs of the brickyard remain in the twenty-first century. It was, however, one of several such yards identified at The Heath in the tithe map for Uttoxeter of 1842, and probably made use of a fixed kiln rather than a temporary clamp. This can be deduced from the parish’s sale of both bricks and tiles to brickyard customers. Tiles were more refined products than bricks and require more control over the firing process, whereas clamp firings were more difficult to regulate. The yard provided periodic employment to the adult male poor of the Uttoxeter workhouse, in accordance with the seasonal pattern imposed on all brick manufacture at the time. Firings typically began in March or April each year, and continued until September or October depending on the weather.

This is important because the variety of vouchers relating to the parish brickyard allow us to understand something of its place in the town’s economy, both as a producer and as a consumer of other products and services. Brickyards needed coal, so it is no surprise that one of the major expenses for the support of the yard lay in bringing coal to the town from elsewhere via the canal. The Uttoxeter canal wharf was also at The Heath, and access to transport by water may explain the development of a number of brickyards in its proximity. The yard also required wood and other raw materials in addition to clay, and specific products to protect the bricks after cutting but before firing. The parish bought quantities of archangel mats, rough woven pieces that could be used like a modern protective fleece to protect against frost damage. We are surmising that Uttoxeter used the mats to cover both the bricks and any tender plants in the workhouse garden.

screen-shot-2018-09-04-at-18-00-20.png

Similarly the bricks were sold chiefly to local purchasers, including members of the parish vestry. Some men who were owed money by the parish for their work as suppliers allowed the debt to be offset by bricks. Finally the employment of workhouse men as brickyard labourers gives us, when combined with some genealogical research, a rare glimpse of the place of workhouse work in the life-cycle of the adult poor. More than one man employed at the yard in the 1820s and 1830s went on to independence as a lodging-house keeper in the town by the time of the 1841 census.

As I say, I’ve been surprised by brick. I had no idea that such a mundane product would prove so engaging. All this goes to show that it’s not the stolidity of the brick but its deployment that matters, historically speaking.

 

To contact the author: a.e.tomkins@keele.ac.uk

 

 

 

 

Judges and the death penalty in Nazi Germany: New research evidence on judicial discretion in authoritarian states

nazipeoplescourt
The German People’s Court. Available at https://www.foreignaffairs.com/reviews/review-essay/good-germans

Do judicial courts in authoritarian regimes act as puppets for the interests of a repressive state – or do judges act with greater independence? How much do judges draw on their political and ideological affiliations when imposing the death sentence?

A study of Nazi Germany’s notorious People’s Court, recently published in the Economic Journal, reveals direct empirical evidence of how the judiciary in one of the world’s most notoriously politicised courts were influenced in their life-and-death decisions.

The research provides important empirical evidence that the political and ideological affiliations of judges do come into play – a finding that has applications for modern authoritarian regimes and also for democracies that administer the death penalty.

The research team – Dr Wayne Geerling (University of Arizona), Prof Gary Magee, Prof Russell Smyth, and Dr Vinod Mishra (Monash Business School) – explore the factors influencing the likelihood of imposing the death sentence in Nazi Germany for crimes against the state – treason and high treason.

The authors examine data compiled from official records of individuals charged with treason and high treason who appeared before the People’s Courts up to the end of the Second World War.

Established by the Nazis in 1934 to hear cases of serious political offences, the People’s Courts have been vilified as ‘blood tribunals’ in which judges meted out pre-determined sentences.

But in recent years, while not contending that the People’s Court judgments were impartial or that its judges were not subservient to the wishes of the regime, a more nuanced assessment has emerged.

For the first time, the new study presents direct empirical evidence of the reasons behind the use of judicial discretion and why some judges appeared more willing to implement the will of the state than others.

The researchers find that judges with a deeper ideological commitment to Nazi values – typified by being members of the Alte Kampfer (‘Old Fighters’ or early members of the Nazi party) – were indeed more likely to impose the death penalty than those who did not share it.

These judges were more likely to hand down death penalties to members of the most organised opposition groups, those involved in violent resistance against the state and ‘defendants with characteristics repellent to core Nazi beliefs’:

‘The Alte Kampfer were thus more likely to sentence devout Roman Catholics (24.7 percentage points), defendants with partial Jewish ancestry (34.8 percentage points), juveniles (23.4 percentage points), the unemployed (4.9 percentage points) and foreigners (42.3 percentage points) to death.’

Judges who became adults during two distinct historical periods (the Revolution of 1918-19 and the period of hyperinflation from June 1921 to January 1924), which may have shaped these judges’ views with respect to Nazism, were more likely to impose the death sentence.

 Alte Kampfer members whose hometown or suburb lay near a centre of the Revolution of 1918-19 were more likely to sentence a defendant to death.

Previous economic research on sentencing in capital cases has focused mainly on gender and racial disparities, typically in the United States. But the understanding of what determines whether courts in modern authoritarian regimes outside the United States impose the death penalty is scant. By studying a politicised court in an historically important authoritarian state, the authors of the new study shed light on sentencing more generally in authoritarian states.

The findings are important because they provide insights into the practical realities of judicial empowerment by providing rare empirical evidence on how the exercise of judicial discretion in authoritarian states is reflected in sentencing outcomes.

To contact the authors:
Russell Smyth (russell.smyth@monash.edu)

THE ‘WITCH CRAZE’ OF 16th & 17th CENTURY EUROPE: Economists uncover religious competition as driving force of witch hunts

11328679url_&amp;&amp;version=1501231358665
“The Pendle Witches”. Available at https://www.theanneboleynfiles.com/witchcraft-in-tudor-and-stuart-times/

Economists Peter Leeson (George Mason University) and Jacob Russ (Bloom Intelligence) have uncovered new evidence to resolve the longstanding puzzle posed by the ‘witch craze’ that ravaged Europe in the sixteenth and seventeenth centuries and resulted in the trial and execution of tens of thousands for the dubious crime of witchcraft.

 

In research forthcoming in the Economic Journal, Leeson and Russ argue that the witch craze resulted from competition between Catholicism and Protestantism in post-Reformation Christendom. For the first time in history, the Reformation presented large numbers of Christians with a religious choice: stick with the old Church or switch to the new one. And when churchgoers have religious choice, churches must compete.

In an effort to woo the faithful, competing confessions advertised their superior ability to protect citizens against worldly manifestations of Satan’s evil by prosecuting suspected witches. Similar to how Republicans and Democrats focus campaign activity in political battlegrounds during US elections to attract the loyalty of undecided voters, Catholic and Protestant officials focused witch trial activity in religious battlegrounds during the Reformation and Counter-Reformation to attract the loyalty of undecided Christians.

Analysing new data on more than 40,000 suspected witches whose trials span Europe over more than half a millennium, Leeson and Russ find that when and where confessional competition, as measured by confessional warfare, was more intense, witch trial activity was more intense too. Furthermore, factors such as bad weather, formerly thought to be key drivers of the witch craze, were not in fact important.

The new data reveal that the witch craze took off only after the Protestant Reformation in 1517, following the new faith’s rapid spread. The craze reached its zenith between around 1555 and 1650, years co-extensive with peak competition for Christian consumers, evidenced by the Catholic Counter-Reformation, during which Catholic officials aggressively pushed back against Protestant successes in converting Christians throughout much of Europe.

Then, around 1650, the witch craze began its precipitous decline, with prosecutions for witchcraft virtually vanishing by 1700.

What happened in the middle of the seventeenth century to bring the witch craze to a halt? The Peace of Westphalia, a treaty entered in 1648, which ended decades of European religious warfare and much of the confessional competition that motivated it by creating permanent territorial monopolies for Catholics and Protestants – regions of exclusive control, in which one confession was protected from the competition of the other.

The new analysis suggests that the witch craze should also have been focused geographically, located where Catholic-Protestant rivalry was strongest and vice versa. And indeed it was: Germany alone, which was ground zero for the Reformation, laid claim to nearly 40% of all witchcraft prosecutions in Europe.

In contrast, Spain, Italy, Portugal and Ireland – each of which remained a Catholic stronghold after the Reformation and never saw serious competition from Protestantism – collectively accounted for just 6% of Europeans tried for witchcraft.

Religion, it is often said, works in unexpected ways. The new study suggests that the same can be said of competition between religions.

 

To contact the authors:  Peter Leeson (PLeeson@GMU.edu)

From VoxEU – Wellbeing inequality in retrospect

Rising trends in GDP per capita are often interpreted as reflecting rising levels of general wellbeing. But GDP per capita is at best a crude proxy for wellbeing, neglecting important qualitative dimensions. 36 more words

via Wellbeing inequality in retrospect — VoxEU.org: Recent Articles

To elaborate further on the topic, Prof. Leandro de la Escosura has made available several databases on inequality, accessible here, as well as a book on long-term Spanish economic growth, available as open source here

 

Perpetuating the family name: female inheritance, in-marriage and gender norms

by Duman Bahrami-Rad (Simon Fraser University)

549027601
Tartanspartan: Muslim wedding, Lahore, Pakistan — Frank Horvat, 1952. Available on Pinterest <https://www.pinterest.co.uk/pin/491947959265621479/&gt;

Why is it so common for Muslims to marry their cousins (more than 30% of all marriages in the Middle East)? Why, despite explicit injunctions in the Quran to include women in inheritance, do women in the Middle East generally face unequal gender relations, and their labour force participation remain the lowest in the world (less than 20%)?

This study presents a theory, supported by empirical evidence, concerning the historical origins of such marriage and gender norms. It argues that in patrilineal societies that nevertheless mandate female inheritance, cousin marriage becomes a way to preserve property in the male line and prevent fragmentation of land.

In these societies, female inheritance also leads to the seclusion and veiling of women as well as restrictions on their sexual freedom in order to encourage cousin marriages and avoid out-of-wedlock children as potential heirs. The incompatibility of such restrictions with female participation in agriculture has further influenced the historical gender division of labour.

Analyses of data on pre-industrial societies, Italian provinces, and women in Indonesia show that female inheritance, consistent with these hypotheses, is associated with lower female labour participation, greater stress on female virginity before marriage and higher rates of endogamy, consanguinity and arranged marriages.

The study also uses the recent reform of inheritance regulations in India – which greatly enhanced Indian women’s right to inherit property – to provide further evidence of the causal impact of female inheritance. The analysis shows that among women affected by the reform, the rate of cousin marriage is significantly higher, and that of premarital sex significantly lower.

The implications of these findings are important. It is believed that cousin marriage helps create and maintain kinship groups such as tribes and clans, which impair the development of an individualistic social psychology, undermine social trust, large-scale cooperation and democratic institutions, and encourage corruption and conflict.

This study contributes to this literature by highlighting a historical origin of clannish social organisation. It also sheds light on the origins of gender inequality as both a human rights issues and a development issue.

Land reform and agrarian conflict in 1930s Spain

Jordi Domènech (Universidad Carlos III de Madrid) and Francisco Herreros (Institute of Policies and Public Goods, Spanish Higher Scientific Council)

Government intervention in land markets is always fraught with potential problems. Intervention generates clearly demarcated groups of winners and losers as land is the main asset owned by households in predominantly agrarian contexts. Consequently, intervention can lead to large, generally welfare-reducing changes in the behaviour of the main groups affected by reform, and to policies being poorly targeted towards potential beneficiaries.

In this paper (available here), we analyse the impact of tenancy reform in the early 1930s on Spanish land markets. Adapting general laws to local and regional variation in land tenure patterns and heterogeneity in rural contracts was one of the problems of agricultural policies in 1930s Spain. In the case of Catalonia in the 1930s, the interest of the case lies in the adaptation of a centralized tenancy reform, aimed at fixed-rent contracts, to sharecropping contracts that were predominant in Catalan agriculture. This was more typically the case of sharecropping contracts on vineyards, the case of customary sharecropping contract (rabassa morta), subject to various legal changes in the late 18th and early 19th centuries. It is considered that the 1930s culminated a period of conflicts between the so called rabassaires (sharecroppers under rabassa morta contracts) and owners of land.

The divisions between owners of land and tenants was one of the central cleavages of Catalonia in the 20th century. This was so even in an area that had seen substantial industrialization. In the early 1920s, work started on a Catalan law of rural contracts, aimed especially at sharecroppers. A law, passed on the 21st March 1934, allowed the re-negotiation of existing rural contracts and prohibited the eviction of tenants who had been less than 6 years under the same contract. More importantly, it opened the door to forced sales of land to long-term tenants. Such legislative changes posed a threat to the status quo and the Spanish Constitutional Court ruled the law was unconstitutional.

The comparative literature on the impacts of land reforms argues that land reform, in this case tenancy reform, can in fact change agrarian structures. When property rights are threatened, landowners react by selling land or interrupting existing tenancy contracts, mechanizing and hiring labourers. Agrarian structure is therefore endogenous to existing threats to property rights. The extent of insecurity in property rights in 1930s Catalonia can be seen in the wave of litigation over sharecropping contracts. Over 30,000 contracts were revised in the courts in late 1931 and 1932 which provoked satirical cartoons (Figure 01).

Untitled
Figure 1. Revisions and the share of the harvest. Source: L’Esquella de la Torratxa, 2nd August 1932, p. 11.
Translation: The rabaissaire question: Peasant: You sweat by coming here to claim your part of the harvest, you would be sweating more if you were to grow it by yourself.

The first wave of petitions to revise contracts led overwhelmingly to most petitions being nullified by the courts. This was most pronounced in the Spanish Supreme Court which ruled against the sharecropper in most of the around 30,000 petitions of contract revision. Nonetheless, sharecroppers were protected by the Catalan autonomous government. The political context in which the Catalan government operated became even more charged in October 1934. That month, with signs that the Centre-Right government was moving towards more reactionary positions, the Generalitat participated in a rebellion orchestrated by the Spanish Socialist Party (PSOE) and Left Republicans. It is in this context of suspension of civil liberties that landowners now had a freer hand to evict unruly peasants. The fact that some sharecroppers did not surrender their harvest meant they could be evicted straight away according to the new rules set by the new military governor of Catalonia.

We use the number of cases of completed and initiated tenant evictions from October 1934 to around mid -1935 as the main dependent variable in the paper. Data were collected from a report produced by the main Catalan tenant union, Unió de Rabassaires (Rabassaires’ Union), published in late 1935 to publicize and denounce tenant evictions or attempts of evicting tenants.

Combining the spatial analysis of eviction cases with individual information on evictors and evicted, we can be reasonably confident about several facts around evictions and terminated contracts in 1930s Catalonia. Our data show that that rabassa morta legacies were not the main determinant of evictions. About 6 per cent of terminated contracts were open ended rabassa morta contracts (arbitrarily set at 150 years in the graph). About 12 per cent of evictions were linked to contracts longer than 50 years, which were probably oral contracts (since Spanish legislation had given a maximum of 50 years). Figure 2 gives the contracts lengths of terminated and threatened contracts.

Untitled 2
Figure 2. Histogram of contract lengths. Source: Own elaboration from Unió de Rabassaires, Els desnonaments rústics.

The spatial distribution of evictions is also consistent with the lack of historical legacies of conflict. Evictions were not more common in historical rabassa morta areas, nor were they typical of areas with a larger share of land planted with vines.

Our study provides a substantial revision of claims by unions or historians about very high levels of conflict in the Catalan countryside during the Second Republic. In many cases, there had a long process of adaptation and fine-tuning of contractual forms to crops and soil and climatic conditions which increased the costs of altering existing institutional arrangements.

To contact the authors:

jdomenec@clio.uc3m.es

francisco.herreros@csic.es

EHS 2018 special: London’s mortality decline – lessons for modern water policy

Werner Troeksen (University of Pittsburgh)
Nicola Tynan (Dickinson College)
Yuanxiaoyue (Artemis) Yang (Harvard T.H. Chan School of Public Health)

 

The United Nations Sustainable Development Goals aim to ensure access to water and sanitation for all. This means not just treating water but supplying it reliably. Lives are at stake because epidemiological research shows that a reliable, constant supply of water reduces water-borne illness.

Thames
Available at <https://heartheboatsing.com/2015/08/13/death-on-the-water/&gt;

Nineteenth century London faced the same challenge. Not until 1886 did more than half of London homes have water supplied 24 hours a day, 7 days a week. The move to a constant water supply reduced mortality. For every 5% increase in the number of households with a constant supply, deaths from water-borne illnesses fell 3%.

During Victoria’s reign, eight water companies supplied the metropolis with water: 50% from the river Thames, 25% from the river Lea and 25% from wells and springs. By the 1860s, the companies filtered all surface water and Bazalgette’s intercepting sewer was under construction. Still, more than 80% of people received water intermittently, storing it in cisterns often located outside the house, uncovered or beside the toilet.

Rapid population and housing growth required the expansion of the water network and companies found it easier to introduce constant service in new neighbourhoods. Retrofitting older neighbourhoods proved challenging and risked a substantial waste of scarce water. The Metropolis Water Act of 1871 finally gave water companies the power to require waste-limiting fixtures. After 1871, new housing estates received a constant supply of water immediately, while old neighbourhoods transitioned slowly.

As constant water supply reached more people, mortality from diarrhoea, dysentery, typhoid and cholera combined fell. With 24-hour supply, water was regularly available for everyone without risk of contamination. Unsurprisingly, poorer, crowded districts had higher mortality from water-borne diseases.

Even though treated, piped water was available to all by the mid-nineteenth century, everyone benefitted from the move to constant service. By the time the Metropolitan Water Board acquired London’s water infrastructure, 95% of houses in the city received their water directly from the mains.

According to Sergio Campus, water and sanitation head at the Inter-American Development Bank, the current challenge in many places is providing a sustainable and constant supply of water. In line with this, the World Bank’s new Water Supply, Sanitation, and Hygiene (WASH) poverty diagnostic has added frequency of delivery as a measure of water quality, in addition to access, water source and treatment.

Regularity of supply varies substantially across locations. London’s experience during the late Victorian years suggest that increased frequency of water supply has the potential to deliver further reductions in mortality in developing countries beyond the initial gains from improved water sources and treatment.

EHS 2018 special: How the Second World War promoted racial integration in the American South

by Andreas Ferrara (University of Warwick)

c805244f10399f75a8d9f41f67baf87e
African American and White Employees Working Together during WWII. Available at <https://www.pinterest.com.au/pin/396950154628232921/&gt;

European politicians face the challenge of integrating the 1.26 million refugees who arrived in 2015. Integration into the labour market is often discussed as key to social integration but empirical evidence for this claim is sparse.

My research contributes to the debate with a historical example from the American South where the Second World War increased the share of black workers in semi-skilled jobs such as factory work, jobs previously dominated by white workers.

I combine census and military records to show that the share of black workers in semi-skilled occupations in the American South increased as they filled vacancies created by wartime casualties among semi-skilled whites.

A fallen white worker in a semi-skilled occupation was replaced by 1.8 black workers on average. This raised the share of African Americans in semi-skilled jobs by 10% between 1940 and 1950.

Survey data from the South in 1961 reveal that this increased integration in the workplace led to improved social relations between black and white communities outside the workplace.

Individuals living in counties where war casualties brought more black workers into semi-skilled jobs between 1940-50 were 10 percentage points more likely to have an interracial friendship, 6 percentage points more likely to live in a mixed-race neighbourhood, and 11 percentage points more likely to favour integration over segregation in general, as well as at school and at church. These positive effects are reported by both black and white respondents.

Additional analysis using county-level church membership data from 1916 to 1971 shows similar results. Counties where wartime casualties resulted in a more racially integrated labour force saw a 6 percentage points rise in membership shares of churches, which already held mixed-race services before the war.

The church-related results are especially striking. In several of his speeches Dr Martin Luther King stated that 11am on Sunday is the most segregated hour in American life. And yet my analysis shows that workplace exposure of two groups can overcome even strongly embedded social divides such as churchgoing, which is particularly important in the South, the so-called bible belt.

This historical case study of the American South in the mid-twentieth century, where race relations were often tense, demonstrates that excluding refugees from the workforce may be ruling out a promising channel for integration.

Currently, almost all European countries forbid refugees from participating in the labour market. Arguments put forward to justify this include fear of competition for jobs, concern about downward pressure on wages and a perceived need to deter economic migration.

While the mid-twentieth century American South is not Europe, the policy implication is to experiment more extensively with social integration through workplace integration measures. This not only concerns the refugee case but any country with socially and economically segregated minority groups.