Missing girls in 19th-century Spain

by Francisco J. Beltrán Tapia (Norwegian University of Science and Technology)

This article is published by the Economic History Review, and it is available here

Gender discrimination, in the form of sex-selective abortion, female infanticide and the mortal neglect of young girls, constitutes a pervasive feature of many contemporary developing countries, especially in South and East Asia and Africa. Son preference stemmed from economic and cultural factors that have long influenced the perceived relative value of women in these regions and resulted in millions of “missing girls”. But, were there “missing girls” in historical Europe? The conventional narrative argues that there is little evidence for this kind of gender discrimination. According to this view, the European household formation system, together with prevailing ethical and religious values, limited female infanticide and the mortal neglect of young girls.

However, several studies suggest that parents treated their sons and daughters differently in 19th-century Britain and continental Europe (see, for instance, here, here or here). These authors stress that an unequal allocation of food, care and/or workload negatively affected girls’ nutritional status and morbidity, which translated in worsened heights and mortality rates. In order to provide more systematic historical evidence of this type of behaviour, our research (with Domingo Gallego-Martínez) relies on sex ratios at birth and at older ages. In the absence of gender discrimination, the number of boys per hundred girls in different age groups is remarkably regular, so comparing the observed figure to the expected (gender-neutral) sex ratio permits assessing the cumulative impact of gender bias in peri-natal, infant and child mortality and, consequently, the importance of potential discriminatory practices. However, although non-discriminatory sex ratios at birth revolve around 105-106 boys per hundred girls in most developed countries today, historical sex ratios cannot be compared directly to modern ones.

We have shown here that non-discriminatory infant and child sex ratios were much lower in the past. The biological survival advantage of girls was more visible in the high-mortality environments that characterised pre-industrial Europe due to poor living conditions, lack of hygiene and the absence of public health systems. Subsequently, boys suffered relatively higher mortality rates both in utero and during infancy and childhood. Historical infant and child sex ratios were therefore relatively low, even in the presence of gender-discriminatory practices. This is illustrated in Figure 1 below which plots the relationship between child sex ratios and infant mortality rates using information from seventeen European countries between 1750 and 2001. In particular, in societies where infant mortality rates were around 250 deaths (per 1,000 live births), a gender-neutral child sex ratio should have been slightly below parity (around 99.5 boys per hundred girls).

pic 01
Figure 1. Infant mortality rates and child sex ratios in Europe, 1750-2001

 

Compared to this benchmark, infant and child sex ratios in 19th-century Spain were abnormally high (see black dots in Figure 1 above; the number refers to the year of the observation), thus suggesting that some sort of gender discrimination was unduly increasing female mortality rates at those ages. This pattern, which is not the result of under-enumeration of girls in the censuses, mostly disappeared at the turn of the 20th century. Notwithstanding that average sex ratios remained relatively high in nineteenth- century Spain, some regions exhibited even more extreme figures. In 1860, 54 districts (out of 471) had infant sex ratios above 115, figures that are extremely unlikely to have occurred by chance. Relying on an extremely rich dataset at the district level, our research analyses regional variation in order to examine what lies behind the unbalanced sex ratios. Our results show that the presence of wage labour opportunities for women and the prevalence of extended families in which different generations of women cohabited had beneficial effects on girls’ survival. Likewise, infant and child sex ratios were lower in dense, more urbanized areas.

This evidence thus suggests that discriminatory practices with lethal consequences for girls constituted a veiled feature of pre-industrial Spain. Excess female mortality was then not necessarily the result of ill-treatment of young girls but could have been just based on an unequal allocation of resources within the household, a circumstance that probably cumulated as infants grew older. In contexts where infant and child mortality is high, a slight discrimination in the way that young girls were fed or treated when ill, as well as in the amount of work which they were entrusted with, was likely to have resulted in more girls dying from the combined effect of undernutrition and illness. Although female infanticide or other extreme versions of mistreatment of young girls may not have been a systematic feature of historical Europe, this line of research would point to more passive, but pervasive, forms of gender discrimination that also resulted in a significant fraction of missing girls.

To contact the author:

francisco.beltran.tapia@ntnu.no

Twitter: @FJBeltranTapia

How well off were the occupants of early modern almshouses?

by Angela Nicholls (University of Warwick).

Almhouses in Early Modern England is published by Boydell Press. SAVE 25% when you order direct from the publisher – offer ends on the 13th December 2018. See below for details.

pic00

Almshouses, charitable foundations providing accommodation for poor people, are a feature of many towns and villages. Some are very old, with their roots in medieval England as monastic infirmaries for the sick, pilgrims and travellers, or as chantries offering prayers for the souls of their benefactors. Many survived the Reformation to be joined by a remarkable number of new foundations between around 1560 and 1730. For many of them their principal purpose was as sites of memorialisation and display, tangible representations of the philanthropy of their wealthy donors. But they are also some of the few examples of poor people’s housing to have survived from the early modern period, so can they tell us anything about the material lives of the people who lived in them?

Paul Slack famously referred to almspeople as ‘respectable, gowned Trollopian worthies’, and there are many examples to justify that view, for instance Holy Cross Hospital, Winchester, refounded in 1445 as the House of Noble Poverty. But these are not typical. Nevertheless, many early modern almshouse buildings are instantly recognisable, with the ubiquitous row of chimneys often the first indication of the identity of the building.

 

pic01
Burghley Almshouses, Stamford (1597)

 

Individual chimneys and, often, separate front doors are evidence of private domestic space, far removed from the communal halls of the earlier medieval period, or the institutional dormitories of the nineteenth century workhouses which came later. Accommodating almspeople in their own rooms was not just a reflection of general changes in domestic architecture at the time, which placed greater emphasis on comfort and privacy, but represented a change in how almspeople were viewed and how they were expected to live their lives. Instead of living communally with meals provided, in the majority of post-Reformation almshouses the residents would have lived independently, buying their own food, cooking it themselves on their own hearth and eating it by themselves in their rooms. The importance of the hearth was not only as the practical means of heating and cooking, but was central to questions of identity and social status. Together with individual front doors, these features gave occupants a degree of independence and autonomy; they enabled almspeople to live independently despite their economic dependence, and to adopt the appearance if not the reality of independent householders.

 

Screen Shot 2018-11-13 at 16.40.44
Stoneleigh Old Almshouses, Warwickshire (1576)

 

The retreat from communal living also meant that almspeople had to support themselves rather than have all their needs met by the almshouse. This was achieved in many places by a transition to monetary allowances or stipends with which almspeople could purchase their own food and necessities, but the existence and level of these stipends varied considerably. Late medieval almshouses often specified an allowance of a penny a day, which would have provided a basic but adequate living in the fifteenth century, but was seriously eroded by sixteenth-century inflation. Thus when Lawrence Sheriff, a London mercer, established in 1567 an almshouse for four poor men in his home town of Rugby, his will gave each of them the traditional penny a day, or £1 10s 4d a year. Yet with inflation, if these stipends were to match the real value of their late-fifteenth-century counterparts, his almsmen would actually have needed £4 5s 5d a year.[1]

The nationwide system of poor relief established by the Tudor Poor Laws, and the survival of poor relief accounts from many parishes by the late seventeenth century, provide an opportunity to see the actual amounts disbursed in relief by overseers of the poor to parish paupers. From the level of payments made to elderly paupers no longer capable of work it is possible to calculate the barest minimum which an elderly person living rent free in an almshouse might have needed to feed and clothe themself and keep warm.[2] Such a subsistence level in the 1690s equates to an annual sum of £3 17s which can be adjusted for inflation and used to compare with a range of known almshouse stipends from the late sixteenth and seventeenth centuries.

The results of this comparison are interesting, even surprising. Using data from 147 known almshouse stipends in six different counties (Durham, Yorkshire, Norfolk, Warwickshire, Buckinghamshire and Kent) it seems that less than half of early modern almshouses provided their occupants with stipends which were sufficient to live on. Many provided no financial assistance at all.

pic03

The inescapable conclusion is that the benefits provided to early modern almspeople were in many cases only a contribution towards their subsistence. In this respect almshouse occupants were no different from the recipients of parish poor relief, who rarely had their living costs met in full.

Yet, even in one of the poorer establishments, almshouse residents had distinct advantages over other poor people. Principally these were the security of their accommodation, the permanence and regularity of any financial allowance, no matter how small, and the autonomy this gave them. Almshouse residents may also have had an enhanced status as ‘approved’, deserving poor. The location of many almshouses, beside the church, in the high street, or next to the guildhall, seems to have been purposely designed to solicit alms from passers-by, at a time when begging was officially discouraged.

SAVE 25% when you order direct from the publisher. Discount applies to print and eBook editions. Click the link, add to basket and enter offer code BB500 in the box at the checkout. Alternatively call Boydell’s distributor, Wiley, on 01243 843 291 and quote the same code. Offer ends one month after the date of upload. Any queries please email marketing@boydell.co.uk

 

NOTES

[1] Inflation index derived from H. Phelps Brown and S. V. Hopkins, A Perspective of Wages and Prices (London and New York, 1981) pp. 13-59.

[2] L. A. Botelho, Old Age and the English Poor Law, 1500 – 1700 (Woodbridge, 2004) pp. 147-8.

Revisiting the changing body

by Bernard Harris (University of Strathclyde)

The Society has arranged with CUP that a 20% discount is available on this book, valid until the 11th November 2018. The discount page is: www.cambridge.org/wm-ecommerce-web/academic/landingPage/EHS20

The last century has witnessed unprecedented improvements in survivorship and life expectancy. In the United Kingdom alone, infant mortality fell from over 150 deaths per thousand births at the start of the last century to 3.9 deaths per thousand births in 2014 (see the Office for National Statistics  for further details). Average life expectancy at birth increased from 46.3 to 81.4 years over the same period (see the Human Mortality Database). These changes reflect fundamental improvements in diet and nutrition and environmental conditions.

The changing body: health, nutrition and human development in the western world since 1700 attempted to understand some of the underlying causes of these changes. It drew on a wide range of archival and other sources covering not only mortality but also height, weight and morbidity. One of our central themes was the extent to which long-term improvements in adult health reflected the beneficial effect of improvements in earlier life.

The changing body also outlined a very broad schema of ‘technophysio evolution’ to capture the intergenerational effects of investments in early life. This is represented in a very simple way in Figure 1. The Figure tries to show how improvements in the nutritional status of one generation increase its capacity to invest in the health and nutritional status of the next generation, and so on ‘ad infinitum’ (Floud et al. 2011: 4).

fig01
Figure 1. Technophysio evolution: a schema. Source: See Floud et al. 2011: 3-4.

We also looked at some of the underlying reasons for these changes, including the role of diet and ‘nutrition’. As part of this process, we included new estimates of the number of calories which could be derived from the amount of food available for human consumption in the United Kingdom between circa 1700 and 1913. However, our estimates contrasted sharply with others published at the same time (Muldrew 2011) and were challenged by a number of other authors subsequently. Broadberry et al. (2015) thought that our original estimates were too high, whereas both Kelly and Ó Gráda (2013) and Meredith and Oxley (2014) regarded them as too low.

Given the importance of these issues, we revisited our original calculations in 2015. We corrected an error in the original figures, used Overton and Campbell’s (1996) data on extraction rates to recalculate the number of calories, and included new information on the importation of food from Ireland to other parts of what became the UK. Our revised Estimate A suggested that the number of calories rose by just under 115 calories per head per day between 1700 and 1750 and by more than 230 calories between 1750 and 1800, with little changes between 1800 and 1850. Our revised Estimate B suggested that there was a much bigger increase during the first half of the eighteenth century, followed by a small decline between 1750 and 1800 and a bigger increase between 1800 and 1850 (see Figure 2). However, both sets of figures were still well below the estimates prepared by Kelly and Ó Gráda, Meredith and Oxley, and Muldrew for the years before 1800.

fig02
Source: Harris et al. 2015: 160.

These calculations have important implications for a number of recent debates in British economic and social history (Allen 2005, 2009). Our data do not necessarily resolve the debate over whether Britons were better fed than people in other countries, although they do compare quite favourably with relevant French estimates (see Floud et al. 2011: 55). However, they do suggest that a significant proportion of the eighteenth-century population was likely to have been underfed.
Our data also raise some important questions about the relationship between nutrition and mortality. Our revised Estimate A suggests that food availability rose slowly between 1700 and 1750 and then more rapidly between 1750 and 1800, before levelling off between 1800 and 1850. These figures are still broadly consistent with Wrigley et al.’s (1997) estimates of the main trends in life expectancy and our own figures for average stature. However, it is not enough simply to focus on averages; we also need to take account of possible changes in the distribution of foodstuffs within households and the population more generally (Harris 2015). Moreover, it is probably a mistake to examine the impact of diet and nutrition independently of other factors.

To contact the author: bernard.harris@strath.ac.uk

References

Allen, R. (2005), ‘English and Welsh agriculture, 1300-1850: outputs, inputs and income’. URL: https://www.nuffield.ox.ac.uk/media/2161/allen-eandw.pdf.

Allen, R. (2009), The British industrial revolution in global perspective, Cambridge: Cambridge University Press.

Broadberry, S., Campbell, B., Klein, A., Overton, M. and Van Leeuwen, B. (2015), British economic growth, 1270-1870, Cambridge: Cambridge University Press.

Floud, R., Fogel, R., Harris, B. and Hong, S.C. (2011), The changing body: health, nutrition and human development in the western world since 1700, Cambridge: Cambridge University Press.

Harris, B. (2015), ‘Food supply, health and economic development in England and Wales during the eighteenth and nineteenth centuries’, Scientia Danica, Series H, Humanistica, 4 (7), 139-52.

Harris, B., Floud, R. and Hong, S.C. (2015), ‘How many calories? Food availability in England and Wales in the eighteenth and nineteenth centuries’, Research in Economic History, 31, 111-91.

Kelly, M. and Ó Gráda, C. (2013), ‘Numerare est errare: agricultural output and food supply in England before and during the industrial revolution’, Journal of Economic History, 73 (4), 1132-63.

Meredith, D. and Oxley, D. (2014), ‘Food and fodder: feeding England, 1700-1900’, Past and Present, 222, 163-214.

Muldrew, C. (2011), Food, energy and the creation of industriousness: work and material culture in agrarian England, 1550-1780, Cambridge: Cambridge University Press.

Overton, M. and Campbell, B. (1996), ‘Production et productivité dans l’agriculture anglaise, 1086-1871’, Histoire et Mésure, 1 (3-4), 255-97.

Wrigley, E.A., Davies, R., Oeppen, J. and Schofield, R. (1997), English population history from family reconstitution, Cambridge: Cambridge University Press.

Surprisingly gentle confinement

Tim Leunig (LSE), Jelle van Lottum (Huygens Institute) and Bo Poulsen (Aarlborg University) have been investigating the treatment of prisoners of war in the Napoleonic Wars.

 

index
Napoleonic Prisoner of War. Available at <https://blog.findmypast.com.au/explore-our-fascinating-new-napoleonic-prisoner-of-war-records-1406376311.html&gt;

For most of history, life as a prisoner of war was nasty, brutish and short. There were no regulations on the treatment of prisoners until the 1899 Hague convention, and the later Geneva conventions. Many prisoners were killed immediately, other enslaved to work in mines, and other undesirable places.

The poor treatment of prisoners of war was partly intentional – they were the hated enemy, after all. And partly it was economic. It costs money to feed and shelter prisoners. Countries in the past – especially in times of war and conflict – were much poorer than today.

Nineteenth century prisoner death rates were horrific. Between one-half and six-sevenths of Napoleon’s 17,000 troops surrendering to the Spanish in 1808 after the Battle of Balién died as prisoners of war. The American civil war saw death rates rise to 27%, even though the average prisoner was captive for less than a year.

The Napoleonic Wars saw the British capture 7,000 Danish and Norwegian sailors, military and merchant. Britain did not desire war with Denmark (which ruled Norway at the time), but did so to prevent Napoleon seizing the Danish fleet. Prisoners were incarcerated on old, unseaworthy “prison hulks”, moored in the Thames Estuary, near Rochester. Conditions were crowded: each man was given just 2 feet (60 cm) in width to hang his hammock.

Were these prison hulks floating tombs, as some contemporaries claimed? Our research shows otherwise. The Admiralty kept exemplary records, now held in the National Archive in Kew. These show the date of arrival in prison, and the date of release, exchange, escape – or death. They also tell us the age of the prisoner, where they came from, the type of ship they served on, and whether they were an officer, craftsman, or regular sailor. We can use these records to look at how many died, and why.

The prisoners ranged in age from 8 to 80, with half aged 22 to 35. The majority sailed on merchant vessels, with a sixth on military vessels, and a quarter on licenced pirate boats, permitted to harass British shipping. The amount of time in prison varied dramatically, from 3 days to over 7 years, with an average of 31 months. About two thirds were released before the end of the war.

Taken as a whole, 5% of prisoners died. This is a remarkably low number, given how long they were held, and given experience elsewhere in the nineteenth century. Being held prisoner for longer increased your chance of dying, but not by much: those who spent three years on a prison hulk had only a 1% greater chance of dying than those who served just one year.

Death was (almost) random. Being captured at the start of the war was neither better nor worse than being captured at the end. The number of prisoners held at any one time did not increase the death rate. The old were no more likely to die than the young – anyone fit enough to go to see was fit enough to withstand any rigours of prison life. Despite extra space and better rations, officers were no less likely to die, implying that conditions were reasonable for common sailors.

There is only one exception: sailors from licenced pirate boats were twice as likely to die as merchant or official navy sailors. We cannot know the reason. Perhaps they were treated less well by their guards, or other prisoners. Perhaps they were risk takers, who gambled away their rations. Even for this group, however, the death rates were very low compared with those captured in other places, and in other wars.

The British had rules on prisoners of war, for food and hygiene. Each prisoner was entitled to 2.5 lbs (~1 kg) of beef, 1 lb of fish, 10.5 lbs of bread, 2 lbs of potatoes, 2.5lbs of cabbage, and 14 pints (8 litres) of (very weak) beer a week. This is not far short of Danish naval rations, and prisoners are less active than sailors. We cannot be sure that they received their rations in full every week, but the death rates suggest that they were not hungry in any systematic way. The absence of epidemics suggests that hygiene was also good. Remarkably, and despite a national debt that peaked at a still unprecedented 250% of GDP, the British appear to have obeyed their own rules on how to treat prisoners.

Far from being floating tombs, therefore, this was a surprisingly gentle confinement for the Danish and Norwegian sailors captured by the British in the Napoleonic Wars.

Small Bills and Petty Finance: co-creating the history of the Old Poor Law

by Alannah Tomkins (Keele University) 

Alannah Tomkins and Professor Tim Hitchcock (University of Sussex), won an AHRC award to investigate ‘Small Bills and Petty Finance: co-creating the history of the Old Poor Law’.  It is a three-year project from January 2018. The application was for £728K, which has been raised, through indexing, to £740K.  The project website can be found at: thepoorlaw.org.

 

Twice in my career I’ve been surprised by a brick – or more precisely by bricks, hurtling into my research agenda. In the first instance I found myself supervising a PhD student working on the historic use of brick as a building material in Staffordshire (from the sixteenth to the eighteenth centuries). The second time, the bricks snagged my interest independently.

The AHRC-funded project ‘Small bills and petty finance’ did not set out to look for bricks. Instead it promises to explore a little-used source for local history, the receipts and ‘vouchers’ gathered by parish authorities as they relieved or punished the poor, to write multiple biographies of the tradesmen and others who serviced the poor law. A parish workhouse, for example, exerted a considerable influence over a local economy when it routinely (and reliably) paid for foodstuffs, clothing, fuel and other necessaries. This influence or profit-motive has not been studied in any detail for the poor law before 1834, and vouchers’ innovative content is matched by an exciting methodology. The AHRC project calls on the time and expertise of archival volunteers to unfold and record the contents of thousands of vouchers surviving in the three target counties of Cumbria, East Sussex and Staffordshire. So where do the bricks come in?

The project started life in Staffordshire as a pilot in advance of AHRC funding. The volunteers met at the Stafford archives and started by calendaring the contents of vouchers for the market town of Uttoxeter, near the Staffordshire/Derbyshire border. And the Uttoxeter workhouse did not confine itself to accommodating and feeding the poor. Instead in the 1820s it managed two going concerns: a workhouse garden producing vegetables for use and sale, and a parish brickyard. Many parishes under the poor law embedded make-work schemes in their management of the resident poor, but no others that I’m aware of channelled pauper labour into the manufacture of bricks.

pic01

The workhouse and brickyard were located just to the north of the town of Uttoxeter, in an area known as The Heath. The land was subsequently used to build the Uttoxeter Union workhouse in 1837-8 (after the reform of the poor law in 1834) so no signs of the brickyard remain in the twenty-first century. It was, however, one of several such yards identified at The Heath in the tithe map for Uttoxeter of 1842, and probably made use of a fixed kiln rather than a temporary clamp. This can be deduced from the parish’s sale of both bricks and tiles to brickyard customers. Tiles were more refined products than bricks and require more control over the firing process, whereas clamp firings were more difficult to regulate. The yard provided periodic employment to the adult male poor of the Uttoxeter workhouse, in accordance with the seasonal pattern imposed on all brick manufacture at the time. Firings typically began in March or April each year, and continued until September or October depending on the weather.

This is important because the variety of vouchers relating to the parish brickyard allow us to understand something of its place in the town’s economy, both as a producer and as a consumer of other products and services. Brickyards needed coal, so it is no surprise that one of the major expenses for the support of the yard lay in bringing coal to the town from elsewhere via the canal. The Uttoxeter canal wharf was also at The Heath, and access to transport by water may explain the development of a number of brickyards in its proximity. The yard also required wood and other raw materials in addition to clay, and specific products to protect the bricks after cutting but before firing. The parish bought quantities of archangel mats, rough woven pieces that could be used like a modern protective fleece to protect against frost damage. We are surmising that Uttoxeter used the mats to cover both the bricks and any tender plants in the workhouse garden.

screen-shot-2018-09-04-at-18-00-20.png

Similarly the bricks were sold chiefly to local purchasers, including members of the parish vestry. Some men who were owed money by the parish for their work as suppliers allowed the debt to be offset by bricks. Finally the employment of workhouse men as brickyard labourers gives us, when combined with some genealogical research, a rare glimpse of the place of workhouse work in the life-cycle of the adult poor. More than one man employed at the yard in the 1820s and 1830s went on to independence as a lodging-house keeper in the town by the time of the 1841 census.

As I say, I’ve been surprised by brick. I had no idea that such a mundane product would prove so engaging. All this goes to show that it’s not the stolidity of the brick but its deployment that matters, historically speaking.

 

To contact the author: a.e.tomkins@keele.ac.uk

 

 

 

 

Judges and the death penalty in Nazi Germany: New research evidence on judicial discretion in authoritarian states

nazipeoplescourt
The German People’s Court. Available at https://www.foreignaffairs.com/reviews/review-essay/good-germans

Do judicial courts in authoritarian regimes act as puppets for the interests of a repressive state – or do judges act with greater independence? How much do judges draw on their political and ideological affiliations when imposing the death sentence?

A study of Nazi Germany’s notorious People’s Court, recently published in the Economic Journal, reveals direct empirical evidence of how the judiciary in one of the world’s most notoriously politicised courts were influenced in their life-and-death decisions.

The research provides important empirical evidence that the political and ideological affiliations of judges do come into play – a finding that has applications for modern authoritarian regimes and also for democracies that administer the death penalty.

The research team – Dr Wayne Geerling (University of Arizona), Prof Gary Magee, Prof Russell Smyth, and Dr Vinod Mishra (Monash Business School) – explore the factors influencing the likelihood of imposing the death sentence in Nazi Germany for crimes against the state – treason and high treason.

The authors examine data compiled from official records of individuals charged with treason and high treason who appeared before the People’s Courts up to the end of the Second World War.

Established by the Nazis in 1934 to hear cases of serious political offences, the People’s Courts have been vilified as ‘blood tribunals’ in which judges meted out pre-determined sentences.

But in recent years, while not contending that the People’s Court judgments were impartial or that its judges were not subservient to the wishes of the regime, a more nuanced assessment has emerged.

For the first time, the new study presents direct empirical evidence of the reasons behind the use of judicial discretion and why some judges appeared more willing to implement the will of the state than others.

The researchers find that judges with a deeper ideological commitment to Nazi values – typified by being members of the Alte Kampfer (‘Old Fighters’ or early members of the Nazi party) – were indeed more likely to impose the death penalty than those who did not share it.

These judges were more likely to hand down death penalties to members of the most organised opposition groups, those involved in violent resistance against the state and ‘defendants with characteristics repellent to core Nazi beliefs’:

‘The Alte Kampfer were thus more likely to sentence devout Roman Catholics (24.7 percentage points), defendants with partial Jewish ancestry (34.8 percentage points), juveniles (23.4 percentage points), the unemployed (4.9 percentage points) and foreigners (42.3 percentage points) to death.’

Judges who became adults during two distinct historical periods (the Revolution of 1918-19 and the period of hyperinflation from June 1921 to January 1924), which may have shaped these judges’ views with respect to Nazism, were more likely to impose the death sentence.

 Alte Kampfer members whose hometown or suburb lay near a centre of the Revolution of 1918-19 were more likely to sentence a defendant to death.

Previous economic research on sentencing in capital cases has focused mainly on gender and racial disparities, typically in the United States. But the understanding of what determines whether courts in modern authoritarian regimes outside the United States impose the death penalty is scant. By studying a politicised court in an historically important authoritarian state, the authors of the new study shed light on sentencing more generally in authoritarian states.

The findings are important because they provide insights into the practical realities of judicial empowerment by providing rare empirical evidence on how the exercise of judicial discretion in authoritarian states is reflected in sentencing outcomes.

To contact the authors:
Russell Smyth (russell.smyth@monash.edu)

THE ‘WITCH CRAZE’ OF 16th & 17th CENTURY EUROPE: Economists uncover religious competition as driving force of witch hunts

11328679url_&amp;&amp;version=1501231358665
“The Pendle Witches”. Available at https://www.theanneboleynfiles.com/witchcraft-in-tudor-and-stuart-times/

Economists Peter Leeson (George Mason University) and Jacob Russ (Bloom Intelligence) have uncovered new evidence to resolve the longstanding puzzle posed by the ‘witch craze’ that ravaged Europe in the sixteenth and seventeenth centuries and resulted in the trial and execution of tens of thousands for the dubious crime of witchcraft.

 

In research forthcoming in the Economic Journal, Leeson and Russ argue that the witch craze resulted from competition between Catholicism and Protestantism in post-Reformation Christendom. For the first time in history, the Reformation presented large numbers of Christians with a religious choice: stick with the old Church or switch to the new one. And when churchgoers have religious choice, churches must compete.

In an effort to woo the faithful, competing confessions advertised their superior ability to protect citizens against worldly manifestations of Satan’s evil by prosecuting suspected witches. Similar to how Republicans and Democrats focus campaign activity in political battlegrounds during US elections to attract the loyalty of undecided voters, Catholic and Protestant officials focused witch trial activity in religious battlegrounds during the Reformation and Counter-Reformation to attract the loyalty of undecided Christians.

Analysing new data on more than 40,000 suspected witches whose trials span Europe over more than half a millennium, Leeson and Russ find that when and where confessional competition, as measured by confessional warfare, was more intense, witch trial activity was more intense too. Furthermore, factors such as bad weather, formerly thought to be key drivers of the witch craze, were not in fact important.

The new data reveal that the witch craze took off only after the Protestant Reformation in 1517, following the new faith’s rapid spread. The craze reached its zenith between around 1555 and 1650, years co-extensive with peak competition for Christian consumers, evidenced by the Catholic Counter-Reformation, during which Catholic officials aggressively pushed back against Protestant successes in converting Christians throughout much of Europe.

Then, around 1650, the witch craze began its precipitous decline, with prosecutions for witchcraft virtually vanishing by 1700.

What happened in the middle of the seventeenth century to bring the witch craze to a halt? The Peace of Westphalia, a treaty entered in 1648, which ended decades of European religious warfare and much of the confessional competition that motivated it by creating permanent territorial monopolies for Catholics and Protestants – regions of exclusive control, in which one confession was protected from the competition of the other.

The new analysis suggests that the witch craze should also have been focused geographically, located where Catholic-Protestant rivalry was strongest and vice versa. And indeed it was: Germany alone, which was ground zero for the Reformation, laid claim to nearly 40% of all witchcraft prosecutions in Europe.

In contrast, Spain, Italy, Portugal and Ireland – each of which remained a Catholic stronghold after the Reformation and never saw serious competition from Protestantism – collectively accounted for just 6% of Europeans tried for witchcraft.

Religion, it is often said, works in unexpected ways. The new study suggests that the same can be said of competition between religions.

 

To contact the authors:  Peter Leeson (PLeeson@GMU.edu)

From VoxEU – Wellbeing inequality in retrospect

Rising trends in GDP per capita are often interpreted as reflecting rising levels of general wellbeing. But GDP per capita is at best a crude proxy for wellbeing, neglecting important qualitative dimensions. 36 more words

via Wellbeing inequality in retrospect — VoxEU.org: Recent Articles

To elaborate further on the topic, Prof. Leandro de la Escosura has made available several databases on inequality, accessible here, as well as a book on long-term Spanish economic growth, available as open source here

 

Perpetuating the family name: female inheritance, in-marriage and gender norms

by Duman Bahrami-Rad (Simon Fraser University)

549027601
Tartanspartan: Muslim wedding, Lahore, Pakistan — Frank Horvat, 1952. Available on Pinterest <https://www.pinterest.co.uk/pin/491947959265621479/&gt;

Why is it so common for Muslims to marry their cousins (more than 30% of all marriages in the Middle East)? Why, despite explicit injunctions in the Quran to include women in inheritance, do women in the Middle East generally face unequal gender relations, and their labour force participation remain the lowest in the world (less than 20%)?

This study presents a theory, supported by empirical evidence, concerning the historical origins of such marriage and gender norms. It argues that in patrilineal societies that nevertheless mandate female inheritance, cousin marriage becomes a way to preserve property in the male line and prevent fragmentation of land.

In these societies, female inheritance also leads to the seclusion and veiling of women as well as restrictions on their sexual freedom in order to encourage cousin marriages and avoid out-of-wedlock children as potential heirs. The incompatibility of such restrictions with female participation in agriculture has further influenced the historical gender division of labour.

Analyses of data on pre-industrial societies, Italian provinces, and women in Indonesia show that female inheritance, consistent with these hypotheses, is associated with lower female labour participation, greater stress on female virginity before marriage and higher rates of endogamy, consanguinity and arranged marriages.

The study also uses the recent reform of inheritance regulations in India – which greatly enhanced Indian women’s right to inherit property – to provide further evidence of the causal impact of female inheritance. The analysis shows that among women affected by the reform, the rate of cousin marriage is significantly higher, and that of premarital sex significantly lower.

The implications of these findings are important. It is believed that cousin marriage helps create and maintain kinship groups such as tribes and clans, which impair the development of an individualistic social psychology, undermine social trust, large-scale cooperation and democratic institutions, and encourage corruption and conflict.

This study contributes to this literature by highlighting a historical origin of clannish social organisation. It also sheds light on the origins of gender inequality as both a human rights issues and a development issue.

Land reform and agrarian conflict in 1930s Spain

Jordi Domènech (Universidad Carlos III de Madrid) and Francisco Herreros (Institute of Policies and Public Goods, Spanish Higher Scientific Council)

Government intervention in land markets is always fraught with potential problems. Intervention generates clearly demarcated groups of winners and losers as land is the main asset owned by households in predominantly agrarian contexts. Consequently, intervention can lead to large, generally welfare-reducing changes in the behaviour of the main groups affected by reform, and to policies being poorly targeted towards potential beneficiaries.

In this paper (available here), we analyse the impact of tenancy reform in the early 1930s on Spanish land markets. Adapting general laws to local and regional variation in land tenure patterns and heterogeneity in rural contracts was one of the problems of agricultural policies in 1930s Spain. In the case of Catalonia in the 1930s, the interest of the case lies in the adaptation of a centralized tenancy reform, aimed at fixed-rent contracts, to sharecropping contracts that were predominant in Catalan agriculture. This was more typically the case of sharecropping contracts on vineyards, the case of customary sharecropping contract (rabassa morta), subject to various legal changes in the late 18th and early 19th centuries. It is considered that the 1930s culminated a period of conflicts between the so called rabassaires (sharecroppers under rabassa morta contracts) and owners of land.

The divisions between owners of land and tenants was one of the central cleavages of Catalonia in the 20th century. This was so even in an area that had seen substantial industrialization. In the early 1920s, work started on a Catalan law of rural contracts, aimed especially at sharecroppers. A law, passed on the 21st March 1934, allowed the re-negotiation of existing rural contracts and prohibited the eviction of tenants who had been less than 6 years under the same contract. More importantly, it opened the door to forced sales of land to long-term tenants. Such legislative changes posed a threat to the status quo and the Spanish Constitutional Court ruled the law was unconstitutional.

The comparative literature on the impacts of land reforms argues that land reform, in this case tenancy reform, can in fact change agrarian structures. When property rights are threatened, landowners react by selling land or interrupting existing tenancy contracts, mechanizing and hiring labourers. Agrarian structure is therefore endogenous to existing threats to property rights. The extent of insecurity in property rights in 1930s Catalonia can be seen in the wave of litigation over sharecropping contracts. Over 30,000 contracts were revised in the courts in late 1931 and 1932 which provoked satirical cartoons (Figure 01).

Untitled
Figure 1. Revisions and the share of the harvest. Source: L’Esquella de la Torratxa, 2nd August 1932, p. 11.
Translation: The rabaissaire question: Peasant: You sweat by coming here to claim your part of the harvest, you would be sweating more if you were to grow it by yourself.

The first wave of petitions to revise contracts led overwhelmingly to most petitions being nullified by the courts. This was most pronounced in the Spanish Supreme Court which ruled against the sharecropper in most of the around 30,000 petitions of contract revision. Nonetheless, sharecroppers were protected by the Catalan autonomous government. The political context in which the Catalan government operated became even more charged in October 1934. That month, with signs that the Centre-Right government was moving towards more reactionary positions, the Generalitat participated in a rebellion orchestrated by the Spanish Socialist Party (PSOE) and Left Republicans. It is in this context of suspension of civil liberties that landowners now had a freer hand to evict unruly peasants. The fact that some sharecroppers did not surrender their harvest meant they could be evicted straight away according to the new rules set by the new military governor of Catalonia.

We use the number of cases of completed and initiated tenant evictions from October 1934 to around mid -1935 as the main dependent variable in the paper. Data were collected from a report produced by the main Catalan tenant union, Unió de Rabassaires (Rabassaires’ Union), published in late 1935 to publicize and denounce tenant evictions or attempts of evicting tenants.

Combining the spatial analysis of eviction cases with individual information on evictors and evicted, we can be reasonably confident about several facts around evictions and terminated contracts in 1930s Catalonia. Our data show that that rabassa morta legacies were not the main determinant of evictions. About 6 per cent of terminated contracts were open ended rabassa morta contracts (arbitrarily set at 150 years in the graph). About 12 per cent of evictions were linked to contracts longer than 50 years, which were probably oral contracts (since Spanish legislation had given a maximum of 50 years). Figure 2 gives the contracts lengths of terminated and threatened contracts.

Untitled 2
Figure 2. Histogram of contract lengths. Source: Own elaboration from Unió de Rabassaires, Els desnonaments rústics.

The spatial distribution of evictions is also consistent with the lack of historical legacies of conflict. Evictions were not more common in historical rabassa morta areas, nor were they typical of areas with a larger share of land planted with vines.

Our study provides a substantial revision of claims by unions or historians about very high levels of conflict in the Catalan countryside during the Second Republic. In many cases, there had a long process of adaptation and fine-tuning of contractual forms to crops and soil and climatic conditions which increased the costs of altering existing institutional arrangements.

To contact the authors:

jdomenec@clio.uc3m.es

francisco.herreros@csic.es