Shoplifting in Eighteenth Century England

by Shelley Tickell (University of Hertfordshire)

Shoplifting in Eighteenth Century England is published by Boydell and Brewer Press. SAVE  25% when you order direct from the publisher – offer ends on the 5th March 2019. See below for details.

 

TickellPicture

What would you choose to buy from a store if money was no object? This was a decision eighteenth-century shoplifters made in practice on a daily basis. We might assume them to be attracted to the novel range of silk and cotton textiles, foodstuffs, ornaments and silver toys that swelled the consumer market in this period. Demand for these home-manufactured and imported goods was instrumental in a trebling of the number of English shops in the first half of the century, escalating the scale of the crime. However, as my book Shoplifting in Eighteenth-Century England shows, this was not the case. Consumer desire was by no means shoplifters’ major imperative.

 

Shoplifting occurred nationwide, but it was disproportionately a problem in the capital. A study of a sample of the many thousand prosecutions at the Old Bailey reveals that linen drapers, shoemakers, hosiers and haberdashers were the retailers most at risk. Over 70% of goods stolen, particularly by women, were fabrics, clothing and trimmings. Though thefts were highly gendered, men also stole these items far more frequently than the food, jewellery and household goods which were largely their preserve. Yet items stolen were not predominantly the most fashionable. Traditional linens, wool stockings and leather shoes were stolen as often as silk handkerchiefs and cotton prints. A prolific shoplifter who confessed to her crime found it profitable over the course of a year to steal printed linen at four times the quantity of the more stylish cotton, lawns, muslins and silk handkerchiefs she also took.

The shoplifters prosecuted were overwhelmingly from plebeian backgrounds. Professional gangs did exist but for most the crime was a source of occasional subsistence. Shop thieves came from the most economically vulnerable sections of society, seeking to weather an urban economy of low-paid and insecure work; many were older women or children. As the stolen goods needed to be convertible to income they were very commonly sold. So thieves sought the items which were most negotiable, those in greatest demand and least conspicuous in the working neighbourhoods in which they lived. A parcel of handkerchiefs stolen unopened was found to be ‘too fine’ for a market seller to whom it was offered. While there was undoubtedly an eagerness for popular fashion, the call for neat and appropriate daily dress in working communities was as insistent. We find the frequency with which shoplifters stole different types of clothing is consistent with a market demand governed in great part by the customary turnover of clothing items in labouring families. Handkerchiefs, shoes and stockings which were replaced regularly, were stolen frequently, jackets and stays more rarely.

There were also some practical reasons why shoplifters avoided the high-fashion goods that elite shops sold. To enter the emporiums in which the rich shopped added a heightened degree of risk. Testimony confirms shopkeepers’ deep reluctance to suspect any customer who appeared genteel, but in elite areas such as London’s West End retailers had an established clientele and a new face was likely to draw attention. A few shoplifters did try their luck by making an effort to dress the part and their polite fashioning and acting skill, witnesses recall, was often masterly. But an accidental slip into plebeian manners was easily done. Three customers dressed in silk drew the suspicion of a Covent Garden shopwoman as, she explained, ‘they called me my dear in a very sociable way’.

In general, shoplifters restricted themselves to plundering smaller local shops that were convenient to reconnoitre and with fewer staff to mount surveillance. A mapping of incidents in London shows this bias towards poorer and less fashionable districts, particularly to the north and east of the capital. The research found that within these working neighbourhoods shoplifted goods played an instrumental role in the intricate social and economic relations that underpinned community survival. Local associates earned money selling or pawning goods for the thief, their reputation serving to give the transaction an added credibility. Neighbours were informally sold stolen items on favourable terms, often including an element of exchange and credit, which acted to secure their complicity and future loyalty. We also come across shoplifted goods that were pawned to fund the shoplifter’s ongoing business or even recommodified as stock for their small retail concerns. Need rather than consumption fever motivated these shoplifters. Shoplifting was a capital crime throughout the century but this seems to have been of very little moment when the dictate was economic survival. As a shoplifter bluntly testified of her friend in 1747, ‘The prisoner came to me to go with her to the prosecutor’s shop, she wanted money, and she should go to the gallows’.

 

SAVE 25% when you order direct from the publisher using the offer code BB500 online at https://boydellandbrewer.com/shoplifting-in-eighteenth-century-england-pb.htmlOffer ends 5th March 2019. Discount applies to print and eBook editions. Alternatively call Boydell’s distributor, Wiley, on 01243 843 291, and quote the same code. Any queries please email marketing@boydell.co.uk

 

To contact Shelly Tickell: s.g.tickell@herts.ac.uk

The Price of the Poor’s Words: Social Relations and the Economics of Deposing for One’s “Betters” in Early Modern England

by Hillary Taylor (Jesus College, Cambridge)

This article is published by The Economic History Review, and it is available on the EHS website

william_powell_frith_-_poverty_and_wealth
Poverty and Wealth. Available at Wikimedia Commons

Late sixteenth- and early seventeenth-century England was one of the most litigious societies on record. If much of this litigation was occasioned by debt disputes, a sizeable proportion involved gentlemen suing each other in an effort to secure claims to landed property. In this genre of suits, gentlemen not infrequently enlisted their social inferiors and subordinates to testify on their behalf.[1] These labouring witnesses were usually qualified to comment on the matter at hand a result of their employment histories. When they deposed, they might recount their knowledge of the boundaries of some land, of a deed or the like. In the course of doing so, they might also comment on all sorts of quotidian affairs. Because testifying enabled illiterate and otherwise anonymous people to speak on-record about all sorts of issues, historians have rightly regarded depositions as a singularly valuable source: for all their limitations, they offer us access to worlds that would otherwise be lost.

But we don’t know much about what labouring people thought about the prospect of testifying for (and against) their superiors, or how they came to testify in the first place. Did they think that it presented an opportunity to assert themselves? Did it – as some contemporary legal commentators claimed – provide them with an opportunity to make a bit of money on the side by ‘selling’ dubious evidence to their litigious superiors?[2] Or were they reluctant to depose in such circumstances and, if so, why? Where subordinated individuals deposed for their ‘betters’, what was the relationship between the ‘pull’ of economic reward and the ‘push’ of extra-economic coercion?

I wrote an article that considers these questions. It doesn’t have any tables or graphs; the issues with which it’s concerned don’t readily lend themselves to quantification. Rather, this piece tries to think about how members of the labouring population conceived of the possibilities that were afforded to and the constraints that were imposed upon them by dint of their socio-economic position.

In order to reconstruct these areas of popular thought, I read loads of late sixteenth- and early seventeenth-century suits from the court of Star Chamber. In these cases, labouring witnesses who had deposed for one superior against another were subsequently sued for perjury (this was typically done in an effort to have a verdict that they had helped to secure overturned). Allegations against these witnesses got traction because it was widely assumed that people who worked for their livings were poor and, as a result, would lie under oath for anyone who would pay them for doing so. Where these suits advanced to the deposition-taking phase, labouring witnesses who were accused of swearing falsely under oath and witnesses of comparable social position provided accounts of their relationship with the litigious superiors in question, or commentaries on the perceived risks and benefits of giving evidence. They discussed the economic dispensations (or the promise thereof) which they had been given, or the coercion which had been used to extract their testimony.

Taken in aggregate, this evidence suggests that members of the labouring population had a keen sense of the politics of testimony. In a dynamic and exacting economy such as that of late sixteenth- and early seventeenth-century England, where labouring people’s material prospects were irrevocably linked to their reputation and ‘honesty,’ deposing could be risky. Members of the labouring population were aware of this, and many were hesitant to depose at all. Their reluctance may well have been born of an awareness that doubt was likely to be cast upon their testimony as a result of their subordinated and dependent social position, which lent credibility to accusations that they had sworn falsely for gain. More immediately, it reflected concerns about the material reprecussions that they feared would follow from commenting on the affairs of their ‘betters.’ Such projections were not merely the stuff of paranoid speculation. In 1601, a carpenter from Buckinghamshire called Christopher Badger had put his mark to a statement defending a gentleman, Arthur Wright, who had frustrated efforts to impose a stinting arrangement on the common to, as many locals claimed, the ‘damadge of the poorer sorte and to the comoditie of the riche.’ Badger recalled that one of Wright’s opponents – also a gentleman – later approached him and said ‘You have had my worke and the woorke of divers’ other pro-stinting individuals. To discourage Badger from further involvement, he added a thinly veiled threat: ‘This might be an occasion that you maie have lesse worke then heretofore you have had.’[3] For members of the labouring population, material circumstance often militated against opening their mouths.

But there was an irony to the politics of testimony, which was not lost on common people. If material conditions made some prospective witnesses reluctant to depose, they all but compelled others to do so (even when they expressed reservations). In some instances, labouring people’s poverty rendered the rewards – a bit of coal, a cow, promises of work that was not dictated by the vagaries of seasonal employment, or nebulous offers of a life freed from want – that they were promised (and less often given) in return for their testimony compelling. In others, the dependency, subordination and obligation that characterized their relations with their superiors necessitated that they speak as required, or face the consequences. In the face of such pressures, a given individual’s reservations about testifying were all but irrelevant.

To contact Hillary Taylor: Hat27@cam.ac.uk

Notes

[1] For debt and debt-related litigation, see Craig Muldrew, The Economy of Obligation: The Culture of Credit and Social Relations in Early Modern England (Basingstoke, 1998).

[2] For suspicions surrounding the testimony of poor and/or labouring witnesses, see Alexandra Shepard, Accounting for Oneself: Worth, Status, and the Social Order in Early Modern England (Oxford, 2015).

[3] TNA, STAC 5/W17/32. Continue reading

Missing girls in 19th-century Spain

by Francisco J. Beltrán Tapia (Norwegian University of Science and Technology)

This article is published by the Economic History Review, and it is available here

Gender discrimination, in the form of sex-selective abortion, female infanticide and the mortal neglect of young girls, constitutes a pervasive feature of many contemporary developing countries, especially in South and East Asia and Africa. Son preference stemmed from economic and cultural factors that have long influenced the perceived relative value of women in these regions and resulted in millions of “missing girls”. But, were there “missing girls” in historical Europe? The conventional narrative argues that there is little evidence for this kind of gender discrimination. According to this view, the European household formation system, together with prevailing ethical and religious values, limited female infanticide and the mortal neglect of young girls.

However, several studies suggest that parents treated their sons and daughters differently in 19th-century Britain and continental Europe (see, for instance, here, here or here). These authors stress that an unequal allocation of food, care and/or workload negatively affected girls’ nutritional status and morbidity, which translated in worsened heights and mortality rates. In order to provide more systematic historical evidence of this type of behaviour, our research (with Domingo Gallego-Martínez) relies on sex ratios at birth and at older ages. In the absence of gender discrimination, the number of boys per hundred girls in different age groups is remarkably regular, so comparing the observed figure to the expected (gender-neutral) sex ratio permits assessing the cumulative impact of gender bias in peri-natal, infant and child mortality and, consequently, the importance of potential discriminatory practices. However, although non-discriminatory sex ratios at birth revolve around 105-106 boys per hundred girls in most developed countries today, historical sex ratios cannot be compared directly to modern ones.

We have shown here that non-discriminatory infant and child sex ratios were much lower in the past. The biological survival advantage of girls was more visible in the high-mortality environments that characterised pre-industrial Europe due to poor living conditions, lack of hygiene and the absence of public health systems. Subsequently, boys suffered relatively higher mortality rates both in utero and during infancy and childhood. Historical infant and child sex ratios were therefore relatively low, even in the presence of gender-discriminatory practices. This is illustrated in Figure 1 below which plots the relationship between child sex ratios and infant mortality rates using information from seventeen European countries between 1750 and 2001. In particular, in societies where infant mortality rates were around 250 deaths (per 1,000 live births), a gender-neutral child sex ratio should have been slightly below parity (around 99.5 boys per hundred girls).

pic 01
Figure 1. Infant mortality rates and child sex ratios in Europe, 1750-2001

 

Compared to this benchmark, infant and child sex ratios in 19th-century Spain were abnormally high (see black dots in Figure 1 above; the number refers to the year of the observation), thus suggesting that some sort of gender discrimination was unduly increasing female mortality rates at those ages. This pattern, which is not the result of under-enumeration of girls in the censuses, mostly disappeared at the turn of the 20th century. Notwithstanding that average sex ratios remained relatively high in nineteenth- century Spain, some regions exhibited even more extreme figures. In 1860, 54 districts (out of 471) had infant sex ratios above 115, figures that are extremely unlikely to have occurred by chance. Relying on an extremely rich dataset at the district level, our research analyses regional variation in order to examine what lies behind the unbalanced sex ratios. Our results show that the presence of wage labour opportunities for women and the prevalence of extended families in which different generations of women cohabited had beneficial effects on girls’ survival. Likewise, infant and child sex ratios were lower in dense, more urbanized areas.

This evidence thus suggests that discriminatory practices with lethal consequences for girls constituted a veiled feature of pre-industrial Spain. Excess female mortality was then not necessarily the result of ill-treatment of young girls but could have been just based on an unequal allocation of resources within the household, a circumstance that probably cumulated as infants grew older. In contexts where infant and child mortality is high, a slight discrimination in the way that young girls were fed or treated when ill, as well as in the amount of work which they were entrusted with, was likely to have resulted in more girls dying from the combined effect of undernutrition and illness. Although female infanticide or other extreme versions of mistreatment of young girls may not have been a systematic feature of historical Europe, this line of research would point to more passive, but pervasive, forms of gender discrimination that also resulted in a significant fraction of missing girls.

To contact the author:

francisco.beltran.tapia@ntnu.no

Twitter: @FJBeltranTapia

How well off were the occupants of early modern almshouses?

by Angela Nicholls (University of Warwick).

Almhouses in Early Modern England is published by Boydell Press. SAVE 25% when you order direct from the publisher – offer ends on the 13th December 2018. See below for details.

pic00

Almshouses, charitable foundations providing accommodation for poor people, are a feature of many towns and villages. Some are very old, with their roots in medieval England as monastic infirmaries for the sick, pilgrims and travellers, or as chantries offering prayers for the souls of their benefactors. Many survived the Reformation to be joined by a remarkable number of new foundations between around 1560 and 1730. For many of them their principal purpose was as sites of memorialisation and display, tangible representations of the philanthropy of their wealthy donors. But they are also some of the few examples of poor people’s housing to have survived from the early modern period, so can they tell us anything about the material lives of the people who lived in them?

Paul Slack famously referred to almspeople as ‘respectable, gowned Trollopian worthies’, and there are many examples to justify that view, for instance Holy Cross Hospital, Winchester, refounded in 1445 as the House of Noble Poverty. But these are not typical. Nevertheless, many early modern almshouse buildings are instantly recognisable, with the ubiquitous row of chimneys often the first indication of the identity of the building.

 

pic01
Burghley Almshouses, Stamford (1597)

 

Individual chimneys and, often, separate front doors are evidence of private domestic space, far removed from the communal halls of the earlier medieval period, or the institutional dormitories of the nineteenth century workhouses which came later. Accommodating almspeople in their own rooms was not just a reflection of general changes in domestic architecture at the time, which placed greater emphasis on comfort and privacy, but represented a change in how almspeople were viewed and how they were expected to live their lives. Instead of living communally with meals provided, in the majority of post-Reformation almshouses the residents would have lived independently, buying their own food, cooking it themselves on their own hearth and eating it by themselves in their rooms. The importance of the hearth was not only as the practical means of heating and cooking, but was central to questions of identity and social status. Together with individual front doors, these features gave occupants a degree of independence and autonomy; they enabled almspeople to live independently despite their economic dependence, and to adopt the appearance if not the reality of independent householders.

 

Screen Shot 2018-11-13 at 16.40.44
Stoneleigh Old Almshouses, Warwickshire (1576)

 

The retreat from communal living also meant that almspeople had to support themselves rather than have all their needs met by the almshouse. This was achieved in many places by a transition to monetary allowances or stipends with which almspeople could purchase their own food and necessities, but the existence and level of these stipends varied considerably. Late medieval almshouses often specified an allowance of a penny a day, which would have provided a basic but adequate living in the fifteenth century, but was seriously eroded by sixteenth-century inflation. Thus when Lawrence Sheriff, a London mercer, established in 1567 an almshouse for four poor men in his home town of Rugby, his will gave each of them the traditional penny a day, or £1 10s 4d a year. Yet with inflation, if these stipends were to match the real value of their late-fifteenth-century counterparts, his almsmen would actually have needed £4 5s 5d a year.[1]

The nationwide system of poor relief established by the Tudor Poor Laws, and the survival of poor relief accounts from many parishes by the late seventeenth century, provide an opportunity to see the actual amounts disbursed in relief by overseers of the poor to parish paupers. From the level of payments made to elderly paupers no longer capable of work it is possible to calculate the barest minimum which an elderly person living rent free in an almshouse might have needed to feed and clothe themself and keep warm.[2] Such a subsistence level in the 1690s equates to an annual sum of £3 17s which can be adjusted for inflation and used to compare with a range of known almshouse stipends from the late sixteenth and seventeenth centuries.

The results of this comparison are interesting, even surprising. Using data from 147 known almshouse stipends in six different counties (Durham, Yorkshire, Norfolk, Warwickshire, Buckinghamshire and Kent) it seems that less than half of early modern almshouses provided their occupants with stipends which were sufficient to live on. Many provided no financial assistance at all.

pic03

The inescapable conclusion is that the benefits provided to early modern almspeople were in many cases only a contribution towards their subsistence. In this respect almshouse occupants were no different from the recipients of parish poor relief, who rarely had their living costs met in full.

Yet, even in one of the poorer establishments, almshouse residents had distinct advantages over other poor people. Principally these were the security of their accommodation, the permanence and regularity of any financial allowance, no matter how small, and the autonomy this gave them. Almshouse residents may also have had an enhanced status as ‘approved’, deserving poor. The location of many almshouses, beside the church, in the high street, or next to the guildhall, seems to have been purposely designed to solicit alms from passers-by, at a time when begging was officially discouraged.

SAVE 25% when you order direct from the publisher. Discount applies to print and eBook editions. Click the link, add to basket and enter offer code BB500 in the box at the checkout. Alternatively call Boydell’s distributor, Wiley, on 01243 843 291 and quote the same code. Offer ends one month after the date of upload. Any queries please email marketing@boydell.co.uk

 

NOTES

[1] Inflation index derived from H. Phelps Brown and S. V. Hopkins, A Perspective of Wages and Prices (London and New York, 1981) pp. 13-59.

[2] L. A. Botelho, Old Age and the English Poor Law, 1500 – 1700 (Woodbridge, 2004) pp. 147-8.

Revisiting the changing body

by Bernard Harris (University of Strathclyde)

The Society has arranged with CUP that a 20% discount is available on this book, valid until the 11th November 2018. The discount page is: www.cambridge.org/wm-ecommerce-web/academic/landingPage/EHS20

The last century has witnessed unprecedented improvements in survivorship and life expectancy. In the United Kingdom alone, infant mortality fell from over 150 deaths per thousand births at the start of the last century to 3.9 deaths per thousand births in 2014 (see the Office for National Statistics  for further details). Average life expectancy at birth increased from 46.3 to 81.4 years over the same period (see the Human Mortality Database). These changes reflect fundamental improvements in diet and nutrition and environmental conditions.

The changing body: health, nutrition and human development in the western world since 1700 attempted to understand some of the underlying causes of these changes. It drew on a wide range of archival and other sources covering not only mortality but also height, weight and morbidity. One of our central themes was the extent to which long-term improvements in adult health reflected the beneficial effect of improvements in earlier life.

The changing body also outlined a very broad schema of ‘technophysio evolution’ to capture the intergenerational effects of investments in early life. This is represented in a very simple way in Figure 1. The Figure tries to show how improvements in the nutritional status of one generation increase its capacity to invest in the health and nutritional status of the next generation, and so on ‘ad infinitum’ (Floud et al. 2011: 4).

fig01
Figure 1. Technophysio evolution: a schema. Source: See Floud et al. 2011: 3-4.

We also looked at some of the underlying reasons for these changes, including the role of diet and ‘nutrition’. As part of this process, we included new estimates of the number of calories which could be derived from the amount of food available for human consumption in the United Kingdom between circa 1700 and 1913. However, our estimates contrasted sharply with others published at the same time (Muldrew 2011) and were challenged by a number of other authors subsequently. Broadberry et al. (2015) thought that our original estimates were too high, whereas both Kelly and Ó Gráda (2013) and Meredith and Oxley (2014) regarded them as too low.

Given the importance of these issues, we revisited our original calculations in 2015. We corrected an error in the original figures, used Overton and Campbell’s (1996) data on extraction rates to recalculate the number of calories, and included new information on the importation of food from Ireland to other parts of what became the UK. Our revised Estimate A suggested that the number of calories rose by just under 115 calories per head per day between 1700 and 1750 and by more than 230 calories between 1750 and 1800, with little changes between 1800 and 1850. Our revised Estimate B suggested that there was a much bigger increase during the first half of the eighteenth century, followed by a small decline between 1750 and 1800 and a bigger increase between 1800 and 1850 (see Figure 2). However, both sets of figures were still well below the estimates prepared by Kelly and Ó Gráda, Meredith and Oxley, and Muldrew for the years before 1800.

fig02
Source: Harris et al. 2015: 160.

These calculations have important implications for a number of recent debates in British economic and social history (Allen 2005, 2009). Our data do not necessarily resolve the debate over whether Britons were better fed than people in other countries, although they do compare quite favourably with relevant French estimates (see Floud et al. 2011: 55). However, they do suggest that a significant proportion of the eighteenth-century population was likely to have been underfed.
Our data also raise some important questions about the relationship between nutrition and mortality. Our revised Estimate A suggests that food availability rose slowly between 1700 and 1750 and then more rapidly between 1750 and 1800, before levelling off between 1800 and 1850. These figures are still broadly consistent with Wrigley et al.’s (1997) estimates of the main trends in life expectancy and our own figures for average stature. However, it is not enough simply to focus on averages; we also need to take account of possible changes in the distribution of foodstuffs within households and the population more generally (Harris 2015). Moreover, it is probably a mistake to examine the impact of diet and nutrition independently of other factors.

To contact the author: bernard.harris@strath.ac.uk

References

Allen, R. (2005), ‘English and Welsh agriculture, 1300-1850: outputs, inputs and income’. URL: https://www.nuffield.ox.ac.uk/media/2161/allen-eandw.pdf.

Allen, R. (2009), The British industrial revolution in global perspective, Cambridge: Cambridge University Press.

Broadberry, S., Campbell, B., Klein, A., Overton, M. and Van Leeuwen, B. (2015), British economic growth, 1270-1870, Cambridge: Cambridge University Press.

Floud, R., Fogel, R., Harris, B. and Hong, S.C. (2011), The changing body: health, nutrition and human development in the western world since 1700, Cambridge: Cambridge University Press.

Harris, B. (2015), ‘Food supply, health and economic development in England and Wales during the eighteenth and nineteenth centuries’, Scientia Danica, Series H, Humanistica, 4 (7), 139-52.

Harris, B., Floud, R. and Hong, S.C. (2015), ‘How many calories? Food availability in England and Wales in the eighteenth and nineteenth centuries’, Research in Economic History, 31, 111-91.

Kelly, M. and Ó Gráda, C. (2013), ‘Numerare est errare: agricultural output and food supply in England before and during the industrial revolution’, Journal of Economic History, 73 (4), 1132-63.

Meredith, D. and Oxley, D. (2014), ‘Food and fodder: feeding England, 1700-1900’, Past and Present, 222, 163-214.

Muldrew, C. (2011), Food, energy and the creation of industriousness: work and material culture in agrarian England, 1550-1780, Cambridge: Cambridge University Press.

Overton, M. and Campbell, B. (1996), ‘Production et productivité dans l’agriculture anglaise, 1086-1871’, Histoire et Mésure, 1 (3-4), 255-97.

Wrigley, E.A., Davies, R., Oeppen, J. and Schofield, R. (1997), English population history from family reconstitution, Cambridge: Cambridge University Press.

Surprisingly gentle confinement

Tim Leunig (LSE), Jelle van Lottum (Huygens Institute) and Bo Poulsen (Aarlborg University) have been investigating the treatment of prisoners of war in the Napoleonic Wars.

 

index
Napoleonic Prisoner of War. Available at <https://blog.findmypast.com.au/explore-our-fascinating-new-napoleonic-prisoner-of-war-records-1406376311.html&gt;

For most of history, life as a prisoner of war was nasty, brutish and short. There were no regulations on the treatment of prisoners until the 1899 Hague convention, and the later Geneva conventions. Many prisoners were killed immediately, other enslaved to work in mines, and other undesirable places.

The poor treatment of prisoners of war was partly intentional – they were the hated enemy, after all. And partly it was economic. It costs money to feed and shelter prisoners. Countries in the past – especially in times of war and conflict – were much poorer than today.

Nineteenth century prisoner death rates were horrific. Between one-half and six-sevenths of Napoleon’s 17,000 troops surrendering to the Spanish in 1808 after the Battle of Balién died as prisoners of war. The American civil war saw death rates rise to 27%, even though the average prisoner was captive for less than a year.

The Napoleonic Wars saw the British capture 7,000 Danish and Norwegian sailors, military and merchant. Britain did not desire war with Denmark (which ruled Norway at the time), but did so to prevent Napoleon seizing the Danish fleet. Prisoners were incarcerated on old, unseaworthy “prison hulks”, moored in the Thames Estuary, near Rochester. Conditions were crowded: each man was given just 2 feet (60 cm) in width to hang his hammock.

Were these prison hulks floating tombs, as some contemporaries claimed? Our research shows otherwise. The Admiralty kept exemplary records, now held in the National Archive in Kew. These show the date of arrival in prison, and the date of release, exchange, escape – or death. They also tell us the age of the prisoner, where they came from, the type of ship they served on, and whether they were an officer, craftsman, or regular sailor. We can use these records to look at how many died, and why.

The prisoners ranged in age from 8 to 80, with half aged 22 to 35. The majority sailed on merchant vessels, with a sixth on military vessels, and a quarter on licenced pirate boats, permitted to harass British shipping. The amount of time in prison varied dramatically, from 3 days to over 7 years, with an average of 31 months. About two thirds were released before the end of the war.

Taken as a whole, 5% of prisoners died. This is a remarkably low number, given how long they were held, and given experience elsewhere in the nineteenth century. Being held prisoner for longer increased your chance of dying, but not by much: those who spent three years on a prison hulk had only a 1% greater chance of dying than those who served just one year.

Death was (almost) random. Being captured at the start of the war was neither better nor worse than being captured at the end. The number of prisoners held at any one time did not increase the death rate. The old were no more likely to die than the young – anyone fit enough to go to see was fit enough to withstand any rigours of prison life. Despite extra space and better rations, officers were no less likely to die, implying that conditions were reasonable for common sailors.

There is only one exception: sailors from licenced pirate boats were twice as likely to die as merchant or official navy sailors. We cannot know the reason. Perhaps they were treated less well by their guards, or other prisoners. Perhaps they were risk takers, who gambled away their rations. Even for this group, however, the death rates were very low compared with those captured in other places, and in other wars.

The British had rules on prisoners of war, for food and hygiene. Each prisoner was entitled to 2.5 lbs (~1 kg) of beef, 1 lb of fish, 10.5 lbs of bread, 2 lbs of potatoes, 2.5lbs of cabbage, and 14 pints (8 litres) of (very weak) beer a week. This is not far short of Danish naval rations, and prisoners are less active than sailors. We cannot be sure that they received their rations in full every week, but the death rates suggest that they were not hungry in any systematic way. The absence of epidemics suggests that hygiene was also good. Remarkably, and despite a national debt that peaked at a still unprecedented 250% of GDP, the British appear to have obeyed their own rules on how to treat prisoners.

Far from being floating tombs, therefore, this was a surprisingly gentle confinement for the Danish and Norwegian sailors captured by the British in the Napoleonic Wars.

Small Bills and Petty Finance: co-creating the history of the Old Poor Law

by Alannah Tomkins (Keele University) 

Alannah Tomkins and Professor Tim Hitchcock (University of Sussex), won an AHRC award to investigate ‘Small Bills and Petty Finance: co-creating the history of the Old Poor Law’.  It is a three-year project from January 2018. The application was for £728K, which has been raised, through indexing, to £740K.  The project website can be found at: thepoorlaw.org.

 

Twice in my career I’ve been surprised by a brick – or more precisely by bricks, hurtling into my research agenda. In the first instance I found myself supervising a PhD student working on the historic use of brick as a building material in Staffordshire (from the sixteenth to the eighteenth centuries). The second time, the bricks snagged my interest independently.

The AHRC-funded project ‘Small bills and petty finance’ did not set out to look for bricks. Instead it promises to explore a little-used source for local history, the receipts and ‘vouchers’ gathered by parish authorities as they relieved or punished the poor, to write multiple biographies of the tradesmen and others who serviced the poor law. A parish workhouse, for example, exerted a considerable influence over a local economy when it routinely (and reliably) paid for foodstuffs, clothing, fuel and other necessaries. This influence or profit-motive has not been studied in any detail for the poor law before 1834, and vouchers’ innovative content is matched by an exciting methodology. The AHRC project calls on the time and expertise of archival volunteers to unfold and record the contents of thousands of vouchers surviving in the three target counties of Cumbria, East Sussex and Staffordshire. So where do the bricks come in?

The project started life in Staffordshire as a pilot in advance of AHRC funding. The volunteers met at the Stafford archives and started by calendaring the contents of vouchers for the market town of Uttoxeter, near the Staffordshire/Derbyshire border. And the Uttoxeter workhouse did not confine itself to accommodating and feeding the poor. Instead in the 1820s it managed two going concerns: a workhouse garden producing vegetables for use and sale, and a parish brickyard. Many parishes under the poor law embedded make-work schemes in their management of the resident poor, but no others that I’m aware of channelled pauper labour into the manufacture of bricks.

pic01

The workhouse and brickyard were located just to the north of the town of Uttoxeter, in an area known as The Heath. The land was subsequently used to build the Uttoxeter Union workhouse in 1837-8 (after the reform of the poor law in 1834) so no signs of the brickyard remain in the twenty-first century. It was, however, one of several such yards identified at The Heath in the tithe map for Uttoxeter of 1842, and probably made use of a fixed kiln rather than a temporary clamp. This can be deduced from the parish’s sale of both bricks and tiles to brickyard customers. Tiles were more refined products than bricks and require more control over the firing process, whereas clamp firings were more difficult to regulate. The yard provided periodic employment to the adult male poor of the Uttoxeter workhouse, in accordance with the seasonal pattern imposed on all brick manufacture at the time. Firings typically began in March or April each year, and continued until September or October depending on the weather.

This is important because the variety of vouchers relating to the parish brickyard allow us to understand something of its place in the town’s economy, both as a producer and as a consumer of other products and services. Brickyards needed coal, so it is no surprise that one of the major expenses for the support of the yard lay in bringing coal to the town from elsewhere via the canal. The Uttoxeter canal wharf was also at The Heath, and access to transport by water may explain the development of a number of brickyards in its proximity. The yard also required wood and other raw materials in addition to clay, and specific products to protect the bricks after cutting but before firing. The parish bought quantities of archangel mats, rough woven pieces that could be used like a modern protective fleece to protect against frost damage. We are surmising that Uttoxeter used the mats to cover both the bricks and any tender plants in the workhouse garden.

screen-shot-2018-09-04-at-18-00-20.png

Similarly the bricks were sold chiefly to local purchasers, including members of the parish vestry. Some men who were owed money by the parish for their work as suppliers allowed the debt to be offset by bricks. Finally the employment of workhouse men as brickyard labourers gives us, when combined with some genealogical research, a rare glimpse of the place of workhouse work in the life-cycle of the adult poor. More than one man employed at the yard in the 1820s and 1830s went on to independence as a lodging-house keeper in the town by the time of the 1841 census.

As I say, I’ve been surprised by brick. I had no idea that such a mundane product would prove so engaging. All this goes to show that it’s not the stolidity of the brick but its deployment that matters, historically speaking.

 

To contact the author: a.e.tomkins@keele.ac.uk

 

 

 

 

Judges and the death penalty in Nazi Germany: New research evidence on judicial discretion in authoritarian states

nazipeoplescourt
The German People’s Court. Available at https://www.foreignaffairs.com/reviews/review-essay/good-germans

Do judicial courts in authoritarian regimes act as puppets for the interests of a repressive state – or do judges act with greater independence? How much do judges draw on their political and ideological affiliations when imposing the death sentence?

A study of Nazi Germany’s notorious People’s Court, recently published in the Economic Journal, reveals direct empirical evidence of how the judiciary in one of the world’s most notoriously politicised courts were influenced in their life-and-death decisions.

The research provides important empirical evidence that the political and ideological affiliations of judges do come into play – a finding that has applications for modern authoritarian regimes and also for democracies that administer the death penalty.

The research team – Dr Wayne Geerling (University of Arizona), Prof Gary Magee, Prof Russell Smyth, and Dr Vinod Mishra (Monash Business School) – explore the factors influencing the likelihood of imposing the death sentence in Nazi Germany for crimes against the state – treason and high treason.

The authors examine data compiled from official records of individuals charged with treason and high treason who appeared before the People’s Courts up to the end of the Second World War.

Established by the Nazis in 1934 to hear cases of serious political offences, the People’s Courts have been vilified as ‘blood tribunals’ in which judges meted out pre-determined sentences.

But in recent years, while not contending that the People’s Court judgments were impartial or that its judges were not subservient to the wishes of the regime, a more nuanced assessment has emerged.

For the first time, the new study presents direct empirical evidence of the reasons behind the use of judicial discretion and why some judges appeared more willing to implement the will of the state than others.

The researchers find that judges with a deeper ideological commitment to Nazi values – typified by being members of the Alte Kampfer (‘Old Fighters’ or early members of the Nazi party) – were indeed more likely to impose the death penalty than those who did not share it.

These judges were more likely to hand down death penalties to members of the most organised opposition groups, those involved in violent resistance against the state and ‘defendants with characteristics repellent to core Nazi beliefs’:

‘The Alte Kampfer were thus more likely to sentence devout Roman Catholics (24.7 percentage points), defendants with partial Jewish ancestry (34.8 percentage points), juveniles (23.4 percentage points), the unemployed (4.9 percentage points) and foreigners (42.3 percentage points) to death.’

Judges who became adults during two distinct historical periods (the Revolution of 1918-19 and the period of hyperinflation from June 1921 to January 1924), which may have shaped these judges’ views with respect to Nazism, were more likely to impose the death sentence.

 Alte Kampfer members whose hometown or suburb lay near a centre of the Revolution of 1918-19 were more likely to sentence a defendant to death.

Previous economic research on sentencing in capital cases has focused mainly on gender and racial disparities, typically in the United States. But the understanding of what determines whether courts in modern authoritarian regimes outside the United States impose the death penalty is scant. By studying a politicised court in an historically important authoritarian state, the authors of the new study shed light on sentencing more generally in authoritarian states.

The findings are important because they provide insights into the practical realities of judicial empowerment by providing rare empirical evidence on how the exercise of judicial discretion in authoritarian states is reflected in sentencing outcomes.

To contact the authors:
Russell Smyth (russell.smyth@monash.edu)

THE ‘WITCH CRAZE’ OF 16th & 17th CENTURY EUROPE: Economists uncover religious competition as driving force of witch hunts

11328679url_&amp;&amp;version=1501231358665
“The Pendle Witches”. Available at https://www.theanneboleynfiles.com/witchcraft-in-tudor-and-stuart-times/

Economists Peter Leeson (George Mason University) and Jacob Russ (Bloom Intelligence) have uncovered new evidence to resolve the longstanding puzzle posed by the ‘witch craze’ that ravaged Europe in the sixteenth and seventeenth centuries and resulted in the trial and execution of tens of thousands for the dubious crime of witchcraft.

 

In research forthcoming in the Economic Journal, Leeson and Russ argue that the witch craze resulted from competition between Catholicism and Protestantism in post-Reformation Christendom. For the first time in history, the Reformation presented large numbers of Christians with a religious choice: stick with the old Church or switch to the new one. And when churchgoers have religious choice, churches must compete.

In an effort to woo the faithful, competing confessions advertised their superior ability to protect citizens against worldly manifestations of Satan’s evil by prosecuting suspected witches. Similar to how Republicans and Democrats focus campaign activity in political battlegrounds during US elections to attract the loyalty of undecided voters, Catholic and Protestant officials focused witch trial activity in religious battlegrounds during the Reformation and Counter-Reformation to attract the loyalty of undecided Christians.

Analysing new data on more than 40,000 suspected witches whose trials span Europe over more than half a millennium, Leeson and Russ find that when and where confessional competition, as measured by confessional warfare, was more intense, witch trial activity was more intense too. Furthermore, factors such as bad weather, formerly thought to be key drivers of the witch craze, were not in fact important.

The new data reveal that the witch craze took off only after the Protestant Reformation in 1517, following the new faith’s rapid spread. The craze reached its zenith between around 1555 and 1650, years co-extensive with peak competition for Christian consumers, evidenced by the Catholic Counter-Reformation, during which Catholic officials aggressively pushed back against Protestant successes in converting Christians throughout much of Europe.

Then, around 1650, the witch craze began its precipitous decline, with prosecutions for witchcraft virtually vanishing by 1700.

What happened in the middle of the seventeenth century to bring the witch craze to a halt? The Peace of Westphalia, a treaty entered in 1648, which ended decades of European religious warfare and much of the confessional competition that motivated it by creating permanent territorial monopolies for Catholics and Protestants – regions of exclusive control, in which one confession was protected from the competition of the other.

The new analysis suggests that the witch craze should also have been focused geographically, located where Catholic-Protestant rivalry was strongest and vice versa. And indeed it was: Germany alone, which was ground zero for the Reformation, laid claim to nearly 40% of all witchcraft prosecutions in Europe.

In contrast, Spain, Italy, Portugal and Ireland – each of which remained a Catholic stronghold after the Reformation and never saw serious competition from Protestantism – collectively accounted for just 6% of Europeans tried for witchcraft.

Religion, it is often said, works in unexpected ways. The new study suggests that the same can be said of competition between religions.

 

To contact the authors:  Peter Leeson (PLeeson@GMU.edu)

From VoxEU – Wellbeing inequality in retrospect

Rising trends in GDP per capita are often interpreted as reflecting rising levels of general wellbeing. But GDP per capita is at best a crude proxy for wellbeing, neglecting important qualitative dimensions. 36 more words

via Wellbeing inequality in retrospect — VoxEU.org: Recent Articles

To elaborate further on the topic, Prof. Leandro de la Escosura has made available several databases on inequality, accessible here, as well as a book on long-term Spanish economic growth, available as open source here