Child workers and industrial health in Britain 1780-1850

Peter Kirby, Child workers and industrial health in Britain 1780-1850 (Woodbridge: Boydell Press, 2013. Pp. xi + 212. 8 tabs. 6 figs. ISBN 9781843838845 Pbk. £19.99)

Review by Alysa Levene (Oxford Brookes University)

Book by Peter Kirby

‘Child workers and industrial health in Britain 1780-1850’ is published by Boydell and Brewer. SAVE  25% when you order direct from the publisher – offer ends on the 18th July 2019. See below for details.


copertina kirby.png

The physical horrors endured by child workers in the early industrial workplace are well known to historians – or at least, we think they are. The regulations of the various Factory Acts and the testaments of sub-commissioners, doctors and factory workers to the parliamentary enquiries of the 1830s and 1840s are common reference points for those of us working or teaching in this area. However, over the last few years, several in-depth studies of child labour in industrial England have appeared which have started to challenge and nuance what we think we know. First, Katrina Honeyman, Child Workers in England, 1780-1820 (2007) suggested that apprentices to cotton mills were often better looked after than we have thought. Then, Jane Humphries, Childhood and Child Labour in the British Industrial Revolution (2010) set industrial work in a wider context of schooling and family life, as evidenced in over 600 working-class autobiographies. And now, Peter Kirby has added the first monograph study of occupational health among child workers in the first half of the nineteenth century¸ and has again, knocked down many of the key points we have been telling students for years.

The book is organised thematically, starting with an Introduction which sets out in detail the historical background to child labour in industry, and the sources we have for studying it. Here, Kirby points out the problems with the medical evidence collected for the parliamentary enquiries in the 1830s and 1840s; namely that many of the doctors concerned did not have first-hand experience of occupational health and so tended to attribute any health issues to working conditions rather than environmental ones. This leads him to place more emphasis on the writings of non-medical men, shifting the perspective away from doctors and children and towards health and conditions of work in the round. The main chapters consider child health in industrial cities generally; the key issues affecting the health of child industrial worker (deformities; ‘materials’ – see more below; and injuries); heights and ages, and how these were measured; and finally, corporal punishment and murder.

One of Kirby’s key conclusions is that it was environmental rather than working conditions which were responsible for most of the health problems experienced by child workers. He states that many began work in factories and mines already compromised by poor nutrition, environmental pollution and the impact of parental loss (which led to work at a young age), and that in fact, stunted and disabled children may have been preferentially admitted to the factory workforce because they were suited to the lighter tasks found there. To a certain degree this is convincing, and it is certainly instructive and worthwhile to draw attention to the relationship between the conditions of home life and working life so clearly. The discussion of environmental pollution and its impact on health is particularly detailed. However, it seems hard to believe either that so many children would have suffered from conditions like byssinosis, scoliosis or poliomyelitis as Kirby suggests, or that pre-existing disability could have been so widespread among child workers given the need to stand upright and bear a load in so many areas of work.

The discussion of ‘materials’ is another area where Kirby provides an impressive level of detail, and which advances our understanding of the realities of working life in mills. In particular, he draws attention to the pollutants which can be carried in raw cotton, and ties this to changes in supply during this period, for example, away from imports from the West Indies, and towards those from North America, which were less likely to be contaminated (this coincided with a fall in ‘mill fevers’). This is something which has not been much considered in previous work (although it was noted by contemporaries) and which has a bearing on both adult and child workers.

Kirby attempts to bring a similarly new perspective to the discussion of workplace violence, suggesting that corporal punishment was common only in specific circumstances (such as where safety or productivity demanded it, or where child workers were particularly vulnerable, like parish apprentices), and that it was in any case a more accepted part of daily life than it is now. These two points do not necessarily sit easily together; certainly the evidence of violence in the commissioners’ reports suggests that it was not condoned. He is more confident on the system of medical inspection, and provides a detailed discussion of its scale and potential pitfalls, particularly the difficulty of assessing children’s ages (vital for ensuring that factories and mines adhered to the changing laws on age at starting work). Ultimately this led to the development of standard charts for growth and dentition.

Overall, this is an excellent and comprehensive study of the occupational health of child workers in the most high-profile areas of the industrial sector. It makes a significant contribution to debates on child labour, and the impact of industry on health and daily life. Kirby paints a notably more optimistic picture of the industrial workplace than we are used to, certainly in times of the impact on health and stature of its youngest workers. He ends by calling for more work on other areas of the industrial workforce, and this would certainly be welcome. The book is an excellent introduction to the topic for students and researchers alike; it remains to be seen whether it sparks a new wave of debate over the ‘optimistic’ versus the ‘pessimistic’ schools of thought on the industrial revolution.


SAVE 25% when you order direct from the publisher using the offer code B125 online hereOffer ends 18th July 2019. Alternatively call Boydell’s distributor, Wiley, on 01243 843 291, and quote the same code. Any queries please email


Nineteenth century savings banks, their ledgers and depositors

by Linda Perriton (University of Stirling)


If you look up as you walk along the streets of British towns and cities, you will see the proud and sometimes colourful traces of nineteenth century savings banks. But evidence of the importance of savings banks to working- and middle-class savers is harder to locate in economic history research.

English and Welsh savings banks operated on a ‘savings only’ model that funded interest payments to savers by purchasing government bonds and, in doing so, placed themselves outside the history of productive financialisation (Horne, 1947). This is a matter of regret, because whatever minor role trustee savings banks played in the productive economy, there is little doubt that they helped to financialise segments of society previously detached from such activities.

Image: Author’s own. A mosaic over the door of the former Fountainbridge branch of the Edinburgh Savings Bank.


The research that Stuart Henderson (Ulster University) and I presented at the EHS 2019 annual conference looks in detail at the financial activity of depositors in one savings bank – the Limehouse Savings Bank, situated in the East End of London.

Savings bank ledgers are a rich source of social history data in addition to the financial, especially in socially diverse larger cities. The apostils of clerks reveal amusement at the names chosen for local clubs (for example, the Royal Order of the Jolly Cocks merits an exclamation mark) or a note as to love gone wrong (for example, a woman who returns the passbook of a lover from whom she has not heard for two years).

We also want to look beyond the aggregate deposit figures for Limehouse recorded in the government reports to discover how individuals used the bank over the period 1830-76.

As a start, we have recorded the account transactions for each of the 195 new accounts opened in 1830, from the first deposit to the last withdrawal – a total of 3,598 transactions. Using the account header information, we have also compiled the personal details of the account holder – such as gender, occupation and place of residence. We use the header profile to trace individual savers in the historical record in order to establish their age and any notable life events, such as marriage and the birth of children.

Apart from 12 accounts, which were registered to individuals who gave addresses other than East End parishes, all the 1830 savers were registered at addresses within a four miles by one mile strip of urban development, which also enabled us to record the residential clustering of savers.

Summary statistics enable us to establish the differences between the categories of savers across several different indicators of transaction activity.

Perhaps unsurprisingly, the men in our 1830 sample tended to make larger deposits and larger withdrawals than the women, with the difference in magnitude masked somewhat by large transactions undertaken by widows. Widows in our sample tended to have a relatively large opening balance and a higher number of withdrawals, suggesting that their accounts functioned more as a ‘draw down’ fund (Perriton and Maltby, 2015).Men also tended to make more transactions than women.

We also see a significant portion of accounts where activity was very limited. The median number of deposits across our 195 accounts was just two, suggesting that a large proportion of accounts acted as something of a (very) temporary financial warehouse. Minors and servants tended to have smaller transactions, but appear to have accumulated more – relatively speaking – than others.

But our interest in the savers goes beyond summary statistics. We know that very few accounts were managed in the way that the sponsors of savings bank legislation intended; the low median of deposits is testament to that.

The basic information in the ledger headers for each account provides a starting point for thinking about when in the life-cycle savings was more successful. Even with the compulsory registration of births, deaths and marriages after 1837 and census data after 1841, the ability to trace an individual saver is not guaranteed.

With so few data points, it is easy to lose individuals at the periphery of the professional and skilled working classes, even in a relatively well documented city like London. Yet the ability to build individual case studies of savers is important to our understanding of savings banks in terms of establishing who were the ‘successful’ savers, and also when – relative to the overall life-cycle of the saver – accounts were held.

Our research presents ten case study accounts from our larger sample to challenge the proposition in social history research on household finances that savings increased when teenage and young adult children were contributing wages to the household. We also look at the evidence for any savings in anticipation of significant life events such as marriage or childbirth. The evidence is weak on both counts.

The distribution of age at account opening among the ten case studies is varied: under 20 years old (3), 21-29 (2), 30-39 (2), 40-49 (0) and 50-59 (3). The three cases of accounts opened after the age of 50 relate to a widow and two married couples, who all had children aged 10-25. But the majority of the accounts we examined were opened by younger adults with young children and growing families.

There is no obvious case for suggesting that savings were possible because expenses could be offset against the wages of teenage or young adult children. Nor can we see any obvious anticipatory or responsive saving for life events in the case studies.

One of our sample account holders did open her account soon after being widowed, but another widow opened her account seven years after the death of her husband. Two men opened accounts when their children were very young, but not in anticipation of their arrival. The only evidence we have in the case studies for changed behaviour as a result of a life event is in the case of marriage – where all account activity ceased for one of our men in the first years of his union.

The mixed quantitative and biographical approach that we use in our study of the Limehouse Savings Bank point to a promising alternative direction for historical savings bank research – one that reconnects savings bank history with the wider history of retail banking and allows for a much richer interplay between social history and financial history.

By looking at the patterns of use by the Limehouse account holders, it is possible to see the ways in which working families and individuals interacted with a standard product and standard service offering, sometimes adding layers of complexity in order to create a different banking product, or using the accounts to budget within a short-term cycle rather than saving for a significant purchase or event.


Further reading:

Horne, HO (1947) A History of Savings Banks, Oxford University Press.

Perriton, L, and J Maltby (2015) ‘Working-class Households and Savings in England, 1850-1880’, Enterprise and Society 16(2): 413-45.


To contact the author:

How the Bank of England managed the financial crisis of 1847

by Kilian Rieder (University of Oxford)

New Branch Bank of England, Manchester, antique print, 1847. Available at <;

What drives a central bank’s decision to grant or refuse liquidity provision during a financial crisis? How does the central bank manage counterparty risk during such periods of high demand for liquidity, when time constraints make it hard to process all relevant information? How does a central bank juggle the provision of large amounts of liquidity with its monetary policy obligations?

All of these questions were live issues for the Bank of England during the financial crisis of 1847 just as they would be in 2007. My research uses archival data to shed light on these questions by looking at the Bank’s discount window policies in the crisis year of 1847.

The Bank had to manage the 1847 financial crisis despite being limited by a legal monetary policy provision in the Act to back any expansion of its note issue with gold. It is often cited as the last episode of financial distress during which the Bank rationed central bank liquidity before fully assuming its role as a lender of last resort (Bignon et al, 2012).

We find that the Bank did not engage in any kind of simple threshold rationing but rather monitored and managed its private sector asset holdings in similar ways to central banks have developed since the financial crisis of 2007. In another echo of the recent crisis, the Bank of England also required an indemnity from the UK government in 1847 allowing the Bank to supply more liquidity than it was legally allowed. This indemnity became part of the ‘reaction function’ in future financial crises.

Most importantly, the year 1847 witnessed the introduction of a sophisticated discount ledger system at the Bank. The Bank used the ledger system to record systematically its day-to-day transactions with key counterparties. Discount loan applicants submitted bills in parcels, sometimes containing a hundred or more, which the Bank would have to analyse collectively ‘on the fly’.

The Bank would reject those it didn’t like and then discount the remainder, typically charging a single interest rate. Subsequently, the parcels were ‘unpacked’ into individual bills in the separate customer ‘with and upon ledgers’ where they were classified under the name of their discounter and acceptor alongside several other characteristics at the bill level (drawer, place of origin, maturity, amount, etc.). By analysing these bills and their characteristics we are better able to understanding the Bank’s discount window policies.

We first find evidence that during crisis weeks the Bank was more likely to reject demands for credit from bill brokers – the money market mutual funds of their time – while favouring a small group of regular large discounters. Equally, firms associated with the commercial crisis and the corn price speculation in 1847 (many of which subsequently failed) were less likely to obtain central bank credit. The Bank was discerning about whom it lent to and the discount window was not entirely ‘frosted’ as suggested by Capie (2001).

But our findings support Capie’s main hypothesis that the decision whether to accept or reject a bill depended largely on individual bill characteristics. The Bank appeared to use a set of rules to decide on this, which it applied consistently in both crisis weeks and non-crisis weeks. Most ‘collateral characteristics’ – inter alia, the quality of the names endorsing a bill – were highly significant factors driving the Bank’s decision to reject.

This finding supports the idea that the Bank needed to be active in monitoring key counterparties in the financial system well before formal methods of supervision in the twentieth century, echoing results obtained by Flandreau and Ugolini (2011) for the later 1866 crisis.


Spinning the Industrial Revolution

by Jane Humphries (All Souls College, University of Oxford) and Benjamin Schneider (Merton College, University of Oxford)

The full paper is published on The Economic History Review and is available here


The wages of hand spinners have been pushed to the forefront of economic history in recent years by Robert Allen’s ‘high-wage economy’ interpretation of British industrialization. Allen contends that rising earnings of hand spinners in the mid-18th century can explain why the spinning innovations of the industrial revolution were invented and adopted first in Britain. This is an extension of his broader argument that the ratio of wages to capital and energy costs in Britain was higher than in other parts of the world and served as a general spur to innovative activity. However, many gender historians and scholars of women’s work have contended that spinning, like other occupations dominated by women, was systematically underpaid. We set out to resolve this dispute by constructing a large dataset of spinners’ earnings from primary sources.

Spinning is the process by which raw fiber (cotton, flax, wool, or synthetic fibers) is turned into yarn. Hand spinners undertook this work on spinning wheels, imparting twist and draft into the fibers with their fingers. Qualitative sources suggest that spinning was a very common employment in the early modern period, especially for women and children. It was organized along the lines of the putting-out system, with many spinners receiving fiber from yarn merchants, spinning it in their homes, and returning the finished yarn in return for payment.

Fig. 01 Pieter Nys, Woman Spinning, 1652 (Dulwich Picture Gallery)

Allen presents a set of claims regarding spinners’ time rates which are taken from the work of Craig Muldrew and Charles Feinstein. Muldrew brought spinning into the limelight with a 2012 article that presented much larger estimates of the number of women employed in yarn production in the 17th and 18th centuries. However, the primary sources underlying Allen’s composite wage series are mostly claims about earning levels provided by commentators interested in emphasizing the value of Britain’s textile industry or showing the reduction in spinners’ earnings produced by mechanization at the end of the 18th century.

To address the question of spinners’ wages with firmer evidence, we collected more than 2500 observations of hand spinners’ earnings from the late 16th to the early 19th century. Spinners were generally paid by piece rates—payments per weight of yarn—which made constructing their remuneration challenging in many cases. In addition to sources that provided recorded earnings per day or a total amount earned over a known time span, we also collected data on their output per time in order to estimate their productivity. We then combined the productivity estimates with piece rates in primary sources to construct daily wages. These constructed wages supplemented the observed earnings per time and claims about remuneration. Our series incorporates the claims of interested parties, the writings of social commentators, and, more importantly, a very large body of new evidence on direct payments to spinners in the account books of putting-out enterprises, spinning schools, and the writings of putting-out merchants.

Fig 02
Figure 2: Nominal daily wages, decadal averages. Source: See online appendix S4 and table 2 in the paper

Our results show that Allen’s claim for high wages in spinning cannot be supported by the contemporary evidence. Productivity in hand spinning was far below the optimistic claims of social commentators, who wrote that a “sturdy woman” could spin a pound of fiber in a day. Direct evidence of spinners’ output shows that most of them produced less than half of this level each day. Unsurprisingly, we also find that earnings per day (even when discounting the constructed wages that use our productivity estimates with piece rates) were also substantially lower than previously claimed. Spinners barely earned enough to support themselves throughout most of the 17th and 18th centuries, and their wages did not rise precipitously in the middle of the 18th century.

At the same time, we know that cloth production expanded substantially in early modern England. We present several possible explanations to resolve this apparent paradox of rising labor demand and stagnant wages. First, we know that the geographical extent of spinning grew in the 18th century. Flax spinning, for example, spread further into the Scottish Highlands. We also provide extensive documentation regarding the involvement of the Poor Law and charitable enterprises in spinning. These entities allowed production to expand while taking advantage of the low wage demands of impoverished families, particularly in the countryside. Finally, we present evidence from contemporary descriptive sources that suggest most spinners faced monopsony power: the putters-out could act as a cartel and hold down spinners’ wages.

The growth of hand spinning provided modest but valuable household earnings for a growing number of poor families in 18th century Britain. However, spinning wages did not rise in the 18th century and therefore they cannot explain the invention of the spinning machines of the Industrial Revolution.


To contact the authors:

Benjamin Schneider ( 

Delusions of competence: the near-death of Lloyd’s of London 1980-2002

by Robin Pearson (University of Hull) 
This paper was presented at the EHS Annual Conference 2019 in Belfast. 

Rapid structural change resulting from system collapse seems to be a less common phenomenon in insurance than in the history of other financial services. One notably exception is the crisis that rocked Lloyd’s of London, the world’s oldest continuous insurance market, in the late twentieth century. 

Hitherto, explanations for the crisis have focused on catastrophic losses and problems of internal governance. My study argues that while these factors were important, they may not have resulted in institutional collapse had it not been for multiple delusions of competence among the various parties involved. 

Lloyd’s was a self-governing market that comprised investors – known as names – who put up their personal assets to back the insurance written on their behalf, and accepted unlimited individual liability for losses. Names were organised into syndicates led by an underwriter and a managing agency. Business could only be brought to syndicates by brokers licensed by Lloyd’s. Large broking firms owned most of the managing agencies and thereby controlled the syndicates, giving rise to serious conflicts of interest.  

 In 1970, Lloyd’s resolved to expand capacity by lowering property qualifications for new names. As a result, the membership exploded from 6,000 to over 32,000 by 1988. Many new names were less well-heeled than their predecessors and largely ignorant of the insurance business. Despite a series of scandals involving underwriters siphoning off syndicate funds for their own personal use, the number of entrants kept rising thanks to double digit investment returnsthe tax advantages of membership, and aggressive recruiting.  

While capacity was increasing, underwriters competed vigorously to write long-tail liability and catastrophe business in the form of excess loss (XL) reinsurance. Under these contracts, the reinsurer agreed to indemnify the reinsured in the event of the latter sustaining a loss in excess of a pre-determined figure. The reinsurer in turn usually retroceded (laid off) some of the amount reinsured to another insurer. 

Many Lloyd’s underwriters went into this market despite having little experience of the business. Some syndicates doing XL reinsurance retroceded to other XL syndicates, so that instead of the risks being dispersed, they circulated around the same market, becoming increasingly opaque and concentrated in a few syndicates. This became the infamous London Market Excess of Loss (LMX) spiral. 

By 1990, over one quarter of business at Lloyd’s was XL reinsurance. The spiral offered brokers, underwriters and managing agents the opportunity to earn commission and fees on every reinsurance and retrocession written. 

It also enabled underwriters to arbitrage the differential between the premiums they charged for the original insurance, and the lower premiums they paid for reinsurance and retrocessionsA later inquiry also showed that those writing at the top of the spiral accepted, out of ignorance or carelessness, premium rates that were far too low for the higher layers, in the belief that these were virtually risk-free.  

Unscrupulous underwriters could also offload the worst risks onto ‘dustbin’ syndicates of outsider names, while picking the best risks to be reinsured with so-called ‘baby’ syndicates of insiders. Poor information recording made it difficult to track the risks insured in the LMX spiral. 

Lloyd’s membership peaked in 1988, which also marked the first of five years of unprecedented losses. ‘Long-tail risks on liability insurance generated many of the losses, as well as a series of storms, earthquakes, hurricanes, oil industry disasters and the Gulf war. Asbestosis and industrial pollution claims in the United States poured in, some from policies dating as far back as the 1930s. 

The tsunami of claims overwhelmed Lloyd’s. Groups of names resisted calls and sued on the grounds that Lloyd’s market supervision had failed. Most political opinion moved towards accepting the need for fundamental reform, despite a fierce rearguard action from traditionalists.  

In 1993, for the first time in its history, Lloyd’s permitted the entry of corporate investors with limited liability, and these soon accounted for 80% of market capacity. The number of individual names collapsed. A vehicle was created – Equitas – to reinsure all liabilities incurred prior to 1993, funded by a levy on members. 

In 1996, Lloyd’s achieved a £3.1 billion settlement with its litigants. In 1998, the new Labour government announced that Lloyd’s would be independently regulated by the Financial Services Authority 

Studies of decision-making under uncertainty and the fallacies of experts are helpful in explaining behaviour at Lloyd’s revealed by the crisiswhich included arrogance, elitism, greed, corruption and stubborn resistance to reform in defence of vested interestsPolitically entrenched ideas about the virtues of self-regulation, and an exaggerated faith in the ability of insider experts to know what was best for the institution, also played a role. 

The practice of syndicate underwriters ‘following’ the premium rate set by a recognised ‘lead’ underwriter reinforced behavioural traits such as herding, the desire to avoid being an outlier in one’s predictions; ‘cognitive dissonance’, the inability to know the limits of one’s expertise; overconfidence and optimistic bias. 

The combined effect of these behaviours on XL underwriting at Lloyd’s was a heightened tendency to ignore ‘black swans’, the unknown or unimagined events that can deliver catastrophic losses. There are obvious parallels with the behaviour of investors in the market for sub-prime mortgage default risk, the collapse of which brought about the global financial crisis of 2007/08. 


Medieval Clothiers and their workers: an early ‘gig’ economy?

by John S. Lee (University of York)

The Medieval Clothier is published by Boydell Press. SAVE 25% when you order direct from the publisher – offer ends on the 31 March 2019. See below for details.


Dyers soaking red cloth in a heated barrel. Available at Wikimedia Commons

Casual wage-earners dependent on wealthy entrepreneurs for their work are not just a modern phenomenon. A new book by John S. Lee, The Medieval Clothier, charts the rise of clothiers in the period 1350-1550, and the innovative ways in which they organised their workforce.

Clothiers co-ordinated the different stages of textile production and found markets for their finished cloth. They increasingly managed all the stages of cloth-making, operating what historians have called the ‘putting out’ system. In this method of organising work, clothiers put-out raw materials for spinners, weavers, fullers and other cloth-workers to process. Clothiers paid these cloth-workers, often based in their own homes, on a piece-rate basis, rather than receiving a regular wage.

Like the modern ‘gig’ economy, the benefits of this system were hotly contested. Clothworkers enjoyed the independence to choose their own hours, and combine their craft with other activities; clothiers incurred no overheads for work done in the homes of their workers and benefitted from lower labour costs. When clothworkers protested in Suffolk in 1525, their spokesman, John Green, explained that the clothiers

‘give us so little wages for our workmanship that scarcely we be able to live, and thus in penury we pass the time, we our wives and children.’

Another, a group of weavers, accused ‘the rich men, the clothiers’ of setting a single price for their work. Others complained that clothiers reimbursed their workers in ‘pins, girdles, and other unprofitable wares’ rather than in cash. Clothiers were even accused in 1549 of paying poor labourers with ‘soap, candles, rotten cloth, stinking fish, and such like baggage’.

Local and national governments responded with ambivalence. On the one hand, cloth exports brought welcome revenue through customs. Governments were also aware though, that disruptions to overseas cloth sales created unemployment and unrest. The putting-out system relied on outworkers, whom the clothier only paid when work was available, and on keeping labour costs low. Following disruption to overseas markets, the government tried to prevent clothiers from laying off their workers. Even the king’s leading minister, Cardinal Wolsey, pressurised London merchants to continue buying cloth from the clothiers.

A few clothiers were able to amass great wealth from this industry, construct lavish mansions and erect elaborate church memorials, which can still be seen today. Thomas Paycocke’s house at Coggeshall, Essex, built to impress in 1509-10 with its stunning woodcarving and elaborate panelling, is now a National Trust property. The wealth of Thomas Spring III, ‘the rich clothier’ of Lavenham, Suffolk, caught the attention of the royal court’s poet, John Skelton, in 1522. The screen constructed to surround Spring’s tomb in Lavenham church in Suffolk engaged craftsmen familiar with commissions for the royal court.

Clothiers that profited from their trade often remembered their workers in their wills. Thomas Paycocke, who died in 1518, left bequests in their wills to ‘my weavers, fullers and shearmen’. He gave additional sums for those ‘that have wrought me very much work’. Paycocke’s bequests to his workers, which totalled £4, may have stretched to as many as 240 workers, while those of Thomas Spring II of Lavenham, who died in 1486, may have supported nearly 4,000 workers. Both these clothiers operated large-scale production through the putting-out system, although exactly how large must remain a matter for discussion.

Cloth-making became England’s leading industry in the late Middle Ages – no other industry created as much employment or generated as much wealth. By the 1540s, as many as 1 in 7 of the country’s workforce were probably making cloth and 1 in 4 households were involved in spinning. This book offers the first recent survey of this hugely important and significant trade and its practitioners, examining the clothiers and their impact within the industry and in their wider communities.

Intended for the general reader, as well as students and academics, this book is the first in a new series – Working in the Middle Ages – which will examine different trades, professions and industries. The series aims to provide authoritative, accessible guides to medieval trades and professions, offering surveys of their origins and development, alongside the practicalities of the occupation.

New proposals for the series are welcomed, and should be sent to the series editor, Dr James Davis, School of History, Queens University Belfast ( or to the Editorial Director (Medieval Studies), Boydell and Brewer (


SAVE 25% when you order direct from the publisher using the offer code BB500 online at Offer ends 31 Mar 2019. Discount applies to print and eBook editions. Alternatively call Boydell’s distributor, Wiley, on 01243 843 291, and quote the same code. Any queries please email


To contact the author:

Wages in the Middle Ages

by Jordan Claridge (London School of Economics)


index.jpgHistorical research on labour and wages has been an object of considerable attention
for both industrial and post-industrial societies. Even in the contemporary period, issues surrounding concepts of work and remuneration, such as growing inequality and the gender pay gap, are regularly debated topics. Indeed, modern English society is currently dealing with fallout in these areas as a result of deindustrialisation and the idea of a universal basic income is gaining traction.

But for the more distant past, understanding these issues often becomes a battle with shadows. My research uses a new method for computing real wages (income adjusted for cost of living) for agricultural labourers in medieval England.

An accurate understanding of these wages is critically important for our conceptions of historic economic development, especially as existing scholarship on medieval wage rates are incompatible with the most recent estimates of historical GDP data and therefore our understanding of precisely how, when and why Western Europe grew rich while other parts of the world did not.

Current scholarship on wages and labour before 1500 tends to be highly extrapolated and interpolated and lacks systematic analyses grounded in precise evidence. My project employs a methodology connecting wage payments to precise data on the number of days worked by individual labourers and the prices for the goods that these same individuals needed to purchase, and facilitates the creation of a wage series based entirely on accurate historical data.

The systematic analysis and quantification of wage levels for the medieval period has been frustrated by the relative lack of records, or, even where records might be plentiful, by the inconsistency or obscurity in the ways in which wage levels are framed. As a result, current discussions of wages and labour before 1500 lack the bite of more systematic analyses grounded in precise evidence and leads to the divergence in results that we currently see in the literature.

My study attempts to break through this impasse by adopting a new method for determining the wage profile of workers on medieval English demesnes (the home farms of lords as against those of their tenants). It uses uniquely detailed agricultural accounts from these demesnes, which survive in tens of thousands for the period of this study (c. AD 1250 – AD 1450).

The method depends on connecting precise data on wages paid both in cash and ‘in kind’ in a manner that allows wages to be calculated without the distorting effect of proxy measurements. This approach promises to facilitate the creation of an accurate wage series for medieval England, based entirely on historical data both over region and over time and to allow surveys of the degree of both female and male labour evident in medieval demesne agriculture.

How well off were the occupants of early modern almshouses?

by Angela Nicholls (University of Warwick).

Almhouses in Early Modern England is published by Boydell Press. SAVE 25% when you order direct from the publisher – offer ends on the 13th December 2018. See below for details.


Almshouses, charitable foundations providing accommodation for poor people, are a feature of many towns and villages. Some are very old, with their roots in medieval England as monastic infirmaries for the sick, pilgrims and travellers, or as chantries offering prayers for the souls of their benefactors. Many survived the Reformation to be joined by a remarkable number of new foundations between around 1560 and 1730. For many of them their principal purpose was as sites of memorialisation and display, tangible representations of the philanthropy of their wealthy donors. But they are also some of the few examples of poor people’s housing to have survived from the early modern period, so can they tell us anything about the material lives of the people who lived in them?

Paul Slack famously referred to almspeople as ‘respectable, gowned Trollopian worthies’, and there are many examples to justify that view, for instance Holy Cross Hospital, Winchester, refounded in 1445 as the House of Noble Poverty. But these are not typical. Nevertheless, many early modern almshouse buildings are instantly recognisable, with the ubiquitous row of chimneys often the first indication of the identity of the building.


Burghley Almshouses, Stamford (1597)


Individual chimneys and, often, separate front doors are evidence of private domestic space, far removed from the communal halls of the earlier medieval period, or the institutional dormitories of the nineteenth century workhouses which came later. Accommodating almspeople in their own rooms was not just a reflection of general changes in domestic architecture at the time, which placed greater emphasis on comfort and privacy, but represented a change in how almspeople were viewed and how they were expected to live their lives. Instead of living communally with meals provided, in the majority of post-Reformation almshouses the residents would have lived independently, buying their own food, cooking it themselves on their own hearth and eating it by themselves in their rooms. The importance of the hearth was not only as the practical means of heating and cooking, but was central to questions of identity and social status. Together with individual front doors, these features gave occupants a degree of independence and autonomy; they enabled almspeople to live independently despite their economic dependence, and to adopt the appearance if not the reality of independent householders.


Screen Shot 2018-11-13 at 16.40.44
Stoneleigh Old Almshouses, Warwickshire (1576)


The retreat from communal living also meant that almspeople had to support themselves rather than have all their needs met by the almshouse. This was achieved in many places by a transition to monetary allowances or stipends with which almspeople could purchase their own food and necessities, but the existence and level of these stipends varied considerably. Late medieval almshouses often specified an allowance of a penny a day, which would have provided a basic but adequate living in the fifteenth century, but was seriously eroded by sixteenth-century inflation. Thus when Lawrence Sheriff, a London mercer, established in 1567 an almshouse for four poor men in his home town of Rugby, his will gave each of them the traditional penny a day, or £1 10s 4d a year. Yet with inflation, if these stipends were to match the real value of their late-fifteenth-century counterparts, his almsmen would actually have needed £4 5s 5d a year.[1]

The nationwide system of poor relief established by the Tudor Poor Laws, and the survival of poor relief accounts from many parishes by the late seventeenth century, provide an opportunity to see the actual amounts disbursed in relief by overseers of the poor to parish paupers. From the level of payments made to elderly paupers no longer capable of work it is possible to calculate the barest minimum which an elderly person living rent free in an almshouse might have needed to feed and clothe themself and keep warm.[2] Such a subsistence level in the 1690s equates to an annual sum of £3 17s which can be adjusted for inflation and used to compare with a range of known almshouse stipends from the late sixteenth and seventeenth centuries.

The results of this comparison are interesting, even surprising. Using data from 147 known almshouse stipends in six different counties (Durham, Yorkshire, Norfolk, Warwickshire, Buckinghamshire and Kent) it seems that less than half of early modern almshouses provided their occupants with stipends which were sufficient to live on. Many provided no financial assistance at all.


The inescapable conclusion is that the benefits provided to early modern almspeople were in many cases only a contribution towards their subsistence. In this respect almshouse occupants were no different from the recipients of parish poor relief, who rarely had their living costs met in full.

Yet, even in one of the poorer establishments, almshouse residents had distinct advantages over other poor people. Principally these were the security of their accommodation, the permanence and regularity of any financial allowance, no matter how small, and the autonomy this gave them. Almshouse residents may also have had an enhanced status as ‘approved’, deserving poor. The location of many almshouses, beside the church, in the high street, or next to the guildhall, seems to have been purposely designed to solicit alms from passers-by, at a time when begging was officially discouraged.

SAVE 25% when you order direct from the publisher. Discount applies to print and eBook editions. Click the link, add to basket and enter offer code BB500 in the box at the checkout. Alternatively call Boydell’s distributor, Wiley, on 01243 843 291 and quote the same code. Offer ends one month after the date of upload. Any queries please email



[1] Inflation index derived from H. Phelps Brown and S. V. Hopkins, A Perspective of Wages and Prices (London and New York, 1981) pp. 13-59.

[2] L. A. Botelho, Old Age and the English Poor Law, 1500 – 1700 (Woodbridge, 2004) pp. 147-8.

Wheels of change: skill-biased factor endowments and industrialisation in eighteenth century England

by Joel Mokyr (Northwestern University), Assaf Sarid (Haifa University), Karine van der Beek (Ben-Gurion University)

Shorrocks Lancashire Loom with a weft stop, The Museum of Science and Industry in Manchester. Available at Wikimedia Commons

The main manifestation of an industrial revolution taking place in Britain in the second half of the eighteenth century was the shift of textile production (that is, the spinning process), from a cottage-based manual system, to a factory-based capital-intensive system, with machinery driven by waterpower and later on by steam.

The initial shift in production technology in the 1740s took place in all the main textile centres (the Cotswolds, East Anglia, and in the middle Pennines in Lancashire and the West-Riding). But towards the end of the century, as the intensity of production and the application of Watt’s steam engine increased, the supremacy of the cotton industry of the northwestern parts of the country began to show, and this is where the industrial revolution eventually took place and persisted.

Our research examines the role of factor endowments in determining the location of technology adoption in the English textile industry and its persistence since the Middle Ages. In line with recent research on economic growth, which emphasises the role of factor endowments on long run economic development, we claim that the geographical and institutional environment determined the location of watermill technology adoption in the production of foodstuffs.

In turn, the adoption of the watermill for grain grinding (around the tenth and eleventh centuries), affected the area’s path of development by determining the specialisation and skills that evolved, and as a result, its suitability for the adoption of new textile technologies, textile fulling (thirteenth and fourteenth centuries) and, later on, spinning (eighteenth century).

The explanation for this path dependence is that all these machines, including other machinery that was developed in various production processes (such as sawing mills, forge mills, paper mills, etc.), were all based on similar mechanical principles as the grinding watermills. Thus, their implementation did not require additional resources or skills and it was therefore more profitable to invest in them and expand textile production, in places that were specialised and experienced in the construction and maintenance of grinding watermills.

As textile exports expanded in the second half of the eighteenth century (both woollen and cotton textiles), Watt’s steam engine was introduced. The watermills that operated the newly introduced spinning machinery began to be replaced with the more efficient steam engines, and almost disappeared by the beginning of the nineteenth century. This stage of technological change took place in Lancashire’s textile centre, which enjoyed both the proximity of coal as well as of strong water flows, and was therefore suitable for the implementation of steam engine technology.

We use information from a variety of sources, including the Apprenticeship Stamp-Tax Records (eighteenth century), Domesday Book (eleventh century), as well as geographical databases, and show that the important English textile centres of the eighteenth century, evolved in places that had more grinding watermills during the Domesday Survey (1086).

To be more precise, we find that on average, there was an additional textile merchant in 1710 in areas that had three more watermills in 1086. The magnitude of this effect is important given that there were on average 1.2 textile cloth merchants in an area (the maximum was 34 merchants).

We also find that textile centres in these areas persisted well into the eighteenth century and specialised in skilled mechanical human capital (measured by the number of apprentices to masters specialising in watermill technology, that is, wrights, in the eighteenth century), which was essential for the development, implementation and maintenance of waterpower as well as mechanical machinery.

The number of this type of worker increased in the 1750s in all the main textile centres until the 1780s, when their number was declining in Lancashire as it was adopting a new technology that was no longer dependent on their skills.

Revisiting the changing body

by Bernard Harris (University of Strathclyde)

The Society has arranged with CUP that a 20% discount is available on this book, valid until the 11th November 2018. The discount page is:

The last century has witnessed unprecedented improvements in survivorship and life expectancy. In the United Kingdom alone, infant mortality fell from over 150 deaths per thousand births at the start of the last century to 3.9 deaths per thousand births in 2014 (see the Office for National Statistics  for further details). Average life expectancy at birth increased from 46.3 to 81.4 years over the same period (see the Human Mortality Database). These changes reflect fundamental improvements in diet and nutrition and environmental conditions.

The changing body: health, nutrition and human development in the western world since 1700 attempted to understand some of the underlying causes of these changes. It drew on a wide range of archival and other sources covering not only mortality but also height, weight and morbidity. One of our central themes was the extent to which long-term improvements in adult health reflected the beneficial effect of improvements in earlier life.

The changing body also outlined a very broad schema of ‘technophysio evolution’ to capture the intergenerational effects of investments in early life. This is represented in a very simple way in Figure 1. The Figure tries to show how improvements in the nutritional status of one generation increase its capacity to invest in the health and nutritional status of the next generation, and so on ‘ad infinitum’ (Floud et al. 2011: 4).

Figure 1. Technophysio evolution: a schema. Source: See Floud et al. 2011: 3-4.

We also looked at some of the underlying reasons for these changes, including the role of diet and ‘nutrition’. As part of this process, we included new estimates of the number of calories which could be derived from the amount of food available for human consumption in the United Kingdom between circa 1700 and 1913. However, our estimates contrasted sharply with others published at the same time (Muldrew 2011) and were challenged by a number of other authors subsequently. Broadberry et al. (2015) thought that our original estimates were too high, whereas both Kelly and Ó Gráda (2013) and Meredith and Oxley (2014) regarded them as too low.

Given the importance of these issues, we revisited our original calculations in 2015. We corrected an error in the original figures, used Overton and Campbell’s (1996) data on extraction rates to recalculate the number of calories, and included new information on the importation of food from Ireland to other parts of what became the UK. Our revised Estimate A suggested that the number of calories rose by just under 115 calories per head per day between 1700 and 1750 and by more than 230 calories between 1750 and 1800, with little changes between 1800 and 1850. Our revised Estimate B suggested that there was a much bigger increase during the first half of the eighteenth century, followed by a small decline between 1750 and 1800 and a bigger increase between 1800 and 1850 (see Figure 2). However, both sets of figures were still well below the estimates prepared by Kelly and Ó Gráda, Meredith and Oxley, and Muldrew for the years before 1800.

Source: Harris et al. 2015: 160.

These calculations have important implications for a number of recent debates in British economic and social history (Allen 2005, 2009). Our data do not necessarily resolve the debate over whether Britons were better fed than people in other countries, although they do compare quite favourably with relevant French estimates (see Floud et al. 2011: 55). However, they do suggest that a significant proportion of the eighteenth-century population was likely to have been underfed.
Our data also raise some important questions about the relationship between nutrition and mortality. Our revised Estimate A suggests that food availability rose slowly between 1700 and 1750 and then more rapidly between 1750 and 1800, before levelling off between 1800 and 1850. These figures are still broadly consistent with Wrigley et al.’s (1997) estimates of the main trends in life expectancy and our own figures for average stature. However, it is not enough simply to focus on averages; we also need to take account of possible changes in the distribution of foodstuffs within households and the population more generally (Harris 2015). Moreover, it is probably a mistake to examine the impact of diet and nutrition independently of other factors.

To contact the author:


Allen, R. (2005), ‘English and Welsh agriculture, 1300-1850: outputs, inputs and income’. URL:

Allen, R. (2009), The British industrial revolution in global perspective, Cambridge: Cambridge University Press.

Broadberry, S., Campbell, B., Klein, A., Overton, M. and Van Leeuwen, B. (2015), British economic growth, 1270-1870, Cambridge: Cambridge University Press.

Floud, R., Fogel, R., Harris, B. and Hong, S.C. (2011), The changing body: health, nutrition and human development in the western world since 1700, Cambridge: Cambridge University Press.

Harris, B. (2015), ‘Food supply, health and economic development in England and Wales during the eighteenth and nineteenth centuries’, Scientia Danica, Series H, Humanistica, 4 (7), 139-52.

Harris, B., Floud, R. and Hong, S.C. (2015), ‘How many calories? Food availability in England and Wales in the eighteenth and nineteenth centuries’, Research in Economic History, 31, 111-91.

Kelly, M. and Ó Gráda, C. (2013), ‘Numerare est errare: agricultural output and food supply in England before and during the industrial revolution’, Journal of Economic History, 73 (4), 1132-63.

Meredith, D. and Oxley, D. (2014), ‘Food and fodder: feeding England, 1700-1900’, Past and Present, 222, 163-214.

Muldrew, C. (2011), Food, energy and the creation of industriousness: work and material culture in agrarian England, 1550-1780, Cambridge: Cambridge University Press.

Overton, M. and Campbell, B. (1996), ‘Production et productivité dans l’agriculture anglaise, 1086-1871’, Histoire et Mésure, 1 (3-4), 255-97.

Wrigley, E.A., Davies, R., Oeppen, J. and Schofield, R. (1997), English population history from family reconstitution, Cambridge: Cambridge University Press.