Land reform and agrarian conflict in 1930s Spain

Jordi Domènech (Universidad Carlos III de Madrid) and Francisco Herreros (Institute of Policies and Public Goods, Spanish Higher Scientific Council)

Government intervention in land markets is always fraught with potential problems. Intervention generates clearly demarcated groups of winners and losers as land is the main asset owned by households in predominantly agrarian contexts. Consequently, intervention can lead to large, generally welfare-reducing changes in the behaviour of the main groups affected by reform, and to policies being poorly targeted towards potential beneficiaries.

In this paper (available here), we analyse the impact of tenancy reform in the early 1930s on Spanish land markets. Adapting general laws to local and regional variation in land tenure patterns and heterogeneity in rural contracts was one of the problems of agricultural policies in 1930s Spain. In the case of Catalonia in the 1930s, the interest of the case lies in the adaptation of a centralized tenancy reform, aimed at fixed-rent contracts, to sharecropping contracts that were predominant in Catalan agriculture. This was more typically the case of sharecropping contracts on vineyards, the case of customary sharecropping contract (rabassa morta), subject to various legal changes in the late 18th and early 19th centuries. It is considered that the 1930s culminated a period of conflicts between the so called rabassaires (sharecroppers under rabassa morta contracts) and owners of land.

The divisions between owners of land and tenants was one of the central cleavages of Catalonia in the 20th century. This was so even in an area that had seen substantial industrialization. In the early 1920s, work started on a Catalan law of rural contracts, aimed especially at sharecroppers. A law, passed on the 21st March 1934, allowed the re-negotiation of existing rural contracts and prohibited the eviction of tenants who had been less than 6 years under the same contract. More importantly, it opened the door to forced sales of land to long-term tenants. Such legislative changes posed a threat to the status quo and the Spanish Constitutional Court ruled the law was unconstitutional.

The comparative literature on the impacts of land reforms argues that land reform, in this case tenancy reform, can in fact change agrarian structures. When property rights are threatened, landowners react by selling land or interrupting existing tenancy contracts, mechanizing and hiring labourers. Agrarian structure is therefore endogenous to existing threats to property rights. The extent of insecurity in property rights in 1930s Catalonia can be seen in the wave of litigation over sharecropping contracts. Over 30,000 contracts were revised in the courts in late 1931 and 1932 which provoked satirical cartoons (Figure 01).

Untitled
Figure 1. Revisions and the share of the harvest. Source: L’Esquella de la Torratxa, 2nd August 1932, p. 11.
Translation: The rabaissaire question: Peasant: You sweat by coming here to claim your part of the harvest, you would be sweating more if you were to grow it by yourself.

The first wave of petitions to revise contracts led overwhelmingly to most petitions being nullified by the courts. This was most pronounced in the Spanish Supreme Court which ruled against the sharecropper in most of the around 30,000 petitions of contract revision. Nonetheless, sharecroppers were protected by the Catalan autonomous government. The political context in which the Catalan government operated became even more charged in October 1934. That month, with signs that the Centre-Right government was moving towards more reactionary positions, the Generalitat participated in a rebellion orchestrated by the Spanish Socialist Party (PSOE) and Left Republicans. It is in this context of suspension of civil liberties that landowners now had a freer hand to evict unruly peasants. The fact that some sharecroppers did not surrender their harvest meant they could be evicted straight away according to the new rules set by the new military governor of Catalonia.

We use the number of cases of completed and initiated tenant evictions from October 1934 to around mid -1935 as the main dependent variable in the paper. Data were collected from a report produced by the main Catalan tenant union, Unió de Rabassaires (Rabassaires’ Union), published in late 1935 to publicize and denounce tenant evictions or attempts of evicting tenants.

Combining the spatial analysis of eviction cases with individual information on evictors and evicted, we can be reasonably confident about several facts around evictions and terminated contracts in 1930s Catalonia. Our data show that that rabassa morta legacies were not the main determinant of evictions. About 6 per cent of terminated contracts were open ended rabassa morta contracts (arbitrarily set at 150 years in the graph). About 12 per cent of evictions were linked to contracts longer than 50 years, which were probably oral contracts (since Spanish legislation had given a maximum of 50 years). Figure 2 gives the contracts lengths of terminated and threatened contracts.

Untitled 2
Figure 2. Histogram of contract lengths. Source: Own elaboration from Unió de Rabassaires, Els desnonaments rústics.

The spatial distribution of evictions is also consistent with the lack of historical legacies of conflict. Evictions were not more common in historical rabassa morta areas, nor were they typical of areas with a larger share of land planted with vines.

Our study provides a substantial revision of claims by unions or historians about very high levels of conflict in the Catalan countryside during the Second Republic. In many cases, there had a long process of adaptation and fine-tuning of contractual forms to crops and soil and climatic conditions which increased the costs of altering existing institutional arrangements.

To contact the authors:

jdomenec@clio.uc3m.es

francisco.herreros@csic.es

EHS 2018 special: London’s mortality decline – lessons for modern water policy

Werner Troeksen (University of Pittsburgh)
Nicola Tynan (Dickinson College)
Yuanxiaoyue (Artemis) Yang (Harvard T.H. Chan School of Public Health)

 

The United Nations Sustainable Development Goals aim to ensure access to water and sanitation for all. This means not just treating water but supplying it reliably. Lives are at stake because epidemiological research shows that a reliable, constant supply of water reduces water-borne illness.

Thames
Available at <https://heartheboatsing.com/2015/08/13/death-on-the-water/&gt;

Nineteenth century London faced the same challenge. Not until 1886 did more than half of London homes have water supplied 24 hours a day, 7 days a week. The move to a constant water supply reduced mortality. For every 5% increase in the number of households with a constant supply, deaths from water-borne illnesses fell 3%.

During Victoria’s reign, eight water companies supplied the metropolis with water: 50% from the river Thames, 25% from the river Lea and 25% from wells and springs. By the 1860s, the companies filtered all surface water and Bazalgette’s intercepting sewer was under construction. Still, more than 80% of people received water intermittently, storing it in cisterns often located outside the house, uncovered or beside the toilet.

Rapid population and housing growth required the expansion of the water network and companies found it easier to introduce constant service in new neighbourhoods. Retrofitting older neighbourhoods proved challenging and risked a substantial waste of scarce water. The Metropolis Water Act of 1871 finally gave water companies the power to require waste-limiting fixtures. After 1871, new housing estates received a constant supply of water immediately, while old neighbourhoods transitioned slowly.

As constant water supply reached more people, mortality from diarrhoea, dysentery, typhoid and cholera combined fell. With 24-hour supply, water was regularly available for everyone without risk of contamination. Unsurprisingly, poorer, crowded districts had higher mortality from water-borne diseases.

Even though treated, piped water was available to all by the mid-nineteenth century, everyone benefitted from the move to constant service. By the time the Metropolitan Water Board acquired London’s water infrastructure, 95% of houses in the city received their water directly from the mains.

According to Sergio Campus, water and sanitation head at the Inter-American Development Bank, the current challenge in many places is providing a sustainable and constant supply of water. In line with this, the World Bank’s new Water Supply, Sanitation, and Hygiene (WASH) poverty diagnostic has added frequency of delivery as a measure of water quality, in addition to access, water source and treatment.

Regularity of supply varies substantially across locations. London’s experience during the late Victorian years suggest that increased frequency of water supply has the potential to deliver further reductions in mortality in developing countries beyond the initial gains from improved water sources and treatment.

EHS 2018 special: How the Second World War promoted racial integration in the American South

by Andreas Ferrara (University of Warwick)

c805244f10399f75a8d9f41f67baf87e
African American and White Employees Working Together during WWII. Available at <https://www.pinterest.com.au/pin/396950154628232921/&gt;

European politicians face the challenge of integrating the 1.26 million refugees who arrived in 2015. Integration into the labour market is often discussed as key to social integration but empirical evidence for this claim is sparse.

My research contributes to the debate with a historical example from the American South where the Second World War increased the share of black workers in semi-skilled jobs such as factory work, jobs previously dominated by white workers.

I combine census and military records to show that the share of black workers in semi-skilled occupations in the American South increased as they filled vacancies created by wartime casualties among semi-skilled whites.

A fallen white worker in a semi-skilled occupation was replaced by 1.8 black workers on average. This raised the share of African Americans in semi-skilled jobs by 10% between 1940 and 1950.

Survey data from the South in 1961 reveal that this increased integration in the workplace led to improved social relations between black and white communities outside the workplace.

Individuals living in counties where war casualties brought more black workers into semi-skilled jobs between 1940-50 were 10 percentage points more likely to have an interracial friendship, 6 percentage points more likely to live in a mixed-race neighbourhood, and 11 percentage points more likely to favour integration over segregation in general, as well as at school and at church. These positive effects are reported by both black and white respondents.

Additional analysis using county-level church membership data from 1916 to 1971 shows similar results. Counties where wartime casualties resulted in a more racially integrated labour force saw a 6 percentage points rise in membership shares of churches, which already held mixed-race services before the war.

The church-related results are especially striking. In several of his speeches Dr Martin Luther King stated that 11am on Sunday is the most segregated hour in American life. And yet my analysis shows that workplace exposure of two groups can overcome even strongly embedded social divides such as churchgoing, which is particularly important in the South, the so-called bible belt.

This historical case study of the American South in the mid-twentieth century, where race relations were often tense, demonstrates that excluding refugees from the workforce may be ruling out a promising channel for integration.

Currently, almost all European countries forbid refugees from participating in the labour market. Arguments put forward to justify this include fear of competition for jobs, concern about downward pressure on wages and a perceived need to deter economic migration.

While the mid-twentieth century American South is not Europe, the policy implication is to experiment more extensively with social integration through workplace integration measures. This not only concerns the refugee case but any country with socially and economically segregated minority groups.

WHEN ART BECAME AN ATTRACTIVE INVESTMENT: New evidence on the valuation of artworks in wartime France

by Kim Oosterlinck (Université Libre de Bruxelles)

 

Scene_from_Degenerate_Art_auction,_1938,_works_by_Picasso,_Head_of_a_Woman,_Two_Harlequins
Scene from the Degenerate Artauction, spring 1938, published in a Swiss newspaper; works by Pablo PicassoHead of a Woman (lot 117), Two Harlequins (lot 115). “Paintings from the degenerate art action will now be offered on the international art market. In so doing we hope at least to make some money from this garbage” wrote Joseph Goebbels in his diaries. From Wikipedia

The art market in France during the Nazi occupation provided one of the best available investment opportunities, according to research published in the Economic Journal. Using an original database to recreate an art market price index for the period 1937-1947, his study shows that in a risk-return framework, gold was the only serious alternative to art.

The research indicates that discretion, the inflation-proof character of art, the absence of market intervention and the possibility of reselling works abroad all played a crucial role in the valuation of artworks. Some investors were ready to go to the black market to acquire assets that could easily be resold abroad. But for those who preferred to stay on the side of legality, the art market provided an attractive alternative.

The author notes that the French art market during the occupation has been the subject of numerous publications. But most of these focus on the fate of looted artworks, with limited attention given to the art market itself.

What’s more, previous research on the economics of art usually considers artworks as a poor investment. But the case of occupied France shows that in extreme circumstances, artworks may prove extremely attractive investment vehicles.

During wartime, illegal activities and the risk of being forced to flee the country increased the appeal of ‘discreet assets’ – ones that allow the storage of a large amount of value in small and easily transportable goods.

By comparing the price index for small and large artworks, the new study establishes that investors were looking for smaller artworks, especially just before the German invasion and during the period 1942-1943, when the black market flourished.

Non-pecuniary motives for buying art, such as ‘conspicuous consumption’, are often thought of as playing an important role in art valuation. The new research analyses this point for occupied France by exploiting the distinction made by the Nazis between ‘degenerate’ and ‘non-degenerate’ artworks.

Pricing of ‘degenerate’ works was indeed affected by the impossibility of engaging in their conspicuous consumption. The price difference between these two categories of artworks is clear at the beginning of the occupation, when the Nazi policy towards ‘degenerate’ artworks held in France had not been clearly spelled out.

The difference gradually vanished as it became known that Hitler took a favourable view of French ‘artistic decadence’ and was not planning to get these works destroyed as long as they remained in France.

Discretion does not only concern artworks, the researcher notes. Other discreet assets, such as collectible stamps, also experienced sharp price increases during the Nazi occupation of France. Assets that are easy to transport and hide therefore have characteristics that are valued by some investors during troubled times.

The interest in discreet artworks goes beyond wartime. At any point, tax evaders may be willing to buy art or other discreet assets to hide illicit profits or to diminish their tax burden. As a result, when wealth and wealth inequality increase, so does demand for discreet assets.

Whereas previous research traditionally attributes these price increases to social competition, the new study suggests an alternative explanation: assets that facilitate tax evasion should fetch a higher price in an environment characterised by increasing wealth inequality. The research thus opens the door to a different interpretation of the high demand for artworks in Japan in the 1990s or in China today.

To contact the author: koosterl@ulb.ac.be

Modelling regional imbalances in English plebeian migration

by Adam Crymble (University of Hertfordshire)

 

d00e33bf821fe63c44ac1602e79acdcc
FJohn Thomas Smith, Vagabondiana,1817

We often hear complaints of migrant groups negatively influencing British life. Grievances against them are many: migrants bring with them their language, cultural values, and sometimes a tendency to stick together rather than integrate. The story is never that simple, but these issues can get under the skin of the locals, leading to tension. Britain has always been home to migrants, and the tensions are nothing new, but two hundred years ago those outsiders were from much closer afield. Often they came from just down the road, as close as the next parish over. And yet they were still treated as outsiders by the law. Under the vagrancy laws, poor migrants in particular ran the risk of being arrested, whipped, put to hard labour, and expelled back home.

It was a way to make sure that welfare was only spent on local people. But thanks to this system, we’ve got a unique way to tell which parts of Britain were particularly connected to one another, and which bits just weren’t that interested in each other. Each of those expelled individuals left a paper trail, and that means we can calculate which areas sent more or fewer vagrants to places like London than we would expect. And that in turn tells us which parts of the country had the biggest potential to impact on the culture, life, and economy of the capital.

As it happens, it was Bristol that sent more paupers to London than anywhere else in England between 1777 and 1786, including at least 312 individuals. They did not arrive through any plan to overwhelm the metropolis, but through hundreds of individual decisions by Bristolians who thought they’d have a go at London life.

From a migration perspective, this tells us that the connectedness between London and Bristol was particularly strong at this time. Even when we correct for factors such as distance, cost of living, and population, Bristol was still substantially over-sending lower class migrants to the capital.

There are many possible explanations for this close connection. The tendency for migrants to move towards larger urban centres meant Bristolians had few other options for ‘bigger’ destinations than smaller towns. Improvements to the road network also meant the trip was both cheaper and more comfortable by the 1780s. And the beginning of a general decline in the Bristol domestic service economy was met with a rise in opportunities in the growing metropolis. These combined factors may have made the connections between London and Bristol particularly strong.

Other urban pockets of the country too showed a similarly strong connection to London, particularly in the West Midlands and West Country. Birmingham, Coventry, Worcester, Bath, Exeter, and Gloucester were all sending peculiarly high numbers of paupers to eighteenth century London. So too was Newcastle-upon-Tyne and Berwick-upon-Tweed, despite being located far to the north and almost certainly requiring a sea journey.

But not everywhere saw London as a draw. Yorkshire, Lincolnshire, Derbyshire, and Cheshire – a band of counties within walking distance of the sprouting mills of the industrialising North – all sent fewer people to London than we would expect. This suggests that the North was able to retain people, uniquely acting as a competitor to London at this time. It also means that places like Bristol and Newcastle-upon-Tyne may have had a bigger impact on the culture of the metropolis in the eighteenth century than places such as York and Sheffield. And that may have had lasting impact that we do not yet fully understand. Each of these migrants brought with them remnants of their local culture and belief systems: recipes, phrases, and mannerisms, as well as connections to people back home, that may mean that the London of today is a bit more like Bristol or Newcastle than it might otherwise have been. There is more research to be done, but with a clear map of how London was and was not connected to the rest of the country, we can now turn towards understanding how those connections sculpted the country.

To contact the author on Twitter: @adam_crymble

Late Marriage as a Contributor to the Industrial Revolution in England

by James Foreman-Peck and Peng Zhou (Cardiff University)

8443224-3x2-940x627
A Wedding at St George’s Church in London. Source: http://www.abc.net.au/news/2017-04-17/wedding-at-st-georges-church-in-london/8443430

A central question of economics is why some nations experienced economic growth and are now rich, when others have not and are poor. We go some way to answering this core question by estimating and testing a model of the English economy beginning four or five centuries before the first Industrial Revolution. Western Europe experienced the earliest modern economic growth and also showed a uniquely high female age at first marriage – around 25 – from at the latest the 15th century. Whereas real wages actually began a sustained rise during the first Industrial Revolution, without the contribution of late marriage, average living standards in England would not have risen by 1870.

We utilise long time series evidence, some dating back to 1300, and test the hypothesis that this West European Marriage Pattern was an essential reason for England’s precocious economic development. Persistent high mortality in the 14th and 15th centuries and massive mortality shocks such as the Black Death lowered life expectations. Subsequently as survival chances improved, especially for children, a given completed family size could be achieved with a smaller number of births. In an environment without artificial birth control, a rise in the age at first marriage of females ensured this reduction in fertility.

Later marriage not only constrained the number of births but also provided greater opportunities for female informal learning, especially through ‘service’. A high proportion of unmarried females between the ages of 15 and 25 left home and worked elsewhere, instead of bearing children, as in other societies. This widened female horizons compared with a passage from the parental household directly into demanding motherhood and housekeeping. Throughout this period the family was the principal institution for educating and training future workers. Schooling was not compulsory until 1880 in England. In the early nineteenth century few children attended any school regularly and few remained at school for more than one and a half years. Such skills and work discipline as were learned were passed on and built up over the generations primarily by the family. Our paper shows how, over the centuries, the gradual rise of this human capital raised productivity and eventually brought about the Industrial Revolution.

Over past centuries marriage and the family were an important engine of economic growth. Whether they still have any comparable contribution in an economy where the state has assumed so much responsibility for education and training remains an open question.        .

 

To contact the authors:

James Foreman-Peck,  Cardiff Business School, Cardiff University, CF10 3EU (foreman-peckj@cardiff.ac.uk.  Tel:07947 031945)

Peng Zhou,  Cardiff  Business School, Cardiff University CF10 3EU.  (ZhouP1@cardiff.ac.uk)

Patterns of rural infant mortality

By Paul Atkinson (University of Liverpool) – research conducted at Lancaster University thanks to ERC funding.

This work looked at the variation in infant mortality across time and place in country districts of England and Wales between 1851 and 1911. It used statistical methods to find patterns in the data from nearly 90% of rural places to show that, far from being one undifferentiated whole, the countryside was divided into zones with their own infant mortality trends. Broadly, infant mortality in the 1850s was worst in an eastern zone of England, but improved fastest here; across a large zone of south and central England infant mortality was somewhat lower than in the first zone in the 1850s (especially in the far south), but dropped somewhat more slowly; while in northern and western England, and in Wales, infant mortality began at lower levels than the rest of the country but stagnated or even increased, above all in the remotest districts.

 

How infant mortality changed in seven clusters of Registration Districts: for their locations, see map. The eastern zone is made up of Fenland and Mercia; Wessex, Severn and Trent form the south-central one and Health and Moor and Upland the final zone.

pic1

pic2

The obvious question is what made these patterns? Mainly different factors from the ones operating in towns, where the combination of crowding and poor sanitation made diarrhoeal disease the major killer, and where falling fertility was associated with decreasing infant mortality. This research identified statistically three other factors associated with infant mortality across time.

First, maternal health – plainly a factor in towns as well, but partly masked there by stronger influences. This work confirms – using a much larger dataset – Millward and Bell’s finding that the mortality of females from tuberculosis at reproductive ages, a good indicator of their general health, predicted infant mortality, explaining about a quarter of the variation in it. So, what makes mothers sick makes babies sick: probably poor nutrition above all, though we could not test that directly.

Second, maternal education, again relevant in towns too, but obscured there. Horrell, Oxley and Humphries have shown how a disadvantaged status within the household for women could produce excess female mortality: the research extends this argument to their babies. Literate women had higher status and more access to resources including food. What makes mothers vulnerable – in our study, their illiteracy – makes babies vulnerable. Female literacy predicted about a sixth of the variation in infant mortality.

The third factor linked with rising infant mortality in this period was remoteness, measured as the distance from the centre of each district to London. This was not just a characteristic of very remote locations, but applied at all distances above 100km. Exactly why infant mortality in the remotest places improved most slowly – even went backwards until the 1890s – is not very clear. This research argues that it was a mixture of large-scale out-migration stripping regions of their healthier inhabitants; possibly, the gradual way new ideas about infant care may have diffused from the biggest cities into the country, and, probably, features of rural social organisation: we argue elsewhere that the general trend to force women out of the agricultural labour market across the later nineteenth century was excluding them from forms of labour which benefited their status, and their babies’ welfare, in northern and western upland, pastoral farming areas, but harmed them in the arable south and east.

This amounts to an argument for two things: attention to ‘the mother as medium’ when explaining infant mortality rates, and attention to the diversity and particularity of local economies and cultures as we study the countryside of the past.

 

The full paper is available here

To contact the author: @PaulAtk43202349

 

 

Five hundred years of French economic stagnation: from Philippe Le Bel to the Revolution, 1280-1789

by Leonardo Ridolfi (IMT School for Advanced Studies Lucca)

In 2008, output per capita in France amounted to around $22,000 dollars per year. After the Second World War, in 1950, annual average income per capita reached $5,000 dollars, while in 1820, at the beginning of the first official national statistics, GDP per capita averaged $1,100 (Maddison, 2010). Nevertheless, precise knowledge of economic growth in France stops when we get back as far as 1820; before this date, the quantitative reconstruction of economic development is shrouded in mystery.

That mystery lies in the difficulty of uncovering sufficient resource material, devising adequate measures of economic performance in the past, and ultimately interpreting the complexity of the dynamics involved. These dynamics stretch far beyond just the mere economic sphere and concern the way a society is itself organised and structured. Nevertheless, several questions spring to mind.

What was the level of material living standards between the thirteenth and the late eighteenth century, from the early stages of state formation to the French Revolution? How did per capita incomes evolve over time? And were French workers richer or poorer than their European counterparts during the pre-industrial period?

This research provides answers to these questions by estimating the first long-run series of output per capita for France from 1280 to 1789.

The study reveals one important conclusion: the dominant pattern was stagnation in levels of output per capita. For the first time indeed, these estimates document quantitatively and in the aggregate what was previously known only qualitatively or for some regions by the classic works of French historiography (Goubert, 1960; Le Roy Ladurie, 1966): the French economy was an inherently stagnating growthless system, a ‘société immobile’, which at the beginning of the eighteenth century was not much different than five centuries earlier.

At the time of the death of King Philip the Fair in 1314, France was a leading economy in Europe and output per capita averaged $900 per year. Almost five centuries later, this threshold was largely unchanged, but the France of King Louis XVI now belonged to the group of the least developed countries in Western Europe. In the 1780s, per capita income was slightly above $1,000, about half the level registered in England and the Low Countries.

Nevertheless, stagnation was not the same as stability. The French economy was highly volatile and experienced multiple peaks and troughs. In addition, these results reject the argument that there was no long-run improvement in living standards before the Industrial Revolution, demonstrating that GDP per capita rose more than 30% between the 1280s and the 1780s.

Yet most of the rise was explained by a single episode of economic growth that took place prior to the Black Death between the 1280s and the 1340s and which shifted the trajectory of growth onto a higher path.

Overall, these estimates suggest that the evolution of the French economy can be suitably interpreted as an intermediate case between the successful example of England and the Low Countries and the declining patterns of Italy and Spain. Being neither a southern country nor a northern one, the growth experience of France seems to reflect this geographical heterogeneity.

 

References

Goubert, Pierre (1960) École pratique des hautes etudes, Laboratoire cartographique, Beauvais et le Beauvaisis de 1600 à 1730: contribution a l’histoire sociale de la France du 17e siècle, Sevpen.

Ladurie, Emmanuel Le Roy (1966) Les paysans de Languedoc Vol. 1. Mouton.

Maddison, Angus (2010) Historical Statistics of the World Economy: 1-2008 AD, Paris.

Safe-haven asset: property speculation in medieval England

by Adrian Bell, Chris Brooks and Helen Killick (ICMA Centre, University of Reading)

Neuadd_y_Penrhyn

While we might imagine the medieval English property market to have been predominantly ‘feudal’ in nature and therefore relatively static, this research reveals that in the fourteenth and fifteenth centuries, it demonstrated increasing signs of commercialisation.

The study, funded by the Leverhulme Trust, finds that a series of demographic crises in the fourteenth century caused an increase in market activity, as opportunities for property ownership were opened up to new sections of society.

Chief among these was the Black Death of 1348-50, which wiped out over a third of the population. In contrast with previous research, this research shows that after a brief downturn in the immediate aftermath of the plague, the English market in freehold property experienced a surge in activity; between 1353 and 1370, the number of transactions per year almost doubled in number.

The Black Death prompted aristocratic landowners to dispose of their estates, as the high death toll meant that they no longer had access to the labour with which to cultivate them. At the same time, the gentry and professional classes sought to buy up land as a means of social advancement, resulting in a widespread downward redistribution of land.

In light of the fact that during this period labour shortages made land much less profitable in terms of agricultural production, we might expect property prices to have fallen.

Instead, this research demonstrates that this was not the case: the price of freehold land remained robust and certain types of property (such as manors and residential buildings) even rose in value. This is attributed to the fact that increasing geographical and social mobility during this period allowed for greater opportunities for property acquisition, and thus the development of property as a commercial asset.

This is indicated by changes in patterns of behaviour among buyers. The data suggest that an increasing number of people from the late fourteenth century onwards engaged in property speculation – in other words, purchase for the purposes of investment rather than consumption.

These investors purchased multiple properties, often at a considerable distance from their place of residence, and sometimes clubbed together with other buyers to form syndicates. They were often wealthy London merchants, who had acquired large amounts of disposable capital through their commercial activities.

The commodification of housing is a subject that has been much debated in recent years. By exploring the origins of property as an ‘asset class’ in the pre-modern economy, this research draws inevitable comparisons with the modern context: in medieval times, much as now, ‘bricks and mortar’ were viewed as a secure financial investment.

Child labour in 18th century England: evidence from the Foundling Hospital

by Alice Dolan (University of Hertfordshire)

Foundling_Hospital;_Captain_Coram_and_several_children,_the_Wellcome_V0049243
Wellcome Images.Foundling Hospital: Captain Coram and several children, the latter carrying implements of work, a church and ships in the distance. Steel engraving by H. Setchell after W. Hogarth

Every few years a child labour scandal in the clothing industry hits the British press, invoking wide public condemnation. This reaction is a modern phenomenon: 250 years ago, child labour in textile production was commonplace, not worthy of a headline.

Attitudes changed in the nineteenth century, leading to the passing of the 1833 Factory Act and 1842 Mines Act. But before this change, child labour was believed to have positive benefits for children.

One notable example was the Foundling Hospital, a charitable institution that supported abandoned children and was a keen believer in the benefits of child labour. The Hospital sought to produce upright citizens that would be able to support themselves as adults.

A key aim of the Hospital was therefore to train children to be ‘industrious’ from a young age. One governor wrote that the Hospital aimed ‘to give [the Foundlings] an early Turn to Industry by giving them constant employment’. This ‘Turn’ would train the children into economic self-sufficiency, stopping them from relying on parish poor relief as adults.

The Foundling Hospital opened its doors in 1741. Parliament recognised the value of its work and funded the acceptance of all children presented to it aged 12 months or under over the period 1756-60. This ‘General Reception’ brought 14,934 children into the Hospital.

The London Hospital could not cope with these unprecedentedly high numbers and new branches were founded, including one in Ackworth, Yorkshire, which received 2,664 children in the period 1757-72. Ackworth closed because Parliament regretted its generosity and stopped funding the General Reception generation in 1771.

Thousands of children required thousands of uniforms and Ackworth chose to make as many garments as possible in-house. On-site production both trained children to be industrious and offered financial benefits for the Hospital. Work completed on-site was cheap and reliable, and there was greater quality control.

The Ackworth ‘manufactory’ produced woollen cloth. The children prepared the fibre for spinning, span it and wove the yarn into cloth that was worn by their peers at Ackworth and was sold to the London Hospital and externally. Some cloth manufacturing work was outsourced, particularly finishing processes that required a higher level of skill.

Few concessions were made for the age of the makers and the London branch criticised and sent orders back that were considered to be of insufficient quality or inappropriate size. These were primarily business rather than charitable transactions.

The skill division also applied in the making of clothing. Underwear, stockings and girls’ clothing were made in-house because it was less skilled work. Garments were produced in high volumes. From 1761 to 1770, 13,442 pieces of underwear (shirts and shifts) and 19,148 pairs of stockings were made by the children.

Tasks such as tailoring, and hat and shoe making required long apprenticeships to develop the necessary skill – this work was therefore outsourced. But external supply had its problems. It was difficult to source enough garments for the hundreds of children at the branch. Products were more expensive because labour was not free and the Hospital had little influence on suppliers’ timeframes.

A Foundling started work young, aged 4 or 5, and continued to work through their residence at the Hospital. Despite this, they were luckier than their peers in the workhouse who endured worse conditions.

Many parents chose to send their children to the Foundling Hospital to give them better life chances through the greater educational and apprenticeship opportunities offered. Putting the children to work, which seems cruel to us, was a key educational strategy to help them achieve economic independence in adulthood. Its financial and logistical benefits were welcome too.