Early-life disease exposure and occupational status

by Martin Saavedra (Oberlin College and Conservatory)

This blog is part H of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History’. This blog is based on the article ‘Early-life disease exposure and occupational status: The impact of yellow fever during the 19th century’, in Explorations in Economic History, 64 (2017): 62-81. https://doi.org/10.1016/j.eeh.2017.01.003   

 

Saavedra1
A girl suffering from yellow fever. Watercolour. Available at Wellcome Images.

Like epidemics, shocks to public health have the potential to affect human capital accumulation. A literature in health economics known as the ‘fetal origins hypothesis’ has examined how in utero exposure to infectious disease affects labor market outcomes. Individuals may be more sensitive to health shocks during the developmental stage of life than during later stages of childhood. For good reason, much of this literature focuses on the 1918 influenza pandemic which was a huge shock to mortality and one of the few events that can be visibly observed when examining life expectancy trends in the United States. However, there are limitations to looking at the 1918 influenza pandemic because it coincided with the First World War. Another complication in this literature is that cities with outbreaks of infectious disease often engaged in many forms of social distancing by closing schools and businesses. This is true for the 1918 influenza pandemic, but also for other diseases. For examples, many schools were closed during the polio epidemic of 1916.

So, how can we estimate the long-run effects of infectious disease when cities simultaneously respond to outbreaks? One possibility is to look at a disease that differentially affected some groups within the same city, such as yellow fever during the nineteenth century. Yellow fever is a viral infection that spreads from the Aedes aegypti mosquito and is still endemic in parts of Africa and South America.  The disease kills roughly 50,000 people per year, even though a vaccine has existed for decades. Symptoms include fever, muscle pain, chills, and jaundice, from which the disease derives its name.

During the eighteenth and nineteenth centuries, yellow fever plagued American cities, particularly port cities that traded with Caribbean Islands. In 1793, over 5,000 Philadelphians likely died of yellow fever. This would be a devasting number in any city, even by today’s standards, but it is even more so when considering that in 1790 that Philadelphia had a population of less than 29,000.

By the mid-nineteenth century, Southern port cities grew, and yellow fever stopped occurring in cities as far north as Philadelphia. The graph below displays the number of yellow fever fatalities in four southern port cities — New Orleans, LA; Mobile, AL; Charleston, SC; and Norfolk, VA — during the nineteenth century. Yellow fever was sporadic, devasting a city in one year and often leaving it untouched in the next. For example, yellow fever killed nearly 8,000 New Orleanians in 1853, and over 2,000 in both 1854 and 1855. The next two years, yellow fever killed fewer than 200 New Orleanians per year, then yellow fever come back killing over 3,500 in 1858. Norfolk, VA was only struck once in 1855. Since yellow fever never struck Norfolk during milder years, the population lacked immunity and approximately 10 percent of the city died in 1855. Charleston and Mobile show similar sporadic patterns. Likely due to the Union’s naval blockade, yellow fever did not visit any American port cities in large numbers during the Civil War.

 

Saavedra2
Source: As per original article.

 

Immigrants were particularly prone to yellow fever because they often came from European countries rarely visited by yellow fever. Native New Orleanians, however, typically caught yellow fever during a mild year as children and were then immune to the disease for the rest of their lives. For this reason, yellow fever earned the name the “stranger’s disease.”

Data from the full count of the 1880 census show that yellow fever fatality rates during an individual’s year of birth negatively affected adult occupational status, but only for individuals with foreign-born mothers. Those with US-born mothers were relatively unaffected by the disease. There are also effects for those who are exposed to yellow fever one or two years after their birth, but there are no effects, not even for those with immigrant mothers, when exposed to yellow fever three or four years after their births. These results suggest that early-life exposure to infectious disease, not just city-wide responses to disease, influence human capital development.

 


 

Martin Saavedra

Martin.Saavedra@oberlin.edu

 

Give Me Liberty Or Give Me Death

by Richard A. Easterlin (University of Southern California)

This blog is  part G of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History’. The full article from this blog is “How Beneficent Is the Market? A Look at the Modern History of Mortality.” European Review of Economic History 3, no. 3 (1999): 257-94. https://doi.org/10.1017/S1361491699000131

 

VACCINATION_06
A child is vaccinated, Brazil, 1970.

Patrick Henry’s memorable plea for independence unintentionally also captured the long history of conflict between the free market and public health, evidenced in the current struggle of the United States with the coronavirus.  Efforts to contain the virus have centered on  measures to forestall transmission of the disease such as stay-at-home orders, social distancing, and avoiding large gatherings, each of which infringes on individual liberty.  These measures have given birth to a resistance movement objecting to violations of one’s freedom.

My 1999 article posed the question “How Beneficent is the Market?” The answer, based on “A Look at the Modern History of Mortality” was straightforward: because of the ubiquity of market failure, public intervention was essential to achieve control of major infectious disease. This intervention  centered on the creation of a public health system. “The functions of this system have included, in varying degrees, health education, regulation, compulsion, and the financing or direct provision of services.”

Regulation and compulsion, and the consequent infringement of individual liberties, have always been  critical building blocks of the public health system. Even before formal establishment of public health agencies, regulation and compulsion were features of measures aimed at controlling the spread of infectious disease in mid-19th century Britain. The “sanitation revolution” led to the regulation of water supply and sewage disposal, and, in time to regulation of slum-  building conditions.  As my article notes, there was fierce opposition to these measures:

“The backbone of the opposition was made up of those whose vested interests were threatened: landlords, builders, water companies, proprietors of refuse heaps and dung hills, burial concerns, slaughterhouses, and the like … The opposition appealed to the preservation of civil liberties and sought to debunk the new knowledge cited by the public health advocates …”

The greatest achievement of public health was the eradication of smallpox, the one disease in the world that has been eliminated from the face of the earth. Smallpox was the scourge of humankind until William Jenner’s discovery of a vaccine in 1798.   Throughout the 19th and 20th centuries, requirements for smallpox vaccination were fiercely opposed by anti-vaccinationists.  In 1959 the World Health Organization embarked on a program to eradicate the disease. Over the ensuing two decades its efforts to persuade governments worldwide to require vaccination of infants were eventually successful, and in 1980 WHO officially declared the disease eradicated. Eventually public health triumphed over liberty. But It took almost two centuries to realize Jenner’s hope that vaccination would annihilate smallpox.

In the face of the coronavirus pandemic the U. S. market-based health care system  has demonstrated once again the inability of the market to  deal with infectious disease, and the need for forceful public intervention. The  current health care system requires that:

 “every player, from insurers to hospitals to the pharmaceutical industry to doctors, be financially self-sustaining, to have a profitable business model. It excels in expensive specialty care. But there’s no return on investment in being positioned for the possibility of a pandemic” (Rosenthal 2020).

Commercial and hospital labs have been slow to respond to the need for developing a test for the virus.  Once tests became available, conducting them was handicapped by insufficient supplies of testing capacity — kits, chemical reagents, swabs, masks and other personal protective equipment. In hospitals, ventilators  were also in short supply. These deficiencies reflected the lack of profitability in responding to these needs, and of a government reluctant to compensate for market failure.

At the current time, the halting efforts of federal public health authorities  and state and local public officials to impose quarantine and “shelter at home” measures have been seriously handicapped by public protests over infringement of civil liberties, reminiscent of the dissidents of the 19th  and 20th centuries and their current day heirs. States are opening for business well in advance of guidelines of the Center for Disease Control.  The lesson of history regarding such actions is clear: The cost of liberty is sickness and death.  But do we learn from history? Sadly, one is put in mind of Warren Buffet’s aphorism: “What we learn from history is that people don’t learn from history.”

 

Reference

Rosenthal, Elizabeth, “A Health System Set up to Fail”,  New York Times, May 8, 2020, p.A29.

 

To contact the author: easterl@usc.edu

Airborne diseases: Tuberculosis in the Union Army

by Javier Birchenall (University of California, Santa Barbara)

This is Part F of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History The full article from this blog was published in ‘Explorations in Economic History’ and is available here

TB-do-not-spit-1910-v2
1910 advertising postcard for the National Association for the Prevention of Tuberculosis. 

Tuberculosis (TB) is one of the oldest and deadliest diseases. Traces of TB in humans can be found as early as 9,000 years ago, and written accounts date back 3,300 years in India. Untreated, TB’s case-fatality rate is as high as 50 percent. It was a dreaded disease.  TB is an airborne disease caused by the bacteria Mycobacterium tuberculosis. Tuberculosis spreads through the air when a person who has an active infection coughs, sneezes, speaks, or sings. Most cases remain latent and do not develop symptoms. Activation of tuberculosis is particularly influenced by undernutrition.

Tuberculosis played a prominent role in the secular mortality decline. Of the 27 years of life expectancy gained in England and Wales between 1871 and 1951, TB accounts for about 40 percent of the improvement, a 12-year gain. Modern medicine, the usual suspect used to explain this mortality decline, could not have been the culprit. As Thomas McKeown famously pointed out, TB mortality started its decline long before the tubercle bacillus was identified and long before  an  effective treatment was provided (Figure 1). McKeown viewed improvements in economic and social conditions, especially improved diets, as the principal factor arresting the combatting tuberculosis. A healthy diet, however, is not the only factor behind nutritional status. Infections, no matter how mild, reduce nutritional status and increase susceptibility to infection.

Figure 1. Mortality rate from TB.

fig01

Source: as per original article

In “Airborne Diseases: Tuberculosis in the Union Army” I studied the determinants of diagnosis, discharge, and mortality from tuberculosis in the past. I examined the medical histories of 25,000 soldiers and veterans in the Union Army using data collected under the direction of Robert Fogel. The Civil War brought together soldiers from many socioeconomic conditions and ecological backgrounds into an environment which was ideal for the spread of this disease. The war also provided a unique setting to examine many of the factors which were  likely responsible for the decline in TB mortality. Before enlistment, individuals had differential exposure to harmful dust and fumes. They also faced different disease environments and living conditions. By housing recruits in confined spaces, the war exposed soldiers to a host of waterborne and airborne infections. In the Civil War, disease was far more deadly than battle.

The Union Army data contains detailed medical records and measures of nutritional status. Height at enlistment measures net nutritional experiences at early ages. Weight, needed to measure current nutritional status using the Body Mass Index (BMI), is available for war veterans. My estimates use a hazard model and a variety of controls aligned with existing explanations proposed for the decline in TB prevalence and fatality rates. By how much would the diagnosis of TB have declined if the average Union Army soldier had the height of the current U.S. male population, and if all his relevant infections diagnosed prior to TB were eradicated?  Figure 2 presents the contribution of the predictors of TB diagnosis in soldiers who did not engage in battle, and  Figure 3 reports soldiers discharged because of TB.  Nutritional experiences in early life provided a protective effect against TB.  Between 25 and 50 per cent of the predictable decline in tuberculosis could be associated with the modern increase in height. Declines in the risk of waterborne and airborne diseases are as important as the predicted changes in height

 

Figure 2. Contribution of various factors to the decline in TB diagnosis

fig02
Source: as per original article

 

Figure 3. Contribution of various factors to the decline in discharges because of TB.

fig03
Source: as per original article

My analysis showed that a wartime diagnosis of TB increased the risk of tuberculosis mortality. Because of the chronic nature of the disease, infected soldiers likely developed a latent or persistent infection that remained active until resistance failed at old age. Nutritional status provided some protection against mortality. For veterans, height was not as robust as BMI. If a veteran’s BMI increased from its historical value of 23 to current levels of 27, his mortality risk from tuberculosis would have been reduced by 50 per cent. Overall, the contribution of changes in `pure’ diets and changes in infectious disease exposure, was probably equal.

What lessons can be drawn for the current covid-19 pandemic? Covid-19 is also an airborne disease. Airborne diseases (e.g., influenza, measles, smallpox, and tuberculosis) are difficult to control. In unfamiliar populations, they often break wreak havoc. But influenza, measles, smallpox, and tuberculosis are mostly killers from the past. The findings in my paper suggest that the conquest of tuberculosis happened through both individual and public health efforts. Improvements in diets and public health worked simultaneously and synergistically. There was no silver bullet to defeat the great white plague, tuberculosis. Diets are no longer as inadequate as in the past. Still, Covid-19 has exposed differential susceptibility to the disease. Success in combatting Covid-19 is likely to require simultaneous and synergistic private and public efforts.

Economic History Review – An introduction to the history of infectious diseases, epidemics and the early phases of the long-run decline in mortality

by Leigh Shaw-Taylor.

Below is the abstract for the Economic History Review virtual issue, Epidemics, Diseases and Mortality in Economic History.

The full introduction and the virtual issue are both available online for free for a limited time.

 

This article, written during the COVID-19 epidemic, provides a general introduction to the long-term history of infectious diseases, epidemics and the early phases of the spectacular long-term improvements in life expectancy since 1750, primarily with reference to English history. The story is a fundamentally optimistic one. In 2019 global life expectancy was approaching 73 years. In 1800 it was probably about 30. To understand the origins of this transition, we have to look at the historical sequence by which so many causes of premature death have been vanquished over time. In England that story begins much earlier than often supposed, in the years around 1600.  The first two ‘victories’ were over famine and plague. However, economic changes with negative influences on mortality meant that, despite this, life expectancies were either falling or stable between the late sixteenth and mid eighteenth centuries. The late eighteenth and early nineteenth century saw major declines in deaths from smallpox, malaria and typhus and the beginnings of the long-run increases in life expectancy. The period also saw urban areas become capable of demographic growth without a constant stream of migrants from the countryside: a necessary precondition for the global urbanization of the last two centuries and for modern economic growth. Since 1840 the highest national life expectancy globally has increased by three years in every decade.

After the Black Death: labour legislation and attitudes towards labour in late-medieval western Europe

by Samuel Cohn Jr. (University of Glasgow)

This blog forms part F in the EHS series: The Long View on Epidemics, Disease and Public Health:Research from Economic History

The full article from this blog post was published on the Economic History Review, and it is available for members at this link 

The_plague_of_ashdod_1630
Poussin, The plague of Ashdod, 1630. Available at <https://en.wikipedia.org/wiki/Plague_of_Ashdod_(Poussin)#/media/File:The_plague_of_ashdod_1630.jpg&gt;

In the summer, 1999, I presented the kernel of this article at a conference in San Miniato al Tedesco in memory of David Herlihy. It was then limited to labour decrees I had found in the Florentine archives from the Black Death to the end of the fourteenth century. A few years later I thought of expanding the paper into a comparative essay. The prime impetus came from teaching the Black Death at the University of Glasgow. Students (and I would say many historians) think that England was unique in promulgating price and wage legislation after the Black Death, the famous Ordinance and Statute of Labourers in1349 and 1351. In fact, I did not then know how extensive this legislation was, and details of its geographical distribution remains unknown today.

A second impetus for writing the essay concerned a consensus in the literature on this wage legislation principally in England: that these decrees followed the logic of the laws of supply and demand. In short, with the colossal mortalities of the Black Death, 1347-51, the greatly diminished supply of labour meant that wage earners in cities and the countryside could demand excessive increases that threatened the livelihoods of elite rentiers — the church, nobility, and merchants. After all, this is what chroniclers and other contemporary commentators across Europe — Henry Knighton, Matteo Villani, William Langland, Giovanni Boccaccio, and many others — tell us in their scorning reproaches to greedy labourers. As ‘Hunger’ in book IV of Piers the Ploughman sneered:

And draught-ale was not good enough for them, nor a hunk of bacon, but they must have fresh meat or fish, fried or baked and chaud or plus chaud at that.

In addition, across the political spectrum from Barry Dobson to Rodney Hilton, Bertha Putnam’s study of the English laws (published in 1908) continued to be proclaimed as the definitive authority on these laws, despite her lack of quantitative analysis and central conclusion: the peasants were guilty of ‘extortionate greed’ and for this reason ‘these laws were necessary and just’ (Enforcement of the Statutes of Labourers, pp. 219–20.) Yet, across Europe, while nominal wages may have trebled through the 1370s, prices for basic commodities rose faster, leaving the supposed heartless labourers worse-off than they had been before the Black Death. As George Holmes discovered in 1957, the class to profit most in England from the post-plague demographics was the supposed victimized nobility.

Through primary and secondary sources, my article then researched wage and price legislation across a wide ambit of Europe — England, the Ile de France, the Low Countries, Provence, Aragon, Castile, Catalonia, Florence, Bologna, Siena, Orvieto, Milan, and Venice. Certainly, research needs to be extended further, to places where these laws were enacted and to where they appear not to have been, as in Scotland and the Low Countries, and to ask what difference the laws may have meant for economic development. However, from the places I examined, no simple logic arose, whether of supply and demand or the aims that might have been expected given differences in political regimes. Instead, municipal and royal efforts to control labour and artisans’ prices splintered in numerous and often contradictory directions, paralleling anxieties and needs to attribute blame as seen in other Black Death horrors: the burning of Jews and the murder of Catalans, beggars, and priests.

In conclusion, a history of the Black Death and its long-term consequences for labour can provide insights for perceiving our present predicament with Covid-19. We can  anticipate that the outcomes of the present pandemic will not be same across countries or continents. Similarly, for the Black Death and successive waves of plague through the fourteenth century, there were winners and losers. Yet, surprisingly, few historians have attempted to chart these differences, and fewer still to explain them. Instead, medievalists have concentrated more on the Black Death’s grand transformations, and these may serve as a welcome tonic for our present pandemic despair, especially as concerns labour. Eventually, post-Black-Death populations experienced increases in caloric intact, greater varieties in diet, better housing, consumption of more luxury goods, increased leisure time, and leaps in economic equality. Moreover, governments such as Florence, even before revolts or regime change, could learn from their initial economic missteps. With the second plague in 1363, they abandoned their laws entitled ‘contra laboratores’ that had endangered the supplies of their thin resources of labour and passed new decrees granting tax exemptions to attract labourers into their territories. Moreover, other regions–the Ile de France, Aragon, Provence, and Siena abandoned these initial prejudicial laws almost immediately. Even in England, despite legislating ever more stringent laws against labourers that last well into the fifteenth century, its law enforcers learnt to turn a blind eye to the law, allowing landlords and peasants to cut mutually beneficial deals that enabled steady work, wages to rise, working conditions to improve, and greater freedom of movement.

Let us remember that these grand transformations did not occur overnight. The switch in economic policies to benefit labourers and the narrowing of the gap between rich and poor did not begin to show effects until a generation after the Black Death; in some regions not until the early fifteenth century.

from VoxEU.org — Coronavirus from the perspective of 17th century plague

by Neil Cummins (LSE), Morgan Kelly (University College Dublin), Cormac Ó Gráda (University College Dublin)

A repost from VoxEU.org

Between 1563 and 1665, London experienced four plagues that each killed one fifth of the city’s inhabitants. This column uses 790,000 burial records to track the plagues that recurred across London (epidemics typically endured for six months). Possibly carried and spread by body lice, plague always originated in the poorest parishes; self-segregation by the affluent gradually halved their death rate compared with poorer Londoners. The population rebounded within two years, as new migrants arrived in the city “to fill dead men’s shoes”.

Full article available here: Coronavirus from the perspective of 17th century plague — VoxEU.org: Recent Articles

cumulative

Pandemics and Institutions: Lessons from Plague

by Guido Alfani (Bocconi University, Milan) & Tommy Murphy (Universidad de San Andrés, Buenos Aires)

This blog forms part E in the EHS series: The Long View on Epidemics, Disease and Public Health:Research from Economic History


 

 

BlackDeath
The plague of Florence in 1348, as described in Boccaccio’s Decameron. Available at the Wellcome Library.

In a recent article[i] we reviewed research on preindustrial epidemics. We focused on large-scale, lethal events: those that have a deeper and more long-lasting impact on economy and society, thereby  producing the historical documentation that allows for systematic study. Almost all these lethal pandemics have been caused by plague: from the “Justinian’s plague” (540-41) and the Black Death (1347-52) to the last great European plagues of the seventeenth century (1623-32 and 1647-57). These epidemics were devastating. The Black Death, killed between 35 and 60 per cent of the population of Europe and the Mediterranean (approximately  50 million victims).

These epidemics also had large-scale and persistent consequences. The Black Death might have  positively influenced the development of Europe, even playing a role in the Great Divergence.[ii] Conversely,  it is arguable that seventeenth-century plagues in  Southern Europe (especially Italy), precipitated the Little Divergence.[iii]  Clearly, epidemics can have asymmetric economic effects. The Black Death, for example, had negative long-term consequences for relatively under-populated areas of Europe, such as Spain or Ireland.[iv] More generally, the effects of an epidemic depend upon the context in which it happens. Below we focus on how institutions shaped the spread and the consequences of plagues.

 

Preindustrial epidemics and institutions

In preindustrial times, as today, institutions played a crucial role in determining the final intensity of epidemics. When the Black Death appeared, European societies were unprepared for the threat. But, when it became apparent that plague was a recurrent scourge, institutional adaptation commenced — typical of  human reaction to a changing biological environment. From the late fourteenth century permanent health boards were established, able to take quicker action than the ad-hoc commissions created during the emergency of 1348. These boards monitored constantly the international situation, and provided the early warning necessary for implementing measures to contain epidemics[v]. From the late fourteenth century, quarantine procedures for suspected cases were developed, and in 1423 Venice built the first permanent lazzaretto (isolation hospital) on a lagoon island. By the early sixteenth century, at least in Italy, central and local government had implemented a broad range of anti-plague policies, including  health controls at river and sea harbours, mountain passes, and political boundaries. Within each Italian state, infected communities or territories were isolated, and human contact was limited by quarantines.[vi]  These, and other instruments developed against the plague, are the direct ancestors of those currently employed to contain Covid-19. However, such policies are not always successful:  In 1629, for example, plague entered Northern Italy as infected armies  from France and Germany arrived to fight in the War of the Mantuan Succession.  Nobody has ever been able to quarantine an enemy army.

It is no accident that these policies were first developed in Italian trading cities which, because of their commercial networks, had good reason to fear infection. Such policies were quickly imitated in Spain and France.[vii]  However, England in particular, “was unlike many other European countries in having no public precautions against plague at all before 1518”.[viii] Even in the seventeenth century, England was still trying to introduce institutions that had long-since been consolidated in Mediterranean Europe.

The development of institutions and procedures to fight plague has been extensively researched.  Nonetheless, other aspects of preindustrial epidemics are less well-known.  For example, how institutions tended to shift mortality towards  specific socio-economic groups, especially the poor.  Once doctors and health officials noticed that plague mortality was higher in the poorest parts of the city, they began to see the poor themselves as being responsible for the spread of the infection. As a result, during the early modern period their presence in cities was increasingly resented,[ix] and as a precautionary measure, vagrants and beggars were expelled. The death of many poor people was even regarded by some as one of the few positive consequences of plague. The friar, Antero Maria di San Bonaventura, wrote immediately after the 1656-57 plague in Genoa:

“What would the world be, if God did not sometimes touch it with the plague? How could he feed so many people? God would have to create new worlds, merely destined to provision this one […]. Genoa had grown so much that it no longer seemed a big city, but an anthill. You could neither take a walk without knocking into one another, nor was it possible to pray in church on account of the multitude of the poor […]. Thus it is necessary to confess that the contagion is the effect of divine providence, for the good governance of the universe”.[x]

 

While it seems certain that the marked socio-economic gradient of plague mortality was partly due to the action of health institutions, there is no clear evidence that officials were actively trying to kill the poor by infection.  Sometimes, the anti-poor behaviour of the elites might have backfired. Our initial research on the 1630 epidemic in the Italian city of Carmagnola suggests that while poor households were more prone to being all interned in the lazzaretto for isolation at the mere suspicion of plague, this might have reduced, not increased, their individual risk of death compared to richer strata. Possibly, this was the combined result of effective isolation of the diseased, assured provisioning of victuals, basic care, and forced within-household distancing[xi].

Different health treatment reserved to rich and poor and economic elites making wrong and self-harming decisions: it would be nice if, occasionally, we learned something from history!

 

[i] Alfani, G. and T. Murphy. “Plague and Lethal Epidemics in the Pre-Industrial World.” Journal of Economic History 77 (1), 2017, 314–343.

[ii] Clark, G. A Farewell to the Alms: A Brief Economic History of the World. Princeton: Princeton University Press, 2007; Broadberry, S. Accounting for the Great Divergence, LSE Economic History Working Papers No. 184, 2013.

[iii] Alfani, G. “Plague in Seventeenth Century Europe and the Decline of Italy: An Epidemiological Hypothesis.” European Review of Economic History 17 (3), 2013, 408–430; Alfani, G. and M. Percoco. “Plague and Long-Term Development: the Lasting Effects of the 1629-30 Epidemic on the Italian Cities.” Economic History Review 72 (4), 2019, 1175–1201.

[iv] For a recent synthesis of the asymmetric economic consequences of plague, Alfani, G. Pandemics and asymmetric shocks: Lessons from the history of plagues, VoxEu, 9 April 2020, https://voxeu.org/article/pandemics-and-asymmetric-shocks

[v] Cipolla, C.M. Public Health and the Medical Profession in the Renaissance. Cambridge: CUP, 1976; Cohn, S.H. Cultures of Plague. Medical Thought at the End of the Renaissance. Oxford: OUP, 2009. Alfani, G. Calamities and the Economy in Renaissance Italy. The Grand Tour of the Horsemen of the Apocalypse. Basingstoke: Palgrave, 2013.

[vi] Alfani, G. Calamities and the Economy, cit.; Cipolla, C.M, Public Health and the Medical Profession, cit.; Henderson, J., Florence Under Siege: Surviving Plague in an Early Modern City, Yale University Press, 2019.

[vii] Cipolla, C.M, Public Health and the Medical Profession, cit..

[viii] Slack, Paul. The Impact of Plague in Tudor and Stuart England. London: Routledge, 1985, 201–26.

[ix] Pullan, B. “Plague and Perceptions of the Poor in Early Modern Italy.” In T. Ranger and P. Slack (eds.), Epidemics and Ideas. Essays on the Historical Perception of Pestilence. Cambridge: CUP, 1992, 101-23; Alfani, G., Calamities and the Economy.

[x] Alfani,  Calamities p.106.

[xi] Alfani, G., M. Bonetti and M. Fochesato, Pandemics and socio-economic status. Evidence from the plague of 1630 in northern Italy, Mimeo.


 

Guido Alfani – guido.alfani@unibocconi.it

Tommy Murphy – tmurphy@udesa.edu.ar

Plague and Renaissance in Tuscany

by Paolo Malanima (Magna Græcia University of Catanzaro)

This blog forms part D in the EHS series: The Long View on Epidemics, Disease and Public Health:Research from Economic History).

The full paper from this blog post was published on the Economic History Review and is available here 

 

Picture 1
The Triumph of Death. Source: Francesco Petrarca, Trionfi: Ms., 1081. Biblioteca Nazionale, Rome.

In 1346, astrologers announced that the conjunction of three planets foretold great and serious events (“grandi e gravi novitadi”).[3] This was quickly confirmed: early in January 1348, two Genoese galleys landed at the port of Pisa, a few kilometres from the city centre. These two galleys began their voyage in Caffa (Teodosia) on the Black Sea.  They had stopped in Messina earlier, and were en-route to Genoa. After landing at Pisa, the mariners went to the marketplace, where, subsequently, many of the locals became ill and quickly died. Panic gripped the city inhabitants (“fu sparto lo grande furore per tucta la cictà di Pisa”).[4]

From Pisa, the Black Death commenced its march on Europe. In a matter of months it was chronicled that nearly 80 per cent of Pisa’s inhabitants had died. By March, 1348,  the first cases of plague occurred in Florence and the plague affected other, close Tuscan cities, progressing at a speed of one kilometre per day.

At the time of the Black Death, Tuscany was one of the most populated areas of Europe, with approximately one million inhabitants.[1] In the fourteenth century, the population density of Tuscany was approximately three times that of Europe (excluding Russia), and roughly twice that  of England and Wales.

The first wave of the plague was followed by several other attacks: according to a seventeenth-century writer, between 1348 and 1527, there were at least 19 further outbreaks.[2] The Tuscan population reached its minimum around 1440, with barely 400,000 inhabitants. It began to slowly recover from the middle of the fifteenth century, reaching 900,000 inhabitants by 1600. The birthplace of the European Renaissance was one of the most devastated regions. The last assault by the plague occurred in 1629-30, after which the plague disappeared from Tuscany.

Picture 1
Figure 2. Part A: Florence price index, 1310-1450; Part B: Daily real wage rates of masons, 1310-1450. Please note that  in A  the price index has a base of 1 for the period 1420-40; in B the nominal wage is divided by the price of the basket.
Source: P. Malanima, ‘Italy in the Renaissance: a Leading Economy in the European context, 1350-1550’, Economic History Review , 71 (2018), pp. 3-30.

What were the economic effects of these outbreaks of plague? The main effect was a sudden change in the ratio of factors of production. The plague destroyed humans, but not the capital stock (buildings, tools), natural capital (that is physical resources), or human capital (knowledge). Consequently, the capital stock per worker increased and, therefore, so did labour productivity. With few exceptions, the consequences of the Black Death were similar across Europe.[5]

In Tuscany, which suffered frequent and powerful outbreaks of the plague, the ratio between production factors changed the most, leading to a decline in output prices (Figure 2, A). The fall in prices was immediate following the Black Death. However, because of bad harvests and military events, an apparent reversal of the trend occurred at the end of the century. Similarly, the price of labour only increased above its base year from about 1450-70 (Figure 2, B). These changes were known to the Florentine government when it noted, in a decree of 1348, that, while “many citizens had suddenly become the poor, the poor had become rich”.[6]

The curve of Tuscan GDP per capita is shown in Figure 3. We note that the trend began to rise soon after the main outbreak of the plague, but started to decline soon after the population was recovering, after the middle of the century. It reached its maximum around 1410-60. At the time, per capita GDP in Tuscany was higher than elsewhere in Europe. In the first half of the fifteenth century, its annual level was about 2,500 present euros, compared to 2,000 euros in 1861 (the date of national unification).

Picture 1
Figure 3. Real per capita GDP (Florentine Lire 1420-40). Notes: The lower curve refers to the yearly percentage changes from the trend of GDP per capita.
Source: as Figure 2.

Was there real growth in Tuscany after the Black Death? The blunt answer is: no. Following Simon Kuznet’s seminal work, we know that modern economic growth is characterised by simultaneous growth in population and product, with the latter growing relatively faster. Furthermore, modern growth implies the continuous growth of product per capita. However, as this case study demonstrates, product per capita rose because the population declined so dramatically, and Tuscan GDP per capita was highly volatile. Indeed, in some years the latter could fluctuate by 10 to 20 per cent, which would be highly unusual by present standards (although the current COVID outbreak might mean that there will be even greater fluctuations in standards of living and mortality). Another difference between modern growth and growth in the Ancien Régime concerns structural change. Modern growth implies a relative rise in the product of industries and services, and, consequently, a rise in urbanisation. In Renaissance Tuscany exactly the opposite occurred. In 1400, the urbanisation rate was half the level reached in about 1300. Approximately 450 years later, the pre-plague level was not yet attained. The rate achieved in 1300 was only surpassed at the start of the twentieth century.

 

To contact the author: malanima@unicz.it

 

Notes:

[1] M. Breschi, P. Malanima, Demografia ed economia in Toscana: il lungo periodo (secoli XIV-XIX), in M. Breschi, P. Malanima (eds.), Prezzi, redditi, popolazioni in Italia: 600 anni, Udine, Forum, 2002, pp. 109-42 (from this paper is taken the following demographic information).

[2] F. Rondinelli, Relazione del contagio stato in Firenze l’anno 1630 e 1633, Firenze, G.B. Landini, 1634.

[3] M. Villani, Cronica, in G. Villani, Cronica con le continuazioni di Matteo e Filippo, Torino, Einaudi, 1979, p. 295.

[4] R. Sardo, Cronaca di Pisa, O. Banti (ed.), Roma, Istituto Storico Italiano per il Medio Evo, 1963, p. 96.

[5] P. Malanima, The Economic Consequences of the Black Death, in E. Lo Cascio (ed.), L’impatto della “Peste Antonina”, Bari, Edipuglia, 2012, pp. 311-30.

[6] Quoted in S. Cohn, ‘After the Black Death: Labour Legislation and Attitudes towards Labour in late-medieval Western Europe’, Economic History Review, 60 (2007), p. 480.

 

 

 

Demand slumps and wages: History says prepare to bargain

by Judy Z. Stephenson (Bartlett Faculty of the Built Environment, UCL)

This blog is part of the  EHS series on The Long View on Epidemics, Disease and Public Health:Research from Economic History).

Big shifts and stops in supply, demand, and output hark back to pre-industrial days, and they carry lessons for today’s employment contracts and wage bargains.

Canteen at the National Projectile Factory
Munitions factory in Lancaster, 1917 ca.
Image courtesy of Lancaster City Museum. Available at <http://www.documentingdissent.org.uk/munitions-factories-in-lancaster-and-morecambe/&gt;

Covid-19 has brought the world to a slump of unprecedented proportions. Beyond immediate crises in healthcare and treatment, the biggest impact is on employment. Employers, shareholders and policymakers are struggling to come to terms with the implications of ‘closed-for-business’ for an unspecified length of time, and laying-off workers seems the most common response, even though unprecedented government support packages for firms and workers have heralded the ‘return of the state’, and the fiscal implications have provoked wartime comparisons.

There is one very clear difference between war and the current pandemic: that of mobilisation. Historians tend to look on times of war as times of full employment and high demand. (1). A concomitant slump in demand and a huge surplus of de-mobilised labour were associated with the depression in real wages and labour markets in the peacetime years after 1815. That slump accompanied increasing investment in large scale factory production, particularly in the textile industry. The decades afterwards are some of the best documented in labour history (2), and they are characterised by frequent stoppages, down-scaling and restarts in production. They should be of interest now because they are the story of how modern capitalist producers learned to set and bargain for wages to ensure they had the skills they needed, when they needed to produce efficiently. Much of what employers and workers learned over the nineteenth century are directly pertinent to problems that currently face employers, workers, and the state.

Before the early nineteenth century in England – or elsewhere for that matter – most people were simply not paid a regular weekly wage, or in fact paid for their time at all (3). Very few people had a ‘job’, and shipwrights, building workers, some common labourers, (in all maybe 15% of workers in early modern economies) were paid ‘by the day’, but the hours or output that a ‘day’ involved were varied and indeterminate. The vast majority of pre-industrial workers were not paid for their time, but for what they produced.

These workers  earned piece rates, like today’s delivery riders earn ‘per drop’, and uber drivers earn ‘per ride’, or garment workers per unit made. When supply of materials failed, or demand for output stalled, workers were not paid, irrespective of whether they could work or not. Blockades, severe weather, famine, plague, financial crises, and unreliable supplies, all stopped work, and so payment of wages ended.  Stoppages were natural and expected. Historical records indicate that in many years commercial activity and work slowed to a trickle in January and February. Households subsisted on savings or credit before they could start earning again, or parishes and the poor law provided bare subsistence in the interim. Notable characteristics of pre-industrial wages – by piecework and otherwise – were wage posting and nominal rate rigidity, or lack of wage bargaining. Rates for some work didn’t change for almost a century, and the risk of no work seems to have been accounted for on both sides. (4).

Piecework, or payment for output is a system of wage formation is of considerable longevity   and its purpose was always to protect employers from labour costs in uncertain conditions. It seems attractive because it transfers  the risks associated with output volatility from the employer to the worker.  Such a practices are the basis of today’s  ‘gig’ economy.  Some workers – those in their prime who are skilled and strong – tend to do well out of the system, and enjoy being able to increase their earnings with effort. This is the flexibility of the gig economy that some relish today.  But its less effective for those who need to be trained or managed, older workers, or anyone who has to limit their hours.

However, piecework or gig wage systems have risks for the employer. In the long run, we know piece bargains break down, or become unworkably complex as both workers and employers behave opportunistically (5). Where firms need skilled workers to produce quickly, or they want to invest in firm or industry specific human capital to increase competitiveness through technology, they can suddenly find themselves outpriced by competitors, or with a labour force with a strong leisure preference or, indeed,  a labour shortage. Such conditions characterised early industrialisation. In the British textile industry this opportunism created and exacerbated stoppages throughout the nineteenth century. After each stoppage both employers and workers sought to change rates. But new bargains were difficult to agree. Employers tried to cut costs. Labour struck. Bargaining for wages impeded efficient production.

Eventually, piecework bargains formed implicit, more stable contracts and ‘invisible handshakes’ paved the way to the relative stability of hourly wages and hierarchy of skills in factories (though the mechanism by which this happened is contested) (6). The form of the wage slowly changed to payment by the hour or unit of time.  Employers worked out that ‘fair’ regular wages (or efficiency wages),  and a regular workforce served them better in the long run than trying to save labour costs through stoppages. Unionisation bettered working conditions and the security of contracts. The Trade Board Act of 1909 regulated the wages of industries still operating minimal piece rates, and ushered in the era of collective wage bargaining as the norm, which only ended with the labour market policies of Thatcherism and subsequent governments.

So far in the twenty-first century, although there has been a huge shift to self-employment, gig wage formation and non-traditional jobs (7) we have not experienced the bitter bargaining that characterised the shift from piecework to time work two hundred years ago, or the unrest of the 1970s and early 1980s. Some of this is probably down to the decline of output volatility that accompanied increased globalisation since the ‘Great Moderation’ and the extraordinarily low levels of unemployment in most economies in the last decade (8). Covid-19 brings output volatility back, in a big, unpredictable way, and the history of wage bargaining indicates that when factors of production are subject to shocks, bargaining is costly. Employers who want to rehire workers who have been unpaid for months, may find established wage bargains no longer hold. Now, shelf stackers who have risked their lives on zero hours contracts may think that their pay rate per hour should reflect this risk. Well-paid professionals incentivised by performance related pay are discovering the precarity of ‘eat what you kill’, and may find that their basic pay doesn’t reflect the preparatory work they need to do in conditions that will not let them perform. Employers facing the same volatility might try to change rates, and many employers have already moved to cut wages.

Today’s state guarantee of many worker’s income, unthinkable in the nineteenth century laissez-faire state, are welcome and necessary. That today’s gig economy workers have made huge strides towards attaining full employment rights would also appear miraculous to most pre-industrial workers. Yet, contracts and wage formation matter. With increasing numbers of workers without job security, and essential services suffering demand and supply shocks, many workers and employers are likely to confront significant shifts in employment.  History suggests bargaining for them is not as easy a process as the last thirty years have led us to believe.

 

To contact the author: 

j.stephenson@ucl.ac.uk

@judyzara

 

References:

(1). Allen, R. (2009). Engels’ pause: Technical change, capital accumulation, and inequality in the British industrial revolution. Explorations in Economic History, 46(4), 418-435; Broadberry et al, (2015). British Economic Growth, 1270-1870. CUP.

(2). Huberman. M., (1996) Escape from the Market, CUP, chapter 2.

(3). Hatcher, J., and Stephenson, J.Z. (Eds.), (2019) Seven Centuries of Unreal Wages, Palgrave Macmillan

(4). J. Stephenson and P. Wallis, ‘Imperfect competition’, LSE Working Paper (forthcoming).

(5). Brown, W. (1973) Piecework Bargaining, Heinemann.

(7). See debates between Huberman, Rose, Taylor and Winstanley in Social History 1987-89.

(6). Katz, L., & Krueger, A. (2016). The Rise and Nature of Alternative Work Arrangements in the United States, 1995-2015. NBER Working Paper Series.

(8). Fang, W., & Miller, S. (2014). Output Growth and its Volatility: The Gold Standard through the Great Moderation. Southern Economic Journal, 80(3), 728-751.