COVID-19 and the food supply chain: Impacts on stock price returns and financial performance

This blog is  part of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History’.

By Julia Höhler (Wageningen University)

As growing evidence about COVID-19 and its effects on the human body and transmission mechanisms emerges, economists are now making progress in understanding the impact of the global pandemic on the food supply chain. While it is apparent that many companies were affected, the nature and magnitude of the effects continue to require investigation.  A special issue of the Canadian Journal of Agricultural Economics on ‘COVID-19 and the Canadian agriculture and food sectors’, was among the first publications to examine the  possible effects of COVID-19 on food-supply.  In our ongoing work we take the next step and ask the question: How can we quantify the effects of COVID-19 on companies in the food supply chain?

Figure 1. Stylized image of supermarket shopping Source: Oleg Magni, Pexels

Stock prices as a proxy for the impact of COVID-19

One way to quantify the initial effects of COVID-19 on companies in the food supply chain is to analyse stock prices and their reaction over time. The theory of efficient markets states that stock prices reflect investors’ expectations regarding future dividends. If stock prices fluctuate strongly, this is a sign of lower expected returns and higher risks. Volatile stock markets can increase businesses’ financing costs and, in the worst case, threaten their liquidity. At the macroeconomic level, stock prices can also be useful to indicate the likelihood of a future recession. For our analysis of stock price reactions, we have combined data from different countries and regions. In total, stock prices for 71 large stock-listed companies from the US, Japan and European were collected. The companies’ activities in our sample cover the entire supply chain from farm equipment and supplies, agriculture, trade, food-processing, distribution, and retailing.

Impact on stock price returns comparable to the 2008 financial crisis

 We began by  calculating the logarithmic daily returns for the companies’ stocks and their average. Second, we compared these average returns with the performance of the S&P 500.  Figure 2, below,  shows the development of average daily returns from 2005 to 2020. Companies in the S&P 500 (top) achieved higher returns on average, but also exhibited higher fluctuations than the average of the companies we examined (bottom). Stock price returns fluctuated particularly strongly during the 2008 financial crisis. The fluctuations since the first notification of COVID-19 to the WHO in early January to the end of April 2020 (red area) are comparable in their magnitude. The negative fluctuations in this period are somewhat larger than in 2008. Based on the comparison of both charts, it can be assumed that stock price returns of large companies in the food supply chain were on average less affected by the two crises. Nevertheless, a look at the long-term consequences of the 2008 financial crisis suggests that a wave of bankruptcies, lower financial performance and a loss of food security may still follow.

Figure 2. Average daily returns, for the S & P 500 (top panel) and 71 food-supply companies (FSC), lower panel, 2005-2020. Source: Data derived from multiple sources. For further information, please contact the author.

Winners and losers in the sub-sectors

In order to obtain a more granular picture of the impact of COVID-19, the companies in our sample  were divided into sub-sectors, and their stock price volatility was calculated between January and April, 2020. Whereas food retailers and breweries experienced relatively low volatility in stock prices, food distributors and manufacturers of fertilizers and chemicals experienced relatively higher volatilities. In order to cross-validate these results, we collected information on realized profits or losses from the companies’ financial reports. The trends observed in  stock prices are also reflected in company results for the first quarter of 2020. Food retailers were able to increase their profits in times of crisis, while food distributors recorded high losses compared to the previous period. The results are likely related to the lockdowns and social distancing measures which altered food distribution channels.

Longer-term effects

Just as the vaccine for COVID-19 is still in the pipeline, research into the effects of COVID-19 needs time to show what makes companies resilient to the effects of unpredictable shocks of this magnitude. Possible research topics relate to the question of whether local value chains are better suited to cushion the effects of a pandemic and maintain food security. Further work is also needed to understand fully the associated trade-offs between food security, profitability, and climate change objectives. Another research question relates to the effects of government protective measures and company support programmes.  Cross-country studies can provide important insights here. Our project lays the groundwork for future research into the effects of shocks on companies in the food value chain. By combining different data sources, we were able to compare stock returns in times of COVID-19 with those of the 2008 crisis, and  identify differences between sub-sectors. In the next step we will use company characteristics such as profitability to explain differences in returns.

To contact the author: julia.hoehler[at] wur.nl

Early-life disease exposure and occupational status

by Martin Saavedra (Oberlin College and Conservatory)

This blog is part H of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History’. This blog is based on the article ‘Early-life disease exposure and occupational status: The impact of yellow fever during the 19th century’, in Explorations in Economic History, 64 (2017): 62-81. https://doi.org/10.1016/j.eeh.2017.01.003   

 

Saavedra1
A girl suffering from yellow fever. Watercolour. Available at Wellcome Images.

Like epidemics, shocks to public health have the potential to affect human capital accumulation. A literature in health economics known as the ‘fetal origins hypothesis’ has examined how in utero exposure to infectious disease affects labor market outcomes. Individuals may be more sensitive to health shocks during the developmental stage of life than during later stages of childhood. For good reason, much of this literature focuses on the 1918 influenza pandemic which was a huge shock to mortality and one of the few events that can be visibly observed when examining life expectancy trends in the United States. However, there are limitations to looking at the 1918 influenza pandemic because it coincided with the First World War. Another complication in this literature is that cities with outbreaks of infectious disease often engaged in many forms of social distancing by closing schools and businesses. This is true for the 1918 influenza pandemic, but also for other diseases. For examples, many schools were closed during the polio epidemic of 1916.

So, how can we estimate the long-run effects of infectious disease when cities simultaneously respond to outbreaks? One possibility is to look at a disease that differentially affected some groups within the same city, such as yellow fever during the nineteenth century. Yellow fever is a viral infection that spreads from the Aedes aegypti mosquito and is still endemic in parts of Africa and South America.  The disease kills roughly 50,000 people per year, even though a vaccine has existed for decades. Symptoms include fever, muscle pain, chills, and jaundice, from which the disease derives its name.

During the eighteenth and nineteenth centuries, yellow fever plagued American cities, particularly port cities that traded with Caribbean Islands. In 1793, over 5,000 Philadelphians likely died of yellow fever. This would be a devasting number in any city, even by today’s standards, but it is even more so when considering that in 1790 that Philadelphia had a population of less than 29,000.

By the mid-nineteenth century, Southern port cities grew, and yellow fever stopped occurring in cities as far north as Philadelphia. The graph below displays the number of yellow fever fatalities in four southern port cities — New Orleans, LA; Mobile, AL; Charleston, SC; and Norfolk, VA — during the nineteenth century. Yellow fever was sporadic, devasting a city in one year and often leaving it untouched in the next. For example, yellow fever killed nearly 8,000 New Orleanians in 1853, and over 2,000 in both 1854 and 1855. The next two years, yellow fever killed fewer than 200 New Orleanians per year, then yellow fever come back killing over 3,500 in 1858. Norfolk, VA was only struck once in 1855. Since yellow fever never struck Norfolk during milder years, the population lacked immunity and approximately 10 percent of the city died in 1855. Charleston and Mobile show similar sporadic patterns. Likely due to the Union’s naval blockade, yellow fever did not visit any American port cities in large numbers during the Civil War.

 

Saavedra2
Source: As per original article.

 

Immigrants were particularly prone to yellow fever because they often came from European countries rarely visited by yellow fever. Native New Orleanians, however, typically caught yellow fever during a mild year as children and were then immune to the disease for the rest of their lives. For this reason, yellow fever earned the name the “stranger’s disease.”

Data from the full count of the 1880 census show that yellow fever fatality rates during an individual’s year of birth negatively affected adult occupational status, but only for individuals with foreign-born mothers. Those with US-born mothers were relatively unaffected by the disease. There are also effects for those who are exposed to yellow fever one or two years after their birth, but there are no effects, not even for those with immigrant mothers, when exposed to yellow fever three or four years after their births. These results suggest that early-life exposure to infectious disease, not just city-wide responses to disease, influence human capital development.

 


 

Martin Saavedra

Martin.Saavedra@oberlin.edu

 

Give Me Liberty Or Give Me Death

by Richard A. Easterlin (University of Southern California)

This blog is  part G of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History’. The full article from this blog is “How Beneficent Is the Market? A Look at the Modern History of Mortality.” European Review of Economic History 3, no. 3 (1999): 257-94. https://doi.org/10.1017/S1361491699000131

 

VACCINATION_06
A child is vaccinated, Brazil, 1970.

Patrick Henry’s memorable plea for independence unintentionally also captured the long history of conflict between the free market and public health, evidenced in the current struggle of the United States with the coronavirus.  Efforts to contain the virus have centered on  measures to forestall transmission of the disease such as stay-at-home orders, social distancing, and avoiding large gatherings, each of which infringes on individual liberty.  These measures have given birth to a resistance movement objecting to violations of one’s freedom.

My 1999 article posed the question “How Beneficent is the Market?” The answer, based on “A Look at the Modern History of Mortality” was straightforward: because of the ubiquity of market failure, public intervention was essential to achieve control of major infectious disease. This intervention  centered on the creation of a public health system. “The functions of this system have included, in varying degrees, health education, regulation, compulsion, and the financing or direct provision of services.”

Regulation and compulsion, and the consequent infringement of individual liberties, have always been  critical building blocks of the public health system. Even before formal establishment of public health agencies, regulation and compulsion were features of measures aimed at controlling the spread of infectious disease in mid-19th century Britain. The “sanitation revolution” led to the regulation of water supply and sewage disposal, and, in time to regulation of slum-  building conditions.  As my article notes, there was fierce opposition to these measures:

“The backbone of the opposition was made up of those whose vested interests were threatened: landlords, builders, water companies, proprietors of refuse heaps and dung hills, burial concerns, slaughterhouses, and the like … The opposition appealed to the preservation of civil liberties and sought to debunk the new knowledge cited by the public health advocates …”

The greatest achievement of public health was the eradication of smallpox, the one disease in the world that has been eliminated from the face of the earth. Smallpox was the scourge of humankind until William Jenner’s discovery of a vaccine in 1798.   Throughout the 19th and 20th centuries, requirements for smallpox vaccination were fiercely opposed by anti-vaccinationists.  In 1959 the World Health Organization embarked on a program to eradicate the disease. Over the ensuing two decades its efforts to persuade governments worldwide to require vaccination of infants were eventually successful, and in 1980 WHO officially declared the disease eradicated. Eventually public health triumphed over liberty. But It took almost two centuries to realize Jenner’s hope that vaccination would annihilate smallpox.

In the face of the coronavirus pandemic the U. S. market-based health care system  has demonstrated once again the inability of the market to  deal with infectious disease, and the need for forceful public intervention. The  current health care system requires that:

 “every player, from insurers to hospitals to the pharmaceutical industry to doctors, be financially self-sustaining, to have a profitable business model. It excels in expensive specialty care. But there’s no return on investment in being positioned for the possibility of a pandemic” (Rosenthal 2020).

Commercial and hospital labs have been slow to respond to the need for developing a test for the virus.  Once tests became available, conducting them was handicapped by insufficient supplies of testing capacity — kits, chemical reagents, swabs, masks and other personal protective equipment. In hospitals, ventilators  were also in short supply. These deficiencies reflected the lack of profitability in responding to these needs, and of a government reluctant to compensate for market failure.

At the current time, the halting efforts of federal public health authorities  and state and local public officials to impose quarantine and “shelter at home” measures have been seriously handicapped by public protests over infringement of civil liberties, reminiscent of the dissidents of the 19th  and 20th centuries and their current day heirs. States are opening for business well in advance of guidelines of the Center for Disease Control.  The lesson of history regarding such actions is clear: The cost of liberty is sickness and death.  But do we learn from history? Sadly, one is put in mind of Warren Buffet’s aphorism: “What we learn from history is that people don’t learn from history.”

 

Reference

Rosenthal, Elizabeth, “A Health System Set up to Fail”,  New York Times, May 8, 2020, p.A29.

 

To contact the author: easterl@usc.edu

Airborne diseases: Tuberculosis in the Union Army

by Javier Birchenall (University of California, Santa Barbara)

This is Part F of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History The full article from this blog was published in ‘Explorations in Economic History’ and is available here

TB-do-not-spit-1910-v2
1910 advertising postcard for the National Association for the Prevention of Tuberculosis. 

Tuberculosis (TB) is one of the oldest and deadliest diseases. Traces of TB in humans can be found as early as 9,000 years ago, and written accounts date back 3,300 years in India. Untreated, TB’s case-fatality rate is as high as 50 percent. It was a dreaded disease.  TB is an airborne disease caused by the bacteria Mycobacterium tuberculosis. Tuberculosis spreads through the air when a person who has an active infection coughs, sneezes, speaks, or sings. Most cases remain latent and do not develop symptoms. Activation of tuberculosis is particularly influenced by undernutrition.

Tuberculosis played a prominent role in the secular mortality decline. Of the 27 years of life expectancy gained in England and Wales between 1871 and 1951, TB accounts for about 40 percent of the improvement, a 12-year gain. Modern medicine, the usual suspect used to explain this mortality decline, could not have been the culprit. As Thomas McKeown famously pointed out, TB mortality started its decline long before the tubercle bacillus was identified and long before  an  effective treatment was provided (Figure 1). McKeown viewed improvements in economic and social conditions, especially improved diets, as the principal factor arresting the combatting tuberculosis. A healthy diet, however, is not the only factor behind nutritional status. Infections, no matter how mild, reduce nutritional status and increase susceptibility to infection.

Figure 1. Mortality rate from TB.

fig01

Source: as per original article

In “Airborne Diseases: Tuberculosis in the Union Army” I studied the determinants of diagnosis, discharge, and mortality from tuberculosis in the past. I examined the medical histories of 25,000 soldiers and veterans in the Union Army using data collected under the direction of Robert Fogel. The Civil War brought together soldiers from many socioeconomic conditions and ecological backgrounds into an environment which was ideal for the spread of this disease. The war also provided a unique setting to examine many of the factors which were  likely responsible for the decline in TB mortality. Before enlistment, individuals had differential exposure to harmful dust and fumes. They also faced different disease environments and living conditions. By housing recruits in confined spaces, the war exposed soldiers to a host of waterborne and airborne infections. In the Civil War, disease was far more deadly than battle.

The Union Army data contains detailed medical records and measures of nutritional status. Height at enlistment measures net nutritional experiences at early ages. Weight, needed to measure current nutritional status using the Body Mass Index (BMI), is available for war veterans. My estimates use a hazard model and a variety of controls aligned with existing explanations proposed for the decline in TB prevalence and fatality rates. By how much would the diagnosis of TB have declined if the average Union Army soldier had the height of the current U.S. male population, and if all his relevant infections diagnosed prior to TB were eradicated?  Figure 2 presents the contribution of the predictors of TB diagnosis in soldiers who did not engage in battle, and  Figure 3 reports soldiers discharged because of TB.  Nutritional experiences in early life provided a protective effect against TB.  Between 25 and 50 per cent of the predictable decline in tuberculosis could be associated with the modern increase in height. Declines in the risk of waterborne and airborne diseases are as important as the predicted changes in height

 

Figure 2. Contribution of various factors to the decline in TB diagnosis

fig02
Source: as per original article

 

Figure 3. Contribution of various factors to the decline in discharges because of TB.

fig03
Source: as per original article

My analysis showed that a wartime diagnosis of TB increased the risk of tuberculosis mortality. Because of the chronic nature of the disease, infected soldiers likely developed a latent or persistent infection that remained active until resistance failed at old age. Nutritional status provided some protection against mortality. For veterans, height was not as robust as BMI. If a veteran’s BMI increased from its historical value of 23 to current levels of 27, his mortality risk from tuberculosis would have been reduced by 50 per cent. Overall, the contribution of changes in `pure’ diets and changes in infectious disease exposure, was probably equal.

What lessons can be drawn for the current covid-19 pandemic? Covid-19 is also an airborne disease. Airborne diseases (e.g., influenza, measles, smallpox, and tuberculosis) are difficult to control. In unfamiliar populations, they often break wreak havoc. But influenza, measles, smallpox, and tuberculosis are mostly killers from the past. The findings in my paper suggest that the conquest of tuberculosis happened through both individual and public health efforts. Improvements in diets and public health worked simultaneously and synergistically. There was no silver bullet to defeat the great white plague, tuberculosis. Diets are no longer as inadequate as in the past. Still, Covid-19 has exposed differential susceptibility to the disease. Success in combatting Covid-19 is likely to require simultaneous and synergistic private and public efforts.

Economic History Review – An introduction to the history of infectious diseases, epidemics and the early phases of the long-run decline in mortality

by Leigh Shaw-Taylor.

Below is the abstract for the Economic History Review virtual issue, Epidemics, Diseases and Mortality in Economic History.

The full introduction and the virtual issue are both available online for free for a limited time.

 

This article, written during the COVID-19 epidemic, provides a general introduction to the long-term history of infectious diseases, epidemics and the early phases of the spectacular long-term improvements in life expectancy since 1750, primarily with reference to English history. The story is a fundamentally optimistic one. In 2019 global life expectancy was approaching 73 years. In 1800 it was probably about 30. To understand the origins of this transition, we have to look at the historical sequence by which so many causes of premature death have been vanquished over time. In England that story begins much earlier than often supposed, in the years around 1600.  The first two ‘victories’ were over famine and plague. However, economic changes with negative influences on mortality meant that, despite this, life expectancies were either falling or stable between the late sixteenth and mid eighteenth centuries. The late eighteenth and early nineteenth century saw major declines in deaths from smallpox, malaria and typhus and the beginnings of the long-run increases in life expectancy. The period also saw urban areas become capable of demographic growth without a constant stream of migrants from the countryside: a necessary precondition for the global urbanization of the last two centuries and for modern economic growth. Since 1840 the highest national life expectancy globally has increased by three years in every decade.

After the Black Death: labour legislation and attitudes towards labour in late-medieval western Europe

by Samuel Cohn Jr. (University of Glasgow)

This blog forms part F in the EHS series: The Long View on Epidemics, Disease and Public Health:Research from Economic History

The full article from this blog post was published on the Economic History Review, and it is available for members at this link 

The_plague_of_ashdod_1630
Poussin, The plague of Ashdod, 1630. Available at <https://en.wikipedia.org/wiki/Plague_of_Ashdod_(Poussin)#/media/File:The_plague_of_ashdod_1630.jpg&gt;

In the summer, 1999, I presented the kernel of this article at a conference in San Miniato al Tedesco in memory of David Herlihy. It was then limited to labour decrees I had found in the Florentine archives from the Black Death to the end of the fourteenth century. A few years later I thought of expanding the paper into a comparative essay. The prime impetus came from teaching the Black Death at the University of Glasgow. Students (and I would say many historians) think that England was unique in promulgating price and wage legislation after the Black Death, the famous Ordinance and Statute of Labourers in1349 and 1351. In fact, I did not then know how extensive this legislation was, and details of its geographical distribution remains unknown today.

A second impetus for writing the essay concerned a consensus in the literature on this wage legislation principally in England: that these decrees followed the logic of the laws of supply and demand. In short, with the colossal mortalities of the Black Death, 1347-51, the greatly diminished supply of labour meant that wage earners in cities and the countryside could demand excessive increases that threatened the livelihoods of elite rentiers — the church, nobility, and merchants. After all, this is what chroniclers and other contemporary commentators across Europe — Henry Knighton, Matteo Villani, William Langland, Giovanni Boccaccio, and many others — tell us in their scorning reproaches to greedy labourers. As ‘Hunger’ in book IV of Piers the Ploughman sneered:

And draught-ale was not good enough for them, nor a hunk of bacon, but they must have fresh meat or fish, fried or baked and chaud or plus chaud at that.

In addition, across the political spectrum from Barry Dobson to Rodney Hilton, Bertha Putnam’s study of the English laws (published in 1908) continued to be proclaimed as the definitive authority on these laws, despite her lack of quantitative analysis and central conclusion: the peasants were guilty of ‘extortionate greed’ and for this reason ‘these laws were necessary and just’ (Enforcement of the Statutes of Labourers, pp. 219–20.) Yet, across Europe, while nominal wages may have trebled through the 1370s, prices for basic commodities rose faster, leaving the supposed heartless labourers worse-off than they had been before the Black Death. As George Holmes discovered in 1957, the class to profit most in England from the post-plague demographics was the supposed victimized nobility.

Through primary and secondary sources, my article then researched wage and price legislation across a wide ambit of Europe — England, the Ile de France, the Low Countries, Provence, Aragon, Castile, Catalonia, Florence, Bologna, Siena, Orvieto, Milan, and Venice. Certainly, research needs to be extended further, to places where these laws were enacted and to where they appear not to have been, as in Scotland and the Low Countries, and to ask what difference the laws may have meant for economic development. However, from the places I examined, no simple logic arose, whether of supply and demand or the aims that might have been expected given differences in political regimes. Instead, municipal and royal efforts to control labour and artisans’ prices splintered in numerous and often contradictory directions, paralleling anxieties and needs to attribute blame as seen in other Black Death horrors: the burning of Jews and the murder of Catalans, beggars, and priests.

In conclusion, a history of the Black Death and its long-term consequences for labour can provide insights for perceiving our present predicament with Covid-19. We can  anticipate that the outcomes of the present pandemic will not be same across countries or continents. Similarly, for the Black Death and successive waves of plague through the fourteenth century, there were winners and losers. Yet, surprisingly, few historians have attempted to chart these differences, and fewer still to explain them. Instead, medievalists have concentrated more on the Black Death’s grand transformations, and these may serve as a welcome tonic for our present pandemic despair, especially as concerns labour. Eventually, post-Black-Death populations experienced increases in caloric intact, greater varieties in diet, better housing, consumption of more luxury goods, increased leisure time, and leaps in economic equality. Moreover, governments such as Florence, even before revolts or regime change, could learn from their initial economic missteps. With the second plague in 1363, they abandoned their laws entitled ‘contra laboratores’ that had endangered the supplies of their thin resources of labour and passed new decrees granting tax exemptions to attract labourers into their territories. Moreover, other regions–the Ile de France, Aragon, Provence, and Siena abandoned these initial prejudicial laws almost immediately. Even in England, despite legislating ever more stringent laws against labourers that last well into the fifteenth century, its law enforcers learnt to turn a blind eye to the law, allowing landlords and peasants to cut mutually beneficial deals that enabled steady work, wages to rise, working conditions to improve, and greater freedom of movement.

Let us remember that these grand transformations did not occur overnight. The switch in economic policies to benefit labourers and the narrowing of the gap between rich and poor did not begin to show effects until a generation after the Black Death; in some regions not until the early fifteenth century.

from VoxEU.org — Coronavirus from the perspective of 17th century plague

by Neil Cummins (LSE), Morgan Kelly (University College Dublin), Cormac Ó Gráda (University College Dublin)

A repost from VoxEU.org

Between 1563 and 1665, London experienced four plagues that each killed one fifth of the city’s inhabitants. This column uses 790,000 burial records to track the plagues that recurred across London (epidemics typically endured for six months). Possibly carried and spread by body lice, plague always originated in the poorest parishes; self-segregation by the affluent gradually halved their death rate compared with poorer Londoners. The population rebounded within two years, as new migrants arrived in the city “to fill dead men’s shoes”.

Full article available here: Coronavirus from the perspective of 17th century plague — VoxEU.org: Recent Articles

cumulative

Pandemics and Institutions: Lessons from Plague

by Guido Alfani (Bocconi University, Milan) & Tommy Murphy (Universidad de San Andrés, Buenos Aires)

This blog forms part E in the EHS series: The Long View on Epidemics, Disease and Public Health:Research from Economic History


 

 

BlackDeath
The plague of Florence in 1348, as described in Boccaccio’s Decameron. Available at the Wellcome Library.

In a recent article[i] we reviewed research on preindustrial epidemics. We focused on large-scale, lethal events: those that have a deeper and more long-lasting impact on economy and society, thereby  producing the historical documentation that allows for systematic study. Almost all these lethal pandemics have been caused by plague: from the “Justinian’s plague” (540-41) and the Black Death (1347-52) to the last great European plagues of the seventeenth century (1623-32 and 1647-57). These epidemics were devastating. The Black Death, killed between 35 and 60 per cent of the population of Europe and the Mediterranean (approximately  50 million victims).

These epidemics also had large-scale and persistent consequences. The Black Death might have  positively influenced the development of Europe, even playing a role in the Great Divergence.[ii] Conversely,  it is arguable that seventeenth-century plagues in  Southern Europe (especially Italy), precipitated the Little Divergence.[iii]  Clearly, epidemics can have asymmetric economic effects. The Black Death, for example, had negative long-term consequences for relatively under-populated areas of Europe, such as Spain or Ireland.[iv] More generally, the effects of an epidemic depend upon the context in which it happens. Below we focus on how institutions shaped the spread and the consequences of plagues.

 

Preindustrial epidemics and institutions

In preindustrial times, as today, institutions played a crucial role in determining the final intensity of epidemics. When the Black Death appeared, European societies were unprepared for the threat. But, when it became apparent that plague was a recurrent scourge, institutional adaptation commenced — typical of  human reaction to a changing biological environment. From the late fourteenth century permanent health boards were established, able to take quicker action than the ad-hoc commissions created during the emergency of 1348. These boards monitored constantly the international situation, and provided the early warning necessary for implementing measures to contain epidemics[v]. From the late fourteenth century, quarantine procedures for suspected cases were developed, and in 1423 Venice built the first permanent lazzaretto (isolation hospital) on a lagoon island. By the early sixteenth century, at least in Italy, central and local government had implemented a broad range of anti-plague policies, including  health controls at river and sea harbours, mountain passes, and political boundaries. Within each Italian state, infected communities or territories were isolated, and human contact was limited by quarantines.[vi]  These, and other instruments developed against the plague, are the direct ancestors of those currently employed to contain Covid-19. However, such policies are not always successful:  In 1629, for example, plague entered Northern Italy as infected armies  from France and Germany arrived to fight in the War of the Mantuan Succession.  Nobody has ever been able to quarantine an enemy army.

It is no accident that these policies were first developed in Italian trading cities which, because of their commercial networks, had good reason to fear infection. Such policies were quickly imitated in Spain and France.[vii]  However, England in particular, “was unlike many other European countries in having no public precautions against plague at all before 1518”.[viii] Even in the seventeenth century, England was still trying to introduce institutions that had long-since been consolidated in Mediterranean Europe.

The development of institutions and procedures to fight plague has been extensively researched.  Nonetheless, other aspects of preindustrial epidemics are less well-known.  For example, how institutions tended to shift mortality towards  specific socio-economic groups, especially the poor.  Once doctors and health officials noticed that plague mortality was higher in the poorest parts of the city, they began to see the poor themselves as being responsible for the spread of the infection. As a result, during the early modern period their presence in cities was increasingly resented,[ix] and as a precautionary measure, vagrants and beggars were expelled. The death of many poor people was even regarded by some as one of the few positive consequences of plague. The friar, Antero Maria di San Bonaventura, wrote immediately after the 1656-57 plague in Genoa:

“What would the world be, if God did not sometimes touch it with the plague? How could he feed so many people? God would have to create new worlds, merely destined to provision this one […]. Genoa had grown so much that it no longer seemed a big city, but an anthill. You could neither take a walk without knocking into one another, nor was it possible to pray in church on account of the multitude of the poor […]. Thus it is necessary to confess that the contagion is the effect of divine providence, for the good governance of the universe”.[x]

 

While it seems certain that the marked socio-economic gradient of plague mortality was partly due to the action of health institutions, there is no clear evidence that officials were actively trying to kill the poor by infection.  Sometimes, the anti-poor behaviour of the elites might have backfired. Our initial research on the 1630 epidemic in the Italian city of Carmagnola suggests that while poor households were more prone to being all interned in the lazzaretto for isolation at the mere suspicion of plague, this might have reduced, not increased, their individual risk of death compared to richer strata. Possibly, this was the combined result of effective isolation of the diseased, assured provisioning of victuals, basic care, and forced within-household distancing[xi].

Different health treatment reserved to rich and poor and economic elites making wrong and self-harming decisions: it would be nice if, occasionally, we learned something from history!

 

[i] Alfani, G. and T. Murphy. “Plague and Lethal Epidemics in the Pre-Industrial World.” Journal of Economic History 77 (1), 2017, 314–343.

[ii] Clark, G. A Farewell to the Alms: A Brief Economic History of the World. Princeton: Princeton University Press, 2007; Broadberry, S. Accounting for the Great Divergence, LSE Economic History Working Papers No. 184, 2013.

[iii] Alfani, G. “Plague in Seventeenth Century Europe and the Decline of Italy: An Epidemiological Hypothesis.” European Review of Economic History 17 (3), 2013, 408–430; Alfani, G. and M. Percoco. “Plague and Long-Term Development: the Lasting Effects of the 1629-30 Epidemic on the Italian Cities.” Economic History Review 72 (4), 2019, 1175–1201.

[iv] For a recent synthesis of the asymmetric economic consequences of plague, Alfani, G. Pandemics and asymmetric shocks: Lessons from the history of plagues, VoxEu, 9 April 2020, https://voxeu.org/article/pandemics-and-asymmetric-shocks

[v] Cipolla, C.M. Public Health and the Medical Profession in the Renaissance. Cambridge: CUP, 1976; Cohn, S.H. Cultures of Plague. Medical Thought at the End of the Renaissance. Oxford: OUP, 2009. Alfani, G. Calamities and the Economy in Renaissance Italy. The Grand Tour of the Horsemen of the Apocalypse. Basingstoke: Palgrave, 2013.

[vi] Alfani, G. Calamities and the Economy, cit.; Cipolla, C.M, Public Health and the Medical Profession, cit.; Henderson, J., Florence Under Siege: Surviving Plague in an Early Modern City, Yale University Press, 2019.

[vii] Cipolla, C.M, Public Health and the Medical Profession, cit..

[viii] Slack, Paul. The Impact of Plague in Tudor and Stuart England. London: Routledge, 1985, 201–26.

[ix] Pullan, B. “Plague and Perceptions of the Poor in Early Modern Italy.” In T. Ranger and P. Slack (eds.), Epidemics and Ideas. Essays on the Historical Perception of Pestilence. Cambridge: CUP, 1992, 101-23; Alfani, G., Calamities and the Economy.

[x] Alfani,  Calamities p.106.

[xi] Alfani, G., M. Bonetti and M. Fochesato, Pandemics and socio-economic status. Evidence from the plague of 1630 in northern Italy, Mimeo.


 

Guido Alfani – guido.alfani@unibocconi.it

Tommy Murphy – tmurphy@udesa.edu.ar

Plague and Renaissance in Tuscany

by Paolo Malanima (Magna Græcia University of Catanzaro)

This blog forms part D in the EHS series: The Long View on Epidemics, Disease and Public Health:Research from Economic History).

The full paper from this blog post was published on the Economic History Review and is available here 

 

Picture 1
The Triumph of Death. Source: Francesco Petrarca, Trionfi: Ms., 1081. Biblioteca Nazionale, Rome.

In 1346, astrologers announced that the conjunction of three planets foretold great and serious events (“grandi e gravi novitadi”).[3] This was quickly confirmed: early in January 1348, two Genoese galleys landed at the port of Pisa, a few kilometres from the city centre. These two galleys began their voyage in Caffa (Teodosia) on the Black Sea.  They had stopped in Messina earlier, and were en-route to Genoa. After landing at Pisa, the mariners went to the marketplace, where, subsequently, many of the locals became ill and quickly died. Panic gripped the city inhabitants (“fu sparto lo grande furore per tucta la cictà di Pisa”).[4]

From Pisa, the Black Death commenced its march on Europe. In a matter of months it was chronicled that nearly 80 per cent of Pisa’s inhabitants had died. By March, 1348,  the first cases of plague occurred in Florence and the plague affected other, close Tuscan cities, progressing at a speed of one kilometre per day.

At the time of the Black Death, Tuscany was one of the most populated areas of Europe, with approximately one million inhabitants.[1] In the fourteenth century, the population density of Tuscany was approximately three times that of Europe (excluding Russia), and roughly twice that  of England and Wales.

The first wave of the plague was followed by several other attacks: according to a seventeenth-century writer, between 1348 and 1527, there were at least 19 further outbreaks.[2] The Tuscan population reached its minimum around 1440, with barely 400,000 inhabitants. It began to slowly recover from the middle of the fifteenth century, reaching 900,000 inhabitants by 1600. The birthplace of the European Renaissance was one of the most devastated regions. The last assault by the plague occurred in 1629-30, after which the plague disappeared from Tuscany.

Picture 1
Figure 2. Part A: Florence price index, 1310-1450; Part B: Daily real wage rates of masons, 1310-1450. Please note that  in A  the price index has a base of 1 for the period 1420-40; in B the nominal wage is divided by the price of the basket.
Source: P. Malanima, ‘Italy in the Renaissance: a Leading Economy in the European context, 1350-1550’, Economic History Review , 71 (2018), pp. 3-30.

What were the economic effects of these outbreaks of plague? The main effect was a sudden change in the ratio of factors of production. The plague destroyed humans, but not the capital stock (buildings, tools), natural capital (that is physical resources), or human capital (knowledge). Consequently, the capital stock per worker increased and, therefore, so did labour productivity. With few exceptions, the consequences of the Black Death were similar across Europe.[5]

In Tuscany, which suffered frequent and powerful outbreaks of the plague, the ratio between production factors changed the most, leading to a decline in output prices (Figure 2, A). The fall in prices was immediate following the Black Death. However, because of bad harvests and military events, an apparent reversal of the trend occurred at the end of the century. Similarly, the price of labour only increased above its base year from about 1450-70 (Figure 2, B). These changes were known to the Florentine government when it noted, in a decree of 1348, that, while “many citizens had suddenly become the poor, the poor had become rich”.[6]

The curve of Tuscan GDP per capita is shown in Figure 3. We note that the trend began to rise soon after the main outbreak of the plague, but started to decline soon after the population was recovering, after the middle of the century. It reached its maximum around 1410-60. At the time, per capita GDP in Tuscany was higher than elsewhere in Europe. In the first half of the fifteenth century, its annual level was about 2,500 present euros, compared to 2,000 euros in 1861 (the date of national unification).

Picture 1
Figure 3. Real per capita GDP (Florentine Lire 1420-40). Notes: The lower curve refers to the yearly percentage changes from the trend of GDP per capita.
Source: as Figure 2.

Was there real growth in Tuscany after the Black Death? The blunt answer is: no. Following Simon Kuznet’s seminal work, we know that modern economic growth is characterised by simultaneous growth in population and product, with the latter growing relatively faster. Furthermore, modern growth implies the continuous growth of product per capita. However, as this case study demonstrates, product per capita rose because the population declined so dramatically, and Tuscan GDP per capita was highly volatile. Indeed, in some years the latter could fluctuate by 10 to 20 per cent, which would be highly unusual by present standards (although the current COVID outbreak might mean that there will be even greater fluctuations in standards of living and mortality). Another difference between modern growth and growth in the Ancien Régime concerns structural change. Modern growth implies a relative rise in the product of industries and services, and, consequently, a rise in urbanisation. In Renaissance Tuscany exactly the opposite occurred. In 1400, the urbanisation rate was half the level reached in about 1300. Approximately 450 years later, the pre-plague level was not yet attained. The rate achieved in 1300 was only surpassed at the start of the twentieth century.

 

To contact the author: malanima@unicz.it

 

Notes:

[1] M. Breschi, P. Malanima, Demografia ed economia in Toscana: il lungo periodo (secoli XIV-XIX), in M. Breschi, P. Malanima (eds.), Prezzi, redditi, popolazioni in Italia: 600 anni, Udine, Forum, 2002, pp. 109-42 (from this paper is taken the following demographic information).

[2] F. Rondinelli, Relazione del contagio stato in Firenze l’anno 1630 e 1633, Firenze, G.B. Landini, 1634.

[3] M. Villani, Cronica, in G. Villani, Cronica con le continuazioni di Matteo e Filippo, Torino, Einaudi, 1979, p. 295.

[4] R. Sardo, Cronaca di Pisa, O. Banti (ed.), Roma, Istituto Storico Italiano per il Medio Evo, 1963, p. 96.

[5] P. Malanima, The Economic Consequences of the Black Death, in E. Lo Cascio (ed.), L’impatto della “Peste Antonina”, Bari, Edipuglia, 2012, pp. 311-30.

[6] Quoted in S. Cohn, ‘After the Black Death: Labour Legislation and Attitudes towards Labour in late-medieval Western Europe’, Economic History Review, 60 (2007), p. 480.