COVID-19 and the food supply chain: Impacts on stock price returns and financial performance

This blog is  part of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History’.

By Julia Höhler (Wageningen University)

As growing evidence about COVID-19 and its effects on the human body and transmission mechanisms emerges, economists are now making progress in understanding the impact of the global pandemic on the food supply chain. While it is apparent that many companies were affected, the nature and magnitude of the effects continue to require investigation.  A special issue of the Canadian Journal of Agricultural Economics on ‘COVID-19 and the Canadian agriculture and food sectors’, was among the first publications to examine the  possible effects of COVID-19 on food-supply.  In our ongoing work we take the next step and ask the question: How can we quantify the effects of COVID-19 on companies in the food supply chain?

Figure 1. Stylized image of supermarket shopping Source: Oleg Magni, Pexels

Stock prices as a proxy for the impact of COVID-19

One way to quantify the initial effects of COVID-19 on companies in the food supply chain is to analyse stock prices and their reaction over time. The theory of efficient markets states that stock prices reflect investors’ expectations regarding future dividends. If stock prices fluctuate strongly, this is a sign of lower expected returns and higher risks. Volatile stock markets can increase businesses’ financing costs and, in the worst case, threaten their liquidity. At the macroeconomic level, stock prices can also be useful to indicate the likelihood of a future recession. For our analysis of stock price reactions, we have combined data from different countries and regions. In total, stock prices for 71 large stock-listed companies from the US, Japan and European were collected. The companies’ activities in our sample cover the entire supply chain from farm equipment and supplies, agriculture, trade, food-processing, distribution, and retailing.

Impact on stock price returns comparable to the 2008 financial crisis

 We began by  calculating the logarithmic daily returns for the companies’ stocks and their average. Second, we compared these average returns with the performance of the S&P 500.  Figure 2, below,  shows the development of average daily returns from 2005 to 2020. Companies in the S&P 500 (top) achieved higher returns on average, but also exhibited higher fluctuations than the average of the companies we examined (bottom). Stock price returns fluctuated particularly strongly during the 2008 financial crisis. The fluctuations since the first notification of COVID-19 to the WHO in early January to the end of April 2020 (red area) are comparable in their magnitude. The negative fluctuations in this period are somewhat larger than in 2008. Based on the comparison of both charts, it can be assumed that stock price returns of large companies in the food supply chain were on average less affected by the two crises. Nevertheless, a look at the long-term consequences of the 2008 financial crisis suggests that a wave of bankruptcies, lower financial performance and a loss of food security may still follow.

Figure 2. Average daily returns, for the S & P 500 (top panel) and 71 food-supply companies (FSC), lower panel, 2005-2020. Source: Data derived from multiple sources. For further information, please contact the author.

Winners and losers in the sub-sectors

In order to obtain a more granular picture of the impact of COVID-19, the companies in our sample  were divided into sub-sectors, and their stock price volatility was calculated between January and April, 2020. Whereas food retailers and breweries experienced relatively low volatility in stock prices, food distributors and manufacturers of fertilizers and chemicals experienced relatively higher volatilities. In order to cross-validate these results, we collected information on realized profits or losses from the companies’ financial reports. The trends observed in  stock prices are also reflected in company results for the first quarter of 2020. Food retailers were able to increase their profits in times of crisis, while food distributors recorded high losses compared to the previous period. The results are likely related to the lockdowns and social distancing measures which altered food distribution channels.

Longer-term effects

Just as the vaccine for COVID-19 is still in the pipeline, research into the effects of COVID-19 needs time to show what makes companies resilient to the effects of unpredictable shocks of this magnitude. Possible research topics relate to the question of whether local value chains are better suited to cushion the effects of a pandemic and maintain food security. Further work is also needed to understand fully the associated trade-offs between food security, profitability, and climate change objectives. Another research question relates to the effects of government protective measures and company support programmes.  Cross-country studies can provide important insights here. Our project lays the groundwork for future research into the effects of shocks on companies in the food value chain. By combining different data sources, we were able to compare stock returns in times of COVID-19 with those of the 2008 crisis, and  identify differences between sub-sectors. In the next step we will use company characteristics such as profitability to explain differences in returns.

To contact the author: julia.hoehler[at] wur.nl

Airborne diseases: Tuberculosis in the Union Army

by Javier Birchenall (University of California, Santa Barbara)

This is Part F of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History The full article from this blog was published in ‘Explorations in Economic History’ and is available here

TB-do-not-spit-1910-v2
1910 advertising postcard for the National Association for the Prevention of Tuberculosis. 

Tuberculosis (TB) is one of the oldest and deadliest diseases. Traces of TB in humans can be found as early as 9,000 years ago, and written accounts date back 3,300 years in India. Untreated, TB’s case-fatality rate is as high as 50 percent. It was a dreaded disease.  TB is an airborne disease caused by the bacteria Mycobacterium tuberculosis. Tuberculosis spreads through the air when a person who has an active infection coughs, sneezes, speaks, or sings. Most cases remain latent and do not develop symptoms. Activation of tuberculosis is particularly influenced by undernutrition.

Tuberculosis played a prominent role in the secular mortality decline. Of the 27 years of life expectancy gained in England and Wales between 1871 and 1951, TB accounts for about 40 percent of the improvement, a 12-year gain. Modern medicine, the usual suspect used to explain this mortality decline, could not have been the culprit. As Thomas McKeown famously pointed out, TB mortality started its decline long before the tubercle bacillus was identified and long before  an  effective treatment was provided (Figure 1). McKeown viewed improvements in economic and social conditions, especially improved diets, as the principal factor arresting the combatting tuberculosis. A healthy diet, however, is not the only factor behind nutritional status. Infections, no matter how mild, reduce nutritional status and increase susceptibility to infection.

Figure 1. Mortality rate from TB.

fig01

Source: as per original article

In “Airborne Diseases: Tuberculosis in the Union Army” I studied the determinants of diagnosis, discharge, and mortality from tuberculosis in the past. I examined the medical histories of 25,000 soldiers and veterans in the Union Army using data collected under the direction of Robert Fogel. The Civil War brought together soldiers from many socioeconomic conditions and ecological backgrounds into an environment which was ideal for the spread of this disease. The war also provided a unique setting to examine many of the factors which were  likely responsible for the decline in TB mortality. Before enlistment, individuals had differential exposure to harmful dust and fumes. They also faced different disease environments and living conditions. By housing recruits in confined spaces, the war exposed soldiers to a host of waterborne and airborne infections. In the Civil War, disease was far more deadly than battle.

The Union Army data contains detailed medical records and measures of nutritional status. Height at enlistment measures net nutritional experiences at early ages. Weight, needed to measure current nutritional status using the Body Mass Index (BMI), is available for war veterans. My estimates use a hazard model and a variety of controls aligned with existing explanations proposed for the decline in TB prevalence and fatality rates. By how much would the diagnosis of TB have declined if the average Union Army soldier had the height of the current U.S. male population, and if all his relevant infections diagnosed prior to TB were eradicated?  Figure 2 presents the contribution of the predictors of TB diagnosis in soldiers who did not engage in battle, and  Figure 3 reports soldiers discharged because of TB.  Nutritional experiences in early life provided a protective effect against TB.  Between 25 and 50 per cent of the predictable decline in tuberculosis could be associated with the modern increase in height. Declines in the risk of waterborne and airborne diseases are as important as the predicted changes in height

 

Figure 2. Contribution of various factors to the decline in TB diagnosis

fig02
Source: as per original article

 

Figure 3. Contribution of various factors to the decline in discharges because of TB.

fig03
Source: as per original article

My analysis showed that a wartime diagnosis of TB increased the risk of tuberculosis mortality. Because of the chronic nature of the disease, infected soldiers likely developed a latent or persistent infection that remained active until resistance failed at old age. Nutritional status provided some protection against mortality. For veterans, height was not as robust as BMI. If a veteran’s BMI increased from its historical value of 23 to current levels of 27, his mortality risk from tuberculosis would have been reduced by 50 per cent. Overall, the contribution of changes in `pure’ diets and changes in infectious disease exposure, was probably equal.

What lessons can be drawn for the current covid-19 pandemic? Covid-19 is also an airborne disease. Airborne diseases (e.g., influenza, measles, smallpox, and tuberculosis) are difficult to control. In unfamiliar populations, they often break wreak havoc. But influenza, measles, smallpox, and tuberculosis are mostly killers from the past. The findings in my paper suggest that the conquest of tuberculosis happened through both individual and public health efforts. Improvements in diets and public health worked simultaneously and synergistically. There was no silver bullet to defeat the great white plague, tuberculosis. Diets are no longer as inadequate as in the past. Still, Covid-19 has exposed differential susceptibility to the disease. Success in combatting Covid-19 is likely to require simultaneous and synergistic private and public efforts.

After the Black Death: labour legislation and attitudes towards labour in late-medieval western Europe

by Samuel Cohn Jr. (University of Glasgow)

This blog forms part F in the EHS series: The Long View on Epidemics, Disease and Public Health:Research from Economic History

The full article from this blog post was published on the Economic History Review, and it is available for members at this link 

The_plague_of_ashdod_1630
Poussin, The plague of Ashdod, 1630. Available at <https://en.wikipedia.org/wiki/Plague_of_Ashdod_(Poussin)#/media/File:The_plague_of_ashdod_1630.jpg&gt;

In the summer, 1999, I presented the kernel of this article at a conference in San Miniato al Tedesco in memory of David Herlihy. It was then limited to labour decrees I had found in the Florentine archives from the Black Death to the end of the fourteenth century. A few years later I thought of expanding the paper into a comparative essay. The prime impetus came from teaching the Black Death at the University of Glasgow. Students (and I would say many historians) think that England was unique in promulgating price and wage legislation after the Black Death, the famous Ordinance and Statute of Labourers in1349 and 1351. In fact, I did not then know how extensive this legislation was, and details of its geographical distribution remains unknown today.

A second impetus for writing the essay concerned a consensus in the literature on this wage legislation principally in England: that these decrees followed the logic of the laws of supply and demand. In short, with the colossal mortalities of the Black Death, 1347-51, the greatly diminished supply of labour meant that wage earners in cities and the countryside could demand excessive increases that threatened the livelihoods of elite rentiers — the church, nobility, and merchants. After all, this is what chroniclers and other contemporary commentators across Europe — Henry Knighton, Matteo Villani, William Langland, Giovanni Boccaccio, and many others — tell us in their scorning reproaches to greedy labourers. As ‘Hunger’ in book IV of Piers the Ploughman sneered:

And draught-ale was not good enough for them, nor a hunk of bacon, but they must have fresh meat or fish, fried or baked and chaud or plus chaud at that.

In addition, across the political spectrum from Barry Dobson to Rodney Hilton, Bertha Putnam’s study of the English laws (published in 1908) continued to be proclaimed as the definitive authority on these laws, despite her lack of quantitative analysis and central conclusion: the peasants were guilty of ‘extortionate greed’ and for this reason ‘these laws were necessary and just’ (Enforcement of the Statutes of Labourers, pp. 219–20.) Yet, across Europe, while nominal wages may have trebled through the 1370s, prices for basic commodities rose faster, leaving the supposed heartless labourers worse-off than they had been before the Black Death. As George Holmes discovered in 1957, the class to profit most in England from the post-plague demographics was the supposed victimized nobility.

Through primary and secondary sources, my article then researched wage and price legislation across a wide ambit of Europe — England, the Ile de France, the Low Countries, Provence, Aragon, Castile, Catalonia, Florence, Bologna, Siena, Orvieto, Milan, and Venice. Certainly, research needs to be extended further, to places where these laws were enacted and to where they appear not to have been, as in Scotland and the Low Countries, and to ask what difference the laws may have meant for economic development. However, from the places I examined, no simple logic arose, whether of supply and demand or the aims that might have been expected given differences in political regimes. Instead, municipal and royal efforts to control labour and artisans’ prices splintered in numerous and often contradictory directions, paralleling anxieties and needs to attribute blame as seen in other Black Death horrors: the burning of Jews and the murder of Catalans, beggars, and priests.

In conclusion, a history of the Black Death and its long-term consequences for labour can provide insights for perceiving our present predicament with Covid-19. We can  anticipate that the outcomes of the present pandemic will not be same across countries or continents. Similarly, for the Black Death and successive waves of plague through the fourteenth century, there were winners and losers. Yet, surprisingly, few historians have attempted to chart these differences, and fewer still to explain them. Instead, medievalists have concentrated more on the Black Death’s grand transformations, and these may serve as a welcome tonic for our present pandemic despair, especially as concerns labour. Eventually, post-Black-Death populations experienced increases in caloric intact, greater varieties in diet, better housing, consumption of more luxury goods, increased leisure time, and leaps in economic equality. Moreover, governments such as Florence, even before revolts or regime change, could learn from their initial economic missteps. With the second plague in 1363, they abandoned their laws entitled ‘contra laboratores’ that had endangered the supplies of their thin resources of labour and passed new decrees granting tax exemptions to attract labourers into their territories. Moreover, other regions–the Ile de France, Aragon, Provence, and Siena abandoned these initial prejudicial laws almost immediately. Even in England, despite legislating ever more stringent laws against labourers that last well into the fifteenth century, its law enforcers learnt to turn a blind eye to the law, allowing landlords and peasants to cut mutually beneficial deals that enabled steady work, wages to rise, working conditions to improve, and greater freedom of movement.

Let us remember that these grand transformations did not occur overnight. The switch in economic policies to benefit labourers and the narrowing of the gap between rich and poor did not begin to show effects until a generation after the Black Death; in some regions not until the early fifteenth century.

Demand slumps and wages: History says prepare to bargain

by Judy Z. Stephenson (Bartlett Faculty of the Built Environment, UCL)

This blog is part of the  EHS series on The Long View on Epidemics, Disease and Public Health:Research from Economic History).

Big shifts and stops in supply, demand, and output hark back to pre-industrial days, and they carry lessons for today’s employment contracts and wage bargains.

Canteen at the National Projectile Factory
Munitions factory in Lancaster, 1917 ca.
Image courtesy of Lancaster City Museum. Available at <http://www.documentingdissent.org.uk/munitions-factories-in-lancaster-and-morecambe/&gt;

Covid-19 has brought the world to a slump of unprecedented proportions. Beyond immediate crises in healthcare and treatment, the biggest impact is on employment. Employers, shareholders and policymakers are struggling to come to terms with the implications of ‘closed-for-business’ for an unspecified length of time, and laying-off workers seems the most common response, even though unprecedented government support packages for firms and workers have heralded the ‘return of the state’, and the fiscal implications have provoked wartime comparisons.

There is one very clear difference between war and the current pandemic: that of mobilisation. Historians tend to look on times of war as times of full employment and high demand. (1). A concomitant slump in demand and a huge surplus of de-mobilised labour were associated with the depression in real wages and labour markets in the peacetime years after 1815. That slump accompanied increasing investment in large scale factory production, particularly in the textile industry. The decades afterwards are some of the best documented in labour history (2), and they are characterised by frequent stoppages, down-scaling and restarts in production. They should be of interest now because they are the story of how modern capitalist producers learned to set and bargain for wages to ensure they had the skills they needed, when they needed to produce efficiently. Much of what employers and workers learned over the nineteenth century are directly pertinent to problems that currently face employers, workers, and the state.

Before the early nineteenth century in England – or elsewhere for that matter – most people were simply not paid a regular weekly wage, or in fact paid for their time at all (3). Very few people had a ‘job’, and shipwrights, building workers, some common labourers, (in all maybe 15% of workers in early modern economies) were paid ‘by the day’, but the hours or output that a ‘day’ involved were varied and indeterminate. The vast majority of pre-industrial workers were not paid for their time, but for what they produced.

These workers  earned piece rates, like today’s delivery riders earn ‘per drop’, and uber drivers earn ‘per ride’, or garment workers per unit made. When supply of materials failed, or demand for output stalled, workers were not paid, irrespective of whether they could work or not. Blockades, severe weather, famine, plague, financial crises, and unreliable supplies, all stopped work, and so payment of wages ended.  Stoppages were natural and expected. Historical records indicate that in many years commercial activity and work slowed to a trickle in January and February. Households subsisted on savings or credit before they could start earning again, or parishes and the poor law provided bare subsistence in the interim. Notable characteristics of pre-industrial wages – by piecework and otherwise – were wage posting and nominal rate rigidity, or lack of wage bargaining. Rates for some work didn’t change for almost a century, and the risk of no work seems to have been accounted for on both sides. (4).

Piecework, or payment for output is a system of wage formation is of considerable longevity   and its purpose was always to protect employers from labour costs in uncertain conditions. It seems attractive because it transfers  the risks associated with output volatility from the employer to the worker.  Such a practices are the basis of today’s  ‘gig’ economy.  Some workers – those in their prime who are skilled and strong – tend to do well out of the system, and enjoy being able to increase their earnings with effort. This is the flexibility of the gig economy that some relish today.  But its less effective for those who need to be trained or managed, older workers, or anyone who has to limit their hours.

However, piecework or gig wage systems have risks for the employer. In the long run, we know piece bargains break down, or become unworkably complex as both workers and employers behave opportunistically (5). Where firms need skilled workers to produce quickly, or they want to invest in firm or industry specific human capital to increase competitiveness through technology, they can suddenly find themselves outpriced by competitors, or with a labour force with a strong leisure preference or, indeed,  a labour shortage. Such conditions characterised early industrialisation. In the British textile industry this opportunism created and exacerbated stoppages throughout the nineteenth century. After each stoppage both employers and workers sought to change rates. But new bargains were difficult to agree. Employers tried to cut costs. Labour struck. Bargaining for wages impeded efficient production.

Eventually, piecework bargains formed implicit, more stable contracts and ‘invisible handshakes’ paved the way to the relative stability of hourly wages and hierarchy of skills in factories (though the mechanism by which this happened is contested) (6). The form of the wage slowly changed to payment by the hour or unit of time.  Employers worked out that ‘fair’ regular wages (or efficiency wages),  and a regular workforce served them better in the long run than trying to save labour costs through stoppages. Unionisation bettered working conditions and the security of contracts. The Trade Board Act of 1909 regulated the wages of industries still operating minimal piece rates, and ushered in the era of collective wage bargaining as the norm, which only ended with the labour market policies of Thatcherism and subsequent governments.

So far in the twenty-first century, although there has been a huge shift to self-employment, gig wage formation and non-traditional jobs (7) we have not experienced the bitter bargaining that characterised the shift from piecework to time work two hundred years ago, or the unrest of the 1970s and early 1980s. Some of this is probably down to the decline of output volatility that accompanied increased globalisation since the ‘Great Moderation’ and the extraordinarily low levels of unemployment in most economies in the last decade (8). Covid-19 brings output volatility back, in a big, unpredictable way, and the history of wage bargaining indicates that when factors of production are subject to shocks, bargaining is costly. Employers who want to rehire workers who have been unpaid for months, may find established wage bargains no longer hold. Now, shelf stackers who have risked their lives on zero hours contracts may think that their pay rate per hour should reflect this risk. Well-paid professionals incentivised by performance related pay are discovering the precarity of ‘eat what you kill’, and may find that their basic pay doesn’t reflect the preparatory work they need to do in conditions that will not let them perform. Employers facing the same volatility might try to change rates, and many employers have already moved to cut wages.

Today’s state guarantee of many worker’s income, unthinkable in the nineteenth century laissez-faire state, are welcome and necessary. That today’s gig economy workers have made huge strides towards attaining full employment rights would also appear miraculous to most pre-industrial workers. Yet, contracts and wage formation matter. With increasing numbers of workers without job security, and essential services suffering demand and supply shocks, many workers and employers are likely to confront significant shifts in employment.  History suggests bargaining for them is not as easy a process as the last thirty years have led us to believe.

 

To contact the author: 

j.stephenson@ucl.ac.uk

@judyzara

 

References:

(1). Allen, R. (2009). Engels’ pause: Technical change, capital accumulation, and inequality in the British industrial revolution. Explorations in Economic History, 46(4), 418-435; Broadberry et al, (2015). British Economic Growth, 1270-1870. CUP.

(2). Huberman. M., (1996) Escape from the Market, CUP, chapter 2.

(3). Hatcher, J., and Stephenson, J.Z. (Eds.), (2019) Seven Centuries of Unreal Wages, Palgrave Macmillan

(4). J. Stephenson and P. Wallis, ‘Imperfect competition’, LSE Working Paper (forthcoming).

(5). Brown, W. (1973) Piecework Bargaining, Heinemann.

(7). See debates between Huberman, Rose, Taylor and Winstanley in Social History 1987-89.

(6). Katz, L., & Krueger, A. (2016). The Rise and Nature of Alternative Work Arrangements in the United States, 1995-2015. NBER Working Paper Series.

(8). Fang, W., & Miller, S. (2014). Output Growth and its Volatility: The Gold Standard through the Great Moderation. Southern Economic Journal, 80(3), 728-751.