Give Me Liberty Or Give Me Death

by Richard A. Easterlin (University of Southern California)

This blog is  part G of the Economic History Society’s blog series: ‘The Long View on Epidemics, Disease and Public Health: Research from Economic History’. The full article from this blog is “How Beneficent Is the Market? A Look at the Modern History of Mortality.” European Review of Economic History 3, no. 3 (1999): 257-94. https://doi.org/10.1017/S1361491699000131

 

VACCINATION_06
A child is vaccinated, Brazil, 1970.

Patrick Henry’s memorable plea for independence unintentionally also captured the long history of conflict between the free market and public health, evidenced in the current struggle of the United States with the coronavirus.  Efforts to contain the virus have centered on  measures to forestall transmission of the disease such as stay-at-home orders, social distancing, and avoiding large gatherings, each of which infringes on individual liberty.  These measures have given birth to a resistance movement objecting to violations of one’s freedom.

My 1999 article posed the question “How Beneficent is the Market?” The answer, based on “A Look at the Modern History of Mortality” was straightforward: because of the ubiquity of market failure, public intervention was essential to achieve control of major infectious disease. This intervention  centered on the creation of a public health system. “The functions of this system have included, in varying degrees, health education, regulation, compulsion, and the financing or direct provision of services.”

Regulation and compulsion, and the consequent infringement of individual liberties, have always been  critical building blocks of the public health system. Even before formal establishment of public health agencies, regulation and compulsion were features of measures aimed at controlling the spread of infectious disease in mid-19th century Britain. The “sanitation revolution” led to the regulation of water supply and sewage disposal, and, in time to regulation of slum-  building conditions.  As my article notes, there was fierce opposition to these measures:

“The backbone of the opposition was made up of those whose vested interests were threatened: landlords, builders, water companies, proprietors of refuse heaps and dung hills, burial concerns, slaughterhouses, and the like … The opposition appealed to the preservation of civil liberties and sought to debunk the new knowledge cited by the public health advocates …”

The greatest achievement of public health was the eradication of smallpox, the one disease in the world that has been eliminated from the face of the earth. Smallpox was the scourge of humankind until William Jenner’s discovery of a vaccine in 1798.   Throughout the 19th and 20th centuries, requirements for smallpox vaccination were fiercely opposed by anti-vaccinationists.  In 1959 the World Health Organization embarked on a program to eradicate the disease. Over the ensuing two decades its efforts to persuade governments worldwide to require vaccination of infants were eventually successful, and in 1980 WHO officially declared the disease eradicated. Eventually public health triumphed over liberty. But It took almost two centuries to realize Jenner’s hope that vaccination would annihilate smallpox.

In the face of the coronavirus pandemic the U. S. market-based health care system  has demonstrated once again the inability of the market to  deal with infectious disease, and the need for forceful public intervention. The  current health care system requires that:

 “every player, from insurers to hospitals to the pharmaceutical industry to doctors, be financially self-sustaining, to have a profitable business model. It excels in expensive specialty care. But there’s no return on investment in being positioned for the possibility of a pandemic” (Rosenthal 2020).

Commercial and hospital labs have been slow to respond to the need for developing a test for the virus.  Once tests became available, conducting them was handicapped by insufficient supplies of testing capacity — kits, chemical reagents, swabs, masks and other personal protective equipment. In hospitals, ventilators  were also in short supply. These deficiencies reflected the lack of profitability in responding to these needs, and of a government reluctant to compensate for market failure.

At the current time, the halting efforts of federal public health authorities  and state and local public officials to impose quarantine and “shelter at home” measures have been seriously handicapped by public protests over infringement of civil liberties, reminiscent of the dissidents of the 19th  and 20th centuries and their current day heirs. States are opening for business well in advance of guidelines of the Center for Disease Control.  The lesson of history regarding such actions is clear: The cost of liberty is sickness and death.  But do we learn from history? Sadly, one is put in mind of Warren Buffet’s aphorism: “What we learn from history is that people don’t learn from history.”

 

Reference

Rosenthal, Elizabeth, “A Health System Set up to Fail”,  New York Times, May 8, 2020, p.A29.

 

To contact the author: easterl@usc.edu

from VoxEU.org — Coronavirus from the perspective of 17th century plague

by Neil Cummins (LSE), Morgan Kelly (University College Dublin), Cormac Ó Gráda (University College Dublin)

A repost from VoxEU.org

Between 1563 and 1665, London experienced four plagues that each killed one fifth of the city’s inhabitants. This column uses 790,000 burial records to track the plagues that recurred across London (epidemics typically endured for six months). Possibly carried and spread by body lice, plague always originated in the poorest parishes; self-segregation by the affluent gradually halved their death rate compared with poorer Londoners. The population rebounded within two years, as new migrants arrived in the city “to fill dead men’s shoes”.

Full article available here: Coronavirus from the perspective of 17th century plague — VoxEU.org: Recent Articles

cumulative

Plague and Renaissance in Tuscany

by Paolo Malanima (Magna Græcia University of Catanzaro)

This blog forms part D in the EHS series: The Long View on Epidemics, Disease and Public Health:Research from Economic History).

The full paper from this blog post was published on the Economic History Review and is available here 

 

Picture 1
The Triumph of Death. Source: Francesco Petrarca, Trionfi: Ms., 1081. Biblioteca Nazionale, Rome.

In 1346, astrologers announced that the conjunction of three planets foretold great and serious events (“grandi e gravi novitadi”).[3] This was quickly confirmed: early in January 1348, two Genoese galleys landed at the port of Pisa, a few kilometres from the city centre. These two galleys began their voyage in Caffa (Teodosia) on the Black Sea.  They had stopped in Messina earlier, and were en-route to Genoa. After landing at Pisa, the mariners went to the marketplace, where, subsequently, many of the locals became ill and quickly died. Panic gripped the city inhabitants (“fu sparto lo grande furore per tucta la cictà di Pisa”).[4]

From Pisa, the Black Death commenced its march on Europe. In a matter of months it was chronicled that nearly 80 per cent of Pisa’s inhabitants had died. By March, 1348,  the first cases of plague occurred in Florence and the plague affected other, close Tuscan cities, progressing at a speed of one kilometre per day.

At the time of the Black Death, Tuscany was one of the most populated areas of Europe, with approximately one million inhabitants.[1] In the fourteenth century, the population density of Tuscany was approximately three times that of Europe (excluding Russia), and roughly twice that  of England and Wales.

The first wave of the plague was followed by several other attacks: according to a seventeenth-century writer, between 1348 and 1527, there were at least 19 further outbreaks.[2] The Tuscan population reached its minimum around 1440, with barely 400,000 inhabitants. It began to slowly recover from the middle of the fifteenth century, reaching 900,000 inhabitants by 1600. The birthplace of the European Renaissance was one of the most devastated regions. The last assault by the plague occurred in 1629-30, after which the plague disappeared from Tuscany.

Picture 1
Figure 2. Part A: Florence price index, 1310-1450; Part B: Daily real wage rates of masons, 1310-1450. Please note that  in A  the price index has a base of 1 for the period 1420-40; in B the nominal wage is divided by the price of the basket.
Source: P. Malanima, ‘Italy in the Renaissance: a Leading Economy in the European context, 1350-1550’, Economic History Review , 71 (2018), pp. 3-30.

What were the economic effects of these outbreaks of plague? The main effect was a sudden change in the ratio of factors of production. The plague destroyed humans, but not the capital stock (buildings, tools), natural capital (that is physical resources), or human capital (knowledge). Consequently, the capital stock per worker increased and, therefore, so did labour productivity. With few exceptions, the consequences of the Black Death were similar across Europe.[5]

In Tuscany, which suffered frequent and powerful outbreaks of the plague, the ratio between production factors changed the most, leading to a decline in output prices (Figure 2, A). The fall in prices was immediate following the Black Death. However, because of bad harvests and military events, an apparent reversal of the trend occurred at the end of the century. Similarly, the price of labour only increased above its base year from about 1450-70 (Figure 2, B). These changes were known to the Florentine government when it noted, in a decree of 1348, that, while “many citizens had suddenly become the poor, the poor had become rich”.[6]

The curve of Tuscan GDP per capita is shown in Figure 3. We note that the trend began to rise soon after the main outbreak of the plague, but started to decline soon after the population was recovering, after the middle of the century. It reached its maximum around 1410-60. At the time, per capita GDP in Tuscany was higher than elsewhere in Europe. In the first half of the fifteenth century, its annual level was about 2,500 present euros, compared to 2,000 euros in 1861 (the date of national unification).

Picture 1
Figure 3. Real per capita GDP (Florentine Lire 1420-40). Notes: The lower curve refers to the yearly percentage changes from the trend of GDP per capita.
Source: as Figure 2.

Was there real growth in Tuscany after the Black Death? The blunt answer is: no. Following Simon Kuznet’s seminal work, we know that modern economic growth is characterised by simultaneous growth in population and product, with the latter growing relatively faster. Furthermore, modern growth implies the continuous growth of product per capita. However, as this case study demonstrates, product per capita rose because the population declined so dramatically, and Tuscan GDP per capita was highly volatile. Indeed, in some years the latter could fluctuate by 10 to 20 per cent, which would be highly unusual by present standards (although the current COVID outbreak might mean that there will be even greater fluctuations in standards of living and mortality). Another difference between modern growth and growth in the Ancien Régime concerns structural change. Modern growth implies a relative rise in the product of industries and services, and, consequently, a rise in urbanisation. In Renaissance Tuscany exactly the opposite occurred. In 1400, the urbanisation rate was half the level reached in about 1300. Approximately 450 years later, the pre-plague level was not yet attained. The rate achieved in 1300 was only surpassed at the start of the twentieth century.

 

To contact the author: malanima@unicz.it

 

Notes:

[1] M. Breschi, P. Malanima, Demografia ed economia in Toscana: il lungo periodo (secoli XIV-XIX), in M. Breschi, P. Malanima (eds.), Prezzi, redditi, popolazioni in Italia: 600 anni, Udine, Forum, 2002, pp. 109-42 (from this paper is taken the following demographic information).

[2] F. Rondinelli, Relazione del contagio stato in Firenze l’anno 1630 e 1633, Firenze, G.B. Landini, 1634.

[3] M. Villani, Cronica, in G. Villani, Cronica con le continuazioni di Matteo e Filippo, Torino, Einaudi, 1979, p. 295.

[4] R. Sardo, Cronaca di Pisa, O. Banti (ed.), Roma, Istituto Storico Italiano per il Medio Evo, 1963, p. 96.

[5] P. Malanima, The Economic Consequences of the Black Death, in E. Lo Cascio (ed.), L’impatto della “Peste Antonina”, Bari, Edipuglia, 2012, pp. 311-30.

[6] Quoted in S. Cohn, ‘After the Black Death: Labour Legislation and Attitudes towards Labour in late-medieval Western Europe’, Economic History Review, 60 (2007), p. 480.

 

 

 

Demand slumps and wages: History says prepare to bargain

by Judy Z. Stephenson (Bartlett Faculty of the Built Environment, UCL)

This blog is part of the  EHS series on The Long View on Epidemics, Disease and Public Health:Research from Economic History).

Big shifts and stops in supply, demand, and output hark back to pre-industrial days, and they carry lessons for today’s employment contracts and wage bargains.

Canteen at the National Projectile Factory
Munitions factory in Lancaster, 1917 ca.
Image courtesy of Lancaster City Museum. Available at <http://www.documentingdissent.org.uk/munitions-factories-in-lancaster-and-morecambe/&gt;

Covid-19 has brought the world to a slump of unprecedented proportions. Beyond immediate crises in healthcare and treatment, the biggest impact is on employment. Employers, shareholders and policymakers are struggling to come to terms with the implications of ‘closed-for-business’ for an unspecified length of time, and laying-off workers seems the most common response, even though unprecedented government support packages for firms and workers have heralded the ‘return of the state’, and the fiscal implications have provoked wartime comparisons.

There is one very clear difference between war and the current pandemic: that of mobilisation. Historians tend to look on times of war as times of full employment and high demand. (1). A concomitant slump in demand and a huge surplus of de-mobilised labour were associated with the depression in real wages and labour markets in the peacetime years after 1815. That slump accompanied increasing investment in large scale factory production, particularly in the textile industry. The decades afterwards are some of the best documented in labour history (2), and they are characterised by frequent stoppages, down-scaling and restarts in production. They should be of interest now because they are the story of how modern capitalist producers learned to set and bargain for wages to ensure they had the skills they needed, when they needed to produce efficiently. Much of what employers and workers learned over the nineteenth century are directly pertinent to problems that currently face employers, workers, and the state.

Before the early nineteenth century in England – or elsewhere for that matter – most people were simply not paid a regular weekly wage, or in fact paid for their time at all (3). Very few people had a ‘job’, and shipwrights, building workers, some common labourers, (in all maybe 15% of workers in early modern economies) were paid ‘by the day’, but the hours or output that a ‘day’ involved were varied and indeterminate. The vast majority of pre-industrial workers were not paid for their time, but for what they produced.

These workers  earned piece rates, like today’s delivery riders earn ‘per drop’, and uber drivers earn ‘per ride’, or garment workers per unit made. When supply of materials failed, or demand for output stalled, workers were not paid, irrespective of whether they could work or not. Blockades, severe weather, famine, plague, financial crises, and unreliable supplies, all stopped work, and so payment of wages ended.  Stoppages were natural and expected. Historical records indicate that in many years commercial activity and work slowed to a trickle in January and February. Households subsisted on savings or credit before they could start earning again, or parishes and the poor law provided bare subsistence in the interim. Notable characteristics of pre-industrial wages – by piecework and otherwise – were wage posting and nominal rate rigidity, or lack of wage bargaining. Rates for some work didn’t change for almost a century, and the risk of no work seems to have been accounted for on both sides. (4).

Piecework, or payment for output is a system of wage formation is of considerable longevity   and its purpose was always to protect employers from labour costs in uncertain conditions. It seems attractive because it transfers  the risks associated with output volatility from the employer to the worker.  Such a practices are the basis of today’s  ‘gig’ economy.  Some workers – those in their prime who are skilled and strong – tend to do well out of the system, and enjoy being able to increase their earnings with effort. This is the flexibility of the gig economy that some relish today.  But its less effective for those who need to be trained or managed, older workers, or anyone who has to limit their hours.

However, piecework or gig wage systems have risks for the employer. In the long run, we know piece bargains break down, or become unworkably complex as both workers and employers behave opportunistically (5). Where firms need skilled workers to produce quickly, or they want to invest in firm or industry specific human capital to increase competitiveness through technology, they can suddenly find themselves outpriced by competitors, or with a labour force with a strong leisure preference or, indeed,  a labour shortage. Such conditions characterised early industrialisation. In the British textile industry this opportunism created and exacerbated stoppages throughout the nineteenth century. After each stoppage both employers and workers sought to change rates. But new bargains were difficult to agree. Employers tried to cut costs. Labour struck. Bargaining for wages impeded efficient production.

Eventually, piecework bargains formed implicit, more stable contracts and ‘invisible handshakes’ paved the way to the relative stability of hourly wages and hierarchy of skills in factories (though the mechanism by which this happened is contested) (6). The form of the wage slowly changed to payment by the hour or unit of time.  Employers worked out that ‘fair’ regular wages (or efficiency wages),  and a regular workforce served them better in the long run than trying to save labour costs through stoppages. Unionisation bettered working conditions and the security of contracts. The Trade Board Act of 1909 regulated the wages of industries still operating minimal piece rates, and ushered in the era of collective wage bargaining as the norm, which only ended with the labour market policies of Thatcherism and subsequent governments.

So far in the twenty-first century, although there has been a huge shift to self-employment, gig wage formation and non-traditional jobs (7) we have not experienced the bitter bargaining that characterised the shift from piecework to time work two hundred years ago, or the unrest of the 1970s and early 1980s. Some of this is probably down to the decline of output volatility that accompanied increased globalisation since the ‘Great Moderation’ and the extraordinarily low levels of unemployment in most economies in the last decade (8). Covid-19 brings output volatility back, in a big, unpredictable way, and the history of wage bargaining indicates that when factors of production are subject to shocks, bargaining is costly. Employers who want to rehire workers who have been unpaid for months, may find established wage bargains no longer hold. Now, shelf stackers who have risked their lives on zero hours contracts may think that their pay rate per hour should reflect this risk. Well-paid professionals incentivised by performance related pay are discovering the precarity of ‘eat what you kill’, and may find that their basic pay doesn’t reflect the preparatory work they need to do in conditions that will not let them perform. Employers facing the same volatility might try to change rates, and many employers have already moved to cut wages.

Today’s state guarantee of many worker’s income, unthinkable in the nineteenth century laissez-faire state, are welcome and necessary. That today’s gig economy workers have made huge strides towards attaining full employment rights would also appear miraculous to most pre-industrial workers. Yet, contracts and wage formation matter. With increasing numbers of workers without job security, and essential services suffering demand and supply shocks, many workers and employers are likely to confront significant shifts in employment.  History suggests bargaining for them is not as easy a process as the last thirty years have led us to believe.

 

To contact the author: 

j.stephenson@ucl.ac.uk

@judyzara

 

References:

(1). Allen, R. (2009). Engels’ pause: Technical change, capital accumulation, and inequality in the British industrial revolution. Explorations in Economic History, 46(4), 418-435; Broadberry et al, (2015). British Economic Growth, 1270-1870. CUP.

(2). Huberman. M., (1996) Escape from the Market, CUP, chapter 2.

(3). Hatcher, J., and Stephenson, J.Z. (Eds.), (2019) Seven Centuries of Unreal Wages, Palgrave Macmillan

(4). J. Stephenson and P. Wallis, ‘Imperfect competition’, LSE Working Paper (forthcoming).

(5). Brown, W. (1973) Piecework Bargaining, Heinemann.

(7). See debates between Huberman, Rose, Taylor and Winstanley in Social History 1987-89.

(6). Katz, L., & Krueger, A. (2016). The Rise and Nature of Alternative Work Arrangements in the United States, 1995-2015. NBER Working Paper Series.

(8). Fang, W., & Miller, S. (2014). Output Growth and its Volatility: The Gold Standard through the Great Moderation. Southern Economic Journal, 80(3), 728-751.