The long-term negative impact of slavery on economic development in Brazil

by Andrea Papadia (London School of Economics)

Jean Baptiste Debret (1826). From “The Atlantic Slave Trade and Slave Life in the Americas: A Visual Record”,


Slavery has been at the centre of many heated debates in the social sciences, yet there are few systematic studies relating slavery to economic outcomes in receiving countries. Moreover, most existing work on Brazil – which was the largest slave importer during the African slave trade and the last country to abolish the practice – has failed to identify any clear legacies of this institution.

This research overcomes this impasse by highlighting a distinctly negative impact of slavery on economic development in Brazil. More precisely, it illustrates that in the municipalities of the states of Rio de Janeiro and São Paulo, where slave labour was more prevalent in the nineteenth century, fiscal development was lower in the early twentieth century, long after slavery was abolished.

The identification of this negative effect is tied to separating the true effect of slavery on fiscal development from the fact that the huge expansion of coffee production that Brazil underwent from the 1830s attracted large numbers of slaves to booming regions. In fact, the research shows that:

  • A naïve analysis of the data would suggest that for relatively low levels, more slavery in the nineteenth century was associated with higher successive fiscal development.
  • For population shares of slaves above 30-35%, more slavery was clearly associated with lower fiscal development.
  • Taking account of the impact of the coffee boom on both the demand for slave labour and development, slavery was unambiguously associated with worse developmental outcomes later on.
  • Comparing two hypothetical municipalities – equal in all respects except for their reliance on slave labour – one with 30% of slaves among its citizens would have had revenues 70% lower compared with one with 20%.
  • These results persist even when taking account of a wide variety of other factors that could explain difference in fiscal development across municipalities.

Fiscal development is widely considered as an essential building block in the creation of modern states able to foster economic growth by providing public goods and protecting the rule of law. While the historical process of fiscal development on the European continent is relatively well understood, in other parts of the world the study of the evolution of fiscal institutions is still in its early stages.

There are many reasons why a high incidence of slavery would hamper fiscal development and the provision of public goods:

  • First, a higher incidence of slaves in the population will translate into lower political representation for the masses, even in only partially democratic regimes such as nineteenth and early twentieth century Brazil.
  • Second, the provision of key public goods, such as education, will be less salient in areas that rely heavily on slave labour. These areas will also be less keen to attract workers from other areas of the country and abroad, thus making the provision of public services to their citizens less important.
  • Finally, slavery might make resource sharing though taxation more difficult due to increased ethnic, geographical and class cleavages in the population.

The history of Brazil, which was characterised by large-scale use of slave labour from the sixteenth century until the nineteenth century, provides an idea testing ground to investigate how this clearly extractive institution affected the developmental path of countries and their subdivisions.

The research shows that by accounting for confounding effects due to Brazil’s coffee boom, the pernicious effects of slavery on a key factor for economic growth – fiscal development – can be strongly identified.

Industrialisation and the origins of modern prosperity: evidence from the United States in the 19th century

by Ori Katz (Tel Aviv University)

Wiki Commons. Market scene by Pieter Aertsen, c.1550


The largest economic mystery is the modern prosperity of humankind. For thousands of years since the Neolithic revolution, most humans lived in small communities, working as farmers, and their average standard of living did not change much.

But in the nineteenth century, things changed: large parts of the world become industrialised. In those parts, people moved to live in huge cities, where they worked in manufacturing and commerce, had fewer children, invested more in schooling, and their standard of living began to rise, and then to rise dramatically, and it has never stopped since. Whether you look at life expectancy, birth fatality, income per person or any other measure, the trend is the same. And we don’t really know why.

We have a lot of theories. Some believe that this dramatic change has something to do with a geopolitical environment that encouraged competition and maintained stability in property rights. Others talk about a change in human preferences, maybe even in human biology. But in every theory, two of the main ingredients are the dramatic reduction in fertility and the increasing investment in human capital during the late nineteenth century.

This research examines the effect of industrialisation on human capital and fertility in the United States during the period from 1850 to 1900. This effect is hard to identify, for example because human capital also affects industrialisation, or because other variables such as ‘culture’ may affect both.

To deal with those problems, the study uses the westward expansion of the country as a ‘natural experiment’. The appearance of new large cities such as Chicago and Buffalo led to the development of new transport routes, and the study looks at counties that happened to be close to those new routes.

Those counties experienced industrialisation only because of their geographical location, and not because of the human capital of the local population or other variables. This means that analysing them is similar to a laboratory experiment, where it is possible to change only one parameter and leave the others intact.

Results show a very large effect of industrialisation on both fertility and human capital. These results are in contrast with an old theory according to which industrialisation was a ‘de-skilling’ process that increased the demand for unskilled labour. It seems that industrialisation was conducive to human capital.

They also find that the effects of industrialisation on both fertility and human capital were larger in counties that were already more developed in the first place. This led to a divergence between them and less developed counties. Indeed, when we look at the country level, we see increasing gaps between the industrialised countries and the rest of the world, starting in the nineteenth century, just like the gaps shown at the county level.

The modern period of growth is still a mystery, but these research results tell us that the effects of industrialisation on fertility and human capital are an important piece of the puzzle. These effects might be the reason for the great divergence between nineteenth century economies that created the modern wealth gaps between nations.

Employment, retirement and pensions: the Victorian era as a golden age for the elderly

by Tom Heritage (University of Southampton)

Irish spinning wheel – around 1900
Library of Congress collection

For far too long, our elderly ancestors have been viewed through the prism of the National Health Service and the modern welfare state: old people are regarded as a burden, taking out of society rather than contributing. In contrast, this study of census data for five counties across England and Wales from 1851 to 1911 reveals a reciprocal relationship between those living in old age and wider society.

First, across the whole period, 86-93% of men aged 60 and over were in employment. Even if we exclude those in workhouses, the figure is 80-85%.

Most old men worked in agricultural and general labouring, although an increase was evident by 1911 in the mining industry in Glamorgan and metal manufacturing in Sheffield. Bricklaying, house painting, dock labouring and commercial sales were also pursued in urban areas. Labour force participation rates were higher among men in their sixties than among men in their seventies and eighties.

Second, from 1851 to 1911, between a sixth and a third of women aged over 60 were in employment. Although their occupations were less diverse than those of men, the majority were based in domestic service.

Old women were also involved in cotton and silk textiles and in the manufacture of straw hats. Over time, though, the employment rates of old women did not increase like those of men, owing partly to foreign competition in Asian straw imports and French silks.

Third, retirement was not an innovation brought about by the creation of old age pensions. As early as 1891, over 13% of old men were described in the census as ‘retired’, with high rates in the areas favoured by today’s retirees: the coastal areas of Christchurch and Portsmouth in southern England. More old people retired than went into the workhouse.

But retirement was only an option for those who had inherited or managed to accumulate wealth, such as former smallholders, grocers, innkeepers, civil servants or military officers. Others who lacked land or capital, for example agricultural labourers, or boot and shoe makers were forced to resort to the Poor Law.

Even then, this did not always, or usually, mean the workhouse. Welfare assistance to old people in their own homes was common, especially for women. ‘Outdoor relief’, usually around 2s 6d per week, was issued as a weekly ‘pension’.

Moreover, the women who received it were not always as old as those entitled to a pension in the modern era: in Yorkshire in 1891, over 10% of old women described as ‘on relief’ were under 66, which will be the minimum pension age for women by 2020.

So is it really true to say that nowadays, ‘the elderly have never had it so good’? In a sense it is, as old people lead healthier and longer lives today than they have ever done.

But it would be wrong to conclude that old people in Victorian times were largely condemned to lives of pain and poverty. They had a wide range of experiences, and many had access to employment opportunities and sources of assistance that are no longer offered.

In terms of present day policy, we might learn something from our Victorian forebears about ways to integrate the general population in their sixties into the workforce, so that they can contribute to society as well as receive welfare.

Child labour in 18th century England: evidence from the Foundling Hospital

by Alice Dolan (University of Hertfordshire)

Wellcome Images.Foundling Hospital: Captain Coram and several children, the latter carrying implements of work, a church and ships in the distance. Steel engraving by H. Setchell after W. Hogarth

Every few years a child labour scandal in the clothing industry hits the British press, invoking wide public condemnation. This reaction is a modern phenomenon: 250 years ago, child labour in textile production was commonplace, not worthy of a headline.

Attitudes changed in the nineteenth century, leading to the passing of the 1833 Factory Act and 1842 Mines Act. But before this change, child labour was believed to have positive benefits for children.

One notable example was the Foundling Hospital, a charitable institution that supported abandoned children and was a keen believer in the benefits of child labour. The Hospital sought to produce upright citizens that would be able to support themselves as adults.

A key aim of the Hospital was therefore to train children to be ‘industrious’ from a young age. One governor wrote that the Hospital aimed ‘to give [the Foundlings] an early Turn to Industry by giving them constant employment’. This ‘Turn’ would train the children into economic self-sufficiency, stopping them from relying on parish poor relief as adults.

The Foundling Hospital opened its doors in 1741. Parliament recognised the value of its work and funded the acceptance of all children presented to it aged 12 months or under over the period 1756-60. This ‘General Reception’ brought 14,934 children into the Hospital.

The London Hospital could not cope with these unprecedentedly high numbers and new branches were founded, including one in Ackworth, Yorkshire, which received 2,664 children in the period 1757-72. Ackworth closed because Parliament regretted its generosity and stopped funding the General Reception generation in 1771.

Thousands of children required thousands of uniforms and Ackworth chose to make as many garments as possible in-house. On-site production both trained children to be industrious and offered financial benefits for the Hospital. Work completed on-site was cheap and reliable, and there was greater quality control.

The Ackworth ‘manufactory’ produced woollen cloth. The children prepared the fibre for spinning, span it and wove the yarn into cloth that was worn by their peers at Ackworth and was sold to the London Hospital and externally. Some cloth manufacturing work was outsourced, particularly finishing processes that required a higher level of skill.

Few concessions were made for the age of the makers and the London branch criticised and sent orders back that were considered to be of insufficient quality or inappropriate size. These were primarily business rather than charitable transactions.

The skill division also applied in the making of clothing. Underwear, stockings and girls’ clothing were made in-house because it was less skilled work. Garments were produced in high volumes. From 1761 to 1770, 13,442 pieces of underwear (shirts and shifts) and 19,148 pairs of stockings were made by the children.

Tasks such as tailoring, and hat and shoe making required long apprenticeships to develop the necessary skill – this work was therefore outsourced. But external supply had its problems. It was difficult to source enough garments for the hundreds of children at the branch. Products were more expensive because labour was not free and the Hospital had little influence on suppliers’ timeframes.

A Foundling started work young, aged 4 or 5, and continued to work through their residence at the Hospital. Despite this, they were luckier than their peers in the workhouse who endured worse conditions.

Many parents chose to send their children to the Foundling Hospital to give them better life chances through the greater educational and apprenticeship opportunities offered. Putting the children to work, which seems cruel to us, was a key educational strategy to help them achieve economic independence in adulthood. Its financial and logistical benefits were welcome too.

Trading parliamentary votes for private gain: logrolling in the approval of new railways in 19th century Britain

by Rui Esteves and Gabriel Geisler Mesevage (University of Oxford)

ImageVaultHandler.aspx – Railways in early nineteenth century Britain

The possibility that politicians might act to further their private financial interests, as opposed to the general public interest, has led to the creation of conflict-of-interest rules in modern democracies. For example, the code of conduct of the British Parliament requires that MPs disclose private interests related to their public duties.

In the mid-nineteenth century, Parliament went further, and created a system for the approval of new major public works projects in which MPs with a conflict were barred from voting. But the effectiveness of these rules can be undermined if politicians agree to trade votes with their colleagues — a practice known as ‘logrolling’.

This research use a unique episode in the mid-nineteenth century to determine whether, and to what extent, British politicians traded their votes to further their private interests.

In the mid-1840s, hundreds of new railway companies petitioned the British Parliament for the right to build railway lines. It was Parliament’s responsibility to pick the railway lines they wanted to see built, and in this way shape the development of the modern British transport network.

Since many MPs were also investors in railroads, Parliament created a system of subcommittees, in which the applications of railways would be considered only by MPs without financial conflicts, and who did not represent a constituency that the railway was intending to service.

As a result of this system, MPs with vested interests could not vote for their preferred projects directly. But they could further their interests indirectly by trading their vote on another project with the vote of the MP overseeing the project in which they had an interest.

Drawing on methods from social network analysis, the study identifies all of the potential trades between MPs, and then test statistically for evidence of vote trading. The statistical evidence reveals significant collusion in the voting patterns of MPs who were deciding which railway lines to approve.

These findings reveal significant levels of vote-trading, with politicians coordinating their behaviour so as to ensure that the projects they preferred – which they were banned from influencing directly – were nonetheless approved by their colleagues. As much as a quarter of all of the approved projects were likely the result of this logrolling, and the economic costs of this behaviour were significant, leading to Britain creating a less efficient railway network.

This research highlights the importance of understanding politician’s private interests. Moreover, it illustrates how merely acknowledging conflicts of interest, and abstaining from voting when conflicted, may not resolve the problem of vested interests if politicians are able to collude. The findings shed light on a perennial problem; the methods developed to detect logrolling in this setting may prove useful for detecting vote-trading in other contexts.

The making of New World individualism and Old World collectivism: international migrants as carriers of cultural values

by Anne Sofie Beck Knudsen (University of Copenhagen)



The Sunday magazine of the New York World appealed to Immigrants with this 1906 cover page celebrating their arrival at Ellis Island.

Although a hotly debated topic, we know surprisingly little of the long-term cultural impact of international migration. Does it boil down to the risk of clashes between different cultures; or do we see cultural changes in migrant-sending and migrant-receiving countries along other dimensions as well?

Using novel empirical data, this research documents how past mass migration flows carried values of individualism across the Atlantic ocean from the mid-nineteenth to early twentieth century. This inter-cultural exchange was so significant that its impact is still observed today.

When talking about individualism versus collectivism, this study refers to the emphasis on independence from society that is prevalent in these cultures. With this in mind, it becomes clear why it has a role to play. The act of migration involves leaving familiar surroundings to embark on a journey where you are bound to rely on yourself. An individual with strong ties to the surroundings will be less likely to undergo this act. Collectivists are thus less likely migrate, while the opposite is true for individualists.

To test the idea of individualistic migration and its long-term impact empirically, this research constructs novel indicators of culture, which allow to go back and study the past. It looks at two everyday cultural manifestations: how we name our children; and how we speak our language.

Giving a child commonplace names like ‘John’ reflects parents of a more conformist motivation as they, perhaps unconsciously, are more concerned about their child fitting in rather than standing out. Likewise, the relative use of singular (‘I’, ‘mine’, ‘me’) over plural (‘we’, ‘ours’, us) personal pronouns tells us something about the focus on the individual over the collective.

The study constructs historical indicators of culture from the distribution of names in historical birth registers and from the written language of local newspapers at the time.

With new data in hand, the research can document the prevalence of individualistic migration during the settlement of the United States around the turn of the twentieth century. Among inhabitants of major migrant-sending countries like Norway and Sweden, only those with more uncommon names were more likely actually migrate to. This cultural effect remains even when considering a host of other potential explanations related to economic prospects and family background.

If more individualistic types are more likely to migrate, we would expect to observe an impact on the overall culture of a given location. That is exactly what this research finds. Districts in Sweden and Norway that experienced high emigration flows of people with an individualistic spirit did indeed become more collectivistic – both in terms of child naming trends and in written language pronoun use.

This leaves with the question of whether an impact from this historical event is still visible today. Does international migration have long-term cultural consequences other than the risk of producing cultural clashes?

In this study, this seems to be the case. Scandinavian districts that experience more emigration are still relatively more collectivist today than those that experienced less. Moreover, it is widely agreed that New World countries like the United States are the most individualistic in the world today – a fact that seems to be explained by the type of migrants they once received.

Mortality in economic downturns: unobserved migration can create the false impression that recessions are good for health

by Vellore Arthi (University of Essex), Brian Beach (College of William & Mary), and Walker Hanlon (University of California, Los Angeles)


Are recessions good for health? A number of recent studies suggest that mortality actually goes down during recessions – at least in developed countries, where social safety nets help cushion the blow of unemployment and income loss.

This striking conclusion rests on one of two assumptions: either that people do not respond by migrating away from recession-stricken areas; or that if they move, these population flows can be perfectly measured. But are these assumptions realistic?

Migrant movements can be notoriously difficult to track, and famous episodes such as the Depression-era migration from the US Great Plains to California suggest that these sorts of internal population movements may indeed be a natural response to changes in local economic conditions. This raises the question: what does unaccounted migration mean for our assessment of the recession-mortality relationship?

Our research shows that unobserved migration from recession-stricken regions may actually lead us to underestimate systematically how deadly recessions really are.

To test how migration influences estimates of the relationship between recessions and mortality, we draw on a unique historical natural experiment: the temporary but severe economic downturn in the cotton textile-producing regions of Britain that resulted from the American civil war (1861-65).

The cotton textile industry was England’s largest industrial sector in the second half of the nineteenth century and, prior to the civil war, received the majority of its raw cotton inputs from the American South. The onset of the civil war sharply reduced these supplies, leading to a severe but temporary economic downturn that left several hundred thousand workers unemployed.

Digitising a wealth of historical data on births, deaths and population, and exploiting variation in both the geographical distribution of the British cotton textile industry and the timing of the civil war, we show that standard approaches yield the familiar result: the downturn, popularly termed the ‘cotton famine,’ reduced mortality.

But we also find evidence that migratory responses to this event were substantial, with much of this mobility occurring over short distances, as displaced cotton workers sought opportunities for work in nearby districts.

After making a series of methodological adjustments that account for this recession-induced migration, we show that the sign of the recession-mortality relationship flips: this downturn in fact appears to have been bad for health, raising mortality in both cotton regions and in the regions to which unemployed cotton operatives fled.

After accounting for migration bias, we find that:

  • The civil war-era downturn in the cotton textile regions of Britain increased total mortality in the affected districts
  • But the downturn appears to have led to improved infant and maternal mortality outcomes, probably by freeing up maternal time for breastfeeding, childcare, and other health-improving behaviours.
  • Gains in infant health were offset by large and significant increases in mortality among the elderly.
  • There was no net effect on mortality among working-age adults, who were also the most mobile during the downturn.
  • This outcome appears to have been driven by worsening mortality due to the deteriorating nutrition and living conditions associated with income loss, which was in turn offset by improvements in maternal mortality and by fewer deaths by accidents and violence. (The latter finding is further supported by evidence that alcohol consumption and industrial accident rates fell during the recession.)

Our study provides both a methodological and factual contribution to our understanding of the relationship between recessions and health. The methodological contribution consists of showing that migration undertaken in response to a recession has the potential to introduce substantial bias into estimates of the recession-mortality relationship using the standard approach – particularly if these population flows are not well measured.

This bias is likely to be greater in settings, such as developing countries, where labour forces are more mobile, where weak social safety nets induce migration in response to recessions, and where the intercensal population data used to track these movements are poor. Studies applying the standard approach in these settings are likely to generate misleading results, which may lead to poorly targeted public health responses.

On a factual level, our study also contributes new evidence on the relationship between recessions and mortality in a historical setting, with the implication that studies focused on just one age group, such as infants, may generate results that are not representative of other segments of the population, or indeed of the overall relationship between recessions and mortality.



Ottoman stock returns during the Turco-Italian and Balkan Wars of 1910-1914

by Avni önder Hanedar (Dokuz Eylül University and Sakarya University, Turkey) and Elmas Yaldız Hanedar (Yeditepe University, Turkey)


Were the military conflicts of 19101914 related to higher risks for market investors at the İstanbul Stock Exchange? Wars are often perceived as bad news, correlated with increasing risks for investors and fluctuations in volatility: there would be fall in stock prices due to expected macroeconomic costs, such as higher inflation and lower production, as companies’ activities and expected returns decrease. On the other hand, if wars’ outcomes were perceived as unimportant for companies’ activities and expected returns, then there would be no significant changes in stock prices and volatility.

Many researchers on financial economics have created a large literature on the effects of different wars, and addressed mixed findings. A pioneering research for the political crises of 1880–1914 is Ferguson (2006), contributing to answering how did investors at the London Stock Exchange view the conflicts on the eve of the First World War. He showed the absence of higher war risk on bonds of Great Powers[1] traded on the London Stock Exchange. In addition, Hanedar et al. (2015) evince that the outbreak of the Turco-Italian and Balkan wars were correlated with a lower likelihood of Ottoman debt repayments, using data on two Ottoman government bonds traded on the İstanbul bourse. As the literature on the İstanbul bourse is limited, new light on this question required to explore risk perceived by stock investors due to the historical conflicts.

A column of Tanin presenting the value of bonds and stocks on 14 November 1910

We focus on the influence of stock returns at the İstanbul bourse during the Turco-Italian and Balkan wars, using unique data on stock prices of 9 popular domestic joint-stock companies in the Ottoman Empire. All these companies played a crucial role for the Ottoman economy and operated in the most attractive sectors, i.e. banking, mining, agriculture, and transportation. Some of them are the Ottoman General Insurance company (Osmanlı Sigorta Şirket-i Umûmiyesi), the Regie (Tobacco) company (Tütün Rejisi), and the Imperial Ottoman Bank (Bank-ı Osmanî-i Şâhâne). The data are manually collected from Tanin, which was a widely circulated daily Ottoman newspaper. This research is the first to provide a historical narrative explaining the changes of Ottoman stock returns due to the wars that took place on the eve of the First World War. It observes only small reactions to the Turco-Italian war, and only for three stocks out of ten examined (see Table 1). This is interesting, as previously (Hanedar et al., 2015) we observed higher responsiveness of government bond prices during the same period.



It would be possible to argue that investors might have believed that the war would not be that harmful for the non-governmental economic and financial sectors. An important aspect supporting the finding is that the companies were either established or supported by foreign investors. Great Powers protected their home countries’ investments both economically and politically. The companies obtained revenue guarantees and privileges from the Ottoman state, making the investors’ investments secure. Great Powers that invested in the Ottoman Empire were expecting its demise soon. Therefore, investors were likely to invest in the companies just for the sake of having territorial claim without much consideration of risk. During the nineteenth century, wars were important sources of the solvency problem, which could explain the sensitivity of government bond prices to the conflicts studied here.

The working paper can be downloaded here

References to this blog post here

[1] The UK, France, Germany, Italy, and Austria-Hungary.

How new technology affects educational choices: lessons from English apprenticeships after the arrival of steam power

by Alexandra de Pleijt (Utrecht University), Chris Minns and Patrick Wallis (London School of Economics)


Many workers today worry whether robots will do away with their jobs. Most economists argue that the effect of automation is likely to depend on what workers do. Robots may replace some types of manual work, but new jobs will also be created to design, maintain and manage automated production.

A shift towards ‘new jobs’ would mean that different skills will be valued in the future, and many policy experts have argued that secondary and post-secondary education will have to change in response. But if young people and their parents anticipate how automation will affect their job prospects, the choices made among current educational opportunities could shift ahead of any changes in what is offered.

The effects of automation on educational choice will be seen in the future. But past experience can offer some ideas as to whether the arrival of new technology affects these choices, even before the technology is widespread.

This research examines how the arrival of a new production technology affected educational choices in late eighteenth century England. The period between 1760 and 1810 is at the beginning of the largest shift in history from hand- to machine-powered production, through the invention and spread of the steam engine that powered the British Industrial Revolution.

Our research combines detailed evidence on the location and timing of the adoption of steam engines with the records of over 300,000 English apprenticeships from the rolls of the Commissioner of Stamps.

The main finding is that the arrival of steam power changed the willingness of young people to pursue apprenticeships, which for centuries had been the main route to acquiring the skills required for the production of manufactured goods. Counties saw a fall of 40-50% in the share of population entering into textile apprenticeships once a steam engine was present.

Despite the possible association with machine design and maintenance, mechanical apprenticeships also saw a decline of just under 20% following the arrival of steam. Merchant and professional apprentices, who were trading the goods produced by craft or industry, were mostly unaffected.

These findings show that the workforce responded to the emergence of technology that would dramatically change the nature of production and work in the future, but that much of the response was local. Apprenticeships fell first in northern counties where industrial towns and cities with factory-based production had emerged earlier. A similar decline in how workers were trained was not seen in southern and eastern England in the early part of the Industrial Revolution.


Agency House Crises in India: What Role Did Indigo Play?

by Tehreem Husain

English, Dutch, and Danish factories at Mocha, 1680 ca. Public Domain picture


History provides us with many examples of asset bubbles which have led to systemic crises in the economy. Popular examples are that of the Tulip mania and the South Sea Bubble. This blog discusses the case of an indigo price bubble in nineteenth century India, perhaps the first of its kind, which lead to a contagion like crises in the economy.

 Almost 17.4% of Indian GDP was derived from the agricultural sector in 2015-16, with nearly half of the Indian population being dependent on agriculture and allied activities for livelihood. This makes smooth functioning of commodity markets of considerable importance to policymakers. Throughout time, there have been many episodes of commodity price surges and ensuing market volatility due to traditional demand-supply gaps, monetary stress and financialization of commodity markets inclusive of speculation (Varadi, 2012). What role did agriculture play in commodity market volatility during the late 18th/ early 19th century? Little is known about perhaps the first asset bubble of its kind in India – the indigo crisis, the reasons attributed to it and the cost it imposed on different sectors of the economy.

With the advent of the East India Company, India was a global trade destination for a number of commodities including cotton, silk, indigo, saltpetre and tea. In order to trade these commodities with global markets, European traders needed banks to finance foreign trade. Indigenous bankers in India did not provide this particular banking function and hence the East India Company diversified its business by introducing agency houses in Calcutta which amongst others also performed banking functions. These agency houses performed all the banking functions of receiving deposits, making advances and issuing paper money. Their responsibility of note circulation crucially helped them in carrying out their diversified lines of businesses as ship-owners, land owners, farmers, manufacturers, money lenders and bankers (Cooke, 1830). It was the agency house of Messrs. Alexander & Co. which started the first European bank in India, called the Bank of Hindostan, in 1770 (Singh, 1966).

In the early nineteenth century these agency houses were tested for their endurance and continuance due to three factors. Firstly and most importantly, during the early 1820s, agency houses borrowed money at low interest rates and invested it prodigally in indigo concerns-the crop being the only profitable means of remittance in Europe. The crisis multiplied when newly formed agency houses, besides investing capital in their own indigo concerns, fiercely competed with the old houses in making indiscriminate advances to indigo planters and paid little regard to the actual state of the market. Excessive demand of indigo fuelled the prices in the mid 1820s and encouraged increased production of the commodity which eventually led to a glut in the market and sharp decline in its price. This rise and fall in prices is evident from the fact that the indigo price shot up from Rs. 130/maund in 1813 to Rs. 300 in 1824, and then fell to Rs. 145/maund in 1832 (Singh, 1966).

The second challenge, along with indigo price volatility, was the start of the first Anglo Burmese war in 1825. This further led to stressed monetary conditions resulting in a scarcity of metal in Calcutta (Sinha, 1927).

Thirdly, in terms of the global landscape, this period marked the peak of investment boom in Britain, which characterized an explosion of company promotions and bond issues by foreign governments, mining companies, railways, utilities, docks and steamships. In total during 1824-25 some 624 companies hoping to raise £372 million were brought to the market. However, with the investment boom peaking out in 1825, market conditions had changed. Interest rates had risen making borrowing more expensive, investor sentiment had become more cautious which eventually led to a panic like situation resulting in bank failures and bankruptcies (Brunnermeier & Schnabel, 2015).

In such times of local and global economic stress, several minor agency houses failed in 1827 which shook investor confidence in the remaining agency houses. A notable case is that of the agency house of Messrs. Palmer and Co., known as the ‘indigo king of Bengal’, which faced heavy withdrawals from their partners and eventually led to the closure of their private bank and finally their own demise in 1830. This panicked the market and led to further withdrawals of capital investments.

During this period agency houses made desperate appeals to the government for financial relief and highlighted their importance in the Indian financial system at that time. In a minute dated 14th May 1830, Lord William Bentick, Governor General of India from 1828-35, accentuated systemic importance of agency houses. He highlighted that not only would there be a dislocation of trade in some staple commodities, any damage to the ‘conglomerate’ nature of the agency houses would cause severe disruptions in other industries, most notably shipping. Finally, loans were granted to these houses in the form of treasury notes bearing 6 percent interest.

Despite the monetary aids provided by the government, the wave of agency house failures could not be curbed. More agency houses failed in January 1832. In addition to this, the unexpected fall in the price of indigo created difficulties for one of the biggest agency houses Messrs. Alexander & Co. It is important to note that the relief package came under stringent conditions. They were obliged to withdraw their bank notes from circulation, and were given an extended period for the payment of their debts provided they end their banking operations (Savkar, 1938). This resulted in the demise of the Bank of Hindostan and the Commercial Bank.

Overall seven great Agency Houses of Calcutta failed within a short span of four years which had detrimental effects on the Indian economy at that time. It may be summarized that speculation in indigo and mixing of trading and agency business were the pivotal reasons behind the failure of these agency houses. More importantly, this episode of a commodity price bubble spreading its tentacles to the entire economy had a phenomenal impact on the structure of business. It is recoded that from a handful of firms in the year before 1850, there were 170 firms working as joint stock organizations in 1868. The first commercial register to identify firms with tradable stock was established in 1843 which listed eights firms (Aldous, 2015). Joint stock organizational form also entered banking. A key example is the rise of the Union Bank of Calcutta (Cooke, 1830). The crisis also led to the establishment of a number of private banks by the British expats (Jones, 1995).