Health Problems in Industrial Towns (Commentary)

Health Problems in Industrial Towns (Commentary)

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

This commentary is based on the classroom activity: Health Problems in Industrial Towns

Q1: What health problems did Edwin Chadwick identify in his reports published in 1842 and 1843?

A1: In his report, The Sanitary Condition of the Labouring Population, published in 1842, Chadwick claimed that slum housing, inefficient sewage and impure water supplies in industrial towns were causing the unnecessary deaths of about 60,000 people every year. The following year, Chadwick identified another health hazard. In Internment in Towns, he argued that the tradition of keeping dead bodies in the homes until the funeral took place was responsible for the spread of infectious diseases.

Q2: Study sources 1, 3, 4, 7, 9, 11 and 12. (a) Why were dunghills more of a problem in the summer than the winter? (b) Explain why dunghills were responsible for a considerable amount of disease. (c) Give reasons why many British streets had dunghills in the first-half of the 19th century. Which one of these reasons is the most important?

A2: (a) The warmer weather in the summer increased the bad smell of the dunghills. (b) Dunghills were full of germs that caused disease. Source 1 shows children playing in the dunghill while a woman searches for objects that she might find useful. This contact with dunghills often resulted in people becoming ill. Disease was also spread by flies that took germs from the dunghill and left them on food in the houses. (c) Human waste was piled up in the street because most houses did not have pipes to take the sewerage away. Every so often, this waste was taken away by nightmen (source 9). As William Thorn points out in source 11, this was then sold to farmers as manure. Although money was made from dunghills, the main reason why they existed was the lack of sewerage pipes.

Q3: Study source 6. What does it tell us about industrial towns and public health? Explain why change is not always the same as progress.

A3: In his 1842 report, Edwin Chadwick compared the average age of deaths in different areas of Britain. His figures show that people were more likely to die at an earlier age in industrial towns (Bolton, Liverpool and Manchester) than in rural areas. There was also a great difference between the average age of death in different social groups living in the same area. Chadwick claimed that some aspects of industrial life, like polluted air, affected all the people
that lived in the area. However, some things such as contaminated water supplies, mainly affected the poor.
Some historians have used this information to argue that change is not the same as progress. In the 19th century, large numbers of people moved from villages to industrial towns. As a result, the average age of death went down and was therefore an example of regression rather than progress.

Q4: Study sources 5 and 13. (a) Why were back-to-back terraced houses cheap to build? (b) Why has a ditch been dug between the two rows of houses? (c) Give two reasons why houses were often built close to rivers and canals?

A4: (a) Back-to-back terraced houses were cheap to build as this design ensured that the houses shared as many walls as possible. This saved space and materials. (b) The ditch allowed the sewerage to run away from the houses. (c) Dr. Robertson points out that factories were usually built on the banks of rivers and canals. As the houses were built close to the factories (so that the workers did not have to travel very far to work) they were also close to the rivers and canals. This also provided the workers with a convenient water supply.

Q5: Read source 13. Why does the author believe that some historians have been too critical of Edwin Chadwick?

A5: R. A. Lewis argues that Chadwick's ideas on public health created a great deal of hostility. His opponents often resorted to attacking Chadwick's character as well as his ideas. This has resulted in a lot of sources that are critical of Chadwick. Lewis believes that some historians have treated these sources as being accurate rather than part of a propaganda campaign. As a result, Lewis believes that historians have often provided an unfavourable impression of Edwin Chadwick's character.

Learning Objectives

After completing this module, the student will be able to:

  • Explain the evolution of concepts about the cause and prevention of disease.
  • Describe the importance of studying the factors associated with outcomes in a systematic way in human populations.
  • Discuss some of the major historical figures and events that played a role in the evolution of public health and epidemiology.

Note: words in bold can be scrolled over to provide a definition.

Recognized as ‘environmental racism’

Rene Miller is currently involved in a lawsuit against the hog farm which sprays hog waste on to a field across the street from her home. Photograph: Alex Boerner

To understand Rene Miller’s predicament, you have to start with the pigs.

Their population in North Carolina increased more than threefold in just one decade, from 2.8 million in 1990 to 9.3 million in 2000 – where it’s stayed, more or less, ever since.

In 1986, North Carolina ranked seventh in the country in pork production 30 years later, it’s second only to Iowa, with an estimated 9 million pigs on 2,217 hog farms, according to the US Department of Agriculture’s quarterly hog survey and the 2012 US Census of Agriculture. The pigs have ushered in a $2.9bn-a-year industry that employs more than 46,000 people in North Carolina. But those hogs also produce millions of tons of feces. In one year alone, an estimated 7.5 million hogs in five eastern North Carolina counties produced more than 15.5m tons of feces, according to a 2008 report by the General Accounting Office.

Nowhere are the impacts more profound than in Duplin County, where Miller and about 2.3 million hogs live – more than anywhere else in the state, according to the Environmental Working Group, a research and advocacy organization.

A recent analysis of county and satellite data by the EWG found that roughly 160,000 North Carolinians live within a half-mile of a pig or poultry farm in Duplin, nearly 12,500 people, more than 20% of its residents, live within that range. If you extend the radius to three miles, as many as 960,000 North Carolinians fall into that category. That’s nearly 10% of the state’s population.

For Miller, these numbers aren’t abstractions. They’re her life.

“That scent is so bad,” she says. “You can’t go outside. You can’t go outside and cook anything because the flies and mosquitoes take over.”

Within a mile of her property, Murphy-Brown LLC – a subsidiary of Smithfield Foods, the largest hog producer in the world – owns 5,280 hogs, according to the NC Department of Environmental Quality. Within two miles, there are more than 80,000 Murphy-Brown-owned hogs at seven different farms, according to a lawsuit Miller filed in 2014.

Fifty yards from Miller’s family graveyard is a massive open-air cesspool storing the pigs’ waste – a stagnant pool containing their feces, urine, blood and other bodily fluids – often referred to as a “lagoon”, one of about 3,300 lagoons across the state. When the cesspool reaches its capacity, its contents are liquefied and sprayed into a field across the street from Miller’s house via a large, sprinkler-like apparatus. The sprayer releases a mist of waste on to the field, which, according to court documents, is about 200ft from Miller’s home at its closest rotation.

That system prevents the cesspool from overflowing, but Miller says it also makes her life miserable.

It’s more than just the smell, she says. The liquefied waste mist drifts on to her property, and “dead boxes” filled with rotting hogs sit near her family’s cemetery, attracting buzzards, gnats and swarms of large black flies. After spending time outside, she says, her eyes burn and her nose waters.

She says she also suffers from asthma, which she began to develop shortly after she returned to her childhood home from New Jersey in the late 80s to care for her ailing mother.

Research published by the late Steven Wing, a professor of epidemiology at the University of North Carolina’s Gillings School of Global Public Health, linked similar health concerns to proximity to hog farms.

Wing, who passed away in November, described his research in a 2013 TED Talk:

“In 1995, I began to meet neighbors of industrial hog operations,” he said. “I saw how close some neighborhoods are to hog operations. People told me about contaminated wells, the stench from hog operations that woke them at night, and children who were mocked at school for smelling like hog waste. I studied the medical literature and learned about the allergens, gases, bacteria, and viruses released by these facilities – all of them capable of making people sick.”

Young hogs are gathered in pens at Butler Farms in Lillington, NC. The hogs live on slatted flooring which their waste is washed through and gathered before being pumped into covered lagoons. Photograph: Alex Boerner

Wing’s research showed a correlation between air pollution from hog farms and higher rates of nausea, increases in blood pressure, respiratory issues such as wheezing and increased asthma symptoms for children and overall diminished quality of life for people living nearby.

“Air pollutants from the routine operation of confinement houses, cesspools, and waste sprayers affect nearby neighborhoods where they cause disruption of activities of daily living, stress, anxiety, mucous membrane irritation, respiratory conditions, reduced lung function, and acute blood pressure elevation,” Wing and fellow UNC researcher Jill Johnston wrote in a 2014 study.

They also found that the state’s industrial hog operations disproportionately affect African Americans, Hispanics and Native Americans. That pattern, they concluded, “is generally recognized as environmental racism”.

The environmental racism argument has won some powerful allies, including US Senator Cory Booker, a New Jersey Democrat who in a recent podcast interview denounced the North Carolina hog industry, which he called “evil” for exploiting its African American neighbors.

“They fill massive lagoons with [waste] and they take that lagoon stuff and spray it over fields,” he told Pod Save America, recalling a trip to North Carolina late last year. “I watched it mist off of the property of these massive pig farms into black communities. And these African American communities are like, ‘We’re prisoners in our own home.’ The biggest company down there [Smithfield] is a Chinese-owned company, and so they’ve poisoned black communities, land value is down, abhorrent … This corporation is outsourcing its pain, its costs, on to poor black people in North Carolina.”

Booker, whose father grew up in Hendersonville and graduated from NC Central, told the INDY in a statement: “I saw firsthand in North Carolina how corporate interests are disproportionately placing environmental and public health burdens on low-income communities of color that they would never accept in their own neighborhoods. In North Carolina, large corporate pork producers are mistreating small contract farmers and externalizing their costs on to vulnerable communities, polluting the air, water, and soil, and making kids and families sick while reaping large financial rewards.

“And unfortunately, we know this is not just a problem in North Carolina. Similar environmental injustices are occurring right now all over the United States. This is unacceptable to me, and I’m in the process of finding ways for the federal government to start to meaningfully address this problem.”

In May, US Representative David Price, a Democrat who represents parts of Wake and Orange counties, took his own stab at a legislative solution. He introduced the Swine Act, a bill intended to improve environmental standards for North Carolina’s hog industry.

“It’s a problem our state has a huge stake in,” Price says. “It’s a matter of finding the political will to get ahead of the curve here. Because if we don’t do something like this, if we don’t get these farms on to a sounder waste-disposal system, we’re going to live to regret it.”

While Price’s bill is currently languishing in committee, this issue is already making its way through the courts.

Three years ago, Miller and more than 500 other North Carolina residents, mostly poor and African American, filed 26 federal lawsuits against Murphy-Brown, alleging its behavior adversely affects their health and quality of life. The lawsuits argue that Murphy-Brown’s parent company, Smithfield – which was purchased by the multinational Chinese corporation WH Group in 2013 for an estimated $4.7bn – has the financial resources to manage the pigs’ waste in a way that minimizes the odor and nuisance to nearby property owners.

The industry dismisses these claims.

“North Carolina’s hog farmers are under a coordinated attack by predatory lawyers, anti-farm activists and their allies,” Smithfield Foods told the INDY in an email. “The lawsuits are about one thing and one thing only: a money grab.”

Smithfield points to the fact that between 2012 and 2016, the DEQ only received 25 odor complaints, and of those, none resulted in fines or notices of violations.

“More than 80% of hog farms are owned and operated by families,” Smithfield argues. “They produce good products, they do it the right way, and they strive to be good neighbors.”

Other industry advocates have also alleged that greed is at the heart of these claims. Hog farmers are conscientious neighbors, the industry argues. And Smithfield and NC Pork Council, a trade group funded by commercial hog operations, both point out that the lawsuits don’t ask farmers to change specific behaviors. The NC Pork Council, a trade group funded by commercial hog operations, has blamed the lawsuits on avaricious attorneys who “like to sue farmers for as much money as possible”.

“Most farmers live on or adjacent to their farms and work hard to take good care of the land,” says Andy Curliss, the CEO of the Pork Council. “They are an integral part of the communities in which they live. They do things the right way and strive to be good neighbors.”

In an email, Mark Anderson, an attorney representing Murphy-Brown, says the company “is aggressively contesting the plaintiffs’ claims. After careful study, we concluded that the claims are not valid and have no merit.”

But Miller says she knows what she’s experienced – and that life on Veachs Mill Road has deteriorated since the hog houses came.

“Right now,” she says, “my life is the worst it’s ever been.”


Although science provided a foundation for public health, social values have shaped the system. The task of the public health agency has been not only to define objectives for the health care system based on facts about illness and health, but also to find means to implement health goals within a social structure. ''The boundaries of public health [have changed] over time with the perception of new health and social problems and with political, economic, and ideological shifts within the government and the nation." (Fee, 1987) The history of public health has been one of identifying health problems, developing knowledge and expertise to solve problems, and rallying political and social support around the solutions.

Despite the huge successes brought about by scientific discovery and social reforms, and despite a phenomenal growth of government activities in health, the solving of public health problems has not taken place without controversy. Repeatedly, the role of the government in regulating individual behavior has been challenged. For example, as early as 1853, Britain's Board of Health was disbanded because Chadwick, its director, "claimed a wide scope for state intervention in an age when laissez-faire was the doctrine of the day." (Chave, 1984) The relationship between public health and private medical practice has also been much debated. In 1920, the New York Medical Society vehemently opposed and succeeded in defeating a proposal for a system of public rural clinics throughout the state. (Starr, 1982) Arguments about the scope of public health and the extent of public sector responsibility for health continue to this day.

The development of a scientific base for public health allowed some consistency in the public health system across the country. All of the states in the United States are involved in some manner in sanitation, laboratory investigation, collecting vital statistics, regulation of the environment, epidemiology, administering vaccines, maternal and child health, mental health, and care of the poor. How local systems conduct these programs differs greatly from area to area. Changing values over both time and place have allowed great variety in the implementation of public health programs across the country.

The following chapter, which summarizes the current public health system in the United States and public health activities in six states visited by the committee, illustrates the variety of approaches to public health which have evolved throughout the current system.

Diseases in industrial cities in the Industrial Revolution

Disease accounted for many deaths in industrial cities during the Industrial Revolution. With a chronic lack of hygiene, little knowledge of sanitary care and no knowledge as to what caused diseases (let alone cure them), diseases such as cholera, typhoid and typhus could be devastating. As the cities became more populated, so the problem got worse.

A filthy “Father Thames”

Cholera was a greatly feared disease. Caused by contaminated water, it could spread with speed and with devastating consequences. Not for nothing did the disease get the nick-name “King Cholera“. Industrial Britain was hit by an outbreak of cholera in 1831-32, 1848-49, 1854 and 1867. The cause was simple – sewage was being allowed to come into contact with drinking water and contaminating it. As many people used river water as their source of drinking water, the disease spread with ease.

An attack of cholera is sudden and painful – though not necessarily fatal. In London it is thought 7000 people died of the disease in the 1831-32 outbreak which represented a 50% death rate of those who caught it. 15,000 people died in London in the 1848-49 outbreak. The disease usually affected those in a city’s poorer areas, though the rich did not escape this disease.

Smallpox made a major re-occurrence in industrial cities even after Edward Jenner’s vaccine. The reason was simple. Very many in the industrial cities were ignorant of the fact that Jenner had developed a vaccine. As Britain continued on its road to a population mostly centred in cities and the agricultural regions became less populated, traditional old wives tales and developments linked to them (such as coxpox, milk maids, Jenner etc) became less well known. Also the overcrowded tenements of the cities were a perfect breeding ground for smallpox.

Typhoid and typhus were as feared as cholera. Both were also fairly common in the Industrial Revolution. Typhoid was caused by infected water whereas typhus was carried by lice. Both were found in abundance in industrial cities.

The greatest killer in the cities was tuberculosis (TB). The disease caused a wasting of the body with the lungs being attacked. The lungs attempt to defend themselves by producing what are called tubercles. The disease causes these tubercles to become yellow and spongy and coughing fits causes them to be spat out by the sufferer.

TB affected those who had been poorly fed and were under nourished. It also affected those who lived in dirty and damp homes. TB can be spread by a person breathing in the exhaled sputum of someone who already has the disease. In the overcrowded tenements of the industrial cities, one infected person could spread the disease very easily.

Though accurate records are difficult to acquire, it is believed that TB killed one-third of all those who died in Britain between 1800 and 1850.

Microbes were only discovered in 1864 by Louis Pasteur. Until that time all manner of theories were put forward as to what caused diseases. A common belief – and one that dated back to Medieval England – was that disease was spread by bad smells and invisible poisonous clouds (miasmas). Industrial cities were certainly plagued by poor smells from sewage, industrial pollutants etc. The majority of deaths were in the industrial cities. Therefore, doctors concluded, the two went together: death and bad smells/gasses.

Such beliefs caused serious problems. In Croydon, typhoid swept through the town in 1852. The local Board of Health went about looking for a smell that caused the disease but found nothing. In fact, sewage had seeped into the town’s water supplies and contaminated the water. It did not occur to the health officials that the water could be the cause of the disease as medical wisdom of the time dictated another cause.

Even a great reformer like Edwin Chadwick was convinced that disease was carried in the atmosphere which had been poisoned by foul smells. In 1849, he persuaded the authorities in London to clean up the sewers in their districts. This, so Chadwick believed, would get rid of the bad smells and therefore disease. Each week an estimated 6000 cubic yards of filth was swept into the River Thames – London’s main source of water. Cholera was given a chance to spread and 30,000 Londoners got the disease in 1849 with 15,000 dying as a result.

Industrialization and health

Throughout history and prehistory trade and economic growth have always entailed serious population health challenges. The post-war orthodoxies of demographic and epidemiological transition theory and the Washington consensus have each encouraged the view that industrialization necessarily changes all this and that modern forms of rapid economic growth will reliably deliver enhanced population health. A more careful review of the historical demographic and anthropometric evidence demonstrates that this is empirically false, and a fallacious oversimplification. All documented developed nations endured the ‘four Ds’ of disruption, deprivation, disease and death during their historic industrializations. The well-documented British historical case is reviewed in detail to examine the principal factors involved. This shows that political and ideological divisions and conflict—and their subsequent resolution in favour of the health interests of the working-class majorities—were key factors in determining whether industrialization exerted a positive or negative net effect on population health.

Industrialization refers to a process which has occurred in the history of all economically ‘developed’ nation states and which remains an aspiration for most of the governments of those many populations which remain today relatively undeveloped. Through industrialization the economy of a country is dramatically transformed so that the means whereby it produces material commodities is increasingly mechanized since human or animal labour is increasingly replaced by other, predominantly mineral sources of energy in direct application to the production of useful commodities 1 . Industrialization is a special case of the near-universal phenomenon of human trade and economic change. It refers to a period of marked intensification of such activity, which in all known cases has resulted in an irreversible change in a country’s economy, after which the production and international trading of commodities remains permanently at a much higher level of intensity. This is largely because the factorial increase in productive capacities made possible by the technological shift in power supply simultaneously entails a wide range of accompanying transformations in the social relations of work, trade, communications, consumption and human settlement patterns and so, inevitably, also implies profound cultural, ideological and political change.

It would be extraordinary if such a thoroughgoing process did not have a range of significant health implications. Two of the oldest, most well-established relationships between economic activity, or trade, and population health are recognized to be mediated through the epidemiological implications of, firstly, regular social interaction between populations previously not exposed to each other’s disease ecology, and, secondly, the increasingly dense permanent settlement of populations, which occurs in the form of towns occupying nodal or strategic points in trading networks. Both of these relationships have always been understood to be negative, in terms of the health of the populations exposed 2– 4 . It has always been realized that the lure and the material benefits of economic exchange between peoples possessing different resources and producing different commodities carry enhanced risks of the accompanying exchange of potentially fatal diseases. The historical records of the early modern city-states of Italy, for instance, demonstrate their governments’ attentions to a range of public health issues to do with the sanitary problems of packed, urban living and the periodic threats of imported epidemics 5 . The gradual expansion of international and intercontinental trade, including of course in persons themselves, throughout the subsequent centuries was characterized by a sequence of extraordinarily lethal epidemics of infectious disease, most tragic of all for the indigenous populations of the Americas. Thus one of France’s most eminent historians has famously written of the era of rising world trade from the 14th century to the 17th as the era of ‘l’unification microbienne du monde’ 6 .

However, despite these well-understood, long-standing negative health risks associated with urbanization and with trade, by contrast the process of industrialization has in general been considered to have a much more positive relationship with human health. There is of course a very obvious intuitive reason for this. It is widely understood that industrialization was a necessary initiating historical process experienced by all today’s ‘successful’, high per capita income societies. These are generally among the populations with the highest life expectancy at birth in the world today. This has been made possible by the advanced medical technology, better food supply, and increased material living standards as a result of the continuous process of economic growth they have all experienced ever since industrialization. The apparently compelling logical inference is that industrialization has improved human welfare and health. This conclusion has been repeatedly supported during the course of the 20th century by a succession of research-based interpretations of the relationship between health and the kind of sustained economic growth made possible by industrialization 7– 13 . The study of British economic history has played a particularly crucial role in informing this generally positive evaluation, partly because it was the first nation-state ever to industrialize but also because of the exceptionally high quality and quantity of its historical medical, epidemiological and demographic as well as economic data. This is due principally to the fact that the British nation-state, as a record-creating and preserving entity, has maintained its integrity throughout many centuries, resulting in the survival of a relative abundance of evidence. 47

The preponderant importance of a secular fall in mortality as being the first and foremost welfare dividend to flow from industrialization has been a central feature of the orthodox consensus throughout the last century. By the beginning of the 20th century, it was obvious that rapid population growth had accompanied the process of industrialization in each modern country’s history. In Sweden, the only country whose official vital statistics reliably reached back to the 18th century, it was also evident that the population growth of the 19th century had been principally as a result of falling mortality, reflecting improving population health. 48 In 1926, two independent research monographs on Britain appeared 7, 8 , each documenting all the important advances in medical knowledge and institutions, which occurred from the late 17th through to the early 19th century. These were portrayed as the health-enhancing first fruits of the same burgeoning spirit of rational scientific enquiry which had produced concomitant advances in technology and industry. By 1929, a grand general theory of ‘demographic transition’ had been sketched, which was to become the dominant international ‘development’ orthodoxy throughout the post-war era 14– 16 . This envisaged all industrializing countries necessarily passing through a linear evolutionary pattern of three stages. The primum mobile of economic growth directly caused a fall in the high mortality rates characterizing stage one, by raising living standards and through the society’s enhanced ability to benefit from medical science, hygiene and sanitation. Consequently, during the transitional stage two, population growth rates increased rapidly until, in the final stage three, parents adjusted their traditional pro-fertility behaviour by reducing their birth rates to reflect the new circumstances of much higher survival rates for their offspring.

In the 1970s, transition theory was apparently further elaborated by two influential contributions. Firstly, Omran’s concept of the epidemiological transition specified three types of epidemiological regime typical of the three stages of demographic transition 17 . Famines and pestilence dominated the pre-industrial high mortality stage, followed by ‘receding pandemics’ as transitional societies industrialized, became wealthier and their medical technology advanced. Finally, the most developed, high life expectancy societies of stage three were afflicted primarily by a residual of ‘degenerative and man-made diseases’. Secondly, Thomas McKeown’s widely-read The Modern Rise of Population argued that the principal cause of the mortality decline consequent on industrialization, as specified in the transition model, was not medical science and technology but primarily rising living standards 10 . The beneficial effect of economic growth on population health was initially transmitted primarily through a gradually rising per capita nutritional intake made possible by a better food supply and rising real incomes (purchasing power). McKeown founded this conclusion on his pioneering epidemiological analysis of the historical series of detailed cause-of-death data available for the whole population of England and Wales since the mid-19th century.

Although McKeown’s thesis, to the extent that it was evidence-based, applied only to the epidemiological history of one country, his findings were nevertheless taken to be broadly generalizable. This was partly because of McKeown’s persuasive skills and his impressively detailed epidemiological data. It was also the result of a widespread assumption, which pervaded the post-war era and which continues to be influential, that the demographic or epidemiological transition is itself a singular, generic process, which has occurred repeatedly following industrialization in all developed countries’ histories a . It follows from this assumption that it can therefore be adequately studied through a single well-documented example. It also followed that the currently non-industrialized countries of the 1970s might profitably learn from such a model and fashion their development policies accordingly.

The 1970s also witnessed the emergence of a resurgent, monetarism and neo-classical economics, which, during the course of the 1980s, replaced the social democratic ‘Keynesian’ with the neo-liberal ‘Washington’ consensus as the dominant programmatic set of policy prescriptions informing the macro-economic and lending policies of western governments and banks and the major Bretton Woods institutions of the World Bank and the IMF, located in Washington. The existence of McKeown’s well-publicized work made it much easier to press forward the neo-liberal economic agenda in the course of the 1980s, with its focus on maximizing capitalist, free market economic growth, not only in the ‘First World’ but also in the world’s least developed countries, since McKeown had apparently proved that the rising living standards facilitated by industrialization had been the principal cause of epidemiological transition in the past.

There had always been important dissenting voices, which disputed the general validity of McKeown’s work, notably, Sam Preston’s important cross-national statistical research. This indicated that during the course of the 20th century rises in societies’ overall investments in health-promoting technology and services—much of it state-organized and funded—was a more significant source of gains in average life expectancy than their rising per capita incomes 18, 19 . However, this was not the message that neo-liberal economists wanted to hear, intent as they were on ‘rolling back’ the state and freeing-up the market. Furthermore, during the 1980s, McKeown’s emphasis on the importance of nutrition also caught the eye of the highest profile practitioner of economic history. The Nobel prizewinner Robert Fogel b published a series of research papers during the late 1980s and early 1990s which presented a new source of long-run historical health data—the anthropometric evidence of American military recruits’ heights and weights 12, 20, 21 . He argued, along McKeownite lines, that this also showed that nutritional inputs were the most important driver of population health during the initial stages of industrialization. Thus, in the important World Development Report for 1991, compiled under the general direction of the leading neo-liberal, Lawrence Summers, Fogel’s work was given prominence and McKeown was cited but there was no reference to Preston’s alternative analyses 22 .

However, in Britain the 1980s also saw the publication of a major new work of long-term historical demographic reconstruction, which radically undercut the crucial assumptions of ‘transition’ theory and so, also of McKeown’s interpretation of the British epidemiological data from the mid-19th century onwards. The Cambridge Group for the History of Population and Social Structure succeeded in reconstructing the population history of England, including national trends in mortality and fertility, on the basis of a 4% sample of the data held in the 10,000 parish registers of England back to their instigation by Henry VIII in 1538 23 . Their work demonstrated, firstly, that England before industrialization was not a regime of high famine and pestilence mortality as envisioned in transition thinking. Secondly, the quadrupling of English population, which took place during industrialization between 1700 and 1870, was driven principally by the increased fertility of marriage and only to a relatively slight extent by a modest fall in mortality. Around 1700, expectation of life at birth had been approximately 36 years and by 1871 it still stood at no more than 41 years. Following this pioneering effort, there has been an enormous flow of further primary research exploiting Britain’s parish registers and much other relevant evidence, which has confirmed these two principal findings 24 .

McKeown had supposed, from within the perspective of modernization and transition thinking, that in addressing the epidemiological patterns of falling mortality, which he could track from the Registrar-General’s official cause of death data from ca. 1851 onwards, he was analysing a single secular trend, which would have started during the late 18th century when it was believed that the British industrial revolution had begun. However, one of the further important conclusions to emerge from the research of the demographic historians was that McKeown’s data series began in the middle of a strange, half-century-long period of stasis in the nation’s mortality. The national average expectation of life at birth had improved fitfully and gradually during the 18th century to reach a level of about 41 years by 1811 but thereafter it failed to register any further improvement above that level until the 1870s. This meant that during the whole of the period when the British economy experienced its historically unprecedented, sustained economic growth rates, while its steam-driven economy powered its way to global trading predominance during the long mid-Victorian boom, overall mortality rates failed to improve at all. Although health had apparently improved moderately during the initial phases of slow economic growth in the 18th century, when full-scale industrialization arrived with the diffusion of steam technology, factories and rail transport, there were then no further net gains in health for two generations. This is despite the fact that workers’ average real wages, which showed no overall improvement before 1811, now began definitely to rise throughout the rest of the 19th century 25 . This chronology is all wrong for the McKeown thesis. Mortality fell in the 18th century without the benefit of increased purchasing power for food (the fluctuating cost of food was the major budgetary item influencing the reconstructed average real wage trend), whereas overall health failed to improve between 1811 and 1871, despite enhanced purchasing power.

Further research on an independent body of evidence, British anthropometric data, has confirmed that late 18th-century improvements in height attainments were curtailed and then even reversed during the second quarter of the 19th century 26 . It is now clear from this and from other detailed demographic research on urban patterns of mortality during this period, that the principal reason for the failure of the national average life expectancy to register any further gains between 1811 and 1871 was due mainly to deteriorating health conditions in Britain’s industrializing towns and cities (Szreter and Mooney 27 ). All the available evidence for a variety of towns of very different sizes, from a Carlisle or a Wigan to Glasgow, exhibits the same patterns and trends. Urban life expectancies, though they had probably improved during the late 18th century, were well below the national average by the end of the first quarter of the 19th century. Thereafter they experienced a particularly deep crisis persisting for two decades during the 1830s and 1840s, followed by a return to the pre-crisis levels (i.e. still well below the static national average) in the 1850s and 1860s. From the 1870s onwards, urban life expectancies finally began to climb above the levels of the early 19th century and, in so doing, pushed the national average onto an upward trend, too (Britain by this time having become a predominantly urban society).

Thus, quite to the contrary of the dominant 20th-century consensus, the only abundantly documented historical case, Britain, shows that industrialization had a powerfully negative direct impact on population health, concentrated particularly among the families of the relatively disempowered, displaced migrants who provided a large part of the workforces in the fast-growing industrial towns and cities 28 . According to this viewpoint, industrialization is not a special case, but conforms to the more general pattern, throughout human history, that periods of increasing economic activity, because they are associated with increasing trade and urban settlement, are also intrinsically productive of increased health risks. Indeed, industrialization, because it is so extensive in its economic scale of transformation, may well exert its negative health effects more dramatically and rapidly than any of the historically earlier forms of more moderate increases in trade and economic activity.

There are a number of ways of seeking to explain these findings about 19th-century Britain in such a way as to reject this conclusion and to preserve instead the conviction that industrialization is, still, a special case and has been a positive influence on health. However, each of these collapses on closer examination. It is, for instance, not the case that such negative health effects as Britain’s towns experienced in the 1830s and 1840s were ‘merely’ the result of urban size or speed of growth or inadequate knowledge of health-preserving technology at that time. Towns of all different sizes from just 20,000 to over 100,000 inhabitants were affected. Most cities grew no faster in these two decades than any two of the previous six or seven decades. Nor was there an inevitable knowledge or ‘learning’ deficit. The technology for constructing urban water supply and the importance of sanitation and sewering was well-understood, as Edwin Chadwick’s summation of knowledge published in 1842 shows 29 the importance of personal hygiene, good food and cleanliness of the personal environment was also well-understood as Haines et al have ingeniously demonstrated 30 .

The heterodox thesis is that industrialization itself, like all forms of economic growth, exerts intrinsically negative population health effects among those communities most directly involved in the transformations which it entails. The case for this apparently paradoxical proposition grows much stronger when it is realized that in virtually all known cases of the industrialization of today’s successful developed economies, their historical demographic or anthropometric trends exhibit the same ‘trademark’ pattern of a negative inflection in the health trends during the decades in which industrialization most affected their populations. This is true, for instance of studies which have been published on populations in USA, Germany, France, Holland, Japan, Australia, Canada and Sweden 31 (Sweden has sometimes been considered an exception, but the most recent research has shown that the landless Swedish rural populace did suffer significant health consequences during the second quarter of the 19th century when their agricultural economy was first exposed to commercial pressures necessitating raised productivity, whereas later in the century it was the crucial role played by advanced government public health measures in the 1870s in anticipating the health problems of industrial urbanization, which minimized such negative effects when Sweden experienced its own industrialization) 32, 33 .

However, it is also true that in each of these cases, as in Britain, a period during which the health of the population was compromised by industrialization was ultimately resolved, so that continuing economic growth came eventually to be accompanied by generally rising health—even in the largest most densely populated cities—resulting in the high life expectancy societies of the present day. The crucial analytical point, of enormous policy relevance, is that this potential capacity of post-industrial economic growth to provide the material basis for generally enhanced population health is not intrinsic to the process of industrialization or of economic growth in itself.

As careful attention to the historical relationship between industrialization and health in the case of Britain and most other countries shows, the direct consequences of rapid economic growth on health are likely to be negative, for a set of long-understood epidemiological reasons. In fact the kind of dramatic transformation associated with the industrialization of an economy is especially likely to be negative in its immediate impact on health and welfare because of the profoundly disruptive nature of this change. The disruption is simultaneously multi-dimensional: social and familial relations, moral codes, ethical standards of behaviour, the physical and the built environments, forms of government, political ideologies and the law itself are all thrown into flux and tumult when a society experiences industrialization and the consequent population movements that are entailed. Such disruptions tend to cause forms of social deprivation to arise, which can lead to disease and ultimately to death for the most unfortunate and marginalized individuals—often children, migrants or ethnic minorities. These are the ‘four Ds’ of rapid economic growth: disruption, deprivation, disease and death 34 . They can only be addressed through political mobilization of the society to devise new structures, which can respond to the forces of disruption and remedy their consequences. This typically requires, at a minimum, massive investment in urban preventive health infrastructure, and an accompanying regulatory and inspection system, along with a humane social security system.

The classic, catch-22 problem, for societies experiencing the disorienting transformations of industrialization is that politics itself is profoundly disrupted, since the process throws up, by definition, a variety of newly powerful commercial and business ‘interest’ groups, typically very divided among themselves on ethnic, regional, industrial or religious lines, to challenge the incumbent governing classes. In British society and its industrial towns, an effective paralysis of the political will occurred for two generations between approximately 1830 and 1870 as successive national and local governments doggedly dodged the expensive issue of investment in urban preventive health infrastructure, even in the face of recurrent cholera visitations. The default ideology of this era, ‘laissez-faire, laissez passer’, reflected the political wisdom that in such a socially fissured society of vigorous competing interests, ‘every man for himself’ was the only general proposition which could command assent. In an as yet undemocratic ‘shopocracy’, dominated by the votes of those precariously trying to keep their heads above water in a roller-coaster market economy, the only electable governments were those which promised to keep national income tax or local rates to an absolute minimum—the most common electoral battle cries were ‘retrenchment’ and ‘economy’ 35 . As a result, whereas the ‘winners’ in this society invested and gambled huge amounts of capital in the railway mania, there was no adequate collective investment in even the basic urban health infrastructure of sewers and clean water and street paving (crucial for health in a horse-drawn economy) 36 . Whereas the paternalistic landed governing class had presided in the late 18th century over an increasingly generous national social security system, the Old Poor Law, spending was slashed under the deterrent ‘workhouse’ system of the New Poor Law of 1834, reflecting the evaporation of social trust between the classes in this disrupted and divided society 37, 38 .

After delaying for as long as they dared, from 1867 to 1928, in response to organized male working-class and subsequent feminist political pressure, the British propertied governing class passed a sequence of four major enfranchisement acts which ultimately granted the vote to all adults of both sexes on an equal basis. From 1867 onwards, this began to transform the electoral arithmetic and the politics of the health and social security needs of the wage labour class in society. The shift in political economy occurred first at municipal level. Under its visionary Mayor Joseph Chamberlain, an industrial magnate, the city of Birmingham pioneered a programme of ‘gas and water socialism’ as its opponents vilified it 39, 40 . Local monopoly services were bought, built and run by the city to provide revenue for an expanding preventive health and social services infrastructure. Once Chamberlain had proved both the electoral and the practical viability of this new political economy, all other major cities and eventually smaller towns, too, followed suit over the next three decades 41 . The towns were beautified but also, crucially, the urban death-rates came tumbling down as local authorities’ expenditure on the health and environmental needs of their mass electorates multiplied to the point where in 1905 the total amount spent by vigorous local governments actually exceeded (for the only time in Britain’s recorded history) the total spent by central government 42 . In December 1905, the ‘New Liberal’ administration won a landslide general election victory and ushered in an entirely new era of state activism with a host of centrally-organized and funded measures, such as old age pensions, labour exchanges, a school medical inspection service, free school meals for the needy, and national insurance against sickness and unemployment for workers. The politics of working-class interests had thus transmuted from the municipal to the national stage in Britain, something which would ultimately lead to the enactment of the welfare state.

The lessons of history, therefore, are that all economic exchange entails health risks and that industrialization typically results in a particularly concentrated cocktail of such health risks. From a policy point of view, it is particularly important that currently non-industrialized societies are neither encouraged nor forced to enter the industrialization process without a clear understanding of the difficult prospects which they face for at least a generation while undergoing this profoundly disruptive process. It may well be possible to avoid the undesirable fourth ‘D’ of death and possibly even the third ‘D’ of disease, given a sufficiently careful and thoroughgoing effort to manage and respond to the forms of deprivation which rapid economic growth produces as it transforms communities and relationships—something which Sweden may well have achieved during the last quarter of the 19th century. Like the Swedish case, the British historical case also suggests that extremely committed, well-informed, well-funded, devolved and democratically responsive forms of local government may be more important than the central state in effectively managing the immediate negative health consequences of industrialization. However, ultimately, the redistributive resources and authority of the central state in a democratic society will undoubtedly become important in ensuring that long-term sustained economic growth continues to be a benefit to the health and welfare of the whole population, rather than merely a source of ever-increasing private wealth to a small proportion of individuals favoured by birth and by chance, which is a tendency inherent in the normal working of unregulated, free market capitalism.

The apparently intuitively obvious notion that the economic growth of industrialization must be straightforwardly beneficial for health has, thus, been shown to be based on a misleading simplification of economic and demographic history, though one which was apparently supported by now-obsolete historical and epidemiological interpretations of history. It is now increasingly emphasized by historical researchers that politics and government have played an all-important role in ensuring that the wealth accumulated by the socially divisive and competitive processes of market economic growth is recycled and redistributed throughout a society to ensure that it contributes more equitably to the overall population health and welfare of the vast majority of the citizens involved in the process as producers and consumers 43, 44 . Unfortunately there is insufficient sign as yet that this understanding is informing the strategy of the most important international institutions which influence the future course of world development, notably the IMF and the WTO (the World Bank has been notably more ambivalent in its approach since the World Development Report of 1997). Policy prescriptions for the world’s poorest countries need to recognize that their state and local government capacity has been dangerously decimated during the last two decades of neo-liberal, free market fundamentalism 45, 46 .

Such transition thinking is an integral part of a more general, encompassing ‘modernization’ ideology, a set of ideas which trace their genealogy to the post-Enlightenment project to spread liberty, scientific reason and democracy to the world, which remains a profoundly influential motivating force in contemporary global history, in particular providing the ethical rationale for the project of international ‘development’.

Fogel had shot to fame in the 1970s with his co-author Stanley Engerman through their pioneering quantitative econometric history of slavery which startlingly concluded that slavery was an efficient economic system and that most black southern slaves had enjoyed a higher standard of living than freed wage-earners in the industrial north in the pre-civil war era: Fogel RW, Engerman SL, Time on the Gross. London: Wildwood House, 1974.


In the United States, chronic illnesses and health problems either wholly or partially attributable to diet represent by far the most serious threat to public health. Sixty-five percent of adults aged ≥20 y in the United States are either overweight or obese ( 13), and the estimated number of deaths ascribable to obesity is 280184 per year ( 14). More than 64 million Americans have one or more types of cardiovascular disease (CVD), which represents the leading cause of mortality (38.5% of all deaths) in the United States ( 15). Fifty million Americans are hypertensive 11 million have type 2 diabetes, and 37 million adults maintain high-risk total cholesterol concentrations (>240 mg/dL) ( 15). In postmenopausal women aged ≥50 y, 7.2% have osteoporosis and 39.6% have osteopenia ( 16). Osteoporotic hip fractures are associated with a 20% excess mortality in the year after fracture ( 17). Cancer is the second leading cause of death (25% of all deaths) in the United States, and an estimated one-third of all cancer deaths are due to nutritional factors, including obesity ( 18).

Developments from 1875

The work of Italian bacteriologist Agostino Bassi with silkworm infections early in the 19th century prepared the way for the later demonstration that specific organisms cause a number of diseases. Some questions, however, were still unanswered. These included problems related to variations in transmissibility of organisms and in susceptibility of individuals to disease. Light was thrown on these questions by discoveries of human and animal carriers of infectious diseases.

In the last decades of the 19th century, French chemist and microbiologist Louis Pasteur, German scientists Ferdinand Julius Cohn and Robert Koch, and others developed methods for isolating and characterizing bacteria. During this period, English surgeon Joseph Lister developed concepts of antiseptic surgery, and English physician Ronald Ross identified the mosquito as the carrier of malaria. In addition, French epidemiologist Paul-Louis Simond provided evidence that plague is primarily a disease of rodents spread by fleas, and the Americans Walter Reed and James Carroll demonstrated that yellow fever is caused by a filterable virus carried by mosquitoes. Thus, modern public health and preventive medicine owe much to the early medical entomologists and bacteriologists. A further debt is owed bacteriology because of its offshoot, immunology.

In 1881 Pasteur established the principle of protective vaccines and thus stimulated an interest in the mechanisms of immunity. The development of microbiology and immunology had immense consequences for community health. In the 19th century the efforts of health departments to control contagious disease consisted in attempts to improve environmental conditions. As bacteriologists identified the microorganisms that cause specific diseases, progress was made toward the rational control of specific infectious diseases.

In the United States the diagnostic bacteriologic laboratory was developed—a practical application of the theory of bacteriology, which evolved largely in Europe. These laboratories, established in many cities to protect and improve the health of the community, were a practical outgrowth of the study of microorganisms, just as the establishment of health departments was an outgrowth of an earlier movement toward sanitary reform. And just as the health department was the administrative mechanism for dealing with community health problems, the public health laboratory was the tool for the implementation of the public health program. Evidence of the effectiveness of this new phase of public health may be seen in statistics of immunization against diphtheria—in New York City the mortality rate due to diphtheria fell from 785 per 100,000 in 1894 to 1.1 per 100,000 in 1940.

The Centers for Disease Control and Prevention (CDC originally the Communicable Disease Center), an agency of the U.S. Department of Health and Human Services, was founded in 1946 and was tasked with the mission of preventing and controlling disease and promoting public health. The CDC serves a key role in gathering and disseminating information on disease and disease prevention to the general public. Today it is a leading center of epidemiology.

While improvements in environmental sanitation during the first decade of the 20th century were valuable in dealing with some problems, they were of only limited usefulness in solving the many health problems found among the poor. In the slums of England and the United States, malnutrition, venereal disease, alcoholism, and other diseases were widespread. Nineteenth-century economic liberalism held that increased production of goods would eventually bring an end to scarcity, poverty, and suffering. By the turn of the century, it seemed clear that deliberate and positive intervention by reform-minded groups, including the state, also would be necessary. For this reason many physicians, clergymen, social workers, public-spirited citizens, and government officials promoted social action. Organized efforts were undertaken to prevent tuberculosis, lessen occupational hazards, and improve children’s health.

The first half of the 20th century saw further advances in community health care, particularly in the welfare of mothers and children and the health of schoolchildren, the emergence of the public health nurse, and the development of voluntary health agencies, health education programs, and occupational health programs.

In the second half of the 19th century, two significant attempts were made to provide medical care for large populations. One was by Russia and took the form of a system of medical services in rural districts after the communist revolution, this was expanded to include complete government-supported medical and public health services for everyone. Similar programs have since been adopted by a number of European and Asian countries. The other attempt was prepayment for medical care, a form of social insurance first adopted toward the close of the 19th century in Germany, where prepayment for medical care had long been familiar. A number of other European countries adopted similar insurance programs.

In the United Kingdom a royal-commission examination of the Poor Law in 1909 led to a proposal for a unified state medical service. This service was the forerunner of the 1946 National Health Service Act, which represented an attempt by a modern industrialized country to provide services to all people.

Later, prenatal care made a substantial contribution to preventive medicine, with the education of mothers influencing the physical and psychological health of families and being passed on to succeeding generations. Prenatal care provides the opportunity to educate the mother in personal hygiene, diet, exercise, the damaging effects of smoking, the careful use of alcohol, and the dangers of drug abuse.

Public health interests also have turned to disorders such as cancer, cardiovascular disease, thrombosis, lung disease, and arthritis, among others. There is increasing evidence that several of these disorders are caused by factors in the environment. For example, there exists a clear association between cigarette smoking and the eventual onset of certain lung and cardiovascular diseases. Theoretically, these disorders are preventable if the environment can be altered. Health education, particularly aimed at disease prevention, is of great importance and is a responsibility of national and local government agencies as well as voluntary bodies. Life expectancy has increased in almost every country that has taken steps toward reducing the incidence of preventable disease.

City Life in the Late 19th Century

Between 1880 and 1900, cities in the United States grew at a dramatic rate. Owing most of their population growth to the expansion of industry, U.S. cities grew by about 15 million people in the two decades before 1900. Many of those who helped account for the population growth of cities were immigrants arriving from around the world. A steady stream of people from rural America also migrated to the cities during this period. Between 1880 and 1890, almost 40 percent of the townships in the United States lost population because of migration.

Industrial expansion and population growth radically changed the face of the nation's cities. Noise, traffic jams, slums, air pollution, and sanitation and health problems became commonplace. Mass transit, in the form of trolleys, cable cars, and subways, was built, and skyscrapers began to dominate city skylines. New communities, known as suburbs, began to be built just beyond the city. Commuters, those who lived in the suburbs and traveled in and out of the city for work, began to increase in number.

Many of those who resided in the city lived in rental apartments or tenement housing. Neighborhoods, especially for immigrant populations, were often the center of community life. In the enclave neighborhoods, many immigrant groups attempted to hold onto and practice precious customs and traditions. Even today, many neighborhoods or sections of some of the great cities in the United States reflect those ethnic heritages.

Health Problems in Industrial Towns (Commentary) - History

Table of Contents

ntario&rsquos Board of Health was first established in 1882. In 1884, the province&rsquos first medical officer of health started his job. By 1886, 400 boards of health were in operation throughout the province, in communities large and small. The promotion of healthy living in Ontario had begun.

During the late 1800s and early 1900s, governments were busy building and maintaining hospitals, cleaning up the urban environment, and making the water supply safe. Private organizations, such as benevolent societies and church groups, provided relief to the sick and did their best to control outbreaks of disease.

Click to see a larger image (60K)
Disinfecting railway cars for foot
and mouth disease, 1908
John Boyd fonds
Reference Code: C 7-3-1672
Archives of Ontario, I0003363

Click to see a larger image (92K)
The sale of &ldquounsanitary&rdquo ice cream, [ca. 1905]
Public Health Nursing Branch
Reference Code: RG 10-30-2, 3.02.5
Archives of Ontario, I0005187

Their work was part of the Victorian notion of social reform that flourished in Canada between 1880 and 1920: the belief that by promoting social causes-temperance, protection of children, improved working conditions, better schooling and medical care-traditional Christian values were being advanced.

The promotion of good health, to many reformers, pointed the way toward social progress and the advancement of society.

&ldquoM. H. O Hastings: I had no idea you needed cleaning up so badly&rdquo. A caricature
of Charles Hastings, Toronto&rsquos Medical Health Officer, and commentary
on his attempts to make Toronto cleaner and healthier, [ca. 1910-14]
Newton McConnell fonds
Reference Code: 301, 61
Archives of Ontario, I0006074

But by the 1900s, major outbreaks of diseases such as typhoid, cholera, and smallpox had overwhelmed the private system of care and the reformers&rsquo efforts. Governments of all levels had to step in.

In Ontario, the Provincial Board of Health took the lead. Soon, through pamphlets, lectures, bulletins, and regular visits from public health inspectors, Ontarians learned how to prevent disease and live healthier lives.

This new emphasis on prevention and education followed on the heels of the bacteriological revolution of the 1880s, as science began to uncover the mystery of what caused disease. Vaccines were discovered around the world - the first smallpox vaccine in Ontario was produced in 1886.

Click to see a larger image (68K)
A log building at a work camp, with a &ldquoSmallpox here&rdquo sign affixed to it, [between 1900 and 1920]
Porcupine area photograph collection.
Reference Code: C 320-1-0-2-5
Archives of Ontario, I0022414

Click to see a larger image (537K)
Influenza poster, 1918
Secretary of the Board of Health and Chief
Medical Officer of Health subject files
Reference Code: RG 62-4-9-450a.1
Archives of Ontario

The importance of clean water, pasteurized milk, and sanitary food practices was also now understood. By the early 1900s, most cities and towns in Ontario had by-laws to regulate the inspection of meat and milk, and inspectors to enforce those laws. Toronto, for example, instituted a city-wide milk campaign in 1921, to alert the public to the dangers of unpasteurized milk.

Nurses giving out free milk and weighing a child at a booth, T. Eaton Co. store, as part of the Toronto Milk Campaign, 1921
Public Health Nursing Branch
Reference Code: RG 10-30-2, 1.8.3
Archives of Ontario, I0005259

Class of boys drinking milk, Toronto milk campaign, 1921
Public Health Nursing Branch
Reference Code: RG 10-30-2, 1.8.15
Archives of Ontario, I0005262

Toronto&rsquos water system was first chlorinated in 1910, and other municipalities quickly followed that city&rsquos lead. And public health workers of all kinds-doctors, nurses, and building and food inspectors-became more organized and professionalized.

The Honourable Manning Doherty milking a cow in front of the
Ontario Legislature, Toronto milk campaign, 1921
Public Health Nursing Branch
Reference Code: RG 10-30-2, 1.8.2
Archives of Ontario, I0005265

Thus, by 1900, the real beginnings of health education-public hygiene, as it was called-had taken root in Ontario.

The gospel of public hygiene spread throughout the province. Medical inspection of public schools began after 1908, so that children could get the medical care and preventive education they needed. Centralized disease reporting in the province helped health workers and governments target those communities most in need. And the establishment of various health promotion agencies-such as the Canadian Association for the Prevention of Tuberculosis, formed in 1900-meant that the government could work with a wide array of health professionals to educate Ontarians about how to prevent and treat many diseases.

Children being measured at the school clinic, [ca. 1905]
Public Health Nursing Branch
Reference Code: RG 10-30-2, 3.03.2
Archives of Ontario, I0005191

By the 1920s, divisions of Preventable Diseases, Public Health Education, Laboratories, Sanitary Engineering, Industrial Hygiene, and Material and Child Hygiene and Public Health Nursing were all established by the provincial government. And, in 1921, the Ontario Division of Public Health Education was formed, marking the start of a new era in health education.

Video, Sitemap-Video, Sitemap-Videos