Innovation

Heroes of Progress: Paul Hermann Muller

Paul Hermann Müller, a 20th century Swiss Chemist who discovered the insecticide qualities of dichloro-diphenyl-trichloroethane (DDT). The effectiveness of DDT in killing mosquitoes, lice, fleas and sand flies that carry malaria, typhus, the plague and some tropical diseases, respectively, has saved countless millions of lives.

Müller was born on January 12, 1899 in Solothurn, Switzerland. At the age of 17, Müller left school and, one year later, began working as an assistant chemist for Lonza – one of the world’s largest chemical and biotechnology companies.

In 1918, Müller returned to school. In 1919, he started studying chemistry with a minor in botany and physics at Basel University. Upon completing his undergraduate degree in 1922, Müller stayed at Basel University to study toward a PhD in organic chemistry. Müller completed his PhD in 1925. That year, he started working for J.R. Geigy, a company that specialized in “chemicals, dyes, and drugs of all kind.” It would be here that Müller made his great discovery.

In 1935, Switzerland began experiencing major food shortages caused by crop infestations, and the Soviet Union experienced the most extensive and lethal typhus epidemic in history. These two events had a profound impact on Müller. Before the 1940s, insecticides were either expensive natural products or inexpensive but made from arsenic compounds that made them poisonous to humans and mammals.

Motivated by the need to create a cheap and long-lasting plant-protection agent that was not harmful to plants or warm-blooded animals, Müller decided to switch the focus of his work at J.R. Geigy from research on vegetable dyes and natural tanning agents, to plant-protection agents.

By 1937, Müller had developed a successful seed disinfectant named Graminone, which, when applied to seeds, protected them from soil-borne pathogens and insects. After this initial success, Müller turned his attention to insecticides. After four years of intensive work and 349 failed experiments, Müller found the compound he was looking for in September 1939.

DDT had first been synthesized by Viennese pharmacologist Othmar Zeidler in 1874. Unfortunately, Zeidler failed to recognize DDT’s value as an insecticide. J.R. Geigy took out a Swiss patent on DDT in 1940 and British, American and Australian patents followed in the early 1940s.

The discovery of DDT came at an important moment in history. It played a crucial role in protecting Allied troops in the Far East, where the shirts of British and U.S. troops were often impregnated with the compound. In 1943, DDT was used in Naples to bring a typhus epidemic under control in just three weeks. Between the 1950s and the 1970s, DDT was used to eradicate malaria from many countries, including the United States and most of Southern Europe.

The use of DDT declined after 1972, when it was banned, due to environmental concerns, by the U.S. Environmental Protection Agency. As Richard Tren of the public health advocacy group called Africa Fighting Malaria noted, “while there is evidence that the widespread, virtually unregulated agricultural use of DDT … did harm the environment, no study… has shown DDT to be the cause of any human health problem.”

In 2006, the World Health Organization reversed its stance on DDT. The WHO now recommends “the use of [DDT in] indoor residual spraying” as “DDT presents no health risk when used properly.”

After his discovery, Müller went on to become J.R. Geigy’s deputy director of research for plant-protection. In 1948, Müller received the Nobel Prize in Physiology or Medicine. The fact Müller was accorded this award even though he was not a physiologist or a medical researcher highlights the immense impact that DDT had in the fight against disease.

Later in life, Müller received many awards and honorary doctorates. In 1961, Müller retired from J.R. Geigy, but continued doing research in his home laboratory. In 1965, Müller died at the age of 65 in Basel.

Thanks to Müller’s work, billions of people have been able to avoid exposure to deadly diseases that have plagued humanity since the dawn of our species.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Malcom McLean

Malcom McLean was n American truck driver and, later, businessman who developed the modern intermodal shipping container. McLean’s development of standardized shipping containers significantly reduced the cost of transporting cargo across the world. Lower shipping costs significantly boosted international trade which, in turn, helped to lift hundreds of millions of people out of poverty. McLean’s “containerization” remains a vital pillar of our interconnected global economy today.

Before McLean developed the standardized shipping container, nearly all the world’s cargo was transported in a diverse assortment of barrels, boxes, bags, crates and drums. A typical ship in the pre-container era contained as many as 200,000 individual pieces of cargo that were loaded onto the ship by hand. The time it took to load and unload the cargo often equaled the time that the ship needed to sail between ports. That inefficiency contributed to keeping the cost of shipping very high. This is where McLean enters our story.

Malcolm (later Malcom) McLean was born in November 1913 in Maxton, North Carolina. When he graduated from high school in 1935, his family lacked the necessary funds to send him to college. Instead he began working as a driver for his siblings’ trucking company.

In 1937, McLean made a routine delivery of cotton bales to a port in North Carolina for shipment to New Jersey. As McLean couldn’t leave until his cargo had been loaded onto the ship, he sat for hours watching dozens of dock hands load thousands of small packages onto the ship. McLean realized that the current loading process wasted enormous amounts of time and money, and he began to wonder if there could be a more productive alternative.

In 1952, McLean thought of loading entire trucks onboard a ship to be transported along the American Atlantic coast (i.e., from North Carolina to New York). Although this idea would dramatically reduce loading times, he soon realized that these “trailer ships” would not be very efficient due to the large amount of wasted cargo space.  

Mclean modified his original design so that just the containers – and not the trucks’ chassis – were loaded onto the ship. He also developed a way for the containers to be stacked on top of one another. That was the origin of the modern-day shipping container.

In 1956, McLean secured a bank loan for $22 million. He used the money to buy two World War II tanker ships and convert them to carry his containers. Later that year, one of his two ships, the SS Ideal-X, was loaded with 58 containers and sailed from New Jersey to Houston, Texas. At the time, McLean’s shipping company offered transport prices that were 25 percent lower than those of his competitor as well as the ability to lock the containers in order to prevent cargo theft, which also appealed to many new customers. 

By 1966, McLean launched his first transatlantic service and three years later, McLean had started a transpacific shipping line. As the advantages of McLean’s container system became clear, bigger ships, more sophisticated containers and larger cranes to load cargo were developed. 

In 1969, McLean sold his first shipping company for $530 million ($3.8 billion in today’s money) and went on to start a string of other business ventures. Most notably, he purchased the shipping company United States Lines in 1978 and built a fleet of 4,400 container ships. McLean continued to refine his shipping containers for the rest of his life. He died at the age of 87 in Manhattan in 2001. When he died, Forbes Magazine called McLean “one of the few men who changed the world.”

In 1956, hand-loading cargo onto a ship in a U.S. port cost $5.86 per ton ($55.58 in today’s money). By 2006, shipping containers reduced that price to just 16 cents per ton ($0.21 in today’s money). HumanProgress’ Board Member Matt Ridley has noted that “the development of containerization in the 1950s made the loading and unloading of ships roughly twenty-times as fast and thereby dramatically lowered the cost of trade.”

This dramatic reduction in shipping costs boosted international trade. That means that consumers now have access to goods from around the world at a price lower than was previously thought possible. Similarly, reduced shipping costs have helped to boost the living standards of hundreds of millions of people in export-oriented developing countries over the last few decades.

Without McLean’s containers, global trade would be far below the level that it is today, and nearly all of us would be less well-off.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Abel Wolman and Linn Enslow

Abel Wolman and Linn Enslow are two 20th century American scientists who discovered how to safely use chlorine to purify drinking water. Enslow and Wolman’s formula was perfected in 1923 and thanks to their discovery more than 190 million lives have been saved worldwide so far.

Using chlorine to purify water did not start with Wolman and Enslow. During a cholera epidemic in 1854, chlorine had been used to purify London’s drinking water. Similarly, the first American patent for a water chlorination system was granted in 1888. While it was accepted that chlorine could kill bacteria, little was understood about the cleansing process and, since chlorine can be poisonous to humans, using the chemical for water purification purposes remained dangerous.

In the early 1900s, cities across America were expanding at a rapid pace and luxuries such as indoor running water were becoming more widespread. With no safe or effective measures to clean their drinking water, city water suppliers often became unwitting disseminators of an array of diseases, including cholera, dysentery and typhoid. This is where Wolman and Enslow enter our story.

Abel Wolman was born in June 1892, in Baltimore, Maryland. Wolman was one of six children of Polish-Jewish immigrants to America. Although Wolman had initially wanted to go into medicine, his parents encouraged him to study engineering. In 1915, Wolman became the fourth person to receive a BS degree from John Hopkins University’s newly established Engineering School.

Linn Enslow was born in February 1891 in Richmond, Virginia. Enslow studied chemistry at John Hopkins University and it was there that he first met Wolman. After graduation, both Enslow and Wolman began working at the Maryland Department of Public Health. In 1918, the pair teamed up to study chlorine’s effect on water purification.

In creating their method of water purification, Enslow and Wolman analyzed chlorine’s effect on the acidity, bacterial content and taste of drinking water. By 1923, the pair had created a standard formula detailing the amount of chlorine needed to safely purify water supplies. Enslow and Wolman’s rigorous scientific research laid the foundation for water purification across the world.

After their breakthrough, Wolman took a more active role than Enslow in encouraging states and countries to adopt the formula. Eventually Wolman was able to apply the new water purification method to Maryland’s drinking water supply. By 1930, typhoid cases in the state had declined by 92 percent. By 1941, 85 percent of all U.S. water systems used the Enslow-Wolman formula. The rest of the world followed America’s lead.

Wolman’s career flourished. He became chair of the state Planning Commission in his early 30s, acted as a consultant to the U.S. Public Health Service, was the Chief Engineer at the Maryland Department of Public Health, and established the Department of Sanitary Engineering at The Johns Hopkins University in 1937

Throughout his life, Wolman sat on numerous boards and advised governments on water purification systems throughout the world. Wolman eventually retired in 1962. He died in 1989 in his native Baltimore at the age of 96.

In the meantime, Enslow went on to become the editor of the magazine Water and Sewage Works. He worked in that role until his sudden death from a heart attack in 1957.

Thanks to the work of Enslow and Wolman, billions of people now have access to drinking water that is free from an array of potentially deadly diseases. It is estimated that the adoption of their formula in water systems worldwide has saved almost 200 million lives.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Kendrick and Eldering

Pearl Kendrick and Grace Eldering are two American scientists who created the first effective vaccine for the whooping cough. Thanks to their work, whooping cough has become preventable, and Eldering and Kendrick’s vaccine has been credited with saving over 15 million lives so far.

The whooping cough is an upper respiratory infection that typically afflicts infants. Although early symptoms are often quite mild, over time coughing bouts cause those infected to lose their breath, turn red, and vomit. At the end of a coughing bout, the child will often be desperately sucking in air, which results in a “whooping” noise. The disease, which is known as pertussis to scientists, after the bacteria Bordetella pertussis that causes it, can result in life-threatening complications such as pneumonia, bacterial infections, and dehydration.

At its height in the 1930s, whooping cough killed more American infants than polio, measles, tuberculosis, and all other childhood diseases combined. This is where Kendrick and Eldering enter our story.

Pearl Kendrick was born in August, 1890, in Wheaton, Illinois. When she was just three years old, she developed a case of whooping cough. Kendrick was lucky enough to survive the illness and went on to have a happy childhood. She received her BS in Liberal Arts from Syracuse University in 1914.

Kendrick initially began her career as a high-school science teacher, but after a short time she started studying bacteriology at Columbia University, focusing on whooping cough. In 1917, she was recruited to work at the Michigan Department of Health. It would be here she would meet Grace Eldering.

Eldering was born in 1900, in Rancher, Montana. Like Kendrick, Eldering contracted and survived the whooping cough at a young age. Eldering studied Biology and English at the University of Montana and graduated in 1927.

In 1928, Eldering moved to Michigan and began volunteering at the Department of Health’s Bureau of Laboratories. After six months of volunteering, Eldering was placed on the payroll. In 1932, she transferred to the lab Kendrick ran in Grand Rapids.

Kendrick and Eldering instantly bonded and began working together on a whooping cough vaccine. However, as their efforts coincided with the Great Depression, funding for their vaccine was virtually nonexistent. As a result, Kendrick and Eldering developed their vaccine primarily during their off hours. This arrangement worked for a few years but by 1936 the pair was in desperate need of additional funds to continue trials for their test vaccine.

In an attempt to raise funds, Kendrick invited the First Lady, Eleanor Roosevelt, to their laboratory. To everyone’s surprise, Mrs. Roosevelt accepted their invitation and during one day, Mrs. Roosevelt spent over thirteen hours with Kendrick. Soon after her visit, Mrs. Roosevelt helped to find the funds that allowed Kendrick and Eldering to continue the large-scale trial that they had begun in 1934.

Their trial eventually involved 5,800 children and the results were groundbreaking. The children who received the vaccine immediately demonstrated a strong immunity against the disease. In 1942, to reduce the discomfort for children receiving vaccines, Eldering and Kendrick combined three vaccines into a single shot. The use of the Diphtheria, Pertussis, and Tetanus (DPT) vaccine became routine throughout the United States in 1943. Thereafter, its use quickly spread across the world.

Both scientists received their PhDs from Johns Hopkins University; Kendrick in 1934 and Eldering in 1942.

Later in life, Kendrick left the Michigan Department of Public Health to teach at the University of Michigan. She died in 1980. Once Kendrick left, Eldering succeeded her as the head of the department. Eldering retired in 1969 and died in 1988. In 1983, both women were inducted into the Michigan Women’s Hall of Fame.

Tragically, each year 160,000 children continue to die after contracting whooping cough in the developing world. Although this figure continues to fall, there is much to be done before whooping cough is completely eradicated.

However, thanks to Eldering and Kendrick’s work, more than 15 million lives have already been saved, and it is likely that their vaccine will continue to save millions more.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Johannes Gutenberg

Johannes Gutenberg, a 15th century German goldsmith and inventor, who created the first metal movable-type printing press. Gutenberg’s inventions inlcuded a process for mass-producing movable type, the use of oil-based ink for printing books, adjustable molds, mechanical movable type and the use of a wooden printing press similar to the agricultural screw presses of that time.

Gutenberg's ideas started a printing revolution, which greatly improved the spread of information. The printing press helped to fuel the later part of the Renaissance, the Reformation, and the Scientific Revolution, thus setting the stage for the start of the Industrial Revolution in the second half of the 18th century.

Relatively little is known about Gutenberg’s early life. It is believed he was born sometime between 1394 and 1404 in the city of Mainz, the Holy Roman Empire (today’s Germany). We do know that Gutenberg was born into a wealthy patrician merchant family and he grew up learning the trade of goldsmithing.

In 1411, the Gutenbergs were exiled from Mainz following an uprising against the patrician class. We don’t know much about Gutenberg’s life over the following fifteen years, but a letter written by him in 1434 indicates he was living in Strasbourg (today’s France). Moreover, legal records from that same year indicate that he was a goldsmith and a member of the Strasbourg militia.

Whilst in Strasbourg, Gutenberg created metal hand mirrors that pilgrims bought and used when visiting holy sites (it was thought hand mirrors could capture the holy light from religious relics). Gutenberg’s metal working skills proved useful when he developed the metal movable-type used in the printing press.

In 1439, Gutenberg encountered financial problems. Unable to placate his investors, Gutenberg is said to have shared a “secret” with them. It is speculated that the secret was a much-improved process of printing. A year later, Gutenberg supposedly declared that he had perfected the art of printing. That said, a workable prototype of his printing press remained a long way off.  

In 1448, Gutenberg moved back to Mainz. With the help of a loan from his brother-in-law, Arnold Gelthus, he was able to build an operating printing press in 1450. A working press enabled Gutenberg to convince Johann Fust, a wealthy moneylender, to lend him more capital to fund further refinement of the printing process. Peter Schöffer, Fust's son-in-law, also joined the enterprise and it is likely that Schöffer designed some of the press’ first typefaces.

It is widely accepted that Gutenberg had two presses, one for lucrative commercial texts, and another reserved for printing the Bible. In 1455, the first 180 copies of The Gutenberg Bible were completed. However, in the same year, Fust sued Gutenberg and demanded his money back, accusing Gutenberg of the misallocation of funds. The court ruled in favor of Fust, which gave him possession of the printing workshop and half of all the printed Bibles.

The court’s ruling left Gutenberg effectively bankrupt. Undeterred, Guttenberg managed to open a small printing shop in Bamberg (Bavaria) in 1459, where he continued printing Bibles. In 1465, Prince Archbishop of Mainz recognized Gutenberg's accomplishments by naming him a Hofmann or a gentleman of the court. That meant that until his death in 1468, Gutenberg could live comfortably on the court’s large annual stipend.

Gutenberg's innovation quickly spread throughout Europe and beyond. That meant that books and pamphlets became much cheaper and more easily accessible. The deluge of printed texts helped to increase literacy rates throughout the continent. Medical, scientific and technical knowledge proliferated, improving the lives of millions. Philosophical, religious and political treatises abounded. The monopolistic controls that guilds and nobility held over Europe's economic and social life for centuries were broken.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

The Existential Threat of Pandemic Pathogens: Part 2

There's lots of other things to think about when you think about infectious disease. Helminths, ectoparasites, amoeba, non-carbon based, so that's something when you talk about the mission to Mars, and thinking about how you're going to deal with organisms that may or may not come back there.

So there's lots of different things to think about, but really our consensus was that viruses are most likely going to be the GCBR agent, because of their mutability and their rapidity of spread, and their lack of an antiviral.

But when you get into viruses, there's so many viruses. There's lots of different rules and exceptions. Would the virus's genes be RNA, or DNA? Will it copy itself in the cytoplasm of a cell or will it be in the nucleus? Will it have a segmented genome like flu? Flu is one of the most prolific viruses, and the reason why it's so good at infecting people is because it can shuffle its genes, because they're all on a segment, and it can basically be like a deck of cards that switches different genes. So that's something that is really important to think about, whether it's segmented versus non-segmented. Does it have a really big genome like MERS and SARS? Or is it a very small genome?

And then what about these ones that are spread by mosquitoes? They get very high levels of the blood in people, are those the kind of viruses that are gonna be able to cause GCBR? In just a couple of headlines a monkey pox, which is a DNA virus that people are very nervous about in the wake of the smallpox eradication, people aren't vaccinating for smallpox routinely anymore, now monkey pox has resurged and the vaccine was protective against monkey pox.

So when you think about viruses, there's lots of different things to think about. Probably when you think about GCBR-level risks, flu goes to the top of this list, because it's done it so many times before. And we do have this scare right now that's been going on since probably the 1990s, regarding avian influenza. That's a picture from my local county fair in my hometown outside of Pittsburgh.

And when you think of avian flu, one of the scariest versions of this is H7N9. We're right now in the sixth wave, but in the fifth wave we saw some very, very scary things happen. You saw changes showing this virus being more likely to be able to transmit between humans, we've seen antiviral resistance in this strain, we've seen the genetics of the strain change so much that the vaccine that's stockpiled, there was a mismatch. And we've seen it evolve high pathogenicity in chickens.

This is the CDC's ranking of viruses. A and B are both H7N9. So it's at the highest risk for emergency impact. So this is probably one of the scariest viruses that we face. And it meets a lot of those criteria that we talked about.

When you look at the steps in pandemic emergence, there's a bunch of steps that a virus has to do, this is specific for influenza. We're already down to around 3 to 4. The infection is replicating sufficiently to produce infectious virus, but we're seeing very stuttered human transmission. But we're getting down that road with H7N9.

So this isn't something that's very theoretical, what I'm talking about, this is something that we're dealing with today, now, in China, with H7N9.

I think what the CDC does is they actually rank the different properties of the virus, which I think is a very good thing to do. It might not always be accurate, but it does definitely give you some framework for how to evaluate viruses, and that's what we were doing with this framework, was trying to take this type of an assessment tool of viruses, of influenza, and apply it to the whole microbial world. And that's what this new paradigm that we made was going to try to accomplish.

So what do you do when you come up with these ideas, when you come up with these types of lists of things that might cause this? You can do what CEPI did. CEPI is the Coalition for Epidemic Preparedness and Innovation. It's a major funder for vaccine research, and what they did is they came up with a list that they took from the WHO blueprint for research, and picked some to actually go after.

I think that's one way to do this, where you have diseases that meet this criteria, then you invest money to go into it, to develop vaccines.

People did think about space bacteria, but it's unlikely that space bacteria, if they're adapted to Mars, are going to be able to do very well in humans on Earth, because there's totally different conditions that would allow them to flourish on one planet, that wouldn't apply to another planet.

We talked to lots of people that were thinking about salamanders and frogs, which are being decimated by these sapronotic pathogens, but don't necessarily affect humans.

So a couple more conclusions there. Any microbe is capable of causing a GCBR. But we believe that RNA viruses are the most pressing and likely threat, because of their mutability, their zoonotic potential. Bacterial antimicrobial resistance is unlikely to reach GCBR levels, and GCBR level of widespread fungal disease is unlikely due to its temperature restrictions, and there are very select conditions for a prion-caused GCBR.

The last part of the talk, I just want to emphasize a few things. We're always surprised about infectious disease, and I list a bunch of them here. H1N1 coming from Mexico, zika, SARS, MERS, and there's lots of people investing in surveillance and prediction approaches.

I think there's two basic approaches. There is this global virome approach, where people will go out and sequence everything that they want to do, and try to find a list of viruses that are out there. It tells you maybe what's coming, but it's very expensive, and 99 percent of those viruses are probably not going to pose any threat to humans. Also, what if the next GCBR or the next pandemic is not viral?

So that's one way to do it. Or there's another way to do it. I think this is the way I favor. It's looking at people that are getting infected by novel diseases. Looking at people like bush meat hunters, people who work in abattoirs, looking for what's called viral chatter, things that go from the first forays of a pathogen into humans. And then, looking at different hot spots. Instead of trying to sequence things, focus on things that are actually causing infections.

And then you think about unknown diagnoses. 50% of our septic shock cases, even in the United States, don't have a diagnosis, and I think that people just treat for symptoms and not necessarily for fevers. So I think that we're going about this a little bit wrongly. I think going after these unknown etiologies, all over the world, trying to figure out where these infections are occurring and actually running things down to specific diagnoses, instead of just saying, "You've got some viral syndrome." I think that's the way to go about it.

A headline just came out of Nature a couple of days ago that really validates what we're saying, which is that you should spend much more on surveying actual infections, not on trying to predict things by viral cataloging. So, what's out there, is lots of biological dark matter. We've got lots of viruses out there, lots of bacteria out there that nobody knows about. And I think we're at the stage now, that we have this tricorder culture, that Captain Kirk is holding there from Star Trek, where Bones, the doctor, would just scan someone and know exactly what they have.

We've got lots of new technologies, and I think that you can start and figure out what disease X is going to be, and that's the new WHO nomenclature for the unknown unknown. And I do think we're at that point now, but only if we actually harness these diagnostic tests.

When you think about infectious disease risk, you have to also think about human actions and that there are human actions that can enhance pandemic potential. Mistakes, political or scientific, fears of the unknown and also, complex disasters. So, if there's a war, for example, at the same time as an infectious disease, it can raise something to a GCBR-level.

So I'm going to conclude here, people keep asking me for books to read. There are lots of people who are first trying to get into this field. These are some of the ones that I could think of, that I think are interesting. This is an article I wrote for The Atlantic, Why Hasn't Disease Wiped Out the Human Race? If you just Google that, you'll find that under my name in the Atlantic, where I try to summarize some of the stuff that I talked about here. And then, these are some interesting books that I think really give you a flavor for this, and a few other ones that I didn't have pictures of.

A Viral Storm by Nathan Wolf and Level 4: Virus Hunters of the CDC by McCormick, that's the book that really got me interested in all of that. It came out in 1996. And this is the book that really started for me when I was a little child. That was the one that my parents read me over and over and over again, which is the story of the rabies vaccine. So, thank you for your interest and I'm happy to take any questions in the time that's remaining, and I have office hours as well and feel free to follow me on Twitter or read my blog. Thanks again for your attention.


 About the Author

Effective Altruism is a project of The Centre for Effective Altruism, a registered charity in England and Wales. Effective altruism is changing the way we do good. Effective altruism is about answering one simple question: how can we use our resources to help others the most? Rather than just doing what feels right, we use evidence and careful analysis to find the very best causes to work on. But it's no use answering the question unless you act on it. Effective altruism is about following through. It's about being generous with your time and your money to do the most good you can.

The Existential Threat of Pandemic Pathogens: Part 1

A sufficiently bad disease could have disastrous effects for humanity, like the 1918 flu, black plague, or even worse. But to protect ourselves as a species from global catastrophic biological risks (GCBRs), we need to know which features make a pathogen more or less likely to pose an existential threat. In this advanced talk from EA Global 2018: San Francisco, Dr. Amesh Adalja covers what types of pathogens are most likely, which transmission methods to worry about the most, and many more considerations for anticipating and preventing GCBRs. A transcript of Amesh's talk is below, including questions from the audience.

I'm going to talk about the characteristics and traits of pandemic pathogens. You just heard my colleague Crystal's talk where she introduced the concept of Global Catastrophic Biological Risk (GCBR). I'm going to try and delve very deep into that concept, to try to understand what is it about certain pathogens that allows them to cause a GCBR. I think the theme of this conference is to be curious, and that's really what motivated this project, was to be very curious.

The aim of the project I'll discuss today was to develop a whole new framework around the traits of naturally occurring pandemic pathogens that could constitute a GCBR, in order to change preparedness activities. In the past we really focused on list-based approaches, that are largely derived from the former Soviet Union's biological weapons program. There hadn't been a lot of fresh thinking. It was very static. In this project, we tried to bring more fresh thinking. First I'll provide a couple of basic definitions, just because I don't know if everybody has a biological background.

Pandemics are infectious disease outbreaks that occur over a wide area and affect a large proportion of the population. They don't necessarily have to be severe; 2009 H1N1 was a mild pandemic, but it was still a pandemic. An epidemic is an infectious disease outbreak that occurs over a large number of individuals within a population. So, the SARS outbreak in 2003 would be an epidemic. An endemic is something that occurs regularly within a population. For example, the common cold is endemic in the human population. Those are just things to keep in mind. What we're talking about are specific types of pandemics that are very severe, to meet GCBR criteria.

I'm not going to spend much time on this definition because Crystal really went into some great detail, but what I'm going to do is expand on what's special about certain biological agents, and what types of biological agents can cause GCBRs. Everybody is very focused on viruses, some people are focused on bacteria, some people on other things, so what I'm trying to understand is what traits does a biological agent have to have in order to be able to cause something this severe, to cause this massive catastrophic loss of life, disruption of society, etc.

I just want to draw a distinction, because what I'm talking about here are pathogens of pandemic potential, versus just critically regionally important, a distinction made in Mike Osterholm's recent book. When you have an outbreak like Middle East Respiratory Syndrome, just because it's not a GCBR doesn't mean it's not important or that it's not going to be very disruptive to people's lives, to societies and to governments. But what we're talking about in a GCBR is something that's going to be global, like the 1918 flu. Something that's a lot different in scale than even Middle East Respiratory Syndrome or SARS. Something much different. There's lots of pathogens of critical regional importance. Even the Ebola outbreak in 2014 in West Africa falls under the criteria for critical regional importance, versus GCBR.

The specific aim with this project was really to, like I said, move away from a list-based historical approach. People had really just taken the Soviet Union Biological Weapons program and added a couple things here and there, but really hadn't thought much about why each thing was on there. They hadn't really challenged the assumptions that put them on there, and really try to understand certain questions. What it was that made smallpox so scary? What is it that makes pandemic flu so scary? We're really trying to go into an inductive manner, trying to make a whole new paradigm, looking at the actual attributes of GCBRs. We tried to do this by being totally microbignostic. What I say microbignostic, that means we didn't go into this project saying, "This has to be a virus. This has to be a bacterium". We said it could be anything. It could be a parasite, a protozoa, it could be a prion. So that was something that was totally new. We really wanted to challenge thinking and then take this paradigm, and hopefully use it to move forward when we think about preparedness and try and think of new infectious disease outbreaks with this new paradigm in mind, to get better at being prepared because we're constantly surprised, which I'll get to later in the talk.

What are the essential traits? We think about what a GCBR have to have. We talk to people and do a lot of literature review; there's a whole bunch of different things that make up the alchemy of a pandemic pathogen. I'm going try and explain what this equation means as best I can.

The first thing you need to do, is you have to have a pathogen that can efficiently transmit from human to human. You can have a disease that can be really bad, like tetanus. But since tetanus doesn't transmit between humans, so it can't be a pandemic pathogen. When you're talking about a pandemic pathogen, it has to be able to get from people to people, so that's number one.

It has to have a moderate fatality rate. It doesn't have to be really, really horrible like a 90% fatality rate or 100% like rabies. It has to be something that's kind of in a sweet spot that it allows enough death to occur that it causes disruption in the society. Remember that the 1918 influenza pandemic, which killed 50 to 100 million people, only had a fatality rate of two percent. But because it was so widespread, it led to disruption.

Contagious during incubation period. In multiple modeling studies, and in experience - and Crystal alluded to this earlier when she talked about smallpox - if a disease is contagious during the incubation period, when you're not sick, then it's very, very hard to control. That's why the H1N1 pandemic had spread everywhere before anybody even knew about it because people were contagious one day before symptoms. If a disease is contagious in that period, it becomes very, very difficult for public health interventions to have any impact. The same goes with mild illnesses with contagiousness. When you have the flu or the common cold and you're out shopping, doing your normal daily life, you spread that to other people. It's very hard to stop that, versus something like Ebola, when you're sick and highly contagious you are in bed and you can't really move, and move about in society. So this is another key factor.

An immunologically naïve population. Again, reflecting back on Crystal's talk when she showed the map of the indigenous populations in the Americas, in the pre-Columbian area in 1492. That was an immunologically naïve population to smallpox, which allowed smallpox to spread very rapidly through that population. That's what a pandemic pathogen would require.

No vaccine or treatment. You don't have any way to stop this. That's another thing that fits into GCBRs.

Correcttmospheric and environmental context. Infectious diseases happen in a context. Is it happening during World War I like the pandemic flu did in 1918? Is it happening where there's been societal disruption? Like, for example, Yemen right now and the cholera outbreaks? All of that's going to play a major role in how prolific an infectious disease outbreak is.

There's a lot of biology that has to go into this too. Not every pathogen can infect every type of human. You have to have a proper receptor. So you've got to have some receptor that lots of humans have that this virus or this bacteria can actually cling onto. It's also going to explain which organs it affects, because obviously some organs are more important. If it affects your brain, your kidney, your liver, your lungs, those are what you really see with these pandemic pathogens. And then it has to be able to evade the host immune response. It has to be something that is not easy for the immune system to mount an effective response against.

You can take all of these things and give them values, and come up with the pandemic potential of a pathogen. You can look and vary them. You can look at some other things too. For example, the more host types a pathogen has, the more chance it is to emerge. That's another thing that comes up, that these things can infect more than one type of species. That's where the concept of zoonosis comes about, where something comes from an animal species into humans. But the more hosts something has, the more likely it is to be able to infect humans and cause a problem.

When you think deeper, there's a couple of things that come out of this. When you look at the way these things transmit, the most likely way to cause a pandemic is for it to be done through the respiratory or the airborne route. There's much, much less you can do to stop an airborne virus or a respiratory/droplet spread virus. If I had measles right now, you would all be exposed. It's very, very hard to do that. But if it was something that was spread through, for example, fecal/oral, like hepatitis A or cholera, you can really delimit that with sanitation. Remember, there was a couple of cases of cholera in Mexico about five years ago, and everybody panicked. But there was just even a modicum of sanitation, and that stopped cholera. It did not spread in Mexico. You can't do that with respiratory viruses and airborne viruses. It's much, much harder through the respiratory or airborne bacteria.

A vector borne transmission, so that means through mosquitoes, through ticks, that's something that's very interesting and it is something that I think we struggle with trying to figure out exactly how bad a vector borne outbreak can get. They are, in general, limited by the vector range. For example, mosquitoes frolic in places that are going to be much more conducive to their habitat, so it's going to be very hard for them to live in a temperate climate. I live in Pennsylvania; we don't have mosquitoes year round there. But there are parts of the world that mosquitoes thrive in all year round. That's why I talked about specifically aedes mosquitoes, which are the mosquito that's responsible for spreading dengue, chikungunya, zika, yellow fever. They are basically covering half the population of the globe. That is something that's a little bit different with vector borne. In general, we don't think vector borne could do this because the host range, the range of the vector, is kind of limited.

The other thing to think about is, talking about global catastrophic biological risk, we talk about deaths, but is fatality everything? This is a question to pose to yourself. There's two purposes for an organism. To survive and to reproduce. What if you had a disease like zika, but which was much more widespread? Or rubella, pre-vaccine. If you decrease the reproductive fitness of a species, could that lead to a GCBR? I think the answer is yes. I don't think that zika or rubella really meet those criteria, and maybe if we were talking about this, if 4ubella occurred now, maybe it would fit in the GCBR. Back in the 1960s, people coped with rubella, but it is one way to think about a GCBR that doesn't end up killing everybody.

The other thing is, there's another virus, HTLV, which as been in the headlines a lot in the last couple of weeks because of some studies that have come out of Australia. HTLV is the most carcinogenic virus. It causes human t-cell leukemia. This was the first human retrovirus that was discovered. What if an infectious agent causes cancer in everybody? Would that lead to a GCBR? That's something just to think about, that it's not always going to necessarily be death. I think you have to be very broad and active minded about this.

There's also something I wanted to talk about called sapronotic disease. That's a term that we found from the plant world and the animal world, where you have this idea that if an infectious agent is killing everybody that it infects, it's not very good because it's going to run out of people to infect. But what if it doesn't care? What if people are just a secondary source for it? What if it's an amoeba that eats things in pond scum, but only intermittently infects humans, like the brain-eating amoebas that always grab headlines? How does that fit into this? If it doesn't necessarily need to be in humans to thrive. That it has other things. It could eat dead bark on a tree, or it can eat stuff on the bottom of the forest floor. That's another way to think about a pandemic pathogen, that I think is really interesting.

The first set of conclusions we drew from this project were, the traits that are most likely to be possessed are going to be respiratory/droplet transmission, with fecal/oral much less likely. The aedes vector borne agents, those mosquitoes, have a special status because there's widespread mosquito prevalence and there are certain viruses that get very high levels in your blood that these mosquitoes can just kind of pick off. That's why we've seen explosive outbreaks of dengue, chikungunya, zika, why we've seen yellow fever resurface. So they are a special category. They probably don't fall as high as respiratory/droplet, but that's something to keep in mind, that these could possibly do that.

So then the next thing we try to do is think about, okay, we've said respiratory/droplet transmission. So we have to make a choice, we have to think: is it gonna be virus, bacteria, fungi, parasite?

So I say there's no agnostics in a foxhole. We had to push people that we talked to in this thing, and push ourselves to think, what would it really be? So I think that viruses are very formidable in this realm. They mutate much, much more rapidly than bacteria. The transmission and replication cycle in a human is much faster than in bacteria. And you heard this earlier today, that there's no real broad spectrum antiviral agent.

So when we think about bacterial infections, we can usually, even in the face of antibiotic resistance, craft together some regimen that works. And lots of them are nonspecific, they kill wide varieties of bacteria. They have a spectrum.

We don't have that so much with antiviral agents. We've got a certain drug for flu, a certain drug for hepatitis C, a certain drug for HIV. We don't have broad spectrum antivirals. And that's a major chink in the armor against viruses.

When you think about bacteria, they really are limited. Because of broad spectrum antibiotics, they're slower, they're less mutable. There are some caveats. We do have multi-drug resistant bacteria that can be challenging, and antibiotic resistance is probably one of the most pressing public health threats that we face.

But they still don't rise to the fact of causing a GCBR. Because even if you think about everything becoming resistant, we would be pulled back to the pre-penicillin era. But people still lived during the pre-penicillin era, in a way that they were still flourishing, human populations were still growing pre-penicillin. So I think that's important to remember.

But we have seen a major outbreak. For example, we talked about Yemen and the cholera outbreak that's occurring with the bombing that's going on, and the infrastructure problems, it's been the worst cholera outbreak in history. And we had a bad plague outbreak just a couple of years ago in Madagascar, with 1200 people infected.

So in certain resource-poor areas, you can see bacteria get very close to causing a GCBR. But it would be hard for it to do anything. So you think back to the Black Death back in the 1300s and 1400s. There were no antibiotics and I think that's why the Black Death probably does qualify for a GCBR at that time.

So the other thing, what about fungi? Fungi are everywhere in the environment. But the fact is, they're temperature restricted. They don't like to grow at human temperatures. And there's a lot of hypotheses about why this is. They think that humans actually came through what's called a mammalian filter, that we evolved the ability to have a 98.6 degree Fahrenheit or a 37.5 degree Celsius temperature as a way to avoid fungi. And we see fungi decimating salamanders and frogs, and all of these other types of species that have lower body temperatures.

And it's very, very hard for fungi to infect humans, they usually have to be immuno-compromised. There's been a few scary things, like candida orus, which is a multi-drug resistant fungi that preys on hospitalized patients. Cryptococcus gattii, which is one that kind of is around the soil and trees. And then there was this exserohilum outbreak. That was related to contaminated steroids, which you might've heard about that getting injected directly into people's backs. But again, they needed something to be able to do that.

Some of these fungi are sapronitic and they can be very, very high mortality rate because they don't transmit well between humans. But the temperature restriction makes it very hard for them to cause a pandemic.

For prions, you've probably heard of mad cow disease, which is probably the most famous prion disease. Those don't spread very well between humans, unless there's cannibalism. This is a graffiti in Pittsburgh, there's a guy who writes kuru all over the city of Pittsburgh. Kuru was a disease that was found in Papua New Guinea among the indigenous tribes there, where they were engaging in ritualistic cannibalism of each other after someone died, and kuru was really decimating that population.

But unless you have people directly exposing themselves like that, in that manner, prions are unlikely to do that. We've seen scrapie for example, in the sheep species, and then chronic wasting disease in elk. Those things have done - because there's so much different saliva contact between people and the way these animals eat and interact with each other, that don't really apply to humans.

It's interesting, because when you think about prions, chronic wasting disease, they actually had to burn down a forest to stop it from spreading, because it was so endemic to the elk and deer in that region. So that's a testament to what a prion can do, but it's very restricted when you think about how prions could do this in humans.

This is something that most people don't know about. So we look to try to see, was there ever an extinction event from an infectious disease in any animal species? And there's one. This is the Christmas Island Rat, and it basically was driven to extinction, by a vector-borne, so mosquito-born, trypanosome. Trypanosomes are parasites, much bigger than bacteria or viruses. And the Christmas Island rat basically was rendered extinct because of it.

But does this apply to humans? I don't think it does, because the poor Christmas Rat couldn't go anywhere. It was on an island; there was nowhere for it to run. Humans can outrun a vector, and like I said, vectors are limited in where they can actually cause disease and where they can spread infections. And if you can outrun that vector, if you can move to places where it isn't there, you can probably avoid it.

I think that when you look at the human trypanosome diseases, they haven't done anything near what they did to the Christmas Island rat. There are some concerns when you talk about parasites and protozoa, about certain drug-resistant forms of malaria, especially if they make it from Asia to Africa, being able to maybe cause something on a GCBR level. But we haven't seen that yet.


About the Author

Effective Altruism is a project of The Centre for Effective Altruism, a registered charity in England and Wales. Effective altruism is changing the way we do good. Effective altruism is about answering one simple question: how can we use our resources to help others the most? Rather than just doing what feels right, we use evidence and careful analysis to find the very best causes to work on. But it's no use answering the question unless you act on it. Effective altruism is about following through. It's about being generous with your time and your money to do the most good you can.

Heroes of Progress: James Watt

James Watt is the 18th century Scottish engineer and inventor who enhanced the design of the steam engine. Watt’s steam engine made energy supply more efficient and reliable than ever before. It was fundamental to kick-starting the Industrial Revolution.

James Watt was born on January 19, 1736 in Renfrewshire, Scotland. His father was a successful shipbuilder and Watt later reminisced that growing up around his father’s workshop proved a profound influence in his educational goals and career trajectory. Due to suffering from bouts of illness as a child, Watt was mostly homeschooled.

When he was 18, Watt’s mother passed away, and the future inventor travelled to London to study mathematical instrument making, which involved learning to build and repair devices such as quadrants, compasses, and scales. After a year in London, Watt returned to Scotland where he produced and repaired mathematical instruments. Watt eventually opened a mathematical instrument shop in 1757 at the University of Glasgow.  

In 1764, Watt as given a Newcomen steam engine to repair at his workshop. This was an older engine that was invented in 1712. The Newcomen engine operated by condensing steam in a cylinder, which in turn creates enough push to power a piston. While fixing the engine, he observed that a lot of the steam was wasted due to the machine’s single cylinder design. As pressure in the engine was created by cooling of the steam, Watt realized that having to repeatedly heat and cool the same cylinder wasted more than three-quarters of the steam’s thermal energy. 

To remedy this inefficiency, Watt created a design that saw steam condensed in a chamber that was separated from the cylinder in 1765. This was revolutionary. Unlike the Newcomen’s engine, which wasted energy by repeatedly heating and cooling the same cylinder, Watt’s engine kept the cylinder at a stable temperature as the steam condensed in separate chamber.

However, due to lack of capital, Watt faced difficulties in constructing a full scale-engine. With investment from Joseph Black, a University of Glasgow physician, Watt was successful in creating a small test engine in 1766. A year later Watt entered a business partnership with John Roebuck. In 1769, Watt and Roebuck took out their famous patent for “A New Invented Method of Lessening the Consumption of Steam and Fuel in Fire Engines.”

Unfortunately, acquiring the patent bled Watt's monetary funds dry. Therefore, he was forced to take on alternate employment -  first as a surveyor and then as a civil engineer.

Seven years later, Watt’s old business partner went bankrupt and an English manufacturer named Matthew Boulton acquired Roebuck’s patent rights. Through Boulton, Watt returned to working full-time on his engine.

Together, the two men founded the Boulton and Watt manufacturing firm, and Watt spent the next several years improving the efficiency and cost of his engine. Watt's first profitable dual-cylinder steam engine came on the market on March 8, 1776, a day before Adam Smith's Wealth of Nations was first published. Little did the two Scotsmen know that they were about to change the world forever. 

The demand for Watt’s engine grew and it was quickly adopted across multiple industries, including rotary machines that were used in cotton mills, which supplied cheap clothing to the masses for the first time. Ultimately Watt’s design turned the steam engine from a machine of “marginal efficiency into the mechanical workhorse of the Industrial Revolution.”

In 1800, once the patent on the steam engine expired, Watt retired. On August 15, 1819 Watt died at the age of 83 in in Birmingham, England. Watt was honored with numerous awards during his lifetime, including fellowship of both the Royal Society of London and Edinburgh. In 1960, the watt (W) unit of power was named after him. In 2009, the Bank of England put Watt’s face on the new British £50 note.

Industrialization has lifted hundreds of millions of people out of poverty. Today, all developed countries have gone through the process of industrialization, a phenomenon that would not have occurred without the Watt engine and it is for that reason that James Watt is deservedly our 13th Hero of Progress.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Joseph Lister

Joseph Lister was a 19th century British surgeon who is commonly dubbed “the father of modern surgery.” Lister introduced new methods of sterilization to hospitals as well as medical equipment that transformed hospitals from death traps into places of genuine healing. In hospitals that adopted his suggestions, infection rates fell drastically and since then Lister’s ideas have been adopted as surgical norms across the world.

Joseph Lister was born on April 5, 1827 in Essex, England, to a prosperous Quaker family. His father was a successful wine merchant and amateur scientist, who pioneered achromatic (i.e., non-color distorting) lenses that are used in microscopes today. Lister attended the University College London, which was one of very few schools that accepted Quakers, to study botany. He graduated with a Bachelor of Arts in 1847.

Upon finishing his undergraduate degree, Lister immediately enrolled as a medical student at the same institution. He graduated with honors in 1852 and became a house surgeon at the University College Hospital. A year later, Lister moved to Edinburgh to be the assistant to James Syme, a renowned surgical teacher at the time. In 1859, Lister assumed the prestigious position of a Regius Professor of Surgery at Glasgow University.

In the mid-19th century, it was thought that bad air was responsible for infections in patients’ wounds. Hospitals would air out their wards at midday to reduce the spread of “miasma” or poisonous vapor filled with particles from decomposed matter that supposedly caused infections. Ignorance about the spread of disease was rife. Surgeons did not wash their hands and clothes, boasting about the stains on their dirty operating gowns and referring to the former as the “good old surgical stink.”

Whilst working at the University of Glasgow, Lister read a paper written by the French chemist Louis Pasteur, who showed that food spoilage may occur under anaerobic (i.e., airless) conditions if micro-organisms were present. Pasteur debunked the notion that bad air caused infections and recommended filtration, exposure to heat, or chemical solutions to arrest the spread of harmful bacteria.

Since Pasteur’s first two suggestions would damage human tissue, Lister started to spray patients’ wounds, surgical instruments, and dressings with carbonic acid. He soon noticed that H2CO3 dramatically reduced infection rates among his patients. He published his findings in the scientific journal The Lancet over a series of six articles in early 1867.

Lister ordered all surgeons under his direction to wear clean gloves, wash their hands and instruments before and after operations, and spray the H2CO3 solution in the operating room. His methods quickly caught on across the world. As hospitals and operating rooms became cleaner and wounds became sterilized, rates of infections fell, and millions of lives were saved.

In 1869, Lister returned to Edinburgh as the successor to Syme and he continued to develop his methods of sterilization. Lister’s most famous patient was Queen Victoria who called on Lister to attend to an “orange-sized abscess in her armpit” in 1871. Armed with carbolic acid and a small incision tool, he treated the Queen’s wound. Later Lister would often joke to his students, “Gentlemen, I am the only man who has ever stuck a knife into the queen!”

Lister continued teaching, researching and treating patients until his wife passed away in 1893. After that, he said that studying and writing lost their appeal to him. Lister died on February 10, 1912 at his country home in Kent, at the age of 84.

Lister was decorated with numerous awards and honorary degrees throughout his life. Most notably he was made President of the Royal Society in 1894 and a baron by Queen Victoria in 1897. Lister’s sterilization methods transformed modern surgery and saved untold millions of lives.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Maurice Hilleman

Maurice Hilleman is an American microbiologist who developed over 40 lifesaving vaccines. Of the fourteen vaccines recommended in the current vaccine schedules, Hilleman developed eight. Hilleman is credited with saving more lives than any other medical scientist of the 20th century.

Hilleman was born August 30, 1919, in Montana. His mother died two days after he was born. Following his mother’s death, his father was faced with the prospect of raising eight children alone. As such, his childless aunt and uncle agreed to raise Maurice on their nearby chicken farm. Hilleman attributed much of his later success to his work on the farm as a boy – since the 1930’s, chicken eggs were used to grow viruses for vaccines.

Due to the lack of funds, Hilleman almost didn’t make it to college. Thankfully, his eldest brother interceded and loaned him the money to pay the tuition fees. Hilleman graduated first in his class from Montana State University in 1941 and won a fellowship to do postgraduate study in microbiology at the University of Chicago. He received his doctoral degree in 1944.

Upon graduation, Hilleman joined the E R Squib & Sons – a virus lab based in New Jersey. Soon after he started working in the lab, Hilleman successfully developed a vaccine for Japanese B encephalitis. This infection, which is native to Asia and the West Pacific, had begun to spread to American troops who were fighting in the Pacific during World War II.

In 1948, Hilleman began working as the Chief of the Department of Respiratory Diseases at the Army Medical Center in Silver Spring, Maryland. In 1957, Hilleman discovered first signs of an impending flu pandemic that was spreading in Hong Kong. Hilleman and his colleagues raced to produce a vaccine and he oversaw the production of over 40 million vaccines that were immediately distributed across the U.S.A.

Although 69,000 Americans died after catching the virus, if it hadn’t been for Hilleman’s efforts, the pandemic could have caused millions of deaths. In recognition of his work, the American military awarded Hilleman the Distinguished Service Medal.

In 1963, whilst working at Merck & Co (one of the world’s largest pharmaceutical companies), Hilleman’s daughter, Jeryl Lynn, became ill with the mumps. Hilleman quickly drove to his lab to pick up the necessary equipment so that he could cultivate material from his daughter’s infection.

In 1967, the original sample taken from Jeryl Lynn’s throat became the basis for the newly approved mumps vaccine. It came to be known as the "Jeryl Lynn Strain." Hilleman later combined his mumps vaccine with the measles and rubella vaccines - which he had also developed – in order to create the MMR vaccine.

Apart from the vaccines mentioned above, Hilleman also developed vaccines for hepatitis A, hepatitis B, chickenpox, meningitis, pneumonia and Hemophilus influenza type B. He also played a part in discovering the cold-producing adenoviruses, the hepatitis viruses, and the cancer-causing SV40 virus.

In 1984, at the mandatory retirement age of 65, Hilleman resigned as Senior Vice President of Merck Research Labs. Not satisfied with retirement, he began directing the newly created Merck Institute for Vaccinology only a few months later. Hilleman continued working at the Institute for Vaccinology until his death in 2005, at the age of 85.

Throughout his life, Hilleman received a stream of awards, including the National Medal of Science (the United States’ highest scientific honor), and the lifetime achievement award from the World Health Organization. Hilleman is often described as the most successful vaccinologist in history.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Francoise Barre-Sinoussi

Francoise Barre-Sinoussi, a French virologist who discovered that human immunodeficiency virus (HIV) is the cause of acquired immune deficiency syndrome (AIDS). Barre-Sinoussi’s discovery has led to the development of medical treatments that slow the progression of HIV and decrease the risk of HIV's transmission.

Barre-Sinoussi was born July 30, 1947 in Paris, France. From an early age, Barre-Sinoussi showed an interest in science and decided to continue her passion for knowledge at the University of Paris. Initially Barre-Sinoussi wanted to study medicine, but as she came from a humble background, she decided to be pragmatic and study natural science. Natural science was a shorter course than a medical degree, thus saving her family money in tuition and boarding fees.

After studying at the University of Paris for a couple of years, Barre-Sinoussi began working part-time at the Pasteur Institute – a Parisian research center focusing on the study of biology, diseases and vaccines. She soon began working full-time at that institute and would only attend the university to take her exams. Barre-Sinoussi received her PhD in 1975 and after a brief internship in the United States she began working on a group of viruses, known as retroviruses.

During the 1980’s AIDS epidemic, scientists were perplexed as to what was causing the outbreak of the disease and Barre-Sinoussi decided to use her knowledge of retroviruses to experiment on AIDS. In 1983, Barre-Sinoussi and her colleague Luc Montaigner made the groundbreaking discovery that HIV is the cause of AIDS. 

Barre-Sinoussi’s discovery led to many medical breakthroughs that have helped in the fight against AIDS, including numerous HIV testing and diagnosis technologies, and lifesaving antiretroviral therapy.

In 1988, Barre-Sinoussi took charge of her own laboratory at the Pasteur Institute and began intensive research trying to create a HIV vaccine. Although no vaccine has been discovered, her team continues to research different mechanisms to protect people against HIV infections.

In 2008, Barre-Sinoussi and Montagnier were awarded the Nobel Prize in Physiology or Medicine for their work in discovering HIV. Barre-Sinoussi has received a number of awards and honorary doctorates. In 2006, she was named Grand Officier de la Légion d’Honneur - France’s highest order of merit. Between 2012 and 2016, Barre-Sinoussi served as the President for the International AIDS Society. She retired from active research in 2017.

As HumanProgress.org has noted before, thanks to the discovery of HIV, and creation of different treatments, humanity is now winning the war on AIDS. Since the peak of the HIV pandemic in the mid-2000s, when some 1.9 million people died of AIDS each year, less than one million people died from the sickness in 2017. New infections are also down. In the mid-1990s, there were 3.4 million new HIV infections each year but in 2017, there were only 1.8 million new HIV infections. That’s a decline of 47 percent.

Without the contributions of Francoise Barre-Sinoussi, humanity’s crusade against AIDS would not be as advanced or successful as it is today, and millions more people would be dying from the virus each year.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Richard Cobden

Richard Cobden, a 19th century British politician and textile manufacturer. Cobden’s work turned Britain, the global hegemon at the time, into a free trading nation – an act that set in motion global trade liberalization that has lifted millions of people out of poverty.

Richard Cobden was born June 3, 1804, in rural Sussex, England. He was the son of a poor farmer and spent his early years in abject poverty. Cobden received little formal education and, at the age of 14, he became a clerk in a textile factory. In 1828, Cobden and two other young men started a company selling calico prints in London. The business was an immediate success and within a few years he was living an affluent life in Manchester.

In 1833, the now-prosperous Cobden began travelling the world. He visited much of Europe, the United States, and the Middle East. While on his travels in 1835, Cobden wrote an influential pamphlet titled England, Ireland and America. In the pamphlet, he advocated for a new approach to foreign policy based on free-trade, peace and non-interventionism.

Cobden returned to England in 1839 to advocate for the repeal of the Corn Laws. Enacted in 1815, the Corn Laws were tariffs placed on imported food and grain into Britain. They kept grain prices artificially high to favor domestic producers. Cobden argued that these laws raised the price of food and the cost of living for the British public, and hampered the growth of other economic sectors.

In March 1839, Cobden created the Anti-Corn Law League – an organization advocating in favor of the repeal. Cobden, with the support of the talented orator John Bright, spoke to audiences across the country. He presented a petition to Parliament urging the end of protectionism. After it was rejected, Cobden realized that petitions would achieve little. It was direct political action that was needed.

In 1841, Cobden became a Member of Parliament for Stockport. The economic hardship associated with the recession that lasted from 1840 to 1842 pushed more people in favor of free trade and Corn Laws were eventually repealed in 1846.

Prime Minister Robert Peel acknowledged Cobden as the man responsible for enabling those who lived in extreme poverty to access cheaper foodstuffs from abroad. Moreover, the repeal of the Corn Laws forced many of Britain’s colonies to embrace free trade.

In 1859, with tensions between Britain and France high, Michel Chevalier, a French statesman, urged Cobden to persuade the French Emperor Napoleon III about the benefits of free-trade. Cobden, with the blessing of the Chancellor of the Exchequer William Gladstone, met with the Emperor to discuss a potential Anglo-French free trade deal.

The Emperor was receptive to Cobden’s arguments and, on January 23, 1860, Britain and France signed the Cobden-Chevalier Treaty. Princeton University economist Gene Grossman described the treaty as the “first modern trade agreement.” Cobden died in London on April 2, 1865.

Repeal of the Corn Laws marked a fundamental shift of the British Empire toward free-trade. That policy alleviated hunger and suffering of millions of people, and set a precedent for free-trade treaties to follow. Cobden’s influence on the creation of the Cobden-Chevalier treaty laid the foundation for modern trade agreements that continue to shape and enrich the world today.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: William Wilberforce

William Wilberforce, a leading 18th century British abolitionist and politician. Wilberforce’s efforts helped to ban the slave trade in 1807 and abolish slavery in the British Empire in 1833, thus freeing millions of formerly enslaved people. 

Wilberforce was born on August 24, 1759, in Kingston upon Hull, England. His father was a wealthy merchant and, at the age of 17, Wilberforce began studying at Cambridge University. The death of his grandfather and uncle left Wilberforce independently wealthy and, while at Cambridge, he lived a relatively carefree life. He was well-known within the university’s social scene and became friends with William Pitt the Younger, who later became Prime Minister.

After graduating in 1780, Wilberforce decided to seek political office and, at the age of 21, he became the Member of Parliament for Hull. He was independent of any political party, stating he was a “no party man.” In his first four years in parliament, Wilberforce admitted he “did nothing to any purpose. My own distinction was my own darling object.” Similar to his university days, Wilberforce was known in many social circles and was fond of drinking and gambling.

In 1785, Wilberforce travelled to Europe with his sister and mother for a vacation. During his time abroad, he read Rise and Progress of Religion in the Soul. This book had a profound impact on Wilberforce’s life. He embraced evangelical Christianity, lost interest in card games and drinking, began to get up early to read the Bible, and decided to commit his future life to work in the service of God.

Thereafter his political views were guided by his faith and his desire to promote Christian ethics. And so began his lifelong concern with social reform.

In 1786, Wilberforce began to play an active role in the abolitionist movement. In 1787, he wrote in his journal that God had set before him the objective of suppressing the slave trade. A group of evangelical abolitionists known as the Clapham Sect soon acknowledged Wilberforce as their leader.

In 1789, he introduced 12 different resolutions against the slave trade to the British Parliament’s House of Commons. Even though he was often supported by Pitt and famous Member of Parliament and philosopher Edmund Burke, Wilberforce’s measures failed to gain majority support. Wilberforce remained resilient and introduced anti-slavery bills in 1791, 1792, 1793, 1797, 1798, 1799, 1804 and 1805. All were defeated.

After the death of Pitt in 1806, Wilberforce tried once more, but this time, rather than calling for an outright ban of slavery, Wilberforce strategically pushed a bill that would make it illegal for slave owners to trade slaves with the French colonies. The bill passed and this smaller step worked to undermine and weaken the power of slave ship owners, thus making it easier for Wilberforce to pass more significant legislation in the future.

In 1807, Wilberforce managed to pass the Slave Trade Act through both Houses of Parliament. However, the 1807 act only banned slave trading and many slaves continued to be held in bondage.

For the remainder of his life, Wilberforce campaigned for the rights of slaves and, despite failing health, he remained integral to the abolitionist movement. In 1825, Wilberforce declined a peerage and resigned his seat due to health reasons.

On 26 July 1833, the Whig government under the leadership of Earl Grey introduced a Bill for the Abolition of Slavery and formally acknowledged Wilberforce in the process. The bill would outlaw slavery in most parts of the British Empire. After hearing of the happy news, Wilberforce died just 3 days later on July 29, 1833.

Wilberforce’s work was integral to the outlawing of slavery throughout the British Empire, the global hegemon of the day. Thereafter, British ships and Royal marines proceeded to extinguish slavery throughout much of the world. For the first time in human history, the suffering of millions was alleviated and dignity of every human being affirmed.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Ronald Ross

Ronald Ross is the man who first discovered that malaria is spread via mosquitoes. Ross’ work laid the foundation for new methods of combatting the disease, thus saving untold millions of lives.

Ronald Ross was born May 13, 1857, in Almora, British India. His father, who was a general in the British Indian Army, enrolled Ross at St Bartholomew’s Hospital Medical College in London, in 1874. After graduating in 1879, Ross joined the Indian Medical Service and was posted all over India and Pakistan between 1881 and 1894.

While on leave in 1888, Ross began to study bacteriology in London. In 1894, Sir Patrick Manson, a pioneering Scottish physician who came to be known as “the father of tropical medicine,” introduced Ross to malaria research.  It is said that even before his luggage cleared the customs office when he returned to India in 1895, Ross was in Bombay Civil Hospital looking for malaria patients. However, Ross’ experiments were interrupted when he was deployed to Bangalore to investigate an outbreak of cholera. In June 1896, he was transferred back to Secunderabad to continue malaria research.

In July 1897, after almost two years of failure, Ross found malarial parasite inside the guts of dissected mosquitoes. His findings were published in the 1897 issue of the British Medical Journal. In 1898, Ross found that the malarial parasite was stored inside the mosquito salivary gland and released through biting. He was also able to demonstrate the transmission of malaria from infected birds, via the mosquito, to healthy birds – thus establishing the completed life cycle of the malarial parasite. That explained how malaria was transmitted to humans.

Ross received the Nobel Prize for physiology and medicine in 1902, thus becoming the first Nobel laureate born outside of Europe. Soon thereafter, Italian zoologist Giovanni Batista Grassi identified Anopheles as the genus of mosquito responsible for spreading the disease. Once scientists understood the role of mosquitoes, they were able to develop programs aimed at halting the spread of the disease.

After leaving the Indian Medical Service in 1899, Ross became a lecturer at the Liverpool School of Tropical Medicine and continued to work on the prevention of malaria in different parts of the world. He received a knighthood from King George V in 1911. Later in life, he founded the Ross Institute and Hospital for Tropical Diseases in 1926 and was its Director-in-Chief until his death in 1932.

Although much work is still to be done, Ross’ discovery has allowed for immense progress in combatting malaria. Incidences of malaria continue to fall throughout the world. The World Health Organization estimates that malaria deaths fell more than 47 percent between 2000 and 2015. Without Ross’ work, the progress made in combatting malaria would be far less sophisticated and millions of people wouldn’t be alive today.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Alexander Fleming

Alexander Fleming is the man who first discovered penicillin. Fleming’s discovery paved the way for the invention of antibiotic drugs, which have been credited with saving over 80 million lives so far.

Alexander Fleming was born on August 6, 1881, in Ayrshire, Scotland. At the age of 13, Fleming moved to London to attend the Royal Polytechnic Institution. After inheriting some money from a dying uncle at the age of 21, he enrolled at St. Mary’s Hospital Medical School, London. Fleming graduated with distinction in 1906 and stayed at the medical school as a researcher of bacteriology under Sir Almroth Wright - a pioneer in vaccine therapy and immunology.

When World War One began, Fleming enrolled in the Army Medical Corps. He returned to St. Mary’s to work as a lecturer in 1918. However, it would be another 10 years before his world-changing discovery.

On 3rd September 1928, Fleming returned to his lab having spent August on a holiday with his family. Fleming was notorious for keeping a messy lab and upon his return he discovered that he had left out a stack of staphylococci (a common bacterium found in 25 percent of healthy people) in petri dishes. Upon investigation he noticed the bacteria were destroyed and infected with fungus – all except for a small ring in one of the plates where the fungus wouldn’t grow. 

After some initial experiments, Fleming isolated the organism responsible for prohibiting the growth of the fungus. He identified it as being derived from the penicillium genus; so, he named it penicillin.

Fleming’s further investigations revealed that penicillin was able to fight all gram-positive bacteria (a type of bacteria with a more penetrable cell wall), which includes those that cause diphtheria, meningitis, scarlet fever and pneumonia. Penicillin fights bacteria by attaching itself to the cell wall and interfering with the bacteria’s ability to produce new cell walls when they divide.

Fleming published his discovery in 1929 in the British Journal of Experimental Pathology. However, little attention was paid to his findings at the time. He continued with his experiments but found that cultivation of penicillin was difficult. After having grown the mold, isolating the antibiotic agent proved strenuous. Lacking the funds and manpower needed for more in-depth research, Fleming abandoned his pursuit after a series of inconclusive experiments.

During World War Two, Howard Florey and Ernst Boris Chain from Oxford University managed to get a carefully preserved strain of Fleming’s penicillin. Florey and Chain began large-scale research, hoping to be able to mass-produce the antibiotic.

Mass production began after the bombing of Pearl Harbor, and by D-Day in 1944, enough penicillin had been produced to treat all wounded allied troops.

In 1944, Fleming was knighted by King George VI and became “Sir Alexander Fleming.” The next year Fleming, Florey and Chain jointly won the Nobel Prize for their contribution to developing the antibiotic.

Looking back on the day of his discovery, Fleming once said, “One sometimes finds what one is not looking for. When I woke up… I certainly didn’t plan to revolutionize all medicine by discovering the world’s first antibiotic or bacteria killer. But I suppose that’s exactly what I did.”

Later in life Fleming was decorated with numerous awards: he was an honorary member of almost all medical and scientific societies across the world, he became a “Freeman” of many boroughs and cities, and was awarded honorary doctorate degrees from almost thirty universities across Europe and America. He died aged 73, in 1955. 

Fleming’s discovery of penicillin laid the foundation for the development of the antibiotic “wonder drug” that has been credited with saving over 80 million lives. Penicillin revolutionized the medical field and it is likely that most people reading this today have benefited from Fleming’s discovery at some point in their lives.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Jonas Salk

Jonas Salk is the man who pioneered the world’s first effective polio vaccine.

Polio is a highly infectious viral disease that is most often transmitted by drinking water that has been contaminated with the feces of someone carrying the virus. The virus spreads easily in regions with poor sanitation. The symptoms include: fever, fatigue, headache, vomiting, stiffness and pain in the limbs. Most infected patients recover. In one out of two-hundred cases, the virus attacks the nervous system, leading to irreversible paralysis. Of those paralyzed, between 5 and 10 percent die when their breathing muscles become immobilized.

Polio has a relatively long incubation period – it can spread for many months without being detected - making it extremely difficult to monitor. According to Max Roser from Oxford University, “Up to the 19th century, populations experienced only relatively small outbreaks [of polio]. This changed around the beginning of the 20th century. Major epidemics occurred in Norway and Sweden around 1905 and later also in the United States.”

The first major outbreak of polio happened in the United States in 1916, when the disease infected 27,000 people and killed more than 7,000 people. The second major outbreak of polio in 20th century America happened in the 1950s. It is here Jonas Salk enters our story.

Jonas Edward Salk was born on October 28, 1914, in New York. Salk became passionate about biochemistry and bacteriology during his time at the New York University School of Medicine. After he graduated in 1939, he started working at the prestigious Mount Sinai Hospital. Salk’s focus shifted to researching polio vaccinations in 1948, when he was head-hunted to work at the National Foundation for Infantile Paralysis – an organization that President Franklin D. Roosevelt, himself a polio sufferer, helped to set up.

After a large outbreak of polio across the United States in 1952, donations began pouring in to the foundation and in the spring of 1953, Salk put forward a promising anti-polio vaccine. The foundation quickly began trials on 1.83 million children across the United States. These children became known as the “polio pioneers.” Salk’s foundation received donations from two-thirds of the American population and a poll even suggested that more Americans knew about these field trails than knew the then-president’s full name (Dwight David Eisenhower).

 On April 12, 1955, Salk’s supervisor, Thomas Francis, announced that Salk’s vaccine was safe and effective in preventing polio. Just two hours later, the U.S. Public Health Service issued a production license for the vaccine and a national immunization program began.

Shortly thereafter, Dr. Albert Sabin, a Polish American medical researcher working at the National Institutes of Health, introduced a polio vaccine that could be administered orally, thereby making vaccination efforts less expensive as trained health workers weren’t needed to administer injections. From a record 58,000 cases in 1952, the United States was declared polio free in 1979.

In 1988, the Global Polio Eradication Initiative (GPEI) was founded to administer the vaccine worldwide. When the GPEI began its efforts, polio paralyzed 10 children for life every 15 minutes, across 125 countries. Since 1988, more than 2.5 billion children have been immunized and incidents of polio infections have fallen by more than 99.99 percent. That is, they fell from 350,000 annual cases, to just 22 new cases across 3 countries in 2017. Next year, Africa is due to be declared free of polio - that is, if no new cases are found in Nigeria, which is the last country in the region to report new polio infections.

Following his discovery of the vaccine, Salk received dozens of awards, a presidential citation, four honorary degrees, half a dozen foreign decorations, and letters from thousands of thankful fellow citizens. In 1963, Salk established the Jonas Salk Institute for Biological Studies - a world-class research facility that focuses on molecular biology and genetics, neurosciences, and plant biology. Salk devoted his later years to researching a vaccine for HIV/AIDS. He died on June 23rd, 1995.

Salk’s work has saved hundreds of millions of people from crippling paralysis, and millions from death. Thanks to his vaccine, a disease that has plagued humanity since pharaonic Egypt is almost completely eradicated, and within a few years, the disease will (hopefully) be consigned to history.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Landsteiner and Lewisohn

Austrian scientist Karl Landsteiner discovered the existence of different blood groups and German surgeon Richard Lewisohn developed procedures that allowed blood to be stored outside of the body without clotting. The two breakthroughs made blood transfusions far more practical and are credited with saving over 1 billion lives.

Karl Landsteiner was born in Vienna, Austria in 1868. At the age of 23, Landsteiner finished medical school and, unable to find any research jobs, he made his living by doing autopsies at “deadhouses” (i.e., morgues). In 1898, he became an assistant in the Department of Pathological Anatomy at the University of Vienna. It was there that he would make his world-changing discovery.

Before Landsteiner discovered the four different blood types (A, B, AB, and O) in 1901, the success of attempted blood transfusions was a matter of pure luck. Most patients were given an incompatible blood type and died when their bodies rejected the donor’s blood.

The discovery of different blood groups meant that blood transfusions became a safer procedure. However, one major drawback remained. Blood transfusions still used the so-called “direct method”(i.e., the donor and the recipient had to be side by side for the transfusion to take place). This is where Richard Lewisohn enters the story.

Born in Germany in 1875, Lewisohn studied medicine at the University of Freiburg. After graduating in 1906, he moved to New York to work at Mount Sinai Hospital. Lewisohn’s great challenge was to find a way to store blood outside of the body without the blood clotting. Any clotting (i.e., coagulating) would render blood useless for transfusions.

Lewisohn built on the work of Belgian physician Albert Hustin who, in 1914, proved sodium citrate could be added to blood as an anticoagulant. In 1915, after continuous experimentation, Lewisohn discovered that the optimal concentration of sodium citrate equaled 0.2 percent of the total mass of blood, but not exceeding more than 5g per transfusion. Lewisohn’s optimization allowed for blood to be stored safely for two days prior to transfusion. The following year, others optimized Lewisohn’s method and pushed that timeframe to 14 days.

Later in life, Landsteiner moved to New York to work for the Rockefeller Institute, and in 1930 he received the Nobel Prize in Physiology or Medicine. In 1939, Landsteiner became Professor Emeritus at the Rockefeller Institute, and died pipette in hand on June 24, 1943. Lewisohn meanwhile retired from surgery in 1937 to focus on cancer research. In 1955 he received the American Association of Blood Banks’ Karl Landsteiner Memorial Award. He died in 1961.

Landsteiner and Lewisohn’s work helped make blood transfusions far safer and more practical. Before their breakthrough people would regularly bleed to death from ulcers, accidents, and childbirth. The work of these two men led to the establishment of the worldwide system of blood banks which is credited with saving over 1 billion lives so far.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Edward Jenner

Edward Jenner is an 18th century English physician who pioneered the smallpox vaccination - the world’s first vaccine. 

Before it was eradicated in 1979, smallpox was one of humanity’s oldest and most devastating scourges. The virus, which can be traced back to pharaonic Egypt, is thought to have killed between 300 and 500 million people as late as the 20th century. 

The “speckled monster, as it was known in 18th century England, smallpox was highly contagious and left the victim’s body covered with abscesses that caused immense scarring. If the viral infection was strong enough, the immune system of the patient collapsed, and the person died. 

The mortality rate for smallpox was between 20 and 60 percent, and of those lucky enough to survive, a third were left blind. Among infants, it was 80 percent. 

Enter, Edward Jenner. 

Born in Gloucestershire in 1749, Jenner was successfully inoculated against smallpox at the age of 8. Between the age of 14 to 21 he apprenticed for a county surgeon in Devon. In 1770, he enrolled as a pupil at St. George’s Hospital in London.

At the hospital Jenner had a variety of interests: he studied geology, conducted experiments on human blood, built and twice launched his own hydrogen balloons, and conducted a particularly lengthy study on the cuckoo bird.

In May 1796, Jenner turned his attention to smallpox. For many years Jenner had heard stories that dairymaids were immune to smallpox because they had already contracted cowpox – a mild disease from cows that resembles smallpox – when they were children. 

Jenner found a young dairymaid by the name of Sarah Nelms who had recently been infected with cowpox from Blossom, a cow whose hide still hangs on the wall of St. George’s medical hospital. Jenner extracted pus from one of Nelms’ pustules and inserted it in an 8-year old boy named James Phipps – the son of Jenner’s gardener.  

Phipps developed a mild fever, but no infection. Two months later, Jenner inoculated the boy with a fresh smallpox lesion and no disease developed. Jenner concluded that the experiment had been a success and he named the new procedure vaccination from the Latin wordvacca meaning cow. 

The American physician Donald Hopkins has noted, "Jenner's unique contribution was not that he inoculated a few persons with cowpox, but that he then proved that they were immune to smallpox.” 

The success of Jenner’s discovery quickly spread around Europe. Napoleon, who was at war with Britain at the time, had all his troops vaccinated, awarded Jenner a medal, and even released two English prisoners at Jenner’s request. Napoleon is cited as having said he could not “refuse anything to one of the greatest benefactors of mankind." 

Jenner made no attempt to enrich himself through his discovery and he even built a small one-room hut in his garden, where he would vaccinate the poor free of charge – he called it the “Temple of Vaccinia.” Later in life he was appointed Physician Extraordinary to King George IV and was made mayor of Berkeley, Gloucestershire. He died on January 26th, 1823, aged 73. 

In 1979, the World Health Organization officially declared smallpox an eradicated disease. 

The smallpox vaccine laid the foundation for other discoveries in immunology and the amelioration of diseases such as measles (rubeola), influenza (the flu), tuberculosis, diphtheria, tetanus (lockjaw), pertussis (whooping cough), hepatitis A and B, polio, yellow fever and rotavirus.

Jenner’s work has saved untold millions of lives from a disease that has plagued humanity for millennia. 


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Fritz Haber and Carl Bosch

Two German Nobel Prize-winning scientists, Fritz Haber and Carl Bosch, created the “Haber-Bosch process,” which efficiently converts nitrogen from the air into ammonia (i.e., a compound of nitrogen and hydrogen). Ammonia is then used as a fertilizer to dramatically increase crop yields. The impact of Haber and Bosch’s work on global food production transformed the world forever.

Throughout the 19th century, farmers used guano (i.e., the accumulated excrement of seabirds and bats) as highly effective fertilizer due to its exceptionally high content of nitrogen, phosphate and potassium - nutrients that are essential for plant growth. But by the beginning of the 20th century, guano deposits started to run out, and the price of the fertilizer began to increase. If a solution to the depletion of guano hadn’t come soon, famine would have followed.

Enter, Fritz Haber. Born in 1868 in Breslau, Germany (now part of Poland), Haber began studying chemistry at the age of 18 at the University of Heidelberg. By 1894, Haber worked at the University of Karlsruhe, researching methods to synthesize nitrogen. Nitrogen is very common in the atmosphere, but the chemical element is difficult to extract from the air and turn into a liquid or solid form (a process known as “fixing” nitrogen).

After thousands of experiments over almost 15 years, Haber succeeded in producing ammonia on July 3rd, 1909. That proved that commercial production was possible. However, Haber’s breakthrough occurred in a small tube, 75 centimetres tall and 13 centimetres in diameter. At the start of the 20th century, large containers that could handle the pressures and temperatures required for industrial scale production of ammonia did not yet exist. 

That is where Carl Bosch enters the story. Born in Cologne in 1874, Bosch studied metallurgy at the University of Charlottenburg in 1894, before transferring to the University of Leipzig to receive his doctorate in chemistry in 1898. Bosch met Haber in 1908 and after finding out about the latter’s breakthrough the following year, Bosch took on the challenge of developing suitable containers that could manage Haber’s process on the industrial level.

Within four years Bosch was producing ammonia in 8-meter-tall containers. The Haber-Bosch process was born. By 1913, Bosch had opened a factory that kick-started the fertilizer industry that we know today. 

The discovery of the Haber-Bosch process meant that for the first time in human history it became possible to produce synthetic fertilizers that could be used on enough crops to sustain the Earth’s increasing population. It’s nearly impossible to say how many lives this breakthrough saved, but the expansion of the world’s population from 1.6 billion in 1900, to more than 7.3 billion today, “would not have been possible without the synthesis of ammonia,” claims the Czechian scientist Vaclav Smil.

After their revolutionary contribution to human progress, the two scientists worked to help Germany during World War I. Bosch focused on bomb making, while Haber became instrumental in developing chlorine gas. When Adolf Hitler came to power in 1933, Haber fled Germany to teach at Cambridge University, and he died shortly after in 1935. Meanwhile, in 1937, Bosch was appointed President of the Kaiser Wilhelm Institute – Germany’s highest scientific position. Being a staunch critic of Nazi policies, Bosch was soon removed from that position and died in 1940. 

Today, more than 159 million tonnes of ammonia are produced annually, and while ammonia is also used for cleaning and as a refrigerant, 88 percent of ammonia is used for fertilizer. It is estimated that if average crop yields remained at their 1900 level, the crop harvest in the year 2000 would have required nearly four times more cultivated land than was actually cultivated. That equates to an area equal to almost half of all land on ice-free continents – rather than just the 15 percent that is needed today.

Without the combined efforts of Fritz Haber and Carl Bosch, the world’s population would be much smaller than it is today. The two have truly changed the world for the better.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Norman Borlaug

Norman Borlaug is the man commonly dubbed the “Father of the Green Revolution.”   

Norman Ernest Borlaug was an American agronomist and humanitarian born in Iowa in 1914. After receiving a PhD from the University of Minnesota in 1944, Borlaug moved to Mexico to work on agricultural development for the Rockefeller Foundation. Although Borlaug’s taskforce was initiated to teach Mexican farmers methods to increase food productivity, he quickly became obsessed with developing better (i.e., higher-yielding and pest-and-climate resistant) crops.

As Johan Norberg notes in his 2016 book Progress:

“After thousands of crossing of wheat, Borlaug managed to come up with a high-yield hybrid that was parasite resistant and wasn’t sensitive to daylight hours, so it could be grown in varying climates. Importantly it was a dwarf variety, since tall wheat expended a lot of energy growing inedible stalks and collapsed when it grew too quickly. The new wheat was quickly introduced all over Mexico.”

In fact, by 1963, 95 percent of Mexico’s wheat was Borlaug’s variety and Mexico’s wheat harvest grew six times larger than it had been when he first set foot in the country nineteen years earlier.

Norberg continues, “in 1963, Borlaug moved on to India and Pakistan, just as it found itself facing the threat of massive starvation. Immediately, he ordered thirty-five trucks of high-yield seeds to be driven from Mexico to Los Angeles, in order to ship them from there.” Unfortunately, Borlaug’s convoy faced problems from the start; it was held up by Mexican police, blocked at the US border due to a ban on seed imports, and was then stalled by race-riots that obstructed the LA harbor.

Eventually Borlaug’s shipment began its voyage to India, but it was far from plain sailing.

Before the seeds had reached the sub-continent, Indian state monopolies began lobbying against Borlaug’s shipment and then, once it was ashore, it was discovered that half the seeds had been killed due to over-fumigation at customs. If that wasn’t enough, Borlaug learnt that the Indian government was planning to refuse fertilizer imports as they “wanted to build up their domestic fertilizer industry.” Luckily that policy was abandoned once Borlaug famously shouted at India’s deputy Prime Minister.

Borlaug later noted, “I went to bed thinking the problem was at last solved and woke up to the news that war had broken out between India and Pakistan.” Amid the war, Borlaug and his team continued to work tirelessly planting seeds. Often the fields were within sight of artillery flashes.

Despite the late planting, yields in India rose by seventy percent in 1965. The proven success of his harvests coupled with the fear of wartime starvation, meant that Borlaug got the go-ahead from the Pakistani and Indian governments to roll out his program on a larger scale. The following harvest was even more bountiful and wartime famine was averted.

Both nations praised Borlaug immensely. The Pakistani Agriculture Minister took to the radio applauding the new crop varieties, while the Indian Agriculture Minister went as far as to plough his cricket pitch with Borlaug’s wheat.  After a huge shipment of seed in 1968, the harvest in both countries boomed. It is recorded that there were not enough people, carts, trucks, or storage facilities to cope with the bountiful crop.

This extraordinary transformation of Asian agriculture in the 1960s and 1970s almost banished famine from the entire continent. By 1974, wheat harvests had tripled in India and, for the first time, the sub-continent became a net exporter of the crop. Norberg notes, “today they (India and Pakistan) produce seven times more wheat than they did in 1965. Despite a rapidly growing population, both countries are much better fed than they used to be.”

Borlaug’s wheat, and the dwarf rice varieties that followed, are credited for ushering in the Green Revolution. After the Indo-Pakistani war, Borlaug spent years working in China and later in life, Africa.

In 1970, Borlaug was awarded the Nobel Peace Prize for his accomplishments. He is only one of seven to have received the Congressional Gold Medal and the Presidential Medal of Freedom, in addition to the Nobel Peace Prize. It is said that he was particularly satisfied when the people of Sonora, Mexico, where he did some of his first experiments, named a street after him.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.