Innovation

Heroes of Progress: Maurice Hilleman

Maurice Hilleman is an American microbiologist who developed over 40 lifesaving vaccines. Of the fourteen vaccines recommended in the current vaccine schedules, Hilleman developed eight. Hilleman is credited with saving more lives than any other medical scientist of the 20th century.

Hilleman was born August 30, 1919, in Montana. His mother died two days after he was born. Following his mother’s death, his father was faced with the prospect of raising eight children alone. As such, his childless aunt and uncle agreed to raise Maurice on their nearby chicken farm. Hilleman attributed much of his later success to his work on the farm as a boy – since the 1930’s, chicken eggs were used to grow viruses for vaccines.

Due to the lack of funds, Hilleman almost didn’t make it to college. Thankfully, his eldest brother interceded and loaned him the money to pay the tuition fees. Hilleman graduated first in his class from Montana State University in 1941 and won a fellowship to do postgraduate study in microbiology at the University of Chicago. He received his doctoral degree in 1944.

Upon graduation, Hilleman joined the E R Squib & Sons – a virus lab based in New Jersey. Soon after he started working in the lab, Hilleman successfully developed a vaccine for Japanese B encephalitis. This infection, which is native to Asia and the West Pacific, had begun to spread to American troops who were fighting in the Pacific during World War II.

In 1948, Hilleman began working as the Chief of the Department of Respiratory Diseases at the Army Medical Center in Silver Spring, Maryland. In 1957, Hilleman discovered first signs of an impending flu pandemic that was spreading in Hong Kong. Hilleman and his colleagues raced to produce a vaccine and he oversaw the production of over 40 million vaccines that were immediately distributed across the U.S.A.

Although 69,000 Americans died after catching the virus, if it hadn’t been for Hilleman’s efforts, the pandemic could have caused millions of deaths. In recognition of his work, the American military awarded Hilleman the Distinguished Service Medal.

In 1963, whilst working at Merck & Co (one of the world’s largest pharmaceutical companies), Hilleman’s daughter, Jeryl Lynn, became ill with the mumps. Hilleman quickly drove to his lab to pick up the necessary equipment so that he could cultivate material from his daughter’s infection.

In 1967, the original sample taken from Jeryl Lynn’s throat became the basis for the newly approved mumps vaccine. It came to be known as the "Jeryl Lynn Strain." Hilleman later combined his mumps vaccine with the measles and rubella vaccines - which he had also developed – in order to create the MMR vaccine.

Apart from the vaccines mentioned above, Hilleman also developed vaccines for hepatitis A, hepatitis B, chickenpox, meningitis, pneumonia and Hemophilus influenza type B. He also played a part in discovering the cold-producing adenoviruses, the hepatitis viruses, and the cancer-causing SV40 virus.

In 1984, at the mandatory retirement age of 65, Hilleman resigned as Senior Vice President of Merck Research Labs. Not satisfied with retirement, he began directing the newly created Merck Institute for Vaccinology only a few months later. Hilleman continued working at the Institute for Vaccinology until his death in 2005, at the age of 85.

Throughout his life, Hilleman received a stream of awards, including the National Medal of Science (the United States’ highest scientific honor), and the lifetime achievement award from the World Health Organization. Hilleman is often described as the most successful vaccinologist in history.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Francoise Barre-Sinoussi

Francoise Barre-Sinoussi, a French virologist who discovered that human immunodeficiency virus (HIV) is the cause of acquired immune deficiency syndrome (AIDS). Barre-Sinoussi’s discovery has led to the development of medical treatments that slow the progression of HIV and decrease the risk of HIV's transmission.

Barre-Sinoussi was born July 30, 1947 in Paris, France. From an early age, Barre-Sinoussi showed an interest in science and decided to continue her passion for knowledge at the University of Paris. Initially Barre-Sinoussi wanted to study medicine, but as she came from a humble background, she decided to be pragmatic and study natural science. Natural science was a shorter course than a medical degree, thus saving her family money in tuition and boarding fees.

After studying at the University of Paris for a couple of years, Barre-Sinoussi began working part-time at the Pasteur Institute – a Parisian research center focusing on the study of biology, diseases and vaccines. She soon began working full-time at that institute and would only attend the university to take her exams. Barre-Sinoussi received her PhD in 1975 and after a brief internship in the United States she began working on a group of viruses, known as retroviruses.

During the 1980’s AIDS epidemic, scientists were perplexed as to what was causing the outbreak of the disease and Barre-Sinoussi decided to use her knowledge of retroviruses to experiment on AIDS. In 1983, Barre-Sinoussi and her colleague Luc Montaigner made the groundbreaking discovery that HIV is the cause of AIDS. 

Barre-Sinoussi’s discovery led to many medical breakthroughs that have helped in the fight against AIDS, including numerous HIV testing and diagnosis technologies, and lifesaving antiretroviral therapy.

In 1988, Barre-Sinoussi took charge of her own laboratory at the Pasteur Institute and began intensive research trying to create a HIV vaccine. Although no vaccine has been discovered, her team continues to research different mechanisms to protect people against HIV infections.

In 2008, Barre-Sinoussi and Montagnier were awarded the Nobel Prize in Physiology or Medicine for their work in discovering HIV. Barre-Sinoussi has received a number of awards and honorary doctorates. In 2006, she was named Grand Officier de la Légion d’Honneur - France’s highest order of merit. Between 2012 and 2016, Barre-Sinoussi served as the President for the International AIDS Society. She retired from active research in 2017.

As HumanProgress.org has noted before, thanks to the discovery of HIV, and creation of different treatments, humanity is now winning the war on AIDS. Since the peak of the HIV pandemic in the mid-2000s, when some 1.9 million people died of AIDS each year, less than one million people died from the sickness in 2017. New infections are also down. In the mid-1990s, there were 3.4 million new HIV infections each year but in 2017, there were only 1.8 million new HIV infections. That’s a decline of 47 percent.

Without the contributions of Francoise Barre-Sinoussi, humanity’s crusade against AIDS would not be as advanced or successful as it is today, and millions more people would be dying from the virus each year.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Richard Cobden

Richard Cobden, a 19th century British politician and textile manufacturer. Cobden’s work turned Britain, the global hegemon at the time, into a free trading nation – an act that set in motion global trade liberalization that has lifted millions of people out of poverty.

Richard Cobden was born June 3, 1804, in rural Sussex, England. He was the son of a poor farmer and spent his early years in abject poverty. Cobden received little formal education and, at the age of 14, he became a clerk in a textile factory. In 1828, Cobden and two other young men started a company selling calico prints in London. The business was an immediate success and within a few years he was living an affluent life in Manchester.

In 1833, the now-prosperous Cobden began travelling the world. He visited much of Europe, the United States, and the Middle East. While on his travels in 1835, Cobden wrote an influential pamphlet titled England, Ireland and America. In the pamphlet, he advocated for a new approach to foreign policy based on free-trade, peace and non-interventionism.

Cobden returned to England in 1839 to advocate for the repeal of the Corn Laws. Enacted in 1815, the Corn Laws were tariffs placed on imported food and grain into Britain. They kept grain prices artificially high to favor domestic producers. Cobden argued that these laws raised the price of food and the cost of living for the British public, and hampered the growth of other economic sectors.

In March 1839, Cobden created the Anti-Corn Law League – an organization advocating in favor of the repeal. Cobden, with the support of the talented orator John Bright, spoke to audiences across the country. He presented a petition to Parliament urging the end of protectionism. After it was rejected, Cobden realized that petitions would achieve little. It was direct political action that was needed.

In 1841, Cobden became a Member of Parliament for Stockport. The economic hardship associated with the recession that lasted from 1840 to 1842 pushed more people in favor of free trade and Corn Laws were eventually repealed in 1846.

Prime Minister Robert Peel acknowledged Cobden as the man responsible for enabling those who lived in extreme poverty to access cheaper foodstuffs from abroad. Moreover, the repeal of the Corn Laws forced many of Britain’s colonies to embrace free trade.

In 1859, with tensions between Britain and France high, Michel Chevalier, a French statesman, urged Cobden to persuade the French Emperor Napoleon III about the benefits of free-trade. Cobden, with the blessing of the Chancellor of the Exchequer William Gladstone, met with the Emperor to discuss a potential Anglo-French free trade deal.

The Emperor was receptive to Cobden’s arguments and, on January 23, 1860, Britain and France signed the Cobden-Chevalier Treaty. Princeton University economist Gene Grossman described the treaty as the “first modern trade agreement.” Cobden died in London on April 2, 1865.

Repeal of the Corn Laws marked a fundamental shift of the British Empire toward free-trade. That policy alleviated hunger and suffering of millions of people, and set a precedent for free-trade treaties to follow. Cobden’s influence on the creation of the Cobden-Chevalier treaty laid the foundation for modern trade agreements that continue to shape and enrich the world today.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: William Wilberforce

William Wilberforce, a leading 18th century British abolitionist and politician. Wilberforce’s efforts helped to ban the slave trade in 1807 and abolish slavery in the British Empire in 1833, thus freeing millions of formerly enslaved people. 

Wilberforce was born on August 24, 1759, in Kingston upon Hull, England. His father was a wealthy merchant and, at the age of 17, Wilberforce began studying at Cambridge University. The death of his grandfather and uncle left Wilberforce independently wealthy and, while at Cambridge, he lived a relatively carefree life. He was well-known within the university’s social scene and became friends with William Pitt the Younger, who later became Prime Minister.

After graduating in 1780, Wilberforce decided to seek political office and, at the age of 21, he became the Member of Parliament for Hull. He was independent of any political party, stating he was a “no party man.” In his first four years in parliament, Wilberforce admitted he “did nothing to any purpose. My own distinction was my own darling object.” Similar to his university days, Wilberforce was known in many social circles and was fond of drinking and gambling.

In 1785, Wilberforce travelled to Europe with his sister and mother for a vacation. During his time abroad, he read Rise and Progress of Religion in the Soul. This book had a profound impact on Wilberforce’s life. He embraced evangelical Christianity, lost interest in card games and drinking, began to get up early to read the Bible, and decided to commit his future life to work in the service of God.

Thereafter his political views were guided by his faith and his desire to promote Christian ethics. And so began his lifelong concern with social reform.

In 1786, Wilberforce began to play an active role in the abolitionist movement. In 1787, he wrote in his journal that God had set before him the objective of suppressing the slave trade. A group of evangelical abolitionists known as the Clapham Sect soon acknowledged Wilberforce as their leader.

In 1789, he introduced 12 different resolutions against the slave trade to the British Parliament’s House of Commons. Even though he was often supported by Pitt and famous Member of Parliament and philosopher Edmund Burke, Wilberforce’s measures failed to gain majority support. Wilberforce remained resilient and introduced anti-slavery bills in 1791, 1792, 1793, 1797, 1798, 1799, 1804 and 1805. All were defeated.

After the death of Pitt in 1806, Wilberforce tried once more, but this time, rather than calling for an outright ban of slavery, Wilberforce strategically pushed a bill that would make it illegal for slave owners to trade slaves with the French colonies. The bill passed and this smaller step worked to undermine and weaken the power of slave ship owners, thus making it easier for Wilberforce to pass more significant legislation in the future.

In 1807, Wilberforce managed to pass the Slave Trade Act through both Houses of Parliament. However, the 1807 act only banned slave trading and many slaves continued to be held in bondage.

For the remainder of his life, Wilberforce campaigned for the rights of slaves and, despite failing health, he remained integral to the abolitionist movement. In 1825, Wilberforce declined a peerage and resigned his seat due to health reasons.

On 26 July 1833, the Whig government under the leadership of Earl Grey introduced a Bill for the Abolition of Slavery and formally acknowledged Wilberforce in the process. The bill would outlaw slavery in most parts of the British Empire. After hearing of the happy news, Wilberforce died just 3 days later on July 29, 1833.

Wilberforce’s work was integral to the outlawing of slavery throughout the British Empire, the global hegemon of the day. Thereafter, British ships and Royal marines proceeded to extinguish slavery throughout much of the world. For the first time in human history, the suffering of millions was alleviated and dignity of every human being affirmed.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Ronald Ross

Ronald Ross is the man who first discovered that malaria is spread via mosquitoes. Ross’ work laid the foundation for new methods of combatting the disease, thus saving untold millions of lives.

Ronald Ross was born May 13, 1857, in Almora, British India. His father, who was a general in the British Indian Army, enrolled Ross at St Bartholomew’s Hospital Medical College in London, in 1874. After graduating in 1879, Ross joined the Indian Medical Service and was posted all over India and Pakistan between 1881 and 1894.

While on leave in 1888, Ross began to study bacteriology in London. In 1894, Sir Patrick Manson, a pioneering Scottish physician who came to be known as “the father of tropical medicine,” introduced Ross to malaria research.  It is said that even before his luggage cleared the customs office when he returned to India in 1895, Ross was in Bombay Civil Hospital looking for malaria patients. However, Ross’ experiments were interrupted when he was deployed to Bangalore to investigate an outbreak of cholera. In June 1896, he was transferred back to Secunderabad to continue malaria research.

In July 1897, after almost two years of failure, Ross found malarial parasite inside the guts of dissected mosquitoes. His findings were published in the 1897 issue of the British Medical Journal. In 1898, Ross found that the malarial parasite was stored inside the mosquito salivary gland and released through biting. He was also able to demonstrate the transmission of malaria from infected birds, via the mosquito, to healthy birds – thus establishing the completed life cycle of the malarial parasite. That explained how malaria was transmitted to humans.

Ross received the Nobel Prize for physiology and medicine in 1902, thus becoming the first Nobel laureate born outside of Europe. Soon thereafter, Italian zoologist Giovanni Batista Grassi identified Anopheles as the genus of mosquito responsible for spreading the disease. Once scientists understood the role of mosquitoes, they were able to develop programs aimed at halting the spread of the disease.

After leaving the Indian Medical Service in 1899, Ross became a lecturer at the Liverpool School of Tropical Medicine and continued to work on the prevention of malaria in different parts of the world. He received a knighthood from King George V in 1911. Later in life, he founded the Ross Institute and Hospital for Tropical Diseases in 1926 and was its Director-in-Chief until his death in 1932.

Although much work is still to be done, Ross’ discovery has allowed for immense progress in combatting malaria. Incidences of malaria continue to fall throughout the world. The World Health Organization estimates that malaria deaths fell more than 47 percent between 2000 and 2015. Without Ross’ work, the progress made in combatting malaria would be far less sophisticated and millions of people wouldn’t be alive today.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Alexander Fleming

Alexander Fleming is the man who first discovered penicillin. Fleming’s discovery paved the way for the invention of antibiotic drugs, which have been credited with saving over 80 million lives so far.

Alexander Fleming was born on August 6, 1881, in Ayrshire, Scotland. At the age of 13, Fleming moved to London to attend the Royal Polytechnic Institution. After inheriting some money from a dying uncle at the age of 21, he enrolled at St. Mary’s Hospital Medical School, London. Fleming graduated with distinction in 1906 and stayed at the medical school as a researcher of bacteriology under Sir Almroth Wright - a pioneer in vaccine therapy and immunology.

When World War One began, Fleming enrolled in the Army Medical Corps. He returned to St. Mary’s to work as a lecturer in 1918. However, it would be another 10 years before his world-changing discovery.

On 3rd September 1928, Fleming returned to his lab having spent August on a holiday with his family. Fleming was notorious for keeping a messy lab and upon his return he discovered that he had left out a stack of staphylococci (a common bacterium found in 25 percent of healthy people) in petri dishes. Upon investigation he noticed the bacteria were destroyed and infected with fungus – all except for a small ring in one of the plates where the fungus wouldn’t grow. 

After some initial experiments, Fleming isolated the organism responsible for prohibiting the growth of the fungus. He identified it as being derived from the penicillium genus; so, he named it penicillin.

Fleming’s further investigations revealed that penicillin was able to fight all gram-positive bacteria (a type of bacteria with a more penetrable cell wall), which includes those that cause diphtheria, meningitis, scarlet fever and pneumonia. Penicillin fights bacteria by attaching itself to the cell wall and interfering with the bacteria’s ability to produce new cell walls when they divide.

Fleming published his discovery in 1929 in the British Journal of Experimental Pathology. However, little attention was paid to his findings at the time. He continued with his experiments but found that cultivation of penicillin was difficult. After having grown the mold, isolating the antibiotic agent proved strenuous. Lacking the funds and manpower needed for more in-depth research, Fleming abandoned his pursuit after a series of inconclusive experiments.

During World War Two, Howard Florey and Ernst Boris Chain from Oxford University managed to get a carefully preserved strain of Fleming’s penicillin. Florey and Chain began large-scale research, hoping to be able to mass-produce the antibiotic.

Mass production began after the bombing of Pearl Harbor, and by D-Day in 1944, enough penicillin had been produced to treat all wounded allied troops.

In 1944, Fleming was knighted by King George VI and became “Sir Alexander Fleming.” The next year Fleming, Florey and Chain jointly won the Nobel Prize for their contribution to developing the antibiotic.

Looking back on the day of his discovery, Fleming once said, “One sometimes finds what one is not looking for. When I woke up… I certainly didn’t plan to revolutionize all medicine by discovering the world’s first antibiotic or bacteria killer. But I suppose that’s exactly what I did.”

Later in life Fleming was decorated with numerous awards: he was an honorary member of almost all medical and scientific societies across the world, he became a “Freeman” of many boroughs and cities, and was awarded honorary doctorate degrees from almost thirty universities across Europe and America. He died aged 73, in 1955. 

Fleming’s discovery of penicillin laid the foundation for the development of the antibiotic “wonder drug” that has been credited with saving over 80 million lives. Penicillin revolutionized the medical field and it is likely that most people reading this today have benefited from Fleming’s discovery at some point in their lives.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Jonas Salk

Jonas Salk is the man who pioneered the world’s first effective polio vaccine.

Polio is a highly infectious viral disease that is most often transmitted by drinking water that has been contaminated with the feces of someone carrying the virus. The virus spreads easily in regions with poor sanitation. The symptoms include: fever, fatigue, headache, vomiting, stiffness and pain in the limbs. Most infected patients recover. In one out of two-hundred cases, the virus attacks the nervous system, leading to irreversible paralysis. Of those paralyzed, between 5 and 10 percent die when their breathing muscles become immobilized.

Polio has a relatively long incubation period – it can spread for many months without being detected - making it extremely difficult to monitor. According to Max Roser from Oxford University, “Up to the 19th century, populations experienced only relatively small outbreaks [of polio]. This changed around the beginning of the 20th century. Major epidemics occurred in Norway and Sweden around 1905 and later also in the United States.”

The first major outbreak of polio happened in the United States in 1916, when the disease infected 27,000 people and killed more than 7,000 people. The second major outbreak of polio in 20th century America happened in the 1950s. It is here Jonas Salk enters our story.

Jonas Edward Salk was born on October 28, 1914, in New York. Salk became passionate about biochemistry and bacteriology during his time at the New York University School of Medicine. After he graduated in 1939, he started working at the prestigious Mount Sinai Hospital. Salk’s focus shifted to researching polio vaccinations in 1948, when he was head-hunted to work at the National Foundation for Infantile Paralysis – an organization that President Franklin D. Roosevelt, himself a polio sufferer, helped to set up.

After a large outbreak of polio across the United States in 1952, donations began pouring in to the foundation and in the spring of 1953, Salk put forward a promising anti-polio vaccine. The foundation quickly began trials on 1.83 million children across the United States. These children became known as the “polio pioneers.” Salk’s foundation received donations from two-thirds of the American population and a poll even suggested that more Americans knew about these field trails than knew the then-president’s full name (Dwight David Eisenhower).

 On April 12, 1955, Salk’s supervisor, Thomas Francis, announced that Salk’s vaccine was safe and effective in preventing polio. Just two hours later, the U.S. Public Health Service issued a production license for the vaccine and a national immunization program began.

Shortly thereafter, Dr. Albert Sabin, a Polish American medical researcher working at the National Institutes of Health, introduced a polio vaccine that could be administered orally, thereby making vaccination efforts less expensive as trained health workers weren’t needed to administer injections. From a record 58,000 cases in 1952, the United States was declared polio free in 1979.

In 1988, the Global Polio Eradication Initiative (GPEI) was founded to administer the vaccine worldwide. When the GPEI began its efforts, polio paralyzed 10 children for life every 15 minutes, across 125 countries. Since 1988, more than 2.5 billion children have been immunized and incidents of polio infections have fallen by more than 99.99 percent. That is, they fell from 350,000 annual cases, to just 22 new cases across 3 countries in 2017. Next year, Africa is due to be declared free of polio - that is, if no new cases are found in Nigeria, which is the last country in the region to report new polio infections.

Following his discovery of the vaccine, Salk received dozens of awards, a presidential citation, four honorary degrees, half a dozen foreign decorations, and letters from thousands of thankful fellow citizens. In 1963, Salk established the Jonas Salk Institute for Biological Studies - a world-class research facility that focuses on molecular biology and genetics, neurosciences, and plant biology. Salk devoted his later years to researching a vaccine for HIV/AIDS. He died on June 23rd, 1995.

Salk’s work has saved hundreds of millions of people from crippling paralysis, and millions from death. Thanks to his vaccine, a disease that has plagued humanity since pharaonic Egypt is almost completely eradicated, and within a few years, the disease will (hopefully) be consigned to history.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Landsteiner and Lewisohn

Austrian scientist Karl Landsteiner discovered the existence of different blood groups and German surgeon Richard Lewisohn developed procedures that allowed blood to be stored outside of the body without clotting. The two breakthroughs made blood transfusions far more practical and are credited with saving over 1 billion lives.

Karl Landsteiner was born in Vienna, Austria in 1868. At the age of 23, Landsteiner finished medical school and, unable to find any research jobs, he made his living by doing autopsies at “deadhouses” (i.e., morgues). In 1898, he became an assistant in the Department of Pathological Anatomy at the University of Vienna. It was there that he would make his world-changing discovery.

Before Landsteiner discovered the four different blood types (A, B, AB, and O) in 1901, the success of attempted blood transfusions was a matter of pure luck. Most patients were given an incompatible blood type and died when their bodies rejected the donor’s blood.

The discovery of different blood groups meant that blood transfusions became a safer procedure. However, one major drawback remained. Blood transfusions still used the so-called “direct method”(i.e., the donor and the recipient had to be side by side for the transfusion to take place). This is where Richard Lewisohn enters the story.

Born in Germany in 1875, Lewisohn studied medicine at the University of Freiburg. After graduating in 1906, he moved to New York to work at Mount Sinai Hospital. Lewisohn’s great challenge was to find a way to store blood outside of the body without the blood clotting. Any clotting (i.e., coagulating) would render blood useless for transfusions.

Lewisohn built on the work of Belgian physician Albert Hustin who, in 1914, proved sodium citrate could be added to blood as an anticoagulant. In 1915, after continuous experimentation, Lewisohn discovered that the optimal concentration of sodium citrate equaled 0.2 percent of the total mass of blood, but not exceeding more than 5g per transfusion. Lewisohn’s optimization allowed for blood to be stored safely for two days prior to transfusion. The following year, others optimized Lewisohn’s method and pushed that timeframe to 14 days.

Later in life, Landsteiner moved to New York to work for the Rockefeller Institute, and in 1930 he received the Nobel Prize in Physiology or Medicine. In 1939, Landsteiner became Professor Emeritus at the Rockefeller Institute, and died pipette in hand on June 24, 1943. Lewisohn meanwhile retired from surgery in 1937 to focus on cancer research. In 1955 he received the American Association of Blood Banks’ Karl Landsteiner Memorial Award. He died in 1961.

Landsteiner and Lewisohn’s work helped make blood transfusions far safer and more practical. Before their breakthrough people would regularly bleed to death from ulcers, accidents, and childbirth. The work of these two men led to the establishment of the worldwide system of blood banks which is credited with saving over 1 billion lives so far.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Edward Jenner

Edward Jenner is an 18th century English physician who pioneered the smallpox vaccination - the world’s first vaccine. 

Before it was eradicated in 1979, smallpox was one of humanity’s oldest and most devastating scourges. The virus, which can be traced back to pharaonic Egypt, is thought to have killed between 300 and 500 million people as late as the 20th century. 

The “speckled monster, as it was known in 18th century England, smallpox was highly contagious and left the victim’s body covered with abscesses that caused immense scarring. If the viral infection was strong enough, the immune system of the patient collapsed, and the person died. 

The mortality rate for smallpox was between 20 and 60 percent, and of those lucky enough to survive, a third were left blind. Among infants, it was 80 percent. 

Enter, Edward Jenner. 

Born in Gloucestershire in 1749, Jenner was successfully inoculated against smallpox at the age of 8. Between the age of 14 to 21 he apprenticed for a county surgeon in Devon. In 1770, he enrolled as a pupil at St. George’s Hospital in London.

At the hospital Jenner had a variety of interests: he studied geology, conducted experiments on human blood, built and twice launched his own hydrogen balloons, and conducted a particularly lengthy study on the cuckoo bird.

In May 1796, Jenner turned his attention to smallpox. For many years Jenner had heard stories that dairymaids were immune to smallpox because they had already contracted cowpox – a mild disease from cows that resembles smallpox – when they were children. 

Jenner found a young dairymaid by the name of Sarah Nelms who had recently been infected with cowpox from Blossom, a cow whose hide still hangs on the wall of St. George’s medical hospital. Jenner extracted pus from one of Nelms’ pustules and inserted it in an 8-year old boy named James Phipps – the son of Jenner’s gardener.  

Phipps developed a mild fever, but no infection. Two months later, Jenner inoculated the boy with a fresh smallpox lesion and no disease developed. Jenner concluded that the experiment had been a success and he named the new procedure vaccination from the Latin wordvacca meaning cow. 

The American physician Donald Hopkins has noted, "Jenner's unique contribution was not that he inoculated a few persons with cowpox, but that he then proved that they were immune to smallpox.” 

The success of Jenner’s discovery quickly spread around Europe. Napoleon, who was at war with Britain at the time, had all his troops vaccinated, awarded Jenner a medal, and even released two English prisoners at Jenner’s request. Napoleon is cited as having said he could not “refuse anything to one of the greatest benefactors of mankind." 

Jenner made no attempt to enrich himself through his discovery and he even built a small one-room hut in his garden, where he would vaccinate the poor free of charge – he called it the “Temple of Vaccinia.” Later in life he was appointed Physician Extraordinary to King George IV and was made mayor of Berkeley, Gloucestershire. He died on January 26th, 1823, aged 73. 

In 1979, the World Health Organization officially declared smallpox an eradicated disease. 

The smallpox vaccine laid the foundation for other discoveries in immunology and the amelioration of diseases such as measles (rubeola), influenza (the flu), tuberculosis, diphtheria, tetanus (lockjaw), pertussis (whooping cough), hepatitis A and B, polio, yellow fever and rotavirus.

Jenner’s work has saved untold millions of lives from a disease that has plagued humanity for millennia. 


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Fritz Haber and Carl Bosch

Two German Nobel Prize-winning scientists, Fritz Haber and Carl Bosch, created the “Haber-Bosch process,” which efficiently converts nitrogen from the air into ammonia (i.e., a compound of nitrogen and hydrogen). Ammonia is then used as a fertilizer to dramatically increase crop yields. The impact of Haber and Bosch’s work on global food production transformed the world forever.

Throughout the 19th century, farmers used guano (i.e., the accumulated excrement of seabirds and bats) as highly effective fertilizer due to its exceptionally high content of nitrogen, phosphate and potassium - nutrients that are essential for plant growth. But by the beginning of the 20th century, guano deposits started to run out, and the price of the fertilizer began to increase. If a solution to the depletion of guano hadn’t come soon, famine would have followed.

Enter, Fritz Haber. Born in 1868 in Breslau, Germany (now part of Poland), Haber began studying chemistry at the age of 18 at the University of Heidelberg. By 1894, Haber worked at the University of Karlsruhe, researching methods to synthesize nitrogen. Nitrogen is very common in the atmosphere, but the chemical element is difficult to extract from the air and turn into a liquid or solid form (a process known as “fixing” nitrogen).

After thousands of experiments over almost 15 years, Haber succeeded in producing ammonia on July 3rd, 1909. That proved that commercial production was possible. However, Haber’s breakthrough occurred in a small tube, 75 centimetres tall and 13 centimetres in diameter. At the start of the 20th century, large containers that could handle the pressures and temperatures required for industrial scale production of ammonia did not yet exist. 

That is where Carl Bosch enters the story. Born in Cologne in 1874, Bosch studied metallurgy at the University of Charlottenburg in 1894, before transferring to the University of Leipzig to receive his doctorate in chemistry in 1898. Bosch met Haber in 1908 and after finding out about the latter’s breakthrough the following year, Bosch took on the challenge of developing suitable containers that could manage Haber’s process on the industrial level.

Within four years Bosch was producing ammonia in 8-meter-tall containers. The Haber-Bosch process was born. By 1913, Bosch had opened a factory that kick-started the fertilizer industry that we know today. 

The discovery of the Haber-Bosch process meant that for the first time in human history it became possible to produce synthetic fertilizers that could be used on enough crops to sustain the Earth’s increasing population. It’s nearly impossible to say how many lives this breakthrough saved, but the expansion of the world’s population from 1.6 billion in 1900, to more than 7.3 billion today, “would not have been possible without the synthesis of ammonia,” claims the Czechian scientist Vaclav Smil.

After their revolutionary contribution to human progress, the two scientists worked to help Germany during World War I. Bosch focused on bomb making, while Haber became instrumental in developing chlorine gas. When Adolf Hitler came to power in 1933, Haber fled Germany to teach at Cambridge University, and he died shortly after in 1935. Meanwhile, in 1937, Bosch was appointed President of the Kaiser Wilhelm Institute – Germany’s highest scientific position. Being a staunch critic of Nazi policies, Bosch was soon removed from that position and died in 1940. 

Today, more than 159 million tonnes of ammonia are produced annually, and while ammonia is also used for cleaning and as a refrigerant, 88 percent of ammonia is used for fertilizer. It is estimated that if average crop yields remained at their 1900 level, the crop harvest in the year 2000 would have required nearly four times more cultivated land than was actually cultivated. That equates to an area equal to almost half of all land on ice-free continents – rather than just the 15 percent that is needed today.

Without the combined efforts of Fritz Haber and Carl Bosch, the world’s population would be much smaller than it is today. The two have truly changed the world for the better.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Heroes of Progress: Norman Borlaug

Norman Borlaug is the man commonly dubbed the “Father of the Green Revolution.”   

Norman Ernest Borlaug was an American agronomist and humanitarian born in Iowa in 1914. After receiving a PhD from the University of Minnesota in 1944, Borlaug moved to Mexico to work on agricultural development for the Rockefeller Foundation. Although Borlaug’s taskforce was initiated to teach Mexican farmers methods to increase food productivity, he quickly became obsessed with developing better (i.e., higher-yielding and pest-and-climate resistant) crops.

As Johan Norberg notes in his 2016 book Progress:

“After thousands of crossing of wheat, Borlaug managed to come up with a high-yield hybrid that was parasite resistant and wasn’t sensitive to daylight hours, so it could be grown in varying climates. Importantly it was a dwarf variety, since tall wheat expended a lot of energy growing inedible stalks and collapsed when it grew too quickly. The new wheat was quickly introduced all over Mexico.”

In fact, by 1963, 95 percent of Mexico’s wheat was Borlaug’s variety and Mexico’s wheat harvest grew six times larger than it had been when he first set foot in the country nineteen years earlier.

Norberg continues, “in 1963, Borlaug moved on to India and Pakistan, just as it found itself facing the threat of massive starvation. Immediately, he ordered thirty-five trucks of high-yield seeds to be driven from Mexico to Los Angeles, in order to ship them from there.” Unfortunately, Borlaug’s convoy faced problems from the start; it was held up by Mexican police, blocked at the US border due to a ban on seed imports, and was then stalled by race-riots that obstructed the LA harbor.

Eventually Borlaug’s shipment began its voyage to India, but it was far from plain sailing.

Before the seeds had reached the sub-continent, Indian state monopolies began lobbying against Borlaug’s shipment and then, once it was ashore, it was discovered that half the seeds had been killed due to over-fumigation at customs. If that wasn’t enough, Borlaug learnt that the Indian government was planning to refuse fertilizer imports as they “wanted to build up their domestic fertilizer industry.” Luckily that policy was abandoned once Borlaug famously shouted at India’s deputy Prime Minister.

Borlaug later noted, “I went to bed thinking the problem was at last solved and woke up to the news that war had broken out between India and Pakistan.” Amid the war, Borlaug and his team continued to work tirelessly planting seeds. Often the fields were within sight of artillery flashes.

Despite the late planting, yields in India rose by seventy percent in 1965. The proven success of his harvests coupled with the fear of wartime starvation, meant that Borlaug got the go-ahead from the Pakistani and Indian governments to roll out his program on a larger scale. The following harvest was even more bountiful and wartime famine was averted.

Both nations praised Borlaug immensely. The Pakistani Agriculture Minister took to the radio applauding the new crop varieties, while the Indian Agriculture Minister went as far as to plough his cricket pitch with Borlaug’s wheat.  After a huge shipment of seed in 1968, the harvest in both countries boomed. It is recorded that there were not enough people, carts, trucks, or storage facilities to cope with the bountiful crop.

This extraordinary transformation of Asian agriculture in the 1960s and 1970s almost banished famine from the entire continent. By 1974, wheat harvests had tripled in India and, for the first time, the sub-continent became a net exporter of the crop. Norberg notes, “today they (India and Pakistan) produce seven times more wheat than they did in 1965. Despite a rapidly growing population, both countries are much better fed than they used to be.”

Borlaug’s wheat, and the dwarf rice varieties that followed, are credited for ushering in the Green Revolution. After the Indo-Pakistani war, Borlaug spent years working in China and later in life, Africa.

In 1970, Borlaug was awarded the Nobel Peace Prize for his accomplishments. He is only one of seven to have received the Congressional Gold Medal and the Presidential Medal of Freedom, in addition to the Nobel Peace Prize. It is said that he was particularly satisfied when the people of Sonora, Mexico, where he did some of his first experiments, named a street after him.


About the Author

Alexander C. R. Hammond is a researcher at a Washington DC think tank.

Desalination Shows Tech Can Solve Environmental Problems

Since he published his bestselling book, Enlightenment Now: The Case for Reason, Science, Humanism, and Progress, Steven Pinker has been criticized for excessive optimism. Jeremy Lent, for example, argues that Pinker is insufficiently concerned about depletion of the planet’s natural resources, including freshwater reserves. He faults the Harvard University psychologist for embracing a “neoliberal, technocratic belief that a combination of market-based solutions and technological fixes will magically resolve all ecological problems.” As Israel’s desalination efforts show, however, technological fixes and market-based solutions are an important part of humanity’s efforts to overcome environmental challenges.

Lent begins his piece by noting some worrying environmental trends, including “the rise in CO2 emissions; the decline in available freshwater; and the increase in the number of ocean dead zones from artificial fertilizer runoff.” Pinker does not deny that, in addition to many positive trends, including reduction of absolute poverty, increasing lifespans and declining violence, humanity faces important challenges. “Progress,” he writes, “is not the same as magic. There are always blips and setbacks… Clearly we have to be mindful of the worst possible setback, namely nuclear war, and of the risk of permanent reversals, such as the worst-case climate change scenarios.

Take, for example, the freshwater supply. Between 1962 and 2014, renewable water resources per person declined from 17,220 cubic meters to 7,462 cubic meters. However, note that 71 percent of the Earth’s surface is covered by water. What’s needed in the areas most affected by drought, such as North Africa and the Middle East, is an affordable process of desalination that separates salt particles from water molecules. Israel has pioneered a desalination method that makes freshwater consumed by Israeli households 48 percent cheaper than that consumed by the people of Los Angeles. Desalination, writes Rowan Jacobsen in Scientific American,

“works by pushing saltwater into membranes containing microscopic pores. The water gets through, while the larger salt molecules are left behind. But microorganisms in seawater quickly colonize the membranes and block the pores, and controlling them requires periodic costly and chemical-intensive cleaning. But [Israeli scientist] Bar-Zeev and colleagues developed a chemical-free system using porous lava stone to capture the microorganisms before they reach the membranes... Israel now gets 55 percent of its domestic water from desalination, and that has helped to turn one of the world’s driest countries into the unlikeliest of water giants.

Lent critiques Pinker for failing “to take into account the structural drivers of [environmental] overshoot: a growth-based global economy reliant on ever-increasing monetization of natural resources and human activity.” In reality, free enterprise is not the problem. It is the solution. Relative scarcity leads to higher prices, higher prices create incentives for innovations, and innovations lead to abundance. Scarcity gets converted to abundance through the price system. The price system functions as long as the economy is based on property rights, rule of law and free exchange. In relatively free economies, resources do not get “depleted’ in the way that Lent fears - as witnessed by the fact that Earth is yet to run out of a single non-renewable resource.

That’s because the totality of our resources, including freshwater, are not fixed. Yes, the total number of atoms on Earth is finite, but the ways in which those atoms can be combined and recombined are infinite. What matters, then, are not the physical limits of our planet, but human freedom to experiment and reimagine the use of the resources that we have. As New York University economics professor Paul Romer writes,

“To get some sense of how much scope there is for more such discoveries, we can calculate as follows. The periodic table contains about a hundred different types of atoms. If a recipe is simply an indication of whether an element is included or not, there will be 100 x 99 recipes like the one for bronze or steel that involve only two elements. For recipes that can have four elements, there are 100 x 99 x 98 x 97 recipes, which is more 94 million… Mathematicians call this increase in the number of combinations ‘combinatorial explosion.’ Once you get to 10 elements, there are more recipes than seconds since the big bang created the universe. As you keep going, it becomes obvious that there have been too few people on earth and too little time since we showed up, for us to have tried more than a minuscule fraction of the all the possibilities.

In contrast to free economies, statist societies without property rights, rule of law and free exchange tend to be much worse stewards of the planet. The Soviet Union and Maoist China, for example, were reckless abusers of their resources, including the most precious resource of all - human beings. The biggest distinction between free and statist societies is the value they place on human life. Free societies treat human beings as a valued resource, because it is only humans who have ideas and creative energy to convert those ideas into innovations. In contrast, statist societies tend to consider members of the human race as liabilities. As such, the road to statist utopias is strewn with corpses.

Lent is wrong to dismiss technological fixes and market-based solutions to our environmental problems. Within the context of a market economy, human beings not only use resources, but replenish and enlarge them. As such, Israel’s desalination plants supply drinking water not only to the Israelis, but also the inhabitants of the West Bank and diplomatic efforts are afoot to supply Israeli drinking water to the surrounding Arab countries. That’s progress.


About the Author

Marian L. Tupy is a senior policy analyst at the Cato Institute and editor of HumanProgress.org

The Master and His Emissary: A Conversation between Dr. Jordan B. Peterson and Dr. Iain McGilchrist

Watch the full conversation here

Dr. Jordan Peterson: Well, I have a question. I’d like to know a little bit more about why you specifically chose the title "The Master and His Emissary".

 

Dr. Iain McGilchrist: It was an attempt to explain what I believed was the relationship between the two brain hemispheres. Like most other things in life, they’re unequal and asymmetrical. One of the brain hemispheres sees more than the other. That is the one that I’ve designated the "master", and is the right hemisphere.

 

JP: That’s a weird inversion, because people often think of the left hemisphere as the one that’s dominant.

 

IM: They do, they do. Traditionally, that’s been the case, but, as is becoming ever clearer, the right hemisphere—this has been a real steep learning curve for some people—is in many ways more reliable, sees more, understands more than the left hemisphere, which is like a high-functioning bureaucrat, in a way. The idea of the story was simply that certain matters needed to be delegated, not only because, as it were, the master couldn’t do everything—he needed an emissary to go abroad and do some of it—but also that he must not get involved with a certain point of view. Otherwise, he’d lose what it was that [inaudible]. What I’m really saying, there, is that there’s a good reason why, evolutionary speaking, the two brain hemispheres are separate.

 

JP: When you say "doesn’t get involved", what’s the advantage of that detachment from the involvement?

 

IM: Well, it’s that… Ramón y Cajal, who you know is a great histopathologist—one of his findings was that, in primates, there are more prohibitory neurons than in any other animals, and there are more in humans than in any other primate.

 

JP: And that’s speaking proportionally.

 

IM: Proportionally, and there are more kinds, as well. We think that about 25 per cent of the entire cortex is inhibitory, so it’s a very strong effect. The callosum seems to be very largely, in the end, inhibiting function in the other hemisphere. That is, I think, because, over time, the two hemispheres have had to specialize. There are reasons why, actually, it can’t be. I’m not going to go into it now, but I was talking just a few days ago at the evolutionary psychiatry meeting. But there are reasons why the corpus callosum has had to become more selective, and to inhibit quite a lot of what’s going on in the other hemisphere; because it enables the two to do distinct things. Of course, they have to work together, but, usually, good teamwork doesn’t mean everyone trying to do the same row. So differentiation is very important for two elements to work together. Inhibition is one way of doing that. Effectively, the two takes on the world, if you like, that the hemispheres have are not easily compatible; and we’re not aware of that, because at the level below consciousness there’s a meta-control center that is bringing them together. So in ordinary experience we don’t feel that we’re in two different worlds, but effectively we are. They have different qualities and different goals, different values, different takes on what is important in the world.

 

JP: I’ve developed a conceptual scheme for thinking about the relationship between the two hemispheres, and I’ve been curious about what you think about it, and how it might map onto, or not, your ideas. I’ve been really interested in the orienting reflex, discovered by Eugene Sokolov, I think, back in about 1962. He was a student of Luria's. The orienting reflex manifested when something, at least in their terminology, unpredictable happened. I’ve thought much more recently that it’s actually when something undesired happens, and the laboratory constraints obscured that, and that actually turned out to be important. I kind of put together the ideas of the orienting reflex with some of the things I learned from Jung’s observations on the function of art and dreams. So imagine that you have a conceptual scheme laid out, and we could say it’s linguistically mediated, it’s enforced on the world, and then there are exceptions to that conceptual scheme, and those are anomalies, things that are unexpected. The orienting reflex orients you towards those.

 

IM: Yes.

 

JP: So those are things that aren’t fitting properly in your conceptual scheme that you have to figure out. The first thing you do is act defensively, essentially, because it might be dangerous. And then your exploratory systems are active, and the exploratory systems, first of all, are enhanced attention, from an attentional perspective. But then—and this is where the art issue sort of creeps into it—the idea would be something like "the right hemisphere generates an imaginative landscape of possibility that could map that anomaly." So you can kind of experience that if it’s at night, you know?

 

IM: Yeah.

 

JP: Say you’re sitting alone at night. It’s two or three in the morning. You’re kind of tired. Maybe you’re in an unfamiliar place, and there’s a noise that happens that shouldn’t happen in another room. You can play with that. So, for example, if you open the door slightly and put your hand in to turn on the light, and you watch what happens, your mind will fill with imaginative representations of what might be in the room.

 

IM: Yes, yes.

 

JP: It’s like the landscape of anomaly will be populated with something like imaginative demons, and that’s a first-pass approximation. It seems to me that’s a right hemispheric function, and that, as you explore further, that imaginative domain, which circumscribes what might be, is constrained and constrained and constrained and constrained, until you get what it actually is, and that’s specialized and routinized. It’s something like that.

 

IM: Yes, yes.

 

JP: Does that seems like a reasonable—what do you think about that?

 

IM: I love that for a whole host of reasons. One is, you mentioned "defense", and one of the ideas behind my hypothesis is that the right hemisphere is on the lookout for predators, whereas the left hemisphere is looking for prey. This has been confirmed in many species of amphibians and mammals.

 

JP: I’d never heard that second part. When you’re in left hemispheric mode, you’re more in predator mode; and when you’re in right hemispheric mode, you’re more in prey mode.

 

IM: Of course, we are not lizards or toads or marmosets, or whatever. But in animals, generally speaking, this is the case. The left hemisphere’s the one that controls the grasping [inaudible], and exploring, which you mentioned, is more right hemisphere. When a frontal function is deficient, people often go into an automatic mode of the hand of that side. And the left hand, it’s usually exploratory motions. Meaningless ones, but trying to explore the environment. And with the right hand, it’s grasping pointlessly at things. They’re automatic things. With the left hand, the right hemisphere to explore. With the right hand, the left hemisphere to grasp. So when you said "exploratory", and you said "defensive", and you said, also, "opening up to possibilities"—these are all aspects of the way the right hemisphere… I often say the right hemisphere opens up to possibility, while the left hemisphere wants to close down to a certainty, and you need both of these.

 

JP: Right, right. It’s a chaos and order issue.

 

IM: Chaos and order. In your talk, you talked about chaos and order, but, if I may say so, you seem—and maybe you’d like to gloss that a little—to suggest that it would be good… We can’t get rid of chaos, but you seem to imply that it would be better if we could, whereas my view is that chaos and order are necessary to one another, and there is a proper harmony or balance.

 

JP: Yeah, well, OK. I think that’s as deep a question as you could possibly ask, I would say, in some sense.

 

IM: Good.

 

JP: I would say there’s a central theological issue there.

 

IM: Yes.

 

JP: The issue there is that, in Genesis, the proper environment of humanity is construed as a garden.

 

IM: Yes.

 

JP: I see that as an optimal balance of chaos and order. Nature flourishing, and it’s prolific and it’s chaotic. If you add harmony to that, you have a garden, so you live in a garden. You’re supposed to tend a garden. OK, so then a garden is created. It’s a walled space, because Eden is a walled space. It’s "paradeisos". It’s a walled garden.

 

IM: That’s it.

 

JP: The thing is, as soon as you make a wall, you try to keep outside out, but you can’t, because the boundaries between things are permeable. So, if you’re going to have reality and you’re going to have a bounded space, you’re going to have a snake in the garden. Then the question is, "what the hell should you do about that? Should you make the walls so high that no snake can possibly get in? or should you allow for the possibility of snakes, but make yourself strong enough so that you can contend with them?" I think there’s an answer, there, that goes deep to the question of, maybe, even why God allowed evil to exist in the world.

 

IM: I agree with you.

 

JP: It’s like, "well, do you make people safe or strong?" Strong is better, and safe might not be commensurate with Being. It might not be possible to exist and to be safe.

 

IM: Our existence is predicated on the fact that we die, so it’s never safe.

 

JP: Well, it’s certainly bounded, right? It’s inevitably wrapped up with that sort of finitude. There’s a lovely, lovely Jewish idea, an ancient idea. It’s one of the most profound ideas I’ve ever come across. It’s kind of a zen koan. It’s a question about the classic attributes of God: omniscient, omnipresent, and omnipotent. What does a being with those three attributes lack? The answer is "limitation," and the second answer is, "that’s the justification for being," that the unlimited lacks the limited.

 

IM: Exactly, exactly.

 

JP: And so the limited is us.

 

IM: For anything to come into existence, there needs to be an element of resistance, so things are never predicated on one pole of what is always a dipole. Everything always has that dipole structure.

 

JP: Yeah, it’s like a prerequisite for Being.

 

IM: It is, and it’s imaged in the yin-yang idea. But it seems to me very important, because, in our culture, we often seem to suppose that certain things are just good and other things are just bad, and it would be good if we could get rid of the bad ones. But, actually, by pursuing certain good things that are good within measure too far, they become bad, and so forth. But let’s go back to your anomaly thing, because Ramachandran calls the right hemisphere the "anomaly detector".

 

JP: Yes, yes.

 

IM: So I think that’s a very important point, because there are two ways you can react to an anomaly. One is to—and both have to be explored—try and prove that it’s not really an anomaly, and therefore you can carry on with things as normal.

 

JP: Yes. That’s what you hope will happen.

 

IM: That’s the typical left hemisphere approach. It doesn’t want anything to have to shift, and quite reasonably. You don’t want to be chaotically shifting, if you’re on to a good thing.

 

JP: Yeah. It’s too stressful.

 

IM: Exactly.

 

JP: It takes too much work.

 

IM: And you might actually be mistaken.

 

JP: Yes, that too.

 

IM: In a way, it’s perfectly correct to be wary, but it’s not correct to be so wary that you blot out anomalies, and there’s lots of evidence of that. The left hemisphere simply blocks out everything that doesn’t fit with its take. It doesn’t see it, actually, at all. So there’s a hugely important element in the right hemisphere going, "hang on: there may be another way of thinking that will accommodate this better." And, actually, good science needs to be skeptical about anomalies, otherwise there would be chaos, but it also needs to be able to shift when an anomaly is large enough, or there are quite a lot of them and they don’t really fit really well.

 

JP: Exactly, yes. There’s another observation that Jung made. I loved this observation. He was trying to account for radical personality transformation. His idea was this, and I think it’s commensurate with the ideas of inhibition between the two hemispheres. So let’s imagine the left is habitually inhibiting the function of the right, to keep fear under control. It does that in all sorts of ways. Imagine the right is reacting to anomalies, and it’s aggregating them. The left can’t deal with them, so the right is aggregating anomalies. Maybe that’s starting to manifest itself in nightmarish dreams, for example. These anomalies are piling up. It’s indicating that you’re on shifting sand. So then, imagine that the right hemisphere aggregates anomalies, and then it starts to detect patterns in the anomalies. Now it starts to generate what you might consider a counterhypothesis to the left’s hypothesis.

 

IM: Exactly.

 

JP: If that counterhypothesis gets to the point where the total sum, in some sense, of the anomalies, plus the already mapped territory, can be mapped by that new pattern, then at some point it will shift.

 

IM: Yes.

 

JP: And the person will kick into a new personality configuration. It’s like a Jean Piagetian stage transition, except more dramatic.

 

IM: It is, and what a Piagetian stage transition is also like—and it subsumes both—is Hegelian [inaudible], the idea that a thing is opposed by something else, but when there is a synthesis, it’s not that one of them is annihilated: they’re both transformed and taken up into the whole, which embraces what before looked like in opposition.

 

JP: OK, here’s a question for you. When I read Thomas Kuhn, I was reading Piaget at the same time—and I knew that Piaget was aware of Kuhn’s work, by the way. The problem I had with Kuhn and the interpreters of Kuhn—who interpret Kuhn as a moral relativist, in some sense—is that they don’t seem to get the idea of increased generalized ability of "plan". So let’s say I have a theory, and a bunch of anomalies accrue, and I have to wipe out the theory. So then I wipe out the theory, and I incorporated the anomalies, and now I have another theory. That’s a descent into chaos. That’s my estimation. That’s the oldest stories. So the anomaly disruption is the mythological descent into chaos, and then you reconfigure the theory with the chaos, and you come up with a better theory.

 

IM: Yes, the resurrection.

 

JP: "Why is it better?" Well, the answer is, "it accounts for everything that the previous theory accounted for, plus the anomalies."

 

IM: Exactly.

 

JP: So there’s progress.

 

IM: Always.

 

JP: Yes, exactly. But Kuhn is often read as stating there is no progress. There’s incommensurate paradigms, and you have to just shift between them. But there isn’t cumulative knowledge, in some sense.

 

IM: I think one thing that we would both probably agree about is that we don’t buy the story that because nothing can be demonstrated definitely, utterly, to be the case, there is no truth. I mean, I think we both believe that there are truths, things that are truer than other things.

 

JP: We certainly act that way.

 

IM: We couldn’t even talk, could we, if we… And even to say that there are no truths is itself a truth statement, which is that it’s truer than the statement "there are truths". So everyone automatically has truths, whether they know it or not.

 

JP: It’s because—well, you said why. It’s not only that you can’t talk—you can’t even see.

 

IM: No.

 

JP: Because you wouldn’t know how to point your eyes.

 

IM: You wouldn’t know how to discriminate what’s coming into your brain at all. So it’s inevitable. I think we would agree about that. But I think there may be a slight point of difference between us in that I’m very willing to embrace the idea of uncertainty, and I may be wrong—perhaps you could expand on that—but sometimes you come across as a man who has certainties that…

How do the Master and His Emissary determine your political beliefs?

Read: Are You Really Conservative or Liberal?

 

JP: Well, it’s a peculiar kind of certainty. I’m certain that standing on the border of order and chaos is a good idea.

 

IM: Good.

 

JP: That’s a weird certainty, eh?

 

IM: Exactly, exactly. You need to be in the sort of slightly unstable position.

 

JP: Yeah. You have to be… what would you say… encountering as much uncertainty as you can voluntarily tolerate.

 

IM: Yes.

 

JP: I think that’s equivalent of Vygotsky's "zone of proximal development". So when we talked a little bit earlier about an instinct for meaning, I think what meaning is, is the elaborated form of the orienting reflex. But what meaning does, it’s function, it’s biological function—which I think is more real, in some sense, than any other biological function—is to tell you when you’re in the place where you’ve balanced the stability, let’s say, of your left hemisphere systems with the exploratory capacity of your right, so that not only are you master of your domain, but you’re expanding that domain simultaneously.

 

IM: Yes, yes.

 

JP: I think that when you’re there, it’s kind of a metaphysical place, in some sense. You’re imbued with a sense of meaning and purpose. That’s an indication that you’ve actually optimized your neurological function.

 

IM: Yes, and perhaps we could gloss the idea of "purpose", because I think people get very confused, I think, about the idea of "purpose", particularly whether there’s a problem that’s suggesting there is a purpose. I believe there is a purpose, or there are purposes to the cosmos, not just to my life—suggest that it’s all been predetermined by God, but this is misunderstanding the nature of time, that there are static slices, and God is there, and he’s sorted it all out, the whole thing’s just unfolding, as Bergson says, "like a lady’s fan being unfurled." It’s extremely boring, and an entirely static and uncreative universe. But, actually, something is at stake. Things are unfolding. They have overall a direction. But exactly what that direction is, isn’t known.

 

JP: That’s what it looks like to me.

 

IM: It’s a fool who says anything positive about the nature of God, but I’m not convinced that God is omniscient and omnipotent, either. I think God is in the process, is becoming. God is not only just becoming, but is becoming, if you see what I mean.

 

JP: Yeah, so Being and Becoming.

 

IM: More becoming.

 

JP: More becoming.

 

IM: Becoming is the important thing.

 

JP: Why do you think that? It’s also a strange segue. I mean, I’m not criticizing you, but I’m curious. What drove you to that conclusion?

 

IM: An awful lot of things, really. I think that everything is a process. In fact, I’m writing a book called "There are No Things".

 

JP: Oh, what are there instead?

 

IM: There are processes.

 

JP: Yes.

 

IM: And there are patterns.

 

JP: Patterns. That’s why music is so powerful.

 

IM: Music is one of the most mysterious and wonderful things in the universe, and I don’t think it is at all foolish of people to have thought that the planetary motions were in some way like a kind of music.

 

JP: No, it’s not at all. It’s a great insight.

 

IM: I think it is a very important insight.

 

JP: Well, music… I’ve said this in public lectures, that music is the most representative of the arts.

 

IM: It is.

 

JP: Because the world is made out of patterns, and music describes how those patterns should be arranged.

 

IM: You’re using "representative" in a very different way.

 

JP: I know, I know, but it depends on what you mean by "representative". It’s representing the ultimate reality of the cosmos. That’s pretty representative.

 

IM: Well, I would like to say "presentative", in that it’s not "representing" anything. When we’re in the presence of music, something is coming into Being, which is at the core of the whole cosmic process.

 

JP: I think that’s why people love music.

 

IM: They do, and there’s hardly any originality in the idea, because lots of physicists say this, but the movement of atoms and movement of planets, and so forth, are more like a dance, or more like music, than they are like things bumping into one another.

 

JP: Right, right. I thought of "things" as patterns that people have made into tools.

 

IM: I agree with you, and tools are what the left hemisphere is always looking for. It’s always looking for something to grasp. It rarifies processes that… It’s all a matter of time. Every single thing, including the mountain behind my house, which is billions of years old—if you were able to take, as it were, a series of time lapse camera—you’d see the thing morphing and changing and flowing. As Eric [inaudible] once said, "everything flows. It’s just over the time period that you consider it."

 

JP: It’s a question of tempo.

 

IM: It’s a question of the tempo. So taking time out of things and considering them in the abstract, deracinated from context, particularly from the flow and the context of the time, changes them into something else. I think that what, in brief, Plato has done and what a lot of the history of more recent Christianity has done is to thing-ify God and heaven—perfect states that are unaltered, and so on. I think it is an evermore wonderfully self-exploring, self-actualizing process that requires a degree of opposition, as a stream, in order to have the movement and the ideas and patterns in it, it has to be constrained.

 

JP: I’ve had intimations like that about death. It’s hard to describe these experiences, but when I’ve contemplated death deeply, it’s struck me as a fundamental repair mechanism against the part of the mechanism by which new things that are better are brought into Being.

 

IM: Absolutely.

 

JP: You see that in your own Being, because, of course, without death, you couldn’t live, because you’re dying. The things about you that aren’t right, even at a physiological level, are dying all the time. Unfortunately, you also completely die. But more cosmically speaking, it does seem to me that death is… I don’t know, man. I’ve had intuitions or intimations that death is the friend of Being. That’s… It’s hard to get my head around that.

 

IM: I completely agree with you, and that’s been said by many wiser people than myself—maybe even than yourself.

 

JP: I would suspect so. Hopefully so.

 

IM: I think that’s right, that death is predicated on life, but also that it shouldn’t be seen as something that’s a negative. It’s a necessary stage in Being becoming what it is. Since everything is ramified, since nothing is just isolated, you and I may look [inaudible] or feel [inaudible], but as you often eloquently say, we all have a history and in time we come from a place, but also as a culture we have history. We can’t detach ourselves from it. We’re expressions of it. But we’re also inevitable dependent, as all organism are, on the environment: where I end and where the "environment" begins. I don’t like the word "environment", so "nature".

 

JP: No, it’s a mythological concept.

 

IM: It suggests something that’s always being born, whereas "environment" suggests something around me, from which I’m separate.

 

JP: And even opposed to.

 

IM: Yes, yes. So I would see us as like an eddy in a stream, or like a wave in the sea. It’s never separate.

 

JP: Schrödinger talked about life as such things.

 

IM: The coming together of physics with a process philosophy are very strong.

 

JP: So when does that book come out?

 

IM: When I finish writing it.

 

JP: Oh, hah.

 

IM: And I’m very worried that it’s getting bigger, and is… In all this time I’m writing it, I’m seeing more and more things that I really must get to know more about it, and it’s an ever-receding…

 

JP: Well, that’s the danger of a book that aims at something fundamental, because you never hit the proper boundaries.

 

IM: That’s it. I need that wall.

 

JP: I also had imaginative experiences, I would say, that, when I was trying to understand, say, the necessity of evil—because that’s also a fundamental theological conundrum and a metaphysical conundrum: "why is it that Being is consisted such that evil is allowed to exist," right? It’s Ivan Karamazov’s critique of Alyosha’s Christianity, essentially. "What kind of God would allow for this sort of thing?"

 

IM: It’s an ancient question.

 

JP: Yeah, it’s an ancient question. I’ve thought about the adversarial aspect of that, which is that you need a challenge. You’re not forced to bring forth what you could bring forth without a challenge, and the greater the thing you’re supposed to bring forth, the greater the challenge has to be, so you need an adversary. Something like that. But I also thought, "it’s possible that Being requires limitation." You might say optimal Being requires free choice. I know I’m going through a lot things quickly. Free choice requires the real distinction between good and evil.

 

IM: It does.

 

JP: Without that, you don’t have choice. Also, maybe it’s possible to set up a world where evil is a possibility, but where it isn’t something that has to be manifest: it’s an option open to you, and a real option, and it has to be, and the challenge that was presented to you, but it’s something that you can not move towards, if you so desire. That seems to be something like the ethical requirement. That’s the fundamental ethical requirement, to avoid evil. That doesn’t mean it shouldn’t exist. That’s not the same issue.

 

IM: No, it isn’t. I wonder, one could recast it as the need for otherness. God needs something other; and that other, if it’s not going to be just part of God, has got to be free. Otherwise, there will be no creation. The nature that there is something other than God… It may, in the end, come from and come back to that God, or that divine essence, or that whatever.

 

JP: That’s something I can’t figure out, either. In the Christian idea, there’s the end of time, where the evil is separated from God forever. I think about that as a metaphysical… Well, you might think, if it’s a form of… Imagine it’s a form of perfection, a form of striving for perfection. You fragment yourself; you challenge yourself; you throw what’s not worthy into the fire everlasting. Something like that. So what you end up with retained is much better than what you started with, through the trials. Something like that.

 

IM: That sounds a bit like the dialectical process that we’ve been talking about.

 

JP: Right.

 

IM: You have eluded to a couple of very good Jewish myths, and there’s one in the Lurian Kabbalah about the creation. I don’t know if you know it, but it’s absolutely riveting to me. The idea is that the primary being—Ein Sof, the ground of all Being—needs something other to come into being: the creation. That creation, what does that Ein Sof do? This is his first act. Is it to stretch out a hand and make something? Not a bit. The first act is to withdraw, to create a place in which there can be something other than the Ein Sof. And so the first stage is called Tzimtzum. It sounds negative, as so many creative things do. It withdraws, and, in that creative space, there are vessels. A spark falls off of Ein Sof and falls into the vessels, and they all shatter. That’s called "the shattering of the vessels".

 

JP: Yes. I came across that in Jung.

 

IM: Right, yes. And then there is the third stage, "repair", in which what has just been fragmented is restored into something greater. And so this process carries on. In my terms, that’s very much like what happens with the hemispheres. The right hemisphere is the one that’s first accepting. It’s sort of actively receptive, if you can put it that way, to whatever is new. You were talking about Elkhonon Goldberg’s, and so on. And then, whatever that is, is then sort of processed by the left hemisphere at the next stage into categories—"this is that"—and try to understand it. Of course, whatever it is, is much bigger than any of the categories. So they all break down, and it gets restored in the right hemisphere into a new whole: the tikkun; the repair.

 

JP: That’s "tikkun", right? T-i-k-k-u-n.

 

IM: T-i-k-k-u-n. I think the kind of easy way of thinking about it is learning a piece of music. You’re first sort of attracted to it as a whole. You then realize you need to practice that piece at bar 28, and you realize that at bar 64 there’s a return to the dominant, or something. And then, actually, when you go on stage, you’ve got to forget all about that. But it’s not that that work is lost. It’s just that it’s no longer present.

 

JP: Right.


About the Author

Dr. Jordan B. Peterson is a public intellectual, clinical psychologist, and professor of psychology at the University of Toronto. He is the author of the multi-million copy bestseller 12 Rules for Life: An Antidote to Chaos, #1 for nonfiction in 2018.

Website

Are You Really Conservative or Liberal?

When I read about clashes around the world – political clashes, economic clashes, cultural clashes – I am reminded that it is within our power to build a bridge to be crossed. Even if my neighbor doesn’t understand my religion or understand my politics, he can understand my story. If he can understand my story, then he’s never too far from me. It is always within my power to build a bridge. There is always a chance for reconciliation, a chance that one day he and I will sit around a table together and put an end to our history of clashes. And on this day, he will tell me his story and I will tell him mine.
— Paulo Coelho

Humanity is a polarity existence and we exist in a polarized world, but we all contain an imperfect balance within ourselves of seemingly opposing forces. The most contentious of forces that divides humanity is morality.

Our personal moral foundations cause the greatest struggles for balance between individual liberty and social order. There are three factors that determine our moral foundations and why we are, or think we are, either conservative or liberal: biology, psychology, and our worldview.

A Divided Brain

Why is the brain divided? This is what psychiatrist Dr. Iain McGilchrist has sought to understand in over twenty years of research. He aims to prove there is a growing imbalance in our brains and help us understand how this makes us increasingly unable to grapple with critical economic, environmental and social issues; ones that shape our very future as a species. He believes that one half of our brain – the left hemisphere – is slowly taking power, and we in the Western world are simultaneously feeding its ambitions. This half of the brain is very proficient at creating technologies, procedures and systems, but it cannot understand the implications of these on the people and the world around it.

The right hemisphere understands the world. It sees the big picture of an interconnected world, understands relationships and body language. It is sustained, broad, open, vigilant, and alert, and creates art, intuition, interest, and imagination. The left hemisphere manipulates the world. It cannot make connections and sees the world as separate parts where details are important but not relationships, things and people are not unique and individual, and groups can organize the world into rules and bureaucracy. It is narrow, sharply focused, attention to detail, and sorts and files things into a system, perceiving people as body parts and can’t see how it all fits together. As human beings, we could not exist independently of either hemisphere, we need both perspectives through which to view and understand the world.

Dr. Helen Fisher’s research led her to understand how brain chemistry determines our personality and politics. Serotonin is more abundant in conservatives with traits including familiarity, being cautious but not fearful, calm and controlled, structured and orderly, fact-oriented and precise, having more close friends, networks, community, and an importance of belonging, being respectful, following the rules, conscientiousness, loyalty, and dependability. Dopamine is more abundant in liberals with novelty seeking and risk-taking behaviour, curiosity, restlessness, independence and self-reliance, impulsiveness, spontaneous decisions, physical and mental exploration, idea generation, mental flexibility and open-mindedness. Estrogen is the liberal with economic regulation and personal freedom, where Testosterone is the conservative with economic freedom and personal regulation. As we know, we all have these hormones in our bodies and imbalances create physical, mental, and emotional health issues.

 

Five Moral Foundations

Dr. Jonathan Haidt and a group of social and cultural psychologists sought to understand why morality varies so much across cultures yet still shows so many similarities and recurrent themes. Their theory, the five moral foundations, proposes that several innate and universally available psychological systems are the foundations of “intuitive ethics.” Each culture then constructs virtues, narratives, and institutions on top of these foundations, thereby creating the unique moralities we see around the world, and conflicting within nations too.

  1. Care/Harm: our mammalian evolution for empathy, attachment, kindness, gentleness, and nurturance.

  2. Fairness/Cheating: the evolutionary process of reciprocal altruism, justice, rights, autonomy, and proportionality.

  3. Loyalty/Betrayal: our tribal history in forming shifting coalitions, patriotism, and self-sacrifice.

  4. Authority/Subversion: our primate history of hierarchical social interactions, leadership, followership, deference to legitimate authority, and respect for traditions.

  5. Sanctity/Degradation: shaped by the psychology of disgust and contamination, underlying religious notions of living in an elevated, less carnal, more noble way, the body is a temple that can be desecrated by immoral activities and contaminants.

Dr. Haidt and his colleagues’ research applied this theory to political "cultures" of liberals and conservatives. They discovered the current American culture war can be viewed as arising from the fact that liberals try to create a morality relying primarily on the Care/Harm foundation supported by the Fairness/Cheating and Liberty/Oppression foundations. Conservatives, especially religious conservatives, determine morality using all six foundations.

 

Your Worldview

In his 1987 book, A Conflict of Visions, economist Thomas Sowell argues that the opposing moral values of conservatives and liberals are intimately linked to the vision a person holds about human nature, either as constrained (conservative) or unconstrained (liberal). Sowell argues that controversies over seemingly unrelated social issues such as taxes, welfare, social security, health care, criminal justice, and war repeatedly reveal a consistent ideological dividing line along these two conflicting visions, the Constrained Vision and the Unconstrained Vision. Depending on which view of human nature you believe to be true will largely determine how you believe issues should be addressed:

If human options are not inherently constrained, then the presence of such repugnant and disastrous phenomena virtually cries out for explanation—and for solutions. But if the limitations and passions of man himself are at the heart of these painful phenomena, then what requires explanation are the ways in which they have been avoided or minimized… In the unconstrained vision, there are no intractable reasons for social evils and therefore no reason why they cannot be solved, with sufficient moral commitment. But in the constrained vision, whatever artifices or strategies restrain or ameliorate inherent human evils will themselves have costs, some in the form of other social ills created by these civilizing institutions, so that all that is possible is a prudent trade-off.

Balance in a Complex World

Essentially, we are born predisposed to being either liberal or conservative at an intuitive and instinctual level, with our moral values predetermined by our brain structure, mix of hormones, moral emotions and reactions, and temperament. This explains why people can be predictably partisan about a range of issues that are seemingly unconnected. Yet both sides of the equation are valid, necessary, and true and can complement and balance out the negative extremes of the other.

As human beings, we have far more in common than what differentiates and divides us. We are all somewhat liberal and somewhat conservative and a person must no longer be both an economic and social liberal or conservative. All humans hold multiple contradictory beliefs and opinions at once, even when we recognize this inherent hypocrisy within ourselves.

Traditional partisan lines are changing in an increasingly polarized political arena. Most people, and likewise most Millennials, are broadly libertarian-minded progressive conservatives, where they do not necessarily feel compelled to hold others to the same value system they hold for themselves, they just want to live their lives and allow others to live theirs so long as harm does not cross the bough. When we can understand where the other person is coming from with respect to consideration of moral foundations, we are more likely to find a path forward that is more effective in dealing with the issues in a fact-based manner that can actually lead to positive developments and results.

Humankind's Progress: Explaining Our Miraculous Flourishing

There is no God in Jonah Goldberg’s new book, Suicide of the West: How the Rebirth of Tribalism, Populism, Nationalism, and Identity Politics is Destroying American Democracy. But the book nonetheless revolves around a miracle. “The Miracle” is the shorthand Goldberg, a bestselling author, syndicated columnist, senior editor at National Review and a fellow at the American Enterprise Institute, uses to describe the escape of our species from the depths of ignorance, poverty and every-day conflict to the heights of scientific achievement, material abundance and relative peace.

To appreciate Goldberg’s Miracle, consider the following. Homo sapiens are between 200,000 and 300,000 years old. Yet the modern world, with all the conveniences that we take for granted (I wrote this article sitting on a plane 8 kilometers above ground, using an internet connection provided by a satellite orbiting 37,000 kilometers above the surface of the Earth), is merely 250 years old. Put differently, for the first 99.9 percent of our time on earth, progress was painfully slow. Then everything suddenly changed. Why? That’s the question that Goldberg strives to answer.

Goldberg has written two previous and popular books, Liberal Fascism: The Secret History of the American Left, From Mussolini to the Politics of Change and The Tyranny of Clichés: How Liberals Cheat in the War of Ideas. Goldberg’s writing style, as the titles of his books suggest, is geared towards a mainstream audience, but the AEI fellow researches his books with care. Of human nature, Goldberg writes:

“Humans were not designed to live in the market order of contracts, money, or impersonal rules, never mind huge societies governed by a centralised state. We were designed to live in bands, or what most people think of as tribes. The human brain is designed so that we can manage stable social relationships with roughly 150 people… We were designed by evolution to be a part of a group, but that group was very limited in size. These groups took on a variety of structures, but the basic anatomy was generally the same. There was a Big Man or some other form of chieftain or ‘alpha.’ …In the most basic sense, these bands were socialist or communist in that resources were generally shared. But the genetic programming clearly emphasised us over me. We still hold on to that programming and it rubs up against modernity constantly.

Based on the above paragraph, readers will be able to deduce the crux of Goldberg’s argument. The Miracle happened not because of, but in spite of, hundreds of thousands of years of evolution. Our rule-based society, where equality before the law takes precedence over the social and economic status of the individual, a staggeringly complex global economy that turns strangers from different continents into instant business partners, and a meritocratic system of social and economic advancement that ignores people’s innate features, such as race and gender, is both very new and extremely fragile.

The Miracle emerged, probably by chance and after hundreds of years of trial and error, in the splendidly quirky island of Great Britain. It then spread, however imperfectly, into other parts of the world. Today, the outposts of the Miracle can be found not only in Western Europe, North America and Oceania (Australia and New Zealand), but also in Asia (Hong Kong), Africa (Botswana) and Latin America (Chile). An extraordinary achievement.

In a refreshingly non-relativistic manner, which is one of Goldberg’s trademarks, he writes, “I believe that, conceptually, we have reached the end of history. We are at the summit, and at this altitude [political] left and right lose most of their meaning. Because when you are at the top of the mountain, any direction you turn – be it left toward socialism or right toward nationalism … the result is the same: You must go down, back whence you came.”

And that descent (decline, if you will) is the key threat that we all ought to keep in mind. The forces of tribalism always linger just below the surface and are never permanently subdued. From Russia and China to Turkey and, to some extent, the United States, the all-mighty chieftain is back in charge. From the darkest corners of the web, where nationalists and anti-Semites thrive, to the university campuses, where identity politics flourish, group loyalty takes precedence over the individual. These dangerous sentiments originate, it is true, in human nature. But their renewed lease on life springs, as Goldberg reminds us, from something much more banal – ingratitude defined as “forgetfulness of, or poor return for, kindness received.

We are the luckiest generation that has ever lived. On average, we are longer living, richer, healthier, more educated, safer and, even, happier, than any other people who have ever lived. Is it too much to ask that we start behaving in a manner that is commensurate with our good fortune?

Suicide of the West is a rich and highly readable book. The progress of our species is described with an appropriate sense of marvel and, perhaps, a little bit of well-earned pride. It is also a deeply humane work, with the author genuinely concerned about the threats to human progress that lurk ahead and for the well-being of his fellow creatures.


About the Author

Marian L. Tupy is a senior policy analyst at the Cato Institute and editor of HumanProgress.org

The History of Progress: Part 2

1.jpg

Sources of Progress

When it comes to the standard of living, human history resembles a hockey stick, with a long straight shaft and an upward-facing blade. For thousands of years, economic growth was negligible (resembling that long straight shaft). At the end of the 18th century, however, economic growth and, consequently, the standard of living, started to accelerate in Great Britain and then in the rest of the world (resembling that upward-facing blade). What was responsible for that acceleration?

The Nobel Prize–winning economist Douglass North, for example, argued that changing institutions, including the evolution of constitutions, laws, and property rights, were instrumental to economic development. The University of Illinois at Chicago economist Deirdre McCloskey attributed the origins of "the great enrichment" to changing attitudes about markets and innovation. The Harvard University psychologist Steven Pinker contended that material and spiritual progress were rooted in the Enlightenment and the concomitant rise of reason, science and humanism.

Whatever the case may be, a confluence of fortuitous events allowed for the breakout of the industrial revolution and enhanced the process of globalization. Let us look at those two in greater detail. According to Encyclopedia Britannica, the industrial revolution was “the process of change from an agrarian, handicraft economy to one dominated by industry and machine manufacture.” It started in Great Britain in the 18th century and then spread to other parts of the world.

“The technological changes included the following: (1) the use of new basic materials, chiefly iron and steel, (2) the use of new energy sources, including both fuels and motive power, such as coal, the steam engine, electricity, petroleum, and the internal-combustion engine, (3) the invention of new machines, such as the spinning jenny and the power loom that permitted increased production with a smaller expenditure of human energy, (4) a new organization of work known as the factory system, which entailed increased division of labor and specialization of function. . . . These technological changes made possible a tremendously increased use of natural resources and the mass production of manufactured goods. . . . There were also many new developments in nonindustrial spheres, including the following: (1) agricultural improvements that made possible the provision of food for a larger nonagricultural population, (2) economic changes that resulted in a wider distribution of wealth, the decline of land as a source of wealth in the face of rising industrial production, and increased international trade, (3) political changes reflecting the shift in economic power, as well as new state policies corresponding to the needs of an industrialized society, (4) sweeping social changes, including the growth of cities, the development of working-class movements, and the emergence of new patterns of authority, and (5) cultural transformations of a broad order.”

Not everyone was happy about the industrial revolution, however. Contemporary observers such as Charles Dickens commented on the squalor of 19th-century cities and the backbreaking labor of the people, including children, in the factories in much the same way that our journalists today comment on the squalor of the rapidly industrializing Indian cities and the backbreaking labor of people, including children, in Bangladeshi factories.

But things should be kept in a proper perspective. Life on an 18th-century farm was extremely difficult, and the city offered the former country dwellers higher wages and new opportunities. In time, sanitation, health care, and other benefits of civilized life caught up with the cities’ rising population, giving us the modern metropolis. In a similar vein, legislation caught up with rising standards of living and codified in law what already was happening in practice—as productivity increased and wages rose, fewer children were needed to supplement their parents’ incomes. Increased productivity of workers led to greater competition for workers, and factory owners started taking better care of their employees. Working conditions improved, and work injuries declined.

Another major criticism of the industrial revolution concerns the spoliation of the environment and exploitation of natural resources. In his famous 1808 poem Jerusalem, for example, William Blake bemoans the “satanic mills” (i.e., factories) that in his view pockmarked the bucolic face of the English countryside. But, as was the case with other writers of the Romantic era, Blake’s description of pre-industrial society was highly idealized. The reality, alas, was much less appealing. Most pre-industrial societies, Great Britain included, were heavily dependent on agricultural output. Agricultural production was labor intensive, but productivity was very low. Before the arrival of machines powered by steam and combustion engines, agriculture depended on much less efficient human and animal labor. People and animals had to be fed, which meant that most of the calories produced on the farm were immediately consumed by the laborers.

Before the industrial revolution, there were no synthetic fertilizers, such as nitrogen, and crop yields were much lower than what they are today. As a consequence, more land was required to feed people and pack animals. Land clearing was usually accomplished by the burning of forests. Yet more trees were cut down to heat houses and to cook food. Environmental damage aside, a major reason for switching from wood to coal was the simple fact that there were very few trees left. Thanks to the use of fossil fuels, global forest coverage has stabilized and is expanding in the world’s richest and most industrial countries.

Let us now turn to globalization, which the Organisation for Economic Co-operation and Development defines as “an increasing internationalization of markets for goods and services, the means of production, financial systems, competition, corporations, technology, and industries. Amongst other things this gives rise to increased mobility of capital, faster propagation of technological innovations, and an increasing interdependency . . . of national markets.”

Contrary to the common misperception, globalization is not a new phenomenon. The trade links between the Sumer and Indus Valley civilizations go back to the third millennium BCE. Then there was the Silk Road between Europe and Asia, and European voyages into India and the Americas. Clearly, trade has been fundamental to the process of globalization from antiquity. But, why do people trade?

Trade delivers goods and services to people who value them most. An additional ton of corn produced in Kansas may be of little importance to the people in the American Midwest, but it can be crucial to the people living in drought-stricken East Africa. Trade, to use economic jargon, improves efficiency in the allocation of scarce resources. Another reason for trade is the principle of comparative advantage. As the Nobel Prize–winning economist Paul Samuelson noted,

“The gains from trade follow from allowing an economy to specialize. If a country is relatively better at making wine than wool, it makes sense to put more resources into wine, and to export some of the wine to pay for imports of wool. This is even true if that country is the world’s best wool producer, since the country will have more of both wool and wine than it would have without trade. A country does not have to be best at anything to gain from trade. The gains follow from specializing in those activities which, at world prices, the country is relatively better at, even though it may not have an absolute advantage in them. Because it is relative advantage that matters, it is meaningless to say a country has a comparative advantage in nothing. The term is one of the most misunderstood ideas in economics, and is often wrongly assumed to mean an absolute advantage compared with other countries.”

Moreover, trade allows consumers to benefit from more efficient production methods. For example, without large markets for goods and services, large production runs would not be economical. Large production runs are instrumental to reducing product costs. For example, early cars had to be individually handcrafted. The Model T assembly line revolutionized car manufacturing and allowed the Ford Motor Company to slash the price of the Model T from $850 in 1909 to $260 in the 1920s. Lower production costs, in other words, lead to cheaper goods and services, and that raises real living standards.

Besides better resource allocation and greater specialization and economies of scale, trade encourages technological and cultural exchanges between previously disconnected civilizations. It is for those reasons that great commercial cities like Florence and Venice during the Renaissance, and London and New York today, also tend to be centers of cultural life and technological progress.

The development of the steam engine and the opening of the Suez Canal in the 19th century made seafaring faster and cheaper. The volume of traded goods greatly increased. Through the process of price convergence, prices fell and consumers benefited. The gold standard and the invention of the telegraph—and later the telephone—also allowed for massive transfers of capital. Attracted by higher profits, capital flowed from more developed to less developed countries, thus stimulating global economic development. As the British economist John Maynard Keynes recalled, before World War I,

“The inhabitant of London could order by telephone, sipping his morning tea in bed, the various products of the whole earth. . . . He could at the same moment and by the same means adventure his wealth in the natural resources and new enterprises of any quarter of the world, and share, without exertion or even trouble, in their prospective fruits and advantages. . . . He could secure . . . cheap and comfortable means of transit to any country or climate without passport or other formality.”

This “golden age” of globalization ended with the outbreak of World War I and the concomitant disruption of world trade. By some estimates, globalization did not reach its pre–World War I levels until the 1970s or even 1980s. In fact, it was the 1980s that marked the beginning of the period of globalization that we live in today. Spurred by economic deregulation within countries, trade liberalization between countries, privatization of state-owned companies, further improvements in transport, and the arrival of the internet, globalization got a new lease on life.

The World Is Getting Better, But Skepticism Remains

As can be seen, Europe and America and, later, other regions of the world, experienced previously unimaginable improvements in standards of living. The process of rapid improvement that started in the early 1800s continues to this day. Accordingly, historical evidence makes a potent case for optimism. Yet optimism about the current state and future well-being of humankind is difficult to come by. As author Matt Ridley writes in his book, The Rational Optimist,

“If . . . you say catastrophe is imminent, you may expect a MacArthur genius award [the MacArthur Fellowship or Genius Grant] or even the Nobel Peace Prize. The bookshops are groaning under ziggurats of pessimism. The airwaves are crammed with doom. In my own adult lifetime, I have listened to the implacable predictions of growing poverty, coming famines, expanding deserts, imminent plagues, impending water wars, inevitable oil exhaustion, mineral shortages, falling sperm counts, thinning ozone, acidifying rain, nuclear winters, mad-cow epidemics, Y2K computer bugs, killer bees, sex-change fish, global warming, ocean acidification, and even asteroid impacts that would presently bring this happy interlude to a terrible end. I cannot recall a time when one or other of these scares was not solemnly espoused by sober, distinguished, and serious elites and hysterically echoed by the media. I cannot recall a time when I was not being urged by somebody that the world could only survive if it abandoned the foolish goal of economic growth. The fashionable reason for pessimism changed, but the pessimism was constant. In the 1960s the population explosion and global famine were top of the charts, in the 1970s the exhaustion of resources, in the 1980s acid rain, in the 1990s pandemics, in the 2000s global warming. One by one these scares came and (all but the last) went. Were we just lucky? Are we, in the memorable image of the old joke, like the man who falls past the first floor of the skyscraper and thinks ‘So far so good!’? Or was it the pessimism that was unrealistic?”

Ridley’s recollection raises an interesting question: Why are we as a species so willing to believe in doomsday scenarios that never quite materialize in practice?

In their 2012 book, Abundance: The Future Is Better Than You Think, Peter H. Diamandis and Steven Kotler offer one plausible explanation. Human beings are constantly bombarded with information. Because our brains have a limited computing power, they have to separate what is important, such as a lion running toward us, from what is mundane, such as a bed of flowers. Because survival is more important than all other considerations, most information enters our brains through the amygdala—a part of the brain that, as Diamandis and Kotler write, is “responsible for primal emotions like rage, hate, and fear.” Information relating to those primal emotions gets our attention first because, as the authors write, the amygdala “is always looking for something to fear.” Our species, in other words, has evolved to prioritize bad news.

As Harvard’s Pinker writes in his 2018 book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress, “the nature of news . . . interact[s] with the nature of cognition to make us think that the world is getting worse. News is about things that happen, not things that don’t happen. We never see a journalist saying to the camera, ‘I’m reporting live from a country where a war has not broken out. . . .’” (p. 44). Consequently, newspapers and other media tend to focus on the negative. As the old saying among journalists goes, “If it bleeds, it leads.” To make matters worse, social media makes bad news immediate and more intimate. Until relatively recently, most people never learned about the countless wars, plagues, famines, and natural catastrophes that happened in distant parts of the world. Contrast that with the 2011 Japanese tsunami disaster, which people throughout the world watched unfold as it was happening on their smart phones.

The human brain also tends to overestimate danger because of what psychologists call “the availability heuristic.” Simply put, “people estimate the probability of an event by the ease with which instances come to mind. . . . But whenever a memory turns up high in the result list of the mind’s search engine for reasons other than frequency—because it is recent, vivid, gory, distinctive, or upsetting—people will overestimate how likely it is in the world. . . . People rank tornadoes (which kill about 50 Americans a year) as a more common cause of death than asthma (which kills more than 4,000 Americans a year), presumably because tornadoes make for better television” (Pinker, p. 44).

Another bias “can be summarized in the slogan ‘Bad is stronger than good.’ . . . How much better can you imagine yourself feeling than you are feeling right now? How much worse can you imagine yourself feeling? The answer to the second question is: it’s bottomless. The asymmetry in mood can be explained by an asymmetry in life. . . . How many things could happen to you today that would leave you much better off? How many things could happen that would leave you much worse off? The answer to the second question is: it’s endless. . . . The psychological literature confirms that people dread losses more than they look forward to gains, that they dwell on setbacks more than they savor good fortune, and that they are stung more by criticism than they are heartened by praise” (Pinker, p. 50).

Furthermore, good and bad things “unfold on different timelines” (Pinker, p. 44). Bad things, such as plane crashes, can happen quickly. Good things, such as humanity’s struggle against HIV/AIDS, tend to happen incrementally and over a long period of time. As Kevin Kelly, founding executive editor of Wired magazine, wrote, "Ever since the Enlightenment and the invention of Science, we’ve managed to create a tiny bit more than we've destroyed each year. But that few percent positive difference is compounded over decades in to what we might call civilization. . . . [Progress] is a self-cloaking action seen only in retrospect” (The Inevitable: 12 Technological Forces That Will Shape Our Future, p. 13).

Clearly, humanity suffers from a negativity bias. Consequently, there is a market for purveyors of bad news, be they doomsayers who claim that overpopulation will cause mass starvation or scaremongers who claim that we are running out of natural resources. Politicians, too, have realized that banging on about “crises” increases their power and can get them reelected and may also lead to prestigious prizes and lucrative speaking engagements. Thus politicians on both left and right play on our fears—from crime supposedly caused by playing violent computer games to health maladies supposedly caused by the consumption of genetically modified foods.

Conclusion

By definition, a world that is populated by flawed human beings cannot be a perfect place. As long as there are people who go hungry or die from preventable diseases, there will always be room for improvement. To that end, everyone has a role to play in helping those in need. By focusing on long-term trends and comparing living standards between two or more generations, however, it is possible to observe much improvement. That improvement is not linear or inevitable, but it is real.


About the Author

HumanProgress.org is a project of the Cato Institute with major support from the John Templeton Foundation and the Searle Freedom Trust, as well as additional funding from the Brinson Foundation and the Dian Graves Owen Foundation.

The History of Progress: Part 1

For most of human history, life was very difficult for most people. People lacked basic medicines and died relatively young. They had no painkillers, and people with ailments spent much of their lives in agonizing pain. Entire families lived in bug-infested dwellings that offered neither comfort nor privacy. They worked in the fields from sunrise to sunset, yet hunger and famines were commonplace. Transportation was primitive, and most people never traveled beyond their native villages or nearest towns. Ignorance and illiteracy were rife. The “good old days” were, by and large, very bad for the great majority of humankind.

Average global life expectancy at birth hovered around 30 years from the Upper Paleolithic to 1900. Even in the richest countries, such as those of Western Europe, life expectancy at the start of the 20th century rarely exceeded 50 years. Incomes were quite stagnant, too. At the beginning of the Common Era (CE), annual GDP per person around the world ranged from $600 to $800. As late as 1820, it was only $712 (measured in 1990 international dollars).

Humanity has made enormous progress—especially over the course of the past two centuries. For example, average life expectancy in the world today is almost 72 years. In 2010, global GDP per person stood at $7,814—over 10 times more than two centuries ago (measured in 1990 international dollars).

It is not only income and life expectancy that are improving. Steven Pinker, professor of psychology at Harvard University, has noted a propitious decline in physical violence. As Pinker writes,

Tribal warfare was nine times as deadly as war and genocide in the 20th century. The murder rate in medieval Europe was more than thirty times what it is today. Slavery, sadistic punishments, and frivolous executions were unexceptionable features of life for millennia, then were suddenly abolished. Wars between developed countries have vanished, and even in the developing world, wars kill a fraction of the numbers they did a few decades ago. Rape, hate crimes, deadly riots, child abuse—all substantially down.”

If anything, the speed of human progress seems to be accelerating. As Charles Kenny of the Center for Global Development writes,

4.9 billion people—the considerable majority of the planet—[live] . . . in countries where GDP [gross domestic product] has increased more than fivefold over 50 years. Those countries include India, with an economy nearly 10 times larger than it was in 1960, Indonesia (13 times), China (17 times), and Thailand (22 times larger than in 1960). And 5.1 billion people live in countries where we know incomes have more than doubled since 1960, and 4.1 billion—well more than half the planet—live in countries where average incomes have tripled or more …

According to a 2011 paper by Brookings Institution researchers Laurence Chandy and Geoffrey Gertz,

[The] rise of emerging economies has led to a dramatic fall in global poverty . . . [The authors] estimate that between 2005 and 2010, the total number of poor people around the world fell by nearly half a billion, from over 1.3 billion in 2005 to under 900 million in 2010. Poverty reduction of this magnitude is unparalleled in history: never before have so many people been lifted out of poverty over such a brief period of time.

Similarly, the world’s daily caloric intake per person has increased from an average of 2,264 in 1961 to 2,850 in 2013. In Sub-Saharan Africa, the caloric intake increased from 2,001 to 2,448 over the same time period. To put these figures in perspective, the U.S. Department of Agriculture recommends that moderately active adult men consume between 2,200 and 2,800 calories a day and moderately active women consume between 1,800 and 2,000 calories a day.

The internet, cell phones, and air travel are connecting ever more people—even in poor countries. More children, including girls, attend schools at all levels of education. There are more women holding political office and more female CEOs (chief executive officers). In wealthy countries, the wage gap between genders is declining. Our lives are not only longer, but also healthier. The global prevalence rate of people infected with HIV/AIDS (human immunodeficiency virus infection and acquired immune deficiency syndrome) has been stable since 2001 and deaths from the disease are declining because of the increasing availability of anti-retroviral drugs. In wealthy countries, most cancer rates have started to fall. That is quite an accomplishment considering that people are living much longer and the risk of cancer increases with longevity. In parts of the world, dwellings are growing ever larger. Workers tend to work fewer hours and suffer from fewer injuries. That’s especially true in developed countries, like the United States. Shops are bursting with a mind-boggling array of goods that are, normally, less expensive and of higher quality than in the past. We enjoy more leisure and travel to more exotic destinations.

Is everything getting better? Not exactly. In recent years, the world has witnessed a sustained attack on political and economic freedoms, as well as freedoms of religion and free expression. Considering that human freedom is an integral part of human progress, these worrying developments are worth bearing in mind.

Progress Is Not Linear

Unfortunately, progress is not linear. Europe, for example, experienced an unprecedented period of peace and rapidly improving standards of living between the conclusion of the Napoleonic Wars in 1815 and the outbreak of World War I in 1914. Between 1820 and 1914, real or inflation-adjusted GDP per person rose by 127 percent in Western Europe. In Great Britain, for example, life expectancy at birth rose from 41 years 1818 to 53 years in 1914. In Sweden, the improvement was even more dramatic, with life expectancy rising from 39 years in 1814 to 58 years in 1914.

The period between the start of the 20th century and the outbreak of World War I saw the introduction of such life-changing technologies as the radio, the vacuum cleaner, air conditioning, the neon light, the airplane, sonar, the first plastics, the Ford Model T automobile, and cornflakes.

As a result of World War I, which raged between 1914 and 1918 and killed some 16 million people, per-person GDP in Western Europe fell by 11 percent between 1916 and 1919. Life expectancy in Great Britain, one of the war’s main participants, dropped from 53 years in 1914 to 47 years in 1918. Other horrors followed.

The devastation of World War I undermined the Russian monarchy, leading to the rise of communism and the establishment of the USSR. Globally, some 100 million people died because of purges and socialist economic mismanagement in communist countries. Defeat in World War I and harsh reparation demands led to resentment in Germany. That contributed to the rise of National Socialism (Nazism), the outbreak of World War II, and the subsequent Holocaust. Some 73 million people died in World War II. After the war ended, communist dictatorships and free-market democracies fought in a variety of proxy conflicts as part of the Cold War, including the Korean War and the Vietnam War.

In spite of all that suffering, humanity rebounded. New technologies were introduced. They included the microwave oven, the mobile phone, the transistor, the video recorder, the credit card, the television, solar cells, optic fiber, microchips, lasers, the calculator, fuel cells, the World Wide Web, and the computer. Medical advances included penicillin; cortisone; the pacemaker; artificial hearts; the MRI scan; HIV protease inhibitor; and vaccines for hepatitis, smallpox, and polio.

Over the course of the 20th century, the GDP of an average Western European rose by 517 percent. For life expectancy, a typical Frenchman could expect to live 34 years longer in 1999 than in 1900.

The United States escaped much of the devastation of the two world wars, but suffered the Great Depression and carried many of the burdens of the Cold War. Between 1929 and 1933, for example, the average U.S. GDP per person declined by 31 percent. It was not until 1940 that it returned to its pre-Depression levels. Over the course of the 20th century, however, average American GDP per person rose by 581 percent and life expectancy by 28 years.

In Asia, average GDP per person rose by 96 percent between 1913 and 1999. Over the same time period, Chinese GDP per person rose by 473 percent and Indian by 173 percent. In China, life expectancy rose from 32 years in 1930 to 71 years in 1999—an increase of 39 years. Indian life expectancy increased from 24 years in 1901 to 61 years in 1999—an increase of 37 years.

The story of Africa is more complex, but still, on balance, positive. Between the time of the European colonization in 1870 and African independence in 1960, a typical inhabitant of the African continent saw his or her GDP rise by 63 percent. Per-person GDP increased by a further 41 percent between 1960 and 1999. Whereas Africa had underperformed relative to the rest of the world, Africans were better off at the end of the 20th century than they were at the beginning. Moreover, since the start of the new millennium, Africa has been making up for lost time. According to the World Bank, average per-person GDP in Sub-Saharan Africa rose by 38 percent between 2001 and 2016.

When it comes to life expectancy, Africa has experienced much progress. However, increases in life expectancy vary, depending on the harm caused by the spread of AIDS. Life expectancy in hard-hit South Africa, for example, rose from 45 years in 1950 to an all-time high of 62 years in 1990. It dropped to 53 years in 2005, before rebounding to 62 years in 2015.

Progress Is Not Inevitable

When reflecting on the world today, it is important to keep human development in proper perspective. The present, for all of its imperfections, is a vast improvement on the past. Understanding and appreciating the progress that humanity has made does not mean that we stop trying to make the future even better than the present. As the University of Oxford philosopher Isaiah Berlin once wrote, "The children have obtained what their parents and grandparents longed for—greater freedom, greater material welfare, a juster society: but the old ills are forgotten, and the children face new problems, brought about by the very solutions of the old ones, and these, even if they can in turn be solved, generate new situations, and with them new requirements—and so on, forever—and unpredictably."

That said, we should avoid making two mistakes. First, we should correctly identify, preserve, and expand those policies and institutions that made human progress possible. If we misidentify the causes of human progress, we could put the well-being of future generations at risk. One way of avoiding serious policy mistakes in the future is to avoid concentrating power in a single pair of hands or in the hands of a small elite. Instead, we should trust in the choices made by free-acting individuals. No doubt, some of those individual choices will turn out to be bad, but the aggregate wisdom of millions of free-acting individuals is more likely to be correct than incorrect.

Second, we should beware of utopian idealism. Utopians compare the present with, so to speak, future perfect, not past imperfect. Instead of seeing the present as a vast improvement on the past, they see the present as failing to live up to some sort of an imagined utopia. Unfortunately, the world will never be a perfect place because the human beings who inhabit it are themselves imperfect. Today, it is difficult to imagine the emergence of a powerful new utopian movement. But few people in 1900 foresaw the destruction brought on by communism and Nazism. We cannot rule out that utopian demagogues akin to Lenin, Stalin, Hitler, Mao Zedong, or Pol Pot will emerge in the future.


Human Progress.png

About the Author

HumanProgress.org is a project of the Cato Institute with major support from the John Templeton Foundation and the Searle Freedom Trust, as well as additional funding from the Brinson Foundation and the Dian Graves Owen Foundation.

Neuroscience shows gratitude makes us healthier and happier

Recent studies have concluded that the expression of gratitude can have profound and positive effects on our health and moods. As Drs. Blaire and Rita Justice reported for the University of Texas Health Science Center, "a growing body of research shows that gratitude is truly amazing in its physical and psychosocial benefits."

In one study on gratitude, conducted by Robert A. Emmons, Ph.D., at the University of California at Davis and his colleague Mike McCullough at the University of Miami, randomly assigned participants were given one of three tasks. Each week, participants kept a short journal. One group briefly described five things they were grateful for that had occurred in the past week, another five recorded daily hassles from the previous week that displeased them, and the neutral group was asked to list five events or circumstances that affected them, but they were not told whether to focus on the positive or on the negative. Ten weeks later, participants in the gratitude group felt better about their lives as a whole and were a full 25 percent happier than the hassled group. They reported fewer health complaints and exercised an average of 1.5 hours more.

In a later study by Dr. Emmons, people were asked to write every day about things for which they were grateful. This daily practice led to greater increases in gratitude than did the weekly journaling in the first study. The results also showed another benefit, where the participants in the gratitude group reported offering others more emotional support or help with a personal problem, indicating that the gratitude exercise increased their goodwill towards others, or more technically, their "pro-social" motivation.

Several studies have shown depression to be inversely correlated to gratitude. It seems that the more grateful a person is, the less depressed they are. Philip Watkins, a clinical psychologist at Eastern Washington University, found that clinically depressed individuals showed significantly lower gratitude (nearly 50 percent less) than non-depressed controls.

Another study on gratitude was conducted with adults having congenital and adult-onset neuromuscular disorders (NMDs), with the majority having post-polio syndrome (PPS). Compared to those who were not jotting down their gratitude nightly, participants in the gratitude group reported more hours of sleep each night and feeling more refreshed upon awakening. The gratitude group also reported more satisfaction with their lives as a whole, felt more optimism about the upcoming week, and felt considerably more connected with others than did participants in the control group.

The positive changes were markedly noticeable to others; according to the researchers, "Spouses of the participants in the gratitude (group) reported that the participants appeared to have higher subjective well-being than did the spouses of the participants in the control (group)." Dr. John Gottman at the University of Washington has been researching marriages for two decades, and states that the conclusion of the research is that unless a couple is able to maintain a high ratio of positive to negative encounters (5:1 or greater), it is likely the marriage will end. With 90 percent accuracy, Gottman says he can predict, often after only three minutes of observation, which marriages are likely to flourish, and which are likely to end, based on a formula that for every negative expression (a complaint, frown, put-down, expression of anger) there needs to be about five positive ones (smiles, compliments, laughter, expressions of appreciation and gratitude).

Switzerland is the Global Hub for Innovation

Switzerland consistently places among the top of the list for highest quality of life in the world, and the Global Innovation Index has ranked Switzerland as number one for innovation for the past eight years, drawing data from leading disruptive developments in the digital markets to finance.

Culture of Innovation

A decentralized political system, top talent workforce, and high economic density of global banks and companies all contribute to Switzerland’s economic resiliency, competitiveness, and robustness. Swiss schools and universities are among the best in the world and form the foundation of the nation’s innovative strength by sharing ideas between researchers, talented individuals, and innovative entrepreneurs to create the breeding grounds for innovation.

The Swiss are an engaged citizenry committed to public dialogue and a politico-economic environment that supports a thriving innovation ecosystem. Though not part of the European Union, Switzerland borders Germany, France, Italy, and Austria, providing European market access and an international environment that attracts top tier talent from across the world creating a genuine diversity of workforce.

Switzerland is a digital hub offering endless opportunities for cross-sector partnerships propelled by a tradition of innovation, which has sparked an intense culture of start-ups in Switzerland. International positioning and marketing, financing support, coordination and networking with federal authorities to provide a consistent and coherent umbrella of standards with the Conference of Cantonal Directors of Economic Affairs ensures quality and continuous improvement processes.

Tradition of Freedom and Democracy

Since the founding of the Old Swiss Confederacy in 1291, the foundation of Switzerland’s political system has been decentralized direct democracy, which gives citizens an extraordinary amount of participation in the legislation process and granting them a maximum of political self-determination.

In 2012, the Swiss Parliament approved the total revision of the Research and Innovation Promotion Act (RIPA), thereby creating the legal framework for providing federal funding for a Swiss Innovation Park yet leaving plenty of leeway as to how the park should be set up. Determined by the cantons, higher-education institutions, and private sector partners, five parks were established across the country.

To move Switzerland forward, the Federal Council established a two-year Digital Action Plan in 2017. It encompasses six action fields filled with concrete implementation projects, out of which specific recommendations were created, including society and digital transformation, business ecosystem and innovation, education and research, regulation, infrastructure and data policy, and data security.

Technology Start-Ups

More than $930 million Swiss Francs were invested in more than 175 Swiss start-ups in 2017. There is intensive exchange between business, academia, and governmental stakeholders which is reducing barriers to market entry, even for emerging fintech companies.

Two technical universities ETH Zurich and EPFL are ranked among the top in the world and CERN is where the world wide web was discovered. Companies have established their research and development centers for technology in Switzerland since 1956, Google’s main office outside the U.S. is in Switzerland, and Switzerland has become known as the Silicon Valley of robotics.

From incubators to financing through cantonal banks for seeds and start-ups, entrepreneurs are provided with high-speed access to innovation to found, grow, and scale their companies. Companies foster cross-sector collaboration to enable innovation by improving digital skills for current employees and for the next generation.

The Luxury Industry

A dichotomy of tradition and innovation, Switzerland is poised to lead luxury with its creative talent, prime geographical location, and focus on business partnerships to become the center of innovation for the luxury sector.

Switzerland is famously home to watch brands Patek Philippe, Rolex, Breitling, and Omega and now many high-end fashion designers are establishing a presence in the country: Armani, Gucci, Hugo Boss, Michael Kors, Prada, and Tom Ford all have physical properties in the southern, Italian-speaking region. Switzerland claims 9 of the top 100 leading luxury good companies in the world, dominated by Richemont, Swatch and Rolex, according to Deloitte. While Italy gained the top spot for leading luxury goods in terms of number of companies, France has the highest share of sales. This puts three of the most powerful countries in luxury goods in close proximity, not to mention that while Switzerland is not part of the European Union, it inevitably has close relationships with it.

Loomish is Switzerland’s global driver of digital innovation in the fashion sector, developing an ecosystem of investors, digital start-ups, fashion brands, retailers, and enterprises united to drive digital transformation in the industry. By launching their Fashion Innovation Award, they aim to give successful applicants face time with leading investors, unparalleled visibility at several international events, one-to-one meetings with top brands who are looking for new solutions and an advisory and services package worth thousands of euros.