Modern-day volunteer infection studies are carefully managed. The aim is to deliberately expose healthy volunteers to infections – they might get sick, but in a safe and manageable way with healthcare support.
This might seem an unlikely thing for doctors to want to do. One of the founding principals of medicine is ‘Do No Harm’, but such experiments are important for understanding human disease. They provide essential and valuable information about how infections cause illness, how they spread and how they can be prevented or treated.
In this first blog post, we take a look back at some of the biggest breakthroughs to come from research involving people and reflect on the darker side of past studies. Finally, we'll talk about how we run studies safely nowadays, protecting volunteers from undue harm.
Early vaccine triumphs
Vaccines save around 2-3 million lives every year according to figures from the World Health Organisation, which is a huge triumph of modern day medicine.
The history of vaccines can be said to begin with a doctor called Edward Jenner and his work that led to the development of the smallpox vaccine. And he wouldn’t have succeeded without using a volunteer infection study (although it’s doubtful his research subject freely ‘volunteered’).
Smallpox was a vicious virus, spreading rapidly and claiming many lives, but those lucky enough to survive were immune to the disease for the rest of their lives. As early as the 17th century doctors began experimenting to try and give people this immunity by inoculating healthy people with pus from smallpox sores. This approach did work, but unfortunately some people who were inoculated got full-blown smallpox and died; others passed on the infection and fuelled new outbreaks.
Edward Jenner, himself inoculated with smallpox as a boy, had noticed that dairymaids exposed to the cow version of smallpox – coxpox – were also immune to disease. In 1796 Jenner carried out his famous experiment and injected his gardener’s young son, James Phipps, with cowpox. The boy had nothing worse than a mild illness and, once recovered, Jenner tested his theory by repeatedly exposing young Phipps to smallpox. Luckily for Phipps it had been a success and he was immune to smallpox. Using cowpox to give people immunity worked and was safer than injecting people with the human virus.
Jenner’s proof-of-principle study on young Phipps was the foundation of the first global vaccination programme, which led to the eradication of smallpox and saved millions of lives.
This is the first and most famous example of a volunteer infection study being used to test a vaccine, but many other vaccines followed in the wake of smallpox. There are now 26 preventable diseases that vaccines protect people from, and together they have saved millions of people from debilitating and deadly illnesses including measles, whooping cough, tetanus and polio. In most cases, doctors tested them by vaccinating healthy people and then purposefully exposing them to the disease to check the vaccine protects them.
Understanding infections
It’s not just vaccine testing that volunteer infection studies have been instrumental for – most of today’s vaccines and treatments wouldn't exist without the fundamental research that pinpointed the causes of infectious diseases and how they spread.
Up until the 1800s it wasn't common knowledge that microbes (germs), invisible to the naked eye, were the cause of many diseases. Thanks to the work of pioneering biologists including Pasteur, Cohn and Koch, there followed a golden age where many bacteria and viruses were discovered and suspected of causing disease.
But detecting a microbe in someone with an illness isn’t the same as proving it causes the disease – it could be coincidence. So Koch came up with four principles that, at the time, were accepted as proof. One of these principles says that the “microorganism should cause disease when introduced into a healthy organism”. In some cases, animal studies helped prove that a microbe was the root cause of a disease, but sometimes researchers had no alternative but to study the infection in people. And historically, it wasn't uncommon for researchers to use self-experimentation, or test on their families. One of the earliest reports is that of John Hunter, who in 1767 allegedly inoculated himself to show that gonorrhoea was contagious – accidentally contracting syphilis at the same time (which later proved fatal, although this version of events is contested).
Much later, in the 1930s a married couple, the MacDonalds, purposefully infected four of their children to prove which bacteria caused whooping cough, and to show that a vaccine stopped two of them from getting ill (a bit unlucky for the younger two, who weren't vaccinated and became very poorly with whopping cough).
Scientist Max Joseph Pettenkofer reportedly had a fortunate escape when he only developed a mild illness after drinking a broth laced with cholera bugs – he’d been trying to prove that the bacteria weren’t the cause of the disease (he was wrong).
An Australian researcher called Barry Marshall wanted to challenge the dogma that stomach ulcers were caused by lifestyle and show that they were due to a bacterial infection and therefore treatable. He offered the first proof of this theory by drinking the bacteria and developing serious inflammation of the stomach lining (showing the bacteria could infect stomach cells), and needed months of powerful antibiotics to get better. But he later won a joint Nobel Prize (with his colleague Robin Warren) as recognition of his determination and perseverance.
Stubbins Ffirth rose to notoriety by trying every means possible to show yellow fever wasn’t contagious, including drinking and injecting vomit and other bodily fluids from sick people. Despite his best efforts, yellow fever was shown to be contagious and Dr Jesse William Lazear was the first to prove that it’s spread by mosquitos. Unfortunately, either deliberately or accidentally, he was bitten himself and died of yellow fever.
Self-experimentation and testing in people helped researchers understand much more about human diseases through the ages, and led to important measures to control the spread of infections. This type of research also led to significant treatment breakthroughs, notably drugs to treat malaria and respiratory viruses like flu. But early volunteer infectious studies had a dark side, too.
The human cost of early volunteer infection studies
Most of these early research studies on people, including self-experimentation, would never be allowed to happen today. The most notoriously unethical studies left some people with lifelong disfigurements or disabilities, and in the worst cases led to people dying.
In the early 20th century, researchers looking for large groups of people to study often took advantage of vulnerable sections of society. These people usually lacked the freedom or capacity to make an informed choice to take part in medical research, and doctors often withheld information about what experiments they were doing.
Prison inmates were often the target of unethical experimentation because they were a captive group offered little protection with respect to their human rights. For example, in the 40s people imprisoned in a US jail called the Stateville Penitentiary were infected with malaria then given completely untested drugs. Doctors had no data about the possible toxicities or side effects of the drugs, and they caused at least one death.
Orphanages and Institutions for disabled children and adults were also easy targets for rogue researchers looking to carry out medical studies. In two of the best known instances, mentally disabled children attending the Willowbrook State School in New York were intentionally given hepatitis to help develop a vaccine, and adult patients at a Michigan State mental hospital were deliberately sprayed with flu to test new treatments - one of the doctors leading this study was Jonas Salk, who later became famous for inventing the first successful polio vaccine.
In some cases studies were carried out in other countries because of their lax regulations – researchers crossed ethical boundaries that they would not have been allowed to cross in their own country. In the 40s, a US team led by John Cutler carried out experiments on people in Guatemala, with agreement from both governments. The doctors deliberately infected around 1,300 soldiers, prostitutes, prisoners and people living in poverty with sexually transmitted infections without their knowledge or consent. The aim was to test penicillin as a treatment, but many weren't given the antibiotic and at least 83 people died.
The Guatemala study was quickly stopped, but one of the most infamous studies – the Tuskegee syphilis experiment – spanned 40 years. Beginning in the 30s, around 600 African-American men with low incomes and low levels of education agreed to join the study in return for free healthcare and burial costs. Around two-thirds of the men already had syphilis before the study began, but weren't informed they had the disease. Over the course of the study scientists discovered that early syphilis was treatable with penicillin – although it was probably too late to help most men in the Tuskegee Study (who had been infected for many years), they weren't told about the possible treatment because doctors wanted to monitor the natural course of untreated disease.
These examples focus on studies of infectious disease, but human experiments to find out more about biological and chemical weapons, radiation exposure and surgical techniques were happening in other parts of the world, including the UK.
Yet all these studies pale in comparison with one of the most systematic and wide scale abuses of people's rights throughout history. In Nazi Germany, prisoners of war were subject to all sorts of unimaginable horrors, including being given various infections. The atrocities were so severe that it catalysed one of the first significant steps towards drawing up ethical guidelines to safeguard people taking part in medical research.
Prioritising safety and ethical conduct
The Nuremberg Code, created in 1947, is one of the most important documents in the history of medical research ethics. It was written by judges overseeing the trials of Nazi doctors who conducted human experiments in the Concentration Camps.
It listed 10 principles that should be followed by those wishing to carry out studies involving people to ensure their rights. It was the first time that the emphasis was on the rights’ of the participants themselves (instead of focusing on doctors’ actions). It was also the first time that informed consent was made a requirement for a study to be morally acceptable.
Several ground-breaking Declarations followed – for example the Declarations of Geneva (1948) and Helsinki (1964). Over time these ethical frameworks for human research have been refined and, while they aren't legally binding, most countries continue to use them as the basis to develop policies governing human medical studies.
Thanks to many years of work and international cooperation, there are now robust principles and procedures that doctors have to follow to protect people taking part in medical research. These checks are in place to ensure the safety, privacy and dignity of people taking part in medical research.
We’ve come a long way since some of the atrocities of the past, and there are barriers in place to stop unethical studies from happening again.
In our next blog post, we’ll be looking at modern-day volunteer infection studies; why we need them, and how they are designed and run to keep people as safe as possible.
Emma