Monday, 30 July 2012

Just A Sniff To Prevent Flu

On 25th July it was announced that the UK are to implement a program of vaccination for children aged 2-17 years old against seasonal influenza. The proposed change won’t come into effect until 2014 and it is believed that it will cost in the region of £100 million a year. There has been much talk about this level of spending with the economy the way it is, so I thought I would write a blog giving some details on the proposal and attempt to convey the benefits of the spending.

Firstly, a little background into flu vaccination. We currently have an annual flu vaccine which is given to ‘at risk’ individuals just before the onset of the flu season (winter). The vaccine contains three different strains of influenza which are predicted to be present during the up and coming flu season. Straight away we hit a slight snag here; the vaccine is made from virus strains which we are expecting to be circulating in the coming months. We need to base the vaccine on ‘guess work’ since it can take as long as six months to produce enough vaccine for everyone who needs it; this is also the reason why we only target ‘at risk’ individuals, we simply can’t make enough for everyone at present. The production of flu vaccine is slow because it relies on growing the virus in eggs, from where it will be collected and made into the vaccine. I say the vaccine is based on ‘guess work’, but this does a huge disservice to the vaccinologists and epidemiologists. The three chosen strains of the virus are picked on the basis of months of surveillance and a lot of data crunching, and the ‘guesses’ that are made regarding which strains to use are rarely incorrect. If the wrong strains are chosen, or the strains circulating change, then it we are likely to see an outbreak.

The current flu vaccine is given to over 65s, pregnant women, chronically ill patients etc. The vaccine is given as an injection and is made of inactivated virus. There are two major categories of classical vaccine, these being inactivated and live-attenuated vaccines (I’ll come back to the latter of these shortly). The inactivated vaccines contain virus particles that are completely inert; they have been ‘killed’ by treatment with chemicals. This version of the flu vaccine is injected into the arm of patients and delivers dead influenza particles which subsequently trigger our immune system to develop a, so called, ‘memory response’ against the flu virus. This memory response means that the next time a live influenza virus (of the same strain as was in the vaccine) enters the patient the immune system will have a very rapid and strong response, so as to completely protect from any severe disease.

Live-attenuated vaccines are slightly different as they do not use killed virus. These vaccines use a weakened (attenuated) form of the virus which is less capable of causing disease. An example of this attenuation is known as cold-adapted virus. To produce this form of a vaccine, virus is grown at temperatures much lower than human body temperature. The virus evolves to grow effectively at these low temperatures. When the virus is then in the human body, at a much higher temperature, it does not grow as well as it usually would. The virus will be cleared by the immune system before it is able to evolve back to growing at human body temperature and provide a protective memory response at the same time. This is just one example of how a live-attenuated virus can be produced, there are many other methods.

Both the inactivated and live-attenuated vaccines have their pros and cons, but the main point is that live-attenuated vaccines tend to give a better level of immunity since they more closely mimic a real infection, but they also carry more risk. The risk with live-attenuated vaccines comes from the possibility of ‘reversion’ whereby the virus mutates back to its normal (wild type) form. If reversion occurs then the vaccine has a chance to cause the very disease it is designed to protect against. We currently use an inactivated virus for the seasonal flu vaccine since there is no chance of reversion, making it safe to give to people who are at the most risk of a severe infection. Children tend to have much stronger immune systems than those classed as ‘at risk.’ It is therefore safer to give children a live-attenuated vaccine since they should be able to fight off the virus before it reverts and, should it revert, they are better able to recover from the disease without any severe complications.

A shot of FluMist
That covers the background, so now we can move onto the proposal of vaccinating all 2-17 year olds here in the UK. The first important point to note about the proposal is that it plans to use a vaccine that is given as a nasal spray, instead of an injection, making it much easier to give en masse (and much less painful). The nasal spray vaccine has been used in the United States since 2003, under the name FluMist, and uses live-attenuated virus.

The next big thing to consider is the reason behind targeting children with this vaccination. Flu is predominantly a disease of the old and the young, however it is much less likely that infection with influenza will cause severe disease in children, so some may ask why bother? There are a lot of reasons to bother. The main reason lies with the concept of herd immunity (which isn’t the easiest thing so bear with me). In an ideal world everyone would be vaccinated against every disease possible. If this was the case, then the virus would never cause any infections (and would die out). However, it is impossible to get 100% coverage of a vaccine in the real world. What is instead attempted is to achieve a level of coverage high enough that everyone is, in effect, protected. Let’s say there are a group of 1 million people. If 90% of these are vaccinated there are only 100,000 of the 1 million who can get the disease. If one of these ‘at risk’ individuals gets infected the chances of them meeting one of the other 99,999 who could get the disease is very slim (99,999 who could get infected against another 900,000 who can’t), making spread unlikely. Those people who haven’t been vaccinated are therefore protected simply because such a high proportion of the population has been. With a high level of coverage there is a good level of herd immunity.

If we vaccinate all children aged 2-17 we reduce the number of people who can become infected and subsequently spread disease. This will help to protect people who are at even greater risk of severe disease, for instance, a child’s grandparents who they may visit while carrying infective virus. If you have read any reports about this new proposal you may have seen the quote that, “even with moderate uptake of 30% it's estimated that this should result in 11,000 fewer hospitalisations and 2,000 fewer deaths each year,” from the Chief medical officer Professor Dame Sally Davies. These estimated figures are based around the concept of herd immunity, as well as the direct effects of vaccinating the children.

Herd immunity is not the only reason to vaccinate children. A child who gets flu will have to take time of school, potentially up to a couple of weeks. This means a parent would have to take time off work. Being at home with a sick child also makes the parent more likely to contract the disease themselves and have to take further time off work. Missing work is clearly not good for the parent, nor is it good for the economy as a whole. I have only used the examples of reduced hospitalisation and reduced time off work, but hopefully already you can see the economic logic behind the proposed move. I’d hate to have been the person to do these sums, but it has been estimated that the approximate £100 million a year cost for the vaccine campaign is cost effective when compared to the other economic benefits it could have.

For the majority of children, influenza is not a killer disease. However, starting in 2014 it is highly likely that we will see a huge campaign to get as many 2-17 year olds vaccinated against the virus as possible. The vaccine will help to reduce the cases of influenza infection in children, which, while rarely fatal, is still fairly nasty and not one you’d want to have if you can avoid it. The campaign will also help to protect those at higher risk of severe influenza by increasing the level of herd immunity. It won’t be a cheap campaign, but the simple fact that it will help prevent many cases of flu will help reduce the costs needed to treat these people, making it cost effective. Away from cost-effectiveness, if you could prevent yourself getting a nasty disease, you are at risk from, by simply having something sprayed up your nose, wouldn’t you want to? 

Wednesday, 4 July 2012

From antibiotics to bacon sandwiches

We are at war with an invisible enemy; the foe is bacteria. This war began in 1928 with the accidental discovery of penicillin by Alexander Fleming, though it didn’t fully kick into action until the work of Chain and Florey to purify and mass produce the chemical by 1945 (ironically a significant year another war). We now have a huge array of antibiotics to fight the bacterial enemy and yet the bacteria are still winning the war through their ability to develop resistance and continue to cause infections that are becoming harder to treat. In this blog post I want to look at a well-known resistant bacteria and how resistance emerges. I also want to look at the livestock industry, not just human use of antibiotics, as this has a major part to play in our war with resistance.

I’m sure many readers will have heard of MRSA (short for methicillin resistant Staphylococcus aureus), though not necessarily all will know what it is. S. aureus is a gram positive bacterium, meaning it takes up the gram stain used to classify bacteria. It is just one of the 500-1000 bacterial species that call us humans home and often makes up part of our skin flora (the bacteria is commonly found living on the skin of healthy people). As a slight aside, it is interesting to note that humans are in fact more bacterial than they are human with roughly ten times as many bacterial cells as human cells on, and in, the average human being, S. aureus is just one of the many bacteria that can make up part of this so called microbiome. That takes care of the SA part of MRSA, so onto the MR part. Methicillin is a drug in the penicillin class of antibiotics. This class of antibiotics are known as beta-lactams, and are/were effective against gram positive bacteria (I say ‘were’ due to the prevalence of resistance). When gram positive bacteria are produced they need to form a cell wall made of many peptidoglycan molecules cross-linked with each other (peptidoglycans are amino acids, the building blocks of proteins, linked to sugar molecules). The formation of this wall protects the bacteria from its environment so it can survive. The penicillin class of antibiotics blocks construction of the wall which causes a bacterium to swell, as water enters, and eventually burst. When first described, MRSA were a group of bacteria capable of blocking the action of methicillin, so to allow the production of a stable cell wall in the presence of the drug. MRSA are now resistant to most penicillin based antibiotics, yet the name has stuck.

MRSA is not the only bacteria which have become resistant to antibiotics, but it is constantly hitting the headlines due to its prevalence in hospitals, hence why I have used it as an example here. The question to ask now is why we have resistant bacteria in the first place; the answer lies with natural selection and evolution. Natural selection can be broadly defined as the selection of advantageous traits that promote survival of an organism. Normally I would give an example of this and then move back to my main point of bacterial antibiotic resistance; however, in my view, bacterial resistance is one of the best examples of natural selection at work. As I mentioned earlier, S. aureus is often found on the skin of healthy people without causing any issues. However, if it gets under the skin or into other areas of the body it can cause an infection (impetigo, for instance). If this happens a doctor would prescribe an antibiotic to the patient which, over time, would kill the bacteria and cure the infection. The aim of the bacteria is to survive and grow, so the presence of a killer drug is bad news. The antibiotic is a ‘selection pressure’ that forces the bacteria to evolve in order to survive. If even a single bacterium develops resistance to the antibiotic (an advantageous trait), through a mutation, it will be at a huge advantage since all the fellow, non-resistant, bacteria that are clogging up space and using up resources will be wiped out by the drug. This leaves the lone mutant bacterium with all the space and resources it needs to thrive. This resistant bacterium will rapidly multiple and form a whole new colony of bacteria that are completely unaffected by the drug; giving a resistant strain.

The use of antibiotics can lead to the development of resistant strains of bacteria; however, it is an inescapable fact that we need to treat patients who have bacterial infections. What is avoidable is the unnecessary overuse of antibiotics seen around the globe. Before the issue of antibiotic resistance came to the fore-front of medical thinking it was common for antibiotics to be used for the wrong thing, such as a cold. Most common colds are caused by rhinovirus which, needless for me to say, is a viral infection. Antibiotics are completely useless against viruses since the drugs interfere with bacterial replication; a virus replicates inside a host cell, hidden away from any effects of an antibiotic. Cases of antibiotics being used to treat common colds and other viral infections has, for the most part, stopped since it was realised that this stupid overuse was leading to resistance.

The overuse of antibiotics in humans has, to an extent, been brought under control since we now understand the risk it poses. However, the story does not stop here. At the end of the Second World War, as the antibiotic era was blooming, people began to notice that feeding antibiotics to livestock made them grow faster (we still aren’t sure why exactly). This was a fantastic finding due to the obvious shortage in quality animal feed at the time and the need for a kick-start to the agriculture and livestock industries following the war. Unfortunately, just as with humans, the use of antibiotics for farm animals promotes the emergence of resistant bacteria. These bacteria can easily cause infection within the livestock which comes with obvious economic issues, but somewhat more worryingly these bacteria could easily get into the human population and cause disease that we would struggle to treat. This fact was noticed fairly early on when the EU banned the use of antibiotics used to treat humans for use in livestock in the 70s. Even with this ban, and numerous others around the use of antibiotics to promote growth, the amount of antibiotics consumed by livestock is still double that used by humans; creating a dangerous environment for the emergence of resistant bacteria.


The overuse of antibiotics is something that needs to stop; otherwise MRSA is likely to become the least of our worries. To that end it seems the world needs to follow Denmark’s lead. Denmark is the world’s largest exporter of pork, with 90% of its produce being shipped off elsewhere. At the turn of the century the Danish pork industry decided to follow the lead of its poultry industry two years previously and agreed to stop using antibiotics for growth. Since then, the use of antibiotics has dropped by around 60%. Many argued that stopping the use of antibiotics would lead to a collapse of the industry, but since 1994 the pork production of Denmark has actually risen by 50%. So not only are they giving the world more bacon sandwiches, they are also reducing the risk of giving the world highly resistant bacteria.

We are in danger of losing our war against bacteria since resistance is emerging faster than we can discover new antibiotics. If we continue to unnecessarily overuse antibiotics and drive the evolution of resistant bacteria we are shooting ourselves in the foot. The obvious place to start the reduction of antibiotic use is in the livestock industry since, as Denmark has shown, it is not essential to have antibiotics for the industry to survive. The first step would be to stop vets making a profit from the sale of antibiotics to farmers (a blatant conflict of interest), followed by the implementation of a good surveillance system to monitor the use of the drugs. If we can curb our use of antibiotics it is likely to extend their shelf life, which will in turn buy us more time to find new and improved antibiotics or to work towards alternatives. If we continue down the path of promoting the emergence of antibiotic resistant bacteria we are in danger of heading back to a pre-antibiotic era in which what is currently a simple and treatable infection could become life threatening.