In the 2000 book "Bowling Alone," political scientist Robert D. Putnam argued that social capital in America was declining. As one way to support that point, Putnam pointed to statistics involving membership in community organizations. People simply saw each other less, according to Putnam. There was no chance to meet the neighbors down the street, socialize with other members of the community or get to know anyone outside of your own house. And that was if you even saw the people in your own house — Putnam believed that technology such as television and the internet had completely negated the need to speak to anybody.
Putnam's point was summed up in the very title of the book: People in the United States were bowling more, but they were heading to the local alley by themselves. The old days of joining a league and fraternizing with the same group of people every week were over. Now, people were shut off from all social connections and bowling alone.
Advertisement
But beyond the societal problems that Putnam believes can arise from declining social capital, an "every man for himself" approach can have tremendous effects on public health. Keeping populations of a community free from viral disease rests in part on the success of herd immunity. Herd immunity rests on the principle of safety in numbers; if more people are immune to a certain virus, either through vaccination or through already having the disease, then more people in the population, even if they themselves aren't immune, are protected from the disease.
To illustrate the point, let's return to that bowling alley where people are bowling by themselves. Let's say the guy on the first lane contracts influenza, and he passes it along to the woman on the second lane. If that woman isn't immune to influenza, then the disease will likely continue its path lane by lane until every person in the bowling alley is suffering. But if that woman is immune, then the disease stops with her, because the virus has nowhere else to go (assuming that the guy in our example didn't have contact with anyone else). By her immunity, she protected all the people on subsequent lanes, even if they didn't get a flu shot that year.
Sounds simple, right? But it may not be that simple.
Vaccination and Herd Immunity
For many of us, chickenpox (known in clinical circles as varicella) was a routine part of childhood – a minor annoyance, but rarely life-threatening. For that reason, a chickenpox vaccine was met with some skepticism when it was introduced in 1995. Still, there was a push in the United States to get children vaccinated, and studies have shown that the effort had a tremendous impact on the disease and the costs associated with it. Chickenpox hospitalizations dropped by a massive 93 percent as of 2012 compared with the pre-vaccine period. During the period of 2006 to 2012 when a second dose became recommended, the hospitalization rate declined by another 38 percent, alleviating costs all around [source: CDC].
Not only does a push for a chickenpox vaccine protect a child and his or her classmates, it can also protect grandparents who may not have had chickenpox. Protecting the elderly is also the idea behind another vaccination that many of us line up for each winter: the flu shot. Between 12,000 and 61,000 Americans have died each year of the flu each year since 2010 [source: CDC]. Of those, 90 percent of flu-related deaths and 50 to 70 percent of hospitalizations are people over the age of 65 [source: CDC]. While the elderly are encouraged to get a flu shot, it's actually more effective if the herd around them is vaccinated, including caretakers and visitors that might include germy grandchildren.
The success of the polio vaccine demonstrated the benefits of using immunization to protect a population, and herd immunity can be achieved for a whole host of diseases beyond chickenpox and the flu, including measles, mumps and smallpox. And while the term herd immunity seems to imply that the whole herd should be vaccinated, the whole herd is merely protected if a certain percentage is immunized. Mathematical models can be used to determine exactly what percentage of the population needs to be vaccinated to prevent a communicable disease. For example, if approximately 80 to 85 percent of a population is vaccinated against polio, then herd immunity is achieved. But measles is more contagious than polio, so 90 to 95 percent of a population needs to be vaccinated for herd immunity to be achieved [source: Oxford Vaccine Group].
However, just because herd immunity is achieved doesn't mean it's completely foolproof. Vaccines themselves aren't 100 percent effective, and diseases can still strike those not immune to them. In some instances, an immunization is only effective for a few years, so the protection may have worn off a majority of the herd at the time of an outbreak. Childhood vaccinations may only delay the age at which outbreaks occur. For example, if a pregnant woman contracts rubella, or German measles, she and the unborn baby face far more complications than a child who endures the disease.
So, if vaccines don't work or just put off a mass outbreak in the herd, do we even need them at all? Some people don't think a shot in the arm is worth the good of the herd.
Advertisement