Imagine a time when people thought of wine and beer as "hygienic" beverages that protected them from water-borne diseases. In early 19th-century France, wine accompanied every meal, and was considered a medicinal tonic. Beer was thought to be healthier than water because it contained nutrients. Those were the days, right?
What were people to do, then, when wine and other fermented beverages began to develop their own diseases? Alcohol could turn sour, smelly, bitter or even lose its flavor altogether. It might take on an oily sheen or become cloudy. The only food preservation practices at the time included curing, canning and fermenting. And we knew how to delay food spoilage, but knew little about what caused it. It would take a deeper understanding of what made foods go bad before the method of pasteurization could be developed.
Advertisement
The theory of spontaneous generation served as a popular explanation at that time for why certain forms of life would suddenly appear out of decaying matter -- think maggots growing out of rotting flesh. But as scientists began to understand better how reproduction worked, it became clear that spontaneous generation didn't explain everything. They hadn't yet discovered reproduction by cell division, so scholars still believed that smaller organisms like bacteria and fungi grew from inanimate matter.
So what does all of this have to do with pasteurization? It was in this environment of scientific uncertainty that Louis Pasteur was called upon to study the diseases of wine. Read on to learn the difference between wine and vinegar and how that discovery led to mass pathogenocide.
Advertisement