Home ArticlesHealthy Longevity The History of Vitamins

The History of Vitamins

Part Two
by Chris A. Knobbe, MD

The History of Vitamins

Let’s examine the history of vitamins, just briefly. We need this perspective and context, in order to assess any potential benefits of consuming multivitamins for the management of AMD.

Bear with me, but I believe this history of vitamins is absolutely critical in understanding how and why we’re obsessed with and still somehow convinced that the consumption of synthetic vitamins will cure (or treat) many ills, including macular degeneration, when we’re constantly faced with failures in this regard.

Biochemist Kazimierz Funk, whose name is commonly anglicized to Casimir Funk, is credited with formulating the concept of vitamins, in the year 1912.20 Funk proposed the concept of these “vital amines” or “vitamines,” which was later shortened to “vitamins,” as he took part in research that attempted to discover the cause of the neurological condition, beriberi.

Beriberi, which is a dreaded condition of neuropathy, has both “wet” and “dry” forms. “Wet beriberi” is characterized by congestive heart failure with massive edema (swelling) of the legs, while “dry beriberi” is characterized by peripheral neuropathy with severe pain and weakness in the legs, numbness, and extreme emaciation with near-complete appetite loss. The condition often led to death by suffocating convulsions.

Between 1878 and the early 1900s, one thing gradually became clear regarding beriberi: the disease seemed to occur primarily in those consuming a monotonous diet, consisting mostly of polished white rice. Funk proposed that beriberi was the result of a deficiency of a “vitamine,” a term shortened from his original “vital amine.”

In this particular case, Funk believed the missing “vitamine” was a substance present in brown rice, but not in white rice – and he was right.

At that time, in fact, Funk described the now obscure conditions of beriberi, scurvy, rickets, and pellagra, all as “deficiency disorders,” that is, that each condition was caused by the lack of a certain “vital-amine,” where vita indicates “life,” and amine was equal to “a nitrogenous substance essential for life.”20

Vitamin Deficiency and Disease

In 1933, Robert Runnels Williams (1886 – 1965), a dedicated chemist, finally isolated the nutrient missing in the diet of those afflicted with beriberi. That vitamin was thiamine, or vitamin B1.21 This vitamin was eventually mass-produced and used to “fortify” foods to prevent beriberi.

Notably, beriberi was a disease caused by an extremely monotonous, bland diet, consisting largely of polished white rice. Interestingly, it was the polishing of the rice that led to the removal of the husk and, therefore, the removal of the B vitamins.

Similarly, scurvy was a disease primarily of sailors and could affect nearly half of any ship’s crew traveling from Europe to the Caribbean due to the lack of vitamin C. This occurred in the face of many weeks or months at sea without any fresh fruit or vegetables. But it wasn’t until 1932 that vitamin C was first identified and isolated.22

Rickets, a condition characterized by weakening and bowing of the bones, is secondary to vitamin D deficiency. In 1824, D. Scheutte first prescribed cod-liver oil to treat rickets.23 Scheutte, of course, had no idea exactly what nutrient was in the cod-liver oil that led to its beneficial effect on the disease, but he knew it worked. Cod-liver oil had already been used for centuries for its many health-promoting benefits. Interestingly, when the oil of fish liver is properly extracted and stored, it is a tremendous source of both vitamin D and vitamin A.

Although Scheutte had properly treated and cured rickets with cod liver oil, it would be more than a century after his effective treatment, in 1931, before three different teams of researchers finally discovered vitamin D2, having isolated at least one of the active forms of vitamin D. One of those teams included German chemist, Adolf Windaus, who received the Nobel prize for chemistry in 1928 for his contributions to the understanding of sterols in relation to vitamins.

Along the road to this discovery, however, the condition of rickets was determined by various researchers to be curable with certain components of an ancestral diet, including butterfat and cod-liver oil and, oddly enough, sunshine.23 Of course, we now understand that 7-dehydrocholesterol, a precursor to active vitamin D, is converted to vitamin D3 in the skin when the skin is exposed to ultraviolet light.

Pellagra, a disease characterized by the “Four D’s,” which is photosensitive (sunlight induced) dermatitis, diarrhea, dementia, and death, became endemic in the U.S. after 1902. Curiously, the disease primarily affected people in the Southern United States. With thousands sick and dying, a physician by the name of Joseph Goldberger, M.D., who was employed by the U.S. Public Health Service, was commissioned to discover the “infectious cause” of this disease. Goldberger had already shown great success in teasing out the cause of other infectious diseases, including yellow fever, dengue, typhus, hookworm, and diphtheria.

But what Goldberger found was that pellagra seemed to affect those consuming the typical diet of the Southern poor, which was meat, meal, and molasses. The meat was generally pork fatback, while the meal was cornmeal. This cheap, filling, monotonous diet, Goldberger was convinced, was missing a vitamin or vitamins, leading to the disease.

In 1915, Goldberger successfully treated those with the disease, by providing the affected with fresh vegetables, meats, milk, and eggs. This recovered those already affected and prevented the disease in others.24 But Goldberger’s discovery wasn’t accepted by the Southerners and, in fact, pride kept most from admitting that their diet might be at fault. Some even accused Goldberger of perpetrating a hoax.

So while Goldberger knew how to prevent – and successfully treat pellagra – with a healthy diet, the fact that he hadn’t isolated the missing nutrient prevented his theory and treatment from being accepted. Eleven years later, in 1926, Goldberger and his colleagues, on the basis of animal experiments, determined that the “pellagra preventive factor” was a water-soluble B vitamin. Goldberger died in 1929 and it wasn’t until 1937 that Conrad A. Elvehjem, a biochemist at the University of Wisconsin, finally isolated the nutrient, which was then called niacin, or vitamin B3.25

Nutrient-Deficiency Diseases

By the 1930s, four major disease conditions, which had plagued, maimed, and killed hundreds of thousands of people, in some cases for centuries, were all found to be nutrient-deficiency diseases. Throughout most of history, the combined searches for the causes of these diseases remained highly elusive, primarily because physicians and scientists were in search of “germs” that were causative.

As Catherine Price, author of the book, Vitamania, so eloquently wrote:

Germ theory’s central tenet – that disease is caused by the presence of something—hid the idea that disease could also be caused by something that is lacking. Germ theory’s light was so bright, so illuminating, that it blinded scientists to the idea that disease could be caused by something that wasn’t there.26

As it turned out, beriberi could be prevented and treated by the consumption of non-processed brown rice, rather than the processed, “polished,” rice. Scurvy could be prevented and treated with small amounts of fresh fruits or vegetables. Rickets could be prevented and treated with butter or cod-liver oil, or even sunshine. And finally, pellagra could be prevented and treated with a diet containing fresh vegetables, meats, eggs, and dairy.

Imagine, all of these diseases – preventable and treatable – with nothing other than healthy, well-rounded, whole-food based diets of both plant and animal origin? This should not be taken as a trivial, historical footnote.

The question is, will this have implications for macular degeneration?

Despite the fact that it was simply whole food that could both prevent and treat such diseases, the U.S. Government and many other nations’ governments who followed suit, didn’t promulgate to their people “healthy diets,” which could prevent such diseases. Instead, they began to “fortify” the very foods and diets that were causing such diseases, with artificially created or laboratory extracted vitamins, in bulk.27,28

Fortification, the process of adding vitamins or minerals to food, included the addition of vitamin D to milk in the late 1930s, “enrichment” of otherwise nutrient-deficient white bread made of refined white flour, with thiamine (B1) and niacin (B3) in the 1940s, ready-to-eat cereals (all of which are nutrient-deficient) made with added iron, B vitamins, and folate, and eventually, the addition of calcium to a variety of products beginning in the 1980s, just to name a few examples.28,29

This truly has led to an odd paradox. The addition of synthetic vitamins to food — fortification or “enrichment” — has allowed us to consume the nutrient-deficient diets that have caused the nutrient-deficiency diseases in the first place.

As such, the Standard American Diet (SAD), as of 2009, is now made up of at least 63% refined, processed, nutrient-deficient foods in the form of refined white flour, refined and added sugars, polyunsaturated vegetable oils, and synthetically produced trans fats.30

This, in turn, has led to epidemics of “Westernized” diseases, such as heart disease, cancer, type 2 diabetes, Alzheimer’s disease, metabolic syndrome, hypertension (high blood pressure), obesity, and even autoimmune diseases (I’ll reference all of this briefly), while the fortification has indeed prevented diseases such as beriberi, pellagra, scurvy, and rickets.

So the fascination with vitamins – in the form of pills – began. Despite failure after failure for such vitamins to achieve their intended benefits in literally thousands of studies in recent decades, our obsession with them continues. Even the medical orthodoxy has “taken” to this obsession.

Returning to our original question, clearly, our 19th-century ancestors weren’t consuming synthetic vitamins. The concept of vitamins hadn’t even been conceived until 1912. Yet, only a few cases of macular degeneration existed anywhere on the planet.

Think about that…

“Westernized” Diets, Vitamin Deficiencies, and Degenerative Diseases

For those who haven’t read my book or the Introduction, it is worth reviewing here that, in 1939, Weston A. Price, scientist, researcher, and dentist, often referred to as the “Isaac Newton of Nutrition”  published a treatise of nutritional and anthropological studies in his book, Nutrition and Physical Degeneration.31

During the period from approximately 1931 to 1937, Price scientifically and systematically studied populations on five continents, in fourteen nations, and among hundreds of different tribes and cultures and many thousands of people, he set out to determine exactly what diets led to excellent dental health versus dental decay, and good general physical health versus physical degeneration.

Price found, in virtually every single individual and population, that both dental health and general physical condition remained superb, as long as people consumed their native, traditional diets.

However, when people began to consume the foods that Price referred to as the ‘displacing foods of modern commerce,’ which he defined as consisting of refined white flour, sugar, canned goods, sweets, confectionery  and vegetable oils, they soon developed rampant dental decay, which was then followed by the onset of many degenerative conditions, such as arthritis and cancer.

Young, growing children, consuming processed food diets, developed facial deformities, with narrowing of the dental arches and consequently, misaligned teeth. Dental caries (cavities) often affected 20 to 30 percent or more of teeth in those consuming the “displacing foods,” but generally around one percent or even less of those people consuming native, traditional diets.

Those consuming significant amounts of the “displacing foods” were also far more prone to infectious diseases, which they once had immunity to, such as tuberculosis. Finally, Price found an onslaught of mental disorders in those consuming the processed foods, including suicidal ideation in those severely affected by dental disease, abscesses, and loss of teeth. Most of these societies had no dentists, nor any access to such, and the dental decay and abscesses meant chronic, severe, unrelenting pain and difficulty consuming the foods they once enjoyed.

Price analyzed the “primitive” people’s native, traditional diets, by sending thousands of food samples back to his labs in the United States. What he discovered was, by any analysis, shocking.

The “primitive” diets, which kept people brilliantly healthy, happy, and free of chronic Westernized diseases, contained at least ten times more fat-soluble vitamins (primarily A, D, and what Price referred to as the “X-factor,” which is now believed to be K2), four times more water-soluble vitamins (all the B vitamins and C),32 and approximately 1.5 to 60 times more minerals, than did the processed-food laden, Westernized (American) diets of his day.33

A boatload of more recent research is in support of Price’s principles. I’ll only mention some of the major research in this article…

Loren Cordain and colleagues at Colorado State University reviewed that the Industrial Revolution, with its attendant widespread production and consumption of refined vegetable oils, refined cereal grains (especially refined white wheat flour), and refined sugars, combined with the Modern Age “junk food” era, has resulted in numerous “maladaptive developments,” including numerous pathophysiological and degenerative diseases.34

Cordain’s group found that currently in the U.S., pasteurized dairy, refined cereal grains (again, primarily refined white wheat flour), refined sugars, polyunsaturated vegetable oils, and alcohol account for some 70% of the total daily caloric consumption.35 These five food groups are virtually entirely devoid of micronutrients, that is, vitamins and minerals.35

This is the recipe for metabolic disaster.

It’s worth repeating.

This is the recipe for metabolic disaster.

________

This is Part-Two of an eight-part series.

Find Part One here.

Find Part Three here.

References: See References Page Here.

Related Articles