The Supplement Question
An Essay on Vitamins, Minerals, and the Business of Deficiency
In the 1930s, a dentist named Weston A. Price traveled the world studying isolated populations still eating traditional diets. He found fourteen distinct groups—in Swiss Alpine villages, the Scottish Outer Hebrides, Alaska, the South Pacific, the African interior—with virtually no tooth decay, no chronic disease, and robust physical development. Their diets varied dramatically. Some ate primarily animal products, others relied heavily on plants, some consumed grains, others didn’t. But all shared certain characteristics: whole foods, traditional preparation methods, and complete absence of refined and processed products.
None of them took supplements.
The Inuit of the Arctic lived for months in complete darkness, with no sunlight exposure whatsoever. According to modern vitamin D theory, they should have been crippled by rickets. They weren’t. Swiss villagers in the Lötschental Valley had no tuberculosis during an era when TB was Switzerland’s leading cause of death—their fat-soluble vitamin intake was ten times higher than the American average, not from supplements, but from butter, cheese, and organ meats from animals grazing on mineral-rich alpine pastures. African tribesmen seemed immune to the diseases that devastated European visitors, despite going barefoot, drinking “unsanitary” water, and living in mosquito-infested areas. The Europeans required complete coverage and protective netting; the Africans did not.
The pattern was consistent across every population Price studied: robust health on traditional diets, regardless of macronutrient composition or environmental hardship; rapid deterioration when industrial foods were introduced. The first generation to eat processed foods showed narrowed facial structure, crowded teeth, and increased susceptibility to infection. Subsequent generations showed progressively worse deformity and disease.
These populations thrived for hundreds of thousands of years without knowing what vitamins were. They didn’t need to. So why do we assume we need them now?
Support Independent Research
This work remains free because paid subscribers make it possible. If you find value here, consider joining them.
What paid subscribers get: Access to the Deep Dive Audio Library — 170+ in-depth discussions (30-50 min each) exploring the books behind these essays. New discussions added weekly. That’s 100+ hours of content for less than the price of a single audiobook.
[Upgrade to Paid – $5/month or $50/year]
Get in touch Essay ideas, stories, or expertise to share: unbekoming@outlook.com
The Manufacturing Reality
Shannon Rowan, a researcher who has investigated vitamin manufacturing processes, describes what’s actually in the bottles lining pharmacy shelves. The images on labels—oranges, sunlight, leafy greens—bear no resemblance to the contents.
Vitamin D3 begins as lanolin extracted from sheep’s wool. The lanolin undergoes a complex industrial process: treatment with solvents including chloroform and benzene, exposure to ultraviolet radiation, purification through aluminum oxide columns, and crystallization. The resulting cholecalciferol is chemically defined but bears no relationship to the vitamin D a human body synthesizes when sunlight strikes skin. The body produces vitamin D through a cascade of enzymatic reactions in skin, liver, and kidneys—a process involving multiple intermediate compounds and regulatory feedback loops. A factory process bypasses all of this, delivering an isolated end-product directly into a system designed to regulate its own production.
Vitamin B12 marketed as cyanocobalamin contains cyanide—the “cyano” in the name is not decorative. The manufacturing process involves bacterial fermentation in “proteinaceous material from waste effluents” (industrial documentation’s euphemism for sewage sludge), followed by treatment with cobalt salts and cyanide to produce the stable commercial form. The body must strip the cyanide molecule to use the vitamin, creating a small but continuous toxic burden with each dose.
Vitamin C supplements do not come from citrus fruits. Commercial ascorbic acid is produced through industrial fermentation of corn-derived glucose by the black mold Aspergillus niger, followed by chemical processing. A 2020 report estimated that 80% of the world’s ascorbic acid is manufactured in China, often in facilities that also produce industrial chemicals—sometimes using shared equipment and supply chains.
Vitamin A (retinol) is synthesized from beta-ionone, itself derived from acetone and acetylene. The Merck Index lists over 100 synonyms for vitamin A compounds, many with distinct toxic profiles. Retinol is a known teratogen—it causes birth defects—yet it is added to prenatal vitamins on the assumption that deficiency poses greater risk than toxicity. The assumption may be backward: liver biopsy studies have found elevated vitamin A in 21.9% of Canadians, 31% of Londoners, 33% of Americans, 40% of Singaporeans, and 49% of Ghanaians studied. What we call “deficiency” may often be toxicity presenting with overlapping symptoms.
Adverse reactions to vitamin supplementation are more common than the industry acknowledges. Users report severe joint and hip pain after vitamin D3 supplementation, resolving after discontinuation. Hypercalcemia symptoms—muscle cramps, heart palpitations, anxiety, eye twitches—persist for months after stopping supplements. One mother reported her child’s chronic skin breakouts cleared completely after stopping vitamin D supplementation: “Perhaps his body was trying to eject the supplement.” Kidney stones linked to vitamin D supplementation are documented in medical literature but rarely mentioned in marketing. These problems occur even in people taking K2 and magnesium as recommended cofactors—the cofactor cascade doesn’t always prevent harm.
Many users report benefits from supplementation—improved energy, better mood, stronger immunity. The question is not whether anyone has ever benefited from a supplement. The questions are: What percentage benefit versus what percentage are harmed? What are the long-term effects of chronic supplementation across decades? Why do we assume industrial chemicals are required for health when traditional populations thrived without them?
Dawn Lester and David Parker, in What Really Makes You Ill?, document the pharmaceutical manufacturing processes that produce these substances. The EPA’s Pharmaceutical Waste document lists “priority pollutants” used in drug synthesis: “benzene, chlorobenzene, chloroform, chloromethane, o-dichlorobenzene, 1,2-dichloroethane, methylene chloride, phenol, toluene and cyanide.” These are classified as “extremely hazardous chemical compounds.” They are also the solvents and reagents used to manufacture the vitamins marketed as essential for health.
The safety documentation tells a story the marketing obscures. The pharmaceutical-grade vitamin C sold by Merck carries a Material Safety Data Sheet. Under “Hazard Identification,” it reads: “May be harmful if swallowed.” The document classifies the substance as a skin and eye irritant. Under “Precautionary Statements,” it advises: “Do not eat, drink, or smoke when using this product.”
Spectrum Chemical’s documentation for USP-grade vitamin D3 (cholecalciferol) is more direct. The hazard classification lists it as “Acute toxicity, Category 2.” The signal word is “Danger.” The substance “may be fatal if swallowed.”
Critics will note that safety data sheets address industrial handling—bulk powders, inhalation risks, concentrated forms. This is true. But the same substance is classified as hazardous when shipped between factories and reclassified as essential nutrition when placed in a bottle with a cheerful label. The chemistry does not change. Only the marketing does.
The same cholecalciferol sold as vitamin D3 supplements is also marketed as rat poison under brand names like Quintox. The mechanism of action is identical: it causes hypercalcemia, softening bones until they fracture and inducing fatal cardiac arrhythmias. The only difference is dosage—and the confidence of the consumer.
Defenders of supplementation invoke the “chemically identical” argument: if synthetic ascorbic acid has the same molecular structure as ascorbic acid in an orange, the body cannot distinguish between them.
The argument has an attractive simplicity—and a fundamental flaw. For two substances to be chemically identical, the natural form must first be isolated and characterized. But isolation itself changes a substance’s context. Vitamin C in an orange exists within a matrix of bioflavonoids, fiber, enzymes, and hundreds of other compounds. These cofactors affect absorption, utilization, and metabolism. Isolated ascorbic acid is not vitamin C in its biological context—it is one component, extracted and purified, operating without the support system that makes it functional in living systems.
The scurvy evidence is instructive. Lind’s concentrated citrus juice—which should have contained the same ascorbic acid as fresh fruit—failed to prevent scurvy while fresh fruit succeeded. Something was lost in processing. The “chemically identical” substance behaved differently from its whole-food source. Modern studies continue to find different outcomes from whole-food vitamin sources versus isolated supplements. The body apparently can distinguish what chemistry professors insist is identical.
The Cofactor Problem
Even advocates of supplementation acknowledge a fundamental difficulty. Jeff Bowles, one of the most prominent proponents of high-dose vitamin D3, acknowledges that D3 supplementation depletes K2, magnesium, boron, zinc, and vitamin A. His defense: “D3 itself is completely non-toxic.” The cofactor depletion, in his view, is a separate problem to be managed with additional supplements.
This is the supplement treadmill in its purest form. Take D3, and you need K2 to prevent arterial calcification. Take K2, and you need magnesium to activate it. Take magnesium, and you may need B6 for absorption. The cascade never ends because isolated industrial chemicals lack the cofactor matrix that whole foods provide.
When you eat liver, you receive vitamin A alongside zinc, copper, B vitamins, and dozens of other compounds in proportions your body recognizes. When you take a retinol capsule, you receive an isolated molecule at concentrations that may be hundreds of times higher than any food would provide, without the cofactors that regulate its metabolism. The body must rob Peter to pay Paul—depleting stored nutrients to process the supplement—or store the excess in tissues where accumulation causes harm.
The fortification of foods operates on the same logic. Milk fortification with vitamin D began in the 1930s as a solution to rickets. By the 1950s, osteoporosis had emerged as a major public health concern—affecting the first generation raised on fortified foods. The official response was not to question fortification but to increase it, adding calcium to the equation. Today, the United States fortifies flour, milk, cereals, juice, and dozens of other products, yet rates of the conditions fortification was designed to prevent have not improved. As one commenter observed: “Most flour, cereals, milks, some juices, and children’s foods are fortified. Yet we still have all the health issues despite the added vitamins. In fact, with fortification health has gone down.”
The circular logic is difficult to escape: create the test, define deficiency by the test, sell the solution, treat the side effects of the solution with additional solutions. The goalposts shift constantly—what was considered adequate vitamin D levels in 1990 became “severe deficiency” by 2024, ensuring an expanding market for supplementation.
Two Variables, One Relationship
The cofactor cascade reveals something deeper than a technical problem with supplement formulation. It reveals the failure of reductionism itself—the assumption that complex biological systems can be understood by isolating their components.
Chemistry is reductionism’s perfect language. It reduces a lemon to ascorbic acid, sunlight to cholecalciferol, liver to retinol. The reduction feels like progress—messy reality clarified into clean molecular structures. But the clarity is an illusion. The lemon is not ascorbic acid. The lemon is a living system containing thousands of compounds in relationships we barely understand, organized by biological processes we cannot replicate. When we extract one molecule and call it “the active ingredient,” we have not discovered the lemon’s secret. We have destroyed it.
This matters because reductionism creates the conditions for a particular kind of cognitive capture.
Daniel Kahneman and Amos Tversky distinguished between fast thinking and slow thinking—what they called System 1 and System 2. Fast thinking connects two variables: cause and effect, problem and solution. Slow thinking holds multiple variables, weighs conditional factors, tolerates ambiguity. Individuals can slow-think. Collectives cannot. The collective mind—the public, the herd, the market—runs on heuristics, on pattern-matching shortcuts that evolved to help social primates navigate complexity without exhausting cognitive resources on every decision.
The supplement industry installs a two-variable heuristic: deficiency causes disease; supplement prevents disease. Vitamin D deficiency causes rickets; vitamin D supplement prevents rickets. Vitamin C deficiency causes scurvy; vitamin C supplement prevents scurvy. Two variables, one relationship. The formula is easy to hold, easy to repeat, easy to act on. It asks nothing—no weighing of probabilities, no assessment of context, no tolerance for complexity.
The “chemically identical” argument exists to protect this heuristic. If synthetic ascorbic acid is identical to ascorbic acid in an orange, then the two-variable formula holds. If they are not identical—if the body distinguishes between isolated molecules and whole-food matrices—then the formula collapses. Suddenly you need slow thinking: Which form? What cofactors? What dose? What context? What toxic burden is the person carrying? What is their diet? Their sun exposure? Their stress level? The variables multiply. The heuristic fails.
This is why the supplement paradigm resists correction despite accumulating evidence of harm. The evidence requires slow thinking to interpret. The heuristic requires only repetition. When a study shows that vitamin E supplementation increases mortality, the heuristic absorbs the blow: wrong form, wrong dose, wrong population. The two-variable structure remains intact. Vitamin E deficiency still causes disease; the right vitamin E supplement still prevents it. The failure is always in the implementation, never in the premise.
The same pattern appears in other captured domains. Cholesterol causes heart disease; statin prevents heart disease. Virus causes infection; vaccine prevents infection. The heuristic structure is identical. Two variables, one relationship, installed at the foundation of trillion-dollar industries. The structure survives because the collective mind cannot perform the slow thinking required to dismantle it.
Traditional peoples had no need for this heuristic because they had no need for supplements. They ate food. The complexity was handled by living systems—plants and animals organizing nutrients into forms the body recognized. The reductionist project attempts to replace this biological intelligence with chemical analysis. It has been running for a century. The results are in: obesity, diabetes, heart disease, cancer, “autoimmune conditions,” mental illness—all rising in populations consuming the most supplements and fortified foods. The two-variable heuristic says this is impossible. The evidence says it is happening.
The way out is not a better heuristic. The way out is slow thinking—holding multiple variables, tolerating uncertainty, recognizing that living systems cannot be reduced to molecular inventories. This is difficult. It is also necessary.
The Fortification Trap
Codex Alimentarius—the international food standards body created by the UN and WHO—has mandated folic acid fortification of grain products in countries around the world. The mandate operates on the assumption that mass medication through the food supply is safer than allowing individuals to make their own nutritional choices. There are no long-term studies of populations consuming fortified foods across multiple generations. There are no control groups. There is no monitoring of total intake when fortified foods, supplements, and naturally occurring nutrients are consumed together.
The metallic iron added to breakfast cereals since the 1940s is actual metal—iron filings that can be extracted from a bowl of cereal using a magnet. This is not the bioavailable iron found in liver or red meat. It is industrial metal added to food products under the assumption that “iron is iron.” The body disagrees. Metallic iron is poorly absorbed and, when absorbed, creates different effects than heme iron from animal sources. Excess accumulates in tissues where it acts as a prooxidant, contributing to the oxidative stress underlying most chronic disease.
The impossible opt-out is perhaps the most troubling aspect. If you want to avoid synthetic vitamins, you must avoid commercial flour, milk, most cereals, many juices, and an expanding list of processed foods. Fortification operates as mass medication without consent—a population-level experiment conducted without the infrastructure to identify adverse effects or the willingness to acknowledge them if they appeared.
The business model is elegant in its circularity. Industrial agriculture is blamed for depleting soil nutrients—a claim that conveniently justifies industrial supplements as the solution. Industrial food processing removes whatever nutrients whole foods contained. Industrial fortification adds synthetic replacements. When health problems persist, industrial supplements are marketed as the answer. Each step generates profit while ensuring the problem is never solved—only managed.
Defenders of supplementation often cite “the degradation of nutrients from industrial farming techniques” as justification for taking supplements. The claim that modern crops contain fewer nutrients than those grown a century ago is widely repeated but rarely examined. What is certain is that the claim serves the supplement industry regardless of its accuracy—the same industrial system that allegedly depletes soil minerals happens to sell the mineral supplements positioned as the solution. Whether soil depletion is real, exaggerated, or invented, the framing keeps the conversation inside the industrial paradigm: industrial problem, industrial solution. The answer to industrial agriculture—if it is indeed the problem—is not industrial supplements. It is food grown differently.
How Vitamins Were “Discovered”
Vitamins were “discovered” roughly a century ago through a process that should sound familiar to anyone who has examined virus isolation methodology. Researchers fed animals severely restricted diets, observed disease symptoms, identified substances that reversed the symptoms, and named those substances “vitamins”—from “vital amines,” though most vitamins are not amines.
The early vitamin A research illustrates the problem. Researchers used heated casein (a protein high in vitamin A) as a “vitamin A-free” diet. Animals on this diet developed symptoms attributed to vitamin A deficiency. When the researchers added foods containing vitamin A, the symptoms resolved. The experiment appeared to prove that vitamin A deficiency caused disease.
But heated casein is not merely vitamin A-free. It is a denatured, toxic substance that would cause illness regardless of vitamin content. The experiment proved that adding certain foods could partially offset the damage from a deliberately harmful diet. It did not prove that vitamin A deficiency, in isolation, causes disease under normal conditions.
The scurvy story is more instructive than usually acknowledged. For centuries, scurvy was believed to be contagious—sailors developed identical symptoms one after another, the classic pattern of “outbreak.” More than two million sailors died between 1500 and 1800, victims of this supposed infection. James Lind’s 1747 trial showed that citrus fruits cured scurvy, but he did not conclude that scurvy was a simple vitamin C deficiency. He believed citrus protected against the “true” cause: dampness.
Here is the detail that vitamin C enthusiasts rarely mention: Lind developed a concentrated citrus juice called “rob” that should have contained the same vitamin C as fresh fruit. It didn’t work. Sailors drinking rob daily still developed scurvy. Only fresh citrus was effective. Something in the whole fruit—destroyed by the concentration process—was essential. Researchers in 1928 named that something “vitamin C” and declared the mystery solved, but they never explained why isolated ascorbic acid behaves differently from the complex matrix of a lemon.
Daniel Roytas, in Can You Catch a Cold?, documents how pellagra, beriberi, and rickets were also initially believed to be contagious diseases. People living together developed identical symptoms simultaneously—the pattern that “proves” infection. The germ theory framework delayed recognition of nutritional factors for decades. Robert Koch himself convinced Japanese researchers that beriberi was caused by a microorganism, sending them on a fruitless search that cost thousands of lives.
The lesson should be clear: viewing health through a single lens—whether germs or vitamins—obscures the complex reality. Scurvy, pellagra, and beriberi were conditions of profound malnutrition and toxic exposure, not simple deficiencies of isolated molecules.
The Toxic Triad
Dr. Thomas Levy, a cardiologist who has spent decades researching vitamin C, has identified a pattern in three nutrients heavily promoted for supplementation: calcium, iron, and copper. Each follows a “little good, a little more bad” dose-response curve—essential at low levels, toxic at the levels achieved through supplementation and fortification.
Calcium supplementation has been associated with increased cardiovascular events. The coronary artery calcium score—measuring calcium deposits in arteries—is one of the most reliable predictors of heart attack risk. Elevated intracellular calcium is both a marker and cause of oxidative stress, the underlying mechanism of most chronic disease. Yet calcium fortification and supplementation continue at population scale.
Iron presents a similar picture. Cancer cells accumulate iron preferentially. The “normal” ferritin range of 25-400 ng/mL was established by measuring sick populations; optimal levels may be below 25 ng/mL. Metallic iron filings have been added to breakfast cereals since the 1940s—actual metal particles that can be extracted with a magnet. A 1981 study found that enriched bread produced higher serum ferritin levels and increased frequency of abdominal complaints in women.
Copper toxicity has been linked to hypertrophic cardiomyopathy. Chelation to remove excess copper improves cardiac function in affected patients. Yet copper is routinely added to multivitamins and fortified foods.
The common thread: each of these nutrients is essential for life, but the body maintains tight homeostatic control over tissue levels. Oral supplementation bypasses these controls, delivering doses that exceed the body’s regulatory capacity. The result is accumulation in tissues where these nutrients become prooxidants rather than nutrients.
Magnesium stands as a notable contrast. Even researchers critical of supplementation often make an exception for magnesium. It blocks calcium channels rather than adding to calcium burden. Studies consistently show that magnesium supplementation reduces all-cause mortality. Unlike fat-soluble vitamins, excess magnesium is excreted rather than stored.
The case for magnesium supplementation is substantial. It is a cofactor in over 600 enzymatic reactions. It is essential for ATP synthesis—the body’s energy currency. It regulates ion channels affecting nerve and muscle function. It influences protein synthesis and bone metabolism. An estimated 68% of American adults don’t consume the recommended daily intake.
Modern conditions conspire to create magnesium deficiency. Processed foods strip magnesium during refining—white flour contains only 16% of the magnesium in whole wheat. Industrial agriculture is accused of depleting soil magnesium through decades of monoculture and synthetic fertilizers. Glyphosate chelates magnesium, making it unavailable even when present in soil. Stress increases magnesium excretion. Caffeine, alcohol, and many medications—especially proton pump inhibitors and diuretics—deplete magnesium further.
Standard blood tests measure serum magnesium, but only 1% of the body’s magnesium circulates in blood. The body maintains serum levels by robbing intracellular and bone stores, meaning serum tests can appear normal while tissue depletion is severe. More accurate tests exist—RBC magnesium, ionized magnesium—but are rarely ordered.
Does this mean magnesium supplementation is justified? The evidence suggests it may be the genuine exception to the critique of supplementation. Magnesium citrate, glycinate, and threonate are well-absorbed forms without the cofactor cascade problems of fat-soluble vitamins. Toxicity is rare because excess is excreted.
Yet even here, the central question remains: why does modern life create conditions requiring supplementation? The magnesium “deficiency epidemic” is a symptom of industrial agriculture and food processing—problems that cannot ultimately be solved by adding another industrial product to the supply chain. Supplementation may be a necessary adaptation to a toxic environment, but it does not address the underlying causes. It manages a symptom while the disease—industrial civilization’s assault on biological systems—continues unchecked.
The honest position on magnesium: it may be necessary under current conditions, it appears safer than most supplements, but its necessity is an indictment of those conditions rather than a validation of the supplementation paradigm.
The Paradox: When Poison Seems to Heal
Any honest examination of the vitamin question must confront evidence that appears to contradict the critique. Dr. Frederick Klenner, a North Carolina physician, reported in 1949 that he had treated sixty consecutive cases of poliomyelitis with high-dose intravenous vitamin C. All sixty recovered without paralysis. He presented his findings to the American Medical Association. They were ignored.
Klenner also documented 300 consecutive uncomplicated births in women who followed his vitamin C protocol during pregnancy. The babies were notably robust: eyes open, strong, alert, free of the birth complications common in his era. Later researchers have documented high-dose intravenous vitamin C reducing mortality in sepsis and improving outcomes in cancer patients.
Dr. Thomas Levy has compiled extensive documentation of vitamin C’s therapeutic effects. The evidence cannot be dismissed. Something real is happening when high-dose vitamin C is administered during acute illness.
But context matters. Klenner believed he was treating a viral disease. He operated within the germ theory framework of his time. What if the framework was wrong? To understand why therapeutic megadoses might work while daily supplementation may not, we need to examine what these interventions were actually treating—and whether the “deficiency” and “infection” frameworks have obscured a simpler explanation: poisoning.
The polio epidemics of the 1940s and 1950s correlated precisely with pesticide use—first lead arsenate, then DDT. Industrial production of DDT began in 1940. In 1945, DDT was released for civilian use, marketed as safe despite existing evidence of neurotoxicity. Polio cases exploded. The pattern was not random: summer outbreaks correlated with summer spraying. Agricultural regions showed higher rates than cities. Children—smaller bodies, higher surface-area-to-weight ratio, more time playing outdoors in sprayed areas—were disproportionately affected.
The National Institutes of Health demonstrated in the mid-1940s that “DDT evidently damaged the same part of the spinal cord as polio.” Endocrinologist Morton Biskind testified to Congress in 1951 that DDT could “produce degeneration of the anterior horn cells of the spinal cord in animals”—the exact pathology attributed to poliovirus. Harrison’s Principles of Internal Medicine noted that “lameness resulting from heavy metal poisoning is clinically sometimes difficult to differentiate from polio.” Lead arsenate, the pesticide DDT replaced, also caused paralysis clinically indistinguishable from polio.
In 1878—before the “virus” was proposed as cause—neurologist Alfred Vulpian demonstrated that dogs poisoned with lead developed symptoms identical to human polio. In 1883, the Russian researcher Popow showed arsenic produced the same paralysis. The arsenic-based pesticide Paris Green had been widely used in agriculture since 1870. Lead arsenate replaced it in 1892 in Massachusetts. Two years later, Massachusetts had a polio epidemic. Dr. Charles Caverly, who investigated, concluded: “We are very certainly not dealing with a contagious disease.”
But the microbe hunters prevailed. In 1908, Landsteiner and Popper injected brain tissue from a deceased polio patient—mixed with undefined “soup”—directly into monkeys’ brains. The monkeys became ill. This was declared proof of viral causation. No one asked whether injecting foreign tissue and unknown substances into brain cavities might cause inflammation and paralysis regardless of viral presence.
Polio cases peaked in 1952 and began declining before the Salk vaccine was introduced in 1955. The decline tracked with restrictions on DDT use. In the Philippines, the first tropical polio epidemic occurred when US forces sprayed massive quantities of DDT after World War II. Populations in neighboring unsprayed areas had no paralysis. When DDT was exported to developing countries after US restrictions, polio “epidemics” followed. When DDT use was finally banned, polio declined—often before vaccination programs began.
In 1951, Irwin Eskwith successfully treated a child with bulbar paralysis—a severe form of “polio” affecting cranial nerves—using dimercaprol, a chelating agent that binds heavy metals like arsenic and lead. The child recovered. If polio were viral, chelation therapy would have no effect. If polio were heavy metal poisoning, chelation would be the logical treatment. Eskwith’s success was ignored.
If polio was not primarily a viral disease but a toxicological syndrome, Klenner’s success takes on an entirely different meaning. High-dose vitamin C is a documented detoxification agent: it donates electrons, supports glutathione synthesis, and assists the body in processing toxic burden. Klenner wasn’t curing a viral infection—he was supporting detoxification from environmental poisoning. His observations remain valid; his interpretation was limited by the scientific framework of his era.
This reframe has significant implications. It allows the clinical evidence for high-dose vitamin C to stand while questioning whether it validates daily supplementation with synthetic ascorbic acid. IV administration during acute toxic crisis is a fundamentally different intervention than oral supplementation to address supposed deficiency. The former supports an overwhelmed detoxification system; the latter assumes the body requires industrial chemicals to function normally.
Niacin presents the same pattern. Dr. Abram Hoffer, beginning in the 1950s, treated over 5,000 schizophrenia patients with megadose niacin—3,000 to 18,000 milligrams daily, doses hundreds of times higher than the RDA. His results were remarkable: recovery rates double those of conventional treatment, with patients who had been institutionalized returning to normal function. Hoffer described schizophrenia as “chronic encephalitis”—brain inflammation—not a psychological disorder requiring talk therapy or a chemical imbalance requiring pharmaceutical management. His framework was metabolic and toxicological.
The mechanism supports this interpretation. Niacin converts to NAD (nicotinamide adenine dinucleotide), which powers over 450 biochemical reactions including cellular energy production and detoxification pathways. A brain struggling with toxic burden or metabolic dysfunction lacks the NAD to clear the damage. High-dose niacin floods the system with raw material for recovery. As Hoffer observed: “Niacin works so good that nobody believes it.”
But Hoffer’s success with therapeutic megadoses does not validate the addition of niacin to breakfast cereals or the sale of 100mg capsules to prevent “deficiency.” Pharmacological intervention during metabolic crisis operates by different mechanisms than nutritional supplementation. The schizophrenia patient taking 10,000mg of niacin daily is not correcting a dietary deficiency—they are using a molecule as medicine to support overwhelmed biological systems. The distinction matters.
The honest position acknowledges unresolved tension. The mechanisms by which high-dose vitamin C produces therapeutic effects during acute illness may operate by pathways entirely distinct from nutritional supplementation. Hormesis—benefit from controlled stress—may explain some effects. Pro-oxidant activity at high doses may trigger adaptive responses. The complete picture remains unclear, and intellectual honesty requires sitting with that uncertainty rather than forcing premature resolution.
What Traditional Peoples Did Instead
The evidence from Weston Price’s research deserves closer examination, because it demolishes the deficiency paradigm from multiple angles.
The Inuit ate their marine mammals raw or fermented—not because they lacked cooking technology but because they understood something modern nutrition has forgotten. Raw and fermented foods preserved the fat-soluble nutrients in forms the body recognized and could regulate. The fat-soluble vitamins in marine mammal blubber arrived with their natural cofactors: vitamin A alongside D alongside K2, in proportions determined by living biological systems rather than laboratory calculations. They developed “deficiency diseases” only when industrial foods replaced their traditional diet.
Inhabitants of Lewis Island in the Outer Hebrides ate primarily seafood, including fish livers and fish liver oil, along with oat porridge and oatcakes. They lived in thatched houses with no chimneys, breathing smoky air day and night. According to modern public health theory, the smoke should have destroyed their lungs, and their limited diet should have caused multiple deficiency diseases. Instead, they had no TB, no heart disease, no cancer to speak of. When modern foods arrived, so did the diseases. Health authorities blamed the smoky houses and mandated chimneys. The chimneys made no difference—only the diet change had caused the deterioration.
Once Africa became “coca-colonized”—once Coca-Cola, white flour, and sugar reached populations that had previously thrived on traditional foods—the diseases that had devastated European colonizers began proliferating among native peoples who had been immune. The pattern repeated everywhere Price looked: robust health gave way to deformity and disease within a single generation of dietary change.
This is not deficiency being corrected by fortification. This is toxicity and malnutrition caused by industrial food being incompletely masked by industrial supplements.
The Question We Should Be Asking
The vitamin framework asks: “How much of this isolated industrial chemical does the body require?”
The terrain framework asks: “What did humans do for hundreds of thousands of years that we’ve stopped doing? What toxic exposures have we added that may be causing the symptoms we attribute to deficiency?”
The answers to the second set of questions are not mysterious. Traditional populations ate whole foods: organ meats, fermented vegetables, raw dairy, bone broths, properly prepared grains, wild-caught fish, pastured animal fats. They avoided—because they didn’t exist—white flour, refined sugar, industrial seed oils, synthetic additives, pesticide residues, and environmental pollutants. They spent time outdoors, in sunlight, in contact with soil. They moved their bodies as part of daily life rather than sitting in climate-controlled boxes under artificial light.
Modern life has inverted every element of this picture. We eat substances that would not be recognized as food by any traditional culture. We avoid sunlight, fearing skin cancer while slathering ourselves with chemical sunscreens. We live indoors, breathing recirculated air, exposed to electromagnetic fields at intensities billions of times higher than the natural background. We dose ourselves with pharmaceuticals that deplete nutrients and burden detoxification systems.
Then we take vitamins to address the resulting symptoms—adding another industrial product to a lifestyle already saturated with industrial products.
The supplement industry now generates over $150 billion annually worldwide. It operates with minimal regulation compared to pharmaceuticals, allowing health claims that would never survive FDA scrutiny for drugs. Bottles promise immune support, energy enhancement, cognitive function, heart health—all based on the theory that isolated chemicals can substitute for whole foods and healthy living.
The same industrial system blamed for depleting soil minerals, that processes nutrients out of food, and pollutes the environment with toxins then sells the “solution” to the problems it creates. It is a perfect business model: manufacture both the disease and the cure.
Unresolved Questions
If synthetic vitamins are truly equivalent to natural forms, why has health declined as fortification increased?
If vitamin D deficiency is epidemic, why didn’t traditional populations in low-sunlight regions suffer the same?
If the body cannot distinguish synthetic from natural, why do studies consistently show different outcomes from whole-food sources versus supplements?
If supplementation is essential, how did humans thrive for hundreds of thousands of years without it?
If high-dose vitamin C works therapeutically during acute illness, does that validate daily supplementation—or suggest a completely different mechanism that has nothing to do with nutritional deficiency?
These questions do not have simple answers. The purpose of raising them is not to provide a new orthodoxy but to prompt better inquiry. The vitamin industry, like the pharmaceutical industry from which it emerged, has not earned unquestioning trust. Safety data sheets exist for a reason. Traditional populations thrived without supplements for a reason. The questions deserve serious consideration rather than dismissal.
Genetics as Blame-Shifting
The so-called MTHFR “mutation” affects 40% of the population, making them supposedly unable to process synthetic B vitamins properly. Mainstream medicine has identified genetic variants in the MTHFR gene that impair the body’s ability to convert synthetic folic acid into its usable methylated form. The variants are common—affecting perhaps 40% of humans to varying degrees. The supplement industry’s response has been to sell methylated B vitamins at premium prices, marketed specifically to those with MTHFR “mutations.”
The framing reveals the assumption. When 40% of a species shares a trait, that trait is not a defect—it is normal variation or, more likely, normal function. The human body did not evolve processing synthetic folic acid. Synthetic folic acid was invented in 1943. For the entire history of human existence before that date, humans obtained folate from food: liver, leafy greens, legumes, fermented foods. These whole-food sources provide folate in forms the body recognizes, accompanied by cofactors that support its metabolism.
What gets labeled an MTHFR “mutation” may simply be the body correctly refusing to process an industrial chemical as though it were food. The “defect” is not in human genetics. The defect is in the assumption that synthetic folic acid—added to flour, cereals, and supplements since mandatory fortification began—is equivalent to dietary folate. The body apparently disagrees.
This is genetics as blame-shifting. Rather than question whether synthetic folic acid belongs in the food supply, medicine identifies a genetic flaw in the humans who react poorly to it. Rather than acknowledge that industrial fortification creates problems, the solution becomes more supplementation—methylated versions, now, at higher cost. The individual is diagnosed as defective. The industrial food system is protected. The supplement industry gains a new market segment. Everyone profits except the person told their genome is broken.
Traditional populations required no genetic testing to know which form of B vitamins to consume. They ate food.
What Life Requires
What traditional peoples understood—and what industrial civilization has forgotten—is that health emerges from a matrix of factors: clean food, clean water, clean air, sunlight, movement, rest, community, purpose. No pill can substitute for this matrix. No industrial product can recreate what only living systems provide.
The vitamins in a lemon are not the lemon. The nutrients in liver are not liver. Whole foods contain thousands of compounds in relationships we barely understand, organized by living processes we cannot replicate. When we reduce nutrition to a checklist of isolated molecules, we have already lost the essential insight: life sustains life. Dead products—however cleverly synthesized—cannot perform this function.
The body is not a machine requiring specific chemical inputs. It is a living system that maintains itself when provided appropriate conditions and sufficient resources. The task is not to calculate the correct dose of cholecalciferol or ascorbic acid. The task is to provide the conditions under which the body can do what it has done successfully for hundreds of thousands of years: heal itself.
The practical implications are simpler than the vitamin industry would have you believe. Eat whole foods—preferably organic, locally grown, in season. Include organ meats, bone broths, pastured animal fats, fermented vegetables, properly prepared grains. Reduce toxic burden: avoid processed foods, industrial seed oils, pesticide residues, unnecessary pharmaceuticals. Get outside: sunlight, fresh air, contact with soil. Move your body as part of daily life. Sleep in darkness. Build genuine community. Find purpose.
These interventions cost nothing to the pharmaceutical industry. They cannot be patented or sold in bottles. They require no laboratory synthesis, no safety data sheets, no supply chains from Chinese factories. They are available to anyone willing to align their lives with the conditions under which human biology evolved.
The vitamin industry offers a different proposition: continue living in ways that make you sick, and buy products to manage the resulting symptoms. It is the same proposition as pharmaceutical medicine, with different packaging and a more “natural” aesthetic. The business model depends on never solving the underlying problem.
There is another way. It requires more effort than swallowing a pill, but it has sustained human health for hundreds of thousands of years. Traditional peoples didn’t know what vitamins were. They didn’t need to.
References
Bailey, Mark. The Final Pandemic: An Antidote to Medical Tyranny. Dr. Sam Bailey, 2023.
Cowan, Thomas S., and Sally Fallon Morell. The Contagion Myth: Why Viruses (Including “Coronavirus”) Are Not the Cause of Disease. Skyhorse Publishing, 2020.
Engelbrecht, Torsten, Claus Köhnlein, Samantha Bailey, and Stefano Scoglio. Virus Mania: How the Medical Industry Continually Invents Epidemics, Making Billion-Dollar Profits at Our Expense, 3rd ed. Books on Demand, 2021.
Gober, Mark, et al. An End to Upside Down Medicine: Contagion, Viruses, and Vaccines. Waterside Productions, 2023.
Hoffer, Abram, Andrew W. Saul, and Harold D. Foster. Niacin: The Real Story. Basic Health Publications.
Kahneman, Daniel. Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011.
Lester, Dawn, and David Parker. What Really Makes You Ill? Why Everything You Thought You Knew About Disease Is Wrong. Independently published, 2019.
Levy, Thomas E. Interview on vitamin C, calcium, iron, and copper toxicity. Unbekoming Substack, 2024.
Price, Weston A. Nutrition and Physical Degeneration. Price-Pottenger Nutrition Foundation, 1939.
Rowan, Shannon. “The Vitamin Deception.” Interview. Unbekoming Substack, 2024.
Roytas, Daniel. Can You Catch a Cold? Untold History and Human Experiments. Independently published, 2024.
West, Jim. “Pesticides and Polio: A Critique of Scientific Literature.” Weston A. Price Foundation, 2003.
Williams, Ulric, and Samantha Bailey. Terrain Therapy. Dr. Sam Bailey, 2022.
EPA. Pharmaceutical Waste Analysis. Blacksmith Institute / Pure Earth, 2006.
Fisher Scientific. Material Safety Data Sheet: Cholecalciferol (Vitamin D3).
Merck. Material Safety Data Sheet: Ascorbic Acid USP.
Support Independent Research
This work remains free because paid subscribers make it possible. If you find value here, consider joining them.
What paid subscribers get: Access to the Deep Dive Audio Library — 170+ in-depth discussions (30-50 min each) exploring the books behind these essays. New discussions added weekly. That’s 100+ hours of content for less than the price of a single audiobook.
[Upgrade to Paid – $5/month or $50/year]
Get in touch Essay ideas, stories, or expertise to share: unbekoming@outlook.com
New Biology Clinic
For those of you looking for practitioners who actually understand terrain medicine and the principles we explore here, I want to share something valuable. Dr. Tom Cowan—whose books and podcasts have shaped much of my own thinking about health—has created the New Biology Clinic, a virtual practice staffed by wellness specialists who operate from the same foundational understanding. This isn’t about symptom suppression or the conventional model. It’s about personalized guidance rooted in how living systems actually work. The clinic offers individual and family memberships that include not just private consults, but group sessions covering movement, nutrition, breathwork, biofield tuning, and more. Everything is virtual, making it accessible wherever you are. If you’ve been searching for practitioners who won’t look at you blankly when you mention structured water or the importance of the extracellular matrix, this is worth exploring. Use discount code “Unbekoming” to get $100 off the member activation fee. You can learn more and sign up at newbiologyclinic.com



As a life long supplement enthusiast and “biohacker” before the term became trendy (lack of better words), this article deeply resonates with me. I relied on supplements heavily as a crutch for over 20 years and thought that I was doing something good for my body… little did I know I was deceiving myself…
A few months ago I cut out all supplements with the exception of CDS (sporadic use) and I feel the best I’ve felt in a very long time. Strange how that happens!?
Thank you for this and may God bless you!
Your introduction is worth forwarding this article ! I got that book a few years ago - it is a gold mine! Since then I respond in somewhat the same way as your introduction, which I will now keep and c/p on every supplement pushers Substack ! Thank you for this wonderful article, I hope everyone will read it.
Modern 'food' is no longer food, and I think all the additions of D2 (milk products) A (same) and other chemical substances, makes things worse instead of better.