Research & Education

Basic Nutrition – Not So Basic After All

It may surprise you, but a case of scurvy was identified in the U.S. in summer 2016.

Not on a British naval ship three hundred years ago.

In the U.S.

In the 21st Century.

Where we have ample access to vitamin C-rich foods.

Scurvy.

Here’s how it went down:

A two-and-a-half year old boy in Michigan presented with a rash, mild anemia, bleeding gums, an inability to walk (his legs were locked in a “frog-like” position), and a great deal of pain when anyone touched his legs. After numerous tests and evaluations by multiple medical teams, several possibilities—some very serious—were considered and dismissed: heavy metal poisoning, juvenile arthritis, polio, osteomyelitis, Guillain-Barré syndrome, and more.

Things were growing increasingly dire, until finally, finally, someone thought to inquire about the boy’s diet. According to an account of the situation from The Washington Post:

“The boy’s parents, who had been at his bedside around the clock, had mentioned his picky eating. But when doctors zeroed in on the subject, the boy’s mother responded that his diet consisted exclusively of chocolate milk — he drank about a quart and a half a day — and two to four graham crackers. He refused to eat anything else. A blood test revealed that his vitamin C level was less than 0.1 milligrams per deciliter; the low end of normal is 0.6 mg.”

Once the scurvy was identified, it was no problem to get this poor little guy back on his feet doing little boy things. But why did it take so long to identify the problem? Why were so many doctors stumped by the presentation, and so many expensive tests ordered when any good first semester nutrition student might have seen the bleeding gums and thought, “scurvy”?

Why, when you take a sick dog or cat to the vet, is the first question, “What are you feeding it?” but when it comes to humans, diet is so often the last consideration—if it’s considered at all?

Perhaps we can excuse entire teams of medical professionals for missing the scurvy because they had no reason to suspect such a basic, elementary situation in an industrialized nation in the 21st Century. But maybe this isn’t so basic and elementary after all. Maybe it’s more common than we realize. Maybe people of any age, not just toddlers, are walking around with overt or subclinical nutrient deficiencies that are interfering with quality of life and resulting in a laundry list of completely avoidable symptoms simply because no one suspects they’re caused by diet.  

And if no one suspects the diet, no one asks about diet. But maybe we should. Or if we’re already asking, maybe we should dig deeper.

After all, it’s not just toddlers who subsist on extremely limited diets. With orthorexia becoming more common, the number of people on very restrictive diets is growing. Veganism is probably the dietary paradigm with the greatest risk for nutrient insufficiencies, but no approach is immune. Unbalanced nutrient intake can occur on any type of diet, whether Paleo, low-carb, ketogenic, Mediterranean, or something else.   

And with increasing numbers of people presenting with malabsorption issues, nutrient deficiencies are possible even in people who think they’re eating a healthy diet. If someone’s not absorbing the nutrients from their food, then they’re not realizing the benefits they think they are from whatever kind of diet they choose to follow. Malabsorption problems are increasing in frequency, owing to increased incidence of inflammatory bowel disorders, celiac disease, “leaky gut,” and a growing number of patients living life post-bariatric surgery.    

Beyond malabsorption issues, let’s not forget that patients increasingly live in a haze of polypharmacy—taking multiple pharmaceutical drugs for years, sometimes decades, some of which may interfere with absorption or increase excretion of specific nutrients.

So, it could well be that basic nutrient insufficiency isn’t as rare as we might think. But we don’t have to wait until things progress to the point of full-blown deficiency diseases. Barring the rare occurrence like the one described earlier, it’s unlikely that a patient would present with pellagra, scurvy, or beriberi in modern North America. But a patient doesn’t need to have a severe, crippling deficiency to be experiencing signs and symptoms of a more mild or moderate insufficiency. Lack of scurvy doesn’t automatically imply someone’s getting enough vitamin C, and lack of rickets isn’t proof someone is vitamin D replete.

Keen-eyed and knowledgeable physicians and nutritionists should be able to identify subclinical insufficiencies long before things get to a point where consequences of entirely preventable issues lead to debilitation and interfere with quality of life.

Of course, this depends on a combination of getting a good history and patient intake, a good physical examination, and being guided by labwork. But the combination of all three is often needed; labwork is wonderful to have to corroborate or disprove hypotheses, but it doesn’t exist in a vacuum. Numbers on a lab printout aren’t a substitute for a good ol’ set of eyes on a patient’s body, especially when the lab ranges for certain nutrients are not reliable.

For example, the reference range for B12 is extremely wide—as wide as 200-900 pg/mL, depending on the source you consult. And while some sources suggest levels less than 200 pg/mL may be a sign of deficiency, other experts recommend considering the cutoff for deficiency to be below 400 pg/mL, depending on the context. Plus, with such a wide range accepted as “normal,” it’s entirely possible for someone to fall within the normal range, yet exhibit clear signs & symptoms of deficiency. In the specific case of B12, tragically, these signs & symptoms may be mistaken for much more serious conditions, including Parkinson’s disease, multiple sclerosis, anxiety, and various psychiatric disorders. (And if left untreated, the neurological and psychiatric damage induced by B12 deficiency may be irreversible.)

So, aside from labwork and a thorough physical examination, a detailed dietary history is a critical component of the patient intake. What patients eat—and what they don’t eat—provides a more informative window into their health than we may typically expect. Of course, the accuracy of the information depends on the patient being honest about their diet, which might be a tall order, but for dietitians, nutritionists, and nutrition-minded doctors, we can save ourselves and our patients a great deal of frustration, and possibly time and money, by deepening our appreciation for someone’s diet in their health outcomes, and making dietary evaluation and counseling a prominent part of what we do.

 

By Amy Berger, MS, CNS