After years of conflicting research and often extreme opinions on iron, it turns out that like anything else that is a benefit in moderation, in excess it is a detriment.
By Dr. Richard G. “Bugs” Stevens
Professor, School of Medicine
University of Connecticut
Introduction
Iron is a most versatile element. It is essential to many of the enzymes that are the engines for life, and in mammals is also used to carry oxygen on hemoglobin in blood. Remember Popeye and his spinach: all that iron made him strong.
But the very quality that makes iron so useful also makes it dangerous. Iron can easily lose or gain one electron going from the ferrous (Fe++) to the ferric (Fe+++) state, back and forth indefinitely. This is how it carries oxygen, for example.
It also means it can be a potent pro-oxidant – it catalyzes the production of free radicals which can destroy cells and tissue, and thereby contribute to cancer and heart disease.
Life forms like us have developed extensive defense mechanisms that allow us to use iron for life’s work while keeping it away from anywhere it is not immediately needed within cells and in the body in general.
Iron Fortification Sweeps the World
Severe iron deficiency is a health problem in much of the world, but in the US it is uncommon.
The recommended daily allowance for adult men is about 8mg per day, and for adult women under 50 it is about 18mg per day (for pregnant women 27mg per day is recommended). Recommended daily allowances are higher for vegetarians. Most Americans get all the iron they need from their diet. And some foods are supplemented with iron.
During the first half of the 20th century, both medical and public health forces began to aggressively promote iron fortification of food to fight iron-deficiency anemia, particularly in the developing world where the problem was most acute and as much as half the population of some areas fit the definition of anemia.
Of the many harmful effects of severe iron deficiency perhaps of greatest concern are the developmental problems in children.
Severe iron deficiency is harmful, but that isn’t the whole story.
Researchers found that Somali nomads who ate iron restricted diets (very low in meat, but rich in dairy) had lower prevalence of infectious diseases than those that ate more meat.
Dietary iron falls into two categories: heme iron which is easily absorbed from our gut, and non-heme iron which is not absorbed nearly as well. Iron from plants and dairy products is almost entirely non-heme, whereas, of course, red meat contains a lot of heme iron.
Almost all infectious agents including bacteria, fungi, and protozoa that cause disease require iron for their growth. They have developed proteins called siderophores that scavenge iron from their human hosts so they can thrive and multiply. One defense mechanism we have against a bacterial infection is to develop a fever; this is because siderophores don’t work at temperatures above 104 F, whereas they work very well at our normal body temperature of 98.6 F.
‘Fortification’ or Adulteration?
For a long time iron was sacrosanct as a nutrient: if a little is good, then more must be better.
This was the worldwide mantra to fight iron-deficiency anemia. Food was iron “fortified” as much as possible. For example, widespread fortification of flour in the United States began in the 1930s.
The tipping point for the iron fortification debate came in 1978 when an eminent physician and scientist named William Crosby published a paper in the Journal of the American Medical Association called The Safety of Iron-Fortified Food.
He argued that although there are some groups at risk of anemia such as pregnant women, adding iron to the food supply in general exposes many who are not at risk of iron deficiency, and who might therefore be harmed.
It was a provocative idea at the time. And it was taken seriously because of who he was: a World War II veteran who received a Bronze Star, a scientist with over 300 scientific publications to his name who had established hematology and oncology specialties at Walter Reed Army Hospital in the early 1950s.
Evidence of potential harm came the same year Crosby’s paper came out. In 1978 Researchers in Sweden found increasing rates of early-stage hemochromatosis among men.
Hemochromatosis is an iron overload condition that in its later stages kills by heart attack or cancer. Further studies in Sweden suggested that iron fortification was harmful for people with genetic hemochromatosis. Iron fortification of food in Sweden was the highest in the world until it was withdrawn in 1995.
Cancer and the Battle over iron
My PhD advisor was a Nobel prize winner named Barry Blumberg. He was interested in whether body iron level interacted with hepatitis B virus in causing liver cancer.
So he sent me all over the world to conduct studies of this possibility, and we found some support.
Later, after obtaining a PhD, I took it a little further and published a paper in 1988 in New England Journal of Medicine that became a turning point in how the medical and public health communities viewed iron.
Unbeknownst to me – a young researcher at the time – the battle over iron had been brewing for some time, and my paper provided the first hard evidence in humans that elevated body iron level was linked to increased cancer risk in general (not just liver). It became very highly cited by other scientists, and the battle was on.
Iron may help cause cancer as a pro-oxidant, but it may also play an important role in progression as a nutrient for existing cancer cells.
Now, these many years later, it has become clear from work by molecular biologists that heme is the ligand for rev-erb alpha. Translation into English: body iron level helps regulate our circadian rhythms and their link to metabolism. The implications of this exciting finding for our health are not yet clear.
Too much iron is bad, but so is too little – what’s just right?
After all these years of conflicting research and often extreme opinions on iron, it turns out that like anything else that is a benefit in moderation, in excess it is a detriment.
Severe iron deficiency anemia is still a real problem in the developing world and requires treatment, particularly for children and pregnant women. However, we must avoid over-treatment because too much iron is dangerous. There needs to be a balance.
Mild iron deficiency for non-pregnant adults may have some benefits such as lower risk of cancer and heart disease, and lower susceptibility to infectious disease.
The best way to attain mild deficiency is to donate blood at the Red Cross with some regularity. Not that bloodletting is a cure-all, but maybe doctors back in the ancient days of medicine had inadvertently stumbled onto something.
Originally published by The Conversation, 05.11.2015, under the terms of a Creative Commons Attribution/No derivatives license.