Skip to content

Coastal Living and Ancient Diet: How Did Humans Get Iodine Before Iodized Salt?

4 min read

For millennia, many populations living away from coastal regions faced the debilitating effects of iodine deficiency, a problem effectively solved for many by the introduction of iodized salt. This raises the question: how did humans get iodine before iodized salt, and what were the consequences of not having a consistent source?

Quick Summary

Before the advent of iodized salt, humans obtained iodine from seafood, plants grown in mineral-rich soil, and certain ancient remedies. Inland populations often suffered from chronic deficiencies and related health issues like goiter.

Key Points

  • Coastal Diet: Historically, coastal populations received ample iodine from seafood like fish, shellfish, and seaweed.

  • Geographic Deficiency: Inland and mountainous areas often had iodine-poor soil, leading to widespread deficiencies and endemic goiter.

  • Ancient Remedies: Ancient Chinese and Roman civilizations used iodine-rich sources like seaweed and burnt sea sponges to treat goiter, long before the element itself was identified.

  • Natural Sources: The iodine content of plants and animals was directly tied to the mineral content of the local soil and water.

  • Pre-1920s Health Crisis: Before modern fortification, regions like the U.S. "goiter belt" experienced high rates of goiter, cretinism, and other iodine deficiency disorders.

  • Public Health Intervention: Salt iodization, introduced in the 1920s, largely eliminated endemic iodine deficiency in many countries.

In This Article

The Ocean's Natural Bounty

Long before modern dietary science, coastal populations benefited from a natural and abundant source of iodine: the ocean. Seafood acts as a primary concentrator of the mineral from seawater, making it a naturally rich dietary source. This consistent intake protected those living near the sea from the widespread iodine deficiency issues that plagued inland communities.

Seafood: A Primary Source for Coastal Dwellers

Marine life provides excellent sources of iodine, including both fish and shellfish. The specific iodine content can vary based on the species, location, and the depth at which it feeds, but overall, it provided a reliable source for those with consistent access.

  • Fish: Saltwater fish like cod and haddock are among the best sources of iodine. Consuming fish regularly, whether fresh or preserved, helped maintain healthy iodine levels.
  • Shellfish: Creatures such as oysters, shrimp, and clams also contain significant amounts of iodine, contributing to the dietary needs of coastal communities.

The Power of Seaweed

Seaweed and other marine algae are exceptional sources of iodine, often containing concentrated amounts far greater than fish. The specific iodine content varies dramatically by species and region. Ancient records from as far back as 3600 B.C. in China document the use of seaweed and burnt sea sponge for treating goiter, demonstrating a long-standing recognition of its therapeutic value. Some popular types of edible seaweed include:

  • Kombu kelp: Known for having the highest iodine concentration.
  • Nori: The red seaweed used in sushi rolls.
  • Wakame: A brown seaweed often used in miso soup.

Land-Based Sources and Geographic Inequality

For inland populations, obtaining sufficient iodine was far more challenging. The iodine content of soil is highly variable, largely depending on geological history. During the last Ice Age, glaciation and repeated flooding leached iodine from surface soils into the sea. As a result, inland areas, especially mountainous regions like the Alps, Himalayas, and the U.S. Great Lakes region, became notorious for their iodine-deficient soils.

Plants and Soil Dependent Animals

Just as modern crops reflect the mineral content of the soil they grow in, ancient diets were shaped by regional geology. Vegetables and fruits grown in iodine-rich soils would naturally contain more of the mineral, while those from deficient regions would contain very little. Animals grazing in these areas would also have lower iodine content in their meat and milk. This created a natural and significant health disparity between coastal and inland populations, leading to endemic goiter and other iodine deficiency disorders in places like the American "goiter belt". Recent studies on bonobos in the Congo Basin, a historically iodine-poor region, reveal that even non-human primates forage for specific iodine-rich aquatic herbs, suggesting this may have been a survival strategy for early humans as well.

Ancient and Medieval Iodine Remedies

Before the discovery of iodine as an element in the 19th century, numerous folk remedies were used to treat goiter, a visible symptom of deficiency. The efficacy of these treatments, unknowingly, depended on their iodine content. Beyond the use of seaweed in ancient China, other examples include:

  • In ancient Rome and Greece, burnt sea sponges were used to treat goiter. The process of burning concentrated the iodine, making it an effective remedy. Hippocrates and other physicians documented these methods.
  • In some mountainous regions, people might have consumed naturally occurring, though inconsistently iodized, rock salt imported from certain quarries.
  • Traditional European practices sometimes involved the use of herbs or plants grown in iodine-sufficient regions, or near mineral springs.

The High Price of Deficiency

The consequences of inadequate iodine intake were severe and widespread, impacting not only physical health but also cognitive development. This chronic deficiency is one of the most significant yet under-recognized public health issues in history.

  • Endemic Goiter: The most obvious sign of iodine deficiency was the enlargement of the thyroid gland, known as goiter. It was a common sight in inland communities for centuries.
  • Cretinism: Severe deficiency during fetal development could lead to cretinism, characterized by profound intellectual disability and stunted growth. This had devastating consequences for affected populations.
  • Cognitive Impairment: Even mild to moderate iodine deficiency during pregnancy and early childhood can lead to reduced cognitive function and lower IQ scores.

The Impact of Salt Iodization

While ancient methods provided some localized relief, they were inconsistent and inaccessible to many. The true public health revolution came in the 1920s when universal salt iodization began in Switzerland and the United States. By adding a standardized, controlled amount of iodine to salt, a widely consumed commodity, public health officials could ensure adequate intake for a massive population, virtually eliminating endemic goiter and cretinism in many countries. The contrast between the pre-fortification era and today highlights the incredible impact of this simple measure.

Feature Pre-Iodized Salt Era (Inland) Post-Iodized Salt Era (Fortified)
Primary Iodine Source Inconsistent soil-dependent foods, occasional herbs, unreliable remedies Iodized table salt, dairy, fortified foods
Health Status High risk of goiter, cretinism, cognitive impairment Dramatically reduced incidence of iodine deficiency disorders
Geographic Impact Severe regional disparities, with inland "goiter belts" Uniform health outcomes across regions due to universal fortification
Historical Examples Goiter belts in U.S. Great Lakes region, Alpine valleys Successful reduction of goiter in places like Michigan post-1924

Conclusion Before the systematic fortification of salt, access to iodine was a matter of geography and diet. Coastal populations thrived on the natural bounty of the sea, while inland communities struggled with chronic deficiency, leading to goiter and severe developmental issues. The history of how did humans get iodine before iodized salt is a tale of reliance on natural, but often inconsistent, food sources and the devastating health disparities that resulted. The eventual introduction of iodized salt stands as one of the most impactful, yet simple, public health interventions, demonstrating how a small change in our food system can have a monumental effect on global health. For more on this history, see the review from PMC: History of U.S. Iodine Fortification and Supplementation.

Frequently Asked Questions

Inland and mountainous regions often have soil depleted of iodine due to glaciation and leaching over millennia. This means crops grown in these soils are low in iodine, leading to widespread deficiency in the local population.

Goitrogens are substances in certain plant foods, like cruciferous vegetables (cabbage, broccoli) and soy, that can interfere with the thyroid's ability to use iodine. They can aggravate an existing deficiency, though consuming them in moderation is generally not a concern for those with adequate iodine intake.

Ancient people did not know about the element iodine, but they did recognize goiter. Early Chinese physicians as far back as 3600 B.C. successfully treated the condition with iodine-rich remedies like seaweed and burnt sea sponge.

The element iodine was accidentally discovered in 1811 by French chemist Bernard Courtois, who noticed a purple vapor rising from seaweed ash treated with sulfuric acid. He was attempting to extract sodium salts for making gunpowder.

Large-scale salt iodization programs began in the 1920s, first in Switzerland and then in the United States, to combat endemic goiter. In the U.S., iodized salt first appeared on grocery store shelves in Michigan in 1924.

A 'goiter belt' was a historical term for inland regions, such as the Great Lakes and Appalachian areas in the U.S., where endemic iodine deficiency was rampant and goiter was a common health problem prior to salt iodization.

Yes. Modern dairy products can contain iodine, but this is often an unintentional result of iodine-based disinfectants used on milking equipment. It is now a significant, though variable, source of iodine in the diet.

References

  1. 1
  2. 2
  3. 3
  4. 4

Medical Disclaimer

This content is for informational purposes only and should not replace professional medical advice.