Skip to content

How did they figure out calories in food?

6 min read

The modern concept of tracking food energy began with the groundbreaking work of 18th-century French chemist Antoine Lavoisier. He was the first to propose that respiration is a form of combustion, and his early experiments laid the foundation for how they figured out calories in food.

Quick Summary

The measurement of food energy, known as calories, originated with early experiments involving bomb calorimeters, which burned food samples and measured the heat released. This approach was refined by American chemist Wilbur Olin Atwater, who developed an indirect system for calculating calories based on macronutrient content.

Key Points

  • Early Calorimetry: Antoine Lavoisier was a pioneer in measuring metabolic heat, demonstrating that respiration was a form of combustion and quantifying the energy produced by living organisms.

  • Direct Measurement: The bomb calorimeter was developed in the late 19th century to burn food samples and measure the heat released directly, providing the total potential energy.

  • Indirect System: Chemist Wilbur Olin Atwater corrected the bomb calorimeter's overestimation by factoring in human digestive efficiency, establishing the 4-4-9 calorie conversion system for protein, carbohydrates, and fat.

  • Standardized Labels: Food manufacturers use the standardized Atwater system to calculate the 'Calories' (kilocalories) on nutrition labels, which legally have a 20% margin of error.

  • Ongoing Refinement: Modern nutritional science recognizes that factors beyond macronutrient content, like food processing and gut bacteria, can influence actual calorie absorption, continuing to refine our understanding.

In This Article

From Combustion to Calculation: The History of Calorie Measurement

The ability to quantify the energy in food, a practice central to modern diet and nutrition, is the result of centuries of scientific inquiry. The story begins not with dietitians, but with chemists and physicists who were fascinated by the nature of heat and combustion. The journey from burning a food sample to the calorie count on a modern nutrition label involves both ingenious, albeit messy, direct measurements and sophisticated, widely-adopted calculation methods.

The Birth of Calorimetry with Lavoisier

In the late 1700s, French chemist Antoine Lavoisier pioneered the use of a device called an ice calorimeter to study respiration. By placing a guinea pig inside the insulated apparatus, he could measure the amount of ice that melted from the heat produced by the animal's respiration. This experiment demonstrated that respiration was a form of slow combustion, similar to a candle burning, and confirmed that living organisms produce measurable heat. This work was a crucial early step, establishing the fundamental principle that the energy stored in food could be quantified as heat.

The Rise of Bomb Calorimetry

While Lavoisier’s experiments were foundational, the tool that truly allowed for the precise measurement of food's caloric content was the bomb calorimeter. Introduced by French chemist Pierre Eugene Berthelot in the 1870s, this device directly measures the heat of combustion.

The process for bomb calorimetry involves:

  • Placing a dried food sample into a sealed, stainless-steel vessel, also known as the "bomb."
  • Pressurizing the bomb with pure oxygen.
  • Submerging the sealed bomb in a known volume of water.
  • Igniting the food sample with an electrical current.
  • Measuring the resulting rise in the water's temperature as the food burns completely.

The temperature increase is then used to calculate the total energy, or calories, in the food sample. The principle is simple: more heat released means a higher temperature rise and a higher caloric value. This method directly measures the total potential energy in food, but does not account for the human digestive process.

The Atwater System: From Combustion to Calculation

Recognizing that the body does not absorb all the energy available in food, American chemist Wilbur Olin Atwater refined the system in the late 19th and early 20th centuries. He conducted extensive experiments with a human-sized respiration calorimeter, feeding human subjects a specific diet and meticulously measuring their waste. By comparing the energy content of the food consumed with the energy lost through urine and feces, he was able to determine the body's net energy gain.

This work led to the development of the "Atwater System," which provides the standard caloric conversion factors still used today. He established the average values for the three major macronutrients:

  • Carbohydrates: 4 calories per gram
  • Proteins: 4 calories per gram
  • Fats: 9 calories per gram

These correction factors account for the fact that certain components, like fiber, are not fully digestible. For example, Atwater found that one gram of fat contains significantly more energy than one gram of protein or carbohydrates. Food manufacturers now use these factors, rather than burning each product, to calculate the calorie counts on nutrition labels.

Comparing Direct Calorimetry and the Atwater System

Feature Direct Bomb Calorimetry Atwater Indirect System
Methodology Burns food in a sealed chamber to measure released heat. Uses standardized macronutrient conversion factors.
Accuracy Measures total potential energy, but overestimates absorbable energy for humans. Accounts for digestive losses, providing a more realistic estimate for human consumption.
Practicality Requires specialized, expensive laboratory equipment. Relies on chemical analysis of macronutrients, making it practical for mass food production.
Limitations Does not factor in digestive efficiency, which varies by food and individual. Provides average values, meaning the exact absorbable calories can vary slightly by food type.

Refinements and Modern Considerations

Over the years, the Atwater system has undergone refinements. For instance, specific factors for different types of carbohydrates have been developed, as not all are digested equally. Modern nutritional science also recognizes that factors like an individual's gut bacteria and how food is processed (e.g., cooking, grinding) can impact the number of calories extracted. While the principles established by Atwater and earlier calorimetrists remain the foundation, the field continues to evolve with our understanding of human metabolism.

Today, the use of calorie counts on food labels is standardized by government regulations, such as the Nutritional Labeling and Education Act in the United States. However, the legal margin of error (up to 20%) means the stated number is a reliable estimate, not an absolute guarantee.

Conclusion: A History of Scientific Refinement

The process of figuring out calories in food is a story of scientific progression, from the fundamental principles of combustion observed by Lavoisier to the refined calculation methods developed by Atwater. The initial direct approach of burning food in a bomb calorimeter was a critical step, but it took Atwater’s physiological experiments to create a system that accurately reflects how the human body utilizes energy. This historical evolution has given us the dietary information we rely on today, albeit with the important understanding that these numbers are averages, not immutable truths. The continued study of human metabolism and digestion further refines our understanding, building upon the work of these pioneering scientists.

The Atwater System: An Enduring Legacy

The Atwater system provides the caloric conversion factors of 4, 4, and 9 for proteins, carbohydrates, and fats, respectively, that are still used today on nutrition labels.

Antoine Lavoisier's Breakthrough

Antoine Lavoisier was the first to demonstrate that the heat produced by living organisms was a form of combustion, using an ice calorimeter to measure a guinea pig's respiration.

How Bomb Calorimetry Works

Bomb calorimetry is the direct method of measuring the heat released from burning a food sample, providing its total potential energy.

Refining the Calorimetry Method

Wilbur Olin Atwater corrected the bomb calorimeter's findings for human digestion by measuring the energy in human waste, creating the more accurate system we use now.

The Journey of the Calorie

The journey from laboratory experiment to everyday food labels is a testament to the refinement of science over centuries, moving from crude combustion to standardized, indirect calculations.

Limitations of Calorie Counts

Modern science acknowledges that calorie counts are estimates, influenced by factors like food processing, fiber content, and even an individual's gut microbiome, meaning they are not perfectly precise.

The Standard for Food Labels

Today's nutrition labels legally rely on the Atwater system, providing a standardized and reliable, though not exact, guide to food energy.

The Origin of the Term 'Calorie'

French physicist Nicolas Clément coined the term "calorie" in the 1820s, defining it in relation to heat engines, though the term was later adopted and applied to food energy.

FAQs

Q: Who was the first person to measure calories in food? A: While Antoine Lavoisier performed early calorimetric experiments on living organisms, American chemist Wilbur Olin Atwater was the first to systematically measure the caloric content of a wide range of foods and apply it to human nutrition in the late 19th century.

Q: Is the Atwater system perfectly accurate? A: No, the Atwater system uses average values and does not account for individual variations in digestion, gut bacteria, or the impact of food processing, meaning the actual number of absorbed calories can vary slightly.

Q: Why don't food manufacturers just use bomb calorimetry? A: Bomb calorimetry measures the total potential energy, which overestimates the calories a human body can actually absorb from food. The Atwater system provides a more physiologically relevant number and is more practical for large-scale food production.

Q: How does a bomb calorimeter work? A: A bomb calorimeter measures the heat released when a food sample is burned inside a sealed, oxygen-filled chamber submerged in water. The increase in the water's temperature is used to calculate the food's total energy content.

Q: Why are there two types of 'calories' mentioned sometimes? A: There is often confusion between the 'small calorie' (cal), which is the energy to heat 1 gram of water 1°C, and the 'large calorie' or kilocalorie (kcal), which is the energy to heat 1 kilogram of water 1°C. The 'Calories' on food labels are actually kilocalories.

Q: Do calories in food account for heat lost during digestion? A: The Atwater system, used for food labels, specifically accounts for digestive losses by subtracting the unabsorbed energy found in waste from the total energy value.

Q: Why is there a legal margin of error for calorie counts? A: There is a recognized 20% margin of error on nutrition labels due to the inherent variability in food products and the use of average conversion factors, as mandated by the Nutritional Labeling and Education Act.

Q: Has the way we measure calories changed over time? A: Yes, the methods have evolved significantly. The earliest work involved direct measurement through combustion, but this was later refined by Wilbur Olin Atwater into the indirect calculation system based on macronutrient content that is used today for nutritional labeling.

Frequently Asked Questions

While Antoine Lavoisier performed early calorimetric experiments on living organisms, American chemist Wilbur Olin Atwater was the first to systematically measure the caloric content of a wide range of foods and apply it to human nutrition in the late 19th century.

No, the Atwater system uses average values and does not account for individual variations in digestion, gut bacteria, or the impact of food processing, meaning the actual number of absorbed calories can vary slightly.

Bomb calorimetry measures the total potential energy, which overestimates the calories a human body can actually absorb from food. The Atwater system provides a more physiologically relevant number and is more practical for large-scale food production.

A bomb calorimeter measures the heat released when a food sample is burned inside a sealed, oxygen-filled chamber submerged in water. The increase in the water's temperature is used to calculate the food's total energy content.

There is often confusion between the 'small calorie' (cal), which is the energy to heat 1 gram of water 1°C, and the 'large calorie' or kilocalorie (kcal), which is the energy to heat 1 kilogram of water 1°C. The 'Calories' on food labels are actually kilocalories.

The Atwater system, used for food labels, specifically accounts for digestive losses by subtracting the unabsorbed energy found in waste from the total energy value.

There is a recognized 20% margin of error on nutrition labels due to the inherent variability in food products and the use of average conversion factors, as mandated by the Nutritional Labeling and Education Act.

Medical Disclaimer

This content is for informational purposes only and should not replace professional medical advice.