Butter, Margarine and Heart Disease

Shortly after World War II, margarine replaced butter in the U.S. food supply. Margarine consumption exceeded butter in the 1950s. By 1975, we were eating one-fourth the amount of butter eaten in 1900 and ten times the amount of margarine. Margarine was made primarily of hydrogenated vegetable oils, as many still are today. This makes it one of our primary sources of trans fat. The consumption of trans fats from other sources also likely tracked closely with margarine intake.


Coronary heart disease (CHD) resulting in a loss of blood flow to the heart (heart attack), was first described in detail in 1912 by Dr. James B. Herrick. Sudden cardiac death due to CHD was considered rare in the 19th century, although other forms of heart disease were diagnosed regularly by symptoms and autopsies. They remain rare in many non-industrial cultures today. This could not have resulted from massive underdiagnosis because heart attacks have characteristic symptoms, such as chest pain that extends along the arm or neck. Physicians up to that time were regularly diagnosing heart conditions other than CHD. The following graph is of total heart disease mortality in the U.S. from 1900 to 2005. It represents all types of heart disease mortality, including 'heart failure', which are non-CHD disorders like arrhythmia and myocarditis.

The graph above is not age-adjusted, meaning it doesn't reflect the fact that lifespan has increased since 1900. I couldn't compile the raw data myself without a lot of effort, but the age-adjusted graph is here. It looks similar to the one above, just a bit less pronounced. I think it's interesting to note the close similarity between the graph of margarine intake and the graph of heart disease deaths. The butter intake graph is also essentially the inverse of the heart disease graph.

Here's where it gets really interesting. The U.S. Centers for Disease Control has also been tracking CHD deaths specifically since 1900. Again, it would be a lot of work for me to compile the raw data, but it can be found here and a graph is in Anthony Colpo's book The Great Cholesterol Con. Here's the jist of it: there was essentially no CHD mortality until 1925, at which point it skyrocketed until about 1970, becoming the leading cause of death. After that, it began to fall due to improved medical care. There are some discontinuities in the data due to changes in diagnostic criteria, but even subtracting those, the pattern is crystal clear.

The age-adjusted heart disease death rate (all forms of heart disease) has been falling since the 1950s, largely due to improved medical treatment. Heart disease incidence has not declined substantially, according to the Framingham Heart study. We're better at keeping people alive in the 21st century, but we haven't successfully addressed the root cause of heart disease.

Was the shift from butter to margarine involved in the CHD epidemic? We can't make any firm conclusions from these data, because they're purely correlations. But there are nevertheless mechanisms that support a protective role for butter, and a detrimental one for margarine. Butter from pastured cows is one of the richest known sources of vitamin K2. Vitamin K2 plays a central role in protecting against arterial calcification, which is an integral part of arterial plaque and the best single predictor of cardiovascular death risk. In the early 20th century, butter was typically from pastured cows.

Margarine is a major source of trans fat. Trans fat is typically found in vegetable oil that has been hydrogenated, rendering it solid at room temperature. Hydrogenation is a chemical reaction that is truly disgusting. It involves heat, oil, hydrogen gas and a metal catalyst. I hope you give a wide berth to any food that says "hydrogenated" anywhere in the ingredients. Some modern margarine is supposedly free of trans fats, but in the U.S., less than 0.5 grams per serving can be rounded down so the nutrition label is not a reliable guide. Only by looking at the ingredients can you be sure that the oils haven't been hydrogenated. Even if they aren't, I still don't recommend margarine, which is an industrially processed pseudo-food.

One of the strongest explanations of CHD is the oxidized LDL hypothesis. The idea is that LDL lipoprotein particles ("LDL cholesterol") become oxidized and stick to the vessel walls, creating an inflammatory cascade that results in plaque formation. Chris Masterjohn wrote a nice explanation of the theory here. Several things influence the amount of oxidized LDL in the blood, including the total amount of LDL in the blood, the antioxidant content of the particle, the polyunsaturated fat content of LDL (more PUFA = more oxidation), and the size of the LDL particles. Small LDL is considered more easily oxidized than large LDL. Small LDL is also associated with elevated CHD mortality. Trans fat shrinks your LDL compared to butter.

In my opinion, it's likely that both the decrease in butter consumption and the increase in trans fat consumption contributed to the massive incidence of CHD seen in the U.S. and other industrial nations today. I think it's worth noting that France has the highest per-capita dairy fat consumption of any industrial nation, along with a comparatively low intake of hydrogenated fat, and also has the second-lowest rate of CHD, behind Japan.

Butter, Margarine and Heart Disease

Shortly after World War II, margarine replaced butter in the U.S. food supply. Margarine consumption exceeded butter in the 1950s. By 1975, we were eating one-fourth the amount of butter eaten in 1900 and ten times the amount of margarine. Margarine was made primarily of hydrogenated vegetable oils, as many still are today. This makes it one of our primary sources of trans fat. The consumption of trans fats from other sources also likely tracked closely with margarine intake.


Coronary heart disease (CHD) resulting in a loss of blood flow to the heart (heart attack), was first described in detail in 1912 by Dr. James B. Herrick. Sudden cardiac death due to CHD was considered rare in the 19th century, although other forms of heart disease were diagnosed regularly by symptoms and autopsies. They remain rare in many non-industrial cultures today. This could not have resulted from massive underdiagnosis because heart attacks have characteristic symptoms, such as chest pain that extends along the arm or neck. Physicians up to that time were regularly diagnosing heart conditions other than CHD. The following graph is of total heart disease mortality in the U.S. from 1900 to 2005. It represents all types of heart disease mortality, including 'heart failure', which are non-CHD disorders like arrhythmia and myocarditis.

The graph above is not age-adjusted, meaning it doesn't reflect the fact that lifespan has increased since 1900. I couldn't compile the raw data myself without a lot of effort, but the age-adjusted graph is here. It looks similar to the one above, just a bit less pronounced. I think it's interesting to note the close similarity between the graph of margarine intake and the graph of heart disease deaths. The butter intake graph is also essentially the inverse of the heart disease graph.

Here's where it gets really interesting. The U.S. Centers for Disease Control has also been tracking CHD deaths specifically since 1900. Again, it would be a lot of work for me to compile the raw data, but it can be found here and a graph is in Anthony Colpo's book The Great Cholesterol Con. Here's the jist of it: there was essentially no CHD mortality until 1925, at which point it skyrocketed until about 1970, becoming the leading cause of death. After that, it began to fall due to improved medical care. There are some discontinuities in the data due to changes in diagnostic criteria, but even subtracting those, the pattern is crystal clear.

The age-adjusted heart disease death rate (all forms of heart disease) has been falling since the 1950s, largely due to improved medical treatment. Heart disease incidence has not declined substantially, according to the Framingham Heart study. We're better at keeping people alive in the 21st century, but we haven't successfully addressed the root cause of heart disease.

Was the shift from butter to margarine involved in the CHD epidemic? We can't make any firm conclusions from these data, because they're purely correlations. But there are nevertheless mechanisms that support a protective role for butter, and a detrimental one for margarine. Butter from pastured cows is one of the richest known sources of vitamin K2. Vitamin K2 plays a central role in protecting against arterial calcification, which is an integral part of arterial plaque and the best single predictor of cardiovascular death risk. In the early 20th century, butter was typically from pastured cows.

Margarine is a major source of trans fat. Trans fat is typically found in vegetable oil that has been hydrogenated, rendering it solid at room temperature. Hydrogenation is a chemical reaction that is truly disgusting. It involves heat, oil, hydrogen gas and a metal catalyst. I hope you give a wide berth to any food that says "hydrogenated" anywhere in the ingredients. Some modern margarine is supposedly free of trans fats, but in the U.S., less than 0.5 grams per serving can be rounded down so the nutrition label is not a reliable guide. Only by looking at the ingredients can you be sure that the oils haven't been hydrogenated. Even if they aren't, I still don't recommend margarine, which is an industrially processed pseudo-food.

One of the strongest explanations of CHD is the oxidized LDL hypothesis. The idea is that LDL lipoprotein particles ("LDL cholesterol") become oxidized and stick to the vessel walls, creating an inflammatory cascade that results in plaque formation. Chris Masterjohn wrote a nice explanation of the theory here. Several things influence the amount of oxidized LDL in the blood, including the total amount of LDL in the blood, the antioxidant content of the particle, the polyunsaturated fat content of LDL (more PUFA = more oxidation), and the size of the LDL particles. Small LDL is considered more easily oxidized than large LDL. Small LDL is also associated with elevated CHD mortality. Trans fat shrinks your LDL compared to butter.

In my opinion, it's likely that both the decrease in butter consumption and the increase in trans fat consumption contributed to the massive incidence of CHD seen in the U.S. and other industrial nations today. I think it's worth noting that France has the highest per-capita dairy fat consumption of any industrial nation, along with a comparatively low intake of hydrogenated fat, and also has the second-lowest rate of CHD, behind Japan.

Leptin Resistance and Sugar

Leptin is a major hormone regulator of fat mass in vertebrates. It's a frequent topic on this blog because I believe it's central to overweight and modern metabolic disorders. Here's how it works. Leptin is secreted by fat tissue, and its blood levels are proportional to fat mass. The more fat tissue, the more leptin. Leptin reduces appetite, increases fat release from fat tissue and increases the metabolic rate. Normally, this creates a "feedback loop" that keeps fat mass within a fairly narrow range. Any increase in fat tissue causes an increase in leptin, which burns fat tissue at an accelerated rate. This continues until fat mass has decreased enough to return leptin to its original level.

Leptin was first identified through research on the "obese" mutant mouse. The obese strain arose by a spontaneous mutation, and is extremely fat. The mutation turned out to be in a protein investigators dubbed leptin. When researchers first discovered leptin, they speculated that it could be the "obesity gene", and supplemental leptin a potential treatment for obesity. They later discovered (to their great chagrin) that obese people produce much more leptin than thin people, so a defeciency of leptin was clearly not the problem, as it was in the obese mouse. They subsequently found that obese people scarcely respond to injected leptin by reducing their food intake, as thin people do. They are leptin resistant. This makes sense if you think about it. The only way a person can gain significant fat mass is if the leptin feedback loop isn't working correctly.

Another rodent model of leptin resistance arose later, the "Zucker fatty" rat. Zucker rats have a mutation in the leptin receptor gene. They secrete leptin just fine, but they don't respond to it because they have no functional receptor. This makes them an excellent model of complete leptin resistance. What happens to Zucker rats? They become obese, hypometabolic, hyperphagic, hypertensive, insulin resistant, and they develop blood lipid disturbances. It should sound familiar; it's the metabolic syndrome and it affects 24% of Americans (CDC NHANES III). Guess what's the first symptom of impending metabolic syndrome in humans, even before insulin resistance and obesity? Leptin resistance. This makes leptin an excellent contender for the keystone position in overweight and other metabolic disorders.

I've mentioned before that the two most commonly used animal models of the metabolic syndrome are both sugar-fed rats. Fructose, which accounts for 50% of table sugar and 55% of high-fructose corn syrup, is probably the culprit. Glucose, which is the remainder of table sugar and high-fructose corn syrup, and the product of starch digestion, does not have the same effects. I think it's also relevant that refined sugar contains no vitamins or minerals whatsoever. Sweetener consumption in the U.S. has increased from virtually nothing in 1850, to 84 pounds per year in 1909, to 119 pounds in 1970, to 142 pounds in 2005 (source).

In a recent paper, Dr. Philip Scarpace's group (in collaboration with Dr. Richard Johnson), showed that a high-fructose diet causes leptin resistance in rats. The diet was 60% fructose, which is extreme by any standards, but it caused a complete resistance to the effect of leptin on food intake. Normally, leptin binds receptors in a brain region called the hypothalamus, which is responsible for food intake behaviors (including in humans). This accounts for leptin's ability to reduce food consumption. Fructose-fed rats did not reduce their food intake at all when injected with leptin, while rats on a normal diet did. When subsequently put on a high-fat diet (60% lard), rats that started off on the fructose diet gained more weight.

I think it's worth mentionong that rodents don't respond to high-fat diets in the same way as humans, as judged by the efficacy of low-carbohydrate diets for weight loss. Industrial lard also has a very poor ratio of omega-6 to omega-3 fats (especially if it's hydrogenated), which may also contribute to the observed weight gain.

Fructose-fed rats had higher cholesterol and twice the triglycerides of control-fed rats. Fructose increases triglycerides because it goes straight to the liver, which makes it into fat that's subsequently exported into the bloodstream. Elevated triglycerides impair leptin transport from the blood to the hypothalamus across the blood-brain barrier, which separates the central nervous system from the rest of the body. Fructose also impaired the response of the hypothalamus to the leptin that did reach it. Both effects may contribute to the leptin resistance Dr. Scarpace's group observed.

Just four weeks of fructose feeding in humans (1.5g per kg body weight) increased leptin levels by 48%. Body weight did not change during the study, indicating that more leptin was required to maintain the same level of fat mass. This may be the beginning of leptin resistance.

Leptin Resistance and Sugar

Leptin is a major hormone regulator of fat mass in vertebrates. It's a frequent topic on this blog because I believe it's central to overweight and modern metabolic disorders. Here's how it works. Leptin is secreted by fat tissue, and its blood levels are proportional to fat mass. The more fat tissue, the more leptin. Leptin reduces appetite, increases fat release from fat tissue and increases the metabolic rate. Normally, this creates a "feedback loop" that keeps fat mass within a fairly narrow range. Any increase in fat tissue causes an increase in leptin, which burns fat tissue at an accelerated rate. This continues until fat mass has decreased enough to return leptin to its original level.

Leptin was first identified through research on the "obese" mutant mouse. The obese strain arose by a spontaneous mutation, and is extremely fat. The mutation turned out to be in a protein investigators dubbed leptin. When researchers first discovered leptin, they speculated that it could be the "obesity gene", and supplemental leptin a potential treatment for obesity. They later discovered (to their great chagrin) that obese people produce much more leptin than thin people, so a defeciency of leptin was clearly not the problem, as it was in the obese mouse. They subsequently found that obese people scarcely respond to injected leptin by reducing their food intake, as thin people do. They are leptin resistant. This makes sense if you think about it. The only way a person can gain significant fat mass is if the leptin feedback loop isn't working correctly.

Another rodent model of leptin resistance arose later, the "Zucker fatty" rat. Zucker rats have a mutation in the leptin receptor gene. They secrete leptin just fine, but they don't respond to it because they have no functional receptor. This makes them an excellent model of complete leptin resistance. What happens to Zucker rats? They become obese, hypometabolic, hyperphagic, hypertensive, insulin resistant, and they develop blood lipid disturbances. It should sound familiar; it's the metabolic syndrome and it affects 24% of Americans (CDC NHANES III). Guess what's the first symptom of impending metabolic syndrome in humans, even before insulin resistance and obesity? Leptin resistance. This makes leptin an excellent contender for the keystone position in overweight and other metabolic disorders.

I've mentioned before that the two most commonly used animal models of the metabolic syndrome are both sugar-fed rats. Fructose, which accounts for 50% of table sugar and 55% of high-fructose corn syrup, is probably the culprit. Glucose, which is the remainder of table sugar and high-fructose corn syrup, and the product of starch digestion, does not have the same effects. I think it's also relevant that refined sugar contains no vitamins or minerals whatsoever. Sweetener consumption in the U.S. has increased from virtually nothing in 1850, to 84 pounds per year in 1909, to 119 pounds in 1970, to 142 pounds in 2005 (source).

In a recent paper, Dr. Philip Scarpace's group (in collaboration with Dr. Richard Johnson), showed that a high-fructose diet causes leptin resistance in rats. The diet was 60% fructose, which is extreme by any standards, but it caused a complete resistance to the effect of leptin on food intake. Normally, leptin binds receptors in a brain region called the hypothalamus, which is responsible for food intake behaviors (including in humans). This accounts for leptin's ability to reduce food consumption. Fructose-fed rats did not reduce their food intake at all when injected with leptin, while rats on a normal diet did. When subsequently put on a high-fat diet (60% lard), rats that started off on the fructose diet gained more weight.

I think it's worth mentionong that rodents don't respond to high-fat diets in the same way as humans, as judged by the efficacy of low-carbohydrate diets for weight loss. Industrial lard also has a very poor ratio of omega-6 to omega-3 fats (especially if it's hydrogenated), which may also contribute to the observed weight gain.

Fructose-fed rats had higher cholesterol and twice the triglycerides of control-fed rats. Fructose increases triglycerides because it goes straight to the liver, which makes it into fat that's subsequently exported into the bloodstream. Elevated triglycerides impair leptin transport from the blood to the hypothalamus across the blood-brain barrier, which separates the central nervous system from the rest of the body. Fructose also impaired the response of the hypothalamus to the leptin that did reach it. Both effects may contribute to the leptin resistance Dr. Scarpace's group observed.

Just four weeks of fructose feeding in humans (1.5g per kg body weight) increased leptin levels by 48%. Body weight did not change during the study, indicating that more leptin was required to maintain the same level of fat mass. This may be the beginning of leptin resistance.

The Fundamentals

I heard an interview of Michael Pollan yesterday on Talk of the Nation. He made some important points about nutrition that bear repeating. He's fond of saying "don't eat anything your grandmother wouldn't recognize as food". That doesn't mean your grandmother specifically, but anyone's grandmother, whether she was Japanese, American or African. The point is that commercial food processing has taken us away from the foods, and traditional food preparation methods, on which our bodies evolved to thrive. At this point, we don't know enough about health to design a healthy synthetic diet. Diet and health are too complex for reductionism at our current level of understanding. For that reason, any departure from natural foods and traditional food processing techniques is suspect.

Mainstream nutrition science has repeatedly contradicted itself and led us down the wrong path. This means that traditional cultures still have something to teach us about health. Hunter-gatherers and certain other non-industrial cultures are still the healthiest people on Earth, from the perspective of non-communicable disease. Pollan used the example of butter. First we thought it was healthy, then we were told it contains too much saturated fat and should be replaced with hydrogenated vegetable margarine. Now we learn that trans fats are unhealthy, so we're making new margarines that are low in trans fats, but are still industrially processed pseudo-foods. How long will it take to show these new fats are harmful? What will be the next industrial fat to replace them? This game can be played forever as the latest unproven processed food replaces the previous one, and it will never result in something as healthy as real butter.

The last point of Pollan's I'll mention is that the world contains (or contained) a diversity of different cultures, living in dramatically different ways, many of which do not suffer from degenerative disease. These range from carnivores like the Inuit, to plant-heavy agriculturalists like the Kitavans, to pastoralists like the Masai. The human body is adapted to a wide variety of foodways, but the one it doesn't seem to like is the modern Western diet.

Pollan's new book is In Defense of Food. I haven't read it, but I think it would be a good introduction to the health, ethical and environmental issues that surround food choices. He's a clear and accessible writer.

Merry Christmas, happy Hanukkah, and happy holidays to everyone!

The Fundamentals

I heard an interview of Michael Pollan yesterday on Talk of the Nation. He made some important points about nutrition that bear repeating. He's fond of saying "don't eat anything your grandmother wouldn't recognize as food". That doesn't mean your grandmother specifically, but anyone's grandmother, whether she was Japanese, American or African. The point is that commercial food processing has taken us away from the foods, and traditional food preparation methods, on which our bodies evolved to thrive. At this point, we don't know enough about health to design a healthy synthetic diet. Diet and health are too complex for reductionism at our current level of understanding. For that reason, any departure from natural foods and traditional food processing techniques is suspect.

Mainstream nutrition science has repeatedly contradicted itself and led us down the wrong path. This means that traditional cultures still have something to teach us about health. Hunter-gatherers and certain other non-industrial cultures are still the healthiest people on Earth, from the perspective of non-communicable disease. Pollan used the example of butter. First we thought it was healthy, then we were told it contains too much saturated fat and should be replaced with hydrogenated vegetable margarine. Now we learn that trans fats are unhealthy, so we're making new margarines that are low in trans fats, but are still industrially processed pseudo-foods. How long will it take to show these new fats are harmful? What will be the next industrial fat to replace them? This game can be played forever as the latest unproven processed food replaces the previous one, and it will never result in something as healthy as real butter.

The last point of Pollan's I'll mention is that the world contains (or contained) a diversity of different cultures, living in dramatically different ways, many of which do not suffer from degenerative disease. These range from carnivores like the Inuit, to plant-heavy agriculturalists like the Kitavans, to pastoralists like the Masai. The human body is adapted to a wide variety of foodways, but the one it doesn't seem to like is the modern Western diet.

Pollan's new book is In Defense of Food. I haven't read it, but I think it would be a good introduction to the health, ethical and environmental issues that surround food choices. He's a clear and accessible writer.

Merry Christmas, happy Hanukkah, and happy holidays to everyone!

U.S. Weight, Lifestyle and Diet Trends, 1970- 2007

For this post, I compiled statistics on U.S. weight, health and lifestyle trends, and graphed them as consistently as possible. They span the period from 1970 to 2007, during which the obesity rate doubled. The data come from the National Health and Nutrition Examination Survey (NHANES), the Behavioral Risk Factor Surveillance System (BRFSS), and the U.S. Department of Agriculture (USDA). Some of the graphs are incomplete, either because the data don't exist, or because I wasn't able to find them. Obesity is defined as a body mass index (BMI) of 30+; overweight is a BMI of 25+. Yes, it's frightening. It has affected adults and children (NHANES).
The percentage of Americans who report exercising in their spare time has actually increased since 1988 (BRFSS).
We're eating about 250 more calories per day, according to NHANES.
The 250 extra calories are coming from carbohydrate (NHANES).

We're eating more vegetables and fruit (USDA).
We're eating more meat by weight, although calories from meat have probably gone down because the meat has gotten leaner (USDA). This graph represents red meat, fish and poultry. The increase comes mostly from poultry. Boneless, skinless chicken breasts anyone?
We're eating more sugar (USDA). The scale of the graph doesn't allow you to fully appreciate that sweetener consumption had increased by a full 100 calories per day by 1999, although it has dropped a bit since then. This is based on food disappearance data. In other words, the amount consumed is estimated using the amount sold domestically, minus a percentage that approximates waste. High-fructose corn syrup has seized nearly 50% of the sweetener market since 1970.
Again, the scale of the graph doesn't allow you to fully appreciate the magnitude of the change here. In 2000, we ate approximately 2.5 ounces, or 280 calories, more processed grains per day than in 1970 (USDA). That has since decreased slightly (34 calories). You might be saying to yourself right now "hey, that plus the 100 calories from sugar adds up to more of an increase than the NHANES data show!" Yes, and I think that points to the fact that the data sets are not directly comparable. NHANES data are self-reported whereas USDA data are collected from vendors. Regardless of the absolute numbers, our processed grain consumption has gone way up since 1970.

Wheat is still king. Although we grow a lot of corn in this country, most of it gets fed to animals. We prefer eating wheat without first feeding it to an intermediary. In absolute quantity, wheat consumption has increased more than any other grain (not including corn syrup).
Bye bye whole milk. Hello skim milk (USDA).

This graph represents "added fats", as opposed to fats that occur naturally in meat or milk (the USDA does not track the latter). Added fats include salad oil, cooking oil, deep fry oil, butter, lard, tallow, etc. We are eating a lot more vegetable oil than we were in 1970. It comes chiefly from the industrial, omega-6 rich oils such as soybean, corn and canola. Added animal fats have increased slightly, but it's pretty insignificant in terms of calories.

There is an artifact in this graph that I have to point out. In 2000, the USDA changed the way it gathered vegetable oil data. This led to an abrupt, apparent increase in its consumption that is obvious on the graph. So it's difficult to make any quantitative conclusions, but I think it's clear nevertheless that vegetable oil intake has increased considerably.

Between 1970 and 1980, something changed in the U.S. that caused a massive increase in obesity and other health problems. Some combination of factors reached a critical mass that our metabolism could no longer tolerate. The three biggest changes in the American diet since 1970:
  • An increase in cereal grain consumption, particularly wheat.
  • An increase in sweetener consumption
  • The replacement of meat and milk fat with industrial vegetable oils, with total fat intake remaining the same.
Mainstream America has done to itself what it did to native American and other indigenous cultures worldwide, with the same result.

U.S. Weight, Lifestyle and Diet Trends, 1970- 2007

For this post, I compiled statistics on U.S. weight, health and lifestyle trends, and graphed them as consistently as possible. They span the period from 1970 to 2007, during which the obesity rate doubled. The data come from the National Health and Nutrition Examination Survey (NHANES), the Behavioral Risk Factor Surveillance System (BRFSS), and the U.S. Department of Agriculture (USDA). Some of the graphs are incomplete, either because the data don't exist, or because I wasn't able to find them. Obesity is defined as a body mass index (BMI) of 30+; overweight is a BMI of 25+. Yes, it's frightening. It has affected adults and children (NHANES).
The percentage of Americans who report exercising in their spare time has actually increased since 1988 (BRFSS).
We're eating about 250 more calories per day, according to NHANES.
The 250 extra calories are coming from carbohydrate (NHANES).

We're eating more vegetables and fruit (USDA).
We're eating more meat by weight, although calories from meat have probably gone down because the meat has gotten leaner (USDA). This graph represents red meat, fish and poultry. The increase comes mostly from poultry. Boneless, skinless chicken breasts anyone?
We're eating more sugar (USDA). The scale of the graph doesn't allow you to fully appreciate that sweetener consumption had increased by a full 100 calories per day by 1999, although it has dropped a bit since then. This is based on food disappearance data. In other words, the amount consumed is estimated using the amount sold domestically, minus a percentage that approximates waste. High-fructose corn syrup has seized nearly 50% of the sweetener market since 1970.
Again, the scale of the graph doesn't allow you to fully appreciate the magnitude of the change here. In 2000, we ate approximately 2.5 ounces, or 280 calories, more processed grains per day than in 1970 (USDA). That has since decreased slightly (34 calories). You might be saying to yourself right now "hey, that plus the 100 calories from sugar adds up to more of an increase than the NHANES data show!" Yes, and I think that points to the fact that the data sets are not directly comparable. NHANES data are self-reported whereas USDA data are collected from vendors. Regardless of the absolute numbers, our processed grain consumption has gone way up since 1970.

Wheat is still king. Although we grow a lot of corn in this country, most of it gets fed to animals. We prefer eating wheat without first feeding it to an intermediary. In absolute quantity, wheat consumption has increased more than any other grain (not including corn syrup).
Bye bye whole milk. Hello skim milk (USDA).

This graph represents "added fats", as opposed to fats that occur naturally in meat or milk (the USDA does not track the latter). Added fats include salad oil, cooking oil, deep fry oil, butter, lard, tallow, etc. We are eating a lot more vegetable oil than we were in 1970. It comes chiefly from the industrial, omega-6 rich oils such as soybean, corn and canola. Added animal fats have increased slightly, but it's pretty insignificant in terms of calories.

There is an artifact in this graph that I have to point out. In 2000, the USDA changed the way it gathered vegetable oil data. This led to an abrupt, apparent increase in its consumption that is obvious on the graph. So it's difficult to make any quantitative conclusions, but I think it's clear nevertheless that vegetable oil intake has increased considerably.

Between 1970 and 1980, something changed in the U.S. that caused a massive increase in obesity and other health problems. Some combination of factors reached a critical mass that our metabolism could no longer tolerate. The three biggest changes in the American diet since 1970:
  • An increase in cereal grain consumption, particularly wheat.
  • An increase in sweetener consumption
  • The replacement of meat and milk fat with industrial vegetable oils, with total fat intake remaining the same.
Mainstream America has done to itself what it did to native American and other indigenous cultures worldwide, with the same result.

The Myth of the High-Protein Diet

The phrase "low-carbohydrate diet" is a no-no in some circles, because it implies that a diet is high in fat. Often, the euphemism "high-protein diet" is used to avoid the mental image of a stick of butter wrapped in bacon. It's purely a semantic game, because there is no such thing as a diet in which the majority of calories come from protein. The ability of the human body to metabolize protein ends at about 1/3 of calories (1, 2), and the long-term optimum may be lower still. Low-carbohydrate diets (yes, the ones that are highly effective for weight loss and general health) are high-fat diets.

Healthy cultures around the world tend to consume roughly 10 to 20% of calories from protein:

Masai - 19%
Kitava - 10%

Tokelau - 12%
Inuit - 20%, according to Stefansson
Kuna - 12%
Sweden - 12%
United States - 15%
Human milk - 6%

The balance comes from fat and carbohydrate. Ask a traditional Inuit. If there's no fat on your meat, you may as well starve. Literally. "Rabbit starvation" was a term coined by American explorers who quickly realized that living on lean game is somewhere between unhealthy and fatal.

In the early 1900s, anthropologist and explorer Vilhjalmur Stefansson lived for several years among completely isolated Canadian Inuit (Eskimo) who had never seen a white person before. They were literally a stone-age culture, completely uninfluenced by the modern world. They are representative of how some of our paleolithic ancestors would have lived. Here's Stefansson, quoted from My Life With the Eskimo (1913):
In certain places and in certain years, rabbits are an important article of diet, but even when there is an abundance of this animal, the Indians consider themselves starving if they get nothing else, - and fairly enough, as my own party can testify, for any one who is compelled in winter to live for a period of several weeks on lean meat will actually starve, in this sense: that there are lacking from his diet certain necessary elements, notably fat, and it makes no difference how much he eats, he will be hungry at the end of each meal, and eventually he will lose strength or become actually ill. The Eskimo who have provided themselves in summer with bags of seal oil can carry them into a rabbit country and can live on rabbits satisfactorily for months.
Dr. Loren Cordain, in his excellent paper "Plant-Animal Subsistence Ratios and Macronutrient Energy Estimations in Worldwide Hunter-Gatherer Diets", argues based on calculated estimates that historical hunter-gatherers generally consumed between 19 and 35% of calories from protein:
This high reliance on animal-based foods coupled with the relatively low carbohydrate content of wild plant foods produces universally characteristic macronutrient composition ratios in which protein is elevated (19- 35% of energy) at the expense of carbohydrates (22- 40% of energy).
Later, he states that the most plausible range of fat intakes is 28- 58%. I agree with his assertion that hunter-gatherer diets tended to be relatively high in protein compared with contemporary diets, but I think his protein numbers are a bit high. Why? Because he calculates macronutrient composition based on the whole-carcass fat content of "representative" animals such as deer.

It's clear from the anthropological literature that hunter-gatherers did not go after representative animals. They went after the fattest animals they could find. They knew exactly which animals were fattest in which seasons, which individuals were likely to be fattest within a herd, and which bodyparts were fattest on an individual animal. For example, Stefansson describes how the Inuit relied on (extremely fat) seal in the spring, wolf in the summer, and caribou and bear in the fall and early winter. If necessary, they would discard lean meat in favor of tongue, marrow, internal organs, back fat and other fat-rich bodyparts. This was in order to obtain a minimum of 65% of calories from fat.

Hunter-gatherers would sometimes even provision themselves with enough fat in advance to last a lean season or two. This was true for dozens of tribes along the Northwest coast of North America that relied chiefly on animal foods. Here's another excerpt from My Life With the Eskimo:
...[spring] is the season which the Eskimo give up to the accumulation of blubber for the coming year. Fresh oil is not nearly so palatable or digestible as oil that has been allowed to ferment in a sealskin bag through the summer, and besides that it is difficult often to get seals in the fall... Each family will in the spring be able to lay away from three to seven bags of oil. Such a bag consists of the whole skin of the common seal... This makes a bag which will hold about three hundred pounds of blubber, so that a single family's store of oil for the fall will run from nine hundred to two thousand pounds.
That's a lot of oil! Some of it would have been used to light oil lamps, but much of it would have been eaten. I think Cordain's estimate of the protein intake of hunter-gatherers is a bit high due to his underestimating fat intake. His paper shows that if you break historical hunter-gatherer cultures into 10 groups based on their reliance on animal foods, the most numerous group (46 out of 229) obtained 85- 100% of their food from animal sources. In other words, approximately 20% of historical hunter-gatherers were carnivorous or nearly so. If the human protein ceiling is 35% of calories, that means roughly one fifth of hunter-gatherers ate 65% or more of their calories as fat. It also means carnivory and high-fat diets are not just anomalies, they are part of the human ecological niche. Zero out of 229 groups obtained less than 16% of calories from animal foods. Vegetarianism is not part of our niche.

Further, although the human body can theoretically tolerate up to 35% protein by calories, even that amount is probably not optimal in the long term. I think that's suggested by the fact that diverse cultures tend to find a source of fat and/or carbohydrate that keeps their protein intake roughly between 10 and 20%. I think it's fine to eat plenty of protein, and there's no need to deliberately restrict it, because your tastes will tell you if you're eating too much. However, "high-protein diet" as a euphemism for low-carbohydrate diet is a misnomer. Low-carbohydrate diets are, and have always been, high-fat diets.

The Myth of the High-Protein Diet

The phrase "low-carbohydrate diet" is a no-no in some circles, because it implies that a diet is high in fat. Often, the euphemism "high-protein diet" is used to avoid the mental image of a stick of butter wrapped in bacon. It's purely a semantic game, because there is no such thing as a diet in which the majority of calories come from protein. The ability of the human body to metabolize protein ends at about 1/3 of calories (1, 2), and the long-term optimum may be lower still. Low-carbohydrate diets (yes, the ones that are highly effective for weight loss and general health) are high-fat diets.

Healthy cultures around the world tend to consume roughly 10 to 20% of calories from protein:

Masai - 19%
Kitava - 10%

Tokelau - 12%
Inuit - 20%, according to Stefansson
Kuna - 12%
Sweden - 12%
United States - 15%
Human milk - 6%

The balance comes from fat and carbohydrate. Ask a traditional Inuit. If there's no fat on your meat, you may as well starve. Literally. "Rabbit starvation" was a term coined by American explorers who quickly realized that living on lean game is somewhere between unhealthy and fatal.

In the early 1900s, anthropologist and explorer Vilhjalmur Stefansson lived for several years among completely isolated Canadian Inuit (Eskimo) who had never seen a white person before. They were literally a stone-age culture, completely uninfluenced by the modern world. They are representative of how some of our paleolithic ancestors would have lived. Here's Stefansson, quoted from My Life With the Eskimo (1913):
In certain places and in certain years, rabbits are an important article of diet, but even when there is an abundance of this animal, the Indians consider themselves starving if they get nothing else, - and fairly enough, as my own party can testify, for any one who is compelled in winter to live for a period of several weeks on lean meat will actually starve, in this sense: that there are lacking from his diet certain necessary elements, notably fat, and it makes no difference how much he eats, he will be hungry at the end of each meal, and eventually he will lose strength or become actually ill. The Eskimo who have provided themselves in summer with bags of seal oil can carry them into a rabbit country and can live on rabbits satisfactorily for months.
Dr. Loren Cordain, in his excellent paper "Plant-Animal Subsistence Ratios and Macronutrient Energy Estimations in Worldwide Hunter-Gatherer Diets", argues based on calculated estimates that historical hunter-gatherers generally consumed between 19 and 35% of calories from protein:
This high reliance on animal-based foods coupled with the relatively low carbohydrate content of wild plant foods produces universally characteristic macronutrient composition ratios in which protein is elevated (19- 35% of energy) at the expense of carbohydrates (22- 40% of energy).
Later, he states that the most plausible range of fat intakes is 28- 58%. I agree with his assertion that hunter-gatherer diets tended to be relatively high in protein compared with contemporary diets, but I think his protein numbers are a bit high. Why? Because he calculates macronutrient composition based on the whole-carcass fat content of "representative" animals such as deer.

It's clear from the anthropological literature that hunter-gatherers did not go after representative animals. They went after the fattest animals they could find. They knew exactly which animals were fattest in which seasons, which individuals were likely to be fattest within a herd, and which bodyparts were fattest on an individual animal. For example, Stefansson describes how the Inuit relied on (extremely fat) seal in the spring, wolf in the summer, and caribou and bear in the fall and early winter. If necessary, they would discard lean meat in favor of tongue, marrow, internal organs, back fat and other fat-rich bodyparts. This was in order to obtain a minimum of 65% of calories from fat.

Hunter-gatherers would sometimes even provision themselves with enough fat in advance to last a lean season or two. This was true for dozens of tribes along the Northwest coast of North America that relied chiefly on animal foods. Here's another excerpt from My Life With the Eskimo:
...[spring] is the season which the Eskimo give up to the accumulation of blubber for the coming year. Fresh oil is not nearly so palatable or digestible as oil that has been allowed to ferment in a sealskin bag through the summer, and besides that it is difficult often to get seals in the fall... Each family will in the spring be able to lay away from three to seven bags of oil. Such a bag consists of the whole skin of the common seal... This makes a bag which will hold about three hundred pounds of blubber, so that a single family's store of oil for the fall will run from nine hundred to two thousand pounds.
That's a lot of oil! Some of it would have been used to light oil lamps, but much of it would have been eaten. I think Cordain's estimate of the protein intake of hunter-gatherers is a bit high due to his underestimating fat intake. His paper shows that if you break historical hunter-gatherer cultures into 10 groups based on their reliance on animal foods, the most numerous group (46 out of 229) obtained 85- 100% of their food from animal sources. In other words, approximately 20% of historical hunter-gatherers were carnivorous or nearly so. If the human protein ceiling is 35% of calories, that means roughly one fifth of hunter-gatherers ate 65% or more of their calories as fat. It also means carnivory and high-fat diets are not just anomalies, they are part of the human ecological niche. Zero out of 229 groups obtained less than 16% of calories from animal foods. Vegetarianism is not part of our niche.

Further, although the human body can theoretically tolerate up to 35% protein by calories, even that amount is probably not optimal in the long term. I think that's suggested by the fact that diverse cultures tend to find a source of fat and/or carbohydrate that keeps their protein intake roughly between 10 and 20%. I think it's fine to eat plenty of protein, and there's no need to deliberately restrict it, because your tastes will tell you if you're eating too much. However, "high-protein diet" as a euphemism for low-carbohydrate diet is a misnomer. Low-carbohydrate diets are, and have always been, high-fat diets.

Gluten Sensitivity: Celiac Disease is the Tip of the Iceberg

Celiac disease is a degeneration of the lining of the small intestine caused by a sensitivity to gluten. Gluten is the protein portion of wheat, rye, barley, and wheat relatives (spelt, kamut, emmer, einkorn and triticale). I found an interesting paper recently on the impact of celiac disease on nutrient status and bone density. Researchers compared 54 Northern Italian children with untreated celiac disease to 60 presumably healthy children. The celiac patients had extremely poor vitamin D status, with a deficiency rate of 35.18% compared to 5% in the control group. This was using the lenient cut-off point of 20 ng/mL. Average serum 25(OH)D3 in celiac patients was less than half the level of the control group. The celiac patients also had low serum calcium and magnesium, and elevated parathyroid hormone. Celiac children had lower bone mineral density. All parameters returned to normal after 6 months on a gluten-free diet.

This confirms what has been shown numerous times before: celiac disease interferes with nutrient status, including the all-important fat-soluble vitamins. It's not surprising, since it flattens the villi, finger-like structures necessary for efficient nutrient absorption in the small intestine. But wait, the overwhelming majority of our vitamin D comes from the effect of sunlight on our skin, not through our small intestine! So gluten sensitivity must be doing something besides just flattening villi. Perhaps it does. Feeding wheat bran to "healthy" volunteers caused them to burn through their vitamin D reserves at an
accelerated rate. I think this underlines what I've come to believe about wheat: it's problematic for a large proportion of the population, perhaps the majority.

Approximately 12% of Americans can be diagnosed as gluten sensitive using blood antibody tests (anti-gliadin IgA or IgG). A subset of these have full-blown celiac disease. The vast, vast majority are undiagnosed. Gluten sensitivity associates with a dizzying array of diseases, including autoimmune disorders, cancer, and neurological problems. The problem with the blood tests is they aren't very sensitive. The most common blood tests for celiac disease look for a class of antibody called IgA. IgA is produced by the mucosa, including the gut. Unless gut damage is already extensive, the majority of IgA stays in the gut. This may cause the assay to overlook many cases of gluten sensitivity. A negative blood antibody test
does not rule out gluten sensitivity!

I recently discovered the work of Dr. Kenneth Fine of
EnteroLab. He has developed an assay that detects anti-gliadin IgA in stool. Gliadin is one of the problematic proteins in gluten that is implicated in gluten sensitivity. Dr. Fine has been conducting informal research using his fecal anti-gliadin IgA test (data here). He has found that:
  • 100% of untreated celiac patients are antigliadin IgA positive by fecal test, compared to only 76% by blood (n= 17).
  • 76% of microscopic colitis (a type of chronic diarrhea) patients are positive by the fecal test, compared to 9% by blood (n= 57).
  • 57% of symptomatic people (digestive problems?) are positive by the fecal test, compared to 12% by blood (n= 58).
  • 62% of people with autoimmune disease are positive by the fecal test.
  • 29% of asymptomatic (healthy) people are positive by the fecal test, compared to 11-12% by blood (n= 240).
  • Baby and cow feces are 0% positive by the stool assay.
It gets worse. Gluten sensitivity is determined in large part by genetics. A gene called HLA-DQ is intimately involved. It encodes a protein that is expressed on the surface of cells, that serves to activate immune cells when certain foreign substances are present. Different versions of the gene are activated by different substances. HLA-DQ2 and HLA-DQ8 are classically associated with celiac disease. Roughly 42% of the US population carries DQ2 or DQ8. According to Dr. Fine, every allele except DQ4 has some association with gluten-related problems! Only 0.4% of the U.S. population carries HLA-DQ4 and no other allele.

Not everyone who is genetically susceptible will end up developing health problems due to gluten, but it's impossible to estimate how many of the problems we attribute to other causes are in fact caused or exacerbated by gluten.

The immune system can be divided into two parts: innate and adaptive. The innate immune system is a nonspecific, first-line reaction to a perceived threat. The adaptive immune system is a more sophisticated, but slower system that produces a powerful response by particular cell types to a very specific threat. Antibody production is part of the adaptive immune system. Thus, if your gluten sensitivity test is looking for antibodies, it could still be missing an immune reaction to gluten mediated by the innate immune system!

This question has been addressed in a
preliminary study. Researchers took gut biopsies from celiac patients and asymptomatic controls. Five out of six asymptomatic controls showed elevated interleukin-15, a marker of innate immune activation, upon exposure to gliadin. An activated innate immune system (commonly called 'inflammation') is associated with a wide array of chronic diseases, from obesity to cancer to cardiovascular disease. Inflammatory cytokines are elevated in celiac patients and may play a role in their bone pathology. What I would like to see is some negative controls-- would the gut biopsies have produced interleukin-15 in response to benign foods or is it truly specific to gluten?

I don't intend to imply that everyone has gluten sensitivity, but I do think the totality of the data are thought-provoking. They also include the association between the introduction of wheat to non-industrial populations and the development of widespread health problems. Another thing to keep in mind is that traditional sourdough fermentation breaks down a portion of gluten, possibly explaining the rise in gluten sensitivity that has paralleled a shift to quick-rise yeast breads. I believe that gluten sensitivity is behind many modern ills, and should be on the short list of suspects in the case of unexplained health problems. This is particularly true of digestive, autoimmune and neurological disorders. Gluten sensitivity is easy to address: stop eating gluten for a few weeks. See how you feel. Reintroduce gluten and see what happens. You might learn something about yourself.

Gluten Sensitivity: Celiac Disease is the Tip of the Iceberg

Celiac disease is a degeneration of the lining of the small intestine caused by a sensitivity to gluten. Gluten is the protein portion of wheat, rye, barley, and wheat relatives (spelt, kamut, emmer, einkorn and triticale). I found an interesting paper recently on the impact of celiac disease on nutrient status and bone density. Researchers compared 54 Northern Italian children with untreated celiac disease to 60 presumably healthy children. The celiac patients had extremely poor vitamin D status, with a deficiency rate of 35.18% compared to 5% in the control group. This was using the lenient cut-off point of 20 ng/mL. Average serum 25(OH)D3 in celiac patients was less than half the level of the control group. The celiac patients also had low serum calcium and magnesium, and elevated parathyroid hormone. Celiac children had lower bone mineral density. All parameters returned to normal after 6 months on a gluten-free diet.

This confirms what has been shown numerous times before: celiac disease interferes with nutrient status, including the all-important fat-soluble vitamins. It's not surprising, since it flattens the villi, finger-like structures necessary for efficient nutrient absorption in the small intestine. But wait, the overwhelming majority of our vitamin D comes from the effect of sunlight on our skin, not through our small intestine! So gluten sensitivity must be doing something besides just flattening villi. Perhaps it does. Feeding wheat bran to "healthy" volunteers caused them to burn through their vitamin D reserves at an
accelerated rate. I think this underlines what I've come to believe about wheat: it's problematic for a large proportion of the population, perhaps the majority.

Approximately 12% of Americans can be diagnosed as gluten sensitive using blood antibody tests (anti-gliadin IgA or IgG). A subset of these have full-blown celiac disease. The vast, vast majority are undiagnosed. Gluten sensitivity associates with a dizzying array of diseases, including autoimmune disorders, cancer, and neurological problems. The problem with the blood tests is they aren't very sensitive. The most common blood tests for celiac disease look for a class of antibody called IgA. IgA is produced by the mucosa, including the gut. Unless gut damage is already extensive, the majority of IgA stays in the gut. This may cause the assay to overlook many cases of gluten sensitivity. A negative blood antibody test
does not rule out gluten sensitivity!

I recently discovered the work of Dr. Kenneth Fine of
EnteroLab. He has developed an assay that detects anti-gliadin IgA in stool. Gliadin is one of the problematic proteins in gluten that is implicated in gluten sensitivity. Dr. Fine has been conducting informal research using his fecal anti-gliadin IgA test (data here). He has found that:
  • 100% of untreated celiac patients are antigliadin IgA positive by fecal test, compared to only 76% by blood (n= 17).
  • 76% of microscopic colitis (a type of chronic diarrhea) patients are positive by the fecal test, compared to 9% by blood (n= 57).
  • 57% of symptomatic people (digestive problems?) are positive by the fecal test, compared to 12% by blood (n= 58).
  • 62% of people with autoimmune disease are positive by the fecal test.
  • 29% of asymptomatic (healthy) people are positive by the fecal test, compared to 11-12% by blood (n= 240).
  • Baby and cow feces are 0% positive by the stool assay.
It gets worse. Gluten sensitivity is determined in large part by genetics. A gene called HLA-DQ is intimately involved. It encodes a protein that is expressed on the surface of cells, that serves to activate immune cells when certain foreign substances are present. Different versions of the gene are activated by different substances. HLA-DQ2 and HLA-DQ8 are classically associated with celiac disease. Roughly 42% of the US population carries DQ2 or DQ8. According to Dr. Fine, every allele except DQ4 has some association with gluten-related problems! Only 0.4% of the U.S. population carries HLA-DQ4 and no other allele.

Not everyone who is genetically susceptible will end up developing health problems due to gluten, but it's impossible to estimate how many of the problems we attribute to other causes are in fact caused or exacerbated by gluten.

The immune system can be divided into two parts: innate and adaptive. The innate immune system is a nonspecific, first-line reaction to a perceived threat. The adaptive immune system is a more sophisticated, but slower system that produces a powerful response by particular cell types to a very specific threat. Antibody production is part of the adaptive immune system. Thus, if your gluten sensitivity test is looking for antibodies, it could still be missing an immune reaction to gluten mediated by the innate immune system!

This question has been addressed in a
preliminary study. Researchers took gut biopsies from celiac patients and asymptomatic controls. Five out of six asymptomatic controls showed elevated interleukin-15, a marker of innate immune activation, upon exposure to gliadin. An activated innate immune system (commonly called 'inflammation') is associated with a wide array of chronic diseases, from obesity to cancer to cardiovascular disease. Inflammatory cytokines are elevated in celiac patients and may play a role in their bone pathology. What I would like to see is some negative controls-- would the gut biopsies have produced interleukin-15 in response to benign foods or is it truly specific to gluten?

I don't intend to imply that everyone has gluten sensitivity, but I do think the totality of the data are thought-provoking. They also include the association between the introduction of wheat to non-industrial populations and the development of widespread health problems. Another thing to keep in mind is that traditional sourdough fermentation breaks down a portion of gluten, possibly explaining the rise in gluten sensitivity that has paralleled a shift to quick-rise yeast breads. I believe that gluten sensitivity is behind many modern ills, and should be on the short list of suspects in the case of unexplained health problems. This is particularly true of digestive, autoimmune and neurological disorders. Gluten sensitivity is easy to address: stop eating gluten for a few weeks. See how you feel. Reintroduce gluten and see what happens. You might learn something about yourself.

Peripheral vs. Ectopic Fat

I went to an interesting presentation the other day by Dr. George Ioannou of the University of Washington, on obesity and liver disease. He made an interesting distinction between the health effects of two types of body fat. The first is called subcutaneous fat (or peripheral fat). It accumulates right under the skin and is evenly distributed over the body's surface area, including extremities. The second is called ectopic fat. Ectopic means "not where it's supposed to be". It accumulates in the abdominal region (beer belly), the liver, muscle tissue including the heart, the pancreas, and perhaps in lipid-rich deposits in the arteries. Subcutaneous fat can be measured by taking skinfold thickness in different places on the body, or sometimes by measuring arm or leg circumference. Ectopic fat can be measured by taking waist circumference.

It's an absolutely critical distinction, because ectopic fat associates with poor health outcomes while subcutaneous fat does not. In
this recent study, waist circumference was associated with increased risk of death while arm and leg circumference were associated with a reduced risk of death. I think the limb circumference association in this particular study is probably confounded by muscle mass, but other studies have also shown a strong, consistent association between ectopic fat and risk of death, but not subcutaneous fat. The same goes for dementia and a number of other diseases. I think it's more than an epidemiological asssociation. Surgically removing the abdominal fat from mice prevents insulin resistance and prolongs their lifespan.

People with excess visceral fat are also
much more likely to have fatty liver and cirrhosis. It makes sense if you think of them both as manifestations of ectopic fat. There's a spectrum of disorders that goes along with excess visceral fat and fatty liver: it's called the metabolic syndrome, and it affects a quarter of Americans (NHANES III). We already have a pretty good idea of what causes fatty liver, at least in lab animals: industrial vegetable oils and sugar. What's the most widely used animal model of metabolic syndrome? The sugar-fed rat. What are two of the main foods whose consumption has increased in recent decades? Vegetable oil and sugar. Hmm... Fatty liver is capable of causing insulin resistance and diabetes, according to a transgenic mouse that expresses a hepatitis C protein in its liver.

You want to keep your liver happy. All those blood tests they do in the doctor's office to see if you're healthy-- cholesterol levels, triglycerides, insulin, glucose--
reflect liver function to varying degrees.

Abdominal fat is a sign of ectopic fat distribution throughout the body, and its associated metabolic consequences. I think we know it's unhealthy on a subconscious level, because belly fat is not attractive whereas nicely distributed subcutaneous fat can be. If you have excess visceral fat, take it as a sign that your body does not like your current lifestyle. It might be time to think about changing your diet and exercise regime.
Here are some ideas.