skip to main |
skip to sidebar
I'm revisiting the topic of the omega-6/omega-3 balance and total polyunsaturated fat (PUFA) intake because of some interesting studies I've gotten a hold of lately (thanks Robert). Two of the studies are in pigs, which I feel are a decent model organism for studying the effect of diet on health as it relates to humans. Pigs are omnivorous (although more slanted toward plant foods), have a similar digestive system to humans (although sturdier), are of similar size and fat composition to humans, and have been eating grains for about the same amount of time as humans.In the last post on the omega-6/omega-3 balance, I came to the conclusion that a roughly balanced but relatively low intake of omega-6 and omega-3 fats is consistent with the diets of healthy non-industrial cultures. There were a few cultures that had a fairly high long-chain omega-3 intake from seafood (10% of calories), but none ate much omega-6.
The first study explores the effect of omega-6 and omega-3 fats on heart function. Dr. Sheila Innis and her group fed young male pigs three different diets: - An unbalanced, low PUFA diet. Pig chow with 1.2% linoleic acid (LA; the main omega-6 plant fat) and 0.06% alpha linolenic acid (ALA; the main omega-3 plant fat).
- A balanced, low PUFA diet. Pig chow with 1.4% LA and 1.2% ALA.
- An unbalanced, but better-than-average, "modern diet". Pig chow with 11.6% LA and 1.2% ALA.
After 30 days, they took a look at the pigs' hearts. Pigs from the first and third (unbalanced) groups contained more "pro-inflammatory" fats (arachidonic acid; AA) and less "anti-inflammatory" fats (EPA and DHA) than the second group. The first and third groups also experienced an excessive activation of "pro-inflammatory" proteins, such as COX-2, the enzyme inhibited by aspirin, ibuprofen and other NSAIDs.
The most striking finding of all was the difference in lipid peroxidation between groups. Lipid peroxidation is a measure of oxidative damage to cellular fats. In the balanced diet hearts, peroxidation was half the level found in the first group, and one-third the level found in the third group! This shows that omega-3 fats exert a powerful anti-oxidant effect that can be more than counteracted by excessive omega-6. Nitrosative stress, another type of damage, tracked with n-6 intake regardless of n-3, with the third group almost tripling the first two. I think this result is highly relevant to the long-term development of cardiac problems, and perhaps cardiovascular disease in general.
In another study with the same lead author Sanjoy Ghosh, rats fed a diet enriched in omega-6 from sunflower oil showed an increase in nitrosative damage, damage to mitochondrial DNA, and a decrease in maximum cardiac work capacity (i.e., their hearts were weaker). This is consistent with the previous study and shows that the mammalian heart does not like too much omega-6! The amount of sunflower oil these rats were eating (20% food by weight) is not far off from the amount of industrial oil the average American eats.
A third paper by Dr. Sheila Innis' group studied the effect of the omega-6 : omega-3 balance on the brain fat composition of pigs, and the development of neurons in vitro (in a culture dish). There were four diets, the first three similar to those in the first study: - Deficient. 1.2% LA and 0.05% ALA.
- Contemporary. 10.7% LA and 1.1% ALA.
- Evolutionary. 1.2% LA and 1.1% ALA.
- Supplemented. The contemporary diet plus 0.3% AA and 0.3% DHA.
The first thing they looked at was the ability of the animals to convert ALA to DHA and concentrate it in the brain. DHA is critical for brain and eye development and maintenance. The evolutionary diet was most effective at putting DHA in the brain, with the supplemented diet a close second and the other three lagging behind. The evolutionary diet was the only one capable of elevating EPA, another important fatty acid derived from ALA. If typical fish oil rather than isolated DHA and AA had been given as the supplement, that may not have been the case. Overall, the fatty acid composition of the brain was quite different in the evolutionary group than the other three groups, which will certainly translate into a variety of effects on brain function.
The researchers then cultured neurons and showed that they require DHA to develop properly in culture, and that long-chain omega-6 fats are a poor substitute. Overall, the paper shows that the modern diet causes a major fatty acid imbalance in the brain, which is expected to lead to developmental problems and probably others as well. This can be partially corrected by supplementing with fish oil.Together, these studies are a small glimpse of the countless effects we are having on every organ system, by eating fats that are unfamiliar to our pre-industrial bodies. In the next post, I'll put this information into the context of the modern human diet.
I'm revisiting the topic of the omega-6/omega-3 balance and total polyunsaturated fat (PUFA) intake because of some interesting studies I've gotten a hold of lately (thanks Robert). Two of the studies are in pigs, which I feel are a decent model organism for studying the effect of diet on health as it relates to humans. Pigs are omnivorous (although more slanted toward plant foods), have a similar digestive system to humans (although sturdier), are of similar size and fat composition to humans, and have been eating grains for about the same amount of time as humans.In the last post on the omega-6/omega-3 balance, I came to the conclusion that a roughly balanced but relatively low intake of omega-6 and omega-3 fats is consistent with the diets of healthy non-industrial cultures. There were a few cultures that had a fairly high long-chain omega-3 intake from seafood (10% of calories), but none ate much omega-6.
The first study explores the effect of omega-6 and omega-3 fats on heart function. Dr. Sheila Innis and her group fed young male pigs three different diets: - An unbalanced, low PUFA diet. Pig chow with 1.2% linoleic acid (LA; the main omega-6 plant fat) and 0.06% alpha linolenic acid (ALA; the main omega-3 plant fat).
- A balanced, low PUFA diet. Pig chow with 1.4% LA and 1.2% ALA.
- An unbalanced, but better-than-average, "modern diet". Pig chow with 11.6% LA and 1.2% ALA.
After 30 days, they took a look at the pigs' hearts. Pigs from the first and third (unbalanced) groups contained more "pro-inflammatory" fats (arachidonic acid; AA) and less "anti-inflammatory" fats (EPA and DHA) than the second group. The first and third groups also experienced an excessive activation of "pro-inflammatory" proteins, such as COX-2, the enzyme inhibited by aspirin, ibuprofen and other NSAIDs.
The most striking finding of all was the difference in lipid peroxidation between groups. Lipid peroxidation is a measure of oxidative damage to cellular fats. In the balanced diet hearts, peroxidation was half the level found in the first group, and one-third the level found in the third group! This shows that omega-3 fats exert a powerful anti-oxidant effect that can be more than counteracted by excessive omega-6. Nitrosative stress, another type of damage, tracked with n-6 intake regardless of n-3, with the third group almost tripling the first two. I think this result is highly relevant to the long-term development of cardiac problems, and perhaps cardiovascular disease in general.
In another study with the same lead author Sanjoy Ghosh, rats fed a diet enriched in omega-6 from sunflower oil showed an increase in nitrosative damage, damage to mitochondrial DNA, and a decrease in maximum cardiac work capacity (i.e., their hearts were weaker). This is consistent with the previous study and shows that the mammalian heart does not like too much omega-6! The amount of sunflower oil these rats were eating (20% food by weight) is not far off from the amount of industrial oil the average American eats.
A third paper by Dr. Sheila Innis' group studied the effect of the omega-6 : omega-3 balance on the brain fat composition of pigs, and the development of neurons in vitro (in a culture dish). There were four diets, the first three similar to those in the first study: - Deficient. 1.2% LA and 0.05% ALA.
- Contemporary. 10.7% LA and 1.1% ALA.
- Evolutionary. 1.2% LA and 1.1% ALA.
- Supplemented. The contemporary diet plus 0.3% AA and 0.3% DHA.
The first thing they looked at was the ability of the animals to convert ALA to DHA and concentrate it in the brain. DHA is critical for brain and eye development and maintenance. The evolutionary diet was most effective at putting DHA in the brain, with the supplemented diet a close second and the other three lagging behind. The evolutionary diet was the only one capable of elevating EPA, another important fatty acid derived from ALA. If typical fish oil rather than isolated DHA and AA had been given as the supplement, that may not have been the case. Overall, the fatty acid composition of the brain was quite different in the evolutionary group than the other three groups, which will certainly translate into a variety of effects on brain function.
The researchers then cultured neurons and showed that they require DHA to develop properly in culture, and that long-chain omega-6 fats are a poor substitute. Overall, the paper shows that the modern diet causes a major fatty acid imbalance in the brain, which is expected to lead to developmental problems and probably others as well. This can be partially corrected by supplementing with fish oil.Together, these studies are a small glimpse of the countless effects we are having on every organ system, by eating fats that are unfamiliar to our pre-industrial bodies. In the next post, I'll put this information into the context of the modern human diet.
Thanks to commenter Brock for pointing me to this very interesting paper, "Effects of fish oil on hypertension, plasma lipids, and tumor necrosis factor-alpha in rats with sucrose-induced metabolic syndrome". As we know, sugar gives rats metabolic syndrome when it's added to regular rat chow, probably the same thing it does to humans when added to a processed food diet.
One thing has always puzzled me about sugar. It doesn't appear to cause major metabolic problems when added to an otherwise healthy diet, yet it wreaks havoc in other contexts. One example of the former situation is the Kuna, who are part hunter-gatherer, part agricultural. They eat a lot of refined sugar, but in the context of chocolate, coconut, fish, plantains, root vegetables and limited grains and beans, they are relatively healthy. Perhaps not quite on the same level as hunter-gatherer groups, but healthier than the average modernized person from the point of view of the diseases of civilization.
This paper really sheds light on the matter. The researchers gave a large group of rats access to drinking water containing 30% sucrose, in addition to their normal rat chow, for 21 weeks. The rats drank 4/5 of their calories in the form of sugar water. There's no doubt that this is an extreme treatment. They subsequently developed metabolic syndrome, including abdominal obesity, elevated blood pressure, elevated fasting insulin, elevated triglycerides, elevated total cholesterol and LDL, lowered HDL, greatly increased serum uric acid, greatly elevated liver enzymes suggestive of liver damage, and increased tumor necrosis factor-alpha (TNF-alpha). TNF-alpha is a hormone secreted by visceral (abdominal) fat tissue that may play a role in promoting insulin resistance.
After this initial 12-week treatment, they divided the metabolic syndrome rats into two groups:- One that continued the sugar treatment, along with a diet enriched in corn and canola oil (increased omega-6).
- A second that continued the sugar treatment, along with a diet enriched in fish oil (increased omega-3).
The two diets contained the same total amount of polyunsaturated fat (PUFA), but had very different omega-6 : omega-3 ratios. The first had a ratio of 9.3 (still better than the average American), while the second had a ratio of 0.02, with most of the omega-3 in the second group coming from EPA and DHA (long-chain, animal omega-3s). The second diet also contained four times as much saturated fat as the first, mostly in the form of palmitic acid.
Compared to the vegetable oil group, the fish oil group had lower fasting insulin, lower blood pressure, lower triglycerides, lower cholesterol, and lower LDL. As a matter of fact, the fish oil group looked as good or better on all these parameters than a non-sugar fed control group receiving the extra vegetable oil alone (although the control group isn't perfect because it inevitably ate more vegetable oil-containing chow to make up for the calories it wasn't consuming in sugar). The only things reducing vegetable oil and increasing fish oil didn't fix were the weight and the elevated TNF-alpha, although they didn't report the level of liver enzymes in these groups. The TNF-alpha finding is not surprising, since it's secreted by visceral fat, which did not decrease in the fish oil group.
I think this is a powerful result. It may have been done in rats, but the evidence is there for a similar mechanism in humans. The Kuna have a very favorable omega-6 : omega-3 ratio, with most of their fat coming from highly saturated coconut and cocoa. This may protect them from their high sugar intake. The Kitavans also have a very favorable omega-6 : omega-3 ratio, with most of their fat coming from coconuts and fish. They don't eat refined sugar, but they do eat a tremendous amount of starch and a generous amount of fruit.
The paper also suggests that the metabolic syndrome is largely reversible.
I believe that both excessive sugar and excessive omega-6 from modern vegetable oils are a problem individually. But if you want to have a much bigger problem, try combining them!
Thanks to commenter Brock for pointing me to this very interesting paper, "Effects of fish oil on hypertension, plasma lipids, and tumor necrosis factor-alpha in rats with sucrose-induced metabolic syndrome". As we know, sugar gives rats metabolic syndrome when it's added to regular rat chow, probably the same thing it does to humans when added to a processed food diet.
One thing has always puzzled me about sugar. It doesn't appear to cause major metabolic problems when added to an otherwise healthy diet, yet it wreaks havoc in other contexts. One example of the former situation is the Kuna, who are part hunter-gatherer, part agricultural. They eat a lot of refined sugar, but in the context of chocolate, coconut, fish, plantains, root vegetables and limited grains and beans, they are relatively healthy. Perhaps not quite on the same level as hunter-gatherer groups, but healthier than the average modernized person from the point of view of the diseases of civilization.
This paper really sheds light on the matter. The researchers gave a large group of rats access to drinking water containing 30% sucrose, in addition to their normal rat chow, for 21 weeks. The rats drank 4/5 of their calories in the form of sugar water. There's no doubt that this is an extreme treatment. They subsequently developed metabolic syndrome, including abdominal obesity, elevated blood pressure, elevated fasting insulin, elevated triglycerides, elevated total cholesterol and LDL, lowered HDL, greatly increased serum uric acid, greatly elevated liver enzymes suggestive of liver damage, and increased tumor necrosis factor-alpha (TNF-alpha). TNF-alpha is a hormone secreted by visceral (abdominal) fat tissue that may play a role in promoting insulin resistance.
After this initial 12-week treatment, they divided the metabolic syndrome rats into two groups:- One that continued the sugar treatment, along with a diet enriched in corn and canola oil (increased omega-6).
- A second that continued the sugar treatment, along with a diet enriched in fish oil (increased omega-3).
The two diets contained the same total amount of polyunsaturated fat (PUFA), but had very different omega-6 : omega-3 ratios. The first had a ratio of 9.3 (still better than the average American), while the second had a ratio of 0.02, with most of the omega-3 in the second group coming from EPA and DHA (long-chain, animal omega-3s). The second diet also contained four times as much saturated fat as the first, mostly in the form of palmitic acid.
Compared to the vegetable oil group, the fish oil group had lower fasting insulin, lower blood pressure, lower triglycerides, lower cholesterol, and lower LDL. As a matter of fact, the fish oil group looked as good or better on all these parameters than a non-sugar fed control group receiving the extra vegetable oil alone (although the control group isn't perfect because it inevitably ate more vegetable oil-containing chow to make up for the calories it wasn't consuming in sugar). The only things reducing vegetable oil and increasing fish oil didn't fix were the weight and the elevated TNF-alpha, although they didn't report the level of liver enzymes in these groups. The TNF-alpha finding is not surprising, since it's secreted by visceral fat, which did not decrease in the fish oil group.
I think this is a powerful result. It may have been done in rats, but the evidence is there for a similar mechanism in humans. The Kuna have a very favorable omega-6 : omega-3 ratio, with most of their fat coming from highly saturated coconut and cocoa. This may protect them from their high sugar intake. The Kitavans also have a very favorable omega-6 : omega-3 ratio, with most of their fat coming from coconuts and fish. They don't eat refined sugar, but they do eat a tremendous amount of starch and a generous amount of fruit.
The paper also suggests that the metabolic syndrome is largely reversible.
I believe that both excessive sugar and excessive omega-6 from modern vegetable oils are a problem individually. But if you want to have a much bigger problem, try combining them!
Bone marrow is a food that has been prized throughout history-- from hunter-gatherer tribes to haute cuisine chefs. It's not hard to understand why, once you've tasted it. It's delicate, meaty and fatty. It's also rich in fat-soluble vitamins, including vitamins K1 and K2, although this will depend on what the animal has eaten.
Roasted marrow bones make a simple appetizer. Beef bones are the best because of their size. Select wide bones that are cut about three inches long. They should be from the femur or the humerus, called the "shank bones". These are sometimes available in the frozen meats section of a grocery store, otherwise a butcher can procure them. If you have access to a farmer's market that sells meats, vendors will typically have bones cut for you if you request it.
Recipe
- Preheat oven to 450 F (230 C).
- Place bones, cut side up, in a baking dish or oven-proof skillet.
- Bake for about 15 minutes, until the marrow begins to separate from the bone, but not much longer because it will turn to mush.
- Scoop out and eat the marrow by itself, on sourdough rye toast or however you please.
- Make soup stock from the leftover bones.
Bone marrow is a food that has been prized throughout history-- from hunter-gatherer tribes to haute cuisine chefs. It's not hard to understand why, once you've tasted it. It's delicate, meaty and fatty. It's also rich in fat-soluble vitamins, including vitamins K1 and K2, although this will depend on what the animal has eaten.
Roasted marrow bones make a simple appetizer. Beef bones are the best because of their size. Select wide bones that are cut about three inches long. They should be from the femur or the humerus, called the "shank bones". These are sometimes available in the frozen meats section of a grocery store, otherwise a butcher can procure them. If you have access to a farmer's market that sells meats, vendors will typically have bones cut for you if you request it.
Recipe
- Preheat oven to 450 F (230 C).
- Place bones, cut side up, in a baking dish or oven-proof skillet.
- Bake for about 15 minutes, until the marrow begins to separate from the bone, but not much longer because it will turn to mush.
- Scoop out and eat the marrow by itself, on sourdough rye toast or however you please.
- Make soup stock from the leftover bones.
I'm always on the lookout for foods rich in vitamin K2 MK-4, because it's so important and so rare in the modern food system. I heard some internet rumors that marrow might be rich in fat-soluble vitamins. Google let me down, so I decided to look through the rat studies on K2 MK-4 in which they looked at its tissue distribution.
I found one that looked at the K2 MK-4 content in different tissues of rats fed vitamin K1. Marrow was rich in K2, along with testes. It contains 10-20 times more MK-4 than liver by weight, and more than any of the other organs they tested (serum, liver, spleen, kidney, heart, testes, marrow, brain) except testes. They didn't include values for salivary gland and pancreas, the two richest sources.
If we assume beef marrow has the same amount of MK-4 as rat marrow per weight (I have no idea if this is really the case, but it's probably in the ballpark), two ounces of beef marrow would contain about 10 micrograms MK-4. Not a huge source, but significant nevertheless.
Bone marrow was a prized food in many hunter-gatherer societies. Let's see what Dr. Weston Price has to say about it (from Nutrition and Physical Degeneration):
For the Indians living inside the Rocky Mountain Range in the far North of Canada, the successful nutrition for nine months of the year was largely limited to wild game, chiefly moose and caribou. During the summer months the Indians were able to use growing plants. During the winter some use was made of bark and buds of trees. I found the Indians putting great emphasis upon the eating of the organs of the animals, including the wall of parts of the digestive tract. Much of the muscle meat of the animals was fed to the dogs. It is important that skeletons are rarely found where large game animals have been slaughtered by the Indians of the North. The skeletal remains are found as piles of finely broken bone chips or splinters that have been cracked up to obtain as much as possible of the marrow and nutritive qualities of the bones. These Indians obtain their fat-soluble vitamins and also most of their minerals from the organs of the animals. An important part of the nutrition of the children consisted in various preparations of bone marrow, both as a substitute for milk and as a special dietary ration.
Here's a bit more about these same groups, also from Nutrition and Physical Degeneration:
The condition of the teeth, and the shape of the dental arches and the facial form, were superb. Indeed, in several groups examined not a single tooth was found that had ever been attacked by tooth decay. In an examination of eighty-seven individuals having 2,464 teeth only four teeth were found that had ever been attacked by dental caries. This is equivalent to 0.16 per cent. As we came back to civilization and studied, successively, different groups with increasing amounts of contact with modern civilization, we found dental caries increased progressively, reaching 25.5 per cent of all of the teeth examined at Telegraph Creek, the point of contact with the white man's foods. As we came down the Stikine River to the Alaskan frontier towns, the dental caries problem increased to 40 per cent of all of the teeth.
Evidently, the traditionally-living groups were doing something right.
I'm always on the lookout for foods rich in vitamin K2 MK-4, because it's so important and so rare in the modern food system. I heard some internet rumors that marrow might be rich in fat-soluble vitamins. Google let me down, so I decided to look through the rat studies on K2 MK-4 in which they looked at its tissue distribution.
I found one that looked at the K2 MK-4 content in different tissues of rats fed vitamin K1. Marrow was rich in K2, along with testes. It contains 10-20 times more MK-4 than liver by weight, and more than any of the other organs they tested (serum, liver, spleen, kidney, heart, testes, marrow, brain) except testes. They didn't include values for salivary gland and pancreas, the two richest sources.
If we assume beef marrow has the same amount of MK-4 as rat marrow per weight (I have no idea if this is really the case, but it's probably in the ballpark), two ounces of beef marrow would contain about 10 micrograms MK-4. Not a huge source, but significant nevertheless.
Bone marrow was a prized food in many hunter-gatherer societies. Let's see what Dr. Weston Price has to say about it (from Nutrition and Physical Degeneration):
For the Indians living inside the Rocky Mountain Range in the far North of Canada, the successful nutrition for nine months of the year was largely limited to wild game, chiefly moose and caribou. During the summer months the Indians were able to use growing plants. During the winter some use was made of bark and buds of trees. I found the Indians putting great emphasis upon the eating of the organs of the animals, including the wall of parts of the digestive tract. Much of the muscle meat of the animals was fed to the dogs. It is important that skeletons are rarely found where large game animals have been slaughtered by the Indians of the North. The skeletal remains are found as piles of finely broken bone chips or splinters that have been cracked up to obtain as much as possible of the marrow and nutritive qualities of the bones. These Indians obtain their fat-soluble vitamins and also most of their minerals from the organs of the animals. An important part of the nutrition of the children consisted in various preparations of bone marrow, both as a substitute for milk and as a special dietary ration.
Here's a bit more about these same groups, also from Nutrition and Physical Degeneration:
The condition of the teeth, and the shape of the dental arches and the facial form, were superb. Indeed, in several groups examined not a single tooth was found that had ever been attacked by tooth decay. In an examination of eighty-seven individuals having 2,464 teeth only four teeth were found that had ever been attacked by dental caries. This is equivalent to 0.16 per cent. As we came back to civilization and studied, successively, different groups with increasing amounts of contact with modern civilization, we found dental caries increased progressively, reaching 25.5 per cent of all of the teeth examined at Telegraph Creek, the point of contact with the white man's foods. As we came down the Stikine River to the Alaskan frontier towns, the dental caries problem increased to 40 per cent of all of the teeth.
Evidently, the traditionally-living groups were doing something right.
I stumbled upon an interesting editorial recently in the American Journal of Clinical Nutrition from Dr. Richard Johnson's group, entitled "How Safe is Fructose for Persons With or Without Diabetes?" It was a response to a meta-analysis in the same journal pronouncing fructose safe up to 90 grams per day. That's the amount in eight apples or four cans of soda. Not quite what our hunter-gatherer ancestors were eating! The editorial outlined the case against excessive fructose, which I feel is quite strong. That led me to another, more comprehensive paper from Dr. Johnson's group, which argues that the amount of fructose found in a food, which they call the "fructose index", is more relevant to health than the food's glycemic index. The glycemic index is a measure of the blood sugar response to a fixed amount of carbohydrate from a particular food. For example, white bread has a high glycemic index because it raises blood sugar more than another food containing the same amount of carbohydrate, say, lentils. Since chronically elevated blood sugar and its natural partner, insulin resistance, are part of the metabolic syndrome, it made sense that the glycemic index would be a good predictor of the metabolic effect of a food. I believed this myself for a long time. My faith in the concept began to erode when I learned more about the diets of healthy traditional cultures. For example, the Kitavans get 69% of their calories from high-glycemic index carbohydrates (mostly starchy root vegetables), with little added fat-- that's a lot of fast-digesting carbohydrate! Overweight, elevated insulin and other symptoms of the metabolic syndrome are essentially nonexistent. Throughout Africa, healthy cultures make dishes from grains or starchy tubers that are soaked, pounded, fermented and then cooked. The result is a pile of mush that is very easily absorbed by the digestive tract, which is exactly the point of going through all the trouble. The more I thought about the glycemic index and its relationship to insulin resistance and the metabolic syndrome, the more I realized there is a disconnect in the logic: elevated post-meal glucose and insulin do not necessarily lead to chronically elevated glucose and insulin. Here's what Dr. Mark Segal from Dr. Johnson's group had to say:We suggest that the [glycemic index] is better aimed at identifying foods that stimulate insulin secretion rather than foods that stimulate insulin resistance. The underlying concept is based on the principle that it is the ingestion of foods that induce insulin resistance that carries the increased risk for obesity and cardiovascular disease and not eating foods that stimulate insulin secretion.
Well said! I decided to take a look through the literature to see if there had been any trials on the relationship between a diet's glycemic index and its ability to cause satiety (fullness) and affect weight. I found a meta-analysis from 2007. Two things are clear from the paper: 1) in the short term, given an equal amount of carbohydrate, a diet with a low glycemic index is more satiating (filling) than one with a high glycemic index, leading to a lower intake of calories. 2) this effect disappears in the long-term, and the three trials (1, 2, 3) lasting 10 weeks or longer found no consistent effect on caloric intake or weight*. As a matter of fact, the only statistically significant (p less than 0.001) weight difference was a greater weight loss in one of the high-glycemic index groups!
As I've said many times, the body has mechanisms for maintaining weight and caloric intake where they should be in the long term. As long as those mechanisms are working properly, weight and caloric intake will be appropriate. The big question is, how does the modern lifestyle derail those mechanisms?
Dr. Johnson believes fructose is a major contributor. Table sugar, fruit, high-fructose corn syrup and honey are all roughly 50% fructose by calories. Total fructose consumption has increased about 19% in the U.S. since 1970, currently accounting for almost one eighth of our total calorie intake (total sugars account for one quarter!). That's the average, so many people actually consume more.
Fructose, but not starch or its component sugar glucose, causes insulin resistance, elevated serum uric acid (think gout and kidney stones), poorer blood glucose control, increased triglycerides and LDL cholesterol in animal studies and controlled human trials. All of these effects relate to the liver, which clearly does not like excessive fructose (or omega-6 oils). Some of these trials were conducted using doses that are near the average U.S. intake. The effect seems to compound over time both in humans and animals. The overweight, the elderly and the physically unfit are particularly vulnerable. I find this pretty damning.
Drs. Johnson and Segal recommend limiting fructose to 15-40 grams per day, which is the equivalent of about two apples or one soda (choose the apples!). They also recommend temporarily eliminating fructose for two weeks, to allow the body to recover from the negative long-term metabolic adaptation that can persist even when intake is low. I think this makes good sense.
The glycemic index may still be a useful tool for people with poor glucose control, like type II diabetics, but I'm not sure how much it adds to simply restricting carbohydrate. Reducing fructose may be a more effective way to address insulin resistance than eating a low glycemic index diet.
*Here was the author's way of putting it in the abstract: "Because of the increasing number of confounding variables in the available long-term studies, it is not possible to conclude that low-glycaemic diets mediate a health benefit based on body weight regulation. The difficulty of demonstrating the long-term health benefit of a satietogenic food or diet may constitute an obstacle to the recognition of associated claims." In other words, the data not supporting our favorite hypothesis is an obstacle to its recognition. You don't say?
I stumbled upon an interesting editorial recently in the American Journal of Clinical Nutrition from Dr. Richard Johnson's group, entitled "How Safe is Fructose for Persons With or Without Diabetes?" It was a response to a meta-analysis in the same journal pronouncing fructose safe up to 90 grams per day. That's the amount in eight apples or four cans of soda. Not quite what our hunter-gatherer ancestors were eating! The editorial outlined the case against excessive fructose, which I feel is quite strong. That led me to another, more comprehensive paper from Dr. Johnson's group, which argues that the amount of fructose found in a food, which they call the "fructose index", is more relevant to health than the food's glycemic index. The glycemic index is a measure of the blood sugar response to a fixed amount of carbohydrate from a particular food. For example, white bread has a high glycemic index because it raises blood sugar more than another food containing the same amount of carbohydrate, say, lentils. Since chronically elevated blood sugar and its natural partner, insulin resistance, are part of the metabolic syndrome, it made sense that the glycemic index would be a good predictor of the metabolic effect of a food. I believed this myself for a long time. My faith in the concept began to erode when I learned more about the diets of healthy traditional cultures. For example, the Kitavans get 69% of their calories from high-glycemic index carbohydrates (mostly starchy root vegetables), with little added fat-- that's a lot of fast-digesting carbohydrate! Overweight, elevated insulin and other symptoms of the metabolic syndrome are essentially nonexistent. Throughout Africa, healthy cultures make dishes from grains or starchy tubers that are soaked, pounded, fermented and then cooked. The result is a pile of mush that is very easily absorbed by the digestive tract, which is exactly the point of going through all the trouble. The more I thought about the glycemic index and its relationship to insulin resistance and the metabolic syndrome, the more I realized there is a disconnect in the logic: elevated post-meal glucose and insulin do not necessarily lead to chronically elevated glucose and insulin. Here's what Dr. Mark Segal from Dr. Johnson's group had to say:We suggest that the [glycemic index] is better aimed at identifying foods that stimulate insulin secretion rather than foods that stimulate insulin resistance. The underlying concept is based on the principle that it is the ingestion of foods that induce insulin resistance that carries the increased risk for obesity and cardiovascular disease and not eating foods that stimulate insulin secretion.
Well said! I decided to take a look through the literature to see if there had been any trials on the relationship between a diet's glycemic index and its ability to cause satiety (fullness) and affect weight. I found a meta-analysis from 2007. Two things are clear from the paper: 1) in the short term, given an equal amount of carbohydrate, a diet with a low glycemic index is more satiating (filling) than one with a high glycemic index, leading to a lower intake of calories. 2) this effect disappears in the long-term, and the three trials (1, 2, 3) lasting 10 weeks or longer found no consistent effect on caloric intake or weight*. As a matter of fact, the only statistically significant (p less than 0.001) weight difference was a greater weight loss in one of the high-glycemic index groups!
As I've said many times, the body has mechanisms for maintaining weight and caloric intake where they should be in the long term. As long as those mechanisms are working properly, weight and caloric intake will be appropriate. The big question is, how does the modern lifestyle derail those mechanisms?
Dr. Johnson believes fructose is a major contributor. Table sugar, fruit, high-fructose corn syrup and honey are all roughly 50% fructose by calories. Total fructose consumption has increased about 19% in the U.S. since 1970, currently accounting for almost one eighth of our total calorie intake (total sugars account for one quarter!). That's the average, so many people actually consume more.
Fructose, but not starch or its component sugar glucose, causes insulin resistance, elevated serum uric acid (think gout and kidney stones), poorer blood glucose control, increased triglycerides and LDL cholesterol in animal studies and controlled human trials. All of these effects relate to the liver, which clearly does not like excessive fructose (or omega-6 oils). Some of these trials were conducted using doses that are near the average U.S. intake. The effect seems to compound over time both in humans and animals. The overweight, the elderly and the physically unfit are particularly vulnerable. I find this pretty damning.
Drs. Johnson and Segal recommend limiting fructose to 15-40 grams per day, which is the equivalent of about two apples or one soda (choose the apples!). They also recommend temporarily eliminating fructose for two weeks, to allow the body to recover from the negative long-term metabolic adaptation that can persist even when intake is low. I think this makes good sense.
The glycemic index may still be a useful tool for people with poor glucose control, like type II diabetics, but I'm not sure how much it adds to simply restricting carbohydrate. Reducing fructose may be a more effective way to address insulin resistance than eating a low glycemic index diet.
*Here was the author's way of putting it in the abstract: "Because of the increasing number of confounding variables in the available long-term studies, it is not possible to conclude that low-glycaemic diets mediate a health benefit based on body weight regulation. The difficulty of demonstrating the long-term health benefit of a satietogenic food or diet may constitute an obstacle to the recognition of associated claims." In other words, the data not supporting our favorite hypothesis is an obstacle to its recognition. You don't say?
Several commenters have asked for my opinion on recent statements by prominent health researchers that many Americans are suffering from unrecognized vitamin A toxicity. Dr. John Cannell of the Vitamin D Council is perhaps the most familiar of them. Dr. Cannell's mission is to convey the benefits of vitamin D to the public. The Vitamin D Council's website is a great resource.Vitamin A is a very important nutrient. Like vitamin D, it has its own nuclear receptors which alter the transcription of a number of genes in a wide variety of tissues. Thus, it is a very fundamental nutrient to health. It's necessary for proper development, vision, mineral metabolism, bone health, immune function, the integrity of skin and mucous membranes, and many other things. Vitamin A is a fat-soluble vitamin, and as such, it is possible to overdose. So far, everyone is in agreement.The question of optimal intake is where opinions begin to diverge. Hunter-gatherers and healthy non-industrial cultures, who almost invariably had excellent dental and skeletal development and health, often had a very high intake of vitamin A (according to Dr. Weston Price and others). This is not surprising, considering their fondness for organ meats. A meager 2 ounces of beef liver contains about 9,500 IU, or almost 200% of your U.S. and Canadian recommended daily allowance (RDA). Kidney and eye are rich in vitamin A, as are many of the marine oils consumed by the Inuit and other arctic groups. If we can extrapolate from historical hunter-gatherers, our ancestors didn't waste organs. In fact, in times of plenty, some groups discarded the muscle tissue and ate the organs and fat. Carnivorous animals often eat the organs first, because they know exactly where the nutrients are. Zookeepers know that if you feed a lion nothing but muscle, it does not thrive. This is the background against which we must consider the question of vitamin A toxicity. Claims of toxicity must be reconciled with the fact that healthy cultures often consumed large amounts of vitamin A without any ill effects. Well, you might be surprised to hear me say that I do believe some Americans and Europeans suffer from what you might call vitamin A toxicity. There is a fairly consistent association between vitamin A intake and bone mineral density, osteoporosis and fracture risk. It holds true across cultures and sources of vitamin A. Chris Masterjohn reviewed the epidemiology here. I recommend reading his very thorough article if you want more detail. The optimum intake in some studies is 2-3,000 IU, corresponding to about 50% of the RDA. People who eat more or less than this amount tend to suffer from poorer bone health. This is where Dr. Cannell and others are coming from when they say vitamin A toxicity is common.The only problem is, this position ignores the interactions between fat-soluble vitamins. Vitamin D strongly protects agains vitamin A toxicity and vice versa. As a matter of fact, "vitamin A toxicity" is almost certainly a relative deficiency of vitamin D. Vitamin D deficiency is also tightly correlated with low bone mineral density, osteoporosis and fracture risk. A high vitamin A intake requires vitamin D to balance it. The epidemiological studies showing an association between high-normal vitamin A intake and reduced bone health all sported populations that were moderately to severely vitamin D deficient on average. At optimal vitamin D levels, 40-70 ng/mL 25(OH)D, it would take a whopping dose of vitamin A to induce toxicity. You might get there if you eat nothing but beef liver for a week or two.The experiment hasn't been done under controlled conditions in humans, but if you believe the animal studies, the optimal intake for bone mineral density is a high intake of both vitamins A and D. And guess what? A high intake of vitamins A and D also increases the need for vitamin K2. That's because they work together. For example, vitamin D3 increases the secretion of matrix Gla protein and vitamin K2 activates it. Is it any surprise that the optimal proportions of A, D and K occur effortlessly in a lifestyle that includes outdoor activity and whole, natural animal foods? This is the blind spot of the researchers who have warned of vitamin A toxicity: uncontrolled reductionism. Vitamins do not act in a vacuum; they interact with one another. If your theory doesn't agree with empirical observations from healthy cultures, it's back to the drawing board.High-vitamin cod liver oil is an excellent source of vitamins A and D because it contains a balanced amount of both. Unfortunately, many brands use processing methods that reduce the amount of one or more vitamins. See the Weston Price foundation's recommendations for the highest quality cod liver oils. They also happen to be the cheapest per dose. I order Green Pasture high-vitamin cod liver oil through Live Superfoods (it's cheaper than ordering directly). So is vitamin A toxicity a concern? Not really; the concern is vitamin D deficiency.
Several commenters have asked for my opinion on recent statements by prominent health researchers that many Americans are suffering from unrecognized vitamin A toxicity. Dr. John Cannell of the Vitamin D Council is perhaps the most familiar of them. Dr. Cannell's mission is to convey the benefits of vitamin D to the public. The Vitamin D Council's website is a great resource.Vitamin A is a very important nutrient. Like vitamin D, it has its own nuclear receptors which alter the transcription of a number of genes in a wide variety of tissues. Thus, it is a very fundamental nutrient to health. It's necessary for proper development, vision, mineral metabolism, bone health, immune function, the integrity of skin and mucous membranes, and many other things. Vitamin A is a fat-soluble vitamin, and as such, it is possible to overdose. So far, everyone is in agreement.The question of optimal intake is where opinions begin to diverge. Hunter-gatherers and healthy non-industrial cultures, who almost invariably had excellent dental and skeletal development and health, often had a very high intake of vitamin A (according to Dr. Weston Price and others). This is not surprising, considering their fondness for organ meats. A meager 2 ounces of beef liver contains about 9,500 IU, or almost 200% of your U.S. and Canadian recommended daily allowance (RDA). Kidney and eye are rich in vitamin A, as are many of the marine oils consumed by the Inuit and other arctic groups. If we can extrapolate from historical hunter-gatherers, our ancestors didn't waste organs. In fact, in times of plenty, some groups discarded the muscle tissue and ate the organs and fat. Carnivorous animals often eat the organs first, because they know exactly where the nutrients are. Zookeepers know that if you feed a lion nothing but muscle, it does not thrive. This is the background against which we must consider the question of vitamin A toxicity. Claims of toxicity must be reconciled with the fact that healthy cultures often consumed large amounts of vitamin A without any ill effects. Well, you might be surprised to hear me say that I do believe some Americans and Europeans suffer from what you might call vitamin A toxicity. There is a fairly consistent association between vitamin A intake and bone mineral density, osteoporosis and fracture risk. It holds true across cultures and sources of vitamin A. Chris Masterjohn reviewed the epidemiology here. I recommend reading his very thorough article if you want more detail. The optimum intake in some studies is 2-3,000 IU, corresponding to about 50% of the RDA. People who eat more or less than this amount tend to suffer from poorer bone health. This is where Dr. Cannell and others are coming from when they say vitamin A toxicity is common.The only problem is, this position ignores the interactions between fat-soluble vitamins. Vitamin D strongly protects agains vitamin A toxicity and vice versa. As a matter of fact, "vitamin A toxicity" is almost certainly a relative deficiency of vitamin D. Vitamin D deficiency is also tightly correlated with low bone mineral density, osteoporosis and fracture risk. A high vitamin A intake requires vitamin D to balance it. The epidemiological studies showing an association between high-normal vitamin A intake and reduced bone health all sported populations that were moderately to severely vitamin D deficient on average. At optimal vitamin D levels, 40-70 ng/mL 25(OH)D, it would take a whopping dose of vitamin A to induce toxicity. You might get there if you eat nothing but beef liver for a week or two.The experiment hasn't been done under controlled conditions in humans, but if you believe the animal studies, the optimal intake for bone mineral density is a high intake of both vitamins A and D. And guess what? A high intake of vitamins A and D also increases the need for vitamin K2. That's because they work together. For example, vitamin D3 increases the secretion of matrix Gla protein and vitamin K2 activates it. Is it any surprise that the optimal proportions of A, D and K occur effortlessly in a lifestyle that includes outdoor activity and whole, natural animal foods? This is the blind spot of the researchers who have warned of vitamin A toxicity: uncontrolled reductionism. Vitamins do not act in a vacuum; they interact with one another. If your theory doesn't agree with empirical observations from healthy cultures, it's back to the drawing board.High-vitamin cod liver oil is an excellent source of vitamins A and D because it contains a balanced amount of both. Unfortunately, many brands use processing methods that reduce the amount of one or more vitamins. See the Weston Price foundation's recommendations for the highest quality cod liver oils. They also happen to be the cheapest per dose. I order Green Pasture high-vitamin cod liver oil through Live Superfoods (it's cheaper than ordering directly). So is vitamin A toxicity a concern? Not really; the concern is vitamin D deficiency.
I just discovered a wonderful new tool from Google.org, Google Flu Trends. Google.org is the philanthropic branch of Google. Flu Trends gives you real-time data on flu incidence in your U.S. state, as well as for the country as a whole. Here's how it works:We've found that certain search terms are good indicators of flu activity. Google Flu Trends uses aggregated Google search data to estimate flu activity in your state up to two weeks faster than traditional flu surveillance systems.
Each week, millions of users around the world search for online health information. As you might expect, there are more flu-related searches during flu season, more allergy-related searches during allergy season, and more sunburn-related searches during the summer.
Google's data match up well with U.S. Centers for Disease Control and Prevention (CDC) data on flu incidence, but are available 1-2 weeks before CDC data. Here's a comparison of Flu Trends and CDC data for previous years. Plus, Google makes the information easily accessible and user-friendly.
I think this a fantastic use of the massive amount of raw information on the internet. It's amazing what a person can do with a brain and an internet connection these days.
I just discovered a wonderful new tool from Google.org, Google Flu Trends. Google.org is the philanthropic branch of Google. Flu Trends gives you real-time data on flu incidence in your U.S. state, as well as for the country as a whole. Here's how it works:We've found that certain search terms are good indicators of flu activity. Google Flu Trends uses aggregated Google search data to estimate flu activity in your state up to two weeks faster than traditional flu surveillance systems.
Each week, millions of users around the world search for online health information. As you might expect, there are more flu-related searches during flu season, more allergy-related searches during allergy season, and more sunburn-related searches during the summer.
Google's data match up well with U.S. Centers for Disease Control and Prevention (CDC) data on flu incidence, but are available 1-2 weeks before CDC data. Here's a comparison of Flu Trends and CDC data for previous years. Plus, Google makes the information easily accessible and user-friendly.
I think this a fantastic use of the massive amount of raw information on the internet. It's amazing what a person can do with a brain and an internet connection these days.
It certainly can in rats. In April 2007, Dr. Cees Vermeer and his group published a paper on the effect of vitamin K on arterial calcification (the accumulation of calcium in the arteries). As I mentioned two posts ago, arterial calcification is tightly associated with the risk of heart attack and death. Warfarin-treated rats are an established model of arterial calcification. Warfarin also causes calcification in humans. The drug is a "blood thinner" that inhibits vitamin K recycling, and inhibits the conversion of vitamin K1 (phylloquinone) to K2 MK-4 (menaquinone-4). This latter property turns out to be the critical one in the calcification process.
Rats are able to convert vitamin K1 to K2 MK-4, whereas humans don't seem to convert well. Conversion efficiency varies between species. Dr. Vermeer's group treated rats with warfarin for 6 weeks, during which they developed extensive arterial calcification. They also received vitamin K1 to keep their blood clotting properly. At 6 weeks, the warfarin-treated rats were broken up into several groups: - One continued on the warfarin and K1 diet
- One was placed on a diet containing a normal amount of K1 (no warfarin)
- One was placed on a high K1 diet (no warfarin)
- The last was placed on a high K2 MK-4 diet (no warfarin)
After 6 more weeks, the first two groups developed even more calcification, while the third and fourth groups lost about 40% of their arterial calcium. The high vitamin K groups also saw a decrease in cell death in the artery wall, a decrease in uncarboxylated (inactive) MGP, and an increase in arterial elasticity. They also measured the vitamin K content of aortas from each group. The group that received the 12-week warfarin treatment had a huge amount of K1 accumulation in the aorta, but no K2 MK-4. This is expected because warfarin inhibits the conversion of K1 to K2 MK-4. It's notable that when conversion to K2 was blocked, K1 alone was totally ineffective at activating MGP and preventing calcification.
In the group fed high K1 but no warfarin, there was about three times more K2 MK-4 in the aortas than K1, suggesting that they had converted it effectively and that vascular tissue selectively accumulates K2 MK-4. A high K1 intake was required for this effect, however, since the normal K1 diet did not reverse calcification. The rats fed high K2 MK-4 had only K2 MK-4 in their aortas, as expected.What does this mean for us? K2 MK-4 appears to be the form of vitamin K that arteries prefer (although not enough is known about the longer menaquinones, such as MK-7, to rule out a possible effect). Humans don't seem to be very good at making the conversion from K1 to K2 MK-4 (at normal intakes; there are suggestions that at artificially large doses we can do it). That means we need to ensure an adequate K2 MK-4 intake to prevent or reverse arterial calcification; eating K1-rich greens won't cut it. It's worth noting that the amounts of K1 and K2 used in the paper were very large, far beyond what is obtainable through food. But the regression took only 6 weeks, so it's possible that a smaller amount of K2 MK-4 over a longer period could have the same effect in humans.
K2 MK-4 (and perhaps other menaquinones like MK-7) may turn out to be an effective treatment for arterial calcification and cardiovascular disease in general. It's extremely effective at preventing osteoporosis-related fractures in humans. That's a highly significant fact. Osteoporosis and arterial calcification often come hand-in-hand. Thus, they are not a result of insufficient or excessive calcium, but of a failure to use the available calcium effectively. In the warfarin-treated rats described above, the serum (blood) calcium concentration was the same in all groups. Osteoporosis and arterial calcification are two sides of the same coin, and the fact that one can be addressed with K2 MK-4 means that the other may be as well.
Both osteoporosis and arterial calcification may turn out to be symptoms of vitamin K2 deficiency, resulting from the modern fear of animal fats and organs, and the deterioration of traditional animal husbandry practices. So eat your pastured dairy, organs, fish roe and shellfish! And if you have arterial calcification, as judged by a heart scan, you may want to consider supplementing with additional K2 MK-4 (also called menaquinone-4 and menatetrenone).
The osteoporosis studies were done with 45 milligrams per day, which was well tolerated but seems excessive to me. Smaller doses were not tested. From the limited information available on the K2 content of foods, 1 milligram of K2 MK-4 per day seems like the upper limit of what you can get from food. That's about 40 times more than the average person eats. Anything more and you're outside your body's operating parameters. Make sure you're getting adequate vitamin D3 and A if you supplement with K2. Vitamin D3 in particular increases the secretion of MGP, so the two work in concert.
It certainly can in rats. In April 2007, Dr. Cees Vermeer and his group published a paper on the effect of vitamin K on arterial calcification (the accumulation of calcium in the arteries). As I mentioned two posts ago, arterial calcification is tightly associated with the risk of heart attack and death. Warfarin-treated rats are an established model of arterial calcification. Warfarin also causes calcification in humans. The drug is a "blood thinner" that inhibits vitamin K recycling, and inhibits the conversion of vitamin K1 (phylloquinone) to K2 MK-4 (menaquinone-4). This latter property turns out to be the critical one in the calcification process.
Rats are able to convert vitamin K1 to K2 MK-4, whereas humans don't seem to convert well. Conversion efficiency varies between species. Dr. Vermeer's group treated rats with warfarin for 6 weeks, during which they developed extensive arterial calcification. They also received vitamin K1 to keep their blood clotting properly. At 6 weeks, the warfarin-treated rats were broken up into several groups: - One continued on the warfarin and K1 diet
- One was placed on a diet containing a normal amount of K1 (no warfarin)
- One was placed on a high K1 diet (no warfarin)
- The last was placed on a high K2 MK-4 diet (no warfarin)
After 6 more weeks, the first two groups developed even more calcification, while the third and fourth groups lost about 40% of their arterial calcium. The high vitamin K groups also saw a decrease in cell death in the artery wall, a decrease in uncarboxylated (inactive) MGP, and an increase in arterial elasticity. They also measured the vitamin K content of aortas from each group. The group that received the 12-week warfarin treatment had a huge amount of K1 accumulation in the aorta, but no K2 MK-4. This is expected because warfarin inhibits the conversion of K1 to K2 MK-4. It's notable that when conversion to K2 was blocked, K1 alone was totally ineffective at activating MGP and preventing calcification.
In the group fed high K1 but no warfarin, there was about three times more K2 MK-4 in the aortas than K1, suggesting that they had converted it effectively and that vascular tissue selectively accumulates K2 MK-4. A high K1 intake was required for this effect, however, since the normal K1 diet did not reverse calcification. The rats fed high K2 MK-4 had only K2 MK-4 in their aortas, as expected.What does this mean for us? K2 MK-4 appears to be the form of vitamin K that arteries prefer (although not enough is known about the longer menaquinones, such as MK-7, to rule out a possible effect). Humans don't seem to be very good at making the conversion from K1 to K2 MK-4 (at normal intakes; there are suggestions that at artificially large doses we can do it). That means we need to ensure an adequate K2 MK-4 intake to prevent or reverse arterial calcification; eating K1-rich greens won't cut it. It's worth noting that the amounts of K1 and K2 used in the paper were very large, far beyond what is obtainable through food. But the regression took only 6 weeks, so it's possible that a smaller amount of K2 MK-4 over a longer period could have the same effect in humans.
K2 MK-4 (and perhaps other menaquinones like MK-7) may turn out to be an effective treatment for arterial calcification and cardiovascular disease in general. It's extremely effective at preventing osteoporosis-related fractures in humans. That's a highly significant fact. Osteoporosis and arterial calcification often come hand-in-hand. Thus, they are not a result of insufficient or excessive calcium, but of a failure to use the available calcium effectively. In the warfarin-treated rats described above, the serum (blood) calcium concentration was the same in all groups. Osteoporosis and arterial calcification are two sides of the same coin, and the fact that one can be addressed with K2 MK-4 means that the other may be as well.
Both osteoporosis and arterial calcification may turn out to be symptoms of vitamin K2 deficiency, resulting from the modern fear of animal fats and organs, and the deterioration of traditional animal husbandry practices. So eat your pastured dairy, organs, fish roe and shellfish! And if you have arterial calcification, as judged by a heart scan, you may want to consider supplementing with additional K2 MK-4 (also called menaquinone-4 and menatetrenone).
The osteoporosis studies were done with 45 milligrams per day, which was well tolerated but seems excessive to me. Smaller doses were not tested. From the limited information available on the K2 content of foods, 1 milligram of K2 MK-4 per day seems like the upper limit of what you can get from food. That's about 40 times more than the average person eats. Anything more and you're outside your body's operating parameters. Make sure you're getting adequate vitamin D3 and A if you supplement with K2. Vitamin D3 in particular increases the secretion of MGP, so the two work in concert.
Traditional cultures throughout the world went to great lengths to maximize the nutritional value of the ingredients they had. Fermentation is a technique that was widely used for preparing grains and legumes. Humans are not well adapted to grains or legumes, in large part due to their assortment of anti-nutrients (substances that prevent the absorption of nutrients) and other toxins. Fermentation is a very effective way to eliminate anti-nutrients, making grains and legumes more nutritious and easily digested.Idlis are steamed, naturally leavened cakes made from a fermented mixture of ground rice and beans. They're mild, savory and fluffy, and pair well with nearly any dish. I think they fill in well for bread. Due to the combination of rice and beans, they contain a fair amount of high-quality complete protein. They are also very economical. Idlis have their roots in Southern Indian cuisine more than 1,000 years ago. They may have originated as a fermented bean dish, with rice added to the recipe later in history.The recipe takes 2-3 days to complete, but actually doesn't require much work. First, the beans and rice are soaked separately, then they are ground and mixed, then they are allowed to ferment for 24-48 hours and steamed. This type of days-long soaking and fermentation process is common in many grain-based cultures worldwide.The recipe traditionally calls for short-grain white rice and urad dal (split black gram). I've been using short-grain brown rice with good results. You will only be able to find urad dal in an Indian grocer, specialty store or online. If you can't find urad dal, try experimenting with other types of mild dry beans.Ingredients and materials- One cup urad dal or other dried bean
- Two cups short-grain brown or white rice
- One teaspoon fenugreek (optional)
- Two teaspoons non-iodized salt
- Filtered or otherwise dechlorinated water
- Muffin tray
- Large pot for steaming (optional)
Recipe- Soak urad dal and rice separately for 6 hours (longer if you're using a different type of bean). Add fenugreek to the rice before soaking (optional). It's used traditionally to speed fermentation.
- Pour water off the urad dal and rice/fenugreek mixture. Don't rinse.
- Grind the urad dal in a food process or or blender with a minimum amount of water until it's a smooth paste. The water must not be chlorinated or it will kill our bacteria! Brita-type water filters remove chlorine, as does boiling or leaving water uncovered overnight.
- Grind the rice/fenugreek mixture coarsely with a minimum amount of dechlorinated water.
- Mix the ground urad dal, ground rice and salt. The salt must be non-iodized, or the batter will not ferment! Pickling salt, kosher salt and unrefined sea salt work well. Add dechlorinated water until it's a thick paste, stirrable but not liquid.
- Ferment for 24-48 hours. You know it's ready when the dough has risen significantly, and the odor has gone from harsh and beany to mild and savory. Fermentation time will depend on the ambient temperature.
- Fill muffin trays about half-way with batter and steam until a knife inserted into them comes out clean, 15-20 minutes. You can also bake them at 350 F. It's not traditional, but I like them baked almost as much. If you really want to be traditional, you can buy an idli steamer.
Here are photos of my last batch. Soaking the urad dal and rice:Batter, pre-fermentation:Batter, post-fermentation (48 hours). It more than doubled in volume. The color didn't actually change, that's just my camera.Ready to steam or bake.After baking. One escaped! Into my belly.Thanks to Soumya dey and Wikipedia for the top photo
Traditional cultures throughout the world went to great lengths to maximize the nutritional value of the ingredients they had. Fermentation is a technique that was widely used for preparing grains and legumes. Humans are not well adapted to grains or legumes, in large part due to their assortment of anti-nutrients (substances that prevent the absorption of nutrients) and other toxins. Fermentation is a very effective way to eliminate anti-nutrients, making grains and legumes more nutritious and easily digested.Idlis are steamed, naturally leavened cakes made from a fermented mixture of ground rice and beans. They're mild, savory and fluffy, and pair well with nearly any dish. I think they fill in well for bread. Due to the combination of rice and beans, they contain a fair amount of high-quality complete protein. They are also very economical. Idlis have their roots in Southern Indian cuisine more than 1,000 years ago. They may have originated as a fermented bean dish, with rice added to the recipe later in history.The recipe takes 2-3 days to complete, but actually doesn't require much work. First, the beans and rice are soaked separately, then they are ground and mixed, then they are allowed to ferment for 24-48 hours and steamed. This type of days-long soaking and fermentation process is common in many grain-based cultures worldwide.The recipe traditionally calls for short-grain white rice and urad dal (split black gram). I've been using short-grain brown rice with good results. You will only be able to find urad dal in an Indian grocer, specialty store or online. If you can't find urad dal, try experimenting with other types of mild dry beans.Ingredients and materials- One cup urad dal or other dried bean
- Two cups short-grain brown or white rice
- One teaspoon fenugreek (optional)
- Two teaspoons non-iodized salt
- Filtered or otherwise dechlorinated water
- Muffin tray
- Large pot for steaming (optional)
Recipe- Soak urad dal and rice separately for 6 hours (longer if you're using a different type of bean). Add fenugreek to the rice before soaking (optional). It's used traditionally to speed fermentation.
- Pour water off the urad dal and rice/fenugreek mixture. Don't rinse.
- Grind the urad dal in a food process or or blender with a minimum amount of water until it's a smooth paste. The water must not be chlorinated or it will kill our bacteria! Brita-type water filters remove chlorine, as does boiling or leaving water uncovered overnight.
- Grind the rice/fenugreek mixture coarsely with a minimum amount of dechlorinated water.
- Mix the ground urad dal, ground rice and salt. The salt must be non-iodized, or the batter will not ferment! Pickling salt, kosher salt and unrefined sea salt work well. Add dechlorinated water until it's a thick paste, stirrable but not liquid.
- Ferment for 24-48 hours. You know it's ready when the dough has risen significantly, and the odor has gone from harsh and beany to mild and savory. Fermentation time will depend on the ambient temperature.
- Fill muffin trays about half-way with batter and steam until a knife inserted into them comes out clean, 15-20 minutes. You can also bake them at 350 F. It's not traditional, but I like them baked almost as much. If you really want to be traditional, you can buy an idli steamer.
Here are photos of my last batch. Soaking the urad dal and rice:Batter, pre-fermentation:Batter, post-fermentation (48 hours). It more than doubled in volume. The color didn't actually change, that's just my camera.Ready to steam or bake.After baking. One escaped! Into my belly.Thanks to Soumya dey and Wikipedia for the top photo