Sunday, April 28, 2019

Subcutaneous versus visceral fat: How to tell the difference?

The photos below, from Wikipedia, show two patterns of abdominal fat deposition. The one on the left is predominantly of subcutaneous abdominal fat deposition. The one on the right is an example of visceral abdominal fat deposition, around internal organs, together with a significant amount of subcutaneous fat deposition as well.


Body fat is not an inert mass used only to store energy. Body fat can be seen as a “distributed organ”, as it secretes a number of hormones into the bloodstream. For example, it secretes leptin, which regulates hunger. It secretes adiponectin, which has many health-promoting properties. It also secretes tumor necrosis factor-alpha (more recently referred to as simply “tumor necrosis factor” in the medical literature), which promotes inflammation. Inflammation is necessary to repair damaged tissue and deal with pathogens, but too much of it does more harm than good.

How does one differentiate subcutaneous from visceral abdominal fat?

Subcutaneous abdominal fat shifts position more easily as one’s body moves. When one is standing, subcutaneous fat often tends to fold around the navel, creating a “mouth” shape. Subcutaneous fat is easier to hold in one’s hand, as shown on the left photo above. Because subcutaneous fat tends to “shift” more easily as one changes the position of the body, if you measure your waist circumference lying down and standing up, and the difference is large (a one-inch difference can be considered large), you probably have a significant amount of subcutaneous fat.

Waist circumference is a variable that reflects individual changes in body fat percentage fairly well. This is especially true as one becomes lean (e.g., around 14-17 percent or less of body fat for men, and 21-24 for women), because as that happens abdominal fat contributes to an increasingly higher proportion of total body fat. For people who are lean, a 1-inch reduction in waist circumference will frequently translate into a 2-3 percent reduction in body fat percentage. Having said that, waist circumference comparisons between individuals are often misleading. Waist-to-fat ratios tend to vary a lot among different individuals (like almost any trait). This means that someone with a 34-inch waist (measured at the navel) may have a lower body fat percentage than someone with a 33-inch waist.

Subcutaneous abdominal fat is hard to mobilize; that is, it is hard to burn through diet and exercise. This is why it is often called the “stubborn” abdominal fat. One reason for the difficulty in mobilizing subcutaneous abdominal fat is that the network of blood vessels is not as dense in the area where this type of fat occurs, as it is with visceral fat. Another reason, which is related to degree of vascularization, is that subcutaneous fat is farther away from the portal vein than visceral fat. As such, it has to travel a longer distance to reach the main “highway” that will take it to other tissues (e.g., muscle) for use as energy.

In terms of health, excess subcutaneous fat is not nearly as detrimental as excess visceral fat. Excess visceral fat typically happens together with excess subcutaneous fat; but not necessarily the other way around. For instance, sumo wrestlers frequently have excess subcutaneous fat, but little or no visceral fat. The more health-detrimental effect of excess visceral fat is probably related to its proximity to the portal vein, which amplifies the negative health effects of excessive pro-inflammatory hormone secretion. Those hormones reach a major transport “highway” rather quickly.

Even though excess subcutaneous body fat is more benign than excess visceral fat, excess body fat of any kind is unlikely to be health-promoting. From an evolutionary perspective, excess body fat impaired agile movement and decreased circulating adiponectin levels; the latter leading to a host of negative health effects. In modern humans, negative health effects may be much less pronounced with subcutaneous than visceral fat, but they will still occur.

Based on studies of isolated hunger-gatherers, it is reasonable to estimate “natural” body fat levels among our Stone Age ancestors, and thus optimal body fat levels in modern humans, to be around 6-13 percent in men and 14–20 percent in women.

If you think that being overweight probably protected some of our Stone Age ancestors during times of famine, here is one interesting factoid to consider. It will take over a month for a man weighing 150 lbs and with 10 percent body fat to die from starvation, and death will not be typically caused by too little body fat being left for use as a source of energy. In starvation, normally death will be caused by heart failure, as the body slowly breaks down muscle tissue (including heart muscle) to maintain blood glucose levels.

References:

Arner, P. (2005). Site differences in human subcutaneous adipose tissue metabolism in obesity. Aesthetic Plastic Surgery, 8(1), 13-17.

Brooks, G.A., Fahey, T.D., & Baldwin, K.M. (2005). Exercise physiology: Human bioenergetics and its applications. Boston, MA: McGraw-Hill.

Fleck, S.J., & Kraemer, W.J. (2004). Designing resistance training programs. Champaign, IL: Human Kinetics.

Taubes, G. (2007). Good calories, bad calories: Challenging the conventional wisdom on diet, weight control, and disease. New York, NY: Alfred A. Knopf.

Friday, March 22, 2019

Total cholesterol and cardiovascular disease: A U-curve relationship

The hypothesis that blood cholesterol levels are positively correlated with heart disease (the lipid hypothesis) dates back to Rudolph Virchow in the mid-1800s.

One famous study that supported this hypothesis was Ancel Keys's Seven Countries Study, conducted between the 1950s and 1970s. This study eventually served as the foundation on which much of the advice that we receive today from doctors is based, even though several other studies have been published since that provide little support for the lipid hypothesis.

The graph below (from O Primitivo) shows the results of one study, involving many more countries than Key's Seven Countries Study, that actually suggests a NEGATIVE linear correlation between total cholesterol and cardiovascular disease.


Now, most relationships in nature are nonlinear, with quite a few following a pattern that looks like a U-curve (plain or inverted); sometimes called a J-curve pattern. The graph below (also from O Primitivo) shows the U-curve relationship between total cholesterol and mortality, with cardiovascular disease mortality indicated through a dotted red line at the bottom.

This graph has been obtained through a nonlinear analysis, and I think it provides a better picture of the relationship between total cholesterol (TC) and mortality. Based on this graph, the best range of TC that one can be at is somewhere between 210, where cardiovascular disease mortality is minimized; and 220, where total mortality is minimized.

The total mortality curve is the one indicated through the full blue line at the top. In fact, it suggests that mortality increases sharply as TC decreases below 200.

Now, these graphs relate TC with disease and mortality, and say nothing about LDL cholesterol (LDL). In my own experience, and that of many people I know, a TC of about 200 will typically be associated with a slightly elevated LDL (e.g., 110 to 150), even if one has a high HDL cholesterol (i.e., greater than 60).

Yet, most people who have a LDL greater than 100 will be told by their doctors, usually with the best of the intentions, to take statins, so that they can "keep their LDL under control". (LDL levels are usually calculated, not measured directly, which itself creates a whole new set of problems.)

Alas, reducing LDL to 100 or less will typically reduce TC below 200. If we go by the graphs above, especially the one showing the U-curves, these folks' risk for cardiovascular disease and mortality will go up - exactly the opposite effect that they and their doctors expected. And that will cost them financially as well, as statin drugs are expensive, in part to pay for all those TV ads.

Wednesday, February 27, 2019

Want to improve your cholesterol profile? Replace refined carbs and sugars with saturated fat and cholesterol in your diet

An interesting study by Clifton and colleagues (1998; full reference and link at the end of this post) looked at whether LDL cholesterol particle size distribution at baseline (i.e., beginning of the study) for various people was a determinant of lipid profile changes in each of two diets – one low and the other high in fat. This study highlights a few interesting points made in a previous post, which are largely unrelated to the main goal or findings of the study, but that are supported by side findings:

- As one increases dietary cholesterol and fat consumption, particularly saturated fat, circulating HDL cholesterol increases significantly. This happens whether one is taking niacin or not, although niacin seems to help, possibly as an independent (not moderating) factor. Increasing serum vitamin D levels, which can be done through sunlight exposure and supplementation, are also known to increase circulating HDL cholesterol.

- As one increases dietary cholesterol and fat consumption, particularly saturated fat, triglycerides in the fasting state (i.e., measured after a 8-hour fast) decrease significantly, particularly on a low carbohydrate diet. Triglycerides in the fasting state are negatively correlated with HDL cholesterol; they go down as HDL cholesterol goes up. This happens whether one is taking niacin or supplementing omega 3 fats or not, although these seem to help, possibly as independent factors.

- If one increases dietary fat intake, without also decreasing carbohydrate intake (particularly in the form of refined grains and sugars), LDL cholesterol will increase. Even so, LDL particle sizes will shift to more benign forms, which are the larger forms. Not all LDL particles change to benign forms, and there seem to be some genetic factors that influence this. LDL particles larger than 26 nm in diameter simply cannot pass through the gaps in the endothelium, which is a thin layer of cells lining the interior surface of arteries, and thus do not induce plaque formation.

The study by Clifton and colleagues (1998) involved 54 men and 51 women with a wide range of lipid profiles. They first underwent a 2-week low fat period, after which they were given two liquid supplements in addition to their low fat diet, for a period of 3 weeks. One of the liquid supplements contained 31 to 40 g of fat, and 650 to 845 mg of cholesterol. The other was fat and cholesterol free.

Studies that adopt a particular diet at baseline have the advantage of departing from a uniform diet across conditions. They also typically have one common characteristic: the baseline diet reflects the beliefs of the authors about what an ideal diet is. That is not always the case, of course. If this was indeed the case here, we have a particularly interesting study, because in that case the side findings discussed below contradicted the authors’ beliefs.

The table below shows the following measures for the participants in the study: age, body mass index (BMI), waist-to-hip ratio (WHR), total cholesterol, triglycerides, low-density lipoprotein (LDL) cholesterol, and three subtypes of high-density lipoprotein (HDL) cholesterol. LDL cholesterol is the colloquially known as the “bad” type, and “HDL” as the good one (which is an oversimplification). In short, the participants were overweight, middle-aged men and women, with relatively poor lipid profiles.


At the bottom of the table is the note “P < 0.001”, following a small “a”. This essentially means that on the rows indicated by an “a”, like the “WHR” row, the difference in the averages (e.g., 0.81 for women, and 0.93 for men, in the WHR row) was significantly different from what one would expect it to be due to chance alone. More precisely, the likelihood that the difference was due to chance was lower than 0.001, or 0.1 percent, in the case of a P < 0.001. Usually a difference between averages (a.k.a. means) associated with a P < 0.05 will be considered statistically significant.

Since the LDL cholesterol concentrations (as well as other lipoprotein concentrations) are listed on the table in mmol/L, and many people receive those measures in mg/dL in blood lipid profile test reports, below is a conversion table for LDL cholesterol (from: Wikipedia).


The table below shows the dietary intake in the low and high fat diets. Note that in the high fat diet, not only is the fat intake higher, but so is the cholesterol intake. The latter is significantly higher, more than 4 times the intake in the low fat diet, and about 2.5 times the recommended daily value by the U.S. Food and Drug Administration. The total calorie intake is reported as slightly lower in the high fat diet than in the low fat diet.


Note that the largest increase was in saturated fat, followed by an almost equally large increase in monounsaturated fat. This, together with the increase in cholesterol, mimics a move to a diet where fatty meat and organs are consumed in higher quantities, with a corresponding reduction in the intake of refined carbohydrates (e.g., bread, pasta, sugar, potatoes) and lean meats.

Finally, the table below shows the changes in lipid profiles in the low and high fat diets. Note that all subtypes of HDL (or "good") cholesterol concentrations were significantly higher in the high fat diet, which is very telling, because HDL cholesterol concentrations are much better predictors of cardiovascular disease than LDL or total cholesterol concentrations. The higher the HDL cholesterol, the lower the risk of cardiovascular disease.


In the table above, we also see that triglycerides are significantly lower in the high fat diet, which is also good, because high fasting triglyceride concentrations are associated with cardiovascular disease and also insulin resistance (which is associated with diabetes).

However, the total and LDL cholesterol were also significantly higher in the high fat compared to the low fat diet. Is this as bad as it sounds? Not when we look at other factors that are not clear from the tables in the article.

One of those factors is the likely change in LDL particle size. LDL particle sizes almost always increase with significant increases in HDL; frequently going up in diameter beyond 26 nm, and thus passing the threshold beyond which an LDL particle can penetrate the endothelium and help form a plaque.

Another important factor to take into consideration is the somewhat strange decision by the authors to use the Friedewald equation to estimate the LDL concentrations in the low and high fat diets. Through the Friedewald equation, LDL is calculated as follows (where TC is total cholesterol):

    LDL = TC – HDL – Triglycerides / 5

Here is one of the problems with the Friedewald equation. Let us assume that an individual has the following lipid profile numbers: TC = 200, HDL = 50, and trigs. = 150. The calculated LDL will be 120. Let us assume that this same individual reduces trigs. to 50, from the previous 150, keeping all of the other measures constant. This is a major improvement. Yet, the calculated LDL will now be 140, and a doctor will tell this person to consider taking statins!

By the way, most people who do a blood test and get their lipid profile report also get their LDL calculated through the Friedewald equation. Usually this is indicated through a "CALC" note next to the description of the test or the calculated LDL number.

Finally, total cholesterol is not a very useful measure, because an elevated total cholesterol may be primarily reflecting an elevated HDL, which is healthy. Also, a slightly elevated total cholesterol seems to be protective, as it is associated with reduced overall mortality and also reduced mortality from cardiovascular disease, according to U-curve regression studies comparing mortality and total cholesterol levels in different countries.

We do not know for sure that the participants in this study were consuming a lot of refined carbohydrates and/or sugars at baseline. But it is a safe bet that they were, since they were consuming 214 g of carbohydrates per day. It is difficult, although not impossible, to eat that many carbohydrates per day by eating only vegetables and fruits, which are mostly water. Consumption of starches makes it easier to reach that level.

This is why when one goes on a paleo diet, he or she reduces significantly the amount of dietary carbohydrates; even more so on a targeted low carbohydrate diet, such as the Atkins diet. Richard K. Bernstein, who is a type 1 diabetic and has been adopting a strict low carbohydrate diet during most of his adult life, had the following lipid profile at 72 years of age: HDL = 118, LDL = 53, trigs. = 45. His fasting blood sugar was reportedly 83 mg/dl. Click here to listen to an interview with Dr. Bernstein on the The Livin' La Vida Low-Carb Show.

The lipid profile improvement observed (e.g., a 14 percent increase in HDL from baseline for men, and about half that for women, in only 3 weeks) was very likely due to an increase in dietary saturated fat and cholesterol combined with a decrease in refined carbohydrates and sugars. The improvement would probably have been even more impressive with a higher increase in saturated fat, as long as it was accompanied by the elimination of refined carbohydrates and sugars from the participants’ diets.

Reference:

Clifton, P. M., M. Noakes, and P. J. Nestel (1998). LDL particle size and LDL and HDL cholesterol changes with dietary fat and cholesterol in healthy subjects. J. Lipid. Res. 39: 1799–1804.

Monday, January 28, 2019

What should be my HDL cholesterol?

HDL cholesterol levels are a rough measure of HDL particle quantity in the blood. They actually tell us next to nothing about HDL particle type, although HDL cholesterol increases are usually associated with increases in LDL particle size. This a good thing, since small-dense LDL particles are associated with increased cardiovascular disease.

Most blood lipid panels reviewed by family doctors with patients give information about HDL status through measures of HDL cholesterol, provided in one of the standard units (e.g., mg/dl).

Study after study shows that HDL cholesterol levels, although imprecise, are a much better predictor of cardiovascular disease than LDL or total cholesterol levels. How high should be one’s HDL cholesterol? The answer to this question is somewhat dependent on each individual’s health profile, but most data suggest that a level greater than 60 mg/dl (1.55 mmol/l) is close to optimal for most people.

The figure below (from Eckardstein, 2008; full reference at the end of this post) plots incidence of coronary events in men (on the vertical axis), over a period of 10 years, against HDL cholesterol levels (on the horizontal axis). Note: IFG = impaired fasting glucose. This relationship is similar for women, particularly post-menopausal women. Pre-menopausal women usually have higher HDL cholesterol levels than men, and a low incidence of coronary events.


From the figure above, one can say that a diabetic man with about 55 mg/dl of HDL cholesterol will have approximately the same chance, on average, of having a coronary event (a heart attack) as a man with no risk factors and about 20 mg/dl of HDL cholesterol. That chance will be about 7 percent. With 20 mg/dl of HDL cholesterol, the chance of a diabetic man having a coronary event would approach 50 percent.

We can also conclude from the figure above that a man with no risk factors will have a 5 percent chance of having a coronary event if his HDL cholesterol is about 25 mg/dl; and about 2 percent if his HDL cholesterol is greater than 60 mg/dl. This a 60 percent reduction in risk, a risk that was low to start with because of the absence of risk factors.

HDL cholesterol levels greater than 60 are associated with significantly reduced risks of coronary events, particularly for those with diabetes (the graph does not take diabetes type into consideration). Much higher levels of HDL cholesterol (beyond 60) do not seem to be associated with much lower risk of coronary events.

Conversely, a very low HDL cholesterol level (below 25) is a major risk factor when other risk factors are also present, particularly: diabetes, hypertension (high blood pressure), and familial hypercholesteromia (gene-induced very elevated LDL cholesterol).

It is not yet clear whether HDL cholesterol is a cause of reduced cardiovascular disease, or just a marker of other health factors that lead to reduced risk for cardiovascular disease. Much of the empirical evidence suggests a causal relationship, and if this is the case then it may be a good idea to try to increase HDL levels. Even if HDL cholesterol is just a marker, the same strategy that increases it may also have a positive impact on the real causative factor of which HDL cholesterol is a marker.

What can one do to increase his or her HDL cholesterol? One way is to replace refined carbs and sugars with saturated fat and cholesterol in one’s diet. (I know that this sounds counterintuitive, but seems to work.) Another is to increase one’s vitamin D status, through sun exposure or supplementation.

Other therapeutic interventions can also be used to increase HDL; some more natural than others. The figure below (also from Eckardstein, 2008) shows the maximum effects of several therapeutic interventions to increase HDL cholesterol.


Among the therapeutic interventions shown in the figure above, taking nicotinic acid (niacin) in pharmacological doses, of 1 to 3 g per day (higher dosages may be toxic), is by far the most effective way of increasing one’s HDL cholesterol. Only the niacin that causes flush is effective in this respect. No-flush niacin preparations may have some anti-inflammatory effects, but do not cause increases in HDL cholesterol.

Rimonabant, which is second to niacin in its effect on HDL cholesterol, is an appetite suppressor that has been associated with serious side effects and, to be best of my knowledge, has been largely banned from use in pharmaceutical drugs.

Third in terms of effectiveness, among the factors shown in the figure, is moderate alcohol consumption. Running about 19 miles per week (2.7 miles per day) and taking fibrates are tied in forth place.

Many people think that they are having a major allergic reaction, and have a panic attack, when they experience the niacin flush. This usually happens several minutes after taking niacin, and depends on the dose and whether niacin was consumed with food or not. It is not uncommon for one’s entire torso to turn hot red, as though the person had had major sunburn. This reaction is harmless, and usually disappears after several minutes.

One could say that, with niacin: no “pain” (i.e., flush), no gain.

Reference:

von Eckardstein, A. (2008). HDL – a difficult friend. Drug Discovery Today: Disease Mechanisms, 5(3), 315-324.

Saturday, December 22, 2018

Applied evolutionary thinking: Darwin meets Washington

Charles Darwin, perhaps one of the greatest scholars of all time, thought about his theory of mutation, inheritance, and selection of biological traits for more than 20 years, and finally published it as a book in 1859.  At that time, many animal breeders must have said something like this: “So what? We knew this already.”

In fact George Washington, who died in 1799 (many years before Darwin’s famous book came out), had tried his hand at what today would be called “genetic engineering.” He produced at least a few notable breeds of domestic animals through selective breeding. Those include a breed of giant mules – the “Mammoth Jackstock” breed. Those mules are so big and strong that they were used to pull large boats filled with coal along artificial canals in Pennsylvania.

Washington learned the basic principles of animal breeding from others, who learned it from others, and so on. Animal breeding has a long tradition.

So, not only did animal breeders, like George Washington, had known about the principles of mutation, inheritance, and selection of biological traits; but they also had been putting that knowledge into practice for quite some time before Darwin’s famous book “The Origin of Species” was published.

Yet, Darwin’s theory has applications that extend well beyond animal breeding. There are thousands of phenomena that would look very “mysterious” today without Darwin’s theory. Many of those phenomena apply to nutrition and lifestyle, as we have been seeing lately with the paleo diet movement. Among the most amazing and counterintuitive are those in connection with the design of our brain.

Recent research, for instance, suggests that “surprise” improves cognition. Let me illustrate this with a simple example. If you were studying a subject online that required memorization of key pieces of information (say, historical facts) and a surprise stimulus was “thrown” at you (say, a video clip of an attacking rattlesnake was shown on the screen), you would remember the key pieces of information (about historical facts) much better than if the surprise stimulus was not present!

The underlying Darwinian reason for this phenomenon is that it is adaptively advantageous for our brain to enhance our memory in dangerous situations (e.g., an attack by a poisonous snake), because that would help us avoid those situations in the future (Kock et al., 2008; references listed at the end of this post). Related mental mechanisms increased our ancestors’ chances of survival over many generations, and became embedded in our brain’s design.

Animal breeders knew that they could apply selection, via selective breeding, to any population of animals, and thus make certain traits evolve in a matter of a few dozen generations or less. This is known as artificial selection. Among those traits were metabolic traits. For example, a population of lambs may be bred to grow fatter on the same amount of food as leaner breeds.

Forced natural selection may have been imposed on some of our ancestors, as I argue in this post, leading metabolic traits to evolve in as little as 396 years, or even less, depending on the circumstances.

In a sense, forced selection would be a bit like artificial selection. If a group of our ancestors became geographically isolated from others, in an environment where only certain types of food were available, physiological and metabolic adaptations to those types of food might evolve. This is also true for the adoption of cultural practices; culture can also strongly influence evolution (see, e.g., McElreath & Boyd, 2007).

This is why it is arguably a good idea for people to look at their background (i.e., learn about their ancestors), because they may have inherited genes that predispose them to function better with certain types of diets and lifestyles. That can help them better tailor their diets to their genetic makeup, and also understand why certain diets work for some people but not for others. (This is essentially what medical doctors do, on a smaller time scale, when they take a patients' parents health history into consideration when dispensing medical advice.)

By ancestors I am not talking about Homo erectus here, but ancestors that lived 3,000; 1,000; or even 500 years ago. At times when medical care and other modern amenities were not available, and thus selection pressures were stronger. For example, if your no-so-distant ancestors have consumed plenty of dairy, chances are you are better adapted to consume dairy than people whose ancestors have not.

Very recent food inventions, like refined carbohydrates, refined sugars, and hydrogenated fats are too new to have influenced the genetic makeup of anybody living today. So, chances are, they are bad for the vast majority of us. (A small percentage of the population may not develop any hint of diseases of civilization after consuming them for years, but they are not going to be as healthy as they could be.) Other, not so recent, food inventions, such as olive oil, certain types of bread, certain types of dairy, may be better for some people than for others.

References:

Kock, N., Chatelain-Jardón, R., & Carmona, J. (2008). An experimental study of simulated web-based threats and their impact on knowledge communication effectiveness. IEEE Transactions on Professional Communication, 51(2), 183-197.

McElreath, R., & Boyd, R. (2007). Mathematical models of social evolution: A guide for the perplexed. Chicago, IL: The University of Chicago Press.

Thursday, November 22, 2018

Does protein leach calcium from the bones? Yes, but only if it is plant protein

The idea that protein leaches calcium from the bones has been around for a while. It is related to the notion that protein, especially from animal foods, increases blood acidity. The body then uses its main reservoir of calcium, the bones, to reduce blood acidity. This post generally supports the opposite view, and adds a twist to it, related to plant protein consumption.

The “eat-meat-lose-bone” idea has apparently become popular due to the position taken by Loren Cordain on the topic. Dr. Cordain has also made several important and invaluable contributions to our understanding of the diets of our Paleolithic ancestors. He has argued in his book, The Paleo Diet, and elsewhere that to counter the acid load of protein one should eat fruits and vegetables. The latter are believed to have an alkaline load.

If the idea that protein leaches calcium from the bones is correct, one would expect to see a negative association between protein consumption and bone mineral density (BMD). This negative association should be particularly strong in people aged 50 and older, who are more vulnerable to BMD losses.

As it turns out, this idea appears to be correct only for plant protein. Animal protein seems to be associated with an increase in BMD, at least according to a widely cited study by Promislow et al. (2002). The study shows that there is a positive multivariate association between animal protein consumption and BMD; an association that becomes negative when plant protein consumption is considered.

The study focused on 572 women and 388 men aged 55–92 years living in Rancho Bernardo, California. Food frequency questionnaires were administered in the 1988–1992 period, and BMD was measured 4 years later. The bar chart below shows the approximate increases in BMD (in g/cm^2) for each 15 g/d increment in protein intake.


The authors reported increments in BMD for different increments of protein (15 and 5 g/d), so the results above are adjusted somewhat from the original values reported in the article. Keeping that in mind, the increment in BMD for men due to animal protein was not statistically significant (P=0.20). That is the smallest bar on the left.

Does protein leach calcium from the bones? Based on this study, the reasonable answers to this question are yes for plant protein, and no for animal protein. For animal protein, it seems to be quite the opposite.

Even more interesting, calcium intake did not seem to be much of a factor. BMD gains due to animal protein seemed to converge to similar values whether calcium intake was high, medium or low. The convergence occurred as animal protein intake increased, and the point of convergence was between 85-90 g/d of animal protein intake.

And high calcium intakes did not seem to protect those whose plant protein consumption was high.

The authors do not discuss specific foods, but one can guess the main plant protein that those folks likely consumed. It was likely gluten from wheat products.

Are the associations above due to: (a) the folks eating animal protein consuming more fruits and vegetables than the folks eating plant protein; or (b) something inherent to animal foods that stimulates an increase in the absorption of dietary calcium, even in small amounts?

This question cannot be answered based on this study; it should have controlled for fruit and vegetable consumption for that.

But if I were to bet, I would bet on (b).

Reference

Promislow, J.H.E., Goodman-Gruen, D., Slymen, D.J., & Barrett-Connor, E. (2002). Protein consumption and bone mineral density in the elderly. American Journal of Epidemiology, 155(7), 636–644.

Sunday, October 21, 2018

Blood glucose control before age 55 may increase your chances of living beyond 90

I have recently read an interesting study by Yashin and colleagues (2009) at Duke University’s Center for Population Health and Aging. (The full reference to the article, and a link, are at the end of this post.) This study is a gem with some rough edges, and some interesting implications.

The study uses data from the Framingham Heart Study (FHS). The FHS, which started in the late 1940s, recruited 5209 healthy participants (2336 males and 2873 females), aged 28 to 62, in the town of Framingham, Massachusetts. At the time of Yashin and colleagues’ article publication, there were 993 surviving participants.

I rearranged figure 2 from the Yashin and colleagues article so that the two graphs (for females and males) appeared one beside the other. The result is shown below (click on it to enlarge); the caption at the bottom-right corner refers to both graphs. The figure shows the age-related trajectory of blood glucose levels, grouped by lifespan (LS), starting at age 40.


As you can see from the figure above, blood glucose levels increase with age, even for long-lived individuals (LS > 90). The increases follow a U-curve (a.k.a. J-curve) pattern; the beginning of the right side of a U curve, to be more precise. The main difference in the trajectories of the blood glucose levels is that as lifespan increases, so does the width of the U curve. In other words, in long-lived people, blood glucose increases slowly with age; particularly up to 55 years of age, when it starts increasing more rapidly.

Now, here is one of the rough edges of this study. The authors do not provide standard deviations. You can ignore the error bars around the points on the graph; they are not standard deviations. They are standard errors, which are much lower than the corresponding standard deviations. Standard errors are calculated by dividing the standard deviations by the square root of the sample sizes for each trajectory point (which the authors do not provide either), so they go up with age since progressively smaller numbers of individuals reach advanced ages.

So, no need to worry if your blood glucose levels are higher than those shown on the vertical axes of the graphs. (I will comment more on those numbers below.) Not everybody who lived beyond 90 had a blood glucose of around 80 mg/dl at age 40. I wouldn't be surprised if about 2/3 of the long-lived participants had blood glucose levels in the range of 65 to 95 at that age.

Here is another rough edge. It is pretty clear that the authors’ main independent variable (i.e., health predictor) in this study is average blood glucose, which they refer to simply as “blood glucose”. However, the measure of blood glucose in the FHS is a very rough estimation of average blood glucose, because they measured blood glucose levels at random times during the day. These measurements, when averaged, are closer to fasting blood glucose levels than to average blood glucose levels.

A more reliable measure of average blood glucose levels is that of glycated hemoglobin (HbA1c). Blood glucose glycates (i.e., sticks to, like most sugary substances) hemoglobin, a protein found in red blood cells. Since red blood cells are relatively long-lived, with a turnover of about 3 months, HbA1c (given in percentages) is a good indicator of average blood glucose levels (if you don’t suffer from anemia or a few other blood abnormalities). Based on HbA1c, one can then estimate his or her average blood glucose level for the previous 3 months before the test, using one of the following equations, depending on whether the measurement is in mg/dl or mmol/l.

    Average blood glucose (mg/dl) = 28.7 × HbA1c − 46.7

    Average blood glucose (mmol/l) = 1.59 × HbA1c − 2.59

The table below, from Wikipedia, shows average blood glucose levels corresponding to various HbA1c values. As you can see, they are generally higher than the corresponding fasting blood glucose levels would normally be (the latter is what the values on the vertical axes of the graphs above from Yashin and colleagues’ study roughly measure). This is to be expected, because blood glucose levels vary a lot during the day, and are often transitorily high in response to food intake and fluctuations in various hormones. Growth hormone, cortisol and noradrenaline are examples of hormones that increase blood glucose. Only one hormone effectively decreases blood glucose levels, insulin, by stimulating glucose uptake and storage as glycogen and fat.


Nevertheless, one can reasonably expect fasting blood glucose levels to have been highly correlated with average blood glucose levels in the sample. So, in my opinion, the graphs above showing age-related blood glucose trajectories are still valid, in terms of their overall shape, but the values on the vertical axes should have been measured differently, perhaps using the formulas above.

Ironically, those who achieve low average blood glucose levels (measured based on HbA1c) by adopting a low carbohydrate diet (one of the most effective ways) frequently have somewhat high fasting blood glucose levels because of physiological (or benign) insulin resistance. Their body is primed to burn fat for energy, not glucose. Thus when growth hormone levels spike in the morning, so do blood glucose levels, as muscle cells are in glucose rejection mode. This is a benign version of the dawn effect (a.k.a. dawn phenomenon), which happens with quite a few low carbohydrate dieters, particularly with those who are deep in ketosis at dawn.

Yashin and colleagues also modeled relative risk of death based on blood glucose levels, using a fairly sophisticated mathematical model that takes into consideration U-curve relationships. What they found is intuitively appealing, and is illustrated by the two graphs at the bottom of the figure below. The graphs show how the relative risks (e.g., 1.05, on the topmost dashed line on the both graphs) associated with various ranges of blood glucose levels vary with age, for both females and males.


What the graphs above are telling us is that once you reach old age, controlling for blood sugar levels is not as effective as doing it earlier, because you are more likely to die from what the authors refer to as “other causes”. For example, at the age of 90, having a blood glucose of 150 mg/dl (corrected for the measurement problem noted earlier, this would be perhaps 165 mg/dl, from HbA1c values) is likely to increase your risk of death by only 5 percent. The graphs account for the facts that: (a) blood glucose levels naturally increase with age, and (b) fewer people survive as age progresses. So having that level of blood glucose at age 60 would significantly increase relative risk of death at that age; this is not shown on the graph, but can be inferred.

Here is a final rough edge of this study. From what I could gather from the underlying equations, the relative risks shown above do not account for the effect of high blood glucose levels earlier in life on relative risk of death later in life. This is a problem, even though it does not completely invalidate the conclusion above. As noted by several people (including Gary Taubes in his book Good Calories, Bad Calories), many of the diseases associated with high blood sugar levels (e.g., cancer) often take as much as 20 years of high blood sugar levels to develop. So the relative risks shown above underestimate the effect of high blood glucose levels earlier in life.

Do the long-lived participants have some natural protection against accelerated increases in blood sugar levels, or was it their diet and lifestyle that protected them? This question cannot be answered based on the study.

Assuming that their diet and lifestyle protected them, it is reasonable to argue that: (a) if you start controlling your average blood sugar levels well before you reach the age of 55, you may significantly increase your chances of living beyond the age of 90; (b) it is likely that your blood glucose levels will go up with age, but if you can manage to slow down that progression, you will increase your chances of living a longer and healthier life; (c) you should focus your control on reliable measures of average blood glucose levels, such as HbA1c, not fasting blood glucose levels (postprandial glucose levels are also a good option, because they contribute a lot to HbA1c increases); and (d) it is never too late to start controlling your blood glucose levels, but the more you wait, the bigger is the risk.

References:

Taubes, G. (2007). Good calories, bad calories: Challenging the conventional wisdom on diet, weight control, and disease. New York, NY: Alfred A. Knopf.

Yashin, A.I., Ukraintseva, S.V., Arbeev, K.G., Akushevich, I., Arbeeva, L.S., & Kulminski, A.M. (2009). Maintaining physiological state for exceptional survival: What is the normal level of blood glucose and does it change with age? Mechanisms of Ageing and Development, 130(9), 611-618.