(Note: My apologies for the sarcastic tone of this post. I am not really congratulating anybody here!)
Dr. Nick Delgado shows us in this YouTube video how to "become diabetic" in 6 hours!
I must admit that I liked the real-time microscope imaging, and wish he had shown us more of that.
But really!
After consulting with my mentor, the MIMIW, I was reminded that there is at least one post on this blog that shows how one can "become diabetic" in just over 60 minutes – that is, about 6 times faster than using the technique described by Dr. Delgado.
The technique used in the post mentioned above is called "intense exercise", which is even believed to be health-promoting! (Unlike drinking olive oil as if it was water, or eating white bread.)
The advantage of this technique is that one can "become diabetic" by doing something healthy!
Thanks Dr. Delgado, your video ranks high up there, together with this Ali G. video, as a fine example of how to bring real science to the masses.
Tuesday, August 31, 2010
Sunday, August 29, 2010
Friday, August 13, 2010
The evolution of costly traits: Competing for women can be unhealthy for men
There are human traits that evolved in spite of being survival handicaps. These counterintuitive traits are often called costly traits, or Zahavian traits (in animal signaling contexts), in honor of the evolutionary biologist Amotz Zahavi (Zahavi & Zahavi, 1997). I have written a post about this type of traits, and also an academic article (Kock, 2009). The full references and links to these publications are at the end of this post.
The classic example of costly trait is the peacock’s train, which is used by males to signal health to females. (Figure below from: animals.howstuffworks.com.) The male peacock’s train (often incorrectly called “tail”) is a costly trait because it impairs the ability of a male to flee predators. It decreases a male’s survival success, even though it has a positive net effect on the male’s reproductive success (i.e., the number of offspring it generates). It is used in sexual selection; the females find big and brightly colored trains with many eye spots "sexy".
So costly traits exist in many species, including the human species, but we have not identified them all yet. The implication for human diet and lifestyle choices is that our ancestors might have evolved some habits that are bad for human survival, and moved away from others that are good for survival. And I am not only talking about survival among modern humans; I am talking about survival among our human ancestors too.
The simple reason for the existence of costly traits in humans is that evolution tends to maximize reproductive success, not survival, and that applies to all species. (Inclusive fitness theory goes a step further, placing the gene at the center of the selection process, but this is a topic for another post.) If that were not the case, rodent species, as well as other species that specialize in fast reproduction within relatively short life spans, would never have evolved.
Here is an interesting piece of news about research done at the University of Michigan. (I have met the lead researcher, Dan Kruger, a couple of times at HBES conferences. My impression is that his research is solid.) The research illustrates the evolution of costly traits, from a different angle. The researchers argue, based on the results of their investigation, that competing for a woman’s attention is generally bad for a man’s health!
Very romantic ...
References:
Kock, N. (2009). The evolution of costly traits through selection and the importance of oral speech in e-collaboration. Electronic Markets, 19(4), 221-232.
Zahavi, A. & Zahavi, A. (1997). The Handicap Principle: A missing piece of Darwin’s puzzle. Oxford, England: Oxford University Press.
The classic example of costly trait is the peacock’s train, which is used by males to signal health to females. (Figure below from: animals.howstuffworks.com.) The male peacock’s train (often incorrectly called “tail”) is a costly trait because it impairs the ability of a male to flee predators. It decreases a male’s survival success, even though it has a positive net effect on the male’s reproductive success (i.e., the number of offspring it generates). It is used in sexual selection; the females find big and brightly colored trains with many eye spots "sexy".
So costly traits exist in many species, including the human species, but we have not identified them all yet. The implication for human diet and lifestyle choices is that our ancestors might have evolved some habits that are bad for human survival, and moved away from others that are good for survival. And I am not only talking about survival among modern humans; I am talking about survival among our human ancestors too.
The simple reason for the existence of costly traits in humans is that evolution tends to maximize reproductive success, not survival, and that applies to all species. (Inclusive fitness theory goes a step further, placing the gene at the center of the selection process, but this is a topic for another post.) If that were not the case, rodent species, as well as other species that specialize in fast reproduction within relatively short life spans, would never have evolved.
Here is an interesting piece of news about research done at the University of Michigan. (I have met the lead researcher, Dan Kruger, a couple of times at HBES conferences. My impression is that his research is solid.) The research illustrates the evolution of costly traits, from a different angle. The researchers argue, based on the results of their investigation, that competing for a woman’s attention is generally bad for a man’s health!
Very romantic ...
References:
Kock, N. (2009). The evolution of costly traits through selection and the importance of oral speech in e-collaboration. Electronic Markets, 19(4), 221-232.
Zahavi, A. & Zahavi, A. (1997). The Handicap Principle: A missing piece of Darwin’s puzzle. Oxford, England: Oxford University Press.
Tuesday, August 10, 2010
Nonexercise activities like fidgeting may account for a 1,000 percent difference in body fat gain! NEAT eh?
Some studies become classics in their fields and yet are largely missed by the popular media. This seems to be what happened with a study by Levine and colleagues (1999; full reference and link at the end of this post), which looked at the role that nonexercise activity thermogenesis (NEAT) plays in fat gain suppression. Many thanks go to Lyle McDonald for posting on this.
You have probably seen on the web claims that overeating leads to fat loss, because overeating increases one’s basal metabolic rate. There are also claims that food has a powerful thermic effect, due to the energy needed for digestion, absorption and storage of nutrients; this is also claimed to lead to fat loss. There is some truth to these claims, but the related effects are very small compared with the effects of NEAT.
Ever wonder why there are some folks who seem to eat whatever they want, and never get fat? As it turns out, it may be primarily due to NEAT!
NEAT is associated with fidgeting, maintenance of posture, shifting position, pacing, and other involuntary light physical activities. The main finding of this study was that NEAT accounted for a massive amount of the difference in body fat gain among the participants in the study. The participants were 12 males and 4 females, ranging in age from 25 to 36 years. These healthy and lean participants were fed 1,000 kilocalories per day in excess of their weight-maintenance requirements, for a period of 8 weeks. See figure below; click on it to enlarge.
Fat gain varied more than 10-fold among the participants (or more than 1,000 percent), ranging from a gain of only 0.36 kg (0.79 lbs) to a gain of 4.23 kg (9.33 lbs). As you can see, NEAT explains a lot of the variance in the fat gain variable, which is indicated by the highly statistically significant negative correlation (-0.77). Its effect dwarfs those related to basal metabolic rate and food-induced thermogenesis, neither of which was statistically significant.
How can one use this finding in practice? This research indirectly suggests that moving often throughout the day may have a significant additive long term effect on fat gain suppression. It is reasonable to expect a similar effect on fat loss. And this effect may be stealthy enough to prevent the body from reacting to fat loss by significantly lowering its basal metabolic rate. (Yes, while the increase in basal metabolic rate is trivial in response to overfeeding, the decrease in this rate is nontrivial in response to underfeeding. Essentially the body is much more “concerned” about starving than fattening up.)
The bad news is that it is not easy to mimic the effects of NEAT through voluntary activities. The authors of the study estimated that the maximum increase in NEAT detected in the study (692 kcal/day) would be equivalent to a 15-minute walk every waking hour of every single day! (This other study focuses specifically on fidgeting.) Clearly NEAT has a powerful effect on weight loss, which is not easy to match with voluntary pacing, standing up etc. Moreover, females seem to benefit less from NEAT, because they seem to engage in fewer NEAT-related activities than men. The four lowest NEAT values in the study corresponded to the four female participants.
Nevertheless, if you have a desk job, like I do, you may want to stand up and pace for a few seconds every 30 minutes. You may also want to stand up while you talk on the phone. You may want to shift position from time to time; e.g., sitting at the edge of the chair for a few minutes every hour, without back support. And so on. These actions may take you a bit closer to the lifestyle of our Paleolithic ancestors, who were not sitting down motionless the whole day. Try also eating more like they did and, over a year, the results may be dramatic!
Reference:
James A. Levine, Norman L. Eberhardt, Michael D. Jensen (1999). Role of nonexercise activity thermogenesis in resistance to fat gain in humans. Science, 283(5399), 212-214.
You have probably seen on the web claims that overeating leads to fat loss, because overeating increases one’s basal metabolic rate. There are also claims that food has a powerful thermic effect, due to the energy needed for digestion, absorption and storage of nutrients; this is also claimed to lead to fat loss. There is some truth to these claims, but the related effects are very small compared with the effects of NEAT.
Ever wonder why there are some folks who seem to eat whatever they want, and never get fat? As it turns out, it may be primarily due to NEAT!
NEAT is associated with fidgeting, maintenance of posture, shifting position, pacing, and other involuntary light physical activities. The main finding of this study was that NEAT accounted for a massive amount of the difference in body fat gain among the participants in the study. The participants were 12 males and 4 females, ranging in age from 25 to 36 years. These healthy and lean participants were fed 1,000 kilocalories per day in excess of their weight-maintenance requirements, for a period of 8 weeks. See figure below; click on it to enlarge.
Fat gain varied more than 10-fold among the participants (or more than 1,000 percent), ranging from a gain of only 0.36 kg (0.79 lbs) to a gain of 4.23 kg (9.33 lbs). As you can see, NEAT explains a lot of the variance in the fat gain variable, which is indicated by the highly statistically significant negative correlation (-0.77). Its effect dwarfs those related to basal metabolic rate and food-induced thermogenesis, neither of which was statistically significant.
How can one use this finding in practice? This research indirectly suggests that moving often throughout the day may have a significant additive long term effect on fat gain suppression. It is reasonable to expect a similar effect on fat loss. And this effect may be stealthy enough to prevent the body from reacting to fat loss by significantly lowering its basal metabolic rate. (Yes, while the increase in basal metabolic rate is trivial in response to overfeeding, the decrease in this rate is nontrivial in response to underfeeding. Essentially the body is much more “concerned” about starving than fattening up.)
The bad news is that it is not easy to mimic the effects of NEAT through voluntary activities. The authors of the study estimated that the maximum increase in NEAT detected in the study (692 kcal/day) would be equivalent to a 15-minute walk every waking hour of every single day! (This other study focuses specifically on fidgeting.) Clearly NEAT has a powerful effect on weight loss, which is not easy to match with voluntary pacing, standing up etc. Moreover, females seem to benefit less from NEAT, because they seem to engage in fewer NEAT-related activities than men. The four lowest NEAT values in the study corresponded to the four female participants.
Nevertheless, if you have a desk job, like I do, you may want to stand up and pace for a few seconds every 30 minutes. You may also want to stand up while you talk on the phone. You may want to shift position from time to time; e.g., sitting at the edge of the chair for a few minutes every hour, without back support. And so on. These actions may take you a bit closer to the lifestyle of our Paleolithic ancestors, who were not sitting down motionless the whole day. Try also eating more like they did and, over a year, the results may be dramatic!
Reference:
James A. Levine, Norman L. Eberhardt, Michael D. Jensen (1999). Role of nonexercise activity thermogenesis in resistance to fat gain in humans. Science, 283(5399), 212-214.
Saturday, August 7, 2010
Cortisol, surprise-enhanced cognition, and flashbulb memories: Scaring people with a snake screen and getting a PhD for it!
Cortisol is a hormone that has a number of important functions. It gets us out of bed in the morning, it cranks up our metabolism in preparation for intense exercise, and it also helps us memorize things and even learn. Yes, it helps us learn. Memorization in particular, and cognition in general, would be significantly impaired without cortisol. When you are surprised, particularly with something unpleasant, cortisol levels increase and enhance cognition. This is in part what an interesting study suggests; a study in which I was involved. The study was properly “sanctified” by the academic peer-review process (Kock et al., 2009; full reference and link at the end of this post).
The main hypothesis tested through this study is also known as the “flashbulb memorization” hypothesis. Interestingly, up until this study was conducted no one seemed to have used evolution to provide a basis on which flashbulb memorization can be explained. The basic idea here is that enhanced cognition within the temporal vicinity of animal attacks (i.e., a few minutes before and after) allowed our hominid ancestors to better build and associate memories related to the animals and their typical habitat markers (e.g., vegetation, terrain, rock formations), which in turn increased their survival chances. Their survival chances increased because the memories helped them avoid a second encounter; if they survived the first, of course. And so flashbulb memorization evolved. (In fact, it might have evolved earlier than at the hominid stage, and it may also have evolved in other species.)
The study involved 186 student participants. The participants were asked to review web-based learning modules and subsequently take a test on what they had learned. Data from 6 learning modules in 2 experimental conditions were contrasted. In the treatment condition a web-based screen with a snake in attack position was used to surprise the participants; the snake screen was absent in the control condition. See schematic figure below (click on it to enlarge). The “surprise zone” in the figure comprises the modules immediately before and after the snake screen (modules 3 and 4); those are the modules in which higher scores were predicted.
The figure below (click on it to enlarge) shows a summary of the results. The top part of the figure shows the percentage differences between average scores obtained by participants in the treatment and control conditions. The bottom part of the figure shows the average scores obtained by participants in both conditions, as well as the scores that the participants would have obtained by chance. The chance scores would likely have been the ones obtained by the participants if their learning had been significantly impaired for any of the modules; this could have happened due to distraction, for example. As you can see, the scores for all modules are significantly higher than chance.
In summary, the participants who were surprised with the snake screen obtained significantly higher scores for the two modules immediately before (about 20 percent higher) and after (about 40 percent higher) the snake screen. The reason is that the surprise elicited by the snake screen increased cortisol levels, which in turn improved learning for modules 3 and 4. Adrenaline and noradrenaline (epinephrine and norepinephrine) may also be involved. This phenomenon is so odd that it seems to defy the laws of physics; note that Module 3 was reviewed before the snake screen. And, depending on the size of a test, this could have turned a “C” into an “A” grade!
Similarly, it is because of this action of cortisol that Americans reading this post, especially those who lived in the East Coast in 2001, remember vividly where they were, what they were doing, and who they were with, when they first heard about the September 11, 2001 Attacks. I was living in Philadelphia at the time, and I remember those details very vividly, even though the Attacks happened almost 10 years ago. That is one of the fascinating things that cortisol does; it instantaneously turns short-term contextual memories temporally associated with a surprise event (i.e., a few minutes before and after the event) into vivid long-term memories.
This study was part of the PhD research project of one of my former doctoral students, and now Dr. Ruth Chatelain-Jardon. Her PhD was granted in May 2010. She expanded the study through data collection in two different countries, and a wide range of analyses. (It is not that easy to get a PhD!) Her research provides solid evidence that flashbulb memorization is a real phenomenon, and also that it is a human universal. Thanks are also due to Dr. Jesus Carmona, another former doctoral student of mine who worked on a different PhD research project, but who also helped a lot with this project.
Reference:
Kock, N., Chatelain-Jardón, R., & Carmona, J. (2009). Scaring them into learning!? Using a snake screen to enhance the knowledge transfer effectiveness of a web interface. Decision Sciences Journal of Innovative Education, 7(2), 359-375.
The main hypothesis tested through this study is also known as the “flashbulb memorization” hypothesis. Interestingly, up until this study was conducted no one seemed to have used evolution to provide a basis on which flashbulb memorization can be explained. The basic idea here is that enhanced cognition within the temporal vicinity of animal attacks (i.e., a few minutes before and after) allowed our hominid ancestors to better build and associate memories related to the animals and their typical habitat markers (e.g., vegetation, terrain, rock formations), which in turn increased their survival chances. Their survival chances increased because the memories helped them avoid a second encounter; if they survived the first, of course. And so flashbulb memorization evolved. (In fact, it might have evolved earlier than at the hominid stage, and it may also have evolved in other species.)
The study involved 186 student participants. The participants were asked to review web-based learning modules and subsequently take a test on what they had learned. Data from 6 learning modules in 2 experimental conditions were contrasted. In the treatment condition a web-based screen with a snake in attack position was used to surprise the participants; the snake screen was absent in the control condition. See schematic figure below (click on it to enlarge). The “surprise zone” in the figure comprises the modules immediately before and after the snake screen (modules 3 and 4); those are the modules in which higher scores were predicted.
The figure below (click on it to enlarge) shows a summary of the results. The top part of the figure shows the percentage differences between average scores obtained by participants in the treatment and control conditions. The bottom part of the figure shows the average scores obtained by participants in both conditions, as well as the scores that the participants would have obtained by chance. The chance scores would likely have been the ones obtained by the participants if their learning had been significantly impaired for any of the modules; this could have happened due to distraction, for example. As you can see, the scores for all modules are significantly higher than chance.
In summary, the participants who were surprised with the snake screen obtained significantly higher scores for the two modules immediately before (about 20 percent higher) and after (about 40 percent higher) the snake screen. The reason is that the surprise elicited by the snake screen increased cortisol levels, which in turn improved learning for modules 3 and 4. Adrenaline and noradrenaline (epinephrine and norepinephrine) may also be involved. This phenomenon is so odd that it seems to defy the laws of physics; note that Module 3 was reviewed before the snake screen. And, depending on the size of a test, this could have turned a “C” into an “A” grade!
Similarly, it is because of this action of cortisol that Americans reading this post, especially those who lived in the East Coast in 2001, remember vividly where they were, what they were doing, and who they were with, when they first heard about the September 11, 2001 Attacks. I was living in Philadelphia at the time, and I remember those details very vividly, even though the Attacks happened almost 10 years ago. That is one of the fascinating things that cortisol does; it instantaneously turns short-term contextual memories temporally associated with a surprise event (i.e., a few minutes before and after the event) into vivid long-term memories.
This study was part of the PhD research project of one of my former doctoral students, and now Dr. Ruth Chatelain-Jardon. Her PhD was granted in May 2010. She expanded the study through data collection in two different countries, and a wide range of analyses. (It is not that easy to get a PhD!) Her research provides solid evidence that flashbulb memorization is a real phenomenon, and also that it is a human universal. Thanks are also due to Dr. Jesus Carmona, another former doctoral student of mine who worked on a different PhD research project, but who also helped a lot with this project.
Reference:
Kock, N., Chatelain-Jardón, R., & Carmona, J. (2009). Scaring them into learning!? Using a snake screen to enhance the knowledge transfer effectiveness of a web interface. Decision Sciences Journal of Innovative Education, 7(2), 359-375.
Wednesday, August 4, 2010
The baffling rise in seasonal allergies: Global warming or obesity?
The July 26, 2010 issue of Fortune has an interesting set of graphs on page 14. It shows the rise of allergies in the USA, together with figures on lost productivity, doctor visits, and medical expenditures. (What would you expect? This is Fortune, and money matters.) It also shows some cool maps with allergen concentrations, and how they are likely to increase with global warming. (See below; click on it to enlarge; use the "CRTL" and "+" keys to zoom in, and CRTL" and "-" to zoom out.)
The implication: A rise in global temperatures is causing an increase in allergy cases. Supposedly the spring season starts earlier, with more pollen being produced overall, and thus more allergy cases.
Really!?
I checked their numbers against population growth, because as the population of a country increases, so will the absolute number of allergy cases (as well as cancer cases, and cases of almost any disease). What is important is whether there has been an increase in allergy rates, or the percentage of the population suffering from allergies. Well, indeed, allergy rates have been increasing.
Now, I don’t know about your neck of the woods, but temperatures have been unusually low this year in South Texas. Global warming may be happening, but given recent fluctuations in temperature, I am not sure global warming explains the increases in allergy rates. Particularly the spike in allergy rates in 2010; this seems to be very unlikely to be caused by global warming.
And I have my own experience of going from looking like a seal to looking more like a human being. When I was a seal (i.e., looked like one), I used to have horrible seasonal pollen allergies. Then I lost 60 lbs, and my allergies diminished dramatically. Why? Body fat secretes a number of pro-inflammatory hormones (see, e.g., this post, and also this one), and allergies are essentially exaggerated inflammatory responses.
So I added obesity rates to the mix, and came up with the table and graph below (click on it to enlarge).
Obesity rates and allergies do seem to go hand in hand, don’t you think? The correlation between obesity and allergy rates is a high 0.87!
Assuming that this correlation reflects reasonably well the relationship between obesity and allergy rates (something that is not entirely clear given the small sample), obesity would still explain only 75.7 percent of the variance in allergy rates (this number is the correlation squared). That is, about 24.3 percent of the variance in allergy rates would be due to other missing factors.
A strong candidate for missing factor is something that makes people obese in the first place, namely consumption of foods rich in refined grains, seeds, and sugars. Again, in my experience, removing these foods from my diet reduced the intensity of allergic reactions, but not as much as losing a significant amount of body fat. We are talking about things like cereals, white bread, doughnuts, pasta, pancakes covered with syrup, regular sodas, and fruit juices. Why? These foods also seem to increase serum concentrations of pro-inflammatory hormones within hours of their consumption.
Other candidates are vitamin D levels, and lack of exposure to natural environments during childhood, just to name a few. People seem to avoid the sun like the plague these days, which can lower their vitamin D levels. This is a problem because vitamin D modulates immune responses; so it is important in the spring, as well as in the winter. The lack of exposure to natural environments during childhood may make people more sensitive to natural allergens, like pollen.
The implication: A rise in global temperatures is causing an increase in allergy cases. Supposedly the spring season starts earlier, with more pollen being produced overall, and thus more allergy cases.
Really!?
I checked their numbers against population growth, because as the population of a country increases, so will the absolute number of allergy cases (as well as cancer cases, and cases of almost any disease). What is important is whether there has been an increase in allergy rates, or the percentage of the population suffering from allergies. Well, indeed, allergy rates have been increasing.
Now, I don’t know about your neck of the woods, but temperatures have been unusually low this year in South Texas. Global warming may be happening, but given recent fluctuations in temperature, I am not sure global warming explains the increases in allergy rates. Particularly the spike in allergy rates in 2010; this seems to be very unlikely to be caused by global warming.
And I have my own experience of going from looking like a seal to looking more like a human being. When I was a seal (i.e., looked like one), I used to have horrible seasonal pollen allergies. Then I lost 60 lbs, and my allergies diminished dramatically. Why? Body fat secretes a number of pro-inflammatory hormones (see, e.g., this post, and also this one), and allergies are essentially exaggerated inflammatory responses.
So I added obesity rates to the mix, and came up with the table and graph below (click on it to enlarge).
Obesity rates and allergies do seem to go hand in hand, don’t you think? The correlation between obesity and allergy rates is a high 0.87!
Assuming that this correlation reflects reasonably well the relationship between obesity and allergy rates (something that is not entirely clear given the small sample), obesity would still explain only 75.7 percent of the variance in allergy rates (this number is the correlation squared). That is, about 24.3 percent of the variance in allergy rates would be due to other missing factors.
A strong candidate for missing factor is something that makes people obese in the first place, namely consumption of foods rich in refined grains, seeds, and sugars. Again, in my experience, removing these foods from my diet reduced the intensity of allergic reactions, but not as much as losing a significant amount of body fat. We are talking about things like cereals, white bread, doughnuts, pasta, pancakes covered with syrup, regular sodas, and fruit juices. Why? These foods also seem to increase serum concentrations of pro-inflammatory hormones within hours of their consumption.
Other candidates are vitamin D levels, and lack of exposure to natural environments during childhood, just to name a few. People seem to avoid the sun like the plague these days, which can lower their vitamin D levels. This is a problem because vitamin D modulates immune responses; so it is important in the spring, as well as in the winter. The lack of exposure to natural environments during childhood may make people more sensitive to natural allergens, like pollen.
Sunday, August 1, 2010
Growth hormone, insulin resistance, body fat accumulation, and glycogen depletion: Making sense of a mysterious hormone replacement therapy outcome
Hormone replacement therapies are prescribed in some cases, for medical reasons. They usually carry some risks. The risks come in part from the body down-regulating its own production of hormones when hormones are taken orally or injected. This could be seen as a form of compensatory adaptation, as the body tries to protect itself from abnormally high hormone levels.
More often than not the down-regulation can be reversed by interrupting the therapy. In some cases, the down-regulation becomes permanent, leading to significant health deterioration over the long run. One can seriously regret having started the hormone replacement therapy in the first place. The same is true (if not more) for hormone supplementation for performance enhancement, where normal hormone secretion levels are increased to enhance (mostly) athletic performance.
Rosenfalck and colleagues (1999) conducted an interesting study linking growth hormone (GH) replacement therapy with insulin resistance. Their conclusions are not very controversial. What I find interesting is what their data analysis unveiled and was not included in their conclusions. Also, they explain their main findings by claiming that there was a deterioration of beta cell function. (Beta cells are located in the pancreas, and secrete insulin.) While they may be correct, their explanation is not very plausible, as you will see below.
Let us take a quick look at what past research says about GH therapy and insulin resistance. One frequent finding is a significant but temporary impairment of insulin sensitivity, which usually normalizes after a period of a few months (e.g., 6 months). Another not so frequent finding is a significant and permanent impairment of insulin sensitivity; this is not as frequent in healthy individuals.
The researchers did a good job at reviewing this literature, and concluded that in many cases GH therapy is not worth the risk. They also studied 24 GH-deficient adults (18 males and 6 females). All of them had known pituitary pathology, which caused the low GH levels. The participants were randomly assigned to two groups. One received 4 months treatment with biosynthetic GH daily (n=13); the other received a placebo (n=11).
The table below (click on it to enlarge) shows various measures before and after treatment. Note the significant reduction in abdominal fat mass in the GH group. Also note that, prior to the treatment, the GH group folks (who were GH-deficient) were overall much heavier and much fatter, particular at the abdominal area, than the folks in the placebo (or control) group.
From the measures above one could say that the treatment was a success. But the researchers point out that it was not, because insulin sensitivity was significantly impaired. They show some graphs (below), and that is where things get really interesting, but not in the way intended by the researchers.
On the figure above, the graphs on the left refer to the placebo group, and on the right to the GH group. The solid lines reflect pre-treatment numbers and dotted lines post-treatment numbers. Indeed, GH therapy is making the GH-deficient folks significantly more insulin resistant.
But look carefully. The GH folks are more insulin sensitive than the controls prior to the treatment, even though they are much fatter, particularly in terms of abdominal fat. The glucose response is significantly lower for the GH-deficient folks, and that is not due to them secreting more insulin. The insulin response is also significantly lower. This is confirmed by glucose and insulin “area under the curve” measures provided by the researchers.
In fact, after treatment both groups seem to have generally the same insulin and glucose responses. This means that the GH treatment made insulin-sensitive folks a bit more like their normal counterparts in the placebo group. But obviously the change for the worse occurred only in the GH group, which is what the researchers concluded.
Now to the really interesting question, at least in my mind: What could have improved insulin sensitivity in the GH-deficient group prior to the treatment?
The GH-deficient folks had more body fat, particularly around the abdominal area. High serum GH is usually associated with low body fat, particularly around the abdominal area, because high GH folks burn it easily. So, looking at it from a different perspective, the GH-deficient folks seem to have been more effective at making body fat, and less effective at burning it.
Often we talk about insulin sensitivity as though there was only one type. But there is more than one type of insulin sensitivity. Insulin signals to the liver to take up glucose from the blood and turn it into glycogen or fat. Insulin also signals to body fat tissue to take up glucose from the blood and make fat with it. (GLUT 4 is an insulin-sensitive glucose transporter present in both fat and muscle cells.)
Therefore, it is reasonable to assume that folks with fat cells that are particularly insulin-sensitive would tend to make body fat quite easily based on glucose. While this is a type of insulin sensitivity that most people probably do not like to have, it may play an important role in reducing blood glucose levels under certain conditions. This appears to be true in the short term. Down the road, having very insulin-sensitive fat cells seems to lead to obesity, the metabolic syndrome, and diabetes.
In fact, in individuals without pituitary pathology, increased insulin sensitivity in fat cells could be a compensatory adaptation in response to a possible decrease in liver and muscle glucose uptake. Lack of exercise will shift the burden of glucose clearance to tissues other than liver and muscle, because with glycogen stores full both liver and muscle will usually take up much less blood glucose than they would otherwise.
I am speculating here, but I think that in individuals without pituitary pathology, an involuntary decrease in endogenous GH secretion may actually be at the core of this compensatory adaptation mechanism. In these individuals, low GH levels may be an outcome, not a cause of problems. This would explain two apparently contradictory findings: (a) GH levels drop dramatically in the 40s, particularly for men; and (b) several people in their 50s and 60s, including men, have much higher levels of circulating GH than people in their 40s, and even than much younger folks.
Vigorous exercise increases blood glucose uptake, inside and outside the exercise window; this is an almost universal effect among humans. Exercise depletes muscle and liver glycogen. (Fasting and low-carbohydrate dieting alone deplete liver, but not muscle, glycogen.) As glycogen stores become depleted, the activity of glycogen synthase (an enzyme involved in the conversion of glucose to glycogen) increases acutely. This activity remains elevated for several days in muscle tissue; the liver replenishes its glycogen in a matter of hours. With glycogen synthase activity elevated, glucose is quickly used to replenish glycogen stores, and not to make fat.
Depleting glycogen stores on a regular basis (e.g., once every few days) may over time reverse the adaptations that made fat cells particularly insulin-sensitive in the first place. Those adaptations become a protection that is not only no longer needed but also detrimental to health, since they lead to obesity. This could be the reason why many people initially find it difficult to lower their body fat set point, but once they lose body fat and stay lean for a while, they seem to become able to maintain their leanness without much effort.
Well, perhaps glycogen-depleting exercise is more important than many people think. It can help make you thin, but through a circuitous path.
And, incidentally, glycogen-depleting exercise causes a temporary but dramatic spike in GH secretion. This natural increase in GH secretion does not seem to be associated with any significant impairment in overall insulin sensitivity, even though glycogen-depleting exercise increases blood glucose levels a lot during the exercise window. This is a temporary and physiological, not pathological, phenomenon.
Reference:
Rosenfalck A.M., Fisker, S., Hilsted, J., Dinesen, B., Vølund, A., Jørgensen, J.O., Christiansen, J.S., & Madsbad, S. (1999). The effect of the deterioration of insulin sensitivity on beta-cell function in growth-hormone-deficient adults following 4-month growth hormone replacement therapy. Growth Hormone & IGF Research, 9(2), 96–105.
More often than not the down-regulation can be reversed by interrupting the therapy. In some cases, the down-regulation becomes permanent, leading to significant health deterioration over the long run. One can seriously regret having started the hormone replacement therapy in the first place. The same is true (if not more) for hormone supplementation for performance enhancement, where normal hormone secretion levels are increased to enhance (mostly) athletic performance.
Rosenfalck and colleagues (1999) conducted an interesting study linking growth hormone (GH) replacement therapy with insulin resistance. Their conclusions are not very controversial. What I find interesting is what their data analysis unveiled and was not included in their conclusions. Also, they explain their main findings by claiming that there was a deterioration of beta cell function. (Beta cells are located in the pancreas, and secrete insulin.) While they may be correct, their explanation is not very plausible, as you will see below.
Let us take a quick look at what past research says about GH therapy and insulin resistance. One frequent finding is a significant but temporary impairment of insulin sensitivity, which usually normalizes after a period of a few months (e.g., 6 months). Another not so frequent finding is a significant and permanent impairment of insulin sensitivity; this is not as frequent in healthy individuals.
The researchers did a good job at reviewing this literature, and concluded that in many cases GH therapy is not worth the risk. They also studied 24 GH-deficient adults (18 males and 6 females). All of them had known pituitary pathology, which caused the low GH levels. The participants were randomly assigned to two groups. One received 4 months treatment with biosynthetic GH daily (n=13); the other received a placebo (n=11).
The table below (click on it to enlarge) shows various measures before and after treatment. Note the significant reduction in abdominal fat mass in the GH group. Also note that, prior to the treatment, the GH group folks (who were GH-deficient) were overall much heavier and much fatter, particular at the abdominal area, than the folks in the placebo (or control) group.
From the measures above one could say that the treatment was a success. But the researchers point out that it was not, because insulin sensitivity was significantly impaired. They show some graphs (below), and that is where things get really interesting, but not in the way intended by the researchers.
On the figure above, the graphs on the left refer to the placebo group, and on the right to the GH group. The solid lines reflect pre-treatment numbers and dotted lines post-treatment numbers. Indeed, GH therapy is making the GH-deficient folks significantly more insulin resistant.
But look carefully. The GH folks are more insulin sensitive than the controls prior to the treatment, even though they are much fatter, particularly in terms of abdominal fat. The glucose response is significantly lower for the GH-deficient folks, and that is not due to them secreting more insulin. The insulin response is also significantly lower. This is confirmed by glucose and insulin “area under the curve” measures provided by the researchers.
In fact, after treatment both groups seem to have generally the same insulin and glucose responses. This means that the GH treatment made insulin-sensitive folks a bit more like their normal counterparts in the placebo group. But obviously the change for the worse occurred only in the GH group, which is what the researchers concluded.
Now to the really interesting question, at least in my mind: What could have improved insulin sensitivity in the GH-deficient group prior to the treatment?
The GH-deficient folks had more body fat, particularly around the abdominal area. High serum GH is usually associated with low body fat, particularly around the abdominal area, because high GH folks burn it easily. So, looking at it from a different perspective, the GH-deficient folks seem to have been more effective at making body fat, and less effective at burning it.
Often we talk about insulin sensitivity as though there was only one type. But there is more than one type of insulin sensitivity. Insulin signals to the liver to take up glucose from the blood and turn it into glycogen or fat. Insulin also signals to body fat tissue to take up glucose from the blood and make fat with it. (GLUT 4 is an insulin-sensitive glucose transporter present in both fat and muscle cells.)
Therefore, it is reasonable to assume that folks with fat cells that are particularly insulin-sensitive would tend to make body fat quite easily based on glucose. While this is a type of insulin sensitivity that most people probably do not like to have, it may play an important role in reducing blood glucose levels under certain conditions. This appears to be true in the short term. Down the road, having very insulin-sensitive fat cells seems to lead to obesity, the metabolic syndrome, and diabetes.
In fact, in individuals without pituitary pathology, increased insulin sensitivity in fat cells could be a compensatory adaptation in response to a possible decrease in liver and muscle glucose uptake. Lack of exercise will shift the burden of glucose clearance to tissues other than liver and muscle, because with glycogen stores full both liver and muscle will usually take up much less blood glucose than they would otherwise.
I am speculating here, but I think that in individuals without pituitary pathology, an involuntary decrease in endogenous GH secretion may actually be at the core of this compensatory adaptation mechanism. In these individuals, low GH levels may be an outcome, not a cause of problems. This would explain two apparently contradictory findings: (a) GH levels drop dramatically in the 40s, particularly for men; and (b) several people in their 50s and 60s, including men, have much higher levels of circulating GH than people in their 40s, and even than much younger folks.
Vigorous exercise increases blood glucose uptake, inside and outside the exercise window; this is an almost universal effect among humans. Exercise depletes muscle and liver glycogen. (Fasting and low-carbohydrate dieting alone deplete liver, but not muscle, glycogen.) As glycogen stores become depleted, the activity of glycogen synthase (an enzyme involved in the conversion of glucose to glycogen) increases acutely. This activity remains elevated for several days in muscle tissue; the liver replenishes its glycogen in a matter of hours. With glycogen synthase activity elevated, glucose is quickly used to replenish glycogen stores, and not to make fat.
Depleting glycogen stores on a regular basis (e.g., once every few days) may over time reverse the adaptations that made fat cells particularly insulin-sensitive in the first place. Those adaptations become a protection that is not only no longer needed but also detrimental to health, since they lead to obesity. This could be the reason why many people initially find it difficult to lower their body fat set point, but once they lose body fat and stay lean for a while, they seem to become able to maintain their leanness without much effort.
Well, perhaps glycogen-depleting exercise is more important than many people think. It can help make you thin, but through a circuitous path.
And, incidentally, glycogen-depleting exercise causes a temporary but dramatic spike in GH secretion. This natural increase in GH secretion does not seem to be associated with any significant impairment in overall insulin sensitivity, even though glycogen-depleting exercise increases blood glucose levels a lot during the exercise window. This is a temporary and physiological, not pathological, phenomenon.
Reference:
Rosenfalck A.M., Fisker, S., Hilsted, J., Dinesen, B., Vølund, A., Jørgensen, J.O., Christiansen, J.S., & Madsbad, S. (1999). The effect of the deterioration of insulin sensitivity on beta-cell function in growth-hormone-deficient adults following 4-month growth hormone replacement therapy. Growth Hormone & IGF Research, 9(2), 96–105.