Kinder and his colleagues assessed the prevalence of vitamin D deficiency in a cohort of patients with interstitial lung disease, who are often treated with corticosteroids. The detrimental effect of chronic use of corticosteroids on bone health has been well established, according to the researchers. Of the patients included in the study, 51 had interstitial lung disease and 67 had other forms of interstitial lung disease related to autoimmune connective tissue diseases.
A vitamin D insufficiency was defined as a serum level of less than 30 ng/mL. A level of less than 20 ng/mL was considered deficient. Both insufficient and deficient levels were prevalent in the study. In the overall sample, lower vitamin D levels were associated with reduced forced vital capacity (P=0.01). When the analysis was restricted to patients with connective tissue disease, both forced vital capacity and diffusing capacity of lung for carbon monoxide — a measure of the lung’s ability to transfer gases from the air to the blood — were significantly reduced (P<0.05 for both). After adjustment for several potential confounders — including age, corticosteroid use, race, and season, the presence of connective lung disease was a strong predictor of vitamin D insufficiency (OR 11.8, 95% CI 3.5 to 40.6).
According to the researchers, a pathogenic role of low vitamin D in the development of autoimmune diseases such as interstitial lung disease is plausible because of the immunoregulatory role of the biologically active form of vitamin D, 1,25-(OH)2D. “All cells of the adaptive immune system express vitamin D receptors and are sensitive to the action of 1,25-(OH)2D,” they wrote. “High levels of 1,25-(OH)2D are potent inhibitors of dendritic cell maturation with lower expression of major histocompatibility complex class II molecules, down-regulation of costimulatory molecules, and lower production of proinflammatory cytokines.” “A common theme in the immunomodulatory functions of vitamin D is that higher levels are immunosuppressive,” they continued, “which is consistent with a potential role for hypovitaminosis D in the pathogenesis of autoimmune disorders.”
In a statement, Len Horovitz, MD, a pulmonary specialist at Lenox Hill Hospital in New York City, commented that “vitamin D is known to promote wound healing, and to benefit the immune system. So it is not surprising to find that patients with immune lung disorders are vitamin D deficient.” He said that all of his patients are screened and treated for vitamin D deficiency with supplements. The study authors noted that further research is needed to determine whether supplementation is associated with improved outcomes. The study was limited, Kinder and his colleagues wrote, by its use of patients from a single center in Cincinnati.
In addition, the cross-sectional design of the study did not evaluate whether vitamin D supplementation is associated with any improved clinical outcomes. To examine that issue, the team called for prospective controlled interventional studies to determine whether vitamin D7 supplements can ameliorate symptoms and improve outcomes in connective tissue disease-related interstitial lung disease.
Source reference: Hagaman J, et al “Vitamin D deficiency and reduced lung function in connective tissue-associated interstitial lung diseases” Chest 2011; DOI: 10.1378/chest.10-0968.
Teens whose diets include lots of sugary drinks and foods show physical signs that they are at increased risk for heart disease as adults, researchers from Emory University report. Among 2,157 teens who took part in the National Health and Nutrition Examination Survey, the average amount of added sugar eaten in a day was 119 grams (476 calories), which was 21.4 percent of all the calories these teens consumed daily, the researchers noted. “We need to be aware of sugar consumption,” said lead researcher and postdoctoral fellow Jean Welsh. “It's a significant contributor of calories to our diet and there are these associations that may prove to be very negative,” she said. “Sugar-sweetened soft drinks and sodas are the major contributor of added sugar and are a major source of calories without other important nutrients.” Awareness of the negative effects of added sugar may help people, particularly teens, cut down on the amount of sugar they consume, Welsh added. “Parents and adolescents need to become aware of the amount of added sugar they are consuming and be aware that there may be some negative health implications if not now, then down the line,” she said.
The report is published in the Jan. 10 online edition of Circulation.
Welsh's team found that teens who consumed the most added sugar had 9 percent higher LDL (“bad”) cholesterol levels, and 10 percent higher triglyceride levels (another type of blood fat), compared with those who consumed the least added sugar. Teens who took in the highest amount of added sugar also had lower levels of HDL (“good”) cholesterol than those who consumed the least amount of added sugar. In addition, teens who consumed the highest amount of added sugar showed signs of insulin resistance, which can lead to diabetes and its associated risk of heart disease, the researchers found.
The American Heart Association has recommended an upper limit for added sugars intake, based on the number of calories you need. “Most American women [teens included] should consume no more than 100 calories of added sugars per day; most men, no more than 150 calories,” the association states.
One caveat to these findings is that because of the way the study was done it is not clear if added sugars caused the differing cholesterol levels, only that they are linked. In addition, the data are only for one day and may not reflect the teen's usual diet, the researchers noted. Commenting on the study, Dr. David L. Katz, director of the Prevention Research Center at Yale University School of Medicine, said that “this study does not prove that dietary sugar is a cardiac risk factor in this population, but it strongly suggests it.”
The paper has three important messages, he said. First, dietary sugar intake in a representative population of teenagers is nearly double the recommended level. Second, the higher the intake of sugar, the greater the signs of cardiac risk, including elevated LDL (“bad”) cholesterol and low HDL (“good”) cholesterol. Third, the apparent harms of excess sugar are greater in overweight than in lean adolescents.
“Sugar is by no means the sole dietary threat to the health of adolescents, or adults,” Katz said. “But we now have evidence it certainly counts among the important threats to both. Reducing sugar intake by adolescents, to prevent them becoming adults with diabetes or heart disease, is a legitimate priority in public health nutrition,” he said.
SOURCES: HealthDay; Jean Welsh, M.P.H., Ph.D., R.N., postdoctoral fellow, Emory University, Atlanta; David L. Katz, M.D., M.P.H., director, Prevention Research Center, Yale University School of Medicine, New Haven, Conn.; Jan. 10, 2011, Circulation, online
To test their hypothesis that environmental influences experienced by the father can be passed down to the next generation in the form of changed epigenetic information, Rando and colleagues fed different diets to two groups of male mice. The first group received a standard diet, while the second received a low-protein diet. To control for maternal influences, all females were fed the same, standard diet. Rando and colleagues observed that offspring of the mice fed the low-protein diet exhibited a marked increase in the genes responsible for lipid and cholesterol synthesis in comparison to offspring of the control group fed the standard diet.
These observations are consistent with epidemiological data from two well-known human studies suggesting that parental diet has an effect on the health of offspring. One of these studies, called the Överkalix Cohort Study, conducted among residents of an isolated community in the far northeast of Sweden, found that poor diet during the paternal grandfather’s adolescence increased the risk of diabetes, obesity and cardiovascular disease in second-generation offspring. However, because these studies are retrospective and involve dynamic populations, they are unable to completely account for all social and economic variables. “Our study begins to rule out the possibility that social and economic factors, or differences in the DNA sequence, may be contributing to what we’re seeing,” said Rando. “It strongly implicates epigenetic inheritance as a contributing factor to changes in gene function.”
The results also have implications for our understanding of evolutionary processes, says Hans A. Hofmann, PhD, associate professor of integrative biology at the University of Texas at Austin and a co-author of the study. “It has increasingly become clear in recent years that mothers can endow their offspring with information about the environment, for instance via early experience and maternal factors, and thus make them possibly better adapted to environmental change. Our results show that offspring can inherit such acquired characters even from a parent they have never directly interacted with, which provides a novel mechanism through which natural selection could act in the course of evolution.” Such a process was first proposed by the early evolutionist Jean-Baptiste Lamarck, but then dismissed by 20th century biologists when genetic evidence seemed to provide a sufficient explanation.
Taken together, these studies suggest that a better understanding of the environment experienced by our parents, such as diet, may be a useful clinical tool for assessing disease risk for illnesses, such as diabetes or heart disease. “We often look at a patient’s behavior and their genes to assess risk,” said Rando. “If the patient smokes, they are going to be at an increased risk for cancer. If the family has a long history of heart disease, they might carry a gene that makes them more susceptible to heart disease. But we’re more than just our genes and our behavior. Knowing what environmental factors your parents experienced is also important.”
The next step for Rando and colleagues is to explore how and why this genetic reprogramming is being transmitted from generation to generation. “We don’t know why these genes are being reprogrammed or how, precisely, that information is being passed down to the next generation,” said Rando. “It’s consistent with the idea that when parents go hungry, it’s best for offspring to hoard calories, however, it’s not clear if these changes are advantageous in the context of a low-protein diet.”
Cystatin C, a blood marker of kidney function, proved significantly more accurate than the standard blood marker, creatinine, in predicting serious complications of kidney disease, in a study by researchers at the San Francisco VA Medical Center and the University of California, San Francisco. Among adults who were identified as having chronic kidney disease by high creatinine levels, the researchers found that only patients who also had abnormally high levels of cystatin C were at high risk for death, cardiovascular disease, heart failure, or kidney failure. People with high creatinine but normal cystatin C levels had risks similar to those with normal creatinine levels.
The researchers also found that a small but important segment of the study population was missed by creatinine but identified by cystatin C as being at significant risk of serious complications, according to lead author Carmen A. Peralta, MD, MAS, an SFVAMC researcher and an assistant professor of medicine in residence in the division of nephrology at UCSF.
The study of 11,909 participants appears online on December 16, 2010, in the JASN Express section of the Journal of the American Society of Nephrology. The authors analyzed patient data from two prospective studies: the Multi-Ethnic Study of Atherosclerosis and the Cardiovascular Health Study, both sponsored by the National Heart, Lung, and Blood Institute.
Principal investigator Michael G. Shlipak, MD, MPH, chief of general internal medicine at SFVAMC, said that the current study highlights a potential clinical use for cystatin C as a method for confirming a diagnosis of chronic kidney disease. Shlipak has been a leader among physicians in identifying cystatin C as an alternative, accurate, and reliable marker of kidney function.
Both cystatin C and creatinine are substances made in the body and filtered by the kidneys. High levels of the substances in the blood indicate that the kidneys are losing the ability to filter them, and thus are losing function. However, explained Peralta, creatinine is a byproduct made in muscles, so it is affected by what you eat and especially by how much muscle you have. Thus, a bodybuilder with healthy kidneys might have an elevated creatinine level because of high muscle mass, whereas a frail elderly person might have normal or even low levels of creatinine, but in fact this persons kidneys are not working well – its just that theres not much creatinine because theres not much muscle.
In contrast, cystatin C is a protein made in cells throughout the body. In studies so far, it does not seem to be that affected by age or muscle mass or diet, said Shlipak, who is also a professor in residence of medicine and epidemiology and biostatistics at UCSF.
Shlipak proposes that cystatin C, which can cost as little as $17 per test, be added as a method for confirming or staging chronic kidney disease in guidelines that are currently being formulated by nephrologists. Its vital that we have an accurate diagnostic test, because kidney disease does not show symptoms until its too late, when your kidneys have almost failed completely, he said. Being missed by creatinine is an important limitation in our current method of diagnosing kidney disease, said Peralta. Yet, she adds, being falsely identified with kidney disease through inaccurate test results can be disastrous as well. There is fear and psychological stress, particularly in communities of color, where people have a lot of friends and family members who are on dialysis, she noted. You can also be subjected to unnecessary and expensive tests and medications.
Older adults who eat a healthy diet tend to live longer than those who indulge in desserts and high-fat dairy products, according to a new study in the Journal of the American Dietetic Association. With the projected doubling of our older population by 2030, what people put on their plates may be even more important.
For 10 years, researchers followed the eating habits of 2,500 healthy seniors aged 70 to 79. They found people who ate ice cream, whole milk and other high fat-dairy items had a 40% higher risk of dying during the decade of study than those who ate a healthful diet. People who ate sweets such as doughnuts, cakes, and cookies had a 37% higher risk of dying in that same 10 year study period.
The seniors were placed into one of the following 6 dietary categories depending upon what they ate: 1) Healthy foods 2) High-fat dairy products 3) Meat, fried foods and alcohol 4) Breakfast cereal 5) Refined grains and 6) Sweets and desserts. The people with the more healthful diets not only lived longer they also reported having a better quality of life, for a longer period of time than others.
“Our study and several previous studies suggest that it may be important what people eat at any age and that people can perhaps increase their quality of life and survival by following a healthy diet,” explains lead author Amy Anderson, Postdoctoral Researcher with the University of Maryland's Department of Nutrition and Food Science.
Eating healthy meant including more low-fat dairy products, fruits and vegetables, whole grains, poultry and fish in the diet as opposed to meat, fried foods, sweets, highly sugared drinks and other fatty foods. The healthy group got only 3% of their calories from high-fat dairy products such as cheese and ice cream, for example, while the high-fat dairy group got 17% of their calories from these foods. The healthier group also ate fewer sweets with only 6% of their calories coming from these treats compared to 25% by those in the desserts group.
The study noted that in the past century, the leading causes of death were from infectious diseases. Now people are dying from chronic illnesses such as heart disease and cancer, which are often tied to what we eat.
“I think this research is important, especially now with the baby boomers entering these older age groups. So if people have a higher quality of life and survival , if they're healthier, this can reduce the cost of health care and improve people's daily lives in general,” says Anderson.
The authors used survey data from Project EAT (Eating Among Teens), in which two groups of adolescents (1608 middle school and 3074 high school students) completed surveys in 1999 and 2004 regarding eating habits, parental styles, and various socioeconomic variables.
Cross-sectional results for adolescent girls indicated a positive association between maternal and paternal authoritative parenting style and frequency of family meals. For adolescent boys, maternal authoritative parenting style was associated with more frequent family meals. Longitudinal results indicated that authoritative parenting style predicted higher frequency of family meals five years later, but only between mothers and sons or between fathers and daughters.
Of the participants, 15.5%, 29.7%, 28.1%, and 21.1% reported being physically inactive at teenage, at 30 years, at 50 years, and in late life respectively; the increase in cognitive impairment for those who were inactive was between 50% and 100% at each time point. When physical activity measures for all four ages were entered into a single model and adjusted for variables such as age, education, marital status, diabetes, hypertension, depressive symptoms, smoking, and BMI, only teenage physical activity status remained significantly associated with cognitive performance in old age.
Middleton added, “As a result, to minimize the risk of dementia, physical activity should be encouraged from early life. Not to be without hope, people who were inactive at teenage can reduce their risk of cognitive impairment by becoming active in later life.”
The researchers concluded that the mechanisms by which physical activity across the life course is related to late life cognition are likely to be multi-factorial. There is evidence to suggest that physical activity has a positive effect on brain plasticity and cognition and in addition, physical activity reduces the rates and severity of vascular risk factors, such as hypertension, obesity, and type II diabetes, which are each associated with increased risk of cognitive impairment.
“Low physical activity levels in today's youth may mean increased dementia rates in the future. Dementia prevention programs and other health promotion programs encouraging physical activity should target people starting at very young ages, not just in mid- and late life,” said Middleton.
“Once we know the exact causes of type 2 diabetes, we can develop more effective prevention and therapy strategies,” said Dr. Thomas Illig, research group leader at the Institute of Epidemiology of Helmholtz Zentrum München and one of the corresponding authors of the study. Dr. Cornelia Huth, who played a key role in the selection of the study participants and the analyses of Helmholtz Zentrum München, added: “What enabled us to identify these factors with a high level of confidence is the large number of investigated subjects in this collaborative study. Each factor by itself contributes only slightly to the entire diabetes risk. But to find out more about the pathogenic mechanisms of the disease, even these slight contributions are important.” Dr. Christian Herder and Dr. Wolfgang Rathmann, both of whom are research group leaders at the German Diabetes Center, pointed out: “One important finding of the new study is that some of the gene loci associated with increased type 2 diabetes risk are also risk variants for other diseases such as coronary heart disease, autoimmune diseases and cancer. This suggests that specific proteins could be relevant for several diseases at the same time.”
Type 2 diabetes is a disorder of glucose homeostasis. Characteristic features of this disorder are that the effect and sufficient production of the hormone insulin become lost. The pathogenic mechanisms of this disease are not yet fully understood. It is known, however, that the combination of genetic susceptibility and lifestyle factors leads to diabetes. In Germany alone, not less than seven percent of the population has been diagnosed with the disease – altogether almost six million people. Additionally, studies show that several million men and women in Germany suffer from as yet undiagnosed and thus untreated diabetes.
The team looked at how NK cells (natural killer cells – a type of immune cell) reacted to Helicobacter pylori. These cells are an important part of the immune system as they can both recognise and kill cells that are infected by viruses and bacteria as well as tumour cells.
“We found that a special type of NK cells was active against the stomach ulcer bacterium,” says Åsa Lindgren. “These NK cells produced cytokines, which are the immune system's signal substances and act as a defence against the intruder.”
The researchers' results suggest that NK cells can play an important role in the immune defence against Helicobacter pylori. Previous research has also shown that a high proportion of NK cells in tumour tissue has contributed to a better prognosis and longer survival for patients with stomach cancer, as these cells help to eliminate the tumour cells.
The researchers therefore believe that activation of the NK cells can play a key role in stopping tumours from developing, and that reduced NK-cell activity can increase the risk of cancer developing. Åsa Lindgren hopes that these findings can be used to develop new ways of diagnosing and treating stomach cancer.
“This would make it possible to diagnose stomach cancer at an early stage, which, in turn, could mean a better prognosis for the patients.”
Thirty years ago Maria de Sousa, then at the beginning of her career, noticed that lymphocytes were attracted to places with surplus of iron. This, together with
1- the fact that the vertebrate immune system (IS) was incredibly more complex that those of its ancestors (and evolution rarely increases complexity, which is energetically costly, unless something is gained)
led her to a revolutionary new idea – could this new complexity be evolutionary sound, because it allowed the IS to perform some important new function, maybe protecting the body against iron toxicity?
In fact iron, although an essential element for most life forms, can also be toxic to these same organisms when free (not attached to proteins). This means that in this form it needs to be “watched” and regulated around the clock. In vertebrates, this is done through hepcidin, a liver protein that “moves” iron between cells and plasma according to the body needs (or potential dangers). The problem is that the hepcidin liver cells have limited mobility so a complementary far reaching iron control system was needed. Lymphocytes, with their unique capacity to move throughout the body were the perfect candidates and since 1978, de Sousa and her group have been chasing this idea.
Much of their work has been done on hemochromatosis – a disease where there are problems in the absorption of iron through the digestive track leading to too much iron in the organism and to its toxic accumulation in the organs.
From this work we know now that hemochromatosis patients also have a defective IS, and more, that their iron overload levels correlate with their lymphocyte deficiency – the less lymphocytes they have the more severe the disease. Work in animal models with iron overload problems or instead, with lymphocyte deficiencies have again found links between excess of iron in the body and deficient IS further supporting de Sousa's “immuno-iron idea”.
And meanwhile, human lymphocytes were shown to produce several proteins crucial for the regulation of iron levels – ferritin, which acts as the body storage of iron (so holding to it when there is too much in the body or releasing it when there is deficiency) and ferroportin, which is the cells' iron “exit door” (again releasing or retaining iron as necessary) . The fact that lymphocytes had both proteins gave them the potential to be a “mobile” and easily “mobilizable” iron-storage compartment, characteristics perfect for an important role in iron homeostasis.
But hepcidin, the central piece of iron regulation, is known to be also an important player in the immune response what has raised the possibility that it could be in it the clue to this problem. In fact, during infection hepcidin shuts down the “door” through which iron leaves the cell (ferroportin) reducing iron availability in the plasma and thus helping to control infection – as bacteria need iron to divide. And now several studies have shown that hepcidin is produced by a variety of cells involved in the immune response. Finally, last year, a study suggested, for the first time, that lymphocytes were also capable of producing the protein putting the possibility that hepcidin could actually be “the missing link” of de Sousa's theory.
To clarify this hypothesis Jorge Pinto, Maria de Sousa and colleagues at the Institute for Molecular and Cell Biology (IBMC) of Porto University looked at hepcidin production in human lymphocytes in situations of toxic iron concentrations or immune activation, as de Sousa's theory proposed that lymphocytes could play a role in both situations. They found that hepcidin not only was produced by all classes of lymphocytes, but also that its production increased both in the presence of high quantities of iron, and when lymphocytes were activated, backing de Sousa's proposals.
Pinto explains: “We show, for the first time, that lymphocytes can “feel” the toxic levels of iron in circulation and respond by increasing their own capacity to retain it within, restoring “normality”. The same mechanism is seen being used in situations of (iron) demand, such as when the cells are activated by the occurrence of an infection and need to divide.”
They also found something else totally unexpected – that hepcidin was involved in this second mechanism, suggesting an even closer dependence between the two systems than de Sousa had thought.
To Hal Drakesmith, a researcher at the University of Oxford working on the possibility of manipulating iron transport as a way to combat infections such as HIV, malaria and Hepatitis C these results raise particularly interesting questions as he explains “This seems to suggest that control of iron metabolism may be an integral component of lymphocyte immunity. Withholding iron from pathogens is an accepted part of our defence against infection, but a role for lymphocytes in controlling iron transport has not been proposed before.
“Crucially – says Pinto – we still believe that the main regulator of systemic iron levels is the liver but not only are lymphocytes (and not liver cells) able to sense toxic forms of iron, but they are also able to travel and be activated in specific places where the pathogens accumulate helping to control infection. “
These results are a major step to understand the link between the IS and iron and, if confirmed in live organisms –all this work was done on human cells in the laboratory – can be the beginning of a totally different view of what the immune system is and how to approach immunologic problems.
As Hal Drakesmith says “the paper describes several new findings which are highly likely to be of interest and importance to the iron and immunity fields of research” A simple example is the anaemia that usually accompanies chronic inflammatory diseases and that so far can not be clearly explained. Pinto and Sousa's results suggest that lymphocyte chronic activation, so characteristic of these diseases, by leading to hepcidin production could be part of the phenomenon as iron is an integral part of red blood cells.
Pinto, de Sousa and colleagues now plan to go back to those diseases of iron overload associated to immune abnormalities and see if hepcidin proves to be, in fact, the connection between them. Other possibility is the construction of mice without the hepcidin gene in the bone marrow – where lymphocytes develop – to analyse the changes that this could bring to both iron homeostasis and the immune response.
Whatever happens this is a strikingly interesting story with decades of persistence and believe behind it and which, I am sure, still has much to tell us.