Scientists may have discovered a way of identifying dieters who are prone to piling the pounds back on after weight loss. A study at Maastricht University’s Department of Human Biology found a link between a gene involved in regulating blood pressure and post-diet weight gain in women. Women who regained weight after slimming had a high change in the concentration of a particular protein in their blood during dieting, research showed. Researchers now hope to develop a test to indicate how prone people are to yo-yo dieting.
Edwin Mariman, professor of functional genetics at Maastricht, said: “It was a surprising discovery, because until now there has been no clear link between this protein and obesity. “We do not yet have an explanation for the results, but it does appear that it should be possible within a few years to use this finding to develop a test to show who is at high risk of putting weight back on after a diet.”
Hospitals already conduct tests for the protein, known as the angiotensin I converting enzyme (ACE). But the test is currently carried out to check its activity in regulating blood pressure, rather than its concentration. Up 80% of dieters suffer from the yo-yo effect, returning to their original weight within a year.
The study looked at around 100 women aged 20 to 45, half of whom had maintained their post-diet weight and half of whom had put weight back on. The findings of the research have been published by Dr Ping Wang, a scientist in Professor Mariman’s research group, in the online scientific journal PloS ONE.
The most common type of breast cancer in older women — estrogen and progesterone receptor (ER/PR) positive breast cancer — has been linked to a protein that fends off aging-related cellular damage. A new study led by Vanderbilt-Ingram Cancer Center researcher David Gius, M.D., Ph.D., now shows how a deficiency in this aging-associated protein may set the stage for these tumors to develop.
The findings, published in Molecular Cell, provide information that could assist in the screening, prevention and treatment of these common age-related cancers. While the young are certainly not spared cancer’s wrath, cancer is primarily a disease of aging, with the majority of cases occurring in people over 50. However, the biological processes that underlie this association are not clear.
“The connection between aging and cancer is one of the most established phenomena in cancer research,” said Gius, associate professor of Cancer Biology, Pediatrics and Radiation Oncology. “The problem to address this clinically significant question is that this field lacks in vivo models to study this.”
In the late-1990s, proteins called “sirtuins” were linked to extended lifespan observed in several species maintained on a calorically restricted diet. These nutrient-sensing sirtuin proteins seemed to defend against aging-related cellular damage. Sirtuins are present in all living organisms, with humans having seven different sirtuin proteins. “When (the sirtuins) were discovered, it seemed obvious to conclude that there might be a mechanistic connection between the genes that determine length of survival and cancer,” Gius said. Previously, while at the National Cancer Institute, Gius and colleagues created mice lacking some of these sirtuins.
They reported last January in Cancer Cell that when they knocked out Sirt3 — a sirtuin localized in the mitochondria, the cellular “power plants” — the mice developed ER/PR positive breast tumors, the most common type of breast cancer in postmenopausal women. These tumors also exhibited increased levels of damaging free radicals and “reactive oxygen species” (ROS) — including superoxide, the primary metabolite of oxygen in the mitochondria — which provided an important clue as to how Sirt3 deficiency might permit these tumors to develop. “The mechanism, at least in part, for why these mice develop receptor positive breast cancer is altered mitochondrial ROS, including superoxide,” Gius said. But how deficiency in a longevity gene led to increased ROS was not clear. Since superoxide is generally removed from the cell with the help of a detoxifying enzyme called manganese superoxide dismutase (MnSOD), Gius hypothesized that the Sirt3 deficiency may abnormally regulate MnSOD.
In the current study, the researchers show that Sirt3 knockout mice have decreased MnSOD activity despite having normal levels of the protein. Gius and colleagues determined that the MnSOD in Sirt3 knockout mice was abnormally modified (with a chemical “acetyl” group) at a specific amino acid (lysine 122). This aberrant modification of MnSOD reduced the enzyme’s ability to detoxify superoxide and appeared to explain the increase in ROS in Sirt3 knockout mouse tumors. “These results suggest that aberrant regulation of MnSOD plays a role in receptor positive breast cancer,” said Gius.
Gius and colleagues also developed an antibody that can assess the acetylation status of MnSOD, which he says can potentially be used “to screen breast tissue samples to determine what women are at risk for (receptor positive) cancer or for recurrence because of this dysregulation of MnSOD.” Additionally, agents that target the acetylation of this amino acid on MnSOD may be useful as chemopreventive therapies in women at risk of these cancers and of recurrence, he noted. The research was supported by grants from the National Cancer Institute and the Department of Defense.
A new study finds that drinking orange juice, soda and other beverages high in the sugar fructose could increase the small risk that middle-age and elderly women have of developing gout. Gout is a painful form of arthritis caused by too much uric acid in the blood. For women in the study who drank two or more servings of these beverages per day, the risk of gout was more than double that for women who drank sugary sodas and juices less than once per month. Because gout is relatively rare among women, the drinks probably contribute only moderately to a woman's chances of developing it. Still, this is the first study linking sodas and sweetened fruit juices to women's gout risk. Previous research found such a link for men.
The study will be published in the Nov. 24 issue of the Journal of the American Medical Association, and is being presented today (Nov. 10) at the American College of Rheumatology annual scientific meeting.
Gout occurs when levels of uric acid in the blood become too high, and uric acid crystallizes around the joints, leading to inflammation, swelling and pain. Foods than can increase the levels of uric acid in the blood include organ meats (such as kidneys and livers), asparagus and mushrooms, according to the Mayo Clinic. Fructose is also known to increase blood uric acid levels, the researchers said. While gout is not common in the United States, the rate of incidences has more than doubled over a 20-year period, from 16 cases per 100,000 Americans in 1977 to 42 per 100,000 in 1996. Over this period, the researchers noted, the population also consumed increasing amounts of soda and other drinks sweetened with fructose.
The new study followed 78,906 women for 22 years, from 1984 to 2006, as part of the Nurses' Health Study. At the beginning of the study, none of the women had gout. By the end, 778 had developed it. Women who drank one serving of soda per day were 1.74 times more likely to develop gout than those who drank less than one serving per month. Those who drank two or more servings per day were 2.4 times more likely to develop gout. Drinking two or more servings of soda per day caused an additional 68 cases of gout per 100,000 women per year, compared with drinking less than one serving of soda per month, the researchers said. Drinking orange juice also increased the risk. Women who drank one serving of orange juice per day were 1.41 times more likely to develop gout, and those who drank two or more servings were 2.4 times more likely to report gout.
Lifestyle and diet
The rise in gout cases is most likely due to changes in lifestyle and diet and an increase in conditions associated with gout, such as metabolic syndrome, said study researcher Dr. Hyon K. Choi of the Boston University School of Medicine. The results held even after the researchers took into account factors that could have influenced the findings, such as age, body mass index and whether the women had gone through menopause, Choi said.
The findings suggest diets to prevent gout should reduce fructose intake, the researchers said.
The study was funded by the National Institutes of Health.
Mothers who did not breastfeed their children have significantly higher rates of type 2 diabetes later in life than moms who breastfed, report University of Pittsburgh researchers in a study published in the September issue of the American Journal of Medicine.
“We have seen dramatic increases in the prevalence of type 2 diabetes over the last century,” said Eleanor Bimla Schwarz, MD, MS, assistant professor of medicine, epidemiology, and obstetrics, gynecology and reproductive sciences at the University of Pittsburgh. She also has a secondary appointment in epidemiology at the GSPH. “Diet and exercise are widely known to impact the risk of type 2 diabetes, but few people realize that breastfeeding also reduces mothers’ risk of developing the disease later in life by decreasing maternal belly fat.”
The study included 2,233 women between the ages of 40 and 78. Overall, 56 percent of mothers reported they had breastfed an infant for at least one month. Twenty-seven percent of mothers who did not breastfeed developed type 2 diabetes and were almost twice as likely to develop the disease as women who had breastfed or never given birth. In contrast, mothers who breastfed all of their children were no more likely to develop diabetes than women who never gave birth. These long-term differences were notable even after considering age, race, physical activity and tobacco and alcohol use.
“Our study provides another good reason to encourage women to breastfeed their infants, at least for the infant’s first month of life,” said Schwarz. “Clinicians need to consider women’s pregnancy and lactation history when advising women about their risk for developing type 2 diabetes.”
Schwarz also is an assistant investigator at the Magee-Womens Research Institute. Co-authors of the study include Jeanette Brown, MD, Jennifer M. Creasman, MPH, and David Thom, MD, PhD, University of California, San Francisco; Alison Stuebe, MD, MSc, University of North Carolina School of Medicine; Candace K. McClure, PhD, University of Pittsburgh; and Stephen K. Van Den Eeden, PhD, Kaiser Permanente, CA.
The research was funded by grants from the National Institutes of Health’s National Institute of Diabetes and Digestive and Kidney Diseases and the National Institute of Child Health and Development.
The idea that disease-causing genes can be beneficial is not new. The most clear-cut case involves a gene variant that, when present in two copies, causes sickle cell anaemia, which can result in severe pain, organ damage and death. Although it seems that natural selection would work to eliminate the disorder, the variant remains prevalent in some areas of Africa because people with just a single copy are less susceptible to malaria. Evolutionarily the trade-off is worth it: Far more people are protected from malaria than ever develop sickle cell anaemia even in today’s environment.
Unlike sickle cell anaemia, which is caused by a mutation in just one gene, many complex diseases are associated with several variants – specific locations in the DNA where the nucleotide ‘letters’ vary between individuals. These locations are known as SNPs, for single nucleotide polymorphisms. Some of these SNPs are associated with an increased disease risk, while others protect against developing the disease. When calculating an individual’s overall genetic risk, it’s necessary to consider the net effect of all of his or her variants.
Researchers at Stanford University picked seven well-known conditions to study: type-1 and type-2 diabetes, rheumatoid arthritis, hypertension, Crohn’s disease, coronary artery disease and bipolar disorder. Previous genome wide association studies have identified several hundred SNPs associated with each disorder. Corona found that of the top SNPs associated with type-1 diabetes, 80 have been recently increasing in prevalence, meaning that they underwent positive selection. Of these, a surprising 58 are associated with an increased risk of the disorder, while 22 appear protective. Similarly, SNPs associated with an increased risk for rheumatoid arthritis were found to be positively selected. In contrast to type-1 diabetes and rheumatoid arthritis, Corona found that we’re evolving away from a tendency to develop Crohn’s disease (that is, more protective SNPs than risky SNPs have been positively selected).
Results for the other three disorders – type-2 diabetes, coronary artery disease and bipolar disorder – showed that protective and risky SNPs were positively selected in about equal proportions. ‘Now we’re starting to see little hints as to why this might be the case,’ said Butte. For example, a recent study in another lab showed that genetic variations in an antiviral response gene called IFIH1 that improve its ability to protect against enterovirus infection (and the resulting severe, potentially deadly, abdominal distress) also increase a carrier’s risk for type-1 diabetes. And scientists who study global disease patterns have long noted that the prevalence of tuberculosis varies inversely with that of rheumatoid arthritis.
‘It’s possible that, in areas of the world where associated triggers for some of these complex conditions are lacking, carriers would experience only the protective effect against some types of infectious disease,’ said Butte, who pointed out that the cumulative effect of many SNPs in a person’s genome may buffer the effect of any one variant, even if it did raise a person’s risk for a particular condition.
Regardless of the reason, some evolutionary tenets still apply. Healthier people are, presumably, more likely to reproduce and pass those same genes – be they protective or risky – to their offspring. When conditions changed because of differences in diet, exposures or location as populations move around the globe, carriers of the risky SNPs began to develop the conditions we struggle with today.
Corona and Butte are now expanding their investigation to include even more SNPs and diseases. They are also looking at the genetic profile of various types of tumours to see if there’s evidence for positive evolutionary pressure there as well.
‘Even though we’ve been finding more and more genetic contributions to disease risk,’ said Butte, ‘that’s not really an appealing answer. There have got to be some other reasons why we have these conditions.’
Source: Stanford University Medical Centre
Researchers have long speculated that the diet may help explain why nations in the Mediterranean region have historically had lower rates of heart disease and some cancers, including breast cancer, compared with other European countries and the U.S.
Until now, only two other studies have looked at the relationship between Mediterranean-style eating and the risk of breast cancer, both done in the U.S. Each found a connection between the diet and lower breast cancer risk, although in one the link was limited to breast cancers that lack receptors for the hormone estrogen — which account for about one-quarter of breast tumors.
The current study focused on women in Greece, as it is the “cradle” of the Mediterranean diet, and a large segment of the population still adheres to it, Dr. Dimitrios Trichopoulos, the senior researcher on the work, told Reuters Health by email.
At the outset, the study participants completed detailed dietary questionnaires and gave information on their lifestyle habits and demographics. Each woman was given a Mediterranean diet score, ranging from 0 to 9, based on how often they consumed vegetables, legumes, fruit and nuts, whole grains, fish and olive oil or other sources of monounsaturated fats; they also won points by limiting meat and dairy.
Of the 14,800 women included, 240 were diagnosed with breast cancer over an average follow-up of 10 years.
Overall, postmenopausal women whose Mediterranean diet scores were in the 6-to-9 range were 22 percent less likely to develop breast cancer than their counterparts with scores between 0 and 3. That was with factors such as age, education, smoking history, weight and exercise habits taken into account.
The findings show an association between Mediterranean eating and lower breast cancer risk, but do not prove cause-and-effect, according to Trichopoulos, who is with the Harvard School of Public Health in Boston and the Bureau of Epidemiologic Research at the Academy of Athens in Greece.
Further studies are needed to confirm the results, he said.
However, other evidence suggests ways the Mediterranean diet might curb cancer risk.
Research has found, for instance, that women who closely follow the diet tend to have lower levels of estrogen, which fuels the growth of the majority of breast cancers, than other women do. Other studies in the lab suggest that the fats found in the Mediterranean diet — both olive oil and the omega-3 fats in oily fish — may slow the growth of cancer cells.
The diet is also typically rich in antioxidants, which protect body cells from damage that can eventually lead to disease, including cancer. Trichopoulos said that if the Mediterranean diet does have a protective effect against cancer, it is “likely” to involve that antioxidant component.
It also makes sense, said the researcher, that the diet could affect the risk of postmenopausal, but not premenopausal, breast cancer.
Younger women who develop breast cancer, he explained, often have a genetic vulnerability to the disease, whereas in older women, lifestyle and environmental exposures may be relatively more important contributors to risk.
Based on their findings, Trichopoulos and his colleagues write, the association between the Mediterranean diet and breast cancer is of “modest, but not negligible, strength.”
In the U.S., a woman's chance of being diagnosed with breast cancer rises from about a half a percent, or one in 233, during her 30s, to about four percent, or one in 27, during her 60s.
Established risk factors for breast cancer include older age and having had a first-degree relative diagnosed with the disease. Research has also linked obesity, sedentary lifestyle, use of hormone replacement therapy and high alcohol intake to an increased risk.
SOURCE: Journal of Clinical Nutrition
The prospective Rotterdam Study involved 5,395 people over age 55 with no dementia at baseline. All of the participants, who lived in one section of the Rotterdam area, provided dietary information when the study began in 1990.
The researchers previously reported a similar association of vitamin E intake with a lower risk of dementia and Alzheimer's disease over six years of follow-up among the cohort.
The current study found that after 9.6 years of follow-up, 465 of the participants had developed dementia; 365 of these cases were classified as Alzheimer's disease.
Higher baseline vitamin E consumption correlated with lower long-term risk of dementia in models minimally adjusted for age only and those adjusted for age, education, apolipoprotein genotype, total caloric intake, alcohol and smoking habits, body mass index, and use of supplements (both P=0.02 for trend).
Dietary surveys indicated that margarine was by far the biggest contributor to vitamin E intake at 43.4%, followed by sunflower oil at 18.5%, butter at 3.8%, and cooking fats at 3.4%.
Participants with vitamin E intakes in the top third, averaging 18.5 mg per day, were 25% less likely to develop dementia of any kind over almost 10 years of follow-up than those in the bottom third, who averaged only 9.0 mg per day. Higher baseline vitamin E consumption correlated with lower long-term risk of dementia (both P=0.02 for trend).
While the top versus bottom tertile comparison was significant, the middle group with vitamin E intake averaging 13.5 mg per day was no less likely to develop dementia than the lowest intake group.
For Alzheimer's disease alone, the multivariate-adjusted risk was 26% lower among those with the highest intake compared with the lowest (95% confidence interval 3% to 44%, P=0.03 for trend). But intermediate intake again appeared to have no impact.
Other antioxidants — vitamin C, beta-carotene, and flavonoids — held no significant associations with dementia or Alzheimer's disease risk (multivariate adjusted P=0.50 to >0.99 for trend).
Sensitivity analyses excluding participants who reported taking supplements at baseline showed similar results.
The researchers noted that the vitamin intakes seen in the study were consistent with a typical Western diet but cautioned about the possibility of residual confounding in the observational results.
Eating more olive oil could help prevent ulcerative colitis, according to a new study co-ordinated by medical researchers at the University of East Anglia (UEA).
Presented today at the Digestive Disease Week conference in New Orleans, the findings show that people with a diet rich in oleic acid – which is present in olive oil –are far less likely to develop ulcerative colitis. Oleic acid is a monounsaturated fatty acid found in olive oil, peanut oil and grapeseed oil, as well as in butter and certain margarines.
The researchers, led by Dr Andrew Hart of UEA's School of Medicine, studied more than 25,000 people aged 40-65 living in Norfolk, UK. The volunteers were recruited to the EPIC study (European Prospective Investigation into Diet and Cancer) between 1993 and 1997. The participants, none of whom had ulcerative colitis at the outset, completed detailed food diaries which were later analysed by specially trained nutritionists working in Cambridge.
By 2004, 22 participants in the study had developed ulcerative colitis and the researchers compared their diets with those who did not develop the disease. They found that those with the highest intake of oleic acid had a 90 per cent lower risk of developing the disease.
“Oleic acid seems to help prevent the development of ulcerative colitis by blocking chemicals in the bowel that aggravate the inflammation found in this illness,” said Dr Hart.
“We estimate that around half of the cases of ulcerative colitis could be prevented if larger amounts of oleic acid were consumed. Two-to-three tablespoons of olive oil per day would have a protective effect,” said Dr Hart.
Ulcerative colitis is a distressing disease affecting 120,000 people of all ages in the UK and 1 million in the US. It is characterized by inflammation of the lining of the colon or large bowel, which causes abdominal pain, diarrhoea and weight loss.
Similar work in other countries is now required to determine if these results are reproducible there, before the link can be said to be definite. If it is confirmed that oleic acid is truly protective, dietary modifications should be considered to prevent colitis. Additionally, the use of oleic acid supplements should also be assessed in the future as a possible treatment for colitis sufferers.
People with mild cognitive impairment can be affected by a reduction in their ability to think, such as reduced memory and a short attention span.
“We wanted to find out whether highly educated patients with mild cognitive impairment differed in terms of tolerance of the disease from patients with intermediate and low levels of education,” says Rolstad.
By analysing the patients' spinal fluid, the researchers were able to examine whether there were signs of dementia in the brain.
“Highly educated patients with mild cognitive impairment who went on to develop dementia over the next two years had more signs of disease in their spinal fluid than those with intermediate and low levels of education,” says Rolstad.
Despite having more disease in the brain, the highly educated patients showed the same symptoms of the disease as their less well educated counterparts. This means that patients with higher levels of education tolerate more disease in the brain.
The researchers also studied patients with mild cognitive impairment who did not go on to develop dementia over the next two years.
“We found that the highly educated patients who did not develop dementia during the course of the study showed signs of better nerve function than those with lower levels of education,” says Rolstad. “This finding means that the highly educated not only tolerate more disease in the brain but also sustain less nerve damage during the early stages of the disease.”
The results indicate that a higher reserve capacity delays the symptoms of dementia and the progress of the disease. This can help the care sector to be more aware of dementia in highly educated patients, and thus increase the chances of the correct treatment being given.