The study was carried out by researchers from University of Otago Medical School, New Zealand. Funding was provided by Genesis Oncology Trust, the Dean’s Bequest Funds of the Dunedin School of Medicine, the Gisborne East Coast Cancer Research Trust and the Director’s Cancer Research Trust. The research was published in the peer-reviewed American Journal of Epidemiology. This was a case-control study in New Zealand that compared a group of adults with bowel cancer, and a group without bowel cancer, and looked at whether they drank milk at school. School milk was freely available in most schools in New Zealand until 1967 when the government programme was stopped. Many schools in the Southland region stopped free milk as long ago as 1950.
Case-control studies are appropriate for looking at whether people with and without a disease have had a particular exposure (milk in this case). The difficulty is in accounting for all potential confounding factors, particularly other health and lifestyle factors, which could be related to both diet and bowel cancer risk, for example regular childhood milk consumption could be a reflection of a ‘healthy’ diet and other healthy lifestyle behaviours that may reduce risk of cancer. In addition, when examining such a specific dietary factor – ie milk consumed in school – it is difficult to account for all possible milk or other dairy products consumed outside of school.
In this case-control study, 562 adults (aged 30 to 69) with newly diagnosed bowel cancer were identified from the New Zealand Cancer Registry in 2007. For a control group, 571 age-matched adults without cancer were randomly selected from the electoral register. All participants were mailed a questionnaire that asked about any previous illness, use of aspirin or dietary supplements in childhood, participation in school milk programmes, other childhood milk consumption, childhood diet (including other milk and dairy), smoking, alcohol consumption prior to 25 years of age, screening tests for bowel cancer, family history of cancer, education and sociodemographic characteristics. Childhood weight and height were not questioned. For school milk consumption they were specifically asked:
- Whether they drank school milk
- How many half-pint bottles they drank a week
- What age they first drank school milk
- When they stopped drinking school milk
Statistical risk associations between school milk participation and cancer were calculated. The calculations took into account several risk factors for bowel cancer risk including age, sex, ethnicity and family history.
What were the basic results?
Data on school milk consumption was available for 552 cases and 569 controls. As expected, people who started school before 1967 were more likely to have had free school milk than those who began school after 1968. Seventy-eight percent of cases participated in the school milk programme compared with 82% of controls. School milk consumption was associated with a 30% reduced risk of developing bowel cancer (odds ratio 0.70, 95% confidence interval 0.51 to 0.96).
When looking at the effect of number of bottles consumed per week they found that compared with no bottles, five bottles per week was associated with 32% significantly decreased risk, and 10 or more bottles with 61% significantly decreased risk. However, there was no significant association with one to four bottles or six to nine bottles. The researchers found a similar trend when the total school consumption of milk was compared with no consumption: 1,200-1,599 bottles was associated with 38% significantly decreased risk; 1,600-1,799 with 43% decreased risk; and 1,800 or more bottles associated with 38% significantly decreased risk. There was no significant association with fewer than 1,200 bottles. The researchers calculated that for every 100 half-pint bottles consumed at school there was a 2.1% reduction in the risk of bowel cancer. Outside of school, there was a significantly reduced risk of bowel cancer with more than 20 dairy products a week compared with none to nine dairy products a week.
The researchers conclude that their national case-control study ‘provides evidence that school milk consumption was associated with a reduction in the risk of adult colorectal cancer in New Zealand. Furthermore, a dose-dependent relation was evident’. This study has strengths in its relatively large size, its reliable and nationally representative identification of cases and controls, and its thorough data collection. However, the conclusion that school milk consumption is associated with a reduced risk of bowel cancer in adulthood must be interpreted in light of a number of considerations:
The analysis took into account established risk factors for bowel cancer including age, sex, ethnicity and family history. However, many other potential confounders were not considered, including diet, physical activity, overweight and obesity, smoking or alcohol consumption. Diet in particular has been implicated in bowel cancer risk, with diets high in saturated fat, red meat and processed foods and low in fibre, fruit and vegetables thought to increase risk. Potentially, any of these lifestyle behaviours could be confounding the relationship between school milk consumption and bowel cancer and regular childhood milk consumption could be a reflection of a ‘healthy’ diet and other healthy lifestyle behaviours that reduce risk of cancer. When looking at the effect of number of bottles consumed per week, the researchers found that, compared with no bottles, five bottles were associated with 32% significantly decreased risk and 10 or more bottles with 61% significantly decreased risk. However, there was no significant association with one to four bottles or six to nine bottles. Therefore, the trend here is not very clear. Particularly as only 16 cases and 31 controls drank 10 or more bottles a week, statistical comparison between such small numbers should be viewed with caution. With many food questionnaires there is the potential for recall bias. For example, adults may have difficulty remembering how many bottles of school milk they drank many years before. When estimating their average weekly amount, it is highly possible that this could be inaccurate or that their consumption varied slightly from week to week and year to year. Particularly when researchers were using this response and combining it with the number of weeks in the school year and their total years at school to give a total number of bottles consumed at school (figures in 100s or 1,000s), there is the possibility of being incorrectly categorised. Hence, there may be less reliability when calculating risk according to the category of total milk bottles consumed. Cancer prevalence, and particularly environmental and lifestyle risk factors for cancer, can vary between countries. These findings in New Zealand may not be represented elsewhere. Of note, the researchers acknowledge that a cohort study in the UK found the opposite: increased childhood dairy consumption was associated with increased risk of bowel cancer. Case-control studies are most appropriate for looking at rare diseases, where you would expect there to be only a small number of cases developing among a large number of people. In the case of bowel cancer, which is common, the slightly more reliable cohort design could have also been used, where children who drank milk at school and those who didn’t were followed over time to see if they developed cancer. However, such a cohort would consequently need extensive long-term follow-up.
The possible association between milk/dairy consumption, or calcium intake, in childhood, or in later years, is worthy of further study. However, from this study alone, it cannot be concluded that school milk prevents bowel cancer later in life.
A new study has claimed that an ingredient of dark chocolate could assist in the control of severely high cholesterol levels, a major problem for those suffering from diabetes. Previous research has highlighted that chocolate which contains a high level of cocoa solids rich in polyphenols may be able to reduce the risk of heart disease, and this study, published in the journal Diabetic Medicine, saw a reduction in cholesterol for a small number of diabetics given chocolate that contained a lot of the chemical.
The researchers, from Hull University, examined 12 patients with type 2 diabetes who were given identical chocolate bars, some of which were enriched with polyphenols. The patients that consumed the enriched bars experienced a small improvement in their overall cholesterol rating, with a drop in total cholesterol while the level of good cholesterol increased.
Steve Atkin, who led the study, said “Chocolate with a high cocoa content should be included in the diet of individuals with type 2 diabetes as part of a sensible, balanced approach to diet and lifestyle.” However, the charity Diabetes UK warned that these findings may mislead people into eating too much chocolate, arguing that the high fat and sugar content probably outweighed any benefits.
Dr Iain Frame, director of research at Diabetes UK, commented “On no account should people take away the message from this study, conducted in only 12 people, that eating even a small amount of dark chocolate is going to help reduce their cholesterol levels.” He added “It would, however, be interesting to see if further research could find a way of testing whether polyphenols could be added to foods which weren’t high in sugar and saturated fat such as chocolate”.
The link between obesity and cardiac disease is not merely anecdotal, there is proof for that. Now, there is further proof that even overweight causes a clustering of risk factors for cardio vascular abnormalities. A recent publication in Heart Asia, a British Medical Journal, has showed that there is not much difference between the cardio vascular risk factors in obese and overweight people. “The clutch of risk factors – glucose intolerance, hyper tension, high cholesterol – are all significantly higher among overweight and obese subjects than among normal subjects,” Vijay Viswanathan, MV Hospital for Diabetes and Prof. M.Viswanthan Diabetes Research Center said. He co-authored the article with Shabana Tharkar, also from the Indian hospital.
The study, conducted among two groups – 2021 subjects aged over 20 years, and 1289 subjects aged 8-19 years – indicated that even among overweight, 'non-obese' people, the presence of major cardiovascular risk factors was not significantly different. While the total diabetes prevalence among the obese population is 28.4 per cent, among the overweight population is 25 per cent. Again, with hypertension, the value for the obese group is 34.2 per cent, while for the overweight population it is 27.6 per cent. In contrast, the corresponding values are 16.2 per cent (diabetes) and 20.2 (hypertension).
Similarly, the study showed higher values for triglycerides and high HDL cholesterol for both these groups.
Overweight was defined as a Body Mass Index, equal to, or in excess of 25 kg/m2 and obesity, a BMI of 30 kg/m2 or above. Further worrisome is the increasing rate of overweight and obesity among both men and women from 1995 to 2008, across all age groups. Dr. Viswanathan added that this is the result of rapid urbanisation. “Obesity has already hit the Western world and it is time for Indians to wake up to the alarm bells,” according to the article. Results from previous studies show a lower risk of developing diabetes with just a five per cent initial reduction in weight, Dr. Viswanathan said.
The findings highlight the urgent need for framing direct and indirect strategies to control the rising levels of obesity in the population, in order to substantially reduce the country's non communicable diseases burden, he added. Regulating the diet, reducing intake of fast foods and high-calorie meals, and upping physical activity and exercise on a regular basis would go a long way in keeping weight under control, diabetologists advise.
Could the Mediterranean diet actually help prevent diabetes? The Mediterranean Diet, which is rich in vegetables, fruits, whole grains, healthy fats from nuts and olive oil, with moderate amounts of fish, low-fat dairy, and wine, and minimal red meat and processed meats, is considered to be an especially healthy eating plan.
Previous research conducted on newly diagnosed diabetic participants showed the diet did indeed help control the sugar spikes. The previous study found the mediterranean diet eating diabetics had better glycemic control. Furthermore, they had less needs for diabetes medications when adhering to the Mediterranean diet as opposed to a simple low-fat diet.
Recently, a team of researchers in Spain conducted a study using data from a large clinical trial to determine the effects of the Mediterranean Diet on preventing the onset of Type-2 diabetes. Participants were followed for an average of 4 years. Upon completion of the study, 54 participants had developed diabetes–but the split varied significantly among groups. The researchers found that the risk of developing diabetes was reduced by 52% among both groups of people who followed the Mediterranean Diet plans compared to the low-fat control group. In analyzing diet adherence, the researchers further noted that the closer an individual followed the Med-Diet plan, the lower their risk of developing diabetes. Interestingly, the weight of all participants did not change significantly throughout the study, nor did it vary significantly among the three groups.
The participants were divided in one of three groups: adherence to the Med-Diet with 1 liter per week of extra virgin olive oil, adherence to the Med-Diet with 1 oz per day of mixed nuts, or a standard low-fat diet as a control. No calorie restrictions were imposed on any of the groups. The two Med-Diet groups were instructed to increase fruit and vegetable intake, decrease meat intake, stay away from refined sweets and unhealthy fats such as butter, and consume red wine in moderation, if desired. The control group was given general instructions to lower overall fat intake. Baseline measurements and annual follow-up involved an oral glucose tolerance test and interviews to assess diet adherence.
Interestingly, the weight of all participants did not change significantly throughout the study, nor did it vary significantly among the three groups.
This study reinforces prior study results suggesting that the Mediterranean Diet – even without weight loss – may be protective against Type-2 diabetes. The researchers suggest that future studies should focus on the Med-Diet’s effects on younger people, and point out the possible benefits of the Mediterranean Diet as an effective intervention against complications of Type-2 diabetes.
Dieters often try to avoid thinking about the foods they crave, but maybe that's the wrong approach.Imagining yourself biting into a luscious piece of chocolate cake – thinking about the way it smells, the creamy texture of frosting on your tongue – may make you eat less of it, a new study suggests. This finding challenges age-old conventional wisdom that tells us thinking about goodies increases our cravings and ultimately our consumption, according to a study from Carnegie Mellon.
Drawing on research that shows mental imagery and perception affect emotion and behavior, the research team – led by assistant professor of social and decision sciences Carey Morewedge – found that repeatedly imagining indulging in a treat decreases ones desire for it.
“These findings suggest that trying to suppress one's thoughts of desired foods in order to curb cravings for those foods is a fundamentally flawed strategy,” Morewedge said in a statement.
The researchers conducted five experiments in which 51 people were asked to imagine themselves doing a series of repetitive actions – including, in one experiment, eating different amounts of M&Ms. A control group imagined putting coins into a washing machine.
Subjects were then invited to eat their fill of M&Ms. Those who had imagined eating the most ultimately ate fewer candies than the others. Subsequent experiments confirmed the results.
The researchers say their results, which were published in the December 10 issue of Science, could have wide-ranging effects.
Says Morewedge: “We think these findings will help develop future interventions to reduce cravings for things such as unhealthy food, drugs and cigarettes, and hope they will help us learn how to help people make healthier food choices.”
Eating a Mediterranean-style diet may help people with type 2 diabetes keep their disease under control without drugs better than following a typical low-fat diet.A new study from Italy shows that people with type 2 diabetes who ate a Mediterranean diet rich in vegetables and whole grains with at least 30% of daily calories from fat (mostly olive oil) were better able to manage their disease without diabetes medications than those who ate a low-fat diet with no more than 30% of calories from fat (with less than 10% coming from saturated fat choices).
After four years, researchers found that 44% of people on the Mediterranean diet ended up requiring diabetes medications to control their blood sugars compared with 70% of those who followed the low-fat diet.
It’s one of the longest-term studies of its kind, and researchers, including Katherine Esposito, MD, of the Second University of Naples, say the results “reinforce the message that benefits of lifestyle interventions should not be overlooked.”
Best Diet for Diabetes Control
In the study, researchers randomly assigned 215 overweight people recently diagnosed with type 2 diabetes who had never been treated with diabetes medications to either a Mediterranean-style diet or a low-fat diet.
The Mediterranean diet was rich in vegetables and whole grains and low in red meat, which was replaced with fish or poultry. Overall, the diet consisted of no more than 50% of daily calories from carbohydrates and no less than 30% of calories from fat.
The low-fat diet was based on American Heart Association guidelines and was rich in whole grains and limited in sweets with no more than 30% of calories from fat and 10% from saturated fats, such as animal fats.
After four years of follow-up, the Mediterranean diet group had better glycemic (blood sugar) control and were less likely to require diabetes medications to bring their blood sugar within healthy levels.
In addition, people who followed the Mediterranean diet group also experienced improvement in other heart disease risk factors. Interestingly, weight loss was relatively comparable between the two groups by the end of the trial, suggesting that attributes of the Mediterranean diet beyond weight loss affect blood sugar control.
SOURCES: Esposito, K. Annals of Internal Medicine, Sept. 1, 2009; vol 151: pp 306-315. News release, American College of Physicians.
Individuals with either type 1 or type 2 diabetes know that maintaining a nutritious diet is one of the most important things they can do to control their disease. The findings of a new study suggest that the services of a registered dietitian may help individuals accomplish this goal.
A team of investigators from the American Dietetic Association reviewed evidence from previous research and summarized their findings in a report published in the Journal of the American Dietetic Association.
In their write-up, researchers laid out a set of exhaustive dietary guidelines for individuals affected by diabetes. Researchers said that the services of registered dietitians may be key in helping individuals follow the guidelines, which could help them significantly improve their condition.
“The evidence is strong that medical nutrition therapy provided by registered dietitians is an effective and essential therapy in the management of diabetes. Registered Dietitians are uniquely skilled in this process,” said Marion Franz, who led the investigation.
The guidelines developed by the research team lay out 29 nutritional points that can help diabetics improve their blood sugar control.
Thirty years ago Maria de Sousa, then at the beginning of her career, noticed that lymphocytes were attracted to places with surplus of iron. This, together with
1- the fact that the vertebrate immune system (IS) was incredibly more complex that those of its ancestors (and evolution rarely increases complexity, which is energetically costly, unless something is gained)
led her to a revolutionary new idea – could this new complexity be evolutionary sound, because it allowed the IS to perform some important new function, maybe protecting the body against iron toxicity?
In fact iron, although an essential element for most life forms, can also be toxic to these same organisms when free (not attached to proteins). This means that in this form it needs to be “watched” and regulated around the clock. In vertebrates, this is done through hepcidin, a liver protein that “moves” iron between cells and plasma according to the body needs (or potential dangers). The problem is that the hepcidin liver cells have limited mobility so a complementary far reaching iron control system was needed. Lymphocytes, with their unique capacity to move throughout the body were the perfect candidates and since 1978, de Sousa and her group have been chasing this idea.
Much of their work has been done on hemochromatosis – a disease where there are problems in the absorption of iron through the digestive track leading to too much iron in the organism and to its toxic accumulation in the organs.
From this work we know now that hemochromatosis patients also have a defective IS, and more, that their iron overload levels correlate with their lymphocyte deficiency – the less lymphocytes they have the more severe the disease. Work in animal models with iron overload problems or instead, with lymphocyte deficiencies have again found links between excess of iron in the body and deficient IS further supporting de Sousa's “immuno-iron idea”.
And meanwhile, human lymphocytes were shown to produce several proteins crucial for the regulation of iron levels – ferritin, which acts as the body storage of iron (so holding to it when there is too much in the body or releasing it when there is deficiency) and ferroportin, which is the cells' iron “exit door” (again releasing or retaining iron as necessary) . The fact that lymphocytes had both proteins gave them the potential to be a “mobile” and easily “mobilizable” iron-storage compartment, characteristics perfect for an important role in iron homeostasis.
But hepcidin, the central piece of iron regulation, is known to be also an important player in the immune response what has raised the possibility that it could be in it the clue to this problem. In fact, during infection hepcidin shuts down the “door” through which iron leaves the cell (ferroportin) reducing iron availability in the plasma and thus helping to control infection – as bacteria need iron to divide. And now several studies have shown that hepcidin is produced by a variety of cells involved in the immune response. Finally, last year, a study suggested, for the first time, that lymphocytes were also capable of producing the protein putting the possibility that hepcidin could actually be “the missing link” of de Sousa's theory.
To clarify this hypothesis Jorge Pinto, Maria de Sousa and colleagues at the Institute for Molecular and Cell Biology (IBMC) of Porto University looked at hepcidin production in human lymphocytes in situations of toxic iron concentrations or immune activation, as de Sousa's theory proposed that lymphocytes could play a role in both situations. They found that hepcidin not only was produced by all classes of lymphocytes, but also that its production increased both in the presence of high quantities of iron, and when lymphocytes were activated, backing de Sousa's proposals.
Pinto explains: “We show, for the first time, that lymphocytes can “feel” the toxic levels of iron in circulation and respond by increasing their own capacity to retain it within, restoring “normality”. The same mechanism is seen being used in situations of (iron) demand, such as when the cells are activated by the occurrence of an infection and need to divide.”
They also found something else totally unexpected – that hepcidin was involved in this second mechanism, suggesting an even closer dependence between the two systems than de Sousa had thought.
To Hal Drakesmith, a researcher at the University of Oxford working on the possibility of manipulating iron transport as a way to combat infections such as HIV, malaria and Hepatitis C these results raise particularly interesting questions as he explains “This seems to suggest that control of iron metabolism may be an integral component of lymphocyte immunity. Withholding iron from pathogens is an accepted part of our defence against infection, but a role for lymphocytes in controlling iron transport has not been proposed before.
“Crucially – says Pinto – we still believe that the main regulator of systemic iron levels is the liver but not only are lymphocytes (and not liver cells) able to sense toxic forms of iron, but they are also able to travel and be activated in specific places where the pathogens accumulate helping to control infection. “
These results are a major step to understand the link between the IS and iron and, if confirmed in live organisms –all this work was done on human cells in the laboratory – can be the beginning of a totally different view of what the immune system is and how to approach immunologic problems.
As Hal Drakesmith says “the paper describes several new findings which are highly likely to be of interest and importance to the iron and immunity fields of research” A simple example is the anaemia that usually accompanies chronic inflammatory diseases and that so far can not be clearly explained. Pinto and Sousa's results suggest that lymphocyte chronic activation, so characteristic of these diseases, by leading to hepcidin production could be part of the phenomenon as iron is an integral part of red blood cells.
Pinto, de Sousa and colleagues now plan to go back to those diseases of iron overload associated to immune abnormalities and see if hepcidin proves to be, in fact, the connection between them. Other possibility is the construction of mice without the hepcidin gene in the bone marrow – where lymphocytes develop – to analyse the changes that this could bring to both iron homeostasis and the immune response.
Whatever happens this is a strikingly interesting story with decades of persistence and believe behind it and which, I am sure, still has much to tell us.
For this study, Chenchen Wang, M.D., M.Sc., and colleagues recruited 40 patients from the greater Boston area with confirmed knee OA who were in otherwise good health. The mean age of participants was 65 years with a mean body mass index of 30.0 kg/m2. Patients were randomly selected and 20 were asked to participate in 60-minute Yang style Tai Chi sessions twice weekly for 12 weeks. Each session included: a 10-minute self-massage and a review of Tai Chi principles; 30 minutes of Tai Chi movement; 10 minutes of breathing technique; and 10 minutes of relaxation.
The remaining 20 participants assigned to the control group attended two 60-minute class sessions per week for 12 weeks. Each control session included 40 minutes of instruction covering OA as a disease, diet and nutrition, therapies to treat OA, or physical and mental health education. The final 20 minutes consisted of stretching exercises involving the upper body, trunk, and lower body, with each stretch being held for 10-15 seconds.
At the end of the 12-week period, patients practicing Tai Chi exhibited a significant decrease in knee pain compared with those in the control group. Using the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain scale, researchers noted a -118.80 reduction in pain from baseline between the Tai Chi and control group. Researchers also observed improved physical function, self-efficacy, depression, and health status for knee OA in subjects in the Tai Chi group. “Our observations emphasize a need to further evaluate the biologic mechanisms and approaches of Tai Chi to extend its benefits to a broader population,” concluded Dr. Wang.
To this end, the following emotional variables have been specified: those relative to emotional experience —the frequency of positive and negative emotions, anxiety, low self-esteem and the influence of diet, weight and the body shape on the emotional state—; negative perception of emotions, negative attitude to emotional expression, alexithymia —the inability to identify own emotions and to express them verbally— and the manner of controlling negative emotions.
Moreover, another variable has also been taken into account: the need for control. This variable is not strictly emotional, but has a clear emotional component, given that people with a high need for control, experience anxiety and unwellness when perceiving lack of control.
Study of women
In order to undertake the study, 433 women took part; 143 of these suffered from some kind of eating disorder and 145 in risk of contracting one. The results of the study show that, in general, the majority of the variables put forward can be used as predictive of suffering an eating disorder. The variables which, above all, alert to greater risk of developing an eating disorder are when the emotional state of the person is excessively influenced by diet, weight and body shape, when self-esteem is low, and when, in anxiety situations, emotions are not expressed and the person tends to act in an impulsive manner.
These results have important implications, above all when drawing up prevention programmes for eating disorders. With the data obtained, it can be said that many of the emotional variables dealt with in Ms Pascual's work should be taken into account when drawing up these prevention programmes.
Eating disorders are very serious illnesses that have dire consequences for the sufferer, both physically as well as psychologically and socially, and there are disorders that are evermore widespread. Much research has been undertaken in order to find out the factors involved in their development, but the role played by the various emotional variables at the onset of these disorders has hardly been investigated. This thesis presented at the UPV/EHU focused on this matter more deeply.