The study was carried out by researchers from University of Otago Medical School, New Zealand. Funding was provided by Genesis Oncology Trust, the Dean’s Bequest Funds of the Dunedin School of Medicine, the Gisborne East Coast Cancer Research Trust and the Director’s Cancer Research Trust. The research was published in the peer-reviewed American Journal of Epidemiology. This was a case-control study in New Zealand that compared a group of adults with bowel cancer, and a group without bowel cancer, and looked at whether they drank milk at school. School milk was freely available in most schools in New Zealand until 1967 when the government programme was stopped. Many schools in the Southland region stopped free milk as long ago as 1950.
Case-control studies are appropriate for looking at whether people with and without a disease have had a particular exposure (milk in this case). The difficulty is in accounting for all potential confounding factors, particularly other health and lifestyle factors, which could be related to both diet and bowel cancer risk, for example regular childhood milk consumption could be a reflection of a ‘healthy’ diet and other healthy lifestyle behaviours that may reduce risk of cancer. In addition, when examining such a specific dietary factor – ie milk consumed in school – it is difficult to account for all possible milk or other dairy products consumed outside of school.
In this case-control study, 562 adults (aged 30 to 69) with newly diagnosed bowel cancer were identified from the New Zealand Cancer Registry in 2007. For a control group, 571 age-matched adults without cancer were randomly selected from the electoral register. All participants were mailed a questionnaire that asked about any previous illness, use of aspirin or dietary supplements in childhood, participation in school milk programmes, other childhood milk consumption, childhood diet (including other milk and dairy), smoking, alcohol consumption prior to 25 years of age, screening tests for bowel cancer, family history of cancer, education and sociodemographic characteristics. Childhood weight and height were not questioned. For school milk consumption they were specifically asked:
- Whether they drank school milk
- How many half-pint bottles they drank a week
- What age they first drank school milk
- When they stopped drinking school milk
Statistical risk associations between school milk participation and cancer were calculated. The calculations took into account several risk factors for bowel cancer risk including age, sex, ethnicity and family history.
What were the basic results?
Data on school milk consumption was available for 552 cases and 569 controls. As expected, people who started school before 1967 were more likely to have had free school milk than those who began school after 1968. Seventy-eight percent of cases participated in the school milk programme compared with 82% of controls. School milk consumption was associated with a 30% reduced risk of developing bowel cancer (odds ratio 0.70, 95% confidence interval 0.51 to 0.96).
When looking at the effect of number of bottles consumed per week they found that compared with no bottles, five bottles per week was associated with 32% significantly decreased risk, and 10 or more bottles with 61% significantly decreased risk. However, there was no significant association with one to four bottles or six to nine bottles. The researchers found a similar trend when the total school consumption of milk was compared with no consumption: 1,200-1,599 bottles was associated with 38% significantly decreased risk; 1,600-1,799 with 43% decreased risk; and 1,800 or more bottles associated with 38% significantly decreased risk. There was no significant association with fewer than 1,200 bottles. The researchers calculated that for every 100 half-pint bottles consumed at school there was a 2.1% reduction in the risk of bowel cancer. Outside of school, there was a significantly reduced risk of bowel cancer with more than 20 dairy products a week compared with none to nine dairy products a week.
The researchers conclude that their national case-control study ‘provides evidence that school milk consumption was associated with a reduction in the risk of adult colorectal cancer in New Zealand. Furthermore, a dose-dependent relation was evident’. This study has strengths in its relatively large size, its reliable and nationally representative identification of cases and controls, and its thorough data collection. However, the conclusion that school milk consumption is associated with a reduced risk of bowel cancer in adulthood must be interpreted in light of a number of considerations:
The analysis took into account established risk factors for bowel cancer including age, sex, ethnicity and family history. However, many other potential confounders were not considered, including diet, physical activity, overweight and obesity, smoking or alcohol consumption. Diet in particular has been implicated in bowel cancer risk, with diets high in saturated fat, red meat and processed foods and low in fibre, fruit and vegetables thought to increase risk. Potentially, any of these lifestyle behaviours could be confounding the relationship between school milk consumption and bowel cancer and regular childhood milk consumption could be a reflection of a ‘healthy’ diet and other healthy lifestyle behaviours that reduce risk of cancer. When looking at the effect of number of bottles consumed per week, the researchers found that, compared with no bottles, five bottles were associated with 32% significantly decreased risk and 10 or more bottles with 61% significantly decreased risk. However, there was no significant association with one to four bottles or six to nine bottles. Therefore, the trend here is not very clear. Particularly as only 16 cases and 31 controls drank 10 or more bottles a week, statistical comparison between such small numbers should be viewed with caution. With many food questionnaires there is the potential for recall bias. For example, adults may have difficulty remembering how many bottles of school milk they drank many years before. When estimating their average weekly amount, it is highly possible that this could be inaccurate or that their consumption varied slightly from week to week and year to year. Particularly when researchers were using this response and combining it with the number of weeks in the school year and their total years at school to give a total number of bottles consumed at school (figures in 100s or 1,000s), there is the possibility of being incorrectly categorised. Hence, there may be less reliability when calculating risk according to the category of total milk bottles consumed. Cancer prevalence, and particularly environmental and lifestyle risk factors for cancer, can vary between countries. These findings in New Zealand may not be represented elsewhere. Of note, the researchers acknowledge that a cohort study in the UK found the opposite: increased childhood dairy consumption was associated with increased risk of bowel cancer. Case-control studies are most appropriate for looking at rare diseases, where you would expect there to be only a small number of cases developing among a large number of people. In the case of bowel cancer, which is common, the slightly more reliable cohort design could have also been used, where children who drank milk at school and those who didn’t were followed over time to see if they developed cancer. However, such a cohort would consequently need extensive long-term follow-up.
The possible association between milk/dairy consumption, or calcium intake, in childhood, or in later years, is worthy of further study. However, from this study alone, it cannot be concluded that school milk prevents bowel cancer later in life.
Eating more fruits and vegetables may not protect children from developing allergies, according to a large Swedish study that questions earlier hints of benefit. Fruits and vegetables are rich in antioxidants, which are thought to reduce airway inflammation. So recent studies reporting less asthma, wheezing and hay fever among children who consumed more produce appeared to make sense.
But not all research has found that link, and the studies that did may have had a surprising flaw, said Helen Rosenlund of Karolinska Institutet in Stockholm, who led the new study. She said some proteins in fruits like apples and pears resemble the pollen parts that trigger hay fever, meaning that kids might react to both. In other words, existing allergies may have caused them to eat around the produce, rather than the other way around. “This could confuse research findings,” explained Rosenlund, “falsely suggesting that diets with fewer fruits and vegetables result in more allergic disease.”
To find out if this was the case, Rosenlund and her colleagues looked at data on nearly 2,500 eight-year-olds who had participated since birth in a larger Swedish study. Based on blood tests and questionnaires filled out by parents, the researchers found that seven percent of the children had asthma. The rates of hay fever and skin rashes were more than twice as high. The average child ate between one and two servings of fruit, and between two and three servings of vegetables each day.
At first glance, some produce did seem helpful: Kids with the biggest appetite for fruit had less than two-thirds the odds of developing hay fever than those who ate the least amount. Apples, pears and carrots appeared to be particularly helpful, the researchers report in the Journal of Allergy and Clinical Immunology, but there was no such link for vegetables overall. However, it turned out that half the children with hay fever were sensitive to birch tree pollen, one of the pollens known to resemble the proteins in apples and carrots. And sure enough, after the team repeated their analysis excluding the 122 kids with food-related allergy symptoms, the hay fever link disappeared as well. “Fruits do not seem to offer protection against allergic if diet modifications are considered,” say Rosenlund.
The researchers say more studies are needed, particularly in other parts of the world that may have a different variety of allergy triggers, or allergens. And they advise those studies should not forget to look at how allergies might influence what participants eat. “Studying diet it is not so easy when it comes to the relation with allergic disease,” Rosenlund said, “because it is such a complex disease pattern.”
SOURCE: bit.ly/g3DpI7 The Journal of Allergy and Clinical Immunology, online January 10, 2011.
In research described as “a stark warning” to those tempted to start smoking, scientists are reporting that cigarette smoke begins to cause genetic damage within minutes — not years — after inhalation into the lungs. Their report, the first human study to detail the way certain substances in tobacco cause DNA damage linked to cancer, appears in Chemical Research in Toxicology, one of 38 peer-reviewed scientific journals published by the American Chemical Society.
Stephen S. Hecht, Ph.D., and colleagues point out in the report that lung cancer claims a global toll of 3,000 lives each day, largely as a result of cigarette smoking. Smoking also is linked to at least 18 other types of cancer. Evidence indicates that harmful substances in tobacco smoke termed polycyclic aromatic hydrocarbons, or PAHs, are one of the culprits in causing lung cancer. Until now, however, scientists had not detailed the specific way in which the PAHs in cigarette smoke cause DNA damage in humans.
The scientists added a labeled PAH, phenanthrene, to cigarettes and tracked its fate in 12 volunteers who smoked the cigarettes. They found that phenanthrene quickly forms a toxic substance in the blood known to trash DNA, causing mutations that can cause cancer. The smokers developed maximum levels of the substance in a time frame that surprised even the researchers: Just 15-30 minutes after the volunteers finished smoking. Researchers said the effect is so fast that it’s equivalent to injecting the substance directly into the bloodstream.
“This study is unique,” writes Hecht, an internationally recognized expert on cancer-causing substances found in cigarette smoke and smokeless tobacco. “It is the first to investigate human metabolism of a PAH specifically delivered by inhalation in cigarette smoke, without interference by other sources of exposure such as air pollution or the diet. The results reported here should serve as a stark warning to those who are considering starting to smoke cigarettes,” the article notes. The authors acknowledged funding from the National Cancer Institute.
Prostate cancer is the second most common cause of cancer related deaths in men. Previous cell and animal research suggests that genistein, a potent soy isoflavone, may prevent the spread of prostate cancer. Now research reports that a genistein-derived drug may help prevent the spread of prostate cancer in men with prostate cancer.
The study, presented at the Ninth Annual American Association for Cancer Research Frontiers in Cancer Prevention Research Conference, investigated the effect of the genistein-drug in men with localized prostate cancer. Researchers at the Robert H. Lurie Comprehensive Cancer Center of Northwestern University administered the genistein-drug once daily to 38 men with localized prostate cancer one month before prostate surgery.
The participant’s prostate cancer cells were analyzed after surgery. The researchers found an increased expression of genes that stop cancer cell spread (metastasis). Furthermore, there was a decreased expression of genes that enhance metastasis.
“The first step is to see if the drug has the effect that you want on the cells and the prostate, and the answer is ‘yes, it does,'” says lead researcher Raymond Bergan, MD, professor of hematology and oncology at Northwestern University Feinberg School of Medicine, in a news release. “All therapies designed to stop cancer cell movement that have been tested to date in humans have basically failed have because they have been ineffective or toxic. If this drug can effectively stop prostate cancer from moving in the body, theoretically, a similar therapy could have the same effect on the cells of other cancers.”
Choosing to eat tomatoes not only reduces a man’s risk of developing prostate cancer, but also shrinks the existing tumors, claims a new Italian study. Researchers theorize that the secret may lie in lycopene, the powerful anti-oxidant that makes tomatoes red. Lycopene helps neutralize harmful free radicals that are implicated in various kinds of cancer, cardiovascular problems, macular degeneration and other age-related illnesses. However, the benefit was strongest for prostate cancer.
In a bid to assess the prostate cancer-fighting properties of tomatoes, the researchers at the University of Naples conducted an experiment on rodents. For the purpose of the study, the researchers fed laboratory rats implanted with prostate cancer cells, with either a normal diet or that containing 10 percent tomato powder. The tomato powders were made from whole foods so the effects of eating the entire vegetable could be assessed as a nutritional supplement. The investigators noted that the animals fed on tomato powder exhibited slow progression of the disease and also had lower rates of prostate cancer. In contrast, those fed on a normal diet displayed no such benefits.
Joanna Owens, from Cancer Research Britain disagrees stated, “This study doesn’t provide enough evidence that tomatoes can reduce the risk of prostate cancer or prevent progression of the disease in humans. “Other risks such as age, family history and ethnicity are likely to play a much greater role than diet alone.” The study has been published in the journal ‘Cancer Prevention Research.’
Consumption of Vitamin B during pregnancy does not increase the risk of allergy in the infants, says a new study from Japan that challenges previous findings. Maternal consumption of folate and vitamins B2, B6, and B12 during pregnancy was not associated with the risk of the infant developing asthma or eczema, according to findings from 763 infants published in Pediatric Allergy and Immunology.
The link between folate and folic acid, the synthetic form of the vitamin, and respiratory health is not clear cut, with contradictory results reported in the literature. A study from Johns Hopkins Children’s Center found that higher levels of folate were associated with a 16 per cent reduction of asthma in (Journal of Allergy & Clinical Immunology, June 2009, Vol. 123, pp. 1253-1259.e2). However, a Norwegian study reported that folic acid supplements during the first trimester were associated with a 6 per cent increase in wheezing, a 9 per cent increase in infections of the lower respiratory tract, and a 24 per cent increase in hospitalisations for such infections, (Archives of Diseases in Childhood, doi:10.1136/adc.2008.142448). In addition, researchers from the University of Adelaide in Australia reported that folic acid supplements in late pregnancy may increase the risk of asthma by about 25 per cent in children aged between 3 and 5 years (American Journal of Epidemiology, 2010, doi:10.1093/aje/kwp315).
Illumination from the Land of the Rising Sun?
The new study, performed by researchers from Fukuoka University, the University of Tokyo, and Osaka City University, goes beyond folate and folic acid, and reports no link between Vitamin B intake and the risk of asthma or eczema in children. “To the best of our knowledge, there has been no birth cohort study on the relationship between maternal consumption of Vitamin B during pregnancy and the risk of allergic disorders in the offspring,” wrote the researchers. The findings were based on data from 763 pairs of Japanese mother and child. A diet history questionnaire was used to assess maternal intakes of the various B vitamins during pregnancy, and the infants were followed until the age of 16 to 24 months. Japan has no mandatory fortification of flour with folic acid.
Results showed that, according to criteria from the International Study of Asthma and Allergies in Childhood, 22 and 19 percent of the children had symptoms of wheeze and eczema, respectively, but there was no association between these children and the dietary intakes of the various B vitamins by their mothers. “Our results suggest that maternal intake of folate, vitamin B12, vitamin B6, and vitamin B2 during pregnancy was not measurably associated with the risk of wheeze or eczema in the offspring,” said the researchers. “Further investigation is warranted to draw conclusions as to the question of whether maternal Vitamin B intake during pregnancy is related to the risk of childhood allergic,” they concluded.
According to the European Federation of Allergy and Airway Diseases Patients Association (EFA), over 30m Europeans suffer from asthma, costing Europe €17.7bn every year. The cost due to lost productivity is estimated to be around €9.8bn. The condition is on the rise in the Western world and the most common long-term condition in the UK today. According to the American Lung Association, almost 20m Americans suffer from asthma. The condition is reported to be responsible for over 14m lost school days in children, while the annual economic cost of asthma is said to be over $16.1bn.
Source: Pediatric Allergy and Immunology. Volume 22, Issue 1-Part-I, February 2011, Pages: 69–74 DOI: 10.1111/j.1399-3038.2010.01081.x
“Maternal B vitamin intake during pregnancy and wheeze and eczema in Japanese infants aged 16–24 months: The Osaka Maternal and Child Health Study”. Authors: Y. Miyake, S. Sasaki, K. Tanaka, Y. Hirota
OTHERWISE healthy teenage girls who diet regularly show worrying signs of malnutrition, Sydney researchers have found. The largest study of its kind shows pressure to be thin could be causing teenage girls serious harm, potentially preventing them from developing properly. The study of 480 girls, between 14 and 17, attending school in Sydney’s northern suburbs and on the central coast, found those who dieted often showed subtle but chronic signs of undernourishment compared to those who occasionally, or never, dieted. The girls were deficient in a number of nutrients and biochemicals, including calcium and protein, as well as haemoglobin, which is vital for transporting oxygen in the blood.
The study leader, Dr Ross Grant, said the teenagers were not getting the nutrients they needed to build their bodies. ”When you get through your adolescent years you should be the healthiest you are ever going to be, and these girls are not giving themselves the best chance to be healthy,” he said. Many students in the study were dieting even though, on average, they were not overweight. ”These are pretty much your average girls on the north shore. They are going to school and they are not unwell in any other way,” Dr Grant said. The low levels of calcium were particularly worrying, he said. ”Calcium is used as a signalling molecule for every cell in the body. If you are not getting enough calcium in your diet then your body starts to get it from wherever it can, which is the bones.”
Most researchers believe the amount of calcium consumed in a person’s teenage years sets the basic level available for the rest of their life. Media messages presenting excessively thin women as having an ideal body shape, or public health campaigns making girls overly aware of not consuming too many calories, could be to blame for dieting, said Dr Grant, who is the head of the Australasian Research Institute at the Sydney Adventist Hospital.
Christine Morgan, the chief executive of the Butterfly Foundation, an eating disorders advocacy group, said she was horrified, but not surprised, by the findings. ”Diets, by their very nature, are telling you to disregard your physiological appetite,” she said. ”These homespun diets result in us not putting the nutrients we need into our bodies.” Disordered eating – irregular eating behaviours that do not fall into the category of an eating disorder – had more than doubled in the past 10 years. ”It has become the norm,” Ms Morgan said.
A new study shows following a Mediterranean style diet rich in vegetables, olive oil, and fish may keep the mind sharp and slow age-related cognitive decline.The diet typified by the Italians, Greeks, and other Mediterranean cultures has already been shown to reduce the risk of heart disease, diabetes, and some types of cancer. But this and other studies are now suggesting that the diet may also have healthy benefits for the mind.
The Mediterranean diet emphasizes fruits and vegetables, fish, legumes, non-refined cereals, olive oil, and moderate wine consumption, usually at meals. Researchers found older adults who followed the diet more closely had slower rates of age-related cognitive decline than those who didn't, even after adjusting for other factors such as educational level. “The more we can incorporate vegetables, olive oil, and fish into our diets and moderate wine consumption, the better for our aging brains and bodies,” says Christy Tangney, PhD, associate professor of clinical nutrition at Rush University, in a news release.
In the study, published in the American Journal of Clinical Nutrition, researchers analyzed information gathered by the ongoing Chicago Health and Aging Project, which follows 3,759 adults over the age of 65 living on the South Side of Chicago. Every three years, the participants took tests of memory and basic math skills and filled out a questionnaire on how often they eat 139 different foods. The study follow-up time was 7.6 years on average.
Researchers looked at how closely the participants followed a Mediterranean diet and then compared it to their scores on age-related cognitive decline. Out of a maximum score of 55 for total adherence to a Mediterranean diet, the average score was 28. The results showed those with higher than average scores had a slower rate of age-related mental decline than those with lower scores. Researchers also looked at how closely the participants followed the Healthy Eating Index-2005, which is based on the 2005 Dietary Guidelines for Americans. They found no relationship between adherence to this type of diet and the rate of age-related cognitive decline.
Teens whose diets include lots of sugary drinks and foods show physical signs that they are at increased risk for heart disease as adults, researchers from Emory University report. Among 2,157 teens who took part in the National Health and Nutrition Examination Survey, the average amount of added sugar eaten in a day was 119 grams (476 calories), which was 21.4 percent of all the calories these teens consumed daily, the researchers noted. “We need to be aware of sugar consumption,” said lead researcher and postdoctoral fellow Jean Welsh. “It's a significant contributor of calories to our diet and there are these associations that may prove to be very negative,” she said. “Sugar-sweetened soft drinks and sodas are the major contributor of added sugar and are a major source of calories without other important nutrients.” Awareness of the negative effects of added sugar may help people, particularly teens, cut down on the amount of sugar they consume, Welsh added. “Parents and adolescents need to become aware of the amount of added sugar they are consuming and be aware that there may be some negative health implications if not now, then down the line,” she said.
The report is published in the Jan. 10 online edition of Circulation.
Welsh's team found that teens who consumed the most added sugar had 9 percent higher LDL (“bad”) cholesterol levels, and 10 percent higher triglyceride levels (another type of blood fat), compared with those who consumed the least added sugar. Teens who took in the highest amount of added sugar also had lower levels of HDL (“good”) cholesterol than those who consumed the least amount of added sugar. In addition, teens who consumed the highest amount of added sugar showed signs of insulin resistance, which can lead to diabetes and its associated risk of heart disease, the researchers found.
The American Heart Association has recommended an upper limit for added sugars intake, based on the number of calories you need. “Most American women [teens included] should consume no more than 100 calories of added sugars per day; most men, no more than 150 calories,” the association states.
One caveat to these findings is that because of the way the study was done it is not clear if added sugars caused the differing cholesterol levels, only that they are linked. In addition, the data are only for one day and may not reflect the teen's usual diet, the researchers noted. Commenting on the study, Dr. David L. Katz, director of the Prevention Research Center at Yale University School of Medicine, said that “this study does not prove that dietary sugar is a cardiac risk factor in this population, but it strongly suggests it.”
The paper has three important messages, he said. First, dietary sugar intake in a representative population of teenagers is nearly double the recommended level. Second, the higher the intake of sugar, the greater the signs of cardiac risk, including elevated LDL (“bad”) cholesterol and low HDL (“good”) cholesterol. Third, the apparent harms of excess sugar are greater in overweight than in lean adolescents.
“Sugar is by no means the sole dietary threat to the health of adolescents, or adults,” Katz said. “But we now have evidence it certainly counts among the important threats to both. Reducing sugar intake by adolescents, to prevent them becoming adults with diabetes or heart disease, is a legitimate priority in public health nutrition,” he said.
SOURCES: HealthDay; Jean Welsh, M.P.H., Ph.D., R.N., postdoctoral fellow, Emory University, Atlanta; David L. Katz, M.D., M.P.H., director, Prevention Research Center, Yale University School of Medicine, New Haven, Conn.; Jan. 10, 2011, Circulation, online
Women suffering from diabetes plus depression have a greater risk of dying, especially from heart disease, a new study suggests.In fact, women with both conditions have a twofold increased risk of death, researchers say. “People with both conditions are at very high risk of death,” said lead researcher Dr. Frank B. Hu, a professor of medicine at Harvard Medical School. “Those are double whammies.” When people are afflicted by both diseases, these conditions can lead to a “vicious cycle,” Hu said. “People with diabetes are more likely to be depressed, because they are under long-term psychosocial stress, which is associated with diabetes complications.”
People with diabetes plus depression are less likely to take care of themselves and effectively manage their diabetes, he added. “That can lead to complications, which increase the risk of mortality.” Hu stressed that it is important to manage both the diabetes and the depression to lower the mortality risk. “It is possible that these two conditions not only influence each other biologically, but also behaviorally,” he said.
Type 2 diabetes plus depression are often related to unhealthy lifestyles, including smoking, poor diet and lack of exercise, according to the researchers. In addition, depression may trigger changes in the nervous system that adversely affect the heart, they said.
The report is published in the January issue of the Archives of General Psychiatry.
Commenting on the study, Dr. Luigi Meneghini, an associate professor of clinical medicine and director of the Eleanor and Joseph Kosow Diabetes Treatment Center at the Diabetes Research Institute of the University of Miami Miller School of Medicine, said the findings were not surprising.
“The study highlights that there is a clear increase in risk to your health and to your life when you have a combination of diabetes and depression,” he said. Meneghini noted there are many diabetics with undiagnosed depression. “I am willing to bet that there are quite a number of patients with diabetes and depression walking around without a clear diagnosis.” Patients and doctors need to be more aware that depression is an issue, Meneghini added.
For the study, Hu’s team collected data on 78,282 women who were aged 54 to 79 in 2000 and who were participants in the Nurses’ Health Study. Over six years of follow-up, 4,654 women died, including 979 who died of cardiovascular disease, the investigators found. Women who had diabetes had about a 35 percent increased risk of dying, and those with depression had about a 44 percent increased risk, compared with women with neither condition, the researchers calculated. Those with both conditions had about twice the risk of dying, the study authors found.
When Hu’s team looked only at deaths from heart disease, they found that women with diabetes had a 67 percent increased risk of dying and those with depression had a 37 percent increased risk of death. But women who had both diabetes and depression had a 2.7-fold increased risk of dying from heart disease, the researchers noted.
In the United States, some 15 million people suffer from depression and 23.5 million have diabetes, the researchers say. Up to one-fourth of people with diabetes also experience depression, which is nearly twice as many as among people who don’t have diabetes, they added. “The combination of diabetes and depression needs to be addressed,” Meneghini concluded. He added that patients need to tell their doctors if they are feeling depressed, and doctors also need to be on the lookout for signs of depression in their diabetic patients.
SOURCES: HealthDay News; Frank B. Hu, M.D., Ph.D., professor, medicine, Harvard Medical School, Boston; Luigi Meneghini, M.D., associate professor, clinical medicine and director, Eleanor and Joseph Kosow Diabetes Treatment Center, Diabetes Research Institute, University of Miami Miller School of Medicine; January 2011, Archives of General Psychiatry