Higher intakes of the B vitamins thiamine and riboflavin from the diet may reduce the incidence of premenstrual syndrome (PMS) by about 35 percent, suggest new findings. According to a new paper published in the American Journal of Clinical Nutrition, the link between B vitamins and PMS is biologically plausible since B vitamins such as thiamine and riboflavin are known to play important roles in the synthesis of various neurotransmitters involved in PMS.
While most women experience mild emotional or physical premenstrual symptoms, as many as 8-20 per cent of women experience symptoms severe enough to meet the definition of premenstrual syndrome, which can substantially interfere with daily activities and relationships. The new study, performed by researchers from the University of Massachusetts, Harvard, and the University of Iowa, indicates that increase intakes of certain B vitamins from food sources may help reduce the incidence of PMS.
Using data from 1,057 women with PMS and 1,968 women without PMS participating in the Nurses' Health Study II cohort, the researchers found that women with the highest average intakes of riboflavin two to four years prior to diagnosis were associated with a 35 percent lower incidence of PMS than women with the lowest average intakes. On the other hand, the researchers did not observe any benefits with other B vitamins, including niacin, folate, B6, and B12. In addition, supplemental intakes of these vitamins was not linked to PMS incidence, they added. “We observed a significantly lower risk of PMS in women with high intakes of thiamine and riboflavin from food sources only,” wrote the researchers. “Further research is needed to evaluate the effects of B vitamins in the development of premenstrual syndrome.”
Beyond the B vitamins, there is also some evidence for the potential of a diet rich in calcium and vitamin D to lower the risk of developing PMS, a condition that affects up to a fifth of all women. According to a study published in 2005 in the Archives of Internal Medicine (Vol. 165, pp1246-1252), researchers from the University of Massachusetts and GlaxoSmithKline reported for the first time that calcium and vitamin D may help prevent the initial development of PMS.
Source: The American Journal of Clinical Nutrition. Published online ahead of print, doi: 10.3945/ajcn.110.009530 “Dietary B vitamin intake and incident premenstrual syndrome” Authors: P.O. Chocano-Bedoya, J.E. Manson, S.E. Hankinson, W.C. Willett, S.R. Johnson, L. Chasan-Taber, A.G. Ronnenberg, C. Bigelow, E.R. Bertone-Johnson
Children who are allergic to food are found to be suffering from anxiety and are increasingly more lonely; One allergic child out of five never attends peers’ parties, while one in four always brings along “safe” food. The burden of food allergies and the risk they can escalate to life-threatening diseases is particularly heavy on children, whose normally active and sociable lifestyle can be severely limited and frustrated by the effort to keep them away from potentially dangerous food.
According to a study presented at the 2011 Food Allergy and Anaphylaxis Meeting by the European Academy of Allergy and Clinical Immunology (EAACI), held Feb 17-19 in Venice, Italy, 23 percent of allergic children are no longer curious to try new food to vary their diet, considered too monotonous by most of them. A child out of ten also gives up crucial physical activity for fear of anaphylactic shock triggered by exercise.
“About 17 percent of allergic children, regardless of their age, never go to a party or a picnic with friends, while 24 percent are forced to bring along something to eat,” says Prof. Maria Antonella Muraro, Chair of the EAACI Meeting. The study, headed by Prof. Muraro, was carried out by the Center for the study and treatment of allergies and food intolerances at the hospital of the University of Padua, Italy on 107 young patients and their mothers.
“Also, 5 to 15 per cent of cases of anaphylactic shock can be triggered by physical activity following the consumption of small amounts of allergenic food that would otherwise be harmless, so one allergic child out of ten also stops every kind of exercise,” Prof. Muraro added. “Allergies are often downplayed as a minor problem, but the life of an allergic person can be hell. Allergic children show to be more afraid of being sick and a higher level of anxiety about food than children with diabetes. The constant alarm surrounding them is taking a toll on their development and well-being.”
Another worrisome problem adding to the poor quality of life of allergic patients, especially the younger ones, is the need to carry life-saving devices at all times, such as epinephrine auto-injectors, “loaded” with enough drug to prevent death in case of severe anaphylactic shock. They are easy to use, light to carry and discreet, but one out of three patients still leaves home without them.
“Within 8 or 10 minutes the shot reverses the symptoms, ranging from urticaria to respiratory distress, cardiovascular collapse and gastrointestinal problems including vomiting and diarrhoea,” explains Prof. Muraro. “It can cause minor side effects, such as irritability or tremors that end as soon as the adrenaline is processed by the body, generally within a couple of hours. Patients should not be scared, even those who have a heart disease: the possible side effects are negligible in comparison to the opportunity to save your life.”
There is no scientific evidence that complementary therapies or kits sold through websites can identify allergies, the UK NHS watchdog NICE says. It says sites for services such as hair analysis use plausible stories but are not backed up by scientific evidence. It is publishing new guidance to help doctors in England and Wales identify when a child may have allergy problems. NICE says some parents end up turning to alternative therapies after a perceived lack of help from their GPs.
It is estimated that one in 20 young children has a food allergy. Dr Adam Fox, an allergy specialist based at the Evelina Children's Hospital in London, says not all children suffer immediate and obvious symptoms. “Food allergies can actually be extremely subtle. Lots of children have eczema, colic or spit up more food than usual. For some of those children the underlying problem is an allergy to something within their diet.”
The guidelines include detailed advice about how to recognise symptoms and when to refer to specialists. Dr Fox, who helped write the guidelines for National Institute for Health and Clinical Excellence (NICE), says he often sees parents in his specialist clinic who have wasted money on complementary or alternative tests.
The review by NICE looked for any scientific research of the usefulness of approaches including hair analysis and Vega testing, which uses mild electric currents, or kinesiology, in diagnosing allergies in children. “The websites are very well put together, the stories behind them are plausible, but we were unable to find any evidence to support them,” says Dr Fox. He says there are two types of testing used in NHS clinics – skin prick and blood sample – which are backed by scientific research. NICE is warning that parents sometimes turn to alternative tests when they have failed to convince their family doctor to listen to their concerns.
It took Alison Berthelson more than two years to get an allergy diagnosis for her first son Harris. She had been to the local surgery several times when he suffered rashes and stomach upsets without any particular cause being identified. After Harris ate a small piece of chocolate containing nuts he suffered a more extreme reaction, becoming agitated, with an extreme rash covering his entire body. The out-of-hours GP gave her son a medicine to reduce swelling, but did not send him on to hospital as an emergency. “It was really very terrifying, terrifying at the time because we didn't know what was happening, and terrifying later when we did know what had happened and how lucky we were.” A new GP correctly diagnosed possible food allergies, and sent Harris for testing at a specialist NHS clinic. He now has to avoid nuts, sesame and some other ingredients used in prepared foods.
Allergies on rise The number of children suffering from food allergies appears to be increasing, although experts are at a loss to understand exactly why. Family doctors are now more likely to see very young children suffering allergic reactions. Dr Joanne Walsh, a GP involved in drafting the advice, says she now sees several children a week with suspected allergic reactions. Some are babies just a couple of weeks old. By gradually eliminating, and reintroducing different foods, she can help parents manage the allergy without the need for hospital visits. “There's nothing more rewarding than a parent coming back and saying it's like having a different child.”
One of Australia’s leading juvenile justice services providing secure and safe care of up to 500 young offenders.
Review adequacy of summer and winter menus to address concerns raised to the State by the public.
Ensure compliance with Nutrient Reference Values for Australia and New Zealand
Ensure compliance with Dietary Guidelines for Children and Adolescents in Australia
Ensure compliance with Standards for Juvenile Custodial Facilities
Review of custodial health findings in various State jurisdictions around Australia and overseas.
Computer based macro and micro nutrient analysis of menus and individual recipes including protein, fat, carbohydrate and protein percentages
Computer based energy analysis of menus and individual recipes
Analysis of menus against nutrient reference values and appropriate recommendations.
Analysis of menus against dietary guidelines and appropriate recommendations
Analysis of food variety and appropriate recommendations
Analysis of special dietary needs and appropriate recommendations
Analysis of food choice and satisfaction and appropriate recommendations
Addiction researchers at Washington University School of Medicine in St. Louis have found that a risk for alcoholism also may put individuals at risk for obesity. The researchers noted that the association between a family history of alcoholism and obesity risk has become more pronounced in recent years. Both men and women with such a family history were more likely to be obese in 2002 than members of that same high-risk group had been in 1992. “In addiction research, we often look at what we call cross-heritability, which addresses the question of whether the predisposition to one condition also might contribute to other conditions,” says first author Richard A. Grucza, PhD. “For example, alcoholism and drug abuse are cross-heritable. This new study demonstrates a cross-heritability between alcoholism and obesity, but it also says — and this is very important — that some of the risks must be a function of the environment. The environment is what changed between the 1990s and the 2000s. It wasn’t people’s genes.”
Obesity in the United States has doubled in recent decades from 15 percent of the population in the late 1970s to 33 percent in 2004. Obese people – those with a body mass index (BMI) of 30 or more – have an elevated risk for high blood pressure, diabetes, heart disease, stroke and certain cancers.
Reporting in the Archives of General Psychiatry, Grucza and his team say individuals with a family history of alcoholism, particularly women, have an elevated obesity risk. In addition, that risk seems to be growing. He speculates that may result from changes in the food we eat and the availability of more foods that interact with the same brain areas as addictive drugs. “Much of what we eat nowadays contains more calories than the food we ate in the 1970s and 1980s, but it also contains the sorts of calories — particularly a combination of sugar, salt and fat — that appeal to what are commonly called the reward centers in the brain,” says Grucza, an assistant professor of psychiatry. “Alcohol and drugs affect those same parts of the brain, and our thinking was that because the same brain structures are being stimulated, overconsumption of those foods might be greater in people with a predisposition to addiction.”
Grucza hypothesized that as Americans consumed more high-calorie, hyper-palatable foods, those with a genetic risk for addiction would face an elevated risk from because of the effects of those foods on the reward centers in the brain. His team analyzed data from two large alcoholism surveys from the last two decades. The National Longitudinal Alcohol Epidemiologic Survey was conducted in 1991 and 1992. The National Epidemiologic Survey on Alcohol and Related Conditions was conducted in 2001 and 2002. Almost 80,000 people took part in the two surveys.
“We looked particularly at family history of alcoholism as a marker of risk,” Grucza explains. “And we found that in 2001 and 2002, women with that history were 49 percent more likely to be obese than those without a family history of alcoholism. We also noticed a relationship in men, but it was not as striking in men as in women.” Grucza says a possible explanation for obesity in those with a family history of alcoholism is that some individuals may substitute one addiction for another. After seeing a close relative deal with alcohol problems, a person may shy away from drinking, but high-calorie, hyper-palatable foods also can stimulate the reward centers in their brains and give them effects similar to what they might experience from alcohol.
“Ironically, people with alcoholism tend not to be obese,” Grucza says. “They tend to be malnourished, or at least under-nourished because many replace their food intake with alcohol. One might think that the excess calories associated with alcohol consumption could, in theory, contribute to obesity, but that’s not what we saw in these individuals.” Grucza says other variables, from smoking, to alcohol intake, to demographic factors like age and education levels don’t seem to explain the association between alcoholism risk and obesity. “It really does appear to be a change in the environment,” he says. “I would speculate, although I can’t really prove this, that a change in the food environment brought this association about. There is a whole slew of literature out there suggesting these hyper-palatable foods appeal to people with addictive tendencies, and I would guess that’s what we’re seeing in our study.” The results, he says, suggest there should be more cross-talk between alcohol and addiction researchers and those who study obesity. He says there may be some people for whom treating one of those disorders also might aid the other.
Regularly drinking green tea could protect the brain against developing Alzheimer’s and other forms of dementia. The study, published in the academic journal Phytomedicine, also suggests this ancient Chinese remedy could play a vital role in protecting the body against cancer. Led by Dr Ed Okello, the Newcastle team wanted to know if the protective properties of green tea – which have previously been shown to be present in the undigested, freshly brewed form of the drink – were still active once the tea had been digested. Digestion is a vital process which provides our bodies with the nutrients we need to survive. But, says Dr Okello, it also means that just because the food we put into our mouths is generally accepted to contain health-boosting properties, we can’t assume these compounds will ever be absorbed by the body.
“What was really exciting about this study was that we found when green tea is digested by enzymes in the gut, the resulting chemicals are actually more effective against key triggers of Alzheimer’s development than the undigested form of the tea,” explains Dr Okello, based in the School of Agriculture, Food and Rural Development at Newcastle University and executive director of the university’s Medicinal Plant Research Group. “In addition to this, we also found the digested compounds had anti-cancer properties, significantly slowing down the growth of the tumour cells which we were using in our experiments.”
As part of the research, the Newcastle team worked in collaboration with Dr Gordon McDougall of the Plant Products and Food Quality Group at the Scottish Crop Research Institute in Dundee, who developed technology which simulates the human digestive system. It is this which made it possible for the team to analyse the protective properties of the products of digestion. Two compounds are known to play a significant role in the development of Alzheimer’s disease – hydrogen peroxide and a protein known as beta-amyloid. Previous studies have shown that compounds known as polyphenols, present in black and green tea, possess neuroprotective properties, binding with the toxic compounds and protecting the brain cells.
When ingested, the polyphenols are broken down to produce a mix of compounds and it was these the Newcastle team tested in their latest research. “It’s one of the reasons why we have to be so careful when we make claims about the health benefits of various foods and supplements,” explains Dr Okello. “There are certain chemicals we know to be beneficial and we can identify foods which are rich in them but what happens during the digestion process is crucial to whether these foods are actually doing us any good.” Carrying out the experiments in the lab using a tumour cell model, they exposed the cells to varying concentrations of the different toxins and the digested green tea compounds.
Dr Okello explained: “The digested chemicals protected the cells, preventing the toxins from destroying the cells. “We also saw them affecting the cancer cells, significantly slowing down their growth. Green tea has been used in Traditional Chinese medicine for centuries and what we have here provides the scientific evidence why it may be effective against some of the key diseases we face today.”
The next step is to discover whether the beneficial compounds are produced during digestion after healthy human volunteers consume tea polyphenols. The team has already received funding from the Biotechnology and Biological Sciences Research Council (BBSRC) to take this forward. Dr Okello adds: “There are obviously many factors which together have an influence on diseases such as cancer and dementia – a good diet, plenty of exercise and a healthy lifestyle are all important. “But I think it’s fair to say that at least one cup of green tea every day may be good for you and I would certainly recommend it.”
(Source: Newcastle University: Phytomedicine)
In the past, positive blood and skin tests would often be mistaken for a food allergy because they would indicate the presence of immunoglobulin E antibodies, but it is important to remember that these are typically higher in patients with atopic dermatitis, according to a speaker at the 69th Annual American Academy of Dermatology Meeting conducted in New Orleans this week. “Those antibodies are not diagnostic, and the only way to diagnose food allergy is with a strong history of reactions or a challenge,” Jon M. Hanifin, MD, of Oregon Health & Science University, said in a press release. “This is done in a doctor’s office, using small increments of the food in question and increasing the amount until an allergic reaction occurs or does not occur. Usually a parent can pinpoint if a child has a true food allergy because the allergic reaction will appear so quickly with lip swelling or hives, quite distinct from simply food intolerance.”
Between 6% and 10% of children have atopic dermatitis, and about one-third of these children have food allergy. Recent research examining the genetic basis of atopic dermatitis has shown that this chronic skin condition is likely related to a defect in the epidermal barrier, which allows irritants, microbes and allergens (such as food) to penetrate the skin and cause adverse reactions. Because the skin barrier in patients with atopic dermatitis is compromised and open to absorb proteins, it allows sensitization to certain foods, leading to a positive skin or blood test.
New guidelines recently issued by the National Institute of Allergy and Infectious Diseases established a protocol for the proper evaluation and management of food allergy. The guidelines recommend that children who are younger than aged 5 years with moderate to severe atopic dermatitis be considered for food allergy evaluation if they have persistent atopic dermatitis despite optimized management or if the child has a reliable history of an immediate reaction after eating a specific food.
Hanifin said research is also ongoing into whether withholding foods is leading to more allergies than an unrestricted diet in young children. This may provide future insight in potential ways to prevent food allergies. He said children in Israel seldom get peanut allergy, which may potentially be attributed to the use of peanut proteins in pacifiers in that country. In the United States and Europe, where peanut allergies are more common, infants are not usually exposed to this food until they are toddlers – the time when most peanut allergies are noticed.
“There is some thinking that withholding foods might actually be causing more allergies, and that an unrestricted diet may help tolerize babies to foods that could potentially cause a problem later in life,” Hanifin said. “Ongoing studies in this country using oral immunotherapy appear promising, and physicians hope that we may discover how to prevent food allergies in the future while continuing to provide successful treatment for children with atopic dermatitis.”
Source: Hanifin J. Food allergy and dermatology. Presented at: The American Academy of Dermatology 69th Annual Meeting; Feb. 4-8, 2011; New Orleans
A controversial new Dutch study may have found a link between food allergies and ADHD. However, many experts are dismissing the findings. The study found that in children with ADHD, putting them on a restrictive diet to eliminate possible, previously unknown food allergies or sensitivities decreased hyperactivity for 64% of them. “There is a longstanding, somewhat inconsistent story about diet and ADHD,” said Jan Buitelaar, the lead author of the Dutch study and a psychiatrist at the Radboud University Nijmegen Medical Centre. “On the one hand, people think it’s sugar that’s the trigger, others think that food coloring could be causing ADHD. Our approach was quite different. We went [with] the idea that food may give some kind of allergic or hyperactivity reaction to the brain.”
There have been previous studies in this field, but they were limited. “This has long been viewed as a kind of a controversial approach,” Buitelaar said. “When we started the research, I was skeptical, but the results convinced me.”
In the study, of the 41 kids who completed the elimination diet, 32 saw decreased symptoms. When certain foods thought to be “triggers” for each child were reintroduced, most of the children relapsed. Among 50 kids given a “control” diet that was just a standard, healthy diet for children, no significant changes were noted. Given these findings, Buitelaar recommended that the elimination diet become part of standard of care for children with ADHD. However, while pediatricians acknowledge some effectiveness, they were against the elimination diet as part of the care for children with ADHD.
“People seem to think that dietary modification is essentially ‘free,’ but it is difficult, socially disruptive, and presents the risk for nutritional deficiency,” said Dr. Michael Daines, a pediatric allergist-immunologist at the University of Arizona. Though Daines is willing to work with families who want to try an elimination diet for treating ADHD, he feels it will only have an effect if the child is having a true food allergy or intolerance.
A new study has found that the leading causes of death are no more infectious diseases but chronic diseases such as cardiovascular disease and cancer – which may be affected by food habits. Researchers investigated eating patterns of over 2500 adults between the ages of 70 and 79 over a ten-year period and found that certain diets were associated with reduced mortality.
By determining the consumption frequency of 108 different food items, researchers were able to group the participants into six different groups as per their food choices:
- Healthy foods- 374 participants
- High-fat dairy products- 332
- Meat, fried foods, and alcohol- 693
- Breakfast cereal-386
- Refined grains-458
- Sweets and desserts-339
‘Healthy foods’ group ate more low-fat dairy products, fruit, whole grains, poultry, fish, and vegetables, and lower consumption of meat, fried foods, sweets, high-calorie drinks, and added fat. ‘High-fat dairy products’ group had higher intake of foods such as ice cream, cheese, and 2 per cent and whole milk and yoghurt, and lower intake of poultry, low-fat dairy products, rice, and pasta.
End results indicated that ‘High-fat dairy products’ group had a 40 per cent higher risk of mortality than the Healthy foods cluster and the ‘Sweets and desserts’ group had a 37 per cent higher risk. No significant differences in risk of mortality were seen between the ‘Healthy foods’ cluster and the ‘Breakfast cereal’ or ‘Refined grains’ clusters.
The “results of this study suggest that older adults who follow a dietary pattern consistent with current guidelines to consume relatively high amounts of vegetables, fruit, whole grains, low-fat dairy products, poultry and fish, may have a lower risk of mortality,” said Amy L. Anderson at Department of Nutrition and Food Science, University of Maryland.
“Because a substantial percentage of older adults in this study followed the ”Healthy foods” dietary pattern, adherence to such a diet appears a feasible and realistic recommendation for potentially improved survival and quality of life in the growing older adult population.” The study will be published in the January 2011 issue of the Journal of the American Dietetic Association .
Scientists have upgraded their opinion of Neanderthal cuisine after spotting traces of cooked food on the fossilised teeth of our long-extinct cousins. The researchers found remnants of date palms, seeds and legumes – which include peas and beans – on the teeth of three Neanderthals uncovered in caves in Iraq and Belgium. Among the scraps of food embedded in the plaque on the Neanderthals' teeth were particles of starch from barley and water lilies that showed tell-tale signs of having been cooked. The Ice Age leftovers are believed to be the first direct evidence that the Neanderthal diet included cooked plants as well as meat obtained by hunting wild animals.
Dolores Piperno, who led the study at the archaeobiology laboratory at the Smithsonian National Museum of Natural History in Washington, said the work showed Neanderthals were more sophisticated diners than many academics gave them credit for. Piperno said the discoveries even raised the possibility that male and female Neanderthals had different roles in acquiring and preparing food. “The plants we found are all foods associated with early modern human diets, but we now know Neanderthals were exploiting those plants and cooking them, too. When you cook grains it increases their digestibility and nutritional value,” she added.
The findings bring fresh evidence to the long debate over why Neanderthals and not our direct ancestors, the early modern humans, went extinct. The last of the Neanderthals are thought to have died out around 28,000 years ago, but it is unclear what role – if any – modern humans played in their demise. “The whole question of why Neanderthals went extinct has been controversial for a long time and dietary issues play a significant part in that,” Piperno said. “Some scholars claim the Neanderthals were specialised carnivores hunting large game and weren't able to exploit a diversity of plant foods. “As far as we know, there has been until now no direct evidence that Neanderthals cooked their foods and very little evidence they were consuming plants routinely.”
Piperno's team was given permission to study the remains of three Neanderthal skeletons. One was unearthed at the Shanidar cave in Iraq and lived 46,000 years ago. The other two were recovered from the Cave of Spy in Belgium, and date to around 36,000 years ago. The scientists examined three teeth from the Iraqi Neanderthal and two from each of the Belgium specimens. To look for traces of food on them, they scraped fossilised plaque from each tooth and looked at it under a microscope. Grains from plants are tiny, but have distinct shapes that the scientists identified by comparing them with a collection at the Smithsonian's herbarium. The researchers also cooked a range of plants to see how their appearance changed.
They collected 73 starch grains from the Iraqi Neanderthal's teeth. Some of these belonged to barley or a close relative, and appeared to have been boiled in water. “The evidence for cooking is strong. The starch grains are gelatinised, and that can only come from heat associated with cooking,” Piperno said. Similar tests on the Belgian Neanderthals' teeth revealed traces of cooked starch that probably came from parts of water lilies that store carbohydrates. Other cooked starch grains were traced back to sorghum, a kind of grass.
The study is published in the Proceedings of the National Academy of Sciences journal.
In Piperno's opinion, the research undermines one theory that suggests early modern humans drove the Neanderthals to extinction by having a more sophisticated and robust diet. The work also raises questions about whether Neanderthals organised themselves in a similar way to early hunter-gatherer groups, she said. “When you start routinely to exploit plants in your diet, you can arrange your settlements according to the season. In two months' time you want to be where the cereals are maturing, and later where the date palms are ready to pick. It sounds simplistic, but this is important in terms of your overall cognitive abilities. “In early human groups, women typically collected plants and turned them into food while men hunted. To us, and it is just a suggestion, this brings up the possibility that there was some sexual division of labour in the Neanderthals and that is something most people did not think existed.”