Wednesday, December 30, 2015

Eating when we are not hungry is bad for our health



With the wide availability of convenient foods engineered for maximum tastiness-- such as potato chips, chocolates, and bacon double cheeseburgers-- in the modern food environment and with widespread advertising, the contemporary consumer is incessantly being bombarded with the temptation to eat. This means that, in contrast to people in traditional societies, people in contemporary societies often eat not on account of hunger but because tasty food is available and beckoning at all hours of the day.

New research published in the Journal of the Association for Consumer Research found that the tendency of today's consumers to eat when they are not hungry might be less advantageous for health than eating when they are hungry.

The individuals participating in the study were 45 undergraduate students. The participants were first asked to rate their level of hunger and then to consume a meal rich in carbohydrates. To measure how the meal was impacting participants' health, participants' blood glucose levels were measured at regular intervals after they consumed the meal. Blood glucose levels tend to rise after a meal containing carbohydrates and it is generally healthier if blood glucose levels rise by a relatively small amount because elevated blood glucose is damaging to the body's cells.

The results of the study showed that individuals who were moderately hungry before the meal tended to have lower blood glucose levels after consuming the meal than individuals who were not particularly hungry before consuming the meal. These findings suggest that it might be healthier for individuals to eat when they are moderately hungry than when they are not hungry.

Monday, December 28, 2015

Overconsumption of "healthy" foods which consumers often perceive as less filling leads to weught gain


Eating too much is typically considered one of the prime culprits of obesity. A new study published in the Journal of the Association for Consumer Research, looked specifically at overconsumption of "healthy" foods which consumers often perceive as less filling. The researchers successfully found evidence to support their hypothesis that when people eat what they consider to be healthy food, they eat more than the recommended serving size because they associate "healthy" with less filling.

The research utilizes a multi-method approach to investigate the "healthy = less filling" intuition. The first study was conducted with 50 undergraduate students at a large public university and employed the well-established Implicit Association Test to provide evidence for an inverse relationship between the concepts of healthy and filling. The second study was a field study conducted with 40 graduate students at a large public university and measured participants' hunger levels after consuming a cookie that is either portrayed as healthy or unhealthy to test the effect of health portrayals on experienced hunger levels. The third study was conducted with 72 undergraduate students in a realistic scenario to measure the impact of health portrayals on the amount of food ordered before watching a short film and the actual amount of food consumed during the film. The set of three studies converges on the idea that consumers hold an implicit belief that healthy foods are less filling than unhealthy foods.

Specifically, the researchers demonstrate that portraying a food as healthy as opposed to unhealthy using a front-of-package nutritional scale impacts consumer judgment and behavior. When a food is portrayed as healthy, as opposed to unhealthy, consumers report lower hunger levels after consumption, order greater portion sizes of the food, and consume greater amounts of the food. Surprisingly, even consumers who say they disagree with the idea that healthy foods are less filling than unhealthy foods are subject to the same biases. In addition, the researchers introduce a novel tactic for reversing consumers' habit of overeating foods portrayed as healthy: highlighting the nourishing aspects of healthy food mitigates the belief that it is less filling.

These findings add to the burgeoning body of work on the psychological causes of weight-gain and obesity and point to a way of overturning the pernicious effects of the "healthy = less filling" assumption. Specifically, the findings suggest that the recent proliferation of healthy food labels may be ironically contributing to the obesity epidemic rather than reducing it. Consumers can use this knowledge to avoid overeating foods presented as healthy and to seek foods portrayed as nourishing when they want to feel full without overeating.

Childhood asthma may increase risks of shingles


Nearly 1 million incidences of herpes zoster, which is also known as shingles, occur every year in the U.S., with an estimated one-third of all adults affected by age 80. Despite its prevalence, particularly between ages 50 and 59, it is still unclear why some individuals will develop shingles, and others will not. In a population-based study published in the Journal of Allergy and Clinical Immunology (JACI), Mayo Clinic researchers build on their previous research from 2013, which linked asthma in childhood with an increased risk of shingles.

"Asthma represents one of the five most burdensome chronic diseases in the U.S., affecting up to 17 percent of the population," says lead author Young Juhn, M.D., who is a general academic pediatrician and asthma epidemiologist at the Mayo Clinic Children's Research Center. "The effect of asthma on the risk of infection or immune dysfunction might very well go beyond the airways."

Medical records for potential patients with shingles were reviewed in Olmsted County, Minnesota, where 371 cases with shingles -- age 67 on average -- were identified during the study period and compared against 742 control subjects. Of the 371 shingles cases, 23 percent (87 individuals) had asthma, compared with 15 percent (114 of 742) from the control group. The authors found that adults with asthma were at about a 70 percent greater risk of developing shingles, compared to those without asthma.

The researchers also noted that, with asthma and other atopic conditions accounted for, both asthma and atopic dermatitis were found to be independently associated with a higher risk of shingles. Shingles occurred at a rate of 12 percent in patients with atopic dermatitis (45 of 371 shingles cases) versus 8 percent (58 of 742) of the control subjects.

The underlying mechanisms are not clear; however, impairment in innate immune functions in the skin and airways is well-documented in patients with asthma or atopic dermatitis. Researchers believe that, because asthma helps suppress adaptive immunity, it may increase the risk of varicella zoster virus reactivation.

"As asthma is an unrecognized risk factor for zoster in adults, consideration should be given to immunizing adults aged 50 years and older with asthma or atopic dermatitis as a target group for zoster vaccination," Dr. Juhn concludes.

The researchers note that neither inhaled corticosteroids nor vaccinations were associated with a higher risk of shingles. Rather, zoster vaccination was associated with a lower risk of shingles.


Monday, December 21, 2015

Obesity more dangerous than lack of fitness, new study claims


A new study, published in the International Journal of Epidemiology, has dismissed the concept of 'fat but fit'. In contrast, the results from the new study suggest that the protective effects of high fitness against early death are reduced in obese people.

Although the detrimental effects of low aerobic fitness have been well documented, this research has largely been performed in older populations. Few studies have investigated the direct link between aerobic fitness and health in younger populations. This study by academics in Sweden followed 1,317,713 men for a median average of 29 years to examine the association between aerobic fitness and death later in life, as well as how obesity affected these results. The subjects' aerobic fitness was tested by asking them to cycle until they had to stop due to fatigue.

Men in the highest fifth of aerobic fitness had a 48 per cent lower risk of death from any cause compared with those in the lowest fifth. Stronger associations were observed for deaths related to suicide and abuse of alcohol and narcotics. Unexpectedly, the authors noted a strong association between low aerobic fitness and also deaths related to trauma. Co-author Peter Nordström has no explanation for this finding: "We could only speculate, but genetic factors could have influenced these associations given that aerobic fitness is under strong genetic control."

The study also evaluated the concept that 'fat but fit is ok'. Men of a normal weight, regardless of their fitness level, were at lower risk of death compared to obese individuals in the highest quarter of aerobic fitness. Nevertheless, the relative benefits of high fitness may still be greater in obese people. However, in this study the beneficial effect of high aerobic fitness was actually reduced with increased obesity, and in those with extreme obesity there was no significant effect at all.

With the limitation that the study cohort included only men, and relative early deaths, this data does not support the notion that 'fat but fit' is a benign condition.

Periodontal disease associated with increased breast cancer risk in postmenopausal women


Bottom Line: Postmenopausal women with periodontal disease were more likely to develop breast cancer than women who did not have the chronic inflammatory disease. A history of smoking significantly affected the women's risk.

Journal in Which the Study was Published: Cancer Epidemiology, Biomarkers & Prevention, a journal of the American Association for Cancer Research

Author: Jo L. Freudenheim, PhD, distinguished professor in the Department of Epidemiology and Environmental Health in the University at Buffalo's School of Public Health and Health Professions.

Background: Periodontal disease is a common condition that has been associated with heart disease, stroke, and diabetes. Previous research has found links between periodontal disease and oral, esophageal, head and neck, pancreatic, and lung cancers, so the researchers wanted to see if there was any relationship with breast cancer.

How the Study Was Conducted: Freudenheim and colleagues monitored 73,737 postmenopausal women enrolled in the Women's Health Initiative Observational Study, none of whom had previous breast cancer. Periodontal disease was reported in 26.1 percent of the women. Because prior studies have shown that the effects of periodontal disease vary depending on whether a person smokes, researchers examined the associations stratified by smoking status.

Results: After a mean follow-up time of 6.7 years, 2,124 women were diagnosed with breast cancer. The researchers found that among all women, the risk of breast cancer was 14 percent higher in women who had periodontal disease.

Among women who had quit smoking within the past 20 years, those with periodontal disease had a 36 percent higher risk of breast cancer. Women who were smoking at the time of this study had a 32 percent higher risk if they had periodontal disease, but the association was not statistically significant. Those who had never smoked and had quit more than 20 years ago had a 6 percent and 8 percent increased risk, respectively, if they had periodontal disease.

Author Comment: "We know that the bacteria in the mouths of current and former smokers who quit recently are different from those in the mouths of non-smokers," Freudenheim explained. One possible explanation for the link between periodontal disease and breast cancer is that those bacteria enter the body's circulation and ultimately affect breast tissue. However, further studies are needed to establish a causal link, Freudenheim said.

Study Limitations: Women self-reported their periodontal disease status, after being asked whether a dentist had ever told them they had it. Also, since the study focused on women who were already enrolled in a long-term national health study, they were more likely than the general population to be receiving regular medical and dental care, and were likely more health-conscious than the general population.

Almonds are a good source of plant protein -- essential fatty acids, vitamin E and magnesium


Eating a moderate amount of almonds each day may enrich the diets of adults and their young children, according to a new study by researchers at the University of Florida Institute of Food and Agricultural Sciences.

"Almonds are a good source of plant protein -- essential fatty acids, vitamin E and magnesium," said Alyssa Burns, a doctoral student in the UF/IFAS food science and human nutrition department. Burns conducted the study as part of her graduate work.

Her statement is backed by the federal government's Dietary Guidelines for Americans, which recommend people eat unsalted nuts.

For the 14-week study, published in the journal Nutrition Research, UF/IFAS nutrition scientists gave almonds daily to 29 pairs of parents and children. Most of the adults were mothers with an average age of 35, while their children were between 3 and 6 years old. The children were encouraged to consume 0.5 ounces of almond butter daily. Parents were given 1.5 ounces of almonds per day.
Participants ate almonds for a few weeks, then they resumed eating their typical intake, which included other foods as snacks.

Researchers based their conclusions about improved dietary intake on participants' scores on the Healthy Eating Index (HEI), a tool used to measure diet quality and adherence to the 2010 Dietary Guidelines for Americans.

UF/IFAS researchers used an online dietary recall to find out what adults had eaten and how much. That way, researchers could measure diet quality, Burns said.

When parents and children were eating almonds, their HEI increased for total protein foods, seafood and plant proteins and fatty acids, Burns said, while they ate fewer empty calories. Parents also decreased sodium intake. Parents and children consumed more vitamin E and magnesium when eating almonds, she said.

HEI is based on 12 dietary components which should be consumed adequately or in moderation, Burns said. All components receive a score between 0 and 10 for maximum score of 100. For all components, a higher score indicates higher diet quality.

When parents and children ate almonds, their HEI score increased from 53.7 to 61.4, Burns said.

A morning cup of coffee could help improve athletic endurance


The caffeine in a morning cup of coffee could help improve athletic endurance, according to a new University of Georgia review study.

Authored by Simon Higgins, a third-year doctoral student in kinesiology in the College of Education, the study was published in this month's issue of the International Journal of Sport Nutrition and Exercise Metabolism.

To research the issue, Higgins reviewed more than 600 scholarly articles and screened them for those that focused only on caffeinated-coffee conditions, measured the caffeine dose and measured an endurance performance. Of these, nine randomized control trials specifically used coffee to improve endurance.

"Previous research has focused on caffeine itself as an aid to improve endurance," Higgins said. "Coffee is a popular source of caffeine, so this paper looked at the research surrounding its ergogenic benefits."

Looking at the nine trials, Higgins found that between 3 and 7 milligrams per kilogram of body weight of caffeine from coffee increased endurance performance by an average of 24 percent. The amount of caffeine in a cup of coffee can vary from 75 mg to more than 150, depending on the variety and how it's roasted and brewed.

"This is helpful for athletes because coffee is a naturally occurring compound," Higgins said. "There's the potential that getting your caffeine by drinking coffee has similar endurance benefits as taking caffeine pills."

In the nine trials, participants either cycled or ran after drinking coffee. They then exercised vigorously and the results were measured. In a majority of cases, endurance was noticeably improved after the use of coffee.

When researching the effects of caffeine from coffee, Higgins found two important discoveries: that caffeine from coffee has ergogenic benefits -- that it enhances physical performance -- and that more research is needed on the use of caffeine from coffee versus pure caffeine use.

"While there is a lack of high-quality research on coffee as a source of caffeine, there is an abundance of research on pure caffeine," he said. "It's surprising how little we know about caffeine from coffee when its endurance effects could be just as beneficial as pure caffeine."

Higgins said that coffee shouldn't be dismissed as less beneficial for endurance. He found that coffee appears to be just as helpful as taking caffeine in the form of powder or tablets.

"There's a perception that coffee won't give you the same benefits as pure caffeine," he said. "New research could mean that athletes could have a cup of coffee versus taking a pill."

Higgins says that more research is needed before giving official recommendations to athletes, especially since the amount of caffeine in a cup of coffee can vary depending on how it's prepared.
"There is a caveat to athletes using coffee: Be careful because you don't know how much caffeine is in some coffee, especially when it's prepared by someone else," he said. "Athletes should run their caffeine use through their sports dietician as the NCAA lists it as a banned substance."

Friday, December 18, 2015

Magnesium intake may be beneficial in preventing pancreatic cancer



Indiana University researchers have found that magnesium intake may be beneficial in preventing pancreatic cancer.

Their study, "Magnesium intake and incidence of pancreatic cancer: The VITamins and Lifestyle study," recently appeared in the British Journal of Cancer.

Pancreatic cancer is the fourth leading cause of cancer-related death in both men and women in the United States. The overall occurrence of pancreatic cancer has not significantly changed since 2002, but the mortality rate has increased annually from 2002 to 2011, according to the National Cancer Institute.

"Pancreatic cancer is really unique and different from other cancers," said study co-author Ka He, chair of the Department of Epidemiology and Biostatistics at the IU School of Public Health-Bloomington. "The five-year survival rate is really low, so that makes prevention and identifying risk factors or predictors associated with pancreatic cancer very important."

Previous studies have found that magnesium is inversely associated with the risk of diabetes, which is a risk factor of pancreatic cancer. But few studies have explored the direct association of magnesium with pancreatic cancer; of those that did, their findings were inconclusive, said Daniel Dibaba, a Ph.D. student at the School of Public Health-Bloomington, who led the IU study.

Using information from the VITamins and Lifestyle study, Dibaba and the other co-authors analyzed an enormous trove of data on over 66,000 men and women, ages 50 to 76, looking at the direct association between magnesium and pancreatic cancer and whether age, gender, body mass index, non-steroidal anti-inflammatory drugs use and magnesium supplementation play a role.

Of those followed, 151 participants developed pancreatic cancer. The study found that every 100-milligrams-per-day decrease in magnesium intake was associated with a 24 percent increase in the occurrence of pancreatic cancer. The study also found that the effects of magnesium on pancreatic cancer did not appear to be modified by age, gender, body mass index or non-steroidal anti-inflammatory drug use, but was limited to those taking magnesium supplements either from a multivitamin or individual supplement.

"For those at a higher risk of pancreatic cancer, adding a magnesium supplement to their diet may prove beneficial in preventing this disease," Dibaba said. "While more study is needed, the general population should strive to get the daily recommendations of magnesium through diet, such as dark, leafy greens or nuts, to prevent any risk of pancreatic cancer."


Thursday, December 17, 2015

Bone drug protects stem cells from aging


Stem cells can be protected from the effects of aging by a drug currently used to treat patients with osteoporosis, a breakthrough study has found.

Scientists from the University of Sheffield discovered the drug zoledronate (Common brand names: Zometa, Reclast) is able to extend the lifespan of mesenchymal stem cells by reducing DNA damage.

DNA damage is one of the most important mechanisms of ageing where stem cells lose their ability to maintain and repair the tissues in which they live and keep it working correctly.

The pioneering research shows the drug protects the stem cells from DNA damage enhancing their survival and maintaining their function. Professor Ilaria Bellantuono, from the University's Department of Metabolism, said: "The drug enhances the repair of the damage in DNA occurring with age in stem cells in the bone. It is also likely to work in other stem cells too.

"This drug has been shown to delay mortality in patients affected by osteoporosis but until now we didn't know why. These findings provide an explanation as to why it may help people to live longer.

"Now we want to understand whether the drug can be used to delay or revert the ageing in stem cells in older people and improve the maintenance of tissues such as the heart, the muscle and immune cells, keeping them healthier for longer.

"We want to understand whether it improves the ability of stem cells to repair those tissues after injury, such as when older patients with cancer undergo radiotherapy."

Approximately 50 per cent of over 75 year-olds have three or more diseases at the same time such as cardiovascular disease, infections, muscle weakness and osteoporosis. In the future it is hoped this drug could be used to treat, prevent or delay the onset of such diseases rather than using a mixture of drugs.

Dr Bellantuono added: "We are hopeful that this research will pave the way for a better cure for cancer patients and keeping older people healthier for longer by reducing the risk of developing multiple age-related diseases."

The study, published today in the journal Stem Cells was funded by the Medical Research Council and Arthritis Research UK.

Prostate cancer survival rates better with surgery vs radiotherapy


A rigorous evaluation of survival rates has shown that cancer patients with localised prostate cancer -- the most common form of prostate cancer -- have a better chance of survival if treated by surgery than by radiotherapy. These findings hold true even after accounting for type of radiation and the aggressiveness of cancer. This is the most robust analysis (meta-analysis) to date of published literature comparing surgery and radiotherapy for localised prostate cancer. The study is published in the peer-reviewed journal, European Urology.

According to senior author, Dr Robert Nam (Odette Cancer Centre, Sunnybrook Research Institute, University of Toronto, Canada):

"In the past, studies that have compared the success rates of surgery or radiation have been confusing because of their methods. We have evaluated all the good-quality data comparing surgery and radiotherapy, and the results are pretty conclusive; in general, surgery results in better mortality rates than radiotherapy. Nevertheless, there are times when radiotherapy may be more appropriate than surgery, so it is important that a patient discusses treatment options with his clinician."

Localised prostate cancer -- where the cancer is confined to the prostate -- accounts for around 80% of prostate cancers. Around 400,000 men are diagnosed with prostate cancer each year in Europe, meaning that around 320,000 will suffer from localised prostate cancer2. The most common way of treating localised prostate cancers are either with radiotherapy, or with surgery. The choice of radiotherapy or surgery varies according to country. For example, in England and Wales, radiotherapy is used more often than surgery.

The researchers conducted a meta-analysis (a 'study of studies') which compared 19 studies including up to 118,830 patients who had undergone treatment with either surgery or radiation.
The analysis had to consider a variety of studies which compared different parameters (such as duration of the study). Fifteen of the studies compared patients who died of prostate cancer after surgery or radiation; they found that over the duration of the studies, patients were twice as likely to die from prostate cancer after being treated with radiation, compared to surgery, (Hazard Ratio 2.08, 95% confidence interval 1.76-2.47, p < 0.00001).

Ten of the studies also looked at overall mortality (where the cause of death was not necessarily from prostate cancer), and found that patients treated with radiation were about one and half times more likely to die sooner than patients who had surgery (HR 1.63, 95% confidence interval 1.54-1.73, p < 0.00001)

"Both treatment approaches should be discussed with patients prior to the start of therapy," says Dr Robert Nam, "the important thing about this research is that it gives physicians and patients additional, information to consider when making the decision about how to treat localised prostate cancer.

Commenting Professor Nicolas Mottet (St Étienne, France) Chairman of the European Association of Urology Prostate Guideline Panel said:

"This systematic review suggests that survival is better after surgery compared to various forms of radiotherapy. It deserves attention, as it is based on the best available data. However, definitive proof needs a large well-conducted randomized control trial, such as the upcoming PROTECT trial which is due to report next year. So we certainly need to take this analysis into account, but it doesn't yet give us a definitive answer as to the best treatment. Although this paper should not change clinical practice, I agree with the authors, this analysis gives us important, additional information."

Carbs, not fats, boost half-marathon race performance


Carbohydrates are the body's main energy source during high-intensity, prolonged running, a new study published in Journal of Applied Physiology reports.

Muscles use carbohydrate and fat stored in the body as fuel during exercise, but the fuel sources differ in availability. Carbohydrates can be used immediately but have limited stores. Fats require additional processing steps before they can be used but have larger reserves in the body. Although carbohydrates are the main energy source during high-intensity exercise, recent studies have examined strategies to improve the muscles' ability to burn fat instead of carbohydrates during prolonged exercise, proposing that this approach will enhance performance because fat stores in the body are larger than carbohydrate stores and can supply significantly more energy.

Researchers at Australian Catholic University's Mary Mackillop Institute for Health Research tested the importance of fuel source to endurance sports performance by blocking the body's use of fat. Male competitive half-marathon runners ran on a treadmill until exhausted at a pace 95 percent of their best half-marathon time. They ate a calorie-free or carbohydrates meal before and during the run and took nicotinic acid to prevent the use of fat stores.

The researchers found that blocking the body's use of fat did not affect the distance the runners covered before becoming exhausted. Blocking fat use also did not affect the use of carbohydrates. Carbohydrates contributed 83 to 91 percent of the total energy used, the research team wrote. The study shows that for high-intensity, long-duration runs, exercising muscles prefer carbohydrates as their fuel source, regardless of whether the runner has eaten or not, says Jill Leckey, primary author of the study.

"Competitive runners should focus on dietary strategies that will increase carbohydrate availability before and during competition to optimize race performance in events lasting up to 90 minutes in duration," according to Leckey.

Although the study was conducted in competitive runners, the findings apply to recreational runners as well, says Leckey. "It's the relative exercise intensity, for instance the percentage of an individual's maximal oxygen uptake or maximum heart rate, that determines the proportion of carbohydrate and fat fuels used by the exercising muscles, not simply the pace they are running."


Wednesday, December 16, 2015

Why the flu vaccine is less effective in the elderly


Around this time every year, the flu virus infects up to one-fifth of the U.S. population and kills thousands of people, many of them elderly. A study published by Cell Press on Dec. 15, 2015 in Immunity now explains why the flu vaccine is less effective at protecting older individuals. More broadly, the findings reveal novel molecular signatures that could be used to predict which individuals are most likely to respond positively to vaccination.

"We provide novel evidence of a potential connection between the baseline state of the immune system in the elderly and reduced responsiveness to vaccination," say co-senior study authors Shankar Subramaniam of the University of California, San Diego, and Bali Pulendran of Emory University. "By providing a more complete picture of how the immune system responds to vaccination, our findings may help guide the development of next-generation vaccines that offer long-lasting immunity and better protection of at-risk populations."

Flu vaccines, which contain proteins found in circulating viral strains, offer protection by eliciting the production of antibodies -- proteins that help the immune system identify pathogens and protect against infectious disease. While vaccination is considered the most effective method for preventing influenza, it is less effective in the elderly. But until now, the molecular mechanisms underlying this decrease in vaccine efficacy were unknown.

To address this question, Subramaniam and Pulendran identified molecular signatures of immunity to flu vaccination using systems biology approaches, which involve the computational and mathematical modeling of complex biological systems. They vaccinated 212 subjects, including 54 elderly individuals, across five influenza vaccine seasons, from 2007 to 2011, and analyzed blood samples to identify molecular pathways associated with protective antibody responses elicited by vaccination. They also analyzed previously published data for 218 additional subjects.

Using this approach, the researchers identified molecular signatures present in blood samples collected a few days after vaccination that predicted with 80% accuracy whether the vaccine would elicit immune protection approximately four weeks later. Within one week of flu vaccination, young individuals showed high levels of antibody-producing B cells, whereas the elderly showed high levels of immune cells called monocytes, which elicit inflammatory responses in the body. These age-related differences predicted impaired vaccine-induced immune responses observed in the elderly three weeks later. "Together, these results suggest potential mechanisms by which changes to the innate response in the elderly may result in diminished antibody responses to vaccination," Subramaniam says.

Even before vaccination, high baseline levels of B cells, in conjunction with low levels of monocytes and related inflammatory molecules, predicted vaccine-induced immune protection four weeks later. "This supports the concept that inflammatory responses at baseline may be detrimental to the induction of vaccine-induced antibody responses," Subramaniam says. "While it is early to suggest, supplementary therapeutic approaches, such as reducing the inflammatory response in elderly patients after vaccination, would be valuable avenues to pursue. However, this warrants longer and more detailed investigations."

For their own part, the researchers plan on applying similar system biology approaches to study other viral infections, such as shingles and yellow fever. "Analyzing these myriad 'omics' data in conjunction with physiological measurements is novel and will serve as a paradigm for future studies on influenza and other infections."

In the meantime, they urge caution against over-generalizing their new findings. "This is obviously a complex problem, and the study reveals responses that are averaged across populations," Subramaniam says. "As is true in every medical diagnosis, prognosis, and treatment, there is a distribution of responses with a majority conforming to the mean predicted response. So the important thing for the general audience to recognize is that there will be exceptions and variations."


Shingles increases short-term risk of stroke in older adults


More than 95% of the world's adult population is infected with the virus that causes chickenpox. Up to one third of these individuals will develop shingles (herpes zoster) in their lifetime. A new U.S. study has found that there is a short-term increased risk of stroke after having shingles, reports Mayo Clinic Proceedings.

Shingles (herpes zoster) is a viral disease caused by reactivation of the chickenpox virus, varicella zoster (VSV), which causes a painful skin rash to erupt in a limited area. Typically, the rash is on one side of the body in a single stripe. It is more common in older adults and those with weak immune systems, but anyone who has had chickenpox can develop shingles. Studies in Europe and Asia have suggested an increased risk of stroke and myocardial infarction (MI) after shingles, but there have not been any previous studies in the U.S.

In the current study investigators assessed the risk of stroke and MI in a U.S. community-based population. About 5,000 adults over 50 in Olmsted County, Minnesota, who had a confirmed episode of shingles, were matched with a group of age- and sex-matched individuals from the same community who had no history of shingles.

The risks for stroke and MI were assessed separately. Patients with a previous stroke were excluded from the stroke analyses, and those with a previous MI were excluded from the MI analyses. The short-term risk of stroke and MI were assessed at three months, six months, one year, and three years after shingles.

"We found there was a 50% increased risk of stroke for three months after shingles, but we also found that people who had shingles had many more risk factors for stroke than those who had not, suggesting they had worse health overall," explained lead investigator Barbara P. Yawn, MD, MSc, of the Department of Research, Olmsted Medical Center, Rochester, MN. "The bottom line however is that shingles was still associated with an increased risk of stroke for three months afterwards even when we made allowances for these multiple risk and confounding factors."

Investigators found that the association between shingles and MI at three months was neither strong nor robust across different analytic methods used. "There did appear to be a small increased risk for MI, but when you take other risk factors into consideration, it disappears," noted Dr Yawn.

There was no increased risk of either stroke or MI at any point beyond three months.

The investigators raise the question of why stroke would be more common after an episode of shingles. "Recent studies have shown that the zoster virus appears to affect vascular tissues as well as the central nervous system and that it may therefore be a systemic illness," stated Dr. Yawn. "Another possible explanation is that stroke is a consequence of the inflammatory response that occurs with an acute zoster episode. This increased risk of stroke may be preventable by vaccinating against the zoster virus."

Herpes zoster (also called "shingles") is linked to a transient increased risk of stroke and myocardial infarction (MI) in the months following initial zoster diagnosis, according to a study published by Caroline Minassian and colleagues from the London School of Hygiene and Tropical Medicine, UK, published in this week's PLOS Medicine.

The researchers identified 42,954 Medicare beneficiaries aged ?65 years who had had a herpes zoster diagnosis and an ischemic stroke and 24,237 beneficiaries who had had a herpes zoster diagnosis and an MI during a 5-year period. They then calculated age-adjusted incidence ratios for stroke and MI during pre-defined periods up to 12 months after a diagnosis of zoster relative to time periods when the patient did not have recent zoster (the baseline period). Compared to the baseline period, there was a 2.4-fold increased rate of ischemic stroke and a 1.7-fold increased rate of MI in the first week after herpes zoster. The increased rate of acute cardiovascular events reduced gradually over the 6 months following herpes zoster. There was no evidence that MI or ischemic stroke incidence ratios varied between individuals who had been vaccinated against zoster and those who had not been vaccinated.

While the researchers used a self-controlled case series design that controls for fixed confounders, residual confounding by time-varying factors such as major life events or stress may limit the accuracy of the findings. Furthermore, only a few participants in the study were vaccinated, which limits the study's power to detect an effect of vaccination.

The authors say "These findings enhance our understanding of the temporality and magnitude of the association between zoster and acute cardiovascular events."

At menopause, weight, exercise, education, income play big roles in metabolic risks


At midlife, overweight and obesity, lack of exercise, less education, and low income put women at much higher risk of having metabolic syndrome, the cluster of conditions predisposes people to diabetes and heart disease, shows a large study published today in Menopause, the journal of The North American Menopause Society.

The researchers from Yonsei University in Seoul and Hallym University in Chuncheon, Korea, analyzed four years of data from the Korean Genetic Epidemiologic Survey on some 1,200 healthy women 45 to 55 years old who did not use hormones and looked for characteristics that predisposed the women to having metabolic syndrome or developing it as they went through menopause.

Metabolic syndrome includes excess body fat around the waist, increased blood pressure, high blood sugar, and abnormal cholesterol levels. Weight gain and a higher risk of metabolic syndrome are known to be common at menopause. But what has not been as well understood is how much social and economic conditions and the transition through menopause influence that risk.

For the women overall in the study, transitioning through menopause or becoming postmenopausal (reaching or exceeding 1 year after their final period) during the study did not significantly increase the risk of metabolic syndrome. But for overweight, obese, sedentary, undereducated, and disadvantaged women, the picture was very different.

In contrast to normal-weight women, overweight women in the study had more than 4 times the risk and obese women more than 12 times risk of metabolic syndrome. Women who didn't exercise had a 1.6 times greater risk than exercisers. For the women who were in perimenopause, the time of irregular periods before menopause, those who were overweight had 3 times the risk of normal weight women for metabolic syndrome, and those who were obese had 9 times the risk. Overweight women who became postmenopausal during the study had 3 times and obese women 8.5 times greater risk than those with normal body weight. And postmenopausal women who did not exercise had a 1.6 times greater risk than high-level exercisers.

For women in the study who had less than 10 years of education, the risk of metabolic syndrome was 1.4 times greater than for more educated women, and the risk for low income women was 1.6 to 1.7 times greater than for wealthier women. Among the women who experienced menopause during the study, those who did not have more than a high-school education had 1.7 times the risk of better educated women. In addition, disadvantaged women who went through menopause during the study had 2.5 times the risk and middle-income women 2 times the risk of their wealthier counterparts.

"As women make the transition from regular cycles through the transition to menopause and after, it is more difficult to maintain a healthy weight, not just because of hormonal changes but also because of aging , less muscle mass, and life stressors, too. This study underscores how important it is to work hard to stave off weight gain as much as possible," says NAMS Executive Director JoAnn V. Pinkerton, MD, NCMP.

So what helps? "Decrease food intake and move more," she suggests. "If women continue to eat as they always have and don't increase their physical activity as their metabolism decreases, they are likely to gain weight."

The study data showed that at least some women heard the exercise message and increased their activity later on during the study. But much more needs to be done, and the authors called for emphasizing weight management and encouraging high levels of physical activity in women before menopausal changes occur and for policy measures that are sensitive to the health needs of economically disadvantaged women.


Iron supplements could help avoid anemia and mild cognitive impairment



In a large population-based study of randomly selected participants in Germany, researchers found that participants with anemia showed lower performances in verbal memory and executive functions. Furthermore, mild cognitive impairment (MCI) occurred almost twice more often in participants diagnosed with anemia.  

Anemia is a condition that develops when your blood lacks enough healthy red blood cells or hemoglobin. Hemoglobin is a main part of red blood cells and binds oxygen. If you have too few or abnormal red blood cells, or your hemoglobin is abnormal or low, the cells in your body will not get enough oxygen. Older adults also may have a greater risk of developing anemia because of poor diet and other medical conditions. Iron-deficiency anemia, the most common type, is very treatable with diet changes and iron supplements.

The study is published in the Journal of Alzheimer's Disease.

Because dementia is the end stage of many years of accumulation of pathological changes in the brain, researchers focus on early stages of cognitive impairment. MCI represents an intermediate and possibly modifiable stage between normal cognitive aging and dementia. Although persons with MCI have an increased risk of developing dementia or Alzheimer's disease (AD), they can also remain stable for many years or even revert to a cognitively normal state over time. This modifiable characteristic makes the concept of MCI a promising target in the prevention of dementia.

What criteria determine MCI? The following four criteria were used to diagnose MCI: 

First, participants have to report a decline in cognitive performance over the past two years. Second, the participants have to show a cognitive impairment in objective cognitive tasks that is greater than one would expect taking their age and education into consideration. Third, this impairment is not as pronounced as in demented individuals since persons with MCI can perform normal daily living activities or are only slightly impaired in carrying out complex instrumental functions. Fourth, the cognitive impairment is insufficient to fulfil criteria for dementia.

The concept of MCI distinguishes between amnestic MCI (aMCI) and non-amnestic MCI (naMCI). In the former, impairment in the memory domain is evident, most likely reflecting AD pathology. In the latter, impairment in non-memory domains is present, mainly reflecting vascular pathology but also frontotemporal dementia or dementia with Lewy bodies.

The Heinz Nixdorf Recall (Risk Factors, Evaluation of Coronary Calcium and Lifestyle) study is an observational, population-based, prospective study that examined 4,814 participants (50% men) between 2000 and 2003 in the metropolitan Ruhr Area. After five years, a second examination was conducted with 92% of the participants taking part. The publication reports cross-sectional results of the second examination.

First, 163 participants with anemia and 3,870 participants without anemia were included to compare the performance in all cognitive subtests. Interestingly, anemic participants showed more pronounced cardiovascular risk profiles and lower cognitive performance in all administered cognitive subtests. After adjusting for age, anemic participants showed a significantly lower performance specifically in the immediate recall task and the verbal fluency task.

Second, 579 participants diagnosed with MCI (299 participants with aMCI and 280 with naMCI) and 1,438 cognitively normal participants were included to compare the frequency of MCI and MCI subtype diagnosis in anemic and non-anemic participants. MCI occurred almost twice more often in anemic than in non-anemic participants. Similar results were found for MCI subtypes, indicating that low hemoglobin level may contribute to cognitive impairment via different pathways.

These results suggest that anemia is associated with an increased risk of MCI independent of traditional cardiovascular risk factors. The association of anemia and MCI has important clinical relevance because -depending on etiology- anemia can be treated effectively. This might provide means to prevent or delay cognitive decline.

Drinking 2 to 3 units of alcohol every day is linked to a reduced risk of death among people with early stage Alzheimer's disease


Drinking 2 to 3 units of alcohol every day is linked to a reduced risk of death among people with early stage Alzheimer's disease, finds research published in the online journal BMJ Open.

Moderate drinking has been associated with a lower risk of developing and dying from heart disease and stroke. But alcohol is known to damage brain cells, and given that dementia is a neurodegenerative disorder, drinking might be harmful in those with the condition.

The researchers therefore wanted to find out if the same potentially positive association between alcohol and a reduced risk of cardiovascular death could be applied to 321 people with early stage Alzheimer's disease, defined as a score of 20 or less on the Mini Mental State Exam (MMSE).
The research team analysed data originally collected on 330 people with early stage dementia or Alzheimer's disease and their primary carers from across Denmark as part of the Danish Alzheimer's Intervention Study (DAISY).

DAISY set out to assess the impact of a 12 month programme of psychosocial counselling and support, and tracked progress for three years afterwards, accumulating a considerable amount of data.
This included information on how much alcohol people with early stage dementia or Alzheimer's drank every day. Around one in 10 (8%) drank no alcohol and at the other end of the scale, around one in 20 (4%) drank more than 3 units daily.

Most of the sample (71%) drank 1 or fewer units a day; 17% drank 2-3 units.

During the monitoring period, 53 (16.5%) of those with mild Alzheimer's disease died. Consumption of 2-3 units of alcohol every day was associated with a 77% lowered risk of death compared with a tally of 1 or fewer daily units.

There was no significant difference in death rates among those drinking no alcohol or more than 3 units every day compared with those drinking 1 or fewer daily units.

These results held true after taking account of influential factors: gender, age, other underlying conditions, whether the individual lived alone or with their primary carer, educational attainment, smoking, quality of life, and MMSE result.

The researchers say there could be several explanations for the findings, including that people who drink moderately have a richer social network, which has been linked to improved quality, and possibly length, of life.

Another explanation may lie in the fact that the seemingly protective effect of alcohol may have been caused by reverse causality, whereby those drinking very little alcohol were in the terminal phase of their life, which would have artificially inflated the positive association.

In a bid to correct for this, the researchers re-analysed the data, omitting the first year of monitoring. But this made no difference to the findings.

"The results of our study point towards a potential, positive association of moderate alcohol consumption on mortality in patients with Alzheimer's disease. However, we cannot solely, on the basis of this study, either encourage or advise against moderate alcohol consumption in [these] patients," they caution.

They suggest that further research looking at the impact of alcohol on cognitive decline and disease progression in patients with mild Alzheimer's disease would be particularly informative.

Recommendations for use of aspirin to prevent preeclampsia


To prevent preeclampsia, new research suggests that low-dose aspirin should be given prophylactically to all women at high risk (those with diabetes or chronic hypertension) and any woman with two or more moderate risk factors (including obesity, multiple gestation and advanced maternal age).

Preeclampsia, a potentially dangerous complication of pregnancy characterized by high blood pressure and a high level of protein in the urine or other end organ affects, complicates between three and seven percent of the births in the U.S. One in seven preterm births and one in 10 maternal deaths in the U.S. can be directly attributed to preeclampsia. Currently, the only intervention that has been shown to reduce the risk of preeclampsia is the use of prophylactic low-dose aspirin.

Erika Werner, MD, of the Division of Maternal-Fetal Medicine at Women & Infants Hospital of Rhode Island, a Care New England hospital, and an assistant professor of obstetrics and gynecology at The Warren Alpert Medical School of Brown University; Dwight Rouse, MD, of Women & Infants' Division of Maternal-Fetal Medicine, principal investigator for the Eunice Kennedy Shriver National Institute of Child Health and Human Development Maternal-Fetal Medicine Units Network (MFMU), and a professor of obstetrics and gynecology at the Alpert Medical School; and Alisse Hausperg, MD, a chief resident at Women & Infants, have published research in the December 2015 edition of Obstetrics & Gynecology, now available online. The research is entitled 'A Cost-Benefit Analysis of Low-Does Aspirin Prophylaxis for the Prevention of Preeclampsia in the United States.'

The researchers developed a decision model to evaluate the risks, benefits and costs of four different approaches to aspirin prophylaxis -- no prophylaxis, prophylaxis per recommendations of the American College of Obstetricians and Gynecologists (only for a narrow segment of pregnant women -- namely, those with a history of preeclampsia necessitating delivery before 34 weeks gestation and those with preeclampsia in more than one prior pregnancy), prophylaxis per the U.S. Preventive Task Force recommendations, and universal prophylaxis for all women.

The researchers concluded, "Both the U.S. Preventive Task Force approach and universal prophylaxis would reduce morbidity, save lives, and lower health care costs in the United States to a much greater degree than the approach currently recommended by ACOG."


Monday, December 14, 2015

Antidepressants during pregnancy associated with increased autism risk


The use of antidepressants, especially selective serotonin reuptake inhibitors, during the final two trimesters of pregnancy was associated with increased risk for autism spectrum disorder in children, according to an article published online by JAMA Pediatrics.

Antidepressants (ADs) are widely used during pregnancy to treat depression. Autism spectrum disorder (ASD) is a neurodevelopmental syndrome characterized by altered communication, language and social interaction and by particular patterns of interests and behaviors. Few studies have investigated the effect of AD use during pregnancy on the risk of ASD in children. A better understanding of the long-term neurodevelopmental effects of ADs on children when used during gestation is a public health priority.

Anick Bérard, Ph.D., of the University of Montreal, Canada, and coauthors used data on all pregnancies and children in Québec between January 1998 and December 2009. The authors identified 145,456 full-term singleton infants born alive. Of the infants, 1,054 (0.72 percent) had at least one ASD diagnosis; the average age at first ASD diagnosis was 4.6 years and the average age of children at the end of follow-up was 6.2 years. Boys with ASD outnumbered girls 4 to 1.

The authors identified 4,724 infants (3.2 percent) who were exposed to ADs in utero; 4,200 (88.9 percent) infants were exposed during the first trimester and 2,532 (53.6 percent) infants were exposed during the second and/or third trimester. There were 31 infants (1.2 percent) exposed to ADs during the second and/or third trimester diagnosed with ASD and 40 infants (1.0 percent) exposed during the first trimester diagnosed with ASD, according to the results.

The use of ADs during the second and/or third trimester was associated with an 87 percent increased of ASD (32 exposed infants), while no association was observed between the use of ADs during the first trimester or the year before pregnancy and the risk of ASD.

Results indicate the increased risk of ASD was observed with selective serotonin reuptake inhibitors (22 exposed infants) and with the use of more than one class of AD during the second and/or third trimester (five exposed infants). In children of mothers with a history of depression, the use of ADs during the second and/or third trimester was associated with an increased risk for ASD in the study (29 exposed infants).

The authors suggest several mechanisms may account for the increased risk of ASD associated with maternal use of ADs during pregnancy. Limitations to the study include its use of prescription filling data, which may not reflect actual use. The data also contained no information on maternal lifestyle.

"Further research is needed to specifically assess the risk of ASD associated with antidepressant types and dosages during pregnancy," the study concludes.

Using antidepressants during pregnancy greatly increases the risk of autism, Professor Anick Bérard of the University of Montreal and its affiliated CHU Sainte-Justine children's hospital revealed today. Prof. Bérard, an internationally renowned expert in the fields of pharmaceutical safety during pregnancy, came to her conclusions after reviewing data covering 145,456 pregnancies. "The variety of causes of autism remain unclear, but studies have shown that both genetics and environment can play a role," she explained. "Our study has established that taking antidepressants during the second or third trimester of pregnancy almost doubles the risk that the child will be diagnosed with autism by age 7, especially if the mother takes selective serotonin reuptake inhibitors, often known by its acronym SSRIs." Her findings were published today in JAMA Pediatrics.

Bérard and her colleagues worked with data from the Quebec Pregnancy Cohort and studied 145,456 children between the time of their conception up to age ten. In addition to information about the mother's use of antidepressants and the child's eventual diagnosis of autism, the data included a wealth of details that enabled the team to tease out the specific impact of the antidepressant drugs. For example, some people are genetically predisposed to autism (i.e., a family history of it.) Maternal age, and depression are known to be associated with the development of autism, as are certain socio-economic factors such as being exposed to poverty, and the team was able to take all of these into consideration. "We defined exposure to antidepressants as the mother having had one or more prescription for antidepressants filled during the second or third trimester of the pregnancy. This period was chosen as the infant's critical brain development occurs during this time," Prof. Bérard said. "Amongst all the children in the study, we then identified which children had been diagnosed with a form of autism by looking at hospital records indicating diagnosed childhood autism, atypical autism, Asperger's syndrome, or a pervasive developmental disorder. Finally, we looked for a statistical association between the two groups, and found a very significant one: an 87% increased risk." The results remained unchanged when only considering children who had been diagnosed by specialists such as psychiatrists and neurologists.

The findings are hugely important as six to ten percent of pregnant women are currently being treated for depression with antidepressants. In the current study, 1,054 children were diagnosed with autism (0.72% of the children in the study), on average at 4.5 years of age. Moreover, the prevalence of autism amongst children has increased from 4 in 10,000 children in 1966 to 100 in 10,000 today. While that increase can be attributed to both better detection and widening criteria for diagnosis, researchers believe that environmental factors are also playing a part. "It is biologically plausible that anti-depressants are causing autism if used at the time of brain development in the womb, as serotonin is involved in numerous pre- and postnatal developmental processes, including cell division, the migration of neuros, cell differentiation and synaptogenesis - the creation of links between brain cells," Prof. Bérard explained. "Some classes of anti-depressants work by inhibiting serotonin (SSRIs and some other antidepressant classes), which will have a negative impact on the ability of the brain to fully develop and adapt in-utero"

The World Health Organization indicates that depression will be the second leading cause of death by 2020, which leads the researchers to believe that antidepressants will likely to remain widely prescribed, including during pregnancy. "Our work contributes to a better understanding of the long-term neurodevelopmental effects of anti-depressants on children when they are used during gestation. Uncovering the outcomes of these drugs is a public health priority, given their widespread use," Prof. Bérard said.



Vegetarian and 'healthy' diets are more harmful to the environment than some meat-based diets


Contrary to recent headlines -- and a talk by actor Arnold Schwarzenegger at the United Nations Paris Climate Change Conference -- eating a vegetarian diet could contribute to climate change.

In fact, according to new research from Carnegie Mellon University, following the USDA recommendations to consume more fruits, vegetables, dairy and seafood is more harmful to the environment because those foods have relatively high resource uses and greenhouse gas (GHG) emissions per calorie. Published in Environment Systems and Decisions, the study measured the changes in energy use, blue water footprint and GHG emissions associated with U.S. food consumption patterns.

"Eating lettuce is over three times worse in greenhouse gas emissions than eating bacon," said Paul Fischbeck, professor of social and decisions sciences and engineering and public policy. "Lots of common vegetables require more resources per calorie than you would think. Eggplant, celery and cucumbers look particularly bad when compared to pork or chicken."

Fischbeck, Michelle Tom, a Ph.D. student in civil and environmental engineering, and Chris Hendrickson, the Hamerschlag University Professor of Civil and Environmental Engineering, studied the food supply chain to determine how the obesity epidemic in the U.S. is affecting the environment. Specifically, they examined how growing, processing and transporting food, food sales and service, and household storage and use take a toll on resources in the form of energy use, water use and GHG emissions.

On one hand, the results showed that getting our weight under control and eating fewer calories, has a positive effect on the environment and reduces energy use, water use and GHG emissions from the food supply chain by approximately 9 percent.

However, eating the recommended "healthier" foods -- a mix of fruits, vegetables, dairy and seafood -- increased the environmental impact in all three categories: Energy use went up by 38 percent, water use by 10 percent and GHG emissions by 6 percent.

"There's a complex relationship between diet and the environment," Tom said. "What is good for us health-wise isn't always what's best for the environment. That's important for public officials to know and for them to be cognizant of these tradeoffs as they develop or continue to develop dietary guidelines in the future."

Low levels of vitamin D may increase risk of stress fractures in active individuals



Vitamin D plays a crucial role in ensuring appropriate bone density. Active individuals who enjoy participating in higher impact activities may need to maintain higher vitamin D levels to reduce their risk of stress fractures, report investigators in The Journal of Foot & Ankle Surgery.

The role of vitamin D in the body has recently become a subject of increasing interest owing to its many physiologic effects throughout multiple organ systems. Vitamin D is an essential nutrient that can behave as a hormone. It is obtained through diet and through the skin when exposed to the sun's rays. It is essential for bone development and remodeling to ensure appropriate bone mass density. Low levels of vitamin D can lead to osteoporosis, osteomalacia, decreased bone mineral density, and risk of acute fracture.

Investigators tested the serum concentration of 25(OH)D, which is used to determine vitamin D status, in patients with confirmed stress fractures. "By assessing the average serum vitamin D concentrations of people with stress fractures and comparing these with the current guidelines, we wanted to encourage a discussion regarding whether a higher concentration of serum vitamin D should be recommended for active individuals," explained lead investigator Jason R. Miller, DPM, FACFAS, Fellowship Director of the Pennsylvania Intensive Lower Extremity Fellowship, foot and ankle surgeon from Premier Orthopedics and Sports Medicine, in Malvern, Pennsylvania, and Fellow Member of the American College of Foot and Ankle Surgeons.
The investigators reviewed the medical records of patients who experienced lower extremity pain, with a suspected stress fracture, over a three-year period from August 2011 to July 2014. All patients had x-rays of the affected extremity and were then sent for magnetic resonance imaging (MRI) if no acute fracture had been seen, yet concern for the presence of a stress fracture remained based on the physical examination findings. Musculoskeletal radiologists independently reviewed all the MRI scans, and the investigators then confirmed the diagnosis of a stress fracture after a review of the images.

The serum vitamin D level was recorded within three months of diagnosis for 53 (42.74%) of these patients. Using the standards recommended by the Vitamin D Council (sufficient range 40 to 80 ng/mL), more than 80% of these patients would have been classified as having insufficient or deficient vitamin D levels. According to the standards set by the Endocrine Society (sufficient range 30 to 100 ng/mL), over 50% had insufficient levels.

"Based on these findings, we recommend a serum vitamin D level of at least 40 ng/mL to protect against stress fractures, especially for active individuals who enjoy participating in higher impact activities," explained Dr. Miller. "This correlates with an earlier study of 600 female Navy recruits who were found to have a twofold greater risk of stress fractures of the tibia and fibula with a vitamin D level of less than 20 ng/mL compared with females with concentrations above 40 ng/mL

"However, vitamin D is not the sole predictor of a stress fracture and we recommend that individuals who regularly exercise or enjoy participating in higher impact activities should be advised on proper and gradual training regimens to reduce the risk of developing a stress fracture," he concluded.


Wednesday, December 9, 2015

Getting the most from your stretching routine



For over 30 years, from the 1960s to the late 1990s, fitness professionals, enthusiasts and athletes were told that static stretching (stretching muscles while the body is at rest) was important for increased flexibility, improved performance and injury reduction. This period was followed by 15 years of being told that static stretching could cause performance impairments and that it does not reduce injury risk, resulting in a dramatic switch to dynamic stretching, where movements are performed through large ranges of motion usually at a fast speed. As a result, many people no longer perform static stretching before exercise or playing sports.


A comprehensive review of the literature published today brings new recommendations to fitness enthusiasts, athletes, coaches and rehabilitation practitioners. Upon reviewing hundreds of studies, researchers found that static stretching, when incorporated into a full warm-up routine that includes an initial aerobic component, static and dynamic stretching and then active and dynamic sport-specific activities should not result in significant performance impairments and may reduce muscle strain injury risk. This systematic review has also highlighted the lack of scientific data regarding the effects of dynamic stretching on injury risk.
"It is important for fitness professionals and enthusiasts, coaches, rehabilitation professionals and other scientists to critically assess the findings of fitness studies" says Dr. David Behm, Memorial University of Newfoundland and lead author of the study. "Many studies over the last 15 years did not include a full warm-up, something that most athletes do regularly. Many studies also tested stretches that were held much longer than what is typically done," continued Dr. Behm. "Before incorporating new findings into your fitness activities, think about how the study applies to your situation and activities". 
"CSEP strongly supports promoting physical activity for healthy outcomes and equally important to that are warm up routines that increase range of motion and decrease muscle injury," says Dr. Phil Chilibeck, CSEP Chair. "The recommendation in the CSEP Position Stand is that all components of a warmup be included with appropriate duration of stretching. The inclusion of static, or Proprioceptive Neuromuscular Facilitation (PNF), stretching is recommended and has the potential to positively influence the standard warmup routines of a large number of athletes."


Research linking healthy lifestyle factors, such as maintaining a healthy weight and exercise, with improved breast cancer survival


It is well documented that a healthy diet and exercise are key in cancer prevention and management, but the exact mechanism hasn't been clear. Now, Yale Cancer Center researchers have found an explanation in the tiny protective ends of chromosomes called telomeres. The findings will be presented Dec. 11 at the 2015 San Antonio Breast Cancer Symposium.
The researchers used a previously published Yale weight-loss intervention study called LEAN to examine how body fat and weight loss through lifestyle changes are associated with telomere length in breast cancer survivors enrolled in a weight-loss trial. Telomeres shorten with cell division and are associated with aging and increased risk of breast cancer mortality.
The Yale study -- among the few to explore a link between weight loss and telomere length in breast cancer survivors -- found that telomeres were slower to shorten in breast cancer survivors who lost weight through diet and exercise. In some cases, telomere shortening even reversed, said the study's first author Tara Sanft, M.D., assistant professor of medical oncology.
"Our results indicate that having higher body fat levels is associated with shorter telomere length, and weight loss was associated with an increase in telomere length," Sanft said. "This suggests that telomere length may be a mechanism mediating the relationship between obesity and breast cancer risk and mortality."
The study's senior author, Melinda Irwin, professor of epidemiology and associate director for Population Sciences at the Yale Cancer Center, said the growing body of research linking healthy lifestyle factors, such as maintaining a healthy weight and exercise, with improved breast cancer survival is compelling.
"With the consistent findings of weight loss and exercise improving potential mechanisms related to breast cancer mortality, we feel there should be a shift in how breast cancer care is delivered, as well as increased access to and reimbursement of lifestyle behavioral counseling and programs," Irwin said.

Stereotypes around aging can negatively impact memory and hearing


A study led by researchers at the University of Toronto shows that when older adults feel negatively about aging, they may lack confidence in their abilities to hear and remember things, and perform poorly at both.


"People's feelings about getting older influence their sensory and cognitive functions," said Alison Chasteen, professor in U of T's Department of Psychology and lead author of the study published in Psychology and Aging. "Those feelings are often rooted in stereotypes about getting older and comments made by those around them that their hearing and memory are failing. So, we need to take a deeper and broader approach to understanding the factors that influence their daily lives."
In the study, the researchers examined three variables - views on aging, self-perceptions of one's abilities to hear and remember, and one's actual performance of both functions - to uncover connections between them. It marks the first time all three factors were studied together using the same group of subjects.
A sample of 301 adults between the ages of 56 and 96 completed standard hearing tests to determine their ability to hear. These were followed by a series of recall tasks to test their memory. Subjects viewed a list of 15 words on a computer screen and listened to a different list of words on headphones. They then wrote down as many words as they could recall. A third test required them to listen to and repeat a list of five words, and then recall them after a five-minute delay. This provided an accurate measurement of each participant's performance in both functions.
Participants then responded to a series of questions and statements relating to their own perceptions of their hearing and memory abilities. They were asked to agree or disagree with statements such as: "I am good at remembering names," or "I can easily have a conversation on the telephone.
To assess their views on aging, they were asked to imagine 15 scenarios and rate their concerns about each based on age. In one, they were asked to imagine they were involved in a car accident in which it was unclear who was at fault, and indicate how worried they would be about being blamed for the accident because of their age. They were also asked how much they worried about being alone as they got older, losing their independence, becoming more forgetful, and finding contentment in their lives.
"Those who held negative views about getting older and believed they had challenges with their abilities to hear and remember things, also did poorly on the hearing and memory tests," said Chasteen.
"That's not to say all older adults who demonstrate poor capacities for hearing and memory have negative views of aging," said Chasteen. "It's not that negative views on aging cause poor performance in some functions, there is simply a strong correlation between the two when a negative view impacts an individual's confidence in the ability to function."
Chasteen said the perceptions older people have about their abilities to function and how they feel about aging must be considered when determining their cognitive and sensory health. She recommends educating older people about ways in which they can influence their aging experience, including providing them with training exercises to enhance their cognitive and physical performance, and disspelling stereotypes about aging.
"Knowing that changing how older adults feel about themselves could improve their abilities to hear and remember will enable the development of interventions to improve their quality of life."
The results are described in the article titled "Do Negative Views of Aging Influence Memory and Auditory Performance Through Self-Perceived Abilities?" published in the December issue of Psychology and Aging. The study was coauthored by researchers at the University of Toronto, Baycrest Health Sciences, the James H.Quillen VA Medical Center, and Phonak AG. The research was supported by a Catalyst Grant from the Canadian Institutes for Health Research.

Long nights and lazy days could send you to an early grave


  • Being physically inactive + too much sleep
  • Being physically inactive + too much sitting
  • Smoking + high alcohol intake

Sleeping more than nine hours a night, and sitting too much during the day could be a hazardous combination, particularly when added to a lack of exercise, according to new findings to emerge from the Sax Institute's 45 and Up Study. 
The findings, published today in the journal PLOS Medicine, show that a person who sleeps too much, sits too much and isn't physically active enough is more than four times as likely to die early as a person without those unhealthy lifestyle habits. (Too much sitting equates to more than 7 hours a day and too little exercise is defined as less than 150 minutes a week.)
"Evidence has increased in recent years to show that too much sitting is bad for you and there is growing understanding about the impact of sleep on our health but this is the first study to look at how those things might act together," said lead author Dr Melody Ding.
"When you add a lack of exercise into the mix, you get a type of 'triple whammy' effect. Our study shows that we should really be taking these behaviours together as seriously as we do other risk factors such as levels of drinking and unhealthy eating patterns. "
Dr Ding and her colleagues from the University of Sydney analysed the health behaviours of more than 230,000 of the participants in the 45 and Up Study - Australia's largest study - which is looking at the health of our population as we age.
They looked at lifestyle behaviours that are already known to increase the risk of death and disease - smoking, high alcohol intake, poor diet and being physically inactive - and added excess sitting time and too little/too much sleep into the equation. They then looked at different combinations of all of these risk factors to see which groupings had the most impact on a person's risk of dying prematurely from any cause. 
As well as new evidence on the risky combination of prolonged sleep, sitting and lack of exercise, the researchers also found another problematic triple threat: smoking, high alcohol intake and lack of sleep (less than 7 hours a night) is also linked to a more than four-times greater risk of early death.
And several other combinations led to more than double the risk of early death: 
"The take-home message from this research - for doctors, health planners and researchers - is that if we want to design public health programs that will reduce the massive burden and cost of lifestyle-related disease we should focus on how these risk factors work together rather than in isolation," said study co-author Professor Adrian Bauman.
"These non-communicable diseases (such as heart disease, diabetes and cancer) now kill more than 38 million people around the world - and cause more deaths than infectious disease. Better understanding what combination of risk behaviours poses the biggest threat will guide us on where to best target scarce resources to address this major - and growing - international problem."