Tuesday, October 31, 2017

Being unaware of memory loss predicts Alzheimer's disease


While memory loss is an early symptom of Alzheimer's disease, its presence doesn't mean a person will develop dementia. A new study at the Centre for Addiction and Mental Health (CAMH) has found a clinically useful way to predict who won't develop Alzheimer's disease, based on patients' awareness of their memory problems.

People who were unaware of their memory loss, a condition called anosognosia, were more likely to progress to Alzheimer's disease, according to the study, published today in the Journal of Clinical Psychiatry. Those who were aware of memory problems were unlikely to develop dementia.

"If patients complain of memory problems, but their partner or caregiver isn't overly concerned, it's likely that the memory loss is due to other factors, possibly depression or anxiety," says lead author Dr. Philip Gerretsen, Clinician Scientist in CAMH's Geriatric Division and Campbell Family Mental Health Research Institute. "They can be reassured that they are unlikely to develop dementia, and the other causes of memory loss should be addressed."

In other cases, the partner or caregiver is more likely to be distressed while patients don't feel they have any memory problems. In Alzheimer's disease, lack of awareness is linked to more burden on caregivers. Both unawareness of illness (anosognosia) and memory loss (known as mild cognitive impairment) can be objectively assessed using questionnaires.

The study, believed to be the largest of its kind on illness awareness, had data on 1,062 people aged 55 to 90 from the Alzheimer's Disease Neuroimaging Initiative (ADNI). This included 191 people with Alzheimer's disease, 499 with mild cognitive impairment and 372 as part of the healthy comparison group.

The researchers also wanted to identify which parts of the brain were affected in impaired illness awareness. They examined the brain's uptake of glucose, a type of sugar. Brain cells need glucose to function, but glucose uptake is impaired in Alzheimer's disease.

Using PET brain scans, they showed that those with impaired illness awareness also had reduced glucose uptake in specific brain regions, even when accounting for other factors linked to reduced glucose uptake, such as age and degree of memory loss.

As the next stage of this research, Dr. Gerretsen will be tracking older adults with mild cognitive impairment who are receiving an intervention to prevent Alzheimer's dementia. This ongoing study, the PACt-MD study, combines brain training exercises and brain stimulation, using a mild electrical current to stimulate brain cells and improve learning and memory. While the main study is focused on dementia prevention, Dr. Gerretsen will be looking at whether the intervention improves illness awareness in conjunction with preventing progression to dementia.
 
 

Monday, October 30, 2017

Black Licorice: Trick or Treat?

As it turns out, you really can overdose on candy—or, more precisely, black licorice.

Days before the biggest candy eating holiday of the year, the Food and Drug Administration (FDA) encourages moderation if you enjoy snacking on the old fashioned favorite.

So, if you’re getting your stash ready for Halloween, here’s some advice from FDA:

If you’re 40 or older, eating 2 ounces of black licorice a day for at least two weeks could land you in the hospital with an irregular heart rhythm or arrhythmia.

FDA experts say black licorice contains the compound glycyrrhizin, which is the sweetening compound derived from licorice root. Glycyrrhizin can cause potassium levels in the body to fall. When that happens, some people experience abnormal heart rhythms, as well as high blood pressure, edema (swelling), lethargy, and congestive heart failure.

FDA’s Linda Katz, M.D., says last year the agency received a report of a black licorice aficionado who had a problem after eating the candy. And several medical journals have linked black licorice to health problems in people over 40, some of whom had a history of heart disease and/or high blood pressure.

Katz says potassium levels are usually restored with no permanent health problems when consumption of black licorice stops.

Licorice, or liquorice, is a low-growing shrub mostly grown for commercial use in Greece, Turkey, and Asia. The National Institutes of Health (NIH) says the plant’s root has a long history of use as a folk or traditional remedy in both Eastern and Western medicine. It has been used as a treatment for heartburn, stomach ulcers, bronchitis, sore throat, cough and some infections caused by viruses, such as hepatitis; however, NIH says there are insufficient data available to determine if licorice is effective in treating any medical condition.

Licorice is also used as a flavoring in food. Many “licorice” or “licorice flavor” products manufactured in the United States do not contain any licorice. Instead, they contain anise oil, which has the same smell and taste. Licorice root that is sold as a dietary supplement can be found with the glycyrrhizin removed, resulting in a product known as deglycyrrhizinated licorice, or DGL, NIH says.

If you have a fondness for black licorice, FDA is offering this advice:
  • No matter what your age, don’t eat large amounts of black licorice at one time.
  • If you have been eating a lot of black licorice and have an irregular heart rhythm or muscle weakness, stop eating it immediately and contact your healthcare provider.
  • Black licorice can interact with some medications, herbs and dietary supplements. Consult a health care professional if you have questions about possible interactions with a drug or supplement you take.

Less than half of patients prescribed new cholesterol drug receive insurance approval



Less than half of patients received their insurer's approval for prescriptions of PCSK9 inhibitors, according to new research in the American Heart Association's journal Circulation.

PCSK9 inhibitors, like Repatha (evolocumab) and Praluent (alirocumab), work by increasing the removal of low-density lipoprotein (LDL) or "bad" cholesterol from the blood. They have been shown to reduce LDL by 60 percent and decrease major cardiac events but cost much more than other cholesterol-lowering drugs with an average cost of $14,300 per year. Prescriptions require prior authorization by health insurance companies.

In a nationwide review of the pharmacy claims combined with electronic medical records (EMRs) lab test results of 9,357 patients prescribed the drug between July 2015 and August 2016, 4,397 (47 percent) were approved for PCSK9 inhibitor therapy and 4,960 (53 percent) were rejected. Sixty percent of those patients had a history of atherosclerotic cardiovascular disease (plaque-buildup of the arteries) while 40 percent did not.

"With the controversy surrounding whether or not these drugs were cost effective, we were anticipating that there might be some reluctance by insurance companies to cover these medications," said senior author Robert Yeh, M.D., director of the Smith Center for Outcomes Research in Cardiology at Beth Israel Deaconess Medical Center in Boston.

"However, we were surprised by the very high rate of rejection, even when prescribed to patients with known atherosclerotic cardiovascular disease, very high LDL levels and those who were intolerant of statins, for example," he said.

Researchers also found that the most significant factor associated with approval rates was insurance type, with the lowest approval rates for private insurance and the highest approval for Medicare.

"Whether or not we can agree on the cost-effectiveness of these drugs, I believe most would agree that one's access to medications should be driven primarily by the strength of the indications for the prescription as opposed to what drug plan you happen to carry," said Yeh, who is also an Associate Professor of Medicine at Harvard Medical School. "

Approximately 1 out of 3 patients who had their prescription approved did not purchase or receive the medication. Those patients who didn't purchase their medication had an out-of-pocket cost that was twice as high as those who did purchase it," said Dr. Gregory Hess, first study author and senior fellow of Health Economics at University of Pennsylvania and chief medical officer at Symphony Health.

"Approximately 1 out of 3 patients who had their prescription approved, did not purchase or receive the medication. Those patients who didn't purchase their medication had an out-of-pocket cost that was more than twice as high as those who did purchase it," said Gregory Hess, M.D., who is first author of the study and a Senior Fellow of Health Economics at University of Pennsylvania and Chief Medical Officer at Symphony Heath.

The findings are based on Symphony Health's HIPPA-compliant patient-level data from all fifty states and all payer types.

"Better education for providers prescribing these medications and more uniform guidelines by insurers about what will and will not be covered are necessary to reduce the amount of administrative waste that is created to reject prescriptions for new medications," Yeh said.

The study was a retrospective analysis and could not determine whether patients suffered any harm from the rejection of these prescriptions.

Group exercise improves quality of life, reduces stress far more than individual work outs



Researchers found working out in a group lowers stress by 26 percent and significantly improves quality of life, while those who exercise individually put in more effort but experienced no significant changes in their stress level and a limited improvement to quality of life, according to a study published in The Journal of the American Osteopathic Association. 

"The communal benefits of coming together with friends and colleagues, and doing something difficult, while encouraging one another, pays dividends beyond exercising alone," said Dayna Yorks, DO, lead researcher on this study. "The findings support the concept of a mental, physical and emotional approach to health that is necessary for student doctors and physicians."

Dr. Yorks and her fellow researchers at the University of New England College of Osteopathic Medicine recruited 69 medical students--a group known for high levels of stress and self-reported low quality of life--and allowed them to self-select into a twelve-week exercise program, either within a group setting or as individuals. A control group abstained from exercise other than walking or biking as a means of transportation.

Every four weeks, participants completed a survey asking them to rate their levels of perceived stress and quality of life in three categories: mental, physical and emotional.

Those participating in group exercise spent 30 minutes at least once a week in CXWORX, a core strengthening and functional fitness training program. At the end of the twelve weeks, their mean monthly survey scores showed significant improvements in all three quality of life measures: mental (12.6 percent), physical (24.8 percent) and emotional (26 percent). They also reported a 26.2 percent reduction in perceived stress levels.

By comparison, individual fitness participants were allowed to maintain any exercise regimen they preferred, which could include activities like running and weight lifting, but they had to work out alone or with no more than two partners. On average the solitary exercisers worked out twice as long, and saw no significant changes in any measure, except in mental quality of life (11 percent increase). Similarly, the control group saw no significant changes in quality of life or perceived stress.

"Medical schools understand their programs are demanding and stressful. Given this data on the positive impact group fitness can have, schools should consider offering group fitness opportunities," said Dr. Yorks. "Giving students an outlet to help them manage stress and feel better mentally and physically can potentially alleviate some of the burnout and anxiety in the profession."

Intake of pesticide residue from fruits, vegetables and infertility treatment outcomes



Bottom Line: Eating more fruits and vegetables with high-pesticide residue was associated with a lower probability of pregnancy and live birth following infertility treatment for women using assisted reproductive technologies.
The Research Question: Is preconception intake of fruits and vegetables with pesticide residues associated with outcomes of assisted reproductive technologies?
Why The Question is Interesting: Animal studies suggest ingestion of pesticide mixtures in early pregnancy may be associated with decreased live-born offspring leading to concerns that levels of pesticide residues permitted in food by the U.S. Environmental Protection Agency may still be too high for pregnant women and infants.
Who: 325 women who completed a diet questionnaire and subsequently underwent cycles of assisted reproductive technologies as part of the Environment and Reproductive Health (EARTH) study at a fertility center at a teaching hospital in Boston.
When: Between 2007 and 2016
Study Measures: Researchers categorized fruits and vegetables as having high or low pesticide residues using a method based on surveillance data from the U.S. Department of Agriculture. They counted the number of confirmed pregnancies and live births per cycle of fertility treatment.
Design: This is an observational study. In observational studies, researchers observe exposures and outcomes for patients as they occur naturally in clinical care or real life. Because researchers are not intervening for purposes of the study they cannot control natural differences that could explain study findings so they cannot prove a cause-and-effect relationship.
Authors: Jorge E. Chavarro, M.D., Sc.D., of the Harvard T. H. Chan School of Public Health, Boston, and colleagues
Results: Eating more high-pesticide residue fruits and vegetables (for example, strawberries and raw spinach) was associated with a lower probability of pregnancy and live birth following infertility treatment. Eating more low-pesticide residue fruits and vegetables was not associated with worse pregnancy and live birth outcomes.
Study Limitations: The study estimated exposure to pesticides based on women's self-reported intake combined with pesticide residue surveillance data rather than through direct measurement. The study also cannot link specific pesticides to adverse effects.
Study Conclusions: "In conclusion, intake of high-residue FVs [fruits and vegetables] was associated with lower probabilities of clinical pregnancy and live birth among women undergoing infertility treatment.

High usage of botanical and herbal remedies


A new study comparing use of herbal remedies among Hispanic women and non-Hispanic white women showed higher than expected use of herbal treatments by both groups, 89% and 81%, respectively. Notably, less than 1 in 6 Hispanic women and only a third of white women discussed the use of herbal treatments with their doctors, as reported in an article published in The Journal of Alternative and Complementary Medicine (JACM), a peer-reviewed publication from Mary Ann Liebert, Inc., publishers. The article is available free on JACM website.

In "Prevalence of Complementary and Alternative Medicine and Herbal Remedy Use in Hispanic and Non-Hispanic White Women: Results from the Study of Women's Health Across the Nation," authors Robin Green, PsyD, et al., from Albert Einstein College of Medicine (Bronx, NY), University of Colorado School of Medicine and School of Public Health (Aurora), and University of Washington (Seattle), examined the use of integrative medicine approaches such as botanical and herbal remedies. The highest reported use of herbals was as teas. The researchers emphasized the need for physicians to ask patients about herbal treatments to identify potential interactions with or patient use as replacements for conventional medications.

Depression is on the rise in the US, especially among young teens




Depression is on the rise in the United States, according to researchers at Columbia University's Mailman School of Public Health and the CUNY Graduate School of Public Health and Health Policy. From 2005 to 2015, depression rose significantly among Americans age 12 and older with the most rapid increases seen in young people. The findings appear online in the journal Psychological Medicine.

This is the first study to identify trends in depression by gender, income, and education over the past decade.

"Depression appears to be increasing among Americans overall, and especially among youth," said Renee Goodwin, PhD, of the Department of Epidemiology, Mailman School of Public Health, who led the research. "Because depression impacts a significant percentage of the U.S. population and has serious individual and societal consequences, it is important to understand whether and how the prevalence of depression has changed over time so that trends can inform public health and outreach efforts."

The results show that depression increased significantly among persons in the U.S. from 2005 to 2015, from 6.6 percent to 7.3 percent. Notably, the rise was most rapid among those ages 12 to 17, increasing from 8.7 percent in 2005 to 12.7 percent in 2015.

Data were drawn from 607,520 respondents to the National Survey on Drug Use and Health, an annual U.S. study of persons ages 12 and over. The researchers examined the prevalence of past-year depression annually among respondents based on DSM-IV criteria.

The increase in rates of depression was most rapid among the youngest and oldest age groups, whites, the lowest income and highest income groups, and those with the highest education levels. These results are in line with recent findings on increases in drug use, deaths due to drug overdose, and suicide.

"Depression is most common among those with least access to any health care, including mental health professionals. This includes young people and those with lower levels of income and education," noted Goodwin. "Despite this trend, recent data suggest that treatment for depression has not increased, and a growing number of Americans, especially socioeconomically vulnerable individuals and young persons, are suffering from untreated depression. Depression that goes untreated is the strongest risk factor for suicide behavior and recent studies show that suicide attempts have increased in recent years, especialy among young women."

Depression frequently remains undiagnosed, yet it is among the most treatable mental disorders, noted the researchers. "Identifying subgroups that are experiencing significant increases in depression can help guide the allocation of resources toward avoiding or reducing the individual and societal costs associated with depression," said Goodwin.

Breastfeeding for two months halves risk of SIDS


Breastfeeding for at least two months cuts a baby's risk of Sudden Infant Death Syndrome almost in half, a sweeping new international study has found.

The study,published in the journal Pediatrics, determined that mothers do not need to breastfeed exclusively for their baby to get the benefit, potentially good news for moms who can't or choose not to rely solely on breastfeeding.

"These results are very powerful! Our study found that babies who are breastfed for at least two months have a significant reduction in their risk of dying from SIDS," said researcher Kawai Tanabe, MPH, of the University of Virginia School of Medicine. "Breastfeeding is beneficial for so many reasons, and this is really an important one."

Preventing SIDS
 
Previous studies have suggested that breastfeeding was associated with a decreased risk of SIDS, the leading cause of death of babies between 1 month and one year of age, but this study is the first to determine the duration necessary to provide that protection. The researchers found, after adjusting for variables that could distort their results, that breastfeeding for at least two months was associated with a significant decreased risk. Breastfeeding for less than two months did not offer such a benefit.

"Breastfeeding for just two months reduces the risk of SIDS by almost half, and the longer babies are breastfed, the greater the protection," said researcher Fern Hauck, MD, of the UVA School of Medicine and the UVA Children's Hospital. "The other important finding from our study is that any amount of breastfeeding reduces the risk of SIDS -- in other words, both partial and exclusive breastfeeding appear to provide the same benefit."

To determine the effects of breastfeeding on SIDS risk, the researchers analyzed eight major international studies that examined 2,259 cases of SIDS and 6,894 control infants where death did not occur. This large collective sample demonstrated the consistency of findings despite differing cultural behaviors across countries, and it provides convincing evidence of the reliability of the findings.

Based on their results, the researchers are calling for "ongoing concerted efforts" to increase rates of breastfeeding around the world. Data from 2007 showed that a quarter of U.S. babies had never been breastfed, the researchers report. (The World Health Organization has established a goal of having more than half of infants worldwide being breastfed exclusively for at least six months by 2025.)

"It's great for mothers to know that breastfeeding for at least two months provides such a strong protective effect against SIDS," said researcher Rachel Moon, MD, of the UVA School of Medicine and the UVA Children's Hospital. "We strongly support international and national efforts to promote breastfeeding."

It remains unclear why breastfeeding protects against SIDS, though the researchers cite factors such immune benefits and effects on infant sleeping patterns as possible mechanisms.

Fish oil or fish consumption? New recommendations for pregnant women trying to prevent childhood asthma


Pregnant women who consume fish rather than fish oil supplements are just as likely to protect their offspring from developing asthma.

Researchers at the University of South Florida in Tampa, Fla just published a scientific review of two studies that conclude children whose mothers consume high-dose omega-3 fatty acids daily during the 3rd trimester are less likely to develop such breathing problems.

However, co-authors Richard Lockey, MD, and Chen Hsing Lin, MD suggest pregnant women receive the same benefit following the Food and Drug Administration and Environmental Protection Agency's recommendation to consume 8-12 ounces (2-3 servings) of low mercury fish a week.

The review published in the Journal of Allergy and Clinical Immunology: In Practice examined two articles. The New England Journal of Medicine study included 346 pregnant women in their 3d trimester who took omega-3 fatty acids daily and 349 who took a placebo. The investigators also divided the trial population into three groups based on their blood levels of omega-3 fatty acids. The population with the lowest blood levels benefited the most from fish oil supplementation.

The Journal of Allergy and Clinical Immunology randomized pregnant women in their 3rd trimester into fish oil, placebo and "no oil" groups. The fish oil group took omega-3 fatty acids daily as did the placebo (olive oil) group. The "no oil" group was informed of the trial proposal and therefore could consume fish oil or fish during the 3rd trimester if they chose to do so. Researchers found the fish oil and the "no oil" groups took less asthma medication as they aged to 24 years old, inferring both groups developed less asthma.

"Omega-3 fatty acids cannot be synthesized by humans and therefore are essential nutrients which are derived exclusively from marine sources," said Lin. "It may be premature to recommend daily high dose fish oil supplementation during the 3rd trimester."

"With almost equal to slightly higher cost, consuming 8-12 ounces (2-3 servings) of fish a week not only may attain the same asthma protection, but strengthens the nutritional benefits to infant growth and development," said Lockey.

For older adults with diabetes, losing weight with diet, exercise can improve circulation



Type 2 diabetes affects blood circulation. The disease stiffens blood vessels and reduces the amount of oxygen that circulates throughout your body. This includes your brain. When blood flow in the brain is impaired, it can affect the way we think and make decisions.


People who have type 2 diabetes are often overweight or obese. These are conditions that may also be linked to cognitive problems (problems with thinking abilities). Lowering calorie intake and increasing physical activity are known to reduce the negative effects of type 2 diabetes on the body. However, the effects of these interventions on cognition and the brain are not clear.

Recently, researchers examined information from a 10-year-long study called Action for Health in Diabetes (Look AHEAD). In this study, participants learned how to adopt healthy, long-term behavior changes. In their new study, the researchers focused on whether participants with type 2 diabetes who lowered calories in their diet and increased physical activity had better blood flow to the brain. The researchers published their findings in the Journal of the American Geriatrics Society.

Researchers assigned participants to one of two groups. The first group was called the Intensive Lifestyle Intervention. In this group, participants were given a daily goal of eating between 1200 to 1800 calories in order to lose weight, based on their initial weight. They also had a goal of 175 minutes of physical activity during the week, through activities such as brisk walking.

Participants were seen weekly for the first six months, and three times a month for the next six months. During years 2 through 4, they were seen at least once a month and were regularly contacted by phone or email. They were also encouraged to join group classes. At the study's end, participants were encouraged to continue individual monthly sessions and other activities.

The second group was called the "control group". The control group attended Diabetes Support and Education classes. The researchers compared the control group to the group that participated in the lifestyle intervention.

About ten years after enrollment, 321 participants completed an MRI brain scan. (An MRI scan is a non-invasive medical test that uses a powerful magnetic field, radio frequency pulses, and a computer to produce detailed pictures of the brain.) 97 percent of those MRIs met quality control standards set by the researchers for their study.

During the study, the participants had their mental functions tested, including their verbal learning, memory, decision-making ability, and other cognitive functions.

The researchers looked at the group of adults who were overweight or obese at the beginning of the study. They concluded that in that group, those who did the long-term behavioral intervention had greater blood flow in the brain. Furthermore, blood flow tended to be greatest among those who did not do as well on tests of mental functions. This may show how the brain may adapt in response to cognitive decline.

However, the researchers also found that for the heaviest individuals, the intervention may have worked differently. This suggests that the intervention may have been most effective in increasing or maintaining blood flow in the brain for individuals who were overweight but not obese.

Friday, October 27, 2017

A better way to wash pesticides off apples




Polishing an apple with your shirt might remove some dust and dirt, but getting rid of pesticide residues could take a little more work. Researchers now report in ACS' Journal of Agricultural and Food Chemistry, that washing apples with a common household product -- baking soda -- could do the trick for residues on the surfaces of the fruit.

The use of pesticides can help increase crop yield, but concerns over their potential effects on human health have been raised over the years. Washing could be one effective strategy to clean pesticides off produce, and it is standard practice in the food industry. But some of the plant-protecting compounds that get absorbed by fruits and vegetables might not be easily removed using current cleaning methods. Lili He and colleagues wanted to find out which washing method can most effectively reduce pesticides.

The researchers applied two common pesticides -- the fungicide thiabendazole, which past research has shown can penetrate apple peels, and the insecticide phosmet -- to organic Gala apples. They then washed these apples with three different liquids: tap water, a 1 percent baking soda/water solution, and a U.S.-EPA-approved commercial bleach solution often used on produce. The baking soda solution was the most effective at reducing pesticides. After 12 and 15 minutes, 80 percent of the thiabendazole was removed, and 96 percent of the phosmet was removed, respectively. The different percentages are likely due to thiabendezole's greater absorption into the apple. Mapping images showed that thiabendazole had penetrated up to 80 micrometers deep into the apples; phosmet was detected at a depth of only 20 micrometers. Washing the produce with either plain tap water or the bleach solution for two minutes, per the industry standard, were far less effective.

Cataract surgery in older women associated with decreased risk of death



In older women with cataracts in the Women's Health Initiative, cataract surgery was associated with a lower risk for overall and cause-specific death, although whether this association is explained by the intervention of cataract surgery is unclear, according to a study published by JAMA Ophthalmology.
Previous studies have suggested an association between cataract surgery and decreased risk for all-cause mortality potentially through a mechanism of improved health status and functional independence, but the association between cataract surgery and cause-specific mortality has not been previously studied and is not well understood.

Anne L. Coleman, M.D., Ph.D., of the University of California, Los Angeles, and colleagues conducted a study that included nationwide data collected from the Women's Health Initiative (WHI) clinical trial and observational study linked with the Medicare claims database. Participants in the present study were 65 years or older with a diagnosis of cataracts in the linked Medicare claims database. The WHI data were collected from January 1993 through December 2015. The WHI is a study of U.S. postmenopausal women ages 50 to 79 years; the database contains information on total and cause-specific mortality.

A total of 74,044 women with cataracts in the WHI included 41,735 who underwent cataract surgery; average age was 71 years. The researchers found that cataract surgery was associated with a 60 percent reduced risk of death from all causes; and a 37 percent to 69 percent reduced risk of death due to pulmonary, accidental, infectious, neurologic and vascular diseases, and cancer.
The study notes some limitations, including that because the WHI cohort is all female, findings from this study may not be generalizable to male patients.

"Further study of the interplay of cataract surgery, systemic disease, and disease-related mortality would be informative for improved patient care," the authors write.

New treatments help those with mild, moderate and severe eczema


If you think only infants suffer from eczema, think again. The uncomfortable, itchy rash that most people relate to babies and young children occurs frequently in adults. Although many adults with atopic dermatitis (commonly known as eczema) develop the disease in childhood and carry it through life, a large number are first diagnosed in adulthood - a trend being discussed at the American College of Allergy, Asthma and Immunology (ACAAI) Annual Scientific Meeting.

"Atopic dermatitis (AD) is underdiagnosed in the United States," says allergist Luz Fonacier, MD, ACAAI board member and presenter at the meeting. "Many adults don't seek out medical care, preferring to self-treat instead, either with home remedies or over-the-counter drugs. Often, they aren't aware they have eczema, and they also don't know treatments have changed a lot in the last few years. There are new drugs and topical medications that can make a huge difference in their quality of life."

In addition to the itching and discomfort, people with eczema can experience problems with sleep and emotional distress, and it can affect their social life. Allergists work with patients to introduce therapies that treat uncomfortable and sometimes painful symptoms like dry skin, itchiness, and scaly rashes that can become infected. Easing the discomfort associated with painful symptoms can improve quality of life and make sleeping easier, as well as relieve emotional distress and embarrassment.

"In the last few years we've seen the introduction of targeted therapies, also known as precision medicine," says allergist Mark Boguniewicz, MD, ACAAI member and lead author on a soon-to-be-published Atopic Dermatitis Yardstick. "The Yardstick will have practical recommendations for physicians about the treatment of AD."

Two new medications have recently been approved for AD. The first, crisaborole, is an ointment that reduces itching, redness and swelling of the skin. It is the first anti-inflammatory medication to be approved for the treatment of mild to moderate AD in more than 15 years. It is approved for patients 2 years of age or older. Dupilumab, the second new medication, is a biologic therapy given by injection for patients 18 years or older with moderate to severe AD who haven't responded to, or can't use topical medications.

"The takeaway message is that there are effective medications available that help relieve eczema symptoms and now can also target the underlying cause," says Dr. Boguniewicz. "People with eczema have been frustrated by the limitations of existing treatments. We're very excited by the new medications which were developed based on better understanding of atopic dermatitis. We expect additional therapies to be approved soon. An allergist has the right training and expertise to diagnose your eczema, and to help you find relief with the right treatments."

Almost half of food allergies in adults appear in adulthood


When people think of food allergies, it's mostly in relation to children. New late-breaking research being presented at the American College of Allergy, Asthma and Immunology (ACAAI) Annual Scientific Meeting shows that almost half of all food-allergic adults surveyed reported one or more adult-onset food allergies.

"Food allergies are often seen as a condition that begins in childhood, so the idea that 45 percent of adults with food allergies develop them in adulthood is surprising," says Ruchi Gupta, MD, MPH, ACAAI member and lead author of the study. "We also saw that, as with children, the incidence of food allergies in adults is rising across all ethnic groups."

The most common food allergy among adults is shellfish, affecting an estimated 3.6 percent of U.S. adults. This marks a 44 percent increase from the 2.5 percent prevalence rate published in an influential 2004 study. Similarly, these new data suggest that adult tree nut allergy prevalence has risen to 1.8 percent from a 2008 estimate of .5 percent, an increase of 260 percent.

"Our research also found that, among black, Asian and Hispanic adults, the risk of developing a food allergy to certain foods is higher than for whites, specifically for shellfish and peanuts," says food allergy researcher Christopher Warren, PhD candidate and study co-author. "For example, Asian adults were 2.1 times more likely to report a shellfish allergy than white adults, and Hispanic adults reported a peanut allergy at 2.3 times the frequency of white adults. Because many adults believe food allergies mostly affect children, they may not think to get tested. It is important to see an allergist for testing and diagnosis if you are having a reaction to a food and suspect a food allergy."

People may not recognize they have a food allergy, and believe their reaction is a food intolerance. They might not seek the help of an allergist for diagnosis, but allergists are specially trained to administer allergy testing and diagnose the results. Allergists can tailor a plan specific to your allergies. To find an allergist near you, use the ACAAI allergist locator.

21 percent increase in childhood peanut allergy since 2010


Parents often worry about peanut allergies because the reaction to peanuts can be very severe. New late-breaking research being presented at the American College of Allergy, Asthma and Immunology (ACAAI) Annual Scientific Meeting suggests that peanut allergy in children has increased 21 percent since 2010, and that nearly 2.5 percent of U.S. children may have an allergy to peanuts.

"Peanut allergies, along with other food allergies, are very challenging for children and families," says Ruchi Gupta, MD, MPH, ACAAI member and lead author of the study. "While 21 percent represents a large increase in the number of kids with a likely peanut allergy, the good news is that parents now have a way to potentially prevent peanut allergy by introducing peanut products to infants early after assessing risk with their pediatrician and allergist."

New guidelines introduced in January walk parents through the process of introducing peanut-containing foods to infants that are at high, medium and low-risk for developing peanut allergies. The guidelines are based on groundbreaking research showing that high risk infants (infants with severe eczema and/or a history of egg allergy) who are introduced to peanut-containing food early are significantly more likely to prevent developing a peanut allergy.

More than 53,000 U.S. households were surveyed between October 2015 and September 2016 for the study. The research suggests that rates of peanut, tree nut, shellfish, fin fish, and sesame allergies are increasing. Allergy to tree nuts, for example, increased 18 percent from 2010 when data were last collected, and allergy to shellfish increased 7 percent. Also evident was an increase in occurrence in black children compared to white children.

"According to our data, the risk of peanut allergy was nearly double among black children relative to white children," says food allergy researcher Christopher Warren, PhD candidate and study co-author. "Black children were also significantly more likely to have a tree nut allergy relative to white children. These findings are consistent with previous work by our group suggesting that black children in the U.S. may be at elevated food allergy risk. It's important that anyone with a food allergy work with their allergist to understand their allergy and how best to avoid the foods that cause their allergic reaction."

Diagnosing food allergy is not always simple, but the need to make a proper diagnosis is very important. Allergists are specially trained to administer allergy testing and conduct food challenges to determine true food allergy. They can then tailor a plan specific to your allergies. To find an allergist near you, use the ACAAI allergist locator. ACAAI has more information on how to introduce peanut-containing products to infants in this video.

Dry mouth symptoms can be side effect of certain medications for older adults


For older adults, dry mouth can be a common side effect of prescribed medications. Having dry mouth means you don't have enough saliva, or spit, to keep your mouth wet. The condition can lead to problems chewing, eating, swallowing, and even talking. What's more, dry mouth puts you at higher risk for tooth decay and oral infections.

However, there's much we don't understand about the connection between medications and dry mouth in older adults. Recently, researchers examined 52 related studies to learn more. Their research was published in the Journal of the American Geriatrics Society.

The researchers reported that there are a number of medications that are linked to dry mouth. These include medications used to treat urinary incontinence, depression, insomnia, and anxiety, as well as diuretics used to treat high blood pressure. In fact, medications used to treat urinary incontinence were nearly six times more likely to cause dry mouth than a placebo. (A placebo is a "sugar pill" or "dummy" treatment that is given in research studies to compare effects of an actual treatment compared to no treatment at all).

The researchers suggested that healthcare providers should regularly monitor and review all medications to identify potential side effects and to adjust doses or change medications when necessary.
###
This summary is from "Medications that Cause Dry Mouth in Older People". It appears online ahead of print in the Journal of the American Geriatrics Society.

Wednesday, October 25, 2017

Better sleep = less fear


Higher quality sleep patterns are associated with reduced activity in brain regions involved in fear learning, according to a study of young adults published in JNeurosci. The results suggest that baseline sleep quality may be a useful predictor of susceptibility to post-traumatic stress disorder (PTSD).

Sleep disturbances are a common feature of PTSD. While previous research has focused on understanding how single nights of sleep influence the maintenance of already-established fear memories, few studies have investigated whether an individual's regular sleeping habits prior to trauma contributes to the acquisition of these fear memories.

Itamar Lerner, Shira Lupkin and their colleagues at Rutgers University had students monitor their sleep at home for one week using unobtrusive sleep monitoring tools, including a headband that measures brain waves, a bracelet that measures arm movements, and a sleep log. The students then participated in a neuroimaging experiment during which they learned to associate a neutral image with a mild electric shock. Students who spent more time in rapid eye movement (REM) sleep -- the phase when dreaming occurs -- exhibited weaker modulation of activity in, and connectivity between, their amygdala, hippocampus and ventromedial prefrontal cortex during fear learning.

The authors replicated these results in a second study using traditional polysomnographic monitoring of sleep during the night just prior to fear learning. Taken together, the findings are consistent with the idea that REM sleep reduces levels of norepinephrine in the brain, which may dampen an individual's sensitivity to fearful stimuli.

Tuesday, October 24, 2017

Physical inactivity and restless sleep exacerbate genetic risk of obesity



Low levels of physical activity and inefficient sleep patterns intensify the effects of genetic risk factors for obesity, according to results of a large-scale study presented at the American Society of Human Genetics (ASHG) 2017 Annual Meeting in Orlando, Fla. These results confirm and strengthen previous findings based on self-reported activity.

Andrew Wood, PhD, postdoctoral researcher, who presented the work; Timothy Frayling, PhD, Professor; and their colleagues at the University of Exeter Medical School study the genetics of body mass index (BMI) and Type 2 Diabetes. In the past, Dr. Frayling explained, it has been difficult to measure interactions between genetic risk factors and aspects of environment and lifestyle in a systematic way.

"Until recently, physical activity and sleep patterns could not be measured with as much precision as genetic variants, and we relied on diaries or self-report, which can be very subjective," Dr. Frayling said. In contrast, the new study made use of wrist accelerometer data, which is more objective and quantifiable, and a large genetic dataset from about 85,000 UK Biobank participants aged 40 to 70.

"We wanted to find out if obesity-related genes and activity level have an interactive effect on obesity risk -- if there is a 'double whammy' effect of being both at genetic risk and physically inactive, beyond the additive effect of these factors," said Dr. Wood. The researchers computed a genetic risk score for each participant based on 76 common variants known to be associated with elevated risk of obesity, and analyzed this score in the context of accelerometer data and participants' BMIs.

They found the strongest evidence to date of a modest gene-activity interaction. For example, for a person of average height with 10 genetic variants associated with obesity, that genetic risk accounted for a 3.6 kilogram increase in weight among those who were less physically active but just 2.8 kilograms among those who were more active. Results were similar in analyses of sleep patterns; among participants with some genetic risk of obesity, those who woke up frequently or slept more restlessly had higher BMIs than those who slept more efficiently.

The researchers are currently examining whether this interaction between genetics and physical activity differs between men and women. They are also studying the effects of patterns of activity -- for example, whether a consistent level of moderate activity has different effects from overall low levels punctuated by periods of vigorous activity.

"We hope these findings will inform clinicians who help people lose or maintain their weight, and contribute to the understanding that obesity is complex and its prevention may look different for different people," said Dr. Frayling. "Ultimately, with further research, we may have the scope to personalize obesity interventions," he said.

Aspirin use = reduced risk of liver cancer


A new study presented at a meeting of the American Association for the Study of Liver Diseases found that daily aspirin therapy was significantly associated with a reduced risk in hepatitis B virus‐related liver cancer.

Hepatitis B is a viral infection that attacks the liver. HBV can be contracted through contact with an infected person's blood or other bodily fluid, and the infection can either be acute or chronic. According to AASLD's Guidelines for Treatment of Chronic Hepatitis B, an estimated 240 million people worldwide have chronic HBV, and the highest prevalence of the virus is in Africa and Asia. Death from HBV is commonly due to the development of cirrhosis (scaring of healthy liver tissue) or hepatocellular carcinoma (liver cancer).

Past research suggests that daily aspirin therapy -- which is often prescribed to prevent cardiovascular disease -- may also prevent the development of cancer. However, clinical evidence is lacking for the effectiveness of aspirin therapy in preventing HBV‐related liver cancer.

Researchers at Taichung Veterans General Hospital in Taichung, Taiwan; E‐Da Hospital in Kaohsiung, Taiwan; Fu Jen Catholic University in New Taipei City, Taiwan; and National Taiwan University Hospital in Taipei conducted a nationwide cohort study to determine if aspirin therapy could, indeed, reduce liver cancer risk.

"Liver cancer is the second leading cause of cancer death worldwide, and HBV is the most prevalent risk factor in our region, says Teng‐Yu Lee, MD, PhD, a researcher in the Department of Gastroenterology at Taichung Veterans General Hospital and lead investigator in the study. "HBV‐related liver cancer is therefore a major public health issue with a severe socioeconomic impact."

Although current antiviral medicines such as nucleos(t)ide analogue therapy could significantly reduce liver cancer risk, Dr. Lee notes these therapies do not completely eliminate the risk. Additionally, Dr. Lee says most HBV carriers are not indicated for antiviral therapy, so another effective way of reducing liver cancer risk needs to be developed.

"Aspirin has been investigated to explore its chemopreventive effect in cancers that are related to chronic inflammation, particularly in the prevention of colorectal cancer. However, clinical evidence supporting the chemopreventive effect of aspirin therapy on liver cancer remains limited. Therefore, we conducted a large‐scale cohort study to evaluate the association of aspirin therapy with HBV‐related liver cancer."

The researchers retrieved medical records from the National Health Insurance Research Database between 1998 and 2012 for their study. They screened records of 204,507 patients with chronic hepatitis B, and excluded patients with other forms of infectious hepatitis. After excluding patients with liver cancer before the follow‐up index dates, 1,553 patients who had continuously received daily aspirin for at least 90 days were randomly matched 1:4 with 6,212 patients who had never received anti‐ platelet therapy by means of propensity scores consisting of baseline characteristics, the index date and nucleos(t)ide analogue (NA) use during follow‐up. The researchers analyzed both cumulative incidents of and hazard ratios for HCC development after adjusting for competing mortality.

Cumulative incidence of liver cancer in the group treated with aspirin therapy was significantly lower than that in the untreated group in five years. In their multivariate regression analysis, the researchers found aspirin therapy was independently associated with reduced liver cancer risk. Sensitivity subgroup analyses also verified this association. Older age, male gender, cirrhosis and diabetes also were independently associated with an increased risk, but nucleos(t)ide analogue or statin use was associated with a decreased risk.

"For effectively preventing HBV‐related liver cancer, the findings of this study may help hepatologists treat patients with chronic HBV infection in the future, particularly for those who are not indicated for antiviral therapy. We are pursuing prospective investigations for further confirming the findings," says Dr. Lee.

Monday, October 23, 2017

Even walking below minimum recommended levels = lower mortality risk


A new study concludes that walking has the potential to significantly improve the public's health. It finds regular walking, even if not meeting the minimum recommended levels, is associated with lower mortality compared to inactivity. The study appears early online in American Journal of Preventive Medicine.

Public health guidelines recommend adults engage in at least 150 minutes of moderate or 75 minutes of vigorous-intensity physical activity per week. But surveys show only half of U.S. adults meet this recommendation. Older adults are even less likely to meet minimum recommendations (42% ages 65-74 years and 28% ages 75 years and older).

Walking is the most common type of physical activity, and has been associated with lower risk of heart disease, diabetes, and breast and colon cancers. While several studies have linked overall moderate-vigorous physical activity to a reduced risk of death, relatively few have examined associations with walking specifically.

To learn more, investigators led by Alpa Patel, Ph.D., looked at data from nearly 140,000 participants in the Cancer Prevention Study II Nutrition Cohort. A small percentage (6-7%) in the study reported no moderate to vigorous intensity physical activity at baseline. Among the rest, about 95% reported some walking, and nearly half walked as their only form of moderate-vigorous physical activity.

After correcting for other risk factors, including smoking, obesity, and chronic conditions, the study found walking-only for less than 2 hours per week was associated with lower all-cause mortality compared to no activity. Meeting 1 to 2 times the minimum recommendation (2.5-5 hours/week) through walking-only was associated with 20% lower mortality risk. Results for those exceeding recommendations through walking-only were similar to those who met recommendations.

Walking-only was most strongly associated with respiratory disease mortality, with approximately 35% lower risk comparing more than 6 hours/week of walking to the least active group. Walking-only was also associated with about 20% less risk of cardiovascular disease mortality and with about 9 percent less risk of cancer mortality.

"Walking has been described as the 'perfect exercise' because it is simple, free, convenient, doesn't require any special equipment or training, and can be done at any age," said Dr. Patel. "With the near doubling of adults aged 65 and older expected by 2030, clinicians should encourage patients to walk even if less than the recommended amount, especially as they age, for health and longevity."

Yoga and aerobic exercise together may improve heart disease risk factors


Heart disease patients who practice yoga in addition to aerobic exercise saw twice the reduction in blood pressure, body mass index and cholesterol levels when compared to patients who practiced either Indian yoga or aerobic exercise alone, according to research to be presented at the 8th Emirates Cardiac Society Congress in collaboration with the American College of Cardiology Middle East Conference October 19-21, 2017 in Dubai.

Lifestyle intervention has been shown to aid in reducing the risk of death and heart disease comorbidities when used alongside medical management. Indian yoga is a combination of whole exercise of body, mind and soul, and a common practice throughout India. Researchers in this study looked specifically at Indian yoga and aerobic training's effect on the coronary risk factors of obese heart disease patients with type 2 diabetes.

The study looked at 750 patients who had previously been diagnosed with coronary heart disease. One group of 225 patients participated in aerobic exercise, another group of 240 patients participated in Indian yoga, and a third group of 285 participated in both yoga and aerobic exercise. Each group did three, six-month sessions of yoga and/or aerobic exercise.

The aerobic exercise only and yoga only groups showed similar reductions in blood pressure, total cholesterol, triglycerides, LDL, weight and waist circumference. However, the combined yoga and aerobic exercise group showed a two times greater reduction compared to the other groups. They also showed significant improvement in left ventricular ejection fraction, diastolic function and exercise capacity.

"Combined Indian yoga and aerobic exercise reduce mental, physical and vascular stress and can lead to decreased cardiovascular mortality and morbidity," said Sonal Tanwar, PhD, a scholar in preventative cardiology, and Naresh Sen, DM, PhD, a consultant cardiologist, both at HG SMS Hospital, Jaipur, India. "Heart disease patients could benefit from learning Indian yoga and making it a routine part of daily life."

Maternal diet may program child for disease risk, but better nutrition later can change that


Research has shown that a mother's diet during pregnancy, particularly one that is high-fat, may program her baby for future risk of certain diseases such as diabetes. A new study from nutrition researchers at the University of Illinois shows that switching the offspring to a new diet--a low-fat diet, in this case--can reverse that programming. 

Yuan-Xiang Pan, a professor in the Department of Food Science and Human Nutrition at U of I, along with Laura Moody, a doctoral student in the Division of Nutritional Sciences at U of I study how early-life nutrition affects later generations and offspring health. In a new study published in the journal, Epigenomics, the researchers focused on whether a post-weaning diet, or a diet later in life, could control the epigenome and affect metabolism in the body.

Epigenetics does not involve changes to the DNA sequence, but are changes that modify gene expression. A person's epigenome is inherited, but it is also reversible based on what you eat, whether you exercise, and even where you live, for example.

"Traditional genetics says that you inherit a sequence from your parents. Epigenetics says you can inherit these other changes to the DNA, as well," Moody explains. "This is where the whole maternal programming of metabolism--the epigenome--comes into play. We wanted to show these changes are easily altered, even after this critical period. You can still change that epigenome later in life.

"The message is not that the high-fat diet is itself bad, but rather you always have the opportunity to change it later. It's not like you are doomed by what your mom or dad did in early in life," she adds.
For the study, the researchers looked at rats that were exposed to a high-fat diet (45 percent fat) during gestation and lactation. At weaning, some of the rats stayed on a high-fat diet and some were put on a low-fat diet (16 percent fat).

The researchers then did whole-genome sequencing of the rats, focusing on differences between gene expression in the livers of the two sets of rats. In particular, they wanted to see if DNA methylation in the liver adapted to the new, low-fat diet. DNA methylation is a mechanism cells use to control gene expression at the epigenetic level. It involves the addition of a methyl group to DNA that changes the way genes are transcribed and affects gene expression.

Scans showed remodeled DNA methylation patterns in the low-fat group, which changed gene expression associated with fat metabolism and inflammation in the liver; there was less fat accumulation and inflammation in the liver. This shows that DNA methylation is responsive to dietary changes later in life.

While there were physiological changes in the rats on low-fat diets, including lower body weight, Moody says they were most encouraged to see the changes in specific metabolic pathways related to type 2 diabetes, suggesting changes in rats' risk for the disease.

"There were definitely physiological outcomes, but we focused on the epigenetic outcomes," Moody says. "We did a whole genome scan, so we weren't just looking in one particular area. So I think it's even more impressive that it was these specific pathways--the type 2 diabetes mellitus pathways--that were metabolically related that were the most changed."

Because research--some from Pan's own lab--has shown that the early-life environment, including nutrition, can program certain diseases such as obesity, diabetes, and even some cancers, the study may offer good news for health throughout the lifespan.

"The early-life environment will mark your epigenome in a certain way so that you may develop certain phenotypes or disease states. Our study shows that after that early programming state, after weaning, and after the lactation period, when we introduced a new type of diet it changed the epigenome in a way that actually affects metabolism and potentially will reduce some of the damage caused by an early-life high-fat exposure," Pan says.

Pan adds that their study shows that the "reprogramming" is possible at least from the point of post-weaning. "Whether we can start from adolescence, or even later in life, we don't know that yet. But hopefully our study shows that by simply changing nutrition you can reverse some of the potential consequences."

Pan's goal is also to continue identifying potential molecular mechanisms involved in this early programming. "If we identify mechanisms, then we can do more detection of disease risk. Even if we don't know what happened during a person's early-life environment, but we do know that they have the potential to develop these kinds of diseases, we can tell them to pay attention to their diet, environment, stress, etc. to minimize the risk of eventually developing these diseases.

"Goal two is to find intervention strategies, including this case, where we show that if you switch to a different diet you actually can specifically remodel the epigenome in the liver related to certain metabolic pathways," he says.

Moody says she will continue to take a more whole-body systemic approach to understand how dietary patterns can affect the epigenome in different tissues in the body and how that can reduce disease risk.

Delayed word processing could predict potential to develop Alzheimer's disease


A delayed neurological response to processing the written word could be an indicator that a patient with mild memory problems is at an increased risk of developing Alzheimer's disease, research led by the University of Birmingham has discovered.

Using an electroencephalogram (EEG) - a test that detects electrical activity in a person's brain via electrodes attached to their scalp - researchers studied the brain activity of a group of 25 patients to establish how quickly they processed words shown to them on a computer screen.

The study, published in Neuroimage Clinical, was led by the University of Birmingham's School of Psychology and Centre for Human Brain Health and was carried out in collaboration with the Universities of Kent and California.

The patients who took part were a mix of healthy elderly people, patients with mild cognitive impairment (MCI), and patients with MCI who had developed Alzheimer's within three years of diagnosis of MCI.

MCI, a condition in which someone has minor problems with mental abilities such as memory beyond what would normally be expected for a healthy person of their age, is estimated to be suffered by up to 20 per cent of people aged over 65. It is not a type of dementia, but a person with MCI is more likely to go on to develop dementia.

Dr Ali Mazaheri, of the University of Birmingham, said: "A prominent feature of Alzheimer's is a progressive decline in language, however, the ability to process language in the period between the appearance of initial symptoms of Alzheimer's to its full development has scarcely previously been investigated.

"We wanted to investigate if there were anomalies in brain activity during language processing in MCI patients which could provide insight into their likelihood of developing Alzheimer's.
"We focused on language functioning, since it is a crucial aspect of cognition and particularly impacted during the progressive stages of Alzheimer's."

Previous research has found that when a person is shown a written word, it takes 250 milliseconds for the brain to process it - activity which can be picked up on an EEG.

Dr Katrien Segaert, of the University of Birmingham, adds: "Crucially, what we found in our study is that this brain response is aberrant in individuals who will go on in the future to develop Alzheimer's disease, but intact in patients who remained stable.

"Our findings were unexpected as language is usually affected by Alzheimer's disease in much later stages of the onset of the disease.

"It is possible that this breakdown of the brain network associated with language comprehension in MCI patients could be a crucial biomarker used to identify patients likely to develop Alzheimer's disease.

"We hope to now test the validity of this biomarker in large population of patients in the UK to see if it's a specific predictor of Alzheimer's disease, or a general marker for dementia involving the temporal lobe.

"The verification of this biomarker could lead the way to early pharmacological intervention and the development of a new low cost and non-invasive test using EEG as part of a routine medical evaluation when a patient first presents to their GP with concern over memory issues."

Depression strongly linked to higher long-term risk of early death


Despite increased awareness about mental illness, depression remains strongly linked to a higher risk of early death -- and this risk has increased for women in recent years -- according to results from the 60-year Stirling County Study published in CMAJ (Canadian Medical Association Journal.

"There is less stigma associated with depression, better treatments are available, but depression's link to mortality still persists," said Dr. Stephen Gilman of the Eunice Kennedy Shriver National Institute of Child Health and Human Development, part of the National Institutes of Health in Bethesda, Maryland. "At first, the association was limited to men, but in later years it was seen for women as well."

The Stirling County Study, begun in 1952 in Atlantic Canada, is well-known internationally as one of the first community-based studies on mental illness. A researcher from the original study, Dr. Jane Murphy with Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts, is a coauthor on this latest research study.

An international team of researchers looked at 60 years of mental health data on 3410 adults during 3 periods (1952-1967, 1968-1990 and 1991-2011) from a region in Atlantic Canada and linked the data to deaths in the Canadian Mortality Database. They found that the link between depression and an increased risk of death was observed in all decades of the study among men, whereas it emerged among women beginning in the 1990s. The risk of death associated with depression appeared strongest in the years following a depressive episode, leading the authors to speculate that this risk could be reversed by achieving remission of depression.

The mean age of participants at enrolment in the study was about 49 years. "The lifespan for young adults with depression at age 25 was markedly shorter over the 60-year period, ranging from 10 to 12 fewer years of life in the first group, 4 to 7 years in the second group and 7 to 18 fewer years of life in the 1992 group," says Dr. Ian Colman, Canada Research Chair in the School of Epidemiology, University of Ottawa, Ottawa, Ontario. "Most disturbing is the 50% increase in the risk of death for women with depression between 1992 and 2011."

Though depression has also been linked with poorer diet, lack of exercise, smoking and alcohol consumption -- all factors that can result in chronic health conditions -- these did not explain the increased risk of death associated with depression in this study.

Societal change may help explain the emergent risk of death for women with depression.
"During the last 20 years of the study in which women's risk of death increased significantly, roles have changed dramatically both at home and in the workplace, and many women shoulder multiple responsibilities and expectations," says Dr. Colman.

Most elderly hip fractures occur in warm months and indoors


Think the shorter winter days, ice and snow put your older loved one at greater risk for a fall and broken hip? Think again. A preliminary study presented at the ANESTHESIOLOGY® 2017 annual meeting shows that the majority of falls occur during warm months, and a greater number of the falls happen indoors rather than out.

"Falls are one of the most common health concerns facing the elderly today," said Jason Guercio, M.D., M.B.A., study author, North American Partners in Anesthesiology at The Hospital of Central Connecticut in New Britain. "And this population is the fastest growing segment of the U.S. People 65 and older are predicted to more than double in number by 2050, increasing from 39 million to 89 million. Falls leading to fracture can result in disability and even death. Understanding the risk factors for fractures can help to focus efforts on decreasing them, and guide resources and appropriate interventions to prevent them."

In the retrospective observational study, 544 patients treated at The Hospital of Central Connecticut for hip fracture from 2013 to 2016 were analyzed for the time of year the fracture occurred and whether it happened in- or outdoors. The authors defined "cold" months as November 1 through April 30, and "warm" months from May 1 through October 31.

The study found more than 55 percent of hip fractures occurred during warm months, with the highest proportion of fractures occurring in May (10.5 percent), September (10.3 percent) and October (9.7 percent). And while the fractures were spread fairly evenly throughout the year, the authors found the majority (76.3 percent) of hip fractures occurred indoors, with only 23.6 percent happening outside. Of the outdoor fractures, more than 60 percent of them happened during warm months. For fractures that occurred indoors, more than 56 percent happened during warm months.

The most common reason for both indoor and outdoor hip fracture was tripping over an obstacle (43.3 percent for indoor and 57.1 percent for outdoor). Inside homes, the most common obstacle appeared to be throw rugs. For indoor fractures, the second leading cause was falling out of bed. For outdoor fractures, the second and third leading causes were being struck by or falling from a vehicle or falling on or down stairs.

"It is counterintuitive that the risk for hip fracture would be higher in warm months, as ice and snow would appear to be significant fall risks," said Dr. Guercio. "Given the results of this study, it appears that efforts to decrease fall risk among the elderly living in cold climates should not be preferentially aimed at preventing outdoor fractures in winter, but should focus on conditions present throughout the year, and most importantly on mitigating indoor risk."

Wealth-associated disparities in death, disability in older adults


Low wealth was associated with death and disability among older adults in both the United States and England, two countries with very different health care and safety-net systems, according to a new article published by JAMA Internal Medicine.

Most research examining the effect of socioeconomic status on health outcomes has used income as the main measure of financial resources. The current study by Lena K. Makaroun, M.D., of the University of Washington and the VA Puget Sound Health Care System, Seattle, and coauthors used wealth as the primary marker as it is a better reflection of financial resources for older adults in retirement

The study included nearly 20,000 adults in the United States and England from two nationally representative groups of older adults in both countries. The adults were separated by age into two groups: 54 to 64 and 66 to 76 because social safety-net programs begin for many around the age of 65 (Medicare and Social Security in the United States and the State Pension in England, which also delivers health care from birth through the National Health Service).

Researchers examined the association between wealth and death and disability, which was defined as any difficulty in performing activities of daily living, such as dressing, eating and bathing.
Adults in both countries with low wealth had a higher risk of death and disability, according to the results. The results suggest small increases in wealth for those with the least wealth could be associated with gains in life expectancy and function.

Limitations of the study include differences between the U.S. and English comparison groups.
"Policies geared toward decreasing wealth-related disparities in death and disability in older adults should target determinants of health outside of access to health care," the article concludes.

So my brain amyloid level is 'elevated' -- What does that mean?


Testing drugs to prevent or delay the onset of Alzheimer's dementia and using them in the clinic will mean identifying and informing adults who have a higher risk of Alzheimer's but are still cognitively normal. A new study from the Perelman School of Medicine at the University of Pennsylvania has shed light on how seniors cope with such information.

The study examined cognitively normal adults 65 years and older who had been accepted into a large Alzheimer's prevention trial based on brain scans showing an "elevated" level of beta amyloid protein plaques. Beta Amyloid plaques are one of the biomarkers of Alzheimer's-disease. The Penn Medicine researchers found that for many of these seniors, being told that that their amyloid levels were "elevated" on brain scans led to frustration and a desire for more detailed information.

"Clinicians who give these results to people should be prepared to explain how and why measurements of amyloid are termed 'elevated' and what that means in terms of Alzheimer's dementia risk," said Jason Karlawish, MD, a professor of Medicine, Medical Ethics and Health Policy, and Neurology, and co-director of the Penn Memory Center.

The study, published on October 23, 2017 in JAMA Neurology, comes as Alzheimer's researchers and the pharmaceutical industry have begun to think more in terms of preventing dementia than in trying to treat after it has been diagnosed. To date, every candidate drug tested in large-scale clinical trials in patients with Alzheimer's dementia has failed to show a significant effect in slowing the usual 5-10 year course of this fatal illness.

Developing a preventive therapy is challenging for a number of reasons, not least because it entails the ethically challenging task of testing potentially risky drugs on people who are cognitively normal. Research over the past two decades has found, however, that certain types of brain scan as well as blood and spinal fluid tests can sort people into categories of higher or lower risk of developing Alzheimer's dementia.

For example, positron emission tomography (PET) using a radiotracer that sticks specifically to Alzheimer's-associated amyloid plaques can measure the extent of amyloid plaques in the brain.

Having no plaques means having essentially no near-term risk of Alzheimer's dementia. Most elderly people will have some amyloid plaque burden, and although that doesn't make Alzheimer's dementia a certainty in a normal lifespan, plaque loads beyond a certain threshold have been linked to a higher risk of this illness.

The most prominent Alzheimer's prevention trial now underway, the NIH-sponsored A4 trial, has enrolled seniors based on the PET finding elevated amyloid. Karlawish and colleagues sought to determine how these seemingly healthy seniors handled the information that they had elevated brain amyloid.

The researchers interviewed 50 seniors (ages 65-85) who had enrolled into the A4 trial. They found that about half had expected their amyloid PET scan result, based on a family history of Alzheimer's or a recent experience with memory problems. Most understood the basic facts provided by the A4 trial clinicians, namely that their brain amyloid levels were elevated, indicating a higher but not certain risk of developing Alzheimer's dementia. A smaller percentage appeared to believe mistakenly that they either had no increased risk of dementia or had 100 percent risk--even "early Alzheimer's."

A large minority of the subjects (20 of the 50) were dissatisfied with the ambiguity of the message that their brain amyloid level was "elevated." One 71-year old woman commented, accurately enough: "I don't know how elevated the risk is. It could be like right over the edge, and other people are right under the edge." Similarly a 75-year old man complained that he found the uncertainty frustrating: "my background is in a technical area, and I'm used to having facts and data."

"What this is telling us is that, in the future, Alzheimer's biomarkers will have to get more predictive, or we'll simply have to educate people to cope with the uncertainty," Karlawish said.

He emphasized that for now, disclosing amyloid PET result to cognitively normal adults is something that occurs only in experimental contexts such as the A4 trial. Amyloid PET scans are available for people who already have cognitive problems, to help distinguish Alzheimer's from other forms of dementia.

Alzheimer's researchers hope, however, that trials such as the A4 trial, which is testing an anti-amyloid drug, will lead eventually to preventive therapies to cognitively normal adults, particularly those considered to have high Alzheimer's risk based on PET amyloid levels and other biomarkers.

"In the future, learning this kind of information will be a normal part of going to the doctor, like finding out you have a high cholesterol level," Karlawish said. "The challenge is to anticipate what it will be like for seniors to learn this, and to develop effective strategies to help them cope with problems that may result, such as being stigmatized socially or losing their usual sense of well-being."

Mindfulness meditation mobile app works


For the millions of mindfulness meditation mobile app users, there is good news: New research shows that they can reduce the body's response to biological stress.

A Carnegie Mellon University-led study found that one component of mindfulness interventions is particularly important for impacting stress biology. Acceptance, or learning how to be open and accepting of the way things are in each moment, is critical for the training's stress reduction effects. Published in Psychoneuroendocrinology, the researchers offer the first scientific evidence that a brief mindfulness meditation mobile app that incorporates acceptance training reduces cortisol and systolic blood pressure in response to stress.

"We have known that mindfulness training programs can buffer stress, but we haven't figured out how they work," said David Creswell, associate professor of psychology in CMU's Dietrich College of Humanities and Social Sciences. "This study, led by Emily Lindsay in my lab, provides initial evidence that the acceptance training component is critical for driving the stress reduction benefits of mindfulness training programs."

For the study, 144 stressed adults participated in one of three randomly assigned smartphone-based interventions: training in monitoring the present moment with acceptance, training in monitoring the present moment only or active control training.

Each participant completed one 20-minute daily lesson for 14 days. Then, they were placed in a stressful situation while their cortisol levels and blood pressure were measured.

The results showed that the participants in the combined monitoring and acceptance program had reduced cortisol and systolic blood pressure reactivity. Their blood pressure responses were approximately 20 percent lower than those in the two interventions that did not include acceptance training. Their cortisol responses were also more than 50 percent lower.

"Not only were we able to show that acceptance is a critical part of mindfulness training, but we've demonstrated for the first time that a short, systematic smartphone mindfulness program helps to reduce the impact of stress on the body," said Lindsay (DC'17), who received her Ph.D. in psychology and is now a postdoctoral research fellow at the University of Pittsburgh. "We all experience stress in our lives, but this study shows that it's possible to learn skills that improve the way our bodies respond to stress with as little as two weeks of dedicated practice. Rather than fighting to get rid of unpleasant feelings, welcoming and accepting these feelings during stressful moments is key."

Building A Better Sandwich


Although sandwiches first appeared in American cookbooks in 1916, the role they play in the U.S. diet has just been illuminated, ironically, at the centennial celebration for The Academy of Nutrition and Dietetics - the world's largest organization of food and nutrition professionals. The results of a Building-A-Better Sandwich Study are being presented at the Academy's 2017 Food Nutrition Conference & Expo (FNCE) in Chicago.

The study, a modeling analysis, was conducted to assess the energy and nutrients contributed from all sandwiches in the U.S. diets of children and adolescents. It used government data from the National Health and Nutrition Examination Survey (NHANES) as well as USDA Typical Food Patterns to assess how Americans currently eat. The striking conclusion is that the ingredients inside the sandwich - not the bread itself - are the most significant drivers of calories, fat and sodium.

"Americans can pointedly and positively impact their consumption of calories, fat and sodium by making more deliberate decisions about sandwich ingredients," said study author Yanni Papanikolaou from Nutrition Strategies Inc. "Many health professionals mistakenly encourage consumers to skip the bread when trying to improve diets. However, this study demonstrates that by building a better sandwich on either whole grain or enriched grain bread, American children and adolescents can take in fewer calories, fat and sodium than they typically consume in sandwiches now."

"Moreover," Papanikolaou continued, "Americans need to think twice before cutting bread from their diets. They pack more of a nutrient punch than a caloric one in adult diets." Specifically, he was referring to another study published just last month in the journal, Nutrients, which shows that all grain foods contributed less than 15 percent of all calories in the total diet, while delivering greater than 20 percent of three shortfall nutrients - dietary fiber, folate, and iron - and greater than 10 percent of calcium, magnesium, and vitamin A. "These data show that grain foods are the foods we love that love us back - finally, we can enjoy bread again," Papanikolaou concluded.

Enough vitamin D when young associated with lower risk of diabetes-related autoimmunity


Getting enough vitamin D during infancy and childhood is associated with a reduced risk of islet autoimmunity among children at increased genetic risk for type 1 diabetes, according to a study published this week in the journal Diabetes.

The study's lead author, Jill Norris, MPH, PhD, of the Colorado School of Public Health at CU Anschutz, and her co-authors examined the association between vitamin D levels in the blood and islet autoimmunity.

Islet autoimmunity, detected by antibodies that appear when the immune system attacks the islet cells in the pancreas that produce insulin, is a precursor to type 1 diabetes.

"For several years there has been controversy among scientists about whether vitamin D lowers the risk of developing of islet autoimmunity and type 1 diabetes," said Dr. Norris.

Type 1 diabetes is a chronic autoimmune disease that is increasing by 3-5 percent annually worldwide. The disease is now the most common metabolic disorder in children under age 10. In younger children, the number of new cases is particularly high. And the risks seem to be greater at higher latitudes, further north from the equator.

Vitamin D represents a candidate protective factor for type 1 diabetes as it regulates the immune system and autoimmunity. Moreover, vitamin D status varies by latitude. But associations between vitamin D levels and islet autoimmunity have been inconsistent. This may be due to different study designs, population variation in vitamin D levels, or a failure to account for the combined effect of exposure and underlying genetic variation in the vitamin D pathway.

The findings are part of The Environmental Determinants of Diabetes in the Young (TEDDY) study, a large, multi-national study funded by the National Institutes of Health's National Institute of Diabetes and Digestive and Kidney Diseases.

TEDDY's effort began in 2004 with children from six clinical centers: three in the U.S. (Barbara Davis Center for Childhood Diabetes at CU Anschutz, the Pacific Northwest Research Institute in Seattle, and Augusta University in Georgia) and three in Europe (Universities of Turku, Oulu, and Tampere in Finland, Helmholtz Zentrum München in Germany, and Lund University in Sweden). The aim of the study is to search for triggers and protective factors for type 1 diabetes in 8,676 children with elevated type 1 diabetes risk.

The TEDDY children were followed with blood samples drawn every three to six months from infancy, to determine the presence of islet autoimmunity, as well as levels of vitamin D.
The authors compared 376 children who developed islet autoimmunity with 1,041 children who did not. The authors found that in children with a genetic variant in the vitamin D receptor gene, vitamin D levels in infancy and childhood were lower in those that went on to develop islet autoimmunity compared with those that did not develop autoimmunity.

This study is the first to show that higher childhood vitamin D levels are significantly associated with a decreased risk of IA.

"Since this association does not prove cause-and-effect, we look to future prospective studies to confirm whether a vitamin D intervention can help prevent type 1 diabetes," Dr. Norris said.

Wednesday, October 18, 2017

Anxiety and depression linked to migraines


In a study of 588 patients who attended an outpatient headache clinic, more frequent migraines were experienced by participants with symptoms of anxiety and depression. In the Headache study, poor sleep quality was also found to be an independent predictor of more severe depression and anxiety symptoms.

The study's investigators noted that factors such as emotional distress and frequency of headache may influence each other through a common pathophysiological mechanism. For example, emotional responses have the potential to alter pain perception and modulation through certain signaling pathways.

"These findings potentially suggest that adequate medical treatment to decrease headache frequency may reduce the risk of depression and anxiety in migraine patients," said Dr. Fu-Chi Yang, corresponding author of the study and an investigator in the Department of Neurology, Tri-Service General Hospital, National Defense Medical Center, Taiwan.


Some older adults don't get to the hospital soon enough when experiencing a heart attack


For individuals experiencing a heart attack, delays in getting to the hospital can have life-threatening consequences. A new study in the Journal of the American Geriatrics Society found that certain factors--non-white race, atypical symptoms, and heart failure--are linked with such delays in older individuals.

The study included 2500 patients aged 75 or older hospitalized for heart attack. Pre-hospital delay (six or more hours before getting to the hospital) was much more common (42%) than in studies of younger heart attack populations, in whom the reported prevalence ranges from 20% to 25%.

"Delays in presentation can have huge consequences for older adults with heart attacks. Based on the results of our study, we need to develop better clinical and public health strategies to ensure timely presentation, especially among non-white communities, those with atypical symptoms, and those with heart failure," said lead author Dr. Gregory Ouellet, of Yale University.

Life in the city: Living nearer nature keeps your amygdala healthier



A study conducted at the Max Planck Institute for Human Development has investigated the relationship between the availability of nature near city dwellers' homes and their brain health. Its findings are relevant for urban planners among others.

Noise, pollution, and many people in a confined space: Life in a city can cause chronic stress. City dwellers are at a higher risk of psychiatric illnesses such as depression, anxiety disorders, and schizophrenia than country dwellers. Comparisons show higher activity levels in city dwellers' than in country dwellers' amygdala -- a central nucleus in the brain that plays an important role in stress processing and reactions to danger.

Which factors can have a protective influence? A research team led by psychologist Simone Kühn has examined which effects nature near people's homes such as forest, urban green, or wasteland has on stress-processing brain regions such as the amygdala.

"Research on brain plasticity supports the assumption that the environment can shape brain structure and function. That is why we are interested in the environmental conditions that may have positive effects on brain development. Studies of people in the countryside have already shown that living close to nature is good for their mental health and well-being. We therefore decided to examine city dwellers," explains first author Simone Kühn, who led the study at the Max Planck Institute for Human Development and now works at the University Medical Center Hamburg-Eppendorf (UKE).

Indeed, the researchers found a relationship between place of residence and brain health: those city dwellers living close to a forest were more likely to show indications of a physiologically healthy amygdala structure und were therefore presumably better able to cope with stress. This effect remained stable when differences in educational qualifications and income levels were controlled for.

However, it was not possible to find an association between the examined brain regions and urban green, water, or wasteland. With these data, it is not possible to distinguish whether living close to a forest really has positive effects on the amygdala or whether people with a healthier amygdala might be more likely to select residential areas close to a forest. Based on present knowledge, however, the researchers regard the first explanation as more probable. Further longitudinal studies are necessary to accumulate evidence.

The participants in the present study are from the Berlin Aging Study II (BASE-II) - a larger longitudinal study examining the physical, psychological, and social conditions for healthy aging. In total, 341 adults aged 61 to 82 years took part in the present study. Apart from carrying out memory and reasoning tests, the structure of stress-processing brain regions, especially the amygdala, was assessed using magnetic resonance imaging (MRI). In order to examine the influence of nature close to peoples' homes on these brain regions, the researchers combined the MRI data with geoinformation about the participants' places of residence. This information stemmed from the European Environment Agency's Urban Atlas, which provides an overview of urban land use in Europe.

"Our study investigates the connection between urban planning features and brain health for the first time," says co-author Ulman Lindenberger, Director of the Center for Lifespan Psychology at the Max Planck Institute for Human Development. By 2050, almost 70 percent of the world population is expected to be living in cities. These results could therefore be very important for urban planning. In the near future, however, the observed association between the brain and closeness to forests would need to be confirmed in further studies and other cities, stated Ulman Lindenberger.