Friday, August 18, 2017

'Fat but fit' are at increased risk of heart disease


Carrying extra weight could raise your risk of heart attack by more than a quarter, even if you are otherwise healthy.

Researchers have found that being overweight or obese increases a person's risk of coronary heart disease (CHD) by up to 28 per cent compared to those with a healthy bodyweight, even if they have healthy blood pressure, blood sugar and cholesterol levels.

The findings add to a growing body of evidence that suggests being 'fat but fit' is a myth, and that people should aim to maintain a body weight within a healthy range.

Storing too much fat in the body is associated with a number of metabolic changes, including increased blood pressure, high blood sugar and altered cholesterol levels, which can lead to disease and poor health. However, previous studies have revealed a subset of overweight people who appear to lack the adverse health effects of excess weight, leading to them being classified as 'metabolically healthy obese' in the medical literature, and 'fat but fit' in the media.

Now, a group led by researchers at Imperial College London and the University of Cambridge has shown that despite an apparent clean bill of health, this overweight group is still at increased risk compared to those with a healthy weight. In the largest study of its kind to date, scientists used data from more than half a million people in 10 European countries -- taken from the European Prospective Investigation into Cancer and Nutrition (EPIC) -- to show that excess weight is linked with an increased risk of heart disease, even when people have a healthy metabolic profile.

"Our findings suggest that if a patient is overweight or obese, all efforts should be made to help them get back to a healthy weight, regardless of other factors. Even if their blood pressure, blood sugar and cholesterol appear within the normal range, excess weight is still a risk factor" said lead author Dr Camille Lassale, from Imperial's School of Public Health and now based at University College London.

In the study, published in the European Heart Journal, researchers looked at the link between excess weight and risk of CHD, a condition where not enough blood gets through to the heart due to clogged arteries, leading to heart attacks.

After a follow-up period of more than 12 years, a total of 7,637 people in the EPIC cohort experienced CHD events, such as death from heart attack. Researchers then selected a representative group of more than 10,000 individuals as controls, for analysis.

Body weight was classified according to definitions from the World Health Organization. Those with a body mass index (BMI) over 30 were classed as obese, while those with a BMI of 25-30 were classed as overweight, and 18.5-25 as normal weight. More than half of the control group (63 per cent) were female, with an average age of 53.6 and an average BMI of 26.1.

Participants were categorised as 'unhealthy' if they had three or more of a number of metabolic markers, including high blood pressure, blood glucose, or triglyceride levels, low levels of HDL cholesterol, or a waist size of more than 37" (94 cm) for men and 31" (80 cm) for women.
After adjusting for lifestyle factors such as smoking, diet, exercise and socioeconomic status, the researchers found that compared to the healthy normal weight group, those classed as unhealthy had more than double the risk of CHD, whether they were normal weight, overweight or obese.

However, analysis also revealed that within the apparently healthy group there was a significant difference in outcomes for people depending on their weight. The research found that compared to those at normal weight, people who were classified as healthy but were overweight had an increased CHD risk of 1.26 (26 per cent), while those who were healthy but obese had an increased risk of 1.28 (28 per cent).

Dr Ioanna Tzoulaki, from Imperial's School of Public Health, said: "I think there is no longer this concept of healthy obese. If anything, our study shows that people with excess weight who might be classed as 'healthy' haven't yet developed an unhealthy metabolic profile. That comes later in the timeline, then they have an event, such as a heart attack."

According to the researchers, the excess weight itself may not be increasing the risk of heart disease directly, but rather indirectly through mechanisms such as increased blood pressure and high glucose. They add that as no follow up measurements were taken, they cannot show how the group's health status changed over time. However, they add that what is clear from the study is that population-wide prevention and treatment of obesity is needed in order to ensure public health.

Dr Lassale added: "Overall, our findings challenge the concept of the 'healthy obese'. The research shows that those overweight individuals who appear to be otherwise healthy are still at increased risk of heart disease."

Energy dense foods may increase cancer risk



Diet is believed to play a role in cancer risk. Current research shows that an estimated 30% of cancers could be prevented through nutritional modifications. While there is a proven link between obesity and certain types of cancer, less is known about how the ratio of energy to food weight, otherwise known as dietary energy density (DED), contributes to cancer risk. To find out, researchers looked at DED in the diets of post-menopausal women and discovered that consuming high DED foods was tied to a 10% increase in obesity-related cancer among normal weight women. Their findings are published in the Journal of the Academy of Nutrition and Dietetics.

DED is a measure of food quality and the relationship of calories to nutrients. The more calories per gram of weight a food has, the higher its DED.

Whole foods, including vegetables, fruits, lean protein, and beans are considered low-DED foods because they provide a lot of nutrients using very few calories. Processed foods, like hamburgers and pizza, are considered high-DED foods because you need a larger amount to get necessary nutrients. Previous studies have shown that regular consumption of foods high in DED contributes to weight gain in adults.

In order to gain a better understanding of how DED alone relates to cancer risk, researchers used data on 90,000 postmenopausal women from the Women's Health Initiative including their diet and any diagnosis of cancer. The team found that women who consumed a diet higher in DED were 10% more likely to develop an obesity-related cancer, independent of body mass index. In fact, the study revealed that the increased risk appeared limited to women who were of a normal weight at enrollment in the program.

"The demonstrated effect in normal-weight women in relation to risk for obesity-related cancers is novel and contrary to our hypothesis," explained lead investigator Cynthia A. Thomson, PhD, RD, Professor of Health Promotion Sciences at the University of Arizona Mel and Enid Zuckerman College of Public Health in Tucson, AZ. "This finding suggests that weight management alone may not protect against obesity-related cancers should women favor a diet pattern indicative of high energy density."

Although restricting energy dense foods may play a role in weight management, investigators found that weight gain was not solely responsible for the rise in cancer risk among normal weight women in the study. They hypothesize that the higher DED in normal-weight women may cause metabolic dysregulation that is independent of body weight, which is a variable known to increase cancer risk.
While further study is needed to understand how DED may play a role in cancer risk for other populations such as young people and men, this information may help persuade postmenopausal women to choose low DED foods, even if they are already at a healthy body mass index.

"Among normal-weight women, higher DED may be a contributing factor for obesity-related cancers," concluded Dr. Thomson. "Importantly, DED is a modifiable risk factor. Nutrition interventions targeting energy density as well as other diet-related cancer preventive approaches are warranted to reduce cancer burden among postmenopausal women."

Outdoor light at night linked with increased breast cancer risk


Women who live in areas with higher levels of outdoor light at night may be at higher risk for breast cancer than those living in areas with lower levels, according to a large long-term study from Harvard T.H. Chan School of Public Health. The link was stronger among women who worked night shifts.

The study will be published online August 17, 2017 in Environmental Health Perspectives.
"In our modern industrialized society, artificial lighting is nearly ubiquitous. Our results suggest that this widespread exposure to outdoor lights during nighttime hours could represent a novel risk factor for breast cancer," said lead author Peter James, assistant professor at Harvard Medical School's Department of Population Medicine at Harvard Pilgrim Health Care Institute, who did the work while a research fellow in the Departments of Epidemiology and Environmental Health at Harvard Chan School.

Previous studies have suggested that exposure to light at night may lead to decreased levels of the hormone melatonin, which can disrupt circadian rhythms--our internal "clocks" that govern sleepiness and alertness--and, in turn, lead to increased breast cancer risk.

The new study, the most comprehensive to date to examine possible links between outdoor light at night and breast cancer, looked at data from nearly 110,000 women enrolled in the Nurses' Health Study II from 1989-2013. The researchers linked data from satellite images of Earth taken at nighttime to residential addresses for each study participant, and also considered the influence of night shift work. The study also factored in detailed information on a variety of health and socioeconomic factors among participants.

Women exposed to the highest levels of outdoor light at night--those in the top fifth--had an estimated 14% increased risk of breast cancer during the study period, as compared with women in the bottom fifth of exposure, the researchers found. As levels of outdoor light at night increased, so did breast cancer rates.

The association between outdoor light at night and breast cancer was found only among women who were premenopausal and those who were current or past smokers. In addition, the link was stronger among women who worked night shifts, suggesting that exposure to light at night and night shift work contribute jointly to breast cancer risk, possibly through mechanisms involving circadian disruption. The authors acknowledged that further work is required to confirm the study findings and clarify potential mechanisms.

Smoking linked to frailty in older adults


A recent paper published in Age & Ageing, the scientific journal of the British Geriatrics Society, finds that current smoking in older people increases the risk of developing frailty, though former smokers did not appear to be at higher risk.

Smoking increases the risk of developing a number of diseases, such as chronic obstructive pulmonary disease (COPD), coronary heart disease, stroke and peripheral vascular disease, all of which can potentially have negative effects on people's physical, psychological and social health.
Frailty is considered a precursor to, but a distinct state from, disability. Frailty is a condition associated with decreased physiological reserve and increased vulnerability to adverse health outcomes. The outcomes include falls, fractures, disability, hospitalisation and institutionalisation. Frailty has also been shown to be linked to worse psychological or cognitive outcomes, such as poor quality of life and dementia.

Due to the potential for reversibility of frailty, identifying potentially modifiable risk factors of frailty may help to develop strategies to prevent or slow progression of adverse health outcomes associated with both frailty and smoking.

Researchers here aimed to examine the association of smoking with the risk of developing frailty, controlling for important confounding variables and using data from a nationally representative sample of older men and women living in England.

Researchers defined frailty using a combination of five physical frailty components: unintentional weight loss, self-reported exhaustion, weakness, slow walking speed, and low physical activity. Frailty is classified as having three or more of the five criteria.

The current study used data of participants who were aged 60 years or older. The final sample for this study was 2,542 participants, divided into two groups: current smokers and non-smokers. The non-smokers were further divided into another two groups: past smokers and never smokers. The past smokers were once again divided into two groups: those who quit within the last 10 years and those who quit more than 10 years ago. The analysis revealed that current smoking was associated with an approximately 60% increased risk of developing frailty.

There was, however, no significant association between past smoking and incident frailty in any models. Among 1,113 past smokers, 157 quit smoking within the last 10 years and 956 quit smoking for more than 10 years ago. Incident frailty risks of these two groups were not significantly different from that of never smokers in all models.

When COPD was added to the model, current smoking was no longer a significant predictor of incident frailty. In this model, COPD was strongly associated with incident frailty. These findings suggest that current smokers are more likely to develop frailty due to COPD, rather than smoking itself.

Given that smoking is a modifiable lifestyle factor, and smokers who quit did not appear to be at high risk for frailty, this research suggests that smoking cessation may potentially prevent or delay developing frailty, even in old age.

"Our study showed that current smoking is a risk factor of developing frailty. Additional analyses revealed that COPD seems a main factor on the causal pathway from smoking towards frailty," said the study's author, Gotaro Kojima, "but those who quit smoking did not carry over the risk of frailty."

Thursday, August 17, 2017

Children at Higher Risk of Type 2 Diabetes Sleep an Hour Less Each Night


​A study in the September 2017 Pediatrics found that children who slept on average one hour less a night had higher risk factors for Type 2 diabetes, including higher levels of blood glucose and insulin resistance. The study, "Sleep Duration and the Risk of Type 2 Diabetes," (published online Aug. 15), also confirmed prior research that has shown an association between shorter sleep duration and higher levels of body fat. Researchers analyzed the body measurements, blood sample results and questionnaire data from 4,525 children of multi-ethnic descent, aged 9-10 years, in England. Children who slept longer had lower body weight and lower levels of fat mass. Sleep duration was also inversely related to insulin, insulin resistance and blood glucose.

On average, children slept 10.5 hours per night (95% range 8.0–12.0 hours). There were strong inverse graded relationships between sleep duration, adiposity, and diabetes risk markers. In adjusted models, a 1-hour-longer sleep duration was associated with 0.19 lower BMI (95% confidence interval [CI] 0.09 to 0.28), 0.03 kg/m5 lower fat mass index (95% CI 0.00 to 0.05 kg/m5), 2.9% lower homeostasis model assessment insulin resistance (95% CI 1.2% to 4.4%), and 0.24% lower fasting glucose (95% CI 0.03% to 0.44%); there was no association with HbA1c or cardiovascular risk. Associations with insulin and glucose remained after an additional adjustment for adiposity markers.


Wednesday, August 16, 2017

Could olfactory loss point to Alzheimer's disease?


Odor identification tests may help scientists track the evolution of the disease in persons at risk
By the time you start losing your memory, it`s almost too late. That`s because the damage to your brain associated with Alzheimer's disease (AD) may already have been going on for as long as twenty years. Which is why there is so much scientific interest in finding ways to detect the presence of the disease early on. Scientists now believe that simple odor identification tests may help track the progression of the disease before symptoms actually appear, particularly among those at risk.

"Despite all the research in the area, no effective treatment has yet been found for AD," says Dr. John Breitner, the director of the Centre for Studies on Prevention of Alzheimer's Disease at the Douglas Mental Health Research Centre of McGill University. He is one of the authors of the study on the subject that was recently published in the journal Neurology. "But, if we can delay the onset of symptoms by just five years, we should be able to reduce the prevalence and severity of these symptoms by more than 50%."

Bubble gum or gasoline?
Close to 300 people with an average age of 63 who are at risk of developing AD because they had a parent who had suffered from the disease, were asked to take multiple choice scratch-and-sniff tests to identify scents as varied as bubble gum, gasoline or the smell of a lemon. One hundred of them also volunteered to have regular lumbar punctures to measure the quantities of various AD-related proteins whose presence in the cerebrospinal fluid (CSF).

The researchers found that those with the most difficulty in identifying odors were those in whom other, purely biological indicators of AD, were most evident.

"This is the first time that anyone has been able to show clearly that the loss of the ability to identify smells is correlated with biological markers indicating the advance of the disease," says Marie-Elyse Lafaille-Magnan, a doctoral student at McGill and the first author on the study. "For more than 30 years, scientists have been exploring the connection between memory loss and the difficulty that patients may have in identifying different odours. This makes sense because it's known that the olfactory bulb (involved with the sense of smell) and the entorhinal cortex (involved with memory and naming of odours) are among the first brain structures first to be affected by the disease."

A cheaper way to track progression of Alzheimer's disease
"This means that a simple smell test may potentially be able to give us information about the progression of the disease that is similar to the much more invasive and expensive tests of the cerebrospinal fluid that are currently being used," the director of research program on Aging, Cognition and Alzheimer's disease of the Douglas Institute and one of the authors on the study. "However, problems identifying smells may be indicative of other medical conditions apart from AD and so should not be substituted for the current tests."

The researchers caution more that far more work needs to be done to see how changes in a person's ability to identify smells over time relates to the progression of the disease itself. For the time being, smell tests are simply one more avenue to explore as researchers look for ways to identify the disease before the symptoms actually begin to appear.

Walnuts activate brain region involved in appetite control


Packed with nutrients linked to better health, walnuts are also thought to discourage overeating by promoting feelings of fullness. Now, in a new brain imaging study, researchers at Beth Israel Deaconess Medical Center (BIDMC) have demonstrated that consuming walnuts activates an area in the brain associated with regulating hunger and cravings. The findings, published online in the journal Diabetes, Obesity and Metabolism, reveal for the first time the neurocognitive impact these nuts have on the brain.

"We don't often think about how what we eat impacts the activity in our brain," said the study's first author Olivia M Farr, PhD, an instructor in medicine in the Division of Endocrinology, Diabetes and Metabolism at BIDMC. "We know people report feeling fuller after eating walnuts, but it was pretty surprising to see evidence of activity changing in the brain related to food cues, and by extension what people were eating and how hungry they feel."

To determine exactly how walnuts quell cravings, Farr and colleagues, in a study led by Christos Mantzoros, MD, DSc, PhD hc mult, director of the Human Nutrition Unit at Beth Israel Deaconess Medical Center and professor of medicine at Harvard Medical School, used functional magnetic resonance imaging (fMRI) to observe how consuming walnuts changes activity in the brain.
The scientists recruited 10 volunteers with obesity to live in BIDMC's Clinical Research Center (CRC) for two five-day sessions. The controlled environment of the CRC allowed the researchers to keep tabs on the volunteers' exact nutritional intake, rather than depend on volunteers' often unreliable food records - a drawback to many observational nutrition studies.

During one five-day session, volunteers consumed daily smoothies containing 48 grams of walnuts - the serving recommended by the American Diabetes Association (ADA) dietary guidelines. During their other stay in the CRC, they received a walnut-free but nutritionally comparable placebo smoothie, flavored to taste exactly the same as the walnut-containing smoothie. The order of the two sessions was random, meaning some participants would consume the walnuts first and others would consume the placebo first. Neither the volunteers nor the researchers knew during which session they consumed the nutty smoothie.

As in previous observational studies, participants reported feeling less hungry during the week they consumed walnut-containing smoothies than during the week they were given the placebo smoothies. fMRI tests administered on the fifth day of the experiment gave Farr, Mantzoros and the team a clear picture as to why.

While in the machine, study participants were shown images of desirable foods like hamburgers and desserts, neutral objects like flowers and rocks, and less desirable foods like vegetables.

When participants were shown pictures of highly desirable foods, fMRI imaging revealed increased activity in a part of the brain called the right insula after participants had consumed the five-day walnut-rich diet compared to when they had not.

"This is a powerful measure," said Mantzoros. "We know there's no ambiguity in terms of study results. When participants eat walnuts, this part of their brain lights up, and we know that's connected with what they are telling us about feeling less hungry or more full."

This area of the insula is likely involved in cognitive control and salience, meaning that participants were paying more attention to food choices and selecting the less desirable or healthier options over the highly desirable or less healthy options. Farr and Mantzoros next plan to test different amounts, or dosages, of walnuts to see whether more nuts will lead to more brain activation or if the effect plateaus after a certain amount. This experiment will also allow researchers to test other compounds for their effect on this system.

Similar studies could reveal how other foods and compounds, such as naturally-occurring hormones, impact the appetite-control centers in the brain. Future research could eventually lead to new treatments for obesity.

"From a strategic point of view, we now have a good tool to look into people's brains - and we have a biological read out." said Mantzoros. "We plan to use it to understand why people respond differently to food in the environment and, ultimately, to develop new medications to make it easier for people to keep their weight down."

Tuesday, August 15, 2017

Why expensive wine appears to taste better: It's the price tag



Previous research has shown that a higher price, for instance for chocolate or wine, increased the expectation that the product will also taste better and in turn affects taste processing regions in the brain. 

"However, it has so far been unclear how the price information ultimately causes more expensive wine to also be perceived as having a better taste in the brain," says Prof. Bernd Weber, Acting Director of the Center for Economics and Neuroscience (CENs) at the University of Bonn. The phenomenon that identical products are perceived differently due to differences in price is called the "marketing placebo effect." As with placebo medications, it has an effect solely due to ascribed properties: "Quality has its price!"

The researchers assessed how different prices are translated into corresponding taste experiences in the brain, even if the wine tasted does not differ. 30 participants took part in the study, of which 15 were women and 15 were men, with an average age of around 30 years.

Wine tasting while lying down

The wine tasting took place lying down in an MRI scanner, allowing brain activity to be recorded "online" while participants were tasting the wines. Each time, the price of the wine was shown first. Only then around a milliliter of the respective wine was administrated to the test person via a tube in their mouths. The participants were then asked to rate via a button on a nine-point scale how good the wine tasted to them. Their mouths were then rinsed with a neutral liquid and the next identical wine sample was given for tasting. All of the experiments were performed in the brain scanner at the Life & Brain Center at the University of Bonn.

"The marketing placebo effect has its limits: If, for example, a very low-quality wine is offered for 100 euros, the effect would predictably be absent," says Prof. Weber. This is why the researchers conducted the tests using an average to good quality red wine with a retail bottle prize of 12 €. In the MRI scanner, the price of this wine was shown randomly as 3, 6 and 18 €. In order to make the study as realistic as possible, the participants were given 45 euros of initial credit. For some of the tastings, the displayed sum was deducted from this account in some of the trials.

"As expected, the subjects stated that the wine with the higher price tasted better than an apparently cheaper one," reports Professor Hilke Plassmann from the INSEAD Business School, with campuses in Fontainebleau (France), Singapore and Abu Dhabi. "However, it was not important whether the participants also had to pay for the wine or whether they were given it for free." Identical wine leads to a better taste experience when a greater quality expectation is associated with the wine due to its price.

The measurements of brain activity in the MRI scanner confirmed this. The research team discovered that above all parts of the medial pre-frontal cortex and also the ventral striatum were activated more when prices were higher. While the medial pre-frontal cortex particularly appears to be involved in integrating the price comparison and thus the expectation into the evaluation of the wine, the ventral striatum forms part of the brain's reward and motivation system. "The reward and motivation system is activated more significantly with higher prices and apparently increases the taste experience in this way," says Prof. Weber.

How can placebo effects be inhibited?

"Ultimately, the reward and motivation system plays a trick on us," explains INSEAD post-doctoral fellow Liane Schmidt. When prices are higher, it leads us to believe that a taste is present that is not only driven by the wine itself, because the products were objectively identical in all of the tastings. "The exciting question is now whether it is possible to train the reward system to make it less receptive to such placebo marketing effects," says Prof. Weber. This may be possible by training one's own physical perception -- such as taste -- to a greater extent.

Monday, August 14, 2017

Automatic blood pressure devices are prone to significant errors



An estimated 1 in 3 U.S. adults have high blood pressure. Blood pressure levels are often assessed by using automatic blood pressure devices. But these automatic devices are prone to significant errors, sometimes leading to the prescription of blood pressure-lowering medications to patients who don't actually need them. Now researchers at the Jerusalem College of Technology and the Shaare Zedek Medical Center in Israel have developed a method to more accurately measure systolic blood pressure. They present their research findings today at the Cardiovascular Aging: New Frontiers and Old Friends conference in Westminster, Colo.

A systolic blood pressure measurement of 140 mmHg or higher and a diastolic measurement of 90 mmHG or higher (140/90 mmHg) is considered high. Blood pressure is usually assessed using either a manual (auscultatory) or automatic (oscillometry) meter in a doctor's office or hospital. However, these measurements can be affected by "white coat syndrome" -- a patient's fear or anxiety in a doctor's office causes their blood pressure to measure above normal levels. To avoid the white coat effect, at-home automatic measurements taken by the patient may be required, but available oscillometry-based automatic meters offer a low level of accuracy.

"The automatic oscillometric technique is less accurate than the manual auscultatory technique, when both are used in the clinician's office," Meir Nitzan, PhD, the new study's first author, said. Currently available automatic blood pressure measurement devices are commonly off by 10 to 15 mmHg. This is mainly due to indirect determination of the blood pressure from the oscillometric air-pressure wave measurements taken by automatic devices.

A patient with an incorrect high blood pressure diagnosis may be prescribed blood pressure-lowering medication unnecessarily. These medications can cause patients' blood pressure to dip too low (hypotension); elderly patients are especially at risk. Side effects of hypotension can include short-term symptoms such as dizziness and fainting and long-term problems such as insufficient blood supply to vital organs, which can lead to acute kidney injury and cognitive impairment.

The research team has developed a device -- using a technique called photoplethysmography -- to more accurately measure systolic blood pressure. The device uses a pressure cuff wrapped around the arm and an electro-optic probe on the finger. "The finger probe is similar to that of pulse oximeter: It includes a light-source emitting light into the finger and a detector, which measures the light transmitted through the finger," Nitzan explained. "The transmitted light exhibits pulses at the heart rate, due to cardiac-induced blood volume changes in the finger tissue. When the cuff pressure increases to above systolic blood pressure these pulses disappear, and when the cuff pressure decreases to below systolic blood pressure they reappear. This effect enables the determination of systolic blood pressure."

Knee arthritis on the rise


The average American today is twice as likely to be diagnosed with knee osteoarthritis than in the years before World War II, Harvard scientists say, but that increase can't be blamed on the reasons most might think.

Based on the examination of more than 2,000 skeletons from cadaveric and archaeological collections across the U.S., the Harvard study is the first to definitively show that knee osteoarthritis prevalence has dramatically increased in recent decades. The research also upends the popular belief that knee osteoarthritis is a wear-and-tear disease that is widespread today simply because more people are living longer and are more commonly obese. The study is described in a paper published this week in the Proceedings of the National Academy of Sciences.

"Before this study, it was assumed without having been tested that the prevalence of knee osteoarthritis has changed over time," said Ian Wallace, the study's first author and a post-doctoral fellow in the lab of Daniel Lieberman, the Edwin M. Lerner II Professor of Biological Sciences and senior author of the study. "We were able to show, for the first time, that this pervasive cause of pain is actually twice as common today than even in the recent past. But the even bigger surprise is that it's not just because people are living longer or getting fatter, but for other reasons likely related to our modern environments."

Understanding the disease, Wallace and Lieberman said, is important not only because it is extremely prevalent today, affecting an estimated one-third of Americans over age 60, but also because it is responsible for more disability than almost any other musculoskeletal disorder.

"Understanding the origins of knee osteoarthritis is an urgent challenge because the disease is almost entirely untreatable apart from joint replacement, and once someone has knee osteoarthritis, it creates a vicious circle," Lieberman said. "People become less active, which can lead to a host of other problems, and their health ends up declining at a more rapid rate."

Wallace and Lieberman think that this study has the potential to shift the popular perception of knee osteoarthritis as an inevitable consequence of aging, and instead focus on efforts to prevent the disease - much like we now do with heart disease.

"There are a lot of well-understood risk factors for heart disease, so doctors can advise their patients to do certain things to decrease their chances of getting it," Lieberman said. "We think knee osteoarthritis belongs in the same category because it's evidently more preventable than commonly assumed. But to prevent the disease more work needs to be done to figure out its causes."

To do that, Wallace and Lieberman are currently addressing the question of the etiology of knee osteoarthritis from a variety of methodological approaches including studies of living human populations and animal models, but their first goal was to figure out how ancient the disease actually is, and whether it really is on the rise.

"There are famous examples in the fossil record of individuals, even Neanderthals, with osteoarthritis," Lieberman said. "But we thought, let's look at the data, because nobody had really done that in a comprehensive way before."

To find those data, Wallace undertook the daunting task of crisscrossing the country to examine thousands of skeletons spanning more than 6,000 years of human history to search for evidence of eburnation - a tell-tale sign of osteoarthritis.

"When your cartilage erodes away, and two bones that comprise a joint come into direct contact, they rub against each other causing a glass-like polish to develop," Wallace said. "That polish, called eburnation, is so clear and obvious that we can use it to very accurately diagnose osteoarthritis in skeletal remains."

The data Wallace collected was combined with analyses from other contributors to the study, making this the largest sample ever studied of older-aged individuals from three broad time periods - prehistoric times, early industrial times (mainly the 1800s), and the modern post-industrial era.

"The most important comparison is between the early industrial and modern samples," Lieberman said. "Because we had data on each individual's age, sex, body weight, ethnicity, and in many cases, their occupation and cause of death, we were able to correct for a number of factors that we considered important covariates. So using careful statistical methods, we are able to say that if you were born after World War II you have approximately twice the likelihood of getting knee osteoarthritis at a given age or BMI than if you were born earlier."

Wallace and Lieberman are now working to identify what factors may be behind the increase, and said the evolutionary approach to the study is a critical part of that ongoing work.

"Epidemiology typically looks at large cohorts of individuals living today to search for associations between a disease and risk factors," Lieberman said. "That's a powerful and valuable method, but it has one critical imitation, which is that the world today is different in many ways from the world in the past, hiding important risk factors that are either no longer prevalent or have become ubiquitous. An evolutionary perspective opens new opportunities to test for associations we might not be able to study in populations like modern day America."

That perspective, Wallace and Lieberman said, allows researchers to zero in on specific things that changed pre- to post-World War II, and understand how they might contribute to the rise in knee osteoarthritis prevalence.

"This is an example of how evolutionary thinking can contribute to our understanding of what causes certain diseases," Wallace said. "We identified the post-war period as a critical time...and it's only with an evolutionary perspective that we gain that insight."

Ultimately, Wallace and Lieberman hope their study inspires new research to prevent knee osteoarthritis.

"Knee osteoarthritis is not a necessary consequence of old age. We should think of this as a partly preventable disease," Lieberman said. "Wouldn't it be great if people could live to be 60, 70 or 80 and never get knee osteoarthritis in the first place? Right now, our society is barely focusing on prevention in any way, shape or form, so we need to redirect more interest toward preventing this and other so-called diseases of aging."

Light-to-moderate alcohol consumption may have protective health effects


Light-to-moderate drinking can lower risk of mortality from all-causes and cardiovascular disease, while heavy drinking can significantly increase risk of mortality from all-causes and cancer, according to a new study published today in the Journal of the American College of Cardiology.

High alcohol consumption has been liked to a host of health issues, including cardiovascular disease, but alcohol in moderation is widely recommended. However, despite these recommendations, studies on the risk of mortality among light-to-moderate drinkers are inconsistent. Researchers in this study examined the association between alcohol consumption and risk of mortality from all causes, cancer and cardiovascular disease in the U.S.

The researchers looked at data from 333,247 participants obtained through the National Health Interview Surveys from 1997 to 2009. Study participants were surveyed regarding their alcohol consumption status and patterns of use. Alcohol consumption patterns were divided into six categories: lifetime abstainers, lifetime infrequent drinkers, former drinkers and current light (less than three drinks per week), moderate (more than three drinks per week to less than 14 drinks per week for men or less than seven drinks per week for women) or heavy drinkers (more than 14 drinks per week for men or seven drinks per week for women).

"Our research shows that light-to-moderate drinking might have some protective effects against cardiovascular disease, while heavy drinking can lead to death. A delicate balance exists between the beneficial and detrimental effects of alcohol consumption, which should be stressed to consumers and patients," said Bo Xi, MD, associate professor at the Shandong University School of Public Health in China and the study's lead author.

Throughout the length of the study, 34,754 participants died from all-causes. Of these, 8,947 mortalities were cardiovascular disease-specific (6,944 heart disease-related and 2,003 cerebrovascular-related deaths) and 8,427 mortalities were cancer-specific.

Researchers found that male heavy drinkers had a 25 percent increased risk of mortality due to all-causes and a 67 percent increase in mortality from cancer. These increases were not significantly noticed in women. There was no association found between heavy drinking and cardiovascular disease mortality. Moderate drinking was associated with a 13 percent and 25 percent decreased risk of all-cause mortality, and 21 percent and 34 percent decreased risk of cardiovascular disease mortality, respectively, in both men and women. Similar findings were observed for light drinking in both genders.

"We have taken rigorous statistical approaches to address issues reported in earlier studies such as abstainer bias, sick quitter phenomenon and limited confounding adjustment in our study. A J-shaped relationship exists between alcohol consumption and mortality, and drinkers should drink with consciousness," said one of the study's authors, Sreenivas Veeranki, MD, DrPH, assistant professor in preventive medicine and community health at University of Texas Medical Branch.

Limitations to the study include obtaining alcohol consumption status through survey responses that may be subject to recall bias, as well as using self-reported responses at baseline.

In an accompanying editorial, Giovanni de Gaetano, MD, PhD, director of the Department of Epidemiology and Prevention at IRCCS Istituto Neurologico Mediterraneo Neuromed said that while younger adults should not expect considerable benefit from moderate drinking, "for most older persons, the overall benefits of light drinking, especially the reduced cardiovascular disease risk, clearly outweigh possible cancer risk."

Secret to happiness may include more unpleasant emotions


People may be happier when they feel the emotions they desire, even if those emotions are unpleasant, such as anger or hatred, according to research published by the American Psychological Association.

"Happiness is more than simply feeling pleasure and avoiding pain. Happiness is about having experiences that are meaningful and valuable, including emotions that you think are the right ones to have," said lead researcher Maya Tamir, PhD, a psychology professor at The Hebrew University of Jerusalem. "All emotions can be positive in some contexts and negative in others, regardless of whether they are pleasant or unpleasant."

The cross-cultural study included 2,324 university students in eight countries: the United States, Brazil, China, Germany, Ghana, Israel, Poland and Singapore. The research, which was published online in the Journal of Experimental Psychology: General, is the first study to find this relationship between happiness and experiencing desired emotions, even when those emotions are unpleasant, Tamir said.
 
Participants generally wanted to experience more pleasant emotions and fewer unpleasant emotions than they felt in their lives, but that wasn't always the case. Interestingly, 11 percent of the participants wanted to feel fewer transcendent emotions, such as love and empathy, than they experienced in daily life, and 10 percent wanted to feel more unpleasant emotions, such as anger or hatred. There was only a small overlap between those groups.

For example, someone who feels no anger when reading about child abuse might think she should be angrier about the plight of abused children, so she wants to feel more anger than she actually does in that moment, Tamir said. A woman who wants to leave an abusive partner but isn't willing to do so may be happier if she loved him less, Tamir said.

Participants were surveyed about the emotions they desired and the emotions they actually felt in their lives. They also rated their life satisfaction and depressive symptoms. Across cultures in the study, participants who experienced more of the emotions that they desired reported greater life satisfaction and fewer depressive symptoms, regardless of whether those desired emotions were pleasant or unpleasant. Further research is needed, however, to test whether feeling desired emotions truly influences happiness or is merely associated with it, Tamir said.

The study assessed only one category of unpleasant emotions known as negative self-enhancing emotions, which includes hatred, hostility, anger and contempt. Future research could test other unpleasant emotions, such as fear, guilt, sadness or shame, Tamir said. Pleasant emotions that were examined in the study included empathy, love, trust, passion, contentment and excitement. Prior research has shown that the emotions that people desire are linked to their values and cultural norms, but those links weren't directly examined in this research.

The study may shed some light on the unrealistic expectations that many people have about their own feelings, Tamir said.

"People want to feel very good all the time in Western cultures, especially in the United States," Tamir said."Even if they feel good most of the time, they may still think that they should feel even better, which might make them less happy overall."

Lower brain serotonin levels are linked to dementia


In a study looking at brain scans of people with mild loss of thought and memory ability, Johns Hopkins researchers report evidence of lower levels of the serotonin transporter -- a natural brain chemical that regulates mood, sleep and appetite.

Previous studies from Johns Hopkins and other centers have shown that people with Alzheimer's disease and severe cognitive decline have severe loss of serotonin neurons, but the studies did not show whether those reductions were a cause or effect of the disease. Results of the new study of people with very early signs of memory decline, the researchers say, suggest that lower serotonin transporters may be drivers of the disease rather than a byproduct.

A report on the study, published in the September issue of Neurobiology of Disease, also suggest that finding ways to prevent the loss of serotonin or introducing a substitute neurotransmitter could slow or stop the progression of Alzheimer's disease and perhaps other dementias.

"Now that we have more evidence that serotonin is a chemical that appears affected early in cognitive decline, we suspect that increasing serotonin function in the brain could prevent memory loss from getting worse and slow disease progression," says Gwenn Smith, Ph.D., professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine and director of geriatric psychiatry and neuropsychiatry at Johns Hopkins University School of Medicine.

Serotonin levels that are lower and out of balance with other brain chemicals such as dopamine are well known to significantly impact mood, particularly depression, and drugs that block the brain's "reuptake" of serotonin (known as SSRIs) are specific treatments for some major forms of depression and anxiety.

Smith notes that researchers have tried with limited success to treat Alzheimer's disease and cognitive impairment with antidepressants such as SSRIs, which bind to the serotonin transporters. But, since these transporters are at much lower levels in people with Alzheimer's, she speculates that the drugs can't serve their purpose without their target.

The idea for Smith's study was inspired by the work of co-author Alena Savonenko, M.D., Ph.D., associate professor of pathology, and her colleagues who showed that loss of serotonin neurons was associated with more protein clumps, or amyloid, in mouse brain.

To further study serotonin's role in cognition and neurodegenerative disease, the Johns Hopkins research team used brain positron emission tomography (PET) scans to look at levels of serotonin in the brains of people with mild cognitive problems, which may be early harbingers of Alzheimer's disease or other dementias.

For the study, the researchers recruited participants with community newspaper ads and flyers, as well as from the Johns Hopkins Memory and Alzheimer's Treatment Center. They paired 28 participants with mild cognitive impairment to 28 healthy matched controls. Participants were an average age of 66 and about 45 percent were women. People with mild cognitive impairment were defined as those who have a slight decline in cognition, mainly in memory in terms of remembering sequences or organization, and who score lower on tests such as the California Verbal Learning Test, which requires participants to recall a list of related words, such as a shopping list. According to Smith, the inability to do this test accurately reflects changes in memory and cognitive impairment indicative of Alzheimer's disease.

Each participant underwent an MRI and PET scan to measure brain structures and levels of the serotonin transporter. During the PET scans, participants were given a chemical -- similar in structure to an antidepressant but not a high enough dose to have a pharmacological effect -- labeled with a radioactive carbon. The chemical binds to the serotonin transporter and the PET scanner detects the radioactivity. When a neuron sends a message it releases the neurotransmitter serotonin, which is detected by the next neuron receiving the message. After this nerve impulse transaction completes, the serotonin transporter SERT grabs up the serotonin and carts it back into the message-sending cell, a metabolic process marked by the ebb and flow of the chemical.

Normally, as people age, the serotonin neurons are especially vulnerable to neurodegeneration, so the transporters are lost when these neurons die and serotonin levels go down. The older they are, the more likely a person is to have lower serotonin levels. That being said, the researchers found that people with mild cognitive impairment had up to 38 percent less SERT detected in their brains compared to each of their age-matched healthy controls. And not a single person with mild cognitive impairment had higher levels of SERT compared to their healthy control.

Each participant also underwent learning and memory tests. In the California Verbal Learning Test, on a scale of 0 to 80, with 80 reflecting the best memory, the healthy participants had an average score of 55.8, whereas those with mild cognitive impairment scored an average of 40.5.

With the Brief Visuospatial Memory Test, participants were shown a series of shapes to remember and draw later. From a scale of 0 to 36, with 36 being the top score, healthy people scored an average of 20.0 and those with mild cognitive problems scored an average of 12.6.

The researchers then compared the results from the brain imaging tests for the serotonin transporter to those two memory tests, and found that the lower serotonin transporters correlated with lower scores. For example, those people with mild cognitive impairment had 37 percent lower verbal memory scores and 18 percent lower levels of SERT in the brain's hippocampus compared to healthy controls.
Smith says her group is investigating whether PET imaging of serotonin could be a marker to detect progression of disease, whether alone or in conjunction with scans that detect the clumping protein known as amyloid that accumulates in the brains of those with Alzheimer's disease.

When it comes to targeting the disease, because of reduced levels of the serotonin transporters, Smith says, the receptors that detect serotonin on message-receiving cells might be a better option. There are 14 types of serotonin receptors that could be used as possible targets. She says a number of experimental drugs now in clinical trials are designed to target serotonin in other ways in the brain, and may have better success than the SSRIs.

About 5.5 million people in the U.S. have Alzheimer's disease-caused dementia, and that number is expected to rise with the increasingly aging population.

High sugar consumption gives rise to dental treatment costs in the billions



Worldwide, people are eating far too much sugar. This has negative consequences for their teeth and for their purses: seen at the global level, the costs of dental treatment are currently running at around 172 billion US dollars (128 billion euros). In Germany alone, these amount to 17.2 billion euros (23 billion US dollars) a year. These are the results of a joint study conducted by the Martin Luther University Halle-Wittenberg (MLU) and the Biotechnology Research and Information Network AG (BRAIN AG) published in the International Journal of Dental Research. The work was carried out within the strategic alliance NatLifE 2020 and was co-financed by the German Federal Ministry of Education and Research (BMBF).

For their work the researchers evaluated representative data on the prevalence of caries, inflammation of the gums (parodontitis) and tooth loss, corresponding costs of treatment and the disease burden, as well as data on sugar consumption, in 168 countries for the year 2010. On the basis of this data they calculated the share of total costs attributable to excessive consumption of sugar. In addition to white household sugar, the researchers also focused their attention, in the analysis, to "hidden" sugar that is contained in many processed products, such as soft drinks, ketchup, ice cream and frozen foods, as well as breads, cakes and pastries.

"The data shows a clear correlation between the consumption of sugar and the incidence of caries, parodontitis and, as a result, tooth loss," said the lead author of the study, Dr Toni Meier from the Institute of Agricultural and Nutritional Sciences at the MLU. "For every additional 25 grams of sugar consumed per person and day - which amounts to roughly eight sugar-cubes or a glass of sweetened lemonade - the costs of dental treatment in high-income countries increase on average by 100 US dollars (75 euros) per person and year."

In Germany, the average daily sugar consumption lies between 90 and 110 grams per person. The costs of treatment amount to 281 US dollars (210 euros) per person and year. This puts Germany in the group of countries with the highest costs of treatment per person and year. Other countries "in the group" are Switzerland (402 US dollars, 300 euros), Denmark (238 US dollars, 178 euros) and the USA (185 US dollars, 138 euros). "If the target of 50 grams of sugar per person and day set by the World Health Organization could be reached, this would result in savings in the costs of treatment within Germany of 150 euros (201 US dollars) per person and year. Extrapolating this figure to the federal level shows annual potential savings of approximately 12 billion euros, or 16 billion US dollars," added Meier. A low-sugar diet is becoming increasingly difficult, however, since almost all processed products in the supermarket contain large quantities of added sugars.

The highest levels of sugar-related dental illness were observed by the researchers in Guatemala, Mauretania and Mexico. "Newly industrialised countries such as India, Brazil and Mexico, but also Pakistan and Egypt, could avoid an excessive burden of illness and of health care costs by anchoring the topic in their health and nutritional policies at an early stage," said the co-author of the study and nutrition scientist, Professor Gabriele Stangl of the MLU. This objective could be achieved by way of educational campaigns or by special taxation on high-calorie food. Such a sugar tax was introduced in Mexico in 2014 and already after one year was proving to be effective: the consumption of sugar-sweetened beverages had decreased by five percent. In the second year this decrease even doubled to ten percent.

"To be able to reduce the burden of nutrition-related illnesses, a balanced mix of educational work and food-policy initiatives, along with innovative technological solutions, are needed," said the co-author of the study, Dr Katja Riedel, joint coordinator of the NatLifE 2020 innovative alliance and program manager of system-products nutrition at BRAIN AG. The alliance co-financed by the German Federal Ministry of Education and Research aims, with the help of biotechnology and the understanding of biological systems, to develop a new generation of sustainably produced and biological active substances for foods and cosmetics and thereby to make a contribution towards improving human nutrition, health and well-being.


Friday, August 11, 2017

Higher income individuals more physically active, yet more sedentary


New research led by American Cancer Society researchers in collaboration with the University of Texas Health Science Center at Houston and Georgia State University used activity monitors to find that higher income individuals are more likely to be "weekend warriors," getting most of their activity on only a few days a week, and also spend more time in sedentary pursuits. The study appears in Preventive Medicine.

Previous research has shown that higher income individuals are more likely to be physically active at a higher intensity. However, this research has historically relied on self-reporting, which may exaggerate actual activity levels. Information collected via activity monitors has shown less than 5% of U.S. adults meet physical activity guidelines. This is despite ample evidence supporting the link between physical activity and reduced risk for premature death and many diseases, including some cancers.

At the same time, evidence has accumulated showing the harmful effects of prolonged sedentary behavior on health; health effects that remain even when physical activity levels are considered. Income has been shown to be a prominent barrier to engaging in physical activity. Individuals with low incomes face time constraints as well as other barriers, including lack of exercise facilities, parks and open space, as well as an inflexible work environment, and have been shown to be less likely to meet physical activity guidelines. Meanwhile, higher income individuals who often also have limited time, have more resources and places to exercise, which could facilitate their ability to meet activity guidelines. However, they also are more likely to hold sedentary jobs, like office work.

For the new study, investigators Kerem Shuval and Qing Li (American Cancer Society), Kelley Pettee Gabriel (University of Texas Health Science Center at Houston) and Rusty Tchernis (Georgia State University), used accelerometer data to analyze physical activity and sedentary behavior in relation to income levels among 5,206 U.S. adults enrolled in The National Health and Examination Survey from 2003-06, a nationally representative survey.

The study found that compared to those making less than $20,000 per year, those with an annual income of $75,000 or more engaged in 4.6 more daily minutes of moderate to vigorous intensity physical activity as measured by activity monitors. High income individuals also engaged in 9.3 fewer minutes of light intensity activity, spent 11.8 more minutes daily sedentary, were 1.6 times more likely to meet guidelines for a brief 2-day period ('weekend warrior'), and were 1.9 times more likely to meet guidelines during a 7-day period.

"Our findings pertaining to income and the 'weekend warrior' effect underscore the importance of tailoring the physical activity message to reflect the constraints of both low and high income individuals," said Dr. Shuval. "To meet guidelines one can engage in 150 minutes of weekly moderate intensity activity over a 2 or 3-day period rather than 7 days, for example. This can be achieved over a long weekend, a message we may want to convey to those pressed for time. It is important to remember, however, that we should increase the duration and intensity of activity gradually to avoid injury. Also, if inactive consult with a physician before embarking on an exercise program."

Almonds may help boost cholesterol clean-up crew


Eating almonds on a regular basis may help boost levels of HDL cholesterol while simultaneously improving the way it removes cholesterol from the body, according to researchers.

In a study, researchers compared the levels and function of high-density lipoprotein (HDL cholesterol) in people who ate almonds every day, to the HDL levels and function of the same group of people when they ate a muffin instead. The researchers found that while participants were on the almond diet, their HDL levels and functionality improved.

Penny Kris-Etherton, distinguished professor of nutrition at Penn State, said the study, published in the Journal of Nutrition, builds on previous research on the effects of almonds on cholesterol-lowering diets.

"There's a lot of research out there that shows a diet that includes almonds lowers low-density lipoprotein, or LDL cholesterol, which is a major risk factor for heart disease," Kris-Etherton said. "But not as much was known about how almonds affect HDL cholesterol, which is considered good cholesterol and helps lower your risk of heart disease."

The researchers wanted to see if almonds could not just increase the levels but also improve the function of HDL cholesterol, which works by gathering cholesterol from tissues, like the arteries, and helping to transport it out of the body.

"HDL is very small when it gets released into circulation," Kris-Etherton said. "It's like a garbage bag that slowly gets bigger and more spherical as it gathers cholesterol from cells and tissues before depositing them in the liver to be broken down."

Depending on how much cholesterol it has collected, HDL cholesterol is categorized into five "subpopulations," which range from the very small preβ-1 to the larger, more mature α-1. The researchers hoped that eating almonds would result in more α-1 particles, which would signal improved HDL function.

In the controlled-feeding study, 48 men and women with elevated LDL cholesterol participated in two six-week diet periods. In both, their diets were identical except for the daily snack. On the almond diet, participants received 43 grams -- about a handful -- of almonds a day. During the control period, they received a banana muffin instead.

At the end of each diet period, the researchers measured the levels and function of each participant's HDL cholesterol. The researchers then compared the results to the participants' baseline measurements taken at the beginning of the study.

The researchers found that compared to the control diet, the almond diet increased α-1 HDL -- when the particles are at their largest size and most mature stage -- by 19 percent. Additionally, the almond diet improved HDL function by 6.4 percent, in participants of normal weight.

"We were able to show that there were more larger particles in response to consuming the almonds compared to not consuming almonds," Kris-Etherton said. "That would translate to the smaller particles doing what they're supposed to be doing. They're going to tissues and pulling out cholesterol, getting bigger, and taking that cholesterol to the liver for removal from the body."

An increase in this particular HDL subpopulation is meaningful, Kris-Etherton explained, because the particles have been shown to decrease overall risk of cardiovascular disease.

Kris-Etherton said that while almonds will not eliminate the risk of heart disease, they may be a smart choice for a healthy snack. She added that in addition to their heart-healthy benefits, almonds also provide a dose of good fats, vitamin E and fiber.

"If people incorporate almonds into their diet, they should expect multiple benefits, including ones that can improve heart health," Kris-Etherton said. "They're not a cure-all, but when eaten in moderation -- and especially when eaten instead of a food of lower nutritional value -- they're a great addition to an already healthy diet."

Thursday, August 10, 2017

How dietary fiber helps the intestines maintain health


University of California - Davis Health researchers have discovered how by-products of the digestion of dietary fiber by gut microbes act as the right fuel to help intestinal cells maintain gut health.

The research, published August 11 in the journal Science, is important because it identifies a potential therapeutic target for rebalancing gut microbiota and adds to a growing body of knowledge on the complex interplay between gut microbiota and dietary fiber.

An accompanying Insights / Perspectives article in the same issue of the journal describes gut microbes as "partners" in the body's defense against potential infectious agents, such as Salmonella.
"Our research suggests that one of the best approaches to maintaining gut health might be to feed the beneficial microbes in our intestines dietary fiber, their preferred source of sustenance," said Andreas Bäumler, professor of medical microbiology and immunology at UC Davis Health and senior author of the study.

"While it is known that the gut is the site of constant turf wars between microbes, our research suggests that signals generated by beneficial microbes drive the intestinal tract to limit resources that could lead to an expansion of potentially harmful microbes," he said.

Resident gut microbes metabolize indigestible dietary fiber to produce short-chain fatty acids, which signal cells lining the large bowel to maximize oxygen consumption, thereby limiting the amount of oxygen diffusing into the gut lumen (the open space within the intestine that comes into direct contact with digested food.)

"Interestingly, the beneficial gut bacteria that are able to breakdown fiber don't survive in an environment rich in oxygen, which means that our microbiota and intestinal cells work together to promote a virtuous cycle that maintains gut health," Mariana X. Byndloss, assistant project scientist and first author on the study.

The new research identified the host receptor peroxisome proliferator receptor gamma (PPARg) as the regulator responsible for maintaining this cycle of protection.

"When this host signaling pathway malfunctions, it leads to increased oxygen levels in the gut lumen," Bäumler said. "These higher oxygen levels make us more susceptible to aerobic enteric pathogens such as Salmonella or Escherichia coli, which use oxygen to edge out competing beneficial microbes."

Serious side effects associated with fluoroquinolone antibacterial drugs


The U.S. Food and Drug Administration is advising that the serious side effects associated with fluoroquinolone antibacterial drugs generally outweigh the benefits for patients with acute sinusitis, acute bronchitis, and uncomplicated urinary tract infections who have other treatment options. For patients with these conditions, fluoroquinolones should be reserved for those who do not have alternative treatment options.

An FDA safety review has shown that fluoroquinolones when used systemically (i.e. tablets, capsules, and injectable) are associated with disabling and potentially permanent serious side effects. These side effects can involve the tendons, muscles, joints, nerves, and central nervous system.

Patients should contact your health care professional immediately if you experience any serious side effects while taking your fluoroquinolone medicine. Some signs and symptoms of serious side effects include tendon, joint and muscle pain, a “pins and needles” tingling or pricking sensation, confusion, and hallucinations. Patients should talk with your health care professional if you have any questions or concerns.

Fluoroquinolone Antimicrobial Drugs [ciprofloxacin (marketed as Cipro and generic ciprofloxacin), ciprofloxacin extended-release (marketed as Cipro XR and Proquin XR), gemifloxacin (marketed as Factive), levofloxacin (marketed as Levaquin), moxifloxacin (marketed as Avelox), norfloxacin (marketed as Noroxin), and ofloxacin (marketed as Floxin)]

Wednesday, August 9, 2017

Increases in alcohol use, especially among women, other groups


Alcohol use, high-risk drinking and alcohol use disorders increased in the U.S. population and across almost all sociodemographic groups, especially women, older adults, racial/ethnic minorities and individuals with lower educational levels and family income, according to a new study published by JAMA Psychiatry.

Regular and detailed monitoring of trends in drinking and alcohol use disorders is important for the health of the nation. Monitoring alcohol consumption patterns and alcohol use disorders over time also is important for the planning and targeting of prevention and intervention programs.
Bridget F. Grant, Ph.D., of the National Institute on Alcohol Abuse and Alcoholism, Rockville, Md., and coauthors present data for 2001-2002 and 2012-2013 on changes in the prevalences (the proportion of people affected) for alcohol use, high-risk drinking and DSM-IV alcohol use disorder (AUD).

High-risk drinking was four or more standard drinks on any day for women, five or more standard drinks on any day for men and, in this study, exceeding those daily drinking limits at least weekly during the past 12 months. An individual was considered to have a DSM-IV diagnosis of AUD if the person met the criteria for alcohol dependence or abuse in the past 12 months, according to the study background.

The authors report:
  • Alcohol use in the United States increased from 65.4 percent in 2001-2002 to 72.7 percent in 2012-2013, an increase of 11.2 percent.
  • High-risk drinking increased between 2001-2002 and 2012-2013 from 9.7 percent to 12.6 percent, representing 20.2 million and 29.6 million Americans, respectively, for a change of 29.9 percent.
  • DSM-IV diagnosis of AUD increased from 8.5 to 12.7 percent in the total population, representing 17.6 million and 29.9 million Americans, respectively, a change of 49.4 percent.
  • With few exceptions, increases in all the outcomes were the greatest among women, older adults, racial/ethnic minorities and those with lower educational levels and family income.
The study notes some limitations, including that the surveys lacked biological testing for substance use.

"These increases constitute a public health crisis that may have been overshadowed by increases in much less prevalent substance use (marijuana, opiates and heroin) during the same period. ... Most important, the findings herein highlight the urgency of educating the public, policymakers and health care professionals about high-risk drinking and AUD, destigmatizing these conditions and encouraging those who cannot reduce their alcohol consumption on their own, despite substantial harm to themselves and others, to seek treatment," the article concludes.

Intraindividual reaction time variability independently predicts mortality


Inconsistent performance in responding to a stimulus, rather than the speed with which one responds, is a marker of accelerated ageing and predicts mortality in older people, according to research published by the Centre for Healthy Brain Ageing (CHeBA), UNSW Sydney.


The researchers measured the variability of response on computerised reaction time tests (intraindividual variability of reaction time) in older adults and found that it predicted survival time after accounting for any signs of decline in cognitive functioning that may herald dementia. The findings were published today in the eminent medical journal, PLOS ONE.

Lead author and head of CHeBA's Neuropsychology group, Dr Nicole Kochan, said the study was the first to comprehensively account for effects of overall cognitive level and dementia on the relationship between intraindividual variability of reaction time and mortality. "Our findings suggest that greater intraindividual reaction time variability is a behavioural marker that uniquely predicts shorter time to death," said Dr Nicole Kochan.

"Importantly, the predictive strength of intraindividual reaction time variability was virtually unchanged when we removed participants who developed dementia over the subsequent eight years. This suggests that variability of reaction time is an independent risk factor and not simply a corollary of general cognitive decline or neuropathological disturbances associated with dementia."
The study examined 861 community-dwelling participants aged 70-90 years from CHeBA's Sydney Memory and Ageing Study (MAS) over eight years. Participants completed two computerised reaction time tests at baseline and as part of comprehensive medical and neuropsychological assessments every two years.

Participants are presented with coloured squares as stimuli on a computer screen and under a simple task are required to touch each square as quickly as possible. On a more complex level each participant is required to make a choice of which of two squares they touch depending on a pre-specified rule.

Greater intraindividual reaction time variability, but not average speed of response time, significantly predicted survival time after adjusting for known mortality risk factors, including age, sex, global cognition score, cardiovascular risk and apolipoprotein ?4 status. The findings add to our previous research showing that measures of reaction time variability are sensitive to other age-related neuropathological states including preclinical dementia and falls.

Dr Kochan explained that the erratic responding is possibly tapping into the efficiency of brain processing. As you get older, efficiency of brain processing decreases and some neurochemicals also decline, leading to erratic type of responding which variability measures may be capturing. Potentially not only as you get older, but as you get closer to death, the variability in response time becomes more exaggerated.

CHeBA Co-Director and co-author, Professor Perminder Sachdev, said the findings are an important contribution to a small but growing field investigating reaction time variability as a behavioural marker of neurobiological integrity.

"Further research exploring the mechanisms involved is needed, including possible links between intraindividual reaction time variability, cognitive decline and structural and functional brain changes," said Professor Sachdev.

Caution re recent US advice on aggressively lowering blood pressure

 

Medical researchers at Trinity College Dublin, Ireland, are advising caution when treating blood pressure in some older people -- after results from a study contrasted with recent advice from the US to attempt to aggressively lower blood pressure in all adults to targets of 120mmHg.

Researchers from the Irish Longitudinal Study on Ageing (TILDA) at Trinity College Dublin, in collaboration with Beaumont Hospital, Dublin, have recently published the findings in the prestigious Journal of the American Medical Association (JAMA Internal Medicine). The full article can be read from: https://www.dropbox.com/sh/1id9axnv31fua3h/AAARsAoe_9j6ST57tMAmOr_1a?dl=0

A large randomised blood pressure trial led by US investigators - the Systolic blood Pressure Intervention Trial (SPRINT) -- demonstrated that lowering systolic blood pressure to levels of 120mmHg or less compared with 140mmHg or less in adults (over 50 years with cardiovascular risk) significantly reduced death (from all causes and from heart failure and heart attacks). The study also reported that common side effects of low blood pressure such as falls, injuries, blackouts, and drops in blood pressure after standing were not increased by aggressive treatment -- even in people over 75 years old.

Because the latter findings were clinically counter intuitive, the TILDA team tested whether they held true outside of a trial setting. Focusing on people in Ireland over 75 years, they examined rates of falls, injuries, blackouts and excessive drops in standing blood pressure in those who met the criteria for the treatment proposed in SPRINT and were followed up with for 3½ years -- the same time period as SPRINT.

The researchers reported starkly contrasting results -- falls and blackouts were up to five times higher than reported in SPRINT and drops in blood pressure on standing were almost double that reported in SPRINT. Therefore, in people over 75 years, intensive lowering of blood pressure to 120mmHg could result in harm and TILDA researchers recommend that a better understanding of who, over 75 years, will or will not benefit, is necessary before widespread adaptation of the SPRINT results.

The TILDA team is now assessing how best to determine which people may benefit from SPRINT, and which people are more at risk from aggressive blood pressure lowering.

First author of the journal article, Research Fellow at TILDA, Dr Donal Sexton, said: "SPRINT was a landmark study of hypertension treatment. While the benefits of lowering blood pressure seen in this study are not in dispute, we are highlighting to physicians that we need to be cognisant of the fact that the trial was not powered for adverse events such as falls causing injury. Physicians ought not to expect a similarly low rate of adverse events in clinical practice as was observed in the trial when lowering blood pressure in older people. Overall what we are saying is that the risks and benefits of lowering blood pressure should be individualised for each patient. "

Professor Rose Anne Kenny, founding Principal Investigator with TILDA and lead author of the journal article commented: "Our work and that of other groups has shown that low blood pressure and particularly drops in standing blood pressure are linked not only to falls, fractures and fall- and blackout-related injuries, but also to depression and possibly other brain health disorders."

"These outcomes can seriously impact on independence and quality of life and we advise caution in applying the SPRINT recommendations to everyone over 75 years without detailed assessment of an individual's risk versus possible benefit until such a time as we can provide more clarity re treatment."






Preparing for longevity -- we don't need to become frail as we age


Age-related frailty may be a treatable and preventable health problem, just like obesity, diabetes, and cardiovascular disease, highlights a review in Frontiers in Physiology.

"Societies are not aware of frailty as an avoidable health problem and most people usually resign themselves to this condition," says Jerzy Sacha, Head of the Catheterization Laboratory at the University Hospital in Opole, Poland. "Fortunately, by proper lifestyle and adequate physical, mental, and social activities, one may prevent or delay the frailty state."

In their recent article, Sacha and his colleagues at the University of Opole and the Opole University of Technology reviewed over one hundred publications on recognizing, treating, and preventing frailty, with the aim of raising awareness of this growing health problem.

Frailty encompasses a range of symptoms that many people assume are just an inescapable part of aging. These include fatigue, muscle weakness, slower movements, and unintentional weight loss. Frailty also manifests as psychological and cognitive symptoms such as isolation, depression, and trouble thinking as quickly and clearly as patients could in their younger years.

These symptoms decrease patients' self-sufficiency and frail patients are more likely to suffer falls, disability, infections, and hospitalization, all of which can contribute to an earlier death. But, as Sacha's review highlights, early detection and treatment of frailty, and pre-frailty, may help many of the elderly live healthier lives.

Sacha's review shows ample evidence that the prevalence and impact of frailty can be reduced, at least in part, with a few straightforward measures. Unsurprisingly, age-appropriate exercise has been shown to be one of the most effective interventions for helping the elderly stay fit. Careful monitoring of body weight and diet are also key to ensuring that older patients are not suffering from malnutrition, which often contributes to frailty.

Socialization is another critical aspect of avoiding the cognitive and psychological symptoms of frailty. Loneliness and loss of purpose can leave the elderly unmotivated and disengaged, and current social programs could improve by more thoroughly addressing intellectual and social needs, as well as physical.

It's not clear yet just how much such interventions can benefit the aging world population, but Sacha's review suggests that raising public awareness is a critical first step. Improved recognition of frailty as a preventable condition by both physicians and patients could contribute significantly to avoiding or delaying frailty.

"Social campaigns should inform societies about age related frailty and suggest proper lifestyles to avoid or delay these conditions," says Sacha. "People should realize that they may change their unfavorable trajectories to senility and this change in mentality is critical to preparing communities for greater longevity."


Tuesday, August 8, 2017

Calcium in arteries influences heart attack risk


Patients without calcium buildup in the coronary arteries had significantly lower risk of future heart attack or stroke despite other high risk factors such as diabetes, high blood pressure, or bad cholesterol levels, new research from UT Southwestern cardiologists shows.

These individuals had less than a 3 percent chance of a cardiovascular event over the next decade - even though many had well-known risk factors - well below the 7.5 percent level set by the American College of Cardiology and American Heart Association as a guideline to begin statin treatment.

"The event rates when coronary calcium is absent are low," said preventive cardiologist Dr. Parag Joshi, Assistant Professor of Internal Medicine at UT Southwestern. "Our findings suggest that individuals with no calcium buildup in their blood vessels may not have to take statins despite the presence of other risk factors that cause coronary disease."

There may still be other reasons statins are a good therapy, so Dr. Joshi said the new findings suggest that adding a CT scan for calcium may be worthwhile as doctors and patients discuss treatment options.

"A CT scan is a test that is easily done, costs about 100 bucks in most major cities, and can give a lot more information about the patient's 10-year risk," said Dr. Joshi, a Fellow of the American College of Cardiology.

Calcium accumulates in the arteries of the heart after plaque builds up and calcifies over time.
The UT Southwestern researchers looked at CT scans of the chest and heart of 6,184 people aged 45 to 84, who had never had a heart attack or stroke, and were participants in a large, multi-site, multi-year study known as MESA (Multi-Ethnic Study of Atherosclerosis).

About half of the participants showed no calcium deposits in their heart arteries, meaning they had a coronary artery calcium (CAC) score of zero.

However, a zero CAC score doesn't mean that no plaque is building up inside the heart's arteries, or that the patient has zero risk - rather it means that the patient's risk of a heart attack is lower than the threshold where doctors typically recommend treatment with a statin, said Dr. Joshi. A 5 percent risk, based on a calculation used by doctors that factors in age, sex, ethnicity, smoking, diabetes, high blood pressure, and cholesterol levels is considered the low end for recommending statin use.

In addition, there may be an argument for starting statin treatment before there is evidence of calcium buildup if there are concerns the patient will develop a problem later because of, say, family history, said Dr. Joshi, who is board certified in the use of coronary computed tomography (CT) angiography, a state-of-the-art technology that provides a very detailed look at the heart and related arteries. It also is important to note that statins carry little risk and are inexpensive, he said.

"A CAC score can really add to the clinician-patient discussion over whether or not to start a statin for primary prevention of heart attacks and strokes," Dr. Joshi said.

The new findings appear online in the Journal of the American College of Cardiology: Cardiovascular Imaging.