Thursday, September 19, 2019

Research suggests the happiest introverts may be extraverts


Advice for introverts: fake it, and you'll be happier
University of California - Riverside
If you are an introvert, force yourself to be an extravert. You'll be happier.
That's the suggestion of the first-ever study asking people to act like extraverts for a prolonged period. For one week, the 123 participants were asked to - in some cases - push the boundaries of their willingness to engage, by acting as extraverts. For another week, the same group was asked to act like introverts.
The benefits of extraversion have been reported before, including those of "forced extraversion," but usually only for brief intervals. In one study, train-riders were asked to talk to strangers; a control group was directed to remain silent. The talkers reported a more positive experience.
UC Riverside researcher Sonja Lyubomirsky wanted to extend the faux extraversion to see if it would result in better well-being.
"The findings suggest that changing one's social behavior is a realizable goal for many people, and that behaving in an extraverted way improves well-being," said Lyubomirsky, a UCR psychologist and co-author of the study, published in the Journal of Experimental Psychology: General. Psychologists favor "extravert" to the more commonly used "extrovert," due to its historic use in academia, and the Latin origins of "extra," meaning "outside."
An initial challenge for this study was the presumption that extraversion--as a trait rewarded in U.S. culture--is best. Many of the adjectives associated with extraversion are more flattering than those tied to introversion. Most people would rather be associated with words like "dynamic" than with words like "withdrawn."
So Lyubomirsky's team went for words agreed upon as most neutral. The adjectives for extraversion were "talkative," "assertive," and "spontaneous"; for introversion, "deliberate," "quiet," and "reserved."
Researchers next told participants -- both the Act Introvert group and the Act Extravert group -- that previous research found each set of behaviors are beneficial for college students.
Finally, the participants were told to go forth, and to be as talkative, assertive, and spontaneous as they could stand. Later, the same group was told to be deliberate, quiet, and reserved, or vice versa. Three times a week, participants were reminded of the behavioral change via emails.
According to all measures of well-being, participants reported greater well-being after the extraversion week, and decreases in well-being after the introversion week. Interestingly, faux extraverts reported no discomfort or ill effects.
"It showed that a manipulation to increase extraverted behavior substantially improved well-being," Lyubomirsky said. "Manipulating personality-relevant behavior over as long as a week may be easier than previously thought, and the effects can be surprisingly powerful."
The researchers suggest that future experiments addressing this question may switch up some variables. The participants were college students, generally more malleable in terms of changing habits. Also, Lyubomirsky said, effects of "faking" extroversion could surface after a longer study period.

Exercise could slow withering effects of Alzheimer's


Imaging shows less brain deterioration in physically active people at high risk for dementia
UT Southwestern Medical Center
DALLAS - Sept. 17, 2019 - Exercising several times a week may delay brain deterioration in people at high risk for Alzheimer's disease, according to a study that scientists say merits further research to establish whether fitness can affect the progression of dementia.
Research from UT Southwestern found that people who had accumulation of amyloid beta in the brain - a hallmark of Alzheimer's disease - experienced slower degeneration in a region of the brain crucial for memory if they exercised regularly for one year.
Although exercise did not prevent the eventual spread of toxic amyloid plaques blamed for killing neurons in the brains of dementia patients, the findings suggest an intriguing possibility that aerobic workouts can at least slow down the effects of the disease if intervention occurs in the early stages.
"What are you supposed to do if you have amyloid clumping together in the brain? Right now doctors can't prescribe anything," said Dr. Rong Zhang, who led the clinical trial that included 70 participants ages 55 and older. "If these findings can be replicated in a larger trial, then maybe one day doctors will be telling high-risk patients to start an exercise plan. In fact, there's no harm in doing so now."
Reduced brain atrophy
The study published in the Journal of Alzheimer's Disease compared cognitive function and brain volume between two groups of sedentary older adults with memory issues: One group did aerobic exercise (at least a half-hour workout four to five times weekly), and another group did only flexibility training.
Both groups maintained similar cognitive abilities during the trial in areas such as memory and problem solving. But brain imaging showed that people from the exercise group who had amyloid buildup experienced slightly less volume reduction in their hippocampus - a memory-related brain region that progressively deteriorates as dementia takes hold.
"It's interesting that the brains of participants with amyloid responded more to the aerobic exercise than the others," said Dr. Zhang, who conducted the trial at the Institute for Exercise and Environmental Medicine. "Although the interventions didn't stop the hippocampus from getting smaller, even slowing down the rate of atrophy through exercise could be an exciting revelation."
However, Dr. Zhang notes that more research is needed to determine how or if the reduced atrophy rate benefits cognition.
Elusive answers
The search for dementia therapies is becoming increasingly pressing: More than 5 million Americans have Alzheimer's disease, and the number is expected to triple by 2050.
Recent research has helped scientists gain a greater understanding of the molecular genesis of the disease, including a UT Southwestern discovery published last year that is guiding efforts to detect the condition before symptoms arise. Yet the billions of dollars spent on trying to prevent or slow dementia have yielded no proven treatments that would make an early diagnosis actionable for patients.
Fitness and brain health
Dr. Zhang is among a group of scientists across the world trying to determine if exercise may be the first such therapy.
His latest research builds upon numerous studies suggesting links between fitness and brain health. For example, a 2018 study showed that people with lower fitness levels experienced faster deterioration of vital nerve fibers in the brain called white matter. Research in mice has similarly shown exercise correlated with slower deterioration of the hippocampus - findings that prompted Dr. Zhang to investigate whether the same effects could be found in people.
"I'm excited about the results, but only to a certain degree," Dr. Zhang said. "This is a proof-of-concept study, and we can't yet draw definitive conclusions."
Expanded research
Dr. Zhang is leading a five-year national clinical trial that aims to dig deeper into potential correlations between exercise and dementia.
The trial, which includes six medical centers across the country, involves more than 600 older adults (ages 60-85) at high risk of developing Alzheimer's disease. The study will measure whether aerobic exercise and taking specific medications to reduce high blood pressure and cholesterol can help preserve brain volume and cognitive abilities.
"Understanding the molecular basis for Alzheimer's disease is important," Dr. Zhang said. "But the burning question in my field is, 'Can we translate our growing knowledge of molecular biology into an effective treatment?' We need to keep looking for answers."

The importance of staying physically active and the negative effects of even short-term inactivity


A new study presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 September) highlights the negative health effects of even short periods of physical inactivity and stresses the importance of staying physically active.
The research was conducted by Dr Kelly Bowden Davies, Newcastle University, UK and the University of Liverpool, UK and colleagues, and analysed the effects of a short-term reduction in physical activity on metabolic profiles, body composition and cardiovascular (endothelial) function.
Low levels of physical activity and sedentary lifestyles are known to confer a significantly increased risk of metabolic problems including obesity, insulin resistance, type 2 diabetes, and cardiovascular disease. The goal of the study was to determine whether adverse effects linked to these conditions would begin to appear in previously active individuals after a period of just 14 days of reduced physical activity.
The team recruited a study group of habitually active (>10,000 steps/day) individuals (18 female, 10 male) with a mean age of 32 and an average BMI of 24.3 kg/m2 (within the 'healthy' range). Assessments were performed at baseline, 14 days after adopting a more sedentary lifestyle, and 14 days after resuming their previous activity level. Participants' cardiorespiratory fitness (V?O2 peak), body composition (dual-energy x-ray absorptiometry/magnetic resonance spectroscopy) and cardiovascular function (flow mediated dilation; FMD) were determined at each time point, and their physical activity (SenseWear armband) was monitored throughout.
Study participants reduced their step count by an average of around 10,000 steps/day measured in comparison to each individual's baseline activity, increasing their waking sedentary time by an average 103 minutes/day.
Cardiovascular function as measured by FMD decreased by 1.8% following 14 days of relative inactivity, but returned to comparable baseline levels 14 days after the resumption of normal activity. The researchers also found that: "In parallel, total body fat, waist circumference, liver fat, insulin sensitivity and cardiorespiratory fitness were all adversely affected by 14 days step-reduction, but returned to comparable baseline levels following resumption of habitual activity."
The authors conclude: "In young non-obese adults, short-term physical inactivity and increased sedentary behaviour led to decreased cardiorespiratory fitness and increasing waist circumference, liver fat deposition and insulin resistance, and led to a significant decline in endothelial function, a sign of developing cardiovascular disease."
They add: "Public health messages need to emphasise the harmful effect of even short-term physical inactivity, and that habitual activity appears to offset these negative consequences. Even small alterations in physical activity in daily living can have an impact on health - positively, or negatively. People should be encouraged to increase their physical activity levels, in any way possible. Often, we hear of 'barriers to exercise', such as energy or lack of enjoyment, but simply increasing daily physical activity can have benefit, as shown here by only changes daily steps."

Wednesday, September 18, 2019

Walking slower and pausing for rest may enable older adults to maintain outdoor mobility


IMAGE
IMAGE: Pausing for rest can facilitate older adults' outdoor mobility. view more 
Credit: Petteri Kivimäki / University of Jyväskylä
When functional ability declines, changing the way of walking by, for instance, walking slower, pausing for rest or using walking aids, can facilitate older adults' outdoor mobility. These were the findings of a study conducted at the Faculty of Sport and Health Sciences, University of Jyväskylä.
As functional ability declines, older people may start to have difficulties in walking long distances. At this point, older people might change their way of walking consciously or unconsciously.
"Changes may be seen, for instance, in lowering walking speed, pausing walking for rest or even in avoiding long walking distances altogether. These early changes in walking are called walking modifications," doctoral student Heidi Skantz explains.
Previous research on walking modifications has implicitly considered modifications as an early sign of functional decline and such modifications have been shown to predict walking difficulties in the future. This previous research, however, has emphasized mainly the negative side of the use of walking modifications. We think that the potential positive, enabling, effects of walking modifications should also be considered.
"We wanted to find out if some of these changes in walking would be beneficial in maintaining outdoor mobility," Skantz says.
Using walking aids, lowered walking speed and pausing for rest were categorized as adaptive walking modifications, since they were considered to reduce the task demand, whereas reduced frequency of walking and avoiding long walking distances were categorized as maladaptive modifications. This categorization was shown to be meaningful.
"Those older people who used maladaptive walking modifications had smaller life-space mobility and they perceived that they lacked possibilities for outdoor mobility," Skantz says. "As for those older people who had chosen to utilise adaptive walking modifications, they were able to maintain wider life-space mobility and they were also satisfied with their outdoor mobility opportunities."
As functional ability declines, walking long distances might become a harder and scarier task than before. In such a case, it still remains important to continue covering long distances by walking, even if with walking aids or by pausing walking, in order to maintain outdoor mobility.
"Encouraging older people to opt for adaptive walking modifications might be possible by designing age-friendly environments, for instance by providing opportunities to rest when walking outdoors. However this warrants further studies," says Skantz.
The study participants were older people between the ages of 75 and 90, who were living in the Jyväskylä and Muurame regions in central Finland. The study was conducted at the Gerontology Research Center and Faculty of Sport and Health Sciences, University of Jyväskylä. This study was supported by European Research Council, the Academy of Finland, the Ministry of Education and Culture and the University of Jyväskylä.

Tuesday, September 17, 2019

Short stature is associated with a higher risk of type 2 diabetes


Short stature is associated with a higher risk of type 2 diabetes, according to a new study in Diabetologia (the journal of the European Association for the Study of Diabetes).Tall stature is associated with a lower risk, with each 10cm difference in height associated with a 41% decreased risk of diabetes in men and a 33% decreased risk in women.
The increased risk in shorter individuals may be due to higher liver fat content and a less favourable profile of cardiometabolic risk factors, say the authors that include Dr Clemens Wittenbecher and Professor Matthias Schulze, of the German Institute of Human Nutrition Potsdam-Rehbruecke, Germany, and colleagues.
Short stature has been linked to higher risk of diabetes in several studies, suggesting that height could be used to predict the risk for the condition. It has been reported that insulin sensitivity and beta cell function are better in taller people. Short stature is related to higher cardiovascular risk, a risk that might in part be mediated by cardiometabolic risk factors relevant to type 2 diabetes -- for example blood pressure, blood fats and inflammation.
This new study used data obtained in the European Prospective Investigation into Cancer and Nutrition (EPIC) -- Potsdam; a study that included 27,548 participants -- 16, 644 women aged between 35 and 65 years and 10,904 men aged between 40 and 65 years -- recruited from the general population of Potsdam, Germany between 1994 and 1998.
A variety of physical data were collected from participants, including body weight, total body height and sitting height (with leg length calculated as the difference between the two), waist circumference and blood pressure. For this study, a sub-cohort of 2,500 participants (approx. 10%) was randomly selected being representative for the full study. Those with diabetes already or lost to follow up were excluded, leaving 2,307 for analysis. In addition, 797 participants of the full cohort who went on to develop type 2 diabetes were included. Of these, an investigation of potential mediating factors was carried out for 2,662 participants (including 2,029 sub-cohort members and 698 diabetes cases).
The study found that the risk of future type 2 diabetes was lower by 41% for men and 33% for women for each 10cm larger height, when adjusted for age, potential lifestyle confounders, education and waist circumference.
The association of height with diabetes risk appeared to be stronger among normal-weight individuals, with an 86% lower risk per 10cm larger height in men, and 67% lower risk per 10cm larger height in women. In overweight/obese individuals, each 10cm larger height was associated with diabetes risk being 36% lower for men and 30% lower for women. The authors say: "This may indicate that a higher diabetes risk with larger waist circumference counteracts beneficial effects related to height, irrespective of whether larger waist circumference is due to growth or due to consuming too many calories."
Larger leg length was associated with a lower risk of diabetes. A slight sex difference was noted -- for men a larger sitting height at the cost of leg length related to increased risk, whilst amongst women both leg length and sitting height contributed to lower risk. The authors suggest that, among boys, growth before puberty, which relates more strongly to leg length, will have a more favourable impact on later diabetes risk than growth during puberty (assuming that truncal bones are the last to stop growing). For girls both growth periods seem to be important.
The authors also calculated to what extent the inverse associations of height and height components with type 2 diabetes risk are explainable by liver fat (measured as Fatty Liver index) and other cardiometabolic risk factors. When the results were adjusted for liver fat content, the men's reduced risk of diabetes per 10cm larger height was 34% (compared with 40% in the overall results), and the women's reduced risk was just 13% compared with 33% in the overall results.
Other biomarkers also affected the results: in men adjustment for glycated haemoglobin (a measure of blood sugar) and blood fats each reduced the risk difference by about 10%. In contrast, among women adjustment for adiponectin (a hormone involved in blood sugar control) (-30%) and C-reactive protein (a marker of inflammation) (-13%) reduced the associations of height with diabetes, in addition to the reductions observed by glycated haemoglobin and blood fats. Taken together, the authors say that a large proportion of the reduced risk attributable to increased height is related to taller people having lower liver fat and a 'healthier' cardiometabolic profile.
The authors say: "Our findings suggest that short people might present with higher cardiometabolic risk factor levels and have higher diabetes risk compared with tall people… These observations corroborate that height is a useful predictive marker for diabetes risk and suggest that monitoring of cardiometabolic risk factors may be more frequently indicated among shorter persons, independent of their body size and composition. Specifically, liver fat contributes to the higher risk among shorter individuals and, because height appears to be largely unmodifiable during adulthood, interventions to reduce liver fat may provide alternative approaches to reduce risk associated with shorter height."
However they add: "Our study also suggests that early interventions to reduce height-related metabolic risk throughout life likely need to focus on determinants of growth in sensitive periods during pregnancy, early childhood, puberty and early adulthood, and should take potential sex-differences into account."
They conclude: "We found an inverse association between height and risk of type 2 diabetes among men and women, which was largely related to leg length among men. Part of this inverse association may be driven by the associations of greater height with lower liver fat content and a more favourable profile of cardiometabolic risk factors, specifically blood fats, adiponectin and C-reactive protein."

Monday, September 16, 2019

Those who consume tea at least four times a week have better brain efficiency


A recent study led by researchers from the National University of Singapore (NUS) revealed that regular tea drinkers have better organised brain regions - and this is associated with healthy cognitive function - compared to non-tea drinkers. The research team made this discovery after examining neuroimaging data of 36 older adults.
"Our results offer the first evidence of positive contribution of tea drinking to brain structure, and suggest that drinking tea regularly has a protective effect against age-related decline in brain organisation," explained team leader Assistant Professor Feng Lei, who is from the Department of Psychological Medicine at the NUS Yong Loo Lin School of Medicine.
The research was carried out together with collaborators from the University of Essex and University of Cambridge, and the findings were published in scientific journal Aging on 14 June 2019.
Benefits of regular intake of tea Past studies have demonstrated that tea intake is beneficial to human health, and the positive effects include mood improvement and cardiovascular disease prevention. In fact, results of a longitudinal study led by Asst Prof Feng which was published in 2017 showed that daily consumption of tea can reduce the risk of cognitive decline in older persons by 50 per cent.
Following this discovery, Asst Prof Feng and his team further explored the direct effect of tea on brain networks.
The research team recruited 36 adults aged 60 and above, and gathered data about their health, lifestyle, and psychological well-being. The elderly participants also had to undergo neuropsychological tests and magnetic resonance imaging (MRI). The study was carried out from 2015 to 2018.
Upon analysing the participants' cognitive performance and imaging results, the research team found that individuals who consumed either green tea, oolong tea, or black tea at least four times a week for about 25 years had brain regions that were interconnected in a more efficient way.
"Take the analogy of road traffic as an example - consider brain regions as destinations, while the connections between brain regions are roads. When a road system is better organised, the movement of vehicles and passengers is more efficient and uses less resources. Similarly, when the connections between brain regions are more structured, information processing can be performed more efficiently," explained Asst Prof Feng.
He added, "We have shown in our previous studies that tea drinkers had better cognitive function as compared to non-tea drinkers. Our current results relating to brain network indirectly support our previous findings by showing that the positive effects of regular tea drinking are the result of improved brain organisation brought about by preventing disruption to interregional connections."
Next step in research As cognitive performance and brain organisation are intricately related, more research is needed to better understand how functions like memory emerge from brain circuits, and the possible interventions to better preserve cognition during the ageing process. Asst Prof Feng and his team plan to examine the effects of tea as well as the bioactive compounds in tea can have on cognitive decline.

Physical activity may attenuate menopause-associated atherogenic changes

n
A new study on menopausal women shows that leisure-time physical activity is associated with a healthier blood lipid profile. However, results suggest that leisure-time physical activity does not seem to entirely offset the unfavorable lipid profile changes associated with the menopausal transition.
Women experience a rapid increase in cardiovascular disease (CVD) risk after the onset of menopause. This observation suggests the presence of factors in middle-aged women that accelerate the progression of CVD independent of chronological aging.
"It is well known that physical activity has health benefits, yet it is less clear to what extent physical activity can prevent the negative changes seen in blood lipid profiles during the menopausal transition," says Matthew Jergenson, MD, from the University of Minnesota Medical School, Minneapolis, Minnesota. "The present study examined menopausal women in the city of Jyväskylä, Finland, to explore the role of leisure-time physical activity on CVD risk factors."
ERMA study examines the effects of menopause
The present study is part of the Estrogenic Regulation of Muscle Apoptosis (ERMA) study, which examines the role of menopause on body composition, leisure-time physical activity and the risk of metabolic diseases.
"Based on our findings, leisure-time physical activity was associated with a healthier blood lipid profile," explains postdoctoral researcher Sira Karvinen from the Gerontology Research Center, Faculty of Sport and Health Sciences, University of Jyväskylä, Finland. "Yet advancing menopausal status predicted a less healthy lipid profile, suggesting that leisure-time physical activity does not entirely offset the unfavorable lipid profile changes associated with the menopausal transition."
More specifically, higher leisure-time physical activity was associated with lower total cholesterol, LDL, triglyceride and fasting blood glucose levels as well as higher HDL levels. Advancing menopausal status, in turn, was associated with higher total cholesterol, triglyceride and LDL levels.
"However, leisure-time physical activity may attenuate the unfavorable atherogenic changes in the serum CV risk factors of healthy middle-aged women," Jergenson and Karvinen state. "Hence one should not forget sport-related hobbies at middle age."
The present study is part of the Estrogenic Regulation of Muscle Apoptosis (ERMA) study, a population-based cohort study (n = 886) of middle-aged Caucasian women between 47 and 55 years of age in the Jyväskylä area. In addition, 193 women composed a longitudinal study population that was followed over the menopausal transition. Physical activity was assessed both by self-reported questionnaires and accelerometer monitoring. Serum lipid profiles (total cholesterol, LDL, HDL, triglycerides, fasting blood glucose) were analyzed to quantify cardiovascular risk factors.

Daily aspirin may benefit many patients without existing cardiovascular disease



The benefits of aspirin may outweigh the risks for many patients without known cardiovascular disease (CVD). Such patients could be identified by using a personalized benefit-harm analysis, which could inform discussions between doctors and patients. The findings are published in Annals of Internal Medicine.
Aspirin reduces the risk for CVD in at-risk patients, but also increases the risk for bleeding. It is not clear if the benefits of aspirin outweigh the risks for patients without known CVD.
Researchers from the University of Auckland, New Zealand studied 245,028 persons (43.6 percent women) aged 30 to 79 years without established CVD to identify persons for whom aspirin would probably result in a net benefit. The net effect of aspirin was calculated for each participant by subtracting the number of CVD events likely to be prevented from the number of major bleeds likely to be caused over 5 years. The data were derived from PREDICT, a well-characterized web-based decision support program integrated with electronic primary care practice management systems in New Zealand. The researchers found that 2.5 percent of women and 12.1 percent of men without established CVD were likely to derive net benefit from aspirin treatment for 5 years if a hospitalization or death due to an acute CVD event was considered equivalent to a hospitalization or death due to an acute major bleed. These percentages increased to 21 percent of women and 41 percent of men when one CVD event was assumed to be equivalent to two major bleeds.
The author of an editorial from the Cardiovascular Institute, Rutgers Robert Wood Johnson Medical School cautions that the study findings may not apply to populations outside New Zealand. Also, participants over the age of 79 were not included in the analysis. The author points to the diversity of findings in several aspirin studies to conclude that making firm, evidence-based recommendations for aspirin use for primary prevention is difficult.

HPV causes oral, anal, and penile cancers


More than 70% of U.S. adults are unaware that human papillomavirus (HPV) causes anal, penile, and oral cancers, according to an analysis led by researchers at The University of Texas Health Science Center at Houston (UTHealth) School of Public Health and published in the current issue of JAMA Pediatrics.
Men are also less likely than women to know that the virus carries a risk of cancer, said Ashish A. Deshmukh, PhD, MPH, assistant professor at UTHealth School of Public Health, who led the study that included 2,564 men and 3,697 women who took part in the Health Information National Trend Survey. Two-thirds of men and one-third of women ages 18-26 did not know that HPV causes cervical cancer. More than 80% of men and 75% of women in the same age group - and 70% of all American adults of any age - did not know that HPV can cause oral, anal, and penile cancers.
Human papillomavirus is the most common sexually transmitted infection. There are many types of HPV, but some are more likely to cause cancers and genital warts. The HPV vaccine can protect against cancers caused by the virus.
The analysis by Deshmukh and colleagues also showed that, of people who were vaccine-eligible or had vaccine-eligible family members, only 19% of men and 31.5% of women received recommendations for the vaccine from a health care provider.
According to the U.S. Centers for Disease Control and Prevention (CDC), boys and girls ages 9-14 should receive the two-dose immunization. A three-dose schedule is recommended if the first dose was given on or after the 15th birthday. Recently, CDC also recommended that adults ages 27-45 may decide to get the HPV vaccine based on discussion with their clinician. A 2018 report by the CDC suggested only 51% of those in the recommended age groups were vaccinated.
"The lack of knowledge may have contributed to low HPV vaccination rates in the United States," said Deshmukh.
"Low levels of HPV knowledge in these older age groups is particularly concerning, given that these individuals are (or will likely be) parents responsible for making HPV vaccination decisions for their children," said Kalyani Sonawane, PhD, assistant professor at UTHealth School of Public Health, the study's co-lead author.
"HPV vaccination campaigns have focused heavily on cervical cancer prevention in women. Our findings demonstrate a need to educate both sexes regarding HPV and HPV vaccination," Deshmukh said. "Rates of cervical cancer have declined in the last 15 to 20 years because of screening. On the other hand, there was a greater than 200% increase in oropharyngeal cancer rates in men and a nearly 150% rise in anal cancer rates in women."
Improving HPV vaccination rates is important to reverse rising rates of these cancers, Deshmukh added.

Later puberty and later menopause associated with lower risk of type 2 diabetes in women, while use of contraceptive pill and longer time between periods associated with higher risk


New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 Sept) shows that use of the contraceptive pill and longer menstrual cycles are associated with a higher risk of developing type 2 diabetes (T2D), while later puberty and later menopause are associated with lower risk.
The study, by Dr Sopio Tatulashvili, Avicenne Hospital, Bobigny, France, and colleagues, suggests that in general longer exposure to sex hormones, but later in life, could reduce the risk of diabetes, and that women at high-risk of T2D taking the contraceptive pill may require personalised advice.
Early screening to detect poor blood sugar control (that may lead to T2D) could lower the risk of further complications. For this reason, it is important to identify the risk factors of T2D. The aim of this study was to determine the association between various hormonal factors and the risk of developing T2D in the large prospective female E3N cohort study.*
The study included 83 799 French women from the E3N prospective cohort followed between 1992 and 2014. Computer models adjusted for the main T2D risk factors were used to estimate risk and statistical significance between various hormonal factors and T2D risk. The risk factors adjusted for included body mass index, smoking, age, physical activity, socioeconomic status, education level, family history of T2D, and blood pressure.
The authors observed that higher age at puberty (aged over 14 years versus under 12 years) reduced T2D risk by 12%, and increased age at menopause (52 years and over compared to under 47 years) reduced risk by 30%. Breastfeeding (ever breastfed versus never breastfed) was also associated with a 10% reduced risk of developing T2D.
Furthermore, an increased total lifetime number of menstrual cycles (over 470 in a woman's lifetime versus under 390) was associated with a 25% reduced risk of developing T2D, and longer duration of exposure to sex hormones (meaning the time between puberty and menopause) (over 38 years compared with under 31 years) was associated with a 34% decreased risk of developing T2D.
By contrast, the use of contraceptive pills (at least once during a woman's lifetime compared with no use at all) was associated with a 33% increased risk of developing T2D, and longer time between periods (menstrual cycle length) (32 days and over versus 24 days and under) was associated with a 23% increased risk.
The authors say: "It seems that longer exposure to sex hormones but later in life could reduce the risk of later developing type 2 diabetes, independent of well-established risk factors. Risk induced by oral contraceptives could lead to personalised advice for young women at risk of developing T2D, such as those with a family history of diabetes, those who are overweight or obese, or those with polycystic ovary syndrome."

Alcohol consumption in people with type 2 diabetes may have some positive effects


An meta-analysis of studies presented at this year's Annual Meeting of the European Association for the Study of Diabetes in Barcelona, Spain (16-20 September) shows that recommendations to moderate alcohol consumption for people with type 2 diabetes (T2D) may need to be reviewed, since low-to-moderate consumption could have a positive effect on blood glucose and fat metabolism.

The study is by Yuling Chen, Southeast University, Nanjing, China, and Dr Li Ling, Director of the Department of Endocrinology, Zhongda Hospital and School of Medicine, Southeast University, Nanjing, China and colleagues.

However, regardless of the effects on metabolism shown by this analysis, advice from various diabetes organisations including Diabetes UK* remains that people with T1D or T2D need to be careful with alcohol consumption, since drinking can make you more likely to have a hypoglyaemic episode (known as a hypo) because alcohol makes your blood sugars drop. It can also cause weight gain and other health issues.

The authors studied PubMed, Embase, and Cochrane databases up to March 2019 for randomised controlled trials (RCTs) that assessed the relationship between alcohol consumption and glucose and lipid metabolism among adults with T2D. Extracted data from RCTs were analysed using computer modelling.

The authors found ten relevant RCTs involving 575 participants that were included in this review. Meta-analysis showed that alcohol consumption was associated with reduced triglyceride levels and insulin levels, but had no statistically significant effect on fasting blood glucose levels, glycated haemoglobin (HbA1c, a measure of blood glucose control), or total cholesterol, low density lipoprotein (bad) cholesterol, and high density lipoprotein (good) cholesterol.

Subgroup analysis indicated that drinking light to moderate amounts of alcohol decreased the levels of triglycerides (blood fats) and insulin in people with T2DM. Light to moderate drinking was defined by the authors as 20g or less of alcohol per day. This translates to approximately 1.5 cans of beer (330ml, 5% alcohol), a large (200ml) glass of wine (12% alcohol) or a 50ml serving of 40% alcohol spirt (for example vodka/gin).

The authors conclude: "Findings of this meta-analysis show a positive effect of alcohol on glucose and fat metabolism in people with type 2 diabetes. Larger studies are needed to further evaluate the effects of alcohol consumption on blood sugar management, especially in patients with type 2 diabetes."

Vegan diet can boost gut microbes related to body weight, body composition and blood sugar control


New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 Sept) suggests that a 16-week vegan diet can boost the gut microbes that are related to improvements in body weight, body composition and blood sugar control. The study is by Dr Hana Kahleova, Physicians Committee for Responsible Medicine (PCRM), Washington, DC, USA, and colleagues.
Gut microbiota play an important role in weight regulation, the development of metabolic syndrome, and type 2 diabetes. The aim of this study was to test the effect of a 16-week plant-based diet on gut microbiota composition, body weight, body composition, and insulin resistance in overweight adults with no history of diabetes.
The study included 147 participants (86% women and 14% men; mean age was 55.6±11.3 years), who were randomised to follow a low-fat vegan diet (n=73) or to make no changes to their diet (n=74) for 16 weeks. At baseline and 16 weeks, gut microbiota composition was assessed, using uBiome kits. Dual energy X-ray absorptiometry was used to measure body composition. A standard method called the PREDIM index was used to assess insulin sensitivity.
Following the 16-week study, body weight was reduced significantly in the vegan group (treatment effect average -5.8 kg), particularly due to a reduction in fat mass (average -3.9 kg) and in visceral fat. Insulin sensitivity also increased significantly in the vegan group.
The relative abundance of Faecalibacterium prausnitzii increased in the vegan group (treatment effect +4.8%). Relative changes in Faecalibacterium prausnitzii were associated with decreases in body weight, fat mass and visceral fat. The relative abundance of Bacteoides fragilis also increased in the vegan group (treatment effect +19.5%). Relative changes in Bacteroides fragilis were associated with decreases in body weight, fat mass and visceral fat, and increases in insulin sensitivity.
The authors conclude: "A 16-week low-fat vegan dietary intervention induced changes in gut microbiota that were related to changes in weight, body composition and insulin sensitivity in overweight adults."
However, the authors acknowledge that further work is needed to separate out the effects of the vegan diet itself from that of the reduced calories. They say: "A plant-based diet has been shown to be effective in weight management, and in diabetes prevention and treatment. This study has explored the link between changes in the gut microbiome, and changes in body weight, body composition, and insulin sensitivity. We have demonstrated that a plant-based diet elicited changes in gut microbiome that were associated with weight loss, reduction in fat mass and visceral fat volume, and increase in insulin sensitivity."
They add: "The main shift in the gut microbiome composition was due to an increased relative content of short-chain fatty acid producing bacteria that feed on fibre. Therefore, high dietary fibre content seems to be essential for the changes observed in our study. We plan to compare the effects of a vegan and a standard portion-controlled diet on gut microbiome in people with type 2 diabetes, in order to separate out the positive effects of the reduced calories in the diet from those caused by the vegan composition of the diet."
They continue: "This is a fascinating area of research and we have been collecting data from more study participants. We hope we will be able to present them at the next year's 2020 EASD meeting."
The authors say that fibre is the most important component of plant foods that promotes a healthy gut microbiome. Faecalibacterium prausnitzii is one of the short-chain fatty acids producing bacteria, which degrade plant complex sugars and starch to produce health-promoting butyrate and/or other short-chain fatty acids that have been found to have a beneficial effect on body weight, body composition, and insulin sensitivity. The authors say: "Eating more fibre is the number one dietary recommendation for a healthy gut microbiome."

Don't make major decisions on an empty stomach, research suggests


We all know that food shopping when hungry is a bad idea but new research from the University of Dundee suggests that people might want to avoid making any important decisions about the future on an empty stomach.
The study, carried out by Dr Benjamin Vincent from the University's Psychology department, found that hunger significantly altered people's decision-making, making them impatient and more likely to settle for a small reward that arrives sooner than a larger one promised at a later date.
Participants in an experiment designed by Dr Vincent were asked questions relating to food, money and other rewards when satiated and again when they had skipped a meal.
While it was perhaps unsurprising that hungry people were more likely to settle for smaller food incentives that arrived sooner, the researchers found that being hungry actually changes preferences for rewards entirely unrelated to food.
This indicates that a reluctance to defer gratification may carry over into other kinds of decisions, such as financial and interpersonal ones. Dr Vincent believes it is important that people know that hunger might affect their preferences in ways they don't necessarily predict.
There is also a danger that people experiencing hunger due to poverty may make decisions that entrench their situation.
"We found there was a large effect, people's preferences shifted dramatically from the long to short term when hungry," he said. "This is an aspect of human behaviour which could potentially be exploited by marketers so people need to know their preferences may change when hungry.
"People generally know that when they are hungry they shouldn't really go food shopping because they are more likely to make choices that are either unhealthy or indulgent. Our research suggests this could have an impact on other kinds of decisions as well. Say you were going to speak with a pensions or mortgage advisor - doing so while hungry might make you care a bit more about immediate gratification at the expense of a potentially more rosy future.
"This work fits into a larger effort in psychology and behavioural economics to map the factors that influence our decision making. This potentially empowers people as they may forsee and mitigate the effects of hunger, for example, that might bias their decision making away from their long term goals."
Dr Vincent and his co-author and former student Jordan Skrynka tested 50 participants twice - once when they had eaten normally and once having not eaten anything that day.
For three different types of rewards, when hungry, people expressed a stronger preference for smaller hypothetical rewards to be given immediately rather than larger ones that would arrive later.
The researchers noted that if you offer people a reward now or double that reward in the future, they were normally willing to wait for 35 days to double the reward, but when hungry this plummeted to only 3 days.
The work builds on a well-known psychological study where children were offered one marshmallow immediately or two if they were willing to wait 15 minutes.
Those children who accepted the initial offering were classed as more impulsive than those who could delay gratification and wait for the larger reward. In the context of the Dundee study, this indicates that hunger makes people more impulsive even when the decisions they are asked to make will do nothing to relieve their hunger.
"We wanted to know whether being in a state of hunger had a specific effect on how you make decisions only relating to food or if it had broader effects, and this research suggests decision-making gets more present-focused when people are hungry," said Dr Vincent.
"You would predict that hunger would impact people's preferences relating to food, but it is not yet clear why people get more present-focused for completely unrelated rewards.
"We hear of children going to school without having had breakfast, many people are on calorie restriction diets, and lots of people fast for religious reasons. Hunger is so common that it is important to understand the non-obvious ways in which our preferences and decisions may be affected by it."

Eating cheese may offset blood vessel damage from salt



Cheese lovers, rejoice. Antioxidants naturally found in cheese may help protect blood vessels from damage from high levels of salt in the diet, according to a new Penn State study.
In a randomized, crossover design study, the researchers found that when adults consumed a high sodium diet, they also experienced blood vessel dysfunction. But, when the same adults consumed four servings of cheese a day alongside the same high sodium diet, they did not experience this effect.
Billie Alba, who led the study while finishing her PhD at Penn State, said the findings may help people balance food that tastes good with minimizing the risks that come with eating too much salt.
"While there's a big push to reduce dietary sodium, for a lot of people it's difficult," Alba said. "Possibly being able to incorporate more dairy products, like cheese, could be an alternative strategy to reduce cardiovascular risk and improve vessel health without necessarily reducing total sodium."
While sodium is a mineral that is vital to the human body in small doses, the researchers said too much dietary sodium is associated with cardiovascular risk factors like high blood pressure. The American Heart Association recommends no more than 2,300 milligrams (mg) of sodium a day, with the ideal amount being closer to 1,500 mg for most adults.
According to Lacy Alexander, professor of kinesiology at Penn State and another researcher on the study, previous research has shown a connection between dairy products -- even cheeses high in sodium -- and improved heart health measures.
"Studies have shown that people who consume the recommended number of dairy servings each day typically have lower blood pressure and better cardiovascular health in general," Alexander said. "We wanted to look at those connections more closely as well as explore some of the precise mechanisms by which cheese, a dairy product, may affect heart health."
The researchers recruited 11 adults without salt-sensitive blood pressure for the study. They each followed four separate diets for eight days at a time: a low-sodium, no-dairy diet; a low-sodium, high-cheese diet; a high-sodium, no-dairy diet; and a high-sodium, high-cheese diet.
The low sodium diets had participants consume 1,500 mg of salt a day, while the high sodium diets included 5,500 mg of salt per day. The cheese diets included 170 grams, or about four servings, of several different types of cheese a day.
At the end of each week-long diet, the participants returned to the lab for testing. The researchers inserted tiny fibers under the participants' skin and applied a small amount of the drug acetylcholine, a compound that signals blood vessels to relax. By examining how each participants' blood vessels reacted to the drug, the researchers were able to measure blood vessel function.
The participants also underwent blood pressure monitoring and provided a urine sample to ensure they had been consuming the correct amount of salt throughout the week.
The researchers found that after a week on the high sodium, no cheese diet, the participants' blood vessels did not respond as well to the acetylcholine -- which is specific to specialized cells in the blood vessel -- and had a more difficult time relaxing. But this was not seen after the high sodium, high cheese diet.
"While the participants were on the high-sodium diet without any cheese, we saw their blood vessel function dip to what you would typically see in someone with pretty advanced cardiovascular risk factors," Alexander said. "But when they consumed the same amount of salt, and ate cheese as a source of that salt, those effects were completely avoided."
Alba said that while the researchers cannot be sure that the effects are caused by any one specific nutrient in cheese, the data suggests that antioxidants in cheese may be a contributing factor.
"Consuming high amounts of sodium causes an increase in molecules that are harmful to blood vessel health and overall heart health," Alba said. "There is scientific evidence that dairy-based nutrients, specifically peptides generated during the digestion of dairy proteins, have beneficial antioxidant properties, meaning that they have the ability to scavenge these oxidant molecules and thereby protect against their damaging physiological effects."
Alba said that in the future, it will be important to study these effects in larger studies, as well as further research possible mechanisms by which dairy foods may preserve vascular health.

Meatballs might wreck the anti-cancer perks of tomato sauce



Eating your tomato sauce with meatballs piled on top could have a surprising downside, new research suggests.
Some of the anti-cancer benefits of tomatoes, specifically those from a compound called lycopene, could disappear when they're eaten with iron-rich foods, according to a new study from The Ohio State University.
Researchers analyzed the blood and digestive fluid of a small group of medical students after they consumed either a tomato extract-based shake with iron or one without iron. Lycopene levels in digestive fluid and in the blood were significantly lower when the study subjects drank the liquid meal mixed with an iron supplement, meaning there was less for the body to use in potentially beneficial ways.
"When people had iron with their meal, we saw almost a twofold drop in lycopene uptake over time," said the study's lead author, Rachel Kopec, an assistant professor of human nutrition at Ohio State.
"This could have potential implications every time a person is consuming something rich in lycopene and iron - say a Bolognese sauce, or an iron-fortified cereal with a side of tomato juice. You're probably only getting half as much lycopene from this as you would without the iron."
Iron is essential in the diet, performing such critical functions as allowing our bodies to produce energy and get rid of waste. But it's also a nutrient that is known to monkey with other cellular-level processes.
"We know that if you mix iron with certain compounds it will destroy them, but we didn't know if it would impair potentially beneficial carotenoids, like lycopene, found in fruits and vegetables," Kopec said.
Carotenoids are plant pigments with antioxidant properties responsible for many bright red, yellow and orange pigments found in the produce aisle. These include lycopene, which is found in abundance in tomatoes and also colors watermelon and pink grapefruit. Scientists have identified several potential anti-cancer benefits of lycopene, including in prostate, lung and skin cancers.
The small study, which included seven French medical students who had repeated blood draws and digestive samples taken from tubes placed in their stomachs and small intestines, took this research out of the test tube and into the human body, allowing for a better examination of human metabolism in action, Kopec said.
It's unclear precisely what is happening that is changing the uptake of lycopene, but it could be that the meal with iron oxidizes the lycopene, creating different products of metabolism than those followed in the study.
"It's also possible that iron interrupts the nice emulsified mix of tomato and fats that is critical for cells to absorb the lycopene. It could turn it into a substance like separated salad dressing - oil on top and vinegar on the bottom - that won't ever mix properly," Kopec said.
Researchers continue to work to better understand lycopene's role in fighting cancer, and the importance of its interplay with other compounds and nutrients.
"Nutrition can play an important role in disease prevention, but it's important for us to gather the details about precisely how what we eat is contributing to our health so that we can give people reliable, science-based recommendations," Kopec said.

Tuesday, September 10, 2019

Skin cancer risk: The dangers of ultraviolet radiation


The dangers of ultraviolet radiation exposure, which most often comes from the sun, are well-known. Speaking at The Physiological Society's Extreme Environmental Physiology conference next week, W. Larry Kenney, Penn State University, will discuss how broad its effects can be, from premature aging to cancer, and how this can be influenced by different skin tones and the use of sunscreen.
Athletes ranging from hikers, to tennis and runners exceed the recommended ultraviolet exposure limit by up to eight-fold during the summer and autumn months. While regular physical activity is associated with a reduced risk of most cancers, skin cancer is an exception. For malignant skin cancer, those in the 90th percentile for physical activity have an increased risk of cancer than those in the 10th percentile. Sun protection in these groups is especially important as multiple studies demonstrate an elevated risk of skin cancer for those who regularly participate in outdoor sports or exercise.
The ultraviolet radiation spectrum is categorized by wavelength as UV-A (320-400 nm), UV-B (290-320 nm), and UV-C (200-290 nm) and the biological effects vary per type. UV-A constitutes around 95% of ultraviolet radiation that reaches the earth's surface, with the remainder being UV-B. In the skin, UV-A is able to reach the skin's blood circulation but most of UV-B is absorbed in the outer layers of the skin (called the epidermis and upper dermis) due to its shorter wavelengths.
Skin pigmentation is another factor that affects our response to sun exposure. UV radiation affects the body's ability to create two important substances, vitamin D and folate, which contribute to both a health pregnancy and early childhood development. It helps vitamin D be synthesised, whereas it causes folate to break down.
There is a theory that suggests that early human populations, living in equatorial Africa, evolved skin pigmentation to protect themselves from folate degradation. This theory also says that depigmentation then occurred as humans moved away from the equator to allow for higher levels of vitamin D synthesis.
Commenting on his talk, Professor Kenney said:
"Sun protection in athletes is especially important as multiple studies demonstrate an elevated risk of skin cancer for those who regularly participate in outdoor sports or exercise. Surprisingly, fewer than 25% of surveyed athletes reported regular use of sunscreen, so there is clearly more awareness-raising that needs to be done."

Once or twice weekly daytime nap linked to lower heart attack/stroke risk


But no such association found for greater frequency or duration of naps



A daytime nap taken once or twice a week may lower the risk of having a heart attack/stroke, finds research published online in the journal Heart. But no such association emerged for either greater frequency or duration of naps.
The impact of napping on heart health has been hotly contested. Many of the published studies on the topic have failed to consider napping frequency, or focused purely on cardiovascular disease deaths, or compared regular nappers with those not opting for a mini siesta, say the researchers.
In a bid to try and address these issues, they looked at the association between napping frequency and average nap duration and the risk of fatal and non-fatal cardiovascular disease 'events,' such as heart attack, stroke, or heart failure, among 3462 randomly selected residents of Lausanne, Switzerland.
Each participant was aged between 35 and 75, when recruited between 2003 and 2006 to the CoLaus study. This has been looking at the factors behind the development of cardiovascular disease.
Participants' first check-up took place between 2009 and 2012, when information on their sleep and nap patterns in the previous week was collected, and their health was then subsequently monitored for an average of 5 years.
Over half (58%, 2014) of the participants said they didn't nap during the previous week; around one in five (19%, 667) said they took one to two naps; around one in 10 (12%, 411) said they took three to five; while a similar proportion (11%, 370) said they took six to seven.
Frequent nappers (3-7 naps a week) tended to be older, male, smokers, weigh more, and to sleep for longer at night than those who said they didn't nap during the day.
And they reported more daytime sleepiness and more severe obstructive sleep apnea -- a condition in which the walls of the throat relax and narrow during sleep, interrupting normal breathing.
During the monitoring period, there were 155 fatal and non-fatal cardiovascular disease 'events'.
Occasional napping, once to twice weekly, was associated with an almost halving in attack/stroke/heart failure risk (48%) compared with those who didn't nap at all.
This association held true after taking account of potentially influential factors, such as age, and nighttime sleep duration, as well as other cardiovascular disease risks, such as high blood pressure/cholesterol.
And it didn't change after factoring in excessive daytime sleepiness, depression, and regularly sleeping for at least 6 hours a night. Only older age (65+) and severe sleep apnea affected it.
But the 67% heightened cardiovascular risk initially observed for frequent nappers virtually disappeared after taking account of potentially influential factors. And no associations with cardiovascular disease 'events' were found for nap length (from 5 minutes to 1 hour plus).
This is an observational study, and as such, can't establish cause, added to which the information on nap and sleep patterns relied on personal recall. But nap frequency may help to explain the differing conclusions reached by researchers about the impact of napping on heart health, suggest the study authors.
In a linked editorial, Drs Yue Leng and Kristine Yaffe, of the University of California at San Francisco, USA, point out that research in this area is hampered by the absence of a gold standard for defining and measuring naps, making it "premature to conclude on the appropriateness of napping for maintaining optimal heart health."
But they add: "While the exact physiological pathways linking daytime napping to [cardiovascular disease] risk is not clear, [this research] contributes to the ongoing debate on the health implications of napping, and suggests that it might not only be the duration, but also the frequency that matters."
And they conclude: "The study of napping is a challenging but also a promising field with potentially significant public health implications. While there remain more questions than answers, it is time to start unveiling the power of naps for a supercharged heart."

Monday, September 9, 2019

Fatty foods necessary for vitamin E absorption, but not right away



A fresh look at how to best determine dietary guidelines for vitamin E has produced a surprising new finding: Though the vitamin is fat soluble, you don't have to consume fat along with it for the body to absorb it.
"I think that's remarkable," said the study's corresponding author, Maret Traber of Oregon State University, a leading authority on vitamin E who's been researching the micronutrient for three decades. "We used to think you had to eat vitamin E and fat simultaneously. What our study shows is that you can wait 12 hours without eating anything, then eat a fat-containing meal and vitamin E gets absorbed."
The study was published today in The American Journal of Clinical Nutrition.
Vitamin E, known scientifically as alpha-tocopherol, has many biologic roles, one of which is to serve as an antioxidant, said Traber, a professor in the OSU College of Public Health and Human Sciences, and Ava Helen Pauling Professor at Oregon State's Linus Pauling Institute.
Federal dietary guidelines call for 15 milligrams of vitamin E daily (by comparison, 65-90 milligrams of vitamin C are recommended). The new research could play a role in future vitamin E guidelines.
Vitamin E in human diets is most often provided by oils, such as olive oil. Many of the highest levels are in foods not routinely considered dietary staples, such as almonds, sunflower seeds and avocados.
"There's increasingly clear evidence that vitamin E is associated with brain protection, and now we're starting to better understand some of the underlying mechanisms," Traber said.
In this latest study, Traber and collaborators used a novel technique involving deuterium-labeled vitamin E, administered both orally and intravenously, to study fractional vitamin E absorption in a group of non-obese, non-diabetic women ages 18-40 with normal blood pressure.
Fractional absorption means just what you would think - the fraction of the dose absorbed by the body rather than metabolized and excreted. Fractional absorption dictates how much of something, in this case vitamin E, a person needs to take to maintain the correct level in his or her body.
Deuterium, the vitamin E marker in this study, is an isotope of hydrogen with double the atomic mass of the regular version; deuterium has both a proton and a neutron, compared to just a proton for normal hydrogen, and is a common tracer in investigations of biochemical reactions.
Study subjects at the National Institutes of Health Clinical Center were given both oral and IV vitamin E and drank a liquid meal containing either 40% fat or no fat. Researchers then used a combination of tightly controlled dietary intakes to determine the roles fat and fasting played in vitamin E absorption.
"What this study says is, vitamin E gets taken up into the intestinal cell and sits there and waits for the next meal to come along," Traber said. "It's in a fat droplet, sitting there, waiting to be picked up, like a cargo container, and loaded onto a chylomicron truck."
Chylomicrons are lipoprotein particles that transport dietary lipids - fats - around the body through the blood plasma.
The IV portion of the study, used in conjunction with the oral dosing to calculate fractional absorption, also yielded remarkable findings, Traber said.
"We injected the vitamin E in a lipid emulsion and expected it would take some time to disappear from the plasma and them come slowly back into circulation, but it was gone within 10 minutes," Traber said. "High-density lipoproteins quickly acquired the vitamin E, and the chylomicrons quickly disappeared from circulation into the liver.
"The IV vitamin E we put into the body over three days, almost none of it came out again, like 2% of the dose," she added. "No one had ever seen that before - normally you absorb about half of what you consume. That vitamin E that's staying in the body, we don't know where it goes, and finding that out is important for studying how much vitamin E you need to eat every day."
Vitamin E is a group of eight compounds - four tocopherols and four tocotrienols, distinguished by their chemical structure. Alpha-tocopherol is what vitamin E commonly refers to and is found in supplements and the European diet; gamma-tocopherol is the type of vitamin E most commonly found in the American diet.
"Plants make eight different forms of vitamin E and you absorb them all, but the liver only puts alpha-tocopherol back into the bloodstream," Traber said. "All of the other forms are metabolized and excreted. That tells us the body is working very hard to get all the nutrients it can and will sort out what the toxins are later. That's really exciting, because it explains why the liver needs an alpha-tocopherol transfer protein but the intestine does not."

Use of antibiotics in preemies has lasting, potentially harmful effects


Nearly all premature babies receive antibiotics in their first weeks of life to ward off or treat potentially deadly bacterial infections. Such drugs are lifesavers, but they also cause long-lasting collateral damage to the developing microbial communities in the babies' intestinal tracts, according to research from Washington University School of Medicine in St. Louis.
A year and a half after babies leave the neonatal intensive care unit (NICU), the consequences of early antibiotic exposure remain, the study showed. Compared to healthy full-term babies in the study who had not received antibiotics, preemies' microbiomes contained more bacteria associated with disease, fewer species linked to good health, and more bacteria with the ability to withstand antibiotics.
The findings, published Sept. 9 in Nature Microbiology, suggest that antibiotic use in preemies should be carefully tailored to minimize disruptions to the gut microbiome - and that doing so might reduce the risk of health problems later in life.
"The type of microbes most likely to survive antibiotic treatment are not the ones we typically associate with a healthy gut," said senior author Gautam Dantas, PhD, a professor of pathology and immunology, of molecular microbiology, and of biomedical engineering. "The makeup of your gut microbiome is pretty much set by age 3, and then it stays pretty stable. So if unhealthy microbes get a foothold early in life, they could stick around for a very long time. One or two rounds of antibiotics in the first couple weeks of life might still matter when you're 40."
Healthy gut microbiomes have been linked to reduced risk of a variety of immune and metabolic disorders, including inflammatory bowel disease, allergies, obesity and diabetes. Researchers already knew that antibiotics disrupt the intestinal microbial community in children and adults in ways that can be harmful. What they didn't know was how long the disruptions last.
To find out whether preemies' microbiomes recover over time, Dantas and colleagues - including first author Andrew Gasparrini, PhD, who was a graduate student at the time the study was conducted, and co-authors Phillip I. Tarr, MD, the Melvin E. Carnahan Professor of Pediatrics, and Barbara Warner, MD, director of the Division of Newborn Medicine - analyzed 437 fecal samples collected from 58 infants, ages birth to 21 months. Forty-one of the infants were born around 2 ½ months premature, and the remainder were born at full term.
All of the preemies had been treated with antibiotics in the NICU. Nine had received just one course, and the other 32 each had been given an average of eight courses and spent about half their time in the NICU on antibiotics. None of the full-term babies had received antibiotics.
The researchers discovered that preemies who had been heavily treated with antibiotics carried significantly more drug-resistant bacteria in their gut microbiomes at 21 months of age than preemies who had received just one course of antibiotics, or full-term infants who had not received antibiotics. The presence of drug-resistant bacteria did not necessarily cause any immediate problems for the babies because most gut bacteria are harmless - as long as they stay in the gut. But gut microbes sometimes escape the intestine and travel to the bloodstream, urinary tract or other parts of the body. When they do, drug resistance can make the resulting infections very difficult to treat.
Moreover, by culturing bacteria from fecal samples taken eight to 10 months apart, the researchers discovered that the drug-resistant strains present in older babies were the same ones that had established themselves early on.
"They weren't just similar bugs, they were the same bugs, as best we could tell," Dantas said. "We had cleared an opening for these early invaders with antibiotics, and once they got in, they were not going to let anybody push them out. And while we didn't show that these specific bugs had caused disease in our kids, these are exactly the kind of bacteria that cause urinary tract and bloodstream infections and other problems. So you have a situation where potentially pathogenic microbes are getting established early in life and sticking around."
Further studies showed that all of the babies developed diverse microbiomes by 21 months of age - a good sign since lack of microbial diversity is associated with immune and metabolic disorders in children and adults. But heavily treated preemies developed diverse microbiomes more slowly than lightly treated preemies and full-term infants. Further, the makeup of the gut microbial communities differed, with heavily treated premature infants having fewer healthy groups of bacteria such as Bifidobacteriaceae and more unhealthy kinds such as Proteobacteria.
The findings already have led Warner, who takes care of premature infants in the NICU at St. Louis Children's Hospital, and her fellow neonatalogists to scale down their use of antibiotics.
"We're no longer saying, 'Let's just start them on antibiotics because it's better to be safe than sorry,'" Warner said. "Now we know there's a risk of selecting for organisms that can persist and create health risks later in childhood and in life. So we're being much more judicious about initiating antibiotic use, and when we do start babies on antibiotics, we take them off as soon as the bacteria are cleared. We still have to use antibiotics - there's no question that they save lives - but we've been able to reduce antibiotic use significantly with no increase in adverse outcomes for the children."

Women's deep belly fat more strongly linked to diabetes and cardiovascular diseases


A comprehensive study from Uppsala University, with over 325,000 participants, shows that deep belly fat is a major contributing risk factor for developing diabetes and cardiovascular disease. The study also shows that deep belly fat is a larger risk factor in women compared to men. Moreover, the scientists investigated how our genes affect the accumulation of fat and present a new, simpler method to estimate the amount of deep belly fat.
Visceral fat - fat stored around the organs in the belly and around the intestines - is known to be associated with a higher risk of developing diabetes and cardiovascular disease. In the new study, published in Nature Medicine, the scientists took it one step further and showed, using genetic data, that there is an actual causal relationship between visceral fat and increased risk of diabetes, heart attack, hypertension and hyperlipidemia.
The scientists developed a method to more easily estimate visceral fat content. The method is not only useful for research purposes, but may also be useful in health care.
"To measure the amount of visceral fat, advanced and costly diagnostic imaging techniques are required. We have developed a simple method which instead estimates an individual's amount of deep belly fat from other parameters, more easily measured than the visceral fat itself, and the method can therefore be used in most clinics," says Dr. Torgny Karlsson, statistician at the Department of Immunology, Genetics and Pathology, Uppsala University, and one of the leading researchers of the study.
The method also enabled the researchers to study the effects of visceral fat on a much larger scale than before.
"We were surprised that visceral fat was more strongly linked to risk of disease in women compared to men," says one of the co-authors, Dr. Åsa Johansson, associate professor of molecular epidemiology at the Department of Immunology, Genetics and Pathology, Science for Life Laboratory, Uppsala University.
"Adding an extra kilogram of visceral fat can increase the risk of type 2 diabetes more than seven times in women, while the same amount of fat accumulation only increases the risk a little more than two times in men," says Dr. Johansson.
The scientists also found that the risk of disease increases most rapidly in people with small or moderate amounts of deep belly fat, but that it does not increase nearly as much if a person with large amounts of fat in the abdomen puts on additional fat.
"Nonlinear effects like this are very interesting to study and may help us to understand the biology behind the link between visceral fat and disease," says Dr. Karlsson.
The scientists also examined millions of positions in the genome to identify genes that affect the amount of visceral fat, and found more than two hundred different genes. Among these, there was a large proportion of genes that are linked to our behaviour, which suggests that the main contributor to abdominal obesity is, after all, that we eat too much and exercise too little. However, there are individual differences in how the fat is distributed in the body, and a person who appears not to be overweight may still have accumulated a harmful amount of visceral fat.
"The findings of this study may enable us to simplify measurements of visceral fat, and thus more easily identify people at high risk of developing diabetes and cardiovascular disease," says Dr. Karlsson.

World's largest evidence review: Nutritional supplements for mental health



We've all heard that 'food is good for your mood'. Now a new study into mental health and nutrient supplementation has taken a leap forward by establishing the gold standard for which nutrients are proven to assist in the management of a range of mental health disorders.
As well as an established relationship between poor diet and mental illness, there is now a vast body of research examining the benefit of nutrient supplementation in people with mental disorders.
To unpack this research, an international team of scientists led by Sydney's NICM Health Research Institute, Western Sydney University examined the 'best of the best' available evidence. The aim was to provide a clear overview of the benefit of specific nutrient supplements - including dosage, target symptoms, safety and tolerability - across different mental disorders.
The world's largest review (a meta-synthesis) of top-tier evidence, published online today in World Psychiatry, examined 33 meta-analyses of randomised control trials (RCTs) and data from 10,951 people with mental health disorders including depression, stress and anxiety disorders, bipolar disorder, personality disorders, schizophrenia and attention-deficit/hyperactivity disorder (ADHD).
Although the majority of nutritional supplements assessed did not significantly improve mental health, the researchers found strong evidence that certain supplements are an effective additional treatment for some mental disorders, supportive of conventional treatment.
All nutrient supplements were found to be safe when recommended dosages and prescriptive instructions were adhered to and there was no evidence of serious adverse effects or contraindications with psychiatric medications.
Summary of results:
  • The strongest evidence was found for omega-3 supplements (a polyunsaturated fatty acid) as an add-on treatment for major depression - reducing symptoms of depression beyond the effects of antidepressants alone.
  • There was some evidence to suggest that omega-3 supplements may also have small benefits for ADHD.
  • There was emerging evidence for the amino acid N-acetylcysteine as a useful adjunctive treatment in mood disorders and schizophrenia.
  • Special types of folate supplements may be effective as add-on treatments for major depression and schizophrenia, however folic acid was ineffective.
  • There was no strong evidence for omega-3 for schizophrenia or other mental health conditions.
  • There is currently a lack of compelling evidence supporting the use of vitamins (such as E, C, or D) and minerals (zinc and magnesium) for any mental disorder.
Lead author of the study, Dr Joseph Firth, Senior Research Fellow at NICM Health Research Institute, Western Sydney University and Honorary Research Fellow at The University of Manchester said the findings should be used to produce more evidence-based guidance on the usage of nutrient-based treatments for various mental health conditions.
"While there has been a longstanding interest in the use of nutrient supplements in the treatment of mental illness, the topic is often quite polarising, and surrounded by either over-hyped claims or undue cynicism," Dr Firth said.
"In this most recent research, we have brought together the data from dozens and dozens of clinical trials conducted all over the world, in over 10,000 individuals treated for mental illness.
"This mass of data has allowed us to investigate the benefits and safety of various different nutrients for mental health conditions - on a larger scale than what has ever been possible before."
Senior author on the study, NICM Health Research Institute's Professor Jerome Sarris said as the role of nutrition in mental health is becoming increasingly acknowledged, it was vital that an evidence-based approach be adopted.
"Future research should aim to determine which individuals might benefit most from evidence-based supplements and to better understand the underlying mechanisms so we can adopt a targeted approach to supplement use in mental health treatment." Professor Sarris said.
"The role of the gut microbiome in mental health is a rapidly emerging field of research, however more research is needed into the role of 'psychobiotics' in mental health treatment."

Friday, September 6, 2019

More time spent standing helps combat effects of sedentary lifestyle


A study conducted by researchers from the University of Granada (UGR) recommends spending more time standing to increase energy expenditure and combat the negative health effects of a sedentary lifestyle. The research has also quantified exactly how many extra calories we burn when we remain standing: 45 kilocalories more, per six-hour period, than when lying or sitting.
One of the applications of the study, published in the journal PLOS ONE, could be the use of adjustable-height tables that enable people to work standing up--already a very common practice in the Nordic countries--and thus combat the negative effects of a sedentary working environment. These tables can be fully adjusted to suit the height of the user, depending on whether they want to sit or stand while working.
Francisco J. Amaro- Gahete, PhD Biomedicine student at the UGR's International School for Postgraduate Studies and member of the Department of Physiology, is the main author of the article. He notes: "We Spaniards spend between 8 and 10 hours sitting or lying down each day, not counting the hours we are asleep. Therefore, if we take steps to combat a sedentary lifestyle by making small lifestyle changes, such as spending more time standing, this could reduce the risk of developing diseases such as obesity or Type 2 diabetes."
In the published article, researchers describe how one effective way to address the effects of a sedentary lifestyle is to reduce the amount of time we spend sitting or lying down, and encourage us to stand more often. The article also presents the findings of the study regarding the energy expenditure for each of these three positions.
Two kinds of people: Energy savers and energy spenders
The scientists used a sample comprising 53 young adults, who were classified into two types, "savers" and "spenders" of energy, depending on the amount of energy expenditure they consumed when switching from sitting or lying to standing.
"Savers consume very little energy in their activities and, therefore, the difference between sitting/lying or standing is practically nil for them. But energy spenders burn approximately 10% more energy when they switch from sitting or lying to standing," explains Amaro.
So what makes a person spend more or less energy? This is a question that researchers are still trying to answer, as it is related, for example, to the issue of why some people lose weight so easily and others find it so difficult.
The factor that appears to have the greatest effect is muscle mass. "People with more muscle mass expend more energy than people with less muscle mass," observes the UGR researcher.
In light of the results, the authors recommend spending more time standing in the office as a good strategy to use up more energy and thus avoid storing it as fat.
"It is really important to change your position," comments Jonatan Ruiz, another of the authors of the article. "If a person were to get up, take 10 steps, and sit down again, it appears that the effects of a sedentary lifestyle would be greatly reduced. Therefore, we must educate our school-age children and young people, as well as teachers, about the importance of avoiding spending long periods of time sitting down to considerably reduce the negative consequences of a sedentary lifestyle such as excess weight and obesity, or the risk of developing cardiovascular disease."

Thursday, September 5, 2019

Hot yoga classes lowered blood pressure


Taking hot yoga classes lowered blood pressure in a small study of adults with elevated or stage 1 hypertension, according to preliminary research presented at the American Heart Association's Hypertension 2019 Scientific Sessions.
While there is evidence of regular, room-temperature yoga's positive effect on blood pressure, little is known about hot yoga's potential impact on blood pressure, according to the study researchers.
"The findings are very preliminary at this point, yet they're somewhat promising in terms of unveiling another unique way to lower blood pressure in adults without the use of medications," said Stacy Hunter, Ph.D., study author and assistant professor and lab director of the cardiovascular physiology lab at Texas State University in San Marcos, Texas. "Hot yoga is gaining popularity, and we're even seeing other styles of yoga, like Vinyasa and power yoga, being offered in heated studios."
Hot yoga is a modern practice, typically offered in a hot, humid atmosphere, with room temperatures around 105 degrees Fahrenheit. Some believe the practice of hot yoga replicates the heat and humidity of India, where yoga originated, while others look at the excessive sweating as a way to rid the body of impurities.
Hunter and colleagues recruited 10 men and women, between ages 20 and 65 years. Participants had either elevated blood pressure (systolic blood pressure between 120 mmHg to 129 mmHg and diastolic pressure less than 80 mmHg) or stage 1 hypertension (130 mmHg to 139 mmHg systolic and 80 mmHg to 89 mmHg diastolic pressure.) These adults were not taking any type of blood pressure medication and had been sedentary -- meaning they had not engaged in a regular physical fitness routine -- for at least six months before the study.
Researchers randomly assigned five participants to take 12 weeks of three-times-weekly hour-long hot yoga classes and they assigned the other five to a control group of no yoga classes. They compared average blood pressures of the two groups after the 12 weeks. The researchers looked at average 24-hour blood pressure readings, as well as perceived stress and vascular function of participants in both groups.
At 12 weeks, they found:
Systolic blood pressure dropped from an average 126 mmHg at the study's start to 121 mmHg after 12 weeks of hot yoga. Average diastolic pressure also decreased from 82 mmHg to 79 mmHg in the hot yoga group.
Average blood pressure did not change among the five adults in the control group, those who did not take hot yoga classes.
Perceived stress levels fell among those in the hot yoga group but not in the non-yoga group.
While waking systolic and diastolic pressures fell in the hot yoga group, blood pressure readings taken during sleep did not change.
There were no changes in vascular function in either group.
"The results of our study start the conversation that hot yoga could be feasible and effective in terms of reducing blood pressure without medication," Hunter said. "However, larger studies need to be done before we can say with confidence that hot yoga has a positive impact on blood pressure."
Taking safety precautions is important, according to Hunter. Adults taking hot yoga classes should be hydrated when they arrive, drink water throughout the class, dress appropriately, not overdo it and be aware of signs and symptoms of heat illness. Always talk to your doctor before starting any new exercise regimen to know what it is best for you.

High blood pressure accelerates cognitive decline


High blood pressure appears to accelerate cognitive decline among middle-aged and older adults and treating high blood pressure may slow down the process, according to a preliminary research presented at the American Heart Association's Hypertension 2019 Scientific Sessions.
The findings are important because high blood pressure and cognitive decline are two of the most common conditions associated with aging, and more people are living longer worldwide.
According to the American Heart Association's 2017 Hypertension Guidelines, high blood pressure is a global health threat, affecting approximately 80 million U.S. adults and one billion people globally. Moreover, the relationship between brain health and high blood pressure is a growing interest as researchers examine how elevated blood pressure affects the brain's blood vessels, which in turn, may impact memory, language and thinking skills.
In this observational study, researchers from Columbia University analyzed data collected on nearly 11,000 adults from the China Health and Retirement Longitudinal Study (CHARLS) between 2011-2015, to assess how high blood pressure and its treatment may influence cognitive decline. High blood pressure was defined as having a systolic blood pressure of 140 mmHg or higher and a diastolic blood pressure of 90 mmHg or higher, and/or taking antihypertensive medications. (Note: The American Heart Association guidelines define high blood pressure as 130 mmHg or higher or a diastolic reading of 80 mmHg or higher.)
Researchers in China interviewed study participants at home about their high blood pressure treatment, education level and noted if they lived in a rural or urban environment. They were also asked to perform cognitive tests, such as immediately recalling words as part of a memory quiz.
Among the study's findings:
Overall cognition scores declined over the four-year study;
Participants ages 55 and older who had high blood pressure showed a more rapid rate of cognitive decline compared with participants who were being treated for high blood pressure and those who did not have high blood pressure; and
The rate of cognitive decline was similar between those receiving high blood pressure treatment and those who did not have high blood pressure.
The study did not evaluate why or how high blood pressure treatments may have contributed to slower cognitive decline or if some treatments were more effective than others.
"We think efforts should be made to expand high blood pressure screenings, especially for at-risk populations, because so many people are not aware that they have high blood pressure that should be treated," said presenting study author Shumin Rui, a biostatistician at the Mailman School of Public Health, Columbia University in New York. "This study focused on middle-aged and older adults in China, however, we believe our results could apply to populations elsewhere as well. We need to better understand how high blood pressure treatments may protect against cognitive decline and look at how high blood pressure and cognitive decline are occurring together."

Eating mushrooms may help lower prostate cancer risk


A new study published in the International Journal of Cancer found an inverse relationship between mushroom consumption and the development of prostate cancer among middle-aged and elderly Japanese men, suggesting that regular mushroom intake might help to prevent prostate cancer.

A total of 36,499 men, aged 40 to 79 years who participated in the Miyagi Cohort Study in 1990 and in the Ohsaki Cohort Study in 1994 were followed for a median of 13.2 years. During follow-up, 3.3% of participants developed prostate cancer. Compared with mushroom consumption of less than once per week, consumption once or twice a week was associated with an 8% lower risk of prostate cancer and consumption three or more times per week was associated with a 17% lower risk.

"Since information on mushroom species was not collected, it is difficult to know which specific mushroom(s) contributed to our findings. Also, the mechanism of the beneficial effects of mushrooms on prostate cancer remains uncertain," said lead author Shu Zhang, PhD, of the Tohoku University School of Public Health, in Japan.