A large analysis of data from nearly 400,000 healthy U.S. adults followed for more than 20 years has found no association between regular multivitamin use and lower risk of death. The study, led byresearchersat the National Institutes of Health’s National Cancer Institute, was published June 26, 2024, in JAMA Network Open.
Many adults in the United States take multivitamins with the hope of improving their health. However, the benefits and harms of regular multivitamin use remain unclear. Previous studies of multivitamin use and mortality have yielded mixed results and been limited by short follow-up times.
To more deeply explore the relationship between long-term regular multivitamin use and overall mortality and death from cardiovascular disease and cancer, the researchers analyzed data from three large, geographically diverse prospective studies involving a total of 390,124 U.S. adults who were followed for more than 20 years. The participants included in this analysis were generally healthy, with no history of cancer or other chronic diseases.
Because the study population was so large and included lengthy follow-up and extensive information on demographics and lifestyle factors, the researchers were able to mitigate the effects of possible biases that may have influenced the findings of other studies. For example, people who use multivitamins may have healthier lifestyles in general, and sicker patients may be more likely to increase their use of multivitamins.
The analysis showed that people who took daily multivitamins did not have a lower risk of death from any cause than people who took no multivitamins. There were also no differences in mortality from cancer, heart disease, or cerebrovascular diseases. The results were adjusted for factors such as race and ethnicity, education, and diet quality.
The researchers noted that it is important to evaluate multivitamin use and risk of death among different kinds of populations, such as those with documented nutritional deficiencies, as well as the potential impact of regular multivitamin use on other health conditions associated with aging.
Moderate levels of physical activity and fitness may be linked to a reduced risk of amyotrophic lateral sclerosis (ALS) later in life, according to a new studypublished in the June 26, 2024, online issue ofNeurology®, the medical journal of theAmerican Academy of Neurology. The study only found an association between physical activity and risk of ALS in male participants, not female participants.
ALS is a rare, progressive neurodegenerative disease that affects nerve cells in the brain and the spinal cord. People with ALS lose the ability to initiate and control muscle movement, which often leads to total paralysis and death. The average life span after diagnosis is two to five years.
“The diagnosis of prominent athletes with ALS at young ages has sparked the uncomfortable idea that higher physical activity could be tied to developing ALS,” said studyauthor Anders Myhre Vaage, MD, of Akershus University Hospital in Norway. “There have been conflicting findings on levels of physical activity, fitness and ALS risk. Our study found that for men, living a more active lifestyle could be linked to a reduced risk of ALS more than 30 years later.”
For the study, researchers looked at 373,696 people in Norway with an average age of 41. They were followed for an average of 27 years.
Of the total participants, 504 people developed ALS. Of those who developed ALS, 59% were male participants.
Participants recorded their level of physical activity for the past year into one of four categories: sedentary; a minimum of four hours per week of walking or cycling; a minimum of four hours per week of recreational sports or heavy gardening; or participation in hard training or sports competitions regularly, several times a week. Due to few participants with the highest level of physical activity, researchers combined the third and fourth categories into one high activity group.
Researchers found that of the 41,898 male participants that had the highest level of physical activity, 63 developed ALS; of the 76,769 male participants with the intermediate level of physical activity, 131 developed ALS; and of the 29,468 male participants with the lowest level of physical activity, 68 developed ALS.
After adjusting for other factors that could affect the risk of ALS, such as smoking and body mass index, researchers found that for male participants, when compared to those with the lowest level of physical activity, those with moderate levels of physical activity had a 29% lower risk of ALS and those with high levels of physical activity had a 41% lower risk of ALS.
Researchers also looked at resting heart rate. Men in the lowest of four categories of resting heart rate, which indicates good physical fitness, had a 32% reduced risk of ALS compared to those with higher resting heart rates.
“Our findings show that, for men, not only do moderate to high levels of physical activity and fitness not increase the risk of ALS, but that it may be protective against the disease,” Myhre Vaage said. “Future studies of the connection between ALS and exercise are needed to consider sex differences and higher or professional athlete physical activity levels.”
A limitation of the study was that the physical activity questionnaire was completed only at one specific time during the study, so it may not have captured the participants’ exercise levels over the nearly 30-year span of the study.
Adult dietary patterns with increased bean consumption are associated with greater overall shortfall nutrient intakes, lower added sugar, improved weight-related outcomes and better diet quality
Limited evidence is available that focuses on beans within American dietary patterns and health. The purpose of this study was to identify commonly consumed adult dietary patterns that included beans and compare shortfall nutrient intakes and diet quality, relative to adults whose typical dietary pattern did not include beans.
Bean consumption was defined as those consuming kidney beans, black beans, chickpeas, and/or pinto beans.
Dietary patterns that are rich in canned and dry beans were associated with significantly higher diet quality scores and greater intake of shortfall nutrients, including nutrients of public health concern. Bean dietary patterns were also associated with improved weight-related outcomes. Dietary guidance should consider the nutrient and health benefits associated with the promotion of increased canned and dry bean consumption in American dietary patterns.
A yearlong randomized controlled trial found that daily prune consumption slowed bone loss connected to osteoporosis
Dairy isn’t the only food that’s good for bone health. Prunes may also protect bone structure and strength in postmenopausal women, according to a new study led by Penn State researchers. The findings, published in Osteoporosis International, suggest that daily prune consumption slows the progression of age-related bone loss and reduces the risk of fracture.
“This is the first randomized controlled trial to look at three-dimensional bone outcomes with respect to bone structure, geometry and estimated strength,” said Mary Jane De Souza, distinguished professor of kinesiology and physiology at Penn State. “In our study we saw that daily prune consumption impacted factors related to fracture risk. That’s clinically invaluable.”
Bones are made of dynamic tissues that are constantly in a process of remodeling. Specialized bone cells remove old bone replacing it with new bone. With age, the scales start to tip, and the body breaks down bone faster than it can build it.
The accelerated loss of bone can lead to osteoporosis, a disease where bone becomes less dense and bone structure changes, making it weaker and at greater risk of fracture. Over 10 million Americans have the condition, according to the Centers for Disease Control and Prevention, and it’s more common in women compared to men and in older adults.
It’s of particular concern among postmenopausal women, the researchers said. Estrogen, a hormone critical for bone health, declines during this life phase and lower levels of estrogen hastens the loss of bone density. While there is medication available to treat osteoporosis, the researchers said that many women who should be taking it aren’t.
Prunes offer a promising alternative, according to De Souza. They contain bioactive compounds like polyphenols that may blunt the inflammatory pathways that lead to bone loss.
Prior studies primarily used dual energy X-ray absorptiometry (DXA) to evaluate 2D bone mass density and to diagnose osteoporosis. But DXA scans can’t distinguish between different types of bone tissue or measure the structural properties of bone, which can serve as a proxy for bone strength and quality, the researchers explained.
“When we look at bone mineral density, we’re looking at how much bone there is, but we also want to know about the quality of the bone. When we look at a three-dimensional picture, we can look at bone structure, geometry and micro-architecture. In other words, it tells us how good the bone is,” De Souza said.
To see whether daily prune consumption influenced bone quality, the research team conducted a 12-month randomized controlled trial with 235 postmenopausal women. Participants were assigned to one of three groups: no prunes; 50 grams, or four to six, prunes daily; or 100 grams, or 10 to 12, prunes daily. Every six months, they were assessed using a peripheral quantitative computed tomography, or pQCT, scan, which allows for cross-sectional imaging to measure 3D bone mass density, bone geometry and bone strength.
Over the course of one year, the researchers found that measures of bone mass density and bone strength at the tibia, or shin bone, all decreased in women in the control group. In contrast, those who ate at least four to six prunes every day maintained bone density and bone strength and preserved bone structure, particularly in cortical bone. While women in both prune groups saw benefit, four to six prunes a day may be the more feasible dose. Women in the 100-gram group dropped out of the study at a higher rate because they got bored of incorporating so many prunes into their daily diet.
“It’s pretty exciting data for a 12-month study,” De Souza said. “We were able to maintain and preserve bone at the weight-bearing, cortical bone of the tibia and the maintenance of cortical bone and bone strength is key to avoiding fracture.”
Prune consumption could also potentially reduce the risk of osteoporosis, De Souza explained, but more research is needed.
With this paper, the research team has built on a suite of studies that investigate the relationship between prunes and bone health. In a prior study with the same cohort of women, the research team demonstrated that daily prunes consumption for a year also preserved total bone mass density at the hip. They’ve also investigated potential mechanisms behind prunes’ bone protective effects, including how prunes influence bacteria in the gut microbiome. De Souza said they hope to continue to expand on these findings in future studies.
Peek in medicine cabinets across the U.S. and you’ll find stacks of leftover COVID-19 tests.
When symptoms arise, so do questions: When should I test? How accurate is it really? And what should I do if I test positive?
In a paper published June 14 in the journal Science Advances, CU Boulder researchers unveil a new mathematical model to quickly answer such questions, not only for COVID but also for emerging rapid tests for respiratory syncytial virus (RSV), the flu and other infectious diseases.
One key takeaway: Advice can differ widely depending on the bug.
“For COVID, we found that if you only have one test, it’s best to wait two days after symptoms arise to use it, because the virus is unlikely to be detectable until then,” said first author Casey Middleton, a doctoral student in the department of Computer Science and the IQ Bio program. “For flu and RSV, you’re best off to take that rapid test when you first feel symptoms.”
A new generation of all-in-one tests
Middleton and senior author Daniel Larremore, a professor of computer science at the BioFrontiers Institute, developed the model to address several challenges that have emerged with the post-pandemic proliferation of rapid tests.
In recent years, companies have rolled out “all-in-one” tests that check for SARS-CoV-2 (the virus that causes COVID-19), influenza A and B, and RSV simultaneously, and some doctor’s offices and pharmacies offer a combo, while-you-wait option.
Meanwhile, at-home COVID testing has become the norm, with people routinely self-collecting nasal swabs to protect friends and family.
“If you’re trying to make a decision about whether to go to book club or go to Bingo night with the grandparents, testing is a really good idea,” said Larremore, whose lab combines computer science, math, epidemiology and biology to address public health challenges. “But COVID has changed, each variant behaves differently and that means the way that they interact with tests may be different.”
When he and Middleton plugged information about Omicron variants, patient behavior and other factors into their new computational model, it revealed that if a person with COVID tests immediately with a rapid test when symptoms emerge, they receive a false negative as much as 92% of the time. Waiting two days after symptoms brings that rate down to 70%. For those who can afford to take a second test on day 3, the false negative rate dips lower, with the tests catching about a third of infections.
That’s because, with most people already previously exposed, their immune systems are primed to react upon seeing COVID again, and that immune response itself causes symptoms. In addition, new variants in folks with some immunity grow slightly more slowly than the original strain.
“Our symptoms are happening sooner, but it takes longer to reach enough virus in your body for it to be detectable,” said Middleton.
With RSV and flu, on the other hand, the virus multiplies so quickly that once symptoms set in, there’s already plenty to make a test show up positive.
“This is the conundrum,” said Larremore. “If you go in right away and test for all three, you can learn a lot from the flu and RSV tests, but you may have swung too early for COVID. If you wait a few days, the timing might be right to catch COVID but you are too late for flu and RSV.”
While a 66% false negative rate may seem high for a COVID test, Larremore notes that the tests are designed to identify folks who have a high viral load and are, thus, most likely to infect others.
“Diagnosing only one third of infections can still cut transmission substantially if we've diagnosed the most infeciouts third," he said.
Rethinking isolation
Assuming that enough at-home tests are available, their study also suggests that a “test to exit” strategy—in which people test again before determining whether to return to work and socialize—can prevent more COVID infections with less inconvenience than the five-day isolation policy that was standard Centers for Disease Control advice until March.
“The five-day isolation policy made people isolate for too long in most cases,” said Middleton. “Test-to-exit does a good job releasing people early who aren’t going to transmit but holding those who still have high amounts of virus.”
Larremore’s previous research was instrumental in informing how COVID-19 vaccines were distributed early in the pandemic and for helping to convince policymakers to prioritize rapid testing.
He and Middleton hope that their new model can help companies develop better tests, help clinicians give better advice and, should another pandemic arise, enable policy-makers to offer swift, data-driven guidance on testing.
but more strongly with the intermittent fasting diet.
Researchers from Johns Hopkins Medicine and the National Institutes of Health’s National Institute on Aging say their study of 40 older adults with obesity and insulin resistance who were randomly assigned to either an intermittent fasting diet or a standard healthy diet approved by the U.S. Department of Agriculture (USDA) offers important clues about the potential benefits of both eating plans on brain health.
IThe results revealed that both types of diet plans had benefits regarding decreasing insulin resistance and improving cognition, with improvements in memory and executive function with both diets, but more strongly with the intermittent fasting diet, according to Mark Mattson, Ph.D., adjunct professor of neuroscience at the Johns Hopkins University School of Medicine and former chief of the laboratory of neurosciences at the National Institute on Aging in Baltimore. “Other scientists may want to incorporate the (brain) markers (we used) into additional, larger studies of diet and brain health,” Mattson says.
In a new article, researchers at the University of Illinois Chicago debunk four common myths about the safety of intermittent fasting.
Intermittent fasting has become an increasingly popular way to lose weight without counting calories. And a large body of research has shown it’s safe. Still, several myths about fasting have gained traction among clinicians, journalists and the general public: that fasting can lead to a poor diet or loss of lean muscle mass, cause eating disorders, or decrease sex hormones.
In a new commentary in Nature Reviews Endocrinology, UIC researchers debunk each of these. They base their conclusions on clinical studies, some of which they conducted and some done by others.
“I’ve been studying intermittent fasting for 20 years, and I’m constantly asked if the diets are safe,” said lead author Krista Varady, professor of kinesiology and nutrition at UIC. “There is a lot of misinformation out there. However, those ideas are not based on science; they’re just based on personal opinion.”
There are two main types of intermittent fasting. With alternate-day eating, people alternate between days of eating a very small number of calories and days of eating what they want. With time-restricted eating, people eat what they want during a four- to 10-hour window each day, then don’t eat during the rest of the day. The researchers conclude both types are safe despite the popular myths.
Here’s a look at their conclusions:
Intermittent fasting does not lead to a poor diet: The researchers point to studies showing the intake of sugar, saturated fat, cholesterol, fiber, sodium and caffeine do not change during fasting compared with before a fast. And the percentage of energy consumed in carbohydrates, protein and fat doesn’t change, either.
Intermittent fasting does not cause eating disorders: None of the studies show that fasting caused participants to develop an eating disorder. However, all the studies screened out participants who had a history of eating disorders, and the researchers say that those with a history of eating disorders should not try intermittent fasting. They also urge pediatricians to be cautious when monitoring obese adolescents if they start fasting, because this group has a high risk of developing eating disorders.
Intermittent fasting does not cause excessive loss of lean muscle mass: The studies show that people lose the same amount of lean muscle mass whether they’re losing weight by fasting or with a different diet. In both cases, resistance training and increased protein intake can counteract the loss of lean muscle.
Intermittent fasting does not affect sex hormones: Despite concerns about fertility and libido, neither estrogen, testosterone nor other related hormones are affected by fasting, the researchers said.
Mindfulness – focusing on the present moment – can improve sleep, reduce stress and improve overall health. A new University of South Florida-led study helps explain why.
Researchers studied 144 nurses over two weeks to see how well they could stay focused on the present and how often they fixated on negative thoughts. The nurses completed surveys three times a day and reported their sleep quality the following morning.
The findings shed light on how mindfulness relates to emotion regulation, the way people handle stressful situations, such as a setback at work.
“Mindfulness is often seen as a magical cure-all for employee stress,” Smith said. “The way it’s often spoken about makes it seem as if staying grounded in and accepting of the present moment means you will never be stressed. To me, it’s crucial to add more nuance.”
That’s where the study comes in by providing insight into how the connection between mindfulness and emotion regulation affects sleep quality.
“We know that good sleep restores us physically and psychologically, and it keeps us happier, safer and even more ethical at work,” Smith said. “We wanted to explore which aspects of sleep are influenced by mindfulness and why.”
Smith’s team included three USF colleagues and two Penn State researchers. It was published recently in the journal Health Psychology.
The researchers focused on nurses due to their long, irregular hours and high-stress work environment, which often leads to sleep problems that can affect not only their health, but patient safety.
The study found that mindfulness helped the nurses experience fewer negative emotions and less rumination — repetitive negative thinking.
“For instance, if you got a negative performance review at work, you might choose to shift your focus from negative thoughts of how you have failed and are incompetent to positive thoughts of what you did right and how you can grow,” Smith said.
Smith and her co-authors believe the findings could help employers make better decisions about implementing strategies to boost their workers' health. Popular employer interventions include mindfulness-based stress reduction programs, along with yoga, meditation, tai chi and therapy. These programs have been shown to help employees manage stress and improve their overall well-being.
"Mindfulness is a hot topic, but we need to understand why it works," Smith said. "Our research is about going back to the drawing board to understand the reasons behind the benefits of mindfulness at work.”
The authors acknowledge the need for further studies to explore the best methods for reducing work-related stress and how they apply across different occupations, including more traditional office settings outside of health care.
“We hope future research on mindfulness looks at not just big-picture results like better sleep or productivity but also how it affects things like handling emotions,” Smith said. “When an intervention doesn't work, it helps us understand where the problem is stemming from. When it does work, it tells us why.”
A survey representing about 150 million adults annually suggests that aspirin use for primary prevention of cardiovascular disease (CVD) remains prevalent among older adults, contrary to recommendations from the American College of Cardiology and the American Heart Association. According to the study authors, these findings highlight the urgent need for physicians to inquire about aspirin use and discuss the benefits and risks with older patients. The findings are published inAnnals of Internal Medicine.
Researchers from Cleveland Clinic studied data from the National Health Interview Survey Sample Adult component (2012–2019 and 2021) to characterize trends in prevalence of aspirin use for CVD prevention. Participants aged 40 years or older were asked to report aspirin use and were stratified by age group and CVD status based on self-reported history of stroke, myocardial infarction, coronary artery disease, or angina. The data showed that aspirin use declined from 2018 to 2019 after new evidence prompted the American College of Cardiology and the American Heart Association to recommend against aspirin therapy for primary prevention in older adults. Still, even after this decline, nearly a third of adults aged 60 or older without CVD were still using aspirin in 2021, and nearly 1 in 20 were using it without medical advice. Overall, 25.6 million adults reported aspirin use in the U.S, with 18.5 million adults aged 60 years or older using aspirin in 2021. The findings suggest a need to reduce inappropriate use of aspirin among older adults.
In a study of loneliness and stroke risk over time among adults ages 50+, those who experienced chronic loneliness had a 56% higher risk of stroke than those who consistently reported not being lonely.
Those who experienced situational loneliness did not have an elevated risk of stroke—suggesting that the impact of loneliness on stroke risk occurs over the longer term.
Boston, MA—Chronic loneliness may significantly raise older adults’ risk of stroke, according to a new study led by Harvard T.H. Chan School of Public Health.
“Loneliness is increasingly considered a major public health issue. Our findings further highlight why that is,” said lead author Yenee Soh, research associate in the Department of Social and Behavioral Sciences. “Especially when experienced chronically, our study suggests loneliness may play an important role in stroke incidence, which is already one of the leading causes of long-term disability and mortality worldwide.”
The study will be published June 24 in eClinicalMedicine.
While previous research has linked loneliness to higher risk of developing cardiovascular diseases, few have examined the impact on stroke risk specifically. This study is one of the first to examine the association between loneliness changes and stroke risk over time.
Using 2006-2018 data from the Health and Retirement Study (HRS), the researchers assessed the association between changes in loneliness and stroke incidence over time. During 2006-2008, 12,161 participants—all adults ages 50 and above who never had a stroke—responded to questions on the Revised UCLA Loneliness Scale, from which the researchers created summary loneliness scores. Four years later (2010-2012), 8,936 participants who remained in the study responded to the same questions again. The researchers then placed the participants into one of four groups according to their loneliness scores across the two time points: “consistently low” (those who scored low on the loneliness scale at both baseline and follow-up); “remitting” (those who scored high at baseline and low at follow-up); “recent onset” (those who scored low at baseline and high at follow-up); and “consistently high” (those who scored high at both baseline and follow-up).
Among the participants whose loneliness was measured at baseline only, 1,237 strokes occurred during the follow-up period (2006-2018). Among the participants who provided two assessments of loneliness over time, 601 strokes occurred during the follow-up period (2010-2018). The researchers analyzed each group’s risk of stroke over the follow-up period in the context of their experiences with loneliness, controlling for other health and behavioral risk factors. These included social isolation and depressive symptoms, which are closely related but distinct from loneliness.
The findings showed a link between loneliness and higher risk of stroke and found that chronic loneliness heightened risk the most. When loneliness was assessed at baseline only, the participants considered lonely had a 25% higher risk of stroke than those not considered lonely. Among the participants who reported loneliness at two time points, those in the “consistently high” group had a 56% higher risk of stroke than those in the “consistently low” group, even after accounting for a broad range of other known risk factors. While the baseline analyses suggest loneliness at one time point was associated with higher risk, those who experienced remitting or recent onset loneliness did not show a clear pattern of increased risk of stroke—suggesting that loneliness’ impact on stroke risk occurs over the longer term.
“Repeat assessments of loneliness may help identify those who are chronically lonely and are therefore at a higher risk for stroke. If we fail to address their feelings of loneliness, on a micro and macro scale, there could be profound health consequences,” said Soh. “Importantly, these interventions must specifically target loneliness, which is a subjective perception and should not be conflated with social isolation.”
The authors noted that further research examining both nuanced changes in loneliness over the short-term, as well as loneliness patterns over a longer period of time, may help shed additional light on the loneliness-stroke association. They also noted that more research is needed to understand the potential underlying mechanisms, and that the study findings were limited to middle-aged and older adults and may not be generalizable to younger individuals.
New research from Edith Cowan University (ECU) has found that nitrate from plant sources is associated with a lower risk of mortality while nitrate from other sources such as animal-based foods, processed meat and tap water, is linked to a higher risk of mortality.
Nitrate, a compound found in vegetables, meat, and drinking water, has been the subject of debate due to its potential impact on health. Emerging evidence suggests that dietary nitrate may play a role in preventing cardiovascular disease (CVD), dementia, and diabetes. However, concerns about a potential link between nitrate ingestion and cancer have led to uncertainties surrounding the consumption of high-nitrate leafy green vegetables.
ECU’s Dr Nicola Bondonno led the project which has found that among 52,247 participants of the Danish Diet Cancer and Health Study, moderate to high intakes of plant and vegetable sourced nitrate were associated with a 14% to 24% lower risk of all-cause, CVD-related, and cancer-related mortality.
While the research could not attribute plant-based nitrate as the sole contributor to human health, given that plants and vegetables contained a range of other protective compounds which themselves were associated with a lower risk of CVD, cancer and mortality, the research underscored the value of higher intakes of nitrate-rich vegetables to mitigate mortality risks.
The research also added to the growing evidence that there was no cause for concern regarding cancer risks from the consumption of nitrate-rich vegetables such as leafy green vegetables and beetroot.
Conversely, higher intakes of naturally occurring animal-sourced nitrate were associated with a 9% and 12% higher risk of all-cause, and CVD-related mortality, respectively. Higher intakes of naturally occurring animal-sourced nitrite, a compound formed from nitrate, were associated with a 25%, 29% and 18% higher risk of all-cause, CVD-related, and cancer-related mortality, respectively.
Meanwhile, higher intakes of nitrate and nitrite from processed meat sources were associated with a 12% to 22% higher risk of all-cause and cancer-related mortality while only additive permitted meat-sourced nitrite was positively associated with CVD-related mortality.
Participants with a higher intake of tap water-sourced nitrate had a higher risk of all-cause and CVD-related mortality but not cancer-related mortality.
Dr Bondonno, who is currently based at the Danish Cancer Institute, said that the source of the nitrate determined the body’s reaction to the nitrate.
“In simplistic terms, nitrate can go down two different pathways when introduced into the body. One is to form a compound called nitric oxide, which has been shown to improve blood flow, lower blood pressure, and support overall cardiovascular health.
“But nitrate may also go down a second pathway, forming a group of compounds called nitrosomines, which are considered to be carcinogenic and are linked to cancer. It is thought that the antioxidant compounds in vegetables push nitrate towards the first pathway.”
The advice resulting from the most recent research fits in with what is commonly known about the optimal human diet; eat more plants and less animal products and limit the amount of processed meats.
“The majority of fears around nitrate consumption have generally stemmed from concerns around cancer, but one of the most interesting findings from this research is that nitrate found in drinking water was more strongly linked to deaths from heart disease.
“Nitrate sourced from plants and vegetables are protective against the different kinds of mortality. But when nitrate comes from animal sources or tap water, it increases your risks, mainly of heart disease, but also of certain cancers.”
Avoiding bright light at night could be a simple way to reduce your risk of diabetes, a Flinders University study shows.
The study published in the prestigious journal The Lancet Regional Health - Europe reveals the compelling relationship between exposure to light and the risk of developing type 2 diabetes.
Type 2 (acquired) diabetes is a chronic condition that affects how the body uses insulin. It develops over many years, is difficult to treat and is usually related to lifestyle factors such as inactivity and obesity.
“We found that exposure to brighter light at night was associated with a higher risk of developing type 2 diabetes,” says senior author Associate Professor Andrew Phillips from the College of Medicine and Public Health.
In the large modelling study, the research team investigated whether personal light exposure patterns predicted the risk of diabetes using data from approximately 85,000 people and around 13 million hours of light sensor data.
The participants – who did not have type 2 diabetes – wore devices on their wrist for one week to track their light levels throughout the day and night.
They were then tracked over the following nine years to observe whether they went on to develop type 2 diabetes.
“Light exposure at night can disrupt our circadian rhythms, leading to changes in insulin secretion and glucose metabolism,” he says.
“Changes in insulin secretion and glucose metabolism caused by disrupted circadian rhythms affect the body's ability to regulate blood sugar levels, which can ultimately lead to the development of type 2 diabetes.”
Having more exposure to light at night (between 12:30am and 6:00am) was linked to a higher risk of developing type 2 diabetes, and this was true regardless of how much light people were exposed to during the day.
The research accounted for other factors associated with type 2 diabetes, such as lifestyle habits, sleep patterns, shift work, diet, and mental health.
Even after taking these factors into account, the findings showed that getting more light at night was still a strong predictor of developing diabetes.
“The results showed that exposure to brighter light at night is associated with a higher risk of developing diabetes, with a dose-dependent relationship between light exposure and risk,” says Associate Professor Phillips.
“Our findings suggest that reducing your light exposure at night and maintaining a dark environment may be an easy and cheap way to prevent or delay the development of diabetes,” he adds.
A new study delves into the potential health risks posed by the release of volatile organic compounds (VOCs) from plastic water bottles when exposed to sunlight. The research, which systematically examined the composition and toxicity of VOCs emitted under ultraviolet-A (UV-A) and solar irradiation, underscores the need for safer storage practices to ensure drinking water safety.
Plastic water bottles are ubiquitous due to their convenience, yet they harbor potential risks. Sunlight exposure can lead these containers to degrade and emit volatile organic compounds (VOCs), which are potentially detrimental to human health. The booming bottled water market underscores the urgency for safer alternatives. In response to these concerns, there is a pressing need for in-depth research into more secure materials and production methods for water containers.
A new research (DOI: 10.1016/j.eehl.2024.01.005) by the Guangdong Key Laboratory of Environmental Pollution and Health, Jinan University, and published in Eco-Environment & Health on 8 February 2024, provides fresh insights into how sunlight can transform plastic water bottles into sources of air pollution.
The research analyzed the VOCs released from six types of plastic water bottles subjected to UV-A and sunlight. Results showed that all tested bottles emitted a complex mixture of alkanes, alkenes, alcohols, aldehydes, and acids, with significant variations in VOC composition and concentration among the bottles. Notably, highly toxic VOCs, including carcinogens like n-hexadecane, were identified, highlighting serious health risks. Prolonged exposure scenarios indicated an increased concentration of VOCs, pointing to a growing cumulative risk.
Dr. Huase Ou, the lead researcher, remarked, "Our findings provide compelling evidence that plastic bottles, when exposed to sunlight, can release toxic compounds that pose health risks. Consumers need to be aware of these risks, especially in environments where bottled water is exposed to sunlight for prolonged periods."
This study not only casts light on the chemical stability of polyethylene terephthalate (PET) bottles but also carries significant implications for public health and safety regulations. Understanding the conditions under which these VOCs are released can guide the improvement of manufacturing practices and material selection for bottled water containers. Furthermore, it underscores the need for enhanced consumer awareness and stricter industry regulations to reduce exposure to these potentially harmful compounds.
A healthy diet that adheres to nutrition recommendations is associated with better blood glucose levels and a lower risk of prediabetes and type 2 diabetes, a new study from the University of Eastern Finland shows. This association was observed also in individuals with a high genetic predisposition to type 2 diabetes.
Type 2 diabetes is a strongly genetic disease that can be prevented and delayed with a healthy lifestyle, such as diet and exercise.
“However, we haven’t really known whether a healthy diet is equally beneficial to all, i.e., to those with a low genetic risk and to those with a high genetic risk,” Doctoral Researcher Ulla Tolonen of the University of Eastern Finland says.
The cross-sectional study examined food consumption and blood glucose levels in more than 1,500 middle-aged and elderly men participating in the broader Metabolic Syndrome in Men Study, METSIM. Food consumption was measured using a food frequency questionnaire, and blood glucose levels were measured using a two-hour glucose tolerance test. In addition, study participants’ genetic risk of type 2 diabetes was scored based on 76 genetic variants associated with type 2 diabetes risk.
The researchers identified two dietary patterns based on food consumption. A dietary pattern termed as “healthy” included, among other things, vegetables, berries, fruits, vegetable oils, fish, poultry, potatoes, unsweetened and low-fat yogurt, low-fat cheese and whole grain products, such as porridge, pasta and rice. This diet was associated with, e.g., lower blood glucose levels and a lower risk of prediabetes and type 2 diabetes.
The study also explored the effect of the genetic risk of type 2 diabetes on the associations with diet and glucose metabolism. The associations of a healthy diet with better glucose metabolism seemed to hold true for individuals with both a low and a high genetic risk of diabetes.
"Our findings suggest that a healthy diet seems to benefit everyone, regardless of their genetic risk," Tolonen concludes.
The findings were published in European Journal of Nutrition.
Adults with a history of low back pain went nearly twice as long without a recurrence of their back pain if they walked regularly, a world-first study has found.
About 800 million people worldwide have low back pain, and it is a leading cause of disability and reduced quality of life.
Repeated episodes of low back pain are also very common, with seven in 10 people who recover from an episode going on to have a recurrence within a year.
Current best practice for back pain management and prevention suggests the combination of exercise and education. However, some forms of exercise are not accessible or affordable to many people due to their high cost, complexity, and need for supervision.
A clinical trial by Macquarie University’s Spinal Pain Research Group has looked at whether walking could be an effective, cost-effective and accessible intervention.
The trial followed 701 adults who had recently recovered from an episode of low back pain, randomly allocating participants to either an individualised walking program and six physiotherapist-guided education sessions over six months, or to a control group.
Researchers followed the participants for between one and three years, depending on when they joined, and the results have now been published in the latest edition of The Lancet.
The paper’s senior author, Macquarie University Professor of Physiotherapy, Mark Hancock, says the findings could have a profound impact on how low back pain is managed.
“The intervention group had fewer occurrences of activity limiting pain compared to the control group, and a longer average period before they had a recurrence, with a median of 208 days compared to 112 days,” Professor Hancock says.
“Walking is a low-cost, widely accessible and simple exercise that almost anyone can engage in, regardless of geographic location, age or socio-economic status.
“We don’t know exactly why walking is so good for preventing back pain, but it is likely to include the combination of the gentle oscillatory movements, loading and strengthening the spinal structures and muscles, relaxation and stress relief, and release of ‘feel-good’ endorphins.
"And of course, we also know that walking comes with many other health benefits, including cardiovascular health, bone density, healthy weight, and improved mental health.”
Lead author Dr Natasha Pocovi says in addition to providing participants with longer pain-free periods, the program was very cost-effective.
“It not only improved people’s quality of life, but it reduced their need both to seek healthcare support and the amount of time taken off work by approximately half,” she says.
“The exercise-based interventions to prevent back pain that have been explored previously are typically group-based and need close clinical supervision and expensive equipment, so they are much less accessible to the majority of patients.
“Our study has shown that this effective and accessible means of exercise has the potential to be successfully implemented at a much larger scale than other forms of exercise.”
To build on these findings, the team now hopes to explore how they can integrate the preventive approach into the routine care of patients who experience recurrent low back pain.
Depletion of this is strong predictor of death in older people, say researchers
Twelve months of heavy resistance training—exercise that makes muscles work against a force—around retirement preserves vital leg strength years later, show the follow up results of a clinical trial, published online in the open access journal BMJ Open Sport & Exercise Medicine.
Depletion of leg muscle strength is regarded as a strong predictor of death in older people, so is important to maintain, say the researchers.
Skeletal muscle mass and function naturally decline with advancing age, ultimately often interfering with mobility and autonomy in older people, note the researchers.
Resistance training, which can involve weights, body weight, or resistance bands, can help to counteract this loss, but most of the published research has involved relatively short periods of time (6-9 months) to monitor its effects.
The researchers therefore wanted to know whether a year of supervised resistance training with heavy loads would make any difference over the longer term.
They followed up participants of the LIve active Successful Ageing (LISA) study, a large randomised controlled trial, the results of which showed that strength can be maintained over 12 months after 1 year of heavy resistance training.
At the time, participants who had recently retired and were healthy and active were stratified by sex, weight (BMI), and the ability to get up from a chair without assistance.
They were randomly assigned either to 1 year of lifting heavy weights 3 times a week (149), or to moderate intensity training (154), involving circuits that incorporated body weight exercises and resistance bands 3 times a week, or to a comparison group (148), all of whom were encouraged to maintain their usual levels of physical activity.
Bone and muscle strength and levels of body fat were measured in all the participants at the start of the trial, and then again after 1, 2, and 4 years.
After 4 years, 369 participants were available for assessment: 128/149 of those who had done the heavy weights resistance training; 126/154 of those completing moderate intensity training; and 115/148 of those in the comparison group. Eighty two people had dropped out, primarily due to lack of motivation or illness.
On average, participants were aged 71 (range 64–75) at year 4; 61% were women; and they were still active based on their daily physical activity, which averaged nearly 10,000 steps, as recorded by activity tracker.
After 4 years, there was no difference among the three groups in leg extensor power—the ability to kick a pedal as hard and as fast as possible---handgrip strength (a measure of overall strength), and lean leg mass (weight minus body fat), with decreases in all 3 indicators across the board.
Leg strength, however, was still preserved at the same level in the heavy weights resistance training group, but fell in the moderate intensity training and comparison groups, possibly because of nervous system changes in response to resistance training, suggest the researchers. And this difference was statistically significant.
As to visceral fat—the fat that is stored internally around the organs—levels of this remained the same in the heavy weights resistance training and moderate intensity exercise groups, but increased in the comparison group.
This implies that some parameters may not depend on weight load or exercise intensity in the long term, suggest the researchers.
They acknowledge that the study participants were healthier and more active than average despite having at least one long term condition in 80% of cases, so aren’t necessarily representative of the population as a whole.
But they conclude: “This study provides evidence that resistance training with heavy loads at retirement age can have long-term effects over several years. The results, therefore, provide means for practitioners and policy-makers to encourage older individuals to engage in heavy resistance training.”
The study found that in animals, a high-fat diet disrupts resident gut bacteria, alters behavior and, through a complex pathway connecting the gut to the brain, influences brain chemicals in ways that fuel anxiety.
“Everyone knows that these are not healthy foods, but we tend to think about them strictly in terms of a little weight gain,” said lead author Christopher Lowry, a professor of integrative physiology at CU Boulder. “If you understand that they also impact your brain in a way that can promote anxiety, that makes the stakes even higher.”
Lowry’s team divided adolescent rats into two groups: Half got a standard diet of about 11% fat for nine weeks; the others got a high-fat diet of 45% fat, consisting mostly of saturated fat from animal products.
The typical American diet is about 36% fat, according to the Centers for Disease Control and Prevention.
Throughout the study, the researchers collected fecal samples and assessed the animals’ microbiome, or gut bacteria. After nine weeks, the animals underwent behavioral tests.
When compared to the control group, the group eating a high-fat diet, not surprisingly, gained weight. But the animals also showed significantly less diversity of gut bacteria. Generally speaking, more bacterial diversity is associated with better health, Lowry explained. They also hosted far more of a category of bacteria called Firmicutes and less of a category called Bacteroidetes. A higher Firmicutes to Bacteroidetes ratio has been associated with the typical industrialized diet and with obesity.
The high-fat diet group also showed higher expression of three genes (tph2, htr1a, and slc6a4) involved in production and signaling of the neurotransmitter serotonin—particularly in a region of the brainstem known as the dorsal raphe nucleus cDRD, which is associated with stress and anxiety.
While serotonin is often billed as a “feel-good brain chemical,” Lowry notes that certain subsets of serotonin neurons can, when activated, prompt anxiety-like responses in animals. Notably, heightened expression of tph2, or tryptophan hydroxylase, in the cDRD has been associated with mood disorders and suicide risk in humans.
“To think that just a high-fat diet could alter expression of these genes in the brain is extraordinary,” said Lowry. “The high-fat group essentially had the molecular signature of a high anxiety state in their brain.”
Lowry suspects that an unhealthy microbiome compromises the gut lining, enabling bacteria to slip into the body’s circulation and communicate with the brain via the vagus nerve, a pathway from the gastrointestinal tract to the brain.
“If you think about human evolution, it makes sense,” Lowry said. “We are hard-wired to really notice things that make us sick so we can avoid those things in the future.”
Lowry stresses that not all fats are bad, and that healthy fats like those found in fish, olive oil, nuts and seeds can be anti-inflammatory and good for the brain.
His advice: Eat as many different kinds of fruits and vegetables as possible, add fermented foods to your diet to support a healthy microbiome and lay off the pizza and fries. Also, if you do have a hamburger, add a slice of avocado. Some research shows that good fat can counteract some of the bad.