Wednesday, October 30, 2024

A positive outlook on aging = better cognitive performance


People with more positive aging expectations rate their cognitive function better and report less perceived cognitive decline, according to a new study

Peer-Reviewed Publication

Penn State

 Getting older brings certain expectations, from gray hair and wrinkles to more bouts of forgetfulness. While these beliefs may seem harmless, whether a person views these changes in a positive or negative light may influence how they perceive their cognitive abilities, according to a new study from researchers in the Penn State College of Nursing.

The team found that people who had more positive expectations of aging tended to report less frequent cognitive problems, such as difficulty concentrating or keeping track of what they were doing. They were also less likely to report that their cognitive performance had declined over time.

The findings were published in the journal Aging & Mental Health.

“Aging expectations are malleable and influence an individual’s perceptions of their cognitive functioning,” said Nikki Hill, associate professor in the Ross and Carol Nese College of Nursing at Penn State, who is first author on the paper. “Modifying older adults’ aging expectations could support healthier cognitive aging through increased awareness and accurate assumptions about the aging process.”

Previous research has found that expectations about aging, such as whether a person expects to maintain high levels of activity or if they expect everything to go downhill, are associated with health. Those with more negative aging expectations tend to experience worse outcomes, such as more rapid physical and cognitive decline, while positive perceptions of aging are linked to behaviors that promote health and wellbeing like exercise.

Hill is interested in understanding how older adults experience cognitive changes and how that influences outcomes related to aging. In her work, she said she’s noticed that when people describe their experiences, they often include stereotypical and stigmatized beliefs about aging and cognitive decline. It led Hill to wonder how people’s expectations about the aging process may influence how they interpret cognitive changes they may experience — a relationship that few studies have examined.

“Do people's perceptions of what they expect aging to be in the future, in terms of physical health, mental health, cognitive health, affect the way that they perceive their cognitive performance?” Hill said. “If it does, then that gives us more clues about how to interpret people's reports of cognitive changes and, potentially, how to intervene earlier to support people to maximize their aging outcomes.”

For example, people who are worried about perceived declines in their cognitive function — even if their cognitive health is normal — are at higher risk for developing a cognitive impairment in the future, Hill explained. She said that with conditions like Alzheimer’s disease and related dementias, there’s a slow, gradual decline in cognitive function over decades and people often experience subtle symptoms before clinical tests identify an impairment in cognition.

The research team conducted an online survey among individuals aged 65 and older in the United States who lived independently and didn’t report any diagnosis of dementia or other cognitive impairment. A total of 581 people completed the survey; 51% of the respondents were women and 74% were non-Hispanic white.

The survey asked about their expectations about physical health, mental health and cognitive function in relation to aging. They were asked to rate statements — for example, “every year that people age, their energy levels go down a little more” — on a four-point scale from “definitely true” to “definitely false.” To assess their perceptions of their own cognition, participants were asked about their cognitive abilities over the last seven days. They were also asked about their ability to perform certain tasks today compared to 10 years ago to assess whether they believed their cognitive abilities had declined.

The team found that people who had more positive expectations of aging tended to rate their cognitive function better and report less perceived decline in their cognitive abilities, both in the last week or over the last 10 years. On the other hand, more negative expectations of aging were linked to more negative perceptions of their current cognitive performance and whether they perceived cognitive decline.

The researchers also found that there wasn’t much difference between participants’ expectations of their physical, mental or cognitive health and how they perceived their cognition. People with positive aging expectations in any of the three domains were more likely to rate their cognition higher, while people with negative expectations rated their cognition lower.

“If we can intervene in a way to ground aging expectations more in what is true and less stigmatized, then maybe we can help people clarify what they're experiencing in terms of cognitive changes, which will support our ability to respond to individual needs for maximizing cognitive health,” Hill said.

Hill said that the team plans to conduct more research to understand this complex relationship such as how do beliefs about aging influence whether older adults report the cognitive change they’re experiencing and how healthcare providers engage patients in conversations about cognitive health.

 

‘Weekend warrior’ exercise = frequent sessions for lowering cognitive decline

 

‘Weekend warrior’ exercise pattern may equal more frequent sessions for lowering cognitive decline risk


Just one or two sessions of physical activity at the weekend—a pattern of exercise dubbed ‘weekend warrior’---may be just as likely to lower the risk of cognitive decline, which can often precede dementia, as more frequent sessions, concludes research published online in the British Journal of Sports Medicine.

And it may be more convenient and achievable for busy people as well, suggest the researchers.

It’s important to identify potentially modifiable risk factors for dementia because a 5-year delay in onset might halve its prevalence, they say, adding that nearly all the evidence to date comes from studies in high-income countries.

They therefore drew on two sets of survey data from the Mexico City Prospective Study, the first of which took place between 1998 and 2004, and the second of which took place between 2015 and 2019. 

Some 10,033 people (average age 51) completed both surveys and their responses were included in the analysis.

For the first survey, respondents were asked whether they exercised or played sports, and if so, how many times a week, and for how long (in minutes). 

Four groups were derived from the responses: the no exercisers; the ‘weekend warriors’ who exercised/played sports once or twice a week; the regularly active who did so three or more times a week; and a combined group comprising weekend warriors and the regularly active.

The Mini Mental State Exam (MMSE) was used to assess  cognitive function at the time of the second survey. A score of 22 or less out of 30 was used to define MCI.

In all, 7945 respondents said they didn’t exercise at all; 726 fulfilled the definition of a weekend warrior; 1362 said they exercised several times a week; and 2088 made up the combined group.

During an average monitoring period of 16 years, 2400 cases of MCI were identified. MCI prevalence was 26% among the no exercisers; 14% among the weekend warriors; and 18.5% among the regularly active. 

After taking account of potentially influential factors including age, educational attainment, smoking, nightly sleep, diet and alcohol intake, weekend warriors were 25% less likely to develop MCI than the no exercisers, while the regularly active were 11% less likely to do so. Those in the combined group were 16% less likely to do so.

When MCI was defined as an MMSE score of 23 or below, 2856 cases were identified. And MCI prevalence rose to 30% among the no exercisers, 20% among the weekend warriors, and 22% among the regularly active.

Compared with the no exercisers, weekend warriors were 13% less likely to develop MCI, while the regularly active and those in the combined group were 12% less likely to do so. The results were similar for both men and women.

The researchers estimated that, in theory, 13% of cases might be avoided if all middle aged adults exercised at least once or twice a week. 

This is an observational study, so no firm conclusions can be drawn about causal factors. And the researchers acknowledge various limitations to their findings. For example, the survey respondents might not have been truly representative of  middle aged adults and there were no objective measures of physical activity.

But there are several possible explanations for the seemingly protective effect of exercise on brain health, they explain.

“For example, exercise may increase brain-derived neurotrophic factor concentrations [molecules that support the growth and survival of neurons] and brain plasticity. Physical activity is also associated with greater brain volume, greater executive function, and greater memory,” they write.

“To the best of our knowledge, the present study is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia,” they continue.

And they go on to suggest that the findings “have important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people in Latin America and elsewhere.” 

Tuesday, October 29, 2024

RSV vaccines effective, but more people need to get them


Peer-Reviewed Publication


Since their introduction last year, researchers have been monitoring the real-world impact of the new respiratory syncytial virus (RSV) vaccines. In a recent commentary in The LancetAngela Branche, MD, an infectious diseases researcher at the University of Rochester Medical Center (URMC), details what has been learned during the vaccine’s first season.

“The evidence is clear; individuals should get vaccinated if they have conditions that place them at risk for severe disease. For older adults and those with chronic conditions, RSV should be considered as serious as the flu, and they should get vaccinated,” said Branche.

RSV is a significant cause of severe respiratory illness among older adults, especially those with underlying health conditions. Worldwide, RSV causes millions of infections, hundreds of thousands of hospitalizations, and tens of thousands of deaths annually in adults aged 60 and older. In the US, adults over 65 experience high rates of RSV-related hospital visits, intensive care unit admissions, and deaths. Older people with RSV are at higher risk of severe illness compared to those with influenza or COVID.

Vaccines Protect Against Severe Symptoms and Keep People Out of the Hospital

In 2023, the FDA approved three RSV vaccines for older adults. Studies have shown these vaccines to be effective, with the Pfizer, GSK, and Moderna vaccines preventing RSV pneumonia and bronchitis in more than 80 percent of participants.

A recent study published in The Lancet assessed the effectiveness of RSV vaccines using data from a large electronic health record network involving the Centers for Disease Control and Prevention (CDC) and multiple US healthcare systems. The study found that RSV vaccines were 80 percent effective in preventing hospitalization, ICU admission, and death among adults aged 60 and older. Vaccine effectiveness was consistent across age groups, including those 75 and older, and among immunocompromised individuals. The study did not find evidence of waning vaccine protection within the season.

However, the uptake of the RSV vaccine in the 2023-2024 winter season was low. An estimated 24 percent of US adults aged 60 years and older received the vaccine, compared to influenza vaccination rates, which approach 50 percent each year for the same group. “Providers were not sure how to apply the shared clinical decision-making recommendations in the first season, and there remains a general lack of knowledge among the medical community and the public on what constitutes a risk for severe disease and who needs to be protected,” said Branche.

Boosting Rates and Better Vaccines

Based on these findings, the US Advisory Committee on Immunization Practices (ACIP), a group of medical and public health experts that advises the CDC, updated guidelines in June 2024 to recommend RSV vaccination for all adults aged 75 and older, those 60 and older in long-term care facilities or with chronic and high-risk health conditions.

“This new data enabled the ACIP to make more definitive recommendations, which will build public confidence in the effectiveness of these vaccines and make implementation a lot easier for providers and pharmacies,” said Branche.

New research shows that vaccines that target multiple strains of the RSV virus, called bivalent vaccines, may provide longer protection. URMC infectious disease experts Edward Walsh, MD, and Ann Falsey, MD, helped lead an international study of a bivalent RSV vaccine developed by Pfizer, the results of which were recently detailed in the New England Journal of Medicine. The vaccine effectively prevented severe RSV-related lower respiratory tract illnesses over two RSV seasons, with an overall efficacy of more than 80 percent. The experimental vaccine was particularly effective in individuals aged 60-79.

Don’t skip colonoscopy for new blood-based colon cancer screening


Newly available blood tests to screen for colorectal cancer sound far more appealing than a standard colonoscopy. Instead of clearing your bowels and undergoing an invasive procedure, the tests require only a simple blood draw. But are the tests effective?

A study led by researchers at Stanford Medicine concluded that the new tests are ideal for people who shy away from other colorectal cancer screening. However, if too many people who would have undergone colonoscopies or stool-based tests switch to the blood tests, colorectal cancer death rates will rise. Because the more established colonoscopies and stool tests are more effective at detecting early cancers and precancerous polyps than the emerging blood tests, their long-term impact is projected to be substantially greater than that of blood tests, the researchers found.

“The first generation of blood tests are a really exciting development in the colorectal cancer screening paradigm,” said Uri Ladabaum, MD, a professor of gastroenterology and the first author of the paper, to be published Oct. 28 in Annals of Internal Medicine. “But for now, if you’re willing and able to do a colonoscopy or stool-based test, don’t switch to a blood test.”

Ladabaum also pointed out that, at a population level, the blood tests will be effective at reducing colorectal cancer deaths only if people who reliably take the test every three years subsequently agree to receive a follow-up colonoscopy if the blood test returns a positive result.

Weighing the options

With the current screening rates in the population, about 4% of all American adults will be diagnosed with colorectal cancer at some point in their lifetimes. Regular screening can help identify early cancers and precancerous polyps and reduce a person’s risk of developing, and dying, from colorectal cancer. The U.S. Preventive Services Task Force recommends that all adults between the ages of 45 and 75 be screened for colorectal cancer.

For decades, screening has required either a once-a-decade colonoscopy, in which a thin flexible tube with a camera is used to look inside a person’s large intestine, or a stool test every one to three years. During a colonoscopy, clinicians can not only detect colorectal cancers, but also remove precancerous polyps which can develop into cancers.

“This makes colonoscopy a unique cancer screening method because you also have the possibility of cancer prevention,” Ladabaum said. “Despite that, there are many people who are not getting screened at all, or who are not getting screened as often as they should.”

Data show that about 1 in 3 American adults in the recommended age range have never been screened for colorectal cancer, so clinicians are hoping that new methods could encourage them to undergo screening.

In 2014, the U.S. Food and Drug Administration approved the first multi-target stool-based colorectal screening test, in which stool collected at home by a patient every one to three years is analyzed for the presence of small amounts of blood or cancer DNA. This summer, the FDA approved a new method that looks for bits of cancer DNA circulating in a person’s bloodstream. These first-generation blood-based tests do not diagnose precancerous polyps well.

“This is a time of intense interest in the colorectal cancer screening field. The paradigm in screening could be changing,” Ladabaum said. “But conducting a randomized controlled trial directly comparing these emerging screening tests over the long term is unfeasible, which leaves patients in a difficult place when they’re weighing their options.”

Comparing effectiveness

Ladabaum and his collaborators collected previously published data on six commercially available or in-development blood- and stool-based screening tests as well as the gold-standard colonoscopy. Using this data, they modeled the relative rate of colorectal cancer and deaths among 100,000 average-risk people who used each screening approach.

Among 100,000 people who receive a colonoscopy every 10 years, 1,543 would develop colorectal cancer and 672 would die from the disease, they determined. For stool-based tests every one to three years (depending on test) the incidence of colorectal cancer ranged from 2,181 to 2,498 cases per 100,000 people, and deaths ranged from 904 to 1,025. For the new blood tests, recommended to be conducted every three years, the cases ranged from 4,310 to 4,365, and deaths ranged from 1,604 to 1,679 — about two and a half times as many deaths as in the colonoscopy group.

Among those who receive no screening, 7,470 would develop the cancer, and 3,624 would die from it.

Moreover, when the group looked at the costs associated with each test, they found that colonoscopies and stool-based tests were more cost-effective than the blood-based tests.

“The blood tests are certainly much better than nothing, but you’ll worsen the population outcomes and raise health care costs if you see people switching from colonoscopies to first-generation blood tests,” Ladabaum said.

Modeling patient choices

When Ladabaum’s group modeled the effect of patient choices on population-wide colorectal cancer rates, they found most people continuing to screen with colonoscopy or stool-based tests as the best-case scenario. Blood tests should be used only by people who would not otherwise be screened.

The research team said they need real-world data on patient choices about colorectal cancer screening to better refine their model on how the blood tests will affect cancer rates.

“It remains to be seen who will really use the blood tests,” Ladabaum said. “Will it be people who have never been screened using any other method? And will they be willing to get a follow-up colonoscopy if indicated?”

He also said blood tests could improve, and the current results would then not hold true for future generations of the tests.

For now, the researchers hope that patients — and clinicians — stick with the most effective screening methods currently available. 

“Ideally, we want as many people as possible to get screened for colorectal cancer, and that’s likely going to mean a combination of different tests being used across the population,” Ladabaum said.

Sunday, October 27, 2024

Skeletal muscle health amid growing use of weight loss medications


Editorial in The Lancet highlights the critical importance of muscle mass with weight loss medications

Peer-Reviewed Publication

Pennington Biomedical Research Center

A recent commentary published in The Lancet journal highlights the critical importance of skeletal muscle mass in the context of medically induced weight loss, particularly with the widespread use of GLP-1 receptor agonists. These medications, celebrated for their effectiveness in treating obesity, have raised concerns regarding the potential for substantial muscle loss as part of the weight loss process.

Dr. Steven Heymsfield, professor of metabolism and body composition, and Dr. M. Cristina Gonzalez, adjunct and visiting professor in metabolism-body composition, both of Pennington Biomedical Research Center, joined colleagues Dr. Carla Prado of the University of Alberta, and Dr. Stuart Phillips of McMaster University on authoring The Lancet commentary, titled “Muscle Matters: The Effects of Medically Induced Weight Loss on Skeletal Muscle.”

The authors emphasize that muscle loss, as measured by decreases in fat-free mass, can account for 25 to 39 percent of total weight lost over a period of 36 to 72 weeks. This rate of muscle decline is significantly higher than what is typically observed with non-pharmacological caloric restriction or normal aging and could lead to unintended negative health consequences.

Despite the promising metabolic benefits associated with GLP-1 receptor agonists, including improvements in fat-to-fat-free tissue ratios, the potential adverse effects of muscle loss are gaining attention. Skeletal muscle plays critical roles not only in physical strength and function but also in metabolic health and immune system regulation.

A decline in muscle mass has been linked to decreased immunity, increased risk of infections, poor glucose regulation, and other health risks. The authors suggest that muscle loss due to weight reduction may exacerbate conditions like sarcopenic obesity, which is prevalent among individuals with obesity and contributes to poorer health outcomes, including cardiovascular disease and higher mortality rates.

While the short-term effects of muscle loss on physical strength and function remain unclear, the commentary calls for future research to explore how reductions in muscle mass might improve muscle composition and quality. The authors stress the need for a multimodal approach to weight loss treatment, combining GLP-1 receptor agonists with exercise and nutritional interventions to preserve muscle mass.

“We have to be mindful of the side effects that we are seeing with the new weight loss medications, such as a person eating less while on the medications and not getting the appropriate amount of dietary vitamins and minerals,” Dr. Heymsfield said. “Also, when a person loses weight, they are not only losing fat, they also lose muscle. We are looking at how that muscle loss can be better managed with consumption of an adequate amount of protein along with an optimum amount of exercise.”

This evolving conversation underscores the importance of ensuring that weight loss interventions promote overall health, including muscle preservation, as part of a comprehensive strategy for treating obesity.

For more information, please refer to the full commentary in The Lancet at https://www.thelancet.com/journals/landia/article/PIIS2213-8587(24)00272-9/fulltext. 

Friday, October 25, 2024

Pistachios may help improve eye health

 

Peer-Reviewed Publication


Pistachios: More Than Meets The Eye Infographic 

image: 

Pistachios: More Than Meets The Eye Infographic 

view more 

Credit: American Pistachio Growers

new study1 from researchers at the Friedman School of Nutrition Science and Policy at Tufts University has found that consuming pistachios daily may significantly improve eye health by increasing macular pigment optical density (MPOD), due to the plant pigment lutein, a key factor in protecting the eyes from blue (visible) light and age-related damage.

The randomized controlled trial showed that compared to eating a usual diet alone, eating 2 ounces (57 grams) of pistachios per day for 12 weeks as part of a usual diet resulted in a significant increase in MPOD in otherwise healthy middle-aged to older adults. MPOD is an important indicator of eye health, as it protects the retina and is linked to a reduced risk of age-related macular degeneration (AMD), a leading cause of blindness in older adults.

Findings from this research are timely, as according to a national poll by the American Foundation for the Blind, Americans fear vision loss more than they fear other serious health problems.2

Key Findings

  • Increased MPOD: Participants who consumed pistachios daily saw a significant rise in MPOD after just 6 weeks, with the effect sustained throughout the 12-week study.
  • Natural Lutein Source: Pistachios are the only nut that provides a measurable source of lutein,3 a powerful antioxidant that helps protect the eyes.
  • AMD Prevention Potential: The study suggests that regular pistachio consumption could offer a natural dietary approach to reducing the risk of AMD.

“Our findings indicate that pistachios are not only a nutritious snack, but they may also provide significant benefits for eye health,” said Dr. Tammy Scott, a research and clinical neuropsychologist and lead author of the study. “This is especially important as people age and face higher risks of vision impairment.”

Unique Role of Lutein from Pistachios and Eye Health

Lutein, found in pistachios, plays a critical role in maintaining eye health by filtering blue light and acting as an antioxidant in the eye. The study found that pistachio consumption nearly doubled participants’ daily intake of lutein, which is typically very low in most American diets,4 and significantly raised plasma levels of lutein.

Dr. Scott explains that in the study, participants were selected to have low habitual baseline lutein intakes in their diet and just 2 ounces per day rapidly increased lutein levels in the blood in only 6 weeks. “By simply incorporating a handful of pistachios into your diet, you can improve your intake of lutein, which is crucial for protecting your eyes,” notes Dr. Scott. She adds that pistachios provide a source of healthy fat, potentially making the lutein from pistachios better taken up into the body.

In the study, about 1.6 mg of lutein was provided from pistachios, which would be enough to double the average daily consumption of lutein, which is in a class of plant pigments known as xanthophylls, in U.S. adults.5

Broader Health Benefits of Lutein

Beyond supporting eye health, the lutein found in pistachios may also benefit brain function. “Lutein crosses the blood-brain barrier, where it may help reduce oxidative stress and inflammation,” notes Dr. Elizabeth Johnson, a co-investigator on the study.

As with the eye, lutein selectively accumulates in the brain and may play a role in reducing cognitive decline.5 Studies suggest higher lutein levels are associated with better cognitive performance, including memory and processing speed, making pistachios a valuable addition to a diet aimed at supporting overall healthy aging.6

Intense exercise may suppress appetite in healthy humans

 

A vigorous workout does more to suppress hunger levels in healthy adults than does moderate exercise, and females may be especially susceptible to this response, according to a small study published in the Journal of the Endocrine Society.


The study examines the effects of exercise intensity on ghrelin levels and appetite between men and women. Ghrelin is known as the “hunger hormone” and is associated with perceptions of hunger.

“We found that high intensity exercise suppressed ghrelin levels more than moderate intensity exercise,” said lead author Kara Anderson, Ph.D., of the University of Virginia and the University of Virginia Health System in Charlottesville, Va. “In addition, we found that individuals felt ‘less hungry’ after high intensity exercise compared to moderate intensity exercise.”

Ghrelin circulates in acylated (AG) and deacylated (DAG) forms, which are known to affect appetite. Data on the impact of exercise intensity on AG and DAG levels, and their effects on appetite, is sparse and primarily limited to males, the study noted.

To address this shortfall, the study examined eight males and six females. Participants fasted overnight and then completed exercises of varying intensity levels, determined by measurements of blood lactate, followed by self-reported measurements of appetite.

Females had higher levels of total ghrelin at baseline compared with males, the study noted. But only females demonstrated “significantly reduced AG” following the intense exercise, according to the findings.

“We found that moderate intensity either did not change ghrelin levels or led to a net increase,” the study noted. These findings suggest that exercise above the lactate threshold “may be necessary to elicit a suppression in ghrelin.”

Researchers also acknowledged that more work is needed to determine the extent to which the effects of exercise differ by sex.

Ghrelin has been shown to have wide-ranging biological effects in areas including energy balance, appetite, glucose homeostasis, immune function, sleep, and memory.

“Exercise should be thought of as a ‘drug,’ where the ‘dose’ should be customized based on an individual’s personal goals,” Anderson said. “Our research suggests that high-intensity exercise may be important for appetite suppression, which can be particularly useful as part of a weight loss program.”

Other study authors include Tana Mardian, Benjamin Stephenson, Emily Grammer, Macy Stahl, Nathan Weeldreyer, and Sibylle Kranz of the University of Virginia; Zhenqi Liu and Kaitlin Love of the University of Virginia Health System; and Jason Allen and Arthur Weltman of the University of Virginia and the University of Virginia Health System.

This research received financial support from the National Institute of Diabetes and Digestive and Kidney Diseases and the University of Virginia’s School of Education and Human Development.

The manuscript, The Impact of Exercise Intensity and Sex on Endogenous Ghrelin Levels and Appetite in Healthy Humans,” was published online.

Reminders can eliminate age-related decline in memory


A to-do list is one way to easily reduce memory declines

Peer-Reviewed Publication

University of Texas at Arlington

Hunter Ball (right), associate professor of psychology at UTA 

image: 

new study from UT Arlington reveals that setting reminders can eliminate some age-related declines in memory. The findings offer a significant breakthrough in addressing the cognitive challenges faced by older adults, particularly in the context of prospective memory, which is the ability to remember to perform an intended action at the right moment, like taking medication or attending appointments.

view more 

Credit: Courtesy UT Arlington

new study from UT Arlington reveals that setting reminders can eliminate some age-related declines in memory. The findings offer a significant breakthrough in addressing the cognitive challenges faced by older adults, particularly in the context of prospective memory, which is the ability to remember to perform an intended action at the right moment, like taking medication or attending appointments.

“Prospective memory is essential for daily living and maintaining independence, especially as people age,” said Hunter Ball, associate professor of psychology at UTA and lead author of the study. “Failing to remember these forward-looking tasks can lead to serious consequences, and previous research has shown that prospective memory tends to decline with age.”

Conducted with psychologists at UTA and Arizona State University, the study involved two experiments that tested prospective memory performance in younger and older adults under varying conditions with or without the aid of reminders. Participants were asked to remember specific tasks while completing ongoing activities, and their performance was measured in both high-load (more items to remember) and low-load conditions (fewer items to remember).

In the first experiment, participants were given specific tasks to remember, such as responding to certain words, and some were provided with reminders displayed on-screen. The results showed no significant age-related decline in prospective memory without reminders under low load, but under high load, both younger and older adults benefitted equally from using reminders. This suggests that reminders can help reduce cognitive strain by making memory retrieval less reliant on internal memory processes.

The second experiment introduced more complex, nonspecific tasks that required participants to recognize categories, such as animals or fruits, rather than specific words. Older adults experienced more difficulties in remembering these nonspecific tasks under high memory load without reminders, but these age-related performance gaps were eliminated entirely when reminders were available. This finding was critical as it highlighted the potential for reminders to counteract the deficits associated with more cognitively demanding tasks that typically strain older adults' memories.

Ball and his colleagues suggest that the effectiveness of reminders for older adults stems from their increased tendency to check reminders more frequently when faced with high cognitive demands. This compensatory behavior likely helps older adults manage tasks that would otherwise be too taxing on their internal memory resources.

The study’s implications extend beyond the laboratory, as prospective memory is a crucial cognitive function in real-life settings. As the population ages, finding practical solutions to memory problems is becoming increasingly important. The authors highlight that digital tools like smartphone apps, personal assistants like Amazon Alexa, or even simple reminder notes can serve as valuable aids for older adults in managing their daily tasks effectively and maintaining their independence.

“This new study demonstrates that cognitive offloading, specifically using reminders like cell phone calendars, can effectively mitigate these declines,” said Ball. “While our study was conducted in a controlled setting, these findings can easily be applied in real-world environments as a way to provide easy and effective way to alleviate the burden of prospective memory challenges in older adults.”

Thursday, October 24, 2024

Mayo Clinic study: What standing on one leg can tell you

 How long a person can stand — on one leg — is a more telltale measure of aging than changes in strength or gait, according to new Mayo Clinic research. The study appears today in the journal PLOS ONE.

Good balance, muscle strength and an efficient gait contribute to people's independence and well-being as they age. How these factors change, and at what rate, can help clinicians develop programs to ensure healthy aging. Individually, people can train their balance without special equipment and work on maintaining it over time.

In this study, 40 healthy, independent people over 50 underwent walking, balance, grip strength and knee strength tests. Half of the participants were under 65; the other half were 65 and older.

In the balance tests, participants stood on force plates in different situations: on both feet with eyes open, on both feet with eyes closed, on the non-dominant leg with eyes open, and on the dominant leg with eyes open. In the one-legged tests, participants could hold the leg they weren't standing on where they wanted. The tests were 30 seconds each.

Standing on one leg — specifically the nondominant leg — showed the highest rate of decline with age.

"Balance is an important measure because, in addition to muscle strength, it requires input from vision, the vestibular system and the somatosensory systems," says Kenton Kaufman, Ph.D., senior author of the study and director of the Motion Analysis Laboratory at Mayo Clinic. "Changes in balance are noteworthy. If you have poor balance, you're at risk of falling, whether or not you're moving. Falls are a severe health risk with serious consequences."

Unintentional falls are the leading cause of injuries among adults who are 65 and older. Most falls among older adults result from a loss of balance.

In the other tests:

  • Researchers used a custom-made device to measure participants' grip. For the knee strength test, participants were in a seated position and instructed to extend their knee as forcefully as possible. Both the grip and knee strength tests were on the dominant side. Grip and knee strength showed significant declines by decade but not as much as balance. Grip strength decreased at a faster rate than knee strength, making it better at predicting aging than other strength measures.
  • For the gait test, participants walked back and forth on an 8-meter, level walkway at their own pace and speed. Gait parameters didn't change with age. This was not a surprising result since participants were walking at their normal pace, not their maximum pace, Dr. Kaufman says.
  • There were no age-related declines in the strength tests that were specific to sex. This indicates that participants' grip and knee strength declined at a similar rate. Researchers did not identify sex differences in the gait and balance tests, which suggests that male and female subjects were equally affected by age.

Dr. Kaufman says people can take steps to train their balance. For example, by standing on one leg, you can train yourself to coordinate your muscle and vestibular responses to maintain correct balance. If you can stand on one leg for 30 seconds, you are doing well, he says.

"If you don't use it, you lose it. If you use it, you maintain it," Dr. Kaufman says. "It's easy to do. It doesn't require special equipment, and you can do it every day."

Live well, think well: Research shows healthy habits tied to brain health

 

 In middle-aged people, having risk factors like blood pressure, blood sugar and cholesterol that are not well-controlled combined with not following certain healthy habits including exercise, diet and sleep, are linked to a higher risk of stroke, dementia or depression later in life, according to a study published in the October 23, 2024, online issue of Neurology®, the medical journal of the American Academy of Neurology. These results do not prove that not having healthy habits increases the risk of these conditions, they only show an association.


The eight cardiovascular and brain health factors, known as the American Heart Association’s Life’s Essential 8 are: being active; eating better; maintaining a healthy weight; not smoking; maintaining a healthy blood pressure; getting enough sleep; and controlling cholesterol and blood sugar levels.

“Brain health is paramount for the optimal well-being of every person, enabling us to function at our highest level and constantly adapt in the world,” said study author Santiago Clocchiatti-Tuozzo, MD, MHS, of Yale University in New Haven, Connecticut, and member of the American Academy of Neurology. “Our study found that making these healthy lifestyle choices in middle age can have meaningful impacts on brain health later in life.”

For the study, researchers evaluated data from 316,127 people with an average age of 56. They were followed over five years.

Researchers looked at participants’ scores across the eight essential cardiovascular health factors and organized them into three categories: optimal, intermediate and poor.

Of the total group, 64,474 people had optimal scores, 190,919 people had intermediate scores and 60,734 people had poor scores.

Researchers then evaluated health records to identify who developed any of the following neurological conditions: stroke, dementia or late-life depression. Poor brain health was defined as the development of any of these conditions during the follow up years.

A total of 1.2% of participants met the definition for poor brain health, with a total of 3,753 conditions. Of those with optimal Life’s Essential 8 scores, 0.7% met the definition for poor brain health, compared to 1.2% of those with intermediate scores and 1.8% of those with poor scores.

After adjusting for factors that could affect the risk of these three neurological conditions, such as age, sex, race and ethnicity, researchers found that people with poor scores on the healthy lifestyle factors were more than twice as likely to develop any of the three neurological conditions compared to those people with optimal scores. Researchers also found that people who had an intermediate score had a 37% higher risk of having one of the three neurological conditions than those who had an optimal score.

“Because the risk factors we looked at are all ones that people can work to improve, our findings highlight the potential brain health benefits of using these eight cardiovascular and brain health factors to guide healthy lifestyle choices,” Clocchiatti-Tuozzo said. “More research is needed to understand this link between lifestyle habits and brain health, as well as how social factors like race and ethnicity can influence this connection.”

To confirm their findings, researchers repeated the study in a group of 68,407 participants followed for a total of five years and found similar results.

A limitation of the study was that the participants’ scores were measured only once at the start of the study, so it does not account for potential lifestyle changes during the five-year study.


People with type 2 diabetes who eat low-carb may be able to discontinue medication

Adults with type 2 diabetes on a low-carbohydrate diet may see benefits to their beta-cell function allowing them to better manage their disease and possibly discontinue medication, according to new research published in the Endocrine Society’s Journal of Clinical Endocrinology & Metabolism.


Beta-cells are endocrine cells in the pancreas that produce and release insulin, the hormone that controls blood sugar levels.

More than 38 million Americans have diabetes, and over 90% of them have type 2 diabetes. Type 2 diabetes most often develops in people 45 or older, but more and more children, teens and young adults are also developing the disease.

People with type 2 diabetes have a compromised beta-cell response to blood sugar, possibly due in part to eating too many carbohydrates. Beta-cell failure or insufficiency on top of insulin resistance is responsible for the development and progression of type 2 diabetes.

“This study shows people with type 2 diabetes on a low-carbohydrate diet can recover their beta-cells, an outcome that cannot be achieved with medication,” said lead study author Barbara Gower, Ph.D., of the University of Alabama at Birmingham in Birmingham, Ala. “People with mild type 2 diabetes who reduce their carbohydrate intake may be able to discontinue medication and enjoy eating meals and snacks that are higher in protein and meet their energy needs.”

The researchers gathered data from 57 white and Black adults with type 2 diabetes, half on a low-carbohydrate diet and the other half on a high-carbohydrate diet and examined their beta-cell function and insulin secretion at baseline and after 12 weeks.

All of the participants’ meals were provided. People on the carbohydrate-restricted diet ate 9% carbohydrates and 65% fat, and participants on the high-carbohydrate diet ate 55% carbohydrates and 20% fat.

The researchers found those on a low-carbohydrate versus a high-carbohydrate diet saw improvements in the acute and maximal beta-cell responses that were 2-fold and 22% greater, respectively. Within each race group, Black adults on a low-carbohydrate diet saw 110% greater improvements in the acute beta-cell response and White adults had improvements in the maximal beta-cell response that were 48% greater than their respective counterparts on the high-carbohydrate diet.

“Further research is needed to determine if a low-carbohydrate diet can restore beta-cell function and lead to remission in people with type 2 diabetes,” Gower said.

Other study authors include Amy Goss, Marian Yurchishin and William Garvey of the University of Alabama at Birmingham; Sarah Deemer of the University of North Texas in Denton, Texas; and Bhuvana Sunil of the University of Washington and Mary Bridge Children’s Hospital in Tacoma, Wash.

This research received financial support from the National Institute of Diabetes and Digestive and Kidney Diseases, the Nutrition Obesity Research Center of the University of Alabama at Birmingham, the Diabetes Research Center, and the National Heart, Lung, and Blood Institute.

The manuscript, Effects of a Carbohydrate Restricted Diet on Beta-cell Response in Adults with Type 2 Diabetes," was published online, ahead print.

Wednesday, October 23, 2024

Scurvy may be re-emerging amid cost of living crisis and rise of weight loss surgery

 

Condition caused by vitamin C deficiency first linked to sailors during Renaissance era

The scourge of scurvy, which is caused by vitamin C deficiency, may be re-emerging amid the cost of living crisis and the rise in weight loss (bariatric) surgery, suggest doctors in the journal BMJ Case Reports after treating a middle-aged man with the condition.

Scurvy is eminently treatable, but because it’s a disease of the past, first associated with sailors during the Renaissance era, it may be mistaken for other conditions, especially inflamed blood vessels (vasculitis), potentially risking fatal bleeding if left untreated, highlight the authors. 

Signs can appear as early as a month after a daily intake of less than 10 mg of vitamin C.

The authors treated a middle aged man whose legs were covered with tiny painful red-brown pinpoints, resembling a rash. Blood was also present in his urine and he was anaemic. 

He tested negative for inflammatory, autoimmune, and blood disorders, and scans revealed no evidence of internal bleeding. Similarly, a skin biopsy returned no diagnostic clues. 

His rash continued to spread while he was in hospital. And further questioning revealed that he was short of cash and neglected his diet, eating little in the way of fruit and vegetables. He said that he sometimes skipped meals altogether. 

He had also stopped taking the nutritional supplements prescribed for him after previous weight loss surgery, because he said he couldn’t afford them. 

Blood tests to assess his general nutritional status indicated no detectable levels of vitamin C and very low levels of other key nutrients. He was diagnosed with scurvy and treated with  daily vitamin C (1000 mg), vitamin D3, folic acid and multivitamin supplements, after which his painful rash and other symptoms cleared up. 

This is just one case report, and while it’s not clear what the current prevalence of scurvy is, it’s still relatively rare. 

But the authors point out: “Scurvy is still seen as a disease of the past, especially in developed countries.” The rising cost of living also makes it harder for families to afford good quality nutritious foods, while there have been numerous reports of scurvy arising from complications following bariatric surgery, they add.

Other risk factors for scurvy include alcoholism, smoking, eating disorders, low household income, obesity, kidney dialysis and drugs that interfere with vitamin C absorption, such as steroids and those that curb stomach acid production (proton pump inhibitors), they highlight.

“Our patient had multiple risk factors, namely, poor dietary habits, obesity, previous bariatric surgery, use of proton pump inhibitors and low-income status. His history of iron, vitamin D and folate deficiencies were also clues to his underlying nutritional deficiency,” they conclude.

Tuesday, October 22, 2024

National poll: Many teens use protein supplements for muscle growth, sports performance

 


Teen boys more likely to use supplements like protein powder, shakes or bars, every day or most days, according to parents

Reports and Proceedings

Michigan Medicine - University of Michigan

Teens and protein supplements 

image: 

Most parents say their teen uses protein supplements to boost muscle growth and sports performance

view more 

Credit: Sara Schultz, University of Michigan Health C.S. Mott Children's Hospital National Poll on Children's Health

ANN ARBOR, Mich. – Protein bars, shakes and powders are increasingly popular among adults – but many teens may be jumping on the bandwagon too.

Two in five parents say their teen consumed protein supplements in the past year, according to the University of Michigan Health C.S. Mott Children’s Hospital National Poll on Children’s Health. The trend was more common among teen boys who were also more likely to take protein supplements every day or most days, parents reported.

“Protein is part of a healthy diet but it can be hard for parents to tell if their child is consuming the right amount,” said Mott Poll co-director Sarah Clark, M.P.H.

“Our poll highlights that many teens are using protein supplements, particularly protein powders, to improve their athletic performance and build muscle.”

Using protein supplements for muscle growth, sports

Parents of boys were more likely say their teen consumed protein supplements to boost muscle growth and for athletic training while girls appeared to use them more often to replace a meal when they were busy or to help with a balanced diet.

About one in 10 parents also indicated that their teen used protein supplements to help with weight loss, more commonly reported among parents of teen girls.

Before turning to protein supplements, it’s helpful for parents and teens to think about what they want to accomplish, Clark notes. In many cases, teens can get adequate protein by eating a balanced diet. Consultation with the teen’s primary care provider or a nutritionist can provide insight on whether protein supplements would be helpful and, if so, guidance on what products would best fit with the teen’s goals.

“Despite what some teens - and their parents or coaches - think, eating more protein than what your body needs will not result in larger or faster muscle gains,” she said. “Instead, it’s helpful to consume the recommended amount of protein spread throughout the day, at each meal and snack.”

Choosing wisely

When busy teens have little time to eat, well-meaning parents may replace a meal with what they believe is a healthy alternative. However, parents should not assume that products labelled as high in protein are healthy options.

“Many protein shakes and bars have excessive amounts of added sugar and caffeine that are unhealthy for teens,” Clark said. “Parents should help teens read labels of protein supplements and choose healthy options, such as those that contain fiber, with little or no added sugar.”

“Relying on protein shakes and bars might not provide the necessary vitamins, minerals, and fiber teens need; they aren’t meant to replace balanced meals.”

Monitoring whether teens get enough protein

Many parents think their teen’s protein intake is lacking, with nearly one in five saying their teen does not get enough, suggests the nationally representative report that includes responses from 989 parents of teens ages 13-17 surveyed in August.

“Protein is an essential part of our diets, as it helps to build muscle, manage hormones and support immune health,” Clark said.

The optimal amount of protein for each individual will vary by age, sex, weight, and level of physical activity, Clark says, and it can be challenging for parents to assess whether their teen is getting the right amount.

Parents should consider a strategy of providing at least one source of protein at each meal and encouraging teens to try a variety of protein-rich foods, including eggs, nuts, fish, lean meats, lentils, and dairy products, she says.

“Teens can generally get enough protein through a well-balanced diet,” Clark said. “There may be some situations when teens aren’t eating a lot of food with protein. In these cases, parents may sometimes consider protein shakes or protein bars as part of a plan to increase their teen’s protein intake.”

Modeling a balanced diet

Adults’ views and behaviors toward protein supplements may also influence kids.

High protein and low carb diets are popular with many adults, and over half of parents also think a high protein diet is healthy for their teen. However, high protein diets are not generally recommended for them since there’s a risk that they may miss other essential nutrients, including carbohydrates.

One in three parents also said they use protein supplements themselves – these parents were more likely to report their teen used them, too.

Parents should apply the same approach for themselves as for their teen, Clark says. Generally, it’s better to get enough protein through a well-balanced diet, and if protein supplements are being considered, they should choose products that also contain fiber and other nutrients, without added sugar or caffeine.