Friday, March 15, 2024

Small amounts of licorice raise blood pressure

 

It is known that large amounts of liquorice cause high blood pressure. A study by researchers at Linköping University, Sweden, now shows that even small amounts of liquorice raise blood pressure. The individuals who react most strongly also show signs of strain on the heart.

Liquorice is produced from the root of plants of the Glycyrrhiza species and has long been used as a herbal remedy and flavouring. However, it is known that eating liquorice can also raise blood pressure. This is mainly due to a substance called glycyrrhizic acid that affects the body’s fluid balance through effects on an enzyme in the kidney. High blood pressure, in turn, increases the risk of cardiovascular disease.

Both the European Union and the World Health Organization have concluded that 100 mg of glycyrrhizic acid per day is probably safe to eat for most individuals. But some people eat more liquorice than that. The Swedish Food Agency has estimated that 5 per cent of Swedes have an intake higher than this level.

In the current study, published in The American Journal of Clinical Nutrition, researchers at Linköping University wanted to test whether the limit stated as likely safe actually is so or not.

It is not easy to know how much glycyrrhizic acid is in the liquorice you eat, as its concentration in different liquorice products varies greatly. This variation may depend on factors such as origin, storage conditions and liquorice root species. In addition, the amount of glycyrrhizic acid is not indicated on many products. The Linköping University study is the first to have carefully measured the amount of glycyrrhizic acid in the liquorice that was tested, while being randomised and having a control group.

In the study, 28 women and men aged 18–30 were instructed to eat liquorice, or a control product that did not contain any liquorice, over two periods of time. The control product instead contained salmiak, which gives salty liquorice its flavour. The liquorice weighed 3.3 grammes and contained 100 mg of glycyrrhizic acid, that is, the amount indicated as likely safe for most people to eat daily. Participants were randomly assigned to eat either liquorice or the control product for two weeks, take a break for two weeks, and then eat the other variety for two weeks. This enabled the researchers to compare the effect of both varieties in the same person. The study participants were asked to measure their blood pressure at home every day. At the end of each intake period, the researchers measured levels of various hormones, salt balance, and heart workload.

“In the study, we found that a daily intake of liquorice containing 100 mg glycyrrhizic acid raised blood pressure in young healthy people. This hasn’t previously been shown for such small amounts of liquorice,” says Peder af Geijerstam, doctoral student at the Department of Health, Medicine and Caring Sciences at Linköping University, general practitioner, and lead author of the study.

When the participants ate liquorice, their blood pressure increased by an average of 3.1 mmHg. The researchers also measured two hormones that are affected by liquorice and that regulate fluid balance: renin and aldosterone. The levels of both of these decreased when eating liquorice. The quarter of the study participants who were most sensitive, based on their levels of the hormones renin and aldosterone decreasing the most after eating liquorice, also gained slightly in weight, most likely due to an increased amount of fluid in the body. This group also had elevated levels of a protein that the heart secretes more of when it needs to work harder to pump around the blood in the body, N-terminal pro-brain natriuretic peptide (NT-proBNP). This suggests increased fluid volume and heart workload in the individuals most sensitive to the effects of liquorice.

“Our results give reason to be more cautious when it comes to recommendations and labelling for food containing liquorice,” says Fredrik Nyström, professor at the same department, who was responsible for the study.

The study was funded with support from, among others, The Strategic Research Network in Circulation and Metabolism (LiU-CircM) at Linköping University, The National Research School in General Practice at Umeå University, King Gustaf V and Queen Victoria Freemason Foundation and Region Östergötland.

Article: A low dose of daily licorice intake affects renin, aldosterone, and home blood pressure in a randomized crossover trial, Peder af Geijerstam, Annelie Joelsson, Karin Rådholm and Fredrik Nyström, (2024). American Journal of Clinical Nutrition, Vol. 119 No. 3-682-692. Published online 20 January 2024, doi: 10.1016/j.ajcnut.2024.01.011

A healthy diet is linked with a slower pace of aging, reduced dementia risk

 

- A healthier diet is associated with a reduced dementia risk and slower pace of aging, according to a new study at Columbia University Mailman School of Public Health and The Robert Butler Columbia Aging Center. The findings show that a diet-dementia association was at least partially facilitated by multi-system processes of aging. 

While literature had suggested that people who followed a healthy diet experienced a slowdown in the processes of biological aging and were less likely to develop dementia, until now the biological mechanism of this protection was not well understood. The findings are published in the Annals of Neurology.


“Much attention to nutrition in dementia research focuses on the way specific nutrients affect the brain” said Daniel Belsky, PhD, associate professor of Epidemiology at Columbia School of Public Health and the Columbia Aging Center, and a senior author of the study. “We tested the hypothesis that healthy diet protects against dementia by slowing down the body’s overall pace of biological aging.”

The researchers used data from the second generation of the Framingham Heart Study, the Offspring Cohort. Originating in 1971, participants in the latter were 60 years of age or older, were free of dementia, and also had available dietary, epigenetic, and follow-up data. The Offspring Cohort were followed-up at nine examinations, approximately every 4 to 7 years. At each follow-up visit, data collection included a physical examination, lifestyle-related questionnaires, blood sampling, and, starting in 1991, neurocognitive testing. 

Of 1,644 participants included in the analyses, 140 of the participants developed dementia. To measure the pace of aging, the researchers used an epigenetic clock called DunedinPACE developed by Belsky and colleagues at Duke University and the University of Otago. The clock measures how fast a person’s body is deteriorating as they grow older, “like a speedometer for the biological processes of aging”, explained Belsky. 

“We have some strong evidence that a healthy diet can protect against dementia,” said Yian Gu, PhD, associate professor of Neurological Sciences at Columbia University Irving Medical Center and the other senior author of the study, “But the mechanism of this protection is not well understood.” Past research linked both diet and dementia risk to an accelerated pace of biological aging. 

“Testing the hypothesis that multi-system biological aging is a mechanism of underlying diet-dementia associations was the logical next step,” explained Belsky. The research determined that higher adherence to the Mediterranean-Dash Intervention for Neurodegenerative Delay diet (MIND) slowed the pace of aging as measured by DunedinPACE and reduced risks for dementia and mortality. Furthermore, slower DunedinPACE accounted for 27 percent of the diet-dementia association and 57 percent of the diet-mortality association.

“Our findings suggest that slower pace of aging mediates part of the relationship of healthy diet with reduced dementia risk, and therefore, monitoring pace of aging may inform dementia prevention,” said first author Aline Thomas, PhD, a Postdoc at the Columbia Department of Neurology and Taub Institute for Research on Alzheimer's Disease and the Aging Brain. “However, a portion of the diet-dementia association remains unexplained, therefore we believe that continued investigation of brain-specific mechanisms in well-designed mediation studies is warranted.”

“We suggest that additional observational studies be conducted to investigate direct associations of nutrients with brain aging, and if our observations are also confirmed in more diverse populations, monitoring biological aging, may indeed, inform dementia prevention,” noted Belsky. 

Highest level of red, processed meat:30- 40% increased risk for colorectal cancer

 


In one of the largest ever gene-environment interaction studies of red meat and colorectal cancer, which explored the impact of red meat consumption on a person’s cancer risk based on their genotype, researchers have identified two genetic markers that may help explain the association between the two and explain why some people face a higher cancer risk.

Past studies show that frequently consuming red and processed meat increases the risk of developing colorectal cancer, but the predominant biological mechanism is not yet established. Understanding the disease process and what genes underlie it can help scientists develop better prevention strategies.

A new study supported by the National Institutes of Health and led by the USC Norris Comprehensive Cancer Center, part of the Keck School of Medicine of USC, analyzed data on red and processed meat intake from 29,842 people with colorectal cancer and 39,635 people without cancer. It found that those who consumed more red or processed meat faced, respectively, a 30 or 40% increased risk for colorectal cancer. Using genome-wide data, the researchers also identified two genes, HAS2 and SMAD7, that altered cancer risk levels based on red or processed meat consumption levels. The results were just published in the journal Cancer Epidemiology, Biomarkers & Prevention.

“These findings suggest that there's a subset of the population that faces an even higher risk of colorectal cancer if they eat red or processed meat,” said lead author Mariana C. Stern, PhD, a professor of population and public health sciences and urology, the Ira Goodman Chair in Cancer Research and the associate director for population science at the USC Norris Comprehensive Cancer Center. “It also allows us to get a peek at the potential mechanism behind that risk, which we can then follow up with experimental studies.”

The researchers used a combination of standard methods to pinpoint gene-environment interactions, as well as a new statistical approach developed in the Keck School of Medicine’s division of biostatistics by coauthors William James Gauderman, PhD, a professor of population and public health sciences, Juan Pablo Lewinger, PhD and Eric Kawaguchi, PhDboth assistant professors of population and public health sciences and their colleagues.

“These state-of-the-art statistical methods and software allowed us to maximize efficiency as we tested for gene-meat interactions across seven million genetic variants,” Gauderman said.

The risk of red and processed meat

The analysis included data from 27 studies of colorectal cancer risk in people of European origin. Gauderman and Ulrike Peters, PhD, MPH, a professor and the associate director of the public health sciences division at the Fred Hutchinson Cancer Center in Seattle, compiled data from the Genetics and Epidemiology of Colorectal Cancer Consortium, the Colorectal Cancer Transdisciplinary Study and the Colon Cancer Family Registry.

First, the research team harmonized data from the various studies to create standard measures for the consumption of red meat (beef, pork and lamb) and processed meat (bacon, sausages, luncheon/deli meats and hot dogs). For each category, they calculated servings per day, adjusted for body mass index, and divided participants into four groups based on levels of red or processed meat intake.

People with the highest level of red meat intake had a 30% increased risk for colorectal cancer; those with the highest level of processed meat intake had a 40% increased risk. These findings do not account for genetic variability that may put some people in the population at higher risk than others.

Genetic markers of cancer risk

Next, based on DNA samples, the researchers compiled data for over seven million gene variants spanning the genome for each study participant. They then conducted a genome-wide gene-environment interaction analysis of the link between red meat intake and cancer risk. Looking at each position in the genome—known as a single nucleotide polymorphism (SNP)—they asked whether having a certain gene variant altered the risk of getting colorectal cancer for people who ate more red meat.

At almost every SNP on the genome, the answer was no. Regardless of what gene variant a person had, their cancer risk based on red meat consumption stayed the same. However, at two specific SNPs, the association changed.

Using a standard statistical analysis approach, the researchers flagged rs4871179 SNP in chromosome 8 near the HAS2 gene. The gene, which is part of a pathway that codes for protein modification inside cells, has been linked to colorectal cancer in some previous studies but never to red meat consumption.

The analysis showed that people with a common variant of the HAS2 gene found in 66% of the population faced a 38% higher risk of colorectal cancer if they consumed the highest level of meat. In contrast, people with another, rarer variant of the same gene had no increased risk of cancer when they ate more red meat. 

“We then used our novel, two-step machine learning approach to first identify patterns among SNPs, red meat consumption and cancer, then focus on the most promising combinations in our gene-environment interaction tests,” Gauderman said.

This method flagged rs35352860 SNP in chromosome 18, part of the SMAD7 gene. SMAD7 regulates hepcidin, a protein linked to iron metabolism. Because red and processed meats contain high levels of heme iron, the researchers hypothesize that different variants of SMAD7 may increase cancer risk by changing the way the body processes iron.

“When hepcidin is dysregulated, that can lead to increased iron absorption and even iron overload inside cells,” Stern said.

People with two copies of the most common variant of the SMAD7 gene present in about 74% of the population faced an 18% greater risk of colorectal cancer if they ate high levels of red meat. Individuals with only one copy of the most common variant or two copies of a less common variant had substantially higher cancer risk– 35% and 46%, respectively.

“These findings suggest that different genetic variants may confer a differing risk of colorectal cancer in individuals who consume red meat, and highlight possible explanations for how the disease develops,” said Joel Sanchez Mendez, a doctoral student in the Keck School of Medicine’s department of population and public health sciences and a co-author of the study.

More evidence needed

The findings reveal promising new details about the link between meat consumption and colorectal cancer, but Stern stresses that they do not yet prove a causal link for these genetic variants.

“This gives us some important food for thought,” she said. “We do these gene-environment interaction studies when we know there’s a clear association between an environmental exposure and a disease, but what happens in between is still a black box.”

Next, she and her colleagues hope to follow up with experimental studies that could provide stronger evidence for the role of dysregulated iron metabolism in the development of colorectal cancer.

Thursday, March 14, 2024

Tryptophan in diet, gut bacteria protect against E. coli infection

 

- Gut bacteria and a diet rich in the amino acid tryptophan can play a protective role against pathogenic E. coli, which can cause severe stomach upset, cramps, fever, intestinal bleeding and renal failure, according to a study published March 13 in Nature.

The research reveals how dietary tryptophan – an amino acid found mostly in animal products, nuts, seeds, whole grains and legumes – can be broken down by gut bacteria into small molecules called metabolites. It turns out a few of these metabolites can bind to a receptor on gut epithelial (surface) cells, triggering a pathway that ultimately reduces the production of proteins that E. coli use to attach to the gut lining where they cause infection. When E. coli fail to attach and colonize the gut, the pathogen benignly moves through and passes out of the body.

The research describes a previously unknown role in the gut for a receptor, DRD2. DRD2 has otherwise been known as a dopamine (neurotransmitter) receptor in the central and peripheral nervous systems.

“It’s actually two completely different areas that this receptor could play a role in, which was not appreciated prior to our findings,” said Pamela Chang, associate professor of immunology in the College of Veterinary Medicine and of chemical biology in the College of Arts and Sciences. “We essentially think that DRD2 is moonlighting in the gut as a microbial metabolite sensor, and then its downstream effect is to help protect against infection.”

Samantha Scott, a postdoctoral researcher in Chang’s lab, is first author of the study, “Dopamine Receptor D2 Confers Colonization Resistance via Microbial Metabolites.”

Now that Chang, Scott and colleagues have identified a specific pathway to help prevent E. coli infection, they may now begin studying the DRD2 receptor and components of its downstream pathway for therapeutic targets.

In the study, the researchers used mice infected with Citrobacter rodentium, a bacterium that closely resembles E. coli, since certain pathogenic E. coli don’t infect mice. Through experiments, the researchers identified that there was less pathogen and inflammation (a sign of an active immune system and infection) after mice were fed a tryptophan-supplemented diet. Then, to show that gut bacteria were having an effect, they gave the mice antibiotics to deplete microbes in the gut, and found that the mice were infected by C. rodentium in spite of eating a tryptophan diet, confirming that protection from tryptophan was dependent on the gut bacteria.

Then, using mass spectrometry, they ran a screen to find the chemical identities of tryptophan metabolites in a gut sample, and identified three such metabolites that were significantly increased when given a tryptophan diet. Again, based on pathogen levels and inflammation, when these three metabolites alone were fed to the mice, they had the same protective effect as giving the mice a full tryptophan diet.

Switching gears, the researchers used bioinformatics to find which proteins (and receptors) might bind to the tryptophan metabolites, and from a long list they identified three related receptors within the same family of dopamine receptors. Using a human intestinal cell line in the lab, they were able to isolate receptor DRD2 as the one that had the protective effect against infection in the presence of tryptophan metabolites.

Having identified the metabolites and the receptor, they analyzed the downstream pathway of DRD2 in human gut epithelial cells. Ultimately, they found that when the DRD2 pathway was activated, the host’s ability to produce an actin regulatory protein was compromised. C. rodentium (and E. coli) require actin to attach themselves to gut epithelial cells, where they colonize and inject virulence factors and toxins into the cells that cause symptoms. But without actin polymerization they can’t attach and the pathogen passes through and clears.

The experiments revealed a new role of dopamine receptor DRD2 in the gut that controls actin proteins and affects a previously unknown pathway for preventing a pathogenic bacteria’s ability to colonize the gut. 

Poor sleep linked to migraine attacks i


A new study by researchers at the University of Arizona Health Sciences identified a link between poor sleep and migraine attacks that suggests improving sleep health may diminish migraine attacks in people with migraine.

Many people with migraine report having sleeping disorders, including insomnia, trouble falling or staying asleep, poor sleep quality, excessive daytime sleepiness, waking up from sleep and being forced to sleep because of a migraine headache. Until now, it was unknown whether migraine causes poor sleep or vice versa.

“It has been recognized for quite a long time that there is a relationship between sleep and migraine,” said principal investigator Frank Porreca, PhD, research director for the Comprehensive Center for Pain & Addiction and professor of pharmacology at the UArizona College of Medicine – Tucson. “The way it has been investigated in the past has been through patient-reported information, which is subjective. We quantitatively measured sleep in preclinical models and found that migraine-like pain does not influence sleep, but if you have disrupted sleep, your chances of having a migraine attack if you're a migraine patient are much higher.”

Porreca led a research team that used preclinical mouse models to evaluate sleep disruption, as the sleep architecture of mice closely matches that of people, including cycles of deep sleep, REM sleep and light sleep. Sleep was assessed using electroencephalogram recordings and visual observations.

Researchers found that when mice were sleep deprived, they were more likely to experience migraine-like pain, but migraine-like pain did not disrupt normal sleep.

Porreca noted that sleep deprivation can happen for many reasons, including stress. For this study, the research team ensured they were studying the effect of sleep, and not stress, on migraine by giving mice novel objects to explore to keep them awake.

“Mice are compelled to explore novel objects. They just have to go and look,” Porreca said. “It reminds me of how teenagers are often sleep deprived because they’re on their phones. Anybody who studies sleep will tell you that from a sleep hygiene point of view, you don’t want any devices in your bedroom where you’re trying to sleep.”

For people with migraine, limiting the use of electronic devices before bedtime and following other sleep health tips could be an easy way to limit the likelihood of migraine attacks.

“Early morning is one of the most common times people experience migraine attacks,” Porreca said. “Migraine is highly female prevalent – it’s 3 to 1, women to men – and almost all the women are of childbearing age. Many people with migraine probably have children. They wake up with a migraine attack and are immediately stressed. They don’t have time to take care of themselves, they have to get the kids ready for school and they have to get ready for work. That migraine attack is happening in the worst time of the day for function.  Improved sleep is critically important and probably would diminish the frequency of migraine attacks.”

The American Migraine Foundation estimates more than 39 million people in the U.S. live with migraine, though that number is probably higher due to the number of people who do not get a diagnosis or treatment.

The paper, “Unraveling the directional relationship of sleep and migraine-like pain,” was published in Brain Communications.

Friday, March 8, 2024

Poll of older adults suggests some who take it may be following outdated advice

One in four older adults take aspirin at least three times a week, mostly in hopes of preventing heart attacks and strokes, a new poll shows.

But many people aged 50 to 80 who said they take aspirin may not need to, the findings from the University of Michigan National Poll on Healthy Aging suggest.

In all, 57% of people aged 50 to 80 who say they take aspirin regularly also said they don’t have a history of cardiovascular disease. Such people should have a conversation with their health care provider about what’s best for them before stopping or starting aspirin use.

National guidelines have changed in recent years for using aspirin for prevention, because of new knowledge about who actually gets the most benefit from its ability to reduce the risk of blood clots, and who faces a risk of bleeding.

Now, guidelines mostly focus on aspirin use in those who already have cardiovascular disease – including those who have survived a heart attack or stroke – and those who face a high risk of it because of their personal health and family history.

The poll shows 14% of all adults age 50 to 80 are taking aspirin even though they have no history of cardiovascular issues.

Whether or not someone has a cardiovascular history, aspirin does pose a bleeding risk that increases with age. That has led to guidelines that advise against routine aspirin use after age 70, or suggest that it may be reasonable to consider stopping around age 75, in those without cardiovascular disease.

The poll finds 42% of all adults age 75 to 80 are taking aspirin. Meanwhile, 31% of all older adults age 50 to 80 who take aspirin don’t appear to know about the bleeding risk associated with it.

The poll is based at the U-M Institute for Healthcare Policy and Innovation and supported by AARP and Michigan Medicine, U-M’s academic medical center. The poll team asked a national sample of adults aged 50 to 80 about their health history and use of aspirin; those who take it were also asked about why.

“Aspirin is no longer a one-size-fits-all preventive tool for older adults, which for decades it was touted as,” says Jordan Schaefer, M.D., M.Sc., a hematologist at Michigan Medicine who worked with the poll team. “This poll shows we have a long way to go to make sure aspirin use is consistent with current knowledge.”

Adds Geoffrey Barnes, M.D., M.Sc., a Michigan Medicine cardiologist who also worked on the poll, “As guidelines change, it’s important for everyone over 40 to talk with their health care provider about their individual cardiovascular risk based on their family history, past health issues, current medications, recent test results like blood pressure, cholesterol and blood sugar, and lifestyle factors like smoking, physical activity and eating habits. Preventive aspirin use should be based on age plus these factors.”

Updated knowledge and guidance

In all, the poll finds 71% of older adults who take aspirin started four or more years ago, which could mean that they and their health care provider may be basing their use on old advice.

Schaefer and Barnes note that because of continuing research on aspirin, two major guidelines changed in recent years for older adults who don’t have a history of cardiovascular disease. In such people, taking aspirin is called primary prevention.

The American College of Cardiology and American Heart Association together say that daily low dose aspirin use might be considered for the prevention of cardiovascular disease for select adults 40 to 70 of age who are at increased risk of cardiovascular disease, but not bleeding, based on a guideline updated in 2019. The U.S. Preventive Services Task Force, which advises the federal government updated its guideline in 2022, and recommends against initiating aspirin for the prevention of cardiovascular disease in adults 60 years or older.

The AHA and ACC offer online calculators to help clinicians estimate a person’s 10-year risk of cardiovascular disease if they don’t already have it. Adults age 40-70 at higher cardiovascular disease risk may be good candidates for aspirin as primary prevention but should always talk with a health care provider before starting to take it.

Meanwhile, for people who have already had a heart attack, some types of stroke or other cardiovascular diagnoses, the use of aspirin is still generally recommended unless the person is unable to tolerate it or has an unacceptable bleeding risk. This is called secondary prevention and should be done only under the supervision of a health care provider.

More dialogue needed

The poll shows the importance of open communication between health care providers and their older patients about all types of medication and supplements, including those like aspirin that are available ‘over the counter’ without a prescription.

The poll finds that 96% of those who take aspirin and have a cardiovascular history said their health care provider had recommended it. But 77% of those who take aspirin and have no cardiovascular history said the same – suggesting a need for a discussion about updated guidelines. Also among those who take aspirin but have no cardiovascular disease history, 20% said they started doing it on their own and 5% said friends and family had advised them.

“Thanks to updated knowledge, and reductions in other major risk factors such as smoking, we can use aspirin more precisely, focusing on those who need this inexpensive and easy-to-obtain preventive medication most and avoiding unnecessary risks for others,” said poll director Jeffrey Kullgren, M.D., M.P.H., M.S. “These poll findings should spur more conversations between health care providers and patients about what’s right for them.” Kullgren is a primary care physician at the VA Ann Arbor Healthcare System and associate professor of internal medicine at U-M.

The poll report is based on findings from a nationally representative survey conducted by NORC at the University of Chicago for IHPI and administered online and via phone in July and August 2023 among 2,657 adults aged 50 to 80, with an oversample of non-Hispanic Black and Hispanic populations. The sample was subsequently weighted to reflect the U.S. population. Read past National Poll on Healthy Aging reports and about the poll methodology.

 

Thursday, March 7, 2024

How does wearing makeup affect skin during exercise?

 

New research published in the Journal of Cosmetic Dermatology reveals the effects of wearing cosmetic foundation during aerobic exercise on the skin and its pores.

The study included 43 healthy college students (20 males and 23 females). Foundation cream was applied to participants on half of the face in two different areas (forehead and upper cheek). The other half of the face served as control.

Moisture increased after exercise in both the non-makeup and makeup zones; however, there was a greater increase in moisture in the makeup zones. This may be a result of makeup preventing moisture from evaporating from the skin. Elasticity of the skin increased after exercise, but to a greater extent in the makeup zones than in non-makeup zones.

The size of pores increased in skin without makeup after exercise, but not significantly in skin with makeup. This may indicate that wearing makeup may block pores. Oil level increased in the non-makeup zones and decreased in the makeup zones, suggesting that it may be difficult to maintain proper oil levels on the skin when wearing makeup.

“For skin health, it’s best to exercise with your makeup removed,” said corresponding author Dongsun Park, PhD, of the Korea National University of Education.

URL upon publication: https://onlinelibrary.wiley.com/doi/10.1111/jocd.16205