Thursday, August 15, 2024

Origin of agriculture in the farming-pastoral zone of northern China

 The beginning of agriculture is one of the most significant events in human history. The origin and spread of agriculture accelerated the development of human society and economy and fundamentally altered humans’ role in the Earth’s ecosystem. This allows humans to transform nature while increasing food production and stability, laying the groundwork for human reproduction and civilizational development. China is one of the world’s three largest agricultural production centers. Our ancestors domesticated dryland crops, such as millet, in northern China as early as 10,000 years ago.

Archaeologists have proposed a variety of hypotheses about the origins of agriculture in northern China over the last several decades. Among them, three types of hypotheses are commonly used: stress caused by climatic instability, socioeconomic competition, and human-environment coevolution. In general, the debates over the factors driving the origin of millet cultivation in northern China highlight the importance of locating an archaeological site to investigate human use of plant resources, reconstruct the climate and vegetation evolution process before and during human presence, and further investigate why humans began to practice agriculture in northern China around the middle Holocene.

Archaeological work in northern China’s farming-pastoral zone since 2015 has led to the recognition of the Yumin Culture (~8 ka BP) as the start of Neolithic culture in Inner Mongolia. Several archaeological sites have been excavated, including Yumin, Simagou, Xinglong, and Sitai. These archaeological sites have yielded a large number of pottery and agricultural stone tools, as well as some animal bones and plant remains, providing evidence for the origins of agriculture in northern China’s farming-pastoral zone. The Yumin site in northern China’s farming-pastoral zone offers a new perspective on human-environmental interactions during the early Neolithic period, and studying the origins of agriculture in this area is critical to understanding the formation of northern China’s traditional dryland farming system.

To better explain the relationship between the formation of the traditional dryland agricultural system and the changes in the geographical environment in northern China, Xin Jia and Zhiping Zhang of the Nanjing Normal University and Yonggang Sun of the Chifeng University led a research team composed of the Nanjing University, Lanzhou University, Chinese University of Hong Kong, Nanjing Institute of Geology and Palaeontology CAS, Institute of Archaeology CASS, Institute of Cultural Relics and Archaeology of Inner Mongolia Autonomous Region, and Ulanqab Museum to examine the relationship between the origins of agriculture and climate change by combining high-resolution ancient environmental records from the Yumin Cultural Circle in Northern China. The team conducted quartz optically stimulated luminescence (OSL) dating on the sedimentary profiles of Yumin and Banan sites belonging to the Yumin culture. They obtained soil samples via flotation and collected and identified carbonized plant seeds during the excavation of the Yumin site. The team also employed multiple proxies such as fossil pollen, magnetic susceptibility, grain size, and chemical elements to analyze the above-mentioned sedimentary profiles. Then, information about agricultural activities and climate change was obtained at the Yumin site. Their research findings were recently published in Science China Earth Sciences.

Their research illustrates that agriculture had already begun in northern China’s farming-pastoral zone during the Yumin culture period (around 8,000 years ago), as evidenced by carbonized millet at the house site (F6) of the Yumin site and combined with agricultural production and processing tools discovered during excavation, as well as 16 representative residences. The origin of agriculture at the Yumin site occurred later than a significant increase in precipitation during the early Holocene but coincided with a substantial rise in vegetation around 8.4 ka. Their findings indicate that the gradual improvement of hydrothermal conditions since the beginning of the Holocene has resulted in the gradual conversion of the land surface from infertile sand to organic-rich soil, providing an appropriate environmental foundation for the origin of dryland farming in northern China around 8.4 ka. The “accumulative environmental effects” during the early Holocene played an essential role in the origin of agriculture in northern China, and it provided a reference for agricultural management in the face of future climate change.

Significant link found between heme iron, found in red meat and other animal products, and type 2 diabetes risk

 



Peer-Reviewed Publication

Harvard T.H. Chan School of Public Health

Key points:

  • Researchers identified a significant link between heme iron—iron found in red meat and other animal products —and risk of type 2 diabetes (T2D), as well as the metabolic pathways underlying the link.
  • Non-heme iron—iron found in plant-based foods—was not associated with risk of T2D.
  • The study suggests that cutting down on heme iron from red meat and adopting a plant-rich diet can help lower diabetes risk. And it raises concerns about the addition of heme to increasingly popular plant-based meat alternatives.

Higher intake of heme iron, the type found in red meat and other animal products—as opposed to non-heme iron, found mostly in plant-based foods—was associated with a higher risk of developing type 2 diabetes (T2D) in a new study led by researchers at Harvard T.H. Chan School of Public Health. While the link between heme iron and T2D has been reported previously, the study’s findings more clearly establish and explain the link.

“Compared to prior studies that relied solely on epidemiological data, we integrated multiple layers of information, including epidemiological data, conventional metabolic biomarkers, and cutting-edge metabolomics,” said lead author Fenglei Wang, research associate in the Department of Nutrition. “This allowed us to achieve a more comprehensive understanding of the association between iron intake and T2D risk, as well as potential metabolic pathways underlying this association.”

The study will be published August 13 in Nature Metabolism.

The researchers assessed the link between iron and T2D using 36 years of dietary reports from 206,615 adults enrolled in the Nurses’ Health Studies I and II and the Health Professionals Follow-up Study. They examined participants’ intake of various forms of iron—total, heme, non-heme, dietary (from foods), and supplemental (from supplements)—and their T2D status, controlling for other health and lifestyle factors.

The researchers also analyzed the biological mechanisms underpinning heme iron’s relationship to T2D among smaller subsets of the participants. They looked at 37,544 participants’ plasma metabolic biomarkers, including those related to insulin levels, blood sugar, blood lipids, inflammation, and two biomarkers of iron metabolism. They then looked at 9,024 participants’ metabolomic profiles—plasma levels of small-molecule metabolites, which are substances derived from bodily processes such as breaking down food or chemicals.

The study found a significant association between higher heme iron intake and T2D risk. Participants in the highest intake group had a 26% higher risk of developing T2D than those in the lowest intake group. In addition, the researchers found that heme iron accounted for more than half of the T2D risk associated with unprocessed red meat and a moderate proportion of the risk for several T2D-related dietary patterns. In line with previous studies, the researchers found no significant associations between intakes of non-heme iron from diet or supplements and risk of T2D.

The study also found that higher heme iron intake was associated with blood metabolic biomarkers associated with T2D. A higher heme iron intake was associated with higher levels of biomarkers such as C-peptide, triglycerides, C-reactive protein, leptin, and markers of iron overload, as well as lower levels of beneficial biomarkers like HDL cholesterol and adiponectin. 

The researchers also identified a dozen blood metabolites—including L-valine, L-lysine, uric acid, and several lipid metabolites—that may play a role in the link between heme iron intake and TD2 risk. These metabolites have been previously associated with risk of T2D.

On a population level, the study findings carry important implications for dietary guidelines and public health strategies to reduce rates of diabetes, according to the researchers. In particular, the findings raise concerns about the addition of heme to plant-based meat alternatives to enhance their meaty flavor and appearance. These products are gaining in popularity, but health effects warrant further investigation.

“This study underscores the importance of healthy dietary choices in diabetes prevention,” said corresponding author Frank Hu, Fredrick J. Stare Professor of Nutrition and Epidemiology. “Reducing heme iron intake, particularly from red meat, and adopting a more plant-based diet can be effective strategies in lowering diabetes risk.”

The researchers noted that the study had several limitations, including the potential for incomplete accounting for confounders and measurement errors in the epidemiological data. In addition, the findings—based on a study population that was mostly white—need to be replicated in other racial and ethnic groups.


Emergency department visits by children associated with water beads more than doubled

 

 

Water beads are made from superabsorbent material that can swell to hundreds of times their original size when exposed to fluids. They are commonly sold as child sensory products, gel projectiles for toy “gel blaster” guns, and decorations. If swallowed, they can expand in the gastrointestinal tract and cause intestinal blockage and even death. They can also cause injury if placed in the ear canal or nose. 

Researchers from the Center for Injury Research and Policy and Central Ohio Poison Center at Nationwide Children’s Hospital have found more than an estimated 8,000 visits to U.S. emergency departments (EDs) associated with water beads from 2007 through 2022, and the number of these visits increased rapidly by more than 130% from 2021 to 2022.

In a study published in American Journal of Emergency Medicineresearchers analyzed 16 years of data and call for a more comprehensive regulatory approach to prevent water bead-associated injuries. The increase in ED visits occurred despite product recalls and the current ASTM F963-23 voluntary toy safety standard, indicating that current prevention strategies are not sufficient.


According to the study, there were an estimated 8,159 visits to U.S. emergency departments from 2007 through 2022 involving water beads among people younger than 20 years. More than half (55%) of cases involved children younger than 5 years. Most emergency department visits in this study involved children swallowing water beads (46%), followed by putting water beads in the ear (33%) or nose (12%). Eye injuries made up 9% of cases in this study. Most patients were treated and released (92%). The proportion of cases admitted was highest among children younger than 5 years (10%), and this age group accounted for most (90%) of admissions in this study. All admissions among children younger than 5 years involved swallowing water beads.

“The number of pediatric water bead-related emergency department visits is increasing rapidly,” said Gary Smith, MD, DrPH, senior author of the study and director of the Center for Injury Research and Policy at Nationwide Children’s. “Although swallowing objects and putting them into an ear or the nose are common among children, water beads pose a unique increased risk of harm because of their expanding properties, and they’re hard to detect with X-rays.”

Water beads in dehydrated form are often sold in sets of tens of thousands, which makes it more likely that misplaced water beads in the home will not be noticed until found by a young child, a group known for exploring their environment by placing objects in their mouth – especially objects like water beads that look like candy.

Water bead toy safety is covered in the ASTM toy safety standard, ASTM F963. The standard addresses bowel obstruction by limiting the size of water beads to the narrowest part of the gastrointestinal tract of a small 18-month-old child. “The current safety standard is inadequate,” said Dr. Smith. “Serious outcomes have occurred to children younger than 18 months, and one-fifth of the water beads swallowed in this study were among children younger than 18 months with the youngest child being 7 months old. Therefore, using intestinal measurements for 18-month-olds is not adequate.” The ASTM F963 toy safety standard also does not address water beads marketed to individuals 14 years or older as gel blasters or used for home decoration or other purposes.

“Regardless of the intended user or marketing strategy used, a water bead that becomes accessible to a child has the same high-risk characteristics and potential harms. This underscores the need for a more comprehensive regulatory approach,” said Dr. Smith. “To be successful, revisions of the ASTM F693 standard and other policy efforts should focus on the primary characteristic of water beads that makes them hazardous, which is their expanding nature.”

Legislation introduced in the U.S. Senate (S.4298, Esther’s Law) in May 2024 would ban water beads that expand by 50% or more with hydration or expand to a size of 3 millimeters or larger. This legislation followed a similar bill introduced in the U.S. House of Representatives (H.R.6468) in November 2023, titled the “Ban Water Beads Act,” and applies to water beads marketed not only as toys, but as educational materials, art materials or art material products, or sensory stimulation materials or sensory tools. The U.S. Consumer Product Safety Commission is also considering regulation of water bead safety. Major U.S. retailers have stopped selling water bead toys in stores and online.

“Many parents are not aware that water beads can be harmful to children,” said Marcel Casavant, MD, co-author of this study and physician at Nationwide Children’s Hospital. “If children younger than six years or with developmental delays live in or visit your home, keep water beads out of your home and talk with your childcare directors, preschool teachers, therapists, and others who may be using water beads with young children.”

Data for this study were obtained from the National Electronic Injury Surveillance System (NEISS) database, which is maintained by the U.S. Consumer Product Safety Commission. The NEISS database provides information on consumer product-related and sports- and recreation-related injuries treated in hospital emergency departments across the country.

FDA Announces Milestone in Sodium Reduction Efforts

 Issues Draft Guidance with Lower Target Levels for Certain Foods

Sodium Reduction

Today, the U.S. Food and Drug Administration marked a milestone building on Phase I of its voluntary sodium reduction targets and issued draft guidance for Phase II in a data-driven, stepwise approach to help sodium reduction across the food supply. Prior to 2021, consumer intake was approximately 3,400 milligrams per day on average, far higher than the limit recommended by the Dietary Guidelines for Americans of 2,300 milligrams per day for those 14 years and older. If finalized, the new set of voluntary targets would support reducing average individual sodium intake to about 2,750 milligrams per day. This reduction is approximately 20% lower than consumer intake levels prior to 2021. 

The Phase II voluntary sodium reduction targets follow an initial set of targets issued in October 2021. The initial set of targets encouraged the food industry to reduce sodium levels in a wide variety of processed, packaged, and prepared foods. Preliminary data from 2022 show about 40% of the initial Phase I targets are very close to or have already been reached indicating early success of this effort.

“Reducing sodium in the food supply has the potential to be one of the most important public health initiatives in a generation. The early successes we’re seeing with sodium level reduction in certain foods is encouraging and indicative of the impact we believe our overall nutrition approach can have on the wellbeing of society,” said FDA Deputy Commissioner for Human Foods Jim Jones. “In addition to our sodium reduction efforts, the FDA is also actively working on a forthcoming final rule updating the definition of the claim ‘healthy,’ a proposed rule for front-of-package nutrition labeling and exploring ways to reduce added sugars consumption. The FDA’s sodium reduction and other nutrition initiatives are central to a broader, whole-of-government approach to help reduce the burden of diet-related chronic diseases and advance health equity.”

The Phase II targets will continue to focus on commercially processed, packaged, and prepared foods in the marketplace. This guidance is particularly relevant as more than 70% of sodium intake in the U.S. population comes from sodium added during food manufacturing and commercial food preparation. The preliminary Phase I data released today, along with public and external feedback, informed the draft Phase II targets. 

The U.S. faces an ever-growing epidemic of diet-related chronic diseases. Too much sodium can raise blood pressure, a major risk factor for heart disease and stroke. Strong scientific evidence supports lowering sodium intake from current levels. Reducing sodium intake has the potential to prevent hundreds of thousands of premature deaths and illnesses in the coming years by helping to reduce risk for heart disease and stroke. Because underserved communities, including racial and ethnic minority groups, experience high blood pressure at increased rates compared to the overall average, reducing sodium in the food supply also could help advance health equity for these populations.

The agency’s sodium reduction initiative is part of the White House National Strategy on Hunger, Nutrition and Health to reduce diet-related diseases by 2030. The FDA’s Phase II voluntary sodium reduction targets reflect what is known about achievable reductions in different food categories, consumer acceptance, and food safety, and align with the Healthy People 2030 goal of reducing average individual sodium intake to approximately 2,750 milligrams per day in the U.S. The Phase II voluntary sodium reduction targets also work in concert with the U.S. Department of Agriculture’s school meals sodium limits, so children have access to healthy choices inside and outside of school.   

Additional sodium-related actions the FDA has taken include: the issuance of a proposed rule to amend the standards of identity to permit the use of salt substitutes in foods for which salt is a required or optional ingredient, and guidance on use of the term “potassium salt” instead of “potassium chloride” to signal consumers that the ingredient is a salt substitute. 

The FDA will continue its stepwise approach to sodium reduction. The agency will also issue a complete evaluation of industry’s progress against the Phase I targets when the data from 2024 become available and are analyzed. The FDA expects to issue regular evaluations of sodium levels in foods about every three years to support its science-driven, transparent, and stepwise process. Future phases of sodium reduction targets will be considered as part of the agency’s evaluation and monitoring of sodium reduction progress in the marketplace, as well as monitoring of sodium intake in the population. 

Wednesday, August 14, 2024

Lack of purpose and personal growth may precede mild cognitive impairment

 

Feeling that your life lacks purpose and that there are few opportunities for personal growth in older age may precede the development of mild cognitive impairment (MCI), a frequent precursor of dementia, suggests research published online in the Journal of Neurology Neurosurgery & Psychiatry.

These aspects of psychological wellbeing noticeably decline 2 to 6 years before MCI is diagnosed, even in the absence of evident signs, and irrespective of whether those affected go on to develop dementia, the findings indicate.

Mounting evidence links psychological wellbeing to brain ageing, including the development of dementia. But much of the published research focuses on a sense of purpose, excluding the other aspects of wellbeing, explain the researchers.

These include self-acceptance, autonomy, feeling capable of managing one’s immediate environment, having meaningful connections with others, and personal growth.

To strengthen the evidence base, the researchers explored changes over time in psychological wellbeing before and after diagnoses of MCI and dementia among 910 cognitively intact older adults (average age 79) participating in the Rush Memory and Aging Project.

This Project is an ongoing long term study that began in 1997. It includes older adults from senior and subsidised housing, continuous care retirement communities, social service agencies, church groups, and individual homes in northeastern Illinois, USA.

Study participants have annual check-ups that include neurological examinations, cognitive tests, medical history, and assessment of psychological wellbeing, which from 2008 onwards included all 6 components.

During an average monitoring period of 14 years, 265 (29%) developed MCI, 89 (34%) of whom went on to develop dementia. The final analysis is based on 229 participants with complete before and after data, including 73 who developed dementia.

Compared with participants who remained cognitively intact, those who developed MCI were more likely to be older, weigh less, and have lower levels of depressive symptoms and psychological wellbeing. 

Similarly, compared with those who didn’t develop dementia, those who did were more likely to be older, female, to carry the gene linked to dementia (APOE ε4), and to have a lower level of psychological wellbeing.

After accounting for potentially influential factors, such as age, vascular disease and its risk factors, lifestyle, social activities and feelings of loneliness, those who developed MCI experienced a faster decline in psychological wellbeing, leading to a lower level of it 2 years before diagnosis, than those who remained cognitively intact. 

In particular, these people had lower levels of purpose in life and personal growth, beginning 3 and 6 years, respectively, before their diagnosis. 

The speed of psychological wellbeing decline was similar before and after their diagnosis for each component except for meaningful connections with others, which declined faster afterwards.

Wellbeing trajectories were similar for all participants with MCI regardless of whether they subsequently developed dementia, prompting the researchers to suggest that their findings “indicate that reduced psychological wellbeing even without apparent cognitive impairment may be a predictor of subsequent dementing disorders.”

This is an observational study, and as such, no firm conclusions can be drawn about cause and effect. The study participants were well educated, which may introduce selection bias because of the ‘healthy volunteer’ effect, and most of them were White and female, which may limit the generalisability of the findings, acknowledge the researchers.

And the mechanisms underlying the association between wellbeing and cognitive function aren’t well understood, they add. 

The two might be bi-directional: in other words, poorer cognition might influence psychological wellbeing as well as the other way round; greater wellbeing and better cognitive function may also share certain protective factors, they suggest.

And the discrepancies across the various wellbeing components may lie in differences in the level of cognitive processing required, they say. 

“Our findings indicate that personal growth and purpose in life may be more cognitively demanding than other components of wellbeing, and therefore may serve as more sensitive indicators of cognitive ageing,” they write.

“Moreover, we found that positive relations with others declined rapidly after MCI diagnosis. People with impaired cognitive function may be less likely to engage in social and leisure activities than they were previously, which can cause further deterioration in their relationships with friends or others,” they add.  

Psychological support should be planned for people diagnosed with dementing disorders, they advocate.

Shingles increased risk of subsequent cognitive decline

 

A new study led by investigators from Brigham and Women’s Hospital, a founding member of the Mass General Brigham healthcare system, found that an episode of shingles is associated with about a 20 percent higher long-term risk of subjective cognitive decline. The study’s findings provide additional support for getting the shingles vaccine to decrease risk of developing shingles, according to the researchers. Their results are published in Alzheimer's Research & Therapy.

"Our findings show long-term implications of shingles and highlight the importance of public health efforts to prevent and promote uptake of the shingles vaccine," said corresponding author Sharon Curhan, MD, of the Channing Division for Network Medicine at Brigham and Women's Hospital. "Given the growing number of Americans at risk for this painful and often disabling disease and the availability of a very effective vaccine, shingles vaccination could provide a valuable opportunity to reduce the burden of shingles and possibly reduce the burden of subsequent cognitive decline."

Shingles, medically known as “herpes zoster,” is a viral infection that often causes a painful rash. Shingles is caused by the varicella zoster virus (VZV), the same virus that causes chickenpox. After a person has chickenpox, the virus stays in their body for the rest of their life. Most of the time, our immune system keeps the virus at bay. Years and even decades later, the virus may reactivate as shingles.

Almost all individuals in the US age 50 years and older have been infected with VZV and are therefore at risk for shingles. There's a growing body of evidence that herpes viruses, including VZV, can influence cognitive decline. Subjective cognitive decline is an individual’s self-perceived experience of worsening or more frequent confusion or memory loss. It is a form of cognitive impairment and is one of the earliest noticeable symptoms of Alzheimer’s disease and related dementias.

Previous studies of shingles and dementia have been conflicting. Some research indicates that shingles increases the risk of dementia, while others indicate there's no association or a negative association. In recent studies, the shingles vaccine was associated with a reduced risk of dementia.

To learn more about the link between shingles and cognitive decline, Curhan and her team used data from three large, well-characterized studies of men and women over long periods: The Nurses’ Health Study, the Nurses’ Health Study 2, and the Health Professionals Follow-Up Study. The study included 149,327 participants who completed health status surveys every two years, including questions about shingles episodes and cognitive decline. They compared those who had shingles with those who didn't.

Curhan designed the study with first author Tian-Shin Yeh, formerly of the Harvard TH Chan School of Public Health. The researchers found that a history of shingles was significantly and independently associated with a higher risk—approximately 20 percent higher—of subjective cognitive decline in both women and men. That risk was higher among men who were carriers of the gene APOE4, which is linked to cognitive impairment and dementia. That same association wasn't present in the women.

Researchers don't know the mechanisms that link the virus to cognitive health, but there are several possible ways it may contribute to cognitive decline. There is growing evidence linking VZV to vascular disease, called VZV vasculopathy, in which the virus causes damage to blood vessels in the brain or body. Curhan’s group previously found that shingles was associated with higher long-term risk of stroke or heart disease.

Other mechanisms that may explain how the virus may lead to cognitive decline include causing inflammation in the brain, directly damaging the nerve and brain cells, and the activation of other herpesviruses.

The limitations of this research include that it was an observational study, information was based on self-report, and included a mostly white, highly educated population. In future studies, the researchers hope to learn more about preventing shingles and its complications.

“We’re evaluating to see if we can identify risk factors that could be modified to help reduce people’s risk of developing shingles,” Curhan said. “We also want to study whether the shingles vaccine can help reduce the risk of adverse health outcomes from shingles, such as cardiovascular disease and cognitive decline.” 

Saturday, August 10, 2024

Vegan diet better than Mediterranean diet for weight loss and reducing inflammation

 Eating a low-fat vegan diet reduces harmful inflammatory dietary compounds called advanced glycation end-products (AGEs) by 73%, compared to no reduction on a Mediterranean diet, according to new research by the Physicians Committee for Responsible Medicine published in Frontiers in Nutrition. The decrease in AGEs on the vegan diet was associated with an average weight loss of 13 pounds, compared with no change on the Mediterranean diet.

The reduction of dietary AGEs on the low-fat vegan diet came mainly from excluding the consumption of meat (41%), minimizing the consumption of added fats (27%), and avoiding dairy products (14%).

“The study helps bust the myth that a Mediterranean diet is best for weight loss,” says lead study author Hana Kahleova, MD, PhD, director of clinical research at the Physicians Committee for Responsible Medicine. “Choosing a low-fat vegan diet that avoids the dairy and oil so common in the Mediterranean diet helps reduce intake of harmful advanced glycation end-products leading to significant weight loss.”

AGEs may be ingested through the diet, and animal products are generally higher in AGEs than plant foods. Cooking with high heat under dry conditions, such as grilling, leads to significant formation of AGEs, especially in animal-derived foods, which are also rich in fats. High amounts of AGEs circulating in the body can contribute to insulin resistance, which can lead to weight gain. AGEs are also linked to inflammation and oxidative stress, which contribute to chronic diseases like heart disease and type 2 diabetes.

The new research is a secondary analysis of a previous Physicians Committee study comparing a low-fat vegan diet to a Mediterranean diet. The study randomly assigned participants to either a low-fat vegan diet, which consisted of fruits, vegetables, grains, and beans, or a Mediterranean diet, which focused on fruits, vegetables, legumes, fish, low-fat dairy, and extra virgin olive oil, for 16 weeks. Neither group had a calorie limit. Participants then went back to their baseline diets for a four-week washout period before switching to the opposite group for an additional 16 weeks. Dietary AGEs were calculated based on self-reported dietary intake records. AGE scores were assigned to each food item, using a published database of AGE content.

“Our research shows that you can use the power of your plate to lose weight with a low-fat vegan diet that’s rich in fruits, vegetables, grains, and beans and low in AGEs,” adds Dr. Kahleova. “It’s a simple and delicious way to maintain a healthy weight and fight chronic disease.”