New credit card-sized TB test could close the diagnostic gap in HIV hotspots: Study

Current tuberculosis infection tests struggle to detect the disease in those with HIV. A common co-infection, HIV can hide TB from traditional tests by eliminating the immune cells relied upon to sound the alarm.

While more than 90% of the 2 billion TB cases worldwide are latent-symptom-free and not contagious-the weakening of the immune system in those with HIV can allow latent TB to turn active, increasing the potential for new infections to spread and often resulting in fatal outcomes. Tuberculosis is the leading cause of death among those with HIV worldwide.

Now, Tulane University researchers have developed a new handheld TB test that significantly improves detection in people with HIV, according to a new study in Nature Biomedical Engineering. Powered by a beetle-inspired chemical reaction, the device requires no electricity and addresses a critical gap in TB infection detection that has long hobbled efforts to eliminate the world’s deadliest infectious disease.

Dubbed the ASTRA (Antigen-Specific T-cell Response Assay), the credit card-sized device requires only a drop of blood to provide same day diagnoses without need for a laboratory or trained staff. When tested against the traditional IGRA blood test (Interferon-Gamma Release Assay), the ASTRA detected TB in HIV-infected individuals with 87% specificity compared to IGRA’s 60%, while also outperforming in detection of TB without HIV co-infection.

“The goal was to develop a TB test that could be taken anywhere and provide quicker, more accurate results for anybody,” said senior author Tony Hu, PhD, Weatherhead Presidential Chair in Biotechnology Innovation at Tulane University and director of the Tulane Center for Cellular & Molecular Diagnostics. “Current tests such as the IGRA are cost-prohibitive or require access to facilities that resource-limited communities don’t have. If we are going to eliminate TB, we have to diagnose and treat as many infection cases as possible.”

Added Bo Ning, lead author and assistant professor of biochemistry at Tulane University School of Medicine: “If your community has an immunocompromised population, someone may have latent TB. This can help block the spread of TB and ensure that no one slips through the cracks.”

To create a test that would not be stymied by HIV, the researchers identified two new biomarkers that could detect TB without relying on the immune cells susceptible to the virus.

After adding a drop of blood to the device, it must incubate for 4 hours to allow a preloaded reagent to stimulate a response from the immune cells. The reagent acts as a “wanted poster” asking if they’ve seen tuberculosis bacteria before.

To avoid the use of electricity, the researchers looked to an unlikely source for inspiration: the bombardier beetle. When threatened, these large insects combine two chemicals, and the resulting reaction produces a forceful spray. Similarly, two chemicals in the ASTRA are combined to propel the sample across a chip for final analysis and diagnosis.

The new device delivers results in about 4 hours, compared to the IGRA, which takes 24 hours, and a common TB skin test, which can take between two and three days for a diagnosis.

The ASTRA’s performance was validated using samples collected from a cohort in Eswatini, a country with high TB incidence and the highest reported HIV prevalence (27.3%) worldwide.

Increasing testing accuracy, access and speed is even more vital as TB resistance to drugs grows more robust, Hu said.

“The sooner you have a diagnosis, the sooner you can begin the process of determining proper treatment,” Hu said. “TB is the No. 1 pathogen HIV patients worry about globally. If treatment is available, we should be working to kill these bacteria, latent or not.”

Reference:

Ning, B., Chandra, S., Pan, Y. et al. Self-powered rapid antigen-specific T-cell response assay for Mycobacterium tuberculosis infections. Nat. Biomed. Eng (2025). https://doi.org/10.1038/s41551-025-01441-5.

Powered by WPeMatico

Benzodiazepine Breakthrough: Enhancing Anesthesia for TAVI with Remimazolam, finds study

Recent study investigates the perioperative management of patients undergoing transcatheter aortic valve implantation (TAVI) using remimazolam, a short-acting benzodiazepine, in combination with flumazenil, contrasting it with sevoflurane. The hypothesis posited that this combination could significantly reduce emergence time when compared to traditional sevoflurane anesthesia.

Methodology

A prospective, randomized, open-label trial was conducted, enrolling 60 patients with severe aortic stenosis aged 18 or older. Participants were divided into two groups: one receiving remimazolam with flumazenil and the other receiving sevoflurane. The primary outcome focused on the time taken for extubation post-anesthesia discontinuation, while secondary outcomes included hemodynamic variables, vasopressor usage, complication rates, and muscle recovery metrics. Results indicated that the median time to extubation was significantly shorter in the remimazolam group (6.5 minutes) compared to the sevoflurane group (14.2 minutes), showcasing a notable difference of 6.9 minutes with a confidence interval of -8.7 to -5.0 (P < 0.001). The remimazolam group also required fewer boluses of ephedrine for hemodynamic support, suggesting improved cardiovascular stability; however, mean arterial pressure (MAP), heart rate (HR), and overall vasopressor requirements did not differ significantly between groups.

Intraoperative Findings

Intraoperatively, parameters such as perfusion index (PI) and regional cerebral oxygen saturation (rSO2) showed statistically significant differences; the remimazolam group had lower values than those receiving sevoflurane, hinting at variations in hemodynamic responses although the clinical significance remains debatable. No significant intraoperative complications were reported in either group.

Flumazenil Considerations

The use of flumazenil to reverse the sedative effects of remimazolam is acknowledged, although potential risks including resedation are noted. Statistical analysis was conducted utilizing mixed models to assess intraoperative variables, reinforcing the findings of reduced emergence time without compromising intraoperative hemodynamics.

Conclusions and Implications

Despite limitations including a lack of blinding and a single-center study design, the data generally supports the conclusion that a combination of remimazolam and flumazenil results in more efficient emergence from anesthesia compared to sevoflurane in high-risk patients undergoing TAVI. These findings suggest that remimazolam could be a viable anesthetic option for these procedures, addressing safety and recovery time effectively within the perioperative context.

Key Points

– A prospective, randomized, open-label trial with 60 patients assessed the efficacy of remimazolam combined with flumazenil against sevoflurane for perioperative management in transcatheter aortic valve implantation (TAVI), focusing on emergence time as the primary endpoint.

– The median extubation time post-anesthesia was significantly shorter in the remimazolam group (6.5 minutes) compared to the sevoflurane group (14.2 minutes), indicating a notable reduction of 6.9 minutes (P < 0.001).

– Hemodynamic support requirements were lower in the remimazolam group, with fewer doses of ephedrine administered, signifying greater cardiovascular stability; however, other hemodynamic metrics such as mean arterial pressure and heart rate did not differ notably between the groups.

– Intraoperative measurements, including perfusion index and regional cerebral oxygen saturation, presented statistically lower values in the remimazolam cohort, suggesting altered hemodynamic responses, although the clinical relevance of these differences is questionable.

– The administration of flumazenil to counteract remimazolam’s sedative effects was acknowledged, alongside the consideration of risks such as potential resedation; mixed models were employed for statistical analysis of intraoperative variables.

– The study concludes that remimazolam and flumazenil offer a more efficient emergence from anesthesia compared to sevoflurane for high-risk TAVI patients, with implications for enhanced safety and reduced recovery times, despite inherent limitations including a lack of blinding and a single-center study design.

Reference –

Harimochi, S., Godai, K., Nakahara, M. et al. Comparison of remimazolam and sevoflurane for general anesthesia during transcatheter aortic valve implantation: a randomized trial. Can J Anesth/J Can Anesth 72, 397–408 (2025). https://doi.org/10.1007/s12630-024-02900-4

Powered by WPeMatico

Low Fluid Removal Rate in AKI Patients Linked to Higher Mortality: Study

Researchers have discovered that a low rate of fluid removal in patients with severe acute kidney injury (AKI) receiving continuous dialysis is related to a highly increased risk of dying within 90 days. In one of the largest retrospective studies at the University of Chicago, patients with less aggressive removal of fluids in the initial three days of continuous veno-venous hemodialysis (CVVHD) were found to have the highest mortality. This is in comparison to those with high or moderate fluid removal. These results stress the need for the maximization of fluid removal rates in order to enhance survival in critically ill patients with AKI. The study was published in the Journal of Critical Care by Samantha G. and colleagues.

Fluid overload is prevalent in patients with severe AKI and has been for a long time linked with adverse outcomes, such as longer-term dialysis dependence and death. However, little is known regarding how much fluid to remove, and how rapidly, to maximize outcomes during continuous dialysis. The study sought to investigate whether the rate of fluid removal via CVVHD contributed to mortality among ICU patients with AKI.

From April 1, 2016, to March 31, 2020, 1,242 adult ICU patients receiving CVVHD for AKI were included in this single-center retrospective cohort study. Scientists gathered data on daily fluid balance, pre-existing health status, and outcomes in patients to explore the relationship between fluid removal rate and survival.

The patients were stratified according to fluid removal rate in the first three days of CVVHD based on body-weight calculation. Low fluid removal rate was ≤1.01 mL/kg/h, and high fluid removal rate was >1.75 mL/kg/h. Variables at baseline like weight, pre-existing heart failure, and initial fluid overload were examined to ascertain their impact on patient outcomes. A multivariable adjusted hazard model was employed to determine the relationship between fluid removal rate and 90-day mortality.

Results

In the 1,242 patients analyzed:

  • Highest (74%) 90-day mortality occurred among patients who had a low fluid removal rate.

  • Low fluid removal, in an adjusted analysis for confounders, was independently related to increased 90-day mortality with an adjusted hazard ratio (AHR) of 2.74 (95% CI: 1.58–4.76).

  • Conversely, high fluid removal was unrelated to elevated mortality, and potentially provides a survival advantage (AHR: 0.50; 95% CI: 0.30–1.03), although without statistical significance.

  • Baseline group differences were for increased rates of congestive heart failure and variability in pre-CVVHD cumulative fluid balance and body weight.

This research concludes that low fluid removal rate during continuous dialysis in AKI patients is a significant and independent predictor of 90-day mortality. Although increased fluid removal rates were not adverse, they may be of benefit and need to be investigated further in clinical trials. These findings highlight the importance of close monitoring of fluid management strategies in the ICU environment, particularly in critically ill AKI patients on CVVHD.

Reference:

Gunning, S., Mire, M., Gulotta, G., & Koyner, J. (2025). Impact of fluid removal rate on patients receiving continuous renal replacement therapy for acute kidney injury. Journal of Critical Care, 89, 155161. https://doi.org/10.1016/j.jcrc.2025.155161

Powered by WPeMatico

MetS Increases Mortality Risk in Stroke Patients After Intravenous Thrombolysis: Study

A new study from Shanxi Bethune Hospital revealed that metabolic syndrome (MetS) significantly worsens outcomes in patients with acute ischemic stroke (AIS) who receive intravenous thrombolysis (IVT). The findings were published in the recent issue of Frontiers in Neurology.

This prospective cohort study from January 2022 to December 2023, examined 292 AIS patients who underwent IVT. The participants were divided into 2 groups as those with MetS and those without, and used propensity score matching to control for baseline differences such as age, sex, and stroke severity.

Within three months of treatment, the all-cause mortality rate in patients with MetS was 24%, more than double the 11.6% mortality observed in patients without MetS. Even after adjusting for potential confounding factors, MetS remained an independent predictor of death, with an adjusted hazard ratio of 2.50 (95% CI: 1.35–4.60, p < 0.01). This suggests that the patients with MetS were 2.5 times more likely to die within 3 months of stroke, despite receiving timely thrombolytic therapy.

Beyond mortality, MetS also predicted worse recovery and higher complications. Patients with MetS were significantly less likely to achieve a good functional outcome (defined as a score of 0 to 2 on the modified Rankin Scale), which indicated slight or no disability. The adjusted odds ratio for good recovery was just 0.47 (95% CI: 0.28–0.77, p < 0.01), reflecting nearly a 53% lower likelihood of favorable neurological recovery when compared to non-MetS counterparts.

Additionally, the risk of symptomatic intracranial hemorrhage (SICH) was notably higher in the MetS group. These patients had more than twice the odds of developing SICH (adjusted OR = 2.40, 95% CI: 1.17–4.92, p = 0.02), which can severely worsen outcomes or prove fatal.

Importantly, as the number of MetS components increased (including obesity, high blood pressure, high blood sugar, high triglycerides, and low HDL cholesterol), so did the risk of mortality. This hint that the cumulative metabolic burden directly impacts stroke prognosis. These findings suggest that even before a stroke occurs, controlling metabolic health could be crucial not only for prevention but also for improving survival and recovery afterward.

Reference:

Chen, W., Liu, D., Li, Z., & Zhang, X. (2025). Metabolic syndrome is associated with prognosis in patients with acute ischemic stroke after intravenous thrombolysis: a prospective cohort study. Frontiers in Neurology, 16. https://doi.org/10.3389/fneur.2025.1598434

Powered by WPeMatico

Socioeconomic Factors May Overshadow Racial Emphasis in Frontal Fibrosing Alopecia: Study Shows

USA: A recent study has revealed that patients with frontal fibrosing alopecia (FFA) are more likely to live in affluent areas, as determined by their Social Vulnerability Index (SVI), compared to those with alopecia areata (AA). The researchers argue that the traditionally held belief that FFA predominantly affects certain racial groups may reflect underlying socioeconomic disparities rather than true racial predisposition.

The study, published in the Journal of the American Academy of Dermatology and led by Jiana Wyche and colleagues from Meharry Medical College, examined the association between socioeconomic status and the prevalence of FFA. FFA is a type of scarring alopecia that has seen a steady rise in cases over recent years. Historically, it has been reported more frequently in White, postmenopausal women. However, the growing number of cases has prompted investigations into potential environmental and lifestyle factors—including the use of facial cosmetics and sunscreens—which tend to be more common in individuals from higher socioeconomic backgrounds.

To better understand the role of socioeconomic status in FFA, researchers conducted a retrospective cohort analysis of patients from Johns Hopkins Hospital, Baltimore, over nine years (2015–2024). The study included 147 individuals diagnosed with FFA and 429 with AA, a condition chosen for comparison due to its similar immune-mediated mechanism. Data were collected from patients across 15 U.S. states, with a large majority (over 84%) from Maryland.

Using the CDC’s Social Vulnerability Index—an indicator that incorporates data on income, education, housing, and other demographic factors—the team assessed the socioeconomic background of each patient based on their zip code. The SVI categorizes areas into four levels of vulnerability, with “low vulnerability” (SVI < 0.25) representing the most affluent areas.

The key findings of the study were as follows:

  • Patients with frontal fibrosing alopecia (FFA) were significantly more likely to live in low-vulnerability (more affluent) zip codes compared to those with alopecia areata (AA), with rates of 50.3% vs. 34.03%.
  • The likelihood of an FFA patient residing in an affluent area was notably higher, with an odds ratio of 1.786.
  • When researchers adjusted for age and socioeconomic status, race was no longer found to be an independent predictor of FFA.
  • Although unadjusted data showed FFA was more common in White individuals and AA in Black individuals, the adjusted analysis suggests these racial patterns may actually reflect underlying socioeconomic differences.

“Our study suggests that FFA patients are more likely to be from affluent zip codes as determined by their SVI when compared to AA patients,” the researchers wrote. “We believe, therefore, that historical emphasis on race in FFA may have been overstated due to the impact of race on socioeconomic status.”

They concluded, “These findings offer new insights into the factors contributing to FFA and highlight the importance of considering social and economic contexts in dermatologic research and diagnosis.”

Reference:

Wyche J, Tsang DA, Aguh C. Odds of developing frontal fibrosing alopecia more closely tied to affluence than race – A retrospective cohort study. J Am Acad Dermatol. 2025 May 16:S0190-9622(25)02133-4. doi: 10.1016/j.jaad.2025.05.1394. Epub ahead of print. PMID: 40383279.

Powered by WPeMatico

Vitamin D3 nanoemulsion significantly improves core symptoms in children with autism: A clinical trial

This study investigates the effectiveness of a vitamin D3-loaded nanoemulsion in improving the core symptoms of autism spectrum disorder (ASD) in children. Children with ASD often have low vitamin D3 levels, which are linked to delays in language development, adaptive behavior, and fine motor skills. While traditional vitamin D3 supplementation has shown mixed results in past studies, this research evaluates whether a nanoemulsion form-engineered to enhance absorption and bioavailability-might produce better outcomes.

Eighty children between the ages of 3 and 6 with diagnosed ASD were randomly assigned into two groups: one receiving the vitamin D3 nanoemulsion, and the other receiving a standard marketed vitamin D3 product, both for a duration of 6 months. Their vitamin D3 levels, adaptive behaviors, and language abilities were assessed before and after supplementation using standardized tools such as the Childhood Autism Rating Scale (CARS), Vineland Adaptive Behavior Scale, and Preschool Language Scale. Only the nanoemulsion group showed statistically significant improvements in vitamin D3 levels, autism severity, social IQ, and both receptive and expressive language performance. The conventional supplement, despite raising blood vitamin D3 levels, did not lead to meaningful improvements in behavioral outcomes.

The study concludes that the nanoemulsion form of vitamin D3 is superior to the conventional oral form in terms of increasing vitamin bioavailability and producing clinically relevant improvements in children with ASD. The authors suggest that nanoemulsion technology could offer a valuable strategy for enhancing the effectiveness of nutritional interventions in neurodevelopmental disorders. However, they acknowledge that further studies with larger sample sizes and long-term follow-up are needed to confirm these findings and explore potential gender-related differences in response.

Reference:

Nagwa A. Meguid, Maha Hemimi, Gina Hussein, Ahmed Elnahry, Marwa Hasanein Asfour, Sameh Hosam Abd El-Alim, Ahmed Alaa Kassem, Abeer Salama, Amr Sobhi Gouda, Walaa Samy Nazim, Radwa Ibrahim Ali Hassan, Neveen Hassan Nashaat, Improved core manifestations of autism following supplementation with vitamin D3-loaded nanoemulsion, LabMed Discovery, https://doi.org/10.1016/j.lmd.2025.100071.

Powered by WPeMatico

Calcium-Based Additives Reduce Soft Drink Erosion, Protect Enamel and Dentin, suggests study

According to a new study, solutions containing calcium lactate (CLP) significantly reduced the erosive potential of soft drinks, especially when combined with LPP or TMP, offering enhanced protection for dentin. Notably, LPP alone was effective in minimizing erosion of both enamel and dentin. These findings suggest clinical benefits of this strategy for high-risk individuals prone to erosive tooth wear, particularly those less compliant with preventive measures.

A study was done to evaluate the erosive potential of a soft drink modified with film-forming polymers and calcium on bovine enamel and dentin. Sprite Zero Sugar was modified with linear sodium polyphosphate (LPP–10 g/L) and sodium trimetaphosphate (TMP-10 g/L), individually or combined with calcium lactate pentahydrate (CLP–4.35 g/L). Enamel and dentin specimens were randomly assigned into six groups (n = 10/substrate): 1. C− (negative control–no modification); 2. LPP; 3. TMP; 4. LPP+CLP; 5. TMP+CLP; 6. C+ (positive control–CLP). The specimens underwent an erosion-remineralization cycling. Surface loss (SL, in μm) was measured with an optical profilometer. Color and viscosity of the drinks were analyzed. Data were statistically analyzed (α=0.05). Results: For enamel and dentin, LPP significantly reduced the erosive effect of the drink compared to C- (p < 0.001 for both), with reductions of approximately 53 % and 41 %, respectively. TMP showed no significant difference from C− for both substrates. C+ reduced SL by 87 % in enamel and 38 % in dentin when compared to C− (p < 0.001). When CLP was combined with the polymers, in enamel, a 97 % reduction in SL for LPP+CLP and TMP+CLP was observed. In dentin, reductions of 56 % and 48 % were observed for LPP+CLP and TMP+CLP. No significant differences were observed between the groups and the C- regarding color and viscosity (p > 0.05). All solutions containing calcium lactate (CLP) were effective in reducing the erosive potential of the original soft drink. The combinations of CLP with LPP or TMP significantly enhanced protection, especially for dentin. Notably, LPP alone was effective in minimizing erosion of both enamel and dentin. Reducing the erosion potential of soft drinks may benefit non-collaborative individuals with high risk for erosive tooth wear.

Reference:

Cláudia Allegrini Kairalla, Milena Rodrigues Muniz, Letícia Oba Sakae, Fernando Neves Nogueira, Idalina Vieira Aoki, Juliano Pelim Pessan, Alessandra Bühler Borges, Taís Scaramucci. Reduction of the erosive potential of a soft drink with polymers and calcium,

Journal of Dentistry, Volume 161, 2025, 105935, ISSN 0300-5712, https://doi.org/10.1016/j.jdent.2025.105935.

(https://www.sciencedirect.com/science/article/pii/S0300571225003793)

Keywords:

Calcium-Based, Additives, Reduce, Soft, Drink, Erosion, Protect, Enamel, Dentin, suggests, study , Journal of Dentistry, Cláudia Allegrini Kairalla, Milena Rodrigues Muniz, Letícia Oba Sakae, Fernando Neves Nogueira, Idalina Vieira Aoki, Juliano Pelim Pessan, Alessandra Bühler Borges, Taís Scaramucci.

Powered by WPeMatico

Adults who survived childhood cancer are at increased risk of severe COVID-19: Study

People who have survived cancer as children are at higher risk of developing severe COVID-19, even decades after their diagnosis. This is shown by a new study from Karolinska Institutet published in the journal The Lancet Regional Health – Europe.

Thanks to medical advances, more and more children are surviving cancer. However, even long after treatment has ended, health risks may remain. In a new registry study, researchers investigated how adult childhood cancer survivors in Sweden and Denmark were affected by the COVID-19 pandemic.

The study included over 13,000 people who had been diagnosed with cancer before the age of 20 and who were at least 20 years old when the pandemic began. They were compared with both siblings and randomly selected individuals from the population of the same gender and year of birth.

The results show that childhood cancer survivors had a lower risk of contracting COVID-19, but were 58 per cent more likely to develop severe disease if they did become infected. Severe COVID-19 was defined as the patient receiving hospital care, intensive care or death related to the infection.

”It is important to understand that even though these individuals were not infected more often, the consequences were more serious when they did become ill,” says Javier Louro, postdoctoral researcher at the Institute of Environmental Medicine at Karolinska Institutet and first author of the study.

The differences in risk were particularly clear during periods of high transmission, such as when new virus variants such as Alpha and Omicron spread rapidly. In Sweden, where pandemic management was based more on recommendations than restrictions, the increase in risk was greater than in Denmark, which introduced early and strict measures.

”Our results suggest that childhood cancer survivors should be considered a risk group in future pandemics or other health crises. This could involve prioritising them for vaccination or offering special protection during periods of high transmission,” says Javier Louro.

Reference:

Louro, Javier et al., COVID-19 infection and severity among childhood cancer survivors in Denmark and Sweden: a register-based cohort study with matched population and sibling comparisons, The Lancet Regional Health – Europe, DOI:10.1016/j.lanepe.2025.101363. 

Powered by WPeMatico

Gut bacteria and amino acid imbalance linked to higher miscarriage risk in women with PCOS: Study

A new study presented today at the 41st Annual Meeting of the European Society of Human Reproduction and Embryology (ESHRE) reveals that women with polycystic ovary syndrome (PCOS) have distinct gut microbiota and metabolic signatures linked to premature endometrial ageing and a higher risk of adverse pregnancy outcomes.

The research highlights a sharp reduction in the beneficial gut bacterium Parabacteroides merdae (P. merdae), alongside elevated levels of branched-chain amino acids (BCAAs), particularly isoleucine-an essential amino acid involved in protein production and energy metabolism. Together, these changes may act as potential drivers of poor endometrial function and reproductive complications in women with PCOS.

Affecting up to one in five women of reproductive age globally, PCOS is a major cause of infertility. Although fertility treatments often succeed in helping women with PCOS conceive, they remain at higher risk of complications such as miscarriage, preterm birth, and gestational diabetes. Until now, the mechanisms behind this elevated risk have remained unclear.

“In clinical practice, we noticed that even younger women with PCOS who achieved pregnancy still faced unexpectedly high rates of miscarriage and other complications”, said Dr. Aixia Liu, the lead author of the study. “Many of these women also had metabolic imbalances and digestive issues, which led us to explore the possible interplay between the gut microbiota, circulating metabolites, and the uterus.”

The prospective study followed 220 women under the age of 35 across 44 cities in China, including 110 PCOS patients and 110 matched controls. Researchers used a combination of gut microbiome sequencing and metabolomics to profile differences between the groups and conducted laboratory studies on endometrial stromal cells (ESCs) to assess ageing and decidualisation, a process critical for embryo implantation.

Results showed a significant reduction in microbial diversity among PCOS patients, particularly a decrease in P. merdae, a species linked to metabolic health. Serum metabolomics revealed elevated levels of BCAAs, especially isoleucine, and reduced levels of short-chain fatty acids in the PCOS group.

Despite similar pregnancy rates, women with PCOS were nearly twice as likely (1.95 times) to experience at least one adverse pregnancy outcome, including miscarriage, preterm birth, macrosomia, low birth weight, gestational diabetes, hypertensive disorders, and perinatal death.

Further investigation revealed that isoleucine levels were also elevated in endometrial tissue. When researchers exposed ESCs to isoleucine in the lab, they observed increased markers of cellular senescence and reduced capacity for decidualisation. “These findings indicate ageing-like changes in the uterus, occurring much earlier than expected”, said Dr. Liu. “Our data suggest that high isoleucine levels and the loss of P. merdae may impair endometrial health, even in women under 35.”

The researchers propose that P. merdae and BCAAs could serve as biomarkers for identifying high-risk PCOS patients and may guide personalised treatment approaches in the future. “The next step is to explore whether dietary interventions, probiotics, or BCAA-restricted diets can reverse these effects and improve pregnancy outcomes”, concluded Dr. Liu.

Professor Dr. Anis Feki, Chair-Elect of ESHRE, added, “The study provides compelling evidence that metabolic and microbial imbalances in PCOS are not only systemic but may directly impair endometrial receptivity, even in younger women. These findings mark a critical step toward personalised reproductive care in PCOS.”

Reference:

Gut bacteria and amino acid imbalance linked to higher miscarriage risk in women with PCOS, European Society of Human Reproduction and Embryology, Meeting: ESHRE 41st Annual Meeting.

Powered by WPeMatico

Food as Medicine: Dietary and Supplement Solutions for Endometriosis Pain, study suggests

Recent study on dietary modifications and supplement use for endometriosis pain aimed to investigate the effectiveness of various strategies in managing symptoms among individuals with endometriosis. The research methodology involved a survey distributed through an online platform, Qualtrics, among 2858 participants, out of which 2599 completed over 80% of the survey questions. The results indicated that the majority of respondents experienced pelvic pain (96.9%) and frequent abdominal bloating (91.2%). It was highlighted that 83.8% of participants had tried one or more diets, while 58.8% had used supplements to alleviate their symptoms. A substantial proportion of individuals reported that these dietary modifications and supplements had a positive impact on their pain levels, with 66.9% attributing pain improvement to diet changes and 43.4% to supplement use.

Pain Improvement and Dietary Modifications

The study found a statistically significant difference in pain scores between participants who reported improvement from dietary modifications compared to those who did not perceive any benefit. Among the popular dietary modifications attempted, reducing alcohol, gluten, dairy, and caffeine were associated with pain improvement for a considerable number of respondents. However, the low-FODMAP diet was less commonly tried despite its potential benefits. Additionally, approximately 32.3% of individuals who used magnesium reported experiencing benefits. The research highlighted the limitations and permissions associated with the distribution of the study findings. The abstract specified that the article is distributed under the CC-BY-NC-ND License, which restricts alterations and commercial uses including text and data mining. Moreover, the introduction discussed the complexities of endometriosis, emphasizing chronic pain as a predominant symptom alongside infertility, fatigue, and gastrointestinal issues. The study aimed to explore the role of the gut microbiome in pain regulation and investigate how dietary modifications could potentially alleviate symptoms in individuals with endometriosis.

Conclusion

In conclusion, this international survey provided insights into the diverse dietary modifications and supplement use practices among individuals with endometriosis. The findings suggested that tailored dietary changes and specific supplements could play a significant role in managing pain associated with endometriosis. The study underscored the importance of individualized approaches in symptom management and highlighted the potential benefits of certain dietary modifications in alleviating pain among individuals with endometriosis.

Key Points

– A study on dietary modifications and supplement use for endometriosis pain involved 2858 participants, revealing that the majority experienced pelvic pain and abdominal bloating.

– 83.8% of participants had tried different diets, while 58.8% used supplements to alleviate symptoms, with 66.9% attributing pain improvement to dietary changes and 43.4% to supplements.

– Dietary modifications such as reducing alcohol, gluten, dairy, and caffeine were linked to pain improvement, while the low-FODMAP diet and magnesium use showed potential benefits.

– The study emphasized the importance of tailored dietary changes and specific supplements in managing pain in individuals with endometriosis.

– The research highlighted the complexities of endometriosis, focusing on chronic pain as a predominant symptom and aiming to explore the role of the gut microbiome in pain regulation through dietary modifications.

– The study’s distribution under the CC-BY-NC-ND License restricts alterations and commercial uses, aiming to provide insights into effective strategies for alleviating endometriosis-related pain.

Reference –

Francesca Hearn-Yeates et al. (2025). Dietary Modification And Supplement Use For Endometriosis Pain. *JAMA Network Open*, 8. https://doi.org/10.1001/jamanetworkopen.2025.3152.

Powered by WPeMatico