Menopause Age Key to Unlocking Type 2 Diabetes Risk, suggests study

Recent research indicates a potential association between age at menopause and the incidence of type 2 diabetes (T2D), yet comprehensive evidence remains limited, especially among Asian populations. A cohort study involving over 1.1 million postmenopausal Korean women aimed to clarify whether age at menopause and instances of premature menopause are linked to the onset of T2D.

Significance of Type 2 Diabetes

Type 2 diabetes represents a chronic health issue significantly contributing to cardiovascular diseases, microvascular complications, and elevated mortality rates. Despite advances in medical treatments, T2D prevalence continues to rise, emphasizing the need for effective preventative measures and the identification of at-risk populations. While men are generally at a higher risk at younger ages, the risk for women escalates post-menopause. Female-specific risk factors and their implications on health, particularly links to cardiovascular disease and T2D, are under-explored.

Data Collection Methodology

The study utilized data from the Korean National Health Insurance Service, which covers nearly the entire South Korean population, providing robust health examination records including self-reported data and clinical health assessments. A total of 3,181,150 women aged 30 and above were initially screened, with exclusions for pre-existing T2D, premenopausal status, and data inconsistencies, resulting in a final cohort of 1,125,378 participants. Age at menopause was self-reported, categorized into four groups: <40 years, 40-44 years, 45-49 years, and ≥50 years, with premature menopause defined as menopause before age 40. Identified T2D cases were defined as fasting blood glucose levels of 126 mg/dL or higher or records of antidiabetic medication claims.

Follow-Up Results

Throughout a median follow-up of 8.4 years, 113,864 new T2D cases emerged, indicating a cumulative incidence of approximately 10.1%. The results indicated that women with premature menopause had a higher incidence of T2D compared to their peers, with a hazard ratio (HR) of 1.13 after accounting for various confounders. Women experiencing menopause at an earlier age (particularly <40 years) faced a significantly increased risk for developing T2D, corresponding with the hypothesis that lower estrogen levels, due to shorter exposure periods, may contribute significantly to insulin resistance and metabolic dysregulation.

Subgroup Analysis Insights

Subgroup analyses revealed nuances in the relationship, with factors like body mass index (BMI) and depressive disorders influencing the risk association. Notably, the risk was exacerbated among individuals without obesity and those struggling with depressive disorders, suggesting a potential interaction between mental health and metabolic outcomes.

Implications of Findings

The findings support the hypothesis that both premature and early menopause can act as significant risk factors for the development of T2D. Given these associations, there is a strong argument for including menopause history in T2D screening protocols. Recommendations suggest recognizing premature menopause as a noteworthy risk factor within diabetes management guidelines, emphasizing preventive care strategies and early detection initiatives.

Conclusions and Future Directions

Overall, the study elucidates important links between menopausal age and T2D incidence, advocating for greater awareness and proactive measures in managing women’s health, particularly concerning metabolic disorders in postmenopausal populations. Further research is essential to explore these relationships across diverse demographic groups and establish effective interventions to mitigate T2D risk.

Key Points

– A cohort study involving over 1.1 million postmenopausal Korean women was conducted to investigate the association between age at menopause, instances of premature menopause, and the incidence of type 2 diabetes (T2D), particularly as evidence is lacking in Asian populations.

– Type 2 diabetes is a chronic health concern linked to cardiovascular diseases and increased mortality rates, with rising prevalence emphasizing the necessity for effective preventive strategies. The risk for women increases significantly post-menopause, making female-specific health factors critical for understanding T2D risk.

– The study analyzed comprehensive health data from the Korean National Health Insurance Service, screening 3,181,150 women aged 30 and above and narrowing the final cohort to 1,125,378. Age at menopause was categorized, and T2D was defined through specific blood glucose levels and medication records.

– Over a median follow-up of 8.4 years, 113,864 new T2D cases emerged, indicating a cumulative incidence of approximately 10.1%. Women experiencing premature menopause exhibited a higher incidence of T2D (hazard ratio of 1.13), particularly those with menopause before age 40, indicating potential metabolic dysregulation linked to lower estrogen levels.

– Subgroup analyses indicated that body mass index (BMI) and depressive disorders significantly influenced the risk association. Increased T2D risk was noted in individuals without obesity and those with depressive issues, suggesting a complex interplay between mental health and metabolic conditions.

– The study underscores the necessity of incorporating menopause history into T2D screening protocols, advocating for recognizing premature menopause as a critical risk factor in diabetes management. Recommendations emphasize the importance of preventive strategies and early detection among postmenopausal women to mitigate T2D risk.

Reference –

B. Ko et al. (2025). Age At Menopause And Development Of Type 2 Diabetes In Korea. *JAMA Network Open*, 8. https://doi.org/10.1001/jamanetworkopen.2024.55388.

Powered by WPeMatico

Weekly Home BP Monitoring Enhances Hypertension Detection in Hemodialysis Patients: Study Finds

Greece: Accurate blood pressure (BP) monitoring is essential for managing hypertension in patients undergoing hemodialysis. A recent study found that in hemodialysis patients, home blood pressure monitoring (HBPM) over a week was more accurate in detecting hypertension than routine BP measurements taken at dialysis centers.

“HBPM demonstrated greater diagnostic accuracy than 44-hour ambulatory BP monitoring (AUC: 0.934). With a threshold of 141.0 mmHg, HBPM achieved optimal sensitivity (85.7%) and specificity (92.9%), reinforcing its value as a reliable screening tool for this patient population,” the researchers reported in the Journal of Human Hypertension.

The best approach for diagnosing hypertension in hemodialysis patients remains a subject of debate. To address this, Panagiotis I. Georgianos, 2nd Department of Nephrology, AHEPA Hospital, School of Medicine, Aristotle University of Thessaloniki, Thessaloniki, Greece, and colleagues evaluated the accuracy of home blood pressure monitoring and routine dialysis-unit BP recordings, using 44-hour ambulatory BP monitoring (ABPM) as the reference standard.

For this purpose, the researchers assessed hypertension over two weeks using three methods: (i) routine predialysis and post dialysis BP recordings averaged over six consecutive dialysis sessions, (ii) home BP monitoring (HBPM) for seven days with duplicate morning and evening measurements (Microlife WatchBP Home N), and (iii) 44-hour ambulatory BP monitoring (ABPM) with 20-minute intervals over an entire interdialytic period (Microlife WatchBPO3). The study included 70 patients (mean age: 65.3 ± 13.2 years), 87.1% receiving hypertension treatment, and an average 44-hour ambulatory systolic/diastolic BP of 120.6 ± 15.2/66.3 ± 10.1 mmHg.

Based on the study, the researchers reported the following findings:

  • The mean difference between ambulatory daytime systolic BP (SBP) and routine BP measurements was:
    • Predialysis SBP: -11.4 ± 13.4 mmHg
    • Postdialysis SBP: -4.0 ± 15.1 mmHg
    • Home SBP: -8.6 ± 10.7 mmHg
  • Home BP monitoring (HBPM) showed superior diagnostic performance for detecting ambulatory daytime SBP ≥135 mmHg:
    • Home SBP: AUC 0.934
    • Predialysis SBP: AUC 0.778
    • Postdialysis SBP: AUC 0.766
    • HBPM was significantly more accurate than both predialysis and postdialysis SBP (P = 0.02).
  • At a cut-off of 141.0 mmHg, home SBP provided the best balance of sensitivity (85.7%) and specificity (92.9%) for diagnosing hypertension.

The study findings highlight that in hemodialysis patients, home blood pressure monitoring conducted over a week is more reliable than routine dialysis-unit BP recordings averaged over two weeks in detecting ambulatory hypertension. HBPM demonstrated greater accuracy in identifying elevated blood pressure levels, making it a valuable tool for improving hypertension diagnosis in this population.

“These results suggest that integrating HBPM into routine clinical practice could enhance blood pressure management, leading to better cardiovascular outcomes for hemodialysis patients,” the authors concluded.

Reference:

Leonidou, K., Georgianos, P. I., Kollias, A., Kontogiorgos, I., Vaios, V., Leivaditis, K., Karligkiotis, A., Stamellou, E., Balaskas, E. V., Stergiou, G. S., & Liakopoulos, V. (2025). Home versus routine dialysis-unit blood pressure recordings among patients on hemodialysis. Journal of Human Hypertension, 1-7. https://doi.org/10.1038/s41371-025-01007-7

Powered by WPeMatico

Statins Reduce Liver Cancer and Hepatic Decompensation Risk among patients with chronic liver disease: JAMA

Researchers have found in a new cohort study that the use of statins in patients with chronic liver disease was associated with a lower risk of liver cancer and hepatic decompensation. Patients who used statins experienced a significant reduction in these risks compared to those who did not, with lipophilic statins and longer treatment duration providing even greater protective effects. Statins may help prevent hepatocellular carcinoma by slowing the progression of liver fibrosis, which is a key factor in the severity of liver disease. This study examined the relationship between the use of statins and the risk of hepatocellular carcinoma and hepatic decompensation, with a focus on how statins influence the progression of liver fibrosis. Researchers analyzed patient data from the year 2000 to the year 2023, selecting adults who were 40 years or older with chronic liver disease and an elevated baseline Fibrosis-4 score, which is a measure of liver fibrosis. Participants were categorized into those who used statins and those who did not, and their health outcomes were tracked over a ten-year period. Findings indicated that patients who used statins had a significantly lower incidence of hepatocellular carcinoma and hepatic decompensation than those who did not use statins. The protective effects were particularly pronounced among individuals who used lipophilic statins and those who had prolonged statin therapy. In addition to reducing the risks of liver cancer and hepatic decompensation, patients who used statins showed a slower progression of liver fibrosis and were more likely to improve their fibrosis risk category over time. Among patients with intermediate or high Fibrosis-4 scores at baseline, those who used statins were more likely to experience a regression in their fibrosis severity compared to those who did not use statins. This suggests that statins may not only help prevent severe liver complications but also contribute to improved liver health over time. Overall, the study supports the potential role of statins in reducing the risk of hepatocellular carcinoma and slowing the progression of liver disease. These findings highlight the need for further research on incorporating statins into treatment strategies for patients with chronic liver disease.

Reference:

Choi J, Nguyen VH, Przybyszewski E, et al. Statin Use and Risk of Hepatocellular Carcinoma and Liver Fibrosis in Chronic Liver Disease. JAMA Intern Med. Published online March 17, 2025. doi:10.1001/jamainternmed.2025.0115

Keywords:

Statins, Reduce, Liver Cancer, Hepatic, Decompensation Risk, among, patients, chronic liver disease, JAMA, Choi J, Nguyen VH, Przybyszewski E, JAMA Intern Med

Powered by WPeMatico

Long-Term Use of Inhaled Corticosteroids Linked to Increased diabetes, Health Risks in COPD Patients: Study

According to a study published in the Annals of Family Medicinereports that using inhaled corticosteroids for more than 24 months significantly increases the risks of diabetes, pneumonia, osteoporosis, cataracts, and fractures in adults with chronic obstructive pulmonary disease (COPD), compared to those who use them for shorter periods. Researchers also observed that these medications are frequently prescribed beyond the recommended guidelines for COPD management.

They aimed to assess long-term inhaled corticosteroid (ICS) risks in chronic obstructive pulmonary disease (COPD) management. They extracted electronic health record data for individuals aged >45 years with COPD from a data repository. The prevalent cohort required a diagnosis of COPD any time during the observation period, and the inception cohort required a diagnosis of COPD made after entry into the database. A composite outcome of any new diagnosis of type 2 diabetes, cataracts, pneumonia, osteoporosis, or nontraumatic fracture; and recurrent event outcomes of repeated pneumonia or nontraumatic fracture were compared for long-term (>24 months) vs short-term (<4 months) ICS exposure. They assessed outcomes for 318,385 and 209,062 individuals in the prevalent and inception cohorts, respectively. The composite dichotomous outcome was significantly greater for long-term vs short-term ICS use for the prevalent (hazard ratio [HR] = 2.65; 95% CI, 2.62-2.68; P <.001) and inception (HR = 2.60; 95% CI, 2.56-2.64; P <.001) cohorts. For the inception cohort, the absolute risk difference of the composite outcome was 20.26% (29.41% minus 9.15%), with a number needed to harm of 5. Hazard ratios were significantly increased in the prevalent and inception cohorts for recurrent pneumonia (HR = 2.88; 95% CI, 2.62-3.16; P <.001 and HR = 2.85; 95% CI, 2.53-3.22; P <.001, respectively) and recurrent fracture (HR = 1.77; 95% CI, 1.42-2.21; P <.001 and HR = 1.57; 95% CI, 1.20-2.06; P <.001). Long-term ICS use for COPD is associated with significantly greater rates of the composite outcome of type 2 diabetes, cataracts, pneumonia, osteoporosis, and nontraumatic fracture; recurrent pneumonia; and recurrent fracture.

Reference:

Adverse Outcomes Associated With Inhaled Corticosteroid Use in Individuals With Chronic Obstructive Pulmonary Disease

Wilson D. Pace, Elisabeth Callen, Gabriela Gaona-Villarreal, Asif Shaikh, Barbara P. Yawn

The Annals of Family Medicine Mar 2025, 23 (2) 127-135; DOI: 10.1370/afm.240030

Keywords:

Long-Term, Use, Inhaled, Corticosteroids, Linked, Increased, diabetes, Health, Risks, COPD Patient, Study, Wilson D. Pace, Elisabeth Callen, Gabriela Gaona-Villarreal, Asif Shaikh, Barbara

Powered by WPeMatico

Ropivacaine safe and effective alternative to lidocaine for local anesthesia in orthognathic procedures: Study

Ropivacaine safe and effective alternative to lidocaine for local anaesthesia in orthognathic procedures suggests a new study published in the Inflammopharmacology.

This triple-blind, controlled clinical trial aimed to assess the effects of ropivacaine and lidocaine on hemodynamic factors, blood loss, opioid consumption, and postoperative pain in patients undergoing orthognathic surgery. Thirty-two patients with Class III malocclusion scheduled for orthognathic surgery were included. The participants were randomly assigned to receive 0.5% ropivacaine or 2% lidocaine with 1:80,000 epinephrine for local anesthesia (n = 16). Hemodynamic parameters were recorded at various time intervals, which included heart rate (HR), systolic blood pressure, diastolic blood pressure, mean arterial pressure, oxygen saturation (SpO2), intraoperative bleeding, opioid consumption, and postoperative pain intensity.RESULTS: The participants’ mean age was 23.67 ± 4.56 years, and 75% were female. The groups were comparable in most measured outcomes. HR was significantly higher in the ropivacaine group at 30 and 60 min post-injection (P < 0.05). SpO2 percentages were comparable between the groups, except at 15 min post-anesthesia, where the lidocaine group demonstrated a significantly higher SpO2 (P = 0.029). Blood pressure, postoperative opioid consumption, intraoperative bleeding, and postoperative pain levels showed no statistically significant differences between the two groups. Within the limitations of this study, both 0.5% ropivacaine and lidocaine with epinephrine demonstrated comparable effects on hemodynamic stability, intraoperative blood loss, postoperative pain levels, and opioid consumption in patients undergoing orthognathic surgery. These findings suggest that ropivacaine may serve as a safe and effective alternative to lidocaine for local anesthesia in orthognathic procedures.

Reference:

Hosseini-Abrishami, Majid, et al. “Ropivacaine Effect On Hemostasis and Pain Level in Patients Undergoing Orthognathic Surgery: a Triple-blinded, Randomized, Clinical Trial.” Inflammopharmacology, 2025.

Keywords:

Ropivacaine, safe, effective, alternative, lidocaine, local anesthesia, orthognathic procedures, Study, Hosseini-Abrishami, Maji

Powered by WPeMatico

Elevated TyG-BMI Linked to Higher Short-Term Mortality in Critically Ill Ischemic Stroke Patients: Study Finds

China: A recent study published in Cardiovascular Diabetology has highlighted a significant association between elevated triglyceride glucose-body mass index (TyG-BMI) and an increased risk of short-term mortality in critically ill patients with ischemic stroke (IS). The findings suggest that TyG-BMI could serve as a simple yet effective biomarker for identifying high-risk patients, aiding in early intervention and improved clinical outcomes.

Ischemic stroke, a leading cause of disability and death worldwide, requires timely risk assessment to enhance patient management in critical care settings. The triglyceride glucose-body mass index is a widely recognized marker for evaluating insulin resistance (IR) and has been strongly linked to stroke. However, the researchers note that research in this area remains limited, and existing studies have reported inconsistent findings.

To fill this knowledge gap, Ming Yu, Department of Neurology, Suining Central Hospital, Suining, China, and colleagues aimed to explore the association between TyG-BMI and 28-day mortality in critically ill ischemic stroke patients using data from the eICU database. This study seeks to bridge existing research gaps and provide more precise biomarker references for clinical use.

For this purpose, the researchers used multivariate Cox regression models to assess the impact of TyG-BMI on 28-day hospital and ICU mortality. Restricted cubic splines (RCS) were applied to examine potential nonlinear relationships, while Kaplan-Meier (K-M) curves were used to compare outcomes across different TyG-BMI groups. Subgroup analyses were also conducted to ensure result reliability and interaction.

The following were the key findings:

  • The study included 1,362 critically ill ischemic stroke patients with a mean age of 68.41 ± 14.16 years, of whom 47.50% were male.
  • Multivariate Cox regression analysis showed that patients in the high TyG-BMI group had significantly higher 28-day hospital mortality (HR = 1.734) and ICU mortality (HR = 2.337).
  • Restricted cubic spline (RCS) analysis revealed a nonlinear positive correlation between TyG-BMI and 28-day hospital mortality.
  • Below the inflection point of TyG-BMI = 380.37, each 1-SD (≈ 25.5 units) increase in TyG-BMI was linked to a 37.3% rise in 28-day hospital mortality (HR = 1.373).
  • Above 380.37, each 1-SD increase in TyG-BMI led to an 87.9% decrease in 28-day hospital mortality (HR = 0.121).
  • The log-likelihood ratio test yielded a P-value of 0.004.
  • For 28-day ICU mortality, RCS analysis showed a significant positive linear correlation with TyG-BMI.

The researchers demonstrated that elevated TyG-BMI is significantly associated with a higher risk of short-term all-cause mortality in critically ill ischemic stroke patients in the United States. Their findings provide strong evidence to address existing uncertainties in this field, highlighting TyG-BMI as a simple and effective biomarker for identifying high-risk patients. They further emphasized that regular monitoring and managing triglycerides, blood glucose, and body weight may help reduce short-term mortality in acute IS patients or those at risk.

Reference:

Ouyang, Q., Xu, L. & Yu, M. Associations of triglyceride glucose-body mass index with short-term mortality in critically ill patients with ischemic stroke. Cardiovasc Diabetol 24, 91 (2025). https://doi.org/10.1186/s12933-025-02583-1

Powered by WPeMatico

RADPAD Protective Drape Reduces Radiation Exposure in Cardiac Cath Labs, Study Finds

USA: A recent systematic review and meta-analysis published in the Cureus Journal highlights the effectiveness of the RADPAD protection drape in reducing radiation exposure among interventional cardiologists in cardiac catheterization laboratories. Given the occupational hazards associated with ionizing radiation, these findings reinforce the importance of implementing protective measures to minimize health risks for healthcare professionals performing fluoroscopy-guided procedures.

M. Chadi Alraies, Cardiology, Wayne State University Detroit Medical Center, Detroit, USA, and colleagues evaluated data from six independent studies involving 892 patients, analyzing radiation exposure levels among operators using the RADPAD protection drape compared to those without it. Ionizing radiation, a known occupational hazard in interventional cardiology, has been linked to various adverse health effects, including cataracts, skin damage, and an increased risk of malignancies. The RADPAD, a sterile, lead-free, disposable radiation shield, is designed to deflect scatter radiation away from operators, reducing their overall exposure during procedures.

Key Findings

Reduction in Radiation Exposure

  • The use of the RADPAD drape significantly lowered radiation exposure for primary operators.
  • Operators using RADPAD experienced a notably lower exposure dose (OR: -0.9).

Comparable Dose Area Product (DAP) and Screening Time

  • There was no significant difference in the dose area product (DAP) between the RADPAD and No-RADPAD groups (OR: 0.008).
  • Screening time remained similar between both groups (OR: 0.13).

Relative Exposure Consistency

  • The relative exposure (E/DAP) showed no significant variation between the groups (OR: -0.47).

Protective Benefits of RADPAD

  • Despite no significant changes in DAP, screening time, or relative exposure, the substantial reduction in direct operator exposure underscores the protective advantage of the RADPAD drape.

Despite these promising results, the study acknowledges certain limitations, including small sample sizes in four of the six studies analyzed. Additionally, variability in case complexity and fluoroscopy usage could have influenced the results. Another potential factor is increased operator awareness of radiation safety when using the RADPAD, which may have contributed to reduced exposure.

The study concludes that incorporating the RADPAD protection drape into catheterization laboratories can significantly lower scatter radiation exposure for both primary and secondary operators, regardless of procedure complexity. By reducing the risk of long-term radiation-related complications, such as cancer and cataracts, the RADPAD drape presents a practical and effective solution for enhancing radiation safety in interventional cardiology.

“Given the strong evidence supporting its efficacy, researchers recommend the routine use of RADPAD protective drapes in all catheterization labs. Future research with larger sample sizes and standardized protocols may further validate these findings and refine radiation protection strategies for healthcare professionals,” the authors concluded.

Reference:

Bahar A, Khanal R, Hamza M, et al. (April 28, 2024) Assessing the Efficacy of RADPAD Protection Drape in Reducing Radiation Exposure to Operators in the Cardiac Catheterization Laboratory: A Systematic Review and Meta-Analysis. Cureus 16(4): e59215. doi:10.7759/cureus.59215

Powered by WPeMatico

Laparoscopic “Tunnel” Approach Enhances Surgical Outcomes of hiatus hernia with GERD: Study

A new study published in the journal of BMC Surgery showed that the laparoscopic “tunnel” approach significantly reduces the risk of vagus nerve injury and helps preserve perigastric vessels during surgery for hiatus hernia with gastroesophageal reflux disease (GERD). These advantages contribute to better postoperative results and improved quality of life for patients. This technique also holds promise for broader use in the treatment of hiatal hernia (HH) and GERD.

Achieving the best possible long-term management of reflux symptoms and indicators with little or no adverse effects is the aim of treatment for HH and GERD. The primary surgical technique for laparoscopic HH repair in conjunction with fundoplication is the traditional bilateral surgical approach (TBSA). One drawback of TBSA is that, while it can protect the vagus nerve locally during surgery, it is unable to evaluate the vagus nerve’s integrity, which might result in undiscovered nerve injury. Thus, to protect the perigastric arteries and reduce vagus nerve injury, this study presents a unique laparoscopic “Tunnel” approach.

Clinical information was gathered sequentially from patients treated for hiatal hernia and gastroesophageal reflux illness at the First Affiliated Hospital of Ningbo University between June 2023 and June 2024 using a laparoscopic “tunnel” technique. Age, BMI, gender, DeMeester score, length of surgery, and postoperative symptoms were among the information gathered. At one, three, and six months after surgery, follow-ups were performed.

BMI was 25.56 ± 4.32 kg/m2, DeMeester score was 118.05 ± 17.71, GERD-Q score was 13 ± 2, and the average age was 54 ± 9 years. Nearly, 115 ± 15 minutes was the average surgery time. After surgery, symptoms dramatically decreased and after six months, the average GERD-Q score was 5 ± 1. At one month, 14 patients had dysphagia, 19 had belching, 5 had stomach distension, 16 had nausea, and 8 had diarrhea.

Only two patients were still belching at six months, and no other symptoms persisted. There were no reports of gallstones or vomiting. Overall, the vagus nerve damage may be less likely to occur using the laparoscopic “tunnel” technique than with the conventional surgical method. It is indisputable that maintaining function while reconstructing function is essential in the surgical treatment of functional disorders.

Source:

Feng, Z., Zhang, Z., Yan, Z., Gao, F., & Chen, Q. (2025). Innovative laparoscopic “Tunnel” approach in managing hiatal hernia with gastroesophageal reflux disease: a retrospective study. BMC Surgery, 25(1), 154. https://doi.org/10.1186/s12893-025-02900-1

Powered by WPeMatico

PTerm Classifier promising tool for Accurately Predicting Preterm Birth: Study Finds

China: In a significant advancement for prenatal care, researchers have developed a blood-based classifier called PTerm that can accurately predict the risk of preterm birth using genome-wide patterns in cell-free DNA (cfDNA). This innovative tool, which leverages existing non-invasive prenatal testing (NIPT) data, offers a highly accurate and cost-effective method for early detection of at-risk pregnancies.

The findings were published online in PLOS Medicine on April 15, 2025.

Preterm birth (PTB)—delivery before 37 weeks of gestation—remains a global challenge, affecting around 11% of pregnancies and contributing significantly to neonatal complications and maternal health risks. PTerm harnesses cfDNA, which circulates in the maternal bloodstream and reflects genetic material from the placenta and other maternal tissues. Because these DNA fragments dynamically respond to biological and pathological changes during pregnancy, they serve as a valuable biomarker for anticipating complications such as PTB.

To establish the model, Zhiwei Guo, Southern Medical University, Guangzhou, China, and colleagues conducted a comprehensive, multi-center study involving 2,590 pregnant women—518 with spontaneous preterm births and 2,072 with full-term deliveries—recruited from three independent hospitals. Whole-genome sequencing of plasma cfDNA was performed, focusing on promoter regions that regulate gene expression. Advanced machine learning techniques, including support vector machines and feature selection algorithms, were used to build the predictive model.

Key Findings:

  • PTerm, the Promoter profiling classifier for preterm prediction, achieved the highest accuracy among all tested models with an AUC of 0.878 based on leave-one-out cross-validation.
  • The classifier maintained strong predictive performance across three independent validation cohorts, with a consistent AUC of 0.849, highlighting its reliability across varied populations.
  • A major benefit of PTerm is its compatibility with current non-invasive prenatal testing (NIPT) workflows, requiring no changes in procedure or additional cost.
  • Its integration into routine prenatal screening could help identify high-risk pregnancies early, enabling timely and targeted medical interventions.

The researchers emphasized that incorporating tools like PTerm into clinical practice could enhance early risk assessment and help lower the global burden of preterm births. They hope future studies will further validate its clinical utility and encourage widespread adoption of cfDNA-based risk prediction models in obstetric care.

“PTerm showed strong predictive accuracy for identifying preterm birth risk. Moreover, it can be applied directly to existing non-invasive prenatal testing data without altering current procedures or incurring additional costs, making it a practical and scalable option for early screening in clinical settings,” the authors concluded.

Reference:

Guo Z, Wang K, Huang X, Li K, Ouyang G, Yang X, et al. (2025) Genome-wide nucleosome footprints of plasma cfDNA predict preterm birth: A case-control study. PLoS Med 22(4): e1004571. https://doi.org/10.1371/journal.pmed.1004571

Powered by WPeMatico

Rilzabrutinib could be effective treatment option in individuals with moderate to severe chronic spontaneous urticaria: JAMA

A new study published in the Journal of American Medical Association showed that Rilzabrutinib decreased itching and hives while preserving a positive risk-benefit profile, indicating that it might be a useful therapy for individuals with moderate to severe chronic spontaneous urticaria (CSU) that is resistant to antihistamines.

The primary cause of chronic spontaneous urticaria, a skin condition, is the activation of cutaneous mast cells via a variety of pathways. B cells and mast cells contain the protein bruton tyrosine kinase (BTK), which is essential for several immune-mediated disease processes. In order to ascertain the effectiveness and risk profile of rilzabrutinib, an oral, covalent, reversible, next-generation BTK inhibitor, in the treatment of patients with CSU, Ana Giménez-Arnau and colleagues carried out this investigation.

A 52-week phase 2 research, the Rilzabrutinib Efficacy and Safety in CSU (RILECSU) randomized clinical trial consisted of a 12-week dose-ranging, double-blind, placebo-controlled phase that was followed by a 40-week open-label extension. From November 24, 2021, until April 23, 2024, the trial was held in 12 countries. 51 centers across Asia, North America, Europe, and South America, recruited and randomly assigned individuals.

Adults with moderate to severe CSU who were not effectively managed with H1-antihistamine therapy, ranging in age from 18 to 80, were enrolled in the experiment. Patients were randomized 1:1:1:1 to 400 mg of rilzabrutinib, 400 mg once a day in the evening, 800 mg twice daily, 1200 mg three times daily, or a matched placebo. Change from baseline at week 12 in either UAS7 (for non-US reference nations) or ISS7 (for the US and US reference countries) was the main end objective.

A total of 160 responders who were either omalizumab-naive or omalizumab-incomplete were randomly assigned. Only the 143 individuals who had never used omalizumab were part of the primary analysis population. At week 12, ISS7 and UAS7 showed significant decreases with rilzabrutinib (1200 mg/d) compared to placebo from baseline. Improvements were also seen in the weekly Angioedema Activity Score (AAS7) and weekly Hives Severity Score (HSS7).

As early as week 1, ISS7, HSS7, UAS7, and AAS7 showed improvements. At week 12, CSU-related biomarkers, such as immunoglobulin (Ig)-G antithyroid peroxidase, soluble Mas-related G protein–coupled receptor X2, IgG anti-Fc-ε receptor 1, and interleukin-31, were lower than placebo. Rilzabrutinib had a good risk-benefit profile; headache, nausea, and diarrhea were more common side effects with rilzabrutinib than with a placebo.

Overall, the findings of this clinical study support that rilzabrutinib may be a useful BTKI with a good risk-benefit profile for treating H1-antihistamine-refractory patients with CSU.

Reference:

Giménez-Arnau, A., Ferrucci, S., Ben-Shoshan, M., Mikol, V., Lucats, L., Sun, I., Mannent, L., & Gereige, J. (2025). Rilzabrutinib in antihistamine-refractory chronic spontaneous urticaria: The RILECSU phase 2 randomized clinical trial. JAMA Dermatology (Chicago, Ill.). https://doi.org/10.1001/jamadermatol.2025.0733

Powered by WPeMatico