Radiation therapy after surgery safely reduces pelvic relapse risk from locally advanced, muscle-invasive bladder: cancer: Study

Radiation therapy could be an underused tool to reduce pelvic relapse risk for patients with locally advanced, muscle-invasive bladder cancer, according to results of a new phase III randomized trial. In the study, moderate doses of radiation therapy after bladder removal surgery sharply cut the rates of cancer returning in the pelvis without adding serious side effects. Findings of the Bladder Adjuvant RadioTherapy (BART) trial conducted at centers across India will be presented today at the American Society for Radiation Oncology (ASTRO) Annual Meeting.

“This is one of the first studies and the largest randomized trial to show that post-operative radiation therapy can meaningfully reduce pelvic relapses in bladder cancer,” said Vedang Murthy, MD, principal investigator of the trial and a professor and radiation oncologist at Tata Memorial Hospital in Mumbai. “Pelvic relapse can be devastating for patients – extremely painful and almost impossible to treat. Our research shows that modern radiation therapy offers a safe way to prevent many of these recurrences and improve patients’ quality of life.”

Locally advanced, muscle-invasive bladder cancer occurs when a tumor grows beyond the inner urothelial lining of the bladder into its muscle wall. Each year, roughly 20,000 to 25,000 people in the U.S. and more than half a million worldwide are diagnosed with muscle-invasive disease. Standard treatment for these patients typically involves radical cystectomy – surgical removal of the bladder – and chemotherapy, but up to one-third develop new pelvic tumors within two to three years.

Dr. Murthy and colleagues previously reported findings showing that adding intensity-modulated radiation therapy (IMRT) after cystectomy for patients with high-risk, muscle-invasive bladder cancer was safe and caused minimal side effects. The new analysis evaluated whether post-operative radiation therapy could also reduce cancer recurrence in the surgical bed and surrounding pelvic region.

The BART trial enrolled 153 patients with locally advanced, urothelial muscle-invasive bladder cancer from 2016 to 2024. Participants were randomly assigned to receive either post-operative/adjuvant radiation therapy (50.4 Gy in 28 fractions, n=77) or observation alone (n=76). All patients underwent radical cystectomy, and nearly all also received chemotherapy before (71%) or after (20%) surgery.

The people who enrolled in the trial were at high risk for recurrence: 62% had tumors that extended outside the bladder wall (pT3–T4), 41% had lymph node involvement (pN+), and 28% displayed variant tumor subtypes.

Patients who received radiation therapy after surgery experienced significantly fewer pelvic recurrences. Over a median follow-up of 47 months, 8% of patients in the radiation group experienced a locoregional recurrence, compared to 26% of those in the observation group (p=0.006). Two-year locoregional recurrence-free survival, the study’s primary endpoint, was 91.2% with radiation therapy versus 76.4% without (p=0.004).

“Bladder cancer is aggressive, and surgery and chemotherapy alone are not enough to prevent pelvic recurrence,” said Dr. Murthy. “But in our trial, very few people who received radiation had a locoregional relapse within two years.”

Disease-free survival (DFS), which measures time to recurrence anywhere in the body, also favored the radiation arm (77.6% vs. 64.4%, p=0.07). But rates of distant metastases were similar in both groups, affecting nearly one third of patients and reflecting the systemic nature of this cancer. Most people diagnosed with muscle-invasive bladder cancer ultimately die from distant metastases, explained Dr. Murthy, “but it does not matter where the cancer returns – whatever relapse we can reduce, we must reduce.”

Two-year overall survival was higher in the radiation arm (68% vs. 57%), though the difference was not statistically significant (p=0.4), which Dr. Murthy attributed to the small sample size. He said the next step for his team is a prospective meta-analysis combining BART data with large, randomized trials from France and Egypt to further assess safety and benefit in survival outcomes.

Side effect rates were low and similar between the groups. Severe late side effects were experienced by 8.5% of patients in the radiation arm and 10.5% in the observation arm (p=0.6). Subgroup analyses also suggested an additional benefit of radiation for patients with larger tumors (T3-4) and node-positive disease, pointing to potential directions for personalized bladder cancer treatment.

Dr. Murthy said he hopes these results will spur greater use of radiation therapy for bladder cancer. ”BART shows that modern radiation techniques allow us to deliver highly targeted treatment with fewer complications than in the past. Radiation therapy is already used safely after surgery for gynecologic cancers in the same anatomically complex region, suggesting it could also become a standard option for high-risk bladder cancer following cystectomy,” he said.

He noted that a limitation of the study is that no patients received immunotherapy, which is becoming standard in bladder cancer treatment to improve survival. Recent advances in immunotherapy highlight “a clear need” to study its use alongside post-operative radiation for patients with high-risk disease, said Dr. Murthy. “The two treatments act differently, with distinct functions and side effect profiles, and there’s no reason we shouldn’t be combining them,” he said.

Reference:

Radiation therapy after surgery safely reduces pelvic relapse risk from locally advanced, muscle-invasive bladder cancer, American Society for Radiation Oncology, Meeting: American Society for Radiation Oncology (ASTRO) 2025Annual Meeting.

Powered by WPeMatico

Billions lack access to healthy diets, but solutions are within reach, says new report

Food systems are key drivers of the world’s most urgent challenges, from chronic diseases and rising inequality to accelerating climate change and biodiversity loss, according to the 2025 EAT-Lancet Commission on Healthy, Sustainable, and Just Food Systems.

Powered by WPeMatico

Early neuroinflammation in people with Down syndrome may explain high prevalence of Alzheimer’s disease

Down syndrome is associated with accelerated aging. It is estimated that up to 90% of individuals with the condition develop Alzheimer’s disease before the age of 70. A study by researchers at the University of São Paulo (USP) in Brazil identified high levels of neuroinflammation in young individuals with Down syndrome, an additional factor explaining the high prevalence of Alzheimer’s disease in older people with the condition. The discovery paves the way for strategies to prevent and monitor the disease.

Powered by WPeMatico

Yale researchers develop novel test for leptospirosis

 In a new study, Yale School of Medicine (YSM) researchers unveiled a novel diagnostic method for detecting leptospiral virulence-modifying (VM) proteins in the blood and urine of hamsters, an advance that could pave the way for early diagnosis of the tropical disease leptospirosis in humans and improved treatment options. The findings were published in the journal Microbiology Spectrum.

Found around the world, leptospirosis affects approximately 1 million people annually, with nearly 60,000 fatalities. The disease is caused by the bacterium Leptospira and is spread through the urine of infected animals. Despite the potential of the disease to cause severe illness when left untreated, early diagnosis has been a significant challenge due to the lack of sensitive and specific diagnostic methods.

The research, led by Yale’s Dr. Joseph M. Vinetz and his team, in collaboration with Luna Bioscience, a company founded by Vinetz to develop vaccines for emerging global infectious diseases, has led to the development of a monoclonal antibody (mAb)-based capture immunoassay. This assay detects VM proteins, a recently identified family of leptospiral proteins crucial for disease pathogenesis.

“We have long known that leptospirosis severely impacts multiple organ systems, leading to conditions like jaundice, acute kidney injury, and pulmonary hemorrhage,” said Vinetz, a professor of medicine (infectious diseases) at YSM. “Our discovery of these VM proteins as circulating exotoxins gives us a specific target for both diagnostics and potential therapeutic interventions.”

Leptospirosis is the first systemic bacterial disease mediated by a toxin — such as tetanus, botulism, or diphtheria — that has the potential for rapid antigen detection by a novel test, he added.

The research lays the groundwork for developing rapid, inexpensive diagnostics that can be used in resource-limited settings, where leptospirosis is most prevalent, according to Vinetz, who is also a professor of epidemiology (microbial diseases) at the Yale School of Public Health. The novel diagnostic method holds promise for transforming leptospirosis management globally, he said.

“By enabling early detection, health care providers can initiate timely treatments, potentially saving lives and mitigating disease severity,” Vinetz said. “Furthermore, understanding the role of VM proteins in disease pathogenesis could lead to new therapeutic targets and vaccine development opportunities.”

Reference:

Chaurasia R, Jacobs A, Tang J, Dong S, Vinetz JM. 0. Development of leptospiral virulence-modifying protein detection assay: implications for pathogenesis and diagnostic test development. Microbiol Spectr 0:e00018-25. https://doi.org/10.1128/spectrum.00018-25

Powered by WPeMatico

CBT-I acceptable and efficacious intervention for managing insomnia in chronic disease populations: JAMA

A new study published in the Journal of the American Medical Association revealed that cognitive behavioral therapy for insomnia (CBT-I) is safe and highly effective in improving sleep among individuals living with chronic diseases like cancer, cardiovascular conditions, chronic pain, and stroke. 

Insomnia often exhibits compounding physical symptoms and reducing quality of life. Standard treatment guidelines recommend CBT-I as the first-line intervention, but concerns have lingered about its suitability for patients already managing heavy disease burdens. 

This review of data across 67 randomized clinical trials (RCTs) involved 5,232 participants, to evaluate the efficacy, safety, and patient acceptability of CBT-I in chronic disease populations. The studies included patients with a wide spectrum of conditions, from irritable bowel syndrome and chronic pain to cancer survivors and stroke patients.

Insomnia severity decreased significantly, with a large effect size (g = 0.98). This suggests patients not only reported sleeping better but also experienced meaningful reductions in their insomnia symptoms. Sleep efficiency improved with a moderate effect size (g = 0.77).

Sleep onset latency, or how long it takes to fall asleep, was shortened with a moderate effect size (g = 0.64). The patients with chronic disease who completed CBT-I not only fell asleep faster and stayed asleep longer but also reported markedly better sleep quality overall.

The review also assessed whether outcomes varied depending on delivery methods or patient characteristics. While results were consistently positive across disease groups, longer treatment durations produced better improvements in both sleep efficiency and time to fall asleep. This suggests that maintaining therapy over an extended period could maximize benefits.

The dropout rates averaged just 13.3%, which highlighted that most participants found the therapy manageable and worthwhile. Moreover, adverse effects linked directly to CBT-I were rare, reassuring clinicians that the therapy poses minimal risk. Overall, these results show that CBT-I is just as effective for people with chronic diseases as it is for the general population struggling with insomnia. This is an important step forward in integrated care, particularly since poor sleep often worsens chronic disease outcomes.

Source:

Scott, A. J., Correa, A. B., Bisby, M. A., Chandra, S. S., Rahimi, M., Christina, S., Heriseanu, A. I., & Dear, B. F. (2025). Cognitive behavioral therapy for insomnia in people with chronic disease: A systematic review and meta-analysis. JAMA Internal Medicine. https://doi.org/10.1001/jamainternmed.2025.4610

Powered by WPeMatico

Rheumatoid arthritis-associated lung disease significantly increase risk of serious infection: Study

A new study published in the journal of Arthritis & Rheumatology revealed that a considerable elevated risk of severe infection across anatomic locations and a variety of pathogen types is linked to rheumatoid arthritis-associated lung disease (RA-LD), especially RA-associated interstitial lung disease (RA-ILD).

An estimated 1% of people in the US and northern European nations suffer with rheumatoid arthritis (RA), a systemic inflammatory disease. Clinically, RA may virtually impact any lung compartment, including the pleura, which can cause pleural inflammation and/or effusions; small and large airways; the pulmonary vasculature; and the parenchyma, which can show up as rheumatoid nodules or ILD. Thus, this study examined the relationship between the risk of significant infection and RA-LD.

Using the MGB Biobank (Boston, Massachusetts), researchers performed a retrospective cohort study that matched RA-LD patients by age, sex, and length of RA to RA patients without lung disease (RA-no LD). Medical record review and chest imaging for clinically evident RA-associated bronchiectasis (RA-BR) and/or RA-ILD were used to confirm RA-LD patients.

Serious infection was the main outcome taken for this study. To account for competing risk of mortality, incidence rates and propensity score-adjusted sub distribution hazard ratios (sdHR) were computed using the Fine and Gray models.

In comparison to 980 RA-no LD comparators, 221 RA-LD patients (151 RA-ILD and 70 RA-BR) had a substantially increased risk of severe infection (55.8 vs. 25.8 per 1,000 person-years, sdHR 1.60, 95%CI 1.20-2.12). For RA-ILD patients (sdHR 1.79, 95%CI 1.33-2.41), the elevated risk persisted, but not for RA-BR cases (sdHR 1.19, 95%CI 0.72-1.97).

RA-LD was linked to a number of pathogen species, including bacteria, fungi, viruses, and mycobacteria. The most frequent anatomic locations of infection in RA-LD were the lungs, skin and soft tissues, and the ears, nose, and throat. Certain infections, such as influenza virus, Staphylococcus, pseudomonas, respiratory syncytial virus, and nontuberculous mycobacteria, were more common in RA-LD patients, especially among RA-BR.

Overall, serious infections involving several body parts and a broad spectrum of microorganisms are significantly more likely to occur in patients with rheumatoid arthritis-related lung disease, particularly interstitial lung disease. In particular, an increased risk of lung infections is related with bronchiectasis associated with RA (RA-BR). 

Reference:

Zhang, Q., Qi, Y., Wang, X., McDermott, G. C., Chang, S. H., Chaballa, M., Khaychuk, V., Paudel, M. L., & Sparks, J. A. (2025). Risk of serious infection in patients with rheumatoid arthritis-associated interstitial lung disease or bronchiectasis: A comparative cohort study. Arthritis & Rheumatology. https://doi.org/10.1002/art.43338

Powered by WPeMatico

Severe Emphysema on HRCT Signals Higher Heart Disease Risk in COPD Patients: Study

China: Severe emphysema, quantitatively assessed using high-resolution computed tomography (HRCT), is a strong independent predictor of coronary artery disease (CAD) in patients with chronic obstructive pulmonary disease (COPD), a new retrospective study has found. The study was published online in the International Journal of Chronic Obstructive Pulmonary Disease.

Researchers found that CAD risk more than doubled when the low attenuation area (LAA%) exceeded 16.95%. Furthermore, patients with more severe emphysema exhibited more complex coronary lesions and were more likely to require percutaneous coronary intervention (PCI), highlighting the critical interplay between structural lung changes and cardiovascular risk.
The study was led by Dr. Luoman Su and colleagues from the Department of Pulmonary and Critical Care Medicine, Key Laboratory of Interventional Pulmonology of Zhejiang Province, The First Affiliated Hospital of Wenzhou Medical University, Wenzhou, China. The research team aimed to clarify the role of emphysema—a key structural subtype of COPD—in the development of CAD, a relationship that had remained poorly understood. Using quantitative HRCT, the investigators sought to determine whether emphysema severity could independently stratify cardiovascular risk beyond traditional factors.
The study retrospectively analyzed 392 COPD patients without prior CAD, who underwent HRCT between 2015 and 2020. Emphysema extent was measured as the percentage of low attenuation areas below −950 Hounsfield units, with severe emphysema defined as LAA% above 16.95%.
Logistic regression and restricted cubic spline analyses revealed the following:
  • Severe emphysema was independently associated with a higher risk of coronary artery disease (adjusted OR 2.28).
  • The predictive model showed strong performance, with an area under the ROC curve of 0.81.
  • Patients with severe emphysema had higher SYNTAX scores, indicating more complex coronary lesions (median 16.29 vs. 10.0 in mild emphysema).
  • Rates of percutaneous coronary intervention (PCI) were significantly higher in patients with severe emphysema (68.2% vs. 33.3%).
“These findings highlight the clinical relevance of emphysema quantification in COPD patients,” Dr. Su and colleagues noted. “Incorporating HRCT-based emphysema severity into cardiovascular risk assessment may enable earlier identification of high-risk individuals, prompting timely evaluation and intervention.”
Despite its insights, the study had several limitations. The retrospective, single-center design limits causal inference and generalizability, while residual confounding from unmeasured factors such as lifestyle and smoking history may persist. Variations in CT imaging protocols and the absence of long-term cardiac event data further constrain the findings. Nonetheless, the observed associations are biologically plausible and complement existing evidence linking emphysema with heightened cardiovascular risk.
“The study demonstrates that quantitatively assessed emphysema on HRCT is an independent predictor of CAD in COPD patients and correlates with more complex coronary lesions. The identified threshold of 16.95% LAA-950 offers a potential imaging biomarker for cardiovascular risk stratification in this population, though prospective multicenter validation is needed before clinical adoption,” the authors wrote.
“These findings highlight the importance of integrating lung structural assessment into comprehensive cardiovascular management strategies for patients with COPD,” they concluded.
Reference:
Su L, Qian C, Yu C, Weng Z, Zhao H, Chen C. Quantitatively Assessed Emphysema Severity on HRCT Independently Predicts Coronary Artery Disease in COPD: A Retrospective Cohort Study. Int J Chron Obstruct Pulmon Dis. 2025;20:3147-3161. https://doi.org/10.2147/COPD.S540503

Powered by WPeMatico

Fiber-Reinforced Restorations Improve Fracture Resistance in Endodontically Treated Teeth: Study

A recent study published in Clinical Oral Investigations by Lena Bal, Cangül Keskin, Aybuke Karaca Sakallı, Bilge Ozcan, and İsen Guleç Kocyigit investigated the effect of different reinforcement materials on the fracture resistance of mesio-occluso-distal (MOD) cavity restorations in endodontically treated teeth.

The researchers aimed to determine how applying fiber-reinforced composites to the cervical and coronal segments of teeth could impact their durability. This study is particularly relevant because vertical fractures in MOD restorations are among the most common failures observed in endodontically treated teeth, and improving restoration strength is essential for long-term success and preservation of dental function.

The study involved eighty-four freshly extracted human mandibular molars, prepared with standardized MOD cavities and endodontically treated. Teeth were divided into groups based on the combinations of reinforcement materials applied to the cervical and coronal segments, including flowable composite, posterior composite, EverX flow, EverX posterior, and Ribbond.

After thermocycling, fracture resistance was evaluated using a universal testing machine, and failure patterns were examined under a stereomicroscope. The authors reported that fiber-reinforced structures provided superior fracture resistance compared to conventional composites, particularly when applied to the cervical segment. This suggests that strategic placement of reinforcement materials in restorative procedures can enhance structural integrity, helping to reduce the risk of fractures in teeth subjected to functional load.

According to Bal et al., the findings underscore the clinical importance of material selection and placement in endodontic restorations. Fiber-reinforced composites were not only stronger than Ribbond but also packable and suitable for mesio-occluso-distal cavity restorations, making them practical for routine dental practice. The authors conclude that combining different resin-reinforced composites in cervical and coronal segments improves fracture resistance and provides a reliable method to enhance tooth durability after endodontic therapy. This study offers valuable guidance for dental professionals seeking evidence-based strategies to strengthen restorations and extend the longevity of endodontically treated teeth.

Reference:
Bal L, Keskin C, Karaca Sakallı A, Özcan B, Koçyiğit İG. Effect of combined use of reinforcement materials on the fracture resistance of MOD cavity restorations in endodontically treated teeth. Clinical Oral Investigations. 2025;29:481. doi:10.1007/s00784-025-06560-6

Keywords: Fiber-reinforced composites, fracture resistance, MOD cavity, endodontically treated teeth, cervical segment, Lena Bal, Clinical Oral Investigations

Powered by WPeMatico

Delayed Diagnosis of Venous Thromboembolism Linked to Higher 30-Day Mortality, Study Finds

USA: A new study published in JAMA Network Open has revealed that delayed diagnosis of venous thromboembolism (VTE) is linked to a higher risk of death.

The investigation revealed that most VTE cases were diagnosed after more than 24 hours, and in many instances, delays extended beyond 72 hours. These diagnostic lapses were strongly tied to higher 30-day mortality, particularly when pulmonary embolism was missed. The findings highlight the urgent need for improved detection strategies to enhance patient safety.
The research was led by Min-Jeoung Kang from the Department of Medicine, Division of General Internal Medicine, Brigham and Women’s Hospital, Boston, Massachusetts, along with colleagues from Penn State Health. The team analyzed data from 3,525 patients across two major U.S. healthcare systems: Mass General Brigham (MGB) and Penn State Health (PSH). The study evaluated the use of the Delayed Diagnosis of VTE electronic clinical quality measure (DOVE eCQM), an automated tool designed to quantify diagnostic delays and assess their impact on patient outcomes.
For this purpose, retrospective data from electronic health records (EHRs) were assessed, covering 2016–2021 at MGB and 2019–2022 at PSH. The researchers categorized delays using thresholds of more than 24 hours and more than 72 hours. They also investigated the causes of missed opportunities, classifying them as practitioner-, system-, patient-, or other-related factors. Mortality risks were then compared between patients diagnosed promptly (within 24 hours) and those diagnosed late.
The study revealed the following notable findings:
  • Diagnostic delays were common, with 79.4% of patients at MGB and 82.4% at PSH receiving a VTE diagnosis after 24 hours.
  • Delays exceeding 72 hours occurred in approximately 70% of cases at both centers.
  • Practitioner-related factors were responsible for most missed diagnoses.
  • At MGB, 30-day all-cause mortality rose from 2.5% for timely diagnoses to 8.3% for delayed diagnoses (RR 3.31).
  • At PSH, 30-day mortality increased from 4.6% to 5.9% with delayed diagnosis (RR 1.28).
  • Many deaths occurring within the first 24 hours were associated with missed pulmonary embolism.
The authors emphasized that the nonspecific symptoms of VTE often hinder timely recognition, which underscores the importance of using systematic tools like the DOVE eCQM. This platform, validated across two different EHR systems, proved effective in quantifying delays and identifying their consequences. By enabling continuous monitoring, it could guide quality improvement initiatives at institutional, regional, and even national levels.
While the study demonstrated the potential of DOVE eCQM, the authors noted that some healthcare systems may face challenges in adopting natural language processing–based platforms. Even so, they argued that such digital solutions are critical to reducing diagnostic delays and improving outcomes in outpatient and primary care settings. Future work, they added, will focus on expanding DOVE eCQM to urgent care and emergency departments, as well as developing clinical decision support systems to help clinicians recognize VTE earlier.
“Overall, the study highlights that delayed recognition of VTE is common and deadly. By leveraging electronic tools such as DOVE eCQM, healthcare systems may be better equipped to reduce missed diagnoses, improve the timeliness of care, and ultimately save lives,” the authors concluded.
Reference:
Kang M, Schreiber R, Baris VK, et al. Delayed Venous Thromboembolism Diagnosis and Mortality Risk. JAMA Netw Open. 2025;8(9):e2533928. doi:10.1001/jamanetworkopen.2025.33928

Powered by WPeMatico

AI-Driven Model Predicts Which Preterm Infants Benefit From Platelet Transfusions?

Netherlands: A multicenter study has described the development of a dynamic prediction tool that helps tailor platelet transfusion decisions for preterm infants with severe thrombocytopenia, showing that the potential benefit or harm of prophylactic transfusion can vary widely based on the infant’s real-time clinical status.

Published in JAMA, the research highlights that using this individualized model could guide clinicians in balancing the risk of major bleeding against the possibility of unnecessary transfusions.

Led by Hilde van der Staaij of the Department of Clinical Epidemiology at Leiden University Medical Center, the Netherlands, the investigators addressed a longstanding challenge in neonatal care: identifying which critically ill preterm infants genuinely benefit from early platelet transfusions. Severe thrombocytopenia—defined as a platelet count below 50 × 10⁹/L—is common in extremely premature babies, but routine prophylactic transfusions have uncertain advantages and may introduce new complications.
To create the prediction model, the team analyzed data from an international cohort of 1,042 infants admitted to 14 neonatal intensive care units across the Netherlands, Sweden, and Germany between 2017 and 2021. All infants were born before 34 weeks of gestation and experienced severe thrombocytopenia. The researchers compared two strategies at repeated two-hour intervals during the first week after onset of thrombocytopenia: administering a platelet transfusion within six hours (prophylaxis) versus withholding transfusion for three days (no prophylaxis). The main outcome was the three-day risk of major bleeding or death.
The model incorporated a broad set of predictors, including gestational and postnatal age, growth restriction, presence of necrotizing enterocolitis or sepsis, need for mechanical ventilation or vasoactive medications, platelet count trends, and prior transfusions. This “landmarking” approach, combined with a clone-censor-weight method, allowed for dynamic updates of each infant’s risk profile as their condition evolved.
Key Findings:
  • Validation used a separate national cohort of 637 Dutch infants treated between 2010 and 2014.
  • The median gestational age in this group was 28 weeks.
  • The median birth weight was 900 g.
  • Major bleeding or death occurred in about one in five infants in both the validation and development cohorts.
  • Model performance was strong, with a time-dependent area under the receiver operating characteristic curve of 0.69 for the prophylactic transfusion strategy.
  • The time-dependent area under the curve was 0.85 for the no-prophylaxis strategy, indicating good discriminatory ability and calibration.
Crucially, the predicted risks varied substantially depending on the infant’s immediate clinical state. Some babies were projected to gain clear protection from early transfusion, while others faced higher odds of harm or no measurable benefit. This heterogeneity underscores that a single platelet threshold is inadequate for guiding transfusion decisions in this vulnerable population.
The authors conclude that their individualized risk algorithm offers a promising step toward more precise, evidence-based management of severe thrombocytopenia in preterm infants. While prospective trials are needed to confirm clinical impact, the tool could soon help neonatologists move away from routine prophylactic transfusions and toward a personalized strategy that optimizes outcomes and minimizes unnecessary exposure to blood products.
Reference:
van der Staaij H, Prosepe I, Caram-Deelder C, et al. Individualized Prediction of Platelet Transfusion Outcomes in Preterm Infants With Severe Thrombocytopenia. JAMA. Published online September 15, 2025. doi:10.1001/jama.2025.14194

Powered by WPeMatico