Tryptase has limited sensitivity as Anaphylaxis Biomarker, claims study

Tryptase is the most commonly used biomarker for diagnosing anaphylaxis; however, it has limited sensitivity. A recent systematic review found its sensitivity to be only 49%, raising concerns about potential misclassification of anaphylaxis cases despite its status as the most reliable current marker.

Anaphylaxis is a life-threatening allergic reaction commonly triggered by food, venom, or drugs. Clinical criteria are central to diagnosing anaphylaxis. However, laboratory biomarkers could provide valuable confirmation when clinical diagnosis is challenging. They aimed to evaluate key biomarkers including tryptase, histamine, platelet-activating factor (PAF), PAF-acetylhydrolase (PAF-AH), and urinary prostaglandin D2 (PGD2) for their diagnostic utility in anaphylaxis. A systematic review was conducted following PRISMA-DTA guidelines. Studies published between 2004 and 2024 from Embase and MedLine were included if they evaluated the diagnostic test accuracy of tryptase, histamine, PAF, PAF-AH, or urinary PGD2 in confirmed anaphylaxis cases. Pooled sensitivity and specificity estimates were calculated using the diagmeta package in R. Results: Twenty-eight studies with 18,749 patients were included, of which 3,329 had anaphylaxis. Tryptase was the most frequently studied biomarker (24 studies), with a pooled sensitivity and specificity of 0.49 and 0.82, respectively. Histamine had a pooled sensitivity of 0.76 and specificity of 0.69. Limited data were available for PAF, PAF-AH, and urinary PGD2. Studies suggest that tryptase remains the most widely used and accessible biomarker for diagnosing anaphylaxis mainly using the “Rule of Twos” diagnosis strategy. Histamine and urinary PGD2 show potential, though their application is limited by practical challenges. Further research is needed to establish the diagnostic roles of PAF and PAF-AH, particularly in non-IgE mediated anaphylaxis pathways.

Reference:

Khalaf R, Prosty C, Davalan W, Abrams E, Kaouache M, Ben-Shoshan M. Diagnostic Utility of Biomarkers in Anaphylaxis: A Systematic Review and Meta-Analysis. J Allergy Clin Immunol Pract. 2025 Apr 14:S2213-2198(25)00362-9. doi: 10.1016/j.jaip.2025.04.008. Epub ahead of print. PMID: 40239922.

Keywords:

Tryptase, limited, sensitivity, Anaphylaxis, Biomarker, claims, study, PAF; PAF-AH; anaphylaxis; biomarkers; diagnosis; histamine; tryptase, Khalaf R, Prosty C, Davalan W, Abrams E, Kaouache M, Ben-Shoshan M. Diagnostic

Powered by WPeMatico

Single dose of baloxavir lowers the risk of influenza virus transmission to close contacts: NEJM

A new study published in The New England Journal of Medicine showed that a single dosage of the antiviral baloxavir marboxil (Xofluza) reduced the rate of influenza virus transmission to household contacts.

Because baloxavir marboxil (baloxavir) quickly decreases influenza virus shedding, it may also lessen transmission. Neuramaminidase inhibitor therapy studies have not provided enough proof that they stop transmission to contacts. The use of baloxavir in practical public health strategies is limited by the lack of data regarding its efficacy, effectiveness, appropriate dose administration, or duration of use for the treatment or post-exposure prophylaxis of novel influenza A viruses of pandemic potential, including the highly pathogenic avian influenza A (H5N1) virus. In order to fill the gap in the literature, Arnold Monto and team carried out this investigation.

Within 48 hours of the initial symptoms, individuals aged 5 to 64 with an influenza-positive index were randomized 1:1 to receive either baloxavir or a placebo. By day five, the main outcome was the spread of influenza virus from an index patient to a household contact. By day five, the first secondary end goal was the spread of the influenza virus, which caused symptoms.

As a whole, 1,457 index patients and 2,681 household contacts were included during the 2019–2024 influenza seasons. Nearly, 726 index patients were randomized to the baloxavir group, and 731 to the placebo group. With an adjusted relative risk reduction of 29%, baloxavir significantly reduced the transmission of laboratory-confirmed influenza by day 5 when compared to placebo.

Although the difference was not statistically significant, the adjusted incidence of influenza virus transmission by day 5 that caused symptoms was 5.8% with baloxavir and 7.6% with placebo. During the follow-up period, 7.2% of the index patients in the baloxavir group developed drug-resistant viruses; no resistant viruses were found in household contacts. There were no new warning signs found.

Overall, compared to placebo, treatment with a single oral dosage of baloxavir resulted in a decreased rate of influenza virus transmission to close contacts. Across age categories, seasons, influenza types (A[H1N1pdm09], A[H3N2], and B), durations between symptom start and baloxavir or placebo administration, and geographical locations, there was a difference in the transmission incidence favoring baloxavir. 

Source:

Monto, A. S., Kuhlbusch, K., Bernasconi, C., Cao, B., Cohen, H. A., Graham, E., Hurt, A. C., Katugampola, L., Kamezawa, T., Lauring, A. S., McLean, B., Takazono, T., Widmer, A., Wildum, S., & Cowling, B. J. (2025). Efficacy of baloxavir treatment in preventing transmission of influenza. The New England Journal of Medicine, 392(16), 1582–1593. https://doi.org/10.1056/nejmoa2413156

Powered by WPeMatico

AI-Powered ECG Model Shows Promise in Heart Failure Risk Prediction: JAMA

A new study published in the Journal of American Medical Association showed that a noise-adapted AI-ECG model successfully estimated heart failure risk using only lead I ECGs across diverse multinational cohorts. This suggests a potential strategy for heart failure (HF) risk stratification that could be applied using wearable and portable ECG devices, warranting further prospective studies.

Single-lead electrocardiograms (ECGs) may be recorded using portable equipment, which might allow for extensive community-based risk assessment. Thus, to determine if an artificial intelligence (AI) system can predict heart failure risk from noisy single-lead electrocardiograms, Lovedeep Dhingra and colleagues carried out this investigation.

A retrospective cohort analysis assessed persons without heart failure at baseline from UK Biobank, YNHHS, and ELSA-Brasil, utilizing outpatient ECG data. The data were evaluated between September 2023 and February 2025, with the major exposure being the AI-ECG-predicted risk of left ventricular systolic dysfunction (LVSD).

Lead I ECGs were separated to mimic wearable device signals, and a noise-adaptive AI-ECG model trained to identify LVSD was used. The model’s relationship with new-onset heart failure (first HF hospitalization) was investigated. Its predictive ability was compared to the PREVENT and PCP-HF risk scores using the integrated discrimination improvement, Harrell C statistic, and net reclassification improvement.

Baseline ECGs were obtained from 192 667 YNHHS patients, 42 141 UKB participants, and 13 454 ELSA-Brasil participants. 31 (0.2%) in ELSA-Brasil, 46 (0.1%) in UKB, and 3697 (1.9%) in YNHHS experienced heart failure over a median (IQR) of 4.2 (3.7-4.5) years, 3.1 (2.1-4.5) years, and 4.6 (2.8-6.6) years, respectively.

Regardless of age, sex, comorbidities, or competing risk of mortality, a positive AI-ECG screening result for LVSD was linked to a 3- to 7-fold increased risk for HF, and every 0.1 increase in the model likelihood was linked to a 27% to 65% greater hazard across cohorts. The discrimination of AI-ECG for new-onset HF was 0.828 in ELSA-Brasil, 0.723 in YNHHS, and 0.736 in UKB.

Across cohorts, integrating AI-ECG predictions beside PCP-HF and PREVENT equations resulted in a higher Harrel C statistic. AI-ECG improved integrated discrimination by 0.091 to 0.205 vs PCP-HF and 0.068 to 0.192 vs PREVENT, as well as net reclassification by 18.2% to 47.2% vs PCP-HF and 11.8% to 47.5% vs PREVENT.

Overall, a noise-adapted AI-ECG model predicted HF risk using lead I ECGs across global cohorts, indicating a viable HF risk-stratification technique that needs to be studied prospectively employing wearable and portable ECG devices.

Source:

Dhingra, L. S., Aminorroaya, A., Pedroso, A. F., Khunte, A., Sangha, V., McIntyre, D., Chow, C. K., Asselbergs, F. W., Brant, L. C. C., Barreto, S. M., Ribeiro, A. L. P., Krumholz, H. M., Oikonomou, E. K., & Khera, R. (2025). Artificial intelligence-enabled prediction of heart failure risk from single-lead electrocardiograms. JAMA Cardiology. https://doi.org/10.1001/jamacardio.2025.0492

Powered by WPeMatico

FAGR Outperforms Fibrinogen in Predicting Mortality in STEMI Patients: Study Finds

China: A recent study has highlighted the potential role of the fibrinogen-to-albumin-to-globulin ratio (FAGR) as a prognostic marker in patients with ST-elevation myocardial infarction (STEMI) undergoing emergency percutaneous coronary intervention (PCI).

The findings, published in Scientific Reports, indicate that FAGR proved to be a significant predictor of all-cause and cardiovascular mortality in patients with acute STEMI undergoing emergency PCI. A higher FAGR was associated with increased mortality rates, outperforming fibrinogen in prognostic accuracy (ROC curve of 0.720 for all-cause mortality and 0.726 for cardiovascular mortality), highlighting its potential as a valuable prognostic marker in STEMI patients.

STEMI is a life-threatening condition that requires immediate medical intervention to restore blood flow to the heart. Despite advancements in PCI, patient outcomes vary significantly, necessitating the identification of reliable prognostic indicators. The researchers note that while the fibrinogen-to-albumin-to-globulin ratio has been recognized for its association with coronary artery disease (CAD), its role in acute STEMI remains insufficiently explored. To bridge this gap, Lixing Chen, Department of Cardiology, Kunming Medical University First Affiliated Hospital, Kunming, Yunnan Province, China, and colleagues aimed to assess the prognostic potential of FAGR in STEMI patients.

For this purpose, the researchers enrolled 1,042 STEMI patients who underwent emergency PCI at the First Affiliated Hospital of Kunming Medical University between June 2018 and January 2023. Based on the median FAGR (2.44), patients were categorized into low and high FAGR groups. The predictive value of FAGR for all-cause and cardiovascular mortality was assessed using Kaplan–Meier plots, restricted cubic spline regression, Cox survival analyses, and time-dependent ROC analyses.

The key findings of the study were as follows:

  • Kaplan–Meier analysis showed a higher cumulative incidence of all-cause and cardiovascular mortality in the high FAGR group.
  • Multivariate Cox proportional hazard analysis identified FAGR as an independent predictor of all-cause and cardiovascular death.
  • For all-cause mortality, FAGR demonstrated a stronger predictive value (AUC = 0.720) compared to fibrinogen (AUC = 0.687).
  • FAGR outperformed fibrinogen for cardiovascular mortality in prediction accuracy (AUC = 0.726 vs. 0.698).

The findings suggest that FAGR is a valuable prognostic indicator in STEMI patients undergoing emergency PCI. According to the authors, the study, the first to report its prognostic significance in STEMI, demonstrated that patients with a higher FAGR had greater all-cause and cardiovascular mortality than those with a lower FAGR. Moreover, FAGR emerged as an independent predictor of mortality, showing superior predictive accuracy compared to fibrinogen.

“However, as a single-center retrospective study, it has limitations, including potential data bias despite adjustments for confounding factors. Further prospective research is needed to validate these findings and establish FAGR’s role in clinical practice,” the authors concluded.

Reference:

Yang, S., Zhou, Y., Xu, D., Dong, Y., Tang, H., Jing, P., Lu, Y., Yuan, M., Zhao, Z., & Chen, L. (2025). The associations between the FAGR and all-cause and cardiovascular mortality in patients with STEMI. Scientific Reports, 15(1), 1-9. https://doi.org/10.1038/s41598-025-93951-0

Powered by WPeMatico

Hypercortisolism Affects 1 in 4 Patients with Difficult-to-Control Type 2 Diabetes: Study Finds

USA: A recent study published in Diabetes Care has highlighted the significant role of hypercortisolism in patients with difficult-to-control type 2 diabetes. Researchers found that approximately one-quarter of individuals with inadequately managed type 2 diabetes, despite being on multiple medications, exhibited hypercortisolism, a condition characterized by excessive cortisol levels in the body.

Despite the use of various glucose-lowering medications, many individuals with type 2 diabetes still fail to achieve their glycemic targets. Cortisol, known as the “stress hormone,” plays a key role in regulating metabolism, blood pressure, and the body’s response to stress. However, chronic elevation of cortisol levels can have detrimental effects, particularly in individuals with type 2 diabetes, where it can worsen hyperglycemia.

In the prospective, observational study, John B. Buse, University of North Carolina School of Medicine, Chapel Hill, NC, and colleagues evaluated the prevalence of hypercortisolism as a potential factor contributing to poor glucose control.

For this purpose, the researchers screened individuals with type 2 diabetes and HbA1c levels ranging from 7.5% to 11.5% (58–102 mmol/mol) who were on two or more glucose-lowering medications, with or without micro-/macrovascular complications, or those taking multiple blood pressure–lowering medications. A 1-mg dexamethasone suppression test (DST) was conducted, excluding common causes of false-positive results.

The primary endpoint was the prevalence of hypercortisolism, defined as post-DST cortisol levels exceeding 1.8 μg/dL (50 nmol/L). The researchers also used multiple logistic regression to assess the characteristics associated with hypercortisolism and evaluated the percentage of participants with hypercortisolism and adrenal imaging abnormalities.

Key Findings:

  • Post-DST cortisol was unsuppressed in 252 of 1,057 participants, with a prevalence of 23.8%.
  • The prevalence of hypercortisolism was 33.3% among participants with cardiac disorders.
  • Among those taking three or more blood pressure–lowering medications, the prevalence of hypercortisolism was 36.6%.
  • 34.7% of participants with hypercortisolism had adrenal imaging abnormalities.
  • Factors associated with a higher prevalence of hypercortisolism included:
    • Use of sodium–glucose cotransporter 2 inhibitors (odds ratio 1.558).
    • Use of maximum-dose glucagon-like peptide 1 receptor agonists (odds ratio 1.544).
    • Use of tirzepatide (odds ratio 1.981).
    • A higher number of blood pressure–lowering medications (odds ratio 1.390).
    • Older age (odds ratio 1.316).
    • BMI <30 kg/m2 (odds ratio 1.639).
    • Non-Latino/Hispanic ethnicity (odds ratio 3.718).
    • Use of fibrates (odds ratio 2.676) or analgesics (odds ratio 1.457).
  • All associations were statistically significant.

The authors revealed that hypercortisolism was identified in approximately 23.8% of individuals with difficult-to-control type 2 diabetes, despite the use of multiple medications. They discovered that this condition was associated with several factors, including older age, lower body mass index (BMI), non-Latino/Hispanic ethnicity, a higher burden of hypertension medications, and the use of fibrates, analgesics, or newer glucose-lowering treatments. These findings emphasize the role of hypercortisolism in exacerbating hyperglycemia in individuals with inadequately controlled type 2 diabetes.

The authors also suggest that screening for hypercortisolism could be beneficial for patients who are not meeting glycemic targets despite treatment with multiple medications, providing valuable insights into the challenges of managing type 2 diabetes.

Reference:

John B. Buse, Steven E. Kahn, Vanita R. Aroda, Richard J. Auchus, Timothy Bailey, Irina Bancos, Robert S. Busch, Elena A. Christofides, Ralph A. DeFronzo, Bradley Eilerman, James W. Findling, Vivian Fonseca, Oksana Hamidi, Yehuda Handelsman, Harold J. Miller, Jonathan G. Ownby, John C. Parker, Athena Philis-Tsimikas, Richard Pratley, Julio Rosenstock, Michael H. Shanik, Lance L. Sloan, Guillermo Umpierrez, Iulia Cristina Tudor, Tina K. Schlafly, Daniel Einhorn, CATALYST Investigators; Prevalence of Hypercortisolism in Difficult-to-Control Type 2 Diabetes. Diabetes Care 2025; dc242841. https://doi.org/10.2337/dc24-2841

Powered by WPeMatico

Maternal childhood trauma may lead to early metabolic changes in male children: Study

Adverse situations experienced by the mother during childhood – such as neglect or physical, psychological or sexual violence – can trigger excessive weight gain in male children as early as the first two months of life. This was shown in a study that followed 352 pairs of newborns and their mothers in the cities of Guarulhos and São Paulo, Brazil. The results were published in the journal Scientific Reports.

The analyses indicated the occurrence of very early metabolic alterations in babies that not only led to weight gain above that expected for their age but also have the potential to increase the future risk of developing obesity and diabetes.

This is the first article resulting from a Thematic Project supported by FAPESP and the National Institutes of Health (NIH) in the United States. Using a database of 580 vulnerable pregnant women, the group is studying intergenerational trauma, i.e., negative effects that can be passed on to future generations, even if the offspring have not lived through such experiences.

Conducted by researchers from Columbia and Duke Universities, both in the United States, and the School of Medicine of the Federal University of São Paulo (EPM-UNIFESP) in Brazil, the study focuses on issues related to mother-baby interaction, development, and mental and physical health.

“We observed that although the babies were born weighing within the expected parameters, in the first few days of life they showed altered weight gain, far above what’s recommended as ideal by the World Health Organization [WHO],” says Andrea Parolin Jackowski, professor at UNIFESP and coordinator of the project in Brazil.

According to the WHO, the ideal weight gain in the first stage of life is up to 30 grams per day. However, the male babies in the study had an average weight gain of 35 grams per day – with some gaining up to 78 grams per day.

“The babies who took part in the study were born full-term, healthy and within the ideal weight range. All of the pregnancies we followed were low-risk, but our data showed that every adversity the mother experienced during childhood increased the babies’ weight gain by 1.8 grams per day. And this was limited to males,” the researcher reports.

According to Jackowski, there are many factors that can influence a baby’s weight in early life, and maternal childhood trauma appears to be one of them. For this reason, the analysis took care to control for so-called confounders – variables related to the mothers’ stress levels that could influence the results. Some examples include lifetime trauma experiences (the effects of which are cumulative) and current trauma, as well as education level and socioeconomic status.

“It’s also important to note that 70% of the babies who took part in the study were exclusively breastfed. The other 30% were on mixed feeding [a combination of breast milk and formula]. This means that they weren’t eating filled cookies or other foods that could actually change their weight. Therefore, the results suggest the occurrence of an early metabolic alteration in these babies,” she says.

Why only boys?

According to the researcher, maternal trauma during childhood only had an impact on the weight of male babies because of physiological variations in the placenta associated with the sex of the fetus.

The placenta is a temporary organ composed of maternal and fetal tissue that shows structural differences and differences in the regulation and expression of steroids and proteins depending on the sex of the baby. “Male fetuses develop strategies to maintain constant growth in the face of an adverse intrauterine environment, leading to a greater risk of prematurity and fetal death,” explains the researcher.

In addition, she adds, childhood adversity is known to increase the risk of depression and anxiety during pregnancy, which can lead to increased levels of pro-inflammatory cytokines and cortisol in the intrauterine environment. “It appears that the placenta of female fetuses adapts to protect them, slowing down the growth rate without restricting intrauterine growth [i.e., the size of the baby is within the expected range at the end of pregnancy] and allowing for a higher survival rate,” she explains.

Another important issue is that the placenta of male fetuses tends to be more susceptible to fluctuations in substances and metabolites present in the maternal bloodstream compared to female placentas. “As a result, in these cases of trauma, it can become more permeable, causing the male fetus to be more exposed to inflammatory factors resulting from high levels of stress, such as cortisol and interleukins, for example.”

The work now published is the first to identify intergenerational trauma as a trigger for physical changes at such an early age. “It’s already known that adverse events in the mother’s childhood can trigger psychological and developmental problems, but our study is pioneering in showing that they can affect physical problems, such as weight gain, as early as the first two months of life,” says Jackowski.

Now, the research team, which includes Vinicius O. Santana and FAPESP postdoctoral fellow Aline C. Ramos, will follow the weight development of the children of mothers who suffered adversity in childhood until they are 24 months old. “We’re going to follow them for longer because we want to investigate the impact of the introduction of food, which usually occurs at 6 months of age,” she says.

As the researchers explain, the research suggests that metabolic changes can be modified. “It’s not a matter of determinism. We need to monitor how the metabolism and inflammatory factors behave in these babies over a longer period of time to understand how to modulate this process. It’s important to know that all of this is modifiable, and we’re now going to look at how we can intervene,” she says.

Reference:

Santana, V.O., Ramos, A.C., Cogo-Moreira, H. et al. Sex-specific association between maternal childhood adversities and offspring’s weight gain in a Brazilian cohort. Sci Rep 15, 2960 (2025). https://doi.org/10.1038/s41598-025-87078-5.

Powered by WPeMatico

Machine Learning may predict Treatment outcomes in early childhood caries, suggests study

Researchers have found in a new study that Machine learning shows promise in predicting treatment outcomes in early childhood caries. Further Key identified predictors can help guide targeted management strategies.

Early childhood caries (ECC) is a major oral health problem among preschool children that can significantly influence children’s quality of life. Machine learning can accurately predict the treatment outcome but its use in ECC management is limited. The aim of this study is to explore the application of machine learning in predicting the treatment outcome of ECC. This study was a secondary analysis of a recently published clinical trial that recruited 1,070 children aged 3- to 4-year-old with ECC. Machine learning algorithms including Naive Bayes, logistic regression, decision tree, random forest, support vector machine, and extreme gradient boosting were adopted to predict the caries-arresting outcome of ECC at 30-month follow-up after receiving fluoride and silver therapy. Candidate predictors included clinical parameters (caries experience and oral hygiene status), oral health-related behaviours (toothbrushing habits, feeding history and snacking preference) and socioeconomic backgrounds of the children. Model performance was evaluated using discrimination and calibration metrics including accuracy, recall, precision, F1 score, area under the receiver operating characteristic curve (AUROC) and Brier score. Shapley additive explanations were deployed to identify the important predictors. Results: All machine learning models showed good performance in predicting the treatment outcome of ECC. The accuracy, recall, precision, F1 score, AUROC, and Brier score of the six models ranged from 0.674 to 0.740, 0.731 to 0.809, 0.762 to 0.802, 0.741 to 0.804, 0.771 to 0.859, and 0.134 to 0.227, respectively. The important predictors of the caries-arresting outcome were the surface and tooth location of the carious lesions, newly developed caries during follow-ups, baseline caries experience, whether the children had assisted toothbrushing and oral hygiene status.

Machine learning can provide promising predictions of the treatment outcome of ECC. The identified key predictors would be particularly informative for targeted management of ECC.

Reference:

Wu, Y., Jia, M., Fang, Y. et al. Use machine learning to predict treatment outcome of early childhood caries. BMC Oral Health 25, 389 (2025). https://doi.org/10.1186/s12903-025-05768-y

Keywords:

Machine, Learning, predict, Treatment, outcomes, early, childhood, caries, suggests, study, Wu, Y., Jia, M., Fang, Y, Machine learning, Early childhood caries, Predictor, Support vector machine, Extreme gradient boosting, SHAP

Powered by WPeMatico

BMI tied to CV risk in ACPA-positive but not ACPA-negative RA patients: Study

Researchers have identified in a new study that body mass index (BMI) affects cardiovascular risk in rheumatoid arthritis (RA) patients differently based on their ACPA (anticitrullinated protein antibody) status and biologic therapy use. The results indicate that greater BMI is associated with decreased cardiovascular risk in ACPA-negative patients receiving biologics but is linked with increased cardiovascular risk in ACPA-positive patients, independent of biologic therapy. These findings underscore the intricate interconnection between immune activity, body composition, and cardiovascular events in patients with RA. The study was conducted by George A. K. and colleagues published in BMJ Rheumatic & Musculoskeletal Diseases.

Rheumatoid arthritis is a chronic autoimmune disorder that not just targets joints but also poses a greater risk of cardiovascular events like strokes and heart attacks. Obesity is usually regarded as a risk factor for heart disease, yet the contribution of BMI to cardiovascular events in RA is less well established, particularly in light of the effects of inflammation, immune factors such as ACPA, and biologic drug therapy.

This global observational study included data from 3,982 RA patients. The authors used two categories to assess cardiovascular outcomes:

  • Major Adverse Cardiovascular Events (MACE): such as myocardial infarction, stroke, or cardiovascular mortality

  • All cardiovascular events: including MACE, in addition to angina, revascularization procedures, transient ischemic attack (TIA), peripheral arterial disease, and heart failure

Multivariable Cox regression models stratified by center risk were used to assess how BMI, ACPA status, and biologic therapy individually and collectively impacted the risk of cardiovascular events.

Key Findings

  • Participants: 3,982 RA patients

  • MACE events: 192

  • Total cardiovascular events: 319

In ACPA-negative biologic users:

  • MACE risk decreased by 62% with increased BMI (HR = 0.38)

  • All cardiovascular event risk decreased by 33% (HR = 0.67)

In ACPA-positive patients (any biologic use):

  • MACE risk increased by 4% per BMI unit (HR = 1.04)

  • All cardiovascular event risk increased by 3% per BMI unit (HR = 1.03)

Three-way interaction significant:

  • MACE (p < 0.001)

  • All cardiovascular events (p = 0.028)

This research determines that the effect of BMI on cardiovascular risk in RA is not uniform. Increased BMI decreased cardiovascular risk in ACPA-negative patients on biologics, whereas increased BMI enhanced cardiovascular risk in ACPA-positive patients regardless of biologic therapy. These results emphasize the need for personalized RA treatment and prevention of cardiovascular disease according to the immune status of the patient and treatment approach.

Reference:

Karpouzas, G. A., Gonzalez-Gay, M. A., Corrales, A., Myasoedova, E., Rantapää-Dahlqvist, S., Sfikakis, P. P., Dessein, P., Hitchon, C., Pascual-Ramos, V., Contreras-Yáñez, I., Colunga-Pedraza, I. J., Galarza-Delgado, D. A., Azpiri-Lopez, J. R., Semb, A. G., van Riel, P. L. C. M., Misra, D. P., Patrick, D., Bridal Logstrup, B., Hauge, E.-M., … for An inTernationAl Cardiovascular Consortium for Rheumatoid Arthritis (ATACC-RA). (2025). Influence of body mass index on cardiovascular risk in rheumatoid arthritis varies across anti-citrullinated protein antibody status and biologic use. RMD Open, 11(2), e005464. https://doi.org/10.1136/rmdopen-2025-005464

Powered by WPeMatico

Antithrombotic Therapy Significantly Increases Risk of Reoperation for Hematoma After Thyroidectomy: S6

A recent study has identified that patients who receive thyroidectomy when on antithrombotic therapy are at a much increased risk of needing reoperation from postoperative hematoma (POH). The study was conducted by Bassam A. and colleagues published in the American Journal of Otolaryngology. The findings of the study suggest a fivefold increased risk of POH in patients treated with anticoagulants, supporting the importance of personalized treatment planning in this group.

Postoperative hematoma is a rare yet life-threatening condition following thyroid surgery, requiring in many cases urgent treatment. To analyze the effect of antithrombotic treatment on the occurrence and severity of POH, scientists divided patients into three main groups according to perioperative drug exposure: Group 1 had thyroidectomy with no antithrombotic medication, Group 2 received antiplatelet drugs, and Group 3 received anticoagulant drugs. POH was also divided into Group A (those with hematoma) and Group B (those without hematoma). The primary aim was to measure the impact that the application of antithrombotic therapy has on hematoma risk, as well as the need for reoperation following it.

Researchers performed a comparison of thyroidectomy patients, segregating them into three groups founded on treatment. The incidence of postoperative hematoma was considered the main point of interest, with cases further divided depending on whether the hematoma necessitated conservative treatment or surgical reoperation. Risk factors including hyperthyroidism, substernal goiter, and hypertension were also recorded to see how they affect the risk of bleeding. The rates of reoperation were determined for all three groups to find the effect of drug type on outcomes of surgery.

Key findings

  • The total postoperative incidence of hematoma was 6%, and only 0.1% of these cases needed reoperation, while the rest of 5.9% were treated conservatively.

  • 83% of the hematoma reoperations were within the first 24 hours after surgery, highlighting the importance of early monitoring.

  • The risk of POH was significantly enhanced by the use of antithrombotic medication 3.4 times for patients taking antiplatelet medication and 5.2 times for those taking anticoagulants.

  • Among those who had developed hematoma (Group A), hyperthyroidism was seen in 32% of cases versus 7% in non-hematoma cases (Group B).

  • Substernal goiter and hypertension were seen in 33% and 52% of Group A patients respectively, versus 11% and 27% in Group B.

  • Antithrombotic drug therapy was seen in 30% of hematoma cases and just 9% of non-hematoma cases.

  • Reoperation rates also significantly differed by drug category.

  • In patients on no blood thinners (Group 1), reoperation was required in only 0.08%.

  • The rate increased to 0.23% among patients on antiplatelet therapy (Group 2) and increased further to 1% among those on anticoagulant therapy (Group 3).

  • These data definitely indicate a direct relationship between the use of antithrombotic drugs and risk for severe bleeding complications following thyroidectomy.

Patients operated on for thyroidectomy under antithrombotic therapy have a very high risk of postoperative hematoma needing reoperation. This observation emphasizes the significance of careful preoperative assessment and potential adjustment of anticoagulant or antiplatelet treatment before surgery. Monitoring soon after surgery during the first 24 hours is also significant, as this is the most risk-prone duration for severe bleeding episodes.

Reference:

Abboud, B., Abboud, C., & Meouche, M. (2025). Reoperation for hematoma in patients on perioperative antithrombotic drugs underwent thyroidectomy. American Journal of Otolaryngology, 46(4), 104636. https://doi.org/10.1016/j.amjoto.2025.104636

Powered by WPeMatico

FIB-4 Score Positively Linked to Gallstone Risk in US Adults, confirms study

Researchers have discovered in a new study that a greater fibrosis-4 index (FIB-4) score, a noninvasive marker traditionally used to assess liver fibrosis, is strongly associated with an increased risk of gallstones among adults. Gallstones are among the most prevalent gastrointestinal diseases, but their association with markers of fibrosis in the liver such as FIB-4 is poorly understood. The new analysis offers valuable information regarding how biomarkers of the liver may be associated with risk for gallbladder disease in the general population. The study was published in BMC Gastroenterology by Huqiang Dong and colleagues.

The research used 2017–2020 National Health and Nutrition Examination Survey (NHANES) data and involved 7,771 participants. It is one of the largest nationally representative analyses to examine this association in US adults. The investigators examined the association of FIB-4 levels with the presence of gallstones based on data from NHANES, a well-established US population-based survey. The FIB-4 index was derived from age, AST, ALT, and platelet count.

Participants were categorized into quartiles according to their FIB-4 levels, and multivariate logistic regression models were used to examine the association with risk of gallstones. Restricted cubic spline (RCS) analysis was employed to check for nonlinear trends, and threshold effects analysis was utilized to identify inflection points at which risk levels altered. Subgroup analyses were conducted to test consistency within various demographic and health subgroups, and Bonferroni correction was implemented to preserve statistical precision.

Results

The weighted prevalence of gallstones in the study population was 11%. There was a clear and statistically significant positive association between FIB-4 levels and risk of gallstones:

  • For every unit increase in FIB-4, there was a 19% increase in the odds of gallstones (Odds Ratio [OR] = 1.19; 95% Confidence Interval [CI]: 1.10 to 1.29).

  • When quartiled, individuals in the uppermost FIB-4 quartile (Q4) had a 60% increased risk of gallstones (OR = 1.60; 95% CI: 1.25 to 2.03) than those in the lowest quartile (Q1).

  • The RCS analysis found a nonlinear positive relationship between FIB-4 and risk for gallstones with a statistically significant nonlinear trend (p = 0.015). Risk increased sharply beyond an inflection point at a FIB-4 score of 2.43 (p = 0.001).

  • Subgroup analysis validated that the positive correlation between FIB-4 and gallstones persisted strongly among non-Hispanic whites, subjects without heart failure or coronary artery disease, and among alcohol users and smokers. Following Bonferroni adjustment, these correlations persisted statistically significantly (p < 0.00217).

This research firmly determines a strong, nonlinear relationship between FIB-4 elevation and increased gallstone risk in US adults. These findings point to the potential of FIB-4 as an inexpensive, noninvasive gallstone risk marker that is rapid. As gallstones are typically asymptomatic until complications are formed, the value of FIB-4 as an adjunct tool to screening could improve earlier diagnosis and prevention efforts, particularly in high-risk patients.

Reference:

Dong, H., Zhang, Z., Fu, C. et al. Association between fibrosis-4 index (FIB-4) and gallstones: an analysis of the NHANES 2017–2020 cross-sectional study. BMC Gastroenterol 25, 229 (2025). https://doi.org/10.1186/s12876-025-03809-y

Powered by WPeMatico