Study Reveals High Risk of Cognitive Complaints Among Audiology Patients

Netherlands: A recent descriptive cohort study highlights a concerning trend in audiology clinics, revealing a significant prevalence of patient-reported cognitive complaints alongside several risk factors for dementia.

The study, published in the International Journal of Audiology, suggested a considerable risk of cognitive issues among the audiology clinic population, as evidenced by the elevated prevalence of self-reported cognitive complaints (SCC) and several dementia risk factors. The analysis of 1,100 patients attending audiology clinics revealed that more than 50% experienced challenges with memory and concentration.

“The findings highlighted a significant presence of dementia risk factors, with 68% of participants reporting sleep disturbances and over 50% exhibiting symptoms of sadness, anxiety, or depression. Additionally, self-reported hearing difficulties were closely linked to these cognitive issues, and feelings of loneliness and vision problems,” the researchers wrote.

Research on the relationship between hearing and dementia primarily seeks to establish the causal direction of this connection. However, there is limited understanding of the prevalence of cognitive issues within a representative audiology patient population. To address this gap, Paul Merkusa, Vrije Universiteit Amsterdam, Amsterdam, the Netherlands, and colleagues set out to investigate the frequency of SCC and dementia risk factors (RF) among patients in audiology clinics.

For this purpose, the researchers enrolled 1,100 patients visiting audiology clinics (51% female, average age 61 years) and administered an online intake tool based on the International Classification of Functioning, Disability, and Health.

The analysis focused on various domains, including memory and concentration (self-reported cognitive complaints), as well as loneliness, depression, sleep issues, and vision problems (dementia risk factors), alongside self-reported hearing difficulties. The study examined prevalence rates and explored associations with demographic variables and self-reported hearing problems.

The following were the key findings of the study:

  • SCC was highly prevalent, with over half of the patients reporting memory or concentration problems.
  • Regarding dementia RF, 68% reported sleeping problems, and > 50% reported sadness, anxiety, or depressed mood.
  • SHP correlated significantly with self-reported memory problems, loneliness, and vision problems.

To conclude, the authors suggest that the audiology clinic population may face a significant risk for cognitive issues, as indicated by the elevated prevalence of self-reported cognitive complaints and dementia risk factors among participants.

Their findings underscore the importance of fostering closer collaboration between audiology and neurology care pathways, advocating for a more holistic and patient-centered approach. Furthermore, when cognitive issues are identified during the intake process at audiology clinics, assessments and communication can be tailored accordingly.

The authors also emphasize the need for future studies incorporating control groups, utilizing more detailed questionnaire items, and applying objective cognitive and hearing status measures to draw stronger conclusions regarding cognitive risks within the hard-of-hearing population.

Reference:

Poelarends, D., Kramer, S. E., Smits, C., & Merkus, P. (2024). The prevalence of patient-reported cognitive complaints and dementia risk factors in the audiology clinic. International Journal of Audiology, 1–8. https://doi.org/10.1080/14992027.2024.2406882

Powered by WPeMatico

Most accurate ultrasound test could detect 96% of women with ovarian cancer, unravels research

An ultrasound test that detected 96% of ovarian cancers in postmenopausal women should replace current standard of care test in the UK according to a new study.

In a paper published in Lancet Oncology today (Monday 30 September), research funded by the National Institute for Health and Care Research (NIHR) and led by Professor Sudha Sundar from the University of Birmingham compared all currently available tests to diagnose ovarian cancer in postmenopausal women head-to-head in a high-quality diagnostic test accuracy study.

Of the six diagnostic tests investigated, the IOTA ADNEX model which looks at ultrasound features (how the lump looked like on ultrasound) had the best accuracy of all and could detect up to 96% of women with ovarian cancer.

The ultrasound test outperforms the current standard of care in the UK significantly and so we recommend that the IOTA ultrasound ADNEX model should replace the current standard of care test called risk of malignancy (RMI1) test in the UK which identifies 83% of ovarian cancers.

Sudha Sundar, Professor of Gynaecological Cancer at the University of Birmingham and consultant in gynaecological cancer surgery at Sandwell and West Birmingham NHS Trust said:

“This is the first time that a head-to-head study of all available ovarian cancer tests have been done in the same population. Here we studied their use with symptomatic, postmenopausal women who are most at risk of this cancer. Our trial found that the IOTA ADNEX ultrasound protocol had highest sensitivity for detecting ovarian cancer compared to the standard of care and other test.

“The ultrasound test also performs well when delivered by a trained sonographer who have received specific training and certification and quality assurance, and as the vast majority of ultrasound scans are performed by sonographers it is important that a new standard is able to be delivered by as many clinical professionals as possible.

“We found that the higher sensitivity of the IOTA ADNEX model is likely to lead to some women who don’t have cancer also being flagged up as having a higher risk of cancer. We however did discuss this extensively with patients, cancer charity Target ovarian cancer and NHS experts who all agreed that in postmenopausal women who are at higher risk of ovarian cancer, picking up more women with cancer would benefit women overall.”

Annwen Jones OBE, Chief Executive at Target Ovarian Cancer said:

‘Early diagnosis of ovarian cancer is vital, and we are pleased to see this research demonstrate that there are more accurate ways of using ultrasound, The faster and earlier ovarian cancer is diagnosed the easier it is to treat and the more successful the outcomes. Alongside this innovative research we need to see, greater awareness of the symptoms of ovarian cancer so that women know to come forward to their GP for testing and receive the best possible treatment as quickly as possible. It is crucial that new ways of working like this are rolled out as quicky as possible.”

The research team note that the IOTA ADNEX model achieved 96% accuracy when delivered by NHS sonographers who were appropriately trained and received quality assurance. As most scans worldwide are carried out by sonographers rather than gynaecologists, introductory free online resources have been created by the researchers for NHS staff to undergo the specialist ultrasound training and get certification and quality assurance.

Reference:

Sundar, S., et al. (2024). Risk-prediction models in postmenopausal patients with symptoms of suspected ovarian cancer in the UK (ROCkeTS): a multicentre, prospective diagnostic accuracy study. The Lancet Oncology. doi.org/10.1016/s1470-2045(24)00406-6.

Powered by WPeMatico

Weekly Vitamin D Supplementation Boosts Levels in Children with Atopic Dermatitis but Fails to Alleviate Severity: Study

Chile: A recent randomized controlled trial revealed that in children with atopic dermatitis (AD), weekly vitamin D supplementation effectively increased vitamin D levels but did not significantly change the severity of the condition or type 2 immunity biomarkers compared to the placebo group.

The findings were published online in the Journal of the European Academy of Dermatology and Venereology on 14 March 2024.

Vitamin D (VD) deficiency is prevalent among patients with atopic dermatitis (AD) and is frequently linked to the severity of the condition. However, randomized trials assessing the impact of VD supplementation on AD have yielded mixed results, and there is limited data on how VD supplementation affects type 2 immunity in individuals with AD. To fill this knowledge gap, Arturo Borzutzky, Universidad Católica de Chile, Santiago, Chile, and colleagues aimed to examine the effectiveness of vitamin D supplementation in reducing the severity of atopic dermatitis and influencing type 2 immunity biomarkers.

For this purpose, the researchers conducted a randomized, double-blind, placebo-controlled trial. They randomly assigned 101 children with atopic dermatitis to receive either weekly oral vitamin D3 (VD3) or a placebo for six weeks. The primary outcome measured was the change in the Severity Scoring of Atopic Dermatitis (SCORAD).

The study led to the following findings:

  • The mean age of participants was 6.3 ± 4.0 years, with a baseline SCORAD score of 32 ± 29. At the study’s outset, 57% of the children were vitamin D deficient, with no significant differences between the groups.
  • The increase in 25(OH)D levels was significantly greater in the vitamin D3 group compared to the placebo group (+43.4 ± 34.5 nmol/L versus +2.3 ± 21.2 nmol/L).
  • There was no significant difference in SCORAD change at the six-week mark between the two groups (−5.3 ± 11.6 for vitamin D versus −5.5 ± 9.9 for placebo).
  • There were no notable differences between the groups regarding changes in eosinophil counts, total IgE levels, specific IgE for Staphylococcal enterotoxin, or the cytokines CCL17, CCL22, CCL27, and LL-37.
  • Staphylococcus aureus colonization in lesional skin showed no significant variation.
  • Single nucleotide polymorphisms in the vitamin D receptor (VDR) gene, specifically FokI, ApaI, and TaqI, did not influence the subjects’ response to vitamin D supplementation.

“The findings showed that while weekly vitamin D supplementation successfully improved vitamin D levels among children with atopic dermatitis, it did not lead to significant improvements in disease severity or type 2 immunity biomarkers, pointing to the need for multifaceted treatment strategies,” the researchers concluded.

Reference:

Borzutzky, A., Iturriaga, C., Pérez-Mateluna, G., Cristi, F., Cifuentes, L., Silva-Valenzuela, S., Vera-Kellet, C., Cabalín, C., Hoyos-Bachiloglu, R., Navarrete-Dechent, C., Cossio, M. L., Roy, C. L., & Camargo, C. A. (2024). Effect of weekly vitamin D supplementation on the severity of atopic dermatitis and type 2 immunity biomarkers in children: A randomized controlled trial. Journal of the European Academy of Dermatology and Venereology, 38(9), 1760-1768. https://doi.org/10.1111/jdv.19959

Powered by WPeMatico

Cupping therapy effective in reducing pain intensity of migraines: Study

A new study published in the Journal of Pharmacopuncture showed that although cupping therapy works well to cure migraines, it has no positive effects on quality of life. Migraines rank as the 6th most prevalent illness in the globe. According to recent study, 4.9% of all impairments worldwide are thought to be caused by migraine headaches, which afflict 14% to 15% of the population worldwide. Recurrent episodes of unilateral throbbing headaches that last for four to seventy-two hours, photophobia, nausea, phonophobia, vomiting, and cutaneous allodynia are the core symptoms of migraines.

Cupping therapy is a well-liked traditional Chinese medicine method that has been used for centuries to treat respiratory conditions, pain, inflammation, improved blood circulation, and stress. It is particularly popular in South East Asian, East Asian, and Middle Eastern regions. Thus, the primary objective of this review is to carefully evaluate and assemble the therapeutic efficacy of cupping therapy for the treatment of migraine headaches.

Clinicaltrials. gov, PubMed/MEDLINE, Cochrane CENTRAL, ProQuest, ScienceDirect, SinoMed, and the National Science and Technology Library were the 7 databases that were thoroughly examined. Reduction of pain severity and success of treatment are the main goals. The risk of adverse events (AEs) and an improvement in quality of life (QoL), measured by the Migraine Disability Scale (MIDAS), were the secondary objectives. Based on the cupping methods (wet and dry cupping) and supplementary adjunctive therapies (such as acupuncture and/or collateral pricking), subgroup analyses were carried out.

There were a total of 1,446 individuals over 18 trials that were included out of 348 records. The ones who received cupping therapy had far greater treatment success rates. The wet cupping was the only modality that showed statistically meaningful improvement. When compared to cupping therapy alone, the supplementary adjunctive therapy did not result in a higher amplitude of therapeutic success. Also, compared to baseline, the cupping treatment demonstrated a substantial decrease in pain and reduced the probability of adverse events. However, cupping did not enhance overall quality of life. Overall, this meta-analysis states that cupping treatment is a safe and effective way to treat migraine headaches. Cupping showed a minimal risk of side effects, decreased pain intensity, and increased treatment effectiveness.

Reference:

Mohandes, B., Bayoumi, F. E. A., AllahDiwaya, A. A., Falah, M. S., Alhamd, L. H., Alsawadi, R. A., Sun, Y., Ma, A., Sula, I., & Jihwaprani, M. C. (2024). Cupping Therapy for the Treatment of Migraine Headache: a systematic review and meta-analysis of clinical trials. In Journal of Pharmacopuncture (Vol. 27, Issue 3, pp. 177–189). Korean Pharmacopuncture Institute. https://doi.org/10.3831/kpi.2024.27.3.177

Powered by WPeMatico

Study evaluates Variability in obstetric anaesthesia practice and care

It is crucial to provide excellent hospital care during labor and delivery to decrease avoidable maternal morbidity and mortality. The latest MBRRACE-UK report found that while deaths directly linked to anesthesia are uncommon, enhancing the overall peripartum care provided to 38% of those who passed away between 2018 and 2020 could have impacted the outcome, as only 22% received high-quality care. Recent study examined the variability in obstetric anesthesia practice and care within the UK’s National Health Service. The researchers conducted a survey of 106 out of 107 participating hospitals, representing 69% of all UK obstetric units. The key findings were: Staffing and Training: – 94% of hospitals had a dedicated consultant obstetric anesthetist during working hours, but 25% of out-of-hours duty anesthetists had other clinical commitments outside of obstetrics. – 98% of hospitals offered multidisciplinary team training, mostly using simulation-based methods. Facilities and Resources: – 47% of hospitals had dedicated high-risk antenatal clinics, and 77% provided written patient information on anesthesia options in multiple languages. – 69% used point-of-care testing to estimate hemoglobin concentration. Clinical Practices: – 76% used patient-controlled epidural analgesia during labor, and 26% used programmed intermittent epidural boluses. – 93% used intrathecal diamorphine for elective cesarean deliveries, with a common dose of 300 mcg. – 74.5% routinely used intraoperative patient warming measures. Outcomes and Quality Indicators: – 76% tracked the incidence of post-dural puncture headache, with a median incidence of 0.96%. – 75% and 67% recorded elective and emergency cesarean delivery rates, respectively, but only 13% measured achievement of adequate pain relief 45 minutes after epidural placement

The study demonstrates significant variability in staffing, facilities, clinical practices, and outcome measurements related to obstetric anesthesia across the UK. The authors recommend standardizing anesthetic peripartum care based on national guidelines and systematically measuring quality indicators to ensure safe and equitable care.

Key Points

1. 94% of hospitals had a dedicated consultant obstetric anesthetist during working hours, but 25% of out-of-hours duty anesthetists had other clinical commitments outside of obstetrics.

2. 98% of hospitals offered multidisciplinary team training, mostly using simulation-based methods.

3. 47% of hospitals had dedicated high-risk antenatal clinics, and 77% provided written patient information on anesthesia options in multiple languages. 69% used point-of-care testing to estimate hemoglobin concentration.

4. 76% used patient-controlled epidural analgesia during labor, and 26% used programmed intermittent epidural boluses. 93% used intrathecal diamorphine for elective cesarean deliveries, with a common dose of 300 mcg. 74.5% routinely used intraoperative patient warming measures.

5. 76% tracked the incidence of post-dural puncture headache, with a median incidence of 0.96%. 75% and 67% recorded elective and emergency cesarean delivery rates, respectively, but only 13% measured achievement of adequate pain relief 45 minutes after epidural placement.

6. The study demonstrates significant variability in staffing, facilities, clinical practices, and outcome measurements related to obstetric anesthesia across the UK. The authors recommend standardizing anesthetic peripartum care based on national guidelines and systematically measuring quality indicators to ensure safe and equitable care.

Reference – 

O’Carroll J, Zucco L, Warwick E, et al. (October 04, 2024) A Survey of Obstetric Anaesthesia Services and Practices in the United Kingdom. Cureus 16(10): e70851. DOI 10.7759/cureus.70851

Powered by WPeMatico

Study Reveals Link Between RBC Transfusion and Reduced Mortality in Septic Patients with CKD

China: A recent study published in Scientific Reports has highlighted a significant association between red blood cell (RBC) transfusion and reduced 28-day mortality rates among septic patients suffering from concomitant chronic kidney disease (CKD).

The findings indicate that a patient’s base excess (BE) value, Sequential Organ Failure Assessment (SOFA) score, and estimated glomerular filtration rate (eGFR) play vital roles in determining treatment outcomes and should be factored into decisions regarding RBC transfusion.

Sepsis is a life-threatening response to infection that often complicates the management of patients with chronic kidney disease. CKD patients frequently experience compromised immune function, increasing their susceptibility to infections that can trigger sepsis. The presence of both conditions can lead to lower hemoglobin levels and is linked to a higher mortality rate.

Against the above background, Xingxing Hu, Nanjing University of Chinese Medicine, Nanjing, Jiangsu, China, and colleagues aimed to examine whether RBC transfusions enhance the outcomes of septic patients with concurrent CKD and to assess the criteria for administering RBC transfusions.

For this purpose, the researchers conducted a retrospective cohort study using data from the MIMIC-IV (v2.0) database. The study included 6,604 patients with sepsis and concurrent chronic kidney disease who were admitted to the Intensive Care Unit (ICU). Propensity score matching (PSM) was employed to account for confounding variables.

The study led to the following findings:

  • Multivariate Cox regression analysis revealed an association between RBC transfusion and a decreased risk of 28-day mortality (HR: 0.61).
  • Following a meticulous 1:1 propensity score matching analysis between the two cohorts, the matched population revealed a notable decrease in 28-day mortality within the RBC transfusion group (HR: 0.60).
  • The researchers observed that a SOFA score ≥ 5, a Base Excess value < 3, and an eGFR < 30 may be considered when evaluating the potential need for RBC transfusion.

The findings revealed that RBC transfusion may enhance the 28-day survival rate in septic patients with concurrent CKD. In particular subgroups, factors like base excess value, SOFA score, and estimated glomerular filtration rate significantly influence treatment outcomes. Consequently, these variables should be considered when determining the need for initial RBC transfusion. The researchers suggest the necessity for additional research to validate these findings through randomized clinical trials.

“Our study found a significant link between RBC transfusion and reduced 28-day mortality in septic patients with CKD. However, its generalizability is limited due to reliance on data from a single U.S. academic center, which may not reflect broader practices or patient demographics, along with inherent retrospective limitations,” the researchers concluded.

Reference:

Chen, L., Lu, H., Lv, C., Ni, H., Yu, R., Zhang, B., & Hu, X. (2024). Association between red blood cells transfusion and 28-day mortality rate in septic patients with concomitant chronic kidney disease. Scientific Reports, 14(1), 1-10. https://doi.org/10.1038/s41598-024-75643-3

Powered by WPeMatico

Study Finds Immediate GTR Superior for d-M2 Infrabony Defects Post Third Molar Extraction

China: In a recent retrospective study, the researchers delved into the critical timing of guided tissue regeneration (GTR) following third molar extraction.

The study, published in BMC Oral Health, found that immediate and delayed guided tissue regeneration treatments demonstrated efficacy for managing infrabony defects at the distal aspect of the second molar (d-M2) following third molar extraction.

Immediate GTR performed concurrently with third molar extraction, and delayed GTR, administered three months later, led to a significant reduction in defect depth, the researchers reported. Immediate GTR potentially provides an additional advantage by minimizing the necessity for a second surgery.

Infrabony defects commonly affect the distal aspect of the second molar (d-M2) following extraction of the adjacent third molar. Guided tissue regeneration (GTR) is a recognized treatment option for addressing these defects post-third molar removal. However, the optimal timing of GTR administration after third molar extraction remains a pivotal consideration in clinical decision-making.

Against the above background, Ying Xuan, Department of Periodontology, Hangzhou Stomatology Hospital, Hangzhou, Zhejiang Province, China, and colleagues aimed to compare delayed and immediate GTR treatments to assist clinical decision-making.

For this purpose, the research team collected D-M2 infrabony defects with a minimum 1-year follow-up. The participants were divided into three groups: the Immediate GTR group, where GTR was performed concurrently with third molar extraction; the Delayed GTR group, where GTR was delayed by at least three months after third molar extraction; and the Control group, which received only scaling and root planing during third molar extraction.

They also evaluated radiographic and clinical parameters related to the infrabony defect before GTR and post-surgery using the Kruskal-Wallis test or one-way ANOVA, followed by post-hoc Dunn’s test or the Bonferroni test for pairwise comparisons.

The following were the key findings of the study:

· A total of 109 d-M2 infrabony defects were assessed.

· No significant differences were found between the two GTR groups, although both of them showed significant reductions in infrabony defect depth: the immediate GTR group (2.77 ± 1.97 mm versus 0.68 ± 1.03 mm) and the delayed GTR group (2.98 ± 1.08 mm versus 0.68 ± 1.03 mm) compared to the control group.

“Guided tissue regeneration has shown effective improvement in infrabony defects at the distal aspect of the second molar (d-M2) following third molar removal, whether performed simultaneously or delayed. Immediate GTR treatment may offer patients reduced discomfort since it requires only one surgical procedure,” the researchers wrote.

“The outcomes of GTR in infrabony defects at the distal aspect of the second molar may be affected by factors such as the morphology of the defects, the specific location of the second molar (M2), and the initial depth of the defects,” they concluded.

Reference:

Tang, SM., Liu, DX., Xiong, ZY. et al. Comparison of immediate vs. delayed guided tissue regeneration in Infrabony defect of second molars after adjacent third molar extraction: a retrospective study. BMC Oral Health 24, 830 (2024). https://doi.org/10.1186/s12903-024-04591-1

Powered by WPeMatico

New study implicates Six protein in early-onset preeclampsia

Preeclampsia is a life-threatening pregnancy complication marked by persistent high blood pressure that is even more serious when it occurs early in the first trimester. The exact cause of early-onset preeclampsia is unknown, and it is difficult to predict, prevent and diagnose. Now, in ACS’ Journal of Proteome Research, researchers report on six proteins that could be used as targets to diagnose and treat the condition.

Preeclampsia’s key symptom is high maternal blood pressure, and serious cases can lead to maternal organ failure, low infant birth weight, or maternal or fetal death. Preeclampsia before 34 weeks of pregnancy has a higher risk of severe outcomes, especially for the fetus. But it’s difficult for health care providers to detect this condition before harmful symptoms appear, because little is known about what causes it. So, Jing Li and colleagues set out to characterize proteins in placenta tissue that may offer clues about the cause of early-onset preeclampsia and serve as targets for early detection or treatment.

The researchers collected placenta tissue from 30 pregnant people, half with early-onset preeclampsia and half with healthy pregnancies. Li and colleagues used mass spectrometry to screen molecular fragments in each sample, followed by a software program to match the fragments to their associated proteins. This process pinpointed 59 proteins that were present in different amounts (either higher or lower) for preeclamptic placenta tissue samples versus healthy placenta tissue samples. The researchers chose 16 of these proteins to target with a different, more sensitive mass spectrometry method, which more precisely measured the amounts of each protein. Of these 16 proteins, six were present in statistically different amounts across tissue sample groups:

  • Preeclamptic placenta tissue had higher levels of monocarboxylate transporter 4, ERO1-like protein alpha and pappalysin-2. These proteins are involved in synthesizing proteins and regulating growth hormones.
  • Preeclamptic placenta tissue had lower levels of desmin, caldesmon and keratine 18. These proteins play key roles in cardiovascular complications, like an enlarged heart; blood flow in placental muscle cells; estrogen signaling and cell health in the uterus lining.

Altogether, the results suggest that cardiovascular complications or the estrogen cycle could be linked to the development of early-onset preeclampsia. The team says that more research is needed, but identifying these six proteins serves as a promising first step toward improved detection and treatment of this life-threatening condition.

Reference:

Zhou, J., et al. (2024). Proteomic Analysis Reveals Differential Protein Expression in Placental Tissues of Early-Onset Preeclampsia Patients. Journal of Proteome Research. doi.org/10.1021/acs.jproteome.4c00404.

Powered by WPeMatico

Ultraprocessed Foods Linked to Increased Risk of Developing Lupus, reveals research

New research suggests that higher daily intakes of ultraprocessed foods (UPFs) are associated with a higher risk of developing systemic lupus erythematosus (SLE). A recently published paper indicates that diets high in UPFs-including foods like chips, sodas, and ready-to-eat processed foods -may also confer an increased risk for SLE, an autoimmune disease. This was published in the journal Arthritis Care & Research by Rossato S. and colleagues.

The researchers included data from the Nurses’ Health Study I (NHSI) and Nurses’ Health Study II (NHSII), two large prospective cohort studies in the U.S. with more than 121,000 female registered nurses enlisted. Participants were between 20 and 50 years of age at enrollment in 1976 (NHSI) and 1989 (NHSII) and had completed comprehensive dietary questionnaires every four years using a validated semiquantitative food frequency questionnaire (SQFFQ). The dietary questionnaire focused on consumption of ultraprocessed foods, which were reported by number of servings/day, grams, or milliliters consumed.

The SLE diagnoses of participants were confirmed by rheumatologists. Tertiles of cumulative average UPF intake will be determined based on the average UPF intake, ranked from highest to lowest. Time-varying Cox regression models will be used to estimate hazard ratios with 95% confidence intervals for the development of SLE adjusting for important covariates that include age, race, BMI, menopausal status, and other lifestyle factors. In addition, risk stratification by BMI and further examination of SLE subtypes-also in the context of the presence and absence of dsDNA antibodies-was performed.

  • This study included 99,000 female participants recruited to NHSI and 106,000 females in NHSII. The mean age of participants in this study using NHSI was 50 years, while, with NHSII, it was 36 years. More than 90% of participants self-reported as White.

  • The highest tertile of UPF intake was significantly associated with an increased risk of SLE, 56% higher among those in the lowest tertile (p= 0.03).

  • The association was stronger with the anti-dsDNA positive SLE cases (p = 0.01). suggesting that higher UPF intake is associated with more severe disease.

  • Among UPF categories, artificially sweetened and sugar-sweetened beverages were most strongly associated with a higher risk of SLE.

  • The associations did not alter after adjustment for many confounders, including BMI, smoking, alcohol consumption, and menopausal status

The results of this study provide novel evidence that increases in UPF intake lead to an increased risk of developing SLE. Since UPFs account for more than 50% of the total calories American adults consume on a daily basis, it is relevant to educate consumers about the health risks associated with such foods. Limiting UPF exposure would reduce the incidence of lupus; however, reduction in overall health risk is also possible.

Reference:

Rossato S, Oakes EG, Barbhaiya M et al. Ultraprocessed Food Intake and Risk of Systemic Lupus Erythematosus Among Women Observed in the Nurses’ Health Study Cohorts. Arthritis care & research, June 2024. 10.1002/acr.25395.

Powered by WPeMatico

Naloxone successfully reverses signs of opioid overdose in over half of users in prehospital setting: Study

The opioid epidemic is a leading cause of morbidity and mortality in the United States, and it is increasingly impacting children and teenagers. Naloxone can reverse the effects of opioid overdose and is being used in hospitals and by emergency responders outside the hospital to save the lives of young people poisoned by opioids.

Research titled, “Naloxone Administration to Pediatric Patients During Emergency Medical Service Events,” finds that among pediatric emergency medical service responses where naloxone was administered, the first dose was successful in improving clinical status in 54.1% of cases. Approximately one-third (32.7%) of pediatric patients received two or more naloxone doses, according to an abstract presented during the American Academy of Pediatrics 2024 National Conference & Exhibition at the Orange County Convention Center from Sept. 27-Oct..

“Emergency medical services clinicians rarely reported that naloxone worsened clinical status, and naloxone improved a patient’s clinical condition in over half of emergency responses in our study,” said lead study author Christopher Gaw, MD, MPH, MBE, FAAP, emergency medicine physician at Nationwide Children’s Hospital. “This finding underscores how naloxone can be a safe and effective antidote when used for suspected opioid poisonings in children and adolescents.”

Researchers examined data from the National Emergency Medicine Service Information System (NEMSIS) on emergency medical service activations for pediatric patients ages 17 and younger in 2022, and found naloxone was administered to teenagers and children at least 6,215 times that year. The study also found that one in five adolescents 13-17 years old were documented as receiving naloxone prior to emergency medical service arrival (20.7%), meaning that somebody on the scene administered naloxone. Naloxone administrations occurred most often in the home or residential setting (61.4%). The research also found that most overdoses occur in adolescents, ages 13-17 (79.4%), followed by children ages 1-5 (10.2%), and boys were also slightly more likely to receive naloxone (55.3%).

The calls for assistance suggest there may be initial confusion in the early emergency medical service response period before a diagnosis is made. In the study, the initial emergency call was for an overdose, poisoning, or ingestion in about a third of the calls for adolescents 13-17 (31.5%) and just 12.8% of calls for infants.

“Our study highlights how EMS clinicians are reporting naloxone use after responding to different types of emergency dispatch calls, such as poisonings, unconsciousness, and problems breathing,” Dr. Gaw said. “These signs and symptoms could represent a possible opioid poisoning, which may explain why naloxone was administered in those situations.”

Reference:

Naloxone successful in over half of users in reversing signs of opioid overdose in the prehospital setting, American Academy of Pediatrics, Meeting: American Academy of Pediatrics 2024 National Conference & Exhibition.

Powered by WPeMatico