Atrial Fibrillation Linked to Fourfold Rise in Cardiac Arrest Risk Among HFpEF Patients: Study Shows

USA: Atrial fibrillation (AF) substantially heightens the risk of life-threatening arrhythmias and cardiac arrest in individuals with heart failure and preserved ejection fraction (HFpEF), according to a recent study published in Heart Rhythm. The study, led by Dr. Benjamin A. Rosen and his team from Deborah Heart and Lung Center, New Jersey, highlights a pressing need to refine risk assessment strategies for this growing patient population.

Despite accounting for a considerable number of deaths among HFpEF patients, sudden cardiac death (SCD) remains poorly understood in this group, with few reliable tools for identifying high-risk individuals. The current study aimed to bridge this gap by identifying potential clinical predictors of serious ventricular arrhythmias or cardiac arrest in HFpEF patients.

Researchers conducted a retrospective cohort analysis involving 951 individuals diagnosed with HFpEF between 2015 and 2022 at a single specialized cardiac care center. Patient data—including demographics and a comprehensive set of 18 clinical variables—were extracted from electronic health records using diagnostic codes. Key variables included the presence of comorbid conditions such as coronary artery disease, hypertension, diabetes, chronic obstructive pulmonary disease, sleep apnea, anemia, chronic kidney disease, and atrial fibrillation.

The key findings were as follows:

  • The average age of the study participants was 73.5 years. Women made up 52% of the total study population.
  • Coronary artery disease was observed in 75% of the patients.
  • Atrial fibrillation was present in 49% of the participants.
  • During the study period, 5% of patients (46 individuals) experienced ventricular tachyarrhythmias or cardiac arrest.
  • These events are strongly linked to a high risk of sudden cardiac death.
  • Stepwise logistic regression analysis identified atrial fibrillation as a significant independent risk factor.
  • Patients with atrial fibrillation had over four times higher odds of developing ventricular tachyarrhythmias or cardiac arrest (odds ratio 4.12).

These findings emphasize the crucial role that AF plays in the pathophysiology of sudden cardiac events in HFpEF patients, suggesting that AF may serve as a valuable marker in future risk stratification models.

“While HFpEF has often been considered lower risk in terms of sudden cardiac death, our data suggest that coexisting atrial fibrillation significantly alters that risk profile,” the authors noted.

Given the increasing prevalence of HFpEF and the limitations in current predictive models, the authors call for further prospective research to validate these findings and develop targeted strategies for early intervention in high-risk patients.

The study contributes to the growing recognition that HFpEF, particularly when complicated by atrial fibrillation, is far from benign and demands closer clinical attention to prevent catastrophic cardiac events.

Reference:

Rosen, B. A., & Kazemian, P. (2025). Atrial fibrillation is associated with ventricular tachyarrhythmias or cardiac arrest in heart failure with preserved ejection fraction. Heart Rhythm. https://doi.org/10.1016/j.hrthm.2025.05.050

Powered by WPeMatico

Preserving good cardiovascular health lowers the risk of mortality and MAFLD: Study

A new study published in the journal of Nature Scientific Reports showed that high Life’s Essential 8 (LE8) scores were related to a 90% reduction in cardiovascular mortality, a 58% reduction in all-cause mortality, and an 89% reduction in the risk of metabolic dysfunction-associated fatty liver disease (MAFLD) in overweight and obese people.

About 70% of those who are overweight or obese have MAFLD, compared to 25% of the overall population. MAFLD is the most prevalent liver disease in obese people, and it can lead to more serious liver diseases like cirrhosis, liver fibrosis, and non-alcoholic steatohepatitis (NASH). The American Heart Association unveiled Life’s Essential 8 (LE8), a new cardiovascular health index, in 2022.

Its precursor, LE7, has shown preventive benefits against cardiovascular and other chronic illnesses, making it a valuable tool for primary prevention in healthcare systems. This study examined the mediating roles of inflammation and insulin resistance in the association between the Life’s Essential 8 (LE8) and MAFLD, all-cause, and cardiovascular mortality in various groups.

Data from the National Health and Nutrition Examination Survey (NHANES, 2007–2018), which included 6,885 overweight and obese people, were utilized in this retrospective investigation. There were three categories for LE8 scores: low, medium, and high. The associations between LE8, MAFLD, and mortality were evaluated using Cox proportional hazards models and weighted logistic regression. 

The highest LE8 group in Model 3 had an 89% lower prevalence of MAFLD than the lowest group (OR = 0.11). 72% of this impact was mediated by HOMA-IR, whereas 3–12% was mediated by inflammatory indicators such as CRP, hs-CRP, SII, and SIRI. In Kaplan-Meier analysis, higher LE8 scores were associated with considerably improved survival and 58% lower all-cause and 90% lower cardiovascular mortality.

5–17% of the mortality impacts were mediated by inflammatory indicators. The LE8–MAFLD connection was non-linear, according to Restricted cubic spline (RCS) curves. In subgroups, LE8 interacted with age. SIRI predicted death, but HOMA-IR accurately predicted MAFLD. Higher LE8 scores decreased the risk of MAFLD and mortality in overweight/obese persons.

Overall, these results emphasize the value of primary preventive measures meant to preserve good cardiovascular health, which can dramatically lower the risk of death and the prevalence of MAFLD, especially in those who are overweight or obese.

Source:

Zhang, W., Zou, M., Liang, J., Zhang, D., Zhou, M., Feng, H., Tang, C., Xiao, J., Zhou, Q., Yang, W., Tan, X., & Xu, Y. (2025). Association of cardiovascular health with MAFLD and mortality in overweight and obese adults and mediation by inflammation and insulin resistance. Scientific Reports, 15(1), 18791. https://doi.org/10.1038/s41598-025-03820-z

Powered by WPeMatico

Daily Walking Volume Linked to Reduced Risk of Chronic Low Back Pain: JAMA

Researchers have found in a new cohort study, that both walking volume and intensity were associated with a lower risk of chronic low back pain, with walking volume showing a stronger protective effect than intensity. This study was conducted by Rayane H. and fellow researchers published in JAMA Network.

Chronic low back pain is a common condition that poses a significant burden on individuals and healthcare systems around the world. Physical activity to manage and prevent LBP has been suggested, but few studies have objectively quantified walking habits in daily life to examine their actual preventive effects. Walking is an easy and commonly performed type of exercise, but the relationship between its volume and intensity and the risk of LBP has been unclear. This research bridges this gap by employing accelerometer-based data to measure the influence of walking on the subsequent growth of chronic LBP.

This population-based cohort study with large-scale prospective design was performed within the Trøndelag Health (HUNT) Study in Norway. The authors recruited 11,194 participants at least 20 years of age (mean age: 55.3 years, SD: 15.1), of whom 58.6% were women (n=6564). All participants were chronic LBP free at baseline (2017–2019) and had ≥1 valid day of accelerometer-measured walking data. They were followed up in 2021–2023, with a median follow-up time of 4.2 years.

Walking volume was ascertained in minutes a day, and walking intensity in metabolic equivalent of task (MET) per minute. Self-reported chronic LBP was the main outcome and was defined as three or more months of back pain within the past year. Poisson regression models were used to estimate the risk ratios (RRs) adjusted and 95% confidence intervals (CIs).

Key Findings

• At follow-up, 14.8% (n=1659) of the participants reported chronic LBP.

Restricted cubic spline analyses revealed that heightened walking volume and intensity were inversely related to risk of chronic LBP:

• Walking Volume and LBP Risk

Relative to those walking fewer than 78 minutes per day:

• 78–100 minutes/day: RR = 0.87 (95% CI: 0.77–0.98)

• 101–124 minutes/day: RR = 0.77 (95% CI: 0.68–0.87)

• 125 minutes or more/day: RR = 0.76 (95% CI: 0.67–0.87)

• Walking Intensity and LBP Risk

Relative to walking intensity of <3.00 MET/min:

• 3.00–3.11 MET/min: RR = 0.85 (95% CI: 0.75–0.96)

• 3.12–3.26 MET/min: RR = 0.82 (95% CI: 0.72–0.93)

• ≥3.27 MET/min: RR = 0.82 (95% CI: 0.72–0.93)

This systematic review concludes that higher daily walking volume is linked with a significantly reduced risk of acquiring chronic low back pain, and walking intensity also has a secondary protective effect. Use of accelerometer-derived data offers strong, objective evidence supporting encouragement of regular walking as part of public health interventions. Walking more daily, particularly in excess of 100 minutes may be an easy and effective means of preventing chronic back pain in the general population.

Reference:

Haddadj R, Nordstoga AL, Nilsen TIL, et al. Volume and Intensity of Walking and Risk of Chronic Low Back Pain. JAMA Netw Open. 2025;8(6):e2515592. doi:10.1001/jamanetworkopen.2025.15592

Powered by WPeMatico

Increasing eGDR levels reduces risk of liver fibrosis and MASLD: BMC Study

A new study published in the journal of BMC Endocrine Disorders showed that increasing estimated glucose disposal rate (eGDR) levels reduced the risk of liver fibrosis by up to 95% and metabolic dysfunction-associated steatotic liver disease (MASLD) by up to 87% when compared to the lowest eGDR levels.

MASLD is now the most common cause of chronic liver disease, with a global incidence of about 30% in adults. Type 2 diabetes, hypertension, obesity, dyslipidemia, and cardiovascular disease are all significantly correlated with MASLD, which has been identified as a metabolic disorder. There is proof that the development of MASLD is significantly influenced by insulin resistance (IR), which is also connected to the advancement of liver fibrosis.

Triglyceride glucose index (TyG index), estimated glucose disposal rate, and homeostasis model assessment-insulin resistance (HOMA-IR) are some of the noninvasive techniques that have been proposed by several researchers to measure IR. Thus, this study was to evaluate the relationship between eGDR and MASLD and liver fibrosis.

Nearly, 3,100 individuals from the 2017–2018 National Health and Nutrition Examination Surveys (NHANES) were included in this study. Binary logistic regression analysis was utilized to investigate the connection between eGDR and MASLD and liver fibrosis. The capacity of eGDR to detect MASLD was estimated using the receiver operating characteristic (ROC).

The participants’ average age was 54.59 (17.29) years, and 49.26% of them were female. Liver fibrosis was seen in 11.15% of cases and MASLD in 62.19% of cases. With βs of -15.18 and -0.74 (all p < 0.01), respectively, eGDR was negatively correlated with the controlled attenuation parameter (CAP) and liver stiffness measurement (LSM) in the fully adjusted models.

With odds ratios (ORs) and 95% CIs of 0.53 (95% CI: 0.48-0.74) and 0.40 (95% CI: 0.28-0.57) (all p < 0.01), eGDR was negatively associated with MASLD and liver fibrosis. The eGDR’s area under the curve (AUC) for detecting liver fibrosis and MASLD is 0.75 and 0.74, respectively.

Overall, easy, accurate, and affordable ways to identify MASLD are needed in clinical practice and epidemiological research. Regular blood tests can efficiently monitor eGDR, a biomarker that may be obtained through routine testing. Consequently, eGDR could be a useful method for the non-invasive diagnosis of MASLD patients.

Source:

Liu, W., Li, X., Chen, L., & Luo, X. (2025). The association between estimated glucose disposal rate and metabolic dysfunction-associated steatotic liver disease and liver fibrosis in US adults. BMC Endocrine Disorders, 25(1), 67. https://doi.org/10.1186/s12902-025-01891-7

Powered by WPeMatico

Can a psychedelic compound from mushrooms benefit people with cancer and major depression?

New results from a clinical trial reveal that a single dose of psilocybin-a naturally occurring psychedelic compound found in mushrooms-can provide sustained reductions in depression and anxiety in individuals with cancer suffering from major depressive disorder. The findings are published by Wiley online in CANCER, a peer-reviewed journal of the American Cancer Society.

People with cancer often struggle with depression. In this phase 2 trial, 28 patients with cancer and major depressive disorder received psychological support from a therapist prior to, during, and following a single 25-mg dose of psilocybin.

During clinical interviews conducted 2 years later, 15 (53.6%) patients demonstrated a significant reduction in depression, and 14 (50%) had sustained depression reduction as well as remission. Similarly, psilocybin reduced anxiety for 12 (42.9%) patients at 2 years.

An ongoing randomized, double-blind trial is currently evaluating up to two doses of 25 mg of psilocybin versus placebo as treatment for depression and anxiety in patients with cancer. This study is building on the single-dose study in an effort to bring a larger majority of the patients into remission of depression and anxiety.

“One dose of psilocybin with psychological support to treat depression has a long-term positive impact on relieving depression for as much as 2 years for a substantial portion of patients with cancer, and we’re exploring whether repeating the treatment resolves depression for more than half of the patients,” said lead author Manish Agrawal, MD, of Sunstone Therapies. “If randomized testing shows similar results, this could lead to greater use of psilocybin to treat depression in patients with cancer.”

Reference:

Manish Agrawal, Kim Roddy, Betsy Jenkins, Celia Leeks, Ezekiel Emanuel, Long-term benefits of single-dose psilocybin in depressed patients with cancer, Cancer, https://doi.org/10.1002/cncr.35889.

Powered by WPeMatico

LVAD good for heart failure but may increase Stroke Risk, reveals research

Researchers have found in a new study that people with Left Ventricular Assist Devices (LVADs) have an 11% to 47% higher risk of developing blood clots that can lead to stroke compared to the general population.

For people with advanced heart failure, left ventricular assist devices, or LVADs, can be a literal lifesaver.

The implantable devices, which improve blood flow throughout the body, are often the last treatment option for patients with advanced heart failure. More than 14,000 people have one, and with heart failure impacting 26 million people globally, their use is likely to grow.

Yet they come with risks: Compared to the general population, people with LVADs face an 11% to 47% higher risk of developing blood clots that can travel to the brain and cause a stroke.

It’s not clear why some LVAD patients have strokes while others don’t. But a new study led by engineers at CU Boulder, CU Anschutz and the University of Washington suggests the answer could lie in hemodynamics-the patterns of blood flow within the body.

The researchers created “digital twins” of real patients with LVADs to map their blood flow. Their findings revealed new insights into how strokes might emerge.

“We are in an age where there is quite a bit of data that we have access to, and we know a lot about how fluid moves through the arteries and veins,” said Debanjan Mukherjee, senior author of the study and assistant professor in the Paul M. Rady Department of Mechanical Engineering at CU Boulder. “We are looking at blood flow patterns as information that currently is not incorporated in clinical practice.”

Engineering concepts like fluid dynamics can offer a unique lens for looking at complex medical issues and provide information that other diagnostic tools might miss, the authors said.

“Knowledge gained from this study can help us develop patient-specific implant techniques to reduce the likelihood of stroke in patients with durable LVADs,” said Jay Pal, professor and chief of cardiac surgery at the University of Washington, and a co-author of the study.

Heart failure, LVADs and the risk of stroke

The body relies on a constant supply of fresh blood and oxygen to function. Heart failure occurs when the heart can no longer pump the amount of blood the rest of the body needs.

During a healthy heartbeat, the left ventricle of the heart constricts and pushes blood into the arteries, where it travels to the body’s organs, muscles and bones.

But in people with heart failure, the left ventricle can become weak and ineffective. An LVAD attaches directly to the heart, bypasses the left ventricle and pumps blood straight into the aorta, the biggest artery in the body.

LVADs can help patients live longer, healthier lives, but they can also raise the risk of blood clots. When blood stagnates in areas like the left ventricle, clots can easily form there and enter major blood vessels.

These clots can travel through the body and land in a variety of places, but the arteries supplying blood to the brain are an especially dangerous spot. Clots that get stuck there can restrict or cut off blood flow to parts of the brain and cause a stroke.

In the current study, supported by the National Institutes of Health and CU’s AB Nexus initiative, Mukherjee and his colleagues explored whether different blood flow patterns in people with LVADs could explain who does and doesn’t get strokes.

To answer this question, the research team, led by former graduate student Akshita Sahni of CU Boulder, collected data from 12 people with LVADs. Six had developed strokes after their LVAD implantation, and six had not.

The group created 3D digital twins of each LVAD patient using detailed imaging of the aorta, nearby blood vessels and the part of the LVAD that attaches to it. The researchers also integrated individual’s clinical information, such as blood pressure and heart rate, into the models.

“We are basically digitally recreating something that’s going on inside the body,” Mukherjee said.

Using the twins, the group was able to estimate the patterns of blood flow through each person’s aorta. They also simulated how blood might flow through the same people before they got their LVADs.

The researchers found differences in blood flow patterns between the patients who had strokes and those who did not, both before and after they had LVADs implanted.

The team also found the LVADs changed the blood flow patterns in each patient, creating a “jet” that pushed blood into the aorta at a different angle than normal blood flow from the heart.

What this means for the future

Such differences in blood flow could help shed light on LVAD patients’ risk level for stroke. For example, varied blood flow patterns might make some people more prone to areas of stagnation, where blood platelets may more easily stick onto gel-like networks of proteins in the blood and form clots.

The findings could help improve treatments and outcomes for people with heart failure. With this information, health care providers can personalize how they surgically implant and monitor LVADs in their patients. They might also be able to anticipate their patients’ level of risk and provide more customized treatments for each person.

Mukherjee and his collaborators are planning additional research on the topic, but he emphasized that some of this work will only be possible with federal support and funding.

“In these times, it is important to remember how much federal agency support means to getting studies like these completed and developed further,” he said.

Reference:

Akshita Sahni, Sreeparna Majee, Jay D. Pal, Erin E. McIntyre, Kelly Cao, Debanjan Mukherjee, Hemodynamics indicates differences between patients with and without a stroke outcome after left ventricular assist device implantation, Computers in Biology and Medicine, https://doi.org/10.1016/j.compbiomed.2025.109877.

Powered by WPeMatico

FDA Approves Keytruda as First Immune Checkpoint Inhibitor for Resectable Locally Advanced Head and Neck Cancer

Pembrolizumab, an immune checkpoint inhibitor, has been approved by the U.S. Food and Drug Administration (FDA) for the treatment of patients with resectable locally advanced head and neck squamous cell carcinoma whose tumors express PD-L1 [Combined Positive Score (CPS) ≥1] as determined by an FDA-approved test.

The FDA approval is based on data from the pivotal KEYNOTE-689 study, a randomized, open-label phase 3 clinical trial in which patients who received pembrolizumab before, during and after standard-of-care surgery had longer event-free survival without the cancer coming back and higher rates of substantial tumor shrinkage prior to surgery. The study was led by investigators from Dana-Farber Brigham Cancer Center and Washington University School of Medicine in St. Louis.

This new regimen represents a substantial change in workflow for head and neck cancer care, offering appropriate patients the option of receiving pembrolizumab before surgery for resectable locally advanced head and neck cancer.

“These findings represent a truly exciting time for our patients, as it is the first advance in this field in over two decades,” said Dr. Ravindra Uppaluri, the study’s overall principal investigator, director of Head and Neck Surgical Oncology at Dana-Farber and Brigham and Women’s Hospital, and Brigham and Women’s Hospital Endowed Chair in Otolaryngology.

“This is the first approval of a checkpoint inhibitor in the curative, perioperative setting and it represents a massive paradigm shift in how we manage surgically treated head and neck cancer going forward,” said Dr. Robert Haddad, chief of the Division of Head and Neck Oncology and the McGraw Chair in Head and Neck Oncology at Dana-Farber, professor of medicine at Harvard Medical School and the Dana-Farber Brigham Cancer Center principal investigator and member of the KEYNOTE-689 steering committee.

The KEYNOTE-689 trial randomized 714 patients with newly diagnosed stage 3 or stage 4A head and neck squamous cell cancer to receive either pembrolizumab before (called neoadjuvant), during and after (called adjuvant) standard of care or standard of care alone. The investigators also measured the presence of the target of pembrolizumab, PD-L1, in tumors to determine if higher scores of PD-L1 in tumors would affect response to treatment.

The study met its primary endpoint showing that patients who received pembrolizumab had longer event-free survival. Median event-free survival was 51.8 months with pembrolizumab and 30.4 without after a median of 38.3 months of follow-up. The team also observed significantly higher rates of major pathologic response, a substantial immune mediated tumor destruction seen in surgical resections.

The treatment was found to be safe with no new observed side effects. Further, patients taking pembrolizumab received surgery in a timely manner and were not delayed by immunotherapy-related side effects prior to surgery.

Powered by WPeMatico

Non-Surgical Root Canal Resolves Rare Mandibular Molar Malformation in Teenager: Case Report

Spain: In a recently published case report in the Journal of Endodontics, researchers from Spain have detailed the successful non-surgical management of a rare and complex dental anomaly—Dens Invaginatus (DI)—in a mandibular first molar of a teenage boy, emphasizing the value of conservative treatment paired with advanced diagnostic imaging.

The case involved a 14-year-old boy who presented with pain and swelling in the lower right molar region, specifically tooth #30. Upon examination, the tooth exhibited an unusual crown shape but no visible signs of decay. Despite these atypical features, the tooth remained firm with normal periodontal probing depths. Further investigation through intraoral radiographs and cone beam computed tomography (CBCT) revealed the presence of Oehlers’ Type II Dens Invaginatus—an uncommon developmental anomaly where enamel and dentin fold into the pulp cavity—and a large radiolucent area around the root, indicative of periradicular pathology.

The patient’s medical history was also noteworthy. Born prematurely at 32 weeks, he had undergone intrauterine feto-fetal blood transfusion and had experienced both stroke and respiratory arrest at birth. Despite these early health complications, he was in good general health at the time of evaluation.

Given the complexity of the case, a conservative, non-surgical approach was selected. Root canal therapy was carried out with precision to thoroughly clean and disinfect the canal system, which is often complicated in teeth affected by DI due to their highly variable internal anatomy. The root canal system was subsequently sealed with a biocompatible filling material, ensuring long-term integrity.

The patient was monitored for 46 months. Throughout this follow-up, he remained asymptomatic, with no pain or sensitivity, and the treated tooth remained stable and functional. Most significantly, radiographic examinations confirmed progressive and ultimately complete healing of the periapical tissues.

The authors emphasized that, to their knowledge, very few cases of DI in mandibular molars have been conservatively managed with guidance from CBCT imaging. This report not only showcases a successful treatment outcome but also underscores the critical role of advanced imaging tools in diagnosing and planning treatment for complex dental anomalies. CBCT allowed for precise visualization of the abnormal anatomy, which contributed to the tailored, effective treatment strategy.

The authors conclude, “The case contributes to the limited body of literature on non-surgical treatment of Dens Invaginatus in lower molars, demonstrating that even challenging anatomical cases can be managed effectively with modern endodontic techniques and technology.”

Reference:

Cantarini, J. M., Sans, F. A., Abbott, P. V., & Garcia-Font, M. (2025). Non-surgical endodontic treatment of an atypical mandibular first molar with a Dens Invaginatus and a large periradicular radiolucency – A case Report. Journal of Endodontics. https://doi.org/10.1016/j.joen.2025.05.023

Powered by WPeMatico

Vascular access status strongly linked with demoralization syndrome in Elderly maintenance hemodialysis Patients: Study

Researchers have found a strong correlation between vascular access status (VAS) and demoralization syndrome (DS) in elderly maintenance hemodialysis (MHD) patients in a new study. Further enhancing social support, managing anemia, and regulating mineral metabolism may improve vascular access outcomes, reduce demoralization, and enhance overall quality of life. The study was conducted by Xiangying Lv and fellow researchers. The study was published in BMC Nephrology.

Vascular access is the lifeblood of hemodialysis patients. Yet, VA complication, pain, and functional impairment can impact not only physical well-being, but emotional resilience as well. Demoralization syndrome—a psychological condition characterized by helplessness, hopelessness, and loss of purpose—is particularly common in older MHD patients, yet its relationship to VA satisfaction has not been well researched.

This research, carried out from April 2024 to October 2024, sought to fill this gap by assessing the relationship between satisfaction with vascular access (VAS) and demoralization among a sample of 350 older MHD patients. With the use of trusted instruments—the Short Form Vascular Access Questionnaire (VAQ) and the Chinese Version of the Demoralization Syndrome Scale—the research provides new information on how bodily treatment experiences affect mental health outcomes.

Participants were recruited from three tertiary hospitals and consisted of patients 60 years and older who were on long-term hemodialysis. Patients were classified into two groups according to their vascular access satisfaction scores:

VA dissatisfaction group (n = 220)

VA satisfaction group (n = 130)

Data gathered consisted of social demographics, laboratory values (hemoglobin and PTH levels), and the duration of dialysis. Binary logistic regression models were employed to determine independent predictors of dissatisfaction with vascular access, while independent t-tests were utilized to compare between the two groups the demoralization syndrome scores.

Results

  • The comparison revealed a substantial difference in DS scores between the low and high VA satisfaction patients.

  • The average DS score for the VA dissatisfaction group was 73.6 ± 8.7, whereas it was 51.2 ± 6.9 in the VA satisfaction group (p < 0.01), strongly reflecting increased psychological distress among those dissatisfied with their vascular access.

Multivariate logistic regression identified several independent risk factors for VA dissatisfaction:

  • Solitary living made dissatisfaction more likely (OR = 2.1; 95% CI, 1.4–3.2).

  • Increased dialysis time was also an independent predictor (OR = 1.8; 95% CI, 1.2–2.7).

  • Higher PTH levels were associated with more dissatisfaction (OR = 1.5; 95% CI, 1.1–2.0).

  • Higher hemoglobin levels, on the other hand, was a protective factor (OR = 0.6; 95% CI, 0.4–0.9), indicating that adequately treated anemia can be a factor in better patient satisfaction with vascular access.

This study finds a strong and clinically relevant correlation between satisfaction with vascular access and demoralization syndrome among elderly hemodialysis patients. Interventions for anemia, control of PTH, and social isolation alone or living with others can increase satisfaction and reduce psychological distress. These modifiable risk factors offer the opportunity for multidisciplinary teams to include vascular access assessments and emotional well-being screening in regular MHD treatment to achieve improved patient quality of life.

Reference:

Lv, X., Zhang, H., Yang, L. et al. Correlation between vascular access satisfaction and demoralization syndrome in elderly patients with maintenance hemodialysis: a multi-center study. BMC Nephrol 26, 265 (2025). https://doi.org/10.1186/s12882-025-04191-3

Powered by WPeMatico

Latent Iron Deficiency at Birth Doubles Risk in Breastfed Infants by 6 Months: Study Finds

India: A recent study published in the Journal of Pediatric Gastroenterology and Nutrition highlights a critical concern regarding iron levels in infants, particularly those who are predominantly breastfed. Conducted by Dr. Puneeth Amaresh Babu and colleagues from the Department of Paediatrics, Command Hospital Air Force Bangalore, Bengaluru, India, the research reveals that infants born with latent iron deficiency (LID) face a significantly higher risk of developing iron deficiency by six months of age, compared to those born with normal iron status (NIS).

The prospective cohort study was conducted at a tertiary care center in southern India. It included neonates born at more than 34 weeks of gestation. Based on cord serum ferritin levels at birth, infants were classified into two groups: those with NIS (ferritin levels greater than 75 ng/mL) and those with LID (ferritin levels between 11–75 ng/mL).

Out of 559 neonates enrolled in the study, 45 were categorized as having LID and 514 as having NIS. At the six-month follow-up, iron status was reassessed in 272 infants—33 from the LID group and 239 from the NIS group—by measuring hemoglobin and serum ferritin levels.

The study revealed the following findings:

  • 48.5% of infants with latent iron deficiency (LID) at birth developed iron deficiency by six months.
  • 24.7% of infants with normal iron status (NIS) at birth developed iron deficiency by six months.
  • Infants with LID had a relative risk of 2.3, indicating more than double the risk compared to those with NIS.
  • Average hemoglobin levels at six months were similar between groups: 10.21 g/dL in the LID group and 10.48 g/dL in the NIS group.
  • These comparable hemoglobin levels suggest that hemoglobin alone may not reliably detect emerging iron deficiency.
  • The overall incidence of iron deficiency among the study cohort was 27.6%.
  • Despite the advantages of exclusive breastfeeding, the findings highlight the need to reassess iron supplementation strategies in early infancy.
  • Special attention is warranted for infants predisposed to low iron stores at birth to prevent early iron deficiency.

The authors stress the importance of early screening for latent iron deficiency and recommend timely intervention through iron supplementation for at-risk infants. This approach may help reduce the burden of iron deficiency during the critical window of early childhood development, a period essential for cognitive and physical growth.

The authors concluded, “With iron deficiency remaining a leading cause of anemia in infancy, the study adds valuable insight into the early identification and prevention of the condition, especially in resource-limited settings where routine screening may not be standard practice.”

Reference:

Babu, P. A., Garg, A. K., Patnaik, S. K., & John, B. M. Risk of iron deficiency at 6 months in predominantly breastfed infants with normal iron status and latent iron deficiency at birth. Journal of Pediatric Gastroenterology and Nutrition. https://doi.org/10.1002/jpn3.70116

Powered by WPeMatico