Cancer care centre to come up in VIMSAR at cost of Rs 34 crore

Sambalpur: A cancer care centre will come up in Western Odisha’s Veer Surendra Sai Institute of Medical Sciences and Research, Burla by the end of 2024, an official said.

The construction work of the building for the cancer care centre on the premises of the institution is going on in full swing and a target has been set to complete the work on the project by July 2024. 

The public works department is carrying out the construction work of the centre, which has been estimated at Rs 34 crore.

Also Read:VIMSAR doctor couple launches free clinic for slum dwellers

“The ongoing construction of a ground-plus-one-storey building within the institution’s premises is progressing rapidly now. Presently, the foundation part of the building is nearing completion. A bunker will also be constructed as a part of the facility,” said assistant executive engineer of PWD, Santanu Sahu.

To ensure compliance with safety regulations, eLORA registration was also done for the project, said an official of VIMSAR.

The official informed us that two linear accelerator machines will be installed at the cancer care centre. It is a machine that aims radiation at cancer tumours with pinpoint accuracy, sparing nearby healthy tissue. It is used to deliver several types of external beam radiation therapy, including Image-guided radiation therapy and Intensity-modulated radiation therapy.

This apart, the cancer care centre will provide out-patient department services to the patients dependent on the hospital, he stated.

Thousands of patients from across western Odisha besides from neighbouring states of Chhattisgarh and Jharkhand depend on VIMSAR for their health care service.

Around 1,000 new cancer cases are reported at VIMSAR annually. The establishment of the cancer care centre will significantly benefit the patients, offering advanced treatment options and improving overall healthcare accessibility in the region, said a doctor of VIMSAR.

Powered by WPeMatico

Chronic low exposure to air pollution in early life linked to allergic lung diseases in children

Australia: A recent study has revealed a stronger association between chronic but low exposure to PM2.5 in early life and allergic sensitisation in childhood than time-limited high exposure levels, such as that experienced during landscape fires. The findings were published online in BMC Pulmonary Medicine on December 21, 2023.

According to the researchers, this is the first study to investigate allergic sensitisation concerning exposure to air pollution from a landscape fire in a non-occupational setting. They did not observe a relationship between in-utero exposure or exposure during the first 2 years following the birth to PM2.5 (particulate matter with an aerodynamic diameter < 2.5 μm) emitted by coal smoke and allergic sensitisation, but increased background PM2.5 concentrations were tied to a higher prevalence of sensitisation to dust.

In terms of sensitisation to specific allergens, amongst the tested allergens, the only positive association was found with dust.

There is inconsistent evidence on the relationship between air pollution and allergic sensitisation in childhood, and this relationship has not been investigated in the context of smoke events predicted to increase with climate change. Thus, Myriam Ziou, University of Tasmania, Hobart, Tasmania, Australia, and colleagues aimed to evaluate associations between exposure in two early life periods to severe levels of PM2.5 from a mine fire, background PM2.5, and allergic sensitisation later in childhood.

For this purpose, they measured specific immunoglobulin E (IgE) levels for seven aeroallergens and total IgE levels in a cohort of children exposed to the Hazelwood coal mine fire, either in utero or during the first two years, in a region in Australia where ambient levels of PM2.5 are generally low.

They estimated personal exposure to fire-specific PM2.5 emissions based on a high-resolution meteorological and pollutant dispersion model and detailed reported movements of pregnant mothers and young children during the fire. The usual exposure to PM2.5 at the residential address at birth was also estimated using a national satellite-based land-use regression model.

Associations between PM2.5 sources and sensitization to cat, dust, grass, and fungi were estimated seven years after the fire with logistic regression, while associations with total IgE levels were estimated with linear regression.

One hundred three children presenting to the 2021 clinical follow-up agreed to provide a blood sample. Those born overseas (n=2) were excluded as their background exposure could not be estimated accurately, which led to the inclusion of 101 children. Of 101 children, 50 were born before the start of the fire, four were born during the fire, 29 were in utero during the whole fire period, and 18 were conceived after the fire.

The study led to the following findings:

  • Sensitisation to D. pteronyssinus had the highest prevalence (35.4%), while Cl. herbarum had the lowest (2.0%).
  • There were various levels of correlation between sensitisation to allergens from different categories, the strongest being between perennial rye grass pollen and D. pteronyssinus.
  • No association was found between the levels of fire-related PM2.5 and the odds of sensitisation to any of the distinct allergen categories, for both peak and cumulative exposure. Exposure to fire-related PM2.5 was not linked with sensitisation to any category either.
  • Early life background exposure to PM2.5 was positively associated with the odds of being sensitised to dust (adjusted OR = 1.90), but not with cat, grass, fungi, or overall sensitization.
  • No evidence was observed of a relationship between exposure to fire-related or background PM2.5 and overall total IgE in the blood.

“Our study indicates chronic exposure to relatively low air pollution levels in early life could have a stronger association with atopy development than time-limited high levels, which may be part of the mechanism relating air pollution to allergic diseases, such as atopic dermatitis and allergic rhinitis,” the researchers wrote.

“Larger studies are needed to examine periods of increased risk during pregnancy and infancy based on knowledge of the maturation of the immune system,” they concluded.

Reference:

Ziou, M., Gao, C.X., Wheeler, A.J. et al. Exposure to air pollution concentrations of various intensities in early life and allergic sensitisation later in childhood. BMC Pulm Med 23, 516 (2023). https://doi.org/10.1186/s12890-023-02815-8

Powered by WPeMatico

Testosterone replacement therapy may not increase risk of prostate cancer among men with hypogonadism

The impact of testosterone replacement therapy (TRT) on prostate cancer and other adverse prostate events remains unclear. Does TRT in men with hypogonadism elevate the risk of high-grade or any prostate cancer or other adverse prostate events? This needs to be further investigated.

Men with hypogonadism and no high risk for prostate cancer who received TRT showed low and similar incidences of high-grade or any prostate cancer, acute urinary retention, and invasive surgical procedures for BPH compared to a placebo. TRT did not worsen lower urinary tract symptoms, according to a recent study published in JAMA Network Open.

The study conducted in 316 clinical trials in the US aimed to compare the effect of TRT versus placebo. The primary outcome measured was the incidence of adjudicated high-grade prostate cancer, and secondary endpoints included the incidence of any adjudicated prostate cancer, acute urinary retention, invasive prostate surgical procedure, prostate biopsy, and new pharmacologic treatment. The intervention effect was analyzed using a discrete-time proportional hazards model.

Key points from the study are:

  • The study included 5198 men of mean age 63 years with hypogonadism with increased risk or preexisting cardiovascular disease.
  • 2596 men received TRT.
  • 1.62 % testosterone transdermal gel was applied daily during the study period.
  • 2602 patients in the placebo group were given matching placebo gel to be applied daily.
  • At baseline, the mean PSA concentration and IPSS were 0.92 ng/mL and 7.1, respectively.
  • The mean treatment duration in the TRT group and placebo was 21.8 months and 21.6 months, respectively.
  • During 14 304 person-years of follow-up, the incidence of high-grade prostate cancer did not differ significantly between groups. (0.19 % for TRT and o.12 % for placebo with a hazard ratio of 1.62)
  • There was a greater increase in PSA concentrations in testosterone-treated men.

The study’s findings will help clinicians and patients make informed decisions about the potential risks of TRT, they said.

Reference:

Bhasin S et al. Prostate Safety Events During Testosterone Replacement Therapy in Men With Hypogonadism: A Randomized Clinical Trial. JAMA Netw Open. 2023;6(12):e2348692. doi:10.1001/jamanetworkopen.2023.48692

Powered by WPeMatico

Liso-cel cost effective second-line treatment for common form of lymphoma

Lisocabtagene maraleucel (liso-cel), a CAR T-cell therapy, is a cost effective second line treatment for relapsed and refractory (hard to treat) diffuse large B-cell lymphoma (r/r DLBCL), according to a study published today in Blood Advances. The study is the first of its kind to incorporate healthcare expenses, societal productivity losses, and patient quality of life in assessing the drug’s cost-effectiveness.

“In our study, we incorporated the often-overlooked societal costs associated with cancer treatment, which are typically neglected in cost-effectiveness analyses that focus solely on the healthcare sector related-expenses,” explained Mohamed Abou-el-Enein, MD, PhD, MSPH, an associate professor of medicine at the Keck School of Medicine at the University of Southern California and the senior author of the study. “Cancer treatments can diminish quality of life, causing work absences and challenges in managing everyday activities, especially among the elderly. Treatments that improve quality of life not only benefit the patient but also reduce these broader societal costs, which is an important aspect of our cost-effectiveness evaluation.”

DLBCL, the most prevalent form of non-Hodgkin lymphoma, is a cancer that affects lymphocytes, a type of white blood cell. In cases where DLBCL does not respond to initial treatment or recurs within 12 months after treatment completion, the standard care protocol typically includes platinum-based chemotherapy, followed by high-dose chemotherapy and autologous stem cell transplantation. In 2022, the U.S. Food and Drug Administration (FDA) granted approval to liso-cel as a second-line treatment for DLBCL. However, the cost-effectiveness of liso-cel is currently a topic of debate among oncologists, especially considering the drug’s steep price tag which increased from $410,300 to $447,227 between 2022 and 2023.

In this study, researchers performed a cost-effectiveness analysis of liso-cel for treating r/r DLBCL using a partitioned survival model, a common economic tool for assessing medical treatments across various stages of disease progression. They found that patients treated with liso-cel had an average life expectancy of 5.34 years and gained 3.64 quality-adjusted life years (QALYs), compared to the 2.47 years and 1.62 QALYs with standard care (SC). The cost-effectiveness of liso-cel, measured by the incremental cost-effectiveness ratio (ICER), was $99,669 per QALY from a healthcare sector perspective and $68,212 per QALY from a societal perspective. The ICER evaluates the additional cost required for each QALY gained, assuming a societal willingness to pay up to $100,000 per QALY. These figures indicate that liso-cel is a cost-effective treatment, staying below the $100,000 per QALY threshold.

An important aspect of the study is its modeling based on outcomes from the TRANSFORM trial, the pivotal study assessing the cost-effectiveness of liso-cel in second-line treatment. This approach closely mirrors the patient population from the TRANSFORM trial, thereby reducing bias in the results. The patient demographic typically involving patients aged 60, with a majority (57%) being male, and all having r/r DLBCL with a relapse occurring within 12 months of initial treatment. The treatment regimens were consistent with those used in the TRANSFORM trial.

Direct medical costs included in the analysis were comprehensive, covering CAR T-cell therapy procedures, chemotherapy, stem cell transplant, hospital admissions, ongoing monitoring, management of disease progression, and end-of-life care. This also encompassed the costs associated with treating adverse events. These costs were derived from published literature and public datasets. Societal costs considered lost productivity, including lost labor market earnings, the value of unpaid productivity and household activities, and out-of-pocket travel expenses. Notably, in terms of mortality-related lost labor earnings, liso-cel was associated with a loss of $110,608, which is lower than the $156,362 for SC. Additionally, both unpaid productivity loss and uncompensated household production costs were comparatively lower for liso-cel. These findings indicate that liso-cel offers economic benefits by reducing losses in both paid and unpaid labor sectors, compared to SC.

The authors’ strict adherence to data from the TRANSFORM trial brings about a limitation in their study due to the lack of long-term follow-up data to be incorporated in their cost-effectiveness analysis. Moreover, the scenario analyses conducted, which looked at various time horizons, utility values, and list prices, reveal that maintaining liso-cel cost-effective at the $100,000 per QALY threshold might pose a challenge. This underlines the critical need to manage liso-cel’s costs effectively to ensure its affordability and economic sustainability.

Looking ahead, Dr. Abou-el-Enein explains that their objective is to continue evaluating the cost-effectiveness of various CAR T-cell therapies. This information is crucial for making informed decisions about reimbursements and their integration into the healthcare system. He also stresses the significance of ongoing dialogue among clinicians, researchers, pharmaceutical manufacturers, and patients concerning the escalating costs of CAR-T cell therapy.

“This study demonstrates that CAR T-cell therapy is worth considering as a second line treatment for r/r DLBCL,” Dr. Abou-el-Enein said. “When deciding on what therapies to use in clinical practice, I think it is important that we do not shy away from these therapies solely because of their list price. We must keep our patients front and center in these decisions and continue to have conversations about how we can lower the price of CAR T-cell therapy and increase patient access to this life-saving treatment.”

Reference:

Jee H. Choe, Tianzhou Yu, Jeremy S Abramson, Mohamed Abou-el-Enein, Cost-Effectiveness of second-line lisocabtagene maraleucel in relapsed or refractory diffuse large B-cell lymphoma, Blood Advances, https://doi.org/10.1182/bloodadvances.2023011793.

Powered by WPeMatico

Socket-shield therapy promising for preserving inter-implant papilla between adjacent central-lateral incisor implants

Socket-shield therapy promising for preserving inter-implant papilla between adjacent central-lateral incisor implants suggests a new study published in the Journal of Esthetic and Restorative Dentistry.

Despite significant progress within implant prosthetic therapy, preserving the papilla between two adjacent implants in the esthetic zone, particularly between central and lateral incisors, remains challenging. This case series aims to report a papilla preservation approach between adjacent upper central-lateral incisor implants using the socket-shield technique.

Six patients with natural dentition received unilateral adjacent central-lateral incisor implants with different socket shield configurations. The esthetic outcomes were clinically assessed after 3–5 years of follow-up. Post-operative papilla fill was evaluated on intraoral images compared to baseline characteristics and the contralateral papilla. Papilla height was preserved in all cases, with minimal alterations observed.

Within the limitations of the present case series, the socket-shield technique demonstrated favorable outcomes in preserving the papilla between adjacent upper central-lateral incisor implants in the midterm follow-up. Clinical studies are warranted to validate these results.

The socket-shield technique seems promising in preserving the inter-implant papilla between adjacent central-lateral incisor implants.

Reference:

Pohl, S. Effects of socket-shield therapy on inter-implant papilla preservation between upper central and lateral incisors: A case series with 3–5 year follow-up. J Esthet Restor Dent. 2023; 1-9. doi:10.1111/jerd.13152

Keywords:

Socket-shield, therapy, promising, preserving, inter-implant, papilla, between, adjacent, central-lateral, incisor, implants, Pohl, S, Journal of Esthetic and Restorative Dentistry

Powered by WPeMatico

Statins flop in reducing risk of conduction disturbances and arrhythmias after TAVR

USA: A small retrospective study published in Cardiovascular Revascularization Medicine has shed light on the effect of antecedent statin use on arrhythmias and conduction disturbances after transcatheter aortic valve replacement (TAVR).

Alexandra J. Lansky, Yale School of Medicine, New Haven, CT, United States of America, and colleagues revealed that statins do not appear to reduce the risk of post-TAVR atrial fibrillation or conduction abnormalities.

Post-TAVR conduction disturbances and atrial fibrillation are reported to be associated with markedly worse short- and long-term prognoses. Statins have multiple pleiotropic effects that may be beneficial in the periprocedural periods of cardiac interventions. They have anti-inflammatory, antioxidant, and antithrombotic effects, can reduce sympathetic activity, and may down-regulate the renin-angiotensin system. It has been hypothesized that statins may protect against incident AF or conduction disturbance.

To determine the effect of antecedent statin usage on arrhythmias and conduction disturbances after TAVR, the research team retrospectively collected data on consecutive patients in the Yale New Haven Health TAVR Registry. It included patients who did not have a prior pacemaker, had at least 1 pre- and post-TAVR electrocardiogram, and did not have a change to their statin regimen during the index hospitalization.

The primary endpoint of the study was the composite of new pacemaker placement, other new conduction disturbances, and new AF evaluated at seven days post-TAVR.

The study revealed the following findings:

  • 612 patients met the inclusion criteria between 2012 and 2019. Of these, 162 patients were not on antecedent statins, and 450 were (28 low-intensity, 225 moderate-intensity, and 197 high-intensity).
  • After 1:1 propensity matching, 99 patients on moderate-/high-intensity statins were matched to 99 patients not on antecedent statins.
  • At 7 days, there was no significant difference in the occurrence of the primary endpoint (57 % statin users versus 46 % non-statin users).
  • There was a trend toward increased conduction disturbances 7 days after TAVR in statin users (56 % versus 42 %), but rates of AF (5 % versus 8 %) and pacemaker placement (9 % versus 15 %) were numerically lower in statin users.
  • There was no significant difference in persistent conduction disturbances (21 % versus 18 %).

“The study did not find a significant benefit of statin use on post-TAVR conduction disturbances, incident AF, or permanent pacemaker (PPM placement), however, its power to detect differences in individual endpoints was limited,” the researchers wrote.

“Further studies are needed to identify if statins or other medications change the incidence of these complications,” they concluded.

Reference:

Shah, T., Maarek, R., See, C., Huang, H., Wang, Y., Parise, H., Forrest, J. K., & Lansky, A. J. (2024). Effect of antecedent statin usage on conduction disturbances and arrhythmias after transcatheter aortic valve replacement. Cardiovascular Revascularization Medicine, 59, 3-8. https://doi.org/10.1016/j.carrev.2023.07.022

Powered by WPeMatico

CKD predicts incomplete revascularization and subsequent MACE in CCS patients

Egypt: Chronic kidney disease predicts partial revascularization and subsequent major adverse cardiovascular events (MACE) in patients with chronic coronary syndrome (CCS), a recent study published in the Indian Heart Journal has revealed.

The study showed that among patients with CCS, chronic kidney disease (CKD) is associated with a higher syntax score (SS) and incomplete revascularization prevalence. Additionally, an association was observed between incomplete revascularization and an increased risk of MACE.

Atherosclerotic plaque formation in the epicardial coronary arteries is the pathological hallmark of coronary artery disease (CAD). Several clinical manifestations can be classified as either chronic coronary syndrome or acute coronary syndrome due to the dynamic nature of the CAD process.

Chronic kidney disease and coronary artery disease constitute a high-risk combination. Despite its high prevalence, few studies have been conducted on CAD, specifically in CKD patients, frequently excluded from most trials. Therefore, there is a lack of evidence for CAD management, which might lead to the inadequate treatment of CKD patients.

To fill this knowledge gap, Shereen Ibrahim Farag, Benha University, Faculty of Medicine, Cardiology Department, Benha, Egypt, and colleagues aimed to determine the impact of CKD on the completeness of revascularization and MACE in patients with chronic coronary syndrome.

The study enrolled 400 patients with CCS who underwent revascularization by PCI. They were categorized into two categories according to their estimated glomerular filtration rate (eGFR) levels: the control group: 200 patients with eGFR ≥60mL/min/1.73m2, and the CKD Group: 200 patients with eGFR< 60ml/min/1.73m.

Reclassification of the patients was done according to revascularization into complete and incomplete revascularization groups with a one-year follow-up to assess the MACE.

Based on the study, the researchers reported the following findings:

  • CKD patients were significantly older (65.78 ± 6.41 versus 56.70 ± 9.20 years). They had higher contrast-induced nephropathy, syntax scores, all-cause mortality, heart failure, and MACE.
  • After reclassification according to revascularization, GFR was significantly reduced among patients with incomplete revascularization (51.08 ± 28.15 versus 65.67 ± 26.62, respectively).
  • Repeated revascularization, stent thrombosis, STEMI, MACE, stroke, and all-cause mortality were more prevalent among patients with incomplete revascularization.
  • Multivariate regression analysis revealed eGFR and SS as independent predictors of incomplete revascularization.
  • The optimal eGFR cutoff value for predicting partial revascularization is 49.50mL/min/1.73m2, with 58.8% sensitivity and 69.3 % specificity.

“The findings showed that chronic kidney disease is linked with a higher prevalence of syntax score and incomplete revascularization in CCS patients,” the researchers wrote.

“Incomplete revascularization is also related to a higher prevalence of MACE. As a result, CKD predicts incomplete revascularization and subsequent MACE in CCS patients,” they concluded.

Reference:

Farag, S. I., Mostafa, S. A., Kabil, H., & Elfaramawy, M. R. (2023). Chronic kidney disease’s impact on revascularization and subsequent major adverse cardiovascular events in patients with chronic coronary syndrome. Indian Heart Journal. https://doi.org/10.1016/j.ihj.2023.11.006

Powered by WPeMatico

IRF cyst localization within different retinal layers may impact outcome of AMD-related macular neovascularization

A recent retrospective case series study delved into the relationship between intraretinal fluid (IRF) localization within retinal layers and the 2-year prognosis for patients with neovascular age-related macular degeneration (AMD). The findings unveiled critical insights into the predictive value of IRF localization for the visual and anatomical outcomes in this cohort.

This study was published in the journal Ophthalmology Retina by Alessandro Arrigo and colleagues. The study involved 243 eyes of AMD patients affected by type 1 and type 2 macular neovascularization (MNV). Optical coherence tomography (OCT) imaging was utilized to classify MNV types, identify various fluid types, and determine IRF localization within retinal layers. A subset of eyes was further analyzed using OCT angiography. The study assessed the association between IRF cyst localization and visual outcomes, particularly the onset of outer retinal atrophy, over a 2-year period.

  • Neovascularizations Breakdown: Type 1 MNV constituted 69%, while type 2 MNV accounted for 31% of cases.

  • Treatment Metrics: Patients received a mean of 7 ± 2 intravitreal injections at the 1-year mark and 5 ± 2 injections at the 2-year follow-up.

  • Visual Acuity: Baseline best-corrected visual acuity improved significantly from 0.4 ± 0.3 to 0.3 ± 0.4 logarithm of the minimum angle of resolution at the 2-year follow-up (P < 0.01).

  • Outer Retinal Atrophy: Occurrence rates were 24% at 1 year and increased to 39% at the 2-year follow-up.

  • IRF Localization Impact: IRF localized at the IPL–INL and OPL–ONL retinal layers at baseline correlated with the poorest functional and anatomical outcomes.

  • Vascular Network Impact: Presence of IRF at baseline was linked to greater impairment of the intraretinal vascular network.

The study concluded that IRF localization at specific retinal layers, particularly IPL–INL and OPL–ONL, serves as a critical prognostic marker for the long-term morphologic and functional outcomes in neovascular AMD patients. This identification offers clinicians a valuable predictive tool for better patient management and treatment planning.

Reference:

Arrigo, A., Aragona, E., Bianco, L., Antropoli, A., Berni, A., Saladino, A., Cosi, V., Bandello, F., & Battaglia Parodi, M. The localization of intraretinal cysts has a clinical role on the 2-year outcome of neovascular age-related macular degeneration. Ophthalmology Retina,2023;7(12):1069–1079. https://doi.org/10.1016/j.oret.2023.07.025

Powered by WPeMatico

Iron deficiency may increase risk of severe HF and adverse events among children with dilated cardiomyopathy

Australia: A recent study published in The Journal of Heart and Lung Transplantation has revealed a high prevalence of iron deficiency (ID) in children with dilated cardiomyopathy (DCM).

The study showed that in clinical practice, iron studies are under-measured, but iron deficiency is associated with severe heart failure and an increased risk of composite adverse events (CAE). The researchers suggested considering the need for iron replacement therapy in children who present with heart failure with DCM.”

“The study is the largest to date to evaluate iron status in pediatric DCM,” the researchers reported. “Almost two-thirds of patients who had iron studies measured were iron deficient, and this conferred an increased risk of the composite occurrence of mechanical circulatory support (MCS), death, or transplantation over time.”

They added, “Those who were iron deficient had a longer hospital stay and were more likely to be microcytic, anaemic, and hypochromic, with a higher NT-proBNP.”

Jack C. Luxford from Children’s Hospital at Westmead, Sydney, Australia, and colleagues aimed to determine the prevalence and impact of iron deficiency in children with dilated cardiomyopathy by conducting a retrospective single-centre cohort study.

For this purpose, the researchers conducted a retrospective single-centre review of all children between 2010 and 2020 with a DCM diagnosis and complete iron studies. Iron deficiency was defined as ≥2 of ferritin <20 μg/litre, transferrin >3 g/liter, iron <9 μmol/litre, or transferrin saturation (TSat) <15%. Laboratory and clinical and freedom from a composite adverse event of MCS, death, or transplant were compared between children with and without ID.

The study led to the following findings:

  • Of 138 patients with DCM, 47 had available iron studies. 62% of patients were iron deficient.
  • Children with ID were more likely to be receiving inotropes (17, 59%) or invasive/noninvasive ventilation (13, 45%) than those who were iron-replete.
  • They had a higher incidence of anaemia (22, 76%) and higher NT-proBNP (1,590 pmol/litre).
  • Children with ID had significantly less freedom from the CAE at 1 year (54% ± 10%), 2 years (45 ± 10), and 5 years (37% ± 11%) than those without.
  • Iron deficiency and anaemia were the only significant predictors of the CAE on univariate Cox regression.

The findings revealed a high prevalence of ID in children with DCM.

“Iron studies are undermeasured in clinical practice, but ID is associated with severe heart failure and an increased CAE risk,” the researchers concluded. “The need for iron replacement therapy should be considered in children who present in HF with DCM.”

Reference:

Luxford, J. C., Casey, C. E., Roberts, P. A., & Irving, C. A. (2023). Iron deficiency and anemia in pediatric dilated cardiomyopathy are associated with clinical, biochemical, and hematological markers of severe disease and adverse outcomes. The Journal of Heart and Lung Transplantation. https://doi.org/10.1016/j.healun.2023.11.014

Powered by WPeMatico

Lowest and highest levels of fasting stress hyperglycemia may increase contrast-induced AKI among patients undergoing PCI

China: A recent study has revealed a significant association between both the highest and lowest levels of fasting stress hyperglycemia ratio (SHR) and an increased occurrence of contrast-induced acute kidney injury (CI-AKI) in patients undergoing coronary angiography (CAG) or percutaneous coronary intervention (PCI).

The findings published in Frontiers in Endocrinology showed that the correlation was observed regardless of whether the patients had diabetes or HbA1c > 6%.

Coronary artery disease (CAD) imposes a significant global disease burden and is a prominent contributor to human mortality on a global scale. Recently, there has been a significant improvement in the clinical prognosis of CAD patients with the widespread application of intervention techniques such as percutaneous coronary intervention and coronary angiography in the diagnosis and treatment of CAD. However, the use of these interventions is tied to a series of related complications that cannot be overlooked due to their effect on patient health.

Stress hyperglycemia ratio is an emerging indicator of critical illness, and exhibits a significant association with adverse cardiovascular outcomes. Yu Shan, Zhejiang University, Hangzhou, Zhejiang, China, and colleagues aimed to evaluate the association between fasting SHR and contrast-induced AKI in a cross-sectional study comprising 3,137 patients who underwent CAG or PCI.

Fasting SHR was calculated by dividing the admission of fasting blood glucose by the estimated mean glucose obtained from glycosylated haemoglobin. Contrast-induced acute kidney injury was evaluated based on elevated serum creatinine (Scr) levels.

The relationship between fasting SHR and the proportion of serum creatinine elevation was investigated using piecewise linear regression analysis. The correlation between fasting SHR and CI-AKI was evaluated through Modified Poisson’s regression analysis. Sensitivity analysis and subgroup analysis were conducted to explore result stability.

The researchers reported the following findings:

  • Among the total population, 15.4% of patients experienced CI-AKI.
  • Piecewise linear regression analysis revealed significant associations between the proportion of SCr elevation and fasting SHR on both sides (≤ 0.8 and > 0.8) [β = -12.651, β = 8.274].
  • The Modified Poisson’s regression analysis demonstrated a statistically significant correlation between both the lowest and highest levels of fasting SHR and an increased incidence of CI-AKI [(SHR < 0.7 versus 0.7 ≤ SHR < 0.9) β = 1.828 (SHR ≥ 1.3 versus 0.7 ≤ SHR < 0.9) β = 2.896], which was validated further through subgroup and sensitivity analyses.

“In populations undergoing PCI or CAG, both the lowest and highest levels of fasting SHR were significantly associated with an increased occurrence of CI-AKI,” the researchers concluded.

The researchers emphasized that this study solely encompassed the Chinese population, thus requiring further investigation to ascertain the generalizability of these findings to other populations in different countries.

Reference:

Shan, Y., Lin, M., Gu, F., Ying, S., Bao, X., Zhu, Q., Tao, Y., Chen, Z., Li, D., Zhang, W., Fu, G., & Wang, M. (2023). Association between fasting stress hyperglycemia ratio and contrast-induced acute kidney injury in coronary angiography patients: A cross-sectional study. Frontiers in Endocrinology, 14, 1300373. https://doi.org/10.3389/fendo.2023.1300373

Powered by WPeMatico