What are Effects of Local Anesthetics on Cardiac Conduction System in Lower Limb Orthopaedic Surgeries?

Recent randomized study compared the effects of three local anesthetic agents – bupivacaine, levobupivacaine, and ropivacaine – on cardiac conduction parameters in patients undergoing lower limb orthopedic surgeries under epidural anesthesia. The primary outcomes measured were the corrected QT interval (QTc) and P-wave dispersion (PWD), which were assessed at multiple time points from baseline up to 24 hours postoperatively. The secondary outcomes included time to onset of sensory and motor block, patient-controlled analgesia (PCA) use, and pain scores. The results showed that there was a statistically significant increase in QTc and PWD from baseline for all three groups at the various time points. However, the mean increase in QTc and PWD was higher in the bupivacaine group compared to the levobupivacaine and ropivacaine groups, though the differences between the groups were not statistically significant. All three local anesthetic agents demonstrated comparable effects on hemodynamic parameters, time to onset of sensory and motor blockade, and quality of postoperative analgesia as assessed by PCA use and visual analog pain scores.

Conclusion

The authors concluded that bupivacaine has a greater tendency to prolong QTc and PWD compared to levobupivacaine and ropivacaine, though all three agents showed similar safety profiles in terms of cardiac conduction effects and analgesic efficacy. The study provides insights into the comparative cardiac effects of these commonly used local anesthetics when administered epidurally for lower limb orthopedic procedures.

Key Points

: 1. The study compared the effects of three local anesthetic agents – bupivacaine, levobupivacaine, and ropivacaine – on cardiac conduction parameters in patients undergoing lower limb orthopedic surgeries under epidural anesthesia.

2. The primary outcomes measured were the corrected QT interval (QTc) and P-wave dispersion (PWD), which were assessed at multiple time points from baseline up to 24 hours postoperatively. The secondary outcomes included time to onset of sensory and motor block, patient-controlled analgesia (PCA) use, and pain scores.

3. The results showed a statistically significant increase in QTc and PWD from baseline for all three groups at the various time points. However, the mean increase in QTc and PWD was higher in the bupivacaine group compared to the levobupivacaine and ropivacaine groups, though the differences between the groups were not statistically significant.

4. All three local anesthetic agents demonstrated comparable effects on hemodynamic parameters, time to onset of sensory and motor blockade, and quality of postoperative analgesia as assessed by PCA use and visual analog pain scores.

5. The authors concluded that bupivacaine has a greater tendency to prolong QTc and PWD compared to levobupivacaine and ropivacaine, though all three agents showed similar safety profiles in terms of cardiac conduction effects and analgesic efficacy.

6. The study provides insights into the comparative cardiac effects of these commonly used local anesthetics when administered epidurally for lower limb orthopedic procedures.

Reference –

Agarwal V, Das PK, Nath SS, Tripathi M, Tiwari B. Comparing the effects of three local anaesthetic agents on cardiac conduction system ‑ A randomised study. Indian J Anaesth 2024;68:889‑95

Powered by WPeMatico

Incremental Peritoneal Dialysis May Safely Replace Standard Method for New Patients, Study Suggests

China: A recent systematic review and meta-analysis have shed light on the outcomes of incremental peritoneal dialysis (IPD) compared to standard full-dose peritoneal dialysis (SPD) in patients initiating dialysis treatment.

The study, published in BMC Nephrology, revealed that incremental peritoneal dialysis may be a viable alternative to SPD for new dialysis patients. The analysis indicates no significant differences in patient survival, rates of peritonitis, or technique survival between the two approaches. However, according to the authors, the effects of IPD on residual renal function (RRF) remain uncertain.

Incremental peritoneal dialysis involves administering less than the standard full-dose peritoneal dialysis for patients with end-stage renal disease. Although IPD is increasingly documented in the literature, its safety and efficacy compared to SPD remain uncertain. Therefore, Jing Cheng, Hospital HuZhou University, Huzhou, Zhejiang Province, China, and colleagues conducted a systematic review of studies that compared mortality, rates of peritonitis, technique survival, anuria-free survival, and RRF between IPD and SPD.

For this purpose, the researchers included all comparative studies published on PubMed, Embase, CENTRAL, Scopus, and Web of Science databases from their inception until September 5, 2023, that reported on the specified outcomes.

The following were the key findings of the study:

  • Ten studies were included. Definitions of IPD were heterogenous and hence mostly a qualitative synthesis was undertaken.
  • The majority of studies found no difference in patient survival between IPD and SPD.
  • Meta-analysis of crude mortality data also presented no significant difference.
  • Peritonitis and technique survival were also not significantly different between IPD and SPD in the majority of studies.
  • Data on RRF was conflicting. Some studies showed that IPD was associated with the preservation of RRF, while others found no such difference.

The findings revealed that incremental peritoneal dialysis may yield outcomes similar to standard peritoneal dialysis in new dialysis patients. There appear to be no significant differences in patient survival, rates of peritonitis, or technique survival between the two approaches. However, the effect of IPD on residual renal function remains uncertain, with existing evidence being heterogeneous and conflicting.

“Future studies should include comparable groups of patients, initiating observations at the start of peritoneal dialysis. These studies should comprehensively report all outcomes covered in this review, ensuring a sufficiently long follow-up period to yield robust evidence,” the researchers concluded.

The researchers, however, noted several limitations, including a limited number of studies and significant methodological differences that hindered comprehensive analysis. Many studies relied on observational data, introducing bias, and lacked clarity on patient selection. Additionally, they did not report outcomes for patients transitioning to hemodialysis or transplantation and important issues like cardiovascular diseases.

Reference:

Xu, S., Wu, W. & Cheng, J. Comparison of outcomes of incremental vs. standard peritoneal dialysis: a systematic review and meta-analysis. BMC Nephrol 25, 308 (2024). https://doi.org/10.1186/s12882-024-03669-w

Powered by WPeMatico

Wider use of convalescent plasma might have saved thousands more lives during pandemic, claims research

A new study led by researchers at Johns Hopkins Bloomberg School of Public Health estimates that thousands of lives could have been saved during the first year of the COVID-19 pandemic if convalescent plasma had been used more broadly, particularly in outpatients at high risk for severe disease and in hospitalized patients during their first few days of admission.

Convalescent plasma from patients who had recovered from COVID was used starting in the early months of the pandemic at the urging of a group of physicians who cited the blood byproduct’s success as a therapy in earlier infectious disease emergencies, including the global influenza pandemic of 1918–1920, and the SARS epidemic of 2002–2004. Plasma from patients recently recovered from a pathogenic infection, such as COVID, typically contains antibodies that may block or reduce the severity of the infection in others.

Over 500,000 patients were treated with convalescent plasma in the U.S. in the first year of the pandemic.

In their new paper, published online October 1 in the Proceedings of the National Academy of Science, the authors estimate that treating hospitalized COVID patients with convalescent plasma saved between 16,476 and 66,296 lives in the United States between July 2020 and March 2021. For these estimates of actual lives saved, the researchers drew from convalescent plasma weekly use data, weekly national mortality data, and convalescent mortality reduction data from meta-analyses of randomized controlled trials.

The researchers also estimated the number of potential lives that would have been saved had convalescent plasma been more widely used among patients being treated for COVID in hospitals. The researchers used the most optimistic assumptions possible: Had 100% of patients hospitalized with COVID been administered high-titer convalescent plasma within three days of admission between July 2020 and March 2021, the authors concluded that—depending on which mortality estimates they used for their analysis-between 37,467 to 149,318 (an approximately 125% increase) or between 53,943 to 215,614 (an approximately 225% increase) lives would have been saved in the first year of the pandemic.

A total of 647,795 units of plasma was given to inpatients with COVID between July 2020 and March 2021. The team used this as a measure of the number of patients treated.

“This is a therapy that can reduce mortality, be immediately available, and is relatively inexpensive-we should be prepared to use it much more in a future infectious disease emergency or pandemic,” says study senior author Arturo Casadevall, MD, PhD, Bloomberg Distinguished Professor of Molecular Microbiology and Immunology and Infectious Diseases at the Bloomberg School.

Casadevall was one of the earliest proponents of convalescent plasma at the start of the pandemic. The study’s first author is Quigly Dragotakes, PhD, a postdoctoral fellow in the Casadevall laboratory.

The authors also estimated the number of hospitalizations that might have been avoided between July 2020 and March 2021 using a range of assumptions:

  • If 15% of outpatients had received convalescent plasma, the authors estimate that between 85,268 and 227,377 hospitalizations would have been avoided.
  • If 75% of outpatients received convalescent plasma, between 426,331 and 1,136,880 hospitalizations would have been avoided.

During the first year of the pandemic, convalescent plasma was approved only for use in patients hospitalized with COVID.

Initial studies of the effectiveness of convalescent plasma in the U.S. and other countries had mixed results. Casadevall and colleagues note this was due in part to the challenges of ensuring that convalescent plasma contained sufficiently high anti-SARS-CoV-2 antibody concentrations. Another issue with many early studies, the researchers say, was that convalescent plasma was given to patients hospitalized with COVID already too sick to benefit much from the therapy.

Later studies showed convalescent plasma could be effective, including a clinical trial led by Johns Hopkins researchers that found that early use of convalescent plasma among outpatients reduced the relative risk of hospitalization by 54%. (Those findingswere announced in December 2021.)

The researchers note that use of convalescent plasma during the pandemic was safe and its cost-averaging about $750 per unit in the U.S.-is lower than newer, patented COVID treatments.

The authors recommend that public health preparedness planning for future infectious disease outbreaks, epidemics, and pandemics include readiness to collect and deliver convalescent plasma at scale.

The authors note that the study has several limitations. While estimates of convalescent plasma units used in their analysis captured most convalescent plasma used during the study period, the exact number of units used is not known. This is likely due in part to the national Blood Centers of America not capturing convalescent plasma treatments administered locally in the early stages of the pandemic. In addition, the mortality reduction estimates the authors used to calculate lives saved varied widely. It’s not known if they mirrored use and efficacy of convalescent plasma use in clinical settings throughout the U.S.

“We should be ready to set up outpatient centers to treat people early on with convalescent plasma during a future outbreak,” Casadevall says. “It would require designating spaces in hospitals for that purpose, but we wouldn’t need any new technology-this is well-established medical knowledge and practice.”

Reference:

Quigly Dragotakes, Patrick W. Johnson, Matthew R. Bura, and Arturo Casadevall, Estimates of actual and potential lives saved in the United States from the use of COVID-19 convalescent plasma, PNAS, https://doi.org/10.1073/pnas.2414957121

Powered by WPeMatico

Phthalate exposure during second and third trimester of pregnancy may impact fetal growth and development: Study

Researchers have now linked prenatal exposure to phthalates with dramatic changes in thyroid hormone levels based on a new prospective cohort study conducted from 2019 to 2022. This study was intended to assess the impact of phthalate exposure through all trimesters on the maternal function of the thyroid, critical to fetal growth and development. The research was published in the International Journal Of Hygiene and Environmental Health by Al-Saleh and colleagues.

Phthalate esters, or PAEs, are endocrine disruptors. Several epidemiological studies have suggested that phthalates might affect maternal thyroid hormones in utero, but results have been inconsistent. The aim of the study was to determine more clearly the relationship between phthalate exposure and thyroid hormone levels over the course of pregnancy.

In this study, 672 pregnant women were enrolled, with two urine and one blood sample collected from each participant in the course of three trimesters. Urine samples from 663, 335, and 294 women in the first, second, and third trimesters, respectively, were analyzed for seven phthalate metabolites. Blood samples from 596, 627, and 576 women in the first trimester; 292, 293, and 282 in the second trimester; and 250, 250, and 248 in the third trimester were examined for free thyroxine (FT4), thyroid-stimulating hormone (TSH), and total triiodothyronine (TT3).

The key findings of the study were:

  • Other than monobenzyl phthalate (MBzP), detected in 25%-33% of samples, other metabolites were found in over 86% of urine samples, indicating widespread exposure to DEP, DBP, and DEHP.

  • The phthalate exposure levels in this cohort were significantly higher than those reported in other countries.

  • The levels of phthalate metabolites changed across trimesters, suggesting changing exposures and metabolic changes during pregnancy.

  • There were significant linear trends for FT4, TSH, and to some degree TT3, across quartiles of specific phthalate metabolites. In the highest quartiles:

  • A 2%-3.7% increase in FT4 levels associated with MEP, MECPP, MEHHP, and a number of sums of metabolites.

  • Significantly, it contributes to a rise in TSH levels for all phthalate metabolites with the exception of MEHHP.

  • This corresponds to an increase of 2.2% for TT3 with MEP, and a decrease of 2.7% with ΣDBP.

  • Higher TSH/FT4 ratios were found with higher quartiles of the following specific phthalate metabolites: MEP, MiBP, MnBP, ∑7PAE, ∑DBP, and ∑LMW.

On the basis of these findings, inference can be made that the exposure to phthalate in utero is associated with changes in maternal thyroid hormone concentrations. This research underlines that during pregnancy, phthalate exposure should be monitored because of its possible disturbances in thyroid hormone levels, very critical for fetal development. The result reiterates the need for further confirmation of the present findings and exploring mechanisms through which phthalates may influence the action of thyroid.

Reference:

Al-Saleh, I., Elkhatib, R., Alghamdi, R., Alrushud, N., Alnuwaysir, H., Alnemer, M., Aldhalaan, H., & Shoukri, M. (2024). Phthalate exposure during pregnancy and its association with thyroid hormones: A prospective cohort study. International Journal of Hygiene and Environmental Health, 261(114421), 114421. https://doi.org/10.1016/j.ijheh.2024.114421

Powered by WPeMatico

Study Evaluates Vaginal Dilation Therapy for Cervical Cancer Survivors

Cervical cancer ranks as the fourth most prevalent cancer in women worldwide. Recent study aimed to assess the impact of vaginal dilation therapy (VDT) on vaginal length, vaginal stenosis, vaginal elasticity, and sexual function of cervical cancer survivors who had not received timely dilation therapy. The study involved 139 patients who received 6 months of VDT. Vaginal conditions were assessed using customised vaginal molds, and sexual function was evaluated using the Female Sexual Function Index (FSFI). The findings revealed that factors such as age, vaginal diameter, and sexual intercourse frequency before diagnosis were significantly associated with female sexual dysfunction after cancer treatment. VDT improved vaginal stenosis, length, and sexual function in all patients, but did not significantly improve vaginal elasticity. Additionally, the study found that patients with a time interval from the last treatment of less than 24 months or those with moderate or good vaginal elasticity benefitted more from VDT. The study confirmed that cervical cancer survivors can experience vaginal condition deterioration and sexual dysfunction after treatment. VDT was effective in improving vaginal stenosis, length, and sexual function, benefiting patients irrespective of the treatment methods they received. The findings also suggested that VDT should be performed promptly after cervical cancer treatment, with patients benefiting more if the therapy is provided earlier. The study also highlighted the need for developing countries to pay more attention to the sexual issues of cervical cancer survivors in clinical practice and to perform VDT promptly after treatment. The research used a prospective, uncontrolled, monocentric interventional study design. The study was approved by the Science and Technology Division of Beijing Obstetrics and Gynaecology Hospital, Capital Medical University, and written informed consent was obtained from all participants. The study revealed that after 6 months of VDT, patients experienced improvements in vaginal stenosis, length, and sexual function, with VDT being more effective in patients with a time interval from the last treatment of less than 24 months or those with moderate or good vaginal elasticity. Furthermore, the study observed that the frequency of sexual intercourse in cervical cancer survivors before and after VDT was significantly lower than that before the diagnosis of the disease. It was found that the quality of the relationship with sexual partners can help decrease the incidence of female sexual dysfunction. The study had some limitations, including a relatively short follow-up period of only 6 months and the absence of a control group. However, the findings emphasized the importance of VDT in improving the vaginal length, stenosis, and sexual function of cervical cancer survivors, highlighting the significance of providing timely VDT, especially in developing countries. In conclusion, the study provided valuable insights into the effectiveness of VDT for cervical cancer survivors and emphasized the importance of prompt VDT after treatment. The findings of the study have important implications for clinical practice and underscore the need to address the sexual issues of cervical cancer survivors, particularly in developing countries.

Key Points

1. The study aimed to assess the impact of vaginal dilation therapy (VDT) on vaginal length, vaginal stenosis, vaginal elasticity, and sexual function of cervical cancer survivors. 139 patients who had not received timely dilation therapy participated in the study and received 6 months of VDT. The evaluation involved customised vaginal molds for assessing vaginal conditions and the Female Sexual Function Index (FSFI) for evaluating sexual function.

2. The findings indicated that age, vaginal diameter, and sexual intercourse frequency before diagnosis were significantly associated with female sexual dysfunction after cancer treatment. VDT was found to improve vaginal stenosis, length, and sexual function in all patients, but it did not significantly improve vaginal elasticity. Patients with a time interval from the last treatment of less than 24 months or those with moderate or good vaginal elasticity benefitted more from VDT.

3. The study confirmed that cervical cancer survivors can experience vaginal condition deterioration and sexual dysfunction after treatment. VDT was effective in improving vaginal stenosis, length, and sexual function, benefitting patients irrespective of the treatment methods they received.

4. The research used a prospective, uncontrolled, monocentric interventional study design and was approved by the Science and Technology Division of Beijing Obstetrics and Gynaecology Hospital, Capital Medical University. The study revealed that after 6 months of VDT, patients experienced improvements in vaginal stenosis, length, and sexual function, with VDT being more effective in patients with a time interval from the last treatment of less than 24 months or those with moderate or good vaginal elasticity.

5. The study observed that the frequency of sexual intercourse in cervical cancer survivors before and after VDT was significantly lower than that before the diagnosis of the disease. The quality of the relationship with sexual partners was found to help decrease the incidence of female sexual dysfunction.

6. The study emphasized the importance of VDT in improving the vaginal length, stenosis, and sexual function of cervical cancer survivors, highlighting the significance of providing timely VDT, especially in developing countries. The findings have important implications for clinical practice and underscore the need to address the sexual issues of cervical cancer survivors, particularly in developing countries. The study suggests that VDT should be performed promptly after cervical cancer treatment.

Reference –

Yu-Xuan Lin, Fei-Fei Zhao & Wei-Min Kong (2024) Effects of vaginal

dilation therapy on vaginal length, vaginal stenosis, vaginal elasticity and sexual functionof cervical cancer survivors, Journal of Obstetrics and Gynaecology, 44:1, 2317387, DOI:

10.1080/01443615.2024.2317387

Powered by WPeMatico

Study Reveals Genotype-Dependent Benefits of Prenatal Fish Oil for Preventing Childhood Atopic Dermatitis

Denmark: A secondary analysis of a randomized clinical trial has uncovered significant insights into how prenatal fish oil supplementation impacts the risk of childhood atopic dermatitis (AD). The study reveals that the effectiveness of omega-3 long-chain polyunsaturated fatty acids (n-3 LCPUFAs) in preventing AD is influenced by the maternal COX1 genotype.

The findings, published in JAMA Dermatology, advocate for a personalized prevention approach to mitigate the risk of childhood atopic dermatitis by genotyping expectant mothers early in pregnancy and administering n-3 LCPUFA supplementation specifically to those with the COX1 TT genotype.

Eicosanoids play a role in the pathophysiology of atopic dermatitis. Yet, it remains unclear how prenatal supplementation with ω-3 long-chain polyunsaturated fatty acids (n-3 LCPUFAs, such as fish oil) and genetic variations in the cyclooxygenase-1 (COX1) pathway influence this process. Considering this, Liang Chen, University of Copenhagen, Copenhagen, Denmark, and colleagues aimed to investigate the relationship between prenatal n-3 LCPUFA supplementation and the risk of childhood atopic dermatitis, both generally and with maternal COX1 genotype.

The prespecified secondary analysis of a randomized clinical trial utilized data from the Danish Copenhagen Prospective Studies on Asthma in Childhood 2010 birth cohort, tracking mother-child pairs with follow-up until the children reached 10. The study, conducted from January 2019 to December 2021 with data analysis from January to September 2023, involved determining maternal and child COX1 genotypes and quantifying urinary eicosanoids when the children were one year old.

In the original trial, 736 pregnant women at 24 weeks of gestation were randomly assigned to receive either 2.4 grams of n-3 LCPUFA (fish oil) or a placebo (olive oil) daily until one week postpartum. The analysis focused on assessing the risk of childhood AD by the age of 10 and by maternal COX1 genotype.

The following were the key findings of the study:

  • At age ten years, 635 children (57% female) completed the clinical follow-up, and these mother-child pairs were included in this study; 321 were in the intervention group and 314 in the control group.
  • Pregnancy n-3 LCPUFA supplementation was associated with lower urinary thromboxane A2 metabolites at age one year (β, –0.46), which was also associated with COX1 rs1330344 genotype (β per C allele, 0.47).
  • Although neither n-3 LCPUFA supplementation (hazard ratio [HR], 1.00) nor maternal COX1 genotype (HR, 0.94) was associated with the risk of childhood AD until age ten years, there was evidence of an interaction between these variables.
  • Among mothers with the TT genotype, the risk of AD was reduced in the n-3 LCPUFA group compared with the placebo group (390 mother-child pairs; HR, 0.70); there was no association for mothers with the CT genotype (33%; HR, 1.29), and risk was increased among offspring of mothers with the CC genotype (6%; HR, 5.77).
  • There was a significant interaction between n-3 LCPUFA supplementation and child COX1 genotype and the development of AD.

The secondary analysis of a randomized clinical trial discovered that prenatal n-3 LCPUFA supplementation was linked to a reduced risk of childhood atopic dermatitis at age ten only in mothers with the COX1 rs1330344 TT genotype. There was no significant association overall or in mothers with the CT genotype, while children of mothers with the CC genotype exhibited an increased risk of atopic dermatitis.

“These findings could guide a personalized prevention strategy, recommending supplementation exclusively for pregnant individuals with the TT genotype,” the researchers concluded.

Reference:

Chen L, Brustad N, Luo Y, et al. Prenatal Fish Oil Supplementation, Maternal COX1 Genotype, and Childhood Atopic Dermatitis: A Secondary Analysis of a Randomized Clinical Trial. JAMA Dermatol. Published online August 28, 2024. doi:10.1001/jamadermatol.2024.2849

Powered by WPeMatico

Air pollution exposure during early life can have lasting effects on the brain’s white matter: Study

Exposure to certain pollutants, like fine particles (PM2.5) and nitrogen oxides (NOx), during pregnancy and childhood is associated with differences in the microstructure of the brain´s white matter, and some of these effects persist throughout adolescence. These are the main conclusions of a study led by the Barcelona Institute for Global Health (ISGlobal), a centre supported by “la Caixa” Foundation. The findings, published in Environmental Research, highlight the importance of addressing air pollution as a public health issue, particularly for pregnant women and children.

An increasing amount of evidence suggests that air pollution affects neurodevelopment in children. Recent studies using imaging techniques have looked at the impact of air pollutants on the brain’s white matter, which plays a crucial role in connecting different brain regions. However, these studies were limited in that they only looked at one timepoint and did not follow the participants throughout childhood.

“Following participants throughout childhood and including two neuroimaging assessments for each child would shed new light on whether the effects of air pollution on white matter persist, attenuate, or worsen,” says ISGlobal researcher Mònica Guxens. And that is what she and her team did.

The study involved over 4,000 participants who had been followed since birth as part of the Generation R Study in Rotterdam, the Netherlands. The research team estimated the amount of exposure to 14 different air pollutants during pregnancy and childhood, based on where the families lived. For 1,314 children, the researchers were able to use data from two brain scans – one performed around 10 years of age and another around 14 years of age – to examine changes in white matter microstructure.

Some effects persist, some diminish over time

The analysis found that exposure to certain pollutants, like fine particles (PM2.5) and nitrogen oxides (NOx), was linked to differences in the development of white matter in the brain. Specifically, higher exposure to PM2.5 during pregnancy, and higher exposure to PM2.5, PM10, PM2.5-10, and NOx during childhood were associated with lower levels of a measure called fractional anisotropy, which measures how water molecules diffuse within the brain. In more mature brains, water flows more in one direction than in all directions, which gives higher values for this marker. This association persisted throughout adolescence (i.e. it was also observed in the second scan), suggesting a long-term impact of air pollution on brain development. Every increase in exposure level to air pollution corresponded to more than a 5-month delay in the development of fractional anisotropy.

“We think that the lower fractional anisotropy is likely the result of changes in myelin, the protective sheath that forms around the nerves, rather than in the structure or packaging of the nerve fibers” says Michelle Kusters, ISGlobal researcher and first author of the study. How air pollutants affect myelin is not fully understood, but could be linked to the entrance of small particles directly to the brain or to inflammatory mediators produced by the body when the particles enter the lungs. Together, this would lead to neuroinflammation, oxidative stress, and eventually neuronal death, as documented in animal studies.

The study also found that some pollutants were linked to changes in another measure of white matter, called mean diffusivity, which reflects the integrity of white matter, and which tends to decrease as the brain matures. Higher exposure to pollutants like silicon in fine particles (PM2.5) during pregnancy was associated with initially higher mean diffusivity, which then decreased more rapidly as the children grew older. This indicates that some effects of air pollution may diminish over time.

Policy implications

Overall, the study suggests that air pollution exposure, both during pregnancy and early childhood, can have lasting effects on the brain’s white matter. “Even if the size of the effects were small, this can have a meaningful impact on a population scale,” says Guxens.

Importantly, these findings were present in children exposed to PM2.5 and PM10 concentrations above the currently recommended maximum values by the WHO but below those currently recommended by the European Union. “Our study provides support to the need for more stringent European guidelines on air pollution, which are expected to be approved soon by the European Parliament,” adds Guxens.

In a previous study, Guxens and her team showed that white matter microstructure can also be affected by early exposure to heat and cold, especially in children living in poorer neighbourhoods.

Reference:

Michelle S.W. Kusters, Mónica López-Vicente, Ryan L. Muetzel, Anne-Claire Binter, Sami Petricola, Henning Tiemeier, Mònica Guxens, Residential ambient air pollution exposure and the development of white matter microstructure throughout adolescence, Environmental Research, https://doi.org/10.1016/j.envres.2024.119828.

Powered by WPeMatico

Thoracic endovascular aortic repair may decrease mortality rate of TBAD versus medical treatment: Study

New research found that thoracic endovascular aortic repair (TEVAR) was productive and has reduced patients’ in-hospital stay or 30-day mortality rate with Type B aortic dissections (TBADs) compared to the medical treatment group. The study results were published in the journal BMC Surgery. Better treatment results were seen in old-age patients, and those with uncomplicated TBADs with TEVAR than with medical treatment.

Endovascular therapy techniques have advanced significantly, offering a promising alternative to traditional medical treatment for patients with Type B aortic dissections (TBADs). Type B aortic dissections occur when the inner layer of the aorta tears, allowing blood to flow between layers of the aortic wall, which can lead to severe complications. The main treatment options include best medical therapy (BMT), aimed at managing blood pressure and other risk factors, and thoracic endovascular aortic repair (TEVAR), which involves inserting a stent to support the aorta and prevent further damage. As research showed that both techniques have been effective researchers conducted a meta-analysis to compare mortality rates and overall complications between TEVAR and BMT in patients with TBADs. The goal was to determine whether TEVAR provides better outcomes than BMT alone, particularly in terms of reducing early mortality and complications.

The analysis included randomized controlled trials and prospective or retrospective cohort studies comparing the effectiveness of TEVAR and BMT in treating Type B aortic dissection. Researchers searched multiple electronic databases to gather relevant studies for comparison. In total, 32 cohort studies involving 150,836 patients were included in the final analysis.

Findings:

  • The results revealed that TEVAR was associated with a significantly lower 30-day mortality rate compared to BMT.
  • The relative risk (RR) of 30-day mortality for patients undergoing TEVAR was 0.79, with a confidence interval (CI) of 0.63 to 0.99 and a P-value of 0.04. This reduction in mortality was particularly notable in patients aged 65 and older, where the relative risk was 0.78 (CI: 0.64–0.95, P = 0.01).
  • However, patients who underwent TEVAR had significantly longer hospital stays compared to those who received BMT. The mean difference (MD) in hospital stay was 3.42 days (CI: 1.69–5.13, P = 0.0001), and the mean ICU stay was 3.18 days longer (CI: 1.48–4.89, P = 0.0003).
  • On the other hand, BMT was associated with an increased risk of stroke, with an RR of 1.52 (CI: 1.29–1.79, P < 0.00001).
  • In terms of long-term outcomes, no significant differences were found between TEVAR and BMT groups for late mortality rates at 1, 3, and 5 years.
  • Similarly, there were no significant differences in the incidence of major complications such as acute renal failure, spinal cord ischemia, myocardial infarction, respiratory failure, or sepsis between the two groups.

Thus, the meta-analysis demonstrated that TEVAR is associated with a significantly lower mortality rate for patients with Type B aortic dissections compared to BMT, particularly in older patients aged 65 and above. Although TEVAR patients experience longer hospital and ICU stays, the procedure’s benefits in reducing early mortality make it a viable alternative to medical therapy alone. Despite these promising findings, further randomized controlled trials are necessary to confirm the advantages of TEVAR over BMT and to explore potential long-term benefits or risks.

Further reading: Motawea, K.R., Rouzan, S.S., Elhalag, R.H. et al. Efficacy of thoracic endovascular aortic repair versus medical therapy for treatment of type B aortic dissection. BMC Surg 24, 259 (2024). https://doi.org/10.1186/s12893-024-02555-4

Powered by WPeMatico

Peribulbar corticosteroid administration clinically useful treatment for ocular myasthenia gravis: Study

Researchers find that long-acting agents, such as triamcinolone, are effective for treating ocular myasthenia gravis (OMG). Oral myasthenia gravis is a condition characterized by weakness in the muscles controlling eye movements and eyelids. The use of peribulbar injections with dexamethasone and triamcinolone, according to a new study, constitutes a safer and better localized alternative, suggesting their efficacy in maintaining permanent resolution of symptoms. The study was published in the Journal of Neuro-Ophthalmology by Lasry R. and colleagues.

OMG weakens the extraocular muscles around the eyes, which often leads to paralysis of eye movements and ptosis or drooping eyelids. Oral treatments with corticosteroids are well established but carry the risk of side effects to so many other body tissues that localized treatments are highly desirable. This article reports on the use of peribulbar corticosteroid injections in OMG patients, thereby providing a more localized form of treatment.

A retrospective chart review was performed in order to determine the effectiveness of peribulbar corticosteroid injections in a small group of patients who had ocular myasthenia gravis. Five patients who received that treatment could be identified for the case-analysis. The study focused on the effects of peribulbar dexamethasone or triamcinolone (40-mg Triesence), which is a longer-acting corticosteroid, which targets the area behind the eye rather than injecting the affected extraocular muscles itself. The results of resolution of symptoms, time to treatment, and recurrence of symptoms after injection were monitored by the review.

Of the total five patients inducted:

  • Four patients had isolated ophthalmoparesis, that manifested as paralysis of eye muscles

  • One patient had isolated ptosis.

  • Four of the five patients had combinations of ptosis and ophthalmoparesis.

  • Cases of ptosis and ophthalmoparesis are documented in three out of four cases, where symptoms subsided completely within a few weeks after one single injection in the peribulbar region.

  • Symptoms remained improved for 5 to 6 months, and patients responded well to repeated injections in cases of symptom relapse.

In summary, peribulbar corticosteroids, mainly triamcinolone, hold considerable promise as a treatment for ocular myasthenia gravis with a possibility of holding relief for many months after less injection. This treatment carries minimal systemic adverse effects and, hence, is a more friendly option to the patient. As such, the healthcare providers should henceforth consider the peribulbar triamcinolone approach as an effective alternative for the management of OMG, especially in patients who experience frequent relapse of symptoms.

Reference:

Lasry, R., Gotkine, M., & Kruger, J. M. (2024). Peribulbar corticosteroids for ocular myasthenia gravis. Journal of Neuro-Ophthalmology: The Official Journal of the North American Neuro-Ophthalmology Society, 44(3), 419–422. https://doi.org/10.1097/wno.0000000000002148

Powered by WPeMatico

Vitamin D Levels Linked to Periodontitis and Tooth Loss in Older Irish Adults, Study Reveals

Ireland: Recent research published in the British Journal of Nutrition has highlighted a significant connection between vitamin D levels, periodontitis, and tooth loss among older adults in Ireland.

A cross-sectional study involving a diverse group of older men and women revealed that concentrations of 25-hydroxyvitamin D (25(OH)D) were associated with both periodontal disease and the likelihood of tooth loss, independent of other risk factors.

Periodontitis, a serious gum infection that damages soft tissue and can destroy the bone that supports teeth, is common among older adults. It is often exacerbated by a variety of factors, including poor oral hygiene, smoking, and systemic health issues. However, the study sheds light on the role of vitamin D, a nutrient crucial for bone health and immune function.

In the study, Lewis Winning, Dublin Dental University Hospital, Trinity College Dublin, Dublin, Republic of Ireland, and colleagues aimed to examine the relationship between 25(OH)D levels and periodontitis and tooth loss in older adults.

For this purpose, 2,346 adults participated in a comprehensive dental examination as part of the health assessment for The Irish Longitudinal Study of Ageing. Analysis of 25-hydroxyvitamin D (25(OH)D) levels was conducted on frozen, non-fasting total plasma samples using liquid chromatography-mass spectrometry (LC-MS). The study employed multiple logistic and multinomial logistic regression to explore the relationships between 25(OH)D concentrations, periodontitis, and tooth loss while adjusting for various potential confounding factors.

Based on the study, the researchers revealed the following findings:

  • The mean age of participants was 65·3 years, and 55·3 % of the group were female.
  • Based on the quintile of 25(OH)D concentration, participants in the lowest versus highest quintile had an OR of 1·57 of having periodontitis in the fully adjusted model.
  • For tooth loss, participants in the lowest v. highest quintile of 25(OH)D had a RRR of 1·55 to have 1–19 teeth and a RRR of 1·96 to be edentulous, relative to those with ≥ 20 teeth in the fully adjusted models.

The cross-sectional study of older men and women in Ireland found an association between 25-hydroxyvitamin D levels and periodontitis and tooth loss.

“These results are particularly significant, considering the high rates of vitamin D deficiency among older adults in the country. Future longitudinal studies should focus on exploring the potential causality and mechanisms involved,” the researchers wrote.

“Furthermore, since vitamin D supplementation generally provides health benefits with a low risk of toxicity or side effects, clinical trials are needed to determine whether vitamin D supplementation can effectively prevent periodontitis and tooth loss in the elderly population in Ireland,” they concluded.

Reference:

Winning L, Scarlett S, Crowe M, O’Sullivan M, Kenny RA, O’Connell B. Vitamin D, periodontitis and tooth loss in older Irish adults. British Journal of Nutrition. Published online 2024:1-9. doi:10.1017/S000711452400148X

Powered by WPeMatico