Corticosteroid Phobia Negatively Impacts Treatment Adherence in Chronic Hand Eczema, reports research

Researchers now report that corticosteroid phobia is indeed a significant concern in patients suffering from chronic hand eczema (CHE), as this phobia severely impacts adherence to topical corticosteroid treatment (TCS). This new finding, which links the fear of corticosteroids with low treatment adherence, further serves to emphasize the need for overtly addressing patient concerns as part of an effective approach to managing CHE. A recent study was published in the Journal of the American Academy of Dermatology by Christensen and colleagues.

The purpose of the study was to assess the patient-reported outcomes of TCS, which would influence treatment adherence in chronic hand eczema. Researchers used the TOPICOP scale, which is a tool for measuring corticosteroid phobia, and the Medication Adherence Report Scale, or MARS, for patients to assess their adherence to prescribed treatments.

The study recruited patients from the Danish Skin Cohort diagnosed with CHE. As a whole, 927 patients responded to a questionnaire which contained the TOPICOP scale and MARS. This helped achieve a response rate of 69.2%. The TOPICOP scale assessed levels of corticosteroid fear, and MARS estimated the degree of medication adherence.

There were high levels of corticosteroid phobia with patients with CHE:

  • 75.5 percent agreed that TCS would harm their skin completely or almost.

  • 48.9 percent agreed that TCS would affect their health in the future

  • 36.3 percent of patients said they felt a level of fear relating to TCS. However, the patients did not know of any risks associated with its use.

The study also gained fruitful understandings on adherence to treatment:

  • 77.9% of patients always or often stopped TCS treatment as soon as they could, indicating considerable reluctance to continue the therapy for the duration needed.

  • 54.8% reported that they always or often delayed initiating their TCS treatment as long as possible and therefore further delayed appropriate management.

  • In summary, 38.8% of patients admitted to receiving fewer doses than prescribed, and 54.0% reported they had stopped their treatment at some point during the course of treatment.

The effects were also of statistical significance on a reduction in the adherence to the treatment with increasing levels of corticosteroid phobia (P = .004), and hence it highlights that fear of TCS directly affects the effective management of CHE.

One of its drawbacks was that use of TOPICOP scale was not validated specifically in patients with CHE. It has been commonly used in other dermatological conditions. The accuracy of capturing corticosteroid phobia for the above-mentioned patient population is dubious.

In conclusion, among patients with CHE, corticosteroid phobia exists, and the adherence to treatment is significantly lower. Educating these patients about fears using tailored communication will be the solution for enhancing adherence and, therefore, proper management of CHE.

Reference:

Christensen, M. O., Sieborg, J., Nymand, L. K., Guttman-Yassky, E., Ezzedine, K., Schlapbach, C., Molin, S., Zhang, J., Zachariae, C., Thomsen, S. F., Thyssen, J. P., & Egeberg, A. (2024). Prevalence and clinical impact of topical corticosteroid phobia among patients with chronic hand eczema—Findings from the Danish Skin Cohort. Journal of the American Academy of Dermatology. https://doi.org/10.1016/j.jaad.2024.07.1503

Powered by WPeMatico

Machine learning based pure-tone audiometry effective in diagnosing Meniere’s disease: Study

A new study published in the journal of Otolaryngology-Head and Neck Surgery showed that an effective machine learning model utilizing pure-tone audiometry characteristics effectively diagnosed Meniere’s disease (MD) with the ability to foresee the subtypes of endolymphatic hydrops.

One of the most complicated disorders in otolaryngology is Meniere’s disease which is a complex vestibular dysfunction that presents significant diagnostic hurdles. As of now, the etiology of MD is complex and poorly understood with endolymphatic hydrops (EH) as a key pathohistological characteristic. Even though EH may presently be seen using gadolinium-enhanced MRI, clinical symptoms still play a major role in the diagnosis of MD.

When MD is first diagnosed, it is frequently divided into “definite MD” and “probable MD.” Thus, Xu Liu and team conducted this study with the purpose to deploy machine learning models for the automated diagnosis of Meniere’s illness and the prediction of endolymphatic hydrops, based on the air conduction thresholds of pure-tone audiometry.

Pure-tone audiometry data and gadolinium-enhanced magnetic resonance imaging sequences were gathered. Based on the air conduction thresholds of pure-tone audiometry, basic and various analytical characteristics were then constructed. Afterwards, the constructed characteristics were used to train 5 traditional machine learning models for MD diagnosis. The models that performed exceptionally well were also chosen to forecast EH. The performance of the model in diagnosing MDs was evaluated against that of skilled otolaryngologists.

With an accuracy rate of 87%, sensitivity of 83%, specificity of 90%, and a strong area under the receiver operating characteristic curve of 0.95, the winning light gradient boosting (LGB) machine learning model trained by multiple features performs remarkably well on the diagnosis of MD.

Also, the LGB model was better than the other 3 machine learning models, with an accuracy of 78% on EH prediction. Further, a feature significance analysis highlights the critical relevance of certain pure-tone audiometry parameters that are necessary for EH prediction and MD diagnosis. At 250 Hz, the standard deviation and mean of the whole-frequency hearing, hearing at low frequencies and the peak audiogram stood out. Overall, this study supports the use of pure-tone audiometry data in an AI-based method to distinguish between MD. 

Reference:

Liu, X., Guo, P., Wang, D., Hsieh, Y., Shi, S., Dai, Z., Wang, D., Li, H., & Wang, W. (2024). Applications of Machine Learning in Meniere’s Disease Assessment Based on Pure‐Tone Audiometry. In Otolaryngology–Head and Neck Surgery. Wiley. https://doi.org/10.1002/ohn.956

Powered by WPeMatico

Standing more may not reduce CVD risk, could increase circulatory disease: Study

Standing has gained popularity among people looking to offset the harms of a sedentary lifestyle often caused by spending long days sitting in front of the computer, television or driving wheel. Standing desks have become a popular option among office workers, and in other industries like retail, workers may opt to stand instead of sit.

However, their efforts may not produce the intended result. New University of Sydney research has shown that over the long-term, standing more compared with sitting does not improve cardiovascular health (coronary heart disease, stroke and heart failure), and could increase the risk of circulatory issues related to standing, such as varicose veins and deep vein thrombosis.

The study, published in the International Journal of Epidemiology also found that sitting for over 10 hours a day increased both cardiovascular disease and orthostatic incidence risk, reinforcing the need for greater physical activity throughout the day. The research also notes that standing more was not associated with heightened cardiovascular disease risk.

Lead author from the Faculty of Medicine and Health and Deputy Director of the Charles Perkins Centre’s Mackenzie Wearables Research Hub, Dr Matthew Ahmadi, said there were other ways for those with a sedentary lifestyle to improve their cardiovascular health.

“The key takeaway is that standing for too long will not offset an otherwise sedentary lifestyle and could be risky for some people in terms of circulatory health. We found that standing more does not improve cardiovascular health over the long term and increases the risk of circulatory issues,” Dr Ahmadi said.

While the researchers found that there were no health benefits gained from standing more, they cautioned against sitting for extended periods, recommending that people who are regularly sedentary or find themselves standing for long periods schedule regular movement throughout the day.

“For people who sit for long periods on a regular basis, including plenty of incidental movement throughout the day and structured exercise may be a better way to reduce the risk of cardiovascular disease,” said Professor Emmanuel Stamatakis, Director of the Mackenzie Wearables Research Hub.

“Take regular breaks, walk around, go for a walking meeting, use the stairs, take regular breaks when driving long distances, or use that lunch hour to get away from the desk and do some movement. In Australia, we are now coming into the warmer months, so the weather is perfect for sun-safe exercise that helps you get moving,” he said.

Professor Stamatakis and Dr Ahmadi’s research published earlier this year found that about 6 minutes of vigorous exercise or 30 minutes of moderate-to-vigorous exercise per day could help lower the risk of heart disease even in people who were highly sedentary for more than 11 hours a day.

The study was conducted using incident heart condition and circulatory disease data taken over a period of seven to eight years from 83,013 UK adults who were free of heart disease at baseline, measured using research-grade wrist-worn wearables similar to a smartwatch.

The data used in the study was not explicitly collected on standing desk usage; instead, it measured the cardiovascular and circulatory impacts of increased standing. Standing desk use in this study likely contributes a very small fraction of total standing.

Reference:

Matthew N Ahmadi, Pieter Coenen, Leon Straker, Emmanuel Stamatakis, Device-measured stationary behaviour and cardiovascular and orthostatic circulatory disease incidence, International Journal of Epidemiology, Volume 53, Issue 6, December 2024, dyae136, https://doi.org/10.1093/ije/dyae136.

Powered by WPeMatico

Add on fentanyl to ropivacaine enhances Caudal Epidural Block Duration and Efficacy in spine surgeries: Study

A sole injection of local anesthetic as a caudal epidural block relieves pain for 2-4 hours. This timeframe can be prolonged by incorporating additional agents like opioids, ketamine, α2 agonists, and adrenaline. Recent research paper examined the use of adjuvants, specifically fentanyl, to extend the duration of pain relief provided by a single injection of the local anesthetic ropivacaine as a caudal epidural block. The study aimed to compare the quality and duration of pain relief using a combination of caudal epidural ropivacaine and fentanyl versus using ropivacaine alone in patients undergoing lumbosacral spine surgeries.

The study included 56 ASA grade I and II patients who were randomly divided into two groups. Group R received 20 ml of 0.2% ropivacaine, while Group RF received 20 ml of 0.2% ropivacaine combined with 50 micrograms of fentanyl as a caudal epidural block. The results showed that the addition of fentanyl to the ropivacaine injection significantly prolonged the duration of analgesia. The mean time until the first rescue analgesia was required was 7.30 hours in the RF group compared to 6.68 hours in the R group. The visual analog scale (VAS) scores were also lower in the RF group throughout the 24-hour postoperative period, with a maximum VAS of 5.87 at 4 hours compared to 5.96 at 6 hours in the R group.

Comparison of Analgesia

The intraoperative fentanyl requirement was similar between the two groups. However, in the postoperative period, the total amount of rescue analgesia (fentanyl, diclofenac, and tramadol) administered was comparable between the groups. No adverse effects or complications related to the caudal block or the administered drugs were observed in either group.

Study Conclusion

The study concludes that the addition of 50 micrograms of fentanyl to 20 ml of 0.2% ropivacaine for ultrasound-guided caudal epidural block in patients undergoing lumbosacral spine surgeries results in longer duration of analgesia and lower VAS scores over the postoperative 24 hours, without increasing the incidence of adverse effects.

Implications of the Findings

The findings suggest that the combination of ropivacaine and fentanyl as a caudal epidural block provides superior postoperative pain relief compared to ropivacaine alone in patients undergoing lumbosacral spine surgeries. This technique can be a valuable tool in managing postoperative pain and improving patient outcomes in this patient population.

Key Points

1. The study examined the use of the adjuvant fentanyl to extend the duration of pain relief provided by a single injection of the local anesthetic ropivacaine as a caudal epidural block in patients undergoing lumbosacral spine surgeries.

2. The study included 56 ASA grade I and II patients who were randomly divided into two groups – Group R received 20 ml of 0.2% ropivacaine, while Group RF received 20 ml of 0.2% ropivacaine combined with 50 micrograms of fentanyl as a caudal epidural block.

3. The results showed that the addition of fentanyl to the ropivacaine injection significantly prolonged the duration of analgesia, with the mean time until the first rescue analgesia being 7.30 hours in the RF group compared to 6.68 hours in the R group. The visual analog scale (VAS) scores were also lower in the RF group throughout the 24-hour postoperative period.

4. The intraoperative fentanyl requirement was similar between the two groups, but in the postoperative period, the total amount of rescue analgesia (fentanyl, diclofenac, and tramadol) administered was comparable between the groups.

5. No adverse effects or complications related to the caudal block or the administered drugs were observed in either group.

6. The study concludes that the addition of 50 micrograms of fentanyl to 20 ml of 0.2% ropivacaine for ultrasound-guided caudal epidural block in patients undergoing lumbosacral spine surgeries results in longer duration of analgesia and lower VAS scores over the postoperative 24 hours, without increasing the incidence of adverse effects.

Reference –

Rajwade S, Dubey R, Khetarpal M, et al. (October 06, 2024) Comparative Analysis of the Postoperative Analgesic Effects of Caudal Epidural Injection of Ropivacaine Combined With Fentanyl Versus Ropivacaine Alone in Lumbosacral Spine Surgeries: A Randomized Double-Blinded Study. Cureus 16(10): e70963. DOI 10.7759/cureus.70963

Powered by WPeMatico

Can daily egg intake reduce the risk of dementia development in elderly?

A recent study published in the recent issue of Nutrients journal showed that taking one egg per day may reduce the risk of dementia, while taking more or less eggs may have the opposite effect. Due to the global aging population, dementia has emerged as a significant public health concern with an annual occurrence rate of 10 million, dementia affects about 55 million people worldwide. By 2050, there will be 152 million dementia sufferers worldwide. Since there is now no recognized treatment for dementia, public health prioritizes primary prevention as a way to lower the prevalence of dementia and slow its development.

The body of studies on the relationship between dietary variables like eating fish, and dementia incidence and prognosis is growing, while efforts to prevent dementia are focused on other modifiable risk factors, like smoking, depression, and exposure to air pollution. In order to ascertain the independent relationship between egg intake and dementia, Precious Igbinigie and colleagues carried out this population-based case-control research in China.

From the community health care clinics and the dementia management system in Guangzhou, China, this research randomly selected 233 people with dementia and 233 persons without dementia. Their food intakes during the previous two years and other risk factors for chronic illnesses were examined. The frequency of egg intake was divided into four categories as Monthly, Weekly, Daily, ≥Twice a day, and Non-consuming.

  • The study found that dementia participants were more likely to eat eggs at monthly but less likely to do so on a daily basis in contrast to controls. When compared to daily egg intake, the age-adjusted odds ratio (OR) for dementia was 1.76 in people who consumed eggs weekly and 4.34 in those who ingested them monthly.
  • Yet, no noteworthy correlations were discovered for the categories of Non-consumption. The corresponding ORs were 2.10, 4.82, 0.73, and 4.16, respectively, after additional adjustments for gender, family income, education, alcohol use, smoking, dietary intakes, cardiovascular disease, and other co-morbidities.
  • An inverse relationship between egg intake and dementia was found among people who consumed eggs on a monthly, weekly, or daily basis. The multiple adjusted OR of dementia was 0.48 for each average increase in egg consumption. The OR for weekly intake was 0.44 and for every day was 0.22 when compared to monthly usage.

Overall, this study revealed that daily egg intake may lessen the risk of dementia. However, the link between non-consumption/monthly or twice-a-day consumption and dementia requires additional research.

Source:

Igbinigie, P. O., Chen, R., Tang, J., Dregan, A., Yin, J., Acharya, D., Nadim, R., Chen, A., Bai, Z., & Amirabdollahian, F. (2024). Association between Egg Consumption and Dementia in Chinese Adults. In Nutrients (Vol. 16, Issue 19, p. 3340). MDPI AG. https://doi.org/10.3390/nu16193340

Powered by WPeMatico

Increased intake of cider and beer associated with gout risk, finds Study

A new study by Jie-Qiong Lyu and team found that beer and cider increase the risk of gout in both men and women when compared to other forms of alcohol. The findings of this study were published in the Journal of American Medical Association. The sex-specific relationships between alcohol use and gout are not well established since most previous research on the subject has either involved males or mixed both sexes. Thus, this study was to assess the relationship between incident gout in men and women and the use of both total and particular alcoholic drinks.

A total of 4,01,128 people in the UK Biobank between the ages 37 to 73, who were gout-free at baseline (2006 to 2010), were included in this prospective cohort research. The data analysis took place between August 2023 and June 2024, and participants were monitored until December 31, 2021. The amount of alcohol consumed overall and in particular alcoholic beverages was measured using a questionnaire. Using medical data, incident gout was determined to be the result.

A total of 2,21,300 women and 1,79,828 males were in the primary analysis. Men who were drinking had a greater chance of developing gout than the ones who never drank, but not women. Increased overall alcohol intake among current drinkers was linked to an increased risk of gout in both sexes, with the association being stronger for men than for women. With regard to the intake of certain alcoholic beverages, beer or cider showed the most pronounced sex difference. For both sexes, drinking beer or cider, champagne or white wine, and spirits all increased the likelihood of developing gout and the highest correlation was seen with one pint of beer or cider per day.

After controlling for other alcoholic beverages and removing participants who had cut back on alcohol for health reasons, self-reported being in poor health, had kidney failure at baseline, cardiovascular disease, cancer, or or developed gout during the first two years of follow-up, some inverse associations between light to moderate consumption of certain alcoholic beverages and gout were eliminated. Overall, higher intake of a number of particular alcoholic drinks was linked to an increased incidence of gout in both sexes in this cohort analysis. 

Source:

Lyu, J.-Q., Miao, M.-Y., Wang, J.-M., Qian, Y.-W., Han, W.-W., Peng, X.-Z., Tao, H.-W., Yang, J., Chen, J.-S., Qin, L.-Q., Chen, W., & Chen, G.-C. (2024). Consumption of Total and Specific Alcoholic Beverages and Long-Term Risk of Gout Among Men and Women. In JAMA Network Open (Vol. 7, Issue 8, p. e2430700). American Medical Association (AMA). https://doi.org/10.1001/jamanetworkopen.2024.30700

Powered by WPeMatico

Kidney transplantation between donors and recipients with HIV safe, suggests NEJM study

Kidney transplantation from deceased donors with HIV (HIV D+) to recipients with HIV (HIV R+) was safe and comparable to kidney transplantation from donors without HIV (HIV D-) in a multicenter observational study in the United States. The clinical outcomes observed were consistent with smaller pilot studies, but this National Institutes of Health (NIH)-funded clinical trial was the first statistically powered to demonstrate noninferiority, which means that an approach being studied is as good as standard clinical practice. The results were published today in the New England Journal of Medicine.

Kidney transplants offer a survival benefit to people with HIV and end-stage kidney disease, but an organ shortage limits access. In addition, people with HIV face a higher risk of death while on the organ waitlist and have lower access to transplants than people without HIV. To help address these disparities, the HIV Organ Policy Equity Act (HOPE) was implemented in 2015 and legalized transplants between donors and recipients with HIV. Currently, the HOPE Act limits this practice to research settings to carefully evaluate outcomes. These include post-transplant survival, post-transplant kidney function (also known as graft survival), and kidney rejection. Research studies also assess unique potential risks of this practice, such as acquiring a second, genetically distinct HIV strain from the donor that could affect the recipient’s HIV disease.

The present study enrolled 198 adults with HIV and end-stage kidney disease who received kidney transplants at 26 centers, comparing the outcomes of 99 study participants who had donors with HIV versus 99 whose donors did not have HIV. Transplants were completed between April 2018 and September 2021 and recipients were monitored subsequently for about three years.

The outcomes for overall survival, graft survival, and rejection events were similar between the two groups. After one year post-transplant, recipient survival was 94% in HIV D+/R+ and 95% in HIV D-/R+. At three years, recipient survival rates were 85% in HIV D+/R+ and 87% in HIV D-/R+. After one year post-transplant, graft survival was 93% in HIV D+/R+ and 90% HIV D-/R+. At three years post-transplant, graft survival rates were 84% in HIV D+/R+ and 80% in HIV D-/R+. Finally, at one year post-transplant, rejection incidence was 13% in HIV D+/R+ and 21% HIV D-/R+ and at three years, 13% in HIV D+/R+ versus 21% in HIV D-/R+. Rates of serious adverse events, surgical site infections, surgical/vascular complications, and cancer were also comparable between the two groups. One case of a recipient who may have acquired a second genetically distinct HIV strain from their donor was observed, but there were no notable clinical consequences.

Overall, the findings show kidney transplantation between donors and recipients with HIV was safe and noninferior to transplantation from donors without HIV. According to the authors, these findings offer evidence to support the expansion of the practice outside of research settings. 

Reference:

Christine M. Durand, Allan Massie, Sander Florman, Safety of Kidney Transplantation from Donors with HIV, New England Journal of Medicine, DOI: 10.1056/NEJMoa2403733.

Powered by WPeMatico

Capsular Tension Rings Reduce IOL Decentration in Highly Myopic Eyes Over 30 mm, claims JAMA study

Researchers have demonstrated that capsular tension rings (CTRs) can indeed significantly reduce intraocular lens (IOL) decentration and tilt in highly myopic eyes, inclusive of cases in which the axial length (AL) is equal to or greater than 30 mm. A recent study published in JAMA Ophthalmology was conducted by Lin and colleagues.

In this clinical trial, there were 186 patients with cataracts whose AL was 26 mm or more. All the patients were further divided into three strata according to AL: stratum 1 (26 mm≤AL<28 mm), stratum 2 (28 mm≤AL<30 mm), and stratum 3 (AL≥30 mm). Randomization was carried out by assigning half of the patients in each stratum to the CTR group with a combination of a C-loop IOL with a CTR and the remaining half to the control group with only a C-loop IOL. The principal outcome measure was IOL decentration, which was assessed at 3 months post-surgery by anterior segment optical coherence tomography.

Results

  • Of the 186 eyes that were included in this study 93 eyes (50%) belonged to the CTR group, and 93 eyes (50%) belonged to the control group. 87 eyes in the CTR group and 92 eyes in the control group were seen for follow-up at a minimum of 3 months after cataract surgery.

  • The authors reported that, compared with the CTR group at 3 months post cataract surgery, IOL decentration and tilt were smaller in the control group (difference, −0.04 mm; 95% CI, −0.07 to −0.01 mm; p =0 .02).

  • This trend was greater in eyes with an AL of 30 mm or more. In that subgroup, IOL decentration was significantly smaller in the CTR group than in controls (0.20 mm vs 0.28 mm; difference, −0.08 mm; 95% CI, −0.14 to −0.02 mm; p= 0.01).

  • Also, clinically significant IOL decentration (≥0.4 mm) and tilt (≥7°) were lower at 3 months in the CTR group compared with the control group. In eyes with AL less than 30 mm, there was no difference in results between the CTR and control groups.

Capsular tension rings demonstrated a reduction in IOL decentration, increased position stability, and better visual quality in eyes with greater than or equal to 30 mm of AL. This study provides important clues for ophthalmic surgeons in optimizing surgical outcomes for patients with high myopia. Further studies should look into the long-term results of the use of CTR on the quality of vision and patient satisfaction with their outcomes.

Reference:

Lin, H., Zhang, J., Zhang, Y., Jin, A., Zhang, Y., Jin, L., Xu, Y., Xie, X., Tan, X., Luo, L., & Liu, Y. (2024). Capsular tension ring implantation for intraocular lens decentration and tilt in highly myopic eyes: A randomized clinical trial. JAMA Ophthalmology, 142(8), 708. https://doi.org/10.1001/jamaophthalmol.2024.2215

Powered by WPeMatico

Septal myectomy considered safe for older patients, reveals Study

USA: Septal myectomy is generally safe for elderly patients with obstructive hypertrophic cardiomyopathy (HCM); however, the presence of left ventricular wall asymmetry should be regarded as a potential risk factor that could complicate outcomes. This insight comes from a recent study published in the Journal of Thoracic and Cardiovascular Surgery.

Septal myectomy is considered safe for older patients; however, the presence of left ventricular wall asymmetry is associated with a worse prognosis,” the researchers wrote.

Recent studies have highlighted the unique clinical characteristics and postoperative outcomes of elderly patients undergoing septal myectomy for obstructive hypertrophic cardiomyopathy (oHCM). Understanding how oHCM manifests in older adults is critical for improving treatment strategies and outcomes as the population ages.

Surgical septal reduction is often deferred in older adults due to concerns about elevated operative risks. Considering this, Hartzell V. Schaff, Department of Cardiovascular Surgery, Mayo Clinic, Rochester, Minnesota, USA, and colleagues aimed to compare the clinical and echocardiographic features of young and older patients undergoing septal myectomy for oHCM and to evaluate differences in early and late postoperative outcomes.

For this purpose, the researchers included 2,663 patients with obstructive hypertrophic cardiomyopathy who underwent transaortic septal myectomy between 2000 and 2021. These patients were categorized by age into three groups: 18-64 years, 65-74 years, and 75 years and older.

The following were the key findings of the study:

  • The median age at the time of surgery increased over the study period. Older patients had a higher prevalence of female sex, hypertension, and diabetes, while the extent of functional limitation, as measured by NYHA class, remained similar across age groups.
  • Elderly patients exhibited thinner septal and posterior walls and less pronounced asymmetry, and they were less likely to test positive for genetic markers.
  • Hospital mortality rates were 0.2%, 0.5%, and 1.3% for patients under 65, those aged 65-74, and those 75 and older, respectively.
  • Five-year survival rates were 97%, 93%, and 91%.
  • In patients over 65, the septal-to-posterior wall thickness ratio was significantly associated with increased mortality, whereas this correlation was not observed in younger patients.
  • Most patients reported an improvement in quality of life following myectomy.

The findings revealed that the clinical characteristics of obstructive hypertrophic cardiomyopathy (oHCM) in older patients differ from those in younger individuals. The researchers observed that older patients exhibit more symmetric but less extensive ventricular hypertrophy and lower rates of positive genetic testing. This indicates that HCM may have distinct clinical and morphological variants in the elderly.

“While septal myectomy is considered safe for older patients, the presence of left ventricular wall asymmetry is associated with a poorer prognosis,” they concluded.

Reference:

DOI: https://www.jtcvs.org/article/S0022-5223(24)00904-8/abstract

Powered by WPeMatico

Fall in elderly associated with increased risk of future dementias: JAMA

A new study published in the Journal of American Medical Association showed that dementia in older persons is more commonly diagnosed within a year following a fall-related injury. Individuals who have dementia and moderate cognitive impairment, which is a prelude to dementia, are more likely to fall. Newer research suggest that older persons who have moderate cognitive impairment which is a risk factor for Alzheimer disease and associated dementias (ADRD), are more likely to fall. It is uncertain how likely it is that an older adult may get dementia following a fall. Thus, this study was to look into the likelihood of a new ADRD diagnosis following a fall in older individuals.

This retrospective cohort analysis utilized Medicare Fee-for-Service information from 2014 to 2015 with follow-up data available for at least a year following the index visit. The participants comprised persons 66 years of age and older without a prior dementia diagnosis who had suffered a catastrophic accident that led to an ED or inpatient visit. The period of data analysis was August 2023 to July 2024 which compared to other injury mechanisms as determined by the Ninth Revision of the International Classification of Diseases and the ICD-10 external cause of injury codes. A Cox multivariable competing risk model that took into account the competing risk of mortality and accounted for possible confounders was used to evaluate the likelihood of receiving a new ADRD diagnosis within a year following a fall.

  • 2,453,655 older adults who had suffered a catastrophic injury were included in the study, the mean age was 78.1 years where 1,522,656 were female, 124,396 were Black and 2,232,102 were White.
  • In 1,228,847 cases, the mechanism of injury was a fall. ADRD was identified more often within a year after a fall when compared to other injury mechanisms. Following a fall, the unadjusted hazard ratio (HR) for incident dementia diagnosis was 1.63.
  • After adjusting for medical comorbidities, patient demographics, and injury features, as well as the competing risk of mortality, falling was found to be independently linked to an elevated risk of dementia diagnosis among older persons on multivariable Cox competing risk analysis.

The HR was 1.27 for the subgroup of older individuals who had not recently been admitted to a skilled care facility. Overall, the outcome of this research found that 10.6% of older persons received a dementia diagnosis during the first year following a fall which indicated that new dementia diagnoses were frequent following falls.

Source:

Ordoobadi, A. J., Dhanani, H., Tulebaev, S. R., Salim, A., Cooper, Z., & Jarman, M. P. (2024). Risk of Dementia Diagnosis After Injurious Falls in Older Adults. In JAMA Network Open (Vol. 7, Issue 9, p. e2436606). American Medical Association (AMA). https://doi.org/10.1001/jamanetworkopen.2024.36606

Powered by WPeMatico