Screening for prostate cancer with first-line MRI less cost-effective than first-line PSA testing: Study

A cost-effectiveness analysis found that screening for prostate cancer using biparametric magnetic resonance imaging (bpMRI) as a first-line approach is less cost-effective than first-line prostate-specific antigen (PSA) testing with second-line multiparametric MRI (mpMRI). These findings persisted even under the assumption that bpMRI was performed free of charge, showing that these savings were not enough to outweigh the limitations of the first-line MRI approach. The analysis is published in Annals of Internal Medicine.

Researchers from Fred Hutchinson Cancer Center, Beth Israel Deaconess Medical Center, and the Mayo Clinic developed a microsimulation model to evaluate the comparative effectiveness and cost-effectiveness of first-line bpMRI versus first-line PSA with reflex mpMRI for prostate cancer screening. The authors found that first-line MRI-based screening substantially increased rates of false-positive test results, prostate biopsy, and overdiagnosis without proportionately substantial reductions in prostate cancer mortality. They note that even when assuming no cost for first-line bpMRI screening, first-line PSA testing with reflex mpMRI followed by MRI-guided prostate biopsy with or without transrectal ultrasonography–guided biopsy still resulted in lower costs and better quality of life for patients. These findings suggest that screening efforts should focus on strategies that reduce false-positive results and overdiagnoses to improve cost-effectiveness.

The authors of an accompanying editorial from Vanderbilt University Medical Center suggest that high-quality cost-effectiveness analyses are crucial to understanding the effect of changes in clinical practice on the overall health care system. In addition, these types of analyses provide important economic context that could bolster guidance statements if used in their development, as acknowledged by the National Comprehensive Cancer Network. The authors suggest that other organizations should follow suit by incorporating cost-effectiveness data in their guidelines. 

Reference:

Roman Gulati, Boshen Jiao, Ra’ad Al-Faouri, Vidit Sharma, Sumedh Kaul, Kevin Wymer, Stephen A. Boorjian, Aria F. Olumi, Ruth Etzioni,  and Boris Gershman, Lifetime Health and Economic Outcomes of Biparametric Magnetic Resonance Imaging as First-Line Screening for Prostate Cancer: A Decision Model Analysis, Annals of Internal Medicine, https://doi.org/10.7326/M23-1504.

Powered by WPeMatico

Even very small amounts of elements in follicular fluid may impact IVF success rates: Study

Though exposure to “trace” (an extremely small amount) elements has been shown to affect ovarian functions in experimental studies, there has been little research on the impact of trace levels of non-essential elements, such as lead (Pb) and mercury (Hg), on female reproduction. Studies have shown that high levels of these non-essential elements may lead to decreased female fertility and reduce the likelihood of getting pregnant. Taken together, this evidence raises concern about the potential negative impact of exposure to trace levels of non-essential elements on human reproduction.

In a new study from George Mason University researchers and funded by the National Institute of Environmental Health Sciences (NIEHS), some trace elements measured in ovarian follicular fluid were associated with differences in ovarian response to in vitro fertilization (IVF), which in turn, may affect the chance of getting pregnant. Trace elements are measured in very small concentrations with units such as parts per million or parts per billion.

“By studying the association between trace elements and ovarian response to IVF, we aim to better understand factors that may influence conception. Ultimately, the goal is to provide guidance for patients and clinicians in strategies to improve their chances of a successful pregnancy in IVF treatment,” said Michael S. Bloom, the principal investigator and professor in the Department of Global and Community Health. “Associations between follicular fluid trace elements and ovarian response during in vitro fertilization” was published in the July 2024 issue of Environmental Research.

“This study is unique because we used follicular fluid which is in direct contact with the developing egg to measure concentrations of trace elements. We found that elements may play a beneficial or detrimental role in reproductive health through ovarian function,” said Rooshna Mohsin, first author and George Mason Ph.D. in Public Health student.

Higher concentrations of cobalt (Co) and copper (Cu) in ovarian follicular fluid were associated with a stronger ovarian response, possibly increasing the likelihood of getting pregnant. However, higher concentrations of lead (Pb) were associated with a reduced number of ovarian follicles than in lower concentrations, which may result in a lower chance of achieving pregnancy.

Researchers also found that low antioxidant levels (called oxidative stress) were associated with higher levels of some metal elements and may play an important role in the associations that were found between follicular fluid trace elements and ovarian function.

This was a clinical observational study of 56 couples using IVF to conceive a pregnancy. Researchers collected follicular fluid from people’s ovaries after the egg was removed, during the routine IVF procedure, measured levels of different metals and other elements in the ovarian follicular fluid, and tested for statistical associations between the follicular fluid metals and measures of ovarian function. 

Reference:

Rooshna Mohsin, Victor Y. Fujimoto, Aubrey L. Galusha, Patrick J. Parsons, Jenna R. Krall, Celeste D. Butts-Jackson, Evelyn Mok-Lin, Michael S. Bloom, Associations between follicular fluid trace elements and ovarian response during in vitro fertilization, Environmental Research, https://doi.org/10.1016/j.envres.2024.118801.

Powered by WPeMatico

New automated insulin delivery systems improve diabetes management: ADA

A new research highlights the significant benefits of advanced technological innovations in managing type 2 diabetes (T2D) was presented at the American Diabetes Association’s 84th Scientific Sessions in Orlando, Florida. 3 studies here showed the positive impacts of automated insulin delivery (AID) systems and continuous glucose monitoring (CGM) on glycemic control and overall diabetes management.

With nearly 40 million Americans affected by diabetes and over 90% having T2D, the global prevalence of the disease continues to rise which makes the effective management strategies more critical than ever. The studies presented at the ADA Scientific Sessions emphasize the transformative potential of integrating advanced technologies into diabetes care for under-resourced populations.

These studies represent a significant advancement in diabetes management technologies which showed substantial improvements in glycemic control and quality of life for people with type 2 diabetes, highlighted the ADA chief scientific and medical officer. Leveraging this will help in empowering the patients with more effective and manageable treatment options, ultimately transforming the landscape of diabetes care.

The SECURE-T2D pivotal trial is the first large-scale, multicenter study which evaluated the Omnipod® 5 AID System that demonstrated significant benefits for adults with T2D. The Omnipod 5 AID System is a tubeless insulin pump that automatically adjusts insulin delivery based on CGM data to improve glycemic control by responding to glucose levels in real-time.

The trial included a total of 305 adults aged 18 to 75 with T2D after using various insulin regimens and with a baseline HbA1c of less than 12.0%. After a 14-day standard therapy phase to establish baseline glycemic control and the participants transitioned to 13 weeks of using the Omnipod 5 AID System. The primary endpoint was the change in HbA1c from baseline to 13 weeks. The study population was notably diverse, with 24% Black and 22% Hispanic/Latino participants.

The results showed a significant reduction in HbA1c levels from an average of 8.2% to 7.4% at the end of the study. The greatest improvements were noticed in participants with the highest baseline HbA1c. The results from the SECURE-T2D trial underline the potential of the Omnipod 5 AID System to transform diabetes management for adults with T2D.

Another study presented as, “Glycemic Outcomes with CGM Use in Patients with Type 2 Diabetes-Real-World Analysis,” highlighted the significant impact of CGM on patients with T2D. The study evaluated the impact of CGM on adults who were using non-insulin therapies, basal insulin and prandial insulin. The 12-month retrospective analysis used data from a large claims database of over 7.1 million T2D patients and compared HbA1c levels before and after CGM use.

Among the 6,030 adults studied, significant HbA1c improvements were observed across all therapy groups after 12 months. These results suggest that CGM can play a crucial role in improving health outcomes for all diabetes patients, regardless of their treatment regimen. Also, the findings from a related late-breaking abstract revealed that CGM use in T2D results in more than a 50% reduction in all-cause hospitalizations and acute diabetes-related hospitalizations.

Another study presented during the general poster session demonstrated that CGM significantly improves glycemic control in adults with T2D who are not using insulin. The real-world study analyzed data from over 3,800 adults using Dexcom G6 and G7 sensors. The participants showed significant improvements after six months of CGM use with further progress at one year. The key findings included a 0.5% reduction in the glucose management indicator and a 17% increase in Time in Range (TIR). This study highlighted the advantages of the Dexcom High Alert feature, which notifies users when glucose levels exceed their selected targets. This encourages the significant long-term improvements in glycemic control observed in the study. Future research will continue to explore ongoing patterns of glycemic improvement and real-world behavior changes enabled by CGM, as well as the impact of other Dexcom system features on glycemic control.

Source:

Breakthrough studies on automated insulin delivery and CGM for type 2 diabetes unveiled at ADA scientific sessions. (n.d.). Diabetes.org. Retrieved June 26, 2024, from https://diabetes.org/newsroom/press-releases/breakthrough-studies-automated-insulin-delivery-and-cgm-type-2-diabetes

Powered by WPeMatico

Study compares efficacy of lowering FIT positivity threshold to multitarget stool RNA testing for colorectal cancer screening

Germany: In colorectal cancer (CRC) screening, a pivotal study has emerged comparing the effectiveness of lowering the positivity threshold for the fecal immunochemical test (FIT) against multitarget stool RNA testing. The findings shed light on potentially transformative approaches to early detection, a critical factor in combating CRC, one of the leading causes of cancer-related mortality worldwide.

The study, published in the Journal of the American Medical Association (JAMA), found comparable sensitivity and specificity levels as reported for the multitarget stool RNA (mt-sRNA) test in the CRC-PREVENT study could be achieved by lowering the FIT positivity threshold, without additional mt-sRNA testing. The results are similar to the previous observations for multitarget stool DNA testing.

The screening tests most widely used for colorectal cancer globally are fecal immunochemical tests for hemoglobin. However, they have limited sensitivity in early-stage CRC and CRC precursors detection.

In 2023, the CRC-PREVENT study, a blinded, prospective, cross-sectional study that enrolled the target population for CRC screening, showed increased sensitivity for CRC of a mt-sRNA test compared with FIT alone (94% versus 78%). The mt-sRNA test involved a commercially available FIT (iFOBT OC-Auto; positivity threshold, 20 μg hemoglobin per gram of feces), participant-reported smoking status, and concentration of 8 RNA transcripts. However, this rise in sensitivity came at a substantial specificity loss for no lesions versus FIT alone (88% vs 96%).

The analysis by Tobias Niedermaier, German Cancer Research Center, Heidelberg, Germany, and colleagues evaluated whether comparable specificity and sensitivity levels could be achieved by lowering the FIT positivity threshold, without additional stool RNA testing and smoking assessment.

The analyses were based on ongoing German BLITZ study data. Participants undergoing screening colonoscopy are recruited in gastroenterology practices located in various southern German cities, complete a short questionnaire, and provide a fecal sample before bowel preparation for evaluating noninvasive CRC screening tests, including the FIT evaluated in this analysis. All participants provide written informed consent.

The main characteristics of the study participants of the BLITZ and CRC-PREVENT studies were summarized by descriptive statistics. Sensitivities of the FIT for CRC or advanced adenoma detection and specificity for the absence of advanced neoplasia were determined for the FIT in the BLITZ study after lowering the positivity threshold recommended by the manufacturer (17 μg hemoglobin per gram of feces) to a lower level (8.8 μg/g) providing the same positivity rate as reported for the mt-sRNA test in the CRC-PREVENT study (17%).

The study led to the following findings:

  • Of 10 061 BLITZ participants recruited from 2008 to 2020, 2454 were excluded, leaving 7607 participants for analysis.
  • Compared with CRC-PREVENT, the study population in BLITZ was older (mean age, 61.5 vs 55.0 years) and included higher proportions of men (48.5% versus 40.2%) and participants with CRC (0.8% versus 0.4%) and advanced adenomas (10.5% versus 6.8%).
  • Lowering the FIT positivity threshold in BLITZ to achieve the same positivity rate as reported for the mt-sRNA test in CRC-PREVENT achieved similar sensitivities (94.9% versus 94.4% for CRC and 44.7% versus 45.9% for advanced adenoma) and specificities (86.9% versus 85.5%)

The limitations include the indirect comparison based on different study populations with varying compositions of age and sex and different prevalences of CRC and advanced adenoma.

“There is a need for future comparisons of novel stool-based screening tests with FITs, which should incorporate comparisons of sensitivities at the same positivity rate or specificity,” the researchers concluded.

Reference:

Niedermaier T, Seum T, Hoffmeister M, Brenner H. Lowering Fecal Immunochemical Test Positivity Threshold vs Multitarget Stool RNA Testing for Colorectal Cancer Screening. JAMA. Published online June 01, 2024. doi:10.1001/jama.2024.9289

Powered by WPeMatico

Brisk Walking Helps to Reduce Risk of Chronic Diseases in Hypertensive Patients, claims study

A recent study published in the journal of Preventive Medicine revealed that walking at a brisk pace may significantly reduce the risk of developing major chronic diseases in individuals with hypertension. The research included a total of 160,470 participants from the UK Biobank and focused the relationship between walking pace, low-grade inflammation and the incidence of diseases like cancer, cardiovascular disease (CVD) and type 2 diabetes mellitus (T2DM).

The study employed the Cox proportional hazards model to analyze the data and found that a faster walking pace correlated with a lower risk of overall cancer, specific cancers (including liver, lung, and endometrial cancers), various CVD events (like angina, atrial fibrillation, heart failure, myocardial infarction, peripheral vascular disease and stroke) and T2DM. Hazard ratios for these diseases ranged from 0.42 to 0.91 by indicating a substantial decrease in risk for brisk walkers.

The outcomes of this study explored the role of low-grade inflammation in these associations. This points toward the higher levels of low-grade inflammation being linked to increased risks of the aforementioned diseases with the exception of liver cancer and atrial fibrillation. The mediation analyses suggested that low-grade inflammation partially explained the relationship between walking pace and reduced risks of lung cancer, T2DM and most CVD events (excluding atrial fibrillation). The proportion of risk reduction mediated by inflammation ranged from 2.0% to 9.8%.

Also, brisk walking significantly decreased the risk of overall cancer and specific cancers such as liver, lung and endometrial cancers in hypertensive individuals. A faster walking pace was associated with a lower incidence of all CVD events, except for atrial fibrillation. The risk of T2DM was markedly reduced in the individuals who walked briskly. Low-grade inflammation was observed to increase the risk of most chronic diseases, yet the benefits of brisk walking were partially mediated by reducing this inflammation.

The study highlights the potential health benefits of increasing walking pace for people with hypertension. This suggests that engaging in brisk walking can be a simple, accessible way to reduce the risk of major chronic diseases, partly by reducing low-grade inflammation. These findings could inform public health recommendations and individual strategies for managing hypertension and reducing the burden of chronic illnesses.

Source:

Peng, Y., Liu, F., Wang, P., Wang, X., Si, C., Gong, J., Zhou, H., Zhang, M., & Song, F. (2024). Association between walking pace and risks of major chronic diseases in individuals with hypertension based on a prospective study in UK Biobank: Involvement of inflammation. In Preventive Medicine (Vol. 184, p. 107986). Elsevier BV. https://doi.org/10.1016/j.ypmed.2024.107986

Powered by WPeMatico

Walking brings huge benefits for low back pain, study finds

Adults with a history of low back pain went nearly twice as long without a recurrence of their back pain if they walked regularly, a world-first study has found.

About 800 million people worldwide have low back pain, and it is a leading cause of disability and reduced quality of life.

Repeated episodes of low back pain are also very common, with seven in 10 people who recover from an episode going on to have a recurrence within a year.

Current best practice for back pain management and prevention suggests the combination of exercise and education. However, some forms of exercise are not accessible or affordable to many people due to their high cost, complexity, and need for supervision.

A clinical trial by Macquarie University’s Spinal Pain Research Group has looked at whether walking could be an effective, cost-effective and accessible intervention.

The trial followed 701 adults who had recently recovered from an episode of low back pain, randomly allocating participants to either an individualised walking program and six physiotherapist-guided education sessions over six months, or to a control group.

Researchers followed the participants for between one and three years, depending on when they joined, and the results have now been published in the latest edition of The Lancet.

The paper’s senior author, Macquarie University Professor of Physiotherapy, Mark Hancock, says the findings could have a profound impact on how low back pain is managed.

“The intervention group had fewer occurrences of activity limiting pain compared to the control group, and a longer average period before they had a recurrence, with a median of 208 days compared to 112 days,” Professor Hancock says.

“Walking is a low-cost, widely accessible and simple exercise that almost anyone can engage in, regardless of geographic location, age or socio-economic status.

“We don’t know exactly why walking is so good for preventing back pain, but it is likely to include the combination of the gentle oscillatory movements, loading and strengthening the spinal structures and muscles, relaxation and stress relief, and release of ‘feel-good’ endorphins.

“And of course, we also know that walking comes with many other health benefits, including cardiovascular health, bone density, healthy weight, and improved mental health.”

Lead author Dr Natasha Pocovi says in addition to providing participants with longer pain-free periods, the program was very cost-effective.

“It not only improved people’s quality of life, but it reduced their need both to seek healthcare support and the amount of time taken off work by approximately half,” she says.

“The exercise-based interventions to prevent back pain that have been explored previously are typically group-based and need close clinical supervision and expensive equipment, so they are much less accessible to the majority of patients.

“Our study has shown that this effective and accessible means of exercise has the potential to be successfully implemented at a much larger scale than other forms of exercise.”

To build on these findings, the team now hopes to explore how they can integrate the preventive approach into the routine care of patients who experience recurrent low back pain.

Reference:

Natasha C Pocovi, Prof Chung-Wei Christine Lin, Prof Simon D French, Petra L Graham, Johanna M van Dongen, Prof Jane Latimer, Effectiveness and cost-effectiveness of an individualised, progressive walking and education intervention for the prevention of low back pain recurrence in Australia (WalkBack): a randomised controlled trial, The Lancet, https://doi.org/10.1016/S0140-6736(24)00755-4.

Powered by WPeMatico

High Dose Glucocorticoids Improve Renal Outcomes but Increase Infection and Mortality in Lupus Nephritis: Study

Researchers have found that higher doses of glucocorticoids during initial therapy for lupus nephritis (LN) lead to improved renal outcomes but are associated with an increased risk of infections and mortality. This conclusion emerges from a systematic review and meta-analysis of control arms from randomized clinical trials (RCTs), shedding light on the delicate balance between efficacy and safety in treating LN. The study was recently published in Arthritis & Rheumatology by Gabriel F. and colleagues.

Lupus nephritis is a severe manifestation of systemic lupus erythematosus (SLE) characterized by inflammation of the kidneys. Effective management of LN is crucial to prevent kidney failure and improve patient outcomes. Glucocorticoids, combined with mycophenolic acid analogs or cyclophosphamide, are standard treatments for LN. However, the optimal dosing regimen to maximize renal response while minimizing adverse effects remains unclear.

The study involved a systematic review and meta-analysis of the control arms of RCTs involving biopsy-proven LN patients. These trials used standardized glucocorticoid regimens alongside either mycophenolic acid analogs or cyclophosphamide and reported outcomes such as complete response (CR), serious infections, and death. Data on glucocorticoid dosing, tapering schemes, and use of glucocorticoid pulses were collected. Meta-analyses of proportions, meta-regression, and subgroup analyses were conducted at six and twelve months for all outcomes.

The analysis included 50 RCT arms, encompassing 3,231 LN patients. Predicted rates for patients starting with oral prednisone 25 mg/day without pulses were:

  • Complete Response (CR): 19.5% (95% CI, 7.3–31.5)

  • Serious Infections: 3.2% (95% CI, 2.4–4.0)

  • Mortality: 0.2% (95% CI, 0.0–0.4)

In contrast, starting with prednisone 60 mg/day (without pulses) showed:

  • CR: 34.6% (95% CI, 16.9–52.3)

  • Serious Infections: 12.1% (95% CI, 9.3–14.9)

  • Mortality: 2.7% (95% CI, 0.0–5.3)

The addition of glucocorticoid pulses further increased the rates of CR and mortality but did not significantly affect the rate of serious infections. A dose-response relationship was observed between the initial glucocorticoid dose and all outcomes at six months, accounting for glucocorticoid pulses, underlying immunosuppressants, and baseline proteinuria.

The study demonstrates that higher initial doses of glucocorticoids can enhance renal response in LN patients, but this benefit comes at the cost of higher rates of serious infections and mortality. This finding is crucial for clinicians who must weigh the benefits of improved renal outcomes against the risks of adverse effects when prescribing glucocorticoids for LN.

Higher exposure to glucocorticoids during the initial treatment phase of lupus nephritis is associated with better renal outcomes but also leads to increased risks of infections and mortality. These findings underscore the importance of individualized treatment plans and the need for close monitoring to balance efficacy and safety in managing LN.

Reference:

Figueroa-Parra, G., Cuéllar-Gutiérrez, M. C., González-Treviño, M., Sanchez-Rodriguez, A., Flores-Gouyonnet, J., Meade-Aguilar, J. A., Prokop, L. J., Murad, M. H., Dall’Era, M., Rovin, B. H., Houssiau, F., Tamirou, F., Fervenza, F. C., Crowson, C. S., Putman, M. S., & Duarte-García, A. (2024). Impact of glucocorticoid dose on complete response, serious infections, and mortality during the initial therapy of lupus nephritis: A systematic review and meta‐analysis of the control arms of randomized controlled trials. Arthritis & Rheumatology. https://doi.org/10.1002/art.42920

Powered by WPeMatico

Cabergoline scores Over Pyridoxine for Lactation Inhibition in Mothers: AJOG

A recent randomized controlled trial compared the effectiveness of cabergoline and pyridoxine (vitamin B6) for lactation inhibition in postpartum women. This study published in the American Journal of Obstetrics and Gynecology provided clarity on which treatment is more efficient for mothers seeking lactation inhibition for personal, social or medical reasons. 

The trial included a total of 88 postpartum patients who requested lactation inhibition. These women were randomly assigned to receive either cabergoline or pyridoxine. Cabergoline was administered in two different regimens according to departmental protocol, which was either a single 1 mg dose on postpartum day 1 or 0.25 mg twice a day for two days. Pyridoxine was given at a dose of 200 mg three times a day for seven days. Also all participants were screened to exclude those with conditions contraindicating the use of cabergoline like hypertensive disorders and fibrotic, cardiac or hepatic diseases.

The patients assessed their symptoms, including breast engorgement, breast pain and milk leakage, on a scale from 0 (no symptoms) to 5 (severe symptoms) on days 0, 2, 7, and 14. The primary outcome of the study was the success of lactation inhibition that was defined as a score of 0 for both engorgement and pain by day 7. The secondary outcomes included the extent of milk leakage, adverse effects, instances of fever or mastitis and any changes or discontinuations in treatment.

The results showed that cabergoline was significantly more effective than pyridoxine in inhibiting lactation by day 7, with a success rate of 78% when compared to 35% for pyridoxine. Mild symptoms (scores of 0 to 2 for engorgement and pain) were more commonly reported in the cabergoline group. Also, 89% of the cabergoline group underwent mild symptoms against 67% in the pyridoxine group.

Milk leakage was also markedly lower in the cabergoline group with only 9% experiencing leakage after 7 days when compared to 42% in the pyridoxine group. By day 14, these figures were 11% for cabergoline users and 31% for those on pyridoxine.

Despite its higher efficacy, cabergoline did present more adverse effects which was reported by 31% of its users when compared to 9% of pyridoxine users. However, all adverse effects were mild and there were no major complications reported in either group. The rates of mastitis and fever related to engorgement were similar between the two groups.

21% of the patients in the pyridoxine group switched to or supplemented with cabergoline due to inadequate results which further highlighted the superior efficacy cabergoline. Also, the final success rates for pyridoxine decreased when adjusted for these changes. The study concluded that while cabergoline is more effective for lactation inhibition, but pyridoxine remains a viable option for women who cannot use cabergoline due to its contraindications.

Source:

Dayan-Schwartz, A., Yefet, E., Massalha, M., Hosari-Mhameed, S., Remer-Gross, C., Pasand, E., & Nachum, Z. (2024). The efficiency of cabergoline vs pyridoxine for lactation inhibition—a randomized controlled trial. In American Journal of Obstetrics and Gynecology (Vol. 230, Issue 5, p. 561.e1-561.e8). Elsevier BV. https://doi.org/10.1016/j.ajog.2023.10.009

Powered by WPeMatico

Study unveiles Impact of Temperature on Cardiovascular Mortality: A Global Perspective

Australia: In a comprehensive analysis published in the Journal of the American College of Cardiology, researchers have unveiled the staggering impact of nonoptimal temperatures on cardiovascular mortality globally and regionally over time. This groundbreaking study sheds light on the intricate relationship between climate dynamics and public health outcomes, emphasizing the urgent need for proactive measures to mitigate the adverse effects of temperature extremes.

The study that tracked mortality over 20 years revealed that worldwide, nearly one in 10 cardiovascular deaths may be attributed to “nonoptimal” temperatures.

The findings suggest that globally, 1,801,513 cardiovascular deaths yearly were linked to nonoptimal temperatures over the two decades, comprising 8.86% of total global cardiovascular (CV) deaths, or 26 excess cardiovascular deaths per 100,000 people.

“Nonoptimal temperatures contribute substantially to CV mortality, with heterogeneous spatiotemporal patterns,” the researchers wrote. “There is a need for effective mitigation and adaptation strategies, especially given the increasing heat-related cardiovascular deaths amid climate change.”

The association between nonoptimal temperatures and cardiovascular mortality risk is recognized. However, there is a lack of comprehensive global assessment of this burden. Samuel Hundessa from Monash University in Melbourne, Victoria, Australia, and colleagues aimed to assess the global cardiovascular mortality burden attributable to nonoptimal temperatures and investigate spatiotemporal trends.

For this purpose, a 3-stage analytical approach was applied using daily cardiovascular deaths and temperature data from 32 countries. Firstly, the research team estimated location-specific temperature–mortality associations considering nonlinearity and delayed effects. Second, they developed a multivariate meta-regression model between location-specific effect estimates and five meta-predictors. Third, cardiovascular deaths associated with nonoptimal, hot, and cold temperatures for each global grid (55 km × 55 km resolution) were estimated, and researchers explored temporal trends from 2000 to 2019.

The study led to the following findings:

  • Globally, 1,801,513 annual cardiovascular deaths were associated with nonoptimal temperatures, constituting 8.86% of total cardiovascular mortality corresponding to 26 deaths per 100,000 population.
  • Cold-related deaths accounted for 8.20%, whereas heat-related deaths accounted for 0.66%.
  • The mortality burden varied significantly across regions, with the highest excess mortality rates observed in Central Asia and Eastern Europe.
  • From 2000 to 2019, there was a decrease in cold-related excess death ratios, while heat-related ratios increased, resulting in an overall decline in temperature-related deaths.
  • Southeastern Asia, Sub-Saharan Africa, and Oceania observed the greatest reduction, while Southern Asia experienced an increase.
  • The Americas and several regions in Asia and Europe displayed fluctuating temporal patterns.

As the global community grapples with the escalating challenges posed by climate change, initiatives aimed at minimizing the health risks associated with temperature extremes are imperative. Through collaborative action and evidence-based interventions, policymakers and stakeholders can mitigate the adverse effects of nonoptimal temperatures on cardiovascular health, ensuring a healthier and more sustainable future for all.

Reference:

Hundessa S, Huang W, Zhao Q, et al. Global and regional cardiovascular mortality attributable to nonoptimal temperatures over time. J Am Coll Cardiol. 2024;83:2276-2287.

Powered by WPeMatico

Expansion of warning statement may be considered after submission of first year PSUR data: CDSCO panel on AstraZeneca’s anticancer drug

New Delhi: Noting that Tremelimumab Concentrate for Solution for infusion 20 mg/ml is not yet launched in the Indian market, the Subject Expert Committee (SEC) functional under the Central Drug Standard Control Organisation (CDSCO) has opined that the drug major Astra Zeneca’s proposal for the expansion of the warning statement from “To be sold by retail on the prescription of a registered oncologist only” to “To be sold by retail on the prescription of a registered oncologist or gastroenterologist only” may be considered after submission of first-year Periodic Safety Update Report (PSUR) data.

This came after Astra Zeneca presented the proposal for expansion of the warning statement for the approved drug Tremelimumab Concentrate for Solution for infusion 20 mg/ml (25 mg/1.25 ml and 300 mg/15 ml presentations) in single-dose vials from “To be sold by retail on the prescription of a registered oncologist only” to “To be sold by retail on the prescription of a registered oncologist or gastroenterologist only.”

Tremelimumab is an anti-CTLA-4 antibody used to treat unresectable hepatocellular carcinoma in combination with durvalumab.

Tremelimumab is a fully human IgG2 monoclonal antibody directed against cytotoxic T-lymphocyte-associated antigen 4 (CTLA-4). CTLA-4 is a cell surface receptor expressed on activated T cells to act as a negative regulator for T cells. By binding to CTLA-4, tremelimumab enhances T cell-mediated killing of tumours and reduces tumour growth. Because CTLA-4 is an immune checkpoint that plays a vital role in regulating T cell-mediated immune responses, tremelimumab is considered an immune checkpoint inhibitor, which is an emerging cancer immunotherapy drug class.

At the recent SEC meeting on the 5th and 6th of June 2024, the expert panel reviewed the proposal for expansion of the warning statement for the approved drug Tremelimumab Concentrate for Solution for infusion 20 mg/ml (25 mg/1.25 ml and 300 mg/15 ml presentations) in single-dose vials from “To be sold by retail on the prescription of a registered oncologist only” to “To be sold by retail on the prescription of a registered oncologist/ gastroenterologist only”.

The committee noted that the drug has not yet been launched on the Indian market.

After detailed deliberation, the committee recommended,

“The proposal for expansion of the warning statement may be considered after submission of the first-year PSUR data and subsequent review of the same by the committee.”

Also Read:Zydus Life Science Gets CDSCO Panel Nod To Manufacture and Market Anti-cancer Drug Nelarabine

Powered by WPeMatico