NEJM: Results from targeted therapy for ulcerative colitis study

An international placebo-controlled study led by Cedars-Sinai suggests that a targeted drug therapy that was developed by researchers at Cedars-Sinai is safe and effective at helping people with moderate to severe ulcerative colitis reach clinical remission.

Results from the multicenter Phase II study, ARTEMIS-UC, were published in The New England Journal of Medicine.

Ulcerative colitis is a type of inflammatory bowel disease (IBD) that damages the digestive tract, causing stomach cramping, diarrhea, weight loss and rectal bleeding. It affects as many as 900,000 people in the U.S., and current treatments are often only minimally effective.

“Findings from this study are poised to have a remarkable impact on treatment for ulcerative colitis and IBD overall,” said study senior author and IBD research pioneer Stephan Targan, MD, the Feintech Family Chair in Inflammatory Bowel Disease and executive director of the F. Widjaja Inflammatory Bowel Disease Institute at Cedars-Sinai. “The investigational therapy was generated based on the concept of precision medicine; it shows promise as being both anti-inflammatory and anti-fibrotic; it represents a potential turning point in drug development and discovery; and it could change how this complex disease is treated in the future.”

The study evaluated a therapy developed by Cedars-Sinai clinician-scientists called tulisokibart (previously PRA023)-a man-made monoclonal antibody that acts like endogenous antibodies. It is designed to target and block a protein called TL1A, which can contribute to the severity of ulcerative colitis. The antibody reduces inflammation and targets fibrosis, which causes many of the complications and severity of disease.

“Unlike other IBD treatments that can exacerbate inflammation or suppress the body’s natural anti-inflammatory responses, our findings suggest that tulisokibart modulates inflammation and the body’s anti-inflammatory mechanisms,” Targan said. “This dual action could lead to more balanced and effective management of ulcerative colitis.”

Notably, the role of TL1A as a master regulator of inflammation was discovered by Targan and collaborators at Cedars-Sinai. In groundbreaking work spanning two decades, the researchers found that while TL1A protects against invading pathogens, at high levels it also contributes to inflammation and fibrosis in IBD.

ARTEMIS-UC was a 12-week study involving 178 adults from 14 countries. It also included a genetic-based companion diagnostic test to help predict response to the therapy.

A Phase III study will further examine safety and test effectiveness of tulisokibart in patients who take it longer than 12 weeks.

Clinician-scientist and geneticist Dermot McGovern, MD, PhD, director of Translational Research in the F. Widjaja Inflammatory Bowel Disease Institute at Cedars-Sinai and one of the study authors, has focused his career on identifying genetic variants associated with ulcerative colitis and other autoimmune diseases, exploring drug targets and working to revolutionize treatment through a precision medicine approach.

Nearly 20 years ago at Oxford University, McGovern and colleagues, in the first-ever genome-wide association study in IBD, identified that a variation in the TNF superfamily 15 (TNFSF15) gene was associated with developing both ulcerative colitis and Crohn’s disease. The protein TL1A, simultaneously being studied by Targan at Cedars-Sinai, is encoded by TNFSF15. McGovern left Oxford to collaborate with Targan and team at Cedars-Sinai in the effort to bring scientific breakthroughs to IBD.

“Findings from the ARTEMIS-UC study exemplify how combining genetics and biology can transform IBD care,” said McGovern, the Joshua L. and Lisa Z. Greer Chair in Inflammatory Bowel Disease Genetics and the director of Precision Health at Cedars-Sinai.

McGovern, who was recently awarded the prestigious Sherman Prize for his pioneering work in advancing understanding of the genetic architecture of IBD in diverse populations, says the uniqueness of this target and the way tulisokibart was designed to interact with that target represent significant advancements in how clinicians approach IBD treatment.

“Previously we have only been able to prescribe a medication to a patient that we think will work well, but going forward we could imagine telling the patient, ‘Actually, the genetic test suggests that you would be more likely to respond to this therapy,’” McGovern said.

Targan and McGovern also noted that ARTEMIS-UC involved multiple countries and diverse populations, reflecting the global nature of IBD. The F. Widjaja Inflammatory Bowel Disease Institute has invested significant resources in extending genetic research in IBD to diverse populations.

“It’s taken a village-supported by Cedars-Sinai’s integrated science culture-to reach this point,” said Targan, a 2017 recipient of the Sherman Prize. “We’ve devoted our careers to getting better treatments to IBD patients, and now we’re closer than ever to helping all patients with ulcerative colitis get their disease into remission so they can get back to enjoying life.”

Reference:

Bruce E. Sands, Brian G. Feagan, Laurent Peyrin-Biroulet, Silvio Danese, Phase 2 Trial of Anti-TL1A Monoclonal Antibody Tulisokibart for Ulcerative Colitis, New England Journal of Medicine, DOI: 10.1056/NEJMoa2314076.

Powered by WPeMatico

Early postoperative feeding in kids after GA associated with shorter duration of hospital stay, finds research

The guidelines for perioperative care in neonatal intestinal surgery known as the Early Recovery After Surgery recommend initiating early enteral feeds for pediatric patients as soon as possible. However, the specific timing for when pediatric patients should resume oral intake after surgery remains unclear. Recent randomized controlled trial compared the effects of early oral feeding (EF) versus conventional feeding (CF) on postoperative outcomes in children undergoing daycare surgery under general anesthesia. The study included 300 children (150 in each group) and assessed the occurrence of postoperative nausea and vomiting (PONV), postoperative pain, duration of hospital stay, and parental satisfaction.

Comparison of PONV Incidence and Pain Scores

The results showed that the incidence of PONV was similar between the two groups, with 12% in the EF group and 18.7% in the CF group (p=0.109). However, the EF group had significantly lower Face, Legs, Activity, Cry, Consolability (FLACC) pain scores at 0 minutes, 30 minutes, and 1 hour postoperatively compared to the CF group.

Duration of Hospital Stay and Parental Satisfaction

Patients in the EF group had a significantly shorter duration of hospital stay, with a mean of 6.31 hours compared to 10.13 hours in the CF group (p<0.001). Parents of children in the EF group also had significantly better satisfaction scores compared to the CF group (p<0.001). The study concluded that early postoperative feeding in children undergoing lower abdominal, non-gastrointestinal surgery under general anesthesia does not increase the incidence of PONV. Furthermore, early feeding was associated with reduced postoperative pain, shorter hospital stay, and higher parental satisfaction. The authors suggest that early postoperative feeding can be safely implemented in this patient population without increasing the risk of adverse events.

Key Points

Based on the provided research paper, the 6 key points are:

1. The study was a randomized controlled trial that compared the effects of early oral feeding (EF) versus conventional feeding (CF) on postoperative outcomes in children undergoing daycare surgery under general anesthesia.

2. The study assessed the occurrence of postoperative nausea and vomiting (PONV), postoperative pain, duration of hospital stay, and parental satisfaction.

3. The incidence of PONV was similar between the EF and CF groups, with 12% in the EF group and 18.7% in the CF group.

4. The EF group had significantly lower Face, Legs, Activity, Cry, Consolability (FLACC) pain scores at 0 minutes, 30 minutes, and 1 hour postoperatively compared to the CF group.

5. Patients in the EF group had a significantly shorter duration of hospital stay, with a mean of 6.31 hours compared to 10.13 hours in the CF group.

6. Parents of children in the EF group had significantly better satisfaction scores compared to the CF group.

Reference –

Singh R, Huligeri HS, Singh P. A randomized controlled trial to compare the occurrence of postoperative nausea and vomiting in early versus conventional feeding in children undergoing daycare surgery under general anaesthesia. Indian J Anaesth 2024;68:815‑20

Powered by WPeMatico

Nasal mupirocin therapy post endoscopic sinus surgery does not reduce symptom recurrence: Study

A recent study published in the Iranian Journal of Otorhinolaryngology showed that nasal mupirocin therapy after endoscopic sinus surgery is not effective in reducing symptom recurrence in persistent rhinosinusitis with nasal polyps. In order to reduce the load of symptoms and future interventions for patients with chronic rhinosinusitis with nasal polyposis (CRSwNP), modern therapies often involve multi-modality therapy, which includes endoscopic sinus surgery (ESS) followed by ongoing appropriate medical therapy.

The pathophysiology of CRSwNP involves both immunologic processes and genetic predisposition. Asthma, allergy, and aspirin-exacerbated respiratory disease (AERD) appear to play a part in the development of CRSwNP. The most common medical treatments are intranasal and systemic corticosteroids, while functional endoscopic sinus surgery (FESS) is utilized for individuals who do not respond to conventional medications. Thus, Mohebbi and colleagues examined the potential benefits of a topical mupirocin ointment administered in the nasal vestibule for reducing the recurrence of symptoms and enhancing the efficacy of functional endoscopic sinus surgery.

A clinical trial comprised patients with nasal polyps, a positive nostril culture for Staphylococcus aureus, and chronic rhinosinusitis. Application of mupirocin ointment on the right nostril was designated as the intervention group, while application of vitamin A ointment to the left nostril was designated as the control group. At the time of diagnosis and 6 months later, Lund-Mackay radiological scores and Lund-Kennedy endoscopic scores were evaluated.

91% of the 60 patients with nasal polyps and chronic rhinosinusitis tested positive for Staphylococcus aureus in their nostrils. A substantial improvement following surgery was shown when the follow-up values in both groups were compared to the average of the diagnostic radiological and endoscopic scores (P-value=0.001, 0.001). The endoscopic and radiological score enhancements did not, however, differ significantly among the study as well as control groups (P-value > 0.56, 0.74).

Overall, after endoscopic nasal surgery, the use of mupirocin ointment twice a day in the nostrils of CRSwNP patients does not appear to have a substantial impact on the surgical result. Given the chance of gram negative pathogenic bacterial development, the administration of mupirocin in these individuals should be considered. It is advised that more research be done with a bigger sample size to examine the effectiveness of mupirocin and other topical antibiotics in preventing relapse in patients with CRSwNP.

Source:

Mohebbi, A., Mohsenian, M., Elahi, M., & Minaeian, S. (2024). Mupirocin Ointment Effect on Polyposis Recurrence After Sinus Surgery. Iranian Journal of Otorhinolaryngology, 36(5), 573-580. https://doi.org/10.22038/ijorl.2024.70685.3405

Powered by WPeMatico

Higher Circulating Lymphocytes and Incidence of Pre-eclampsia and Eclampsia

Pre-eclampsia and eclampsia are two of the most serious
acute multisystemic disorders during pregnancy and are significant determinants
of maternal and neonatal mortality on a global scale. Pre-eclampsia is
associated with an elevated susceptibility to adverse pregnancy outcomes, such
as preterm birth and intrauterine growth restriction, thereby amplifying the
risk of low birth weight. Furthermore, it is intricately linked to serious
maternal and neonatal health complications, including chronic hypertension,
maternal end-stage renal disease, and neonatal pulmonary dysplasia. The precise
etiology of pre-eclampsia remains elusive, although our current understanding
suggests that women afflicted with pre-eclampsia exhibit increased uterine
artery resistance due to impaired immune regulation. This, in turn, contributes
to the activation of the maternal endothelium and the onset of systemic chronic
inflammation. Multiple studies have found that immune cells change
significantly in women with pre-eclampsia.

Mendelian randomization (MR) presents a robust means to
investigate the causal relationship between immune cells and pre-eclampsia by
genetic variants (single nucleotide polymorphisms (SNPs)), and it is also less
susceptible to the shortcomings of classical epidemiological studies, such as
confounding bias, information bias, and selection bias. Recently, the
application of MR has gained significant traction in elucidating the causal
link between immune cells and various diseases such as hypertension, amyotrophic
lateral sclerosis and multiple sclerosis. In this study, authors utilized MR
and colocalization analysis to investigate the potential causal association
between immune cells and pre-eclampsia.

For exposure, authors extracted genetic variants associated
with immune cell-related traits, and for outcomes, they used summary genetic
data of pre-eclampsia/eclampsia. A two-sample Mendelian randomization (MR)
analysis was then performed to assess the causal relationship.

Study found that genetically proxied circulating lymphocyte
absolute count was causally associated with total eclampsia (odds ratio OR = 1
53, 95% confidence interval (CI) (1.31-1.79), p = 1 15E − 07) and pre-eclampsia
(OR = 1 50, 95% CI (1.28-1.77), p = 9 18E − 07); T cell absolute count was
causally associated with total eclampsia (OR = 1 49, 95% CI (1.28-1.73), p = 2
73E − 07) and pre-eclampsia (OR = 1 47, 95% CI (1.25-1.72), p = 1 76E − 06).
And CD28- CD25+ CD8+ T cell absolute count was causally associated with total
eclampsia (OR = 1 83, 95% CI (1.44-2.32), p = 7 11E − 07) and pre-eclampsia (OR
= 1 77, 95% CI (1.38-2.26), p = 6 55E − 06).

Study findings collectively demonstrate significant
associations between genetically predicted lymphocyte and T cell count and the
risk of pre-eclampsia, as well as the combined occurrence of pre-eclampsia and
eclampsia. A complex interplay of acquired, genetic, and immune risk factors
collectively contributes to the onset of early placental dysfunction, and many
researchers believe that an abnormal maternal immune response to the fetus is the
initiating factor in the development of eclampsia. Moreover, this process
involves cells of the innate and adaptive immune systems, including
neutrophils, monocytes, natural killer (NK) cells, and T lymphocytes. This
dysfunction also triggers the release of antiangiogenic factors ultimately
culminating in subsequent multiorgan dysfunction.

This study demonstrated a causal relationship between
lymphocyte and T cell count and pre-eclampsia and the combination of
pre-eclampsia and eclampsia. Based on these findings, authors suggest that
routine blood examinations should be incorporated into the clinical evaluation
of pregnant woman more frequently. In addition, lymphocyte and T cell counts
should be monitored in patients with pre-eclampsia and eclampsia. However, additional
investigations are imperative to corroborate and validate these findings, in
order to evaluate their robustness and generalizability.

Source: Qiuping Zhao, Rongmei Liu, Hui Chen; Hindawi Journal
of Pregnancy Volume 2024, Article ID 8834312, 7 pages https://doi.org/10.1155/2024/8834312

Powered by WPeMatico

Low Serum Uric Acid may Help Predict Recurrence among acute ischemic stroke patients : Study

A recent study highlighted the potential significance of renal function-normalized serum uric acid (SUA) levels in predicting stroke outcomes, particularly in patients with acute ischemic stroke (AIS). While the relationship between serum uric acid and stroke outcomes has been debated due to its dependence on renal clearance, this research introduces the ratio of SUA to serum creatinine (SUA/SCr) as a promising predictor of stroke recurrence over a one-year period.

This prospective, multicenter observational study explored the association between SUA/SCr levels and the outcomes of stroke patients over one year. This research followed 2,294 patients with AIS and monitored them for stroke recurrence, all-cause mortality and overall prognosis. The study assessed how SUA/SCr levels correlated with these critical outcomes by using multivariable Cox regression analyses and restriction cubic splines.

The findings were particularly compelling in regard to stroke recurrence. The study revealed that for every one-unit increase in SUA/SCr, there was a corresponding 19% decrease in the risk of stroke recurrence within one year. This suggests that higher SUA/SCr levels could serve as a protective factor against future strokes in patients with AIS.

To better understand the impact of SUA/SCr levels, the study divided the patients into 4 quartiles (Q1-Q4) based on their SUA/SCr ratios. The results showed that patients in the higher quartiles (Q2, Q3, and Q4) had significantly lower risks of stroke recurrence when compared to the individuals in the lowest quartile (Q1). The trend test further confirmed a significant decrease in stroke recurrence from the lowest to the highest quartiles, highlighting the importance of maintaining a higher SUA/SCr ratio.

The study found no significant association between SUA/SCr levels and other outcomes, such as poor prognosis or all-cause mortality. This suggests that while SUA/SCr may play a critical role in reducing stroke recurrence, it does not necessarily impact the overall survival or recovery in the long term. The findings of this study suggest that low SUA/SCr could be an independent risk factor for stroke recurrence within a year following an initial AIS event. The negative but nonlinear association between SUA/SCr and stroke recurrence highlights the complexity of this relationship. While more research would be required to fully understand the underlying mechanisms, these results offer a new perspective on managing stroke patients and potentially preventing future strokes.

Reference:

Zhang, D., Liu, Z., Guo, W., Lu, Q., Lei, Z., Liu, P., Liu, T., Peng, L., Chang, Q., Zhang, M., Lin, X., Wang, F., & Wu, S. (2024). Association of serum uric acid to serum creatinine ratio with 1‐year stroke outcomes in patients with acute ischemic stroke: A multicenter observational cohort study. In European Journal of Neurology. Wiley. https://doi.org/10.1111/ene.16431

Powered by WPeMatico

Over 80% of pregnant women iron deficient by third trimester, finds study

When a woman becomes pregnant, her iron requirements increase almost tenfold to support fetal development as well as her own increased iron needs. Her ability to meet these increased iron needs depends on her iron stores at the beginning of the pregnancy as well as the physiological adaptations that enhance iron absorption as pregnancy progresses. These physiological adaptations, however, are not always enough to support a pregnant woman’s iron needs, especially among the estimated 50% of women who begin pregnancy with depleted iron stores. While often thought of as a problem in low-resource settings, recent studies have documented iron deficiency rates of 33-42% among pregnant women in high-resource settings.

Iron deficiency can lead to anemia, a condition in which the body can’t produce sufficient hemoglobin, which, in turn, limits the red blood cells’ ability to carry oxygenated blood throughout the body. Anemia during pregnancy is associated with a higher risk of both adverse maternal outcomes and adverse infant outcomes, including postpartum depression, postpartum hemorrhage, preterm birth, low birth weight, and small-for-gestational age birth. Even without the presence of anemia, maternal iron deficiency can result in long-term neurodevelopmental challenges for the child.

At the moment, screening for iron deficiency during pregnancy is not universally routine. Moreover, there is no generally agreed upon diagnostic criteria for iron deficiency during pregnancy. The most recent draft recommendation from the US Preventive Services Task Force, for example, states that “the current evidence is insufficient to assess the balance of benefits and harms of screening for iron deficiency anemia in pregnant women.” In contrast, the International Federation of Gynecology and Obstetrics and European Hematology Society recommend all pregnant women in their first trimester irrespective of the presence or absence of anemia be screened for iron deficiency. Moreover, they also recommend that all women of reproductive age irrespective of the presence or absence of anemia be screened for iron deficiency.

Even when screening is conducted, it may be insufficient to detect iron deficiency. In clinical practice, for example, hemoglobin is frequently the only benchmark used to evaluate iron status among pregnant women. Hemoglobin, however only provides an indication of anemia. As a result, poor maternal and infant health outcomes that may develop before iron deficiency advances to anemia may arise undetected.

Unfortunately, well-designed studies of the changes in iron status during the course of pregnancy are limited. In response, the authors of “Longitudinal Evaluation of Iron Status during Pregnancy: A Prospective Cohort Study in a High-Resource Setting” evaluated the changes in iron biomarkers throughout pregnancy, established the prevalence of iron deficiency, and proposed iron status benchmarks in early pregnancy that predict iron deficiency in the third trimester. The authors, Elaine K. McCarthy et al., also sought to determine how common risk factors for iron deficiency such as obesity and smoking affected iron status throughout pregnancy. The results of the study, one of the largest studies ever to document the changes in iron status during pregnancy, were published in The American Journal of Clinical Nutrition, a publication of the American Society for Nutrition.

To conduct their research, the authors worked with data collected from 641 women in Ireland who were pregnant and had a successful delivery for the first time and who participated in the IMproved PRegnancy Outcomes via Early Detection (IMPROvED) consortium project. Samples were taken from the women at 15 weeks, 20 weeks and 33 weeks of pregnancy to determine iron status. Within 72 hours following delivery, information about the pregnancy, delivery, and the baby were obtained from the mother via an interview with a research midwife. Information pertaining to clinical outcomes and complications during pregnancy and delivery were confirmed by reviewing medical records.

“In this high-resource setting,” the authors found that “iron deficiency defined by a variety of biomarkers and thresholds, was very common during pregnancy, despite the cohort profile as generally healthy.” Interestingly, none of the study participants were anemic in the first trimester, yet more than 80% of the women were iron deficient by the third trimester. In particular, the authors noted that “our cohort had higher rates of deficiency in the third trimester than even some low-resource settings.”

In this study, almost three-quarters of the participants took an iron-containing supplement that contained the Irish/European recommended daily iron allowance of 15-17mg. The authors did note that “iron-containing supplements (mainly multivitamins) taken pre/early pregnancy were associated with a reduced risk of deficiency throughout pregnancy, including the third trimester.”

According to the authors, these findings draw attention to “the benefit of screening for iron deficiency with hemoglobin and ferritin in defined low-risk populations.” Moreover, based on their findings, the authors proposed a threshold for ferritin, a protein that stores iron, of 60µg per liter or less at 15 weeks of pregnancy that predicted the presence of iron deficiency at 33 weeks of pregnancy, defined as 15µg of ferritin per liter or less. The authors noted that “this has previously been identified as the inflection point at which fetal iron accretion is compromised, leading to poorer neurocognitive function and earlier onset of postnatal iron deficiency in the offspring.”

In an accompanying editorial to this study, “Finally, a Quality Prospective Study to Support a Proactive Paradigm in Anemia of Pregnancy,” also published in The American Journal of Clinical Nutrition, authors Michael Auerbach and Helain Landy bluntly labeled the medical community’s approach to women, including the lack of screening and treating iron deficiency and anemia among pregnant women, as “misogyny.” Given the study’s findings, the editorial calls upon the American College of Obstetricians and Gynecologists and the United States Preventive Services Taskforce to “change their approach to diagnosis to screen all pregnant women for iron deficiency, irrespective of the presence or absence of anemia, and recommend supplementation when present for the most frequent nutrient deficiency disorder that we encounter.”

Looking to the future, the authors of “Longitudinal Evaluation of Iron Status during Pregnancy: A Prospective Cohort Study in a High-Resource Setting” believe that “further good-quality, large-scale longitudinal studies of iron status, with concurrent inflammatory status, are needed to provide the evidence base to help establish much-needed consensus. Moreover, the use of early pregnancy iron biomarkers and thresholds should be instituted in better alignment with clinically meaningful health outcomes.”

Reference:

Elaine K McCarthy, David Schneck, Saonli Basu, Annette Xenopoulos-Oddsson, Fergus P McCarthy, Mairead E Kiely, Michael K Georgieff, Longitudinal evaluation of iron status during pregnancy: a prospective cohort study in a high-resource setting, The American Journal of Clinical Nutrition, 2024, https://doi.org/10.1016/j.ajcnut.2024.08.010.

Powered by WPeMatico

‘Weekend warrior’ physical activity may help protect against more than 200 diseases, suggests study

Busy with work and other obligations, some people concentrate their moderate-to-vigorous exercise in one or two days of the week or weekend. A study led by investigators at Massachusetts General Hospital, a founding member of the Mass General Brigham healthcare system, has found that this “weekend warrior” pattern of exercise is associated with lower risk of developing 264 future diseases and was just as effective at decreasing risk as more evenly distributed exercise activity. Results are published in Circulation.

“Physical activity is known to affect risk of many diseases,” said co-senior author Shaan Khurshid, MD, MPH, a faculty member in the Demoulas Center for Cardiac Arrhythmias at Massachusetts General Hospital. “Here, we show the potential benefits of weekend warrior activity for risk not only of cardiovascular diseases, as we’ve shown in the past, but also future diseases spanning the whole spectrum, ranging from conditions like chronic kidney disease to mood disorders and beyond.”

Guidelines recommend at least 150 minutes of moderate-to-vigorous physical activity per week for overall health. Among people who meet these recommendations, however, do those who exercise 20–30 minutes most days of the week experience benefits over those who go 5 or 6 days between longer exercise sessions?

Khurshid, along with co-senior author Patrick Ellinor, MD, PhD, the acting chief of Cardiology and the co-director of the Corrigan Minehan Heart Center at Massachusetts General Hospital, and their colleagues analyzed information on 89,573 individuals in the prospective UK Biobank study who wore wrist accelerometers that recorded their total physical activity and time spent at different exercise intensities over one week. Participants’ physical activity patterns were categorized as weekend warrior, regular, or inactive, using the guideline-based threshold of 150 minutes per week of moderate-to-vigorous physical activity.

The team then looked for associations between physical activity patterns and incidence of 678 conditions across 16 types of diseases, including mental health, digestive, neurological, and other categories.

The investigators’ analyses revealed that weekend warrior and regular physical activity patterns were each associated with substantially lower risks of over 200 diseases compared with inactivity. Associations were strongest for cardiometabolic conditions such as hypertension (23% and 28% lower risks over a median of 6 years with weekend warrior and regular exercise, respectively) and diabetes (43% and 46% lower risks, respectively). However, associations also spanned all disease categories tested.

“Our findings were consistent across many different definitions of weekend warrior activity, as well as other thresholds used to categorize people as active,” said Khurshid.

The results suggest that physical activity is broadly beneficial for lowering the risk of future diseases, especially cardiometabolic conditions. “Because there appears to be similar benefits for weekend warrior versus regular activity, it may be the total volume of activity, rather than the pattern, that matters most,” said Khurshid. “Future interventions testing the effectiveness of concentrated activity to improve public health are warranted, and patients should be encouraged to engage in guideline-adherent physical activity using any pattern that may work best for them.”

Reference:

Shinwan Kany, Mostafa A. Al-Alusi, Joel T. Rämö, James P. Pirruccello, Timothy W. Churchill, Steven A. Lubitz, Associations of “Weekend Warrior” Physical Activity With Incident Disease and Cardiometabolic Health, Circulation, https://doi.org/10.1161/CIRCULATIONAHA.124.068669

Powered by WPeMatico

Apremilast Significantly Improves Genital Psoriasis Symptoms and QoL, reveals study

According to researchers, apremilast is the only oral phosphodiesterase 4 inhibitor that significantly improves symptoms of genital psoriasis and enhances quality of life in patients with moderate-to-severe disease. In a phase 3, randomized, placebo-controlled trial, the drug showed statistically and clinically meaningful benefits, hence depicting a promising treatment for this most challenging and stigmatizing condition. The research was published by Joseph F. Merola and colleagues in the Journal of the American Academy of Dermatology.

Genital psoriasis is an ultra-common and distressing presentation of psoriasis that affects patients’ quality of life and frequently brings significant social stigma. It has limited treatment options, especially with oral systemic agents. Apremilast is indicated for treating psoriasis, but this is the first study examining its effectiveness for treating genital psoriasis.

The DISCREET trial was a phase 3, randomized, double-blind, placebo-controlled study, NCT03777436, to determine the efficacy and safety of apremilast 30 mg twice daily in patients with moderate-to-severe genital psoriasis. This study consisted of a 16-week treatment period with an extension phase. Patients were stratified according to <10% and ≥10% affected body surface area to ensure balanced groups.

289 patients were randomized to apremilast, n=143, or placebo, n=146; the primary endpoint was the proportion of patients achieving a modified static Physician Global Assessment of Genitalia sPGA-G) score of 0 or 1 at Week 16 with at least a 2-point reduction from baseline. Secondary endpoints included improvements in genital signs and symptoms, overall skin involvement, and patient-reported quality of life outcomes.

Key Findings

  1. At week 16, 39.6% of patients treated with apremilast achieved the primary endpoint, as compared with 19.5% of patients receiving placebo, which corresponded to a difference of 20.1% between groups, statistically significant with P = .0003, thus confirming the efficacy of apremilast in the management of genital psoriasis.

  2. Treatment with apremilast was associated with dramatic improvements in genital signs and symptoms; significant reductions in itching, discomfort, and other psoriasis-related symptoms were reported. Besides genital area improvement, there were overall betterments of psoriasis symptoms in patients at other body areas.

  3. The researchers also showed that among individuals treated with apremilast, there was a significant improvement in quality-of-life measures, supporting the suggestion of better functioning in everyday life and reduced encumbrances related to emotional factors specific to genital psoriasis.

  4. The most frequently emergent adverse events in the apremilast group were diarrhea, headache, nausea, and nasopharyngitis. These adverse effects were of mild or moderate intensity, as would be consistent with a generally expected safety profile.

These results bring to the forefront an oral treatment option with apremilast in patients with genital psoriasis, where therapeutic choices have been few in this population. The significant improvement in symptoms of genital psoriasis, in combination with overall improvements in skin condition and quality of life, underlines the clinical relevance of apremilast in the management of this condition.

In this first randomized controlled study of an oral systemic treatment specifically for genital psoriasis, apremilast showed significant clinical efficacy and quality-of-life improvements. At Week 16, almost 40% of patients achieved a meaningful decrease in the symptoms of genital psoriasis, underscoring apremilast as another valid treatment option for this frequently stigmatizing condition. The results confirm apremilast’s part as an important addition to the treatment landscape for genital psoriasis, offering patients a much-needed therapeutic alternative.

Reference:

Merola, J. F., Parish, L. C., Guenther, L., Lynde, C., Lacour, J.-P., Staubach, P., Cheng, S., Paris, M., Picard, H., Deignan, C., Jardon, S., Chen, M., & Papp, K. A. (2024). Efficacy and safety of apremilast in patients with moderate-to-severe genital psoriasis: Results from DISCREET, a phase 3 randomized, double-blind, placebo-controlled trial. Journal of the American Academy of Dermatology, 90(3), 485–493. https://doi.org/10.1016/j.jaad.2023.10.020

Powered by WPeMatico

Children under continuous kidney replacement therapy have higher risk of vitamin D deficiency, finds research

A new study by Peace Dorothy Imani and team found that children who need continuous kidney replacement therapy (CKRT) are more likely to have osteopenia, fractures, and/or vitamin D deficiency. The findings of this study were published in the journal of BMC Nephrology. A typical feature of chronic kidney disease (CKD) is altered calcium and phosphate balance. The dysregulation of phosphate metabolism, parathyroid hormone (PTH), fibroblast growth factor (FGF)-23, expression of Klotho, and 1,25-dihydroxyvitamin D (1, 25 di-(OH)2D) causes this. The ensuing metabolic abnormalities are linked to both mineral bone disease (MBD) and an elevated risk of cardiovascular disease, which is a primary cause of morbidity and death in people with chronic kidney disease (CKD).

In critically unwell children with acute kidney injury (AKI), continuous kidney replacement therapy is used to manage hemodynamics and gradually remove fluid while permitting nutritional assistance. Recent consensus from the Acute Disease Quality Initiative (ADQI) defines acute kidney disease (AKD) as AKI lasting more than 7 days but less than 90 days. This study was to characterize the bone and metabolic results of juvenile AKD patients who needed more than 28 days of CKRT combined with localized citrate anticoagulation.

In this prospective observational research conducted at a single site, the study included 37 patients who needed regional citrate anticoagulation and CKRT for at least 28 days. The duration of CKRT was the exposure, and the results included osteopenia and/or fractures, as well as 25-hydroxy vitamin D. Vitamin D insufficiency and deficiency were prevalent in 17.2% and 69.0% of people, respectively. The radiographic evidence of osteopenia and/or fractures were present in 29.7% of the patients. Also, age or ethnicity did not appear to have any impact on vitamin D deficit or insufficiency. Vitamin D levels were not predicted by duration on CKRT or intact PTH levels. After correcting for age and length of CKRT, children with chronic liver illness had an odds ratio higher than children with other main diagnoses for osteopenia and/or fractures.

Overall, vitamin D insufficiency and inadequacy are common in juvenile AKD patients undergoing CKRT and may deteriorate despite conventional supplementation. The patients who require extended CKRT may require higher doses of vitamin D supplementation to maintain adequate levels and avoid MBD.

Source:

Imani, P. D., Vega, M., Pekkucuksen, N. T., Srivaths, P., & Arikan, A. A. (2024). Vitamin D and metabolic bone disease in prolonged continuous kidney replacement therapy: a prospective observational study. In BMC Nephrology (Vol. 25, Issue 1). Springer Science and Business Media LLC. https://doi.org/10.1186/s12882-024-03705-9

Powered by WPeMatico

Maternal depression during early pregnancy associated with impaired child executive functioning at 4 to 5 years of age: AJOG

Maternal depression is a serious condition that affects up
to 1 in 7 pregnancies. Despite evidence linking maternal depression to
pregnancy complications and adverse fetal outcomes, there remain large gaps in
its identification and treatment. More work is needed to define the specific
timing and severity of depression that most urgently requires intervention,
where feasible, to protect maternal health and the developing fetus.

A study by Levitan RD et al aimed to examine whether the
timing and severity of maternal depression and/or anxiety during pregnancy
affect child executive functioning at age 4.5 years. Executive functioning in
the preschool years is a strong predictor of both school readiness and longterm
quality of life.

This longitudinal observational pregnancy cohort study
included a sample of 323 mother-child dyads taking part in the Ontario Birth
Study, an open pregnancy cohort in Toronto, Ontario, Canada. Maternal symptoms
of depression and anxiety were assessed at 12 to 16 and 28 to 32 weeks of
gestation and at the time of child testing at age 4.5 years using the 4-item
Patient Health Questionnaire. Child executive functioning was measured during a
home visit using standardized computerized administration of the Flanker test
(a measure of attention) and the Dimensional Change Card Sort (a measure of
cognitive flexibility). Posthoc general linear models were used to assess
whether maternal depression severity categories (no symptom, mild symptoms, or
probable major depressive disorder) were helpful in identifying children at
risk.

Across all children, after controlling for potential
confounds, greater maternal depressive symptoms at weeks 12 to 16 weeks of
gestation predicted worse performance on both the Flanker test (DR2 ¼0.058; P<
.001) and the Dimensional Change Card Sort (P=.018). Posthoc general linear
modeling further demonstrated that the children of mothers meeting the
screening criteria for major depression in early pregnancy scored 11.3% lower
on the Flanker test and 9.8% lower on the Dimensional Change Card Sort than the
children of mothers without maternal depressive symptoms in early pregnancy.
Mild depressive symptoms had no significant effect on executive function
scores. There was no significant effect of anxiety symptoms or maternal
antidepressant use in early pregnancy or pandemic conditions or maternal
symptoms in later pregnancy or at the time of child testing on either the
Flanker or Dimensional Change Card Sort results.

This study demonstrated that fetal exposure to maternal
major depression, but not milder forms of depression, at 12 to 16 weeks of
gestation is associated with impaired executive functioning in the preschool
years.

The current findings suggest that maternal major depression
during early pregnancy may have a particularly deleterious effect on the fetal
brain circuitry necessary for child executive functioning. This emphasizes an
urgent need to improve the recognition and treatment of major depression,
particularly in early pregnancy, to limit its negative effects on child
cognitive development. Possible treatments include both antidepressant
medications and cognitive behavior therapy, which now has proven efficacy for
antenatal depression per se.

This study demonstrated that fetal exposure to maternal
major depression, but not milder forms of depression, at 12 to 16 weeks of
gestation is associated with impaired executive functioning in the preschool
years. Child executive functioning is crucial for school readiness and predicts
long-term quality of life. This emphasizes an urgent need to improve the
recognition and treatment of maternal major depression, particularly in early
pregnancy, to limit its negative effects on the patient and on child cognitive
development.

Powered by WPeMatico