Smart device uses AI and bioelectronics to speed up wound healing process, reveals study

As a wound heals, it goes through several stages: clotting to stop bleeding, immune system response, scabbing, and scarring. 

A wearable device called “a-Heal,” designed by engineers at the University of California, Santa Cruz, aims to optimize each stage of the process. The system uses a tiny camera and AI to detect the stage of healing and deliver a treatment in the form of medication or an electric field. The system responds to the unique healing process of the patient, offering personalized treatment.

The portable, wireless device could make wound therapy more accessible to patients in remote areas or with limited mobility. Initial preclinical results, published in the journal npj Biomedical Innovations, show the device successfully speeds up the healing process.

Designing a-Heal

A team of UC Santa Cruz and UC Davis researchers, sponsored by the DARPA-BETR program and led by UC Santa Cruz Baskin Engineering Endowed Chair and Professor of Electrical and Computer Engineering (ECE) Marco Rolandi, designed a device that combines a camera, bioelectronics, and AI for faster wound healing. The integration in one device makes it a “closed-loop system”—one of the firsts of its kind for wound healing as far as the researchers are aware.

“Our system takes all the cues from the body, and with external interventions, it optimizes the healing progress,” Rolandi said.

The device uses an onboard camera, developed by fellow Associate Professor of ECE Mircea Teodorescu and described in a Communications Biology study, to take photos of the wound every two hours. The photos are fed into a machine learning (ML) model, developed by Associate Professor of Applied Mathematics Marcella Gomez, which the researchers call the “AI physician” running on a nearby computer.

“It’s essentially a microscope in a bandage,” Teodorescu said. “Individual images say little, but over time, continuous imaging lets AI spot trends, wound healing stages, flag issues, and suggest treatments.”

The AI physician uses the image to diagnose the wound stage and compares that to where the wound should be along a timeline of optimal wound healing. If the image reveals a lag, the ML model applies a treatment: either medicine, delivered via bioelectronics; or an electric field, which can enhance cell migration toward wound closure.

The treatment topically delivered through the device is fluoxetine, a selective serotonin reuptake inhibitor which controls serotonin levels in the wound and improves healing by decreasing inflammation and increasing wound tissue closure. The dose, determined by preclinical studies by the Isseroff group at UC Davis group to optimize healing, is administered by bioelectronic actuators on the device, developed by Rolandi. An electric field, optimized to improve healing and developed by prior work of the UC Davis’ Min Zhao and Roslyn Rivkah Isseroff, is also delivered through the device.

The AI physician determines the optimal dosage of medication to deliver and the magnitude of the applied electric field. After the therapy has been applied for a certain period of time, the camera takes another image, and the process starts again.

While in use, the device transmits images and data such as healing rate to a secure web interface, so a human physician can intervene manually and fine-tune treatment as needed. The device attaches directly to a commercially available bandage for convenient and secure use.

To assess the potential for clinical use, the UC Davis team tested the device in preclinical wound models. In these studies, wounds treated with a-Heal followed a healing trajectory about 25% faster than standard of care. These findings highlight the promise of the technology not only for accelerating closure of acute wounds, but also for jump-starting stalled healing in chronic wounds.

AI reinforcement

The AI model used for this system, which was led by Assistant Professor of Applied Mathematics Marcella Gomez, uses a reinforcement learning approach, described in a study in the journal Bioengineering, to mimic the diagnostic approach used by physicians.

Reinforcement learning is a technique in which a model is designed to fulfill a specific end goal, learning through trial and error how to best achieve that goal. In this context, the model is given a goal of minimizing time to wound closure, and is rewarded for making progress toward that goal. It continually learns from the patient and adapts its treatment approach.

The reinforcement learning model is guided by an algorithm that Gomez and her students created called Deep Mapper, described in a preprint study, which processes wound images to quantify the stage of healing in comparison to normal progression, mapping it along the trajectory of healing. As time passes with the device on a wound, it learns a linear dynamic model of the past healing and uses that to forecast how the healing will continue to progress.

“It’s not enough to just have the image, you need to process that and put it into context. Then, you can apply the feedback control,” Gomez said.

This technique makes it possible for the algorithm to learn in real-time the impact of the drug or electric field on healing, and guides the reinforcement learning model’s iterative decision making on how to adjust the drug concentration or electric-field strength.

Now, the research team is exploring the potential for this device to improve healing of chronic and infected wounds. 

Reference:

Li, H., Yang, Hy., Lu, F. et al. Towards adaptive bioelectronic wound therapy with integrated real-time diagnostics and machine learning–driven closed-loop control. npj Biomed. Innov. 2, 31 (2025). https://doi.org/10.1038/s44385-025-00038-6.

Powered by WPeMatico

New Lens Implant After Cataract Surgery Restores Clear Vision Across All Distances Without Glasses: Study

Patients who have a new type of lens implanted in their eyes during surgery for cataracts or to correct their eyesight have excellent or good vision over distances both near and far, and often no longer need spectacles for reading.

Research presented today (Sunday) at the 43rd Congress of the European Society of Cataract and Refractive Surgeons (ESCRS) [1] evaluated outcomes for around 200 patients in 17 sites in Europe and Asia-Pacific who had surgery to implant the TECNIS PureSEETM, a purely refractive extended depth of field (EDF) presbyopia correction Intraocular Lens (IOL). Presbyopia is the condition that affects all people as they age, making it harder to focus clearly on close objects and text.

The study looked at visual acuity after surgery: how well the patients could see over far, intermediate and near distances on the logMAR visual acuity charts used by ophthalmologists that consist of rows of letters that become smaller as they go down the chart. It also measured Manifest Refraction Spherical Equivalent (MRSE) – a way of quantifying the refractive errors of the eye. In addition, the study reported on how patients were managing after surgery in terms of whether or not they needed spectacles for reading, how satisfied they were with their outcomes and whether they would recommend the lens to others.

Data were available for 238 patients at the time of the Congress, making it one of the biggest studies to report on this type of lens so far. The findings showed that the EDF presbyopia correction IOL, on average, provided excellent distance, very good intermediate, and functional near vision without glasses, with little refractive error. (Full details of results are in note).

Nearly all patients (96%) reported needing glasses ‘none’ or ‘a little of the time’ for distance vision; 93% reported this for intermediate distances, 62% for near distances, and 85% for overall vision.

For satisfaction with the outcomes, 96% were ‘mostly’ or ‘completely’ satisfied with their distance vision, 94% with intermediate, 73% with near, and 95% with overall vision; 96% would recommend the lens to their family and friends.

Professor Oliver Findl, Chair of the ESCRS Education Committee, is a consultant eye surgeon and head of the ophthalmology department at Hanusch Hospital, Vienna. He presented the findings to the congress.

He said: “The PureSee EDF IOL gave patients excellent distance, very good intermediate and functional near vision, which resulted in high patient satisfaction with less need for spectacles. The data in this study came from several surgical centres throughout Europe and Asia in a ‘real world setting’ outside of the usual clinical trials.

“The category of EDF IOLs, such as the TECNIS PureSee, are a great alternative to multifocal lenses for patients who wish to be less dependent on spectacles after lens surgery and do not want to take the risk of unwanted optical side-effects.”

Currently, when a person requires cataract surgery, their cloudy lens is replaced with an artificial lens. To enable a patient to see objects both near and far, their surgeon may offer them a choice of lenses, such as:

• Monofocal lens, which enables patients to see clearly at one distance point (far, intermediate or near), but which mean they need spectacles for the distances they have not chosen. If surgery is needed on both eyes, patients can choose to have a lens for close-up work in one eye, and another lens for distances in the other eye. This combination is known as monovision and the brain then adjusts to the two distances so that the patient can see both near and far.

• Multifocal lens, which can provide good vision over all distances, without the need for glasses. The lens is split into different zones or concentric rings, with different prescription lengths for each section, which provide clear, complete vision when combined. However, these lenses may cause some unwanted optical side-effects, especially at night.

• Extended depth of field (EDF) lens, which provides good distance and intermediate (arm’s length) vision, but spectacles may still be needed for close work such as reading small print.

“The difference between some of these lenses and the EDF presbyopia correction IOL that we used in this study is that it is a fully ‘refractive’ IOL, meaning it uses variations in the lens curvature to focus light at a single distance. The surface of the lens is smooth and you don’t see bumps or rings,” said Prof. Findl. “This means you have better night vision and don’t see halos, starbursts, glare and other visual disturbances that can occur with other lenses.”

Dr Joaquín Fernández is ESCRS Secretary, CEO of Qvision and Medical Director of Andalusian Ophthalmology Institute at Vithas Hospitals. He was not involved in the research. He commented: “Eye surgery for cataracts or to correct vision is constantly evolving but, so far, the ‘holy grail’ of developing a lens that can give patients good vision over all distances without any visual disturbances has been elusive. These data from a ‘real world’ study are very encouraging and suggest that the available options are expanding to better meet the expectations of our patients. However, other options still need to be explored. We look forward to further results from the study.”

Reference:

Patients who had cataracts removed or their eyesight corrected with a new type of lens have good vision over all distances without spectacles, European Society of Cataract and Refractive Surgeons, Meeting: 43rd Congress of the European Society of Cataract and Refractive Surgeons.

Powered by WPeMatico

Molecular breast imaging may benefit women with dense breasts, suggests research

Screening women with dense breasts with both molecular breast imaging (MBI) and digital breast tomosynthesis (DBT) increased overall invasive cancer detection while modestly increasing the recall rate compared with screening only with DBT, according to a new study published today in Radiology, a journal of the Radiological Society of North America (RSNA).

“To our knowledge, this is the first multicenter, prospective evaluation of MBI as a supplement to DBT in women with dense breasts,” said lead author Carrie B. Hruska, Ph.D., professor of medical physics at Mayo Clinic in Rochester, Minnesota.

An estimated 47% of women who undergo breast cancer screening have dense breasts, according to the Centers for Disease Control and Prevention. DBT is an advanced form of mammography that takes multiple X-ray images of the breast from different angles to create a 3D reconstruction of the breast, but it does not detect all breast cancers, especially in women with dense breasts.

MBI is one of several options available for supplemental breast screening for dense breasts, such as breast ultrasound, breast MRI and contrast-enhanced mammography. The Density MATTERS (Molecular Breast Imaging and Tomosynthesis to Eliminate the Reservoir) Trial was designed to assess the performance of screening MBI as a supplement to DBT in women with dense breasts.

For the trial, women with dense breasts from five sites were prospectively enrolled from 2017 to 2022 and underwent two annual screening rounds of DBT and MBI (prevalence screening at Year 1 and incidence screening at Year 2). One-year follow-up after each screening round was completed in September 2024.

Eligible participants included women aged 40–75 years who were asymptomatic and had dense breasts as visually assessed by a radiologist and reported on their last mammogram.

The study cohort included 2,978 participants (mean age 56.8 years). The women were mostly postmenopausal, and 82% had category C breast density. Approximately 80% of participants had no family history of breast cancer, and 98% had no personal history of breast cancer.

Across both screening rounds, 30 breast cancer lesions were detected in 29 participants by MBI only and not found with DBT. Most of these incremental breast cancers were invasive (22 of 30 or 71% of lesions). The median invasive lesion size was 0.9 cm. Among the participants with MBI-only detected breast cancer, 26 of 29 (90%) had node-negative cancers, and 6 of 29 (20%) had node-positive disease.

“MBI detected an additional 6.7 cancers per 1,000 screenings at Year 1 and an additional 3.5 cancers per 1,000 screenings at Year 2,” Dr. Hruska said. “Among the incremental cancers detected only by MBI, 70% were found to be invasive. Additionally, 20% of those incremental cancers were node positive, suggesting that MBI can reveal mammographically occult, clinically important disease.”

In the first screening round, 7 of 2,978 participants (2.4 per 1,000 screened) were diagnosed with node-positive cancers. DBT alone detected 4 of 7 (57%), and DBT plus MBI detected 7 of 7 (100%).

In Year 2, 6 of 2,590 participants (2.3 per 1,000 screened) had node-positive cancers. DBT alone detected 1 of 6 (16%) and DBT plus MBI detected 4 of 6 (67%). Neither modality detected 2 of the 6 node-positive cancers (33%).

“Someone who’s having their routine annual screen every year should not be diagnosed with advanced breast cancer,” Dr. Hruska said. “That’s just unacceptable. With a supplemental screening every few years, we hope to find cancers earlier and see the diagnosis of advanced cancer go way down.”

Dr. Hruska said one of the strengths of the Density MATTERS trial was the mix of academic medical centers and community hospitals involved. Participating centers also included MD Anderson Cancer Center, Henry Ford Health System, ProMedica Breast Care (regional practice in Ohio), and a Mayo Clinic Health System site in La Crosse, Wisconsin.

“The enrollment of 12% minority patients extends the generalizability of our findings,” she said.

She said the trial results provide important data for healthcare institutions assessing the best modality for supplemental breast screening.

“I don’t want to discourage anyone from getting a mammogram, because they absolutely should,” Dr. Hruska said. “However, DBT doesn’t find all cancers, and women need to understand its limitations and consider how supplemental screening can fill the gap.”

According to Dr. Hruska, MBI is considered safe for routine screening, is well-tolerated by patients and is relatively inexpensive.

“MBI uses a well-established radiotracer that’s been used in cardiac imaging for a really long time,” she said. “It has fewer risks than other modalities and no contrast reactions. If a woman has a choice of modalities, it’s important that she understands the benefits and risks of each and be involved in the decision-making.”

Reference:

Carrie B. Hruska , Katie N. Hunt, Nicholas B. Larson, Molecular Breast Imaging and Digital Breast Tomosynthesis for Dense Breast Screening: The Density MATTERS Trial, Radiology, https://doi.org/10.1148/radiol.243953

Powered by WPeMatico

Meta-Analysis Reveals Higher Risk of Dry Eye in Patients with Inflammatory Bowel Disease

Canada: Dry eye syndrome (DES) emerges as a frequently overlooked extraintestinal manifestation of inflammatory bowel disease (IBD), a systematic review and meta-analysis published in Frontline Gastroenterology has revealed. 

The review, led by Bachviet Nguyen and colleagues from the University of British Columbia, Vancouver, reveals that patients with IBD are at significantly higher risk of developing DES, highlighting the need for clinicians to actively screen for ocular complications in this population.
The meta-analysis pooled data from eight cohort studies encompassing 55,211 patients with IBD and 54,870 controls without IBD. Researchers examined objective ocular parameters, including tear production (Schirmer I test), tear film stability (tear breakup time, TBUT), and symptom severity (Ocular Surface Disease Index, OSDI). The results demonstrate that IBD patients experience measurable impairment in both tear production and tear film stability, along with more severe dry eye symptoms, compared with non-IBD controls.
Key findings from the review include:
  • Increased risk of DES: IBD patients had more than double the odds of developing dry eye syndrome compared with controls (OR=2.54).
  • Higher symptom burden: OSDI scores were significantly elevated in the IBD group, with a weighted mean difference (WMD) of 4.57 points, indicating more pronounced dry eye symptoms.
  • Reduced tear production: The Schirmer I test showed that IBD patients produced less tear volume than controls (WMD −3.63 mm), suggesting impaired lacrimal function.
  • Tear film instability: Tear breakup time (TBUT) was shorter in IBD patients (WMD −3.33 s), reflecting compromised ocular surface stability.
This meta-analysis highlights that DES may be an underrecognized manifestation of IBD, alongside well-known ocular extraintestinal manifestations such as uveitis, scleritis, and episcleritis. While these conditions are often monitored during routine care, dry eye has not traditionally been prioritized, despite its substantial impact on patient comfort and quality of life.
According to the authors, early recognition and treatment of DES in IBD patients could significantly improve daily functioning and overall well-being. They recommend that gastroenterologists and primary care providers incorporate routine ocular assessments into evaluations of IBD patients, particularly for those reporting eye discomfort or visual disturbances. Interventions such as artificial tears, anti-inflammatory eye drops, and lifestyle modifications may alleviate symptoms and prevent long-term ocular damage.
As the prevalence of IBD continues to rise worldwide, this review serves as a reminder that management should extend beyond the gastrointestinal tract. By acknowledging and addressing underappreciated manifestations like dry eye syndrome, clinicians can deliver more holistic care, enhancing both physical comfort and quality of life for patients living with chronic inflammatory disease.
Reference:
Nguyen B, Quon S, Tao BK, et alDry eye syndrome: an underappreciated extraintestinal manifestation of inflammatory bowel disease? A systematic review and meta-analysisFrontline Gastroenterology Published Online First: 19 September 2025. doi: 10.1136/flgastro-2025-103356

Powered by WPeMatico

Urinary NGAL useful Biomarker for differentiating SRNS vs SSNS, reports study

A new study published in the journal of BMC Nephrology showed that as a non-invasive biomarker to distinguish between steroid-resistant nephrotic syndrome (SRNS) and steroid-sensitive nephrotic syndrome (SSNS), urinary neutrophil gelatinase-associated lipocalin (NGAL) has great promise for supporting early diagnosis, risk assessment, and treatment.

Massive proteinuria (more than 40 mg/m^2 per hour) causes hypoalbuminemia (less than 30 g/L), which leads to hyperlipidemia, edema, and other problems. Nephrotic syndrome is a clinical illness characterized by this condition. Although patient response varies, corticosteroids are usually used as the first line of treatment.

Early in the course of the disease, it’s critical to distinguish between steroid-sensitive nephrotic syndrome (SSNS) and steroid-resistant nephrotic syndrome (SRNS), as the latter is linked to a higher risk of unfavorable long-term consequences. One possible non-invasive indicator of renal impairment is neutrophil gelatinase-associated lipocalin (NGAL), a biomarker generated in response to tubular injury. Therefore, this systematic review and meta-analysis attempted to establish if urine NGAL values change between patients with SRNS, SSNS, and healthy controls.

Following the PRISMA standards, this research performed a systematic review and meta-analysis of papers that reported NGAL levels in SSNS and SRNS. Using PubMed, Web of Science, Scopus, ScienceDirect, and the WHO Virtual Health Library Regional, a thorough literature search was carried out. A random-effects model was used for the statistical analysis in order to determine the standardized mean difference (SMD) with a 95% confidence range.

There were 16 investigations in all. Both SSNS and SRNS patients had much greater urine NGAL levels than healthy controls, according to meta-analyses. In comparison to healthy controls, SSNS and SRNS patients had substantially higher urinary NGAL levels (SMD = 0.78 (95% CI: 0.434–1.128, P <.001) and SMD = 2.56 (95% CI: 1.152–3.971, P <.001), respectively.

Urinary NGAL levels were significantly greater in SRNS patients than in SSNS patients (SMD = 1.889, 95% CI: 0.819–2.959, P <.001). Urinary NGAL has a moderate to great discriminative power in differentiating between SRNS and SSNS, according to ROC analysis from many investigations.

Overall, urinary NGAL showed great promise as a non-invasive, early indicator of steroid resistance in nephrotic syndrome. Its incorporation in clinical evaluation processes is supported by its relevance across populations, biological plausibility, and diagnostic accuracy. With more research, NGAL may be crucial in helping children patients with nephrotic syndrome achieve better results, reduce needless steroid exposure, and customize treatment plans.

Source:

Abdalla, A., Ali, A., Abufatima, I. O., Majzoub, S., Elbasheer, T. A. E., Ahmed, S. M. A. O., Osman, S., Khalid, Y. K. K., Elamir, A., Khalid, H. K. K., Abdelmutalib, N. M. A., & Mohamed, S. O. O. (2025). Use of urinary NGAL in steroid-resistant vs. steroid-sensitive nephrotic syndrome: a systematic review and meta-analysis. BMC Nephrology, 26(1). https://doi.org/10.1186/s12882-025-04420-9

Powered by WPeMatico

Moderate Alcohol Intake Linked to Higher Risk of Upper Aerodigestive Tract Cancer, Meta-Analysis Finds

France: Even moderate alcohol consumption significantly increases the risk of squamous cell cancers of the upper aerodigestive tract (UADT), a large pooled analysis of 28 prospective cohorts published in the JNCI: Journal of the National Cancer Institute has shown.

The analysis revealed that consuming as little as 5–15 grams of alcohol per day—equivalent to roughly half a standard drink—was associated with a 12% higher risk of UADT cancer compared with very low alcohol intake. Risk estimates rose consistently with each 10-gram increment, reaching 16% in women and 12% in men, highlighting the public health implications of even modest drinking.
The study, conducted by Elmira Ebrahimi and colleagues from the Genomic Epidemiology Branch at the International Agency for Research on Cancer (IARC/WHO), Lyon, France, pooled individual-level data from 2,365,437 participants across multiple geographic regions. Over a median follow-up of 15.5 years, the analysis identified 6,903 cases of UADT squamous cell carcinoma. Researchers used advanced statistical models to account for potential confounders, including age, sex, and smoking status, and examined associations across different types of alcoholic beverages.
Key findings from the meta-analysis include:
  • Risk at low to moderate intake: Alcohol consumption of 5–<15 g/day was associated with a 12% higher risk of UADT cancers (HR 1.12) compared with 0.1–<5 g/day.
  • Dose-response relationship: Each 10-gram daily increment in alcohol intake increased UADT cancer risk by 16% in women and 12% in men.
  • Impact across smoking status: The elevated risk persisted among current smokers (HR 1.14), former smokers (HR 1.10), and never-smokers (HR 1.15).
  • Consistency across beverages: The association between alcohol and UADT cancer was observed regardless of the type of alcoholic drink consumed.
  • Geographic differences: Risk estimates per 10 g/day varied slightly by region, with Europe-Australia showing HR 1.15, Asia 1.13, and North America 1.11.
These findings reinforce that alcohol acts as an independent risk factor for UADT squamous cell carcinoma, regardless of smoking history. While baseline risks differ—with smokers at higher absolute risk—the consistent increase across populations underscores the broader implications of alcohol consumption on cancer burden.
According to the researchers, the study supports ongoing public health strategies aimed at reducing alcohol intake to lower UADT cancer incidence. “Even moderate alcohol consumption, often considered safe, can meaningfully elevate the risk of upper aerodigestive tract cancers,” Ebrahimi and colleagues noted. “Public health interventions should continue emphasizing alcohol reduction, particularly in populations with other risk factors such as tobacco use.”
The study adds to a growing body of evidence linking alcohol with cancer development beyond the liver, highlighting the need for global awareness and policy measures targeting alcohol consumption. By reducing alcohol intake, individuals can take an important step toward lowering their risk of these often-aggressive cancers.
Reference:
Ebrahimi, E., Naudin, S., Dimou, N., Mayén, A., Wang, M., Abnet, C. C., Åkesson, A., Barnett, M. J., Bellocco, R., Bonn, S. E., Chen, C., Christiani, D. C., Crane, T. E., Eliassen, A. H., Freudenheim, J. L., Gao, Y., Gierach, G., Giovannucci, E. L., Gram, I. T., . . . Ferrari, P. Alcohol consumption and upper aerodigestive tract squamous cell carcinoma: Evidence from 28 prospective cohorts. JNCI: Journal of the National Cancer Institute. https://doi.org/10.1093/jnci/djaf230

Powered by WPeMatico

Statin Use in Early Pregnancy Not Linked to Congenital Malformations, New Study Shows

Norway: Exposure to statins during the first trimester of pregnancy does not appear to increase the risk of congenital malformations, according to a large nationwide study from Norway published in the European Heart Journal. The findings provide reassuring evidence for women who require lipid-lowering therapy and suggest that previous concerns based on animal studies may overestimate the risk in human pregnancies.         

The study, led by Jacob J. Christensen and colleagues from the University of Oslo, leveraged national registry data encompassing more than 800,000 pregnancies between 2005 and 2018. Researchers categorized pregnancies as statin-exposed if a prescription was filled during the first trimester, previously exposed if a prescription was filled in the year before pregnancy but not during the first trimester, or non-exposed if no prescription was filled during either period. 

Key results from the study include:

  • Overall congenital malformations: Rates were similar across exposure groups, with 4.3% in non-exposed pregnancies, 5.9% in statin-discontinuer pregnancies, and 6.7% in first-trimester exposed pregnancies.

  • Major malformations: No significant differences were observed, with adjusted odds ratios (ORs) showing no elevated risk among exposed pregnancies (OR 1.15).

  • Minor malformations: Again, exposure was not associated with increased risk (OR 1.47).

  • Heart malformations: Analyses of statin and other lipid-modifying agents (LMAs) revealed no significant association with cardiac defects (OR 1.22).

To strengthen these findings, the researchers incorporated the Norwegian data into a meta-analysis that included six previous studies. The combined analysis confirmed no increased risk of major congenital abnormalities (adjusted OR 1.06) or heart malformations (1.24), supporting the safety profile of statins during early pregnancy.

Traditionally, statins have been contraindicated during pregnancy due to concerns about teratogenicity observed in animal models. However, human data have been limited, leaving clinicians uncertain about the real-world risks. This study provides substantial evidence that first-trimester exposure does not confer a strong or independent risk of congenital malformations, offering reassurance to both patients and healthcare providers.

Christensen and colleagues emphasized that while the study’s large scale strengthens confidence in the findings, very rare or weak associations cannot be entirely ruled out. “Our results suggest that statin use in early pregnancy is not linked to major or heart malformations, but ongoing monitoring and cautious clinical judgment remain important,” the authors noted.

Overall, the study supports a nuanced approach to lipid-lowering therapy in women of childbearing age, highlighting that necessary treatment should not be automatically withheld due to unfounded teratogenic fears. By combining national registry data with meta-analytic evidence, the research clarifies long-standing questions about the safety of statins in pregnancy and can inform clinical guidelines and patient counseling.

Reference:

Christensen, J. J., Holven, K. B., Bogsrud, M. P., Retterstøl, K., E, J., Michelsen, T. M., Veierød, M. B., & Nordeng, H. Statin use in pregnancy and risk of congenital malformations: A Norwegian nationwide study. European Heart Journal. https://doi.org/10.1093/eurheartj/ehaf592

Powered by WPeMatico

Hidradenitis Suppurativa Linked to Higher Risk of Peripheral Arterial Disease, Study Finds

Taiwan: Patients with hidradenitis suppurativa (HS) face a significantly higher risk of developing peripheral arterial occlusive disease (PAOD), according to a 15-year follow-up cohort review published in Clinical and Experimental Dermatology.

The analysis, led by Dr. Shiu-Jau Chen from the Department of Neurosurgery, Mackay Memorial Hospital, Taipei, Taiwan, found that HS patients had a 23% higher crude risk of PAOD compared with matched controls, which increased to 48% after adjusting for confounding factors. The risk was particularly pronounced among men and adults aged 65 years or older, highlighting the need for careful cardiovascular evaluation in this patient population.
The review utilized data from the TriNetX research network, matching patients with HS 1:1 to non-HS controls based on demographic and clinical characteristics. The primary aim was to determine the incidence of PAOD over 15 years. Incidence rates were analyzed using hazard ratios (HRs) and Kaplan-Meier survival curves, with sensitivity and stratification analyses conducted to verify the robustness of the findings.
The key findings of the study were as follows:
  • HS patients showed consistently higher rates of PAOD compared to controls.
  • In the unadjusted model, the hazard ratio (HR) for developing PAOD in HS patients was 1.23.
  • After adjusting for potential confounders, the risk remained elevated with an adjusted HR of 1.48.
  • Kaplan-Meier analyses demonstrated a significantly higher cumulative probability of PAOD in HS patients over the follow-up period.
  • Stratified analyses indicated that men with HS had an HR of 1.79.
  • Adults aged 65 years and older with HS had an HR of 1.77.
  • Elevated risk persisted even after accounting for traditional cardiovascular risk factors, suggesting HS independently contributes to the development of arterial disease.
The findings emphasize the importance of proactive cardiovascular monitoring and management in patients with HS. Clinicians are encouraged to assess for signs of peripheral arterial disease in this population, particularly in older patients and men, and to consider integrated care strategies that address both dermatologic and cardiovascular health.
Dr. Chen and colleagues emphasize the need for further investigations to clarify the biological mechanisms linking HS to PAOD and to explore whether effective management of HS could mitigate the associated vascular risk. This long-term review highlights the broader systemic implications of HS, reinforcing that the condition is not solely a dermatologic concern but also a marker for increased cardiovascular vulnerability.
“Hidradenitis suppurativa significantly raises the risk of developing peripheral arterial occlusive disease, independent of traditional risk factors, and calls for heightened awareness and preventive strategies in clinical practice,” the authors concluded.
Reference:
Chang, H., Lo, S., Li, Y., Chiu, T., Yeh, C., Jhang, Y., Chen, S., & Gau, S. Association between Hidradenitis Suppurativa and Peripheral Arterial Occlusion Disease: A propensity-score-matched cohort study. Clinical and Experimental Dermatology. https://doi.org/10.1093/ced/llaf423

Powered by WPeMatico

Subthalamic DBS as an Effective Therapy for Parkinson’s Disease: JAMA

A recent trial published in the Journal of American Medical Association found that subthalamic deep brain stimulation (DBS) is an effective therapy for people living with moderate to advanced Parkinson’s disease (PD)

The Implantable Neurostimulator for the Treatment of Parkinson’s Disease (INTREPID) trial tracked outcomes in 313 patients implanted with Boston Scientific’s Vercise DBS system between 2013 and 2022. This study followed participants for 5 years, making it one of the most comprehensive long-term DBS trials to date.

Of the 191 patients who received DBS, 137 completed the 5-year follow-up. At baseline, the participants had significant motor impairment, with average Unified Parkinson’s Disease Rating Scale (UPDRS-III) scores of 42.8 in the “off medication” state. By year one, DBS cut this score nearly in half, down to 21.1, which represented a 51% improvement. While some decline occurred by year 5 the benefit remained strong, with a 36% overall improvement when compared to pre-surgery levels.

The patients began with an average score of 20.6 and saw a 41% improvement in year one, which leveled to a still-significant 22% gain by year five. Also, dyskinesia is a disabling side effect of long-term medication use, dropped from 4.0 at baseline to 1.0 at year one and stabilized at 1.2 after 5 years, which reflected a 70–75% reduction.

The patients reduced their levodopa equivalent doses by 28% in the first year, with this reduction maintained through the 5-year mark. This highlights the potential of DBS to not only improve symptoms but also lower dependence on medications that carry side effects. The most frequent serious adverse event was infection, reported in 9 participants. 10 deaths occurred during the study, though none were related to DBS or the trial itself.

While Parkinson’s disease is progressive, meaning symptoms inevitably worsen over time, the INTREPID results demonstrate that DBS offers sustained and clinically meaningful relief well beyond the initial years of therapy. Overall, these findings reinforce DBS as a standard treatment option for patients with moderate to advanced PD, particularly those struggling with medication fluctuations and uncontrolled motor symptoms.

Source:

Starr, P. A., Shivacharan, R. S., Goldberg, E., Tröster, A. I., House, P. A., Giroux, M. L., Hebb, A. O., Whiting, D. M., Leichliter, T. A., Ostrem, J. L., Metman, L. V., Sani, S., Karl, J. A., Siddiqui, M. S., Tatter, S. B., Haq, I. ul, Machado, A. G., Gostkowski, M., Tagliati, M., … Nazzarro, J. M. (2025). Five-Year Outcomes from Deep Brain Stimulation of the Subthalamic Nucleus for Parkinson Disease. JAMA Neurology. https://doi.org/10.1001/jamaneurol.2025.3373

Powered by WPeMatico

Choline Intake Boosts Bone Density in Postmenopausal Women, reveals research

A new study published in Scientific Reports has found that higher dietary choline consumption is linked to greater bone mineral density (BMD) in postmenopausal women, a group highly vulnerable to osteoporosis. Since the accelerated bone loss caused by estrogen deficiency during menopause, identification of modifiable dietary factors such as choline is important for establishing osteoporosis prevention strategies. The study was conducted by Jincheng and colleagues.

The study involved 4,160 postmenopausal women aged 50 years and above with no exclusion based on income or ethnicity. The information was captured in six cycles of the NHANES from 2007 through 2018. Researchers used weighted linear regression models to test how cumulative dietary choline consumption was associated with lumbar spine bone mineral density. The models were completely adjusted for suspected confounders, such as age, race, income level, body mass index (BMI), and comorbidities. Stratified analyses were performed to determine whether socioeconomic and demographic characteristics altered the detected effects.

Key Findings

  • With every 1 g/day higher choline intake, there was a 0.082 g/cm² rise in lumbar spine BMD (β: 0.082, 95% CI: 0.025–0.139).

  • Participants in the highest quartile of choline consumption (Q4) also had a 0.025 g/cm² greater BMD than those in the lowest quartile (Q1) (β: 0.025, 95% CI: 0.007–0.042).

The association was significantly larger in some subgroups:

  • Obese women had a greater effect size of 0.146 g/cm² (95% CI: 0.067–0.220, P interaction = 0.015).

  • High-income women (PIR > 4) experienced a rise in BMD of 0.121 g/cm² (95% CI: 0.013–0.228, P interaction = 0.003).

  • Non-Hispanic White women had a BMD increase of 0.110 g/cm² (95% CI: 0.034–0.185, P interaction = 0.039).

This research yields the first holistic evidence that increased dietary choline is positively associated with higher lumbar spine BMD in postmenopausal women, especially in obese women, high-income populations, and non-Hispanic Whites. These results indicate the promise of choline-focused nutritional interventions for the prevention of osteoporosis in this at-risk population. The role of income and race also highlights the importance of equitable diet interventions able to reach the bone health disparity among aging women.

Reference:

Bai, J., Lv, P., Li, L. et al. Association between total dietary choline intake and lumbar spine bone mineral density in postmenopausal women based on NHANES 2007–2018. Sci Rep 15, 23483 (2025). https://doi.org/10.1038/s41598-025-08891-6

Powered by WPeMatico