CBT-I acceptable and efficacious intervention for managing insomnia in chronic disease populations: JAMA

A new study published in the Journal of the American Medical Association revealed that cognitive behavioral therapy for insomnia (CBT-I) is safe and highly effective in improving sleep among individuals living with chronic diseases like cancer, cardiovascular conditions, chronic pain, and stroke. 

Insomnia often exhibits compounding physical symptoms and reducing quality of life. Standard treatment guidelines recommend CBT-I as the first-line intervention, but concerns have lingered about its suitability for patients already managing heavy disease burdens. 

This review of data across 67 randomized clinical trials (RCTs) involved 5,232 participants, to evaluate the efficacy, safety, and patient acceptability of CBT-I in chronic disease populations. The studies included patients with a wide spectrum of conditions, from irritable bowel syndrome and chronic pain to cancer survivors and stroke patients.

Insomnia severity decreased significantly, with a large effect size (g = 0.98). This suggests patients not only reported sleeping better but also experienced meaningful reductions in their insomnia symptoms. Sleep efficiency improved with a moderate effect size (g = 0.77).

Sleep onset latency, or how long it takes to fall asleep, was shortened with a moderate effect size (g = 0.64). The patients with chronic disease who completed CBT-I not only fell asleep faster and stayed asleep longer but also reported markedly better sleep quality overall.

The review also assessed whether outcomes varied depending on delivery methods or patient characteristics. While results were consistently positive across disease groups, longer treatment durations produced better improvements in both sleep efficiency and time to fall asleep. This suggests that maintaining therapy over an extended period could maximize benefits.

The dropout rates averaged just 13.3%, which highlighted that most participants found the therapy manageable and worthwhile. Moreover, adverse effects linked directly to CBT-I were rare, reassuring clinicians that the therapy poses minimal risk. Overall, these results show that CBT-I is just as effective for people with chronic diseases as it is for the general population struggling with insomnia. This is an important step forward in integrated care, particularly since poor sleep often worsens chronic disease outcomes.

Source:

Scott, A. J., Correa, A. B., Bisby, M. A., Chandra, S. S., Rahimi, M., Christina, S., Heriseanu, A. I., & Dear, B. F. (2025). Cognitive behavioral therapy for insomnia in people with chronic disease: A systematic review and meta-analysis. JAMA Internal Medicine. https://doi.org/10.1001/jamainternmed.2025.4610

Powered by WPeMatico

Rheumatoid arthritis-associated lung disease significantly increase risk of serious infection: Study

A new study published in the journal of Arthritis & Rheumatology revealed that a considerable elevated risk of severe infection across anatomic locations and a variety of pathogen types is linked to rheumatoid arthritis-associated lung disease (RA-LD), especially RA-associated interstitial lung disease (RA-ILD).

An estimated 1% of people in the US and northern European nations suffer with rheumatoid arthritis (RA), a systemic inflammatory disease. Clinically, RA may virtually impact any lung compartment, including the pleura, which can cause pleural inflammation and/or effusions; small and large airways; the pulmonary vasculature; and the parenchyma, which can show up as rheumatoid nodules or ILD. Thus, this study examined the relationship between the risk of significant infection and RA-LD.

Using the MGB Biobank (Boston, Massachusetts), researchers performed a retrospective cohort study that matched RA-LD patients by age, sex, and length of RA to RA patients without lung disease (RA-no LD). Medical record review and chest imaging for clinically evident RA-associated bronchiectasis (RA-BR) and/or RA-ILD were used to confirm RA-LD patients.

Serious infection was the main outcome taken for this study. To account for competing risk of mortality, incidence rates and propensity score-adjusted sub distribution hazard ratios (sdHR) were computed using the Fine and Gray models.

In comparison to 980 RA-no LD comparators, 221 RA-LD patients (151 RA-ILD and 70 RA-BR) had a substantially increased risk of severe infection (55.8 vs. 25.8 per 1,000 person-years, sdHR 1.60, 95%CI 1.20-2.12). For RA-ILD patients (sdHR 1.79, 95%CI 1.33-2.41), the elevated risk persisted, but not for RA-BR cases (sdHR 1.19, 95%CI 0.72-1.97).

RA-LD was linked to a number of pathogen species, including bacteria, fungi, viruses, and mycobacteria. The most frequent anatomic locations of infection in RA-LD were the lungs, skin and soft tissues, and the ears, nose, and throat. Certain infections, such as influenza virus, Staphylococcus, pseudomonas, respiratory syncytial virus, and nontuberculous mycobacteria, were more common in RA-LD patients, especially among RA-BR.

Overall, serious infections involving several body parts and a broad spectrum of microorganisms are significantly more likely to occur in patients with rheumatoid arthritis-related lung disease, particularly interstitial lung disease. In particular, an increased risk of lung infections is related with bronchiectasis associated with RA (RA-BR). 

Reference:

Zhang, Q., Qi, Y., Wang, X., McDermott, G. C., Chang, S. H., Chaballa, M., Khaychuk, V., Paudel, M. L., & Sparks, J. A. (2025). Risk of serious infection in patients with rheumatoid arthritis-associated interstitial lung disease or bronchiectasis: A comparative cohort study. Arthritis & Rheumatology. https://doi.org/10.1002/art.43338

Powered by WPeMatico

Severe Emphysema on HRCT Signals Higher Heart Disease Risk in COPD Patients: Study

China: Severe emphysema, quantitatively assessed using high-resolution computed tomography (HRCT), is a strong independent predictor of coronary artery disease (CAD) in patients with chronic obstructive pulmonary disease (COPD), a new retrospective study has found. The study was published online in the International Journal of Chronic Obstructive Pulmonary Disease.

Researchers found that CAD risk more than doubled when the low attenuation area (LAA%) exceeded 16.95%. Furthermore, patients with more severe emphysema exhibited more complex coronary lesions and were more likely to require percutaneous coronary intervention (PCI), highlighting the critical interplay between structural lung changes and cardiovascular risk.
The study was led by Dr. Luoman Su and colleagues from the Department of Pulmonary and Critical Care Medicine, Key Laboratory of Interventional Pulmonology of Zhejiang Province, The First Affiliated Hospital of Wenzhou Medical University, Wenzhou, China. The research team aimed to clarify the role of emphysema—a key structural subtype of COPD—in the development of CAD, a relationship that had remained poorly understood. Using quantitative HRCT, the investigators sought to determine whether emphysema severity could independently stratify cardiovascular risk beyond traditional factors.
The study retrospectively analyzed 392 COPD patients without prior CAD, who underwent HRCT between 2015 and 2020. Emphysema extent was measured as the percentage of low attenuation areas below −950 Hounsfield units, with severe emphysema defined as LAA% above 16.95%.
Logistic regression and restricted cubic spline analyses revealed the following:
  • Severe emphysema was independently associated with a higher risk of coronary artery disease (adjusted OR 2.28).
  • The predictive model showed strong performance, with an area under the ROC curve of 0.81.
  • Patients with severe emphysema had higher SYNTAX scores, indicating more complex coronary lesions (median 16.29 vs. 10.0 in mild emphysema).
  • Rates of percutaneous coronary intervention (PCI) were significantly higher in patients with severe emphysema (68.2% vs. 33.3%).
“These findings highlight the clinical relevance of emphysema quantification in COPD patients,” Dr. Su and colleagues noted. “Incorporating HRCT-based emphysema severity into cardiovascular risk assessment may enable earlier identification of high-risk individuals, prompting timely evaluation and intervention.”
Despite its insights, the study had several limitations. The retrospective, single-center design limits causal inference and generalizability, while residual confounding from unmeasured factors such as lifestyle and smoking history may persist. Variations in CT imaging protocols and the absence of long-term cardiac event data further constrain the findings. Nonetheless, the observed associations are biologically plausible and complement existing evidence linking emphysema with heightened cardiovascular risk.
“The study demonstrates that quantitatively assessed emphysema on HRCT is an independent predictor of CAD in COPD patients and correlates with more complex coronary lesions. The identified threshold of 16.95% LAA-950 offers a potential imaging biomarker for cardiovascular risk stratification in this population, though prospective multicenter validation is needed before clinical adoption,” the authors wrote.
“These findings highlight the importance of integrating lung structural assessment into comprehensive cardiovascular management strategies for patients with COPD,” they concluded.
Reference:
Su L, Qian C, Yu C, Weng Z, Zhao H, Chen C. Quantitatively Assessed Emphysema Severity on HRCT Independently Predicts Coronary Artery Disease in COPD: A Retrospective Cohort Study. Int J Chron Obstruct Pulmon Dis. 2025;20:3147-3161. https://doi.org/10.2147/COPD.S540503

Powered by WPeMatico

Fiber-Reinforced Restorations Improve Fracture Resistance in Endodontically Treated Teeth: Study

A recent study published in Clinical Oral Investigations by Lena Bal, Cangül Keskin, Aybuke Karaca Sakallı, Bilge Ozcan, and İsen Guleç Kocyigit investigated the effect of different reinforcement materials on the fracture resistance of mesio-occluso-distal (MOD) cavity restorations in endodontically treated teeth.

The researchers aimed to determine how applying fiber-reinforced composites to the cervical and coronal segments of teeth could impact their durability. This study is particularly relevant because vertical fractures in MOD restorations are among the most common failures observed in endodontically treated teeth, and improving restoration strength is essential for long-term success and preservation of dental function.

The study involved eighty-four freshly extracted human mandibular molars, prepared with standardized MOD cavities and endodontically treated. Teeth were divided into groups based on the combinations of reinforcement materials applied to the cervical and coronal segments, including flowable composite, posterior composite, EverX flow, EverX posterior, and Ribbond.

After thermocycling, fracture resistance was evaluated using a universal testing machine, and failure patterns were examined under a stereomicroscope. The authors reported that fiber-reinforced structures provided superior fracture resistance compared to conventional composites, particularly when applied to the cervical segment. This suggests that strategic placement of reinforcement materials in restorative procedures can enhance structural integrity, helping to reduce the risk of fractures in teeth subjected to functional load.

According to Bal et al., the findings underscore the clinical importance of material selection and placement in endodontic restorations. Fiber-reinforced composites were not only stronger than Ribbond but also packable and suitable for mesio-occluso-distal cavity restorations, making them practical for routine dental practice. The authors conclude that combining different resin-reinforced composites in cervical and coronal segments improves fracture resistance and provides a reliable method to enhance tooth durability after endodontic therapy. This study offers valuable guidance for dental professionals seeking evidence-based strategies to strengthen restorations and extend the longevity of endodontically treated teeth.

Reference:
Bal L, Keskin C, Karaca Sakallı A, Özcan B, Koçyiğit İG. Effect of combined use of reinforcement materials on the fracture resistance of MOD cavity restorations in endodontically treated teeth. Clinical Oral Investigations. 2025;29:481. doi:10.1007/s00784-025-06560-6

Keywords: Fiber-reinforced composites, fracture resistance, MOD cavity, endodontically treated teeth, cervical segment, Lena Bal, Clinical Oral Investigations

Powered by WPeMatico

Delayed Diagnosis of Venous Thromboembolism Linked to Higher 30-Day Mortality, Study Finds

USA: A new study published in JAMA Network Open has revealed that delayed diagnosis of venous thromboembolism (VTE) is linked to a higher risk of death.

The investigation revealed that most VTE cases were diagnosed after more than 24 hours, and in many instances, delays extended beyond 72 hours. These diagnostic lapses were strongly tied to higher 30-day mortality, particularly when pulmonary embolism was missed. The findings highlight the urgent need for improved detection strategies to enhance patient safety.
The research was led by Min-Jeoung Kang from the Department of Medicine, Division of General Internal Medicine, Brigham and Women’s Hospital, Boston, Massachusetts, along with colleagues from Penn State Health. The team analyzed data from 3,525 patients across two major U.S. healthcare systems: Mass General Brigham (MGB) and Penn State Health (PSH). The study evaluated the use of the Delayed Diagnosis of VTE electronic clinical quality measure (DOVE eCQM), an automated tool designed to quantify diagnostic delays and assess their impact on patient outcomes.
For this purpose, retrospective data from electronic health records (EHRs) were assessed, covering 2016–2021 at MGB and 2019–2022 at PSH. The researchers categorized delays using thresholds of more than 24 hours and more than 72 hours. They also investigated the causes of missed opportunities, classifying them as practitioner-, system-, patient-, or other-related factors. Mortality risks were then compared between patients diagnosed promptly (within 24 hours) and those diagnosed late.
The study revealed the following notable findings:
  • Diagnostic delays were common, with 79.4% of patients at MGB and 82.4% at PSH receiving a VTE diagnosis after 24 hours.
  • Delays exceeding 72 hours occurred in approximately 70% of cases at both centers.
  • Practitioner-related factors were responsible for most missed diagnoses.
  • At MGB, 30-day all-cause mortality rose from 2.5% for timely diagnoses to 8.3% for delayed diagnoses (RR 3.31).
  • At PSH, 30-day mortality increased from 4.6% to 5.9% with delayed diagnosis (RR 1.28).
  • Many deaths occurring within the first 24 hours were associated with missed pulmonary embolism.
The authors emphasized that the nonspecific symptoms of VTE often hinder timely recognition, which underscores the importance of using systematic tools like the DOVE eCQM. This platform, validated across two different EHR systems, proved effective in quantifying delays and identifying their consequences. By enabling continuous monitoring, it could guide quality improvement initiatives at institutional, regional, and even national levels.
While the study demonstrated the potential of DOVE eCQM, the authors noted that some healthcare systems may face challenges in adopting natural language processing–based platforms. Even so, they argued that such digital solutions are critical to reducing diagnostic delays and improving outcomes in outpatient and primary care settings. Future work, they added, will focus on expanding DOVE eCQM to urgent care and emergency departments, as well as developing clinical decision support systems to help clinicians recognize VTE earlier.
“Overall, the study highlights that delayed recognition of VTE is common and deadly. By leveraging electronic tools such as DOVE eCQM, healthcare systems may be better equipped to reduce missed diagnoses, improve the timeliness of care, and ultimately save lives,” the authors concluded.
Reference:
Kang M, Schreiber R, Baris VK, et al. Delayed Venous Thromboembolism Diagnosis and Mortality Risk. JAMA Netw Open. 2025;8(9):e2533928. doi:10.1001/jamanetworkopen.2025.33928

Powered by WPeMatico

AI-Driven Model Predicts Which Preterm Infants Benefit From Platelet Transfusions?

Netherlands: A multicenter study has described the development of a dynamic prediction tool that helps tailor platelet transfusion decisions for preterm infants with severe thrombocytopenia, showing that the potential benefit or harm of prophylactic transfusion can vary widely based on the infant’s real-time clinical status.

Published in JAMA, the research highlights that using this individualized model could guide clinicians in balancing the risk of major bleeding against the possibility of unnecessary transfusions.

Led by Hilde van der Staaij of the Department of Clinical Epidemiology at Leiden University Medical Center, the Netherlands, the investigators addressed a longstanding challenge in neonatal care: identifying which critically ill preterm infants genuinely benefit from early platelet transfusions. Severe thrombocytopenia—defined as a platelet count below 50 × 10⁹/L—is common in extremely premature babies, but routine prophylactic transfusions have uncertain advantages and may introduce new complications.
To create the prediction model, the team analyzed data from an international cohort of 1,042 infants admitted to 14 neonatal intensive care units across the Netherlands, Sweden, and Germany between 2017 and 2021. All infants were born before 34 weeks of gestation and experienced severe thrombocytopenia. The researchers compared two strategies at repeated two-hour intervals during the first week after onset of thrombocytopenia: administering a platelet transfusion within six hours (prophylaxis) versus withholding transfusion for three days (no prophylaxis). The main outcome was the three-day risk of major bleeding or death.
The model incorporated a broad set of predictors, including gestational and postnatal age, growth restriction, presence of necrotizing enterocolitis or sepsis, need for mechanical ventilation or vasoactive medications, platelet count trends, and prior transfusions. This “landmarking” approach, combined with a clone-censor-weight method, allowed for dynamic updates of each infant’s risk profile as their condition evolved.
Key Findings:
  • Validation used a separate national cohort of 637 Dutch infants treated between 2010 and 2014.
  • The median gestational age in this group was 28 weeks.
  • The median birth weight was 900 g.
  • Major bleeding or death occurred in about one in five infants in both the validation and development cohorts.
  • Model performance was strong, with a time-dependent area under the receiver operating characteristic curve of 0.69 for the prophylactic transfusion strategy.
  • The time-dependent area under the curve was 0.85 for the no-prophylaxis strategy, indicating good discriminatory ability and calibration.
Crucially, the predicted risks varied substantially depending on the infant’s immediate clinical state. Some babies were projected to gain clear protection from early transfusion, while others faced higher odds of harm or no measurable benefit. This heterogeneity underscores that a single platelet threshold is inadequate for guiding transfusion decisions in this vulnerable population.
The authors conclude that their individualized risk algorithm offers a promising step toward more precise, evidence-based management of severe thrombocytopenia in preterm infants. While prospective trials are needed to confirm clinical impact, the tool could soon help neonatologists move away from routine prophylactic transfusions and toward a personalized strategy that optimizes outcomes and minimizes unnecessary exposure to blood products.
Reference:
van der Staaij H, Prosepe I, Caram-Deelder C, et al. Individualized Prediction of Platelet Transfusion Outcomes in Preterm Infants With Severe Thrombocytopenia. JAMA. Published online September 15, 2025. doi:10.1001/jama.2025.14194

Powered by WPeMatico

NB-UVB Phototherapy Effective Alternative to Cyclosporine for Refractory CSU: Study

A new clinical study suggests that narrow Band Ultraviolet B (NB-UVB) phototherapy is effective, well-tolerated treatment for antihistamine-refractory chronic spontaneous urticaria, offering a viable alternative to oral cyclosporine.

CSU, marked by persistent hives and itching lasting more than 6 weeks, often leaves patients and clinicians struggling to find effective treatment beyond antihistamines. Cyclosporine, though effective, is typically reserved as a third-line option due to concerns about toxicity and rebound flares. The current study highlights NB-UVB phototherapy as a promising contender in this treatment landscape.

This trial enrolled 50 patients whose CSU had failed to respond to maximum-dose antihistamines. The participants were divided into 2 groups: one received NB-UVB phototherapy 3-times weekly, while the other was prescribed cyclosporine at 3 mg/kg/day. Both regimens lasted 90 days, with outcomes tracked for another 90 days post-treatment.

This study assessed patients primarily using the Urticaria Activity Score over 7 days (UAS7). Secondary measures included the Urticaria Control Test (UCT), the Chronic Urticaria Quality of Life (CU-QoL) questionnaire, and changes in serum biomarkers such as IL-6, IL-31, and IgE.

Both treatment groups showed significant symptom relief by Day 15. Cyclosporine worked faster, providing rapid reductions in UAS7, but patients frequently experienced rebound flares after discontinuation. NB-UVB, on the other hand, produced more gradual but longer-lasting control, with sustained improvements noted even after therapy ended.

When comparing efficacy, NB-UVB met the non-inferiority criteria, meaning it was not significantly less effective than cyclosporine in reducing UAS7 scores. Quality of life measures (CU-QoL) improved in both groups, while UCT scores confirmed patients reported better control of their condition under either regimen.

In terms of biomarkers, both groups experienced reductions in serum IgE. However, the cyclosporine group showed a more pronounced decrease in inflammatory markers IL-6 and IL-31, aligning with the drug’s systemic immunosuppressive effect. NB-UVB proved to be well tolerated with minimal side effects, reinforcing its suitability for long-term disease management. Cyclosporine, although effective, carried its usual risks, and the flare-ups post-discontinuation raised concerns about dependency.

The limitations of this study include its single-center design and relatively short follow-up of 90 days after treatment cessation. Broader, multi-center research with extended monitoring will be essential to validate these findings. Overall, the results place NB-UVB phototherapy firmly on the map as a potential alternative to cyclosporine in antihistamine-refractory CSU.

Source:

Roshini, N., Mehta, H., Bishnoi, A., Kumar, V., Kumar, A., Parsad, D., & Kumaran, M. S. (2025). Narrow band ultraviolet B phototherapy versus oral cyclosporine in the treatment of chronic urticaria. Photodermatology, Photoimmunology & Photomedicine, 41(5),. https://doi.org/10.1111/phpp.70050

Powered by WPeMatico

High Uric Acid at ICU Admission Signals Higher Risk of Kidney Complications and Mortality: Study

An ancillary analysis of the FROG-ICU cohort, published in Anaesthesia Critical Care & Pain Medicine, has found that elevated serum uric acid levels at the time of intensive care unit admission are strongly associated with adverse outcomes, including higher mortality and kidney complications. The study, known as the URIC-ICU analysis, examined whether uric acid could serve as a prognostic marker in critically ill patients. The authors report that patients with high uric acid at admission had worse survival rates at both 90 days and one year, along with an increased risk of acute kidney injury and major adverse kidney events within 30 days.

Importantly, the study revealed that the link between uric acid and poor outcomes persisted even in patients who had normal kidney function at the time of ICU admission. This suggests that uric acid itself may be more than a simple indicator of kidney dysfunction and could play a role in the pathophysiology of critical illness. The researchers highlight that uric acid reflects metabolic stress, oxidative injury, and systemic inflammation, all of which can contribute to organ damage and mortality. By identifying high-risk patients early through a simple blood test, clinicians may have an opportunity to improve monitoring and consider targeted interventions.

The findings support the use of serum uric acid as a potential biomarker for risk stratification in intensive care. While the study underscores the prognostic value of uric acid, the authors also caution that further research is needed to determine whether lowering uric acid levels can directly improve patient outcomes. Nevertheless, the URIC-ICU analysis offers important insights for critical care practice, suggesting that measuring uric acid at admission could help identify patients who require closer surveillance and more aggressive management.

Reference:
Quenot JP, Darreau C, Lasocki S, Biais M, Besch G, Deye N, Laterre PF, Mira JP, Vinsonneau C, Cariou A, Payen D, Annane D, Vignon P, Mebazaa A, Gayat E. Association between serum uric acid level and outcome in intensive care unit, an ancillary analysis of the FROG-ICU cohort (URIC-ICU). Anaesth Crit Care Pain Med. 2025. doi:10.1016/j.accpm.2025.100451

Keywords: Serum uric acid, intensive care unit, acute kidney injury, mortality, prognostic biomarker, FROG-ICU, URIC-ICU, Anaesthesia Critical Care & Pain Medicine

Powered by WPeMatico

Vitamin D Deficiency Linked to Meniere’s Disease and Hearing Loss Severity: Study

Researchers have found in a new study that Vitamin D deficiency is associated with Meniere’s disease and worsened hearing loss. The findings highlight the importance of nutritional factors in ear health, particularly the role of Vitamin D in maintaining balance and auditory function. Meniere’s disease is a chronic inner ear condition marked by symptoms such as vertigo, tinnitus, and fluctuating hearing loss. While its exact cause has remained unclear, recent research suggests that Vitamin D levels could play a part in both the onset and progression of the condition.

The study, published in Frontiers in Neurology (Kohli et al., 2025), examined serum 25-hydroxyvitamin D concentrations in patients diagnosed with Meniere’s disease. It revealed that individuals with low Vitamin D were more likely to experience more severe hearing loss compared to those with sufficient levels. The researchers emphasized that Vitamin D contributes not only to bone and immune health but also influences the inner ear’s calcium metabolism, which is vital for hearing and balance. This suggests that addressing Vitamin D deficiency may help reduce the burden of symptoms for patients and potentially improve quality of life.

Although the research does not claim a direct cause-and-effect relationship, it highlights a significant association that warrants further clinical attention. For people living with Meniere’s disease, monitoring Vitamin D intake through diet, supplementation, or lifestyle adjustments like safe sun exposure may prove beneficial. Clinicians may also consider evaluating Vitamin D status as part of routine care for patients presenting with inner ear disorders. As the study notes, better understanding the link between nutrition and auditory health could open new avenues for treatment and management strategies. With no definitive cure for Meniere’s disease, these insights add a valuable dimension to ongoing efforts in reducing its impact on patients.

Reference: Kohli, K.K., et al. (2025). Association of low serum 25-hydroxyvitamin D levels with hearing loss severity in Meniere disease: a cross-sectional study. Frontiers in Neurology. https://doi.org/10.3389/fneur.2025.1638357

Keywords: Vitamin D deficiency, Meniere’s disease, hearing loss, auditory health, inner ear disorders, nutritional factors, balance.

Powered by WPeMatico

UP NEET Counselling 2025 Round 3 schedule OUT, check details

Lucknow: The Director General, Medical Education and Training, Uttar Pradesh (UPDGME) has issued the time schedule for third round of UP NEET counseling 2025 for admissions to MBBS and BDS courses in the state run medical and dental colleges for this academic year 2025-26.

All the concerned candidates are advised to take note of the following details as released by the UPDGME.

In accordance with the schedule issued by the Medical Counselling Committee (MCC), New Delhi, for the NEET UG admissions, the schedule for the third round of online counselling for admission to state quota seats in undergraduate courses (MBBS/BDS) in government/ private medical/ dental colleges /institutions/ medical universities is as follows:-

Sr. no. Description Dates Total days
1 Date
of online registration and
uploading of documents
From:- 06-10-2025 (11:00 AM)                                            To :- 09-10-2025 (11:00 AM).
03
Days
2 Date of depositing registration and security
money
From:- 06-10-2025 (l 1:00 AM)                                          To :- 09-10-2025 (02:00 PM). 04 Days
3 Date of Merit List Declaration On :- 10-10-2025 01 Day
4 Date of On-line Choice filling From:- 11-10-2025 (11:00 AM)                                            To :- 13-10-2025 (02:00 PM). 03 Days
5 Date of Allotment Result Declaration On :- 15-10-2025 01 Day
6 Date for downloading the allotment letters & Admission From:- 16-10-2025                                                                To :- 18-10-2025
& 24-10-2025
04 Days

As per the instructions contained in the government order/brochure of UP NEET UG 2025, candidates should assess their eligibility themselves.

Only those candidates will be eligible for choice filling who have completed the online registration process, whose original documents have been verified online, and who have deposited the required security amount.

The UP DGME has further issued a set of guidelines for the candidates. 

To view the full official notice, click link below: https://medicaldialogues.in/pdf_upload/2377-302838.pdf


Powered by WPeMatico