1. Clinical Utility of Zuckerkandl’s Tubercle as a Predictive Landmark for Recurrent Laryngeal Nerve Identification in Thyroid Surgeries: A Prospective Surgical Audit
Pristy Mol Biju, Achsah Jesintha Dhas, Bestine Varghese
Pristy Mol Biju, Achsah Jesintha Dhas, Bestine Varghese
Abstract
Introduction: The Zuckerkandl tubercle (ZT) is an anatomic landmark that can be used for the identification of the RLN intra –operatively. The ZT is a lateral or posterior projection from the lateral thyroid lobe. Adequate recognition and dissection of the ZT is essential for successful thyroid surgery. Objectives: To estimate the proportion of Zuckerkandl tubercle in individuals. To determine the association between Identification of Zuckerkandl tubercle and tracing of recurrent laryngeal nerve in individuals who underwent thyroidectomy. Methods: Hospital based cross sectional study design. This study is conducted in the Department of General Surgery, Dr. SMCSI Medical College, and Karakonam. November2020–October2022 (2 years). A period of 18 months from the date of institutional ethical committee clearance. Result: In this study the mean age was 44.76 +/- 6.131 in the study population. The minimum was 32 maximum was 57 in the study population. Majority of the study population, 65.5 percent were females and 34.5 percent were males. RLN was found posterior to ZT in 56.4 percent of the individuals. RLN was found anterior to ZT in 9.1 percent of the individuals. After identifying ZT in 38 out of the 55 individuals RLN was identified in 35 out of 38 individuals and in the remaining 3, RLN was not identified with the help of ZT. Size of ZT was assessed and ZT less than 10mm was found in 37.5percent of the study population, ZT More than 10mm in 34.5 percent of the study population and ZT not visualized in 30.9 percent. Conclusion: In this study in 69% of individuals, ZT was visualized during thyroidectomy. Also ZT was visualized more on the right side (38.2%).Among 38 patients in whom ZT was visualized, in 35 patients it was helpful to identify RLN. Thus this study concludes that ZT is an important pointer which helps in identifying RLN during thyroidectomy.
2. Anatomical Prevalence of Zuckerkandl’s Tubercle and Its Morphological Variations in South Indian Population Undergoing Thyroidectomy
Pristy Mol Biju, Achsah Jesintha Dhas, Punitha Thetraravu Oli
Pristy Mol Biju, Achsah Jesintha Dhas, Punitha Thetraravu Oli
Abstract
Background/Introduction: The Zuckerkandl tubercle (ZT) is a posterior projection of the lateral thyroid lobe and serves as an important anatomical landmark for identifying the recurrent laryngeal nerve (RLN) during thyroid surgery. Recognition of ZT can reduce the risk of RLN injury, a common complication in thyroidectomies. Given its close anatomical relationship with the tracheoesophageal groove, the study aimed to assess the frequency and size distribution of ZT in individuals undergoing thyroidectomy and its role in aiding RLN identification. Objectives: To estimate the proportion of Zuckerkandl tubercle in patients undergoing thyroidectomy and evaluate its anatomical variations. Materials and Methods: A hospital-based cross-sectional study was conducted over 2 years (November 2020 – October 2022) at the Department of General Surgery, Dr. SMCSI Medical College, and Karakonam. Fifty-five patients undergoing thyroidectomy for benign thyroid conditions were included using non-probability sampling. Patients with malignancy, prior neck surgery or radiation, or unfit for surgery were excluded. ZT presence, laterality, size, and its utility in RLN identification were recorded intraoperatively. Data were analyzed using SPSS software with Chi-square test applied for association. A p-value <0.05 was considered statistically significant. Results: ZT was identified in 38 out of 55 patients (69.1%), while absent in 30.9%. ZT was found more frequently on the right side (38.2%) than the left (27.3%), and bilaterally in 3.6% cases. The size distribution among those with ZT showed equal proportions of ZT <10 mm and ZT >10 mm (both 50%). There was a highly significant association between ZT presence and its size (χ² = 55.000, df = 2, p < 0.001). No significant association was found between ZT presence and age (p = 0.219) or gender (p = 0.250). Mean age was 44.76 ± 6.13 years with a female predominance (65.5%). Conclusion: ZT was observed in nearly 70% of individuals undergoing thyroidectomy and served as a consistent and reliable anatomical marker for RLN identification. The tubercle was more frequently located on the right side and presented either as a small (<10 mm) or large (>10 mm) structure with equal prevalence. Recognizing and preserving this structure is essential to prevent RLN injury during thyroid surgeries.
3. CT Evaluation of Abdominal Tuberculosis Using Neutral or Positive Oral Contrast Agent
Suparna Sahu, Anjali Prakash, Rashmi Dixit, Dipankar Pal
Suparna Sahu, Anjali Prakash, Rashmi Dixit, Dipankar Pal
Abstract
Introduction: Abdominal tuberculosis poses a diagnostic challenge due to its nonspecific symptoms and varied imaging features. CT plays a key role in its evaluation, with oral contrast agents influencing image clarity. This study compares the diagnostic utility of neutral versus positive oral contrast in CT imaging of abdominal TB. Aims: The main purpose of our study is to evaluate intestinal and extra intestinal findings in patients with abdominal tuberculosis on CT using positive or neutral oral contrast. Materials and Methods: This cross-sectional analytic study was conducted in the Department of Radio-diagnosis, Maulana Azad Medical College and associated Lok Nayak Hospital, New Delhi, over a period of one year. A total of 40 patients diagnosed with abdominal tuberculosis were included in the study. Result: The neutral oral contrast group showed greater bowel distension across all segments (e.g., jejunum: 2.10 ± 0.52 cm vs. 1.70 ± 0.42 cm) and better fold visibility (Grade II in 65% vs. 45%). Appreciable mural enhancement was more frequent with neutral contrast (80% vs. 20%). Among 31 patients with lymphadenopathy, 54.83% had multi-compartment involvement, with mesenteric nodes most common (96.7%) and homogenous enhancement seen in 77.41% of cases. Conclusion: Neutral oral contrast agents showed better bowel distension, fold visibility, and mural enhancement than positive contrast on CT. These findings improve small bowel assessment and diagnostic accuracy. Mesenteric lymphadenopathy was the most common, often with multi-compartment involvement. Overall, neutral contrast enhances CT diagnostic yield in abdominal evaluations.
4. A Comparative Study of Surgical Site Infections in Elective and Emergency Caesarean Surgeries
A. Sai Charitha, Jyosna Devi Rentapalli, K. Bhavani
A. Sai Charitha, Jyosna Devi Rentapalli, K. Bhavani
Abstract
Background: Surgical Site Infections (SSIs) are a leading postoperative complication, particularly after cesarean sections (C-sections), impacting patient outcomes and healthcare resources. The risk of SSIs is significantly higher in emergency procedures compared to elective surgeries, due to multiple modifiable and non-modifiable factors. Aim of the study: To study and compare surgical site infections in emergency and elective Caesarean surgeries. Methodology: This prospective comparative study was conducted on 560 pregnant women undergoing either elective or emergency lower segment cesarean sections (LSCS) at the Government Maternity Hospital, Tirupati over a one-year period. Participants were assessed preoperatively, intraoperatively, and postoperatively for demographic characteristics, clinical risk factors, and signs of wound infection. Data were analyzed using SPSS v24, with chi-square and t-tests employed for statistical significance. Results: SSIs were observed in 8.2% (n=23) of emergency LSCS cases versus 1.07% (n=3) of elective LSCS cases. Emergency procedures showed higher association with risk factors like postoperative anemia (39.13%) and obesity (34.78%). Klebsiella spp. and Staphylococcus aureus were the predominant pathogens isolated in emergency and elective groups, respectively. Wound gaping and need for resuturing were significantly higher in the emergency group. Conclusion: Emergency LSCS is significantly associated with a higher incidence of SSIs compared to elective procedures. Identifying key risk factors like obesity, anemia, and hypothyroidism, along with targeted antibiotic therapy, can help reduce postoperative infections. Preoperative optimization and standardized infection control protocols are crucial to improving maternal outcomes.
5. Effectiveness of Incorporating GBL (Game Based Learning) in TL (Traditional Lectures) For Phase 3 Part I MBBS Students in Paediatrics
Loveleen Kaur, Kunal Choudhary, Sanjeev Kumar Tiwari, Rajarshi Gupta
Loveleen Kaur, Kunal Choudhary, Sanjeev Kumar Tiwari, Rajarshi Gupta
Abstract
Background: Medical education is increasingly adopting student-centered approaches, with Game-Based Learning (GBL) emerging as a cutting-edge method to boost engagement, motivation, and knowledge retention. In the past, teaching and learning methods predominantly emphasized knowledge acquisition over immersive educational experiences. Today, there is a growing embrace of playful approaches, with game-based learning (GBL) seamlessly integrating engagement and education. We implemented game-based applications and traditional learning methods using multiple-choice questions (MCQs) to evaluate their effectiveness in creating an engaging and productive learning experience for medical students. Methods: An interventional study was conducted among 60 Phase 3 Part 1 MBBS students at a Medical College and tertiary care Hospital in Eastern India. A crossover design was used, where two groups comprising 30 students in each group experienced both TL and GBL for different topics. Pre- and post-tests were conducted and student satisfaction was assessed using a Likert scale. Statistical analysis was done using paired, unpaired t-tests and the Wilcoxon Signed-Ranks Test, p value < 0.05 was considered significant. Results: In our study, post-test scores showed significant improvement within both groups (p < 0.05). The GBL group demonstrated a greater increase in scores compared to the TL group, with a statistically significant difference (p < 0.01). Student satisfaction ratings were notably higher for GBL, indicating a strong preference for this approach (p = 0.0001). Conclusion: GBL is an effective addition to traditional lectures, enhancing knowledge retention and learner satisfaction. While the study supports the use of GBL in medical education, larger studies are needed for broader validation.
6. Study of Plasma Homocysteine Levels in Subjects with Cerebral Infarct and Myocardial Infarction
A. Joseph Panneer Selvam, A. Ganesh Raja, P.I. Sajith Ali
A. Joseph Panneer Selvam, A. Ganesh Raja, P.I. Sajith Ali
Abstract
Background: Numerous cross-sectional and retrospective case-control studies have linked elevated total homocysteine levels to peripheral, cerebral, and coronary vascular disease. . These investigations have also identified homocysteine as a unique risk factor that goes beyond the traditional risk factors. Because they might be influenced by a variety of factors, epidemiological findings suggesting a correlation between high HCY levels and cardiovascular risk do not establish a causative relationship. Moreover, other clinical studies revealed that vitamin supplementation had no discernible impact on cardiovascular risk, despite lowering HCY levels. Hence, our study aimed to evaluate the association between homocysteine and coronary and cerebral vascular disease without other risk factors like hyperlipidemia, diabetes mellitus, hypertension, smoking and old age. Methods: This was a case-control study involving 100 subjects of either sex of age group between 13 and 40 years, with 50 cases with myocardial and cerebral infarct and 50 controls. Plasma total homocysteine was determined by HPLC (High Performance Liquid-Chromatography). Results: The mean homocysteine level was 19.36±8.06090 among cases, while among controls it was 13.88±4.69. Hyperhomocystinemia was seen in a higher percentage of cases, 58%, compared to 38% among the control group. The cases had a 1.50-fold higher risk for MI or stroke than controls (the relative risk ratio), and the odds ratio is 2.25. The percentage of hyperhomocystinemia was 36% in the non-vegetarian group and 62% in the vegetarian group. The mean homocysteine was 18.92 µmol/L in stroke and 19.56 µmol/L in MI. Conclusion: Hyperhomocystinemia is an independent risk factor for coronary artery disease and cerebrovascular disease. In the present study, the cases had a 1.50-fold higher risk for MI or stroke than the controls. It is strongly recommended to screen for hyperhomocystinemia especially among young patients with arterial occlusive disease or venous thrombosis without other risk factors.
7. An Observational Study of Association of Substance Abuse and Psychiatric Disorders in Alleged Offenders Brought to S.M.S. Medical College, Jaipur
Vikas Soral, Mahesh Soni, Deepali Pathak, D. K. Sharma, Surya Bhan Kushwaha
Vikas Soral, Mahesh Soni, Deepali Pathak, D. K. Sharma, Surya Bhan Kushwaha
Abstract
Background: Criminal behaviour often stems from poor social upbringing, disturbed family dynamics, substance dependence, and undiagnosed psychiatric conditions. The overlap between mental illness and substance abuse among offenders poses a major concern for both healthcare systems and the legal framework. Aim: To study the association between substance abuse and psychiatric disorders in relation to criminal behaviour among alleged offenders. Methods: This prospective, descriptive observational study was conducted in the Department of Forensic Medicine and Toxicology at SMS Medical College, Jaipur, from August 2019 to November 2020. A total of 250 alleged offenders were selected based on informed consent and eligibility for medico legal evaluation. Information was gathered on their demographic profile, criminal background, substance use habits, family circumstances, and mental health status. Results: Out of 980 individuals examined, 250 (25.5%) met the inclusion criteria. Most participants were male (98.4%) and within the 18–30 year age group (64.8%). Substance use was observed in 45.2% of cases, with alcohol (60.2%) and smack (31.9%) being the most used substances. Strong links were found between substance use and factors such as low educational attainment, exposure to family conflict or abuse, and involvement in property-related crimes. Although only 4% of offenders displayed identifiable mental disorders, those with psychiatric issues had a higher incidence of childhood trauma and substance dependence. A history of repeat offenses was strongly associated with long-term addiction (77.4%). Conclusion: This study reveals a clear connection between substance abuse and criminal behaviour, especially when combined with unstable family and social conditions. These findings highlight the urgent need for prison-based rehabilitation, mental health assessment, and early family-focused interventions to reduce repeat offending and support offender recovery.
8. To Study Spectrum of Cervical Cytology in Conventional Pap Smear by Bethesda System 2014 at Tertiary Care Centre Bastar C.G.
Kalpana Nayak, Deepika Dhruw, Sakshi Dubey, K.L. Azad
Kalpana Nayak, Deepika Dhruw, Sakshi Dubey, K.L. Azad
Abstract
Introduction: Cervical cancer is one of the leading causes of morbidity and mortality among women worldwide. Cervical cancer can be preventable by early detection and screening precursor lesions by Papanicolaou smear (PAP) smear. Aim: The aim of our study is to study spectrum of cervical cytology in conventional PAP smear by applying Bethesda system 2014 at tertiary care center Bastar C.G. Material and Method: This retrospective study is conducted by Department of Pathology at Lt. Baliram Kashyap Memorial Government Medical College, Jagdalpur for a period of 2 years. A total of 1046 cases included and slides were reported according to Bethesda system 2014. Result: Out of 1046 cases, maximum number of cases were in age group of 31 – 40 years comprise of 381(36.42%) followed by 32(30.87%) cases in 41-50 years. Highest number of cases 370(35.37%) belong to NILM, followed by ASC-US cases 92 (8.79%) followed by LSIL16 (1.52%) and HSIL, SCC and AGC (NOS) were found to be 08 (0.8%), 04(0.38%) and 04(0.38%) respectively. The remaining cases 16(1.52%) were Unsatisfactory. Conclusion: Cervical pap smear with Bethesda system 2014 help to categorise lesions as infective, inflammatory and neoplastic and for appropriate treatment by clinician especially in rural area.
9. CT-Based Morphometric Review of Cervical Transverse Foramina in Indian Adults: A Narrative Synthesis and Clinical Implications
Harshul Singh
Harshul Singh
Abstract
Background & Objectives: Detailed knowledge of the cervical transverse foramina (TF) morphology is crucial for surgical procedures and interventions involving the cervical spine. This review synthesizes computed tomography (CT)-based morphometric data on TFs in Indian adults, emphasizing anatomical variation and potential clinical implications. Methods: A narrative review was performed using data from peer-reviewed literature (published between 2020 and 2024) reporting CT-based morphometric analysis of TFs in Indian adults. Key morphometric parameters such as mean diameter, shape, symmetry, and presence of accessory foramina were extracted and analyzed descriptively. Results: Fifteen studies met the inclusion criteria. The mean TF diameter progressively decreased from C1 (6.3 mm) to C7 (4.1 mm). Oval-shaped foramina predominated in the upper cervical vertebrae, while irregular shapes and accessory foramina increased caudally. Asymmetry was reported in 18.3% of cases, with right-sided dominance. Interpretation & Conclusions: Cervical transverse foramina show considerable anatomical variation in the Indian population. Preoperative CT evaluation is recommended to minimize vertebral artery injury risk during cervical interventions. The study highlights the importance of population-specific anatomical databases.
10. Assessment of Determinants of Undernutrition in Children Aged One to Five Years: A Hospital-Based Study
Urja Dipakbhai Ladani, Khush Jitendrabhai Viramgama, Drashti Chandrakantbhai Patel
Urja Dipakbhai Ladani, Khush Jitendrabhai Viramgama, Drashti Chandrakantbhai Patel
Abstract
Background: Undernutrition remains a major public health challenge, particularly among children under five years of age in low- and middle-income countries like India. Despite numerous government programs, a significant proportion of children continue to suffer from stunting, wasting, and underweight due to a complex interplay of dietary, social, and environmental risk factors. Understanding these determinants at the institutional level is crucial for designing targeted interventions to reduce childhood malnutrition. Material and Methods: A hospital-based cross-sectional observational study was conducted over a period of one year in the Pediatric Outpatient Department of a tertiary care teaching hospital. A total of 500 children aged 1 to 5 years were enrolled. Data regarding sociodemographic profile, parental education, socioeconomic status, feeding practices, and environmental conditions were collected through a semi-structured, pre-tested questionnaire. Anthropometric measurements were taken using standard protocols, and nutritional status was assessed using WHO Child Growth Standards (2006). Z-scores were calculated to classify children as underweight, stunted, or wasted. Statistical analysis was performed using Chi-square test, and a p-value < 0.05 was considered significant. Results: Out of the 500 children, 52% were males and 48% females. The majority (36%) belonged to the 1–2 year age group, and 62% were from low-income households. The prevalence of underweight, stunting, and wasting was 34.2%, 41.8%, and 17.6%, respectively. Significant associations were observed between undernutrition and factors such as low maternal and paternal education (p < 0.001), poor socioeconomic status (p < 0.001), and paternal alcohol use (p < 0.001). Exclusive breastfeeding showed a protective trend, though not statistically significant (p = 0.085). Overcrowding and poor environmental conditions were also linked to increased risk of undernutrition. Conclusion: The study highlights the multifactorial etiology of undernutrition among under-five children, emphasizing the critical role of parental education, household income, and behavioral factors. Addressing these determinants through community-based education, nutritional counseling, and socioeconomic development programs is essential for improving child health outcomes.
11. Right Ventricular Infarction Complicating Inferior Wall Myocardial Infarction and it’s in Hospital Adverse Outcome: An Observational Study
Gourab Das, Rajesh Kishore Debbarma, Suman Raul, Abhishek Bhattacharjee, Manodip Mandal
Gourab Das, Rajesh Kishore Debbarma, Suman Raul, Abhishek Bhattacharjee, Manodip Mandal
Abstract
Introduction: Inferior wall myocardial infarction (IWMI) is frequently caused by right coronary artery occlusion and can often also involve the right ventricle. Right ventricular infarction (RVI) occurs in a significant proportion of these cases and is associated with worse clinical outcomes such as hemodynamic instability, arrhythmias, and increased risk of in-hospital complications such as cardiogenic shock and death. Despite its impact, RVI is often underdiagnosed in routine clinical practice. Early recognition is essential for timely intervention and improved outcomes. This observational study aims to assess the incidence of RVI in patients with IWMI and evaluate its association with in-hospital adverse events. Aims: Aim of the present study is to assess the RV infarction in Acute Inferior Wall Myocardial Infarction patients and it’s correlation on in hospital outcome. Materials & Methods: The study was Cross sectional Descriptive type of study. This study was completed within one and half year, One Year for data collection (2023-2024) and 6 months for data management. department of medicine, Agartala Government Medical College and GB Pant Hospital. And total sample size 110 acute inferior wall myocardial infarction patients. Result: We found that Out of the 110 patients with acute inferior wall myocardial infarction (IWMI), 53 (48.2%) had right ventricular myocardial infarction (RVMI). Arrhythmias were significantly more common in patients without RVMI (26.3%) compared to those with RVMI (3.8%). Mortality was higher in the RVMI group (11.3%) compared to the non-RVMI group (7.0%).Heart failure occurred more frequently in patients with RVMI (11.3%) than in those without RVMI (3.5%).Hypotension or cardiogenic shock was also more prevalent among patients with RVMI (30.2%) compared to non-RVMI patients (19.3%).Survival without complications was slightly lower in the RVMI group (43.4%) compared to the non-RVMI group (43.9%). Statistical analysis using the chi-square test showed a significant association between RVMI and in-hospital outcomes, with a chi-square value of 13.2225 and a p-value of 0.0102, indicating statistical significance (p < 0.05). Conclusion: We concluded that patients with inferior wall myocardial infarction (IWMI) with concomitant right ventricular (RV) infarction evaluated in this observational study. 53 patients out of 110 had RV involvement (RVMI) which begets adverse outcome.
12. Sexual Dysfunction in Male Schizophrenic Patients Treated with Risperidone versus Olanzapine in Bhopal Population
Mayur Prakash Shinde, Ravindra Bhumanna Narod, Priya Kaurwad
Mayur Prakash Shinde, Ravindra Bhumanna Narod, Priya Kaurwad
Abstract
Background: It is established that antipsychotic drug administration elevates prolactin secretion, which suppresses LH and testosterone in males. Hence, two novel drugs with different affinity profiles are administered to evaluate sexual dysfunction in males. Method: Out of 90, 45 patients were administered with risperidone, and the remaining 45 were administered with olanzapine. A simple rating scale that measures sex drive, arousal, penile erection, orgasm ability, and orgasm satisfaction. These scores range from 5 to 30. Sexual dysfunction is defined as a total score ≥ 19 or a score of > 5 on an item. Results: Comparison of sexual dysfunction, ASEX getting and keeping an erection, and ASEX satisfaction with orgasm had a significant p-value (p<0.001). Conclusion: The present comparative study pronounces that risperidone has higher sexual dysfunction than olanzapine in male schizophrenic patients.
13. Effect of Cardiac Autonomic Function in Copd – Insight from a Cross-Sectional Study in Upper Assam
Priyanka S., Tazkira Begum, Subhalakshmi Das, Abanti Bora Baruah, Rituparna Bora
Priyanka S., Tazkira Begum, Subhalakshmi Das, Abanti Bora Baruah, Rituparna Bora
Abstract
Background: “Chronic obstructive pulmonary disease (COPD)” is a progressive respiratory disease that induces cardiac autonomic dysfunction and adversely impacts the autonomic nervous system. 50% of COPD mortality is associated with cardiovascular disease (CVD). COPD Severity could be examined by forced expiratory volume in 1s. “Heart rate variability (HRV)” is employed for evaluating cardiac autonomic function. Aims and Objectives: To evaluate HRV parameters including mean HR, NN50, root mean square of successive differences (RMSSD) and pNN50 across various phases of COPD and observe whether these parameters had been associated with severity of disease. Materials and Methods: Cross-sectional research including 140 COPD patients was conducted. Pulmonary function test, HRV values, and anthropometric parameters were evaluated. Then divided into 4 categories as per “Global Initiative for Chronic Obstructive Lung Disease (GOLD)” stage criteria. Analysis of Variance (ANOVA) is employed for comparing mean ± standard deviation (SD) of continuous measurement results. When 4 groups’ p values have been considered to be significant (p < 0.05). Results: RMSSD levels reported lower in very severe, severe, and moderate COPD patients than in mild COPD patients. Patients with very severe, severe, and moderate COPD exhibited higher mean HR levels than those with mild COPD. Mean HR had a positive correlation with disease severity; however, RMSSD levels had a negative correlation. Conclusion: We have demonstrated that COPD patients experience cardiac autonomic dysfunction, indicated as elevated sympathetic and decreased parasympathetic activity. This correlation became more evident as severity of disease increased.
14. Neutrophil-Lymphocyte Ratio in Pregnancy Induced Hypertension- A Comparative Study
Popcee Gogoi, Abanti Bora Baruah, Tazkira Begum, Mondita Borgohain, Farzana Zahir
Popcee Gogoi, Abanti Bora Baruah, Tazkira Begum, Mondita Borgohain, Farzana Zahir
Abstract
Introduction: Hypertension is one of the common medical complications of pregnancy that contributes significantly to maternal and perinatal morbidity and mortality. The neutrophil-to-lymphocyte ratio (NLR) is a marker of subclinical inflammation, used to predict hypertension incidence, especially preeclampsia. Objective: To compare neutrophil-lymphocyte ratio between subjects with pregnancy-induced hypertension and normotensive pregnant subjects. Materials & Methods: This cross-sectional comparative study was carried out over a period of one year after informed consent and ethical clearance. The study population included 260 antenatal patients of 29-40 weeks of gestation (130 pregnant women with pregnancy-induced hypertension (PIH) and 130 normotensive pregnant women) who were admitted in the department of Obstetrics & Gynaecology, Assam Medical College & Hospital, Dibrugarh, after fulfilment of inclusion and exclusion criteria. The primary outcome was neutrophil-lymphocyte ratio (NLR). Result: NLR was significantly higher in pregnant women with PIH (3.58±0.79) and normotensive pregnant women (2.38±0.58) (p <0.001). Conclusion: The study showed that the mean NLR value of pregnant subjects with PIH was noticeably higher than that of normotensive pregnant subjects, suggesting an increased inflammatory response in hypertensive disorder of pregnancy. Unlike many other inflammatory markers, NLR proves to be an inexpensive and readily available biomarkers, obtained from complete blood counts that may be useful for prediction and diagnosis of pre-eclampsia.
15. GNB Uropathogens and their Antibiotic Resistance Trends: A Report in Eastern India
Somosree Ghosh
Somosree Ghosh
Abstract
Background: Urinary tract infections are one of the commonest infections in India and worldwide leading to increased mortality and morbidity. Prevalence of bacterial agents causing UTI and it’s antibiogram varies in different geographical location. Culture and sensitivity testing remains gold-standard for diagnosis of UTI. It is essential to know the local prevalence rate and antibiogram of Uropathogens, before starting an empirical therapy, until arrival of reports. This study was undertaken to study the etiological factors and their antibiogram in a tertiary care hospital in Eastern India. Methods: In about 5512 admitted and out-patients of KPC Medical College and hospital, Kolkata, freshly passed mid-stream urine from non-catheterized patients and from aseptic catheter port of catheterised patients was collected in a sterile container and was sent to Microbiology Department for a period of one year. The samples were plated in Mac Conkey and UTI agar and incubated at 37◦C overnight, following which next day it was processed in Vitek Automated machine for identification and Antibiotic susceptibility testing. Results were analysed using appropriate software. Results: Of the 5512 urine samples, prevalence of UTI was 8.85% (488). The females (60.86%) were more affected than the males(39.14%).Most susceptible age group susceptible to UTI was the elderly individuals above 60yrs (61.06%) followed by 46-60yr (13.93%).Out of the 488 positive urine growth samples, 447 (91.60%) samples had Gram Negative etiology, 20 (4.09%) samples yielded growth of Gram Positive cocci and 21 (4.31%) samples showed growth of Candida species. Most common pathogen isolated was Escherichia coli (45.90%) followed by Klebsiella sp (34.50%). They are sensitive to Fosfomycin (88.98%), Nitrofurantoin (79.86%), Tetracycline (77.85%), Cotrimoxazole (75.61%), Aminoglycosides (74.94%) and Carbapenems (70.25%). Conclusion: The study was helpful for giving an idea on Empirical therapy to be used when suspecting UTI. Fosfomycin, Nitrofurantoin, Cotrimoxazole, Aminoglycosides are better to treat UTI empirically. Hygiene, Health education and good Infection control practices can help in decreasing Urinary infection rates.
16. Comparative Study of the Accuracy of Preoperative Investigations Such as USG and FNAC with Postoperative Histopathological Findings in Diagnosing the Spectrum of Thyroid Disorders at a Tertiary Care Centre in Idukki
Lillykutty Joseph, Raman M.R., Anilkumar V., Vandana, Haseena R.
Lillykutty Joseph, Raman M.R., Anilkumar V., Vandana, Haseena R.
Abstract
Introduction: The thyroid gland is the first endocrine gland to develop in the human embryo, beginning its formation by the third week of gestation as a thickening in the floor of the primitive pharynx between the first and second pharyngeal pouches. This thickening gives rise to a diverticulum that migrates caudally in front of the pharyngeal gut while remaining temporarily connected to the tongue by the thyroglossal duct. Aims: To compare the diagnostic accuracy of sonography, fine needle aspiration cytology (FNAC), and histopathological examination in evaluating thyroid disorders. To assess the correlation of clinical, radiological, and cytological findings with final histopathological outcomes in patients with thyroid swellings. Materials & Methods: The present study was a prospective observational study. This Study was conducted from 1 year at Department General Surgery, Al Azhar Medical College & Super Speciality Hospital,Total 107 patients were included in this study. Result: In our study of 107 patients, the majority (74 patients, 69.2%) had no comorbidities. Among those with comorbidities, diabetes mellitus (DM) was the most common, seen in 8 patients (7.5%), followed by hypertension (HTN) in 7 patients (6.5%). Some patients had combinations of conditions such as DM with HTN, COPD, or DLP. The statistical analysis showed a highly significant (P < 0.00001). In our study of 107 patients, the majority were euthyroid (74 patients, 69.2%), meaning they had normal thyroid function. Hyperthyroidism was found in 21 patients (19.6%), hypothyroidism in 12 patients (11.2%). The P value was < 0.00001, indicating a statistically significant. Conclusion: We concluded that the study carried out at a tertiary care facility in Idukki revealed a predominance of benign cytology (Bethesda II) and euthyroid status among patients. A considerable number were diagnosed with multinodular goitre, and some also had coexisting hypertension and diabetes mellitus.
17. Comparison of Conventional v/s High-Sensitivity Troponin Assays in Early Diagnosis of Acute Coronary Syndrome
Thomas Mathew, Ebenezer Yohannan
Thomas Mathew, Ebenezer Yohannan
Abstract
Background: Acute Coronary Syndrome (ACS) is one of the most common causes of morbidity and death globally, and early diagnosis is critical to intervene in time and reduce outcomes. The measurement of cardiac troponin (cTn) is a gold standard biomarker for the diagnosis of myocardial damage. Traditional troponin assays (cTn) have been used for decades; however, their decreased sensitivity during the first few hours after the onset of symptoms can hamper diagnosis. High-sensitivity troponin assays (hs-cTn) were created to detect trace levels of circulating troponin, allowing for earlier diagnosis of myocardial necrosis. Objective: The objective of this study was to compare the diagnostic performance, time to diagnosis, and clinical utility of standard vs. high-sensitivity troponin assays in the early diagnosis of ACS in patients with chest pain. Methods: In this prospective comparative study conducted at a tertiary care cardiac center, adult patients presenting to the emergency department with suspected ACS were enrolled within 6 hours of symptom onset. Blood samples were obtained at baseline and at 1, 3, and 6 hours for both conventional cTnI assays and hs-cTnI assays. The primary outcomes were sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and area under the receiver operating characteristic curve (AUC) for ACS diagnosis. Secondary outcomes included the proportion of patients diagnosed within 3 hours and the impact on subsequent clinical decision-making. Results: A total of 420 patients (mean age: 57.6 ± 11.2 years; 68% male) were included. At presentation, hs-cTn detected elevated troponin in 71.4% of confirmed ACS cases compared to 42.9% with conventional assays (p < 0.001). The sensitivity of hs-cTn at baseline was 92.8% versus 68.3% for conventional cTn, while specificity remained comparable (hs-cTn: 94.1%, conventional: 95.0%). AUC for hs-cTn was significantly higher (0.964) compared to conventional cTn (0.835). Early diagnosis within 3 hours was achieved in 88.6% of ACS patients using hs-cTn compared to 61.2% with conventional assays, reducing the median time to definitive diagnosis by 1.8 hours. Earlier diagnosis led to faster initiation of guideline-directed medical therapy and facilitated timely reperfusion interventions. Conclusion: High-sensitivity troponin assays demonstrate superior sensitivity and diagnostic accuracy for early detection of ACS compared to conventional assays, without compromising specificity. Their application significantly shortens time to diagnosis, allowing for earlier therapeutic intervention and possible enhancement of clinical outcomes. The use of hs-cTn testing in routine ACS evaluation protocols should be weighed in high-resource environments to maximize patient care.
18. Comparison of Combined Spinal-Epidural Versus General Anesthesia with Epidural Catheter on Postoperative Quality of Recovery after Abdominal Hysterectomy: A Prospective Observational Study
Sanjeev Singh Guleria, Deepesh Dubey, Deepanshu Yadav, Sumit Bhargava
Sanjeev Singh Guleria, Deepesh Dubey, Deepanshu Yadav, Sumit Bhargava
Abstract
Background: The choice of anesthetic technique for abdominal hysterectomy significantly influences postoperative recovery outcomes. This study aimed to evaluate the effect of combined spinal-epidural (CSE) anesthesia versus general anesthesia with epidural catheter (GE) on the quality of postoperative recovery in patients undergoing abdominal hysterectomy, assessed using the Quality of Recovery-15 (QoR-15) scale. Methods: This prospective, single-center observational study included 120 female patients aged 18-70 years with ASA physical status I-III undergoing elective abdominal hysterectomy. Patients were divided into two groups based on anesthetic technique: CSE group (n=60) and GE group (n=60). The primary outcome was the QoR-15 score at 24 hours postoperatively. Secondary outcomes included postoperative pain scores using the Numerical Rating Scale (NRS), analgesic consumption, incidence of postoperative nausea and vomiting (PONV), time to mobilization, hospital length of stay, and patient satisfaction scores. Results: At 24 hours post-surgery, the CSE group demonstrated significantly higher QoR-15 scores compared to the GE group (134.2 ± 9.1 vs 125.8 ± 12.6, p<0.001). The CSE group required significantly less postoperative analgesic consumption (98.4 ± 28.7 ml vs 132.6 ± 41.2 ml, p=0.001) and had lower incidence of rescue analgesia requirement (8.3% vs 25.0%, p=0.018). Pain scores were significantly lower in the CSE group during the first 6 hours postoperatively (p<0.001). PONV incidence was reduced in the CSE group (11.7% vs 28.3%, p=0.022). Time to mobilization and hospital length of stay showed no significant differences between groups. Conclusions: Combined spinal-epidural anesthesia provides superior postoperative recovery quality for patients undergoing abdominal hysterectomy compared to general anesthesia with epidural catheter. CSE technique offers enhanced pain control, reduced opioid requirements, and decreased incidence of postoperative complications, contributing to improved patient comfort and recovery experience.
19. Analysis of Hematology Quality Control using Six Sigma Metrics
Shiyamala Krishnasamy, Fathima Jackia Banu Iqbal, Vinoth Kumar Ganesamoorthy, Suganya Rathinam, S. Priya Banthavi
Shiyamala Krishnasamy, Fathima Jackia Banu Iqbal, Vinoth Kumar Ganesamoorthy, Suganya Rathinam, S. Priya Banthavi
Abstract
Background: Quality control (QC) in clinical haematology laboratories is crucial for ensuring accurate and reliable test results, which directly impact patient care. Traditional QC programs, relying mainly on Westgard rules, often lead to unnecessary reruns and inefficiencies. Six Sigma metrics offer an advanced statistical approach to assess and enhance analytical performance by quantifying process variation and defects. Aim & Objective: This study was aimed to analyse haematology quality control using Six Sigma metrics, and determine appropriate frequencies of quality checks based on Sigma values. Methods: This was a cross-sectional observational study. Quality control (QC) samples were analysed twice daily at three levels for ten consecutive days, yielding 360 observations per control level. Sigma metrics were calculated for certain complete blood count (CBC) parameters like RBC, WBC, haemoglobin (Hb), Hematocrit (HCT), and platelet count in Sysmex XN 330 analyzer using standard formulas. Results: The Sigma metric analysis enabled categorization of analytes based on performance, supporting tailored QC strategies. Majority of the parameters like RBC, WBC, Hb, HCT and platelets had good to excellent performance (sigma >6) and hence one level QC can be run per day under the 13s rule. Platelet showed poor performance (sigma <3) in L1 (low QC) suggesting maximum quality control runs for that particular level. Conclusion: Six Sigma metrics provide an effective framework for optimizing haematology QC programs, minimizing unnecessary reruns, and improving laboratory efficiency. This method supports evidence-based QC planning and continuous quality improvement in resource-limited settings.
20. Alarming Rise of Antibiotic Resistance in ENT Infections: A Snapshot from the Developing World
Daisy Bacchani, Vibhor B. Kumar, Pravek Khetani
Daisy Bacchani, Vibhor B. Kumar, Pravek Khetani
Abstract
Background: Otitis media is a prevalent ENT infection in developing regions, increasingly complicated by antimicrobial resistance. Empirical treatment without microbiological guidance has led to selective pressure, promoting resistant strains. The need for local surveillance of pathogen profiles and resistance trends to guide successful therapy and stewardship activities is highlighted by rising failure rates and the prevalence of biofilm-forming organisms such as Pseudomonas aeruginosa and Staphylococcus aureus. Methods: For ten months, a prospective observational study was carried out at a tertiary care center, involving 390 patients with clinically diagnosed otitis media. All patients presented with profuse otorrhea unresponsive to empirical antibiotics. Aural swabs were collected under aseptic precautions, cultured on standard media, and put through the Kirby-Bauer disc diffusion method of antimicrobial susceptibility testing in compliance with CLSI recommendations. Results: Pseudomonas aeruginosa and Proteus mirabilis were the predominant Gram-negative isolates, while S. aureus was the most common Gram-positive organism. High resistance rates (>70%) were observed against ampicillin, amoxicillin-clavulanate, and cotrimoxazole. Sensitivity was better retained with ciprofloxacin, gentamicin, and amikacin, although early signs of resistance were noted. Multidrug-resistant (MDR) phenotypes were common, particularly among Pseudomonas isolates. Conclusion: The microbial landscape of ENT infections is shifting, with a growing prevalence of MDR pathogens and declining efficacy of conventional antibiotics. Empirical treatment protocols must be reevaluated in light of local resistance data. Routine culture and sensitivity testing should be integrated into clinical practice to support evidence-based antibiotic stewardship and improve patient outcomes.
21. Skin Changes in Endocrinological Disorders Other Than Diabetes Mellitus
Rik Goswami, Saswati Halder, Projna Biswas
Rik Goswami, Saswati Halder, Projna Biswas
Abstract
Introduction: Skin is often considered a mirror of internal health, with various endocrinological disorders presenting with characteristic cutaneous manifestations. While diabetes mellitus is widely studied in this regard, numerous other endocrine conditions such as thyroid disorders, adrenal dysfunction, pituitary abnormalities, and gonadal hormone imbalances also lead to distinct skin changes. Recognition of these dermatological signs can aid in early diagnosis, guide management, and improve patient outcomes. Methods: This was a descriptive observational study conducted over a period of one year at the Calcutta School of Tropical Medicine. The study included a total of 50 patients with endocrinological disorders other than diabetes mellitus, attending both outpatient and inpatient departments. Data were collected on demographic variables such as age, gender, and body mass index (BMI), as well as clinical parameters including type of endocrine disorder—specifically thyroid, adrenal, pituitary disorders, and polycystic ovary syndrome (PCOS). Detailed dermatological evaluations were performed to document the presence and type of skin manifestations and specific skin features associated with each endocrine abnormality. All information was systematically recorded and analyzed to identify correlations between hormonal imbalances and cutaneous changes in this patient population. Results: In this study of 50 patients with endocrinological disorders other than diabetes mellitus, the mean age was 38.6 ± 12.4 years, with 56% males and 44% females, and a mean BMI of 25.3 ± 3.8 kg/m². Thyroid disorders were the most common (40%), followed by adrenal disorders (20%), pituitary disorders (16%), PCOS (14%), and other rare endocrine syndromes (10%). The most frequent skin manifestations were dry skin (36%), hyperpigmentation (24%), hirsutism (20%), acne (16%), and striae (12%). Among thyroid disorder patients, dry, rough skin (60%), hair thinning (40%), myxedema (25%), and palmar erythema (30%) were prominent. Laboratory analysis showed significant correlations between skin changes and elevated hormone levels, including TSH (8.2 ± 3.5 vs. 3.4 ± 1.2 µIU/mL, p = 0.001), cortisol (24.1 ± 6.3 vs. 15.8 ± 4.7 µg/dL, p = 0.002), testosterone (78 ± 25 vs. 42 ± 15 ng/dL, p = 0.005), and IGF-1 (320 ± 50 vs. 210 ± 45 ng/mL, p = 0.004). Conclusion: Cutaneous manifestations are common and often early indicators of endocrine dysfunction beyond diabetes mellitus. Awareness of these dermatological signs among clinicians can facilitate timely diagnosis and appropriate endocrine evaluation. Early recognition and intervention can prevent complications and improve patient quality of life.
22. Acute Pancreatitis in Scrub Typhus Positive Fever: A Case Report
Satyaki Roy
Satyaki Roy
Abstract
We report the case of a 47-year-old male presenting with abdominal pain and systemic signs of inflammation. Laboratory tests revealed mild anemia, marked neutrophilic leukocytosis, and a critically elevated C-reactive protein (CRP). Pancreatic enzymes were significantly elevated. Contrast-enhanced CT of the abdomen demonstrated features consistent with mild acute pancreatitis. Concurrent urinary findings suggested cystitis, and anatomical anomalies of the renal-ureteric system were detected. Serological testing was positive for Scrub Typhus IgM, raising suspicion of a rickettsial etiology contributing to the systemic inflammatory response. The case highlights the importance of recognizing overlapping infectious and inflammatory etiologies and the value of integrated hematologic, biochemical, and radiological assessment in acute settings.
23. Does Repeat Dose of Gonadotropin-Releasing Hormone Agonist Trigger in Normo and Hyper Responders Compared to Single Dose in Antagonist IVF Cycles Provides a Better Mature Oocyte Yield (MII)? – A Prospective Cohort Study
Meena Srikanth, Anuranjita Pallavi
Meena Srikanth, Anuranjita Pallavi
Abstract
Introduction: Controlled ovarian stimulation (COS) with a GnRH antagonist protocol is commonly used in in-vitro fertilization (IVF) cycles. GnRH agonist (GnRHa) trigger is an established alternative to hCG in preventing ovarian hyperstimulation syndrome (OHSS), especially in high responders. However, concerns remain about the adequacy of oocyte maturation with a single GnRHa trigger dose. Recent studies suggest that a repeat or dual GnRHa trigger may improve outcomes, particularly in specific responder groups. Objective: To evaluate whether a repeat dose of GnRH agonist trigger, compared to a single dose, improves mature oocyte (MII) yield in normo-responders and hyper-responders undergoing IVF with an antagonist protocol. Methods: This prospective observational study was conducted at Craft Hospital and Research Centre, Kerala, from December 2017 to August 2018, involving women undergoing IVF cycles with a GnRH antagonist protocol. The study compared the outcomes of single versus repeat doses of GnRH agonist trigger in normo- and hyper-responders. Key variables assessed included age, type of infertility, AMH, AFC, pre-trigger LH, peak estradiol, post-trigger progesterone and LH levels, number of mature oocytes (MII), and maturity rate. The aim was to evaluate whether a repeat dose offered improved oocyte maturation compared to a single dose. Results: In this prospective observational study, baseline characteristics such as age, AMH, AFC, pre-trigger LH, peak estradiol, total gonadotropin dose, and stimulation duration were comparable between the single-dose (Group A) and repeat-dose (Group B) GnRH agonist trigger groups in antagonist IVF cycles. The majority had primary infertility in both groups. Although Group A showed slightly higher mean mature oocyte (MII) yield (15.69 vs. 14.73) and oocyte maturity rate (84% vs. 78%) than Group B, the differences were not statistically significant. Post-trigger LH and progesterone levels also did not differ significantly between groups. Thus, administering a repeat dose of GnRH agonist did not provide a significant advantage in mature oocyte yield compared to a single dose. Conclusion: Repeat-dose GnRH agonist trigger appears to be a safe and effective strategy to improve mature oocyte yield in both normo- and hyper-responders undergoing GnRH antagonist IVF cycles. This approach may be particularly beneficial in optimizing outcomes in patients with a higher follicular response, without increasing the risk of OHSS.
24. Comparison of Subcutaneous Dexmedetomidine Versus Clonidine as an Adjuvant to Spinal Anaesthesia in Lower Limb Surgeries: A Randomized Double Blind Control Trial
Sabarni Sanyal, Debabanhi Barua, Sujata Dalai
Sabarni Sanyal, Debabanhi Barua, Sujata Dalai
Abstract
Introduction: Lower limb surgeries can be performed under local nerve block, regional anesthesia, or general anesthesia. Central neuraxial blockade is preferred due to its ease of administration, high success rates, and quick onset. Spinal anesthesia has a short duration, but adjuvants like fentanyl, morphine, and alpha2 agonists are used to prolong block duration and improve patient satisfaction. Aims and Objective: The study aimed to compare the effectiveness of subcutaneous dexmedetomidine and subcutaneous clonidine as adjuvants to intrathecal 0.5% hyperbaric bupivacaine in spinal anaesthesia for patients undergoing elective lower limb surgeries, including orthopaedic and plastic surgeries. Methods and Materials: November 2019 to October 2020. Total 92 Patients aged 20-60 years of age who were undergoing elective lower limb surgeries (orthopaedic and plastic surgeries) attending Medical College, Kolkata.] Result: In Group-C, the mean Time to first analgesic requirement (mean± s.d.) Of patients was 246.8261 ± 10.4271. In Group-D, the mean Time to first analgesic requirement (mean± s.d.) Of patients was 270.3478 ± 12.9430. Difference of mean Time to first analgesic requirement with both Group was statistically significant (p<0.0001). Conclusion: The study found that both clonidine and dexmedetomidine provide effective and safe anesthesia in elective lower limb orthopaedic and plastic surgery, with dexmedetomidine providing longer postoperative analgesia duration.
25. Evaluation of Feto-Maternal Health Outcomes in Post-COVID Pregnant Women in a Rural Tertiary Care Hospital
Sanjukta Mukherjee, Swaralipi Misra, Mriganka Mouli Saha
Sanjukta Mukherjee, Swaralipi Misra, Mriganka Mouli Saha
Abstract
Introduction: Novel Corona virus infection (SARS-CoV-2), also known as COVID-19 in pregnancy is an important concern but currently limited data available to predict the risk of virus infection in pregnancy and its adverse outcome. Objective/Aim: Evaluate SARS-CoV-2 infection in pregnancy and its adverse outcome on mother as well on fetus. Materials and Methods: In this prospective observational study, total number of 1120 pregnant women admitted in the isolation ward of our institution were included. All women presented with common symptoms like fever, tiredness, headache, sore throat, and cough. Results: Sixty women were diagnosed SARS-CoV-2/COVID-19 positive by Reverse Transcriptase Polymerase Chain Reaction (RTPCR) examination of nasopharyngeal swab (NP), which was almost 5.3% of the isolated women. All neonates were tested negative for SARS-CoV- 2 infection. All the study subjects recovered with routine care and were sent home after 7 days with advice for “safe home” (nearby place where the COVID patients can safely stay isolated in a room with all oral medications and report immediately to the healthcare facility if any adverse condition) for further 7 days. Conclusions: SARS-CoV-2 infection in pregnancy most of the time presents in the latter half of pregnancy and management is similar to that in the general population. There is no increased risk of severe disease during pregnancy. Support of intimate partner, social support will promote mental health hygiene amongst the COVID positive mother. Due to immunological modulation in pregnancy, newborns are mostly protected from disease transmission.
26. Anthracycline-Free or Short-Term Regimen as Adjuvant Chemotherapy for Operable Breast Cancer: A Phase III Randomized Non-Inferiority Trial
Sourav Paul, Juhi
Sourav Paul, Juhi
Abstract
Introduction: Anthracycline-based chemotherapy regimens are widely used as adjuvant treatment for operable breast cancer but are associated with significant cardiotoxicity and other adverse effects. Aims: This study aimed to evaluate whether anthracycline-free or short-term chemotherapy regimens are non-inferior to standard anthracycline-containing regimens in terms of disease-free survival (DFS) among patients with operable breast cancer. Materials and Methods: The present study was a prospective study conducted over a period of one year at R. G. Kar Medical College. A total of 100 patients with HER2-negative operable breast cancer were enrolled. The study variables included domain, toxicity, treatment group, median age, hormone receptor positivity, nodal status, tumor size, and grade III tumors. Detailed clinical and pathological data were collected for each patient, and relevant treatment-related toxicities were recorded. All patients were followed systematically to assess outcomes and adverse effects across the different study domains. Results: The study involved 100 patients equally divided into anthracycline-free, short-term anthracycline, and standard anthracycline groups, with comparable baseline characteristics across all groups. At 3 years, disease-free survival (DFS) rates were similar—87.9% for anthracycline-free and standard groups, and 91.2% for the short-term group—with both experimental regimens demonstrating non-inferiority to the standard. Overall survival (OS) rates also showed no significant differences among groups. The anthracycline-free regimen had significantly lower rates of grade 3–4 neutropenia and cardiotoxicity, while nausea and febrile neutropenia rates were comparable. Quality of life assessments favored the anthracycline-free group, showing significantly better physical and emotional well-being and reduced fatigue at 6 months post-treatment. Conclusions: Anthracycline-free and short-term anthracycline-containing adjuvant chemotherapy regimens provide non-inferior efficacy compared to standard anthracycline-based regimens in patients with operable breast cancer, with a more favorable safety profile. These findings support the consideration of anthracycline-sparing strategies in selected patients to reduce treatment-related toxicity without compromising clinical outcomes.
27. Classic Versus New Approaches in Management of Intertrochanteric Femur Fractures: Comparative Advantages and Side Effects
Anshul Khare, Sanjay Gupta, Sudip Bhargava
Anshul Khare, Sanjay Gupta, Sudip Bhargava
Abstract
Background: Intertrochanteric femur fractures constitute a common orthopedic injury, especially in the elderly. Management strategies have evolved from classic techniques such as Dynamic Hip Screw (DHS) to newer modalities, including Proximal Femoral Nail Anti-rotation (PFNA), Gamma nails, minimally invasive plate osteosynthesis (MIPO), and hip arthroplasty. This paper reviews and compares classic and contemporary management techniques, highlighting their benefits and side effects. Methods: A review of randomized controlled trials, meta-analyses, cohort studies, and clinical reports (2008–2025) comparing traditional and new operative modalities for intertrochanteric femur fractures. Key outcome measures include functional recovery (Harris Hip Score), complications, operative metrics, and patient quality of life. Results: Classic DHS remains cost-effective and reliable for stable fractures but is associated with longer surgical time and delayed mobilization. Intramedullary nails and PFNA show reduced surgical time, earlier weight-bearing, and higher union rates, but with risks of implant-related complications. Hip arthroplasty—though traditionally reserved for fracture fixation failure—now sees expanded use in selected elderly populations, offering superior early mobilization and functional scores. MIPO and locking plates deliver high union rates for comminuted fractures and decrease infection and soft tissue complications. Conclusions: While classic methods like DHS are ideal for stable fractures, newer techniques (PFNA, Gamma nail, MIPO, and arthroplasty) provide improved outcomes in unstable or osteoporotic bones. Choice of therapy should be individualized, considering patient comorbidities, bone quality, fracture pattern, and expected rehabilitation.
28. Prevalence of Myocarditis in Adults with Prior COVID-19 Infection: A Cross-Sectional Study
Sreemanta Madhab Baruah, Anish Hazra, Mriganka Shekhar Chaliha, Pronami Borah, Shyamjith Lakshmanan B.
Sreemanta Madhab Baruah, Anish Hazra, Mriganka Shekhar Chaliha, Pronami Borah, Shyamjith Lakshmanan B.
Abstract
Background: COVID-19 has been linked to a wide spectrum of cardiovascular complications, with myocarditis drawing particular attention. Although vaccination is known to lessen the severity of illness, uncertainty remains about the persistence of myocardial inflammation in individuals who contract the infection after immunization. Methods: A cross-sectional study was carried out over a one-year period in the Department of Medicine, Assam Medical College and Hospital, Dibrugarh. The study recruited 100 adults aged 18–59 years with a documented history of COVID-19 infection and complete vaccination. Participants with pre-existing cardiac, autoimmune, or chronic systemic diseases were excluded. Each subject underwent detailed clinical assessment, electrocardiography (ECG), and contrast-enhanced cardiac magnetic resonance imaging (MRI) to evaluate myocardial changes. Results: Abnormal ECG patterns were common, with sinus tachycardia detected in 48% and non-specific ST–T wave alterations in 38% of cases. Cardiac MRI demonstrated late gadolinium enhancement (LGE)-a characteristic marker of myocarditis—in 8% of participants. The relationship between previous COVID-19 infection and myocardial involvement was statistically significant (p = 0.01), with the majority of cases seen among men aged 30–39 years. Although no major structural heart disease was evident, subtle inflammatory changes were observed. Conclusion: This study suggests that even fully vaccinated individuals who recover from COVID-19 may experience silent myocardial injury. Incorporating ECG and cardiac MRI into follow-up protocols can aid in early detection. Younger male patients with lingering symptoms may benefit most from targeted cardiac surveillance.
29. Occurrence and Risk Factors for Complications Following Laparotomy for Benign Gynaecological Surgeries
Sagarika Saha, Gitanjali Deka, Saswati Sanyal Choudhury
Sagarika Saha, Gitanjali Deka, Saswati Sanyal Choudhury
Abstract
This prospective observational study was conducted in Gauhati Medical College and Hospital, Guwahati for a period of one year from October 2022 to September 2023 in the Department of Obstetrics and Gynaecology. 350 patients undergoing laparotomy for benign gynaecological conditions, aged between 15-65 years of age were studied for post-operative complications for a period of 30 days after surgery and the associated risk factors were evaluated. The post-operative complications which were most frequently observed were paralytic ileus, surgical site infection, urinary tract infection and organ injury. Significant association was found between certain risk factors and the complications developed. Positive correlation was found between the development of surgical site infection with diabetes mellitus, obesity, and anaemia. Significant association was also seen between the occurrence of post-operative paralytic ileus with diabetes mellitus and prolonged duration of surgery. Urinary tract infection was seen to be associated with diabetes mellitus. Peri operative blood loss was also found to be associated with post-operative complications. No association was found between the age of the patients, raised blood pressure and hypoalbuminemia with the post-operative complications. It was concluded that the risk factors associated with post-operative complications were anaemia, obesity, diabetes mellitus, peri operative blood transfusion and prolonged duration of surgery. Hence, pre optimisation of the patient’s condition prior to surgery plays a vital role in prevention of these complications and will reduce the morbidity and health case expense associated with them.
30. A Comparative Study of Surgical Treatment Versus Conservative Treatment of Vocal Cord Nodules in Tertiary Care Hospital
Indranil Khatua, Anupam Ray, Alokendu Bose, Debarshi Jana
Indranil Khatua, Anupam Ray, Alokendu Bose, Debarshi Jana
Abstract
Introduction: Vocal cord nodules are benign, bilateral, callous-like lesions on the vocal folds, commonly caused by chronic voice abuse or misuse. They are a frequent cause of hoarseness, particularly among professional voice users. Management options include conservative measures—such as voice therapy, vocal hygiene, and medical management—and surgical excision via microlaryngoscopy. While surgery offers rapid lesion removal, conservative treatment addresses the underlying phonotrauma, potentially reducing recurrence. The comparative effectiveness of these approaches remains a topic of clinical interest, particularly in resource-limited tertiary care settings. Objectives: To compare the outcomes of surgical treatment and conservative treatment in patients with vocal cord nodules, in terms of symptomatic improvement, objective voice quality measures, and recurrence rates. Methods: This prospective study was conducted in the Outpatient Department and indoor wards of the Department of ENT and Head Neck Surgery at R. G. Kar Medical College from December 2017 to December 2019. The study included 50 patients attending the ENT OPD during the study period who were diagnosed with benign bilateral vocal cord nodules. Data were collected on demographic variables (age, sex, occupation) and clinical parameters including pre-treatment GRBAS score, post-treatment GRBAS score, first change (Ch1) GRBAS score, second change (Ch2) GRBAS score, and post-follow-up GRBAS score. Patients were divided into two groups: Group A underwent surgical excision, while Group B received conservative management. All patients were evaluated using the GRBAS scale at baseline, post-treatment, and during follow-up to assess voice quality and treatment outcomes. Results: This study of 50 patients with vocal cord nodules found that most were aged 31–40 years (52%) and predominantly female (68%), especially in the surgical group (88.2%). Housewives were the most common occupation (36%), followed by singers (14%) and teachers (10%). The mean ages of the surgical (34.53 years) and conservative (34.85 years) groups were similar (p = 0.8586). Pre-treatment GRBAS scores were comparable, but post-treatment improvement was greater in the surgical group (p = 0.0409). While surgery showed faster early gains, follow-up scores revealed no significant long-term difference between the two groups. Conclusion: Both surgical and conservative treatments are effective in improving voice quality in patients with vocal cord nodules. Surgery offers faster symptomatic relief, but long-term outcomes are comparable when appropriate voice therapy is followed. Conservative treatment remains a valuable first-line approach, especially in motivated patients, while surgery is beneficial for refractory cases or professional voice users requiring rapid recovery.
31. Observational Analysis of Unnatural Deaths Based on Autopsy Reports at a Tertiary Care Center
Kamlesh Lad, Ankur Shantilal Kagathara, Swetkumar Maheshbhai Saradava
Kamlesh Lad, Ankur Shantilal Kagathara, Swetkumar Maheshbhai Saradava
Abstract
Background: Unnatural deaths, including road traffic accidents, suicides, homicides, burns, poisoning, and drowning, remain a major medico-legal concern in India. Autopsy-based studies provide vital epidemiological data for prevention and policy-making. This study analyzed the demographic profile, manner, and causes of unnatural deaths at a tertiary care center. Material and Methods: This retrospective observational study was conducted in the Department of Forensic Medicine at a tertiary care centre for 1 year. A total of 312 medico-legal autopsies of unnatural deaths were included, excluding incomplete records, decomposed or unidentified bodies, and natural deaths. Data from autopsy reports, inquest papers, police records, and hospital files were analyzed for age, sex, cause and manner of death, seasonal distribution, injury patterns, and toxicological findings. Statistical analysis was performed using SPSS v25, with p < 0.05 considered significant. Results: Males accounted for 75.0% of cases (male-to-female ratio 3:1), with the highest incidence in the 21–30 years age group (31.7%). Accidental deaths predominated (61.2%), followed by suicides (25.9%) and homicides (9.0%). Road traffic accidents were the leading cause (45.5%), followed by hanging (13.5%), poisoning (11.5%), burns (9.3%), and drowning (8.0%). Seasonal peaks occurred during the monsoon (29.8%). Head injuries were the most common fatal injury pattern (43.1%) in trauma cases, and organophosphorus compounds were the most frequent toxic agent (58.3%) in poisoning cases. Conclusion: Young adult males are most affected by unnatural deaths, predominantly due to road traffic accidents. The findings emphasize the need for targeted road safety enforcement, suicide prevention programs, and toxicological surveillance.
32. Evaluation of Liver Enzyme Levels in Malaria Patients
Viral Patel, Ashish Madiya
Viral Patel, Ashish Madiya
Abstract
Background and Aim: Increased liver enzyme levels have been documented in both uncomplicated and complicated malaria, often mimicking viral hepatitis. Present study aims to evaluate the pattern and significance of hepatic enzyme alterations in malaria and examine their potential role in prognosticating disease severity. Material and Methods: A total of 120 patients diagnosed with malaria by peripheral smear or rapid diagnostic test were enrolled after informed consent. Blood samples were obtained for liver function tests, including SGOT, SGPT, alkaline phosphatase, and bilirubin, at the time of admission. Enzyme levels were measured using standardized automated biochemical analyzers. Results: The mean SGOT value was 372.18 IU/L, with an X-value of 9.12 and a p-value of 0.0016, indicating a highly significant correlation between elevated liver enzyme levels and malaria infection, affirming their clinical relevance in prognostic assessment. Conclusion: This study highlights that elevated liver enzymes, especially SGOT, are a common biochemical finding in patients with malaria and are statistically significant in the disease process. The male predominance and middle-aged population reflect the demographic vulnerability in endemic zones.
33. Liver Biochemical Profiles in Congestive Heart Failure
Umesh
Umesh
Abstract
Introduction: Congestive heart failure (CHF) is a clinical syndrome characterized by the heart’s inability to pump blood adequately, leading to systemic venous congestion. One of the frequently overlooked consequences of CHF is hepatic dysfunction, often termed “cardiac hepatopathy” or “congestive hepatopathy.” Alterations in liver biochemical profiles may reflect the severity of heart failure and have prognostic implications. Objectives: To evaluate the patterns and prevalence of liver function abnormalities in patients with CHF and to correlate 0these abnormalities with the severity of heart failure. Methods: This cross-sectional observational study included 100 patients diagnosed with CHF (NYHA class II–IV) admitted to a tertiary care hospital. Detailed clinical assessments, echocardiographic evaluations, and liver function tests (LFTs) — including serum bilirubin, AST, ALT, ALP, GGT, total protein, and albumin — were performed. Patients with known chronic liver disease, alcohol abuse, or hepatotoxic drug use were excluded. Data were analyzed to assess the correlation between liver function parameters and the severity of CHF, as indicated by ejection fraction and NYHA class. Results: Liver function abnormalities were observed in 68% of patients with CHF. The most common abnormalities included elevated ALP (52%), GGT (47%), and bilirubin (33%). Hypoalbuminemia was present in 45% of cases. Significant correlations were found between elevated right atrial pressures and raised bilirubin and ALP levels (p < 0.01). Patients with advanced NYHA class (III–IV) and reduced ejection fraction (<40%) exhibited more pronounced derangements in LFTs. Hepatic congestion markers were more prevalent in right-sided or biventricular failure. Conclusion: Liver biochemical abnormalities are common in patients with CHF and tend to worsen with the severity of cardiac dysfunction. Routine monitoring of liver function in CHF patients may help in early detection of congestive hepatopathy and better risk stratification. Understanding the liver–heart interaction is crucial for comprehensive CHF management.
34. Postoperative Port Site Infections in General Laproscopy Procedure
Prasenjit Mukherjee, Chandramouli Mukherjee, Debarshi Jana
Prasenjit Mukherjee, Chandramouli Mukherjee, Debarshi Jana
Abstract
Background: Port site infections (PSIs) are a notable complication following laparoscopic cholecystectomy, potentially delaying recovery and increasing healthcare burden. Identifying risk factors and their clinical implications is essential for optimizing surgical outcomes. Objectives: To assess the incidence, risk factors, microbiological profile, and postoperative outcomes associated with port site infections in patients undergoing laparoscopic cholecystectomy. Materials and Methods: A prospective observational study was conducted to evaluate the incidence, risk factors, and outcomes of port site infections following laparoscopic cholecystectomy. The study was carried out in the Department of General Surgery at a tertiary care hospital, where patients undergoing elective laparoscopic cholecystectomy were enrolled. A total of 50 patients were included in the study, and detailed demographic, intraoperative, and postoperative data were collected and analyzed to assess potential associations with port site infections. Results: In all patients within the PSI group, the umbilical port was used for gallbladder retrieval (100%), compared to 87.5% in the non-PSI group. Although use of the umbilical port was slightly more frequent in the PSI group, this difference did not reach statistical significance (p = 0.08). The remaining 12.5% of non-PSI patients had the gallbladder extracted through other port sites. Regarding port size, the 10 mm trocar was predominantly used for specimen retrieval in both groups—9 patients (90%) in the PSI group and 35 patients (87.5%) in the non-PSI group. A 12 mm port was used in 1 patient (10%) in the PSI group and 5 patients (12.5%) in the non-PSI group. No significant association was observed between port size and infection (p = 0.47). In terms of port closure techniques, 4 patients (40%) in the PSI group had only skin closure, while 6 (60%) underwent both skin and fascial closure. In contrast, in the non-PSI group, 6 patients (15%) had skin-only closure and 34 (85%) had combined skin and fascial closure. Although a higher proportion of patients with skin-only closure developed PSI, the difference was not statistically significant (p = 0.11). Conclusion: Although not statistically significant, the use of the umbilical port and skin-only closure was more common among patients who developed port site infections. Port size showed no association with infection. These findings suggest that while port characteristics may not be independent risk factors, meticulous closure techniques—especially including fascial closure—may help reduce the risk of postoperative infections.
35. The Association Between Liver Enzymes and Cardiovascular Risk Factors in Adults with Secondary Dyslipidemia
Sangita Chanda, Biswanath Sharma Sarkar, Protyush Chakraborty
Sangita Chanda, Biswanath Sharma Sarkar, Protyush Chakraborty
Abstract
Introduction: Secondary dyslipidemia is a common metabolic disturbance that contributes to the development of cardiovascular disease (CVD). Liver enzymes such as alanine aminotransferase (ALT), aspartate aminotransferase (AST), and gamma-glutamyl transferase (GGT) are increasingly recognized as potential markers of metabolic and cardiovascular risk. However, their associations with established cardiovascular risk factors in adults with secondary dyslipidemia remain unclear. Objective: This study aimed to evaluate the relationships between serum liver enzyme levels and traditional cardiovascular risk factors in adults presenting with secondary dyslipidemia. Methods: This cross-sectional observational study was conducted over one year at Raiganj Government Medical College and Hospital. The study enrolled 50 adult patients diagnosed with secondary dyslipidemia. Key variables assessed included age, gender, body mass index (BMI), smoking status, presence of hypertension, liver enzyme levels, lipid profile and other relevant risk factors. Data were collected through clinical evaluation and laboratory investigations. The objective was to analyze the demographic and clinical profile of patients with secondary dyslipidemia and identify associated risk factors. Results: This study of 50 adults with secondary dyslipidemia (mean age 45.6 ± 8.7 years; 56% male) found that most participants were overweight (mean BMI 27.4 ± 3.2 kg/m²), 30% were smokers, and 36% had hypertension. Liver enzyme levels (ALT, AST, GGT, ALP) were generally within normal ranges, though GGT was toward the upper limit. Lipid profiles showed elevated total and LDL cholesterol, low HDL cholesterol, and borderline-high triglycerides. Correlation analysis revealed significant associations: ALT with LDL cholesterol and triglycerides; AST inversely with HDL cholesterol; and GGT with both systolic and diastolic blood pressure. Hypertensive individuals had significantly higher ALT, AST, and GGT levels compared to normotensive participants, while ALP differences were not significant. Conclusion: In adults with secondary dyslipidemia, serum GGT and ALT levels are significantly associated with key cardiovascular risk factors such as obesity, dysglycemia, and hypertriglyceridemia. These liver enzymes may serve as accessible biomarkers for early cardiovascular risk stratification in this population. Further longitudinal studies are warranted to confirm their predictive value for cardiovascular events.
36. Analysis of Risk Factors Contributing to Seroma Formation Following Mastectomy in Breast Cancer Patients
Prasenjit Mukherjee, Chandramouli Mukherjee, Debarshi Jana
Prasenjit Mukherjee, Chandramouli Mukherjee, Debarshi Jana
Abstract
Introduction: Seroma formation is one of the most common postoperative complications following mastectomy in breast cancer patients. Although often self-limiting, it can lead to delayed wound healing, infection, discomfort, and prolonged hospital stay. Identifying the risk factors associated with seroma formation is essential for improving postoperative outcomes. Objectives: To evaluate and analyze the clinical and surgical risk factors contributing to seroma formation in patients undergoing mastectomy for breast cancer. Materials and Methods: This prospective observational study included 60 breast cancer patients who underwent modified radical mastectomy. Data were collected on demographic variables, comorbidities, BMI, number of lymph nodes removed, drain duration, and other clinical parameters. The incidence of seroma formation was recorded and analyzed in relation to these factors using appropriate statistical tests. Results: A statistically significant association was observed between body mass index (BMI) and the incidence of seroma formation. Among the 28 patients with a BMI less than 25 kg/m² (normal weight), only 8 patients (28.6%) developed seroma, while 20 patients (62.5%) did not. In contrast, among the 32 patients who were overweight or obese (BMI ≥25 kg/m²), seroma was present in 20 patients (71.4%) and absent in only 12 (37.5%). The difference was statistically significant, with a p-value of 0.015. Conclusion: The findings of this study indicate a significant association between higher body mass index and the development of postoperative seroma following mastectomy. Patients who were overweight or obese were more likely to develop seroma compared to those with normal BMI. This suggests that elevated BMI is an important modifiable risk factor, and weight optimization prior to surgery may contribute to reducing postoperative complications such as seroma.
37. A Retrospective Observational Study on the Role of Furosemide in Prevention of Bronchopulmonary Dysplasia among Preterm Infants (28–32 Weeks)
Sankar Narayan Mishra, Taraknath Ghosh, Kaustav Nayek, Sayantani Panda
Sankar Narayan Mishra, Taraknath Ghosh, Kaustav Nayek, Sayantani Panda
Abstract
Introduction: Bronchopulmonary dysplasia (BPD) is the most common pulmonary morbidity associated with Prematurity and pre- mature infants with BPD are at an increased risk of death and severe developmental disability. Despite the devastating impact Of BPD on premature infants, there are currently no therapies labeled by the US Food and Drug Administration to prevent BPD. Neonatologists commonly use furosemide as off-label in premature infants. Aims: To evaluate the role of Furosemide in prevention of Bronchopulmonary Dysplasia in preterm infants. Materials and Methods: This hospital-based retrospective observational study was conducted in the Special Newborn Care Unit (SNCU) and Neonatal Intensive Care Unit (NICU) of Burdwan Medical College and Hospital over a period of two years, from November 2019 to November 2021. A total of 50 preterm babies with a gestational age between 28 and 32 weeks, admitted to the SNCU/NICU during the study period, were included. The study population comprised babies who met the gestational age criteria, and relevant data, particularly regarding exposure to furosemide, were collected retrospectively from hospital medical records. Results: In this retrospective study, baseline characteristics such as gestational age, birth weight, sex ratio, antenatal steroid use, and Apgar scores were comparable between the Furosemide and Non-Furosemide groups. In the Furosemide group, therapy was started at a mean age of 5.4 days, with an average duration of 4.6 days and a total dose of 3.2 mg/kg, mainly for oxygen requirement and PDA management. Neonates receiving Furosemide had a significantly lower incidence of bronchopulmonary dysplasia (20% vs 44%) and shorter oxygen support duration (12 vs 18 days). Respiratory outcomes also favored the Furosemide group, with shorter CPAP use and a trend toward reduced ventilation duration, though re-intubation rates were similar. Adverse events such as electrolyte imbalance, nephrocalcinosis, and mortality did not differ significantly. Importantly, the hospital stay was significantly shorter in the Furosemide group (26.4 vs 31.2 days). Conclusions: This study concludes that Furosemide use in preterm neonates was associated with reduced incidence of bronchopulmonary dysplasia, shorter oxygen and CPAP support, and decreased hospital stay, without a significant increase in adverse events or mortality.
38. Prevalence of Overweight and Obesity among School Children
Sankar Narayan Mishra, Sayantani Panda, Bidyut Kumar Khuntdar
Sankar Narayan Mishra, Sayantani Panda, Bidyut Kumar Khuntdar
Abstract
Introduction: Overweight and obesity among school-aged children have emerged as a significant public health concern worldwide. The increasing prevalence of these conditions is associated with a range of adverse health outcomes, including early onset of cardiovascular diseases, type 2 diabetes mellitus, and psychosocial problems. Aims and Objectives: The aim of the study is to assess the prevalence of overweight and obesity among school children aged 6 to 14 years. The objectives include determining the distribution of overweight and obesity by age, gender, and school type, and identifying associated risk factors. Materials and Methods: The study was conducted in the Department of Paediatrics, Outpatient Services, Tamralipta Government Medical College, Tamluk, West Bengal 721636, over a period from March 2024 to February 2025. The study population comprised school children aged 6 to 14 years from selected government and private schools. Children with chronic illnesses or conditions affecting growth were excluded to ensure accurate assessment of overweight and obesity prevalence. A total of 120 children were randomly selected and included in the study. Result: A study of 120 school children aged 6–14 years found that 20.8% were overweight and 12.5% were obese, with a slightly higher prevalence among Boys (35.4%) than Girls (30.9%). The risk of overweight and obesity increased with age, reaching 45.7% in the 12–14 years age group. Conclusion: The study reveals a substantial presence of overweight and obesity among school children, indicating a growing public health concern. The prevalence of excess weight was found to be higher among Boys compared to Girls and showed an increasing trend with age.
39. Association of Mean Platelet Volume with Hba1c in Type 2 Diabetes Mellitus
Linasree Baro, Abanti Bora Baruah, Tazkira Begum, Mauchumi Baruah, Anupam Dutta, Gayatri Gogoi
Linasree Baro, Abanti Bora Baruah, Tazkira Begum, Mauchumi Baruah, Anupam Dutta, Gayatri Gogoi
Abstract
Background: Diabetes mellitus is a serious worldwide health issue that is progressively impacting people everywhere. Diabetes is a chronic, broad-spectrum metabolic disorder that requires continuous medical care, in which the organism cannot adequately benefit from carbohydrates, fats, and proteins due to insulin deficiency or defects in the effect of insulin. Due to altered platelet shape and function, platelets may be a causal factor in the micro and macrovascular disorders that diabetic individuals are more likely to acquire. Aim: The present study aims at to assess the relationship between HbA1c levels and platelet activity (MPV) in type 2 diabetics taking Oral and Insulin therapy. Materials and Methods: From the Assam Medical College’s outpatient department and wards, 80 persons with type 2 diabetes were selected, 40 of whom were using oral hypoglycemic agents alone, and 40 of whom were combining insulin and oral hypoglycemic agents. Data were analyzed using SPSS (Version 23), with statistical significance set at p < 0.05. Results: MPV was found to be significant with HbA1c≥6.5% in both the OHA without Insulin and OHA with Insulin groups (p<0.001). But MPV was not significant with HbA1c<6.5% in both the groups (p<0.331). Conclusion: Values of MPV are increased in patients with HbA1c≥6.5% and are significantly higher in diabetic patients treated with OHA without Insulin than in those patients on OHA with insulin therapy. Higher MPV and higher HbA1c levels were shown to be strongly positively correlated in our study, when the HbA1c levels were ≥6.5%. This result emphasizes the link between elevated platelet activation and inadequate glycemic management, indicating that hyperglycemia causes changes in platelet size and function.
40. Computed Tomographic Analysis of Frontal Sinus Recess Anatomy and Its Relation with Frontal Sinusitis
Ashish Kumar, Deba Kumar Chakrabarty, Mamoona Choudhury
Ashish Kumar, Deba Kumar Chakrabarty, Mamoona Choudhury
Abstract
Aims and Objectives: This study focused on identifying the prevalence of frontal cells on CT scans in individuals with chronic rhinosinusitis, using the “International Frontal Sinus Anatomy Classification (IFAC)” system. It also explored how certain types of frontal cells may be linked to the development of frontal sinusitis. Methods: CT images from 50 patients, covering 100 sinus sides, were examined. All frontal cell types defined by IFAC were identified, and logistic regression analysis was conducted to compare their presence between patients with and without frontal sinusitis. Results: Agger nasi cells were the most commonly detected, found in 92% of the cases. Other types of frontal cells observed included supra agger cells (28%), supra agger frontal cells (19%), supra bullar cells (32%), supra bullar frontal cells (11%), supraorbital ethmoid cells (6%), and frontal septal cells (9%). Among individuals diagnosed with frontal sinusitis, there was a significantly higher prevalence of supra agger frontal cells (25%), supra bullar cells (46.4%), and supra bullar frontal cells (14.3%) compared to those without sinusitis, who exhibited these cells at rates of 16.7%, 26.4%, and 9.7%, respectively. These differences were statistically significant (p < 0.01), indicating a strong correlation with frontal sinusitis. Conclusion: Based on the IFAC classification, there is a significant link between frontal sinusitis and the presence of supra agger frontal, supra bullar, and supra bullar frontal cells.
41. Jejunoileal Atresia: A 5 Year Institutional Review of Surgical Outcomes, Prognostic Factors and the Impact of Delayed Presentation in a Resource-Limited Setting
Arkaprovo Roy, Rajarshi Kumar, Aloke Kumar Sinhababu
Arkaprovo Roy, Rajarshi Kumar, Aloke Kumar Sinhababu
Abstract
Introduction: Jejunoileal atresia(JIA) is a frequent cause of intestinal obstruction in the newborn and it occurs in 1-3 per 10,000 live births. The condition is caused by vascular accident in utero. The antenatal diagnosis is difficult even though prenatal imaging has improved. Neonates with JIA normally manifest within the first 24-48 hours of birth with biliary emesis and inability to empty meconium with or without abdominal distension. The gold standard is surgical resection of the atresia segment and end to end anastomosis. When neonates are in severe sepsis or in poor general state or in severe bowel distension, other surgical methods- stoma or a chimney procedure (e.g Santulli ) can be used. Survival rates in developed countries are more than 90% due to early diagnosis, neonatal intensive care advancements, pediatric anaesthesia expertise and access to total parenteral nutrition (TPN). Aims and Objectives: This paper seeks to carry out an institutional review based on cases of jejunoileal atresia that have been conducted over a period of 5 years basing on short term outcomes and survival influencing factors. Materials and Methods: A retrospective study was conducted on neonates admitted with jejunoileal atresia who were operated between January 2020 to December 2024. The data that was recorded include the demographics, clinical presentation, and preparation, surgical details and outcomes alongwith associated anomalies if any. To establish the significant predictors of survival, statistical analysis was done. Results: 33 neonates with jejunoileal atresia were operated with a male-female ratio of 1.36:1. 79 % were low birth weight, and 64 % were very low birth weight. The average age of presentation was around 9.5 days. Our study indicated the timing of presentation and birth weight as significant factors of survival. Even though the overall survival which was 72 % at the time of initial discharge had declined to 64 % at least 6 months follow up, the switchover to Santulli( chimney) procedure in the final 2 years yielded a survival of more than 90%. Conclusions: Neonates with late presentation and low birth weight did poorly. High index of suspicion of intestinal obstruction in neonates who have biliary emesis is essential for early diagnosis. Survival might be enhanced by antenatal diagnosis by intervening in time. A chimney operation such as Santulli, which was found to have a higher survival rate than primary anastomosis in neonates of unfavorable general condition and massive luminal disparity, has turned out to be saviour in resource-limited tertiary centres.
42. Comparative Evaluation of Intrathecal Bupivacaine and Ropivacaine with Dexmedetomidine in Lower Limb Orthopaedic Surgeries: A Prospective Observational Study
Sourav Karmakar, Abik Mallik, Subrata Kumar Mandal
Sourav Karmakar, Abik Mallik, Subrata Kumar Mandal
Abstract
Background: Effective postoperative analgesia is crucial in lower limb orthopaedic surgeries. While bupivacaine is a standard intrathecal anaesthetic, ropivacaine—with lower cardiotoxicity—may benefit from adjuvants such as dexmedetomidine to prolong analgesic effects. Objectives: To compare the efficacy and safety of intrathecal 0.5% bupivacaine heavy versus 0.75% ropivacaine heavy combined with 10 μg dexmedetomidine in terms of sensory and motor block characteristics, analgesia duration, and hemodynamic stability. Methods: This prospective, observational study included 100 adult patients (ASA I/II) undergoing lower limb orthopaedic surgeries under spinal anaesthesia. Participants were allocated into Group A (bupivacaine + saline) and Group B (ropivacaine + dexmedetomidine). Key parameters included block onset and duration, analgesia duration, rescue analgesia requirements, and hemodynamic parameters. Statistical significance was defined as p<0.05. Results: Demographics and surgical variables were comparable between groups. Block onset times were similar (p>0.05). However, Group B showed significantly longer duration of sensory block (226.6±41.1 min vs. 198.8±44.2 min), motor block (172.6±44.9 min vs. 154.8±52.8 min), and analgesia (278.4±51.2 min vs. 234.6±39.6 min) (p<0.001). Time to rescue analgesia was prolonged (p<0.001) and total analgesic requirement reduced in Group B (p=0.002). Hemodynamic stability and adverse events were similar across groups (p>0.05). Conclusion: Intrathecalropivacaine combined with dexmedetomidine offers prolonged postoperative analgesia and reduced analgesic need without compromising safety, making it a viable alternative to bupivacaine in spinal anaesthesia for orthopaedic surgeries.
43. Association of Undiagnosed Hyperprolactinemia in Polycystic Ovary Syndrome: A Descriptive Cross-Sectional Case Study from a Tertiary Care Center in India
Piyali Das, Saikat Kumar Sarkar
Piyali Das, Saikat Kumar Sarkar
Abstract
Background: Polycystic Ovary Syndrome (PCOS) and hyperprolactinemia (HPRL) are two common endocrine disorders in reproductive-age women. The potential overlap between PCOS and HPRL can affect clinical presentation and management. This study aimed to investigate the association of undiagnosed hyperprolactinemia in women diagnosed with PCOS. Methods: A descriptive cross-sectional study was conducted on 120 women diagnosed with PCOS as per Rotterdam criteria at Medical College Hospital, Kolkata. Serum prolactin, LH, FSH levels, and sociodemographic, clinical, and reproductive parameters were analyzed. Results: Hyperprolactinemia was detected in 45% of PCOS patients. A high LH:FSH ratio (≥3) and a positive family history of endocrine disorders were significantly associated with hyperprolactinemia (Adjusted Odds Ratio [AOR] = 6.27 and 21.90, respectively). No significant associations were observed between HPRL and BMI or specific clinical features such as hirsutism, acne, or menstrual irregularities. Conclusions: A substantial proportion of PCOS patients had undiagnosed hyperprolactinemia. Routine prolactin screening is recommended for PCOS patients to optimize diagnosis and treatment pathways.
44. Epidemiological Analysis of Atrial Septal Defect (ASD): A Comperative Study of Pediatric and Adult Population
Bitan Karmakar, Santanu Dutta, Nitai P. Bhattacharyya, Bhaswati Pandit
Bitan Karmakar, Santanu Dutta, Nitai P. Bhattacharyya, Bhaswati Pandit
Abstract
Background: Atrial Septal Defect (ASD) is a common congenital heart defect characterized by an abnormal communication between the atria, leading to shunting between systemic and pulmonary circulations. It is one of the most prevalent congenital heart diseases (CHD), with increasing recognition in both pediatric and adult populations, especially with advancements in diagnostic imaging. This study aims to assess the prevalence of ASD cases among CHD patients in a tertiary care hospital in eastern India, highlighting demographic factors and associated characteristics. Methods: A prospective observational study was conducted at the Cardiothoracic and Vascular Surgery unit of IPGME&R and SSKM Hospital, Kolkata, India. The study enrolled 250 CHD patients from December 2021 to September 2023. Among these, 112 patients were diagnosed with ASD. Patients were categorized into two groups: pediatric (0-14 years) and adult (>14 years). Data were collected via standardized questionnaires, including demographic, clinical, and family history details. Statistical analysis was performed using descriptive statistics, Student’s t-test, and Chi-square test for comparison. Results: Among the 112 ASD patients, 61 were pediatric and 51 were adult. In the pediatric cohort, there was a near-equal gender distribution, with a mean age at diagnosis of 31.6 months. In contrast, the adult group had a predominance of females (76%) and an average diagnosis age of 346.5 months (approximately 28.9 years). A family history of heart disease was reported in 7.14% of ASD cases. The analysis revealed significant differences in the age at diagnosis and gender distribution between the two groups. Conclusion: The study highlights significant age-related disparities in the diagnosis of ASD, with pediatric patients being diagnosed at a much earlier age compared to adults. A gender bias was observed in the adult group, warranting further investigation into the potential underlying causes. Early diagnosis and intervention in pediatric patients are crucial to reducing long-term complications. Additionally, the role of genetic factors and family history underscores the importance of genetic counseling and screening in managing ASD. This study provides valuable epidemiological insights into the prevalence and characteristics of ASD in eastern India, contributing to a better understanding of this congenital heart defect in diverse populations.
45. Prevalence of Smartphone Addiction and Its Correlation with Sleep Disturbances among Undergraduate Medical Students: A Cross-Sectional Study
Shobha Vijaybhai Gadhia, Kanzariya Hardik Dayarambhai, Sonagara Kishan Dipakbhai
Shobha Vijaybhai Gadhia, Kanzariya Hardik Dayarambhai, Sonagara Kishan Dipakbhai
Abstract
Background: Smartphone usage has rapidly increased among young adults, particularly medical students, raising concerns about smartphone addiction and its potential adverse effects on sleep quality. Poor sleep has significant implications for academic performance, psychological well-being, and long-term health outcomes. Objectives: To determine the prevalence of smartphone addiction and poor sleep quality among undergraduate medical students and to analyze the correlation between the two. Methods: A cross-sectional study was conducted among 250 MBBS students in a tertiary care medical college over three months. Smartphone addiction was assessed using the Smartphone Addiction Scale–Short Version (SAS-SV), and sleep quality was evaluated using the Pittsburgh Sleep Quality Index (PSQI). Demographic details and daily smartphone usage were also recorded. Data were analyzed using descriptive statistics, chi-square test, and Pearson’s correlation. A p-value <0.05 was considered statistically significant. Results: The prevalence of smartphone addiction was 42.4%, with higher rates in males (48.5%) compared to females (37.6%) (p = 0.03). Poor sleep quality (PSQI > 5) was observed in 61.2% of students. Poor sleep was more common among students with smartphone addiction (79.2%) compared to those without addiction (47.9%) (p < 0.001). A significant dose–response relationship was observed between hours of smartphone use and poor sleep, with prevalence rising from 38.4% in students using smartphones less than 3 hours per day to 82.3% in those using more than 5 hours (p < 0.001). Pearson’s correlation showed a significant positive relationship between smartphone addiction scores and sleep disturbance scores (r = 0.42, p < 0.001). Conclusion: Smartphone addiction and poor sleep quality are highly prevalent among undergraduate medical students and show a strong positive correlation. Male students and those with higher daily smartphone usage were more vulnerable. The findings underscore the urgent need for awareness programs, counseling, and digital wellness initiatives to promote healthier technology use and sleep hygiene in medical undergraduates.
46. Forensic Evaluation of Alcohol-Related Deaths in Medicolegal Autopsies: A One-Year Retrospective Study
Kamlesh Lad, Jigar Chavda, Gauravkumar Singh
Kamlesh Lad, Jigar Chavda, Gauravkumar Singh
Abstract
Background: Alcohol consumption is a major risk factor for unnatural deaths, particularly in road traffic accidents, suicides, and violent crimes. Forensic autopsy evaluation of alcohol-related fatalities provides crucial insights into the demographic profile, manner of death, and toxicological findings, which are vital for public health and medico-legal interventions. Materials and Methods: A retrospective observational study was conducted in the Department of Forensic Medicine at a tertiary care center in western Gujarat over one year period. All medicolegal autopsies with confirmed alcohol consumption based on toxicological analysis or circumstantial evidence were included, while decomposed or incomplete cases were excluded. Data on age, sex, cause and manner of death, and blood alcohol concentration (BAC) levels were collected and analyzed using descriptive statistics. Institutional Ethics Committee approval was obtained prior to the study. Results: Out of 236 medicolegal autopsies, 54 cases (22.9%) were alcohol-related. Males predominated (90.7%), with the 21–40 years age group most affected (61.1%). Road traffic accidents were the leading category (40.7%), followed by suicides (29.6%), homicides (16.7%), and other accidental deaths (12.9%). Toxicological analysis revealed that 68.5% of cases had BAC above the legal intoxication limit (>80 mg/dl), with 35.2% showing severe intoxication (>150 mg/dl). Conclusion: Alcohol is a significant contributory factor in unnatural deaths, predominantly affecting young adult males, with road traffic accidents and suicides being the major categories. Elevated BAC levels in most victims highlight the urgent need for strict enforcement of drink-and-drive laws, routine toxicological testing in autopsies, and targeted public health interventions to reduce alcohol-related mortality.
47. Prevalence and Predictors of Arrhythmias in Chronic Kidney Disease Patients on Dialysis: A Hospital-Based Observational Study
Paryant Vala, Ankur Shantilal Kagathara, Swetkumar Maheshbhai Saradava
Paryant Vala, Ankur Shantilal Kagathara, Swetkumar Maheshbhai Saradava
Abstract
Background: Chronic kidney disease (CKD) patients on dialysis are at high risk of cardiovascular complications, with arrhythmias being a major contributor to morbidity and sudden cardiac death. Data from India on the prevalence and predictors of arrhythmias in this population remain limited. Methods: This cross-sectional observational study was conducted at a tertiary care center in India and included 100 adult patients with stage 5 CKD on maintenance dialysis. Demographic, clinical, and dialysis-related details were recorded, and arrhythmias were assessed using 12-lead electrocardiography performed pre- and post-dialysis. Selected patients underwent 24-hour Holter monitoring. Patients were grouped by arrhythmia status and compared for risk factors. Logistic regression analysis was used to identify independent predictors. Results: Arrhythmias were observed in 30% of patients, with atrial fibrillation/flutter being the most frequent, followed by premature ventricular complexes, conduction abnormalities, and supraventricular tachycardia. Patients with arrhythmias were older and had a higher prevalence of diabetes and coronary artery disease. Hypokalemia and longer dialysis vintage were also more frequent in this group. On multivariate analysis, older age, diabetes mellitus, coronary artery disease, and hypokalemia were independent predictors of arrhythmia. Conclusion: Nearly one-third of CKD patients on maintenance dialysis developed arrhythmias, with atrial fibrillation being most common. Age, diabetes, coronary artery disease, and hypokalemia significantly increased arrhythmic risk. While non-modifiable comorbidities contribute, careful attention to electrolyte balance and dialysis parameters may help reduce risk. Routine ECG surveillance and targeted preventive strategies are recommended to improve outcomes in this high-risk population.
48. A comparative study of Sural Nerve conduction among affected and non-affected limbs in patients suffering from unilateral Sciatica
Kamlesh Kumar Mahawar, Abhishek Saini, Tanu Atreya, Jyotsna Shukla, Bhawna Sharma
Kamlesh Kumar Mahawar, Abhishek Saini, Tanu Atreya, Jyotsna Shukla, Bhawna Sharma
Abstract
Background: The Sural nerve, a purely sensory nerve, plays a critical role in detecting early sensory changes in lumbosacral radiculopathy. Despite frequent sensory complaints in Sciatica, its electrophysiological assessment is often underutilized. Objective: To evaluate and compare Sural nerve conduction parameters between the affected and non-affected limbs in patients with unilateral Sciatica. Methods: A cross-sectional observational study was conducted on 40 patients aged 30–40 years with clinically diagnosed unilateral Sciatica. Bilateral Sural nerve conduction studies were performed using standardized techniques. Parameters assessed included distal latency, duration, sensory nerve action potential (SNAP) amplitude, and nerve conduction velocity (NCV). Paired t-tests were used for statistical comparisons. Results: Significant reductions were observed in SNAP amplitude (16.5 ± 6.0 µV vs. 20.5 ± 7.0 µV; p < 0.05) and NCV (55.0 ± 5.0 m/s vs. 58.0 ± 5.5 m/s; p < 0.05) on the affected side. Latency and duration showed no significant differences. These findings suggest axonal and early demyelinating changes in the affected Sural nerve. Conclusion: Sural nerve conduction is significantly impaired on the affected side in unilateral Sciatica. Bilateral NCS enhances diagnostic sensitivity and can detect subclinical sensory dysfunction. Routine Sural nerve assessment is recommended in patients with radicular symptoms, especially where motor findings are inconclusive.
49. Assessment of Microbial Contamination of Indian Currency in Circulation in A Tertiary Care Hospital Setting
Rudramuneswara Swamy B.P., Janakiram K., Mallikarjun Swamy S., Sakthi Vignesh, Navaneeth B.V., Dhanusha V.
Rudramuneswara Swamy B.P., Janakiram K., Mallikarjun Swamy S., Sakthi Vignesh, Navaneeth B.V., Dhanusha V.
Abstract
Background: Currency represents a universal medium for transmission of microorganisms in the environment and serve as an unrecognized reservoir for pathogenic bacteria contaminated by droplets while coughing, sneezing, touching with contaminated hands. Paper currency notes are continuously contaminated by poor handling and poor storage practices. Contaminated currency can also cause nosocomial infections and care should be taken while handling these currencies. Simultaneous handling of currency notes along with food can cause contamination of food. The pathogenic or potentially pathogenic microorganisms could survive on paper currency for days and may cause healthcare associated infections. The aim of this study was to investigate the occurrence and identify the microorganisms contaminating Indian currency notes. Methods: This is a prospective study based on laboratory investigations on various denominations of currency randomly collected from various sources of our hospital. Each currency note was collected from the respective people in a sterile plastic bag and transferred to the laboratory immediately and finally the swab samples are subjected to laboratory analysis. Similarly, a total of 12 new paper currencies (mint or uncirculated paper currency) and 12 new coins (mint coins) collected from the local bank, were used as negative control. Results: A total of 90 (69%) organisms were isolated from 130 currency notes. Among the 90 isolates 14% were pathogenic and 86% were non-pathogenic environmental organisms. Among the different categories or the sources, the highest percentage of contamination was occurred in food handlers followed by hospital visitors and healthcare workers (HCWs) while the least level of contamination was occurred in non-healthcare professionals. Conclusion: The study is to add to the limited body of literature on microbial contamination of Indian currency. The outcome of this study reflects that paper currency notes contaminated with microbes and can be a source for microbial transmission causing infectious diseases, foodborne illness, nosocomial infections and thus could be public health hazard. Public awareness should be created to avoid cross contamination of currency notes.
50. Vitamin D Levels in Children on Antiseizure Medications: A Comparative Study of Monotherapy and Polytherapy
Ashna Ann Varghese, Varghese Abraham, George Noble
Ashna Ann Varghese, Varghese Abraham, George Noble
Abstract
Background: Epilepsy is a common chronic neurological disorder in children requiring long-term antiseizure medication (ASM) therapy. While ASMs are essential for seizure control, they may adversely affect vitamin D metabolism, thereby impacting bone health. Evidence regarding the comparative effect of monotherapy versus polytherapy on vitamin D levels in children remains limited, particularly in the Indian context. Objectives: To assess serum vitamin D levels in children with epilepsy receiving ASM therapy and to compare the prevalence of deficiency between those on monotherapy and polytherapy. Methods: A hospital-based cross-sectional study was conducted at a tertiary care center in Kerala over 18 months. Sixty-five children (aged 1–18 years) with epilepsy, on ASMs for at least six months, were enrolled. Demographic data, seizure history, and treatment details were recorded. Serum 25-hydroxyvitamin D [25(OH)D] levels were measured and classified according to Indian Academy of Pediatrics guidelines. Comparative analyses between monotherapy and polytherapy groups were performed using appropriate statistical tests. Results: Of the 65 participants, 60% (n=39) were on monotherapy and 40% (n=26) on polytherapy. Median serum vitamin D levels were higher in the monotherapy group (30.60 ng/mL, IQR: 22.00–44.70) than in the polytherapy group (25.75 ng/mL, IQR: 15.45–43.50), though this difference was not statistically significant (P = 0.194). However, vitamin D deficiency was significantly more common in the polytherapy group (42.3%) compared to the monotherapy group (12.8%) (P = 0.007). Among drug regimens, the Valproate + Carbamazepine combination was associated with the lowest median vitamin D levels (16.80 ng/mL, P = 0.009), while Valproate + Levetiracetam was associated with the highest levels (46.80 ng/mL). Conclusion: Vitamin D deficiency is common among children with epilepsy, with significantly higher prevalence in those receiving polytherapy compared to monotherapy. These findings emphasize the need for routine monitoring of vitamin D levels and consideration of preventive supplementation, especially in patients on multidrug regimens or enzyme-inducing ASMs.
51. The Impact of SGLT2 Inhibitor Therapy on Postprandial Gut Hormone Response and Gut Microbiome Profile in Patients with Type 2 Diabetes: A 6-Month Prospective Pilot Study
Suchitra Yadav, Naveen Kumar, Jagdish Prasad Sunda
Suchitra Yadav, Naveen Kumar, Jagdish Prasad Sunda
Abstract
Background: The metabolic benefits of sodium-glucose cotransporter-2 inhibitors (SGLT2i) extend beyond glycosuria. We hypothesized that these effects are mediated through modifications of the gut-pancreas axis. Objective: To evaluate the effects of empagliflozin on postprandial glucagon-like peptide-1 (GLP-1) and peptide YY (PYY) levels and on the gut microbiome in patients with type 2 diabetes mellitus (T2DM). Methods: In this prospective, randomized, active-controlled, open-label pilot study, 62 patients with T2DM were assigned to receive either empagliflozin 25 mg (n=31) or glimepiride 4 mg (n=31) daily for 6 months. A standardized mixed-meal test was performed at baseline and study end. Fecal samples were collected for 16s rRNA sequencing. Results: After 6 months, the empagliflozin group demonstrated a significantly greater increase in the incremental area under the curve (iAUC) for both postprandial GLP-1 (+35.2%, p=0.003) and PYY (+28.7%, p=0.011) compared to the glimepiride group. Empagliflozin therapy also led to a significant increase in microbial alpha-diversity (Shannon index, p=0.008) and a rise in the relative abundance of SCFA-producing bacteria, including Roseburia (p=0.002) and Faecalibacterium (p<0.001). Changes in Faecalibacterium abundance were inversely correlated with HbA1c (r = -0.47, p=0.009). Conclusion: This pilot study suggests empagliflozin may enhance the secretion of satiety-inducing gut hormones and modulate the gut microbiome. These potential gastro-centric mechanisms warrant further investigation in larger, controlled trials.
52. A Cross-Sectional Study of Neurological Soft Signs with Sociodemographic and Clinical Profile in Psychiatric Disorders
Suman, Alok Tyagi, Harshpreet Kour, Gaurav Rajender
Suman, Alok Tyagi, Harshpreet Kour, Gaurav Rajender
Abstract
Background: Neurological soft signs (NSS) are non-localizing neurological abnormalities indicative of subtle cortical-subcortical dysfunction, commonly seen in psychiatric conditions. Though extensively studied in psychotic disorders like schizophrenia, NSS are increasingly recognized in non-psychotic psychiatric conditions as well. Objectives: This study aimed to compare the severity of NSS in psychiatric patients with and without psychotic symptoms, and to explore associated sociodemographic and clinical variables. Methods: A cross-sectional, comparative observational study was conducted among 100 drug-naïve psychiatric patients aged 18–60 years, equally divided into psychotic and non-psychotic groups. Patients were assessed using the MINI-PLUS for diagnostic confirmation and the Neurological Evaluation Scale (NES) for NSS. Additional scales including BPRS, PANSS, HAM-A, MADRS, and Y-BOCS were used to measure symptom severity. Results: The mean NES score was significantly higher in patients with psychotic symptoms (17.70 ± 7.53) compared to those without (6.96 ± 4.41), p < 0.001. Prominent NSS domains in psychotic patients included motor coordination and complex motor sequencing. While NSS were also present in non-psychotic disorders like OCD and anxiety, their severity was markedly lower. Conclusion: NSS are significantly more prevalent in psychiatric disorders with psychotic features, supporting their role as potential neurodevelopmental markers. Their presence, even in non-psychotic conditions, suggests a spectrum of neurological involvement in psychiatric illness. NSS assessment may aid in early diagnosis, clinical differentiation, and intervention planning.
53. Progression of Diabetic Nephropathy in Patients with Uncontrolled Type 2 Diabetes Mellitus
Vivek Kumar Singh, Vikrant Kumar, Vijay Kumar
Vivek Kumar Singh, Vikrant Kumar, Vijay Kumar
Abstract
Background: Diabetic nephropathy (DN) is a leading cause of end-stage renal disease worldwide, and its progression is strongly linked to glycemic control in type 2 diabetes mellitus (T2DM). While the association is well-established, contemporary data on the rate of renal function decline in patients with persistently poor glycemic control versus those with sustained control are needed to reinforce clinical management strategies. Methods: A total of 250 patients with T2DM for ≥5 years and a baseline eGFR >60 mL/min/1.73m² were enrolled and stratified into two groups based on baseline and follow-up glycated hemoglobin (HbA1c) levels: an uncontrolled group (n=120; mean HbA1c >8.5%) and a controlled group (n=130; mean HbA1c <7.0%). Annual assessments included eGFR (calculated via the CKD-EPI 2021 equation) and UACR. The primary outcomes were the mean annual rate of eGFR decline and the median change in UACR over 5 years. Results: At baseline, both groups were comparable in age, sex, and diabetes duration. The uncontrolled group exhibited a significantly greater mean annual eGFR decline (-4.8 ± 1.2 mL/min/1.73m²) compared to the controlled group (-1.6 ± 0.8 mL/min/1.73m²; p<0.001). The median UACR increased from 25 mg/g [IQR 15–40] to 88 mg/g [IQR 45–155] in the uncontrolled group, a significantly greater change than in the controlled group, which increased from 22 mg/g [IQR 14–35] to 31 mg/g [IQR 20–48] (p<0.001). Furthermore, the incidence of a composite renal endpoint (new-onset macroalbuminuria or a >40% sustained eGFR decline) was significantly higher in the uncontrolled cohort (31.7% vs. 6.2%; p<0.001). Conclusion: Persistent poor glycemic control in patients with T2DM is associated with a three-fold acceleration in the rate of eGFR decline and a substantial increase in albuminuria over a 5-year period. These findings underscore the critical importance of achieving and maintaining long-term glycemic targets to preserve renal function and prevent the progression of diabetic nephropathy.
54. Assessment of Antibiotic Prescription Pattern in In-Patients with Urinary Tract Infection (UTI) in a Tertiary Care Centre in Uttar Pradesh
Rahul Ranjan, Suruchi Prakash
Rahul Ranjan, Suruchi Prakash
Abstract
Background: UTI is still one of the most common infections people develop while staying in a hospital, mostly among patients with weak immunity or catheters in place. Using antibiotics in these cases without being appropriate adds heavily to the growing problem of antimicrobial resistance (AMR). Antibiotic use in hospitals should be regularly checked to make sure medicine is used sensibly. Aim: To check the differences in antibiotic use and outcomes in in-patients with UTI given either empirical or culture-directed therapies in a tertiary hospital. Methods: The researchers performed a prospective observational study on 100 adults who had a UTI. Participants were sorted randomly into Group A, receiving antibiotic treatment and Group B, being treated with medication more closely matched to the results of their culture test. Relevant information on patient demographics, medications given, time spent in treatment, how long the patient stayed in the hospital, and outcomes from their symptoms and adverse reactions was collected and then studied using statistics. Results: The average time patients spent in the hospital was lower for the culture-guided group (5.2 ± 1.1 days) than for the empirical group (6.4 ± 1.3 days; p = 0.0012). Resolving symptoms by Day 3 was more common in Group B (9 out of 10) than in Group A (68 out of 100; p = 0.009). Just 2% of patients in Group B had symptoms back within the same 7 days (only 5% of those in Group A did, with p = 0.048). The use of broad-spectrum antibiotics was greater in the empirical group. Even though more adverse reactions occurred in Group A, the results were not statistically important. Conclusion: Culture-guided treatment of infections achieves better health results, wiser antibiotic use and assists measures to address AMR.
55. Anatomic and Morphometric Study of the Adult Spleen in Cadavers
Deepa Verma, Khushboo Sinha, Sanjeev Kumar Sinha, Birendra Kumar Sinha
Deepa Verma, Khushboo Sinha, Sanjeev Kumar Sinha, Birendra Kumar Sinha
Abstract
Background: The spleen is ‘the largest lymphoid organ with vital haematological and immunological roles. Morphological and morphometric variations are clinically significant in diagnostic, surgical, and radiological contexts. Aim: To evaluate the morphological and morphometric characteristics of adult human cadaveric spleens and compare findings with previous studies. Methodology: A descriptive cross-sectional study’ was conducted on 34 cadaveric spleens obtained from the Department ‘of Anatomy, Patna Medical College and Hospital, Bihar. Spleens with intact structures were dis-sected, fixed in 10% formalin, and analyzed for shape, notches, weight, and dimensions. Standard anatomical tools and digital scales were used. Data were statistically analyzed using SPSS v27. Results: Wedge and triangular spleens were the most frequent shapes (35.29% each), followed by tetrahedral (14.71%), oval (8.82%), semilunar (2.94%), and heart-shaped (2.94%). The majority of spleens weighed 50–150 g (38.24%), with a secondary peak at 351–450 g (20.59%). Mean morphometric values were length 9.72 ± 1.08 cm, breadth 6.81 ± 0.91 cm, and thickness 3.59 ± 0.70 cm, closely’ correlating with previous Indian studies but showing slight variation from international reports. Conclusion: The study highlights significant variability in spleen morphology and morphometry, with wedge and triangular shapes predominating. Regional and population-specific anatomical differences were evident, un-derscoring the importance of such data for surgical planning, radiological interpretation, and accurate diagnosis of splenic disorders.
56. Role of Serum C-Reactive Protein in Early Detection of Gut Gangrene Among Patients with Intestinal Obstruction: An Observational Study
Parveen Jindal, Parul Jindal, Manish Tardeja, Nia Jindal, Chandra Sekhar Chevuturu, Rachna Singh
Parveen Jindal, Parul Jindal, Manish Tardeja, Nia Jindal, Chandra Sekhar Chevuturu, Rachna Singh
Abstract
Background: Intestinal obstruction is a frequently encountered surgical emergency with variable clinical outcomes depending on the severity and progression of the condition. A major complication associated with delayed or missed diagnosis is bowel gangrene, which carries significant morbidity and mortality. Early identification of intestinal ischemia or gangrene remains a challenge in the absence of definitive diagnostic tools. C-reactive protein (CRP), a widely available acute-phase reactant, may help predict the presence of bowel necrosis due to its association with systemic inflammation. Aim and Objectives: The primary objective of this observational study was to assess the predictive role of raised serum CRP levels in identifying gut gangrene in patients diagnosed with intestinal obstruction. The study also aimed to determine a clinically relevant CRP cut-off value that could be used as a triage tool in emergency surgical settings. Methods: This single-center observational study was conducted in the Department of General Surgery at Gautam Budhha Chikitsa Mahavidyalaya, Dehradun, India for one year. A total of 120 patients presenting with features of intestinal obstruction were enrolled after applying strict inclusion and exclusion criteria. All patients underwent routine blood investigations, radiological imaging, and serum CRP level estimation at the time of admission. Final diagnosis of bowel viability or gangrene was confirmed intraoperatively and corroborated by histopathological analysis where required. The CRP values were then statistically analyzed in correlation with operative findings to assess sensitivity, specificity, and predictive accuracy. Results: Out of the 120 patients studied, 38 (31.6%) were diagnosed intraoperatively with gangrenous bowel. The mean CRP levels in patients with gangrene (mean ± SD: 96.4 ± 21.8 mg/L) were significantly higher than those with non-gangrenous obstruction (34.7 ± 15.2 mg/L), with a p-value < 0.001. Receiver Operating Characteristic (ROC) curve analysis revealed that a serum CRP cut-off value of 60 mg/L offered a sensitivity of 87% and a specificity of 79% in detecting gut gangrene. Conclusion: This study underscores the utility of elevated serum CRP as a simple, cost-effective, and early biochemical marker for predicting gut gangrene in cases of intestinal obstruction. Incorporating CRP testing into the diagnostic protocol may aid clinicians in making timely decisions regarding surgical intervention and improving patient outcomes.
57. Evaluation of Analgesic Efficacy of Intrathecal Buprenorphine versus Tramadol as Adjuvants to Bupivacaine: A Comparative Study
Negi Deeksha, Piyush Kumar Sengar, Upendra Nath Verma
Negi Deeksha, Piyush Kumar Sengar, Upendra Nath Verma
Abstract
Background: Intrathecal adjuvants are widely used to enhance the quality and duration of spinal anaesthesia, especially in infra-umbilical surgeries. Buprenorphine, a partial μ-opioid receptor agonist, and tramadol, a centrally acting atypical opioid, have both been used as adjuvants to local anaesthetics like bupivacaine. However, their comparative efficacy and safety in prolonging postoperative analgesia remain subjects of clinical interest. Objective: To compare the efficacy of intrathecal bupivacaine with buprenorphine versus bupivacaine with tramadol in terms of onset and duration of sensory and motor block, postoperative analgesia, and associated side effects in patients undergoing lower abdominal and lower limb surgeries. Methods: A prospective, randomized comparative study was conducted at Department of Anaesthesia, Shaheed Nirmal Mahto Medical College, Dhanbad, Jharkhand, India. A total of 100 ASA Grade I and II patients aged 18–60 years undergoing elective lower abdominal or lower limb surgeries were enrolled and divided equally into two groups (n=50 each). Group B received 3 mL of 0.5% hyperbaric bupivacaine with 60 µg buprenorphine, while Group T received 3 mL of 0.5% hyperbaric bupivacaine with 25 mg tramadol intrathecally. Parameters assessed included onset and duration of sensory and motor block, duration of effective analgesia, and incidence of adverse effects. Results: Group B (buprenorphine) showed significantly prolonged duration of postoperative analgesia (412 ± 38 minutes) compared to Group T (tramadol) (316 ± 41 minutes) (p < 0.001). Onset of sensory block was faster in Group T, while motor block onset and duration were comparable. Mild side effects such as nausea and pruritus were noted more frequently in Group B but were self-limiting and did not require intervention. Conclusion: Buprenorphine as an intrathecal adjuvant to bupivacaine provides more prolonged and effective postoperative analgesia compared to tramadol, making it a preferable choice in spinal anaesthesia for lower abdominal and limb surgeries despite minor tolerable side effects.
58. Efficacy of Palonosetron versus Ondansetron in Preventing Postoperative Nausea and Vomiting Following Abdominal Surgeries: A Randomized Comparative Analysis
Piyush Kumar Sengar, Negi Deeksha, Upendra Nath Verma
Piyush Kumar Sengar, Negi Deeksha, Upendra Nath Verma
Abstract
Background: Postoperative nausea and vomiting (PONV) are common and distressing complications following abdominal surgeries under general anesthesia. These symptoms can hinder recovery, prolong hospital stay, and adversely affect patient satisfaction. Among available prophylactic options, 5-HT₃ receptor antagonists are widely used—ondansetron being the conventional choice, and palonosetron, with its longer half-life, showing potential for improved outcomes. Objectives: To compare the effectiveness of palonosetron and ondansetron in preventing postoperative nausea and vomiting in patients undergoing abdominal surgeries under general anesthesia. Methods: In this prospective, hospital-based, comparative study, 100 patients aged 18–60 years with ASA physical status I or II, scheduled for elective abdominal surgery, were enrolled and randomly allocated into two groups of 50 each. Group A received a single intravenous dose of palonosetron 0.075 mg, while Group B received ondansetron 8 mg intravenously, 5–10 minutes before induction. All patients underwent standardized anesthetic and analgesic protocols. The incidence and severity of PONV were assessed during three postoperative intervals (0–6 h, 6–12 h, and 12–24 h), and rescue antiemetic requirements were recorded. Categorical data were analyzed using chi-square test and continuous variables by unpaired t-test; p < 0.05 was considered significant. Results: The overall incidence of PONV over 24 h was significantly lower in the palonosetron group (10 of 50; 20 %) compared with the ondansetron group (37 of 50; 74 %) (p < 0.001). During the 0–6 h interval, nausea occurred in 5 of 50 patients (10 %) receiving palonosetron versus 12 of 50 (24 %) receiving ondansetron (p = 0.18). Between 6–12 h, nausea was reported by 3 of 50 (6 %) versus 15 of 50 (30 %) (p = 0.01), and during 12–24 h by 2 of 50 (4 %) versus 18 of 50 (36 %) (p < 0.001). Vomiting episodes followed a similar pattern, with 4 of 50 (8 %) versus 10 of 50 (20 %) in 0–6 h (p = 0.12), 2 of 50 (4 %) versus 12 of 50 (24 %) in 6–12 h (p = 0.02), and 0 of 50 (0 %) versus 17 of 50 (34 %) in 12–24 h (p < 0.001). Rescue antiemetic was required in 4 % of patients in the palonosetron group compared to 28 % in the ondansetron group (p = 0.003). Both drugs were well tolerated with minimal adverse effects. Conclusion: Palonosetron demonstrated superior efficacy to ondansetron in preventing both early and late PONV, reduced the need for rescue antiemetics, and offered a favorable safety profile. Its prolonged duration of action makes it a preferred agent for single-dose prophylaxis in abdominal surgeries under general anesthesia.
59. Estimation of Stature Using Metrical Parameters of Upper Limb Long Bones in Humans
Hema Narayan, Barun Kumar
Hema Narayan, Barun Kumar
Abstract
Background: Forensic anthropology is an essential component for the identification of an unknown individual through skeletal analysis; estimating stature has always been a primary investigator. Long bones of the upper limb, particularly the humerus, radius, and ulna, are used to estimate stature because they are often preserved with skeletal remains. Aim: To develop reliable, sex-specific regression models for stature estimation based on the morphometric analysis of upper limb long bones. Methodology: A descriptive cross-sectional study was undertaken on 120 adult human skeletal specimens (45 males, 75 female subjects). Specimens were obtained from all the adult human skeletal remains housed at Netaji Subhas Medical College, Bihta, Patna. A number of measurements were taken on specific bones with measurements taken by using standard osteometric instruments. The known statures were used for regression analysis. Data were statistically analyzed using SPSS v27 to derive predictive equations. Results: Among the three bones, the ulna showed the strongest correlation with stature in both males (r = 0.64; r² = 0.41) and females (r = 0.55; r² = 0.30), followed by the radius and humerus. The standard error of estimate was lowest for the ulna, indicating higher accuracy. Regression formulas based on ulna length yielded highly precise height predictions with minimal differences (±0.05–0.43 cm) from actual stature. Conclusion: The ulna is the most reliable predictor of stature among upper limb long bones. Sex-specific regression equations developed in this study offer an accurate and practical method for forensic stature estimation, especially when complete skeletons are unavailable.
60. An Observational Clinical Study on Antibiotic Utilization Pattern among Patients Undergoing Elective Laparoscopic Cholecystectomy at a Tertiary Care Medical Training Institution in Kolkata
Ghosh S., Chatterje S., Banerjee P., Sarkar S.
Ghosh S., Chatterje S., Banerjee P., Sarkar S.
Abstract
Background: Drug Utilization Study is the “marketing, distribution, prescription and the use of drugs in a society with special emphasis on the resulting medical, social and economic consequences”. The current study was conducted to observe the antibiotic utilization pattern in patients undergoing elective laparoscopic cholecystectomy in College of Medicine and Sagore Dutta Hospital. Methods: This observational, cross-sectional study was conducted for a year to study the antibiotic utilization pattern in patients, during the post-operative period. Data was collected from the bedside treatment records of the patients. Results: A total of 384 patients were recruited and it was observed that the most commonly used antibiotics used for post-operative prophylaxis in patients undergoing elective laparoscopic cholecystectomy were a monotherapy with intravenous ceftriaxone (43.23%) followed by a dual therapy with intravenous ceftriaxone and metronidazole (22.14%). Among patients who were switched over to oral antibiotics the most commonly used regimen was a dual therapy with oral ciprofloxacin and metronidazole (21.16%) followed by a monotherapy with oral cefuroxime axetil (18.67%). Conclusion: The results of this study, provide baseline data for conducting future research work in the area of Antibiotic Utilization Studies, to improve the prescription patterns of antibiotics used for surgical prophylaxis and to combat the rising threat of antimicrobial resistance.
61. A Prospective, Randomized, Controlled Study Comparing the Effects of Normal Saline versus Plasmalyte as Intravenous Fluid on Acid-Base (Base Excess) and Electrolyte Status during Renal Allografting
Akshay Toshniwal, Himanshu Aneejwal, Sangeeta Khanna, Dhrupad Patel
Akshay Toshniwal, Himanshu Aneejwal, Sangeeta Khanna, Dhrupad Patel
Abstract
Background: The study was conducted to analyze and evaluate the effects of Normal Saline (NS) and Plasmalyte (PL) on acid-base status (Base Excess) and other electrolyte status during renal allografting. Methods: This prospective, randomized, comparative study included 70 patients who were randomized into two equal groups: Group 1 (NS, n=35) and Group 2 (PL, n=35). Arterial blood samples were collected at five different time points: just before induction of anesthesia (T₀), 30 minutes after induction (T₁), during anastomosis (T₂), at the time of drain insertion (T₃), and 6 hours post-surgery (T₄). The samples were analyzed for pH, PCO₂, PO₂, Base excess (BE), HCO₃⁻, Na⁺, K⁺, Cl⁻, and lactate. Results: The NS group showed a statistically significant decrease in pH and base excess (BE) and a significant increase in serum chloride levels during the procedure. Specifically, the BE values for the NS group significantly decreased from T₂ (-5.93 ± 1.8) to T₃ (-7.66 ± 1.13) and T₄ (-6.84 ± 1.51), while the PL group’s BE remained relatively stable. The mean chloride levels were significantly higher in the NS group compared to the PL group intra-operatively. In contrast, the Plasmalyte group maintained a more stable acid-base and electrolyte balance. Conclusion: The administration of Normal Saline during renal allografting is associated with the development of significant hyperchloremic metabolic acidosis. This study concludes that Plasmalyte is a superior intravenous fluid choice for maintaining acid-base balance during renal transplantation as compared to Normal Saline.
62. Prevalence of Peripheral Arterial Disease in Patients with Diabetic Foot Ulcers: A Clinical and Observational Study from a Tertiary Care Hospital
Anjani Kumar Anjan, Manoj Kumar Shaw
Anjani Kumar Anjan, Manoj Kumar Shaw
Abstract
Background: Diabetic foot ulcers (DFUs) are serious complication of diabetes mellitus and can be closely tied to the presence of peripheral vascular disease (PVD). The presence of PVD will compound the delayed healing of ulcers, increase susceptibility to infection, and lead to amputation in many cases. Early detection of PVD should improve outcomes for DFU patients. Methods: This prospective observational study was conducted over a one year period from (January 2023 to December 2023) at the Department of General Surgery, Bhagwan Mahavir Institute of Medical Sciences, Pawapuri, Nalanda, Bihar, India. A total of 120 patients with diabetic foot ulcers were enrolled. To assess the presence of peripheral vascular disease, detailed history taking, clinical examination, and Doppler ultrasound studies were performed. We also investigated the association of PVD with age, sex, diabetes status (duration of diabetes), smoking, and accompanying morbidity factors. Results: Of the 120 study participants, PVD presence was confirmed in 48 patients (40%). Incidence was higher among males (66.7%) compared to females (33.3%). PVD incidence was highest among patients with 10 years + duration of diabetes (62.5%) compared to shorter durations. Smoking history and monitoring hypertension were found to be significant risk factors. Ulcers associated with PVD have delayed healing and more complications. Conclusion: The study showed a high rate of peripheral vascular disease among this population with diabetic foot ulcers and had further clinical implications that were reported in the case studies on wounds studied. The early identification, mode of intervention, and management of PVD is vital to lower risk for morbidity, aid ulcer healing, and avoid amputations. This highlights the importance of vascular assessment as part of routine diabetic foot care assessment.
63. Clinicopathological Evaluation and Outcomes of Postdated Pregnancies: An Observational Study
Smita Mallick
Smita Mallick
Abstract
Background: Postdated pregnancy, defined as a pregnancy that extends beyond 40 completed weeks, remains an important clinical condition associated with increased maternal and perinatal risks. Understanding the clinical presentations and pathological associations of postdated pregnancies is essential for optimizing obstetric management. Objective: To assess the clinical features, maternal complications, and perinatal outcomes in cases of postdated pregnancy and to evaluate their pathological correlates. Methods: A prospective observational study was conducted in the Department of Obstetrics and Gynaecology, Narayan Medical College and Hospital, Sasaram, Bihar,India for one year. A total of 118 women with pregnancies extending beyond 40 weeks were included. Detailed clinical history, examination, laboratory tests, and ultrasonographic findings were recorded. Maternal outcomes, labor complications, mode of delivery, placental changes, and neonatal outcomes were systematically analyzed. Results: The majority of women were between 21 and 30 years of age, with multiparous women slightly predominating. Common maternal complications included prolonged labor (22%), meconium-stained liquor (18.6%), and increased rates of operative delivery (32.2%). Placental examinations frequently showed signs of aging, including calcification and infarcts in 27.1% of cases. Neonatal outcomes revealed an increased incidence of low Apgar scores at 1 minute (14.4%) and NICU admissions (11.9%), though perinatal mortality remained low. Conclusion: Postdated pregnancies are associated with increased maternal and perinatal morbidity, largely due to prolonged labor, placental senescence, and higher operative intervention rates. Careful antenatal monitoring, timely induction of labor, and judicious decision-making regarding operative delivery are crucial in reducing complications.
64. Clinical Evaluation of Risk Factors in Difficult Cholecystectomy: A One-Year Observational Study
Manoj Kumar Shaw, Anjani Kumar Anjan
Manoj Kumar Shaw, Anjani Kumar Anjan
Abstract
Background: Cholecystectomy is one of the most commonly performed general surgical procedures worldwide. However, a subset of cases can be categorized as “difficult cholecystectomy” due to technical challenges, in-creased operative time, or higher risk of complications. Identifying preoperative and intraoperative risk factors that contribute to difficult cholecystectomy is essential for improving surgical planning, patient counseling, and outcomes. Objective: The purpose of this study was to determine clinical, demographic, and radiologic risk factors for difficult cholecystectomy in patients who were being operated on at a teaching hospital tertiary care center. Methods: This is a prospective observational study that was carried out in the Department of General Surgery, Bhagwan Mahavir Institute of Medical Sciences, Pawapuri, Nalanda, Bihar, India, during January 2024 to December 2024. 132 patients who underwent cholecystectomy were included. The difficulty was measured in terms of operative findings, surgical time, requirement for conversion to open cholecystectomy, and intraoperative complications. Data were collected regarding age, gender, BMI, clinical presentation, history of acute cholecysti-tis, ultrasonographic findings, and operative details. Statistical analysis was performed to determine associations between risk factors and difficult cholecystectomy. Results: Among the 132 patients, difficult cholecystectomy was observed in 41 patients (31.1%). Significant risk factors identified were male gender (p<0.05), age above 50 years, obesity (BMI >30), history of recurrent cholecystitis, thickened gallbladder wall on ultrasonography (>4 mm), and presence of impacted stones in the gallbladder neck or cystic duct. Conversion to open cholecystectomy was required in 18 patients (13.6%). Conclusion: Difficult cholecystectomy remains a clinical challenge influenced by multiple demographic and pathological factors. Preoperative recognition of risk factors can help surgeons anticipate technical difficulties, ensure patient safety, and adopt appropriate surgical strategies.
65. Comparative Study of Pulse Rate, Oxygen Saturation, and Respiratory Effort in Preterm Newborns after Different Feeding Techniques
Ashutosh Raj, Rajiv Kumar
Ashutosh Raj, Rajiv Kumar
Abstract
Aim: This study aimed to evaluate the impact of different feeding methods—breast milk, formula, and intravenous (IV)—on pulse rate, oxygen saturation, and respiratory effort in preterm newborns. Methodology: A total of 90 preterm infants were enrolled in a randomized controlled trial Department of Paediatrics, Jawahar lal Nehru medical College and hospital, Bhagalpur, Bihar, India, with 30 infants assigned to each feeding method. Baseline measurements of pulse rate, oxygen saturation, and respiratory effort were recorded, followed by assessments at 15-, 30-, and 60-minutes post-feeding. Statistical analyses, including ANOVA and Tukey’s HSD post-hoc tests, were employed to compare the physiological responses across the feeding groups. Result: The analysis revealed significant differences in physiological parameters across the feeding methods. At the 15-minute interval, the formula group exhibited a significantly higher pulse rate compared to the breast milk and IV groups (p = 0.048). Oxygen saturation levels were notably higher in the IV group at all time points, with significant differences compared to the formula group (p = 0.03 at 15 minutes, p = 0.045 at 30 minutes, p = 0.015 at 60 minutes). Respiratory effort scores were significantly elevated in the formula group at the 30-minute mark (p = 0.045), with post-hoc analysis confirming differences from the breast milk group (p = 0.040). Conclusion: The findings indicate that while all feeding methods can be safely utilized, intravenous feeding may provide superior benefits in maintaining oxygen saturation levels, whereas formula feeding is associated with higher pulse rates and respiratory effort. These results highlight the importance of tailored feeding strategies to optimize the health outcomes of preterm infants.
66. Risk Assessment of Surgical Site Infection in Head-Neck Surgeries Haematologically
Anshu Chopra, Madhurima Banerjee, Bibek Deb, Nandita Pramanick
Anshu Chopra, Madhurima Banerjee, Bibek Deb, Nandita Pramanick
Abstract
Introduction: Patients undergoing any surgery tend to have an elevated total leukocyte count as a result of inflammation. This could lead to over diagnosis of surgical site infection and overuse of health resources. With the help of this study, we are trying to establish a trend of white blood cell count and Neutrophil to Lymphocyte Ratio (NLR) in the immediate post-op period; diagnosing Surgical site infection (SSI) utilizing laboratory values will be easier and more reliable. Materials and Methods: We have established a data repository of post-op total leucocyte count and differentialcounts of patients on POD1, POD3 and POD7. After analysing them, we have come to a pattern of variance on those days that can help us determine a baseline value of the aforementioned markers on those days. Results: After putting the data in Microsoft Excel, we came to the cut-off values for each post-op day according to the patient’s age, sex, pre-op total count and which surgery he/ she is going through. Conclusions: With the help of these cut-off values, we will be able to predict the surgical site infection even before there are any clinical signs visible. This will help us to start treatment as early as possible to reduce patient’s morbidities and save on the healthcare system’s resources.
67. Evaluating the Role of Intrathecal Fentanyl in Reducing Post-Dural Puncture Headache Among Parturient Undergoing Cesarean
Priya Ranjan, Zubia Anjum, Vijay Singh
Priya Ranjan, Zubia Anjum, Vijay Singh
Abstract
Background: Post-dural puncture headache (PDPH) is a frequent and distressing complication of spinal anesthesia in parturient patients undergoing cesarean section, causing discomfort and delaying recovery. Intrathecal fentanyl has been proposed as an adjunct to alleviate the incidence of PDPH. Aim: To evaluate the role of intrathecal fentanyl in alleviating the incidence of PDPH among parturient patients undergoing cesarean section under spinal anesthesia. Methodology: We conducted a prospective, randomized, double-blind trial in 100 ASA II parturient patients, randomized into two groups: Group F (n = 50) received 0.5% hyperbaric bupivacaine with 25 µg fentanyl; Group C (n = 50) received bupivacaine with saline. Subarachnoid block (SAB) was induced using a 25G Quincke needle. The patients were monitored for 72 hours post-operatively, then followed up every day until day 14, when we evaluated PDPH using the Visual Analogue Scale (VAS). Data was analyzed using Chi square and Student’s T-test. Results: Group F demonstrated a lower incidence of both mild (2% vs 6%) and moderate (0% vs 4%) PDPH than Group C. While the distribution, quality, and associated symptoms of headaches (nausea and vertigo, backache) was lower in Group F, these values did not reach statistical significance (p > 0.05). Conclusion: There is potential that intrathecal fentanyl may reduce the severity and incidence of PDPH and suggest a potential protective effect as an adjunct to spinal anesthesia in parturient patients undergoing cesarean section.
68. Comparative Clinical Analysis of Conservative and Surgical Treatment of Distal Radius Fractures in Adults
Waybase Hanumant Ashok, Sagar Shriram Maslekar, Anand Balajirao Anerao, Madhusudhan Kale
Waybase Hanumant Ashok, Sagar Shriram Maslekar, Anand Balajirao Anerao, Madhusudhan Kale
Abstract
Background: Distal radius fractures (DRFs) are among the most common orthopedic injuries, particularly in elderly patients, and are often associated with osteoporosis and increased fall risk. Both conservative casting and surgical intervention are widely used treatment options, but the optimal approach remains debated. Aim: To compare the clinical outcomes, functional recovery, quality of life, and complication rates of conservative versus surgical management of DRFs in adults. Methodology: This prospective comparative study enrolled 60 adult patients with DRFs treated either conservatively (closed reduction and casting) or surgically (open reduction and internal fixation or percutaneous pinning). Outcomes assessed at 3 and 6 months included pain (VAS), ROM (range of motion), DASH (Disabilities of the Arm, Shoulder, and Hand) score, and SF-36 quality-of-life scores. Complications were documented and statistically analyzed. Results: Surgical management demonstrated substantially reduced DASH scores across both 3 months (62.45 ± 9.12 vs. 69.10 ± 13.85; p=0.038) and 6 months (52.80 ± 8.75 vs. 59.45 ± 14.62; p=0.019). VAS pain scores and ROM were consistently superior among the surgery cohort (p<0.05). Malunion was more frequent with casting (53.3% vs. 23.3%), whereas surgical site infections (10.0%) and radial nerve injuries (13.3%) occurred only in surgical patients. The SF-36 ratings indicated no substantial difference between the groups. Conclusion: Surgical management of DRFs offers faster functional recovery, improved pain control, and reduced malunion risk, though it carries procedure-specific complications. Casting remains viable for low-demand patients or those unfit for surgery. Individualized treatment selection is essential.
69. Impact of Arthroscopic Meniscus Repair Versus Partial Meniscectomy on Knee Function in Young Adults
Anand Balajirao Anerao, Waybase Hanumant Ashok, Sagar Shriram Maslekar, Madhusudhan Kale
Anand Balajirao Anerao, Waybase Hanumant Ashok, Sagar Shriram Maslekar, Madhusudhan Kale
Abstract
Background: The knee menisci are crucial for load distribution, shock absorption, and joint stability. Meniscal injuries are common in young adults, and surgical management includes arthroscopic meniscal repair or partial meniscectomy. Functional outcomes differ between these approaches, influencing long-term joint health. Aim: To evaluate and compare the functional outcomes of arthroscopic meniscal repair versus partial meniscectomy in young adults with meniscal injuries. Methodology: A prospective comparative study was conducted on 90 patients aged 18–40 years with MRI-confirmed meniscal tears. Group A (n=35) underwent arthroscopic meniscal repair, and Group B (n=55) underwent partial meniscectomy. Pre- and post-operative functional assessments included Lysholm, Tegner, WOMAC, HSSK, and VAS scores, with follow-up at 3, 6, and 12 months. Statistical analysis utilized Chi-square, independent t-tests, and repeated-measures ANOVA. Results: Demographics were comparable between groups. Surgery duration was significantly longer in the repair group (84.51 ± 4.76 min) versus meniscectomy (46.13 ± 6.94 min, p<0.0001). Postoperative Lysholm (87.83 vs. 75.94, p=0.0068) and WOMAC scores (4.68 vs. 6.94, p=0.0117) favored meniscectomy, indicating faster short-term functional recovery. Tegner, VAS, and HSSK scores showed no significant differences. Both groups achieved comparable pain relief and activity levels over a mean follow-up of 1.7 years. Conclusion: Partial meniscectomy provides quicker short-term functional improvement, whereas meniscal repair preserves native tissue, supporting long-term knee biomechanics. For young, active patients, meniscal repair is preferable when anatomically feasible to reduce the risk of osteoarthritis, despite longer operative times and slower early recovery.
70. A Comparative Analysis of Functional Outcomes Following Intramedullary Nailing Versus Plating in Diaphyseal Fractures of the Humerus
Sagar Shriram Maslekar, Anand Balajirao Anerao, Waybase Hanumant Ashok, Madhusudhan Kale
Sagar Shriram Maslekar, Anand Balajirao Anerao, Waybase Hanumant Ashok, Madhusudhan Kale
Abstract
Background: Humeral shaft fractures account for approximately 3% of all fractures and traditionally, they have been treated conservatively. They can require surgery if conservative treatment fails, for polytrauma cases, or if the fracture is comminuted. Dynamic compression plating (DCP) and intramedullary nailing (IMN) are the two most popular means of surgical treatment, but their relative effectiveness is still debated. Objectives: To compare functional results, complications and fracture union resulting from intramedullary nailing with plating for diaphyseal humerus fractures. Methods: This prospective comparative study was conducted at Parbhani Medical College and RP Hospital with 30 patients over 1 year. Patients were randomly assigned to one of two groups: Group A (IM nailing n=15) and Group B (plating n=15) and functional result was assessed at 6 weeks, 3 months and 6 months using the DASH and Constant-Murley scores. SPSS v26.0 was used for statistical analysis and a value of p<0.05 was considered significant. Results: The average follow-up was 10.17 months. High velocity injuries were frequent (60%). Patients with younger ages were more likely to receive plating and patients with older age were more likely to receive IM nailing. Functional recoveries were similar between groups; union rates and complications between groups were not different. Conclusion: Plating and IM nailing are acceptable for diaphyseal humeral fractures. Technique selection should be individualized according to the patient’s age, pattern of fracture, and severity of injury for best functional result.
71. Comparative Analysis of Functional Outcomes Following Arthroscopic Meniscal Repair and Partial Meniscectomy in Young Adults in Tertiary Care Hospital
Tarun Dodiya, Setukumar Jasani, Yogesh Kumar Gadhavi
Tarun Dodiya, Setukumar Jasani, Yogesh Kumar Gadhavi
Abstract
Background: Meniscal injury is a common intra-articular knee disorder in young adults, usually due to sporting injury. Arthroscopic partial meniscectomy (APM) and meniscal repair are frequent surgical procedures with varying short- and long-term results. Although APM provides more rapid symptomatic improvement, meniscal repair is potentially more likely to preserve joint health. Objective: This prospective observational study compares functional outcomes, patient satisfaction, and complications after arthroscopic meniscal repair versus APM in young adults aged 18–40 years. Methods: Ninety-eight symptomatic meniscal tears patients received either meniscal repair (n=49) or APM (n=49). Patient-reported outcomes (Lysholm Knee Score, KOOS4) and Patient Acceptable Symptom State (PASS) were measured preoperatively and at 3, 6, and 12 months after surgery. Statistical analysis compared functional recovery and complication rates between groups. Results: APM produced significantly improved KOOS4 and Lysholm scores at 3 months (p<0.01), reflecting earlier rapid recovery. Meniscal repair, however, showed better function and greater PASS positivity at 12 months (p<0.05). Complication rates were low and similar between groups. Conclusion: Partial meniscectomy allows for faster short-term improvement, but repair provides better long-term knee function and patient satisfaction. Meniscal tissue preservation is the preferred option in young adults with repairable tears to achieve optimal joint health, even at the cost of slower early recovery.
72. Evaluation of Varying Doses of Dexmedetomidine for Modulating Hemodynamic and Clinical Responses during Extubation in Open Cholecystectomy Patients: A Randomized Controlled Trial
Muni Lal Gupta, Khusboo Rani, Arunodaya Suman
Muni Lal Gupta, Khusboo Rani, Arunodaya Suman
Abstract
Background: Extubation at the end of surgery is often associated with undesirable hemodynamic and airway responses such as tachycardia, hypertension, coughing, and agitation. These responses may be particularly hazardous in patients undergoing open cholecystectomy due to increased intra-abdominal pressure and risk of bleeding. Dexmedetomidine, a highly selective α2-adrenergic agonist, has emerged as a promising agent to attenuate these responses. However, the optimal dose for balancing efficacy and safety remains unclear. Objectives: The present study aimed to compare the effectiveness of three different doses of dexmedetomidine in attenuating the extubation response in patients undergoing open cholecystectomy. Methods: This prospective, randomized controlled trial was conducted at the Department of Anesthesiology, Bhagwan Mahavir Institute of Medical Sciences, Pawapuri, Nalanda, Bihar, India. A total of 120 adult patients (ASA I–II), aged 18–60 years, scheduled for elective open cholecystectomy under general anesthesia were enrolled. They were randomly allocated into three groups (n = 40 each) to receive intravenous dexmedetomidine at doses of 0.25 µg/kg, 0.5 µg/kg, or 1.0 µg/kg, administered 10 minutes before extubation. Hemodynamic parameters (heart rate, systolic blood pressure, diastolic blood pressure, mean arterial pressure), extubation quality score, sedation score, and adverse effects were assessed. Results: Dexmedetomidine at 0.5 µg/kg and 1.0 µg/kg significantly attenuated the rise in heart rate and blood pressure compared to the 0.25 µg/kg group (p < 0.05). The best extubation quality was observed in the 0.5 µg/kg group, with minimal coughing and agitation. The 1.0 µg/kg group, although effective in hemodynamic control, showed higher sedation and incidence of bradycardia. The 0.25 µg/kg dose was inadequate in suppressing extubation response in most patients. Conclusion: Dexmedetomidine is effective in attenuating hemodynamic and airway responses during extubation in open cholecystectomy patients. Among the studied doses, 0.5 µg/kg offers the best balance between efficacy and safety, whereas 1.0 µg/kg, although more potent, may be associated with higher sedation and bradycardia risk.
73. Survey of Clinical Practices in the Use of Cuffed Endotracheal Tubes in Pediatric Anaesthesia: An Observational Study
Muni Lal Gupta, Arunodaya Suman, Khusboo Rani
Muni Lal Gupta, Arunodaya Suman, Khusboo Rani
Abstract
Background: The use of cuffed endotracheal tubes (ETTs) in pediatric anesthesia has long been debated due to concerns of airway injury, subglottic stenosis, and postoperative complications. However, recent technological advancements and cuff design improvements have shifted global practices toward the use of cuffed tubes. In India, there is variability in adoption patterns, and data on prevailing practices are limited. Objectives: The present study aimed to assess the current practices, preferences, and perceptions regarding the use of cuffed ETTs in pediatric anesthesia among Indian anesthesiologists. Methods: A cross-sectional questionnaire-based survey was conducted among practicing anesthesiologists across India. A total of 120 participants were included, representing both teaching institutions and private healthcare settings. The survey collected data on frequency of cuffed tube usage in different pediatric age groups, selection criteria, concerns regarding safety, perioperative monitoring methods, and perceived advantages and limitations. Data were analyzed using descriptive statistics. Results: Among the 120 respondents, 78.3% reported routine use of cuffed ETTs in children above 2 years, while 65% also preferred them in infants less than 1 year. The primary reasons cited were improved airway seal, reduced risk of aspiration, and better ventilation control. Concerns included the risk of mucosal injury (42%) and uncertainty regarding cuff pressure monitoring (38%). The majority (72%) utilized cuff pressure manometers, while others relied on pilot balloon palpation. Institutional protocols supporting cuffed tube use were reported by only 46% of respondents, indicating a lack of standardized guidelines across centers. Conclusion: The study highlights a clear shift toward routine use of cuffed endotracheal tubes in pediatric anesthesia among Indian anesthesiologists, driven by perceived clinical advantages. However, concerns about safety and inconsistent cuff pressure monitoring practices underscore the need for nationwide consensus guidelines and training programs.
74. A Study of Acid Base, Electrolyte and Haemodynamic Status of Status Epilepticus Patients Among Children in a Tertiary Care Hospital of West Bengal
Sankar Narayan Mishra, Mahaprasad Pal, Sayantani Panda, Kaustav Nayek
Sankar Narayan Mishra, Mahaprasad Pal, Sayantani Panda, Kaustav Nayek
Abstract
Introduction: Status epilepticus (SE) is a life-threatening neurological emergency characterized by prolonged or recurrent seizures without recovery of consciousness. The condition is associated with significant disturbances in acid–base balance, electrolytes, and haemodynamic parameters, which may worsen neuronal injury and increase morbidity and mortality. Early recognition and correction of these derangements are crucial in the comprehensive management of SE. Methods: The present study was a prospective observational study conducted at Burdwan Medical College and Hospital, Bardhaman, West Bengal (PIN 713104) over a period of one year, from August 2021 to July 2022. The study population comprised all patients aged between 3 months and 12 years presenting with status epilepticus (SE) who fulfilled the inclusion criteria. A total of 81 patients with SE were enrolled during the study period. Results: Among 81 patients with status epilepticus (mean age 32.4 ± 15.7 years; 59.3% male), generalized seizures were more common (72.8%) than focal (27.2%). Etiologies included CNS infections (24.7%), metabolic (18.5%), stroke/structural (16.0%), and idiopathic (40.8%). Acid–base analysis showed acidemia (pH 7.31 ± 0.08), raised PaCO₂ (46.2 ± 8.5 mmHg), low HCO₃⁻ (20.8 ± 3.2 mEq/L), base excess –3.5 ± 2.1 mEq/L, and elevated lactate (3.2 ± 1.1 mmol/L). Electrolyte disturbances included hyponatremia (132.6 ± 5.8 mEq/L), hypokalemia (3.4 ± 0.6 mEq/L), hypocalcemia (7.9 ± 0.7 mg/dL), and low magnesium (1.6 ± 0.3 mg/dL). Haemodynamics showed tachycardia (112 ± 18 bpm) with near-normal BP and MAP, and reduced oxygen saturation (93 ± 4%). Longer seizure duration correlated with worsening acidosis, hyponatremia, hypokalemia, and higher lactate. Conclusion: Status epilepticus is frequently complicated by metabolic derangements and haemodynamic instability. Prompt identification and correction of acid–base imbalance and electrolyte abnormalities, along with haemodynamic stabilization, are essential for improving outcomes in these patients.
75. Demographic Profile of Head and Neck Cancer Patients Excluding Thyroid Cancer: A Single Tertiary Institution-Based Study in Eastern India
Poulomi Saha, Animesh Ghosh, Arunabha Sengupta
Poulomi Saha, Animesh Ghosh, Arunabha Sengupta
Abstract
Introduction: Head and neck cancers (HNCs) are a significant public health concern in India, with a high burden attributed to modifiable risk factors such as tobacco and alcohol use. Understanding the demographic and clinical profile of affected patients is essential for formulating targeted prevention and early detection strategies. Aims: To assess the demographic characteristics, risk factor distribution, clinical staging, histological types, and performance status of patients diagnosed with head and neck cancers, excluding thyroid malignancies, in a tertiary care setting. Materials & Methods: This prospective, observational, institution-based study was conducted at IPGMER and SSKM Hospital, a premier tertiary care teaching institute located in Kolkata, Eastern India. The study was carried out over a period of 18 months, from July 2023 to January 2025. Data were collected from both the outpatient and inpatient departments of oncology. A total of 384 patients presenting with symptoms suggestive of head and neck cancer were included in the study. Result: In our study population of 384 patients, tobacco use was highly prevalent, with 38.5% being smokers (n=148), 32.3% chewers (n=124), and 16.7% using both forms (n=64), while only 12.5% (n=48) reported no tobacco use. This distribution was statistically significant (p < 0.001). Alcohol consumption was also common, reported by 44.8% (n=172) of patients, compared to 55.2% (n=212) who did not consume alcohol, showing a significant association (p = 0.034). Regarding the site of the primary lesion, the oral cavity was the most commonly affected (n=142, 37.0%), followed by the larynx (n=82, 21.4%), hypopharynx (n=64, 16.7%), oropharynx (n=48, 12.5%), nasopharynx (n=22, 5.7%), and other sites (n=26, 6.7%), with this distribution being statistically significant (p < 0.001). Histologically, squamous cell carcinoma was the predominant type (n=332, 86.5%), followed by adenocarcinoma (3.1%), mucoepidermoid carcinoma (2.6%), lymphoma (2.1%), and others (5.7%), with a highly significant p-value (<0.001). Conclusion: Head and neck cancers are strongly linked to preventable lifestyle factors and predominantly affect individuals from lower socioeconomic backgrounds. Late-stage presentation is common, highlighting the need for enhanced awareness, early diagnosis, and targeted public health interventions in high-risk populations.
76. Prevalence and Association of Sensorineural Hearing Loss Among Diabetic Patients: A Community-Based Cross-Sectional Study in Jamuhar, Sasaram, Bihar
Akhil Sareen, Ravi Ranjan, Chandrakant Diwakar
Akhil Sareen, Ravi Ranjan, Chandrakant Diwakar
Abstract
Background: Type 2 diabetes mellitus (DM) is a chronic metabolic disorder affecting multiple systems, and growing evidence links it to sensorineural hearing loss (SNHL) through mechanisms such as cochlear microangiopathy and auditory neuropathy. Data from rural India remain scarce. Objective: To estimate the prevalence of SNHL among diabetic patients in Jamuhar, Sasaram, Bihar, and assess its association with disease characteristics using audiological tests. Methodology: A cross-sectional community-based study was performed at Narayan Medical College and Hospital, Jamuhar, Sasaram, Bihar on 100 subjects; 50 having type 2 DM and 50 matched on age and gender having non-diabetes. All participants underwent biochemical analysis and audiological tests: Pure Tone Audiometry (PTA), Distortion Product Otoacoustic Emissions (DPOAE), and Brainstem Evoked Response Audiometry (BERA). Statistical significance was set at p < 0.05. Results: SNHL was more common in diabetics (84% vs. 62%) and was more severe as indicated by moderate to profound loss in 52% of the diabetic subjects versus 28% in non-diabetic subjects; DPOAE indicated abnormal cochlear function for the diabetic subjects in 80% of cases versus 56% for the controls, indicating clinically unevaluated cochlear function. BERA showed prolongation in the wave V and I–V interpeak latencies across all test intensities (P < 0.001), consistent with delay conduction along the auditory pathway. The diabetic subjects were older (mean age 52.3 years vs. 34.6 years) and had poorer glycemic control (mean HbA1c 8.1%). Conclusion: Diabetes markedly increases the prevalence and degree of SNHL by disturbing the cochlear and retrocochlear pathways. Routine audiological screening and strict control of glycemic levels are recommended in order to mitigate the occurrence of auditory problems.
77. Outcomes of Locked vs. Non-Locked Plating in Distal Fibula Fractures: A Retrospective Analysis
Amarnath Chaturvedi, Sanjeev Kumar, Omprakash Kumar
Amarnath Chaturvedi, Sanjeev Kumar, Omprakash Kumar
Abstract
Background: Ankle fractures are increasingly common, particularly among elderly patients with osteoporotic bone, posing challenges for stable fixation and optimal healing. Traditional non-locking plates may be insufficient in complex or osteoporotic fractures, whereas locking compression plates (LCPs) offer angular stability and preserve periosteal blood supply, potentially improving outcomes. Aim: To compare the clinical and radiological outcomes, union rates, and complications of locked versus non-locked plating in distal fibula fractures. Methodology: A retrospective observational study was conducted at the Department of Orthopaedics, Nalanda Medical College and Hospital, Patna, India, over one year. Eighty patients with distal fibula fractures meeting inclusion criteria were analyzed. Patient records were reviewed for demographics, fracture classification, implant type, radiographic union, functional outcomes (AOFAS score), and complications. Statistical analysis was performed using SPSS v27, with significance set at p < 0.05. Results: Locking Compression Distal Fibula Plates (LCDFP) were used in older patients (mean 58 years) with more complex fractures, while Semi-Tubular Plates (STP) and Limited-Contact Dynamic Plates (LCDCP) were used in younger patients with simpler fractures. Mean radiographic union was slightly faster with LCDFP (14.5 weeks) versus STP (15.1 weeks) and LCDCP (15.6 weeks). Full weight-bearing and complication rates were comparable across groups, though STP had a higher frequency of symptomatic hardware removal. Overall functional outcomes were satisfactory. Conclusion: Both locking and non-locking plating techniques provide effective fixation for distal fibula fractures. LCDFP is advantageous in older patients with osteoporotic or complex fractures, while STP and LCDCP remain reliable for simpler fracture patterns. Implant selection should be individualized based on patient age, bone quality, and fracture complexity.
78. Prevalence of Extrapulmonary TB in FNAC in Tertiary Care Centre in 2 Year Duration
Sandip Kumar, Shashi Ranjan Roy, Dilip Kumar
Sandip Kumar, Shashi Ranjan Roy, Dilip Kumar
Abstract
Aim: The aim of the present study was to find out the prevalence of extra-pulmonary tuberculosis among tuberculosis patients visiting a tertiary care centre in 2 years. Methods: This was a descriptive cross-sectional study conducted among the tuberculosis confirmed adult patients visiting the Department of Pathology, Patna Medical College & Hospital, Patna, Bihar, India for the period of two years and 50 patients were included in the study. Results: Out of 80 patients included in our study, the prevalence of extrapulmonary tuberculosis is found to be 50 (62.5%). 32 (64%) were males and 18 (36%) were females. Most of them were from the urban area i.e. 37 (74%) and 13 (26%) were from the rural areas. Most of them were farmers, followed by students and housewives, while there was least number of businessmen 1 (2%). Out of all, 5 (10%) were HIV serologic status. Among the extra pulmonary cases, 33 (66%) were pleural effusion followed by disseminated tuberculosis 6 (12%) and 3 (6%) abdominal TB. Out of 50 extrapulmonary cases, 14 (28%) had loss of weight, 21 (42%) had loss of appetite and 27 (54%) had reported that they have evening rise of temperature. Among the 50 extrapulmonary cases, Ascitic was positive in 7 patients and pleural fluid was positive in 38 patients. Out of all the 50 cases, 32 (64%) patient had pleural effusion followed by 4 (8%) fibro cavitation and consolidation respectively and 6 (12%) patients had normal radiological finding. Conclusion: Extra-pulmonary tuberculosis is common in urban areas, mostly in low socioeconomic populations. Tuberculosis presents with a varied spectrum of symptoms. In countries like India where TB is widely prevalent, it is always suggested to keep the rare possibility of extrapulmonary TB in mind when patients present. A detailed history, combined with thorough physical examination and vital investigations are necessary, particularly in identifying atypical forms of extrapulmonary TB. Culture is also an essential step in diagnosis. Sophisticated techniques like PCR may not be available in a setup with limited resources. Histopathological examination is essential for confirmation. Management with ATD is effective.
79. Prospective Evaluation of Surgical Management of Clavicle Fractures Using Anatomically Precontoured Locking Plates
Sanjeev Kumar, Amarnath Chaturvedi, Omprakash Kumar
Sanjeev Kumar, Amarnath Chaturvedi, Omprakash Kumar
Abstract
Background: Midclavicular fractures, especially midshaft fractures, are common injuries that affect functional use of the shoulder and anatomical plates and screws offer stable fixation and early mobilization, which may improve the outcome compared to conservative management. Aim: To prospectively evaluate the outcomes, both functional and radiological, of surgically managed displaced mid-third clavicle fractures with anatomically precontoured locking plates. Method: A prospective observational study involving 40 adult patients (between 18–60 years) that had displaced midshaft clavicle fractures were undertaken at Nalanda Medical College, Patna. The patients were surgically treated with open reduction and internal fixation with anatomically precontoured locking plates and were followed up until 6 months post-operatively. Functional recovery was evaluated using the Constant-Murley Score; and radiological union (maximum angles) were assessed on plain X-rays taken post-operatively. Results: Most patients were young adults, aged 18–30 years (30%) and primarily sustained fractures due to road traffic accidents (65%). The right side was more commonly involved (60%); and for functional outcome, 28 patients (70%) were evaluated with excellent results, 8 patients (20%) had good results, whereas 4 patients (10%) were evaluated with fair recovery; there were no poor outcomes. All radiological unions were achieved without major complications. Conclusion: Anatomically precontoured locking plates for fractures of the midshaft of the clavicle provided excellent fixation, successful early mobilization and sufficient functional recovery, which indicated an effective and reliable management modality.