فهرست مطالب

Medical Journal Of the Islamic Republic of Iran - Volume:38 Issue: 1, Winter 2024

Medical Journal Of the Islamic Republic of Iran
Volume:38 Issue: 1, Winter 2024

  • تاریخ انتشار: 1402/10/11
  • تعداد عناوین: 94
|
  • Atefeh Zamani, Masoumeh Dehghan Manshadi, Mansoureh Akouchekian*, Iman Salahshouri Far Pages 1-5
    Background

    Multiple sclerosis (MS) is a complex human autoimmune-type inflammatory disease of the central nervous system (CNS). MicroRNA-146a (miR-146a) belongs to an endogenous and non-coding RNA family with 18-22 nucleotides long, which modulates the innate and adaptive immune response.   

    Methods

    Our study aimed to investigate a possible association between rs2910164 and rs2431697 polymorphisms of the miR-146a gene and multiple sclerosis in the Iranian population. A total of 60 MS cases and 100 controls were recruited. Single nucleotide polymorphism (SNP) rs2431697 was genotyped by utilizing polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP), and SNP rs2910164 was genotyped by using Tetra-primer ARMS-PCR. Statistical Analysis conducted by the chi-squared test utilizing SPSS version 21.0 Software. The Hardy‐Weinberg equilibrium assumption was evaluated.  

    Results

    The results of the present study suggest the miR-146a gene rs2431697 polymorphism is not associated with multiple sclerosis. However, there is a significant relationship between polymorphism rs2910164 of the miR-146a gene and multiple sclerosis in the population studied (P = 0.012).   

    Conclusion

    Our data provide evidence that the miR-146a gene may be involved in creating the susceptibility to MS in the Iranian population.

    Keywords: Multiple Sclerosis, Mir-146A, Tetra-Primer ARMS-PCR., PCR-RFLP
  • Mohammad Hasan Bemanian, Mousa Ghelichi-Ghojogh, Seyed Ali Aghapour* Pages 6-9
    Background

    Anaphylaxis is an allergic reaction which occurs with or without the stimulation of the immune system. Hymenoptera stings are common causes of anaphylaxis in the world. Skin tests are the first-line diagnostic measure for Hymenoptera anaphylaxis. The present study aimed to evaluate the safety of a single-step approach in sensitization testing for Hymenoptera venom.   

    Methods

    This cross-sectional study was conducted in 2019 in Golestan province the north of Iran. The sample population consisted of 140,000 individuals covered by 84 rural healthcare centers in the vicinity of Gorgan, Iran. Thirty-three patients agreed to receive the diagnostic test. In this research, in contrast to the 2011 ACAAI guideline, the extracts of venom of three types of Hymenoptera were injected intra-dermally without any dilution at the concentration of 1 μg/ml. 

    Results

    The results of the skin test in the patients bitten by honey bee, yellow jacket, and paper wasp were negative in 15.2%, 15.2%, and 21.2% of the cases, respectively. After the test, no allergic reaction was observed, with the exception of a minor skin reaction, which improved within a short time. These preventive measures were taken during the test for the following four hours when the patient was present at the test site and up to 48 hours afterward via follow-up from the healthcare center to the home of the patient.  

    Conclusion

    The results of our study showed that the non-diluted single injection of the Hymenoptera sting was accompanied by no side effects.

    Keywords: Anaphylaxis, Skin Tests, Venoms, Bites, Stings, Hymenoptera
  • Elshazly Abdul Khalek, Hamouda Abdel-Khalek El-Bahnasy*, Mohamad Alshahat Omar, Mohamed Ibrahim Elraghy, Tarek Ahmed Ahmed Dabash, Mahmoud S. Berengy, Elsayed Abozid, Muhammad Saad Reihan Pages 10-15
    Background

    Diabetes is associated with left ventricular remodeling. Myocardial wall stress is a measurable factor connected to the ventricular breadth and force and is related to myocardial thickness; it can be measured by echocardiography. The present study aimed to assess the link between heart failure (HF) and echocardiography-derived myocardial wall stress in diabetic patients with ST elevation myocardial infarction (STEMI) who were managed with revascularization.   

    Methods

    This study was a comparative prospective study that took place between February 2022 and February 2023. It included 100 diabetic patients presented with STEMI and managed by percutaneous coronary intervention (PCI). Patients were selected from the cardiology departments at Al-Azhar University Hospital, Damietta, Egypt. During the hospital stay, patients were checked for HF symptoms and signs. They were also observed for 3 months after discharge for detection of HF. Those who did not develop HF were assigned to group I, and those with HF were assigned to group II.    

    Results

    The mean value of end-systolic wall stress (ESWS) was 77.09 ± 12.22 and 97 ± 13.44, and the mean value of end-diastolic wall stress (EDWS) was 12.61 ± 2.76 and 15.87 ± 2.86 in groups I and II respectively, with significant differences between the 2 groups. The cutoff point to detect HF was 88 KPa for ESWS and 13.5 KPa for EDWS, with a sensitivity of 70% and 79% and a specificity of 80% and 61% for ESWS and EDWS, respectively.  

    Conclusion

    Elevated left ventricle (LV) myocardial stress is related to increased HF in diabetic patients whose HF was managed by PCI after STEMI. LV wall stress is a potentially helpful risk stratification tool using routine echocardiography to determine the treatment plane according to the risk status.

    Keywords: Heart Failure, Myocardial Wall Stress, Acute Myocardial Infarction, Revascularization
  • Mahmoud Helmy Elsaied Hussein*, Ibrahim Fadl Mahmoud, Yasser MS. Eita, Mohamed A. Ahmed Aglan, Mohammad Seddiek A. Esmaiel, Gamal Abdelshafy Ibrahim Farag, Neazy Abdmokhles Abdelmottaleb, Mohamed Attia Elkahely, Mohamed A Mansour Pages 16-20
    Background

    Predicting the outcome of blunt chest trauma by scoring systems is of utmost value. We aimed to assess the role of the chest trauma scoring system (CTS) in predicting blunt chest trauma outcomes among Egyptians.  

    Methods

    A prospective observational study included 45 patients admitted to the cardiothoracic emergency unit of Al-Azhar University hospitals. We documented their demographic data, history, cause and mode of trauma, vital parameters, and necessary investigations (e.g., chest X-ray and Computed Tomography) when the patient was admitted to the cardiothoracic department. All patients were assessed using the chest trauma scale (CTS) and followed up till death or discharge.      

    Results

    The patient's age ranged between 18 to 76 years (mean 42.67 years). Eighty percent were males, and 48% needed mechanical ventilation (MV). The period of MV was ranged from 1 to 5 days (mean 2.81 days). Twenty-two patients had pneumonia. Eight patients died with a chest trauma scale ranging from 2 to 12 with a median of 6. About 87 percent of patients had unilateral lesions, and 5 had criminal causes. Road traffic accidents were the most typical cause of trauma (60%). There was a significant relation between mortality among the studied patients and each MV, length of ICU duration, chest trauma scale, laterality of trauma, and associated injuries. There was a statistically significant relation between the chest trauma scale and the need for MV, the timing of MV, the presence of pneumonia, and mortality.  

    Conclusion

    CTS ≥ 6.5 can predict mortality with 100.0% sensitivity, specificity of 62.2%, and accuracy of 68.9%. However, a score of ≥ 5.5 can predict the development of pneumonia with a sensitivity of 81.8%, specificity of 78.3%, and accuracy of 80%.

    Keywords: Chest Trauma, Blunt, Severity, Mortality, Intensive Care
  • Neda Amoori, Bahman Cheraghian, Payam Amini, Seyed Mohammad Alavi* Pages 21-27
    Background

    Tuberculosis is a principal public health issue. Reducing and controlling tuberculosis did not result in the expected success despite implementing effective preventive and therapeutic programs, one of the reasons for which is the delay in definitive diagnosis. Therefore, creating a diagnostic aid system for tuberculosis screening can help in the early diagnosis of this disease. This research aims to use machine learning techniques to identify economic, social, and environmental factors affecting tuberculosis.  

    Methods

    This case-control study included 80 individuals with TB and 172 participants as controls. During January-October 2021, information was collected from thirty-six health centers in Ahvaz, southwest Iran. Five different machine learning approaches were used to identify factors associated with TB, including BMI, sex, age , marital status, education, employment status, size of the family, monthly income, cigarette smoking, hookah smoking, history of chronic illness, history of imprisonment, history of hospital admission, first-class family, second-class family, third-class family, friend, co-worker, neighbor, market, store, hospital, health center, workplace, restaurant, park, mosque, Basij base, Hairdressers and school. The data was analyzed using the  statistical programming R software version 4.1.1.   

    Results

    According to the calculated evaluation criteria, the accuracy level of 5 SVM, RF, LSSVM, KNN, and NB models is 0.99, 0.72, 0.97,0.99, and 0.95, respectively, and except for RF, the other models had the highest accuracy. Among the 39 investigated variables, 16 factors including First-class family (20.83%), friend (17.01%), health center (41.67%), hospital (24.74%), store (18.49%), market (14.32%), workplace (9.46%), history of hospital admission (51.82%), BMI (43.75%), sex (40.36%), age (22.83%), educational status (60.59%), employment status (43.58%), monthly income (63.80%), addiction (44.10%), history of imprisonment (38.19%) were of the highest importance on tuberculosis.  

    Conclusion

    The obtained results demonstrated that machine-learning techniques are effective  in identifying economic, social, and environmental factors associated with tuberculosis. Identifying these different factors plays a significant role in preventing and performing appropriate and timely interventions to control this disease.

    Keywords: Tuberculosis, Classification, Risk Factor, Machine Learning
  • Mahdi Aghabagheri, Mahmoud Mehrmohammadi*, Khosrow Bagheri Noaparast Pages 28-34
    Background

    A major contribution to the humanities literature has been the development and application of Vygotsky's sociocultural theory in relevant fields. Constructivism as a paradigm is owed to Vygotsky and his efforts. On-the-spot scaffolding in this regard is one of the innovations that can be triangulated with hermeneutic phenomenology to pave the way for a paradigm shift in the educational system in a broad view and for critical thinking for medical students in a narrow view. This study aimed to illuminate the other side of the behavioristic lesson plan, which is on-the-spot scaffolding in implementing one of the modules of essential skills for doctors of medicine (Adab-e Pezeshki), which is critical thinking.       

    Methods

    This study was qualitative and longitudinal. It is longitudinal due to the 3 years of involvement and qualitative due to the study design approach and the use of discourse analysis and hermeneutic phenomenology as tools.    

    Results

    Three main findings emerged from the study's qualitative nature: first, students who attended the sessions or who merely completed the assignments created an on-the-spot critical thinking scaffolding design, which is an alternative to a behavioristic lesson plan; and second, the students wrote numerous movie reviews in both Persian and English. One of the components of reflective autobiography is these film critics and a student-written, instructor-edited paper submitted to ICHPE 2023.   

    Conclusion

    There is an urgent need for a paradigm shift and comparative-historical investigations in the medical education system in Iran.

    Keywords: Hermeneutic Phenomenology, Critical Thinking, Medical Education
  • Sedigheh Tehranchi, Farzaneh Palizban*, Maryam Khoshnood Shariati, Naeeme Taslimi Taleghani, Arefeh Fayazi, Mohammad Farjami Pages 35-40
    Background

    Oropharyngeal colostrum priming (OCP) has been proposed as a potential nutritional option for very low birth weight (VLBW) newborns. This study aimed to determine short-term outcomes of early oral colostrum administration in VLBW neonates.  

    Methods

    This open-label randomized controlled trial was conducted on VLBW neonates admitted to Mahdieh Hospital, Tehran, Iran, between February and December 2022. According to the protocol, all eligible neonates were randomized evenly to the intervention group, which received oral colostrum (OC), and the control group, which received no OC. Finally, short-term outcomes of early OC administration were compared between groups using the independent-samples t test, chi-square, and Fisher exact tests.   

    Results

    Of 80 randomized neonates, 37 and 39 from the intervention and control groups entered the final analysis, respectively. Neonates in the intervention and control groups did not significantly differ in terms of peripherally inserted central catheter (PICC) infection (P = 0.728), sepsis (P = 0.904), necrotizing enterocolitis (NEC) (P > 0.999), intraventricular hemorrhage (IVH) (P = 0.141), retinopathy of prematurity (ROP) (P = 0.923), and bronchopulmonary dysplasia (BPD) (P = 0.633). Furthermore, there was no significant difference between groups considering the time to reach 120 cc/kg feeds (P = 0.557), time to reach birth weight (P = 0.157), length of hospitalization (P = 0.532), and mortality rate (P = 0.628).   

    Conclusion

    The results of our study revealed that despite safety, early OC administration did not improve any of the short-term outcomes in VLBW neonates.

    Keywords: Colostrum, Very Low Birth Weight, Neonate, Breast Milk
  • Ali Feizi, Bahar Hafezi*, Saeed Bagheri Faradonbeh, Shahram Tofighi Pages 41-46
    Background

    Inequality in the use of dental services is a primary concern of global health, and few studies have been done in this field in Iran. Therefore, the present study aimed to conduct a decomposition analysis of socioeconomic inequalities in the utilization of oral health services.   

    Methods

    This was a cross-sectional study in which 715 households, including 2680 people living in Ahvaz, were included using a stratified-cluster sampling. Data were collected using a questionnaire. For data analysis and estimating the elasticity of the influencing factors, the logistic model and Stata software were used. The social and economic disparities in oral health variables were broken down into determinant components using the Van Doorslaer and Wagstaff technique.  

    Results

    The key factors determining social and economic inequalities in the utilization of these services were insurance status, education level, income quintile, and occupation. Nearly 31% of utilization inequalities can be attributed to the insurance status of households. In addition, the education level of household members (about 28%) was the second factor of inequality. The variables of income quintile and occupation are also considered as the third factor, and the age of household members had a negative role in the socioeconomic inequality.  

    Conclusion

    The utilization of oral health services can be improved by improving economic and social variables in society. Therefore, including oral health services in insurance plans and primary health care services and supporting people with low-income levels can play an important role in reducing these inequalities.

    Keywords: Socioeconomic, Inequality, Concentration Index, Decomposition Analysis
  • Masoumeh Sadat Abtahi, Mahdi Aghabagheri* Pages 47-49
  • Fatemeh Mohebbi, Kaveh Alavi, Amir Hossein Jalali Nadoushan, Mahdieh Saeidi, Mahnoush Mahdiar, Fahimeh Bakhshijoibari, Seyed Kazem Malakouti* Pages 50-56
    Background

    Paying attention to the needs of patients with psychiatric disorders has recently come into focus. Failure to meet the needs of patients can affect their quality of life. This study aimed to determine the main areas of the needs of patients with severe psychiatric disorders and evaluate their relationship with the quality of life.  

    Methods

    In this cross-sectional study, 174 patients with severe mental illness who were referred to Iran Psychiatric Hospital for hospitalization or outpatient treatment were enrolled in this study (68 with schizophrenia and schizoaffective disorder, 106 with bipolar disorder type 1). A qualified psychiatry resident conducted interviews with each patient to determine their needs using the Camberwell Assessment of Need Short Appraisal Schedule (CANSAS) and the severity of their illness using the Hamilton Depression Rating Scale (HAM-D), Positive and Negative Syndrome Scale (PANSS), and Young Mania Rating Scale. A checklist for demographic data and the World Health Organization Quality of Life Brief Version (WHOQOL-BREF) questionnaire was completed by patients. Data were analyzed using descriptive statistics. Since the number of needs distribution was not normal, we used the Mann-Whitney, Kruskal-Wallis, and chi-square tests for qualitative variables.  

    Results

    The total number of patient needs was 9 (mean = 9.1, SD = 3.7). The most unmet needs were intimate relationships (69.5%), sexual expression (65.5%), and information on condition and treatment (51.1%). Unmet needs showed a negative correlation with the quality of life (P < 0.001) and a positive correlation with the severity of depression (P = 0.045), negative symptoms (P = 0.001), and general psychopathology (P < 0.001).   

    Conclusion

    A higher number of unmet needs of severe psychiatric patients is associated with lower quality of life and more severe disorders.

    Keywords: Bipolar Disorder, Needs Assessment, Quality Of Life, Schizophrenia
  • Shervan Shoaee, Farshad Sharifi, Pooneh Ghavidel-Parsa, Shayan Sobhaninejad, Mohammad-Hossein Heydari, Ahamd Sofi-Mahmudi* Pages 57-67
    Background

    The prevalence of dental caries among the elderly is high worldwide, and dental caries cause the major burden of oral diseases. This meta-analysis aimed to determine the dental caries experience among the elderly in Iran.  

    Methods

    A systematic review of the published and grey literature on Iranians aged 65 years or older was performed. Six international and local databases provided the most comprehensive population-based studies. National oral health surveys and national disease and health surveys were considered other primary data sources. The quality of remained studies was assessed by a modified tool designed based on the STROBE statement checklist to evaluate the cross-sectional studies. R Version 3.6.0 was used for statistical analysis. Heterogeneity was assessed using Cochran’s Q and F statistics. Subgroup analysis was performed to detect the source of heterogeneity. Funnel plots and Egger’s regression intercept test were used to assess publication bias and selective reporting.   

    Results

    Overall, 3099 sources were found. After excluding ineligible studies, 46 data points with 10411 people ≥65 years were included in the meta-analysis. The mean pooled decayed, missing, and filled teeth (DMFT) among older people was 26.84 (range, 26.41-27.28). The DMFT was 26.78 (range, 26.12-27.43) in women and 26.91 (range, 26.32-27.50) in men. The mean number of decayed teeth was 1.48 (range, 1.32-1.65). The mean pooled missing teeth was 24.83 (range, 24.20-25.46), and the mean pooled filled teeth was 0.14 (range, 0.12-0.17). The majority (92%) of the DMFT was related to missing teeth.  

    Conclusion

    Iranian elderly have almost 5 sound teeth in their mouth on average. The Iranian oral health policymakers should address this considerable burden of dental caries in designing and implementing better oral health policies for the population, especially older Iranian adults.

    Keywords: Aged, Dental Caries, DMFT Index, Iran, Meta-Analysis
  • Asgar Aghaei Hashjin, Rafat Bagherzadeh, Amrollah Faraji, Mahtab Rouzbahani, Pouria Farrokhi* Pages 68-74
    Background

    The likelihood of poor health outcomes for refugees is increased due to a variety of complicated causes. Lack of access to high-quality care during resettlement is frequently cited by migrants. Therefore, this study was carried out to assess the quality of primary care services from the perspective of refugees and migrants.   

    Methods

    This cross-sectional study was conducted in three health networks affiliated with Iran University of Medical Sciences in 2021. Data were collected by using a self-administrative questionnaire, the validity and reliability of which were checked and confirmed. The questionnaires were randomly completed by 280 migrants and refugees. Data were analyzed by using Kruskal–Wallis, Mann–Whitney U, Spearman correlation, exploratory factor analysis, and Cronbach's α with SPSS 22.   

    Results

    According to the results, the overall service quality was 3.86 out of 5. The highest and lowest mean scores were related to efficiency (4.12 ±0.64) and tangibility (3.28 ±0.39). Furthermore, there was a significant relationship between the perception of service quality and gender, education, residence area, and the rate of center visits (P < 0.05).   

    Conclusion

    The quality of services was generally rated favorably by the refugees. Managers and decision-makers are recommended to allocate enough funds to equip and upgrade the amenities at health centers to increase the quality of services.

    Keywords: Primary Health Care, Quality Of Health Care, Migrants, Refugees
  • Ali Kabir, Shahrbanoo Abdolhosseini, Ali Zare-Mirzaei, Abdolreza Pazouki, Mohsen Masoodi, Amirhossein Faghihi Kashani* Pages 75-80
    Background

    Obesity and Helicobacter pylori (H. pylori) infection are public health problems in the world and Iran. This study aimed to indicate the anatomical place with the most accurate results for H. pylori. According to gastric mapping, this study will be able to evaluate the prevalence of H. pylori based on the pathology of gastric mapping and the accuracy of the antral rapid urease test (RUT) based on endoscopic findings.  

    Methods

    In this cross-sectional study, upper digestive endoscopy and gastric pathology were studied in 196 obese patients candidates for bariatric surgery. Statistical analyses were performed using a t-test and Chi-square/fisher’s exact test to compare the groups. Sensitivity, specificity, accuracy, positive predictive value (PPV), negative predictive value (NPV), positive likelihood ratio (PLR), negative likelihood ratio (NLR), and odds ratio (OR) were used to compare RUT and pathological H. pylori test of each of the six areas of the stomach. We set a positive test of the pathology of 6 regions of the stomach as our gold standard (in this study).  

    Results

    The most common area of the stomach for pathological findings of H. pylori were incisura (116, 59.2%), greater curvature of the antrum (115, 58.3%), lesser curvature of the antrum (113, 57.7%), lesser curvature of the corpus (112, 57.1%), greater curvature of the corpus (111, 56.6%) and cardia (103, 52.6%). The prevalence of H. pylori was 58.2% (114 cases) and 61.2% (120 cases) with RUT and gastric pathology, respectively. Mild, moderate, and severe infection of H. pylori in cardia (58, 29.6%), greater and lesser curvature of the antrum (61, 31.1%), and greater curvature of the antrum (37, 18.9%) had the highest percentages of incidence comparing to other sites of the stomach, respectively. The most sensitive area for pathologic biopsy was incisura (96.6%, 95% confidence interval: 91.7, 98.7).  

    Conclusion

    According to the highest sensitivity, PLR, NPV, and pathological findings of H. pylori in accordance with the lowest NLR in the incisura compared with other parts of the stomach, it is highly recommended to take the biopsy from the incisura instead of other anatomical places of stomach for detecting H. pylori specifically if our strategy is taking only one biopsy.

    Keywords: Bariatric Surgery, Obesity, Helicobacter Pylori, Endoscopy, Pathology
  • Homayoun Sadeghi-Bazargani, Mina Golestani*, Mohammad Saadati, Bahram Samadirad, Saber Azami-Aghdash, Ali Jafari-Khounigh Pages 81-87
    Background

    Online reporting systems can establish and maintain the community-authority connection for safety promotion initiatives and their sustainability. The aim of this study was to report the development, implementation, and evaluation of an online community safety reporting system in safe communities in Iran.  

    Methods

    In the first place, the life cycle approach and software systems development were used for design and implementation, which included 7 steps. In the following, an online Community Safety Reporting System (CSRS) was developed with two main interfaces, including a web-based and phone application. The software was developed using suitable programming languages for the web and as a mobile application for Android and iOS systems.  

    Results

    During the six months of implementation, we received 80 reports in different safety areas, which were managed by the administrators and provided feedback for reporters. System user-friendliness and easy to use were the main strengths declared by users. The CSRS program is implemented at two levels of usage: public users to report safety issues and city admin functional evaluation of the system through a short interview with users. Moreover, city authorities believed that the system facilitates community participation in decision-making processes. The address of the web page is www.payamiran.ir.  

    Conclusion

    CSRS provides a way for community voices to be heard and facilitates mutual interaction between the community and authorities. CSRS could be used as a community participation tool to ensure safety promotion initiatives sustainability.

    Keywords: Safety Reporting System, Evaluation, Client Voice, Injury, Accident, Traffic Accident
  • Ali Kabir, Davood Rasouli, Kamran Soltani Arabshahi* Pages 88-93
    Background

    Due to the changing conditions of education, research, and treatment in the world, especially the recent pandemic, and more use of virtual space, there is a need for evaluation of digital professionalism in faculty members as the most influential people who have a direct and deep impact on the next generation.  

    Methods

    In this analytical cross-sectional study that was conducted in 2023 on 149 faculty members of Iran University of Medical Sciences, they were invited to participate in the study through various methods (SMS, E-mail, and media messages). The link to the Persian standardized questionnaire was made available for participants. If a person received less than 70% of the score in each area, he/she would receive solutions to improve his/her situation in that area at the end of answering the questions. The self-administered questionnaire has 5 fields and 33 questions. Maximum scores were 10 points. Spearman and Pearson correlation coefficients and statistical tests consisting of chi-square, t-test, Mann-Whitney U, one-way ANOVA, and Kruskal-Wallis H were used in the analysis.  

    Results

    The mean overall score of people in principles of digital professionalism was 0.8. Women and basic sciences faculties had a significantly better status than men and clinical faculties in the principles of digital professionalism as a whole (P = 0.001 and P = 0.049, respectively). The domain of “knowledge management and information literacy” had significantly lower scores in professors in comparison with other degree (instructors, assistant professors, and associate professors (P = 0.039).  

    Conclusion

    The mean score of the principles of digital professionalism is acceptable at 80%. Coherent, timely, and up-to-date training to ensure the effective, safe and appropriate use of digital technology, especially for men, professors and clinical faculty members who had a lower score than others, should be done.

    Keywords: Digital Professionalism, Faculty Members, Basic Sciences, Clinical Sciences
  • Somayyeh Shalchi Oghli, Roya Sadeghi*, Ramesh Omranipour, Abbas Rahimi Foroushani, Mahnaz Ashoorkhani, Yaser Tedadi Pages 94-103
    Background

    Stress is an overwhelming feeling in patients with breast cancer (BC). However, The effect of virtual education has not been fully regulated. Hence, this study intends to compare the impact of 2 virtual education methods on perceived stress and stress coping in women with BC.  

    Methods

    A 3-armed randomized clinical trial was conducted among 315 women with BC who were referred to the Cancer Institute in Tehran. They were randomly assigned to 3 groups: (a) Family-based, receiving family-based training package; (b) peer-support, receiving peer-support educational package;  and (c) control, receiving routine hospital care. Data were collected through demographic and disease characteristics, the Perceived Stress Scale (PSS-14), and Coping Inventory for Stressful Situations (CISS-21) questionnaires before and 3 months after the intervention.   

    Results

    The effect of the group factor after controlling the before-intervention scores in perceived stress, problem-oriented, emotion-oriented, and avoidance-oriented strategies were P < 0.0001, P = 0.015, P < 0.0001, and P = 0.111, respectively. Also, the effect of the confounding factor of BC disease stage in the dependent variables was P = 0.527, P = 0.275, P = 0.358, and P = 0.609, respectively. The effect size test showed that before the intervention, the mean scores of perceived stress, problem-oriented, emotion-oriented, and avoidance-oriented strategies were 32.00 ± 7.03, 19.36 ± 4.68, 25.10 ± 5.90, and 17.65 ± 6.64 respectively, but after the intervention showed a decrease in mean scores of perceived stress, emotion-oriented, and avoidance strategies.   

    Conclusion

    What is vibrant in virtual family-based education is far more effective than peer support when problem-oriented coping increases. Conversely, reducing perceived stress in women with BC receiving enough information and family support should be considered.

    Keywords: Virtual Education, Family-Based, Peer-Support, Breast Neoplasms, Stress
  • Shahram Yazdani, Jalil Koohpayehzadeh, Somaieh Bosak*, Sadegh Abaei Hasani, Kamran Mohammadi Janbazloufar, Mohammad Hossein Ayati Pages 104-110
    Background

    One of the approaches to health workforce planning is supply-based. It has been emphasized that countries should model health workforce based on evidence and their context. The objective of this study is to "design a supply health workforce planning model for specialty and subspecialty in Iran."  

    Methods

    This is a study using Walker and Avant’s (2018) theory synthesis framework to construct the model. This method has three steps. According to the viewpoint of the research team and the needs of the country, the focal concept is determined. Then, a literature review was done to determine related factors and their relationships. In the third step, according to the review, the viewpoint of the research team, the rationale of the connection between components, and the graphic model were presented.   

    Results

    "Supply" was selected as the focal concept. In the literature review, 42 components were obtained from the systematic review, 43 components obtained from the study of other texts were combined with the opinion of the research team about the field of Iran, and the connections between them were determined. In the third step, the supply model was designed using the Stock and Flow method. Finally, by applying the "functional full-time coefficient", the number of full-time equivalent physicians was calculated.  

    Conclusion

    The presented model is an evidence-based model that follows stock and flow design. Stock is the number of specialties or subspecialties that exist in the labor market. Flow includes inflow and outflow according to the educational pathway in the context of Iran.

    Keywords: Health Workforce, Health Human Resources, Health Planning, Labor Supply, Health Policy, Physicians
  • Armin Khavandegar, Vali Baigi, Mohammadreza Zafarghandi, Vafa Rahimi-Movaghar, Reza Farahmand-Rad, Seyed-Mohammad Piri, Mahgol Sadat Hassan Zadeh Tabatabaei, Khatereh Naghdi, Payman Salamati* Pages 111-117
    Background

    Lengthy hospitalization may lead to an increased hospital-acquired patient complication, including infections, as well as increased costs for both healthcare systems and patients. A few studies evaluated the impact of various clinical and demographic variables on patients' length of stay (LOS). Hence, in this study, we aimed to investigate the impact of various variables on traumatic patients' LOS.  

    Methods

    This is a retrospective single-center, registry-based study of traumatic patients admitted to Taleqani, a major trauma center in Kermanshah, Iran. A Minimal Dataset (MDS) was developed to retrieve traumatic data on demographic and clinical aspects. We used univariable and multiple quantile regression models to evaluate the association between independent variables, including ISS, GCS, and SBP, with LOS. LOS is practically defined as the time interval between hospital admission and discharge. The LOS durations have been presented as median (Q1 to Q3) hours. A p-value of <0.05 was considered statistically significant.   

    Results

    A total of 2708 cases were included in this study, with 1989 (73.4%) of them being male. The median LOS was 87.00 (48.00 to 144.00) hours. When adjusted for systolic blood pressure (SBP), Glasgow Coma Scale (GCS), Injury Severity Score (ISS), and cause of injury, the two characteristics of spine/back and multiple trauma were significantly associated with the higher LOS, with 43 (20.5 to 65.48) and 24 (10.39 to 37.60) hours more than extremities (P < 0.001 and P = 0.005). Besides, the patients admitted due to road traffic injuries (RTI) were discharged 16 and 41 hours later than falling and cutting/stabbing (P = 0.008 and < 0.001, respectively). Moreover, the patients with ISS≥16 and 9≤ISS≤15 had a median of 51 (21 to 80) and 34 (22 to 45) LOS hours more, compared to 1≤ISS≤8, respectively (P < 0.001). The trauma cases experiencing SBP ≤ 90 mmhg on admission had a median of 41 (20 to 62) hours more hospitalization period than those with SBP> 90 mmhg (P < 0.001). At last, the patients with GCS of 9 to 12 and GCS of 3 to 8 were hospitalized for 39 and 266 hours more than GCS of 13 to 15 (P < 0.001).  

    Conclusion

    Determining independent determinants of prolonged LOS may lead to better identifying at-risk patients on admission. Trauma care providers should consider the following risk factors for increased LOS: higher ISS, Lower GCS, and SBP, multiple trauma or spine injury, and trauma resulting from falling or cutting/stabbing. As a result, the impact of extended LOS might be reduced by intervening in the related influencing factors.

    Keywords: Length Of Stay, Wounds, Injuries, Registries, Clinical, Non-Clinical Factors, Classification, Hospital
  • Rouzbeh Kazemi, Alireza Amirbeigloo, Ali Ghotbi, Mahsa Nazifi, Fahimeh Soheilipour* Pages 118-123
    Background

    Hyperglycemia is common in the early acute stroke phase especially in patients with diabetes. To the best of our knowledge, no study has evaluated the course of hyperglycemia in patients with diabetes  during the post-stroke recovery phase.  

    Methods

    It was an observational study conducted in Tabassom Rehabilitation Center for Stroke Patients, Tehran, Iran, 2018-2021. Forty-seven consecutive patients with diabetes and stroke were enrolled and included if at least 3 months had passed from their stroke . Any change in glycemic control before and after stroke was controlled by monitoring drugs used for diabetes treatment and laboratory results. To assess categorical variables, the Pearson chi-squared test was used. Quantitative variables before and after the stroke were analyzed by the paired sample t-test.  

    Results

    The mean age was 63.6 ± 6.9 years, and 22 patients were women. The median time from occurrence of  stroke to the first visit was 5 months and 6 days. Glycemic control improved among patients with diabetes during the post-stroke recovery phase. There was a significant decrease of 0.7 ± 1.3 % in HbA1c (P = 0.001). The number and the dose of drugs needed for diabetes treatment decreased. No significant correlation could be found between changes in HbA1c and weight.  

    Conclusion

    Despite the initial increase in glycemia in patients with diabetes in the acute phase of stroke, glycemic control improves after stroke, and often, it is necessary to decrease diabetes drugs to prevent hypoglycemia. This topic is important and should be addressed by guidelines and institutions involved in the care of patients with diabetes and stroke.

    Keywords: Diabetes, Glycemic Control, Rehabilitation, Stroke, Post-Stroke Recovery
  • Ashefet Agete, Girma Altaye, Ebrahim Talebi* Pages 124-130
    Background

    Cardiovascular diseases (CVD) represent a leading cause of global mortality, necessitating proactive identification of risk factors for preventive strategies. This study aimed to uncover prognostic factors influencing cardiovascular patient survival.  

    Methods

    This study, which used a sample size of 410, showed how to analyze data using simple random sampling. It was conducted at the Tikur Anbessa Specialist Hospital in Addis Ababa, Ethiopia, between September 2012 and April 2016. The Cox PH and stratified Cox regression models were used for the analysis.  

    Results

    Findings disclosed a patient cohort where 200 patients (48.8%) persisted through subsequent evaluation, while 210 patients (51.2%) succumbed. Blood pressure (BP), specific CVD, and education levels (EL) exhibited nonproportionalities in scaled Schoenfeld residuals (P < 0.001), prompting necessary stratification. Inadequacies in the Cox proportional hazards model led to favoring the stratified Cox model. Notably, EL, BP, cholesterol level (CL), alcohol use (AU), smoking use (SU), and pulse rate (PR) exhibited statistical significance (P < 0.001). Acceptability of the absence of interaction in the model, with disease types as strata, was established. Different cardiovascular conditions served as distinct groups, where EL, AU, BP, PR, CL, and SU emerged as variables with statistically substantiated significance associated with the mortality of patients with CVD.  

    Conclusion

    Implications stress the imperative of widespread awareness among policymakers and the public concerning cardiovascular disease incidence. Such awareness is pivotal in mitigating identified risk factors, guiding more effective healthcare interventions tailored to the multifaceted challenges posed by cardiovascular health.

    Keywords: Cardiovascular, Cox Proportional Hazard Model, Cox-Snell Residual, Noncommunicable Disease, Stratified Cox Regression Model
  • Elham Davtalab Esmaeili, Leila R. Kalankesh, Ali Hossein Zeinalzadeh, Alireza Ghaffari, Saeed Dastgiri* Pages 131-137
    Background

    One of the most crucial objectives of policymakers is to enhance the population's overall health. Establishing a surveillance system is a way to achieve this goal. The Behavioral Risk Factor Surveillance System (BRFSS) is a national system that collects data on the health-related behaviors of the United States residents using the Behavioral Risk Factor Questionnaire (BRFSSQ). This survey is aimed at reducing risk behaviors and their consequences. Regarding the fact that the cultural environment within each country may affect how behaviors are assessed, this study aimed to develop a Persian version, cross-cultural adaptation, and assess the validity and reliability of the PBRFSSQ.  

    Methods

    In this cross-sectional study, 250 individuals were enrolled using the stratified sampling method between August 2022 and April 2023. Six steps of translation and test method proposed by Sousa et al was used. Content and face validity were calculated. Also, the Cronbach’s alpha and test-retest were computed. 

    Results

    Of all participants, 54.5% were male and aged 30 to 65 years old (69%). The Scale Content Validity Index was equal to 0.95. The Intra class Correlation Coefficient (ICC) was computed as 0.86, 0.88, and 0.87 for the core, optional, and total components, respectively. Furthermore, the Cronbach's alpha coefficient of 0.85 was obtained overall.  

    Conclusion

    This tool was highly valid and reliable for assessing risky behaviors among the Iranian general population.

    Keywords: Validity, Reliability, Behavior, Psychometrics, Risk Factor
  • Mahnaz Solhi, Tahereh Dehdari, Mahasti Emami Hamzehkolaee*, Hoda Shirafkan, Angela Hamidia Pages 138-145
    Background

    Nurses' resilience in the care of patients with Coronavirus Disease 2019 (COVID-19) is essential. This study aimed to develop and validate an instrument for assessing nurses' resilience control resources in the COVID-19 pandemic.   

    Methods

    In this qualitative study, with a conventional content analysis based on a literature review and semi-structured interviews conducted with 20 nurses, the initial draft of the instrument was prepared in different aspects based on a 5-point scale. The instrument's face validity and content validity were examined in 15 nurses and 15 experts, and construct validity was obtained in 482 nurses using the available sampling method. Data were analyzed in SPSS software Version 24 using indexes and analytic tests.   

    Results

    Out of 54 items, 18 items were confirmed by the expert panel and the items had content validity ratio and content validity index scores higher than 0.79. According to the results of an exploratory factor analysis, this tool has 4 dimensions: God, chance, internal locus of control, and powerful others. They accounted for 48.06% of the total variance. CFA showed the indices confirmed the model fit (χ2/df = 1.846, comparative fit index = 0.921, incremental fit index = 0.923, root mean square approximationerror= 059, goodness of fit index = 0.905). The reliability of the instrument was acceptable (Ω > 0.70, α > 0.7, CR >0.60, and intra-class correlation coefficients > 0.70).   

    Conclusion

    The developed tool is used to measure the control resources of nurses' resilience in caring for COVID-19 patients. It can help recognize the focus areas for developing appropriate interventions.

    Keywords: Health Locus Of Control, Resilience, COVID-19, Measurement Tools
  • Ehsan Moradi-Joo, Mohsen Barouni*, Leila Vali, Saeid Mahmoudian Pages 146-154
    Background

    Early diagnosis of hearing loss and timely interventions are important to minimize the consequences of this condition, especially for children. This research was conducted to analyze the newborn hearing loss screening program in Iran.  

    Methods

    This qualitative study was conducted using the content analysis method and based on the CIPP model in 2023. The snowball method was used to recruit a sample with maximum diversity. The criteria for selecting people for interviews included having at least three years of experience in the newborn hearing loss screening program and sufficient knowledge in the field. To ensure the reliability of the results, four criteria proposed by Lincoln and Guba were used. Data analysis was conducted by MAXQDA2022 software.  

    Results

    In the current research, using content analysis in the form of the CIPP model, based on the viewpoints of the interviewees (40 people), the management requirements of the newborn hearing loss screening program were categorized into the four main categories of context (texture), input, process, and output. Eight subcategories were identified in the context dimension, four subcategories in the input dimension, seven subcategories in the process dimension, and four subcategories in the output dimension.  

    Conclusion

    According to the findings of this research, in order to properly implement the newborn hearing loss screening program, there is a need to conduct pilot studies, need assessments, evidence-based programs, and epidemiological studies and to prioritize services and resources. Also, communication between service delivery levels needs to be improved, and attention should be paid to personnel motivation and screening programs.

    Keywords: Program Analysis, Hearing Loss Screening Program, Infants, Iran, CIPP Model, Directed Qualitative
  • Fatemeh Mohseni, Aeen Mohammadi, Nasim Kajavirad, Kamal Basiri, Mahboobeh Khabaz Mafinejad* Pages 155-163
    Background

    Conflict management skills include the ability of team members to actively use appropriate methods and strategies in different conflict situations. Considering the necessity of effective training in conflict management skills for medical students as a member of healthcare teams, this scoping review study aimed at reviewing the appropriate methods for teaching conflict management to medical students.  

    Methods

    In this scoping review, PubMed, Eric, ProQuest, Web of Science (WoS), and Scopus databases were systematically searched until May 21, 2023. Titles, abstracts, and full texts were screened separately by 2 researchers. The quality of the articles was assessed using the Best Evidence Medical Education (BEME) tool. Then, a descriptive synthesis was performed, and the results were reported. The Kirkpatrick model was used to evaluate the educational outcomes assessment.  

    Results

    Out of 2888 retrieved studies, 19 studies were included. Although active and interactive teaching methods such as roleplay, group discussion, and interactive workshops were the most frequently used methods, the results did not pronounce the superiority of one method over others.   

    Conclusion

    Based on the results of this scoping review, further research should evaluate the effectiveness of conflict management training methods by focusing on the randomized controlled trial design and using standard and valid tools to assess educational outcomes.

    Keywords: Teaching, Medical Students, Conflict Management, Scoping Review
  • Hamed Tayyebi, Sajad Noorigaravand, Mohammad Reza Baheaddini, Elham Mohammadyahya, Ava Parvandi, Saeid Shirvani, Ali Yeganeh* Pages 164-167
    Background

    In extra-articular distal femoral fractures (EDFFs), nonunion is a serious complication that occurs rarely. In this study, we examined how longer preservation of initial fracture hematoma by delaying the osteosynthesis (OS) affects the fracture union.  

    Methods

    In a retrospective cohort study, 98 EDFF patients were included. The OS was done within 2 days of injury in 50 patients (early OS group) and after 2 days of injury in 48 patients (late OS group). Time to callus formation and fracture union, bleeding amount, surgical duration, pain, knee range of motion, knee function, and postoperative complications, including the nonunion, knee deformity, infection, and revision, were compared between the 2 groups. Statistical analyses were done with SPSS. A comparison of the mean between the 2 groups was made with an independent t test or its nonparametric counterpart. A comparison of categorical variables between the 2 groups was made using a chi-square or the Fisher's exact test. P ˂ 0.05 was considered statistically significant.   

    Results

    The mean time to callus formation was 47.1 ± 17.3 days in the early OS group and 46.9 ‎‎± 19.7 in the late OS group (P = 0.950). The mean time to fracture union was ‎‎114.9 ± 21 in the early OS group and 117.4 ± 28.8 days ‎in the late OS group (P = 0.630). The mean operation time and bleeding amount between the 2 groups did not differ significantly (P = 0.230 and P = 0.340, respectively). The knee range of motion, pain, and function were not notably different (P = 0.620, P = 0.790, and P = 0.770, respectively). Nonunion occurred in 3 patients of early OS and 2 patients of the late OS group. Other complications were also comparable in the 2 study groups.  

    Conclusion

    Delayed OA in EDFF patients has no significant effect on bone healing and fracture union. Future standard studies are required to confirm these results.

    Keywords: Distal Femur Fractures, Osteosynthesis, Hematoma, Fracture Nonunion, Fracture Healing
  • Mohammad Hasan Bagheri, Mona Kabiri, Nema Mohamadian Roshan, Seyed Yavar Shams Hojjati, Ezzatollah Rezaei* Pages 168-171
    Background

    Fat graft surgery is one of the most effective procedures in plastic surgery, and since some patients request multiple surgeries and these cases sometimes take hours, it endangers the viability of the fat graft. In this study, we intend to evaluate the viability of adipose tissue aspirated with a syringe at refrigerator (4°C) and freezer (–20 °C) temperatures.    

    Methods

    This was a cross-sectional study. After receiving the ethics committee's approval (IR.MUMS.MEDICAL.REC.1401.423), 17 volunteers entered the study. The harvested fat tissue sample was divided into 3 parts, and each of them was transferred to 3 separate sterile tubes. The first tube was sent to the laboratory for preliminary examination of fresh fat, and the second tube was transferred to a 4°C refrigerator for 72 hours. The sample from the third tube was first passed through a strainer and after drying, it was transferred to a –20°C freezer for 72 hours. After treatment with trypsin, we placed the sample inside the centrifuge using the Coleman method. Finally, 3 layers were formed, and the white middle layer was extracted as a fat cell suspension. Tissue samples were stained with trypan blue, and the percentage of viable cells was calculated using an optical microscope.   

    Results

    There was a significant difference between the mean number and percentage of viable cells in all 3 groups. Samples in the 4°C refrigerator had significantly more cellular viability than those in the –20°C freezer (mean difference, 72.842%; P < 0.001).  

    Conclusion

    Our findings showed that after 72 hours at 4°C, adipose tissue has significantly higher survival than at –20°C (98.93% vs 75.31%). Since the survival of fat cells is one of the direct determinants of fat retention, it can affect the results after surgery. The present study recommends fresh adipose tissue for immediate transplantation unless there is an urgent need for cold storage.

    Keywords: Temperature Of Storage, Adipocyte Survival, Cell Viability
  • Leili Iranirad, Mohammad Saleh Sadeghi*, Seyed Fakhreddin Hejazi Pages 172-177
    Background

    Contrast-induced nephropathy (CIN) or contrast-induced acute kidney injury (CI-AKI) refers to an acute kidney injury (AKI) occurring after exposure to contrast media, commonly used in diagnostic procedures or therapeutic angiographic interventions. Recently, Na/K citrate, used for urine alkalinization, has been assessed for preventing CIN. This experiment evaluated Na/K citrate’s efficacy in preventing CIN in high-risk patients undergoing cardiac catheterization.  

    Methods

    A prospective randomized clinical trial involved 400 patients with moderate- to high-risk factors for CIN undergoing elective percutaneous coronary intervention (PCI). They were randomly assigned to either the control or Na/K citrate groups. The Na/K citrate group (n = 200) received a 5 g Na/K citrate solution diluted in 200 mL water 2 hours before and 4 hours after the first administration, along with intravenous hydration for 2 hours before and 6 hours after the procedure. In contrast, the control group (n = 200) received only intravenous hydration. Serum creatinine (SCr) levels were measured before contrast exposure and 48 hours afterward. CIN was defined as a 25% increase in serum creatinine (SCr) or > 0.5 mg/dL 48 hours after contrast administration. The significance level was set at P ˂ 0.05.  

    Results

    CIN was observed in 33 patients (16.5%) in the control group and 6 patients (3%) in the Na/K citrate group. The incidence of CIN was found to have a significant difference between the 2 groups 48 hours after receiving the radiocontrast agent (P < 0.001).  

    Conclusion

    Our results show that Na/K citrate is helpful and substantially reduces the incidence of CIN.

    Keywords: Contrast Media, Citrate, Percutaneous Coronary Intervention
  • Elahe Askarzade, Hasan Abolghasem Gorji*, Jalal Arabloo Pages 178-186
    Background

    The gradual movement towards universal health coverage (UHC) is an important issue in many countries. The aim of this study is to identify the role of supplementary health insurance in achieving universal coverage.  

    Methods

    This comprehensive review study was conducted to identify the role of supplementary health insurance in achieving universal health coverage. 4894 articles were found in the search in databases (Scopus, PubMed, and Web Science), and finally42 articles were selected. Considering the criteria of titles and abstracts, the reviewed articles were assessed, and a thematic analysis approach was used to analyze the collected data.  

    Results

    The review showed 52 Sub dimensions in 7 dimensions. Policymakers can draw on international experiences to ensure that private health insurance contributes to achieving universal health coverage by Providing clarity within the national health financing policy framework regarding the role of private health insurance. Enhancing understanding of how supplementary health insurance impacts the performance of the healthcare system. They are improving oversight of private health insurance, regulating financial protection and consumer support, and implementing thorough market surveillance and proper allocation of health subsidies between the private and public sectors.  

    Conclusion

    Supplementary insurance holds promise as a complementary tool in achieving universal health coverage. Addressing gaps in primary insurance and providing additional financial protection can contribute to enhanced access, improved quality of care, and reduced financial barriers to healthcare services. However, careful attention must be given to affordability, equity, regulation, and coordination with primary insurance schemes to ensure its effective implementation and prevent unintended consequences.

    Keywords: Health Insurance, Supplementary Insurance, Universal Health Coverage
  • Parisa Hosseini Koukamari, Roya Nikbakht, Mahmood Karimy, Zahra Mohammadi* Pages 187-195
    Background

    Complaints of the arm, neck, and shoulder (CANS) in the workplace are becoming more prevalent among employees. The Maastricht Upper Extremity Questionnaire (MUEQ) validates upper extremity complaints in 7 domains—including workstation, body posture, break time, job control, job demands, work environment, and social support. The aim of the present study was to translate, adapt, and validate the Persian Version of MUEQ among Iranian office workers.   

    Methods

    The psychometric evaluation of the Persian version of the MUEQ employed a comprehensive methodological approach comprising face and content validity assessments, confirmatory factor analysis (CFA), and Cronbach's alpha. A panel of 10 experts assessed the face and content validity of the instrument. In the second phase, through a cross-sectional study, the validity and reliability of the questionnaire were measured by CFA and Cronbach’s alpha in a sample of 420 people from the target population in Tehran, Iran.   

    Results

    The mean age of the participants was 41.40 ± 7.80 years. Examination of upper limb complaints showed that neck pain was the most common complaint among office workers, with a prevalence of 65%. The CFA results confirmed the questionnaire's structure, with 59 items grouped into 7 subscales, and with fit indices—comparative fit index, 0. 87; root mean square error of approximation, 0.08; goodness of fit index, 0.9. The questionnaire demonstrated strong internal consistency, as all items exhibited Cronbach's alpha values of ≥0.9.   

    Conclusion

    The psychometric evaluation of the Persian version of the MUEQ showed that it is a valid and reliable tool for evaluating psychosocial factors in the work environment. Identifying psychosocial factors influential in musculoskeletal problems will lead to better planning to change behavior and design constructive interventions to improve behavior. By addressing psychosocial determinants of musculoskeletal issues at both the individual and organizational levels, we can enhance employees' awareness, self-efficacy, and ability to manage their musculoskeletal health and make informed decisions about their well-being.

    Keywords: Office Worker, Psychometric Properties, Workplace, Musculoskeletal Disorder, CANS, MUEQ
  • Mojgan Barati, Mahin Najafian, Najmieh Saadati, Maryam Motefares* Pages 196-201
    Introduction

    Twin pregnancy is associated with a high risk of mortality and morbidity. It is necessary to estimate the weight difference of the fetuses with a reliable method to prevent possible complications. This study was conducted to compare the association between the Estimated fetal weight (EFW) discordance and the Abdominal Circumference (AC) discordance with birth weight in twins.   

    Methods

    This was a descriptive-analytical and retrospective study. The statistical population was all twin pregnant mothers referred to Imam Khomeini Hospital in Ahvaz from 2017 to 2019. The sample size was determined with a census (540 people). Based on abdominal circumference (AC), the size of head circumference (HC), femur length (FL), the Biparietal Diameter (BPD), and using the Hadlock formula, estimate of fetal weight(EFW) was calculated. Then the EFW Discordance and AC Discordance were calculated and compared with the birth weight. Data were analyzed using SPSS18. Unpaired, Two-Tailed T-test and Pearson correlation test was used.  

    Results

    The results showed that the mean percentage difference of fetal weight in twin pregnancies in the EFW method was 9.25%, in the AC method was 9.89% and finally at birth was 10.72%. The correlation of the weight difference between the two embryos in the AC method with the time of birth (r = 0.922 and P <0.001) was higher than in the EFW method with the time of birth (r = 0.69 and P <0.001) and finally it was found that in detecting the difference more from 20% and 25%, AC diagnostic power was good, but EFW was moderate.  

    Conclusion

    Therefore, to evaluate the weight and weight difference in twin embryos, the ACmethod has the appropriate accuracy and compatibility. Another major prospective study to evaluate the diagnostic performance of AC and EFW mismatch based on gestational age at scan, incision point, maternal and placental characteristics to determine true ultrasound diagnostic accuracy in predicting growthmismatch in twin pregnancy and optimal post-case management option is needed.

    Keywords: Abdominal Circumference Difference, Estimated Fetal Weights Twins, Birth Weight
  • Shahin Soltani, Kamran Arvan, Behzad Karami Matin, Javad Ghoddoosinejad, Fardin Moradi, Hamid Salehiniya* Pages 202-209
    Background

    People with disabilities (PWD) typically face a range of obstacles when accessing healthcare, particularly when compared with the general population. This challenge becomes more pronounced for PWDs in lower socioeconomic groups. This study aimed to assess the socioeconomic-related disparity in financial access to rehabilitation services among Iranian PWDS.  

    Methods

    A total of 766 Iranian PWDs aged ≥18 years participated in this cross-sectional study. We employed the concentration index (C) to estimate socioeconomic inequality in accessing rehabilitation services.  

    Results

    In this study, 766 Iranian adults aged 18 to 70 took part, with a mean age of 36.50 (SD, ±10.02) years. The findings revealed that 72.15% (n = 469) of participants had to borrow money to cover the costs of rehabilitation services. The concentration index (C = -0.228, P = 0.004) demonstrated a notable concentration of inadequate financial access to rehabilitation services among individuals with lower socioeconomic status (SES). Decomposition analysis identified the wealth index as the primary contributor to the observed socioeconomic disparities, accounting for 309.48%.  

    Conclusion

    Our findings show that socioeconomic inequalities disproportionately impact PWDs in lower socioeconomic groups. It is recommended that efforts be made to enhance the national capacity for monitoring the financial protection of PWDs and to develop equitable mechanisms that promote prepayment and risk pooling, thus reducing reliance on out-of-pocket payments at the time of service utilization.

    Keywords: Inequality, Socioeconomic Factors, Concentration Index, Rehabilitation, Access To Health Care, Iran
  • Rasool Samimi, Afra Hossein Panahi, Roja Zaboli, Amir Peymani, Samaneh Rouhi*, Somayeh Ahmadi Gooraji, Neda Rajai Pages 210-221
    Background

    Polymorphisms in the vitamin D receptor (VDR) play an effective role in the susceptibility of pulmonary tuberculosis (TB). Given the importance of this polymorphism and its association with pulmonary TB, this study aimed to investigate the prevalence of VDR polymorphisms in people with pulmonary TB.  

    Methods

    The search process was performed from 2009 to 2023 according to PRISMA (Preferred reporting items for systematic reviews and meta-analyses). The strengthening of the reporting of observational studies in epidemiology (STROBE) checklist was used to qualify the articles. The data was entered into STATA version 14 software, then the fixed effects model and the random effects model, effect size (ES), and Q test (P < 0.10) were used for data analysis at a confidence interval level (CI) of 95%. Two-sided statistical tests were considered with α=0.05.  

    Results

    In this research, 28 articles were analyzed. Polymorphisms showed a significant relationship with susceptibility to pulmonary TB (P = 0.000), and significant heterogeneity (P = 0.000) was seen between polymorphisms. FokI (95% CI: 0.39-0.46, P = 0.000, ES = 43%), ApaI (95% CI: 0.31-0.48, P = 0.000, ES = 39%) and BsmI  (95% CI: 0.24-0.50, P = 0.000, ES = 37%) showed the most frequent gene polymorphisms after TaqI (95% CI: 0.34-0.77, P = 0.000, ES = 56%).  

    Conclusion

    ApaI, BsmI, FokI, and TaqI polymorphisms were found in patients suffering from pulmonary TB. Polymorphisms related to the TaqI gene were the most frequent. Controlling and prescribing vitamin D may be needed in these patients.

    Keywords: Vitamin D Receptor Genes, Polymorphisms, Pulmonary Tuberculosis
  • Shiva Malgard, Sirous Panahi*, Leila Nemati Anaraki, Shadi Asadzandi, Hossein Ghalavand Pages 222-229
    Background

    The present study was motivated by issues with earlier studies on documenting knowledge and experiences. This scoping review investigates and maps the procedures for documenting organizational knowledge and experiences.   

    Methods

    Following the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) extension for Scoping Reviews (PRISMA-ScR) guidelines, a scoping review was conducted. Data were obtained by searching PubMed, Web of Science, Scopus, ProQuest, Embase, and Emerald Insight databases and Persian databases, such as Magiran, Noormags, and Ensani. The selected terms were searched using the Boolean AND/OR operators, phrases, parentheses, and truncations in the title, abstract, keywords, and text word fields. The inclusion criteria were resources relevant to the research question, studies in English and Persian, original research articles, and resources published between 2011 and 2022. Finally, 8 related papers were selected as the research population after screening records.    

    Results

    The review of the selected studies indicates that there have been different steps for documenting knowledge and experiences according to the subject's scope and the goals of the studies. The included articles revealed numerous steps for documentation—including planning, acquisition, registration, evaluation, submission, maintenance, publication, application, payment, and compensation.   

    Conclusion

    Although a systematic mechanism for documenting knowledge and experience is essential, many processes and phases are offered for documentation. Therefore, a complete review that synthesizes and integrates past study findings must still be included. Several shortcomings in past research on documenting knowledge and expertise prompted the present study. The results of the present study can be of great use to managers and employees of various organizations in topics such as the creation of standards for documenting knowledge and experiences, organizational-structural planning in this field, and training on different documentation methods.

    Keywords: Knowledge Management, Documenting Knowledge, Documenting Experience, Procedures
  • Najmeh Shamspour*, Maryam Eslami, Jalal Azmandian, Behnam Dalfardi, Azam Dehghani Pages 230-234
    Background

    End-stage kidney disease (ESKD) is a global issue. Although the use of kidney replacement therapy measures has improved outcomes for patients with ESKD, the mortality rate remains significant. Identifying modifiable factors that affect patient outcomes can help improve their survival. The aim of this study was to investigate the factors affecting the clinical outcome of peritoneal dialysis patients.  

    Methods

    This prospective cohort study was conducted between 2018 and 2021.Participants: Patients aged between 18 and 75 years with a history of peritoneal dialysis (PD) for at least six months were included. Demographic data, kt/v ratio, medical history, serum levels of albumin, creatinine, triglycerides, total cholesterol, calcium, phosphorus, parathyroid hormone, hemoglobin, and ferritin were recorded before starting PD and during the follow-up period, along with clinical outcomes. To describe the data, the central index of mean, frequency, and relative frequency was used, and for analytical statistics, Chi-square test, analysis of variance, and Kruskal-Wallis were used.  

    Results

    A total of 64 patients with a mean age of 51.78 ± 15.31 years were included. Of these, 27 (42.18%) had a history of diabetes mellitus, and 38 (59.37%) had a history of hypertension (HTN). 48 (75%) patients survived until the end of the study, while 47 (73.4%) participants experienced peritonitis. Our findings indicate that variables such as sex, marital status, weight, history of HTN, and serum levels of hemoglobin and ferritin significantly affect outcomes.  

    Conclusion

    We found that factors including sex, marriage, normal weight, HTN, normal hemoglobin, and ferritin can lead to better survival in PD patients. Recurrent peritonitis was the most crucial cause of PD to HD shifts.

    Keywords: Peritoneal Dialysis, Survival, End-Stage Kidney Disease
  • Mansoureh Javadipour, Elham Keshtzar, Parivash Parvasi, Seyed Farzad Hosseini, Ali Hassan Rahmani* Pages 235-239
    Background

    A wide variety of electrocardiogram (ECG) changes can manifest with antidepressant drugs, occurring at both therapeutic doses and toxic levels. Notably, ECG abnormalities like wide QRS and QT prolongation may be observed in poisoned patients with tricyclic antidepressants (TCAs), indicating severe conditions that necessitate the implementation of cardiac monitoring systems. This study aimed to investigate ECG Abnormality in poisoned patients with tricyclic antidepressants.  

    Methods

    This retrospective patient record study was conducted at Razi Hospital in Ahvaz, Iran, from 2006 to 2009. Patient information was extracted from hospital medical records after the established protocol. The chi-square test was employed for initial analysis; subsequently, logistic regression was applied to identify risk factors associated with abnormal ECG findings. We analyzed the data using SPSS (Version 19; IBM) statistical software. P < 0.05 was defined as statistically significant.  

    Results

    Among the 210 poisoned patients, comprising 88 men (41.9%) and 122 women (58.1%), the majority fell within the age range of 15 to 25 years. In our study, the most commonly ingested drugs by poisoned patients were amitriptyline in 134 patients (63.8%) and nortriptyline in 42 patients (20%). A significant portion of 137 patients (65.2%) exhibited poisoning symptoms within ˂ 6 hours, while 73 patients (34.8%) showed symptoms between 6 and 24 hours. Our findings indicated that the initial symptoms in poisoned patients included a decreased level of consciousness in 168 patients (80%), nausea and vomiting in 20 patients (9.5%), and various other symptoms. Notably, our results revealed ECG changes in 70 patients, with 32 patients (15.2%) showing a QRS widening (> 0.1sec), 5 patients (2.4%) displaying a tall R wave in aVR, 5 patients (2.4%) exhibiting right axis deviation, and other observed changes.  

    Conclusion

    QRS widening in poisoned patients with tricyclic antidepressants is more frequently observed in symptomatic patients, highlighting the importance of ECG screening in these patients.

    Keywords: ECG Changes, TCA Poisoning, Overdose, Suicide
  • Mohsen Seifollahi, Marzieh Heidarzadeh Arani, Rozita Hoseini Shamsabadi, Shahrbanoo Nakhaie, Maesoumeh Karimi Aghche, Mohammad Javad Azadchehr, Amin Sadat Sharif* Pages 240-243
    Background

    Urinary tract infections (UTIs) are extremely prevalent bacterial infections among children. They have numerous potential causes. Without proper diagnosis and treatment, UTIs can lead to serious complications in children, including impaired growth, high blood pressure, protein in urine, and eventual chronic kidney disease. Zinc and vitamin D in sufficient concentrations help to maintain the health of the immune system. Therefore, their deficiency can cause various infections. Several factors can contribute to the development of UTIs. This article deals with the role of zinc and vitamin D as immune markers in UTI in children without other risk factors.  

    Methods

    In this case-control study, serum zinc and vitamin D levels without any other risk factors were examined in 40 healthy children and 40 children with UTIs. Data analysis was done through SPSS 26 using the chi-square, the Fisher’s exact, and ‎ independent t tests.‎  

    Results

    The study findings demonstrated a statistically significant distinction between the 2 groups regarding serum vitamin D and zinc levels (P < 0.001); 80% of children with UTIs and 17.5% in the healthy group had ‎vitamin D deficiency. Also, 60% of the urinary ‎infection group had zinc deficiency, whereas 17.5% of the healthy group had it.‎  

    Conclusion

    Low serum zinc and vitamin D levels may increase susceptibility to pediatric UTI. ‎ Given the data, supplementation with zinc and vitamin D could play a significant role in treating active infections and preventing recurrence in susceptible children.

    Keywords: Zinc, Vitamin D, Children, Pediatric, Urinary Tract Infection
  • Homayoun Sadeghi-Bazargani, Hamid Soori, Seyed Abbas Motevalian, Omid Aboubakri, Ali Jafari-Khounigh, Alireza Razzaghi, Hamid Reza Khankeh, Seyyed Taghi Heydari, Forouzan Rezapur Shahkolai, Mojtaba Sehat, Davoud Khorasani Zavareh, Mohammad Asghari-Jafarabadi, Ali Imani, Mohammad Bagher Alizadeh Aghdam, Hossein Poustchi, Mahdi Rezaei, Mina Golestani* Pages 244-250
    Background

    Measuring socioeconomic status (SES) as an independent variable is challenging, especially in epidemiological and social studies. This issue is more critical in large-scale studies on the national level. The present study aimed to extensively evaluate the validity and reliability of the Iranian SES questionnaire.  

    Methods

    This psychometric, cross-sectional study was conducted on 3000 households, selected via random cluster sampling from various areas in East Azerbaijan province and Tehran, Iran. Moreover, 250 students from Tabriz University of Medical Sciences were selected as interviewers to collect data from 40 districts in Iran. The construct validity and internal consistency of the SES questionnaire were assessed using exploratory and confirmatory factor analyses and the Cronbach's alpha. Data analysis was performed in SPSS and AMOS.   

    Results

    The complete Iranian version of the SES questionnaire consists of 5 factors. The Cronbach's alpha was calculated to be 0.79, 0.94, 0.66, 0.69, and 0.48 for the occupation, self-evaluation of economic capacity, house and furniture, wealth, and health expenditure, respectively. In addition, the confirmatory factor analysis results indicated the data's compatibility with the 5-factor model (comparative fit index = 0.96; goodness of fit index = 0.95; incremental fit index = 0.96; root mean square error of approximation = 0.05).

    Conclusion

    According to the results, the confirmed validity and reliability of the tool indicated that the Iranian version of the SES questionnaire could be utilized with the same structure on an extensive level and could be applicable for measuring the SES in a broader range of populations.

    Keywords: Socioeconomic Status, Generalizability, Validity, Reliability, Factor Structure, Psychometric
  • Ehsan Naderifar, Maryam Tarameshlu, Reza Salehi, Leila Ghelichi*, Arash Bordbar, Negin Moradi, Branda Lessen Pages 251-257
    Background

    The survival rate in premature infants (PIs) has increased, but many have medical and developmental complications. Difficulty with sucking, swallowing, and poor nourishment are common complications. This study aimed to investigate the effects of Kinesio-tape (KT) combined with premature infant oromotor intervention (PIOMI) on feeding efficiency (mean volume intake [%MV]), oromotor skills (Preterm Oral Feeding Readiness Assessment Scale [POFRAS]), and weight gain in PIs.  

    Methods

    In this single-subject study, 5 PIs with feeding problems were received the PIOMI-KT for 7 consecutive days. The main outcome measure was the POFRAS scale. The %MV and weight gain were the secondary outcome measures. Measurements were taken before treatment (T0), after the 4th session (T1), and after the 7th session (T3).  

    Results

    The POFRAS scores, %MV, and weight gain improved in all infants after treatment. The maximum and minimum change in level between the baseline and treatment phase was +26 and+16 for POFRAS, +54 and, +34 for %MV, +180, and +100 for weight gain. The treatment trend was upward for all infants and shown by the directions of the slopes indicated by positive values. The feeding problems were resolved in all infants after the 7th treatment session.  

    Conclusion

    The combination therapy of PIOMI-KT improved feeding function in PIs.

    Keywords: PIOMI, Kinesio-Tape, Premature, Infants, Feeding Problems
  • Saba Dahaghin, Ava Aghakhani*, Azadeh Memarian, Pardis Monjezi, Kamran Aghakhani Pages 258-264
    Background

    A Medical Certificate of Cause of Death (MCCD) is a legal and enforceable document issued by the attending physician. However, according to the instructions, in many cases such as sudden, unexplained, and extraordinary deaths, along with some uncommon causes, such as cases suspicious of murder, the deceased patient must be referred to the Iranian Legal Medicine Organization (ILMO). Moreover, the unnecessary referral of corpses to ILMO can increase the workload of the staff, finally confronting the family of the deceased with high emotional and financial costs.  

    Methods

    In this cross-sectional study, the medical records of all deceased patients referred from Hazrat Rasool Hospital to the ILMO (565 cases) in a three-year period from April 2016 to March 2019 were investigated and analyzed using SPSS22 software with chi-squared and T-test.  

    Results

    Among all the patients who passed away during this time (4,239 patients), 565 were referred to ILMO, accounting for 13.3% of deaths. The most common causes of referral were car and motorcycle accidents, with a total prevalence of 27.1%, dead-on-arrival (DOA) prevalence of 21.3%, and death with an unknown cause prevalence of 15.3%. Significant correlation was also detected between causes of referral with gender, time of death, and age. For example, Car accidents and lawsuits against medical staff were more common in men and women, respectively.   

    Conclusion

    Car and motorcycle accidents, DOA, and unknown causes were the most prevalent causes of referral in this study. In general, few studies have been conducted regarding the causes of referral of the deceased to the Legal Medicine Organization. In this study, we collected relevant variables to investigate this issue thoroughly.

    Keywords: Forensic Medicine, Legal Medicine Organization, Cause Of Referral
  • Akmaral Izbasarova, Aisulu Zholdybayeva, Galiya Kadrzhanova, Khadisha Kashikova*, Asel Izbasarova, Natalya Petrova, Alima Tolybekova Pages 265-269

    This paper presents a unique 12-year case analysis of a girl with Penta-X syndrome, a chromosomal abnormality characterized by five X chromosomes instead of the normal two in healthy women. Pentasomy of X is a genetic, but not a hereditary disease affecting only women. Our patient demonstrated delayed mental, speech, and motor development along with physical anomalies such as craniofacial deformities, and eye pathology and was diagnosed with pentasomy of the X chromosome at the age of 3 after a cytogenetic examination. She developed epileptic seizures at the age of nine. Magnetic resonance imaging(MRI) revealed leukoencephalopathy with ventriculomegaly. The peculiarity of this observation is that the polysomy 49, XXXXX detected in the patient is characterized by a typical phenotypic presentation combined with demyelinating leukoencephalopathy, which has not been a typical feature of the disorder.

    Keywords: Pentasomy X, Chromosome Aberrations, Aneuploidy, Mental Retardation, Leukoencephalopathy
  • Atefe Tirkan, Delaram Eskandari, Maryam Roham, Oldooz Aloosh, Tayeb Ramim, Hale Afshar* Pages 270-274
    Background

    Melatonin, a tryptophan product, has anti-inflammatory and virucidal effects. A study evaluated whether melatonin is more effective than placebo in critically ill COVID-19 patients.   

    Methods

    The present study used a double-blind, randomized clinical trial in patients with COVID-19 hospitalized in the intensive care unit of Rasool Akram Hospital, Tehran. Iran. Melatonin 10 mg and placebo were given to the patients at night before bed. Patients were randomly divided into 2 groups. The first group was given melatonin with a therapeutic dose of 10 mg daily, and the second group was given a placebo with the same dose of 10 mg daily. Patients received melatonin or placebo for 7 days.The chi-square or Fisher exact test was used to compare qualitative variables. The study analyzed the mean of the variables under investigation by conducting a 2-factor repeated measures analysis of variance at 3 different time intervals in those administered medication or placebo.  

    Results

    The study analyzed 44 melatonin patients and 42 placebo groups. The mean intensive care unit (ICU) hospitalization days were 11.23 ± 4.73 days in the melatonin group and 11.90 ± 6.52 days in the placebo group (P = 0.582). The mean days of hospitalization in the melatonin group were 19.70 ± 8.77 days and 21.48 ± 10.85 days in the placebo group (P = 0.407). The mean oxygen saturation before and after discharge from ICU in the melatonin group was 81 ± 6.73%, 91.02 ± 1.17%, and in the placebo group, 83.36 ± 8.27%, 91.21 ± 1.26, respectively (P = 0.467 and P = 0.150)  

    Conclusion

    Melatonin can significantly reduce inflammation and oxidative stress markers in patients, making it a promising therapeutic option for COVID-19 patients. Further research is needed to determine the optimal treatment dosage and duration. Nonetheless, these results offer a promising avenue for future research and clinical practice.

    Keywords: Melatonin, COVID-19, Intensive Care Unit
  • Raha Tabahfar, Fatemeh Oskouie*, Hamid Haghani Pages 275-280
    Background

    Death in the place that the patient has selected and feels comfortable remaining in for the remainder of their life is one of the main objectives of palliative care for terminally ill cancer patients. Nevertheless, this problem is constantly disregarded. The goal of the present study was to look at variables that affected cancer patients' decisions about their place of death.   

    Methods

    A descriptive cross-sectional study was conducted from May to August 2018. Using a continuous sampling method, 631 patients who had passed away between 2011 and 2017 were selected among the patients with a history of cancer and hospitalization at the Firoozgar Hospital in Tehran. A self-made 3-section questionnaire with 21 questions was completed by phone calls made to the families who confirmed their patients’ deaths due to cancer. Data were managed by SPSS software Version 13, and descriptive statistics were used in data analysis.  

    Results

    Based on the results, among 631 deceased patients, only 157 (24.9%) chose their place of death, and 474 (75.1%) had not spoken about it during their lifetime. Among the examined variables, age, sex, education, insurance status, duration of disease, activities of daily living, awareness of disease progression, and receiving home care had a significant association with this choice in people who died of cancer.   

    Conclusion

    Despite the importance of the choice of place of death by the patient in the final days of life, the possibility of having an option is not provided for most cancer patients. Patients who understand how their disease is progressing at this point are probably going to want to select where they pass away. Consequently, the healthcare system must be ready to grant cancer patients the option to choose their final resting place and ensure a comfortable and respectable passing. Future research can be built upon the results of this study.

    Keywords: Place Of Death, End-Of-Life Care, Palliative Care, Death, Cancer, Preferred Place Of Death
  • Asgar Aghaei Hashjin, Kamran Irandoust*, Hanie Gholampoor, Claudia Fischer, Masoud Maleki Birjandi, Aireza Mazdaki, Hossein Abdolali, Hediye Seval Akgun Pages 281-292
    Background

    Acquiring a thorough understanding of the expenses linked to the education of health sciences students is crucial for effective university planning, budgeting, and overall preparedness. This systematic review aimed to identify and compare the per capita costs associated with educating medical and other health science students internationally—particularly emphasizing the context of Iran.  

    Methods

    A systematic review was conducted in 2023 according to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. The search covered the period from January 1, 2000, to  November 11, 2022, using MeSH and EMTREE terms. Databases such as PubMed, Scopus, Web of Science, Embase, and Iranian databases were searched. Manual searches were performed using Google and Google Scholar.   

    Results

    The study retrieved 1336 publications from bibliometric databases and, following thorough screening, included 8 relevant articles from 5 countries (Australia, Iran, United States, Thailand, and Vietnam). An additional 17 relevant articles from Iranian databases were also included. Based on USD purchasing power parity (PPP) 2019, the results show that the mean per capita cost of training a medical student for 1 academic year in Iran is $61,493.86 (range, $28,102-$133,603; standard deviation, $35,476.03). In comparison, the cost of training a medical student for 1 year is $263,305 in the United States and $44,674 in Australia. In Thailand and Vietnam, a 6-year medical program costs is $284,058 and $69,323, respectively. Moreover, according to most studies, training students in other health sciences in Iran for 1 academic year generally costs ˂$20,000 (PPP 2019).  

    Conclusion

    The study reveals that the costs associated with medical student education in Iran exceed that of most countries, second only to the United States. These findings highlight the importance of such data in improving the efficiency, sustainability, and informed resource allocation of global medical education programs for future planning and budgeting.

    Keywords: Cost, Education, Medical Student, Training
  • Ali Yeganeh, Ava Parvandi, Mohammadreza Mehri, Hamed Tayyebi, Javad Khajemozafari* Pages 293-296
    Background

    It is becoming increasingly important to study pathology at the knee and spine because of their role in causing pain and deformity in one another. Compression of the lumbar nerve root can disrupt innervation to the thigh muscles, cause an imbalance of muscle and result in varus deformity. In this study, we try to figure out the relation between lumbar spine disorders and genu varum and realize if lumbar spine disorders can cause varus deformity in a patient.  

    Methods

    In this cross-sectional study, the number of 53 patients with knee varus greater than 20 degrees who visited the orthopedic clinics of Rasoul Akram and Moheb Mehr hospitals affiliated to Iran University of Medical Sciences between 2020 and 2022 were investigated in terms of association with lumbar disorders. The demographic characteristics and clinical findings of the patients were recorded and collected using a checklist by referring to the patient's medical profiles. Radiographic findings were evaluated by referring to the imaging department and using the Pacs system. The diagnosis of the patients was based on the history and findings of the knee graph, which was measured with a goniometer to measure the knee angle. The frequency of lumbar disorders caused by pressure on the lumbar nerves in patients with genu varum deformity including canal stenosis, osteoarthritis, spondylolisthesis, and disc herniation was investigated. To investigate lumbar problems, patients' history and radiographic images as well as lumbosacral and knee MRI of patients were used.  

    Results

    The mean age was 66.3 ± 7.66 years. 40 (75.5%) cases were female and 13 (24.5%) cases were male. Lumbar canal stenosis with 28(52.8%) and disc herniation with 32(60.4%) were the most common lumbar disorders in patients with knee varus more than 20 degrees. The mean age of patients with varus of more than 20 degrees with lumbar disorders was significantly higher than that of patients without lumbar disorders. (P: 0.001) There was no significant difference between gender distribution and lumbar disorders in patients with varus more than 20 degrees. Significant positive correlation between genu varum with lumbar canal stenosis (r: 0.53, P: 0.001), osteoarthritis (r: 0.38, P: 0.004), spondylolisthesis (r: 0.39, P: 0.002) and disc herniation (r: 0.46, P: 0.001) was reported2   

    Conclusion

    An association between lumbar disorders and varus more than 20 degrees was found to be considerable.

    Keywords: Genovarum, Lumbar Disorders, Canal Stenosis, Osteoarthritis, Spondylolisthesis, Disc Herniation
  • Shahed Zandiehrad, Shahla Raghibdoust*, Mohammad Taghi Joghataei, Arsalan Golfam Pages 297-302
    Background

    Various studies have shown that individuals with autism spectrum disorder (ASD) experience significant cognitive impairments during childhood. Individuals often experience various language disorders that can manifest in different ways. There are also studies indicating that these impairments persist into adulthood for individuals with ASD. This study aimed to evaluate and identify cognitive impairments among Persian-speaking adults with ASD.   

    Methods

    This research is of a quantitative nature and has been conducted using an experimental method in which two subtests from the Persian Version of the Montreal Protocol for the Evaluation of Communication (P.M.E.C.), including the Metaphor Interpretation and Speech Act Interpretation subtests, were utilized. Thirteen Persian-speaking men with ASD participated in this research, with ages ranging from 25 to 44 years (mean age 32.84, standard deviation 4.17), whose educational levels varied from primary school to 20 years of formal education. The control group consisted of 26 healthy Persian-speaking men who were matched in terms of age and educational level with the ASD group. The Kolmogorov-Smirnov test and a paired t-test were used to compare the two groups.  

    Results

    The results indicated that the ASD group performed significantly poorer in both the Metaphor Interpretation subtest (P < 0.001) and the Speech Act Interpretation subtest (P = 0.033) compared to the healthy control group, suggesting cognitive impairments in their abilities.   

    Conclusion

    The findings of this research can be valuable for assessment and intervention purposes in rehabilitation centers, as well as in academic and research settings.

    Keywords: Autism Spectrum Disorder, Persian Language, P.M.E.C. Protocol, Cognitive Skills, Metaphor Comprehension, Speech Act Interpretation
  • Amin Keykhaie Afusi, Marzieh Salehi Shahrabi, Mehrsa Paryab*, Mohammad Javad Kharrazi Fard Pages 303-308
    Background

    Dental caries is a serious health condition in children. Poor diet, poor oral hygiene, and unique anatomy of the primary teeth can all contribute to the development of caries in primary teeth. Developmental structural defects in teeth during the fetal period and the first year after birth are believed to increase caries susceptibility. This study aimed to assess the correlation of the Apgar score with dental caries in 3- to 5-year-old Iranian children.   

    Methods

    This retrospective, descriptive, cross-sectional study was conducted at the Pediatric Dental Clinic of Tehran Dental School in 2022. A total of 123 eligible children between 3-5 years were enrolled. The parents were requested to fill out a checklist regarding the information of demographics, birth and infancy condition and Apgar score of children. The children underwent clinical dental examination, and their dmft was recorded. Data were analyzed by the Pearson and regression tests. P values < 0.1 were considered statistically significant.  

    Results

    The Pearson test showed that the 1-minute (P = 0.000) and 5-minute (P = 0.000) Apgar scores had a significant correlation with dmft. The regression analysis of demographic and birth factors revealed significant correlations between duration of breastfeeding (P = 0.066) and age of initiation of toothbrushing (P = 0.019) with dmft. Also, birth weight (P = 0.026) and mother’s educational level (P = 0.090) had significant correlations with the Apgar score.   

    Conclusion

    The results indicated a significant correlation between the Apgar score and dental caries. Thus, newborns with lower Apgar scores are recommended to receive more regular oral and dental care services.

    Keywords: Apgar Score, Dental Caries, Children, Pediatric Patients
  • Fatemeh Goodarzi, Mohammad Mahdi Daei, Samira Dodangeh, Elham Kia Lashaki, Zohreh Yazdi, Majid Hajikarimi* Pages 309-312
    Background

    Non-ST-elevation myocardial infarction (NSTEMI) is a significant component of acute coronary syndrome (ACS) and typically exhibits a relative incidence that is more than double that of ST-segment elevation myocardial infarction (STEMI). Data obtained from the International Long QT Syndrome Registry indicate that the risk of developing malignant arrhythmias in individuals with long QT syndrome is exponentially associated with the duration of the QTc interval. Therefore, the aim of this study was to assess the potential inclusion of prolonged QTc as a prognostic risk factor in NSTEMI patients.  

    Methods

    A cross-sectional study was conducted on patients with NSTEMI diagnosis admitted to the Bu-Ali Hospital of Qazvin between April 2021 and September 2021 by census method. The QT interval was measured in the electrocardiogram at admission. The documented grace score was calculated and its relationship with the corrected QTc interval was estimated using the Hodges formula. Finally, the relationship between QTc and GRACE score was investigated as a prognostic factor in ACS patients. Relationships were assessed by using both the T-test and the chi-square test.  

    Results

    A total of 60 patients (31.7% females, 63.8% males) with a mean age of 63 ± 12.7 years were evaluated. Most of the patients (68.3%) were at low risk regarding the Grace score category. In evaluating the relationship between QTc in the electrocardiogram at admission with total GRACE score, the Pearson correlation results were significant and there was a positive relationship between these two factors (r = 0.497, P < 0.001).  

    Conclusion

    This study revealed a significant relationship between the QTc interval of patients and the GRACE Score. It was shown patients' QTc can be a predictive factor of patients' mortality.

    Keywords: NSTEMI, GRACE, Electrocardiogram, ECG
  • Foruzan Orak, Maryam Saadat*, Amal Saki, Amin Behdarvandan, Fateme Esfandiarpour Pages 313-317
    Background

    The evaluation of VTE risk using risk assessment scales for each hospitalized patient is recommended by the National Institute for Health and Care Excellence. The purpose of this study was to compare the predictive accuracy of two common assessment scales, the Autar and Padua deep vein thrombosis (DVT) risk assessment scales.  

    Methods

    This prospective cohort study was  conducted on 228 ICU hospitalized patients. The risk of VTE was estimated using the Autar and Padua scales during the first 48 hours after admission. The predictive accuracy of the above two risk assessment scales for VTE in ICU patients was compared based on the area under the receiver operating curve (ROC).   

    Results

    Results of ROC analysis indicated the area under the curve (AUC) values for the Autar (0.61 ± 0.05) and Pauda (0.53 ± 0.06). Log-rank test showed no difference in AUCs (P = 0.19). Moreover, the accuracy of the Autar scale  and Padua obtained 24% and 14% respectively. Both scales had 100% sensitivity but their specificity was low (Autar 14% and Padua 3%). The positive likelihood ratios (LR+) were 1.17 for Autar and 1.03 for Padua. The negative likelihood ratios (LR-)  were 0 for Autar and 0.89 for Padua. Inter-rater agreement values obtained 0.99 and 0.95 respectively for the the Autar and Padua scales.   

    Conclusion

    The AUC, accuracy, and LR+ of the Autar risk assessment scale were higher than the Padua scale in predicting VTE. However, both scales had excellent reliability, high sensitivity and low specificity. It is recommended that the risk of VTE is recorded by the Autar scale for patients admitted to ICUs. It can help the healthcare team in the use of prophylaxis for those that are at high risk for VTE.

    Keywords: DVT, PE, VTE, Autar Risk Assessment Scale, Padua DVT Risk Assessment Scales
  • Mohammadreza Sheikhy-Chaman, Aziz Rezapour, Aidin Aryankhesal, Ali Aboutorabi* Pages 318-326
    Background

    Monitoring households' exposure to catastrophic health expenditure (CHE) based on out-of-pocket (OOP) health payments is a critical tool for evaluating the equitable financial protection status within the health system. The COVID-19 pandemic has brought unprecedented global change and potentially affected the mentioned protection indicators. This study aimed to assess the prevalence of CHE among households in Iran during the COVID-19 period.  

    Methods

    The present study employed a retrospective-descriptive design utilizing data derived from two consecutive cross-sectional Annual Household Income and Expenditure Surveys (HIES) undertaken by the Statistical Centre of Iran (SCI) in 2020 and 2021. The average annual OOP health payments and the prevalence of households facing CHE were estimated separately for rural and urban areas, as well as at the national level. Based on the standard method recommended by the World Health Organization (WHO), CHE was identified as situations in which OOP health payments surpass 40% of a household's capacity to pay (CTP). The intensity of CHE was also calculated using the overshoot measure. All statistical analyses were carried out using Excel-2016 and Stata-14 software.  

    Results

    The average OOP health payments increased in 2021, compared to 2020, across rural and urban areas as well as at the national level. Urban residents consistently experienced higher OOP health payments than rural residents and the national level in both years. At the national level, the prevalence of CHE was 2.92% in 2020 and increased to 3.18% in 2021. In addition, rural residents faced a higher prevalence of CHE based on total health services OOP, outpatient services OOP, and inpatient services OOP compared to urban residents and the national level. Regarding the intensity of CHE using overshoot, the results for 2020 and 2021 revealed that the overshoot ranged between 0.60% and 0.65% in rural areas, between 0.30% and 0.33% in urban areas, and between 0.38% and 0.41% at the national level.  

    Conclusion

    A considerable percentage of households in Iran still incur CHE. This trend has increased in the second year of COVID-19 compared to the first year, as households received more healthcare services. The situation is even more severe for rural residents. There is an urgent need for targeted interventions in the health system, such as strengthening prepayment mechanisms, to reduce OOP and ensure equitable protection for healthcare recipients.

    Keywords: Catastrophic Health Expenditure, Out-Of-Pocket, Health Equity, COVID-19, Iran
  • Hajar Nazari Kangavari, Ahmad Hajebi, Hamid Peyrovi, Masoud Salehi, Mohammad Hossein Taghdisi, Abbas Motevalian* Pages 327-335
    Background

    Success in COVID-19 vaccination depends on understanding why people refuse or hesitate to take the vaccine. This study aims to explore vaccine refusal and hesitancy among Iranians who participated in the national COVID-19 vaccine hesitancy survey.   

    Methods

    A qualitative content analysis approach was used. Twenty-six participants were selected by purposive sampling. In-depth, semi-structured telephone interviews were conducted during the year 2022. A directed content analysis approach was used for analyzing the data by extracting the codes, subcategories, and categories.   

    Results

    Four major categories and their respective subcategories related to refusal and/ or hesitancy against COVID-19 vaccination emerged: “lack of confidence” (distrust in policymakers and pharmaceutical companies, distrust in national media, belief in conspiracy theory, and lack of confidence in the vaccine's safety and effectiveness), “complacency” (Fatalism and philosophical beliefs, low perceived risk, and belief in the adequacy of the precautionary principles), “constrains” (personal and psychological barriers), and “coercion” (coercion by relatives and unsteady imposed mandatory vaccination by the government).   

    Conclusion

    Distrust, fatalism, low perceived risk, and overconfidence in traditional Persian medicine were important barriers to COVID-19 vaccine acceptability needing a variety of measures for improving COVID-19 vaccine uptake, including enhancing public trust in government and policymakers, clarifying vaccine safety and effectiveness, dealing with religious fatalism, and regulating anti-science messages on social media.

    Keywords: Vaccine Hesitancy, Vaccine Refusal, COVID-19, Qualitative Research, Iran
  • Sana Eybpoosh*, Seyyed Amir Yasin Ahmadi Pages 336-339
    Background

    Case-control studies are efficient designs for investigating gene-disease associations. A discovery of genome-wide association studies (GWAS) is that many genetic variants are associated with multiple health outcomes and diseases, a phenomenon known as pleiotropy. We aimed to discuss about pleiotropic bias in genetic association studies.  

    Methods

    The opinions of the researchers on the basis of the literature were presented as a critical review.  

    Results

    Pleiotropic effect can bias the results of gene-disease association studies if they use individuals with pre-existing diseases as the control group, while the disease in cases and controls have shared genetic markers. The idea supports the conclusion that when the exposure of interest in a case-control study is a genetic marker, the use of controls from diseased cases that share similar genetic markers may increase the risk of pleiotropic effect. However, not manifesting the disease symptoms among controls at the time of recruitment does not guarantee that the individual will not develop the disease of interest in the future. Age-matched disease-free controls may be a better solution in similar situations. Different analytical techniques are also available that can be used to identify pleiotropic effects. Known pleiotropic effects can be searched from various online databases.  

    Conclusion

    Pleiotropic effects may result in bias in genetic association studies. Suggestions consist of selecting healthy yet age-matched controls and considering diseases with independent genetic architecture. Checking the related databases is recommended before designing a study.

    Keywords: Pleiotropy, Research Design, Observational Study, Genetic Epidemiology, Case-Control Studies, Genetic Association Studies
  • Saeed Dahmardeh, Zahra Heidari* Pages 340-345
    Background

    Despite the implementation of national iodine supplementation programs, structural thyroid diseases are still highly prevalent in most countries. Thus, the link between trace elements other than iodine, such as selenium, and thyroid diseases should be investigated.   

    Methods

    In this case-control study, adult patients with newly diagnosed papillary thyroid carcinoma, benign thyroid nodules, and healthy euthyroid controls without nodules were recruited. Thyroid function tests and serum selenium levels were assessed and compared between groups. The One-way ANOVA test was used to assess the mean difference of numerical variables among the three studied groups (PTC, Benign nodule, and healthy control group). In addition, a post-hoc comparison was conducted based on Bonferroni correction for a pairwise comparison of these three groups.  

    Results

    Data from 182 patients with papillary thyroid carcinoma (PTC), 185 patients with benign thyroid nodules, and 180 healthy individuals as a control group were analyzed. The mean serum selenium levels in the PTC, benign thyroid nodules, and control group were 94.9, 121.6, and 134.3 µg/l, respectively (P < 0.001). There was a significant relationship between the cancer stage and selenium level in the PTC group. Patients in higher stages of cancer had a lower mean of selenium (P < 0.001). In univariate logistic regression, TSH and selenium were significant variables for PTC compared with patients with benign thyroid nodules. Each unit increase in selenium reduces the chance of PTC by about 6%.   

    Conclusion

    The low levels of selenium were associated with PTC. Also, serum selenium levels were inversely correlated with the stage of thyroid cancer.

    Keywords: Selenium, Thyroid Nodule, Thyroid Cancer, Papillary
  • Roya Vaziri Harami, Amirreza Keyvanfar, Yousef Semnani, Hanieh Najafiarab* Pages 346-351
    Background

    Many patients with bipolar disorder (BD) experience sleep problems. Sleep abnormalities are associated with immune dysfunction, which may be reflected by hematological indices.

    Purpose

    This study aimed to investigate the association between sleep quality and the neutrophil-to-lymphocyte ratio (NLR) and the platelet-to-lymphocyte ratio (PLR) in patients with BD.   

    Methods

    This cross-sectional study was performed at Imam Hossein Hospital, Tehran, Iran, from March to September 2023. Hospitalized patients newly diagnosed with BD were interviewed to complete questionnaires. Sleep quality and manic and depressive symptoms of the participants were assessed using the Pittsburg Sleep Quality Index (PSQI), the Young Mania Rating Scale (YMRS), and the Hamilton Depression Rating Scale (HDRS), respectively. Furthermore, blood samples were taken from each patient to investigate hematological indices. Continuous and categorical variables were compared between groups using an independent-sample t test and chi-square/Fisher's exact tests, respectively. The Poisson regression model was also used to investigate predictors of the PSQI score.  

    Results

    Of 305 patients included in the study, 78.7% and 21.3% were experiencing manic and depressive episodes, and 90.20% had poor sleep quality. The prevalence of poor sleep quality was significantly higher in depressed patients than in manic patients (100% vs. 87.5%; P = 0.003). Depressed patients had significantly higher platelet counts (mean difference [MD], 34.09 [95% CI, 9.35-58.83]; P = 0.007) and PLR (MD, 38.14 [95%CI, 10.25-66.02]; P = 0.008) and lower lymphocyte counts (MD, 266.04 [95% CI, [14.41-517.67]; P = 0.038) compared with manic patients. The Poisson regression model with adjustment revealed that men (risk ratio [RR], 1.113; P = 0.025), those with lower educational levels (RR, 1.164; P = 0.001), and those with higher HDRS scores (RR, 1.370; P < 0.001) had significantly deteriorated sleep quality.  

    Conclusion

    Most bipolar patients have poor sleep quality, particularly those with depressive episodes. Depressed patients had significantly higher platelet counts and PLR. Also, depressed patients with male sex, lower educational levels, and more severe depressive symptoms had poorer sleep quality.

    Keywords: Bipolar Disorder, Blood Cell Count, Depression, Mania, Sleep Quality
  • Majid Khosravi, Aziz Rezapour*, Najmeh Moradi, Setare Nassiri Zeidi, Namamali Azadi Pages 352-359
    Background

    Spinal muscular atrophy is an inherited neurodegenerative disorder that typically leads to severe physical disability. The present study aimed to determine the subjective evaluation of this disorder screening and analyze its influencing factors in Iran.‎    

    Methods

    A cross-sectional study was performed using data from the second survey of women either pregnant or planning to become pregnant in Tehran, the capital of Iran, in 2022. The dependent variable was the willingness to pay for this disease screening test. The independent variables included sociodemographic, economic, and health characteristics, the history of this disease or other diseases of the person and family, and knowledge about this disease in the included population. Logistic regression was utilized to identify independent variables associated with the dependent variable, and the results were reported as unadjusted and adjusted odds ratios and P values with 95% CIs. A questionnaire was used as a research tool, and STATA 17 software was used for data analysis. The monetary value of spinal muscular atrophy (SMA) screening was calculated by estimating willingness to pay using the congenital valuation method.   

    Results

    In total, 578 women were included. About 64.85% of respondents had a willingness to pay for SMA screening as the dependent variable, with a mean of $526. University education (P = 0.009) and pregnancy experience (P = 0.021) were associated with the dependent variable.  

    Conclusion

    Iranian women expressed their willingness to undergo screening tests, but due to financial constraints, they expected the government and nongovernmental organizations to bear most of the cost.

    Keywords: Spinal Muscular Atrophy, Willingness To Pay, Carrier Screening, Contingent Valuation ‎Method, ‎Logistic Regression Model
  • Farzin Halabchi, Mohmmad Mahdi Tavana*, Vahid Seifi, Marzieh Mahmoudi Zarandi Pages 360-371

    Medial gastrocnemius strain (MGS), is the most common cause of mid-calf pain in athletes due to the stretch of the gastrocnemius muscle when the knee is in extension and the ankle is in dorsiflexion. Chronological age and previous calf injury are the most substantial risk factors for MGS, including high body mass index, previous lower limb injuries, L5 radiculopathy, and inadequate warm-up. The dominant presentation of MGS is a pain that can be diverse from acute to latent, which is felt in the posteromedial aspect of the calf and is often preceded by a feeling of a pop. The signs of MGS include antalgic gait, ecchymosis, swelling, local tenderness, and sometimes a palpable gap felt along the muscle. Passive dorsiflexion of the ankle or resistive ankle plantarflexion with knee extension can indicate a more severe injury, while functional tests can illicit milder injuries of calf muscles—including gastrocnemius. The diagnosis of MGS is usually made by clinical evaluation. However, imaging modalities—including magnetic resonance imaging and ultrasound—can be helpful in case of suspicion. In most cases of MGS, the cornerstone of treatment is nonoperative rehabilitation, which can be performed as a 4-phase program and should be tailored individually. Some instances of MGS are referred for early or later surgical treatment if indicated. In this article, we review the literature about various aspects of MGS, from diagnosis to treatment and rehabilitation, and propose a structured approach to this injury.

    Keywords: Orthopedics, Sports Medicine, Rehabilitation, Athletic Injuries, Gastrocnemius Muscle
  • Mahsa Rasekhian, Sadra Haji, Maryam Shahabi, Azizeh Barry, Akram Hashemi, Mahdiyeh Ghasemi, Ghobad Ramezani* Pages 372-377
    Background

    Students with higher academic self-efficacy usually show higher levels of academic adaptation and are agile in using variant learning strategies. In this regard, training-based learning (TBL), a relatively new educational method, can be considered a complementary method in the education of pharmacy students. This research aims to compare the effect of TBL with the lecture method on the academic self-efficacy of pharmacy students in pharmaceutical biotechnology courses.  

    Methods

    This was a quasi-experimental study of pretest and posttest types with random assignment to 2 control and experimental groups, in which the effects of team-based training and lecture methods were studied in the pharmaceutical biotechnology course. In the experimental group, the students were divided into 8 groups of 6 people, and they spent 5 sessions of the pharmaceutical biotechnology course with the TBL method, and the control group also received the same content by the lecture method. Both groups answered the self-efficacy tool before the intervention, and at the end of the intervention, both groups answered the tool again. After the approval of the university ethics committee and obtaining informed consent, the self-efficacy questionnaire was distributed in person and online among the participants. The quantitative data were collected and analyzed using SPSS software Version 19 (mean and standard deviation, homogeneity of the 2 groups, tests, and Kolmogorov-Smirnov).  

    Results

    The data analysis showed that most of the participants were 22 years old. The independent samples t test results showed that the 2 groups did not have a statistically significant difference in the mean age (P = 0.058). Also, there was no statistically significant difference between the 2 groups in the pretest (P = 0.391), and the 2 groups had almost the same mean. Still, there was a statistically significant difference between the 2 groups in the academic self-efficacy variable in the posttest (P < 0.001).   

    Conclusion

    Team-based teaching methods can increase students' participation and enthusiasm in learning and applying self-efficacy and self-management skills, introducing more diverse career and educational opportunities for pharmacy students.

    Keywords: Team-Based Learning, Academic Self-Efficacy, Teaching Method, Pharmacy
  • Khadisha Kashikova*, Ergali Nabiyev, Ramazan Askerov, Zhassulan Argynbayev, Ussama Abujazar, Arnat Baizakov, Nurlan Turbekov Pages 378-382
    Background

    Proximal femoral fractures are a global epidemiological concern due to their association with mortality and morbidity in the geriatric population.  

    Methods

    We conducted an epidemiological study using hospital registry data to assess the incidence and associated factors of proximal femur fractures among individuals aged 60 years or older living in Almaty City. Student’s t-test was used to assess for between-group differences.   

    Results

    The data showed that the overall frequency of fractures among the population of Almaty City aged 60 years and older between 2014 and 2019 averaged 169.6 per 100,000, with a higher rate among women (190.3) compared to men (135.8). However, in age groups up to 70 years and over 85 years, the frequency of proximal femur fractures was higher among men. From 2014 to 2019, the incidence of proximal femur fractures increased by 1.6 times. An analysis of the distribution of fracture frequency by season revealed that winter was the most dangerous period.  

    Conclusion

    Our research suggests a need for further epidemiological studies on the incidence of proximal femur fractures in various regions, identifying risk factors, and developing targeted regional prevention programs.

    Keywords: Epidemiology, Fracture, Osteoporosis, Femur, Proximal Femur, Body Mass Index
  • Ergali Nabiyev, Ramazan Askerov, Khadisha Kashikova*, Arnat Baizakov, Zhassulan Argynbayev, Zhenisbek Baubekov, Kuanysh Baikubesov Pages 383-388
    Background

    Medial migration of the cervical screw is a frequent complication of Gamma nails and is observed in 4.3% - 6% of cases. The reasons are a violation of the surgical technique of osteosynthesis of a trochanter fracture, including an unrepaired fracture, a breach of the insertion point of the fixator, and a suboptimal position of the cervical screw. However, the migration of the Gamma nail neck screw into the pelvic cavity is sporadic, and only a few cases have been published in the literature.  

    Case description: 

    This is the case of a patient born in 1952 who was admitted to the hospital with pain syndrome in the pelvis and dysuric phenomena. As a result of clinical X-ray examination, CT, and MRT examination of the pelvis, medial migration of the Gamma nail cervical screw with damage to the bladder was revealed. The patient was urgently operated on the day of admission. The cervical screw from the bladder was removed, the bladder was sutured, and an epicystostomy was installed. The Gamma nail from the right femur was also released. There were no intraoperative complications. In the postoperative period, the patient was prescribed antibiotics and analgesics. The duration of hospitalization was six days. Being examined for five weeks after the operation, the patient does not make any particular complaints, walks without means of support, and the intertrochanteric fracture of the right femur fused incorrectly. The patient was offered an endoprosthesis of the right hip joint, but he and his relatives refused.  

    Conclusion

    The traumatologist should be aware of the possibility of such complications after osteosynthesis and its associated risks. They should be able to recognize the etiological factors causing the medial migration of the cervical screw of the intramedullary retainer. If medial screw migration is detected, the traumatologist should assess the function of internal organs and the condition of the main vessels and take measures to safely remove the migrated fixator from the anatomical cavity of the body in one team with a surgeon, urologist and  vascular surgeon.

    Keywords: Trochanter Fracture, Osteosynthesis, Gamma Nail, Cervical Screw, Migration, Bladder, Perforation
  • Sahar Sadr Moharerpour*, Hasan Otukesh, Rozita Hoseini Shamsabadi, Hossein Ghorbani, Shahrbanoo Nakhaiee, Farnoosh Seirafianpour, Parsa Panahi Pages 389-393
    Background

    The invasive, expensive, and time-consuming nature of radiological examinations for vesicoureteral reflux (VUR) has compelled researchers to search for new markers to predict VUR. This study was designed to evaluate the usefulness of serum and urine concentrations of neutrophil gelatinase-associated lipocalin (NGAL) in predicting the existence of VUR.  

    Methods

    This cross-sectional study involved all patients with a first febrile urinary tract infection (UTI) referred to Ali Asghar Children’s Hospital. Each patient included in the study had clinical symptoms of pyelonephritis and a positive urine culture. The patients were divided into 2 groups: VUR and non-VUR. The serum and urinary NGAL levels were calculated in both groups. The receiver operating characteristic (ROC) curve was used to look for serum and urinary NGAL cut-points that differentiated the VUR group from the non-VUR group.  

    Results

    Among the 40 children in the study, 23 belonged to the VUR group. The median age was 2.5 years (range, 0.3-8 years), and 35 patients were girls. ROC curve analysis showed that only the urinary NGAL level was significantly related to VUR. There was no association between serum NGAL levels and VUR. According to the ROC curve, a urinary NGAL level cut-off value of 15 ng/mL was likely to be diagnostic of VUR with 82.6% sensitivity and 58.8% specificity.  

    Conclusion

    The urinary NGAL level, specifically with a cut-off value of 15 ng/mL, can indicate the existence of VUR in patients with UTI with near-acceptable levels of sensitivity and specificity.

    Keywords: Children, Diagnosis, Febrile Urinary Tract Infection, Neutrophil Gelatinase-Associated Lipocalin, Vesicoureteral Reflux
  • Nader Tavakoli*, Nahid Hashemi Madani, Mojtaba Malek, Zahra Emami, Alireza Khajavi, Rokhsareh Aghili, Maryam Honardoost, Fereshteh Abdolmaleki, Mohammad E. Khamseh Pages 394-400
    Background

    Mortality has been indicated to be high in patients with underlying diseases. This study aimed to examine the comorbidities is associated with a higher risk of death during the hospital course.   

    Methods

    We retrospectively evaluated the risk of in-hospital death in 1368 patients with COVID-19 admitted to 5 academic hospitals in Tehran between February 20 and June 13, 2020.  We also assessed the composite end-point of intensive care unit admission, invasive ventilation, and death. The Cox proportional survival model determined the potential comorbidities associated with deaths and serious outcomes.   

    Results

    The retrospective follow-up of patients with COVID-19 over 5 months indicated 280 in-hospital deaths. Patients with diabetes (risk ratio (RR), 1.47 (95% CI, 1.10-1.95); P = 0.008) and chronic kidney disease (RR, 1.72 (95% CI, 1.16-2.56); P = 0.007) showed higher in-hospital mortality. Upon stratifying data by age, patients aged ˂65 years showed a greater risk of in-hospital death in the presence of 2 (hazard ratio (HR), 2.68 (95% CI, 1.46-4.95); P = 0.002) or more (HR, 3.47 (95% CI, 1.69-7.12); P = 0.001) comorbidities, compared with those aged ≥ 65 years.   

    Conclusion

    Having ≥ 2 comorbidities in nonelderly patients is associated with a greater risk of death during hospitalization. To reduce the mortality of COVID-19 infection, younger patients with underlying diseases should be the focus of attention for prevention strategies.

    Keywords: COVID-19, Comorbidity, Invasive Ventilation, Mortality, Iran
  • Zeynab Moradian Haft Cheshmeh, Afshin Ostovar, Ali Ghanbari Motlagh, Mohsen Asadi-Lari* Pages 401-407
    Background

    Ensuring the comprehensive and accurate representation of data within cancer registries holds paramount significance across various facets of public health decision-making. This study delves into the evaluation of data completeness in breast cancer (BC) pathology reports within a population-based cancer registration system in Iran, spanning the period from 2016 to 2018.   

    Methods

    Employing a retrospective and descriptive analytical approach, we harnessed secondary data extracted from pathology reports encompassing breast cancer diagnoses, which were duly recorded in the Integrated Cancer Information Management System database during 2016-2018. A total of 4000 pathology reports were thoughtfully selected from each of the three years. The spectrum of pathology information encompassed tumor type, site grade, size (T), and involvement of lymph nodes (N).  Summary statistics were provided as percentages of categorical variables and mean with standard deviation of continuous variables.  A comparison of categorical variables was performed using the Chi-squared test.   

    Results

    The participants' mean age was 51.8±12.5 years. Among the 12,000 studied patients, 5744 (47.9%) were ≤ 50 years old, 5233 (43.6%) were aged 50-69 years, and 1023 (8.5%) were >60 years old. The completeness of BC pathology reports varied for different variables. Interestingly, the completeness of these variables increased with older age groups. The proportion of specific tumor types differed significantly among age groups (P = 0.001). Notably, the prevalence of invasive ductal carcinoma was higher in the ≤ 50 years age group compared to the older cohorts. Likewise, notable variations in tumor sizes were observed (P = 0.009), with a higher prevalence of missing tumor size data noted in the age group ≤ 50 years. On the other hand, pathologic T stage also demonstrated age-dependent variations (P = 0.014), indicating a higher prevalence of missing stages in the ≤ 50 years age group. Finally, tumor grade exhibited a statistically significant difference (P < 0.001), with a higher proportion of grade 1 tumors observed in the 50-69 years age group.   

    Conclusion

    Tumor grade had the highest completeness rate, while tumor size, pathologic T stage, and pathologic N stage had the lowest. Therefore, a good understanding of completeness of pathology reports, as well as improvement in the registration of stage, integrated system at the national level for BC is warranted.

    Keywords: Completeness, Pathology Report, Breast Cancer, Iran
  • Mohammad Taghi Karimi, Abolghasem Fallahzadeh Abarghuei* Pages 408-415
    Background

    Thoracolumbar fractures are common traumatic injuries that can be treated conservatively or by surgery, depending on the type and severity of the injury. This study aimed to determine the efficiency of various orthoses used for these fractures based on the available literature.  

    Methods

    Between 1950 and 2023, a search was conducted in some databases, including PubMed Central and MEDLINE, ISI Web of Knowledge, Cochrane-centered Register of Controlled Trial (CCTR), Embase, and Scopus. Some keywords—such as conservative treatment, orthoses, brace, and cast—were used in combination with thoracolumbar fracture, burst fracture, and compressive fracture. The quality of the studies was evaluated using the PEDro scale. Two researchers independently reviewed the studies.   

    Results

    Based on the results of the included studies, orthosis is not necessary for stable burst and compression fractures. Based on the inclusion criteria, 20 papers were selected for the final analysis, 12 of which were on the use of spinal braces and casting (with quality between 1 and 6), 2 on the no-treatment approach, and 6 on comparing the outputs of treatment with spinal braces with no braces.   

    Conclusion

    Although the use of orthosis and cast is one of the conservative treatments recommended for patients with thoracolumbar fractures, it seems that for stable burst fractures and compression fractures, the use of a brace does not provide any benefits. However, the use of a brace or cast is recommended for burst fractures with more than 1 column fracture.

    Keywords: Thoracolumbar Fractures, Compression Fractures, Burst Fractures, Orthosis, Conservative Treatment
  • Zahra Hosseinpour, Mostafa Rezaei Tavirani*, Mohammad Esmaeil Akbari Pages 416-424
    Background

    Breast cancer is a complex and heterogeneous disease, and understanding its regulatory mechanisms and network characteristics is essential for identifying therapeutic targets and developing effective treatment strategies. This study aimed to unravel the intricate network of interactions involving differentially expressed genes, microribonucleic acid (miRNAs), and proteins in breast cancer through an integrative analysis of multi-omic data from Cancer Genome Atlas Breast Invasive Carcinoma (TCGA-BRCA) dataset.  

    Methods

    The TCGA-BRCA dataset was used for data acquisition, which included RNA sequencing data for gene expression, miRNA sequencing data for miRNA expression, and protein expression quantification data. Various R packages, such as TCGAbiolinks, limma, and RPPA, were employed for data preprocessing and integration. Differential expression analysis, network construction, miRNA regulation exploration, pathway enrichment analysis, and independent dataset validation were performed.  

    Results

    Eight consistently upregulated hub genes—including ACTB, HSP90AA1, FN1, HSPA8, CDC42, CDH1, UBC, and EP300—were identified in breast cancer, indicating their potential significance in driving the disease. Pathway enrichment analysis revealed highly enriched pathways in breast cancer, including proteoglycans in cancer, PI3K-Akt, and mitogen-activated protein kinase signaling.   

    Conclusion

    This integrated multi-omic data analysis provides valuable insights into the regulatory mechanisms, network characteristics, and functional roles of genes, miRNAs, and proteins in breast cancer. The findings contribute to our understanding of the molecular landscape of breast cancer, facilitate the identification of potential therapeutic targets, and inform strategies for effective treatment.

    Keywords: Breast Cancer, Target Therapy, Multiomic, Hub-Genes
  • Sepehr Shirzadeh, Navid Omidkhoda, Amir Hooshang Mohammadpour, Reza Mannani, Vahid Jomehzadeh* Pages 425-432
    Background

    Inflammation is important in the pathophysiology of acute respiratory distress syndrome (ARDS). Traumatic injuries have been assumed to be primary ARDS causes. Recently, pentoxifylline, a phosphodiesterase inhibitor (PEI), was shown to have anti-inflammatory effects and reduce the incidence of ARDS. The present study investigated the impact of preventive pentoxifylline administration in trauma patients prone to ARDS development.   

    Methods

    A total of 62 trauma patients admitted to the Kamyab Hospital in Mashhad,Iran, with ARDS risk who fulfilled the inclusion and exclusion criteria were included in this study. The patients were randomly divided into treatment and placebo groups. The treatment group received 400 mg pentoxifylline 3 times a day, while the control group received placebo tablets thrice for 1 week. Before the intervention and during the study, factors such as heart rate, blood pressure, respiration rate, continuous pulse oximetry, CRP, PO2, PCO2, and PH were assessed. Finally, the obtained data were analyzed using SPSS Version 26 via a generalized estimating equations model and an independent t test.  

    Results

    The heart rate was significantly lower in the treatment group than in the placebo group (P = 0.036). In addition, PO2 levels were remarkably higher in the treatment group (P = 0.040). Changes in respiratory rate (P = 0.064), CRP (P = 0.341), PH (P = 0.910), PCO2 (P = 0.892), HCO3 (P = 0.172), systolic blood pressure (P = 0.302), and SPO2 (P = 0.350) were not significantly different between the 2 groups. In addition, no significant difference was observed in the incidence and severity of ARDS between the 2 groups.   

    Conclusion

    The findings of this study revealed that pentoxifylline administration to trauma patients had no beneficial effects on ARDS but improved some vital signs and laboratory variations.

    Keywords: Acute Respiratory Distress Syndrome, Pentoxifylline
  • Zandulla Nakipov, Dinara Kaliyeva, Assiya Turgambayeva*, Zakira Kerimbayeva, Zhalgaskali Arystanov, Tanagul Arystanova, Nellya Ivanchenko, Nabil Joseph Awadalla Pages 433-440
    Background

    One of the most effective measures to reduce the cost of medicines for both the healthcare system and patients is the use of generic drugs (GDs). The objective of this study was to identify the physicians’ level of knowledge and attitude toward GDs.  

    Methods

    A cross-sectional survey was conducted based on a specially designed validated questionnaire of 19 items. The survey was attended by doctors of various specialties working in polyclinics in six regions in the Republic of Kazakhstan. Construct validity was assessed through principal component factor analysis, whereas reliability was assessed using Cronbach’s alpha coefficient. Group differences were assessed using Mann-Whitney and Kruskal-Wallis nonparametric tests when comparing two and more than two groups, respectively.  

    Results

    The study involved 450 physicians. Only 260 (57.8%) believed that GDs are bioequivalent to the brand name drug (strongly agree and agree). About 202 (45%) of respondents doubt the effectiveness of GDs, and 144 (32%) assumed that they cause more side effects compared to similar branded drugs. Also, the majority of the respondents 320 (71.2%) felt that branded drugs should be held to higher safety standards than GDs. Approximately 338 (75%) of the physicians positively expressed that both physicians and pharmacists need standardized guidelines for the brand name substitution process. Further, 372 (82.7%) proposed that more information about the safety and efficacy of GD is needed. Also, 326 (72.4%), 314 (88.2%), and 85 (18.9%) of the respondents assumed that patients’ socio-economic factors, trust in manufacturers/suppliers, and bonuses on products respectively influence the prescribing of medicines.   

    Conclusion

    Although the study indicated that physicians in the Republic of Kazakhstan are acknowledging the use of GDs, concerns about the effectiveness and safety of GDs remain high. To enhance the use of GDs, physicians' targeted educational programs on GDs' bioequivalence, safety, and efficacy should be implemented.

    Keywords: Attitude, Generic Drugs, Physicians, Knowledge
  • Faranak Rokhtabnak, Masoud Baghai-Wadji, Parinaz Morovati Sharifabadi, Nasrin Nouri* Pages 441-445

    Bronchobiliary fistula (BBF) in adults is a quite infrequent complication characterized by the abnormal interconnection between the right bronchial system and the biliary tract. BBF may occur due to various causes, including trauma, infections, malignancies, and complications of certain surgical procedures involving the liver or the hepatobiliary system. In this paper, we report a case of BBF following liver hydatid cyst resection that developed in a 58-year-old Iranian male. The patient had acute dyspnea with yellowish sputum. After diagnostic measures such as bronchoscopy, computed tomography (CT) scan, endoscopic retrograde cholangiopancreatography (ERCP), and confirmation of the diagnosis of BBF, the patient underwent Intravenous (IV) antibiotic therapy, placement of pleural drain, sphincterotomy and CBD stents insertion but unfortunately, these measures were not effective and the patient was a candidate for thoracotomy and resection of fistula and the involved lung. During surgery, absolute lung isolation was done by insertion of a left-sided double-lumen endobronchial tube and uneventful anesthesia was maintained for about 5 hours. Patients with BBF present unique challenges in terms of anesthetic management. Sepsis, pulmonary impairment, electrolyte imbalances and malnutrition will make anesthesiologists face many perioperative challenges. During surgery, absolute lung isolation is typically necessary and achieving effective lung isolation can be quite challenging due to the presence of the fistula. Postoperatively, intensive respiratory support, chest tube drainage, and appropriate antibiotic therapy may be required  .In addition, a multidisciplinary approach involving anesthesiologists, thoracic surgeons, and other specialists is crucial.

    Keywords: Anesthesia, One-Lung Ventilation, Broncho-Biliary Fistula, BBF, Thoracotomy
  • Shabnam Asadi, Amirreza Haji Azizi, Shiva Soraya, Mohammad Faramarzi, Ruohollah Seddigh, Ali Asadi* Pages 446-451
    Background

    Diagnosing factitious disorder (FD) poses significant medical challenges; delays impact patient care and costs. Cultural factors of each country also affect illness and behavior disorders. This study examines the prevalence, demographics, and clinical features of factitious disorder patients in Iranian psychiatric hospitals.  

    Methods

    This cross-sectional study reviewed patient data from three psychiatric hospitals in Tehran from 2017 to 2022, confirming FD diagnoses by psychiatry faculty. Inclusion criteria were the diagnosis of FD according to ICD-10 in the last five years. We recorded demographic data, main stressors, symptoms and diagnoses and analyzed them with SPSS-25. Data are presented as numbers and percentages and compared between groups by chi-square test.  

    Results

    A total of 17 cases with the diagnosis of factitious disorder were investigated in 5 years (4.315 per 10,000 patients). The highest frequency age range was between 20-30 years, and most of them were male. Our results showed that in only 7 cases, there was initial suspicion of factitious disorder or factitious disorder imposed on another (factitious disorder by proxy). Most of the patients had psychiatric comorbidities, among which the most common comorbidity was substance use disorder and cluster B personality disorder. Among the evidence of suspicion for the diagnosis of factitious disorder in 65% of cases was a history of multiple previous hospitalization and more than 40% of the cases were based on the pattern of repeated symptoms.  

    Conclusion

    This study showed that FD is underdiagnosed, and more attention is needed to the signs of this diagnosis in the assessments. Also, the clinical features showed that treatment should account for comorbid disorders.

    Keywords: Factitious Disorder, Psychiatric Inpatients, Munchausen Syndrome
  • Peyman Kamali Hakim, Mehrzad Mehdizade, Fahimeh Zeinalkhani*, Arian Karimi Rouzbahani, Hadise Zeinalkhani, Hamid Rajabi, Hamed Ghorani, Sina Delazar Pages 452-456
    Background

    Developmental dysplasia of the hip (DDH) is used to describe a spectrum of structural abnormalities that involve the growing hip. Early diagnosis and treatment are critical to providing the best possible functional outcome. This study aimed to evaluate the prevalence of DDH in neonates with and without risk factors and determine the role of ultrasound screening on the initial diagnosis.   

    Methods

    This prospective cross-sectional study was conducted on 399 infants at the Pediatric Treatment Center, Tehran University of Medical Sciences, between December 2015 and June 2016. Infants with suspected DDH who underwent hip ultrasonography were included, and the presence or absence of each risk factor was documented according to the checklist. The ultrasound findings were also registered in the checklists. The odds ratio (OR) of each risk factor for DDH was calculated. The collected data were analyzed by SPSS software version 18 at a 0.05 significance level.  

    Results

    In 16 months of study, 174 (43.6%) male and 225 (56.4%) female infants under the age of 18 months were studied. Risk factors were detected in the medical history of 329 infants. Out of them, 230(57.6%) were firstborn children, 7 (1.75%) had a positive family history of DDH, and 26 (6.5%) had limb anomalies. There was also a history of breech presentation in 16 (4.01%) and a history of oligohydramnios in 21 (5.1%) of infants. The prevalence of DDH was 25.8% in infants with risk factors and 2.8% in those without risk factors. (OR = 11.84, P < 0.05).  

    Conclusion

      In this study, the frequency of DDH was significantly higher in infants with risk factors. The female gender and limb anomalies were stronger risk factors for DDH. Overall, ultrasound showed great potential for DDH screening.

    Keywords: Developmental Dysplasia Of The Hip, Ultrasound, Screening, Risk Factor
  • Seyyed Hadi Jabali, Shahram Yazdani, Hamid Pourasghari, Mohamadreza Maleki* Pages 457-473
    Background

    Evidence-informed policymaking is a complex process that requires adapting to diverse contexts characterized by varying degrees of certainty and agreement. Existing models and frameworks often lack clear guidance for dealing with such contexts. This study aimed to develop a novel contingency model to guide the context-specific use of evidence in health policymaking.  

    Methods

    The study conducted a meta-ethnographic synthesis of 15 existing models and frameworks on evidence-informed policymaking, integrating key factors and concepts influencing the use of evidence in policy decisions. The study also adapted the Stacey Matrix, a tool for understanding the complexity of decision-making, into a quantitative scoring system to assess the levels of certainty and agreement in a given policy context.  

    Results

    The study proposed a contingency model that delineates seven modes of decision-making based on the dimensions of certainty and agreement, ranging from rational to molasses-slow collective. For each mode, the model suggests configuring four aspects: team composition, policy idea generation, problem analysis, and consensus building. The model also highlights the multifaceted influences of evidence, interests, values, and beliefs on policy decisions.  

    Conclusion

    The contingency model offers researchers and policymakers a flexible framework for aligning policymaking processes with available evidence. The model also underscores the importance of context-specific approaches to evidence-informed policymaking. The model could enhance evidence-informed policymaking capacity, improving health outcomes and system performance. Further research should validate and extend the model empirically across diverse contexts.

    Keywords: Evidence-Informed Health Policymaking, Contingency Model, Meta-Model, Meta-Ethnography, Decision-Making
  • Siros Hemmatpour, Ghobad Moradi, Mehdi Zokaei, Zhaleh Karimi, Yousef Moradi, Elham Nouri* Pages 474-480
    Background

    Neonatal mortality is a significant public health issue that can often be prevented. The present study was designed and conducted to determine the causes of neonatal mortality and the status of their mothers in Kurdistan Province in 2019.  

    Methods

    In this matched case-control study, the case group comprised 171 deceased neonates, while the control group consisted of 171 healthy neonates, along with their mothers' status in Kurdistan Province in 2019. for each case in each city, one control from the same city within the same week of the case's birth was randomly selected. Data were collected using a checklist containing information on the neonates and their mothers. The analysis was performed using STATA 17 software, which involved descriptive tests and analytical tests. These included frequency calculation, chi-square test, conditional logistic regression, and a stepwise backward elimination method for multivariate analysis with (P = 0.1). A significance level of P < 0.05 was considered.   

    Results

    The results, after matching the odds ratio with a CI 95% using the backward method, showed a significant association between prematurity (OR:15.99, 95%CI:4.38-58.31), complications during pregnancy (OR:8.64, 95% CI:2.80-26.66), weight gain during pregnancy (OR:3.04, 95% CI:1.06-8.70).  

    Conclusion

    The findings of our study suggest a positive association between neonatal mortality and specific maternal and neonatal factors, namely neonatal prematurity, complications during pregnancy, and inappropriate weight gain during pregnancy. Therefore, there is a compelling need to implement effective measures to control these identified risk factors, with the goal of reducing neonatal mortality.

    Keywords: Risk Factors, Neonatal Mortality, Matched Case-Control
  • Behzad Damari, Mohammad Reza Amir Esmaili, Noora Rafiee*, Ahmad Hajebi Pages 481-491
    Background

    It is now confirmed that mental health promotion policies need innovations beyond the scope of the health sector. In this study, an attempt was made to identify the most effective stakeholders of the public sector in the field of mental health promotion in Iran to help the policy-makers and to encourage inter-sectoral collaboration and further involvement of these effective sectors in mental health promotion plans.  

    Methods

    This was a mixed-methods study. From the first step (literature review and a survey), the names of public agencies affected by mental health promotion were extracted. In the next step, a checklist for identifying the main stakeholders was developed. The data of this step were analyzed by the simple additive weighting method. Ultimately, a table was plotted in the form of institutional mapping in order to summarize the organizations affecting each risk factor of mental health promotion.  

    Results

    The Islamic Consultative Assembly, the Ministry of Interior, the Islamic Republic of Iran Broadcasting, the Ministry of Cooperatives, Labor, and Social Welfare, and the Ministry of Education were identified as the five institutions with the greatest impacts on the social determinants of mental health in Iran.   

    Conclusion

    Significant impacts can be exerted by institutions such as the Islamic Consultative Assembly (as the legislator), the Ministry of Interior, and its subsidiary entities such as municipalities and governors (as the administrators of homeland security and support for safe and appropriate urban and local facilities), the Islamic Republic of Iran Broadcasting (as the national media), the Ministry of Cooperatives, Labor, and Social Welfare (as the institution in charge of employment, job security, and social welfare), and the Ministry of Education (as the educational institution of the country).

    Keywords: Stakeholder, Intersectoral Collaboration, Health Promotion, Mental Health
  • Hossein Rezai, Hooshang Dadgar*, Amir Kasaeian Pages 492-502
    Background

    Many children with autism spectrum disorder (ASD) are unable to benefit from timely interventions. This research aimed to indirectly enhance play and communication skills in ASD children by providing a video educational package and distance education for their parents.  

    Methods

    In this clinical trial study, 32 parents and their children with ASD were randomly assigned to either the intervention or waitlist control groups. The intervention group received an educational video package along with 24 one-hour online sessions. The frequency of communication, engagement in functional games, and the use of conventional and unconventional gestures were assessed before, immediately after, and 3 months following the completion of the intervention in the participating children. The variables were analyzed within and between the two groups using a mixed between-within-subjects analysis of variance (ANOVA).  

    Results

    The intervention group achieved significantly higher scores than the control group in the frequency of communication (P = 0.003), functional play (P < 0.001), and conventional gestures (P < 0.001). Conversely, the intervention group had significantly lower scores than the control group in unconventional gestures (P < 0.001).   

    Conclusion

    The observed improvements in both parents and children within the intervention group provide compelling support for the effectiveness of telepractice in speech therapy. This suggests that incorporating remote training methods into speech therapy sessions could enhance access for children with ASD to these interventions.

    Keywords: Instructional Film, Video, Autism Spectrum Disorder, Nonverbal Communication, Play, Playthings, Prenatal Education
  • Mehryar Taghavi Gilani, Alireza Bameshki, Majid Razavi*, Ghorbanali Sadeghzadeh Pages 503-507
    Background

    Pulmonary compliance is an important lung factor and is affected by tidal volume. In this study, static and dynamic compliance with tidal volumes of 6 and 10 ml/kg have been evaluated in patients undergoing abdominal cancer surgery.  

    Methods

    This randomized clinical trial was conducted on 50 patients who were candidate for abdominal cancer surgery. This study was done in patients aged 20-65 years without chronic diseases. After induction of anesthesia, the first group was ventilated with a tidal volume of 10 ml/kg and 8 breaths/minute, and also the second group was ventilated with a tidal volume of 6 mL/kg and 14 breaths/minute. From the beginning and every 15 minutes, expiratory tidal volume, peak and plateau airway pressure, heart rate and blood pressure were measured for two hours. The data was analyzed with SPSS v.20 and P < 0.05 was meaningful.  

    Results

    There was no significant difference between the two groups for demographic characteristics. There was no significant difference between the two groups in the dynamic and static compliance of the patients during the study. However, the static compliance decreased in the 6 ml/kg group and increased in the 10 ml/kg group, but the difference was not statistically significant (P = 0.32). The peak, plateau pressure and hemodynamic parameters were the same in the two groups.  

    Conclusion

    In general, the static and dynamic compliance was not significantly different in the two groups despite a slight decrease in the 6 ml/kg group and a slight increase in the 10 ml/kg group.

    Keywords: Mechanical Ventilation, Tidal Volume, Dynamic Compliance, Static Compliance
  • Omid Moradi Moghaddam, Mohammajavad Gorjizadeh, Mohsen Sedighi, Alireza Amanollahi, Ali Khatibi, Mohammadreza Ghodrati, Mohammad Niakan Lahiji* Pages 508-511
    Background

    Acid-base disturbances are frequently found in intensive care unit (ICU) patients. Base excess (BE) is commonly used to quantify the degree of metabolic impairment. We aimed to compare the predictive value of BE and Sequential Organ Failure Assessment (SOFA) score for mortality in ICU patients.  

    Methods

    This prospective and observational investigation was performed on 87 ICU patients who underwent mechanical ventilation. SOFA score and acid-base variables at 6 hours of ICU admission were analyzed and compared between survivors and non-survivors. Receiver-operating characteristic (ROC) curve was applied to analyze the predictive value of BE and SOFA for mortality.  

    Results

    Mean age of patients was 63.91±5.03 years, and 60 (69%) were male. The non-survived patients had significantly higher SOFA (P = 0.001) and APACHE II scores (P = 0.001). The non-survived patients had a lower bicarbonate (P = 0.002), PO2 (P = 0.001), pH (P = 0.0021), and a higher PCO2 (P = 0.001) compared with survivors, and most patients who died (80%) had a low BE value (< -2) (P = 0.002). The estimated AUC of SOFA and BE was 0.83 (95% CI, 0.73 - 0.92) and 0.71 (95% CI, 0.57 – 0.85), respectively.  

    Conclusion

    BE is, to some extent capable of predicting mortality in ICU patients. However, the SOFA score is a more accurate and reliable parameter in comparison to BE for prediction.

    Keywords: SOFA, Base Excess, Intensive Care, Mortality
  • Ali Bagherpour, Behzad Motaharian, Farzaneh Lal Alizadeh, Maryam Valizadeh, Kosar Hosseini* Pages 512-517
    Background

    The term Ponticulus Posticus (PP) refers to a complete or partial bony bridge on the vertebral artery that passes through the superior-lateral surface of the posterior arch of the atlas. This study was conducted with the aim of investigating the prevalence of ponticulus posticus in orthodontic patients referred to Mashhad Dental School.   

    Methods

    In this cross-sectional study, one thousand cephalograms were selected from the patients referred to the orthodontics department of Mashhad Dental School between 2017 and 2021. In lateral cephalogram images with appropriate quality, the type of malocclusion was determined using the AudaxCeph software (Audax d.o.o., Ljubljana, Slovenia). Then, the images were evaluated for the presence or absence of PP. For the statistical analysis, chi-square and t-test were used.   

    Results

    In this study, 861 lateral digital cephalograms were analyzed. The overall prevalence of PP in the studied population was 17.5%. The prevalence of PP was higher in males than in females (P < 0.001).  The variables “presence of PP” (P = 0.056) and “type of PP” (P = 0.522) were found to be independent of age groups. Although class II subjects showed a higher prevalence of PP, skeletal malocclusion classes were not found to be correlated with the presence of PP (P = 0.104) nor with its types (P = 0.958).   

    Conclusion

    The current study is considered the primary study that provides data concerning the prevalence of PP in the East of Iran. Our study showed that PP was not rare in this region. More studies with 3D radiological examination are needed to increase the accuracy of diagnosing PP and its prevalence in Iran.

    Keywords: Lateral Cephalogram, Ponticulus Posticus, Malocclusion, Radiography
  • Farhad Fatehi, Seyed Ali Hoseini, Nazila Akbarfahimi*, Abdolreza Yavari Pages 518-524
    Background

    Spinal cord injury (SCI) is a life-long neurological disease. This study reviews the literature on the vocational rehabilitation (VR) of people who experience SCI.   

    Methods

    MEDLINE (via PubMed), Web of Science, EMBASE, Google Scholar, ProQuest, and Science Direct databases were searched. The inclusion criteria of the articles included the following: describing adults with SCI only, the English or Persian language, and involving people of workforce age. Conference abstracts, case studies, and editorials were excluded.   

    Results

    The eligibility of 186 full-text articles was assessed, and 124 studies met the inclusion criteria. Most studies focused on barriers and facilitators for work in people with SCI.  

    Conclusion

    There are no current services and programs in Iran that support post-injury employment of people with SCI, and therefore, there remains a need for studies addressing employment in this population.

    Keywords: Return To Work, Spinal Cord Injuries, Vocational Rehabilitation
  • Samira Mazaheri, Zahra Soleymani*, Roxanne Hudson, Saeed Talebian Pages 525-532
    Background

    This research marks the exploration into comparing the effectiveness of two reading interventions in improving reading outcomes for third to fifth-grade Farsi-speaking students with dyslexia.  

    Methods

    In this  randomized control trial study,  twenty students in Tehran were randomly assigned to a multi-component group and a comprehension-based intervention group, each receiving 36 sessions of 45 minutes. The effectiveness of the interventions was evaluated using adjusted mean differences with a one-way ANCOVA.   

    Results

    The results revealed the comprehension-based intervention's superior effect size across most outcomes, except for the letters string. The effect size was large for word reading 0.93 (CI -0.002 to 1.85), medium for phoneme deletion 0.67 (CI -0.23 to 1.5), small for text comprehension 0.25 (CI -0.62 to 1.13), and trivial for both rhyme identification 0.1 (-0.77 to 0.98) and non-word reading 0.11 (CI -0.76 to 0.98). The multi-component intervention had a greater effect size on letters string than the other intervention, although it was small -0.21 (CI -1.09 to 0.66).  

    Conclusion

    The study concluded that comprehension-based intervention was more effective for Farsi-speaking students with dyslexia in grades 3-5, emphasizing the need for diverse intervention approaches to address their specific needs.

    Keywords: Reading, Intervention, Farsi, Dyslexia, Multi-Component, Comprehension
  • Maryam Hormozi, Meysam Moulaee, Mahdi Alaee, Nasim Beigi Boroujeni, Mandana Beigi Boroujeni* Pages 533-539
    Background

    Silymarin is a flavonolignan that has various medicinal properties such as liver protection, antioxidant, anti-inflammatory, anti-cancer and heart protection activities. The aim of this study was to investigate the effect of silymarin on the expression level of mir-21, matrix metalloproteinase (MMP), and their tissue inhibitors (TIMPs) in liver cancer HepG2 cell line.   

    Methods

    An in-vitro experimental study was conducted on the human HepG2 cells prepared from Pasteur Institute, Tehran, Iran. Four concentrations of 0 (control), 50, 100, and 150 µM of silymarin were considered as the study groups according to the MTT assay. Gene expression study was performed using real-time PCR. The studied genes were mir-21, MMP-2, MMP-9, TIMP-1 and TIMP-2. In addition, some apoptosis-related genes including BAX, BCL2 and Caspase3 (CAS3) were investigated. GAPDH was used as an internal control. Relative expression was calculated by REST program using t-test on the logarithm of expression considering a significance level of 0.05.    

    Results

    The significant up-regulations consisted of TIMP genes for doses 100 µM and 150 µM, and the apoptosis activating genes CAS3 and BAX (P < 0.05). The significant down-regulations consisted of MMP-9 in all concentrations, MMP-2 in concentration 100 µM, and the apoptosis inhibitory gene BCL2 in concentrations 50 µM and 100 µM (P < 0.05). In addition, mir-21 as an oncogenic micro-RNA showed significant down-regulation for all doses (P < 0.05). All the comparisons were with the control group.   

    Conclusion

    The present study showed that silymarin could affect the HepG2 cell line at the gene expression level via increasing apoptosis and changing the expression of MMP-2, MMP-9, TIMP-1, TIMP-2 and mir-21. These findings were in line with each other and in favor of suppression of tumoral activity in this cell line.

    Keywords: Gene Expression, Apoptosis, Cancer, Hepatocellular Carcinoma
  • Ehsan Fallah, Mobin Naghshbandi, Roya Ghafoury, Nima Hosseini Zare* Pages 540-550
    Background

    Anterior cruciate ligament (ACL) reconstruction is pivotal for restoring knee stability and function in individuals with ACL injuries. While bone-patellar tendon-bone (PT), hamstring tendon (HT), and quadriceps tendon (QT) autografts are commonly employed, their comparative effectiveness remains a subject of ongoing research. This study aims to comprehensively compare the functional outcomes, knee stability, revision rates, and incidence of anterior knee pain associated with these autografts.  

    Methods

    In this randomized clinical trial, adult male participants undergoing primary single-bundle ACL reconstruction were randomized into three groups (PT, HT, QT) using a computer-generated sequence with allocation concealment. Blinded assessments were conducted at 2-, 6-, and 12-months post-surgery to evaluate knee function, stability, and patient satisfaction. The rehabilitation protocol was standardized across groups, including specific exercises and cryotherapy, to minimize postoperative swelling and pain.  

    Results

      A total of 75 participants were followed for 12 months post-surgery. While significant improvements in knee function and stability were observed across all groups, there were no statistically significant differences between the autograft types in terms of revision rates or the incidence of anterior knee pain. Detailed statistical analysis revealed effect sizes and confidence intervals, substantiating the clinical relevance of the findings.  

    Conclusion

    PT, HT, and QT autografts each provide favorable outcomes for ACL reconstruction without significant differences in efficacy up to one year postoperatively.Level of Evidence: Level 2 (Randomized Clinical Trial)

    Keywords: Anterior Cruciate Ligament, Quadriceps Tendon, Hamstring Tendon, Tendon Graft, Patella Tendon Graft
  • Mohammad Haji Aghajani, Roxana Sadeghi, Mohammad Parsa Mahjoob, Amir Heidari, Fatemeh Omidi, Mohammad Sistanizad, Asma Pourhoseingholi, Seyed Saeed Hashemi Nazari, Mahmoud Yousefifard, Reza Miri*, Niloufar Taherpour Pages 551-561
    Background

    The current registry system aims to design a database that can be used for future research as a tool to produce and update new protocols for the diagnosis, treatment, management, and prevention of heart diseases.  

    Methods

    In this hospital-based registry system, established on 27 July 2021, all the adult patients (age ≥18 years old) with signs and symptoms of cardiac diseases under coronary angiography or angioplasty in the cardiac ward of Imam Hossein Hospital of Tehran, Iran were recruited and followed-up until 30 days after discharge in the pilot phase. All data were collected using a researcher-made checklist from face-to-face interviews with patients and their medical records. The data were registered electronically in web-based software. Quality Control (QC) is conducted monthly by the QC team to ensure the documented data's quality.  

    Results

    among 1265 patients under coronary angiography or angioplasty over a year, 97% (n=1198) of them were Iranian, and 991 (73.33%) patients lived in the country's capital, Tehran. About 55% (n=706) of patients were male. The mean age of the total patients was 60.48 ± 12.01 years. 764 (60.39%) patients were diagnosed with Coronary Artery Disease (CAD). Of all CAD patients, 32.72% (n=250) and 1.18% (n=9) were premature and very early CAD, respectively. During one year, 22.54% (n=279) and 7.02% (n=87) of patients were under PCI and CABG, respectively.  

    Conclusion

    Since CVDs, especially CADs, are one of the most common and priority diseases in Iran's health system, establishing a coronary angiography and angioplasty registration system is an opportunity to study the epidemiological and clinical process of CVDs in the shape of an accurate registration system.

    Keywords: Coronary Artery Disease, Coronary Angiography, Coronary Angioplasty, Registry System, Iran
  • Sara Parviz, Fahimeh Zeinalkhani*, Masoumeh Gity, Hamidreza Saligheh Rad, Anahita Fathi Kazerooni, Fatemeh Nili, Peyman Kamali Hakim, Hadise Zeinalkhani Pages 562-567

    Primitive neuroectodermal tumors (PNET) are a family of poorly differentiated malignant neoplasms of neuroectodermal origin. According to the location of origin, PNETs could be further categorized as central or peripheral. Peripheral PNET (pPNET) is an uncommon type that accounts for 1% of all soft tissue sarcomas and occurs outside the central and sympathetic nervous systems. Ovarian PNET is a very rare tumor with a high mortality rate. We report a case of pPNET originating from the pelvic cavity of a young woman. Ultrasound and Magnetic Resonance Imaging (MRI) findings demonstrated the presence of a high-grade malignant ovarian tumor. On microscopic evaluation, the tumor was composed of solid nests and sheets of small rounded cells, and on Immunohistochemical (IHC) evaluation, the tumor cells showed intense cell-membranous immunoactivity for MIC2 protein (CD99).In the differential diagnosis of any invasive pelvic tumor in young women, pPNET should be considered.

    Keywords: Primitive Neuroectodermal Tumor (PNET), Ovary, Magnetic Resonance Imaging (MRI) Findings, Tumor, Diagnosis
  • Mansoureh Farhangniya*, Ali Samadikuchaksaraei, Farzaneh Mohamadi Farsani Pages 568-577
    Background

    The skin is the biggest organ in the body and has several important functions in protection and regulation. However, wound development can disrupt the natural healing process, leading to challenges such as chronic wounds, persistent infections, and impaired angiogenesis. These issues not only affect individuals' well-being but also pose significant economic burdens on healthcare systems. Despite advancements in wound care research, managing chronic wounds remains a pressing concern, with obstacles such as persistent infection and impaired angiogenesis hindering the healing process. Understanding the complex genetic pathways involved in wound healing is crucial for developing effective therapeutic strategies and reducing the socio-economic impact of chronic wounds. Weighted Gene Co-Expression Network Analysis (WGCNA) offers a promising approach to uncovering key genes and modules associated with different stages of wound healing, providing valuable insights for targeted interventions to enhance tissue repair and promote efficient wound healing.   

    Methods

    Data collection involved retrieving microarray gene expression datasets from the Gene Expression Omnibus website, with 65 series selected according to inclusion and exclusion criteria. Preprocessing of raw data was performed using the Robust MultiArray Averaging approach for background correction, normalization, and gene expression calculation. Weighted Gene Co-Expression Network Analysis was employed to identify co-expression patterns among genes associated with wound healing processes. This involved steps such as network construction, topological analysis, module identification, and association with clinical traits. Functional analysis included enrichment analysis and identification of hub genes through gene-gene functional interaction network analysis using the GeneMANIA database.   

    Results

    The analysis using WGCNA indicated significant correlations between wound healing and the black, brown, and light green modules. These modules were further examined for their relevance to wound healing traits and subjected to functional enrichment analysis. A total of 16 genes were singled out as potential hub genes critical for wound healing. These hub genes were then scrutinized, revealing a gene-gene functional interaction network within the module network based on the KEGG enrichment database. Noteworthy pathways such as MAPK, EGFR, and ErbB signaling pathways, as well as essential cellular processes including autophagy and mitophagy, emerged as the most notable significant pathways.  

    Conclusion

    We identified consensus modules relating to wound healing across nine microarray datasets. Among these, 16 hub genes were uncovered within the brown and black modules. KEGG enrichment analysis identified co-expression genes within these modules and highlighted pathways most closely associated with the development of wound healing traits, including autophagy and mitophagy. The hub genes identified in this study represent potential candidates for future research endeavors. These findings serve as a stepping stone toward further exploration of the implications of these co-expressed modules on wound healing traits.

    Keywords: Wound Healing, Gene Expression, Network Analysis, Autophagy, Mitophagy
  • Sara Cheraghi, Maryam Honardoost, Fereshteh Abdolmaleki, Mohammad E. Khamseh* Pages 578-581
    Background

    Papillary thyroid carcinoma is the most frequent type of thyroid cancer. The BRAFV600E mutation is associated with tumor progression. We explored the utility of the BRAF molecular testing on fine needle aspiration fixed specimens of patients with confirmed diagnoses of papillary thyroid carcinoma.  

    Methods

    Fixed thyroid cytology slide specimens of 19 patients with Bethesda II to VI reports were used to detect BRAFV600E mutation by pyrosequencing of extracted DNA.   

    Results

    BRAFV600E mutation was detected in 25% of the specimens with Bethesda category III and IV nodules and in 73% of the nodules with Bethesda category V and VI.   

    Conclusion

    BRAF mutation analysis can be performed on fixed fine needle aspiration cytology specimens. Although the frequency of the mutation is higher in specimens with higher Bethesda category scores, it could support clinical decision-making in thyroid nodules with intermediate Bethesda category scores.

    Keywords: Papillary Thyroid Cancer, Fine Needle Aspiration, BRAFV600E
  • Abdollah Soltan-Tajian, Alireza Jabbari, Nasrin Shaarbafchizadeh*, Peivand Bastani Pages 582-589
    Background

    In recent decades, healthcare purchasing has been continuously searching for new approaches to improve performance. The pressure of expensive services resulting from more advanced health technology has increased the necessity of these changes. Strategic purchasing of health services, as a recommended approach, remains unknown in diagnostic imaging services. This study explores the potential determinants of strategic purchasing in the context of diagnostic imaging services.  

    Methods

    This was a qualitative study conducted through framework analysis (applying five stages of familiarization, identifying a thematic framework, indexing, mapping, and interpretation) in 2023 based on the World Health Organization for strategic purchasing. This framework includes 5 questions: what to buy? From whom to buy? For whom to buy? what mechanism to buy? At what price to buy? A dimension of what structure to buy? Data were gathered through semi-structured interviews with key informants in which data saturation was reached in 18 interviews. After transcribing each interview, data were analyzed using MAXQDA software.  

    Results

    A total of 32 factors were identified to be influencing the strategic purchasing of diagnostic imaging services. Development of an evidence-based service package using a prospective combined payment system, consideration of the burden of disease and health needs, implementation of a referral system and family physician program integrated with the electronic health record, and most importantly, political belief and technical capacity are the most important identified factors.   

    Conclusion

    The implementation of a strategic purchasing policy requires a systemic approach to the factors affecting it. A number of specific and sometimes interconnected activities must be carried out in different areas of strategic purchasing. Governance of purchasing is the foundation of strategic purchasing. It is suggested that this item should be investigated more in countries like Iran.

    Keywords: Healthcare, Health System Financing, Purchasing, Strategic Purchasing, Diagnostic Imaging Services
  • Ali Arash Anoushirvani, Seidamir Pasha Tabaeian, Minoo Maarefi, Samira Basir Shabestari* Pages 590-596
    Background

    Both coronavirus disease-2019 (COVID-19) and cancer place a heavy burden on the society and mental health of patients. Spiritual health may play a prominent role in coping with stressful conditions. Considering the existing controversy regarding the correlation between spiritual health and stress related to the COVID-19 pandemic in cancer patients, this study aimed to assess the correlation between spiritual health and COVID-19 stress in cancer patients.  

    Methods

    This cross-sectional study was conducted on cancer patients presenting to Rasoul and Firouzgar Hospitals, affiliated with Iran University of Medical Sciences, in 2022. After obtaining written informed consent, eligible patients filled out the spiritual health questionnaire and COVID Stress Scale (CSS). Data were analyzed by the Pearson and Spearman correlation coefficients and one-way ANOVA.  

    Results

    The mean (SD) levels of COVID-19 stress and spiritual health were equal to 106.5 (44.5) and 26.2 (10.9), respectively, which are regarded as moderate levels. An inverse correlation of -0.48 was found between spiritual health and COVID-19 stress (P < 0.001). Spiritual health decreased by an increase in the stage of cancer (P < 0.001). The mean COVID-19 total stress score and its domain scores [except for the post-traumatic stress syndrome (PTSD) domain] were significantly higher in patients with poor spiritual health compared with those with good spiritual health. However, the difference in this regard was not significant between patients with poor and moderate spiritual health (P > 0.05).   

    Conclusion

    This study confirmed the prominent role of spiritual health in the reduction of COVID-19 stress. Promotion of spiritual health in cancer patients should be considered as an inseparable part of patient care to prevent disease aggravation and decrease the stress level of cancer patients, particularly during the COVID-19 pandemic.

    Keywords: Cancer, COVID-19, Health, Pandemics
  • Ehsan Zarei, Iman Yousefi, Saba Shiranirad, Tahmineh Poursaki, Mohamad Mehdi Zahmatkesh, Pouria Farrokhi* Pages 597-608
    Background

    Defensive medicine (DM) refers to taking or not taking clinical actions, mainly to prevent legal or reputational consequences. It increases patient and health system costs and threatens patient safety. This study aimed to provide policy options to reduce DM behaviors and was conducted in two phases.  

    Methods

    First, a scoping review was conducted by searching the Web of Science, PubMed, ProQuest, and Scopus databases in 2000–2023, and interventions and strategies to control DM behaviors were identified. To recognize the advantages, disadvantages, and implementation considerations, one session of focus group discussion (FGD) with experts was designed. Finally, the policies, strategies, advantages, disadvantages, and implementation considerations were refined and categorized during two expert panel sessions.   

    Results

    During the search, 1774 articles were retrieved. Finally, after the screening process, 58 articles were included in the study. Four main policy options were formulated: "evidence-based medicine," "legal reforms," "promotion of professional ethics and a supportive environment," and "improving the doctor-patient relationship." In the following, 13 interventions and strategies, 18 advantages, 18 disadvantages, and 21 implementation considerations were identified.  

    Conclusion

    To manage and reduce the effects of DM behaviors, different interventions at macro, organizational, and individual levels are needed. At the micro and individual levels, the enhancement of knowledge and skills is valuable. Organizational interventions that create a supportive culture and promote ethical behavior are also important.

    Keywords: Defensive Medicine, Defensive Practice, Conflict Of Interest, Medical Malpractice, Lawsuits
  • Matine Gharavi, Katayoun Salem*, Elham Shirazi Pages 609-615
    Background

    Behavioral problems in children contribute significantly to non-compliance and lack of cooperation with dentists.This study aimed to assess the impact of parenting styles on the success of conscious sedation with midazolam in uncooperative children aged 4 to 6 years.  

    Methods

    This short-term longitudinal study included ninety-six children aged 4-6 years who were classified as uncooperative according to the Frankl Behavior Rating Scale (Frankl I, II), requiring pulp treatment and Stainless-Steel Crown (SSC) restoration. Midazolam was orally administered at 0.25 mg/kg. Parents completed the Parental Stress Dental Questionnaire (PSDQ), Strengths and Difficulties Questionnaire (SDQ), and Children's Fear Survey Schedule-Dental Subscale (CFSS-DS). Treatment began at least thirty minutes post-drug administration. Vital signs were monitored using a pulse oximeter. Sedation effectiveness was assessed with the Houpt scale at local anesthesia injection (T0), cavity preparation (T1), restoration (T2), and treatment conclusion (T3). Statistical analysis used Kruskal-Wallis and Mann-Whitney U tests (P < 0.05).  

    Results

    Most parents (69, 71.9%) had an authoritative parenting style, while 10 (10.4%) were authoritarian, and 17 (17.7%) were permissive. Authoritative parenting is associated significantly with sedation success (P = 0.001) and reduced dental fear (P = 0.008). Conversely, authoritarian (P = 0.031) and permissive (P = 0.001) parenting styles are associated with sedation failure. Authoritarian parenting is associated positively with increased dental fear (P = 0.001). No significant association was found between permissive parenting style and dental fear (P > 0.05). No significant association existed between behavioral problems and parenting styles (P > 0.05). There was no significant association observed between permissive parenting style and dental fear (P = 0.279). Similarly, no significant associations were found between behavioral problems and specific parenting styles: authoritative (P = 0.625), authoritarian (P = 0.050), and permissive (P = 0.522).  

    Conclusion

    Understanding parenting styles aids in predicting conscious sedation success with midazolam and assisting in managing uncooperative children during dental procedures.

    Keywords: Parenting, Conscious Sedation, Midazolam, Pediatric Dentistry
  • Mahmoud Abdelhamid Elhendawy, Ahmed M Omran*, Sherif Hamdeno, Hazem Dahshan, Ahmed Abu, Elsoud, Ahmed Salem, Mohamed Ali Abdelaziz, Khallad Sholkamy, Saber M. Abdelmaksoud Pages 616-622
    Background

    The anatomy of the eyelid changes with age. Multiple changes were observed in the eyelids and the surrounding structures including the malar region. Aging affects the appearance of eyelids and midface by the formation of tear trough deformity and malar flattening and ptosis. To define the effect of malar fat suspension on the lateral part of the infraorbital wall and orbital fat transposition in tear-trough and malar flattening and ptosis.  

    Methods

    A retrospective study was carried out on 15 patients who had surgeries between January 2020 and January 2022. This technique combines orbital fat transposition to the medial side of the infraorbital wall and malar fat suspension to the lateral side of the infraorbital wall. The average follow-up period was 12 months. Values were compared by paired samples student or Wilcoxon signed rank test for quantitative and qualitative data respectively.   

    Results

    There was a significant improvement in tear trough deformity, malar ptosis, and midface lift (P < 0.5). No recurrence was observed on follow-up of 12 months. One patient experienced minor postoperative complications in the form of prolonged ecchymosis for 2 months.  

    Conclusion

    The transcutaneous lower blepharoplasty with orbital fat transposition and malar fat suspension to the lateral part of the infraorbital wall can be considered a safe and effective intervention with improved aesthetic outcome. Thus, it is recommended in patients with tear trough deformity and malar ptosis.

    Keywords: Blepharoplasty, Malar Ptosis, Midface Lift, Lower Eyelid
  • Yulduz Khaidarova*, Gaukhar Kurmanova, Gulzada Nurgaliyeva, Madina Omarova Pages 623-629
    Background

    High titers of specific antibodies to cyclic citrulline peptide (ACCP) are often present in the serum of patients with rheumatoid arthritis (RA) and, together with rheumatoid factor (RF), are a diagnostic marker of RA. Brucellosis is a zoonotic infection in which osteoarticular involvement occurs in 10-85% of patients.  RF in brucellosis patients is significantly higher than in healthy people.   

    Methods

    We presented 2 cases of brucellosis spondylodiscitis with positive results for RF and ACCP, which aroused great interest among the rheumatologists of our center.  

    Results

    Both patients described were men (27 and 60 years old) with arthritis, back pain, and high levels of rheumatoid arthritis-specific antibodies. These patients were suspected of having tuberculous spondylitis, but the tuberculous process was excluded using specific tests. During antibacterial therapy, there is a dynamic decrease in antirheumatoid antibodies. X-rays of the hand joints revealed no signs of erosive arthritis.  

    Conclusion

    All cases of arthritis, spondylitis, and spondylodiscitis in endemic areas require careful analysis and comparison of patients' clinical and laboratory-instrumental data to prevent misdiagnosis. With brucellosis infection, against the background of adequate antibacterial therapy, inflammation of the joints and spine is reversible.

    Keywords: Rheumatoid Arthritis, Antibodies To Cyclic Citrulline Peptide (ACCP), Chronic Brucellosis, Spondylodiscitis, Differential Diagnosis
  • Arash Khaledi, Hooman Minoonejad*, Hassan Daneshmandi, Mahdieh Akoochakian, Mehdi Gheitasi Pages 630-637
    Background

    Millions of people worldwide suffer from back pain and muscle weakness due to adolescent idiopathic scoliosis (AIS). It has been found that Schroth exercises (SE) are the most effective treatment for AIS. However, it is still not clear how combining SE with asymmetric spinal stabilization exercises (ASSE) can impact back pain and trunk extensor muscle endurance (TE). This study aims to compare the effects of SE with and without ASSE on back pain and TE in AIS.  

    Methods

    A randomized controlled trial was conducted with 40 boys aged 10 to 18 years who had AIS. They were divided into three groups: SE+ASSE (n = 15), SE only (n = 15), and a waitlist control (n = 10). The participants underwent exercise training for 50-70 minutes three times a week for up to 12 weeks. The study evaluated two variables, namely, back pain (measured with a Visual Analog Scale or VAS) and TE (measured with the Biering-Sorensen test), before and after the interventions. For statistical analysis, a post-hoc Bonferroni test following analysis of covariance (ANCOVA) was used at α = 0.05.  

    Results

    According to a study, patients who underwent a combination of SE and ASSE experienced a significant reduction in back pain (VAS score = 2.9±0.8 to 0.1±0.4) as compared to those who only underwent SE (VAS = 2.7±0.9 to 1.5±1.2) and the control group. No significant difference was found between the SE group and the control group in terms of back pain reduction. Furthermore, there was no significant difference in TE among the three groups. However, the combined exercises showed a numerical improvement (75.6±52.5 sec to 119.2±62.6 sec) compared to the other groups (P = 0.311).  

    Conclusion

    The combination of SE and ASSE is more effective in reducing back pain in AIS than SE alone or control. Although there was no significant difference between the three groups in terms of improving the TE, the SE and ASSE groups showed better results numerically.

    Keywords: Back Pain, Exercise Therapy, Muscle Endurance, Scoliosis
  • Mohsen Barouni, Hossein Farshidi, Somayeh Karimi, Mohammad Arab, Hamed Nazari*, Farzaneh Ghasemi Pages 638-645
    Background

    In Iran, one of the most important and influential sources for financing the Primary Health Care (PHC )is the government budget. This study was conducted with the aim of evaluating the allocation of the PHC budget and evaluating the equality in the allocation of these resources.

    Methods

    In this applied descriptive-analytical study, the study population included all of Iran's 31 provinces. Data was gathered from the registered statistics of the Ministry of Health and the Iranian Statistics Center for the years 2021 and 2022. In this research, the Gini coefficient and the Lorenz curve have been used to measure equality in the allocation of the PHC budget.

    Results

    The results showed that in 2022, the PHC budget increased by 50% compared to 2021. 20 provinces received less than the national average, and 11 provinces received more than the national average. The average allocation budget in urban areas in 2021 and 2022 is 596,452 and 854,936 million rials ($2,385,808 and $2,374,822), respectively. The average allocation budget in rural areas in 2021 and 2022 is 1,144,350 and 1,752,936 million rials, respectively ($4,577,400 and $4,869,267). The numerical value of the Gini coefficient for the budget allocation in 2021 and 2022 was 0.20 and 0.19, respectively.

    Conclusion

    The Gini coefficient shows that the allocation of the PHC budget is relatively unequal. Advocacy for the reallocation of resources in the health sector based on evidence and based on the deprivation coefficient of demographic groups is one of the most basic ways to support the more deprived and less developed provinces.

    Keywords: Primary Health Care Budget, Equality, Gini Coefficient, Iran
  • Ahmad Amani, Reza Fadayevatan, Babak Eshrati, Mohammad Rafiee, Ahmad Ali Akbari Kamrani Pages 646-652
    Background

    Cancer is one of the diseases affecting the elderly and can lead to loss of life years. The skin, breast, gastric, colorectal, and lung cancers are five prevalent cancers in the elderly. The present study was conducted to evaluate the incidence and burden of these cancers in the elderly.

    Methods

    This secondary study was conducted on available extracted data from the population-based cancer registry in Markazi province in 2019. The data of all cases older than 60 years that lived more than six months in Markazi province were extracted. Collecting information involves gathering data on cancer incidence and death rates based on age and sex groups, as well as overall mortality rates. This also includes survival rates, recovery rates for cancer patients, and disability attributed to cancer using the global burden of disease (GBD) standard table from various sources. Various data, including the latest death registration report and the latest cancer registration report for 2019 and the Iran Statistics Center, were obtained. In order to check and analyze the data, Excel and DISMOD2 software were used. In order to analyze the data, formulas for calculating the burden of diseases (DALY=YLL+YLD) were used. For the validity and reliability of the data, the method of preventing the registration of impossible codes and useless codes was used.

    Results

    The incidence rate of skin, breast, gastric, colorectal, and lung cancers in elderly women was 52.87, 59.02, 67.63, 47.95, and 20.90, respectively, per 100000. DALYs of these cancers in elderly women were 63.15, 423.86, 686.37, 366.49, and 385.18, respectively. The incidence rate of skin, gastric, colorectal, and lung cancers in elderly men was 100.84, 135.80, 49.74, and 68.57, respectively per 100000. DALYs of these cancers in elderly men were 342.31, 1117.01, 337.99, and 452.41, respectively. The highest YLL and YLD were related to gastric cancer (493.31/100,000) and breast cancer (220.84/100,000)

    Conclusion

    Based on the results of this study, the incidence, mortality, and DALY of skin, breast, stomach, colorectal and lung cancers were higher in the elderly. In this study, the burden of some cancers such as breast, was lower compared to provinces such as Yazd.

    Keywords: Cancer, Incidence Rate, Disability Adjusted Life Years, Elder
  • Leila Mounesan, Safoora Gharibzadeh*, Mahboubeh Parsaeian, Mohammad Mehdi Gouya, Sana Eybpoosh, Ali Hosseini, Leila Haghjou, Aliakbar Haghdoost, Ehsan Mostafavi Pages 653-658
    Background

    To reduce the clinical burden of COVID-19, healthcare providers, and policymakers need a clear understanding of the illness severity during epidemic waves. This study aimed to identify the clinical severity of patients with COVID-19 during different stages of an epidemic wave (pre-peak, peak, post-peak) in four provinces in Iran.

    Methods

    We conducted a secondary analysis of the data on COVID-19 patients admitted to hospitals (25,382 cases), which were recorded in the Medical Care Monitoring Center. Data included adult patients (≥18 years) who were hospitalized due to COVID-19 infection, confirmed by a positive SARS-CoV-2 RT-PCR test. No exclusion criteria were applied. A pairwise comparison method was used to evaluate clinical severity. Then, based on univariable and multivariable linear regression models, the severity scores of patients were compared during various stages of an epidemic wave.

    Results

    The findings showed that the level of severity of the disease was higher during and after the peak in the total population. The means (SD) of severity scores were 0.16 (0.25), 0.18 (0.26), and 0.19 (0.26) before, during and after the peak, respectively. Besides, age and the underlying disease had a positive and significant relationship with disease severity.

    Conclusion

    During the middle and late phases of the COVID-19 epidemic wave, hospitals are seeing patients with more severe illnesses than in the early stages. Enhancing hospital preparedness is essential to avert excess deaths and critical cases. Moreover, it is important to maintain ongoing monitoring of clinical symptoms during the recovery phase to support individual patients, guide public health policy, and enhance scientific understanding of epidemic recovery processes.

    Keywords: COVID-19, Epidemics, Clinical Severity, Hospitalization
  • Mahboobeh Farhoudi, Behnam Hajiaghaei*, Hassan Saeedi, Taher Babaee Pages 659-666
    Background

    Individuals who have undergone lower limb amputation often struggle with excessive heat and sweating in their prosthetic sockets. This is due to the closed environment of the socket, which disrupts the body's natural cooling mechanisms and can lead to increased skin temperature, sweating, and various skin problems. This study aimed to develop a new socket to alleviate heat buildup in those with below-knee amputation.

    Methods

    A positive residual limb model of a below-knee amputee was used to create a new socket made of copper metal through electroforming. A cooling system was programmed so that if the temperature exceeded a predetermined threshold, the system would be activated to prevent further temperature increase. The participant wore the conventional and new socket with the cooling system, and his residual limb skin temperature was monitored using a temperature data logger.

    Results

    Implementing the new socket led to a significant 5°C to 6°C reduction in temperature within the socket, greatly enhancing thermal comfort and reducing heat sensation for the users.

    Conclusion

    By incorporating the new socket and cooling system, substantial reductions in heat accumulation within the prosthetic socket can be achieved.

    Keywords: Prosthetic Socket, Temperature, Below-Knee Amputation, Electroforming, Active Cooling System, Heat Reduction