ABSTRACT
BACKGROUND: The longitudinal evaluation of students seems to be a better way to assess their knowledge compared with that of the traditional methods of evaluation, such as modular and final tests. Currently, progress testing is the most consolidated type of longitudinal testing method. However, despite being well consolidated as an assessment tool in medical education, the use of this type of test in residency programs is scarce.
OBJECTIVES: This study aimed to investigate residents’ knowledge growth regarding residency training and to describe the implementation of a longitudinal evaluation test in ophthalmological residency training across several medical schools in Brazil. Finally, the study aimed to check whether performance in the tests can be used as a predictor of the results of the specialist title test.
DESIGN AND SETTING: This was a prospective observational study. This study was conducted using an online platform.
METHODS: Online tests were developed following the same pattern as the Brazilian Ophthalmology Council specialist tests. All the residents performed the test simultaneously. The tests were conducted once a year at the end of the school year.
RESULTS: A progress test was conducted across 13 services with 259 residents. Our results demonstrated that resident scores improved over the years (P < 0.0001) and had a moderate correlation with the Brazilian Opthalmology Council specialist test (P = 0.0156).
CONCLUSION: The progress test can be considered a valuable tool to assess knowledge, meaning their knowledge increased over residency training. In addition, it can be used as a predictor of the result in the specialist title test.
KEYWORDS (MeSH terms): Ophthalmology; Education, medical; Internship and residency
AUTHORS’ KEYWORDS: Knowledge assessment; Longitudinal evaluation; Online test; Residency; Medical residency
INTRODUCTION
Knowledge assessment plays an important role in medical education since professional expertise development appears to be strongly connected to knowledge.1 Research has shown that assessment may be used in different ways. For example, studies have demonstrated that assessment drives and stimulates learning,2,3 provides educational efficacy information to institutions and teachers, and protects patients.1
The definitions of “to test” in the dictionary are as follows: to discover the worth of something by trial, to obtain more information about the object of assessment, and to improve the quality of something by trial.4 Thus, assessment in the broader sense involves testing, measuring, collecting, combining information, and providing feedback.4
In many medical residency programs, modular, intermediate, or final tests have been used to measure the knowledge level of trainees.5,6 However, these types of tests are associated with the promotion of short-term memorization.5 In addition, residents' performance may not correspond to the real knowledge level since it is merely a one-point measurement, not allowing any extrapolation to the maintained knowledge level over time.7 To benefit students' long-term retention, longitudinal testing in the form of the progress test, the most known and established kind of longitudinal test, has been suggested.2,8
Progress testing aims to measure students' knowledge at the end level and allows the measurement of knowledge growth.8,9 In addition, progress testing forces students to study over time, encouraging more profound and deep learning10 since it is impossible for students to cram before the test. Alternatively, students must acquire information continuously in such a way that it is available when required.11 Progress tests allow for individual learning pathways, which may provide clues for future performance. Finally, progress testing can be organized at a national level7 and can be used to compare the results of candidates from different countries.5
Progress tests have been used in different ways, such as for providing feedback to students,12,13 understanding knowledge growth on questions requiring lower and higher order of cognitive processing,12,13 comparing national14 and international curricula,15 and the effectiveness of educational strategies.16,17 Many medical schools worldwide have already adopted this progress testing as part of their curricula, such as the Netherlands,18 Canada,19 Germany,6 Indonesia, South Africa, the United States,20 and Brazil.21,22
Despite being a well-established assessment tool in the undergraduate context, progress testing is much less widespread in the postgraduate context, where the best test format remains controversial.23 Some authors believe that, at least in theory, longitudinal tests would also be an interesting approach to knowledge assessment in postgraduate medical education.7 So far, only a few residency programs have already included the progress test in their curricula, such as in obstetrics and gynecology,24 radiology, and in general practice,10,25 demonstrating promising results.
The World Reference Institution in ophthalmology residency programs is the International Council of Ophthalmology (ICO). According to the ICO, medical knowledge is one of the general core competencies expected from ophthalmic specialists (besides patient care, practice-based learning and improvement, communication skills, professionalism, and systems-based practice).26,27
Progress testing during residency could play an important role in monitoring the competence progress. Besides, it could be useful for the quality control of residency programs in Brazil, by allowing interventions during the course. In addition, the tests can serve as self-learning tools for residents. Finally, it can be useful to predict residents' results in the specialist test of the Brazilian Opthalmology Council.28
OBJECTIVE
This study aimed to investigate residents' knowledge growth during their residency training. This study also describes the implementation of a progress test in ophthalmological residency training across several medical schools in Brazil. Finally, this study aimed to investigate whether there was a correlation between the performance of the progress test and the specialist title test.
METHODS
This was a prospective observational study carried out through an online platform.
This study was approved by the ethics committee of Universidade Estadual de Campinas on December 17, 2018 (CAAE number:02613718.9.0000.5404).
Participants: The study was conducted in 2019. All participants were ophthalmology residents who agreed to participate voluntarily in the study and signed a consent form.
Ophthalmology Residency in Brazil
In Brazil, the ophthalmology residency consists of a 3 years program.
The institution that represents the Brazilian Ophthalmology is the Brazilian Ophthalmology Council (Conselho Brasileiro de Oftomologia, CBO).28 According to the CBO, the minimum pedagogic program required for the ophthalmology specialization consists of the following content:
-
Basic sciences: 100% in the 1st year and 0% in the 2nd and 3rd year
-
Propaedeutics: 60% in the 1st year, 30% in the 2nd, and 10% in the 3rd year
-
Optometry: 50% in the 1st year, 50% in the 2nd, and 0% in the 3rd year
-
Surgical techniques: 50% in the 1st year, 50% in the 2nd, and 0% in the 3rd year
-
Clinics and surgery: 25% in the 1st year, 50% in the 2nd, and 25% in the 3rd year
Besides this mandatory content, there may be complementary activities, such as clinical case discussions, pathological anatomy sections, and scientific article discussions.28
Progress test construction and application
The progress test consisted of 125 multiple-choice questions on clinical and surgical issues in ophthalmology. The blueprint followed the same pattern as the Brazilian Ophthalmology Council specialist test:28
-
uveitis: 9 questions;
-
neuro ophthalmology: 7 questions;
-
orbit: 4 questions;
-
lacrimal system: 4 questions;
-
ocular plastics: 8 questions;
-
ocular tumors: 5 questions;
-
cornea: 14 questions;
-
contact lenses: 4 questions;
-
refractive surgery: 2 questions;
-
retina: 13 questions;
-
cataract: 10 questions;
-
glaucoma: 11 questions;
-
refraction: 23 questions;
-
strabismus: 7 questions;
-
low vision: 4 questions.
The Graphic 1 shows the division of the test questions.
As the tests consisted of 125 multiple-choice questions, for the statistical analysis, a 0.08 value corresponds to 10/125 for each correct answer; thus, it was attributed to a score that could vary from 0 to 10 for each test.
The questions in the tests were taken from the following books: Review Questions in Opthalmology,29 Clinical Optics and Refraction,30 and Self-tests in Optic and Refraction.31 They were chosen according to the issue and level of difficulty (judged by the authors), in a way that there were questions of different issues and levels of difficulty.
As there were residents from many parts of the country, the tests were online, and all the residents from the 1st to the 3rd year of the ophthalmology residency programs performed the tests simultaneously. Therefore, all residents were enrolled in the same test, regardless of whether they were in their 1st, 2nd, or 3rd year of residency. The tests were conducted once a year at the end of the school year.
Each service organized the implementation of the tests, and the only requirement was that all residents sat on the test simultaneously. Some services used their own informatic lab rooms, while those that did not have one allowed their residents to use their own computers, either at the service or at home, at a predetermined schedule, as long as there was one computer for each resident.
Site
First, participants had to create an account. Once completed, they were able to access the site. Figures 1–4 show a small portion of the site.
The presentation page contains some important advice to read and the consent term that had to be signed before the test itself (Figure 1).
The test page contained the test itself. Once completed, participants had to submit their answers. Immediately after the submission, the participant received feedback (Figure 2).
The feedback page shows the number of correct and incorrect answers, the time spent performing the test, and the score (Figure 3).
Figure 4 shows the correct answers and explanations.
Data analysis
Frequency tables were used for the descriptive analysis of categorical variables. Positions and dispersion measures were used for numeric variables. The Kruskal–Wallis test was used to compare the differences between years, followed by Dunn's test to identify significant differences.
The Friedman or Wilcoxon test was used to compare students' knowledge growth.
To investigate the relationship between the progress test and CBO scores, the Spearman linear correlation coefficient and Wilcoxon test for related samples were conducted.
A statistical level of 0.05 was considered significant.
Data were analyzed using the Statistical Analysis Software (SAS) System for Windows (Statistical Analysis System), version 9.4. SAS Institute Inc, 2002-2012, Cary, North Carolina, United States.
RESULTS
Among the many ophthalmology residents all around Brazil invited to join the study, 24 accepted the invitation. A total of 297 residents participated in the progress test. Of these, 100 (33.7%) were from the 1st year, 108 (36.4%) from the 2nd year, and 89 (30.0%) from the 3rd year.
Descriptive analysis and comparison of the scores for each residency year
The mean score of the 1st year residents was 4.3, that of the 2nd year residents was 5.1, and that of the 3rd year residents was 5.4. Table 1 and Graphic 2 show the descriptive analysis and comparison of scores for each residency year.
The Kruskal–Wallis test was used to compare the mean scores across the three years of residency. The P value was < 0.0001, which was considered statistically significant. Therefore, it is possible that there was a difference between the mean scores.
The Wilcoxon test was used for multiple comparisons of the mean scores for each pair of the residency years (1st versus 2nd, 1st versus 3rd, and 2nd versus 3rd) to check the difference between the pairs. There was a significant difference between the 1st and 2nd years and the 1st and 3rd years of residency (P < 0.0001 in both cases). However, the difference between the 2nd and 3rd years of residency was not significant (P = 0.0619). This may be because of the pedagogic program itself since, if we look at it, we can see that almost all the theoretical content was taught in the first two years of residency, with only a small percentage remaining in the 3rd year of residency.
Relationship between the progress test and Brazilian Ophthalmology Council (CBO) scores (Table 2 and Graphic 3)
Spearman linear correlation between the progress test and Brazilian Ophthalmology Council (CBO) scores.
For this analysis, we had only eight residents from the 3rd year. Correlation analysis demonstrated an association between the progress test and CBO scores. Spearman correlation (Graphic 3) showed a positive and significant correlation between these two scores (which was 0.61), which means that the higher the score on the progress test, the higher the score on the CBO test.
DISCUSSION
In this study, we demonstrated that progress tests could be used for ophthalmology residency training. They helped to detect the residents' knowledge growth over time and had a moderate relationship with the CBO test. Our findings are aligned with previous studies in both undergraduate8,9,32 and residency training.10,30–33,34
For example, in a study by Tomic et al., 4 years of progress testing were evaluated in a medical school in Brazil and positive results were found, with a continuum of cognitive gain during medical training.32
Similarly, previous studies with longitudinal tests on the residency program10,29,30 found that the progress test was able to detect the difference33,35 among residency years. Taken together, the knowledge scores increased over the years. 10,35
Concerning the relationship between the progress and CBO tests, our results were partially in concordance with those of previous studies. For example, in an undergraduate context, a study by Hamamoto Filho et al. found a correlation between students' progress testing scores and their performance in a residency selection process in Brazil.36
In the residency context, a descriptive study by Al-Mohammed A et al.37 compared the residents' performance on the American College of Physicians (ACP) Internal Medicine In-Training Examination (IM-ITE) results and the certification examination of the American Board of Internal Medicine (CABIM) and American Board of Surgery Qualifying Examinations in Qatar, found that the performance on the ITE could accurately predict the performance on both qualifying exams,31 which is in concordance with our results.
Therefore, our study is in concordance with previous studies performed by residents. What makes our study exclusive is that besides being performed in a country where there are almost no similar studies, it is, as far as we are concerned, the only one performed with ophthalmology residents.
For the future
Two more different tests will be developed, and each test will be used at the end of the school year by all the residents from the 1st to the 3rd year of the ophthalmology residency programs.
All the tests will have the same number of questions (125). They will follow the same division of national testing issues; however, the questions will be completely different from one test to another. In other words, all questions will be changed from the 1st year to another. Thus, at the end of the 3 years of residency, each resident performed three different tests.
After the end of the tests, the tests will be revised, and each resident will receive individual performance feedback through an online program developed with personal login and password.
Limitations of the study
In some services, the residents were allowed to do the test at home because the service did not have informatics labs or an appropriate classroom for them to perform the tests. This can be biased because we cannot guarantee they did not cheat on the test. In addition, as participation in the study was voluntary and the progress test score was not part of the official residency program, some residents did not take it seriously. Finally, our sample size for comparison of the progress and CBO tests was small. However, even with such a small sample size, we found a moderate and significant correlation.
CONCLUSION
Based on the data obtained, it is possible to see that the scores of the residents improved over the years, which means that their knowledge increased. In other words, there was progress along the residency course.
Residents approved the longitudinal test as a self-learning tool and as a tool for improving residency programs. Therefore, we can say that the implementation of a longitudinal evaluation system in ophthalmological residency schools in Brazil was successful and could be implemented in other medical subspecialties.
Acknowledgements:
We would like to thank the statistical sector of the Faculty of Medical Sciences and the team of students and professors of the statistics department of the Institute of Mathematics, Statistics, and Scientific Computing of the State University of Campinas for the data analysis
REFERENCES
-
1 Van Der Vleuten CP. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract. 1996;1(1):41-67. PMID: 24178994; https://doi.org/10.1007/BF00596229
» https://doi.org/10.1007/BF00596229 -
2 Cecilio-Fernandes D, Cohen-Schotanus J, Tio RA. Assessment programs to enhance learning. Physical Therapy Reviews. 2018;23(1):17-20. https://doi.org/10.1080/10833196.2017.1341143
» https://doi.org/10.1080/10833196.2017.1341143 -
3 Wood T. Assessment not only drives learning, it may also help learning. Med Educ. 2009;43(1):5-6. PMID: 19140992; https://doi.org/10.1111/j.1365-2923.2008.03237.x
» https://doi.org/10.1111/j.1365-2923.2008.03237.x -
4 Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33(3):206-14. PMID: 21345060; https://doi.org/10.3109/0142159X.2011.551559
» https://doi.org/10.3109/0142159X.2011.551559 -
5 Verhoeven BH, Snellen-Balendong HA, Hay IT, et al. The versatility of progress testing assessed in an international context: a start for benchmarking global standardization? Med Teach. 2005;27(6):514-20. PMID: 16199358; https://doi.org/10.1080/01421590500136238
» https://doi.org/10.1080/01421590500136238 -
6 Nouns ZM, Georg W. Progress testing in German speaking countries. Med Teach. 2010;32(6):467-70. PMID: 20515374; https://doi.org/10.3109/0142159X.2010.485656
» https://doi.org/10.3109/0142159X.2010.485656 -
7 Dijksterhuis MG, Scheele F, Schuwirth LW, et al. Progress testing in postgraduate medical education. Med Teach. 2009;31(10):e464-8. PMID: 19877854; https://doi.org/10.3109/01421590902849545
» https://doi.org/10.3109/01421590902849545 -
8 Wrigley W, van der Vleuten CP, Freeman A, Muijtjens A. A systemic framework for the progress test: strengths, constraints and issues: AMEE Guide No. 71. Med Teach. 2012;34(9):683-97. PMID: 22905655; https://doi.org/10.3109/0142159X.2012.704437
» https://doi.org/10.3109/0142159X.2012.704437 -
9 Cecilio-Fernandes D, Bicudo AM, Hamamoto Filho PT. Progress testing as a pattern of excellence for the assessment of medical students’ knowledge: concepts, history, and perspective. Medicina (Ribeirão Preto). 2021 Jul 16;54(1):e173770. Available from: http://hdl.handle.net/11449/229239 Accessed in 2022 (Jul 5).
» http://hdl.handle.net/11449/229239 -
10 Rutgers DR, van Raamt F, van Lankeren W, et al. Fourteen years of progress testing in radiology residency training: experiences from The Netherlands. Eur Radiol. 2018;28(5):2208-15. PMID: 29196854; https://doi.org/10.1007/s00330-017-5138-8
» https://doi.org/10.1007/s00330-017-5138-8 -
11 McHarg J, Bradley P, Chamberlain S, et al. Assessment of progress tests. Med Educ. 2005;39(2):221-7. PMID: 15679690; https://doi.org/10.1111/j.1365-2929.2004.02060.x
» https://doi.org/10.1111/j.1365-2929.2004.02060.x -
12 Cecilio-Fernandes D, Kerdijk W, Jaarsma AD, Tio RA. Development of cognitive processing and judgments of knowledge in medical students: Analysis of progress test results. Med Teach. 2016;38(11):1125-9. PMID: 27117670; https://doi.org/10.3109/0142159X.2016.1170781
» https://doi.org/10.3109/0142159X.2016.1170781 -
13 Hamamoto Filho PT, Silva E, Ribeiro ZMTet al. Relationships between Bloom's taxonomy, judges’ estimation of item difficulty and psychometric properties of items from a progress test: a prospective observational study. Sao Paulo Med J. 2020;138(1):33-9. PMID: 32321103; https://doi.org/10.1590/1516-3180.2019.0459.R1.19112019
» https://doi.org/10.1590/1516-3180.2019.0459.R1.19112019 -
14 Verhoeven BH, Verwijnen GM, Scherpbier AJJ, et al. An analysis of progress test results of PBL and nonPBL students. Med Teach. 1998; 20(4):310-6. https://doi.org/10.1080/01421599880724
» https://doi.org/10.1080/01421599880724 -
15 Albano MG, Cavallo F, Hoogenboom R, et al. An international comparison of knowledge levels of medical students: the Maastricht Progress Test. Med Educ. 1996;30(4):239-45. PMID: 8949534; https://doi.org/10.1111/j.1365-2923.1996.tb00824.x
» https://doi.org/10.1111/j.1365-2923.1996.tb00824.x -
16 Cecilio-Fernandes D, Aalders WS, Bremers AJA, Tio RA, de Vries J. The Impact of Curriculum Design in the Acquisition of Knowledge of Oncology: Comparison Among Four Medical Schools. J Cancer Educ. 2018 Oct;33(5):1110-1114. PMID: 28374229; https://doi.org/10.1007/s13187-017-1219-2
» https://doi.org/10.1007/s13187-017-1219-2 -
17 Cecilio-Fernandes D, Aalders WS, de Vries J, Tio RA. The Impact of Massed and Spaced-Out Curriculum in Oncology Knowledge Acquisition. J Cancer Educ. 2018;33(4):922-5. PMID: 28194581; https://doi.org/10.1007/s13187-017-1190-y
» https://doi.org/10.1007/s13187-017-1190-y -
18 Verhoeven BH, Verwijnen GM, Scherpbier AJ, van der Vleuten CP. Growth of medical knowledge. Med Educ. 2002;36(8):711-7. PMID: 12191053; https://doi.org/10.1046/j.1365-2923.2002.01268.x
» https://doi.org/10.1046/j.1365-2923.2002.01268.x -
19 Blake JM, Norman GR, Keane DR, et al. Introducing progress testing in McMaster University's problem-based medical curriculum: psychometric properties and effect on learning. Acad Med. 1996;71(9):1002-7. PMID: 9125989; https://doi.org/10.1097/00001888-199609000-00016
» https://doi.org/10.1097/00001888-199609000-00016 - 20 Ferreira RC. Relação entre o desempenho no teste de progresso e na seleção para residência médica [thesis]. Campinas: Universidade Estadual de Campinas; 2019.
-
21 Bicudo AM, Hamamoto Filho PT, Abbade JF, Hafner MLMB, Maffei CML. Teste de Progresso em Consórcios para Todas as Escolas Médicas do Brasil. Rev Bras Educ Med. 2019;43(4):151-6. https://doi.org/10.1590/1981-52712015v43n4RB20190018
» https://doi.org/10.1590/1981-52712015v43n4RB20190018 -
22 Conselho Federal de Medicina. VII Fórum Nacional de Ensino Médico [Internet]. Available from: http://www.eventos.cfm.org.br/index.php?option=com_content&view=article&id=21090. Accessed in 2022 (Jul 5).
» http://www.eventos.cfm.org.br/index.php?option=com_content&view=article&id=21090 -
23 Ravesloot C, van der Schaaf M, Haaring C, et al. Construct validation of progress testing to measure knowledge and visual skills in radiology. Med Teach. 2012;34(12):1047-55. PMID: 22931139; https://doi.org/10.3109/0142159X.2012.716177
» https://doi.org/10.3109/0142159X.2012.716177 -
24 Dijksterhuis MG, Scheele F, Schuwirth LW, et al. Progress testing in postgraduate medical education. Med Teach. 2009;31(10):e464-8. PMID: 19877854; https://doi.org/10.3109/01421590902849545
» https://doi.org/10.3109/01421590902849545 -
25 Dijksterhuis MG, Schuwirth LW, Braat DD, Scheele F. An exploratory study into the impact and acceptability of formatively used progress testing in postgraduate obstetrics and gynaecology. Perspect Med Educ. 2013;2(3):126-41. PMID: 27023455; https://doi.org/10.1007/s40037-013-0063-2
» https://doi.org/10.1007/s40037-013-0063-2 -
26 Tso MO, Goldberg MF, Lee AG, et al. An international strategic plan to preserve and restore vision: four curricula of ophthalmic education. Am J Ophthalmol. 2007;143(5):859-65. PMID: 17452171; https://doi.org/10.1016/j.ajo.2007.01.055
» https://doi.org/10.1016/j.ajo.2007.01.055 -
27 ICO Residency Curriculum 2nd Edition and Updated Community Eye Health Section.. San Francisco, CA: International Council of Ophthalmology; 2016. Available from: https://dansk-oftalmologisk-selskab.dk/wp-content/uploads/2021/05/161216-updated-ico-residency-curriculum.pdf Accessed in 2022 (Jul 8).
» https://dansk-oftalmologisk-selskab.dk/wp-content/uploads/2021/05/161216-updated-ico-residency-curriculum.pdf -
28 Conselho Brasileiro de Oftalmologia [Internet]. Available from: https://www.cbo.net.br/ Accessed in 2022 (Jun 5).
» https://www.cbo.net.br/ - 29 Chern KC, Saidel M, editors. Review questions in ophthalmology. 3rd ed. Philadelphia: Wolters Kluwer; 2015.
- 30 Heidary F, Rahimi A, Gharebaghi R. Clinical Optics and Refraction 313 Key Questions Answered. Shiraz, Iran: CreateSpace Independent Publishing Platform; 2013
- 31 Chua CN. Self-tests in optics and refraction. Kelana Jaya: Alco Malaysia; 2007.
-
32 Tomic ER, Martins MA, Lotufo PA, Benseñor IM. Progress testing: evaluation of four years of application in the school of medicine, University of São Paulo. Clinics (Sao Paulo). 2005;60(5):389-96. PMID: 16254675; https://doi.org/10.1590/s1807-59322005000500007
» https://doi.org/10.1590/s1807-59322005000500007 -
33 Dournes G, Bricault I, Chateil JF. Analysis of the French national evaluation of radiology residents. Diagn Interv Imaging. 2019;100(3):185-93. PMID: 30527527; https://doi.org/10.1016/j.diii.2018.11.006
» https://doi.org/10.1016/j.diii.2018.11.006 -
34 Ray JJ, Sznol JA, Teisch LF, et al. Association Between American Board of Surgery In-Training Examination Scores and Resident Performance. JAMA Surg. 2016;151(1):26-31. PMID: 26536059; https://doi.org/10.1001/jamasurg.2015.3088
» https://doi.org/10.1001/jamasurg.2015.3088 -
35 Mauro GP, Najas GF, Carvalho HA, Villar RC. Prospective validation of a core curriculum progress assimilation instrument for radiation oncology residentship. Rep Pract Oncol Radiother. 2020;25(6):951-5. PMID: 33100910; https://doi.org/10.1016/j.rpor.2020.09.003
» https://doi.org/10.1016/j.rpor.2020.09.003 -
36 Hamamoto Filho PT, de Arruda Lourenção PLT, do Valle AP, Abbade JF, Bicudo AM. The Correlation Between Students’ Progress Testing Scores and Their Performance in a Residency Selection Process. Med Sci Educ. 2019;29(4):1071-5. PMID: 34457585; https://doi.org/10.1007/s40670-019-00811-4
» https://doi.org/10.1007/s40670-019-00811-4 -
37 Al-Mohammed A, Al Mohanadi D, Rahil A, et al. Evaluation of Progress of an ACGME-International Accredited Residency Program in Qatar. Qatar Med J. 2020;2020(1):6. PMID: 32300550; https://doi.org/10.5339/qmj.2020.6
» https://doi.org/10.5339/qmj.2020.6
Publication Dates
-
Publication in this collection
03 Oct 2022 -
Date of issue
2023
History
-
Received
03 Mar 2022 -
Reviewed
05 June 2022 -
Accepted
01 July 2022