Open-access MiniCex as an Assessment Tool of a Medical Course Internship Program

MiniCex como Instrumento para Avaliação de Programa no Internato de um Curso de Medicina

Abstract:

Introduction:  Program assessment is the process of data collection about a course or teaching program that takes into consideration the aspects of cost-effectiveness, checking the adequacy of the evaluation according to the course purpose and the program capacity to yield changes in real life. Such regular assessments provide feedback to the decision-making process that aim at better teaching and learning practices. The Mini Clinical Evaluation Exercise (MiniCex) is a performance rating scale designed to assess the skills that medical students and residents need in real-life situations with patients. Considering the importance of program assessment for an institution, the utilization of the MiniCex data might be of great value for the follow-up of students and the course, helping the planning process and generating improvements in the institution. Therefore, the objective of this study is to assess the program using MiniCex in the beginning of the medical internship, aiming to determine in what areas of the basic and pre-clinical course the students have more difficulties.

Methods:  A cross-sectional descriptive study was carried out, using the retrospective data obtained by the MiniCex forms that were applied to the 9th-semester medical students, which correspond to the first semester of medical internship in the Federal University of Pará. A total of 111 students was assessed, among the 154 students eligible for internship, from August 2017 to July 2018.

Results:   Among the performed evaluations, with 97% being requested by the teachers, most of them (72%) were about new cases, and 45% and 38% had low or moderate complexity, respectively. There was a predominance of musculoskeletal system disorders (27.7%), followed by the gastrointestinal/hepatology system (14.8%). Concerning the skills in each domain, the performance was satisfactory in all of them. We observed that 12% of the students had difficulties in at least one area, followed by 6.3% of students with difficulties in 2 areas and 4.5% with an unsatisfactory performance in 3 or more areas.

Conclusion:   the MiniCex, when applied to internship students, showed to be a source of important and useful information, as part of a program assessment concerning the areas preceding the internship. The analysis of the obtained data was sent to teachers of the pre-internship, internship and course management areas. To the first ones, with the objective of reviewing their programs, detecting where they can intervene and, thus, make changes that aim a better acquisition of basic knowledge by the students and, consequently, improve their performance. To the second ones, to provide an overview of where they will have to focus their programs according to the needs of the medical students who reach the internship. Finally, to the course management, as a guide of what should be supervised by the professionals teaching the semesters that precede the internship.

Key-words: Educational Measurement; Education Medical; Program Evaluation

Resumo:

Introdução:   avaliação de programa é o processo de obtenção de informações sobre um curso ou programa de ensino que leva em consideração aspectos de custo-efetividade, de checagem da adequação da avaliação ao propósito do curso e da capacidade do programa de induzir transformação da realidade. Tais avaliações regulares retroalimentam as tomadas de decisão que almejam melhores práticas de ensino e aprendizagem. O Miniexecício Clínico Avaliativo (Mini Clinical Evaluation Exercise - MiniCex) é uma escala de classificação de desempenho projetada para avaliar as habilidades que os acadêmicos e residentes necessitam em encontros reais com os pacientes. Diante da importância da avaliação de programa para uma instituição, a utilização de dados do MiniCex pode ser de grande valia para o acompanhamento dos alunos e do curso, favorecendo o planejamento e as melhorias na instituição. Objetivo: utilizar o MiniCex como parte de uma avaliação de programa no início do internato do curso de Medicina, visando determinar as áreas do curso básico e pré-clínico nas quais o aluno possui deficiências.

Métodos:  Foi realizado um estudo transversal, de caráter descritivo, com a utilização de dados retrospectivos obtidos por meio das fichas do MiniCex aplicadas aos alunos do nono semestre no módulo de Clínica Médica que correspondeu ao primeiro semestre do internato da Faculdade de Medicina da Universidade Federal do Pará, sendo avaliados um total de 111 alunos dentre os 154 aptos ao internato no período de agosto de 2017 a julho de 2018.

Resultados:  Dentre as avaliações realizadas, com 97,2% solicitadas pelos professores, a maioria (72%) foi de casos novos, 45% e 38,7% de baixa e moderada complexidade, respectivamente. Houve predomínio afecções do sistema musculoesquelético (27,7%), seguido do sistema gastrointestinal/hepatologia (14,8%). Quanto às habilidades em cada domínio, obteve-se rendimento suficiente em todos. Observou-se que 12,6% dos alunos tiveram deficiência em pelo menos uma área, o que foi seguido de 6,3% de alunos insuficientes em duas áreas e 4,5% com rendimento insatisfatório em três ou mais áreas.

Conclusão:  o MiniCex aplicado aos estudantes do internato mostrou-se capaz de fornecer informações importantes e úteis como parte de uma avaliação de programa das áreas prévias ao internato. A análise dos dados obtidos foi encaminhada aos professores do pré-internato e do internato e à direção do curso. Enviou-se a análise aos primeiros para que pudessem rever seus programas e detectar em que ponto podem intervir e fazer as alterações que visem à melhor aquisição de conhecimentos básicos pelos discentes e consequentemente ao aumento do desempenho deles. Quanto aos professores do internato, o objetivo foi apresentar-lhes um panorama dos aspectos em que precisarão concentrar seus programas conforme as carências indicadas pelos acadêmicos que chegam ao internato. Por último, à direção, o material serviu de guia do que deve fiscalizar dos docentes dos semestres que antecedem o internato.

Palavras-chave: Avaliação Educacional; Educação Médica; Avaliação de Programas

Resumo:

Introdução:   avaliação de programa é o processo de obtenção de informações sobre um curso ou programa de ensino que leva em consideração aspectos de custo-efetividade, de checagem da adequação da avaliação ao propósito do curso e da capacidade do programa de induzir transformação da realidade. Tais avaliações regulares retroalimentam as tomadas de decisão que almejam melhores práticas de ensino e aprendizagem. O Miniexecício Clínico Avaliativo (Mini Clinical Evaluation Exercise - MiniCex) é uma escala de classificação de desempenho projetada para avaliar as habilidades que os acadêmicos e residentes necessitam em encontros reais com os pacientes. Diante da importância da avaliação de programa para uma instituição, a utilização de dados do MiniCex pode ser de grande valia para o acompanhamento dos alunos e do curso, favorecendo o planejamento e as melhorias na instituição. Objetivo: utilizar o MiniCex como parte de uma avaliação de programa no início do internato do curso de Medicina, visando determinar as áreas do curso básico e pré-clínico nas quais o aluno possui deficiências.

Métodos:   Foi realizado um estudo transversal, de caráter descritivo, com a utilização de dados retrospectivos obtidos por meio das fichas do MiniCex aplicadas aos alunos do nono semestre no módulo de Clínica Médica que correspondeu ao primeiro semestre do internato da Faculdade de Medicina da Universidade Federal do Pará, sendo avaliados um total de 111 alunos dentre os 154 aptos ao internato no período de agosto de 2017 a julho de 2018.

Resultados:   Dentre as avaliações realizadas, com 97,2% solicitadas pelos professores, a maioria (72%) foi de casos novos, 45% e 38,7% de baixa e moderada complexidade, respectivamente. Houve predomínio afecções do sistema musculoesquelético (27,7%), seguido do sistema gastrointestinal/hepatologia (14,8%). Quanto às habilidades em cada domínio, obteve-se rendimento suficiente em todos. Observou-se que 12,6% dos alunos tiveram deficiência em pelo menos uma área, o que foi seguido de 6,3% de alunos insuficientes em duas áreas e 4,5% com rendimento insatisfatório em três ou mais áreas.

Conclusão:   o MiniCex aplicado aos estudantes do internato mostrou-se capaz de fornecer informações importantes e úteis como parte de uma avaliação de programa das áreas prévias ao internato. A análise dos dados obtidos foi encaminhada aos professores do pré-internato e do internato e à direção do curso. Enviou-se a análise aos primeiros para que pudessem rever seus programas e detectar em que ponto podem intervir e fazer as alterações que visem à melhor aquisição de conhecimentos básicos pelos discentes e consequentemente ao aumento do desempenho deles. Quanto aos professores do internato, o objetivo foi apresentar-lhes um panorama dos aspectos em que precisarão concentrar seus programas conforme as carências indicadas pelos acadêmicos que chegam ao internato. Por último, à direção, o material serviu de guia do que deve fiscalizar dos docentes dos semestres que antecedem o internato.

Palavras-chave: Avaliação Educacional; Educação Médica; Avaliação de Programas

INTRODUCTION

Program evaluation is the process of obtaining information about a teaching course or program that takes into account aspects of cost-effectiveness, checking the adequacy of the evaluation according to the course purpose and the program’s capacity to generate changes in real life. This assessment provides information that is analyzed for the creation of reports, the issuing of value judgments and decision making aiming to qualify the training of future professionals who will be delivered to society, including health professionals1. An educational program itself is rarely static, so an assessment plan should be designed to allow educators to obtain useful knowledge about the program and to support its continuing development2. Such regular assessments provide feedback for decision-making that aim at better teaching and learning practices1.

Among the program evaluation models is the four-level Kirkpatrick model3, which is based on the assumption of linear relationships between the program components and the results, being useful to help evaluators identify students’ relevant results. There is also the Logic Model4, which specifies the intended relationships between its evaluation components and may require constant updating as the program evolves. There’s the CIPP model from Stufflebeam5, flexible enough to incorporate studies that support continuous program improvement, as well as summative studies of the results of a completed program6. Another model used to plan and carry out program evaluation is the task-oriented one, which has five steps with a proposal guided by guiding questions that need to be answered during the evaluation process; this model is quite simple and self-explanatory1.

The Mini Clinical Evaluation Exercise (MiniCex), is a performance rating scale developed by the American Board of Internal Medicine in the 1990s, designed to assess the skills that medical students and residents need in situation of real-life with patients7,8.

There is no description in the literature on the use of MiniCex as a source of information for a program evaluation in undergraduate medical students; however, due to the quality of the data that can be obtained through the use of this instrument, it was considered that it could be a relevant source of information about the skills acquired throughout the period prior to the internship, which might even be used to guide the activities during the internship.

Therefore, the objective was to use the MiniCex as part of a program evaluation at the beginning of the medical school internship, aiming to determine the areas of the basic and pre-clinical course in which the students have difficulties.

METHOD

A cross-sectional, descriptive study was carried out using retrospective data obtained through MiniCex forms applied to students in the ninth semester in the Clinical Medicine I module, which corresponds to the first semester of the internship at the Medical School of Federal University of Pará (UFPA), with a total of 111 students being evaluated among the 154 eligible for internship from August 2017 to July 2018.

Only the first assessment obtained through the MiniCex of each student was used for this research, considering that it would represent the skills acquired in the previous eight semesters.

Clinical Medicine I consisted of outpatient activities in clinical medicine, cardiology, rheumatology, hepatology, geriatrics and pneumology, carried out by students in the ninth semester of the medical course.

The instrument used by the institution evaluated the six traditionally observed domains of clinical interview skill, physical examination skill, humanistic quality/professionalism, reasoning and clinical judgment, communication and counseling skills, organization and efficiency, ending with overall clinical skill), which were classified using a 6-item Likert scale: 1 to 3 - unsatisfactory and 4 to 6 - satisfactory. Other data were recorded, such as: case complexity, patient diagnosis, observation environment, whether it was a new case or return, as well as activity module.

The students were instructed on the first day of activities about the instrument use, and both the student and the observer were given the opportunity to identify evaluation opportunities. Feedback was regularly offered immediately after the teacher’s observation.

Teachers were trained in workshops carried out by the institution, both in the use of MiniCex and in providing feedback. The forms were kept by the teachers, which were requested at the end of each module for data collection and to prepare reports that were sent to the course management by the researchers.

The study was approved by the Research Ethics Committee of Instituto de Ciências da Saúde da UFPA, under number 2,250,664 (CAAE 66666017.9.0000.5172) on 08/31/2017. The collected data were organized and analyzed using Microsoft Excel 2007® spreadsheets. Categorical variables were expressed as absolute and percentage values, using the chi-square test to assess differences between the groups. A p value <0.05 was considered statistically significant.

RESULTS

Throughout the 12-month project development, a total of 111 students from the ninth semester of medical school, attending Clinical Medicine I internship, were evaluated through the MiniCex scale, of which observations were made by 4 teachers. Among the performed evaluations, of which 97.2% were requested by the teachers, the majority (72%) were of new cases, while 45% and 38.7% of them showed low and moderate complexity, respectively (Table 1).

Table 1
Data obtained through MiniCex on the type and complexity of cases treated by the internship students in Clinical Medicine I, from August 2017 to July 2018.

The diagnoses identified during the care assistance corresponded to 52 diseases, grouped into 11 systems. There was a predominance of the musculoskeletal system with 20.7%, followed by the gastrointestinal / hepatology system with 11.1% (Table 2). As for the skills in each domain, the performance was satisfactory in all of them (Table 3). In this project, it was observed that 12.6% of students had difficulty in at least 1 area, followed by 6.3% of students with difficulties in 2 areas and 4.5% with unsatisfactory performance in 3 or more areas.

Table 2
System frequencies in Clinical Medicine I services obtained through Mini-Cex applied to internship students from August 2017 to July 2018.
Table 3
Performance of students in the ninth semester of internship, assessed through MiniCex during Clinical Medicine I, from August 2017 to July 2018.

DISCUSSION

This study showed that the information obtained from the use of MiniCex can be used as part of a program evaluation during the medical course.

The internship students demonstrated sufficient overall competence in all skills in the Clinical Medicine module, highlighting the “humanistic / professionalism skill”, in which 100% of the students showed satisfactory performance, corroborating the study by Baños (2015)9, in which the students had a satisfactory result and had a higher score in the same area. The project showed that 12.6% of students had difficulty in at least 1 area, followed by 6.3% of students with difficulties in 2 areas and 4.5% with an unsatisfactory performance in 3 or more areas. There are no previous studies in the literature that allow a comparison with our findings.

Even with the lack of literature for comparison, it is noteworthy that a feedback to teachers of the preclinical stages responsible for teaching skills, showing the totality of observations made to assess the “humanistic quality / professionalism” skill were reported as satisfactory by the internship teachers, which would reinforce that they are on the right path regarding this skill. On the other hand, going back to these teachers with the information that almost 30% of the students evaluated at the beginning of the internship were considered as showing insufficient capacity in the “physical examination skill”, points to the need to review their programs and / or methodologies, aiming at an improvement in this performance. It is precisely this type of information that the MiniCex is able to offer as part of a program evaluation, objectively pointing out the skills that can be improved and those that are at an appropriate level.

Other assessment instruments could have been used, such as the Objective Structured Clinical Examination (OSCE), considered one of the most reliable methods for assessing the clinical skills of students and residents, in which the examinees alternate through a certain number of stations where there are real or standardized patients, with the purpose of performing different clinical tasks10, as well as the Problem Based Learning (PBL), a method that allows the student to experience learning by doing and, especially, the direct involvement with real life and the clinical environment 11.

However, in this study, MiniCex was chosen because it is a practical instrument developed to be applied in approximately 20 minutes 8. The MiniCex assessment involves direct observation by an educational supervisor of an student’s performance in real clinical situations. The evaluation is repeated on several occasions, and can occur in several situations: clinical, on-call, surgeries, etc. with subsequent feedback, which should be given as soon as possible 12. The main characteristic of this formative assessment instrument for clinical skills is to reproduce as closely as possible the routine of attending physicians in their workplace. It is precisely in the way of dealing with sensitive situations that the interns will show their skills, because in professional life, they won’t be examining actors or dummies. The students must learn to make decisions under conditions of uncertainty, to deal with ambiguity, complexity, exceptionality and the conflicts of values that almost always escape technical reasoning13. In parallel, as a disadvantage, the instrument requires more than one meeting between the student and the patients in order result in a more reliable and valid measure of practice and development of clinical skills 8. This study was limited to presenting the data from the first MiniCex form of each evaluated student, as we aimed to use the instrument as part of a program evaluation. However, it is worth mentioning that other MiniCex forms were filled out and used also for student evaluation, with their respective feedback, according to the opportunities during the Clinical Medicine activities.

The process of evaluating educational programs is the “systematic collection and analysis of information related to the design, implementation and results of a program, aiming at monitoring and improving the quality and effectiveness of the program” 14. The initial evaluation phase comprise the moment when the institutions or individuals responsible for a program make the decision to evaluate it. They must decide about the objective (s) of the evaluation and who will be responsible for carrying it out.2 Medical educators can choose between the models of individual program evaluation instruments or a combination of them to develop an appropriate evaluation model for their programs 6.

Special attention should be paid to the research objectives, the method’s validity and the selected instruments. The assessment can have a formative role, identifying areas where teaching can be improved or a cumulative role, assessing the effectiveness of teaching 2. As important as evaluating a program, is to use the instrument to give feedback to the student evaluated via MiniCex as one of the educational and evaluation strategies with the greatest evidence of effectiveness in the education of health professionals15, as it is an important part of the process of improvement of clinical skills 16, as well as professional development 17. As a result, the evaluation becomes a regulatory activity in the teaching-learning process, detecting gaps and providing solutions to any obstacles faced by students, in addition to providing improvements in didactic tools and possible adjustments in the syllabus or even in the curricular structure. The feedback regulates the teaching-learning process, continuously providing information so that the students can perceive how distant, or close, they are from the desired goals. The fact that the feedback is continuous allows the necessary adjustments for the best quality of learning to be made early, and not only when the student fails the test at the end of the course, that is, in the summative evaluation 18. Providing good quality and timely feedback plays an essential role in learning and professional development in medicine19.

In this context, the act of evaluating the professional in training starts to acquire a new meaning: it stops establishing whether the student has had a passing grade or not in a certain discipline to check whether the educational objectives have been achieved as a whole; whether the student has acquired, in addition to the necessary technical knowledge, competencies, skills and attitudes required for the required new professional profile. Therefore, it divests itself of the summative characteristic alone and assumes a formative role, as an integral part of the new professional’s training skills 20.

Considering the concerns that the undergraduate students are not adequately prepared for the practice and do not meet the national health needs, evidence is needed to indicate what is happening, where improvements are needed and how to make these improvements. Therefore, it is necessary to identify a system for the continuous screening of the program deficiencies and its possible improvements 21.

FINAL CONSIDERATIONS

The MiniCex applied to medical internship students showed to be capable of providing important information, useful as part of a program evaluation of the areas prior to internship. The analysis of the obtained data was sent to teachers of the pre-internship, internship school and course management. To the first ones, with the objective of reviewing their programs, detecting where they can intervene and, thus, making changes that aim at better acquisition of basic knowledge by students and, consequently, improve their performance. To the second ones, aiming to give them an overview of where they will need to focus their programs according to the needs identified in the students who arrive at the internship phase. Finally, to the management, as a guide of what should be supervised by the professionals teaching the semesters that precede the internship.

REFERENCES

  • 1 Bollela VR, Castro M. Avaliação de programas educacionais nas profissões da saúde: conceitos básicos. Medicina (Ribeirão Preto) 2014;47(3):333-42.
  • 2 Goldie J. AMEE Education Guide nº 29: Evaluating educational programmes. Med Teach 2006;28(3):210-24.
  • 3 Kirkpatrick D. Revisiting Kirkpatrick’s four-level model. Train Dev 1996;1:54-9.
  • 4 Frechtling JA. Logic modeling methods in program evaluation. San Francisco: Jossey-Bass; 2007.
  • 5 Stufflebeam DL, Shinkfield AJ. Evaluation theory, models, and applications. San Franscisco: John Wiley & Sons; 2007.
  • 6 Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE Guide nº 67. Med Teach 2012;34(5):e288-99.
  • 7 Norcini JJ, Blank LL, Arnold GK, Kimball HR. The Mini-Cex (clinical evalluation exercise): a preliminary investigation. Ann Intern Med 1995;123:795-9.
  • 8 Al Ansari A, Ali SK, Donnon T. The construct and criterion validity of the mini-CEX: a meta-analysis of the published research. Acad Med 2013;88(3):413-20.
  • 9 Baños JE, Gomar-Sancho C, Guardiola E, Palés-Argullós J. La utilización del Mini Clinical Evaluation Exercise (mini-CEX) en estudiantes de medicina. Rev. Fund. Educ. Méd. 2015;18(6)417-26.
  • 10 Amaral FTV, Troncon LEA. Participação de estudantes de Medicina como avaliadores em exame estruturado de habilidades clínicas (Osce). Rev. bras. educ. méd. 2007;31(1):81-9.
  • 11 Rego S. Currículo paralelo em Medicina, experiência clínica e PBL: uma luz no fim do túnel? Interface (Botucatu) 1998;2:35-48.
  • 12 Holmboe ES, Huot S, Stephen PHD, Chung J, Norcini J, Hawkins RE. Construct validity of the miniclinical evaluation exercise (miniCEX). Acad Med 2003;78(8):826-30.
  • 13 Megale L, Gontijo ED, Motta JAC. Evaluation of medical students’ clinical skills using the Mini-Clinical Evaluation Exercise (mini-CEX). Rev bras. educ. méd. 2009;33(2):166-75.
  • 14 Chicago. Accreditation Council for Graduate Medical Education. Accreditation council for graduate medical education: glossary of terms. ACGME; 2018 [acesso em 22 mar 2019]. Disponível em: Disponível em: https://www.acgme.org/Portals/0/PDFs/ab_ACGMEglossary.pdf
    » https://www.acgme.org/Portals/0/PDFs/ab_ACGMEglossary.pdf
  • 15 Pelgrim EAM, Kramer AWM, Mokkink HGA, Van der Vleuten CPM. The process of feedback in workplace‐based assessment: organisation, delivery, continuity. Med Educ 2012;46(6):604-12.
  • 16 Branch JR, William T, Paranjape A. Feedback and reflection: teaching methods for clinical settings. Acad Med 2002;77(12):1185-8.
  • 17 Carr S. The Foundation Programme assessment tools: an opportunity to enhance feedback to trainees? Postgrad Med J 2006;82(971):576-9.
  • 18 Borges MC, Miranda CH, Santana RC, Bollela VR. Avaliação formativa e feedback como ferramenta de aprendizado na formação de profissionais da saúde. Medicina (Ribeirão Preto) 2014;47(3):324-31.
  • 19 Burgess A, Mellis C. Feedback and assessment for clinical placements: achieving the right balance. Adv. Med. Educ. Pract. 2015;6:373-81.
  • 20 Sampaio AMB, Pricinote SCMN, Pereira ERS. Avaliação clínica estruturada. Rev. Gest. Saúde 2014;5(2):410-7.
  • 21 Martin P, Zindel M, Nass S. Graduate medical education outcomes and metrics: proceedings of a workshop. Washington: The National Academies Press; 2018.
  • FUNDING SOURCES
    Pro-Rectory of Research and Postgraduate Studies of the Federal University of Pará through the Voluntary Institutional Program for Scientific Initiation - PIVIC, PROPESP Public Edict 09/2018.

Publication Dates

  • Publication in this collection
    27 Feb 2020
  • Date of issue
    2020

History

  • Received
    06 Nov 2019
  • Accepted
    14 Nov 2019
location_on
Associação Brasileira de Educação Médica SCN - QD 02 - BL D - Torre A - Salas 1021 e 1023 , Asa Norte | CEP: 70712-903, Brasília | DF | Brasil, Tel.: (55 61) 3024-9978 / 3024-8013 - Brasília - DF - Brazil
E-mail: rbem.abem@gmail.com
rss_feed Acompanhe os números deste periódico no seu leitor de RSS
Acessibilidade / Reportar erro