ABSTRACT
Purpose: this study had two aims: (1) Analyse students' perceptions of achieved learning in the simulation workshops implemented in multiple areas in a Speech and Language Therapy curricula, and (2) Establish the effect of incorporating simulation workshops in the students' comprehensive evaluation of the course.
Methods: a survey on perceived learning was validated, including quantitative and qualitative sections, and applied to students that participated in the simulation workshops (n=241). Additionally, quantitative information from a survey of systematic application on the comprehensive perception of the courses that implemented workshops was analysed (n=277).
Results: the quantitative section showed a positive perception of learning through the simulation workshop and coincided with the qualitative section's positive opinions. Additionally, the courses that implemented simulation workshops showed a positive evaluation in methodology, feedback, and organization.
Conclusion: the students had a positive perception of the incorporation of clinical simulation workshops. The survey's sections provided complementary information regarding learning through clinical simulation.
Keywords: Simulation Training; Education; Speech, Language and Hearing Sciences
Introduction
Clinical simulation is a teaching-learning method that replaces or amplifies real experiences with guided experiences that evoke or replicate substantial aspects of the real clinical context in an interactive way1. Clinical simulation occurs in a safe environment in which the student can make mistakes and repeat a procedure without negative consequences for the patient1,2. This methodology has been widely implemented in areas of healthcare, facilitating the development of students’ clinical and transversal skills3-6.
There is currently limited but increasing evidence regarding simulation in Speech and Language Therapy (SLT) programs7. The research focuses mainly on simulated patients' experiences, allowing technical, non-technical, and clinical decision-making skills3. Among the potential benefits of including simulation are the following: (a) Strengthen the abilities to address communicative and speech disorders4,8, assess auditory function9, evaluate swallowing10, manage dysphagia11,12 and tracheostomy management13; (b) Create a safe learning environment, correcting errors without adverse patient consequences, and allowing teachers to focus more on the students than on the patients4, (c) Provide specific and manageable experiences3, preparing the student for future clinical experiences14,15, and (d) Support safety and confidence to face real and difficult clinical situations14,15. Other recent simulation studies in SLT have included simulated parents in the context of early hearing detection16, evaluation of preterm infant feeding17, and interprofessional education scenarios between nutrition and dietetics, and speech and language therapy students18. Also, Howells et al.19 have described the effect of using simulation-based learning cases on SLT students' confidence and perceptions.
Hill et al.4 evaluated the use of clinical simulation in SLT curricula. The simulated patients were able to reproduce scenarios that allowed the students to interact with them in a standardized way, maximizing the opportunity to demonstrate comparable clinical skills4. In SLT, a patient interview should be performed, diagnostic information should be given, and intervention should be developed -complex tasks for students entering real clinical environments-. Additionally, explaining diagnostic hypotheses and soliciting referrals, in cases of severe communicative disorders, can have emotional implications for the patient’s family. The evident symptoms (word-finding deficits, hand tremors, and hemiparalysis) are often portrayed realistically by standardized patients, but covert symptoms, such as comprehension difficulties, are portrayed far less realistically20. This restriction could affect simulated patients' ability (SPA) to interpret some language disorders related to the practice of SLT. Moreover, SPA almost exclusively tends to be adults since no studies have reported children's use as SPA21.
Simulated patients are effective in increasing confidence in the management of difficult interactions12. Despite the reported positive effects, this methodology must be planned appropriately when being incorporated into curricula, due to the fact that the simple existence of simulation workshops does not automatically increase learning12. There is evidence to suggest that simulation can partially replace traditional placement time for SLT students without the cost of competency. Hill22 reported that when a mean of 20% of placement time was replaced with simulation, SPL students achieved an equivalent competency level. Also, educators of SLT programs agree that simulated experiences could account for up to 25% of required direct clinical hours in speech-language pathology and audiology23. It bases its value as an alternative model of practice hours in SLT curricula.
The current demands of higher education institutions regarding the guarantee of graduate quality require the search for strategies to develop early clinical competence. The simulation allows the integration of theoretical aspects and practical skills in a controlled environment24. However, students’ perceptions of simulation and their competency development when engaged in simulation-based learning activities are less reported25. Likewise, the effect of implementing simulation-based workshops on multiple areas (e.g., combined effect on students’ learning when participating in multiple workshops across the degree program) along a cycle in SLT curricula is unknown. Additionally, to our knowledge, there are no reports regarding the effect that such an implementation could have in other aspects of the execution of the course, such as the organization of the course itself, the materials available for learning, and the delivery of feedback, among other aspects that students perceive as relevant when evaluating their learning experience.
Thus, the objectives of this study were: (1) To analyse students’ perceptions of achieved learning in the simulation workshops implemented in multiple areas in a SLT curricula, and (2), To establish the effect of incorporating simulation workshops in the students' evaluation (e.g. methodology, organization, infrastructure and materials, teacher quality, and feedback) of the course in which the workshops are implemented.
Methods
The Ethics Committee of the Faculty of Medicine at Pontificia Universidad Católica de Chile approved the protocol of the study (ID number 170926007). Each student that participated signed an informed consent form.
This study was mixed with qualitative and quantitative designs, in which the quantitative research was performed together with the qualitative research, and the mixing occurred in the interpretation of the results. For the first objective, third and fourth year SLT students were recruited after attending simulation workshops. Since the activity was part of the course program, all students participated in the simulation workshops.
The students answered a survey that compiled quantitative and qualitative data. Concerning the second objective, we collected quantitative data from a survey delivered across all courses in a program, and across subsequent years (systematic application) regarding the comprehensive evaluation of the course. This evaluation was answered by students belonging to courses that already incorporated the simulations, and by others from previous years when simulation workshops were not incorporated yet.
Simulation Workshops
The university where the simulation workshops were implemented is one of the oldest universities in Chile, with more than 20,000 undergraduate students. However, the SLT undergraduate degree is a recent creation at the university in question with three graduates' cohorts. The SLT curricula consist of five years of training in six areas of assessment and treatment of speech, voice, language, auditory, swallowing, and orofacial myofunctional disorders. In the first two years, the courses correspond to basic sciences, and then in the third year, the preclinical and clinical courses begin. In the fifth year, the student completes an internship in the areas mentioned above. Five simulation workshops were implemented in three clinical courses. Two of these courses were in the 8th semester (4th year), and the other was in the 6th semester (3rd year), with 277 students participating overall. In parallel to the simulation, the students were involved in clinical training in hospitals and other clinical centers throughout the course (Table 1). The simulation workshops’ aims were (1) to develop the communicative and clinical skills necessary to perform patient interviews and communicate diagnostic hypotheses effectively, and (2) to develop the procedural skills to perform therapeutic interventions.
All workshops involved simulated patients, who had never presented the health conditions they represented. The course teachers were previously trained in simulation methodology and debriefing. The simulated patients were prepared through a pilot training session, in which each course lecturer provided feedback regarding the health condition represented. A day before workshops began, the activity instructions and relevant information regarding to aetiology, clinical records, and signs presented by patients were sent to the students. Each workshop had three sections: (1) Providing initial instructions: at the beginning, it was indicated the aims of each simulation workshop; (2) Interaction with simulated patients: students had to interact -individually or in pairs- with the simulated patients; and (3) Debriefing: students were guided through a reflective process, identifying the weaknesses and strengths of their performance.
Part 1: Students’ Perceptions of Achieved Learning in the Simulation Workshops
Instrument Development
The ‘Survey of Perception of Simulation Workshop for Non-surgical Procedures’, developed by Villagrán et al.26 and used with medical students was adapted. A panel of local experts composed of those in charge of clinical courses was constituted. The adapted survey evaluated the perception of learning by the students regarding the simulation workshop. It was in a Likert scale format, with seven items (quantitative) and four open questions (qualitative). The newly adapted instrument was named Survey of Perception of Simulation Workshops in Health Sciences.
Sample Size
The sample size was calculated based on the confirmatory factor analysis’s goodness of fit27. Root Mean Square Error of Approximation (RMSEA) test was used to obtain the sample size needed to test the close-fit hypothesis of the confirmatory model. Thus, considering an RMSEA = 0.04, with a 80% power, 7 items/one factor, and α = 0.05, the sample size required was 205 participants.
Validity and Reliability
Content and appearance validity of the survey was determined through a panel of experts, and the construct validity through exploratory and confirmatory factor analysis. Since the survey was a Likert-type scale, a matrix of polychoric correlations was created. The determination coefficient (R2) was estimated to quantify the percentage of the variance of the survey's assertions explained by the factor identified. Reliability was assessed through Cronbach's Alpha coefficient.
Quantitative Analysis
All assertions were rated with a Likert scale with five response options (i.e. strongly disagree, disagree, neutral, agree and strongly agree). These options were coded from one point for ‘strongly disagree’ to five points for ‘strongly agree’. The Kruskal-Wallis test was used to compare global scores among workshops.
Qualitative Analysis
A qualitative analysis allowed the identification of categories, and determined the units of register (text citations regarding the topics broached by students). A categorization system was built, which complied with the characteristics of thoroughness, mutual exclusion, single classification principle, objectivity, and appropriateness28. Although previously defined categories were used at the beginning, new subcategories were added.
Data Triangulation
The quantitative information was triangulated, taking into consideration the proposal of O’Cathain et al.29 identifying total agreements, complementary information, and discordance in the qualitative and quantitative sections. Data triangulation was used to compare and consolidate findings across the quantitative and qualitative data sets.
Part 2: Effect of Incorporating Simulation Workshops in the Students' Course Evaluation
Regarding the second objective, an instrument delivered across all courses in the SLT program, and across subsequent years was applied. Eight aspects were analysed: methodology, teachers, information sources, feedback, grades, organization, infrastructure and materials, and comprehensive evaluation of the course30. Students rated each aspect on a scale from 1 to 7. In Methodology, students were asked to evaluate the teaching methodologies used and if they were enough. In Teachers, students indicated whether teachers created a safe learning environment and made all aspects of the course clear. When evaluating Sources of Information, students were asked to consider whether the guide test was adequate and readily available in the library. In Feedback, students considered whether they had received information, as well as recommendations for improvement during the course. To evaluate Grades and Quizzes, students took into consideration the required time and quantity of evaluations, level of difficulty, and the time it took to receive their results. For Organization, students had to rate the general organization of the course, as well as the schedule and workload of the course. To evaluate Infrastructure and Materials, the availability of rooms and laboratories was taken into consideration, together with the amount of simulation patients, and work materials. In the item named Comprehensive evaluation, students were asked to evaluate the course and the importance of their learning’s comprehensively.
Sample Size
The sample was estimated for the comparisons between evaluations applied two years before carrying out the simulation workshops, with an evaluation performed immediately after the workshops' implementation. A Mixed-effects model was performed to make these comparisons. Considering a mean difference of 0.4 points between evaluations, an alpha level of 0.05, a 90% power, with students nested within 3 clusters, and a model with a random intercept, the sample size required was 277 participants.
Quantitative Analysis
Students belonged to courses that incorporated the simulations, and from previous years that did not incorporate the simulation workshop performed the evaluation. There were records of evaluations from the previous two years before the implementation of the workshops, and the evaluations given upon completion of the workshops. A mixed-effects model was used to compare adjacent evaluations specifying a random intercept to consider that students’ perceptions about the same course would be correlated. Given the non-normal distribution of the model’s residuals, the standard error was estimated with bootstrapping.
Results
Part 1: Students’ Perceptions of Achieved Learning in the Simulation Workshops
Instrument Development
The expert committee adapted the survey initially developed by Villagrán et al.18 to be applied in SLT area. The adapted survey maintained the seven Likert-type assertions and four open questions. The phrase ‘simulation models’ (‘Modelos de simulación’), was replaced with ‘workshop methodology (‘Metodología del taller’).
Validity and Reliability
The exploratory factor analysis showed that the assertions adequately represented the only construct identified (Perceptions of learning through simulation workshops). This construct explained 87.2% of the variance scores. All of the factor loadings were greater than 0.4 (Table 2). The R2 fluctuated between 0.73 and 0.41, and the Cronbach's Alpha coefficient reached a value of 0.8.
Quantitative Analysis
The overall score had a median of 34 points (35 points was the highest possible score), with the 25th percentile scoring 32 points and the 75th percentile scoring 35. There were no significant differences between the medians of the workshops (X2=7.38; p=0.117) (Table 3).
Descriptive statistics (median, 25th percentile and 75th percentile) of the Survey of Perception of Simulation Workshops in Health Sciences by workshop.
There was a ceiling effect for all assertions except for numbers four and five (Table 2). Scoring differences were observed for the assertions: four with: one (p<0.001), three (p<0.001); and five with: one (p<0.001), two (p<0.001), three (p<0.001), six (p<0.001) and seven (p<0.001).
Qualitative Analysis
The results were structured according to the open questions, with 64 identified codes. These codes were grouped into 14 subcategories, and these were grouped into four categories.
Category One: Strengths
Three types of subcategories were identified in the answers to the question: What strengths of the simulation workshop would you highlight? In Contextual characteristics of the workshop subcategory, students referred to the quality of acting and the realistic imitation of reality (‘The workshop allowed me to have a real interaction’). Another subcategory was Perception of learning in the simulated context, which alluded to the opportunity when preparing them for clinical practice, adapting to different situations and practicing procedures (‘Encounter a procedure, allowing me to generate strategies for real-world clinical practice’). Perception of learning based on the reflective practice subcategory referred to value the debriefing and recognizing their own mistakes (‘made me feel more confident and learn from common mistakes’ and ‘we were able to become aware of our mistakes’).
Category Two: Weaknesses
Three types of subcategories were observed in the answers to the question: What aspects could be improved in the simulation workshop? The first corresponds to Contextual characteristics of the workshop, which referred to the organization, number of people per group, and the limitation of the interactions with the simulated patient (‘Everyone should have the opportunity to participate). The Teacher-Student-Actor interaction subcategory alluded to the previous delivery of instructions and the lack of immediate feedback (‘Feedback is required after the evaluation, before beginning the intervention’). In the third subcategory, Perception of one’s weaknesses in a simulated scenario, students recognized the difficulty of time management (‘More time is needed to organize the intervention’).
Category Three: Difficulties
Three types of subcategories were observed in the answers to the question: Did you have difficulties during the simulation workshop? In the first subcategory, Organization and contextual factors, students made references to problems with the number of students, time, and environmental factors (‘I need to optimize the time to perform the patient evaluation’). The Difficulty in the cognitive competence subcategory alluded to a lack of experience and studies (‘I had difficulties due to the lack of experience with patients with severe speech disorder’). In the third subcategory called Difficulty in procedural competency, students referred to the delivery of diagnosis, formulation of questions, detailing, and the order of evaluation (‘We did not explain the diagnosis to the patient. It was difficult to manage the patient’s frustration and tiredness’). In Difficulty in attitudinal competency subcategory, students referred to seem nervous or insecure, and manifested obstacles in adapting to the needs of the patient (‘It was difficult for me to adapt to the situation while achieving the objectives of the session’).
Category Four: Recommendations
Three types of subcategories were observed in the answers to the question: Would you recommend the simulation workshop to other students? The Perception of development of transversal and evaluative skills subcategory referred to the development of these skills (‘yes, since it incorporated strategies that made possible an adequate interaction with the patient’). The preparation for clinical visits and their futures as professional, as well as the possibility to make mistakes in a safe environment was mentioned in the subcategory Preparation before real clinical practice (‘yes, this allowed me to improve before starting clinical practices’). In the Practical learning opportunity subcategory, students referred to the workshop as a meaningful learning experience (‘yes, since it was essential to work on communication skills and the human side, rather than only on technical and theoretical knowledge’).
Triangulation of Information
There was total agreement among the quantitative assertion number two, related to the usefulness of receiving feedback to recognize strengths and weaknesses, and the qualitative subcategories perception of learning based on reflective practice and identification of strengths and weaknesses. There was also total agreement among the assertion six regarding the opinion that the simulation workshop should be an essential component in teaching, and that 100% of the participants would recommend the workshop in the qualitative category recommendations.
The responses to assertion seven, related with being allowed to prepare oneself before performing clinical procedures, was complemented with the subcategories related to the possibility of learning from mistakes before encountering real patients. Discordance was observed in the category Contextual characteristics of the workshop, which included negative aspects related to the number of students, and that the simulation imitates reality, identifying weaknesses that are not reflected in the quantitative section.
Part 2: Effect of Incorporating Simulation Workshops in the Students' Course Evaluation
There were no differences in any of the evaluated aspects when comparing years previous to the implementation of the workshops, except sources of information (p<0.01) and comprehensive grade (p<0.01). There were differences when comparing the year before implementing the workshops and the evaluation performed immediately after their implementation in aspects such as methodology (p<0.05), feedback (p<0.05), organization (p<0.01), infrastructure and materials (p<0.001), and comprehensive grade (p<0.001) (Figure 1).
Results from the evaluation survey of the systematic application course, including results from the two years since the beginning of the course, and the evaluations given upon completion of the workshops (n=277).
Discussion
Clinical simulation workshops were implemented in three courses of the SLT curricula to develop communicative and procedural skills. The first objective was to analyse students' perceptions of achieved learning in the simulation workshops implemented in multiple areas in SLT curricula. There was a positive perception of the inclusion of simulation workshops reported both in quantitative and qualitative sections of the survey.
In the perception survey, assertions five and seven obtained the highest R2, relating strongly to the construct measured. These assertions are linked to an increase in safety and preparation for clinical visits. This finding is in agreement with studies performed in SLT in which students after participating in workshops with simulated patients, reported decreased anxiety and increased confidence4,7,13.
A second objective was to establish the effect of simulation workshops in the students' evaluation of the course in which the workshops were implemented. There were differences when comparing before-after simulation workshops implementation in methodology, feedback, organization, and the overall grade. The increase in the score for feedback could be associated in part with the debriefing, which corresponds to the space for reflection regarding clinical practice, allowing students to clarify their knowledge and understand the rationality of the simulation31. This improvement is consistent with Clinard & Dudding7, where students identified the feedback and communication with the educator as the most significant strengths of the process.
The present study found the strengths and weaknesses of the simulation workshops via the incorporation of qualitative survey questions. Students’ difficulties and their recommendations for future workshops were also gathered. Having a high score in the quantitative assertions, with a ceiling effect for some, limits the application of the instruments that only consider quantitative aspects. On the other hand, high scores highlight the fact that the experience was a positive one, and that the difficulties were outweighed by the benefits experienced.
Limitations and Projections
One limitation is the evaluation of learning via perception, without including changes in student performance, making this study more of an initial evaluation. Despite this, it was possible to confirm changes in the evaluation that was applied systematically regarding other aspects that students perceive as relevant when evaluating their learning experience. Additionally, the performance obtained in courses that included the workshops is high, which makes it challenging to evaluate possible modifications.
Another limitation is that the improvement in the evaluation of the course in which the workshops were implemented cannot be attributed solely to the simulation workshops since the study was not controlled by other activities such as clinical training. There is a constant increase in the evaluation of courses, which is to be expected given that there is an early evaluation (half of semester), and all professors have completed methodology-based teacher training. The increase in aspects such as organization and methodology is likely attributable to the simulation workshops.
One possible projection is to evaluate the transfer of skills among students that participated in the workshops to students that did not. In this institution where the participants study, it is common that students become teachers’ assistants for lower-level courses. Thus, it is not unreasonable to assume that some students model others. Barsuk et al.32 observed that medical residents that had not yet received simulation training improved their basal performance after observing residents with more experience and who had completed simulation training.
Conclusion
Clinical simulation workshops were implemented in the SLT curricula to develop skills before clinical practice. To evaluate these, a learning perception survey that included a quantitative and qualitative section was adapted, and its validity and reliability were confirmed. Elevated scores showed students’ positive perceptions regarding the implementation of simulation workshops, which were a learning experience where students highlighted the benefits of putting their knowledge into practice before facing real patients.
Acknowledgements
To the panel of local experts committee composed of directors and clinical lecturers in the Department of Health Sciences at Pontificia Universidad Católica de Chile.
REFERENCES
-
1 Corvetto M, Bravo MP, Montaña R, Utili F, Escudero E, Boza C et al. Simulación en educación médica: una sinopsis. Rev Med Chil. 2013;141(1):70-9. Spanish. doi: 10.4067/S0034-98872013000100010.
» https://doi.org/10.4067/S0034-98872013000100010 - 2 Alanazi AA, Nicholson N, Thomas S. The use of Simulation Training to improve knowledge, skills, and confidence among healthcare students: a systematic review. The Internet Journal of Allied Health Sciences and Practice. 2017;15(3):2.
-
3 Bokken L, Linssen T, Scherpbier A, van der Vleuten C, Rethans JJ. Feedback by simulated patients in undergraduate medical education: a systematic review of the literature. Med Educ. 2009;43(3):202-10. doi: 10.1111/j.1365-2923.2008.03268.x.
» https://doi.org/10.1111/j.1365-2923.2008.03268.x -
4 Hill AE, Davidson BJ, Theodoros DG. The performance of standardized patients in portraying clinical scenarios in speech-language therapy. Int J Lang Commun Disord. 2013;48(6):613-24. doi: 10.1111/1460-6984.12034.
» https://doi.org/10.1111/1460-6984.12034 -
5 Michael M, Abboudi H, Ker J, Shamim Khan M, Dasgupta P, Ahmed K. Performance of technology-driven simulators for medical students - a systematic review. J Surg Res. 2014;192(2):531-43. doi: 10.1016/j.jss.2014.06.043.
» https://doi.org/10.1016/j.jss.2014.06.043 -
6 Lee R, Raison N, Lau WY, Aydin A, Dasgupta P, Ahmed K et al. A systematic review of simulation-based training tools for technical and non-technical skills in ophthalmology. Eye (Lond). 2020;34(10):1737-59. doi: 10.1038/s41433-020-0832-1.
» https://doi.org/10.1038/s41433-020-0832-1 -
7 Clinard ES, Dudding CC. Integrating simulations into communication sciences and disorders clinical curriculum: impact of student perceptions. Am J Speech Lang Pathol. 2019;28(1):136-47. doi: 10.1044/2018_AJSLP-18-0003.
» https://doi.org/10.1044/2018_AJSLP-18-0003 -
8 Syder D. The use of simulated clients to develop the clinical skills of speech and language therapy students. Eur J Disord Commun. 1996;31(2):181-92. doi: 10.3109/13682829609042220.
» https://doi.org/10.3109/13682829609042220 -
9 Wilson WJ, Schmulian D, Sher A, Morris S, Hill AE. Student perceptions of two simulated learning environments in paediatric audiology. Int J Audiol. 2020;59(1):16-23. https://doi.org/10.1080/14992027.2019.1660004
» https://doi.org/10.1080/14992027.2019.1660004 -
10 Potter N, Allen M. Clinical swallow exam for dysphagia: A speech pathology and nursing simulation experience. Clin Simul Nurs. 2013;9(10):e461-e464. doi: 10.1016/j.ecns.2012.08.001
» https://doi.org/10.1016/j.ecns.2012.08.001 -
11 Ward EC, Hill AE, Nund RL, Rumbach AF, Walker-Smith K, Wright SE et al. Developing clinical skills in paediatric dysphagia management using human patient simulation (HPS). Int J Speech Lang Pathol. 2015;17(3):230-40. doi: 10.3109/17549507.2015.1025846.
» https://doi.org/10.3109/17549507.2015.1025846 -
12 Miles A, Friary P, Jackson B, Sekula J, Braakhuis A. Simulation-based dysphagia training: teaching interprofessional clinical reasoning in a hospital environment. Dysphagia. 2016;31(3):407-15. 10.1007/s00455-016-9691-0.
» https://doi.org/10.1007/s00455-016-9691-0 -
13 Miles A, Greig L, Jackson B, Keesing M. Evaluation of a tracheostomy education programme for speech-language therapists. Int J Lang Commun Disord. 2020;55(1):70-84. doi: 10.1111/1460-6984.12504.
» https://doi.org/10.1111/1460-6984.12504 -
14 Bressmann T, Eriks-Brophy A. Use of simulated patients for a student learning experience on managing difficult patient behaviour in speech-language pathology contexts. Int J Speech Lang Pathol. 2012;14(2):165-73. doi: 10.3109/17549507.2011.638727.
» https://doi.org/10.3109/17549507.2011.638727 - 15 Miles A, Donaldson S, Friary P. Training hospital readiness in speech-language pathology students through simulation. Internet j. allied health sci. pract. 2015;13(4): Article 8.
-
16 Alanazi AA, Nicholson N. Audiology and speech-language pathology simulation training on the 1-3-6 early hearing detection and intervention timeline. Am J Audiol. 2019;28(2):348-61. doi: 10.1044/2019_AJA-18-0185.
» https://doi.org/10.1044/2019_AJA-18-0185 -
17 Ferguson NF, Estis JM. Training students to evaluate preterm infant feeding safety using a video-recorded patient simulation approach. Am J Speech Lang Pathol. 2018;27(2):566-73. doi: 10.1044/2017_AJSLP-16-0107.
» https://doi.org/10.1044/2017_AJSLP-16-0107 -
18 Jackson B, Brady A, Friary P, Braakhuis A, Sekula J, Miles A. Educator-student talk during interprofessional simulation- based teaching. BMJ STEL. 2020;6(4):206-13. doi: 10.1136/bmjstel-2019-000455
» https://doi.org/10.1136/bmjstel-2019-000455 -
19 Howells S, Cardell EA, Waite MC, Bialocerkowski A, Tuttle N. A simulation-based learning experience in augmentative and alternative communication using telepractice: speech pathology students' confidence and perceptions. Adv Simul (Lond). 2019;4(Suppl 1):23. doi: 10.1186/s41077-019-0113-x.
» https://doi.org/10.1186/s41077-019-0113-x -
20 Baylor C, Burns MI, Struijk J, Herron L, Mach H, Yorkston K. Assessing the Believability of Standardized Patients Trained to Portray Communication Disorders. Am J Speech Lang Pathol. 2017;26(3):791-805. doi: 10.1044/2017_AJSLP-16-0068.
» https://doi.org/10.1044/2017_AJSLP-16-0068 -
21 Carter MD. The effects of computer-based simulations on speech-language pathology student performance. J Commun Disord. 2019;77:44-55. doi: 10.1016/j.jcomdis.2018.12.006.
» https://doi.org/10.1016/j.jcomdis.2018.12.006 -
22 Hill AE, Ward E, Heard R, McAllister S, McCabe P, Penman A et al. Simulation can replace part of speech-language pathology placement time: a randomised controlled trial. Int J Speech Lang Pathol. 2020:1-11. doi: 10.1080/17549507.2020.1722238.
» https://doi.org/10.1080/17549507.2020.1722238 -
23 Dudding CC, Nottingham EE. A national survey of simulation use in university programs in Communication Sciences and Disorders. Am J Speech Lang Pathol. 2018;27(1):71-81. doi: 10.1044/2017_AJSLP-17-0015.
» https://doi.org/10.1044/2017_AJSLP-17-0015 -
24 Hope A, Garside J, Prescott S. Rethinking theory and practice: pre-registration student nurses experiences of simulation teaching and learning in the acquisition of clinical skills in preparation for practice. Nurse Educ Today. 2011;31(7):711-5. doi: 10.1016/j.nedt.2010.12.011.
» https://doi.org/10.1016/j.nedt.2010.12.011 -
25 Hewat S, Penman A, Davidson B, Baldac S, Howells S, Walters J et al. A framework to support the development of quality simulation-based learning programmes in speech-language pathology. Int J Lang Commun Disord. 2020;55(2):287-300. doi: 10.1111/1460-6984.12515.
» https://doi.org/10.1111/1460-6984.12515 -
26 Villagrán I, Tejos R, Chahuan J, Uslar T, Pizarro M, Varas J et al. Undergraduate student's perception of clinical simulation workshops: assessment of an instrument. Rev Med Chil. 2018;146(6):786-95. doi: 10.4067/s0034-98872018000600786.
» https://doi.org/10.4067/s0034-98872018000600786 -
27 Maccallum RC, Browne MW, Sugawara HM. Power analysis and determination of sample size for covariance structure modeling of fit involving a particular measure of model. Psychol Methods. 1996;1(2):130-49. doi: 10.1037/1082-989X.1.2.130
» https://doi.org/10.1037/1082-989X.1.2.130 - 28 Rodríguez G, Gil J, García E. Métodos de investigación cualitativa. Málaga: Aljibe; 1996.
-
29 O'Cathain A, Murphy E, Nicholl J. Three techniques for integrating data in mixed methods studies. BMJ. 2010;341:c4587. doi: 10.1136/bmj.c4587.
» https://doi.org/10.1136/bmj.c4587 -
30 Pérez G, Kattan E, Collins L, Wright AC, Rybertt T, González A et al. Assessment for learning: experience in an undergraduate medical theoretical course. Rev Med Chil. 2015;143(3):329-36. doi: 10.4067/S0034-98872015000300007.
» https://doi.org/10.4067/S0034-98872015000300007 -
31 McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ. 2010;44(1):50-63. doi: 10.1111/j.1365-2923.2009.03547.x.
» https://doi.org/10.1111/j.1365-2923.2009.03547.x -
32 Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Unexpected collateral effects of simulation-based medical education. Acad Med. 2011;86(12):1513-7. doi: 10.1097/ACM.0b013e318234c493.
» https://doi.org/10.1097/ACM.0b013e318234c493
Publication Dates
-
Publication in this collection
12 Apr 2021 -
Date of issue
2021
History
-
Received
13 Oct 2020 -
Accepted
08 Mar 2021