Simulation Program - Publication Repository
Permanent URI for this collectionhttps://hdl.handle.net/1807/89967
Browse
Browsing Simulation Program - Publication Repository by Title
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Bridging Class and Field: Field Instructors’ and Liaisons’ Reactions to Information About Students’ Baseline Performance Derived From Simulated Interviews(2017) Bogo, Marion; Lee, Barbara; McKee, Eileen; Ramjattan, Roxanne; Baird, Stephanie L.To strengthen students’ preparation for engaging in field learning, an innovation was implemented to teach and assess foundation year students’ performance prior to entering field education. An Objective Structured Clinical Examination (OSCE) informed the final evaluation of students’ performance in two companion courses on practice theory and skills. The evaluation was used by field instructors and students to develop the field learning plan. This paper reports on a qualitative study that examined field instructors’ and faculty field liaisons’ experiences and reactions to using this new approach and its impact in shaping students’ field learning goals. Implications for supporting adoption of innovations that strengthen the link between classroom and field teaching are offered, including new institutional policies that resulted from ongoing evaluation.Item The Development of an Online Practice-Based Evaluation Tool for Social Work(Sage Publications, 2011-01-18) Regehr, Cheryl; Bogo, Marion; Regehr, GlennObjective: This paper describes the development of a practice-based evaluation (PBE) tool that allows instructors to represent their student’s clinical performance in a way that is sufficiently authentic to resonate with both instructors and students, is psychometrically sound, and is feasible in the context of real practice. Method: A new online evaluation tool was designed to address several of the problems associated with previous methods of evaluation, and was tested on 190 field instructor—student pairs. Results: Results demonstrated feasibility of the tool, high acceptability from students and faculty, high internal consistency, and clearly reduced ceiling effect, when compared with a traditional competency-based evaluation (CBE) tool. It did, however, continue to result in a strong skew toward positive evaluation and did not increase the identification of students at risk. Conclusions: The online PBE tool demonstrates promise in redressing some of the evaluation issues posed by the previous CBE model of evaluation.