• Users Online: 237
  • Home
  • Print this page
  • Email this page
Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 


 
   Table of Contents      
REVIEW ARTICLE
Year : 2021  |  Volume : 7  |  Issue : 3  |  Page : 204-210

Workplace-Based Assessment: A Real-Time Assessment Tool


Guru Nanak Eye Centre, Maulana Azad Medical College and associated hospitals, New Delhi, India

Date of Submission03-Dec-2021
Date of Acceptance03-Dec-2021
Date of Web Publication24-Dec-2021

Correspondence Address:
Dr. Kirti Singh
Guru Nanak Eye Centre, Maulana Azad Medical College and associated hospitals, New Delhi
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/mamcjms.mamcjms_127_21

Rights and Permissions
  Abstract 


Workplace-based assessment (WPBA) is a method of assessment which measures working of a healthcare professional when performing his duties. The shift in undergraduate medical education to competency-based medical education in curriculum of India makes this WPBA a very attractive option of evaluation, as it measures competency outcomes in real-time scenarios on patients. This article discusses the strengths of WPBA, its lacunae, its essential components of direct observation, conduction at workplace, and constructive, immediate feedback. It also details the methodology, types, utility index, strengths, and lacunae of WPBA along with strategies to improve implementation.

Keywords: Direct observation, feedback, utility and rating forms, workplace-based assessment


How to cite this article:
Singh K, Singh A. Workplace-Based Assessment: A Real-Time Assessment Tool. MAMC J Med Sci 2021;7:204-10

How to cite this URL:
Singh K, Singh A. Workplace-Based Assessment: A Real-Time Assessment Tool. MAMC J Med Sci [serial online] 2021 [cited 2023 Jun 9];7:204-10. Available from: https://www.mamcjms.in/text.asp?2021/7/3/204/333605




  Introduction Top


The current paradigm shift in undergraduate medical education to competency-based medical education (CBME) envisages a shift from classroom-based compartmentalized teaching to competency-based integrated teaching at workplace with real patients. Workplace-based assessment (WPBA) is the optimal tool to assess such competencies.[1]

Learning does not occur without assessment. The aphorism “Assessment drives learning” underscores need for all educators to embed assessment tools in concordance with learning objectives. Traditional methods of undergraduate/postgraduate assessment, whether it be ward leaving, symposia, or qualifying examination, focus more on presentation skills and cognitive domain of student, minimizing on psychomotor skill evaluation and ignoring affective domain, for example, communication skills. Retrace your path to the practical examinations faced by you and you would realize that how you examined the patient (psychomotor skills), how you interacted with the patient (affective domain) were minimally assessed, what you were tested on was how you presented and discussed the case. This emphasis on knowledge and reasoning skills has resulted in production of doctors’ surfeit with theoretical facts, classification, guidelines about disease with suboptimal translation into effective clinical skills, and patient handling.[1] This is one key reason for the increasing doctor–patient conflict with its negative ramifications.

An attempt to address this lacunae, a method of assessment which seeks to measure what it should measure, namely working of a healthcare professional when performing his duties was proposed by Norcini and Burch in 2007 and was named WPBA.[2] WPBA is defined as formal and/or informal assessment of anyone, by everyone and/or anyone present around him, when learner is actually performing his job. The three essential components of WPBA are: direct observation, conduction at workplace, and contextual, constructive, immediate feedback.[3],[4],[5]


  Competency Assessment Top


To understand WPBA, a look at competency ladder as envisioned by George Miller in 1990 is required. “Millers pyramid” moves away from traditional Flexnerian education model dominated by theoretical knowledge-based assessments, toward performance-based assessment in real-life situations. The pyramid divides development of clinical competence into four hierarchical processes started from knowledge (at the base) to routine application in clinical settings (at the apex) [Figure 1].
Figure 1 Millers pyramid of competencies.

Click here to view


Level 1 (Knows) tests knowledge by written examinations and traditional MCQ. Level 2 (Knows how) tests “application of knowledge” by essays, clinical problem-solving exercises, and extended MCQs. Level 3 (Shows) tests “clinical skills competency,” by standardized patient exercises, simulations, and clinical examinations. Level 4 (Shows how) tests “clinical performance” by direct observation in real clinical setting.[6] The lower two levels utilize classroom-based assessments, whereas two higher tiers testing the behavioral components of clinical competence, use simulated and real clinical settings. WPBA is an assessment tool of upper two levels.


  Methodology of WPBA Top


A planned assessment of trainee’s competencies in any domain (attitude, skills, or knowledge − ASK), is performed, with each assessment being labeled an encounter. The encounter of a short duration (10–15 minutes) is observed by a skilled observer, who assesses the student using a structured check list and rating scale devised by Norcini and Burch[2] [Figure 2]. The use of such a checklist increases error detection rate of assessors, and prevents this tool from becoming a mere tick box exercise.[7] The key component of WPBA is specific, immediate, and structured feedback given immediately after the encounter over 3 to 5 minutes, and never within the patients hearing.[8] It concludes with the assessor giving a specific action plan how to rectify the omissions, if any. The teacher then rates the performance on the chart, signs it, and hands it back to the trainee who keeps it for record. If the competency is acquired appropriately, the skill is considered to be mastered. The learner and trainer team then decide on the topic and probable time of next encounter. At each semester end, the competencies or learning outcomes achieved are confirmed and signed for by educational supervisor.[9]
Figure 2 Norcini rating scale for mini clinical examination (mini-CEX) evaluation form. Source: www.hcat.nhs.uk.

Click here to view



  Strengths of WPBA Top


  1. Constructive, specific feedback allows trainee to steer his learning toward desired outcomes[4],[10],[11],[12]
  2. Summative assessments (exit examinations) handicap students with element of 1-day luck, examiner bias, and anxiety aphasia. In addition, administration has the daunting task of arranging examiners, patient variety, and other resources. WPBA supplements this labor intensive examination system.
  3. Clinical overload in Indian hospitals, curtails time for effective formative assessment, which should assess for all competencies and all domains of ASK. A structured, learner led WPBA, conducted during working time and real-world scenarios (Ward/OPD/OT) takes care of this deficiency.[7],[13] It does not require any additional infrastructure support apart from skilled observers (fellows).
  4. The assessment encourages reflection by students and permits leaner guided course correction.[2]
  5. For under performers with poor skill mastery, WPBA by highlighting their specific deficiency gives them an opportunity to seek specific help early in training, to acquire desired competencies.[9] For high achievers, this tool is of limited use, for whom this step by step analytic guidelines, hamper learning process, the so called expertise reversal effect.[14]



  Types of WPBA Top


These can be categorized as: direct observation of trainee performance, individual case discussions, feedback linked, and documentation of work.[15]

1. Direct observation of trainees during clinical encounters: Mini-clinical evaluation exercise (mini-CEX), direct observation of procedural skills, clinical examination and procedural skills, objective structured assessment of technical skills, and objective structured long examination record.[5]

2. Discussion of individual clinical cases/supervised learning event: Case-based discussion, clinical encounter cards, and blinded patient encounters.

3. Feedback linked: Multisource feedback (MSF) or 360° assessment: This consists of feedback on regular, routine work from peers, paramedical workers (nursing staff), and other stakeholders like patients. The terminologies used are − mini peer assessment tool, patient satisfaction questionnaire, and clinical supervisor report.

4. Documentation of work: Log book, clinical encounter cards CEC, portfolio.

Direct Observation by Mini-CEX

It is a structured “snapshot” assessment of an observed clinical encounter, designed to provide feedback on skills essential to providing optimal clinical care.[13],[16] Competencies are assessed from seven areas: history taking, physical examination skills, communication skills, critical judgment, professionalism, organization, and efficiency and overall clinical care [Table 1].
Table 1 Competencies tested in mini-CEX

Click here to view


A structured, standard mini-CEX form rates the trainees from 1 to 9 grades (with 1–3 being unsatisfactory, 4–6 being satisfactory, and 7 or above being superior) is filled. Provision of free-text space at bottom of rating form permits assessor to identify strengths, areas for development, and an action plan. This constructive, contextual, and immediate feedback also given verbally is integral to mini-CEX activity.[2],[5]

Multiple encounters, around 6 to 8, dealing with multiple cases with multiple examiners are required for optimal reliability and for overcoming interrater bias.[17],[18]


  Direct Observation of Procedural Skills Top


Evaluates diagnostic and interventional skills of trainees in workplace setting using structured checklist and scored on 6-point rating scale (1–2 below expected level, 3 borderline, 4 meets expected level, and 5–6 above the expected levels).[5],[7] Free text space at end enables assessor to write comments on trainee’s progress, deficiencies, or strengths [Figure 3]. Psychomotor and affective competencies can be assessed, for example, eliciting a knee jerk, taking visual acuity, and measuring blood pressure.
Figure 3 Norcini rating scale for direct observation of procedural skills (DOPSs) assessment form. Source: www.hcat.nhs.uk.

Click here to view


The checklist includes: obtains informed consent, appropriate analgesia/sedation, technical ability, aseptic technique, help seeking behavior where appropriate, postprocedure management, communication skills, and consideration of patient/professionalism.[19]


  Individual Case Discussion Top


Case-based discussion

The trainee selects 1 to 2 discharged cases managed or observed by him and asks the educational supervisor to conduct a structured discussion on any aspect of the case.[20] Probing inquiry by assessors identify knowledge deficits including trainee’s ability to recognize these gaps. This tool is a good method of assessing and discussing professional judgment, rational treatment strategies, and ethical dilemmas, and is highly correlated with standardized patient care.[7],[15] In addition by checking records, it audits both management protocols and record keeping practices [Figure 4].
Figure 4 Norcini rating scale for case-based discussion (CBD) assessment form. Source: www.hcat.nhs.uk.

Click here to view


Blinded patient encounter

This is a bedside teaching method where one student of a small group of four or five students is observed when performing a focused interview or physical examination as instructed by clinician educator.[2] Based on his clinical findings, student is then asked to discuss the differential diagnosis and further care. The entire group learns information gathering and problem solving.

Work Documentation: Portfolio and Log Book

A longitudinal repository of activities including sickness trail, patient prototypes reported, procedures/surgeries performed, critical incidents, and counseling learning milestones it maps a trainee’s journey over a defined time along with attainment of professional competencies. Portfolio differs from log book by including self-reflection by student and critical thinking.[6],[15]

Portfolio evaluation is performed by observer trained in qualitative data analysis to identify pattern of learning and provide targeted feedback for guiding further growth trajectory.

Feedback Linked

Multisource feedback: This tool sample opinions and assessments of colleagues on clinical performance, professional behavior, and interpersonal skills of trainee.[21] The respondents are coworkers in same workplace, for example, supervisors, nursing staff, fellow trainees, and even patients.[22] Respondents anonymously rate on a standard form, listing a number of qualities, or behavioral characteristics with a rating scale. The feedback is communicated to educational trainer, who jots down his own assessment. The use of multiple observers impart a holistic picture of trainee’s performance as each assessor assesses from his/her own perspective, giving rise to the synonym – 360° feedback. This tool has been proven to be of immense use in enhancing communication skills and affective domain constructs.[17]

The value of feedback in WPBA is immense especially during the current climes of doctor–patient mistrust and high stress levels among health professionals. A real-time scenario involves a doctor handling patients and their relatives while simultaneously guiding her team to provide optimal care.[5],[17] For such ongoing complex human interaction, the doctor requires to develop self-perception, work on interpersonal development, and modify behavior aiming to achieve a healthy, working relationship with peer and patient.[11] This concept is beautifully illustrated by the Johari model, given by psychologists Luft and Ingham[23] [Table 2]. The purpose of feedback is to reveal the blind spot, hidden and unknown elements to the trainee, so as to enable him to adopt remedial measures, in time.
Table 2 Johari window

Click here to view



  Utility of WPBA Tools Top


Utility of any assessment tool is a product of validity, reliability, feasibility, acceptability, and educational impact.[1],[21],[24] For example, mini-CEX scores high on of validity, acceptability, and educational impact but reliability requires multiple encounters from multiple skilled observers which may not be very feasibility for undergraduate training.[9],[25] MSF, on the other hand, scores high on reliability, educational impact but lower validity and acceptability.


  Modalities to Improve Utility of WPBA Top


  1. For validity: Content validity can be increased by incorporating all domains (ASK) and all competencies. The encounters need to contextual and assessed by trained observers. Trainee induction with guidelines on process and intent along with blueprinting (systematic grouping of competencies with appropriate weightage), needs to be performed before initiating the WPBA activity.[17],[26]
  2. For reliability: Doing multiple encounters with multiple assessors.
  3. For feasibility: Faculty development training, student sensitization, and blueprinting.[1]
  4. For acceptability: Trainee induction and faculty orientation. Creating an atmosphere of mutual trust as imparting and receiving of constructive, nonjudgmental feedback is critical to the success of any WPBA activity
  5. For educational impact: Faculty development, redesigning of curriculum, pilot projects, standardized tools, and blueprinting is required.[1],[21]



  WPBA Relevance in Current Medical Training Top


Supervised encounters during work at hospital empower learners to function with appropriate responsiveness in real-life patient scenarios.

The current paradigm shift in undergraduate medical education to CBME envisages a shift from classroom-based compartmentalized teaching to competency-based integrated teaching at workplace with real patients. WPBA is the optimal tool to assess such competencies.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Singh T, Modi JN. Workplace based assessment: a step to promote competency based postgraduate training. Indian Paediatr 2013;50:553-9.  Back to cited text no. 1
    
2.
Norcini J, Burch V. Workplace based assessment as an educational tool: AMEE guide no 31. Med Teach 2007;29:855-71.  Back to cited text no. 2
    
3.
Barrett A, Galvin R, Steinert Y, Scherpbier A, O’Shaughnessy A, Horgan M, Horsley T. A BEME (Best Evidence in Medical Education) review of the use of workplace-based assessment in identifying and remediating underperformance among postgraduate medical trainees: BEME Guide No. 43. Med Teach 2016;38:1188-98.  Back to cited text no. 3
    
4.
Anderson P. Giving feedback on clinical skills: are we starving our young? J Grad Med Educ 2012;4:154-8.  Back to cited text no. 4
    
5.
Singh T, Kundra S, Gupta P. Direct observation and focused feedback for clinical skills training. Indian Paediatr 2014;51:713-7.  Back to cited text no. 5
    
6.
Poikela E. Developing criteria for knowing and learning at work: towards context. J Workplace Learn 2004;16:267-75.  Back to cited text no. 6
    
7.
Liu C. Introduction to workplace based assessments. Gastroenterol Hepatol Bed Bench 2012;5:24-8.  Back to cited text no. 7
    
8.
Daelmans HE, Mak-van der Vossen MC, Croiset G, Kusurkar RA. What difficulties do faculty members face when conducting work place based assessments in undergraduate clerkships? Int J Med Edu 2016;7:19-24.  Back to cited text no. 8
    
9.
Barrett A, Galvin R, Steinert Y, Scherpbier A, O’Shaughnessy A, Walsh G, Horgan M. et al... Profiling postgraduate workplace-based assessment implementation in Ireland: a retrospective cohort study. Springer Plus 2016;5:133.  Back to cited text no. 9
    
10.
Ende J. Feedback in clinical medical education. JAMA 1983;250:777-81.  Back to cited text no. 10
    
11.
Chowdhury RR, Kalu G. Learning to give feedback in medical education. Obstet Gynaecol 2004;6:243-7.  Back to cited text no. 11
    
12.
Lefroy J, Watling C, Teunissen PW, Brand P. Guidelines: the do’s, don’ts and don’t knows of feedback for clinical education. Perspect Med Educ 2015;4:284-99.  Back to cited text no. 12
    
13.
Goel A, Singh T. The usefulness of mini clinical evaluation exercise as a learning tool in different pediatric clinical settings. Int J App Basic Med Res 2015;5(Suppl S1):32-4.  Back to cited text no. 13
    
14.
Kalyuga S, Ayres P, Chandler P, Sweller J. The expertise reversal effect. Educ Psychol 2003;38:23-31.  Back to cited text no. 14
    
15.
Singh T, Sood R. Workplace based assessment: measuring and shaping clinical learning. Natl Med J India 2013;26:42-5.  Back to cited text no. 15
    
16.
Holmboe ES, Yepes M, Williams F, Huot SJ. Feedback and the mini clinical evaluation exercise. J Gen Intern Med 2004;19(Pt 2):558-61.  Back to cited text no. 16
    
17.
Miller A, Archer J. Impact of work place based assent on doctor’s education and performance: a systematic review. BMJ 2010;341:c5064.  Back to cited text no. 17
    
18.
Holmboe ES, Hawkins RE, Huot SJ. Direct observation of competence training: a randomized controlled trial. Ann Inter Med 2004;140:874-81.  Back to cited text no. 18
    
19.
Modi J, Anshu, Gupta P, Singh T. Teaching and assessing clinical reasoning skills. Indian Paediatr 2015;52:787-94.  Back to cited text no. 19
    
20.
Tan J, Tengah C, Chong VH, Liew A, Naing L. Workplace based assessment in an Asian context: trainees’ and trainers’ perception of validity, reliability, feasibility, acceptability, and educational impact. J Biomed Educ 2015; Article ID 615169:8, pages. https://doi.org/10.1155/2015/615169  Back to cited text no. 20
    
21.
Brinkman WB, Geraghty SR, Lanphear BP, Khoury JC, Gonzalez del Rey JA, Dewitt TG, Britto MT. Effect of multisource feedback on resident communication skills and professionalism: a randomized controlled trial. Arch Pediatr Adolesc Med 2007;161:44-9.  Back to cited text no. 21
    
22.
Luft J, Ingham H. The Johari Window: a graphic model for interpersonal relations. Hum Relat Train News 1961;5:6-7.  Back to cited text no. 22
    
23.
Van der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programs. Med Educ 2005;389:309-17.  Back to cited text no. 23
    
24.
Govaerts M. Workplace bases assessment and assessment for learning: threats to validity. J Grad Med Educ 2015;7:265-7.  Back to cited text no. 24
    
25.
Stalmeijer RE, Dolmans DH, Snellen-Balendong HA, van Santen-Hoeufft M, Wolfhagen IH, Scherpbier AJ. Clinical teaching based on principles of cognitive apprenticeship: views of experienced clinical teachers. Acad Med 2013;88:861-5.  Back to cited text no. 25
    
26.
Cate OT, Scheele F. Viewpoint: competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med 2007;82:542-7.  Back to cited text no. 26
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4]
 
 
    Tables

  [Table 1], [Table 2]



 

Top
 
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
Abstract
Introduction
Competency Asses...
Methodology of WPBA
Strengths of WPBA
Types of WPBA
Direct Observati...
Individual Case ...
Utility of WPBA ...
Modalities to Im...
WPBA Relevance i...
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed2702    
    Printed162    
    Emailed0    
    PDF Downloaded206    
    Comments [Add]    

Recommend this journal