'Clare, you need a blog' - well Anne-Marie, here is my first attempt!
Curiously, as a medical educator who champions the use of blogs as a pedagogic strategy, I have seldom used one myself. However, recent twitter-conversations about workplace based assessment in medicine, lead me to extend and share my thinking about the ongoing debate about the value, or otherwise, of these tools.
I have spent the past six years, post MMC, engaging in rich, often challenging, conversations with educational and clinical supervisors, many of whom decry the learning value of WPBA and resort to meaningless, tick box practices to meet regulatory requirements. Others, wishing to use them as intended, raise anxieties about using the 'full range' of performance judgments, on the basis that the mean is 'above expectation' unless the trainee is experiencing difficulties. It is hardly surprising then, that Collin's (2010) evaluation of foundation training describes assessment of trainees as 'excessive, onerous and not valued'. The move away from WPBA to 'supervised learning events' is poignant, given that the original intent of WPBA was to capture learning arising from everyday working practices. WPBA were meant to 'script' in time for trainers and trainees to come together, to discuss observed practice and to ensure that feedback was an integral part of working life. I wonder then, what went wrong?
It seems to me that WPBA embodied a fundamental shift in the culture of medical training, a shift from time-served apprenticeship to time-measured training. Assessments, historically, were significant, loaded affairs, based around high-stake summative judgements allowing entry into the profession, or otherwise. WPBA in their fledgling foundation form, were designed to be something very different; a 'diagnostic' tool, illuminating profiles of performance, that could form the basis of a shared 'developmental' conversation. They were designed as formative tools, capturing everyday working practices, tracing development over a trainee's working year. For this reason, the foundation WPBA were individually formative (for learning) and collectively summative (of learning). They used both criterion (pre-determined descriptors of performance) and normative referencing, with the norm being what it was reasonable to expect of a trainee at the END of that stage in training. They were valid because they were designed to sample authentic practice (not assessed performance) and reliable in that they were based on multiple assessments, on multiple occasions, by multiple assessors.
In my experience, the lack of faculty development activity to support the introduction of these tools, meant that trainers defaulted to known assessment practices. i.e. WPBA were viewed as summative hurdles to overcome, and trainees baulked at any judgement below 'at the level of expect ion'. Boxes were ticked; feedback sections left empty. All too soon, WPBA started to shape working practices ('we better do all your assessments today, as your time here is nearly up') rather than respond to working practices. All too often, trainers unaware of the normative benchmark, used the tools to judge performance relative to the stage of training, rather than end point. All too soon, these skewed assessment practices diluted any potential learning value embedded in the tools.
So where next with WPBA? Given that WPBA are being held onto during specialty training, at least for now, are their ways to retrieve their learning value.
1. Assessors to be very clear about the intended purposes and parameters of the tools they are using. Formative or summative? Norm or criterion referenced? Understanding their purpose will lead to more purposeful use.
2. Royal Colleges should be explicit about the design and development of the tools they are advocating. How has their validity and reliability been assessed? What constitutes 'best practice' in their use. All assessments involve some kind of 'trade off';being clear about limitations of tools is just as important as knowing their strengths.
3. Trainers and trainees should have opportunities to familiarise themselves with tools before they are used; where merited this may include some shared faculty development activity so the learning value of tools can be enhanced.
4. Rather than seeing WPBA as a driver for learning, see them as a way of capturing and fostering learning. If you are going to use them, use them authentically and in so doing, add meaning to their use.
5. Feedback is key; not the rather instrumental 'good bad good' routine, rather a rich, developmental conversation which explores the ways in which performance might be developed, with 'what happens next' being a far more important topic than 'what just happened'.
Clare.
Curiously, as a medical educator who champions the use of blogs as a pedagogic strategy, I have seldom used one myself. However, recent twitter-conversations about workplace based assessment in medicine, lead me to extend and share my thinking about the ongoing debate about the value, or otherwise, of these tools.
I have spent the past six years, post MMC, engaging in rich, often challenging, conversations with educational and clinical supervisors, many of whom decry the learning value of WPBA and resort to meaningless, tick box practices to meet regulatory requirements. Others, wishing to use them as intended, raise anxieties about using the 'full range' of performance judgments, on the basis that the mean is 'above expectation' unless the trainee is experiencing difficulties. It is hardly surprising then, that Collin's (2010) evaluation of foundation training describes assessment of trainees as 'excessive, onerous and not valued'. The move away from WPBA to 'supervised learning events' is poignant, given that the original intent of WPBA was to capture learning arising from everyday working practices. WPBA were meant to 'script' in time for trainers and trainees to come together, to discuss observed practice and to ensure that feedback was an integral part of working life. I wonder then, what went wrong?
It seems to me that WPBA embodied a fundamental shift in the culture of medical training, a shift from time-served apprenticeship to time-measured training. Assessments, historically, were significant, loaded affairs, based around high-stake summative judgements allowing entry into the profession, or otherwise. WPBA in their fledgling foundation form, were designed to be something very different; a 'diagnostic' tool, illuminating profiles of performance, that could form the basis of a shared 'developmental' conversation. They were designed as formative tools, capturing everyday working practices, tracing development over a trainee's working year. For this reason, the foundation WPBA were individually formative (for learning) and collectively summative (of learning). They used both criterion (pre-determined descriptors of performance) and normative referencing, with the norm being what it was reasonable to expect of a trainee at the END of that stage in training. They were valid because they were designed to sample authentic practice (not assessed performance) and reliable in that they were based on multiple assessments, on multiple occasions, by multiple assessors.
In my experience, the lack of faculty development activity to support the introduction of these tools, meant that trainers defaulted to known assessment practices. i.e. WPBA were viewed as summative hurdles to overcome, and trainees baulked at any judgement below 'at the level of expect ion'. Boxes were ticked; feedback sections left empty. All too soon, WPBA started to shape working practices ('we better do all your assessments today, as your time here is nearly up') rather than respond to working practices. All too often, trainers unaware of the normative benchmark, used the tools to judge performance relative to the stage of training, rather than end point. All too soon, these skewed assessment practices diluted any potential learning value embedded in the tools.
So where next with WPBA? Given that WPBA are being held onto during specialty training, at least for now, are their ways to retrieve their learning value.
1. Assessors to be very clear about the intended purposes and parameters of the tools they are using. Formative or summative? Norm or criterion referenced? Understanding their purpose will lead to more purposeful use.
2. Royal Colleges should be explicit about the design and development of the tools they are advocating. How has their validity and reliability been assessed? What constitutes 'best practice' in their use. All assessments involve some kind of 'trade off';being clear about limitations of tools is just as important as knowing their strengths.
3. Trainers and trainees should have opportunities to familiarise themselves with tools before they are used; where merited this may include some shared faculty development activity so the learning value of tools can be enhanced.
4. Rather than seeing WPBA as a driver for learning, see them as a way of capturing and fostering learning. If you are going to use them, use them authentically and in so doing, add meaning to their use.
5. Feedback is key; not the rather instrumental 'good bad good' routine, rather a rich, developmental conversation which explores the ways in which performance might be developed, with 'what happens next' being a far more important topic than 'what just happened'.
Clare.
Dear kind people who commented on the original; sorry your helpful comments have disappeared. I managed to delete the post and linked comments while renaming my blog. Have managed to retrieve original content but not reattach your comments.
ReplyDeleteClare
http://www.blogger.com/blogger.g?blogID=8548648637110176926#publishedcomments/src=dashboard
ReplyDeleteThis link should take you to the original set of comments on this post/
Nope- we can't access this page. You can just put it down to experience or you could copy and paste the comments yourself. I have only lost comments on blogger once- when the service went down- but I was able to copy and paste from the email notifications I received.
DeleteI'd be happy to share my experience of an NCAS WBPA with any academic interested in evaluating NCAS' practice.
ReplyDeleteHi Peter
ReplyDeleteI would be very interested to learn more about your work. My research activity is focussed on workplace based learning in medicine - and in the ways in which we develop doctors /dentists as educators. I am particularly interested in 'cultures' of learning and practice. Over the past 5 years I have done a lot of work locally with NHS Trusts and Deaneries, developing the training practices of clinical and educational supervisors. WPBA are a frequent topic of discussion!
Clare
Original comment reinstated: in response to Clare, Thank you. You have compressed a massive amount of thinking into a few hundred words. As a relative outsider to the world of post-graduate training I;m going to have to ask you to unpack some more of this. "All too often, trainers unaware of the normative benchmark, used the tools to judge performance relative to the stage of training, rather than end point. All too soon, these skewed assessment practices diluted any potential learning value embedded in the tools." Does your understanding of what happened come from your research, or is it detailed elsewhere? When this 'skewing' was happening what did it look like? AM, by
ReplyDeleteMorning Anne-Marie, My understanding arises from reading, but, more importantly (to me at least) the lived experience of working with over 1000 clinical and educational supervisors asked to put the tools to use. I can comment with security on the use of foundation tools, as I use them as a case study of WPBA in workshops /study days. In my experience, very few trainers were aware that the tools require them to make a judgement of a trainees current performance against what they might expect of an average trainee at the END of that year of training. So they used them relative to expectations at that point in training. This first type of 'skewing' means opportunities to identify profiles of performance (eg above expectation on communication skills, below expectation on procedural skills) were lost, as were opportunities to trace development over time i.e. so trainee sees trends towards level of expecation and above. The second skewing I observed relates to the pattern of use of WPBA, on Why aren't WPBA working?
Outstanding blog you guys have preserved there, I totally valuate the effort.
ReplyDeleteclick here
That is very kind of you! Thanks
DeleteHi. Thanks for getting out to blog.
ReplyDeleteI am wondering - do you think that it is WPBAs that are not working? Or is it the way in which they are implemented that are the problem?
This was an awesome blog post!
ReplyDeleteMarkham Airport Taxi
Medical educator work alongside occupational WPBA, Tools and Everyday working practices and devising a plan of action to improve their quality of life. Demand is particularly high in this as baby boomers age.
ReplyDeletebuy college term paper
Medical educator are the go-to personnel when your WPBA won’t work or your Tools won’t open. As the head of the IT department, I triage the operations of an organization’s technical network, and they’re a growing.
ReplyDeleteneed paper writing help
Thank you for sharing the information in the post. For the training in aesthetic treatments www.theharleystreet.com institute provides professional training to the doctors and practitioners.
ReplyDeletetry here replica louis vuitton special info Chloe Dolabuy pop over to these guys Source
ReplyDelete