Blog

The hard task of assessing “soft” skills: Notes from a regional seminar

On the 12th of September, the Western Cape SAAHE regional committee hosted a half-day seminar on the topic of assessing “soft” skills. We had 40 participants who spent a few hours discussing different approaches to assessing things like professionalism, interpersonal skills and graduate attributes. This post presents the notes taken during the discussion from the seminar, with some additional information added after the fact, together with links to external sources. We share it here in the hope that others may find it useful.


The point was made that these should perhaps not be called soft skills given that soft has the unintended connotation of being “unimportant” or “easy”. Should these be referred to rather as: Complex skills? Intrinsic roles (e.g. as described in competency frameworks)? Graduate attributes? Critical cross-field outcomes?


Tools that can be used in the context of a course / module / block

  • Assessment of Physiotherapy Practice (APP) tool
    • This is a Physio-specific tool that is used in the clinical practice module at SU, addressing the development and assessment of soft skills. Students tend to score low when they start using it, then get better over time. Principles to think about here are progress testing and the testing effect (for learning theory and for learning skills). For a nice systematic review of a range of clinical practice assessment tools, including the APP, see O’Connor, A., McGarr, O., Cantillon, P., McCurtin, A., & Clifford, A. (2017). Clinical Performance Assessment Tools in Physiotherapy Practice Education: A Systematic Review.
  • Feedback from patients in Linguistics
    • Has the added advantage of boosting students’ confidence but we also need to bear in mind that patients may need to be prepared in advance. In the experience of some, patients tend to be overly complimentary about student performance (e.g. “everything was perfect”).
    • Scope of undertaking: No indication of how many students
  • Longitudinal matrix
    • UWC Occupational therapy department developed a matrix over four years considering the following: UG graduate attributes and core competencies; Weighted according to the four years; Used Bloom’s taxonomy as a guide for levels e.g., know about communication in the first year; application and integration in the fourth year.
    • Scope of the undertaking: About 50 per year level
  • Module handovers
    • At the end of every term or every semester, lecturers sit and hand over what was covered and what still needs to be covered in upcoming modules to bridge gaps in learning, module expectations and what students struggle with. Individual students who are struggling are raised in regular staff meetings
    • Scope of the undertaking: 50 students
  • Create opportunities for students to develop but then don’t actually assess
    • It is questionable with respect to the degree to which these things can be taughtThese opportunities might consist solely of 1) do something, 2) get feedback on what you did, 3) reflect.

Practicalities of deploying approaches

  • Designing and managing a distributed curriculum
    • Nursing at UWC has Interpersonal skills sessions for first-year students e.g. death and dying; washing a baby – observation of how student washes baby, makes eye contact with the baby, etc.
      • three times a week; four scenariosuse simulated patientsfacilitated by mentorsfizzles out in later yearsScope of undertaking: over 1000 students in programme; 300-350 in first year
    • UCT has two courses that run in the first year and incorporate soft skills: Becoming a Professional and Becoming a Health Professional (all disciplines are working together in these courses)
      • A challenge is whether the principles addressed here are followed through in later years, particularly in the clinical yearsIn addition, there is some evidence that isolated modules like this aren’t very effective at developing what is essentially an integrated skill that must be used across domains
  • Understanding fully what you are trying to help students to learn
    • For example, iare communication skills about mastering isiXhosa as a language or about understanding culturally appropriate communication e.g. the role of eye contact and body position in communication?
      We need to understand the role of cultural humility (see here and here for more information) in the context of understanding and assessing soft skills – ensuring that students recognise that difference of any sort has an impact and moderating yourself and your behaviour accordingly.Role modelling is very important in this regardAre we assessing things like “punctuality” and “dress code” under the broad name of “professionalism”?
  • Symbolism is important – soft skills should be communicated as being integral rather than peripheral
    • Shouldn’t be “putting it on the side as a different mark” on a rubricAll aspects of the curriculum should be integrated; a signal that clinical expertise is necessary but not sufficientAssessment system should be designed in such a way that a student cannot pass on clinical reasoning alone yet have poor communication skills or empathy (see below)Using soft skills tools that do not contribute to decision making can be  a two-edged sword e.g., if Nursing students must log 4000 clinical hours, this equates to an impact curriculum; student focuses on the technique so much they forget there is a patient in the bed; rote behaviours are emphasised, rather than soft skills, in order to demonstrate the technique (think here about the quote adapted from Rees & Knight, 2007: “Do we want [health care workers] who are professional, or will we settle for [health care workers] who can act in a professional manner?”)Important to consider the instrument and purpose of the assessment; if the student doesn’t understand that communication is part of the manual technique, then they will not focus on / address / bother with communication skills.
  • We could consider using longitudinally collected narrative data obtained from clinicians, patients, peers, etc.
    • We currently compartmentalise information about students because we have modular systems within universities and institutions
      There is little or no longitudinal collation and consideration of data
    • Consider narrative, qualitative approaches to collecting assessment data: Maybe each clinician writes a short paragraph about the students’ behaviour e.g. “the student rolled their eyes a bit every time I spoke to them” – this says something about a student’s professionalism without having to assign a number to a definition of what “professional” means.
      • once in 4 years or if three people out of 25 have a problem with an aspect of student behaviour, maybe it’s not an issue; but if 18 out of 25 note issues and make qualitative, open-ended claims about student behaviour, that is a different story
    • An example of a Physiotherapy school in New Zealand collating narrative feedback alongside quantitative data was discussed in some depth in a podcast you can listen to here.
    • The individual preceptor or tutor isn’t burdened with making pass-fail decisions about a student – all they have to do is write a short narrative where they don’t even have to define the behaviour i.e. is this about “communication” or “professionalism”? All they need to do is describe an observed behaviour.
      • lowering the stakes for the preceptors
    • This helps us to evaluate the aggregate picture over time rather than make decisions based on single snapshots of performanceProgrammatic approach to assessment suggests applying criteria used to illustrate rigour for qualitative research – trustworthiness; credibility; dependability; confirmability

How do we use the information?

  • Only make decisions with high resolution data
    • Can’t make defensible decisions about soft skills using tiny amounts of data collected on a single clinical placementCees van der Vleuten uses the idea of the pixel resolution of the “picture” we have about a student. He suggests that we gather and collate small numbers of pixels over time to generate a high-resolution picture of what is going on.
  • This takes pressure off any one clinician as being “responsible for” a student’s failure
  • Helps avoid the problem of “failure to fail”
    • If I have to make a defensible decision, I rather make no decisionThen, the student reaches the final year and everyone agrees the student shouldn’t be there
  • Each clinician / tutor / preceptor just contributes their 3 or 7 or 4 pixels worth of data towards building the composite picture, without the pressure of pass-fail decision making attached to the data collection
  • These small pixels don’t form the basis for any single decision but are rather summed over the duration of the programme; so, rather than “professionalism” counting for 5% of a single assessment task, you have many tasks where those marks are added up over time.
  • Separate data collection from decision making
    • this kind of principle could have systemic implicationsfor all of the principles we mention, there are implications for programme design i.e. it’s hard to fit these ideas into our traditional curriculum models.
  • Use collective decision making
    • Anonymise the observations; give them to a panel of five clinicians to make a judgement about competent / not e.g. is this someone you would hire? No? Not competent.Is it defensible?
      • if you have 60-70 observations from 23 preceptors over 2 years, it will be (again, consider qualitative criteria for rigour as mentioned above)once such a system is in place, there are bound to be legal challenges but we cannot let that dictate what is sound practice and our current system faces legal challenges anyway – it is not as though it is perfect
    • Avoid spending too much time on students who are clear fails or clear passes (i.e., spending hours “agreeing vigorously”); rather focus energy on borderline students eg 5% or 1SEM or some such measure either side of the cut score

Incorporating this into assessment programmes and bureaucratic systems

  • This should be incorporated into rules of progression or for the award of a qualification rather than used at module level
  • We need to circumvent the tyranny of an assessment bureaucracy that demands a numerical mark to enter against a credit bearing course code on the university’s computer system
  • Options to consider
    • make a “competent” judgment (don’t use numerical scores) for each of a series of soft skills it a prerequisite / DP requirement for access to the next year of study or to the exit exams
    • make a “competent” judgement for each of a series of soft skills a subminimum requirement for the award of a qualification (irrespective of scores in other domains)
    • this would ensure that there are meaningful consequences for not being competent in soft skills that are valued in the programme and ensure that the student who scores 80% overall but is very unprofessional or very poor communicators does not qualify
    • Could convert letter grades (what the assessor gives e.g. A = Excellent, B = Good, C = Borderline, D = Weak) to number grades (what the system needs to be captured e.g. 80%, 70%, 60%, etc.)
  • This would require some planning about what kind of remediation to put in place

#6 – A humanistic pedagogy for student support

In this episode, I talk to Dr Mpho Jama about how a humanistic pedagogy could be key to facilitating student success through enhanced support. She suggests that it is in the human relationships between teachers and students that we must look to provide higher, more subtle levels of support for students.

Dr Jama is the head of the Division of Student Learning and Development in the Faculty of Health Sciences at the University of the Free State. Mpho does research on student retention, Humanistic pedagogy and Qualitative Social Research. Her PhD thesis is entitled: Designing an academic support and development programme to combat attrition among non-traditional medical undergraduates.

Resources for this conversation

Jama, M. (2017). Applying a humanistic pedagogy to advance and integrate humane values in a medical school environment. Perspectives in Education, 35(1):28-39.

Jama, M. (2016). Academic Guidance for Undergraduate Students in a South African Medical School: Can we guide them all? Journal of Student Affairs in Africa, 4(2):13-24.

Jama, M. (2010). Designing an academic support and development programme to combat attrition. PhD thesis. 10.13140/RG.2.1.1882.5120.

Jama, M., Monnapula-Mapesela, M & Beylefeld, A.A. (2008). Theoretical perspectives on factors affecting the academic performance of students. South African Journal of Higher Education, 22(5).

Jama, M. & Beylefeld, A.A. (2007). “Thou shallt know thy student”. What pre-university attributes characterised the first-year medical students that were denied examination access in 2007, and what competencies did they lack? Poster presentation.

More of Dr Jama’s work can be found on her ResearchGate profile.


Note: In order to listen to this podcast you will need to install a podcast app on your phone or tablet. iPhones come with one pre-installed and you can choose from a variety of options on Android devices. Open the podcast app, search for “SAAHE” and then subscribe to the podcast. You will now be able to download any of the episodes for offline listening when you’re out and about.

#5 – A critical pedagogy for online learning, with Michael Rowe

Earlier this year the Critical Physiotherapy Network published Manipulating practices: A critical physiotherapy reader. The book is a collection of critical writing from a variety of authors dealing with a range of topics related to physiotherapy practice and education.  One of the interesting features of this collection is that it is completely open access, which means that the authors, and not the publishers, have the intellectual property rights to make choices about what is permissable to do with the content of the book. While the entire book is available in different formats, including PDF, HTML, EPUB and XML, there is no audio version.

This SAAHE podcast is a recording of one chapter in the collection, entitled “A critical pedagogy for online learning in physiotherapy education“. We are using the SAAHE blog to experiment with sharing content in different formats, and would love to hear your feedback on whether or not this is something you would like to see more of.


In order to graduate physiotherapy students who are able to thrive in increasingly complex health systems, professional educators must move away from instrumental, positivist ideologies that disempower both students and lecturers. Certain forms of knowledge are presented as objective, value-free, and legitimate, while others – including the personal lives and experiences of students – are moved to the periphery and regarded as irrelevant for professional education. This has the effect of silencing students’ voices and sending the message that they are not in control of their own learning. While the integration of digital technology has been suggested as a means for developing transformative teaching and learning practices, it is more commonly used to control students through surveillance and measurement. This dominant use of technology does little more than increase the cost-effectiveness and efficiency of information delivery, while also reinforcing the rigid structures of the classroom. Physiotherapy educators who adopt a critical pedagogy may use it to create personal learning environments (PLEs) that enable students to inform their own learning based on meaningful clinical experiences, democratic approaches to learning, and interaction with others beyond the professional programme. These PLEs enable exploration, inquiry and creation as part of the curriculum, and play a role in preparing students to engage with the complex and networked systems of the early 21st century. While the potential for pedagogical transformation via the integration of digital technology is significant, we must be critical of the idea that technology is neutral and be aware that our choices concerning tools and platforms have important implications for practice.

#4 – Case based learning, with Corne Postma

In this episode I speak to Corné Postma from the University of Pretoria. We discuss his PhD research where he looked at the use of case-based learning to develop clinical reasoning in undergraduate Dentistry students. Corné used both quantitative and qualitative data to determine that students’ clinical reasoning ability improved after using a case-based approach to learning.

Corné is an Associate Professor in the Department of Dental Management Sciences, School of Dentistry, at the University of Pretoria. He is a specialist in Community Dentistry by training and his primary teaching responsibility lies in the domain of Comprehensive Patient Care, which includes patient communication, patient administration, clinical reasoning and patient management. He is also involved in developing other non-clinical skills such as self-awareness, ethics, professionalism, leadership, team work and health advocacy skills in dental students.

Corné has a very broad clinical research interest, which correlates with the generalist requirement of Comprehensive Patient Care. He has a particular affinity for health professions education research, which is closely linked to the development of different kinds of soft skills in students. His research outputs can be viewed on Google Scholar. Corné is a SAFRI (Sub-Saharan African Foundation for the Advancement of International Medical Education and Research Regional Institute) as well as a TAU (Teaching Advancement at University) fellow.

Resources for this conversation


Note: In order to listen to this podcast you will need to install a podcast app on your phone or tablet. iPhones come with one pre-installed and you can choose from a variety of options on Android devices. Open the podcast app, search for “SAAHE” and then subscribe to the podcast. You will now be able to download any of the episodes for offline listening when you’re out and about.

#3 – Standard setting, with Scarpa Schoeman

In this episode of the SAAHE podcast I speak to Prof. Scarpa Schoeman, Director of Undergraduate Medical Education at the Wits Medical School, Faculty of Health Sciences, University of the Witwatersrand, where he leads and directs the Graduate Entry Medical Programme. Scarpa and I talk about the (almost) universal pass mark (cut score) of 50% and the problems with this as a standard. We also discuss possible alternatives to standard setting that take into account the validity and reliability of the assessment scores, as well the difficulty of the test.

Scarpa has published a variety of peer reviewed articles and presented at international conferences on the topic of medical education and assessment. His research interests include assessment and standard setting (the Cohen method in particular), as well as the educational environment for medical students. His clinical interests and practice focuses on Emergency Medicine. He also acts as Assessment consultant to the Colleges of Physicians, Obstetrics and Gynaecology, Paediatricians and Anaesthetists of South Africa. He is a Fellow of the Higher Education Academy in the United Kingdom and is a part time tutor in Assessment and Standard setting for the CME at Dundee University.

Resources for this conversation


Note: In order to listen to this podcast you will need to install a podcast app on your phone or tablet. iPhones come with one pre-installed and you can choose from a variety of options on Android devices. Once the podcast app is up and running, search for “SAAHE” and subscribe to the podcast. You will now be able to download any of the episodes for offline listening when you’re out and about.

#2: Mapping exit-level assessment, with Christina Tan

I recently spoke with Christina Tan, a PhD graduate from the University of Stellenbosch, who conducted research into the validity of assessing exit-level outcomes in an undergraduate medical programme at three medical schools.

This is the second in our podcast series on research in health professions education. If you have any suggestions for future conversations, please let us know in the comments.

If you’d like to read more about Christina’s work, here is one of her recent papers: Tan, C., van Schalkwyk, S., Bezuidenhout, J. & Cilliers, F. (2016). Mapping undergraduate exit-level assessment in a medical programme: A blueprint for clinical competence? African Journal of Health Professions Education 8(1):45-49. DOI:10.7196/AJHPE.2016.v8i1.546


Note: In order to listen to this podcast you will need to install a podcast app on your phone or tablet. iPhones come with one pre-installed and you can choose from a variety of options on Android devices. Once the podcast app is up and running, search for “SAAHE” and subscribe to the podcast. You will now be able to download any of the episodes for offline listening when you’re out and about.

#1: Patient-centredness, with Elize Archer

Welcome to a new SAAHE initiative where we have conversations with people doing interesting work in health professions education. In this conversation I talk to Elize Archer, a recent PhD graduate from the University of Stellenbosch. Elize conducted her research on patient-centred approaches to clinical practice among medical students. In our conversation we discuss different aspects of patient-centred practice, how to think about developing this mindset in students, and some of the challenges to its implementation.

You can read more about Elize’s work here: Archer, E. & van Heerden, B. (2016). Undergraduate medical students’ attitudes towards patient-centredness: a longitudinal study. DOI: https://doi.org/10.15694/mep.2017.000161.

We hope that this is the first of many such conversations and your comments and feedback are welcome. In particular, we’d love to hear your suggestions about PhD and group research projects that have the potential to change practice. If you know of anyone doing work that you think would be valuable to be shared more widely, please do let us know. I apologise for the audio quality at times during the recording. This is something that we’ll work on improving in the future. The conversation is just short of 50 minutes. I hope that you enjoy it.


Note: In order to listen to this podcast you will need to install a podcast app on your phone or tablet. iPhones come with one pre-installed and you can choose from a variety of options on Android devices. Once the podcast app is up and running, search for “SAAHE” and subscribe to the podcast. You will now be able to download any of the episodes for offline listening when you’re out and about.

Two presentations on health professions education, by Paul Worley

Last week Prof. Paul Worley, the previous Dean of Flinders University School of Medicine in Adelaide, hosted two seminars during his visit to Cape town. He very kindly agreed to let us record both sessions and share them here. You can also visit the SAAHE Western Cape Facebook page for more opportunities to engage with the videos.

The future of health professions education

The video is about one and half hours but it is well worth the time invested. Please note that the audio recording for this session is not great.


Decentralised clinical training for the health professions


Thank you to colleagues at Stellenbosch University for hosting the sessions and for their efforts in making the recordings available.

New SAAHE President: Prof. Francois Cilliers

We would like to congratulate Prof. Francois Cilliers on his new position as the incoming President of SAAHE and Chair of the National Executive and Council. Francois has been a member of SAAHE from the very earliest days of the organisation, when it was little more than an informal group of like-minded colleagues in the Western Cape, and has always been an active member at the regional and national levels. We have no doubt that SAAHE will continue to grow and develop under his leadership, as a positive force for health professions education in the country.

We would also like to take this opportunity to thank Prof. Gert van Zyl for his enormous contribution to SAAHE as President over the past few years. His commitment and dedication to the organisation have set a high standard indeed, and we wish him all the best for the future.

Consensus Statement on Decentralised Training in the Health Professions

At the closing ceremony of the 2017 SAAHE national conference in Potchefstroom, delegates adopted a conference declaration in the form of the Consensus Statement on Decentralised Training in the Health Professions, which was endorsed by the SAAHE national council.

This statement was the culmination of discussions over the last two years at SAAHE conferences and national workshops, driven by the Stellenbosch University Collaborative Capacity Enhancement with Districts (SUCCEED) project and the Forum for Rural Clinical Education (FORCE), a SAAHE special interest group which is being re-constituted as a special interest group for decentralised education, amongst others. The focus of these discussions has been on the importance and value of decentralised training in terms of transforming teaching and learning and in addressing the human resources for health needs of our country.

The consensus statement positions decentralised training as being part of the solution to the challenges we face in health care, and calls on all those involved – particularly education and service partners – to work together towards developing a shared vision for such training. It is part of a process that includes the development of a framework that will provide practical guidance for implementing decentralised training.

We invite SAAHE members to support the statement, to use it in advocating for decentralised training and to request organisations that you are part of to consider adding their formal endorsement of it. Please let us know about any such endorsements or formal institutional support via the comments field below.

Download the statement.