Prof. Dr. Sylvia Heeneman, DMV PhD Harold Bok, Lubberta de Jong, Suzanne Schut

Maastricht University, School of Health Profession Education, the Netherlands
Utrecht University, Faculty of Veterinary Medicine, the Netherlands

Workshop description

Competency-based education (CBE) calls for a shift in thinking about learning and assessment practices. Increasing requirements regarding summative evaluations in CBE interfere with the self-direction of learners. A model of Programmatic Assessment (PA) aims to address problems by integrating and optimizing the learning and decision making functions of assessment. Key features of PA are that individual assessments primarily serve as learning opportunities and decisions are to be made only on the basis of multiple assessment data that are aggregated across many assessment occasions.

The implementation and use of programmatic assessment is seen in more and more programmes and seems to be readily embraced by educators. However, tensions between assessment for learning and assessment of learning, safeguarding the quality of assessment in the program and other dilemmas need to be considered carefully when implementation such an approach. This workshop will focus on navigating the interface between the learning and assessment function of the assessment program.


  • Basic understanding of the model of Programmatic Assessment and the underlining theoretical assumptions and principles
  • Awareness of latest findings in research on Programmatic Assessment
  • Motivate key strategies for implementation based on shared experiences (of workshop facilitators from 2 institutes) and evidence from literature.
  • Sharing of best practices, in undergraduate and postgraduate education
  • Insight of the applicability for Programmatic Assessment in the local context



  • van der Vleuten C, Schuwirth L, Driessen E, Dijkstra J, Tigelaar D, Baartman L, van Tartwijk J: A model for programmatic assessment fit for purpose. Medical teacher 2012, 34:205-14.
  • van der Vleuten C, Schuwirth L, Scheele F, Driessen E, Hodges B: The assessment of professional competence: building blocks for theory development. Best Practice & Research Clinical Obstetrics & Gynaecology 2010, 24:703-19.
  • Heeneman S, Schut S, Donkers J, van der Vleuten C, Muijtjens A: Embedding of the progress test in an assessment program designed according to the principles of programmatic assessment. Medical teacher 2017, 39:44-52.
  • Heeneman S, Oudkerk Pool A, Schuwirth L, van der Vleuten C, Driessen E: The impact of programmatic assessment on student learning: theory versus practice. Medical education 2015, 49:487-98.
  • Schut, S., Driessen, E., van Tartwijk, J., van der Vleuten, C., & Heeneman, S. (2018). Stakes in the eye of the beholder: an international study of learners’ perceptions within programmatic assessment. Med Educ. doi:10.1111/medu.13532
  • van der Vleuten C, Schuwirth L, Driessen E, Govaerts M, Heeneman S: Twelve tips for programmatic assessment. Medical teacher 2015, 37:641-46.
  • Bok H, Teunissen P, Favier R, Rietbroek N, Theyse L, Brommer H, Haarhuis J, Beukelen P, van der Vleuten C, Jaarsma D: Programmatic assessment of competency-based workplace learning: When theory meets practice. BMC Medical education 2013, 13:123.
  • Bok H, de Jong L, O’Neill T, Maxey C, Hecker K: Validity evidence for programmatic assessment in competency-based education. In submission.
  • de Jong L, Favier R, van der Vleuten C, Bok H: Students’ motivation toward feedback-seeking in the clinical workplace. Medical teacher 2017, 39:954-58.
  • de Jong L, Bok H, Kremer W, van der Vleuten C: Can we provide validity evidence for saturation of information in portfolio assessment? In preparation.