gms | German Medical Science

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

ISSN 2366-5017

Exams at Medical Faculties - Quality, Responsibility and Perspectives

commentary medicine

Search Medline for

  • corresponding author Martin R. Fischer - Gesellschaft für Medizinische Ausbildung (GMA), Ausschuss Prüfungen; Universität Witten/Herdecke, Fakultät für Gesundheit, Institut für Didaktik und Bildungsforschung im Gesundheitswesen, Witten, Germany
  • author Matthias Holzer - Gesellschaft für Medizinische Ausbildung (GMA), Ausschuss Prüfungen; Ludwig-Maximilians-Universität München, Medizinische Klinik Innenstadt, Bayerisches Kompetenznetzwerk Lehre in der Medizin, Kompetenzzentrum Prüfungen, Munich, Germany
  • author Jana Jünger - Gesellschaft für Medizinische Ausbildung (GMA), Ausschuss Prüfungen; Universität Heidelberg, Medizinische Fakultät, Medizinische Klinik, Kompetenzzentrum für Prüfungen in der Medizin in Baden-Württemberg, Heidelberg, Germany

GMS Z Med Ausbild 2010;27(5):Doc66

doi: 10.3205/zma000703, urn:nbn:de:0183-zma0007035

This is the English version of the article.
The German version can be found at: http://www.egms.de/de/journals/zma/2010-27/zma000703.shtml

Received: October 24, 2010
Revised: October 26, 2010
Accepted: October 26, 2010
Published: November 15, 2010

© 2010 Fischer et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by-nc-nd/3.0/deed.en). You are free: to Share – to copy, distribute and transmit the work, provided the original author and source are credited.


Commentary

In 2008 the Exam Committee of the GMA, together with the Competence Centre for exams in medicine in Baden-Württemberg published a guideline for faculty internal written assessment records in medical school [1]. As a result, the implementation of this guideline as a quality standard was lively debated. A survey of the current state of exams under the faculty’s responsibility was carried out and published at the request of Medizinischer Fakultätentag (MFT) from Heidelberg [2]. It became clear that the Multiple-Choice (MC) format so far predominate in written examinations and that there are a number of opportunities for improvement in terms of general quality standards for design, item construction, review, implementation, evaluation and communicating results. Only a small minority of subjects to date uses clinical-practical examination formats such as the OSCE, although by now every faculty conducts an OSCE at least in one subject. A comparison of the results between the faculties is not possible due to the different form and content of the tests given. To ensure quality and improve interdepartmental exchange, competence networks for medical education were established - first in Baden-Württemberg and then in Bavaria. The subject of examinations plays a central role in these. Both networks, each with five medical schools have requested the implementation of the GMA guidelines for faculty examinations [3], [4]. We are pleased to note that the guidelines are being observed by all departments and to a greater extent have already been implemented. But as with the MFT survey [2] we get a complex picture with different implementation realities between the faculties.

These achievements are encouraging but developing exam quality and the interdisciplinary exchange further remain high on the agenda. No one will relieve the medical schools of the need to do so. Setting up further regional cooperation networks between the faculties is desirable as this can create synergies to further develop the exam culture as could be shown in Baden-Württemberg and Bavaria alongside all the necessary competition demanded by politics.

An example of synergetic effects is cooperation on exam question preparation and on test item review in an interdisciplinary database. These efforts are now bearing fruit in many places. We will mention the item-management system medicine (IMS-Medicine) as an example for a number of such approaches, which is now shared by 15 faculties and coordinated by the competence centre in Heidelberg. If enough high-quality test questions and OSCE stations are available in a database, the basis for cross-disciplinary examination associations and voluntary interfaculty comparison of results prior to the second written state examination with appropriate correction options is laid. The creation of questions and the reviewing process are centrally available in digital format. In future the publication of exam content could be discussed again should this become necessary through a large number of available high-quality questions. The black market of former exam sheets would then be a thing of the past. Conducting exams digitally is also well established in many places. The development of interdisciplinary standard solutions is obvious and will not be long in coming. One wonders when the second written national exam will first be carried out in digital form by the Institute of Medical and Pharmaceutical Examination Questions (IMPP).

Only by testing knowledge and skills as well as appropriate feedback, sensible learning is possible. Exams represent a learning opportunity for teachers and students. However, it should be investigated how the feedback should be designed so that everyone involved in the exams can learn from it effectively. Formative exam formats make an important contribution to this, especially the Progress-Test Medicine of the Charité Berlin which by now a third of all medical faculties in Germany are using. Such testing tools can make important contributions to the study achievements of students, as shown by the contribution by Schmidmaier at al. from Munich [5] to the Progresstest.

The oral-practical part of the second national examination needs to be better structured and standardised. Here new collaborative efforts between the faculties and the relevant state examination boards are being developed, which is very welcome. Switzerland is currently testing their oral-practical state examination which by 2011 will be compulsory for all medical undergraduates with 12 to 16 test stations. In contrast to Germany, this will enable not only interdisciplinary comparison of knowledge evaluated in written examinations but also of knowledge and skills. Switzerland has a national learning objectives catalogue on which the exam content of the oral-practical state exams are based. Through the National Competency-based Learning Objectives Catalogue in Medicine [6] initiated for medical education in Germany by the MFT and the GMA, by now 21 working groups are working to create the conditions for such a national structured oral-practical exam. But it will require some patience before it is completed. But even beyond that, the GMA and the exams committee in future will collaborate more closely with medical associations to improve the quality of the specialist exams. There is considerable need for further development in this area, especially in comparison with the neighbouring countries Austria and Switzerland, both of which have a substantial qualitative head-start in terms of objectivity, reliability and validity of the specialist exams.

It is therefore necessary to learn from each other at the faculty, regional, national and international level to establish a continuous audit culture of high quality. For this purpose, research questions arise in abundance and much remains to be done!


References

1.
Gesellschaft für Medizinische Ausbildung, Kompetenzzentrum Prüfungen Baden-Württemberg, Fischer MR. Leitlinie für Fakultäts-interne Leistungsnachweise während des Medizinstudiums: Ein Positionspapier des GMA-Ausschusses Prüfungen und des Kompetenzzentrums Prüfungen Baden-Württemberg. GMS Z Med Ausbild. 2008;25(1):Doc74. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2008-25/zma000558.shtml External link
2.
Möltner A, Duelli R, Resch F, Schultz JH, Jünger J. Fakultätsinterne Prüfungen an den deutschen medizinischen Fakultäten. GMS Z Med Ausbild. 2010;27(3):Doc44. DOI: 10.3205/zma000681. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2010-27/zma000681.shtml External link
3.
Reindl M, Holzer M, Fischer MR. Durchführung der Prüfungen nach den Leitlinien des GMA-Ausschusses Prüfungen: Eine Bestandsaufnahme aus Bayern. GMS Z Med Ausbild. 2010;27(4):Doc56. DOI: 10.3205/zma000693. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2010-27/zma000693.shtml External link
4.
Jünger J, Möltner A, Lammerding-Köppel M, Rau T, Obertacke U, Biller S, Narciß E. Durchführung der universitären Prüfungen im klinischen Abschnitt des Medizinstudiums nach den Leitlinien des GMA-Ausschusses Prüfungen: Eine Bestandsaufnahme der medizinischen Fakultäten in Baden-Württemberg. GMS Z Med Ausbild. 2010;27(4):Doc57. DOI: 10.3205/zma000694. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2010-27/zma000694.shtml External link
5.
Schmidmaier R, Holzer M, Angstwurm M, Nouns Z, Reincke M, Fischer MR. Querschnittevaluation des Medizinischen Curriculums München (MeCuM) mit Hilfe des Progress Test Medizin (PTM). GMS Z Med Ausbild. 2010;27(5):Doc70. DOI: 10.3205/zma000707. Zugänglich unter/available from: http://www.egms.de/en/journals/zma/2010-27/zma000707.shtml External link
6.
Hahn EG, Fischer MR. Nationaler Kompetenzbasierter Lernzielkatalog Medizin (NKLM) für Deutschland: Zusammenarbeit der Gesellschaft für Medizinische Ausbildung (GMA) und des Medizinischen Fakultätentages (MFT). GMS Z Med Ausbild. 2009;26(3):Doc35. DOI: 10.3205/zma000627. Zugänglich unter/available from: http://www.egms.de/static/de/journals/zma/2009-26/zma000627.shtml External link