• Research article
  • Open access
  • Published: 28 November 2007

Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper

  • Edward J Palmer 1 , 2 &
  • Peter G Devitt 2  

BMC Medical Education volume  7 , Article number:  49 ( 2007 ) Cite this article

55k Accesses

127 Citations

14 Altmetric

Metrics details

Reliable and valid written tests of higher cognitive function are difficult to produce, particularly for the assessment of clinical problem solving. Modified Essay Questions (MEQs) are often used to assess these higher order abilities in preference to other forms of assessment, including multiple-choice questions (MCQs). MEQs often form a vital component of end-of-course assessments in higher education. It is not clear how effectively these questions assess higher order cognitive skills. This study was designed to assess the effectiveness of the MEQ to measure higher-order cognitive skills in an undergraduate institution.

An analysis of multiple-choice questions and modified essay questions (MEQs) used for summative assessment in a clinical undergraduate curriculum was undertaken. A total of 50 MCQs and 139 stages of MEQs were examined, which came from three exams run over two years. The effectiveness of the questions was determined by two assessors and was defined by the questions ability to measure higher cognitive skills, as determined by a modification of Bloom's taxonomy, and its quality as determined by the presence of item writing flaws.

Over 50% of all of the MEQs tested factual recall. This was similar to the percentage of MCQs testing factual recall. The modified essay question failed in its role of consistently assessing higher cognitive skills whereas the MCQ frequently tested more than mere recall of knowledge.

Construction of MEQs, which will assess higher order cognitive skills cannot be assumed to be a simple task. Well-constructed MCQs should be considered a satisfactory replacement for MEQs if the MEQs cannot be designed to adequately test higher order skills. Such MCQs are capable of withstanding the intellectual and statistical scrutiny imposed by a high stakes exit examination.

Peer Review reports

Problem-solving skills are an essential component of the medical practitioner's clinical ability and as such must be taught, learned and assessed during training. Entire curricula have been re-designed with this concept in mind. Problem-based learning is used in many teaching institutions and has its supporters and detractors. Despite criticism, it is undeniable that what problem-based learning sets out to achieve in terms of encouraging and developing the skills of synthesis, evaluation and problem-solving are valued components of a good medical education. In conjunction with the promotion of these skills, an effective assessment process is required. It has long been recognised that in the assessment of clinical competence problem-solving ability has been one of the most difficult areas to measure and quantify [ 1 ]. The modified essay question (MEQ) is one of several tools developed to try and assess this skill [ 2 ].

The MEQ is a compromise between the multiple-choice question (MCQ) and the essay. A well constructed MCQ will be unambiguous, clearly set to a defined standard and easy to mark (usually automatically), but more often than not tests little more than recall of fact [ 3 ]. An essay might test higher powers of reasoning and judgement but will be time-consuming to mark and risk considerable variation in standards of marking [ 4 ]. The MEQ is designed to sit in between these two test instruments in terms of the ability to test higher cognitive skills and the ease of marking to a consistent standard. The aim of the modified essay question is to broadly measure both the absolute amount of knowledge retained by the candidate and the ability of the candidate to use that knowledge to reason through and evaluate clinical problems. It accomplishes this by providing a clinical scenario with a number of steps. Progression through these stages should test the candidate's ability to understand, reason, evaluate and critique.

Construction of appropriate MEQs can be difficult [ 5 ] and a major criticism of this form of assessment is that MEQs often do little more than test the candidate's ability to recall a list of facts and frustrate the examiner with a large pile of papers to be hand-marked [ 6 ].

Although there is evidence to suggest that well constructed MEQs will test higher order cognitive skills [ 5 ], and that they can test different facets of understanding than MCQs [ 7 ], it is reasonable to ask if MEQ assessments in higher education are well constructed and if they are capable of assessing higher order cognitive skills. This paper describes such a study and is designed to gauge the effectiveness of the MEQ as a summative test tool in a clinical course. We have defined the effectiveness of the questions by their ability to measure higher cognitive skills, as determined by a modification of Bloom's taxonomy, and its quality as determined by the presence of item writing flaws.

Fourth Year clinical students at the University of Adelaide underwent a written test as part of their overall assessment of performance for a nine-week surgical attachment. The same test instrument was used at the start of the attachment and on completion. The test material consisted of 50 MCQs and three MEQs (a total of 8 stages) and the questions were designed so that both types would cover similar test material. The content, focusing on core material, was matched in both the MCQ and the MEQ components of the examination. The MCQs had one correct answer and four distractors and were constructed to standard guidelines for MCQ construction [ 8 , 9 ].

In addition, the MEQ components of the Final MB BS examination papers for two consecutive years at the University of Adelaide were analysed. The first paper had 15 MEQs with a total of 68 stages, the other had 15 MEQs with a total of. 70 stages. The papers for each examination were assembled by one member of Faculty, who gathered contributions from individual clinicians. There was no formal instruction for the contributors on how to construct an MEQ, which would assess higher order cognitive skills, and the examination organiser undertook the final review of the submitted material.

In total, 33 MEQs made up of 146 stages were collected for analysis. The MEQs were written by at least 12 separate authors using the standard methodology for developing assessments within the faculty.

Each multiple-choice question was quantified independently as to its level of cognitive skill tested [ 10 ] and its structural validity [ 11 ] by two assessors. Each modified essay question and their individual components was also categorised independently by the two assessors according to the cognitive level measured by each question and its component parts. The assessors discussed their individual assessment and then produced a final grading for each MCQ and MEQ. The inter-rater agreement was calculated using Kappa statistics.

The data was classified using a modification of Bloom's hierarchy of cognitive learning [ 12 , 13 ]. Three levels were defined and classified as shown in Table 1 . Level I, covered knowledge and recall of information, Level II covered comprehension and application, understanding and the ability to interpret data, and Level III tested problem-solving, the use of knowledge and understanding in new circumstances.

The rating scale shown in Table 2 was used to judge the rigor of the multiple-choice questions according to the presence of any item-writing flaws.

The item-writing flaws were defined as:

Repetition of part of the stem in an option

Use of qualifiers within an option

Complicated or ambiguous stem

Negative questions not clearly stated

Use of double negatives

Absolute options (e.g., never, always, all-of-the-above)

The cover test has been defined as the ability to surmise the answer from the stem of an item alone, with the correct answer and the distractors covered up [ 9 ].

Table 3 illustrates an example of the coding of 2 MCQs. Neither of the MCQs in this table displayed item-writing flaws. Item 1 in the table was judged to be testing lower order cognitive skills than item 2.

Table 4 illustrates stages of an MEQ requiring different levels of cognitive skill to answer. The first two items in the table come from the same MEQ. The last item was obtained from a different question.

The assessors showed a close correlation in their assessment of the questions according to the modified Bloom's taxonomy categorisation. The reliability between the two assessors and the final mark was good with values of Kappa equal to 0.7 and 0.8 for the MCQs and 0.7 and 0.8 for the MEQs.

The overall performances of the MCQs and the MEQs were compared for their ability to test higher cognitive skills (Figure 1 ). Just over 50% of the MCQs in the Fourth Year examination paper focussed only on recall of knowledge and the largest proportion of MEQs also focussed on this low level cognitive skill. A similar proportion of MCQs and MEQs tested middle order cognitive skills and, rather surprisingly, MCQs were better at addressing the highest order cognitive skills compared with MEQs.

figure 1

Percentage of MCQs and MEQs addressing different Bloom's levels of cognitive skills.

Each of the Final Examination papers for 2005 and 2006 contained 15 MEQs and there were a total of 68 and 70 sections respectively (average 4.5 and 4.7 sections per question). In the 2005 paper 51% of the questions tested factual recall (Bloom level I), 47% tested data interpretation (Bloom level II) and only 2% tested critical evaluation. The pattern was similar for the 2006 paper with 54% testing Bloom level I cognitive skills and the remainder (46%) testing Bloom level II.

The 33 MEQs had an average Bloom categorisation of 1.35 with a standard deviation of 0.4. The distribution is shown in Figure 2 .

figure 2

Number of MEQs at different Modified Bloom's taxonomy levels (consensus of two assessors).

The assessors showed a close correlation in their assessment of the multiple-choice questions according to the item writing flaws categorisation. The reliability between the two assessors and the final mark was moderate, with Kappa equal to 0.5 and 0.6.

An analysis of the structural validity of the MCQs showed that 80% passed the cover test and contained no item-writing flaws. Twenty percent of questions were flawed, but most of these flaws were only of a minor nature and only one question out of the fifty was sufficiently flawed to call into question its structural validity.

For an assessment to be effective, there are a number of issues to be considered. Resource considerations are important, and this may have some impact on the style of exam chosen. True-false, multiple-choice and extended matching questions can be marked automatically and may have a relatively low impact on academic time, compared to the marking of MEQ and essay questions. Based on resource considerations alone, MEQs may be considered an inferior form of assessment, but there are other issues, which must be considered.

The reliability and validity of an assessment is vitally important. A reliable assessment will provide consistent results if applied to equivalent cohorts of students. MCQs benefit from a high reliability when the set of questions is valid and there are sufficient numbers of questions, as do True-False questions [ 14 ]. MEQs and standard essay questions can have good reliability provided multiple markers are used. Validity of content should always be carried out regardless of the type of assessment tool used. At a minimum this should include content validity and construct validity. Other measures of validity such as concurrent and predictive validity are also relevant but can be far more challenging to determine. The ability of assessments to discriminate effectively between good and poor candidates, as well as the fidelity of the assessment are also important considerations in evaluating an assessment tool.

We have shown that in a standard mid-course multiple-choice examination paper a substantial component of that examination will focus on testing higher cognitive skills. Yet conversely and perversely, in an examination specifically designed as part of the exit assessment process a disproportionately high percentage of modified essay questions did little more than measure the candidates' ability to recall and write lists of facts. This may be inappropriate when it is considered that the next step for most of the examinees is a world where problem-solving skills are of paramount importance. The analysis has shown that it is possible to produce an MCQ paper that tests a broad spectrum of a curriculum, measures a range of cognitive skills and does so, on the basis of structurally sound questions. It is important to recognise that these results are from one institution only, and the processes used to design assessments may not be typical of other institutions. The generalizability of the results is also worth considering. In this study there were many authors involved in writing the questions. Although it was not possible to isolate individual authors, at least a dozen individuals were involved, and there was little variation in the overall Bloom categorization of the MEQs. This suggests that the findings of this study may be transferable to other schools.

The apparent structural failure of the MEQ papers was not likely the result of a conscious design decision on the part of those who wrote the questions, but may have been a lack of appreciation of what an MEQ is designed to test. This resulted in a substantial proportion of the questions measuring nothing more than the candidates' ability to recall and list facts.

This relatively poor performance of MEQs has been observed by others. Feletti [ 15 ] reported using the MEQ as a test instrument in a problem-based curricula. In their study the percentage of the examination that tested factual recall varied between 11% and 20%. The components testing problem-solving skills ranged from 32% to 45%. That the proportion of factual recall questions in the current study was higher than that observed by Feletti might well reflect a lack of peer-review when the examination was set. The Feletti data showed that as the number of items increased in the examination, the ability to test cognitive skills, other than factual recall, fell. In other words, the shorter the time available to answer an item, the more likely the material would focus on recall of fact. The University of Adelaide papers allowed 12 minutes a question or less than 3 minutes per stage. This is considerably less than the 2 – 20 minutes per item in the Feletti study.

The open-ended question has low reliability [ 15 ] and an examination based on this format is unable to sample broadly. The essay has only moderate inter-rater reliability for the total scores in free-text marking and low reliability for a single problem [ 16 ]. Such an examination is also expensive to produce and score, particularly when measured against a clinician's time. It makes little sense to use this type of assessment to test factual knowledge, which can be done much more effectively and efficiently with the MCQ.

Our study has confirmed the impressions reported by others that MEQs tend to test knowledge as much as they measure higher cognitive skills [ 5 ]. If an MEQ is to be used to its full value it should present a clinical problem and examine how the students sets about dealing with the situation with the step-wise inclusion of more data to be analysed and evaluated. Superficially, this is what the MEQs in this study set out to do, but when the questions were examined closely, most failed and did no more than ask the candidates to produce a list of facts.

The present study has shown that it is possible to construct a multiple-choice examination paper, which tests those cognitive skills for which the MEQ is supposedly the instrument of choice. These observations raises the question of why it is necessary to have MEQs at all, but the potential dangers of replacing MEQs with MCQs must be considered.

It is generally thought that MCQs focus on knowledge recall and MEQs test the higher cognitive skills. When the content of both assessments is matched the MCQ will correlate well with the MEQ and the former can accurately predict clinical performance [ 2 ]. This undoubtedly relies upon a well-written MCQ designed to measure more than knowledge recall.

A good MCQ is difficult to write. Many will contain item writing flaws and most will do no more than test factual recall. Our study has shown that this does not necessarily have to be the case, but it cannot be assumed that anyone can write a quality MCQ unaided and without peer review.

If MCQs are to be used to replace MEQs or similar open-ended format, the issue of cueing must be considered. The effect of cueing is usually positive and can lead to a higher mean score [ 17 ]. Conventional MCQs have a cueing effect which has been reported as giving an 11-point advantage compared with open-ended questions. It has been shown that if open-ended questions do not add to the information gained from an MCQ, this difference in the mean score may not matter, particularly if it can lead to the use of a well structured MCQ testing a broad spectrum of material with an appropriate range of cognitive testing [ 18 ]. Grading could be adjusted to take into account the benefits of cueing.

Other options to improve the testing abilities of the MCQ type of format is to use extended matching questions and uncued questions [ 19 ]. These have been put forward as advances on the MCQ, but these test formats can be easily misused with the result that they may end up focusing only on knowledge recall [ 4 , 19 , 20 ].

The criticisms levelled at MCQs are more a judgement of poor construction [ 11 , 21 ] and the present study suggests that a similar criticism should be levelled at MEQs. We would go further, and suggest that assessment with well-written MCQs has more value (in terms of broad sampling of a curriculum and statistical validity of the test instrument) than a casually produced MEQ assessment. This is not suggest that MEQs should never be used, as they do have the capability to measure higher cognitive skills effectively [ 5 ], and there is evidence to suggest that MEQs do measure some facets of problem solving that an MCQ might not [ 7 ].

The measurement of problem-solving skills is important in medicine. MEQs seem ideally suited for this process, but it is possible to use a combination of MEQs and MCQs in a sequential problem solving process, where the ability to solve problems can be separated to some extent from the ability to retain facts [ 22 ]. The computer may be the ideal format for this, and there are examples of problem solving exercises using the electronic format readily available [ 23 ].

When designing an assessment, which may consist of MCQs or MEQs, it is important to recognise the potential strengths of both formats. This study has shown that if an MEQ is going to be used to assess higher order cognitive skills, there needs to be a process in place where adequate instruction is given to the MEQ authors. If this instruction is not available, and the authors can construct high quality MCQs, the assessment may be better served by containing more MCQs than MEQs. The reduced effort in marking such an assessment would be of benefit to faculties struggling with limited resources.

Apart from its ability to assess appropriate cognitive skills, any assessment instrument should be able to withstand the scrutiny of content and construct validity, reliability, fidelity and at the same time discriminate the performance levels of the cohort being tested. We suggest that a well-constructed peer-reviewed multiple-choice question meets many of the educational requirements and advocate that this format be considered seriously when assessing students. Benefits of automated marking, and potentially high reliability at low cost make MCQs a viable option when writing high stakes assessments in clinical medicine.

Marshall J: Assessment of problem-solving ability. Medical Education. 1977, 11: 329-34.

Article   Google Scholar  

Rabinowitz HK: The modified essay question: an evaluation of its use in a family medicine clerkship. Medical Education. 1987, 21: 114-18.

Epstein RM: Assessment in Medical Education. N Engl J Med. 2007, 356: 387-96. 10.1056/NEJMra054784.

Wood EJ: What are extended Matching Sets Questions?. Bioscience Education eJournal. 2003, 1: [ http://www.bioscience.heacademy.ac.uk/journal/vol1/beej-1-2.pdf ]

Google Scholar  

Irwin WG, Bamber JH: The cognitive structure of the modified essay question. Medical Education. 1982, 16: 326-31.

Ferguson KJ: Beyond multiple-choice questions: using case-based learning patient questions to assess clinical reasoning. Medical Educ. 2006, 40 (11): 1143-. 10.1111/j.1365-2929.2006.02592.x.

Rabinowitz HK, Hojat MD: A comparison of the modified essay question and multiple choice question formats: Their relationships to clinical performance. Fam Med. 1989, 21: 364-367.

Haladyna TM, Downing SM, Rodriguez MC: A review of multiple-choice item-writing guidelines for classroom assessment. App Meas Educ. 2002, 13: 309-334. 10.1207/S15324818AME1503_5.

Case S, Swanson D: Constructing Written Test Questions For the Basic and Clinical Sciences. National Board of Examiners. 2003

Bloom B, Englehart M, Furst E, Hill W, Krathwohl D: Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. 1956, New York, Toronto: Longmans, Green

Palmer E, Devitt P: Constructing multiple choice questions as a method for learning. Ann Acad Med Singap. 2006, 35: 604-08.

Crooks TJ: The Impact of Classroom Evaluation Practices on Students. Rev Educ Res. 1988, 58: 438-81. 10.2307/1170281.

Buckwalter JA, Schumacher R, Albright JP, Cooper RR: Use of an educational taxonomy for evaluation of cognitive performance. J Med Educ. 1981, 56: 115-21.

Downing SM: True-false, alternate-choice, and multiple-choice items. Educ meas, issues pract. 1992, 11: 27-30. 10.1111/j.1745-3992.1992.tb00248.x.

Feletti GI, Smith EKM: Modified Essay Questions: are they worth the effort?. Medical Education. 1986, 20: 126-32.

Schuwirth LWT, van der Vleuten C: ABC of learning and teaching in medicine: Written assessment. BMJ. 2003, 643-45. 10.1136/bmj.326.7390.643.

Schuwirth LWT, van der Vleuten CPM, Donkers HHLM: A closer look at cueing effects in multiple-choice questions. Med Educ. 1996, 30: 44-49.

Wilkinson TJ, Frampton CM: Comprehensive undergraduate medical assessments improve prediction of clinical performance. Med Educ. 2004, 38: 1111-16. 10.1111/j.1365-2929.2004.01962.x.

Veloski JJ, Rabinowitz HK, Robeson MR: A solution to the cueing effects of multiple choice questions: the Un-Q format. Med Educ. 1993, 27: 371-75.

Wood TJ, Cunnington JPW, Norman GR: Assessing the Measurement Properties of a Clinical Reasoning Exercise. Teach Learn Med. 2000, 12: 196-200. 10.1207/S15328015TLM1204_6.

Collins J: Education techniques for lifelong learning: writing multiple-choice questions for continuing medical education activities and self-assessment modules. Radiographics. 2006, 26: 543-51. 10.1148/rg.262055145.

Berner ES, Bligh TJ, Guerin RO: An indication for a process dimension in medicl problem-solving. Med Educ. 1977, 11: 324-328.

eMedici. Web page accessed 2007, [ http://www.emedici.com ]

Pre-publication history

The pre-publication history for this paper can be accessed here: http://www.biomedcentral.com/1472-6920/7/49/prepub

Download references

Author information

Authors and affiliations.

Centre for Learning and Professional Development, University of Adelaide, Adelaide, Australia

Edward J Palmer

Dept of Surgery, University of Adelaide, Adelaide, Australia

Edward J Palmer & Peter G Devitt

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Edward J Palmer .

Additional information

Competing interests.

The author(s) declare that they have no competing interests.

Authors' contributions

PGD conceived of the study. EP and PGD designed, coordinated and carried out the study. EP carried out the statistical analysis. Both authors participated in the manuscript and read and approved the final version.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Authors’ original file for figure 2, rights and permissions.

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article.

Palmer, E.J., Devitt, P.G. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper. BMC Med Educ 7 , 49 (2007). https://doi.org/10.1186/1472-6920-7-49

Download citation

Received : 11 April 2007

Accepted : 28 November 2007

Published : 28 November 2007

DOI : https://doi.org/10.1186/1472-6920-7-49

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Cognitive Skill
  • Structural Validity
  • Examination Paper
  • Factual Recall
  • High Order Cognitive Skill

BMC Medical Education

ISSN: 1472-6920

what is modified essay question

Europe PMC requires Javascript to function effectively.

Either your web browser doesn't support Javascript or it is currently turned off. In the latter case, please turn on Javascript support in your web browser and reload this page.

Search life-sciences literature (43,961,101 articles, preprints and more)

  • Available from publisher site using DOI. A subscription may be required. Full text
  • Citations & impact
  • Similar Articles

Modified essay questions: are they worth the effort?

Medical Education , 01 Mar 1986 , 20(2): 126-132 https://doi.org/10.1111/j.1365-2923.1986.tb01059.x   PMID: 3959927 

Abstract 

Full text links .

Read article at publisher's site: https://doi.org/10.1111/j.1365-2923.1986.tb01059.x

References 

Articles referenced by this article (8)

Title not supplied

Association

The Evolution and impact of programme evaluation in a new medical school

Assessment and Evaluation in Higher Education 1983

Reliability and validity studies on modified essay questions.

J Med Educ, (11):933-941 1980

MED: 7441674

Comprehensive assessment of final-year medical student performance based on undergraduate programme objectives.

Feletti GI , Saunders NA , Smith AJ

Lancet, (8340):34-37 1983

MED: 6134895

Assessment of clinical competence using an objective structured clinical examination (OSCE).

Harden RM , Gleeson FA

Med Educ, (1):41-54 1979

MED: 763183

Assessment-help or hurdle

Programmed Learning and Educational Technology 1979

Citations & impact 

Impact metrics, citations of article over time, smart citations by scite.ai smart citations by scite.ai include citation statements extracted from the full text of the citing article. the number of the statements may be higher than the number of citations provided by europepmc if one paper cites another multiple times or lower if scite has not yet processed some of the citing articles. explore citation contexts and check if this article has been supported or disputed. https://scite.ai/reports/10.1111/j.1365-2923.1986.tb01059.x, article citations, the prognostic validity of the formative for the summative meq (modified essay questions)..

Büssing O , Ehlers JP , Zupanic M

GMS J Med Educ , 38(6):Doc99, 15 Sep 2021

Cited by: 1 article | PMID: 34651057 | PMCID: PMC8493849

Assessment formats in dental medicine: An overview.

Gerhard-Szep S , Güntsch A , Pospiech P , Söhnel A , Scheutzel P , Wassmann T , Zahn T

GMS J Med Educ , 33(4):Doc65, 15 Aug 2016

Cited by: 15 articles | PMID: 27579365 | PMCID: PMC5003142

Should essays and other "open-ended"-type questions retain a place in written summative assessment in clinical medicine?

BMC Med Educ , 14:249, 28 Nov 2014

Cited by: 28 articles | PMID: 25431359 | PMCID: PMC4275935

Problem-based learning research in anesthesia teaching: current status and future perspective.

Chilkoti G , Mohta M , Wadhwa R , Wadhwa R , Saxena AK

Anesthesiol Res Pract , 2014:263948, 29 May 2014

Cited by: 4 articles | PMID: 24982673 | PMCID: PMC4058836

How to teach psychiatry to medical undergraduates in India?: a model.

Manohari SM , Johnson PR , Galgali RB

Indian J Psychol Med , 35(1):23-28, 01 Jan 2013

Cited by: 12 articles | PMID: 23833338 | PMCID: PMC3701355

Similar Articles 

To arrive at the top five similar articles we use a word-weighted algorithm to compare words from the Title and Abstract of each citation.

Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper.

Palmer EJ , Devitt PG

BMC Med Educ , 7:49, 28 Nov 2007

Cited by: 69 articles | PMID: 18045500 | PMCID: PMC2148038

Comparison of performance of final-year students from three Australian medical schools.

Saunders NA , McIntosh J , Prince RL , Feletti GI , Engel CE , Lawrence JR , Pitney WR , Hughes JL , Marshall JR

Med J Aust , 147(8):385-388, 01 Oct 1987

Cited by: 1 article | PMID: 3657675

How do medical students actually think while solving problems in three different types of clinical assessments in Korea: Clinical performance examination (CPX), multimedia case-based assessment (CBA), and modified essay question (MEQ).

Kim S , Choi I , Yoon BY , Kwon MJ , Choi SJ , Kim SH , Lee JT , Rhee BD

J Educ Eval Health Prof , 16:10, 09 May 2019

Cited by: 1 article | PMID: 31071764 | PMCID: PMC6545527

A Web-based geriatrics portfolio to document medical students' learning outcomes.

Supiano MA , Fantone JC , Grum C

Acad Med , 77(9):937-938, 01 Sep 2002

Cited by: 13 articles | PMID: 12228110

Undergraduate medical education: comparison of problem-based learning and conventional teaching.

Nandi PL , Chan JN , Chan CP , Chan P , Chan LP

Hong Kong Med J , 6(3):301-306, 01 Sep 2000

Cited by: 57 articles | PMID: 11025850

Europe PMC is part of the ELIXIR infrastructure

Book cover

The MRCGP Study Book pp 49–124 Cite as

Modified Essay Question or MEQ

  • T. A. I. Bouchier Hayes LAH, FRCGP, DRCOG ,
  • John Fry OBE, MD, FRCS, FRCGP ,
  • Eric Gambrill MB, BS, FRCGP, D.OBST. RCOG ,
  • Alistair Moulds MB, CH.B, MRCGP, D.OBST. RCOG &
  • K. Young OBE, MB, B.CH, FRCGP, DTM & H, DPH  

128 Accesses

The MEQ is an original development by the examiners of the RCGP of the patient-management problem type of examination format widely used in North America and Australasia. The papers are normally based on a real case and the format is the familiar one in which an evolving clinical problem is unfolded stage by stage. The participant is expected to respond in the appropriate manner by eliciting further information via history or physical examination; speculate on the diagnostic possibilities; order relevant investigations; come to a working diagnosis; advise and counsel the patient and his family; intervene appropriately by mobilising relations, nursing or social services; refer to specialists or prescribe drugs or aids; and show an awareness of risk factors in a situation by demonstrating an ability to anticipate problems which might be expected to arise in the future.

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Unable to display preview.  Download preview PDF.

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 1981 Update Books Ltd

About this chapter

Cite this chapter.

Hayes, T.A.I.B., Fry, J., Gambrill, E., Moulds, A., Young, K. (1981). Modified Essay Question or MEQ. In: The MRCGP Study Book. Springer, Dordrecht. https://doi.org/10.1007/978-94-015-7174-6_3

Download citation

DOI : https://doi.org/10.1007/978-94-015-7174-6_3

Publisher Name : Springer, Dordrecht

Print ISBN : 978-94-015-7176-0

Online ISBN : 978-94-015-7174-6

eBook Packages : Springer Book Archive

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

what is modified essay question

What is.… a Modified Essay Question?

  • Cite this article
  • https://doi.org/10.3109/01421598909146276

Sample our Medicine, Dentistry, Nursing & Allied Health journals, sign in here to start your FREE access for 14 days

  • Reprints & Permissions
  • Read this article /doi/epdf/10.3109/01421598909146276?needAccess=true

Medical education is moving to a more problem-orientated basis than was the case formerly. The Modified Essay Question has its origins in this movement, being introduced in the late 1960s as one assessment technique more suited to general practice than other traditional assessment methods. In its original form it is a paper exercise based on an evolving situation presented by a patient in primary care. Experience with the technique in different countries is briefly summarised, and its applications to assessment and to teaching are discussed. Despite shortcomings this method appears to be standing up to the test of time.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form . For more information, please visit our Permissions help page .

  • Back to Top

Related research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations. Articles with the Crossref icon will open in a new tab.

  • People also read
  • Recommended articles

To cite this article:

Download citation, your download is now in progress and you may close this window.

  • Choose new content alerts to be informed about new research of interest to you
  • Easy remote access to your institution's subscriptions on any device, from any location
  • Save your searches and schedule alerts to send you new results
  • Export your search results into a .csv file to support your research

Login or register to access this feature

Register now or learn more

Use modified essay questions

Affiliation.

  • 1 Professor of General Practice, Department of General Practice, University of Dundee, 166 Nethergate, Dundee, DD1 4DR, UK.
  • PMID: 24480002
  • DOI: 10.3109/01421598009072166

In the second issue of Medical Teacher (1979, 1, 65-70), Professor R. M. Harden gave an overview of assessment in medical education. Here, Professor Knox describes some of the issues involved in one form of assessment-the Modified Essay Question. With careful preparation, this technique can provide a measure of abilities (including attitudes) which cannot easily be assessed by other means. The MEQ can also provide a brisk learning experience in small groups or at a large plenary session.

IMAGES

  1. modified essay question instructions

    what is modified essay question

  2. (PDF) Evaluation of Modified Essay Questions (MEQ) and Multiple Choice

    what is modified essay question

  3. Analysing an essay question

    what is modified essay question

  4. Starting an essay with a question

    what is modified essay question

  5. Essay assessment

    what is modified essay question

  6. Passing the Modified Essay (MEQ) and Critical Essay (CEQ) Exam- RANZCP- Dr Sanil Rege

    what is modified essay question

VIDEO

  1. Essay writing

  2. Essay Question Immunity From Question 10 to 29

  3. How to write an Essay in any Exam in 2024 ? 💯 #writingmania #essay #ytshorts #shorts

  4. Should I Revise and Edit a Research Paper?

  5. Why is There Something Rather Than Nothing?

  6. GMAT Sentence Correction Explained: Learn the meaning-based approach to GMAT Sentence Correction

COMMENTS

  1. What is ... a modified essay question?

    The Modified Essay Question has its origins in this movement, being introduced in the late 1960s as one assessment technique more suited to general practice than other traditional assessment methods. In its original form it is a paper exercise based on an evolving situation presented by a patient in primary care. Experience with the technique ...

  2. PDF Modified Essay Questions: MEQ

    Modified Essay Questions: MEQ • You will be asked to mark one or two sub-questions of a modified essay problem. • MEQ answers are usually worth 1 or 2 marks, reflecting the expected amount of detail. For a 2 mark answer, markers should decide whether a candidate's answer for that domain is worth 2, 1 or 0 marks.

  3. What is.… a Modified Essay Question?

    The Modified Essay Question has its origins in this movement, being introduced in the late 1960s as one assessment technique more suited to general practice than other traditional assessment methods. In its original form it is a paper exercise based on an evolving situation presented by a patient in primary care. Experience with the technique ...

  4. Constructed Response Items

    Constructed response items (CRIs) are types of questions used to assess higher levels of the cognitive domain such as knowledge synthesis, evaluation, and creation. Many formats of CRIs are existing including long essay questions, short answer questions (SAQs), and the modified essay questions (MEQs). The aim of this chapter is to introduce you ...

  5. Assessment of higher order cognitive skills in undergraduate education

    The modified essay question failed in its role of consistently assessing higher cognitive skills whereas the MCQ frequently tested more than mere recall of knowledge. Construction of MEQs, which will assess higher order cognitive skills cannot be assumed to be a simple task. Well-constructed MCQs should be considered a satisfactory replacement ...

  6. Modified essay questions: are they worth the effort?

    The method chosen for important examinations strongly influences the nature of student learning. The Newcastle Medical School in Australia developed a 5-year problem-based curriculum and adopted the Modified Essay Question (MEQ) as the main written instrument for assessing students' problem-solving skills.

  7. The modified essay question: Its exit from the exit examination?

    Aims: We have undertaken a critical review of the exit examination from the University of Adelaide focussing on the written components. This examination consisted of an objective structure clinical examination (OSCE), a multiple choice question (MCQ) paper and a modified essay question (MEQ) paper. Methods: The two written papers were assessed ...

  8. What is ... a modified essay question?

    The Modified Essay Question was introduced in the late 1960s as one assessment technique more suited to general practice than other traditional assessment methods, and appears to be standing up to the test of time. Medical education is moving to a more problem-orientated basis than was the case formerly. The Modified Essay Question has its origins in this movement, being introduced in the late ...

  9. Modified Essay Question or MEQ

    Abstract. The MEQ is an original development by the examiners of the RCGP of the patient-management problem type of examination format widely used in North America and Australasia. The papers are normally based on a real case and the format is the familiar one in which an evolving clinical problem is unfolded stage by stage.

  10. Short Answer Questions or Modified Essay Questions— More than a

    The aim of the study was to evaluate whether the results of an examina-tion differ when short answer questions (SAQ) or modified essay questions (MEQ) are used. Method, Forty-nine stu-dents in the ...

  11. Modified Essay Questions: are they worth the effort?

    The modified essay question failed in its role of consistently assessing higher cognitive skills whereas the MCQ frequently tested more than mere recall of knowledge, and well-constructed MCQs should be considered a satisfactory replacement for MEQs if the MEZs cannot be designed to adequately test higher order skills.

  12. Meq

    The exam asks you to respond to 4-6 modified essay questions, worth up to 38 marks each (total 125 marks). You are allowed 150 minutes to complete the exam, including reading time. Eligibility. You are eligible to apply for the MEQ exam after you have completed 18 months FTE training.

  13. The modified essay question: Its exit from the exit examination?

    This examination consisted of an objective structure clinical examination (OSCE), a multiple choice question (MCQ) paper and a modified essay question (MEQ) paper. Methods: The two written papers were assessed for item writing flaws and taxonomic level using modified Bloom's criteria. Curriculum experts independently assessed adequacy of the ...

  14. PDF Short Answer Questions or Modified Essay Questions ...

    Short Answer Questions or Modified Essay Questions—More than a Technical Issue 29. and the remaining three MEQ, and 4) the MEQ in 2) and 3). Percentage correctly answered questions in different

  15. [PDF] Modified essay question.

    The modified essay question (MEQ), featuring an evolving case scenario, tests a candidate's problem-solving and reasoning ability, rather than mere factual recall, and can be conducted using a computer-based testing scenario, which offers several advantages over a pen-and-paper format. Expand. 17. Highly Influenced.

  16. Modified Essay Questions: are they worth the effort?

    The Newcastle Medical School in Australia developed a 5-year problem-based curriculum and adopted the Modified Essay Question (MEQ) as the main written instrument for assessing Students' problem-solving skills. Even with the best of intentions the MEQ has been abused by its over-use, as shown in this review of annual assessment for years 1, 3 ...

  17. Modified essay questions: are they worth the effort?

    The method chosen for important examinations strongly influences the nature of student learning. The Newcastle Medical School in Australia developed a 5-year problem-based curriculum and adopted the Modified Essay Question (MEQ) as the main written instrument for assessing students' problem-solving skills. Even with the best of intentions the ...

  18. PDF The modified essay question*

    The Modified Essay Question 375 comparatively few had a clear idea ofwhat to say to a patient who had one single doubtful attack ofdisseminated sclerosis. Whenquestioned about the healthy father whoasked for a statutory sickness benefit certificate while his wife was in hospital, candidates whohabitually gave certificates as an easy wayout also failed to take other moreimportant positive steps to

  19. What is.… a Modified Essay Question?

    The Modified Essay Question has its origins in this movement, being introduced in the late 1960s as one assessment technique more suited to general practice than other traditional assessment methods. In its original form it is a paper exercise based on an evolving situation presented by a patient in primary care.

  20. Use modified essay questions

    In the second issue of Medical Teacher (1979, 1, 65-70), Professor R. M. Harden gave an overview of assessment in medical education. Here, Professor Knox describes some of the issues involved in one form of assessment-the Modified Essay Question. With careful preparation, this technique can provide …

  21. (PDF) Modified Essay Questions MEQs

    The previous articles in this series of writing for professional publication focused on the preparation you need to do before starting to write an article, the practicalities of writing the ...