Erratum: this article corrects: "correlation of the NBME advanced clinical examination in EM and the national EM M4 exams".
Emergency Medicine; Credentialing Examinations
[This corrects the article on p. 138 in vol. 16, PMID: 25671023.].
Hiller Katherine; Miller Emily S; Lawson Luan; Wald David; Beeson Michael; Heitz Corey; Morrissey Thomas; House Joseph; Poznanski Stacey
The western journal of emergency medicine
2015
2015-03
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
<a href="http://doi.org/10.5811/westjem.2015.2.25829" target="_blank" rel="noreferrer noopener">10.5811/westjem.2015.2.25829</a>
Correlation of the NBME advanced clinical examination in EM and the national EM M4 exams.
Humans; United States; Prospective Studies; Linear Models; Emergency Medicine/*education; Clinical Competence; Educational Measurement/*methods; *Clinical Clerkship; Human; Descriptive Statistics; Multicenter Studies; Data Analysis Software; Academic Performance; Undergraduate; Medical; *Education; Linear Regression; Emergency Care – Education
INTRODUCTION: Since 2011 two online, validated exams for fourth-year emergency medicine (EM) students have been available (National EM M4 Exams). In 2013 the National Board of Medical Examiners offered the Advanced Clinical Examination in Emergency Medicine (EM-ACE). All of these exams are now in widespread use; however, there are no data on how they correlate. This study evaluated the correlation between the EM-ACE exam and the National EM M4 Exams. METHODS: From May 2013 to April 2014 the EM-ACE and one version of the EM M4 exam were administered sequentially to fourth-year EM students at five U.S. medical schools. Data collected included institution, gross and scaled scores and version of the EM M4 exam. We performed Pearson's correlation and random effects linear regression. RESULTS: 305 students took the EM-ACE and versions 1 (V1) or 2 (V2) of the EM M4 exams (281 and 24, respectively) [corrected].The mean percent correct for the exams were as follows: EM-ACE 74.9 (SD-9.82), V1 83.0 (SD-6.39), V2 78.5 (SD-7.70) [corrected]. Pearson's correlation coefficient for the V1/EM-ACE was 0.53 (0.43 scaled) and for the V2/EM-ACE was 0.58 (0.41 scaled) [corrected]. The coefficient of determination for V1/ EM-ACE was 0.73 and for V2/EM-ACE 0.71 (0.65 and .49 for scaled scores) [ERRATUM]. The R-squared values were 0.28 and 0.30 (0.18 and 0.13 scaled), respectively [corrected]. There was significant cluster effect by institution. CONCLUSION: There was moderate positive correlation of student scores on the EM-ACE exam and the National EM M4 Exams.
Hiller Katherine; Miller Emily S; Lawson Luan; Wald David; Beeson Michael; Heitz Corey; Morrissey Thomas; House Joseph; Poznanski Stacey
The western journal of emergency medicine
2015
2015-01
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
<a href="http://doi.org/10.5811/westjem.2014.11.24189" target="_blank" rel="noreferrer noopener">10.5811/westjem.2014.11.24189</a>
The National Emergency Medicine Fourth-year Student (M4) Examinations: Updates and Performance.
*Education; assessment; CDEM; Clinical Clerkship; Clinical Competence; Education; Educational Measurement; Educational Measurement/*methods; Emergency Medicine – Education; Emergency Medicine/*education; examination; Humans; Medical; medical student; Undergraduate
BACKGROUND: Version 1 (V1) of the National Emergency Medicine Fourth-year Student (EM M4) Examination was released in 2011 and revised along with release of V2 in 2012. Each examination contains 50 multiple-choice questions designed to assess knowledge in the EM M4 clerkship curriculum. Development and initial performance data were described previously. OBJECTIVE: To provide updated V1 performance data, describe development and revision of V2, and to compare performance between academic years and examination forms, and within academic years. METHODS: Examinations are administered at www.saemtests.org with ongoing performance data provided. After 1 year of use, nine questions on V2 were revised, five because of low discriminatory ability and four because of excessive difficulty. Revision or replacement was done in accordance with the National Board of Medical Examiners (NBME) Item Writing Guidelines. Mean scores were compared for V1 between academic years (i.e., July 2011-June 2012 vs. July 2012-June 2013), V2 compared with V1, and for each examination version for early and late test takers. RESULTS: V1 has been administered \textgreater10,000 times since its release, and the current form mean is 81.5% (SD 3.7). Average discriminatory value (rpb) is 0.204. V2 has been administered \textgreater1500 times, with a mean score of 78.4% (SD 4.4) and average rpb 0.253. V1 and V2 current means differ statistically. Scores from examinees completing V1 or V2 early vs. late in the academic year differ statistically. CONCLUSIONS: Performance data for V1 remain stable after 2 years. Revisions of poorly performing questions improved question performance on V2. Questions with low rpb or low pdiff will continue to be revised annually. While examination forms differ statistically, the practical utility of the differences is not defined.
Heitz Corey R; Lawson Luan; Beeson Michael; Miller Emily S
The Journal of emergency medicine
2016
2016-01
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
<a href="http://doi.org/10.1016/j.jemermed.2015.06.072" target="_blank" rel="noreferrer noopener">10.1016/j.jemermed.2015.06.072</a>