1
40
3
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1016/j.jemermed.2015.06.072" target="_blank" rel="noreferrer noopener">http://doi.org/10.1016/j.jemermed.2015.06.072</a>
Pages
128–134
Issue
1
Volume
50
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
The National Emergency Medicine Fourth-year Student (M4) Examinations: Updates and Performance.
Publisher
An entity responsible for making the resource available
The Journal of emergency medicine
Date
A point or period of time associated with an event in the lifecycle of the resource
2016
2016-01
Subject
The topic of the resource
*Education; assessment; CDEM; Clinical Clerkship; Clinical Competence; Education; Educational Measurement; Educational Measurement/*methods; Emergency Medicine – Education; Emergency Medicine/*education; examination; Humans; Medical; medical student; Undergraduate
Creator
An entity primarily responsible for making the resource
Heitz Corey R; Lawson Luan; Beeson Michael; Miller Emily S
Description
An account of the resource
BACKGROUND: Version 1 (V1) of the National Emergency Medicine Fourth-year Student (EM M4) Examination was released in 2011 and revised along with release of V2 in 2012. Each examination contains 50 multiple-choice questions designed to assess knowledge in the EM M4 clerkship curriculum. Development and initial performance data were described previously. OBJECTIVE: To provide updated V1 performance data, describe development and revision of V2, and to compare performance between academic years and examination forms, and within academic years. METHODS: Examinations are administered at www.saemtests.org with ongoing performance data provided. After 1 year of use, nine questions on V2 were revised, five because of low discriminatory ability and four because of excessive difficulty. Revision or replacement was done in accordance with the National Board of Medical Examiners (NBME) Item Writing Guidelines. Mean scores were compared for V1 between academic years (i.e., July 2011-June 2012 vs. July 2012-June 2013), V2 compared with V1, and for each examination version for early and late test takers. RESULTS: V1 has been administered \textgreater10,000 times since its release, and the current form mean is 81.5% (SD 3.7). Average discriminatory value (rpb) is 0.204. V2 has been administered \textgreater1500 times, with a mean score of 78.4% (SD 4.4) and average rpb 0.253. V1 and V2 current means differ statistically. Scores from examinees completing V1 or V2 early vs. late in the academic year differ statistically. CONCLUSIONS: Performance data for V1 remain stable after 2 years. Revisions of poorly performing questions improved question performance on V2. Questions with low rpb or low pdiff will continue to be revised annually. While examination forms differ statistically, the practical utility of the differences is not defined.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1016/j.jemermed.2015.06.072" target="_blank" rel="noreferrer noopener">10.1016/j.jemermed.2015.06.072</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
*Education
2016
assessment
Beeson Michael
CDEM
Clinical Clerkship
Clinical Competence
Department of Emergency Medicine
Education
Educational Measurement
Educational Measurement/*methods
Emergency Medicine – Education
Emergency Medicine/*education
examination
Heitz Corey R
Humans
Lawson Luan
Medical
medical student
Miller Emily S
NEOMED College of Medicine
The Journal of emergency medicine
Undergraduate
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.5811/westjem.2014.11.24189" target="_blank" rel="noreferrer noopener">http://doi.org/10.5811/westjem.2014.11.24189</a>
Pages
138–142
Issue
1
Volume
16
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Correlation of the NBME advanced clinical examination in EM and the national EM M4 exams.
Publisher
An entity responsible for making the resource available
The western journal of emergency medicine
Date
A point or period of time associated with an event in the lifecycle of the resource
2015
2015-01
Subject
The topic of the resource
Humans; United States; Prospective Studies; Linear Models; Emergency Medicine/*education; Clinical Competence; Educational Measurement/*methods; *Clinical Clerkship; Human; Descriptive Statistics; Multicenter Studies; Data Analysis Software; Academic Performance; Undergraduate; Medical; *Education; Linear Regression; Emergency Care – Education
Creator
An entity primarily responsible for making the resource
Hiller Katherine; Miller Emily S; Lawson Luan; Wald David; Beeson Michael; Heitz Corey; Morrissey Thomas; House Joseph; Poznanski Stacey
Description
An account of the resource
INTRODUCTION: Since 2011 two online, validated exams for fourth-year emergency medicine (EM) students have been available (National EM M4 Exams). In 2013 the National Board of Medical Examiners offered the Advanced Clinical Examination in Emergency Medicine (EM-ACE). All of these exams are now in widespread use; however, there are no data on how they correlate. This study evaluated the correlation between the EM-ACE exam and the National EM M4 Exams. METHODS: From May 2013 to April 2014 the EM-ACE and one version of the EM M4 exam were administered sequentially to fourth-year EM students at five U.S. medical schools. Data collected included institution, gross and scaled scores and version of the EM M4 exam. We performed Pearson's correlation and random effects linear regression. RESULTS: 305 students took the EM-ACE and versions 1 (V1) or 2 (V2) of the EM M4 exams (281 and 24, respectively) [corrected].The mean percent correct for the exams were as follows: EM-ACE 74.9 (SD-9.82), V1 83.0 (SD-6.39), V2 78.5 (SD-7.70) [corrected]. Pearson's correlation coefficient for the V1/EM-ACE was 0.53 (0.43 scaled) and for the V2/EM-ACE was 0.58 (0.41 scaled) [corrected]. The coefficient of determination for V1/ EM-ACE was 0.73 and for V2/EM-ACE 0.71 (0.65 and .49 for scaled scores) [ERRATUM]. The R-squared values were 0.28 and 0.30 (0.18 and 0.13 scaled), respectively [corrected]. There was significant cluster effect by institution. CONCLUSION: There was moderate positive correlation of student scores on the EM-ACE exam and the National EM M4 Exams.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.5811/westjem.2014.11.24189" target="_blank" rel="noreferrer noopener">10.5811/westjem.2014.11.24189</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
*Clinical Clerkship
*Education
2015
Academic Performance
Beeson Michael
Clinical Competence
Data Analysis Software
Department of Emergency Medicine
Descriptive Statistics
Educational Measurement/*methods
Emergency Care – Education
Emergency Medicine/*education
Heitz Corey
Hiller Katherine
House Joseph
Human
Humans
Lawson Luan
Linear Models
Linear Regression
Medical
Miller Emily S
Morrissey Thomas
Multicenter Studies
NEOMED College of Medicine
Poznanski Stacey
Prospective Studies
The western journal of emergency medicine
Undergraduate
United States
Wald David
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1097/00001888-199407000-00016" target="_blank" rel="noreferrer noopener">http://doi.org/10.1097/00001888-199407000-00016</a>
Pages
583–587
Issue
7
Volume
69
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Using chart reviews to assess residents' performances of components of physical examinations: a pilot study.
Publisher
An entity responsible for making the resource available
Academic medicine : journal of the Association of American Medical Colleges
Date
A point or period of time associated with an event in the lifecycle of the resource
1994
1994-07
Subject
The topic of the resource
Adult; Ambulatory Care; Clinical Competence; Educational Measurement/*methods; Female; Humans; Internship and Residency/*standards; Male; Ohio; Physical Examination/*standards; Pilot Projects; Program Evaluation
Creator
An entity primarily responsible for making the resource
Ognibene A J; Jarjoura D G; Illera V A; Blend D A; Cugino A E; Whittier F C
Description
An account of the resource
PURPOSE: To evaluate chart review as a method of assessing residents' performances of physical examinations in an ambulatory care setting. METHOD: In 1992, nurse authors at the Affiliated Hospitals at Canton of the Northeastern Ohio Universities College of Medicine assessed whether 22 internal medicine residents performed ten components of the physical examination by interviewing patient volunteers immediately after the patients' examinations. A total of 89 patient interviewees were included in the analysis; these patients were all new outpatients who had been scheduled for initial visits to obtain complete histories and physical examinations. Charts for the same patients were then retrospectively reviewed. The residents and faculty were blinded to both the chart reviews and the interviews. Statistical methods used were Pearson correlational analysis and variance-component analysis. RESULTS: The interviews and chart reviews showed 81% agreement in component performance. Completeness of the physical examination (whether measured by chart review or interview) did not correlate with other standard methods of resident evaluation, and completeness did not show a significant association with characteristics of the residents and patients. Two of the 22 residents assessed were identified as having completeness scores so low as to be unsatisfactory. CONCLUSION: That residents were identified as failing to perform examination components suggests that chart reviews, especially when independently verified by patient interviews, may be a useful evaluation tool for identifying inadequate performance of components of the physical examination and may identify the need for remediation.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1097/00001888-199407000-00016" target="_blank" rel="noreferrer noopener">10.1097/00001888-199407000-00016</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
1994
Academic medicine : journal of the Association of American Medical Colleges
Adult
Ambulatory Care
Blend D A
Clinical Competence
Cugino A E
Department of Internal Medicine
Educational Measurement/*methods
Female
Humans
Illera V A
Internship and Residency/*standards
Jarjoura D G
Male
NEOMED College of Medicine
Ognibene A J
Ohio
Physical Examination/*standards
Pilot Projects
Program Evaluation
Whittier F C