Modern medical education requires frequent competency assessment. The Accreditation Council for Graduate Medical Education (ACGME) provides a descriptive framework of competencies and milestones but does not provide standardized instruments to assess and track trainee competency over time. Entrustable professional activities (EPAs) represent a workplace-based method to assess the achievement of competency milestones at the point-of-care that can be applied to anesthesiology training in the United States.
Experts in education and competency assessment were recruited to participate in a 6-step process using a modified Delphi method with iterative rounds to reach consensus on an entrustment scale, a list of EPAs and procedural skills, detailed definitions for each EPA, a mapping of the EPAs to the ACGME milestones, and a target level of entrustment for graduating US anesthesiology residents for each EPA and procedural skill. The defined EPAs and procedural skills were implemented using a website and mobile app. The assessment system was piloted at 7 anesthesiology residency programs. After 2 months, faculty were surveyed on their attitudes on usability and utility of the assessment system. The number of evaluations submitted per month was collected for 1 year.
Participants in EPA development included 18 education experts from 11 different programs. The Delphi rounds produced a final list of 20 EPAs, each differentiated as simple or complex, a defined entrustment scale, mapping of the EPAs to milestones, and graduation entrustment targets. A list of 159 procedural skills was similarly developed. Results of the faculty survey demonstrated favorable ratings on all questions regarding app usability as well as the utility of the app and EPA assessments. Over the 2-month pilot period, 1636 EPA and 1427 procedure assessments were submitted. All programs continued to use the app for the remainder of the academic year resulting in 12,641 submitted assessments.
A list of 20 anesthesiology EPAs and 159 procedural skills assessments were developed using a rigorous methodology to reach consensus among education experts. The assessments were pilot tested at 7 US anesthesiology residency programs demonstrating the feasibility of implementation using a mobile app and the ability to collect assessment data. Adoption at the pilot sites was variable; however, the use of the system was not mandatory for faculty or trainees at any site.