How general pediatricians learn procedures: implications for training and practice

ABSTRACT The Accreditation Council for Graduate Medical Education (ACGME) requires General Pediatricians (GPeds) to learn thirteen procedures during training. However, GPeds infrequently perform these procedures in practice. We sought to determine:1) how GPeds learned procedures, 2) if GPeds self-reported achieving competence in the required ACGME procedures during training, and 3) if GPeds maintained these skills into practice. We conducted this mixed methods study from 2019–2020. 51 GPeds from central Ohio and the American Board of Pediatrics General Examination Committee were recruited via email or snowball sampling and participated in semi-structured recorded phone interviews probing procedural performance during training and current practice. Participants represented varied geographic regions and clinical settings. We employed Sawyer’s ‘Learn, See, Practice, Prove, Do, Maintain’ mastery learning pedagogical framework as a lens for thematic analysis. Participants did not demonstrate competence in all ACGME required procedures during training, nor sustain procedural skills in practice. Most participants learned procedures through a ‘see one, do one’ apprenticeship model. GPeds reported never being formally assessed on procedural competence during residency. All GPeds referred out at least one procedure. GPeds also believed that skill maintenance was unwarranted for procedures irrelevant to their current practice. GPeds did not sufficiently demonstrate competence in all ACGME required procedures during training, partially suggesting why they infrequently perform some procedures. Alternatively, these required procedures may not be relevant to their practice. Pediatric residency procedures education might consider using mastery learning for practice-specific procedures and surface-level methods (learning without mastery) for other skills.


Introduction
Therapeutic procedures, once considered the surgeon's exclusive domain, are now practiced in almost all medical specialties [1]. Even though general pediatrics is not known to be a 'procedures-heavy' discipline, general pediatricians (GPeds) are expected to provide a medical home for their patients, which includes performing common procedures safely and effectively [2][3][4][5][6]. For nearly 30 years, common pediatric procedures have been incorporated into residency program curricula in the USA [7]. In 2006, the Accreditation Council for Graduate Medical Education (ACGME) introduced procedure requirements -compulsory skills that residents must competently perform prior to graduation. Competence is 'individual characteristics (knowledge, abilities and attitudes) that allow a person to practice an activity in an autonomous fashion, to continuously improve practice and to adapt to a rapidly [changing] environment.' [8] Demonstrating competence requires a trainee to 'show how' they are sufficiently skilled to perform a procedure [9]. While the ACGME requires that graduates demonstrate competence in required procedures, they leave it up to the discretion of residency programs to determine core methods of competence assessment, often with input of program specific clinical competency committees [10]. Table 1 displays the most recent list of ACGME required procedures for pediatric residents [11].
Learning procedures under mastery learning models is effective in both procedure heavy disciplines, such as surgery, as well as primary care specialties such as internal medicine, pediatrics, or obstetrics and gynecology [12][13][14][15]. Mastery learning models that incorporate deliberate practice, the process of performing the skill under direct observation of an instructor with immediate formative feedback, are favored to traditional frameworks such as the common apprenticeship 'see one, do one' technique that only requires learners to demonstrate rudimentary skill [16,17]. Deliberate practice builds learners' skills until they have achieved a level of performance that is comparable to that of a master. Training to mastery levels of performance is effective because it reliably predicts future skill performance resulting in optimal health care outcomes, whereas minimum competency learning, a potential outcome of 'see one, do one' education, leads to rapid skill performance deterioration (decay) [18][19][20].
Sawyer's 'Learn, See, Practice, Prove, Do, Maintain' pedagogical framework is one such mastery learning model for learning procedures ( Figure 1) [21]. Prior to 'seeing' a procedure, this model requires trainees to first learn the foundational underpinnings of what a procedure entails (the steps, indications, contraindications, complications, pain management, postprocedure care, and interpretation of applicable results) through readings, didactics, or web-based modules [11,21]. Then, trainees see demonstrations of the procedure and engage in efforts to model the collective tasks that comprise the procedure through deliberate practice and feedback [17]. At the point during the deliberate practice stage where individual learners believe they are ready, the learner advances to the 'prove' phase of Sawyer's model. Once they have demonstrated competence at the 'prove' phase, they are considered qualified to perform the procedure under supervision in the clinical setting. As learners progress through these stages, direct supervision gradually decreases until learners reach unsupervised performance. The final phase of Sawyer's framework involves the maintenance of procedural skills over time.
From prior studies, we know that GPeds do not perform most of the currently required ACGME procedures in their practice for a variety of reasons including lack of supplies or personnel, time constraints, decreased self-confidence, decreased clinical opportunities, or reimbursement barriers [22][23][24]. GPeds also suggested that they may not have learned all procedures to a level of competence during training [24]. We wondered whether current GPeds' disinclination to perform procedures was associated with the later, particularly how they learned procedures and if they received procedural competency assessments during residency. Using Sawyer's model as a conceptual framework for evaluating procedural learning, the objectives of this study were to determine: 1) how GPeds learned procedures, 2) if GPeds self-reported achieving competence in the required ACGME procedures during training, and 3) if GPeds maintained these skills into practice.

Setting and participants
The target population was GPeds who completed residency and currently practice in the USA. We  drew our sample from two sources: a database containing the names and contact information for GPeds practicing throughout central Ohio [22,23] and the roster for the American Board of Pediatrics (ABP) General Pediatrics Examination Committee. The central Ohio GPeds database represented a range of practicing GPeds and allowed us to effectively sample GPeds who practiced in urban, suburban, and rural settings. Oversampling GPeds from rural settings was particularly critical for us since these practitioners generally self-report performing more procedures than those in suburban and urban settings [22]. We used snowball sampling from the central Ohio GPeds database to increase rural GPeds representation [24]. The inclusion of members of the ABP General Examination Committee provided more breadth of practicing GPeds throughout the USA, as the ABP chooses pediatricians from academic and community settings and diverse training backgrounds. We ensured that sampled GPeds practiced across urban, suburban, and rural geographic settings as defined by proximity to a Level 1 or 2 pediatric trauma center (PTC) (Urban: <10 miles; Suburban: 10-30 miles; and Rural >30 miles from a PTC). We sent central Ohio GPeds a postcard, followed by a telephone call or email to schedule an interview. Members of the General Pediatrics Examination Committee were recruited through an email from the ABP.

Design, measures, and data collection
We employed a convergent parallel mixed method design [25,26] with an original semi-structured interview combined with a close-ended question instrument (Appendix A). Specifically, we asked GPeds' about their residency training and how they learned and performed the 13 required ACGME procedures, their level of skill to perform these procedures at graduation and their current skill level, their opinions about maintenance of procedural skill over the course of their careers, how they manage procedures in their current practice, and their practice type and distance from a Level 1 or 2 PTC. We piloted these instruments among general pediatricians who did not participate in the study to ensure clarity and understanding. We conducted interviews until we achieved thematic and geographic sufficiency [26][27][28]. We provided a $50 MasterCard© to each interviewee for their participation. IRB Statement: The study was deemed exempt by the institutional review board.

Data analysis
We used Sawyer's model as a framework for qualitative data interpretation and Braun and Clarke's six steps of thematic analysis to guide our analytic approach [29].
Our qualitative analysis involved intensive group discussion among three study team members (MSI, DPW, CBL) for familiarizing us with the data, generating initial codes across all data, identifying and reviewing themes, and creating a thematic map prior to developing a codebook. These study team members met to code five randomly selected transcripts, following Braun and Clarke's six-step process. After development of the codebook, we divided the remaining transcripts such that two of these three study members coded each individual transcript. When the two coders disagreed, a discussion was held with all three members to negotiate agreement. All study authors reviewed the final themes to ensure appropriate representation.
For the closed ended questions, we converted responses to numbers and analyzed them with descriptive statistics (means, standard deviations, frequencies and percentages). We used paired t-tests to compare GPeds self-reported level of skill performance at residency graduation to their current performance. This data was analyzed using SPSS (IBM Corp. Released 2019. IBM SPSS Statistics for Windows, Version 26.0. Armonk, NY: IBM Corp).

Results
From June 2019 to January 2020, 51 GPed (40 from central Ohio and 11 from the ABP) participants completed the survey and interview. Interviews averaged 33 minutes in length (range: 23-96 minutes).

Participant and practice characteristics
Most of the participants were female (64.7%; n = 33) and more than half (56.8%; n = 29) had been practicing for greater than 11 years. Most (80.4%; n = 41) worked full time and practiced an average of 30.9 miles (STD = 28.4) from a tertiary medical care center. Participants represented 26 different residency programs throughout the USA. There were no differences in themes extracted between Central OH GPeds' and ABP General Examination GPeds' responses. Three participants reported completing more procedures because of their training location (one in an urban setting and two in a rural setting). Otherwise, there were no associations between procedures learned and residency program setting. Practice characteristics of participants varied widely and there was equal representation of urban, suburban, and rural GPeds. Participants represented a diversity of practice types, with most (n = 41 of 51; 80.4%) spending an average of 80.5% in an office/ private practice setting (Table 2). Every GPed said that they currently refer at least one procedure to emergency departments, hospitals, or outpatient subspecialists, regardless of their own practice type (academic medical center, hospital, private practice).

Reported procedural learning experiences and preparedness
Nearly 10% of participants described having little experience in performing the 13 required ACGME procedures, aside from umbilical catheter placements and lumbar punctures. More than 50% of the participants said they never gave immunizations, placed a temporary splint, or inserted a peripheral IV during training. None of the participants reduced a dislocation other than a nursemaid's elbow. Table 3 shows the number of GPeds who had limited experience for each required procedure with quotations that highlight their lack of experience in learning these skills.
GPeds also reported feeling less prepared in practice than they did at graduation across five specific procedures: neonatal endotracheal intubations, umbilical catheter placements, lumbar punctures, simple laceration repairs, and peripheral IV placements (Table 4). For these five procedures, not only were their self-reported ratings of preparedness significantly lower now than their personal ratings at graduation, but the effect sizes related to these differences were moderate to large [30].
A number of GPeds reported that a portion of their practice is performed in emergency department (ED) (n = 7 of 51; 13.7%), labor and delivery (L&D) (n = 4 of 51; 7.8%), and newborn nursery (n = 14 of 51; 27.5%) settings. While these GPeds spent less than 40% of their time in these acute care settings (38%, 20%, and 28% respectively), they were more likely to report performing common procedures such as reductions of dislocations (ED), incision and drainage of abscesses (ED), and bag mask ventilation (L&D, newborn nursery). The most common non-ACGME procedure performed was circumcision (newborn nursery). These same GPeds said that they learned these specific procedures either on-the-job by observing colleagues or through formal life support skill classes (Pediatric Advanced Life Support or Neonatal Resuscitation Program).

How does GPeds procedures education compare to Sawyer's pedagogical framework?
We present our analyses of how GPeds' learned procedures during residency through the lens of Sawyer's model. Table 5 displays relevant themes and quotations for each component of this model.

GPeds did not prove competence in training
None of the participants graduated from residency programs that provided structured summative assessment opportunities for their residents to prove procedural competence. Almost all programs used procedure logs as an indirect measure of competence, sometimes with a minimum number of required encounters per procedure. Some programs used selfreported procedure logs, while others required a supervisor to sign-off that they observed the procedure. Most participants did not feel that procedure logs were defensible evidence of procedural competence, particularly since data could be fabricated.

GPeds did not learn or teach back procedures prior to 'doing'
Very few participants learned about a procedure through lectures or readings prior to being shown the procedure. Some participated in simulations; however, these were most often part of certification programs outside of residency like Pediatric Advance Life Support (PALS) or Neonatal Resuscitation Programs (NRP). Table 2. Demographic and practice profile with frequencies (counts) and percentages of the 51 general pediatricians who participated in interviews about how they learned procedures. †Type of current practice is reported as the number of general pediatricians who work in each type of practice and the average percentage of time they allocate to that practice type.  None of the participants reported 'teaching back' as part of learning procedures, a fundamental part of the 'see' step in Sawyer's education model. Instead, they reported simply jumping to the 'do' portion of the model to perform the procedure on a child with or without supervision.

GPeds did not engage in deliberate practice
With the exception of PALS or NRP courses taken during or after training, the participants did not commonly engage in deliberate practice during training, whether it was in a simulated environment or during training in the clinical environment. Some reported receiving feedback and coaching while performing procedures on live patients, but were not required to prove that they were competent prior to performing the procedure in the clinical setting. Others simply avoided performing procedures they did not feel competent to perform.

GPeds did not participate in graduated supervision
Participants did not mention graduated supervision as part of their learning. Some participants took almost complete responsibility for learning procedures on their own. Others mentioned that, depending on their future professional plans, they pursued additional opportunities through which to learn and practice procedures outside of the residency program such as moon-lighting, elective rotations, or external courses. Even in those settings, they reported not learning these skills through graduated supervision.

GPeds did not maintain all skills and do not desire formal skill maintenance
Only two participants pursued post-graduate training workshops in procedural skills. A few GPeds commented that they have let their life support certifications lapse since these were not needed in their current medical practices. When asked how they would 'relearn' a procedure if they happened to relocate or change job types, most said that they would prefer to shadow or be observed by peers as opposed to attending training workshops. They wanted their employer or local medical center to ensure they were credentialed and/or to complete 'check-offs' for competency. Furthermore, most participants voiced concerns about introducing formal procedural competence assessments as a component of maintenance of certification (MOC). Participants stated that since GPeds' scope of practice varied so broadly, governing medical bodies and boards could not fairly implement requirements. In addition, they reported that requiring procedural MOC would be costly, burdensome, and add to the ever-growing list of administrative requirements that occupy a physician's time. Above all, GPeds believed that procedural MOC is unnecessary since not all procedures are relevant to the practice of all GPeds.

Discussion
Using Sawyer's framework as a lens, our results suggest that although GPeds had some experience with most required procedures, they never had the opportunity to learn all these procedures to the level of competence during training. Moreover, residency training programs did not formally assess procedural competence, so many GPeds graduated without documented evidence or the self-perception that they were proficient at performing the procedures asked them about in this study. Recent studies are consistent with these findings. For example, a recent survey of graduating pediatric residents showed that 33.3% may not have completed one or more of the required procedures successfully in training [31]. Program directors also have reported pediatric residents fail to achieve competence in procedures such as venipunctures, neonatal endotracheal intubations, and administering immunizations [32]. Furthermore, although participants reported completing procedure logs, such documentation is designed for program evaluation and not for providing evidence of individual competence [24,33,34].
From prior studies, we know GPeds believe procedures are an important part of their residency education for five reasons. They want to be: 1) adaptable to potential changes in their type of practice or practice location, 2) prepared for emergencies in which a life-saving procedure is needed, 3) sufficiently knowledgeable to describe procedures to patient families, 4) able to perform the procedure in a situation where they were too far from a PTC, and 5) align with conceptualizations of GPeds in the formation of their professional identity [24]. We also know that not learning procedures during residency training impedes GPeds from performing procedures in practice [24]. Our data showed this to be particularly salient for certain high-risk, low frequency procedures (endotracheal intubation, lumbar puncture, umbilical catheter placement) that GPeds reported feeling less prepared to perform. Not learning and performing procedures, therefore, made maintenance of skills nearly impossible. In fact, most GPeds suggested that they had not maintained such skills and would have to relearn them if required to demonstrate competence during formal MOC assessment or in practice. Together, these results may at least partially explain GPeds disinclination to perform some ACGME procedures in practice.
Our results also call into question the need for GPeds to learn the specific 13 procedures currently required by the ACGME. Pediatric healthcare is evolving. Trends toward the use of procedural specialists or procedure technicians, referrals of procedures to pediatric sub-specialists or to emergency departments and urgent cares, and not receiving adequate Table 5. GPeds did not learn procedures through methods consistent with a mastery learning model: The number (N) and percentage (%) of 51 GPeds who reported that they did not experience components of mastery learning when learning procedures. Representative quotes provide explanation for their lack of experience. Sawyer procedures education during pediatric residency are all existing barriers to GPeds performing these procedures in their current practice [35,36]. With this in mind, perhaps GPeds do not need to become proficient in all of these procedures during residency training. For example, in this and in other studies, it has been reported that in private practice or officebased settings, the administration of immunizations is done primarily by medical assistants and nurses rather than GPeds [23]. All GPeds interviewed for this study said that they had referred at least one procedure to an emergency department or subspecialist. This suggests that many of these procedures have likely fallen out of the GPed's scope of practice and that graduate medical education should adapt accordingly.

How should pediatric residency programs teach procedures?
We used Sawyer's framework as a gold-standard lens to evaluate procedures education because of its comprehensive focus on procedural steps and skill maintenance to prevent decay. Through this lens and from our GPeds responses, implementing mastery-based learning more broadly into pediatric procedures education would not necessarily be beneficial given the investment of time required for mastery learning and the fact that some procedures seem to have fallen out of GPeds' scope of practice. We also found that the setting in which the GPeds in our sample practiced, the type of patients they saw in that environment, and the other types of providers that were available all determined the specific procedures they needed to perform. Study participants reported being significantly less prepared currently as compared to graduation from residency on 5 of 13 procedures (neonatal endotracheal intubation, umbilical catheter placement, lumbar puncture, simple laceration repair, and peripheral intravenous catheter placement), simply because the patients they see in their current practice settings do not need these procedures. Other procedures from our list (giving immunizations or venipuncture) may have been performed in their current practice settings, but were often delegated to other healthcare practitioners. Finally, there seem to be a few procedures, such as splinting, that GPeds were reluctant to perform due to the cost of maintaining supplies in an office-based setting, and others that they do perform that are currently not taught in pediatric residencies. A critical question as we consider training modifications becomes: should procedures education in pediatric residencies consider the type of future practice in which the general pediatrician plans to work? Track-based education or customized procedural training may help to meet the needs of future general practitioners, without overwhelming an already dense pediatric program curriculum. Such tailored education could differentiate procedures education for individuals who plan on various subspecialty fellowships or practice types. For example, those going into critical care or who plan to practice in a rural, military, or global setting would need to master emergent care procedures such as airway management and rapid intravenous access; whereas those planning on primary care practices in urban or suburban settings would need to master office-based procedures such as bladder catherization [20,37]. Within each track, mastery of specific required procedures, through clinical practice and/or simulation, and surface-level coverage-learning without the expectation of eventual competent performance-of other less relevant skills would streamline procedures education. This could then ensure that the practitioners are learning procedures relevant to their future practice and that those specific procedures are mastered in order to promote maintenance over time and reduce the associated medical errors or adverse patient outcomes that comes from deskilling [38,39]. While low clinical volumes may impact the ability to maintain skills, alternative methods of learning, such as rolling refreshers (e.g., just-in-time in situ training sessions), procedural simulations, and semi-frequent post-graduation skill assessment in practice-specific procedures could be adopted [40][41][42].

Further implications of adapting procedural education requirements
If the ACMGE and residency programs modify procedure requirements to the future needs of pediatric residents or if the ACGME eliminates procedural requirements altogether, adaptations will be needed for core educational requirements. Further research using practice pattern analyses could be used to determine which procedures are relevant to each type of practice and each pediatric subspecialty. This would also entail investigating the depth to which each required procedures need to be learned for various types of clinical settings. In addition, changes to procedural requirements would have significant implications on formalizing standards of practice for how these procedures are referred and to whom, highlighting the role that GPeds play in the medical homes for their patients [43].
The primary limitation in this study was that participants' responses depended on recall of their training experiences. Most (94%) of our interviewees' have been in practice for thirty years or less and had some experience during training with all of the procedures we inquired about. Some of those who had been out of training for a long time had some difficulty in answering questions about when and where they learned specific procedures. However, almost none had difficulty in recalling 'how' or 'how well' they learned each procedure. In addition, the majority of the sample GPeds worked in the central region of the country and completed residency programs in that region (many in the State of Ohio). Accordingly, our findings may not be completely generalizable to other regions of the U.S. or other countries. Finally, GPeds were not asked their beliefs about the optimal methods of procedural learning, and thus, their suggestions for how best to deliver procedures education were not offered.

Conclusions
Accreditation bodies such as the ACGME establish rules and regulations designed to guide residency programs. For pediatrics programs, they have historically recommended or required that residents demonstrate competence in clinical procedures prior to graduation. The GPeds interviewed in our study said that they never learned many of the currently required procedures, nor were they required to demonstrate competence through formal assessment of their skills. This lack of training may partially explain why GPeds infrequently perform these procedures and why they are more likely to refer them to specialists. An alternative, and equally compelling explanation, is that these required procedures simply may not be needed in the practice of most GPeds. Further research is needed to establish the procedural scope of practice for GPeds and pediatric subspecialists. Once this is achieved, procedures education can be tailored to the needs of the pediatrician based on their future practice plans.

Disclosure statement
No potential conflict of interest was reported by the author(s).

Notes on contributor
Dr. Iyer was supported by a grant to the American Board of Medical Specialties Research and Education Foundation from the American Board of Medical Specialties. The funder/sponsor did not participate in the work.

Disclaimer
This content is solely the responsibility of the authors and does not represent the official views of the American Board of Pediatrics or the American Board of Medical Specialties.