Remote Instruction in Focused Assessment With Sonography in Trauma (FAST) Exams for Surgery Residents: A Pilot Study

Background The Focused Assessment with Sonography in Trauma (FAST) exam is an important component to the evaluation of trauma patients. With advances in technology and meeting limitations due to COVID-19, remote instruction and learning have gained popularity. We sought to determine whether remote instruction of FAST exams was feasible as sustainable surgical education and a possible alternative to traditional in-person teaching. Methods General surgery residents completed a baseline survey and skills assessment on FAST exams and were then randomized to remote or in-person instruction. The remote group participated in an instructional session with a content expert through video conference and then practiced on a simulated mannequin while the expert remotely provided feedback. The in-person group received the experience with the content expert in the room. Both groups completed a post-course survey immediately after the session and a follow-up survey and objective assessment at six-months. Results were compared with two-way analysis of variance (ANOVA). Results 14 residents underwent the curriculum, seven in each group. There was a significant increase in self-reported confidence when comparing pre- and immediate post-course results for both the remote and in-person groups. At six months, confidence scores remained elevated and skill assessment scores improved, although the latter did not reach significance. There was no significant difference in post-course results between the groups. Conclusions Remote instruction of FAST exams was feasible. Pilot data demonstrated an increase in confidence and suggest outcomes that are similar to in-person instruction, which has positive implications for future remote educational and potentially clinical initiatives.

• Although hands-on skills, such as performing a FAST exam, have traditionally been taught in person, it is feasible to offer the same training remotely utilizing video conferencing software.• Data on subjective and objective outcomes of FAST exam performance among a pilot group of surgical trainees suggest that remote instruction may be non-inferior to in-person instruction.

Introduction
Performing a Focused Assessment with Sonography for Trauma (FAST) exam is an essential component of the clinical armamentarium acquired during surgical residency training.The FAST exam provides rapid assessment of patients at the bedside and has become an integral step in both surgical and critical care management algorithms.Its emergence as a key element in the assessment of traumatic illness is a product of its core features; it is noninvasive, can be performed rapidly, and carries minimal risk of complications. 1 Nonetheless, a significant limitation of this modality is it is user-dependent; ensuring that users can successfully perform this assessment demands implementation of a sound educational program within residency training, which has necessitated the implementation of formal training curricula within residency programs.Previously published curricula often include didactic instruction, live demonstrations by a content expert, as well as interactive simulation experiences. 2,3Consequently, the quality and accuracy of the FAST exam are expected to increase because of both the number of exams performed and the general confidence level of the trainee.The median number of examinations to ensure user proficiency is reported as 50, though documented variability is high ranging from 10 to 200. 4,5raditionally, this training has been conducted in person, but restrictions posed by the COVID-19 pandemic severely limited gatherings and effectively halted inperson delivery.While in-person gatherings fell, a paradigm shift toward remote conferencing has encouraged the continuation of trainee educational endeavors while maintaining a low-risk learning environment.][8][9] Our study examines the implications of a FAST exam training curriculum executed both in the traditional live manner and via teleconference to determine the feasibility and preliminary efficacy of remote learning and mentoring.We hypothesize that utilization of remote learning will demonstrate non-inferiority to in-person instruction when teaching general surgery residents to conduct FAST exams.

Methods
After discussion with our Institutional Review Board (IRB), this study was exempted from IRB review as it is a quality improvement project intended to improve trainee ultrasound education rather than experimental human subject research.General surgery residents of all postgraduate year (PGY) levels were included for participation in this study.The curriculum was designed to be conducted through phases similar to the previously described TEAMS (Tele-Education-Assisted Mentorship in Surgery) methodology 10 : baseline survey, online didactics, baseline assessment, live instructional session, immediate post-course survey, and six-month survey and assessment (Figure 1).A baseline survey regarding confidence and experience with FAST exams was first distributed to the participating residents (available in Supplemental File 1).Survey items included the number of FAST exams residents had previously performed, their self-perceived confidence in obtaining each of the 4 window views (pericardial, perihepatic, perisplenic, and pelvic), and their willingness to proceed to the operating room based on their examination findings.
Following this survey, all residents completed the American College of Surgeons UltraSound Essentials for Residents (USER) online course to introduce or reinforce foundational knowledge through didactics.They subsequently underwent an objective assessment of their baseline FAST examination skills using a training mannequin and Butterfly iQ+ portable ultrasound (Butterfly Network, Inc., Guilford, CT).The training mannequin (CAE Blue Phantom TM FAST Exam Ultrasound Training Model, CAE Healthcare, Montreal, Canada) was a full human torso that allowed fluid to be injected into or aspirated from the 4 windows, thus displaying a "negative" or "positive" result.The resident was asked to determine whether each of the 4 spaces was positive or negative for fluid, and an independent examiner determined their FAST exam accuracy, which was scored from zero to 100%.If the resident verbally expressed uncertainty or did not demonstrate ultrasound images supporting their final decision, the response was counted as incorrect.
Residents were then randomized to either a remote cohort session or an in-person cohort session, which were both held at our institution's surgical simulation lab.These small group sessions began with a presentation by a fellowship-trained trauma faculty surgeon with board certification in critical care, considered the content expert, that detailed how to properly perform a FAST exam and provided multiple examples of positive and negative findings.There were 3 faculty instructors recruited to teach the course.To minimize variability between instructors, they were all provided with identical teaching materials (didactics information, PowerPoint slides, etc.,) and primed for the session in the same manner.The faculty instructor to resident ratio was 1:3 or 4. Following this presentation, the residents would practice obtaining the proper views on the mannequin under the guidance of the faculty surgeon, who would provide real-time feedback as they viewed the ultrasound images together.
The remote sessions were conducted using the Zoom Meeting (San Jose, CA) platform with 3 devices connected to the call (Figure 2).The first device was a tablet that was connected to the Butterfly ultrasound, which the resident used to perform FAST exams.The tablet's screen was shareable to allow the faculty instructor to simultaneously observe the ultrasound views obtained by the participant (Figure 3A, B).Secondly, the instructor was connected via a computer that was located elsewhere in the simulation center, isolated from the residents.The third device connected to the call was the overhead simulation center camera that allowed the instructor to view where residents were placing the ultrasound probe on the mannequin and provide feedback if needed.Conversely, the in-person session occurred with the faculty instructor present in the room; however, the same ultrasound and mannequin setup were utilized (Figure 3C).Immediately following this session, the residents completed a post-course survey that repeated the subjective questions present in the baseline survey to evaluate perceived ability and confidence.The residents then returned in six months to repeat the objective assessment of their FAST exam skills, scored by the same independent examiner, and complete the survey for a third time.
Statistical analysis was performed using GraphPad Prism 9 from GraphPad Software Inc (La Jolla, CA).
Surveys utilized Likert scale responses, ranging from 1 (strongly disagree) to 5 (strongly agree).Continuous data were expressed as mean values ± standard deviations (SD).Significant differences between the remote and in-person groups at baseline, immediate postcourse, and 6-month follow-up time points were calculated using a two-way analysis of variance (AN-OVA).Two-tailed P-values less than .05were considered significant.

Results
14 general surgery residents completed the ultrasound course, with seven residents randomized to each cohort.Eight (57%) residents were junior residents, defined as being in their first or second year of training.11 (79%) residents had performed at least one FAST exam prior to the course, with seven having done one to 10 FAST exams.Nine (64%) had received previous instruction in FAST exams before the study, although only three reported being satisfied with that training.There were no significant differences in baseline characteristics between the two groups regarding gender, PGY level, previous FAST exam training, and the number of prior FAST exams performed (Table 1).

Survey Responses
Baseline pre-course survey responses did not differ significantly between the remote and in-person instruction groups.This was true for individual questions, as well as for the average of survey responses (Table 2).Immediately following the course, both the in-person and remote groups provided higher survey ratings than pre-course (Figure 4).Interestingly, the responses to individual questions were not significantly higher than those obtained pre-course, but the average of all responses had significantly increased.This was true for both remote and in-person groups, with the average of responses increasing from 3.12 ± .35(SD) to 4.13 ± .36 and 3.17 ± .39 to 4.02 ± .28,respectively.There were no significant differences in immediate post-course surveys between the remote and in-person groups; this was true for both individual questions and the overall average.
At the six-month follow-up, there were decreases in response scores for all survey questions, and thus the average response as well, compared to the immediate post-course results, but these did not reach statistical significance.The six-month survey responses remained higher than those obtained pre-course, and this was observed in both the remote and in-person groups.Similar to what was observed immediately post-course, the responses to individual questions at six months did not differ significantly from those of pre-course, but the  average of responses at six months was significantly higher than the pre-course average.This applied to both groups, with the remote average of 3.92 ± .36 (compared to 3.12 ± .35pre-course) and the in-person average of 3.80 ± .38 (3.17 ± .39 pre-course).There were no significant differences in six-month survey responses between the remote and in-person groups.

Objective Skills Assessment
Prior to the course, the remote group demonstrated 70.8 ± 33.2% accuracy during the FAST exam objective assessment.The in-person group's accuracy was 75 ± 20.4%, which did not differ significantly from the remote group.At six months, each group's accuracy had increased from their pre-course baseline, with the in-person group's increasing to 87.5 ± 13.7% and the remote group increasing to 85 ± 13.7%.However, this increase was not statistically significant (Figure 5).

Discussion
This pilot study was intended to evaluate the feasibility of a remote FAST exam course for general surgery trainees.Our results demonstrate that this virtual method of education was attainable, and pilot data yielded performance outcomes and self-reported confidence levels similar to those of traditional in-person learning.FAST exams are considered a highly "hands-on" skill and therefore would be seemingly easier to teach with an instructor physically present in the same location as the trainee.However, our findings demonstrated that with adequate communication and visualization, the same educational content could be delivered through remote instruction.
Interestingly, the concept of virtual instruction is not a recent innovation, as nearly a decade prior, minimally invasive surgeons in North America successfully taught the Fundamentals of Laparoscopic Surgery (FLS) tasks to surgeons in Botswana via Skype. 11The cost was found to be significantly less and the long-term learning effects more sustained compared to the traditional in-person three-day courses held previously, although connection issues and dropped calls were frequently reported drawbacks.While that landmark endeavor represents significant progress for medical education, internet accessibility and connectivity have since become more reliable, and the quality of video and audio technology has continued to improve.More recently, Skype has been used to instruct medical professionals with no formal ultrasound training on how to accurately obtain lung views and diagnose or exclude a pneumothorax. 12As this examination requires second-to-second real-time visual feedback to verify the subtle pleural sliding, the results of this virtual course showcase the capacity of technological advancement in the field.
To date, a limited number of studies have investigated the role of remote instruction in teaching FAST exam competency.Terry et al. describe the process by which Ugandan emergency care providers received virtual FAST exam education from instructors located in the United States which led to improvement in the quality of subsequent ultrasound images and greater point-of-care ultrasound (POCUS) utilization at their institution. 13wever, these evaluations were in the form of written comments and ratings received via email several days later, representing a relative delay in feedback that may not be generalizable nor beneficial to resident education.In another study, the Philips Lumify (Philips Medical System, Bothell, WA) portable ultrasound system was utilized in undergraduate medical education to teach FAST exams remotely to medical students. 14The authors demonstrated this method to be non-inferior to in-person teaching, but a significant weakness was its short-term follow-up as it only assessed immediate post-course retention.
Our results expand upon the findings reported by Drake and Terry et al. by trialing remote instruction on a different set of learners, that is, general surgery residents, and examining longer term outcomes at six months.Our pilot group exhibited significant increases in confidence ratings following the course, especially immediately after the session.Although some decrease in confidence was identified at the six-month follow-up, the average rating remained significantly higher than pre-course values.This suggests there may be educational value in offering the course on a regular basis or incorporating "skills refresher" sessions, to mitigate the attrition associated with time and lack of practice.Furthermore, there were no significant differences between the two teaching groups at any of the three time points (pre-, immediate post-, and six-month follow-up), indicating that remote instruction has effects on self-reported confidence comparable to traditional in-person teaching.
The objective skills assessment scores both before and after the course also appeared similar between the teaching groups.At six-month follow-up, these scores were higher; however, the increase did not reach statistical significance.The lack of significance may be multifactorial and represents a few limitations of our study.The number of participants in each group was small though this was anticipated due to the study being a pilot.With the study being a pilot, all data can only be interpreted as preliminary and with the expectation that additional participants will follow to strengthen the study.Additionally, the assessment consisted of only four parts, limiting the variability in possible scores.Future studies may consider increasing the number of assessment components as well as integrating a speed or efficiency metric to better detect progression in FAST exam skills and provide further insight into the quality of these training sessions.Another limitation was the inability to account for the effect that clinical experience obtained between the course and six-month follow-up may have exerted on the final self-reported confidence and objective performance.Some trainees had completed rotations on the trauma service in between the baseline time point and six-month follow-up, which may have provided additional undetected ultrasonographic practice.In future training sessions, the trainees can be queried on the amount of additional FAST exam experience received during the six-month period.We can subsequently determine the degree of additional on-the-job experience obtained and assess its impact on outcomes, since we are unable to eliminate these external factors.
The generalizability of our study may be limited due to the resources that were utilized.We are fortunate to have a simulation center equipped with a mannequin, ultrasound probes, and camera setup necessary for the remote sessions and acknowledge that may not be true for all institutions.However, there are cost-effective alternatives that may be implemented so that remote learning is more attainable in resource-limited settings, such as utilizing live volunteers in place of the mannequin or a smartphone as the third-person overhead camera.Finally, there were differences between remote and in-person learning that were not measurable or captured by our metrics.Aspects of non-verbal communication, body language, and hands-on feedback can be lost through teleconference, representing possible disadvantages of remote learning not investigated in this study.
Although the implementation of this project coincided with the onset of COVID-19, we believe the utility of these findings extends beyond pandemic restrictions.The technology in remote learning overcomes barriers related to distance or travel and allows access to global resources, including internationally known experts and instructors.There are also potential benefits to direct clinical work, in addition to education, via tele-mentoring medical personnel with patients in the field.The advent of FAST exam education via "tele-ultrasound" is beneficial to students and trainees of all experience levels and may also assist in bridging the resource gap encountered by health care providers in low-and middle-income countries.It is our hope that by verifying its feasibility, this study will serve as a foundation for future investigations to continue enhancing the caliber of surgical education.

Figure 2 .
Figure 2. Remote cohort Zoom meeting setup with 3 connected devices.

Figure 1 .
Figure 1.Study methodology.FAST; Focused Assessment with Sonography in Trauma.ACS USER; American College of Surgeons UltraSound Essentials for Residents.

Figure 3 .
Figure 3. Remote (A, B) and in-person (C) FAST exam instruction sessions.(A) Resident practicing on a mannequin with shared ultrasound images and Zoom instructor meeting on wall television.(B) Computer screen of Zoom meeting with shared ultrasound image, faculty instructor, and simulation room overhead view.(C) Faculty instructor providing feedback to a resident during in-person session.

Figure 5 .
Figure 5. Objective skills assessment of FAST exam accuracy during pre-course and six-month follow-up periods for remote and in-person instruction groups.

Figure 4 .
Figure 4. Average survey responses during pre-course, immediate post-course, and six-month follow-up periods for remote and in-person instruction groups.ÃÃ P ≤ .01 and ÃÃÃ P ≤ .001.

Table 2 .
Individual Survey Question Confidence Ratings and Averages of Survey Responses.FAST; Focused Assessment with Sonography in Trauma.OR; Operating Room.Ã Differed significantly from Pre-course value with P < .05.