Abstract

Objective

To systematically review and synthesize the published literature regarding the education of general practitioners (GPs) and GPs in training (GPTs) in the use of ultrasonography.

Design

This systematic review was prospectively registered in PROSPERO, conducted according to the Cochrane recommendations. We combined studies identified in a previous systematic review with studies from an updated literature search using the same search string. We searched the following databases: MEDLINE via Pubmed, EMBASE via OVID, Cinahl via Ebsco, Web of Science and Cochrane Register of Controlled Trials using the words ‘ultrasonography’ and ‘general practice’. Two reviewers independently screened articles, extracted data and assessed the quality of included papers according to the Down and Black quality assessment tool. Disagreements were resolved by involving a third reviewer.

Results

Thirty-three papers were included. Ultrasound training was described to include both theoretical and practical training sessions. Theoretical training was achieved through introductory e-learning and/or didactic lectures. Practical training included focussed hands-on training sessions, while some papers described additional longitudinal practical training through proctored scans during clinical work or through self-study practice with continuous feedback on recorded scans.

Conclusion

There was a large variation in ultrasound training programs for GPs and GPTs, with an overall emphasis on focussed practical training. Few studies included a longitudinal learning process in the training program. However, diagnostic accuracy seemed to improve with hours of practical training, and studies including continuous feedback on scans conducted during clinical patient encounters showed superior results.

Lay Summary

Point-of-care ultrasonography is increasingly used by general practitioners (GPs) working in primary care as a diagnostic tool providing earlier and more precise diagnoses. However, ultrasonography is a user-dependent technology and obtaining competence requires both training and practice. Today, there is no consensus about which ultrasound training GPs should have before they start scanning patients in their clinics.

Key Messages
  • Point-of-care ultrasonography is increasingly used by general practitioners (GPs).

  • Ultrasound training programs had an overall emphasis on focussed training sessions.

  • Higher diagnostic accuracy was found with longitudinal practical training.

  • More research is needed to establish how GPs are best trained in ultrasonography.

This systematic review summarizes the available literature describing ultrasound training programs for GPs. Following international recommendations for conducting systematic literature reviews, we ended up with 33 individual articles for analysis, showing considerable variation in the amount of training and the educational elements included in the training. However, most educational programs included both theoretical and practical training sessions, with an overall emphasis on focussed hands-on training sessions.

The ultrasound competences obtained during the training programs were assessed using a variety of different written and practical tests and generally high pass-rates were reported. Less attention was given to the longitudinal learning process and the maintenance of scanning competence following training sessions, but we found that studies including continuous supervised training during regular patient encounters reported higher diagnostic accuracies.

Introduction

The use of point-of-care ultrasonography (PoC-US) is increasing (1–4) across medical specialties, and position papers have stressed that appropriate training is needed for clinicians using PoC-US (5–8). Although developments in technology have made ultrasound images easier to interpret, the technology remains highly user dependent. Educational programs have been developed for clinicians working in hospital settings (9,10). There is no consensus, however, regarding the ultrasound training that general practitioners (GPs) working in primary care should have (11).

GPs are generalists and the use of ultrasound in general practice is not restricted to just one or two applications but cover a broad range of selected applications that are relevant in a primary care setting (11,12). To secure patient safety, it is important that GPs gain sufficient skills to perform ultrasound within the applications they use. Insufficient training may lead to false-positive results and concomitant unwarranted patient anxiety and additional superfluous examinations. On the other hand, false-negative results may lead to serious conditions being overlooked or delays in treatment.

Future GPs are expected to use PoC-US routinely in primary care (13,14) and, based on knowledge from the secondary sector, the American Association of Family Physicians recommends an ultrasound training program for family medicine residents (15). Introductory ultrasound training sessions may be transferred from the secondary sector as they may be universal for any doctors striving to learn ultrasonography. Longitudinal educational elements, however, such as continuous proctored scans during clinical work and continuous training under supervision, may not be possible in an office-based general practice where GPs often work alone or with only a few colleagues. Furthermore, the opportunities for practicing scanning skills, and maintaining skills over time, is challenged by a high workload, short consultation times and a low frequency of conditions suited to an ultrasound examination in general practice. Hence, there is a pressing need for guidelines based on evidence from general practice to tailor an ultrasound training program that is suitable for GPs working in primary care.

This systematic literature review aims to systematically review and synthesize the published literature about ultrasonography education for GPs and GPs in training (GPT) to provide a detailed overview of how educational programs are composed, which educational elements are included and how the participants’ technical ultrasonography skills are assessed.

Methods

Information source and search strategy

This systematic review was prospectively registered in PROSPERO (Registration number: CRD42018102738) and conducted according to the Cochrane recommendations (16). Reporting followed PRISMA guidelines (17).

We combined studies identified in a previous, broad-scope, systematic review (11) with studies from an updated literature search using the same search string and methodological rigor. The original literature search was conducted in August 2017 and our updated search in July 2018 and February 2020 by the same investigator. Supplementary file 1 provides the full search string and the results.

We searched the following databases: MEDLINE via Pubmed, EMBASE via OVID, Cinahl via Ebsco, Web of Science and Cochrane Register of Controlled Trials, using the words ‘ultrasonography’ and ‘general practice’ in combination and replaced with thesaurus terms. No restrictions were made concerning the year of publication or type of publication.

Eligibility criteria

Articles were included if a published full text was available containing original data from a clinical trial, an observational study and an audit or case series and if they described training in the use of ultrasonography among GPs or GPT. GPs were defined as primary care doctors specializing in family medicine and GPT as medical doctors or residents working towards specialization in family medicine. Studies were excluded if they described thermal therapeutic ultrasound, ultrasound without production of an image for the clinician to view or if not published in Danish, English, Norwegian or Swedish.

Selection of studies

The papers from the updated literature search were independently screened by two reviewers according to the inclusion and exclusion criteria. The two reviewers compared their individual selected studies for consistency after abstract and full-text reading. All studies from the original systematic review (11) were included for the full-text screening and assessed according to the additional inclusion criteria of this study. Disagreements over whether to include a study or not were resolved through discussion or by involving a third reviewer. If a paper described a training program without elaborating on the content of the program, the corresponding author was contacted and asked for additional information (Supplementary file 2). The selection of studies is elaborated in Figure 1.

PRISMA flow diagram.
Figure 1.

PRISMA flow diagram.

Data extraction

Data on all studies were extracted by two reviewers using a modified version of Cochrane’s data extractions template (Supplementary file 3). The reviewers also assessed the quality of the included studies using the Downs and Black checklist for the assessment of the methodological quality both of randomized and non-randomized studies of health care interventions (18). As most included studies were non-interventional, we excluded items 19, 21–24 and 27 from the Downs and Black quality assessment checklist, leaving a scale from 1 to 21. Disagreements were resolved through discussion and by involving a third reviewer. We aimed to describe the results narratively as the previous literature review described large variation in the included studies.

Results

Study selection

The updated literature search identified 3153 records, of which 636 were duplicates. After the screening process, 126 individual papers were included for full-text screening together with the 51 papers from the previous literature review. After an initial assessment, two papers (19,20) were found to describe the same clinical study, data collection and ultrasound training program. Consequently, we only included one of the papers (19) in the analysis and considered the other paper (20) to be elaborating material. We attempted to contact the corresponding authors of 29 papers that had insufficient information about the training described (Supplementary file 2). Through this correspondence, one additional paper was identified (21).

Thirty-three papers (19,21–52) provided sufficient information on ultrasound training programs to proceed to data extraction. In the remaining papers, the information provided was insufficient. Figure 1 depicts the identification and selection of the studies included and Supplementary file 2 provides information about excluded papers.

Study characteristics

Most studies were observational studies and only two randomized controlled trials (RCTs) were included (40,46). The papers were published between 1988 and 2020—seven papers (23,33–35,42,43,45) were published before 2003 and the remaining 25 papers (19,21,22,24–32,36–41,44,46–52) after 2012. Thirteen studies originated from North America (21,23–27,33–35,42,43,45,47), 12 from Europe (19,22,30,32,36,38–41,44,46,50), 2 from Africa (29,49), 4 from Asia (28,37,48,51) and 3 from Central or South America (28,31,52). Additionally, one study was conducted in both Peru and Nepal (28).

The methodological quality of the studies varied considerably and low scores were mainly given due to design, poor reporting or the selection of participating GPs. Down and Black scores varied from 5 to 17 with a mean score of 12.5 (out of 21). Table 1 provides an overview of the included studies.

Table 1.

Characteristics of the studies describing ultrasound training of GPs or GPs in training, identified in a literature search from interception to 2020.

StudyYearOriginStudy designParticipantsnQuality scorea
Avremescu et al. (22)2017Italy, Bulgaria, RomaniaProspective, observationalGPs, other1025
Bailey et al. (23)2001USAProspective, observationalGPT1610
Blois et al. (24)2012CanadaProspective, observationalGP113
Bornemann et al. (25)2014USAProspective, observationalGP155
Bornemann et al. (26)2015USAProspective, observationalGP416
Bornemann et al. (27)2017USAProspective, observationalGPT1713
Chavez et al. (28)2015Peru/NepalProspective, observationalGP217
Chebli et al. (29)2017MoroccoCross-sectionalGP2413
Colli et al. (30)2015ItalyProspective, observationalGP, other9012
Del Carpio et al. (31)2012ArgentinaProspective, observationalGP, GPT1810
Dorfhofer et al. (48)2020IndonesiaProspective, observationalGP, other1910
Evangelista et al. (32)2016SpainProspective, observationalGP1415
Franklin et al. (21)2017USAProspective, observationalGPT, other412
Hahn et al. (33)1988USAProspective, observationalGP39
Hahn et al. (34)1988USAProspective, observationalGP1311
Jones et al. (49)2020KenyaProspective, observationalGP, GPT418
Keith et al. (35)2001USARetrospective, chart reviewGPTND13
Le Lous et al. (36)2017FranceProspective, controlled trailGP2614
Lee et al. (37)2017IndonesiaProspective, observationalGP4111
Lindgaard et al.(38)2017DenmarkProspective, observationalGP513
Mjølstad et al. (39)2012NorwayProspective, observationalGP717
Morbach et al. (40)2018GermanyRandomized controlled trailGP4815
Mumoli et al. (41)2017ItalyProspective, observationalGP1814
Nilsson et al. (50)2019SwedenProspective, observationalGP,GPT615
Ornstein et al. (42)1990USARetrospective, observationalGP413
Pervaiz et al. (51)2019BangladeshProspective, observationalGP2510
Rodney et al. (43)1990USAProspective, observationalGP214
Romiger et al. (52)2018MexicoProspective, observationalGP812
Siso-Almirall et al. (44)2017SpainProspective, interventionalGP415
Smith et al. (45)1991USAProspective, observationalGPT1216
Szwamel et al. (19)2017PolandCross-sectionalGP8113
Todsen et al. (46)2016DenmarkRandomized controlled trailGPs, GPT, other6417
Wong et al. (47)2013USAProspective, observationalGP89
StudyYearOriginStudy designParticipantsnQuality scorea
Avremescu et al. (22)2017Italy, Bulgaria, RomaniaProspective, observationalGPs, other1025
Bailey et al. (23)2001USAProspective, observationalGPT1610
Blois et al. (24)2012CanadaProspective, observationalGP113
Bornemann et al. (25)2014USAProspective, observationalGP155
Bornemann et al. (26)2015USAProspective, observationalGP416
Bornemann et al. (27)2017USAProspective, observationalGPT1713
Chavez et al. (28)2015Peru/NepalProspective, observationalGP217
Chebli et al. (29)2017MoroccoCross-sectionalGP2413
Colli et al. (30)2015ItalyProspective, observationalGP, other9012
Del Carpio et al. (31)2012ArgentinaProspective, observationalGP, GPT1810
Dorfhofer et al. (48)2020IndonesiaProspective, observationalGP, other1910
Evangelista et al. (32)2016SpainProspective, observationalGP1415
Franklin et al. (21)2017USAProspective, observationalGPT, other412
Hahn et al. (33)1988USAProspective, observationalGP39
Hahn et al. (34)1988USAProspective, observationalGP1311
Jones et al. (49)2020KenyaProspective, observationalGP, GPT418
Keith et al. (35)2001USARetrospective, chart reviewGPTND13
Le Lous et al. (36)2017FranceProspective, controlled trailGP2614
Lee et al. (37)2017IndonesiaProspective, observationalGP4111
Lindgaard et al.(38)2017DenmarkProspective, observationalGP513
Mjølstad et al. (39)2012NorwayProspective, observationalGP717
Morbach et al. (40)2018GermanyRandomized controlled trailGP4815
Mumoli et al. (41)2017ItalyProspective, observationalGP1814
Nilsson et al. (50)2019SwedenProspective, observationalGP,GPT615
Ornstein et al. (42)1990USARetrospective, observationalGP413
Pervaiz et al. (51)2019BangladeshProspective, observationalGP2510
Rodney et al. (43)1990USAProspective, observationalGP214
Romiger et al. (52)2018MexicoProspective, observationalGP812
Siso-Almirall et al. (44)2017SpainProspective, interventionalGP415
Smith et al. (45)1991USAProspective, observationalGPT1216
Szwamel et al. (19)2017PolandCross-sectionalGP8113
Todsen et al. (46)2016DenmarkRandomized controlled trailGPs, GPT, other6417
Wong et al. (47)2013USAProspective, observationalGP89

n, number of GPs and GPTs participating in training program. ND, no data available.

aQuality assessment according to modified Downs and Blacks list (0–21)

Table 1.

Characteristics of the studies describing ultrasound training of GPs or GPs in training, identified in a literature search from interception to 2020.

StudyYearOriginStudy designParticipantsnQuality scorea
Avremescu et al. (22)2017Italy, Bulgaria, RomaniaProspective, observationalGPs, other1025
Bailey et al. (23)2001USAProspective, observationalGPT1610
Blois et al. (24)2012CanadaProspective, observationalGP113
Bornemann et al. (25)2014USAProspective, observationalGP155
Bornemann et al. (26)2015USAProspective, observationalGP416
Bornemann et al. (27)2017USAProspective, observationalGPT1713
Chavez et al. (28)2015Peru/NepalProspective, observationalGP217
Chebli et al. (29)2017MoroccoCross-sectionalGP2413
Colli et al. (30)2015ItalyProspective, observationalGP, other9012
Del Carpio et al. (31)2012ArgentinaProspective, observationalGP, GPT1810
Dorfhofer et al. (48)2020IndonesiaProspective, observationalGP, other1910
Evangelista et al. (32)2016SpainProspective, observationalGP1415
Franklin et al. (21)2017USAProspective, observationalGPT, other412
Hahn et al. (33)1988USAProspective, observationalGP39
Hahn et al. (34)1988USAProspective, observationalGP1311
Jones et al. (49)2020KenyaProspective, observationalGP, GPT418
Keith et al. (35)2001USARetrospective, chart reviewGPTND13
Le Lous et al. (36)2017FranceProspective, controlled trailGP2614
Lee et al. (37)2017IndonesiaProspective, observationalGP4111
Lindgaard et al.(38)2017DenmarkProspective, observationalGP513
Mjølstad et al. (39)2012NorwayProspective, observationalGP717
Morbach et al. (40)2018GermanyRandomized controlled trailGP4815
Mumoli et al. (41)2017ItalyProspective, observationalGP1814
Nilsson et al. (50)2019SwedenProspective, observationalGP,GPT615
Ornstein et al. (42)1990USARetrospective, observationalGP413
Pervaiz et al. (51)2019BangladeshProspective, observationalGP2510
Rodney et al. (43)1990USAProspective, observationalGP214
Romiger et al. (52)2018MexicoProspective, observationalGP812
Siso-Almirall et al. (44)2017SpainProspective, interventionalGP415
Smith et al. (45)1991USAProspective, observationalGPT1216
Szwamel et al. (19)2017PolandCross-sectionalGP8113
Todsen et al. (46)2016DenmarkRandomized controlled trailGPs, GPT, other6417
Wong et al. (47)2013USAProspective, observationalGP89
StudyYearOriginStudy designParticipantsnQuality scorea
Avremescu et al. (22)2017Italy, Bulgaria, RomaniaProspective, observationalGPs, other1025
Bailey et al. (23)2001USAProspective, observationalGPT1610
Blois et al. (24)2012CanadaProspective, observationalGP113
Bornemann et al. (25)2014USAProspective, observationalGP155
Bornemann et al. (26)2015USAProspective, observationalGP416
Bornemann et al. (27)2017USAProspective, observationalGPT1713
Chavez et al. (28)2015Peru/NepalProspective, observationalGP217
Chebli et al. (29)2017MoroccoCross-sectionalGP2413
Colli et al. (30)2015ItalyProspective, observationalGP, other9012
Del Carpio et al. (31)2012ArgentinaProspective, observationalGP, GPT1810
Dorfhofer et al. (48)2020IndonesiaProspective, observationalGP, other1910
Evangelista et al. (32)2016SpainProspective, observationalGP1415
Franklin et al. (21)2017USAProspective, observationalGPT, other412
Hahn et al. (33)1988USAProspective, observationalGP39
Hahn et al. (34)1988USAProspective, observationalGP1311
Jones et al. (49)2020KenyaProspective, observationalGP, GPT418
Keith et al. (35)2001USARetrospective, chart reviewGPTND13
Le Lous et al. (36)2017FranceProspective, controlled trailGP2614
Lee et al. (37)2017IndonesiaProspective, observationalGP4111
Lindgaard et al.(38)2017DenmarkProspective, observationalGP513
Mjølstad et al. (39)2012NorwayProspective, observationalGP717
Morbach et al. (40)2018GermanyRandomized controlled trailGP4815
Mumoli et al. (41)2017ItalyProspective, observationalGP1814
Nilsson et al. (50)2019SwedenProspective, observationalGP,GPT615
Ornstein et al. (42)1990USARetrospective, observationalGP413
Pervaiz et al. (51)2019BangladeshProspective, observationalGP2510
Rodney et al. (43)1990USAProspective, observationalGP214
Romiger et al. (52)2018MexicoProspective, observationalGP812
Siso-Almirall et al. (44)2017SpainProspective, interventionalGP415
Smith et al. (45)1991USAProspective, observationalGPT1216
Szwamel et al. (19)2017PolandCross-sectionalGP8113
Todsen et al. (46)2016DenmarkRandomized controlled trailGPs, GPT, other6417
Wong et al. (47)2013USAProspective, observationalGP89

n, number of GPs and GPTs participating in training program. ND, no data available.

aQuality assessment according to modified Downs and Blacks list (0–21)

The composition of ultrasound training or educational programmes

Nineteen studies (19,24,28–30,32,33,37,39–44,47,48,51,52) described the training of GPs, six studies (21,23,27,35,36,45) described the ultrasound training of GPTs and nine studies included both groups in the training (22,25,26,31,34,38,46,49,50). Fifteen studies described using an adapted version of an already established educational program targeted at emergency physicians (24,37,49), obstetricians (35), cardiologists (40), medical students (48) or GPs (19,33,34,38,42), while four studies (21,25–27) described using online course material from ultrasound societies or course providers. Nine studies (21–23,27,33,34,40,47,52) described the ultrasound training as a pilot for GP ultrasound education, but no studies provided a reference for a national ultrasound education or residency training program for GPs.

The training programs included ultrasound examinations of different organs and applications (Figs. 2 and 3). Seven—mainly older—studies described training GPs or GPTs to perform comprehensive obstetric ultrasound examination for both screening and diagnostic purposes (33–36,42,43,45). One study (22) described an e-learning program divided into modules with different comprehensive ultrasound applications. Five studies described training in PoC-US for screening purposes for abdominal aortic aneurism (23,24,44) or cystic echinococcosis (29,31), and the remaining studies described training in PoC-US for limited ultrasound examinations of the heart (21,26,32,39,40,50), the lungs (28,51) or the veins in the legs (41). Eleven studies described an ultrasound training program, including several PoC-US applications (19,25,27,30,37,38,46–49,52).

Educational elements in point-of-care ultrasound training programs.
Figure 2.

Educational elements in point-of-care ultrasound training programs.

Educational elements in training program for comprehensive ultrasound exams.
Figure 3.

Educational elements in training program for comprehensive ultrasound exams.

The extent of the described ultrasound training varied considerably. Allocated focussed course hours, including preparation, lectures and hands-on training, at a course facility varied from 2 to 52 hours, while longitudinal practical training during clinical work varied from 4 weeks to 3 years. Figure 2 provides an overview of the different compositions of ultrasound training. Supplementary file 4 describes the participants’ experiences with the educational programmes.

Theoretical educational elements

Thirty-one studies (Figs. 2 and 3) had an introductory theoretical educational element that comprised 2% to 100% of the entire training program. Self-study of videos, e-learning or electronic didactic modules was described in nine studies varying from 20 minutes to 28 hours (21–23,26,27,35,38,41,50). In three studies, e-learning constituted the entire theoretical content of the training program (26,27,38) and, in one study, the entire ultrasound training program (22). Theoretical didactic lectures, however, were more commonly described. In 27 studies, participants received didactic teaching varying from 48 minutes to 12 hours (19,23,28–30,35–37,39–41,46–50,52). The teachers were described as sonographers (34), radiologists (19,28,31,44), obstetricians/gynaecologists (36,52), cardiologists (39), emergency physicians (47,52), equipment manufacturers (34), medical students (37,48) or simply ultrasound experts (21,25,38,40,41,46,51).

Practical educational elements

All studies but one (22) included practical educational elements in the training (Fig. 2) comprising between 43% and 100% of the entire training program. Focussed practical training in the use of ultrasonography was achieved through hands-on training sessions in 32 studies. These hands-on sessions lasted from 1 to 30 hours (19,23,26,28–30,35–41,46–49,52) and included scanning volunteers (19,21,23,31,32,38,39,46–48,52), patients (19,24,25,27–31,33–36,39,41,42,45,50) or simulator/phantom (25,27,32,36,47) using hand-held portable ultrasound scanners (25,26,29–32,37,39,40,47,50,52), mid-range scanners (19,21,28,38,41,46,48,51) or high-end ultrasound devices (23,35,36). Four older studies (33,34,42,45) used ultrasound devices that are outdated today.

In seven studies (19,21,31,36,38,46,48), the hands-on training was conducted in smaller teams of one to four participants. Hands-on instructors were described as radiologists (28,31,44), expert sonographers with different educational background (19,21,24,25,27,29,34,38,40–42,46), obstetricians/gynaecologists (36,52), cardiologists (39), emergency physicians (47,52), medical students (37,46,48) or GPs (23,26,45).

Fourteen studies described additional longitudinal practical training of ultrasound competences achieved through supervision during clinical patient encounters at hospital departments (24,30,33,36,41,42,45,50,51) or in a general practice setting (25,27,34,35,52). The participants were supervised by more experienced faculty (24,25,27,35,36,45,51,52), obstetricians (34,42), an ultrasound technician (50) or by a radiologist using telemedicine (33). The supervised scans during clinical work was described to include 5–78 ultrasound examinations (24,25,33,34,41,45,50,51) or a time period of 3 days to 1 week (30,33,34,41,42,48,52).

Other educational elements

Eleven studies described other longitudinal educational elements as part of the ultrasound training program. In eight studies (19,25–27,38,39,50,52), a time period with self-study practice was included and, in four studies (27,33,34,42), theoretical teaching was also attained through sessions where specialists reviewed video recordings of the participants’ scans together with the participants. In three studies (27,38,52), participants continuously uploaded video recordings of ultrasound scan for review by experts. The participants then received feedback as a written report with suggested points for improvement. One study described a recapitulation meeting, where participants presented three selected cases and the corresponding ultrasound scans from 2 weeks of practice in their own clinics (19).

Assessment of ultrasound competences following training

In 20 studies, the training program ended with an assessment of the participants’ ultrasound competences (19,22,23,25,27,31–34,36–38,40,42,44,46–49,51). There was, however, a large variation in the assessment tools; 10 studies described a written test and 16 studies described a practical test. Table 2 provides an overview of these tests and assessments. In 12 studies (22,23,25,31,32,37,40,44,46–49), the assessment followed immediately after the focussed training sessions and, in seven studies (19,27,33,34,36,38,42), the assessment was performed after 1–14 months of longitudinal practical training.

Table 2.

Assessment of competence following an ultrasound training program.

Area of applicationWritten testPractical testPass rateTest scoreExtent of
training
Ref.
Broad applicationTest with 20 quizzes97%120–200 points28 hours(22)
Multiple-choice testOSCEMultiple-choice scores: 84%; OSCE scores: 85%4 weeks(27)a
OSCE and OSAUS100%12 months(38)a
Non-specified testNon-specified testNo statistics ‘The vast majority of students had no problems with the exam’44 hours(19)a
Non-specified testCompetence was rated (1–3 points)82.9%Test scores: 82.1%; practical exam score: 83.2%4 weeks(37)a
OSAUS and diagnostic test on four patientsMean OSAUS 27.4 (max = 40)5 hours(46)a
Observational evaluationND(25)a
Observational evaluation using checklist100%8 hours(47)a
OSCE3 days(49)a
Multiple-choice testCompetence was rated (1–3 points)79% written; 79% practical83% written; 83% practical24 hours(48)a
AortaObservational evaluation25 hours(44)a
Competence was rated (1–4 points)62.5% (10/16) residents reached Level 4 after 3.4 scans.2.3 hours(23)a
HeartCase-based multiple-choice test100%κ = 0.72
κ = 0.81
3.9 hours(40)a
Observational evaluation28 hours(32)a
LungInterpretation of 25 recorded scansObservational evaluation7 days(51) a
AbdomenNon-specified test100%20 hours(31)a
ObstetricsImages were scored blindly70%4 hours(36)a
Non-specified testNon-specified test14 months(42)
Non-specified test52 hours(33)
ARDMS testCompetence was rated in 21 performance areas (1–5 points)100%Written test: mean 3.8
Practical test: mean 159
14 months(34)
Area of applicationWritten testPractical testPass rateTest scoreExtent of
training
Ref.
Broad applicationTest with 20 quizzes97%120–200 points28 hours(22)
Multiple-choice testOSCEMultiple-choice scores: 84%; OSCE scores: 85%4 weeks(27)a
OSCE and OSAUS100%12 months(38)a
Non-specified testNon-specified testNo statistics ‘The vast majority of students had no problems with the exam’44 hours(19)a
Non-specified testCompetence was rated (1–3 points)82.9%Test scores: 82.1%; practical exam score: 83.2%4 weeks(37)a
OSAUS and diagnostic test on four patientsMean OSAUS 27.4 (max = 40)5 hours(46)a
Observational evaluationND(25)a
Observational evaluation using checklist100%8 hours(47)a
OSCE3 days(49)a
Multiple-choice testCompetence was rated (1–3 points)79% written; 79% practical83% written; 83% practical24 hours(48)a
AortaObservational evaluation25 hours(44)a
Competence was rated (1–4 points)62.5% (10/16) residents reached Level 4 after 3.4 scans.2.3 hours(23)a
HeartCase-based multiple-choice test100%κ = 0.72
κ = 0.81
3.9 hours(40)a
Observational evaluation28 hours(32)a
LungInterpretation of 25 recorded scansObservational evaluation7 days(51) a
AbdomenNon-specified test100%20 hours(31)a
ObstetricsImages were scored blindly70%4 hours(36)a
Non-specified testNon-specified test14 months(42)
Non-specified test52 hours(33)
ARDMS testCompetence was rated in 21 performance areas (1–5 points)100%Written test: mean 3.8
Practical test: mean 159
14 months(34)

OSCE, objective structured clinical examination; OSAUS, objective structured assessment of ultrasounds skills; ARDMS, American Registry for Diagnostic Medical Sonographers.

aStudies describing the use of point-of-care ultrasound.

Table 2.

Assessment of competence following an ultrasound training program.

Area of applicationWritten testPractical testPass rateTest scoreExtent of
training
Ref.
Broad applicationTest with 20 quizzes97%120–200 points28 hours(22)
Multiple-choice testOSCEMultiple-choice scores: 84%; OSCE scores: 85%4 weeks(27)a
OSCE and OSAUS100%12 months(38)a
Non-specified testNon-specified testNo statistics ‘The vast majority of students had no problems with the exam’44 hours(19)a
Non-specified testCompetence was rated (1–3 points)82.9%Test scores: 82.1%; practical exam score: 83.2%4 weeks(37)a
OSAUS and diagnostic test on four patientsMean OSAUS 27.4 (max = 40)5 hours(46)a
Observational evaluationND(25)a
Observational evaluation using checklist100%8 hours(47)a
OSCE3 days(49)a
Multiple-choice testCompetence was rated (1–3 points)79% written; 79% practical83% written; 83% practical24 hours(48)a
AortaObservational evaluation25 hours(44)a
Competence was rated (1–4 points)62.5% (10/16) residents reached Level 4 after 3.4 scans.2.3 hours(23)a
HeartCase-based multiple-choice test100%κ = 0.72
κ = 0.81
3.9 hours(40)a
Observational evaluation28 hours(32)a
LungInterpretation of 25 recorded scansObservational evaluation7 days(51) a
AbdomenNon-specified test100%20 hours(31)a
ObstetricsImages were scored blindly70%4 hours(36)a
Non-specified testNon-specified test14 months(42)
Non-specified test52 hours(33)
ARDMS testCompetence was rated in 21 performance areas (1–5 points)100%Written test: mean 3.8
Practical test: mean 159
14 months(34)
Area of applicationWritten testPractical testPass rateTest scoreExtent of
training
Ref.
Broad applicationTest with 20 quizzes97%120–200 points28 hours(22)
Multiple-choice testOSCEMultiple-choice scores: 84%; OSCE scores: 85%4 weeks(27)a
OSCE and OSAUS100%12 months(38)a
Non-specified testNon-specified testNo statistics ‘The vast majority of students had no problems with the exam’44 hours(19)a
Non-specified testCompetence was rated (1–3 points)82.9%Test scores: 82.1%; practical exam score: 83.2%4 weeks(37)a
OSAUS and diagnostic test on four patientsMean OSAUS 27.4 (max = 40)5 hours(46)a
Observational evaluationND(25)a
Observational evaluation using checklist100%8 hours(47)a
OSCE3 days(49)a
Multiple-choice testCompetence was rated (1–3 points)79% written; 79% practical83% written; 83% practical24 hours(48)a
AortaObservational evaluation25 hours(44)a
Competence was rated (1–4 points)62.5% (10/16) residents reached Level 4 after 3.4 scans.2.3 hours(23)a
HeartCase-based multiple-choice test100%κ = 0.72
κ = 0.81
3.9 hours(40)a
Observational evaluation28 hours(32)a
LungInterpretation of 25 recorded scansObservational evaluation7 days(51) a
AbdomenNon-specified test100%20 hours(31)a
ObstetricsImages were scored blindly70%4 hours(36)a
Non-specified testNon-specified test14 months(42)
Non-specified test52 hours(33)
ARDMS testCompetence was rated in 21 performance areas (1–5 points)100%Written test: mean 3.8
Practical test: mean 159
14 months(34)

OSCE, objective structured clinical examination; OSAUS, objective structured assessment of ultrasounds skills; ARDMS, American Registry for Diagnostic Medical Sonographers.

aStudies describing the use of point-of-care ultrasound.

The assessment of the participants’ ultrasound skills was performed by radiologists (33,35,38,44,46), sonographers (21,24,32,34,42), cardiologists (39), obstetricians (36), GPs experienced in using ultrasonography (23,26,27,45) or other qualified assessors (19,40,47,51). The results of the tests assessing participants’ ultrasound skills following their training were described in 13 studies (19,22,23,27,31,34,36–38,40,46–48). Generally, test scores were good and the studies reported almost 100% pass rates except in four distinct studies where participants either performed full detailed ultrasound examinations after just few hours of training (23,36) or were trained by first-year medical students in a very extensive PoC-US curriculum (37,48). Four studies (34,37,40,48) described how diagnostic skills improved with training by using pre- and post-assessment tests.

Diagnostic accuracy following training

Fifteen studies (23,24,26,30,32,33,35,38,39,41,44–46,50,51) compared ultrasound examinations performed by GPs or GPTs to an ultrasound specialist’s examination or interpretation (Supplementary file 5). It was not possible to conduct a meta-analysis of the results as application, scanning protocols and available information varied considerably between the included studies.

Across applications, there were no overall association between hours of training, number of performed scans and diagnostic accuracy. Instead, diagnostic accuracy seemed to depend on the area of application, for example, one study (23) found that an average of 3.4 exams were sufficient to achieve independent competence level for scanning the aorta, while another study (21) found that fewer than 35 examinations would be insufficient for a reliable estimation of left ventricular mass index using limited echocardiography. Scanning routine seemed to be of importance for heart examinations as four studies found that variability in measurements decreased (21,26) and agreement on diagnosis slightly increased (32,50) with the number of performed heart examinations. Moreover, the ability to rule out pathology for ultrasound examinations of the heart seemed to improve with hours of focussed practical training. (Supplementary file 5). Two controlled trials (36,46) also demonstrated how added simulation-based focussed training increased participants’ diagnostic performance in other areas of application.

When the training programs included more applications, diagnostic accuracy seemed to improve with a longer period of practical training. Studies with longitudinal practical training during clinical work, including continuous feedback (33,38,41), showed superior results.

Five studies provided other quality estimates of GPs’ ultrasound examinations. Two studies (42,43) found little difference between GPs’ obstetric ultrasound examinations and birth outcomes; two studies reported that only 85% (33) and 70% (36) of images obtained from GPs’ ultrasound examinations were adequate to make a diagnosis and, finally, one study (31), which described screening for echinococcosis, reported that all cases found by GPs in the first screening round following training were false positives.

Discussion

Statement of principle findings

This systematic literature review reports how most educational programs for GPs and GPTs include both theoretical and practical training sessions, with an overall emphasis on focussed practical training of ultrasound competences. There was large variation in the extent of training, the educational elements included in the training and in the curricula described in the included papers. Theoretical training was typically achieved through didactic lectures, although more recent papers described the use of e-learning programs. Most commonly, practical training of ultrasound competences consisted of focussed hands-on training sessions. Some papers, however, also described additional longitudinal practical training through proctored scans during clinical work in hospital departments or in general practice. Other papers described longitudinal self-study practice with continuous feedback on recorded scans. The teachers in the focussed theoretical and practical training sessions were ultrasound specialists and only a few studies described involving GPs as supervisors in longitudinal training. The ultrasound competences obtained during the training programs were assessed using a variety of different written and practical tests and, generally, high pass rates were reported. Only a few studies reported diagnostic accuracy of the participants’ scans and, for each area of application, we generally found that quality increased with the number of hours of practical training.

Strengths and weaknesses of the study

This study followed the Cochrane guidelines for conducting systematic literature reviews and we searched the five largest databases that were expected to include relevant studies. We reviewed all reference lists for grey literature and contacted corresponding authors for elaborating material. Hence, we aimed to include the totality of information regarding training in PoC-US in general practice. Nevertheless, a total of 20 studies were excluded as no full-text paper was available and one study due to language. Hence, some information might have been missed.

To avoid bias and personal interpretations, the screening process, data extraction and quality assessment were done independently by two reviewers, involving a third party in case of inconsistencies in the evaluations. However, our conclusions are limited by the fact that the quality of the included papers was poor and that the training programs were heterogeneous. Furthermore, only a few authors returned to us with elaborating material. Hence, it was difficult to compare the data and a meta-analysis was not possible.

Additionally, seven studies were more than 10 years old and they described a different use of ultrasound than the PoC-US known today. Developments in ultrasound technology have made the interpretation of images easier. Thus, the change in technology may limit results from older studies.

Findings in relation to other studies

The previous literature review (11) reported large differences in GPs’ ultrasound training without elaboration on the composition of educational programs. In line with international recommendations (8,53), we found that most training programs included both theoretical and practical educational elements. The Royal College of Radiologists (8) has stated that basic theoretical training, including knowledge about the physics of ultrasound, knobology and artefacts, is a prerequisite for any practical ultrasound training. In most of the studies included in our review, introductory theoretical training was achieved through either e-learning or lectures and previous studies have found no difference between the two (6). Indeed, the World Federation for Ultrasound in Medicine and Biology (WFUMB) position paper on PoC-US (6) recommends using web-based theoretical teaching. This may be particularly beneficial for GPs as it allows for flexible study hours, which may reduce absence from their clinics. Furthermore, having continuous access to online knowledge resources may be advantageous for the longitudinal learning process.

Focussed practical training was an essential part of the training programs described in the included studies and only one study (22) failed to include or mention the need for practical training of ultrasound competences. Previous research (54) shows that, even though GPs may acquire skills on short-focussed training sessions, skills obtained at a training facility does not automatically transfer into clinical use and longitudinal training is necessary to develop expertise. This is supported by guidelines (6,8,10,15,55) recommending that practical ultrasound training should include both focussed and longitudinal educational elements. Longitudinal training, however, was only described in a few of the studies included in this review.

WFUMB (6) has suggested that focussed hands-on training might involve phantoms and virtual reality equipment. We found that most training programs involved training with patients and only a few studies involved training with phantoms—possibly due to the area of application (transvaginal ultrasonography). Whereas the included two controlled trials (36,46) found that simulation-based training improved diagnostic skills, training on phantoms or healthy volunteers does not sufficiently train the recognition of pathology (8). Hence, educational programs for GPs should also include scanning patients with pathological ultrasound findings.

For longitudinal ultrasound training, guidelines (8,53) recommend supervised scans during patient encounters. Yet, there is no consensus regarding the number of supervised examinations needed to obtain proficiency (55). For focussed emergency ultrasound, five proctored scans per week during the training period and a minimum of 50 scans for ultrasound novices has been suggested (8). Other guidelines suggest 150–200 scans (5). In our study, only five of the studies describing PoC-US included proctored scans in the training program and only one of them reported more than 50 supervised scans per participant (24). The possibilities for continuous supervision and feedback might be limited in an office-based general practice. However, three of the included studies (27,32,38) described web-based solutions where participants continuously uploaded scans for review by experts and, in two of the studies (27,38), they subsequently received written feedback on performance. Other studies (33,34,42) described feedback sessions where ultrasound specialists reviewed video recordings of scans together with the participants. Feedback sessions offer the opportunity for improving image acquisition, image interpretation and incorporating ultrasound results into clinical decision-making (55). Still, despite the high learning potential, such review sessions are more resource demanding and may not be possible on a larger scale.

A previous qualitative study (12) has described how GPs continuously train their ultrasound competences during their daily routines. To support this longitudinal learning process, some guidelines (8,15,55) have recommended that ultrasound novices are assigned an ultrasound mentor following participation in a training program. The European Federation for Ultrasound in Medicine and Biology (53) suggests that ultrasound-competent practitioners train novice practitioners in the use of PoC-US as this will increase the understanding of the clinical setting and application. Hence, if ultrasound is to be implemented in general practice on a larger scale, attention must be given to the education of GPs who are ultrasound experts and may act as teachers on focussed teaching sessions and mentors in the longitudinal learning process.

An assessment of ultrasound competences following a training program has also been recommended (15,53,55). However, the descriptions of assessments are ambiguous and unclear. Moreover, 20 of the studies in this review described a variety of different assessment tools and tests. Future educational programs for GPs and GPTs must include assessments that reflect the level of competence needed in general practice and uniform validated assessment tools that account for interrater variability. Furthermore, educational programs must allow for the low frequency of ultrasound examinations in general practice by including arrangements that secure the maintenance of ultrasound skills over time.

In our review, we found no overall difference between training programs for GP or GPTs. Only one study (35) described a continuous educational program for family medicine residents over a period of 3 years, including annual theoretical and practical ultrasound sessions. If ultrasonography is the future diagnostic tool for every clinician, ultrasound training should be integrated into the residency programs. However, consensus needs to be established as to which ultrasound examinations are best suited for general practice in terms of relevance and complexity. The organization of the health care system will influence the relevance of different applications, for example, pelvic ultrasonography is less relevant for GPs in countries where patients have open access to gynaecologists and, as this review has shown, some ultrasound applications require more training than others. A substantial number of the studies in this review included several ultrasound applications in the same training program. Nonetheless, module-based educational programs for GPs or GPTs, with a stepwise application of new ultrasound examinations, may provide the opportunity for participants to develop a comfort zone with confidence in the diagnostic process before moving on to the next module and gaining new skills. Longitudinal training, including continuous feedback on performed scans in general practice, seems to yield good long-term ultrasound proficiency. It is not known whether combining short module-based courses with longitudinal supervision by an ultrasound-competent GP mentor may be similarly effective.

Meaning of the study

This study expands on the reported differences in ultrasound training of GPs found in the previous review (11). The study emphasized the need for both theoretical and practical training to become proficient in using PoC-US. However, there is a lack of more precise knowledge regarding educational content and extend to optimize training. Future studies are needed to expand on this topic. Ideally, RCTs should compare different training programs to best assess which training elements to include. Focus should be on a thorough description of the educational elements. Furthermore, external validity should be strengthened by a broader selection of participants that includes GPs without a special interest in ultrasonography. We need a greater understanding of the way GPs learn and maintain competences over time. Hence, future studies should include an assessment of competences after a follow-up period to learn more about GPs’ ability to maintain skills.

Acknowledgements

The authors would like to thank the corresponding authors in the identified papers, who returned to us with elaborating information and material.

Declaration

Funding: this study is independent research funded by the Center for General Practice at Aalborg Universitet, Denmark.

Ethics approval: not applicable.

Conflicts of interest: the authors declare that they have no competing interests.

Data availability

Data are available upon reasonable request from the corresponding author.

References

1.

Smallwood
N
,
Dachsel
M
.
Point-of-care ultrasound (POCUS): unnecessary gadgetry or evidence-based medicine?
Clin Med (Lond)
2018
;
18
(
3
):
219
24
.

2.

Wittenberg
M
.
Will ultrasound scanners replace the stethoscope?
BMJ
2014
;
348
:
g3463
.

3.

Bhagra
A
,
Tierney
DM
,
Sekiguchi
H
et al.
Point-of-care ultrasonography for primary care physicians and general internists
.
Mayo Clin Proc
2016
;
91
(
12
):
1811
27
.

4.

Barron
KR
,
Wagner
MS
,
Hunt
PS
et al.
A primary care ultrasound fellowship: training for clinical practice and future educators
.
J Ultrasound Med
2019
;
38
(
4
):
1061
8
.

5.

spocus.org
.
Guidelines for Point of Care Ultrasound Utilization in Clinical Practice
.
San Antonio, TX
:
The Society of Point of Care Ultrasound
,
2017
. https://spocus.org/Practice-Guidelines (accessed on
18 May 2020
).

6.

Dietrich
CF
,
Goudie
A
,
Chiorean
L
et al.
Point of care ultrasound: a WFUMB position paper
.
Ultrasound Med Biol
2017
;
43
(
1
):
49
58
.

7.

Soni
NJ
,
Schnobrich
D
,
Mathews
BK
et al.
Point-of-care ultrasound for hospitalists: a position statement of the society of hospital medicine
.
J Hosp Med
2019
;
14
:
E1
6
.

8.

rcr.ac.uk
.
Ultrasound Training Recommendations for Medical and Surgical Specialities
. 3rd edn.
London, UK
:
The Royal College of Radiologists
,
2017
. https://www.rcr.ac.uk/system/files/publication/field_publication_files/bfcr173_ultrasound_training_med_surg.pdf (accessed on
18 May 2020
).

9.

Arntfield
R
,
Millington
S
,
Ainsworth
C
et al.
Canadian recommendations for critical care ultrasound training and competency
.
Can Respir J
2014
;
21
(
6
):
341
5
.

10.

Atkinson
P
,
Bowra
J
,
Lambert
M
et al.
International federation for emergency medicine point of care ultrasound curriculum
.
CJEM
2015
;
17
(
2
):
161
70
.

11.

Andersen
CA
,
Holden
S
,
Vela
J
et al.
Point-of-care ultrasound in general practice: a systematic review
.
Ann Fam Med
2019
;
17
(
1
):
61
9
.

12.

Andersen
CA
,
Davidsen
AS
,
Brodersen
J
et al.
Danish general practitioners have found their own way of using point-of-care ultrasonography in primary care: a qualitative study
.
BMC Fam Pract
2019
;
20
(
1
):
89
.

13.

Steinmetz
P
,
Oleskevich
S
.
The benefits of doing ultrasound exams in your office
.
J Fam Pract
2016
;
65
(
8
):
517
23
.

14.

Bornemann
P
,
Barreto
T
.
Point-of-care ultrasonography in family medicine
.
Am Fam Physician
2018
;
98
(
4
):
200
.

15.

aafp.org
.
Recommended Curriculum Guidelines for Family Medicine Residents Point of Care Ultrasound
.
Chicago, IL
:
American Academy of Family Physicians
,
2016
. https://www.aafp.org/dam/AAFP/documents/medical_education_residency/program_directors/Reprint290D_POCUS.pdf (accessed on
18 May 2020
).

16.

Higgins
JPT
,
Thomas
J
,
Chandler
J
et al. (eds).
Cochrane Handbook for Systematic Reviews of Interventions
. 2nd edn.
Chichester, UK
:
John Wiley & Sons
,
2019
.

17.

Moher
D
,
Liberati
A
,
Tetzlaff
J
et al.
Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement
.
BMJ
2009
;
339
:
b2535
.

18.

Downs
SB
.
The feasibility of creating a checklist for the assessment of the methodological quality both of randomized and non-randomized studies of health care interventions
.
J Epidemiol Community Health
1998
;
52
(
6
):
377
84
.

19.

Szwamel
K
,
Polanski
P
,
Kurpas
D
.
Weekend courses on ultrasonography as a form of teaching knowledge and the skills necessary to perform ultrasounds in the family physician’s practice
.
Family Med Prim Care Rev
2017
;
19
(
3
):
270
6
.

20.

Szwamel
K
,
Polański
P
,
Kurpas
D
.
Experiences of family physicians after a CME ultrasound course
.
Family Med Prim Care Rev
2017
;
19
(
1
):
62
9
.

21.

Franklin
F
,
Lee
L
,
Mays
K
et al.
Quantifying the amount of training needed for novice ultrasonographers to become proficient in measuring left ventricular mass index
.
EC Cardiol
2017
;
3
(
1
):
26
33
.

22.

Avramescu
ET
,
Mitrache
DM
.
Management of on line learning platforms in medical education
.
EpSBS
2017
;
23
:
1559
69
.

23.

Bailey
RP
,
Ault
M
,
Greengold
NL
et al.
Ultrasonography performed by primary care residents for abdominal aortic aneurysm screening
.
J Gen Intern Med
2001
;
16
(
12
):
845
9
.

24.

Blois
B
.
Office-based ultrasound screening for abdominal aortic aneurysm
.
Can Fam Physician
2012
;
58
(
3
):
172
.

25.

Bornemann
P
,
Bornemann
G
.
Military family physicians’ perceptions of a pocket point-of-care ultrasound device in clinical practice
.
Mil Med
2014
;
179
(
12
):
1474
7
.

26.

Bornemann
P
,
Johnson
J
,
Tiglao
S
et al.
Assessment of primary care physicians’ use of a pocket ultrasound device to measure left ventricular mass in patients with hypertension
.
J Am Board Fam Med
2015
;
28
(
6
):
706
12
.

27.

Bornemann
P
.
Assessment of a novel point-of-care ultrasound curriculum’s effect on competency measures in family medicine graduate medical education
.
J Ultrasound Med
2017
;
36
(
6
):
1205
11
.

28.

Chavez
MA
,
Naithani
N
,
Gilman
RH
et al.
Agreement between the world health organization algorithm and lung consolidation identified using point-of-care ultrasound for the diagnosis of childhood pneumonia by general practitioners
.
Lung
2015
;
193
(
4
):
531
8
.

29.

Chebli
H
,
Idrissi
LE
,
Benazzouz
M
et al.
Human cystic echinococcosis in Morocco: ultrasound screening in the Mid Atlas through an Italian-Moroccan partnership
.
PLoS Negl Trop Dis
2017
;
11
(
3
):
e0005384
.

30.

Colli
A
,
Prati
D
,
Fraquelli
M
et al.
The use of a pocket-sized ultrasound device improves physical examination: results of an in- and outpatient cohort study
.
PLoS One
2015
;
10
(
3
):
e0122181
.

31.

Del Carpio
M
,
Mercapide
CH
,
Salvitti
JC
et al.
Early diagnosis, treatment and follow-up of cystic echinococcosis in remote rural areas in Patagonia: impact of ultrasound training of non-specialists
.
PLoS Negl Trop Dis
2012
;
6
(
1
):
e1444
.

32.

Evangelista
A
,
Galuppo
V
,
Mendez
J
et al.
Hand-held cardiac ultrasound screening performed by family doctors with remote expert support interpretation
.
Heart
2016
;
102
(
5
):
376
82
.

33.

Hahn
RG
,
Ho
S
,
Roi
LD
et al.
Cost-effectiveness of office obstetrical ultrasound in family practice: preliminary considerations
.
J Am Board Fam Pract
1988
;
1
(
1
):
33
8
.

34.

Hahn
RG
,
Roi
LD
,
Ornstein
SM
et al.
Obstetric ultrasound training for family physicians. Results from a multi-site study
.
J Fam Pract
1988
;
26
(
5
):
553
8
.

35.

Keith
R
,
Frisch
L
.
Fetal biometry: a comparison of family physicians and radiologists
.
Fam Med
2001
;
33
(
2
):
111
4
.

36.

Le Lous
M
,
De Chanaud
N
,
Bourret
A
et al.
Improving the quality of transvaginal ultrasound scan by simulation training for general practice residents
.
Adv Simul (Lond)
2017
;
2
:
24
.

37.

Lee
JB
,
Tse
C
,
Keown
T
et al.
Evaluation of a point of care ultrasound curriculum for Indonesian physicians taught by first-year medical students
.
World J Emerg Med
2017
;
8
(
4
):
281
6
.

38.

Lindgaard
K
,
Riisgaard
L
.
Validation of ultrasound examinations performed by general practitioners
.
Scand J Prim Health Care
2017
;
35
(
3
):
256
61
.

39.

Mjolstad
OC
,
Snare
SR
,
Folkvord
L
et al.
Assessment of left ventricular function by GPs using pocket-sized ultrasound
.
Fam Pract
2012
;
29
(
5
):
534
40
.

40.

Morbach
C
,
Buck
T
,
Rost
C
et al.
Point-of-care B-type natriuretic peptide and portable echocardiography for assessment of patients with suspected heart failure in primary care: rationale and design of the three-part Handheld-BNP program and results of the training study
.
Clin Res Cardiol
2018
;
107
(
2
):
95
107
.

41.

Mumoli
N
,
Vitale
J
,
Giorgi-Pierfranceschi
M
et al.
General practitioner-performed compression ultrasonography for diagnosis of deep vein thrombosis of the leg: a multicenter, prospective cohort study
.
Ann Fam Med
2017
;
15
(
6
):
535
9
.

42.

Ornstein
SM
,
Smith
MA
,
Peggs
J
et al.
Obstetric ultrasound by family physicians. Adequacy as assessed by pregnancy outcome
.
J Fam Pract
1990
;
30
(
4
):
403
8
.

43.

Rodney
WM
,
Prislin
MD
,
Orientale
E
et al.
Family practice obstetric ultrasound in an urban community health center. Birth outcomes and examination accuracy of the initial 227 cases
.
J Fam Pract
1990
;
30
(
2
):
163
8
.

44.

Siso-Almirall
A
,
Kostov
B
,
Navarro Gonzalez
M
et al.
Abdominal aortic aneurysm screening program using hand-held ultrasound in primary healthcare
.
PLoS One
2017
;
12
(
4
):
e0176877
.

45.

Smith
CB
,
Sakornbut
EL
,
Dickinson
LC
et al.
Quantification of training in obstetrical ultrasound: a study of family practice residents
.
J Clin Ultrasound
1991
;
19
(
8
):
479
83
.

46.

Todsen
T
,
Jensen
ML
,
Tolsgaard
MG
et al.
Transfer from point-of-care Ultrasonography training to diagnostic performance on patients—a randomized controlled trial
.
Am J Surg
2016
;
211
(
1
):
40
5
.

47.

Wong
F
,
Franco
Z
,
Phelan
MB
et al.
Development of a pilot family medicine hand-carried ultrasound course
.
WMJ
2013
;
112
(
6
):
257
61
.

48.

Dornhofer
K
,
Farhat
A
,
Guan
K
et al.
Evaluation of a point-of-care ultrasound curriculum taught by medical students for physicians, nurses, and midwives in rural Indonesia
.
J Clin Ultrasound
2020
;
48
(
3
):
145
51
.

49.

Jones
L
,
Gathu
C
,
Szkwarko
D
et al.
Expanding point-of-care ultrasound training in a low- and middle-income country: experiences from a collaborative short-training workshop in Kenya
.
Fam Med
2020
;
52
(
1
):
38
42
.

50.

Nilsson
G
,
Söderström
L
,
Alverlind
K
et al.
Hand-held cardiac ultrasound examinations performed in primary care patients by nonexperts to identify reduced ejection fraction
.
BMC Med Educ
2019
;
19
(
1
):
282
.

51.

Pervaiz
F
,
Hossen
S
,
Chavez
MA
et al.
Training and standardization of general practitioners in the use of lung ultrasound for the diagnosis of pediatric pneumonia
.
Pediatr Pulmonol
2019
;
54
(
11
):
1753
9
.

52.

Rominger
AH
,
Gomez
GAA
,
Elliott
P
.
The implementation of a longitudinal POCUS curriculum for physicians working at rural outpatient clinics in Chiapas, Mexico
.
Crit Ultrasound J
2018
;
10
(
1
):
19
.

53.

The European Federation for Ultrasound in Medicine and Biology
.
Minimum training recommendations for the practice of medical ultrasound
.
Ultraschall Med
.
2006
;
27
(
1
):
79
105
.

54.

Wearne
S
.
Teaching procedural skills in general practice
.
Aust Fam Physician
2011
;
40
(
1–2
):
63
7
.

55.

Olgers
TJ
,
Azizi
N
,
Blans
MJ
et al.
Point-of-care Ultrasound (PoCUS) for the internist in acute medicine: a uniform curriculum
.
Neth J Med
2019
;
77
(
5
):
168
76
.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model)